Home
Jobs
Companies
Resume

952 Bigquery Jobs - Page 2

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop innovative solutions and contribute to key decisions. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Lead the development and implementation of new features.- Conduct code reviews and ensure coding standards are met.- Troubleshoot and resolve complex technical issues. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery.- Strong understanding of data modeling and database design.- Experience with cloud platforms like Google Cloud Platform.- Knowledge of SQL and query optimization techniques.- Hands-on experience in developing scalable applications.- Good To Have Skills: Experience with data warehousing solutions. Additional Information:- The candidate should have a minimum of 12 years of experience in Google BigQuery.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

10.0 - 15.0 years

22 - 37 Lacs

Bengaluru

Work from Office

Naukri logo

Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Join Kyndryl as a Data Architect where you will unlock the power of data to drive strategic decisions and shape the future of our business. As a key member of our team, you will harness your expertise in basic statistics, business fundamentals, and communication to uncover valuable insights and transform raw data into rigorous visualizations and compelling stories. In this role, you will have the opportunity to work closely with our customers as part of a top-notch team. You will dive deep into vast IT datasets, unraveling the mysteries hidden within, and discovering trends and patterns that will revolutionize our customers' understanding of their own landscapes. Armed with your advanced analytical skills, you will draw compelling conclusions and develop data-driven insights that will directly impact their decision-making processes. Your Role and Responsibilities: Data Architecture Design: Design scalable, secure, and high-performance data architectures, including data warehouses, data lakes, and BI solutions. Data Modeling: Develop and maintain complex data models (ER, star, and snowflake schemas) to support BI and analytics requirements. BI Strategy and Implementation: Lead the design and implementation of BI solutions using platforms like Power BI, Tableau, Qlik, and Looker. ETL/ELT Management: Architect efficient ETL/ELT pipelines for data transformation and integration across multiple data sources. Data Governance: Implement data quality, data lineage, and metadata management frameworks to ensure data reliability and compliance. Performance Optimization: Optimize data storage and retrieval processes for speed, scalability, and efficiency. Stakeholder Collaboration: Work closely with business and technical teams to define data requirements and deliver actionable insights. Cloud and Big Data: Utilize cloud-native tools like Azure Synapse, AWS Redshift, GCP BigQuery, and Databricks for large-scale data processing. Mentorship: Guide junior data engineers and BI developers on best practices and advanced techniques. Your unique ability to communicate and empathize with stakeholders will be invaluable. By understanding the business objectives and success criteria of each project, you will align your data analysis efforts seamlessly with our overarching goals. With your mastery of business valuation, decision-making, project scoping, and storytelling, you will transform data into meaningful narratives that drive real-world impact. At Kyndryl, we believe that data holds immense potential, and we are committed to helping you unlock that potential. You will have access to vast repositories of data, empowering you to delve deep to determine root causes of defects and variation. By gaining a comprehensive understanding of the data and its specific purpose, you will be at the forefront of driving innovation and making a difference. If you are ready to unleash your analytical ability, collaborate with industry experts, and shape the future of data-driven decision making, then join us as a Data Analyst at Kyndryl. Together, we will harness the power of data to redefine what is possible and create a future filled with limitless possibilities. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Skills and Experience Education: Bachelor's or master’s in computer science, Data Science, or a related field. Experience: 8+ years in data architecture, BI, and analytics roles. BI Tools: Power BI, Tableau, Qlik, Looker, SAP Analytics Cloud. Data Modeling: ER, dimensional, star, and snowflake schemas. Cloud Platforms: Azure, AWS, GCP, Snowflake. Databases: SQL Server, Oracle, MySQL, NoSQL (MongoDB, DynamoDB). ETL Tools: Informatica, Talend, SSIS, Apache Nifi. Scripting: Python, R, SQL, DAX, MDX. Soft Skills: Strong communication, problem-solving, and leadership abilities. Knowledge of deployment patterns. Strong documentation, troubleshooting, and data profiling skills. Excellent analytical, conceptual, and problem-solving abilities. Ability to manage multiple priorities and swiftly adapt to changing demands. Preferred Skills and Experience Microsoft Certified: Azure Data Engineer Associate AWS Certified Data Analytics - Specialty Google Professional Data Engineer Tableau Desktop Certified Professional Power BI Data Analyst Associate Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.

Posted 4 days ago

Apply

8.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

What you’ll be doing: Assist in developing machine learning models based on project requirements Work with datasets by preprocessing, selecting appropriate data representations, and ensuring data quality. Performing statistical analysis and fine-tuning using test results. Support training and retraining of ML systems as needed. Help build data pipelines for collecting and processing data efficiently. Follow coding and quality standards while developing AI/ML solutions Contribute to frameworks that help operationalize AI models What we seek in you: 8+ years of experience in IT Industry Strong on programming languages like Python One cloud hands-on experience (GCP preferred) Experience working with Dockers Environments managing (e.g venv, pip, poetry, etc.) Experience with orchestrators like Vertex AI pipelines, Airflow, etc Understanding of full ML Cycle end-to-end Data engineering, Feature Engineering techniques Experience with ML modelling and evaluation metrics Experience with Tensorflow, Pytorch or another framework Experience with Models monitoring Advance SQL knowledge Aware of Streaming concepts like Windowing, Late arrival, Triggers etc Storage: CloudSQL, Cloud Storage, Cloud Bigtable, Bigquery, Cloud Spanner, Cloud DataStore, Vector database Ingest: Pub/Sub, Cloud Functions, AppEngine, Kubernetes Engine, Kafka, Micro services Schedule: Cloud Composer, Airflow Processing: Cloud Dataproc, Cloud Dataflow, Apache Spark, Apache Flink CI/CD: Bitbucket+Jenkins / Gitlab, Infrastructure as a tool: Terraform Life at Next: At our core, we're driven by the mission of tailoring growth for our customers by enabling them to transform their aspirations into tangible outcomes. We're dedicated to empowering them to shape their futures and achieve ambitious goals. To fulfil this commitment, we foster a culture defined by agility, innovation, and an unwavering commitment to progress. Our organizational framework is both streamlined and vibrant, characterized by a hands-on leadership style that prioritizes results and fosters growth. Perks of working with us: Clear objectives to ensure alignment with our mission, fostering your meaningful contribution. Abundant opportunities for engagement with customers, product managers, and leadership. You'll be guided by progressive paths while receiving insightful guidance from managers through ongoing feedforward sessions. Cultivate and leverage robust connections within diverse communities of interest. Choose your mentor to navigate your current endeavors and steer your future trajectory. Embrace continuous learning and upskilling opportunities through Nexversity. Enjoy the flexibility to explore various functions, develop new skills, and adapt to emerging technologies. Embrace a hybrid work model promoting work-life balance. Access comprehensive family health insurance coverage, prioritizing the well-being of your loved ones. Embark on accelerated career paths to actualize your professional aspirations. Who we are? We enable high growth enterprises build hyper personalized solutions to transform their vision into reality. With a keen eye for detail, we apply creativity, embrace new technology and harness the power of data and AI to co-create solutions tailored made to meet unique needs for our customers. Join our passionate team and tailor your growth with us!

Posted 4 days ago

Apply

5.0 - 7.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

What you’ll do: Utilize advanced mathematical, statistical, and analytical expertise to research, collect, analyze, and interpret large datasets from internal and external sources to provide insight and develop data driven solutions across the company Build and test predictive models including but not limited to credit risk, fraud, response, and offer acceptance propensity Responsible for the development, testing, validation, tracking, and performance enhancement of statistical models and other BI reporting tools leading to new innovative origination strategies within marketing, sales, finance, and underwriting Leverage advanced analytics to develop innovative portfolio surveillance solutions to track and forecast loan losses, that influence key business decisions related to pricing optimization, credit policy and overall profitability strategy Use decision science methodologies and advanced data visualization techniques to implement creative automation solutions within the organization Initiate and lead analysis to bring actionable insights to all areas of the business including marketing, sales, collections, and credit decisioning Develop and refine unit economics models to enable marketing and credit decisions What you’ll need: 5 to 8 years of experience in data science or a related role with a focus on Python programming and ML models. Strong in Python, experience with Jupyter notebooks, and Python packages like polars, pandas, numpy, scikit-learn, matplotlib, etc. Experience with ML lifecycle: data preparation, training, evaluation, and deployment Hands-on experience with GCP services for ML & data science Experience with Vector Search and Hybrid Search techniques Embeddings generation using BERT, Sentence Transformers, or custom models Embedding indexing and retrieval (Elastic, FAISS, ScaNN, Annoy) Experience with LLMs and use cases like RAG Understanding of semantic vs lexical search paradigms Experience with Learning to Rank (LTR) and libraries like XGBoost, LightGBM with LTR support Proficient in SQL and BigQuery Experience with Dataproc clusters for distributed data processing using Apache Spark or PySpark Model deployment using Vertex AI, Cloud Run, or Cloud Functions Familiarity with BM25 ranking (Elasticsearch or OpenSearch) and vector blending Awareness of search relevance evaluation metrics (precision@k, recall, nDCG, MRR) Life at Next: At our core, we're driven by the mission of tailoring growth for our customers by enabling them to transform their aspirations into tangible outcomes. We're dedicated to empowering them to shape their futures and achieve ambitious goals. To fulfil this commitment, we foster a culture defined by agility, innovation, and an unwavering commitment to progress. Our organizational framework is both streamlined and vibrant, characterized by a hands-on leadership style that prioritizes results and fosters growth. Perks of working with us: Clear objectives to ensure alignment with our mission, fostering your meaningful contribution. Abundant opportunities for engagement with customers, product managers, and leadership. You'll be guided by progressive paths while receiving insightful guidance from managers through ongoing feedforward sessions. Cultivate and leverage robust connections within diverse communities of interest. Choose your mentor to navigate your current endeavors and steer your future trajectory. Embrace continuous learning and upskilling opportunities through Nexversity. Enjoy the flexibility to explore various functions, develop new skills, and adapt to emerging technologies. Embrace a hybrid work model promoting work-life balance. Access comprehensive family health insurance coverage, prioritizing the well-being of your loved ones. Embark on accelerated career paths to actualize your professional aspirations. Who we are? We enable high growth enterprises build hyper personalized solutions to transform their vision into reality. With a keen eye for detail, we apply creativity, embrace new technology and harness the power of data and AI to co-create solutions tailored made to meet unique needs for our customers. Join our passionate team and tailor your growth with us!

Posted 4 days ago

Apply

3.0 - 5.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

What you’ll be doing: Assist in developing machine learning models based on project requirements Work with datasets by preprocessing, selecting appropriate data representations, and ensuring data quality. Performing statistical analysis and fine-tuning using test results. Support training and retraining of ML systems as needed. Help build data pipelines for collecting and processing data efficiently. Follow coding and quality standards while developing AI/ML solutions Contribute to frameworks that help operationalize AI models What we seek in you: Strong on programming languages like Python, Java One cloud hands-on experience (GCP preferred) Experience working with Dockers Environments managing (e.g venv, pip, poetry, etc.) Experience with orchestrators like Vertex AI pipelines, Airflow, etc Understanding of full ML Cycle end-to-end Data engineering, Feature Engineering techniques Experience with ML modelling and evaluation metrics Experience with Tensorflow, Pytorch or another framework Experience with Models monitoring Advance SQL knowledge Aware of Streaming concepts like Windowing, Late arrival, Triggers etc Storage: CloudSQL, Cloud Storage, Cloud Bigtable, Bigquery, Cloud Spanner, Cloud DataStore, Vector database Ingest: Pub/Sub, Cloud Functions, AppEngine, Kubernetes Engine, Kafka, Micro services Schedule: Cloud Composer, Airflow Processing: Cloud Dataproc, Cloud Dataflow, Apache Spark, Apache Flink CI/CD: Bitbucket+Jenkins / Gitlab, Infrastructure as a tool: Terraform Life at Next: At our core, we're driven by the mission of tailoring growth for our customers by enabling them to transform their aspirations into tangible outcomes. We're dedicated to empowering them to shape their futures and achieve ambitious goals. To fulfil this commitment, we foster a culture defined by agility, innovation, and an unwavering commitment to progress. Our organizational framework is both streamlined and vibrant, characterized by a hands-on leadership style that prioritizes results and fosters growth. Perks of working with us: Clear objectives to ensure alignment with our mission, fostering your meaningful contribution. Abundant opportunities for engagement with customers, product managers, and leadership. You'll be guided by progressive paths while receiving insightful guidance from managers through ongoing feedforward sessions. Cultivate and leverage robust connections within diverse communities of interest. Choose your mentor to navigate your current endeavors and steer your future trajectory. Embrace continuous learning and upskilling opportunities through Nexversity. Enjoy the flexibility to explore various functions, develop new skills, and adapt to emerging technologies. Embrace a hybrid work model promoting work-life balance. Access comprehensive family health insurance coverage, prioritizing the well-being of your loved ones. Embark on accelerated career paths to actualize your professional aspirations. Who we are? We enable high growth enterprises build hyper personalized solutions to transform their vision into reality. With a keen eye for detail, we apply creativity, embrace new technology and harness the power of data and AI to co-create solutions tailored made to meet unique needs for our customers. Join our passionate team and tailor your growth with us!

Posted 4 days ago

Apply

1.0 - 2.0 years

0 - 0 Lacs

Chennai

Work from Office

Naukri logo

The ideal candidate should have a strong background in SQL, BigQuery, and Google Cloud Platform (GCP), with hands-on experience in developing reports and dashboards using Looker Studio, Looker Standard, and LookML. Excellent communication skills and the ability to work collaboratively with cross-functional teams are essential for success in this role. Key Responsibilities: Design, develop, and maintain dashboards and reports using Looker Studio and Looker Standard. Develop and maintain LookML models, explores, and views to support business reporting requirements. Optimize and write advanced SQL queries for data extraction, transformation, and analysis. Work with BigQuery as the primary data warehouse for managing and analyzing large datasets. Collaborate with business stakeholders to understand data requirements and translate them into scalable reporting solutions. Implement data governance, access controls, and performance optimizations within the Looker environment. Perform root-cause analysis and troubleshooting for reporting and data issues. Maintain documentation for Looker projects, data models, and data dictionaries. Stay updated with the latest Looker and GCP features and best practices.

Posted 6 days ago

Apply

3.0 - 8.0 years

35 - 50 Lacs

Bengaluru

Work from Office

Naukri logo

About the Role: As a Data Engineer, you will be part of the Data Engineering team with this role being inherently multi-functional, and the ideal candidate will work with Data Scientist, Analysts, Application teams across the company, as well as all other Data Engineering squads at Wayfair. We are looking for someone with a love for data, understanding requirements clearly and the ability to iterate quickly. Successful candidates will have strong engineering skills and communication and a belief that data-driven processes lead to phenomenal products. What you'll do: Build and launch data pipelines, and data products focussed on SMART Org. Helping teams push the boundaries of insights, creating new product features using data, and powering machine learning models. Build cross-functional relationships to understand data needs, build key metrics and standardize their usage across the organization. Utilize current and leading edge technologies in software engineering, big data, streaming, and cloud infrastructure What You'll Need: Bachelor/Master degree in Computer Science or related technical subject area or equivalent combination of education and experience 3+ years relevant work experience in the Data Engineering field with web scale data sets. Demonstrated strength in data modeling, ETL development and data lake architecture. Data Warehousing Experience with Big Data Technologies (Hadoop, Spark, Hive, Presto, Airflow etc.). Coding proficiency in at least one modern programming language (Python, Scala, etc) Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing and query performance tuning skills of large data sets. Industry experience as a Big Data Engineer and working along cross functional teams such as Software Engineering, Analytics, Data Science with a track record of manipulating, processing, and extracting value from large datasets. Strong business acumen. Experience leading large-scale data warehousing and analytics projects, including using GCP technologies Big Query, Dataproc, GCS, Cloud Composer, Dataflow or related big data technologies in other cloud platforms like AWS, Azure etc. Be a team player and introduce/follow the best practices on the data engineering space. Ability to effectively communicate (both written and verbally) technical information and the results of engineering design at all levels of the organization. Good to have : Understanding of NoSQL Database exposure and Pub-Sub architecture setup. Familiarity with Bl tools like Looker, Tableau, AtScale, PowerBI, or any similar tools. PS: This role is with one of our clients who is a leading name in Retail Industry.

Posted 6 days ago

Apply

7.0 - 9.0 years

8 - 15 Lacs

Hyderabad

Hybrid

Naukri logo

Role & Responsibilities Role Overview : We are seeking a talented and forward-thinking Data Engineer for one of the large financial services GCC based in Hyderabad with responsibilities that include designing and constructing data pipelines, integrating data from multiple sources, developing scalable data solutions, optimizing data workflows, collaborating with cross-functional teams, implementing data governance practices, and ensuring data security and compliance. Technical Requirements : • Proficiency in ETL, Batch, and Streaming Process • Experience with BigQuery, Cloud Storage, and CloudSQL • Strong programming skills in Python, SQL, and Apache Beam for data processing • Understanding of data modeling and schema design for analytics • Knowledge of data governance, security, and compliance in GCP • Familiarity with machine learning workflows and integration with GCP ML tools • Ability to optimize performance within data pipelines Functional Requirements : • Ability to collaborate with Data Operations, Software Engineers, Data Scientists, and Business SMEs to develop Data Product Features • Experience in leading and mentoring peers within an existing development team • Strong communication skills to craft and communicate robust solutions • Proficient in working with Engineering Leads, Enterprise and Data Architects, and Business Architects to build appropriate data foundations • Willingness to work on contemporary data architecture in Public and Private Cloud environments This role offers a compelling opportunity for a seasoned Data Engineering to drive transformative cloud initiatives within the financial sector, leveraging unparalleled experience and expertise to deliver innovative cloud solutions that align with business imperatives and regulatory requirements. Qualification o Engineering Grad / Postgraduate CRITERIA o Proficient in ETL, Python, and Apache Beam for data processing efficiency. o Demonstrated expertise in BigQuery, Cloud Storage, and CloudSQL utilization. o Strong collaboration skills with cross-functional teams for data product development. o Comprehensive knowledge of data governance, security, and compliance in GCP. o Experienced in optimizing performance within data pipelines for efficiency.

Posted 6 days ago

Apply

3.0 - 7.0 years

14 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 284544 We are currently seeking a Systems Integration Advisor -Technical Architecture-Cloud Services-AWS to join our team in Bangalore, Karntaka (IN-KA), India (IN). : We are seeking a highly skilled and motivated Mid-Level AI/DS Specialist to join our dynamic team at NTT Data. The ideal candidate will have a strong background in artificial intelligence and data science, with expertise in natural language processing (NLP), generative AI (Gen-AI), and conversational AI. The candidate should be well-versed with the Microsoft AI platform, OpenAI, Databricks, Python, and common data science libraries and tools. Additionally, the candidate should be capable of fine-tuning large language models (LLMs) and familiar with AI/ML engineering and prompt engineering. Key Responsibilities: Develop and implement AI/DS solutions to enhance business processes and customer experiences. Utilize NLP, Gen-AI, and conversational AI techniques to build and optimize AI models. Work with the Microsoft AI platform, OpenAI, and Databricks to develop and deploy AI solutions. Write efficient and scalable code in Python, leveraging common data science libraries and tools. Fine-tune LLM models to meet specific project requirements. Collaborate with cross-functional teams to integrate AI/DS solutions into existing systems. Stay updated with the latest advancements in AI/DS and apply them to ongoing projects. Conduct prompt engineering to improve the performance and accuracy of AI models. Qualifications: Bachelor's or Master's degree in Computer Science, Data Science, AI, or a related field. Proven experience in AI/DS, with a focus on NLP, Gen-AI, and conversational AI. Proficiency in the Microsoft AI platform, OpenAI, Databricks, and Python. Strong knowledge of common data science libraries and tools. Experience in fine-tuning LLM models. Familiarity with AI/ML engineering and prompt engineering. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills.

Posted 6 days ago

Apply

4.0 - 9.0 years

8 - 12 Lacs

Noida

Work from Office

Naukri logo

Req ID: 327316 We are currently seeking a GCP & GKE - Sr Cloud Engineer to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Job Title / RoleGCP & GKE - Sr Cloud Engineer Primary SkillCloud-Infrastructure-Google Cloud Platform Minimum work experience4+ yrs Total Experience4+ Years Mandatory Skills: Technical Qualification/ Knowledge Expertise in assessment, designing and implementing GCP solutions including aspects like compute, network, storage, identity, security , DR/business continuity strategy, migration , templates , cost optimization, PowerShell ,Terraforms, Ansible etc. Must have GCP Solution Architect Certification Should have prior experience in executing large complex cloud transformation programs including discovery, assessment, business case creation, design , build , migration planning and migration execution Should have prior experience in using industry leading or native discovery, assessment and migration tools. Good knowledge on the cloud technology, different patterns, deployment methods, compatibility of the applications Good knowledge on the GCP technologies and associated components and variations Anthos Application Platform Working knowledge on GCE, GAE, GKE and GCS Hands-on experience in creating and provisioning compute Instances using GCP console, Terraform and Google Cloud SDK. Creating Databases in GCP and in VM"™s Knowledge of data analyst tool (big query). Knowledge of cost analysis and cost optimization. Knowledge of Git & GitHub. Knowledge on Terraform and Jenkins. Monitoring the VM and Applications using Stack driver. Working knowledge on VPN and Interconnect setup. Hands on experience in setting up HA environment. Hands on experience in Creating VM instances in Google cloud Platform. Hands on experience in Cloud storage and retention policies in storage. Managing Users on Google IAM Service and providing them appropriate permissions. GKE Install Tools - Set up Kubernetes tools Administer a Cluster Configure Pods and Containers Perform common configuration tasks for Pods and containers. Monitoring, Logging, and Debugging Inject Data Into Applications Specify configuration and other data for the Pods that run your workload. Run Applications Run and manage both stateless and stateful applications. Run Jobs Run Jobs using parallel processing. Access Applications in a Cluster Extend Kubernetes Understand advanced ways to adapt your Kubernetes cluster to the needs of your work environment. Manage Cluster Daemons Perform common tasks for managing a DaemonSet, such as performing a rolling update. Extend kubectl with plugins Extend kubectl by creating and installing kubectl plugins. Manage HugePages Configure and manage huge pages as a schedulable resource in a cluster. Schedule GPUs Configure and schedule GPUs for use as a resource by nodes in a cluster. CertificationGCP Engineer & GKE Academic Qualification: B. Tech or equivalent or MCA Process/ Quality Knowledge Must have clear knowledge on ITIL based service delivery. ITIL certification is desired. Knowledge on quality Knowledge on security processes Soft Skills: Good communication skill and capability to work directly with global customers Timely and accurate communication Need to demonstrate the ownership for the technical issues and engage the right stakeholders for timely resolution. Flexibility to learn and lead other technology areas like other public cloud technologies, private cloud, automation

Posted 6 days ago

Apply

4.0 - 9.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 327258 We are currently seeking a GCP & GKE - Sr Cloud Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Job Title / RoleGCP & GKE - Sr Cloud Engineer : Primary SkillCloud-Infrastructure-Google Cloud Platform Minimum work experience4+ yrs Total Experience4+ Years Mandatory Skills: Technical Qualification/ Knowledge: Expertise in assessment, designing and implementing GCP solutions including aspects like compute, network, storage, identity, security , DR/business continuity strategy, migration , templates , cost optimization, PowerShell ,Terraforms, Ansible etc. Must have GCP Solution Architect Certification Should have prior experience in executing large complex cloud transformation programs including discovery, assessment, business case creation, design , build , migration planning and migration execution Should have prior experience in using industry leading or native discovery, assessment and migration tools. Good knowledge on the cloud technology, different patterns, deployment methods, compatibility of the applications Good knowledge on the GCP technologies and associated components and variations Anthos Application Platform "¢ Working knowledge on GCE, GAE, GKE and GCS "¢ Hands-on experience in creating and provisioning compute Instances using GCP console, Terraform and Google Cloud SDK. "¢ Creating Databases in GCP and in VM"™s "¢ Knowledge of data analyst tool (big query). "¢ Knowledge of cost analysis and cost optimization. "¢ Knowledge of Git & GitHub. "¢ Knowledge on Terraform and Jenkins. "¢ Monitoring the VM and Applications using Stack driver. "¢ Working knowledge on VPN and Interconnect setup. "¢ Hands on experience in setting up HA environment. "¢ Hands on experience in Creating VM instances in Google cloud Platform. "¢ Hands on experience in Cloud storage and retention policies in storage. "¢ Managing Users on Google IAM Service and providing them appropriate permissions. "¢ GKE "¢ Install Tools - Set up Kubernetes tools "¢ Administer a Cluster "¢ Configure Pods and Containers "¢ Perform common configuration tasks for Pods and containers. "¢ Monitoring, Logging, and Debugging "¢ Inject Data Into Applications "¢ Specify configuration and other data for the Pods that run your workload. "¢ Run Applications "¢ Run and manage both stateless and stateful applications. "¢ Run Jobs "¢ Run Jobs using parallel processing. "¢ Access Applications in a Cluster "¢ Extend Kubernetes "¢ Understand advanced ways to adapt your Kubernetes cluster to the needs of your work environment. "¢ Manage Cluster Daemons "¢ Perform common tasks for managing a DaemonSet, such as performing a rolling update. "¢ Extend kubectl with plugins "¢ Extend kubectl by creating and installing kubectl plugins. "¢ Manage HugePages "¢ Configure and manage huge pages as a schedulable resource in a cluster. "¢ Schedule GPUs "¢ Configure and schedule GPUs for use as a resource by nodes in a cluster. CertificationGCP Engineer & GKE Academic Qualification B. Tech or equivalent or MCA Process/ Quality Knowledge: Must have clear knowledge on ITIL based service delivery. ITIL certification is desired. Knowledge on quality Knowledge on security processes Soft Skills: Good communication skill and capability to work directly with global customers Timely and accurate communication Need to demonstrate the ownership for the technical issues and engage the right stakeholders for timely resolution. Flexibility to learn and lead other technology areas like other public cloud technologies, private cloud, automation

Posted 6 days ago

Apply

4.0 - 9.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 327315 We are currently seeking a GCP & GKE - Sr Cloud Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Job Title / RoleGCP & GKE - Sr Cloud Engineer Primary SkillCloud-Infrastructure-Google Cloud Platform Minimum work experience4+ yrs Total Experience4+ Years Mandatory Skills: Technical Qualification/ Knowledge Expertise in assessment, designing and implementing GCP solutions including aspects like compute, network, storage, identity, security , DR/business continuity strategy, migration , templates , cost optimization, PowerShell ,Terraforms, Ansible etc. Must have GCP Solution Architect Certification Should have prior experience in executing large complex cloud transformation programs including discovery, assessment, business case creation, design , build , migration planning and migration execution Should have prior experience in using industry leading or native discovery, assessment and migration tools. Good knowledge on the cloud technology, different patterns, deployment methods, compatibility of the applications Good knowledge on the GCP technologies and associated components and variations Anthos Application Platform Working knowledge on GCE, GAE, GKE and GCS Hands-on experience in creating and provisioning compute Instances using GCP console, Terraform and Google Cloud SDK. Creating Databases in GCP and in VM"™s Knowledge of data analyst tool (big query). Knowledge of cost analysis and cost optimization. Knowledge of Git & GitHub. Knowledge on Terraform and Jenkins. Monitoring the VM and Applications using Stack driver. Working knowledge on VPN and Interconnect setup. Hands on experience in setting up HA environment. Hands on experience in Creating VM instances in Google cloud Platform. Hands on experience in Cloud storage and retention policies in storage. Managing Users on Google IAM Service and providing them appropriate permissions. GKE Install Tools - Set up Kubernetes tools Administer a Cluster Configure Pods and Containers Perform common configuration tasks for Pods and containers. Monitoring, Logging, and Debugging Inject Data Into Applications Specify configuration and other data for the Pods that run your workload. Run Applications Run and manage both stateless and stateful applications. Run Jobs Run Jobs using parallel processing. Access Applications in a Cluster Extend Kubernetes Understand advanced ways to adapt your Kubernetes cluster to the needs of your work environment. Manage Cluster Daemons Perform common tasks for managing a DaemonSet, such as performing a rolling update. Extend kubectl with plugins Extend kubectl by creating and installing kubectl plugins. Manage HugePages Configure and manage huge pages as a schedulable resource in a cluster. Schedule GPUs Configure and schedule GPUs for use as a resource by nodes in a cluster. CertificationGCP Engineer & GKE Academic Qualification: B. Tech or equivalent or MCA Process/ Quality Knowledge Must have clear knowledge on ITIL based service delivery. ITIL certification is desired. Knowledge on quality Knowledge on security processes Soft Skills: Good communication skill and capability to work directly with global customers Timely and accurate communication Need to demonstrate the ownership for the technical issues and engage the right stakeholders for timely resolution. Flexibility to learn and lead other technology areas like other public cloud technologies, private cloud, automation

Posted 6 days ago

Apply

4.0 - 9.0 years

8 - 12 Lacs

Chennai

Work from Office

Naukri logo

Req ID: 327318 We are currently seeking a GCP & GKE - Sr Cloud Engineer to join our team in Chennai, Tamil Ndu (IN-TN), India (IN). Job Title / RoleGCP & GKE - Sr Cloud Engineer Primary SkillCloud-Infrastructure-Google Cloud Platform Minimum work experience4+ yrs Total Experience4+ Years Mandatory Skills: Technical Qualification/ Knowledge Expertise in assessment, designing and implementing GCP solutions including aspects like compute, network, storage, identity, security , DR/business continuity strategy, migration , templates , cost optimization, PowerShell ,Terraforms, Ansible etc. Must have GCP Solution Architect Certification Should have prior experience in executing large complex cloud transformation programs including discovery, assessment, business case creation, design , build , migration planning and migration execution Should have prior experience in using industry leading or native discovery, assessment and migration tools. Good knowledge on the cloud technology, different patterns, deployment methods, compatibility of the applications Good knowledge on the GCP technologies and associated components and variations Anthos Application Platform Working knowledge on GCE, GAE, GKE and GCS Hands-on experience in creating and provisioning compute Instances using GCP console, Terraform and Google Cloud SDK. Creating Databases in GCP and in VM"™s Knowledge of data analyst tool (big query). Knowledge of cost analysis and cost optimization. Knowledge of Git & GitHub. Knowledge on Terraform and Jenkins. Monitoring the VM and Applications using Stack driver. Working knowledge on VPN and Interconnect setup. Hands on experience in setting up HA environment. Hands on experience in Creating VM instances in Google cloud Platform. Hands on experience in Cloud storage and retention policies in storage. Managing Users on Google IAM Service and providing them appropriate permissions. GKE Install Tools - Set up Kubernetes tools Administer a Cluster Configure Pods and Containers Perform common configuration tasks for Pods and containers. Monitoring, Logging, and Debugging Inject Data Into Applications Specify configuration and other data for the Pods that run your workload. Run Applications Run and manage both stateless and stateful applications. Run Jobs Run Jobs using parallel processing. Access Applications in a Cluster Extend Kubernetes Understand advanced ways to adapt your Kubernetes cluster to the needs of your work environment. Manage Cluster Daemons Perform common tasks for managing a DaemonSet, such as performing a rolling update. Extend kubectl with plugins Extend kubectl by creating and installing kubectl plugins. Manage HugePages Configure and manage huge pages as a schedulable resource in a cluster. Schedule GPUs Configure and schedule GPUs for use as a resource by nodes in a cluster. CertificationGCP Engineer & GKE Academic Qualification: B. Tech or equivalent or MCA Process/ Quality Knowledge Must have clear knowledge on ITIL based service delivery. ITIL certification is desired. Knowledge on quality Knowledge on security processes Soft Skills: Good communication skill and capability to work directly with global customers Timely and accurate communication Need to demonstrate the ownership for the technical issues and engage the right stakeholders for timely resolution. Flexibility to learn and lead other technology areas like other public cloud technologies, private cloud, automation

Posted 6 days ago

Apply

4.0 - 9.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Req ID: 327319 We are currently seeking a GCP & GKE - Sr Cloud Engineer to join our team in Hyderabad, Telangana (IN-TG), India (IN). Job Title / RoleGCP & GKE - Sr Cloud Engineer Primary SkillCloud-Infrastructure-Google Cloud Platform Minimum work experience4+ yrs Total Experience4+ Years Mandatory Skills: Technical Qualification/ Knowledge Expertise in assessment, designing and implementing GCP solutions including aspects like compute, network, storage, identity, security , DR/business continuity strategy, migration , templates , cost optimization, PowerShell ,Terraforms, Ansible etc. Must have GCP Solution Architect Certification Should have prior experience in executing large complex cloud transformation programs including discovery, assessment, business case creation, design , build , migration planning and migration execution Should have prior experience in using industry leading or native discovery, assessment and migration tools. Good knowledge on the cloud technology, different patterns, deployment methods, compatibility of the applications Good knowledge on the GCP technologies and associated components and variations Anthos Application Platform Working knowledge on GCE, GAE, GKE and GCS Hands-on experience in creating and provisioning compute Instances using GCP console, Terraform and Google Cloud SDK. Creating Databases in GCP and in VM"™s Knowledge of data analyst tool (big query). Knowledge of cost analysis and cost optimization. Knowledge of Git & GitHub. Knowledge on Terraform and Jenkins. Monitoring the VM and Applications using Stack driver. Working knowledge on VPN and Interconnect setup. Hands on experience in setting up HA environment. Hands on experience in Creating VM instances in Google cloud Platform. Hands on experience in Cloud storage and retention policies in storage. Managing Users on Google IAM Service and providing them appropriate permissions. GKE Install Tools - Set up Kubernetes tools Administer a Cluster Configure Pods and Containers Perform common configuration tasks for Pods and containers. Monitoring, Logging, and Debugging Inject Data Into Applications Specify configuration and other data for the Pods that run your workload. Run Applications Run and manage both stateless and stateful applications. Run Jobs Run Jobs using parallel processing. Access Applications in a Cluster Extend Kubernetes Understand advanced ways to adapt your Kubernetes cluster to the needs of your work environment. Manage Cluster Daemons Perform common tasks for managing a DaemonSet, such as performing a rolling update. Extend kubectl with plugins Extend kubectl by creating and installing kubectl plugins. Manage HugePages Configure and manage huge pages as a schedulable resource in a cluster. Schedule GPUs Configure and schedule GPUs for use as a resource by nodes in a cluster. CertificationGCP Engineer & GKE Academic Qualification: B. Tech or equivalent or MCA Process/ Quality Knowledge Must have clear knowledge on ITIL based service delivery. ITIL certification is desired. Knowledge on quality Knowledge on security processes Soft Skills: Good communication skill and capability to work directly with global customers Timely and accurate communication Need to demonstrate the ownership for the technical issues and engage the right stakeholders for timely resolution. Flexibility to learn and lead other technology areas like other public cloud technologies, private cloud, automation

Posted 6 days ago

Apply

1.0 - 6.0 years

5 - 9 Lacs

Noida, Chennai, Bengaluru

Work from Office

Naukri logo

Req ID: 328283 We are currently seeking a Jr Cloud Engineer - GCP to join our team in Bangalore, Karntaka (IN-KA), India (IN). Job Title / RoleJr Cloud Engineer - GCP : Primary SkillCloud-Infrastructure-Google Cloud Platform Minimum work experience1+ yrs Total Experience1+ Years Mandatory Skills: Technical Qualification/ Knowledge: Working knowledge on GCE, GAE, GKE and GCS Hands-on experience in creating and provisioning compute Instances using GCP console, Terraform and Google Cloud SDK. Creating Databases in GCP and in VM"™s Knowledge of data analyst tool (big query). Knowledge of cost analysis and cost optimization. Knowledge of Git & GitHub. Knowledge on Terraform and Jenkins. Monitoring the VM and Applications using Stack driver. Working knowledge on VPN and Interconnect setup. Hands on experience in setting up HA environment. Hands on experience in Creating VM instances in Google cloud Platform. Hands on experience in Cloud storage and retention policies in storage. Managing Users on Google IAM Service and providing them appropriate permissions . GKE - Designing, implementing, managing, and deploying cloud-native applications in a Kubernetes environment. GKE - Automation, troubleshooting issues, and mentoring the team members. GKE - Understand security, efficiency, scalability of core services and capabilities. CertificationGCP Engineer & GKE Process/ Quality Knowledge: Must have clear knowledge on ITIL based service delivery. Soft Skills: Good communication skill and capability to work directly with global customers Timely and accurate communication Need to demonstrate the ownership for the technical issues and engage the right stakeholders for timely resolution. Flexibility to learn and lead other technology areas like other public cloud technologies, private cloud, automation Location - Bengaluru,Chennai,Noida,Pune

Posted 6 days ago

Apply

4.0 - 9.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 327298 We are currently seeking a GCP Solution Architect to join our team in Bangalore, Karntaka (IN-KA), India (IN). Primary SkillCloud-Infrastructure-Google Cloud Platform Minimum work experience4+ yrs Total Experience4+ Years Mandatory Skills: Technical Qualification/ Knowledge Expertise in assessment, designing and implementing GCP solutions including aspects like compute, network, storage, identity, security , DR/business continuity strategy, migration , templates , cost optimization, PowerShell ,Terraforms, Ansible etc. Must have GCP Solution Architect Certification Should have prior experience in executing large complex cloud transformation programs including discovery, assessment, business case creation, design , build , migration planning and migration execution Should have prior experience in using industry leading or native discovery, assessment and migration tools. Good knowledge on the cloud technology, different patterns, deployment methods, compatibility of the applications Good knowledge on the GCP technologies and associated components and variations Anthos Application Platform Working knowledge on GCE, GAE, GKE and GCS Hands-on experience in creating and provisioning compute Instances using GCP console, Terraform and Google Cloud SDK. Creating Databases in GCP and in VM"™s Knowledge of data analyst tool (big query). Knowledge of cost analysis and cost optimization. Knowledge of Git & GitHub. Knowledge on Terraform and Jenkins. Monitoring the VM and Applications using Stack driver. Working knowledge on VPN and Interconnect setup. Hands on experience in setting up HA environment. Hands on experience in Creating VM instances in Google cloud Platform. Hands on experience in Cloud storage and retention policies in storage. Managing Users on Google IAM Service and providing them appropriate permissions. GKE Install Tools - Set up Kubernetes tools Administer a Cluster Configure Pods and Containers Perform common configuration tasks for Pods and containers. Monitoring, Logging, and Debugging Inject Data Into Applications Specify configuration and other data for the Pods that run your workload. Run Applications Run and manage both stateless and stateful applications. Run Jobs Run Jobs using parallel processing. Access Applications in a Cluster Extend Kubernetes Understand advanced ways to adapt your Kubernetes cluster to the needs of your work environment. Manage Cluster Daemons Perform common tasks for managing a DaemonSet, such as performing a rolling update. Extend kubectl with plugins Extend kubectl by creating and installing kubectl plugins. Manage HugePages Configure and manage huge pages as a schedulable resource in a cluster. Schedule GPUs Configure and schedule GPUs for use as a resource by nodes in a cluster. CertificationGCP Engineer & GKE Academic Qualification: B. Tech or equivalent or MCA Process/ Quality Knowledge Must have clear knowledge on ITIL based service delivery. ITIL certification is desired. Knowledge on quality Knowledge on security processes Soft Skills: Good communication skill and capability to work directly with global customers Timely and accurate communication Need to demonstrate the ownership for the technical issues and engage the right stakeholders for timely resolution. Flexibility to learn and lead other technology areas like other public cloud technologies, private cloud, automation

Posted 6 days ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Chennai, Gurugram, Bengaluru

Work from Office

Naukri logo

Req ID: 321996 We are currently seeking a GCP-Cloud Run Engineer to join our team in Hyderabad, Telangana (IN-TG), India (IN). GCP-Cloud Run Exp- 5+ years Notice Period- Immediate to 30 days : Senior Application Developer with Google Cloud Platform experience in BigQuery, SQL, CloudRun. Need a Senior Application Developer with GCP Skillset for a project involving re-design and re-platform of legacy Revenue Allocation system Mandatory Skills: GCP BigQuery, SQL, CloudRun Desired Skills: Linux Shell Scripting is a huge plus; Nice to have - Kafka, MQ Series, Oracle PL/SQL Location - Bengaluru,Chennai,Gurugram,Hyderabad,Noida,Pune

Posted 6 days ago

Apply

5.0 - 8.0 years

3 - 7 Lacs

Chennai

Work from Office

Naukri logo

Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement Deliver NoPerformance ParameterMeasure1.Analyses data sets and provide relevant information to the clientNo. Of automation done, On-Time Delivery, CSAT score, Zero customer escalation, data accuracy Mandatory Skills: Google BigQuery.

Posted 6 days ago

Apply

7.0 - 12.0 years

35 - 40 Lacs

Pune

Work from Office

Naukri logo

Greetings from Peoplefy Infosolutions !!! We are hiring for one of our reputed MNC client based in Pune . We are looking for candidates with 7 + years of experience in below skills - Primary skills : Understanding of AI ML in DE Python Data Engineers Database -Big query or Snowflake Interested candidates for above position kindly share your CVs on chitralekha.so@peoplefy.com with below details - Experience : CTC : Expected CTC : Notice Period : Location :

Posted 6 days ago

Apply

2.0 - 4.0 years

8 - 12 Lacs

Mumbai

Work from Office

Naukri logo

The SAS to Databricks Migration Developer will be responsible for migrating existing SAS code, data processes, and workflows to the Databricks platform. This role requires expertise in both SAS and Databricks, with a focus on converting SAS logic into scalable PySpark and Python code. The developer will design, implement, and optimize data pipelines, ensuring seamless integration and functionality within the Databricks environment. Collaboration with various teams is essential to understand data requirements and deliver solutions that meet business needs

Posted 6 days ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

New Delhi, Chennai, Bengaluru

Hybrid

Naukri logo

Your day at NTT DATA Senior GenAI Data Engineer We are seeking an experienced Senior Data Engineer to join our team in delivering cutting-edge Generative AI (GenAI) solutions to clients. The successful candidate will be responsible for designing, developing, and deploying data pipelines and architectures that support the training, fine-tuning, and deployment of LLMs for various industries. This role requires strong technical expertise in data engineering, problem-solving skills, and the ability to work effectively with clients and internal teams. What you'll be doing Key Responsibilities: Design, develop, and manage data pipelines and architectures to support GenAI model training, fine-tuning, and deployment Data Ingestion and Integration: Develop data ingestion frameworks to collect data from various sources, transform, and integrate it into a unified data platform for GenAI model training and deployment. GenAI Model Integration: Collaborate with data scientists to integrate GenAI models into production-ready applications, ensuring seamless model deployment, monitoring, and maintenance. Cloud Infrastructure Management: Design, implement, and manage cloud-based data infrastructure (e.g., AWS, GCP, Azure) to support large-scale GenAI workloads, ensuring cost-effectiveness, security, and compliance. Write scalable, readable, and maintainable code using object-oriented programming concepts in languages like Python, and utilize libraries like Hugging Face Transformers, PyTorch, or TensorFlow Performance Optimization: Optimize data pipelines, GenAI model performance, and infrastructure for scalability, efficiency, and cost-effectiveness. Data Security and Compliance: Ensure data security, privacy, and compliance with regulatory requirements (e.g., GDPR, HIPAA) across data pipelines and GenAI applications. Client Collaboration: Collaborate with clients to understand their GenAI needs, design solutions, and deliver high-quality data engineering services. Innovation and R&D: Stay up to date with the latest GenAI trends, technologies, and innovations, applying research and development skills to improve data engineering services. Knowledge Sharing: Share knowledge, best practices, and expertise with team members, contributing to the growth and development of the team. Requirements: Bachelors degree in computer science, Engineering, or related fields (Master's recommended) Experience with vector databases (e.g., Pinecone, Weaviate, Faiss, Annoy) for efficient similarity search and storage of dense vectors in GenAI applications 5+ years of experience in data engineering, with a strong emphasis on cloud environments (AWS, GCP, Azure, or Cloud Native platforms) Proficiency in programming languages like SQL, Python, and PySpark Strong data architecture, data modeling, and data governance skills Experience with Big Data Platforms (Hadoop, Databricks, Hive, Kafka, Apache Iceberg), Data Warehouses (Teradata, Snowflake, BigQuery), and lakehouses (Delta Lake, Apache Hudi) Knowledge of DevOps practices, including Git workflows and CI/CD pipelines (Azure DevOps, Jenkins, GitHub Actions) Experience with GenAI frameworks and tools (e.g., TensorFlow, PyTorch, Keras) Nice to have: Experience with containerization and orchestration tools like Docker and Kubernetes Integrate vector databases and implement similarity search techniques, with a focus on GraphRAG is a plus Familiarity with API gateway and service mesh architectures Experience with low latency/streaming, batch, and micro-batch processing Familiarity with Linux-based operating systems and REST APIs Location: Delhi or Bangalore Workplace type : Hybrid Working

Posted 6 days ago

Apply

1.0 - 3.0 years

3 - 5 Lacs

New Delhi, Chennai, Bengaluru

Hybrid

Naukri logo

Your day at NTT DATA We are seeking an experienced Data Engineer to join our team in delivering cutting-edge Generative AI (GenAI) solutions to clients. The successful candidate will be responsible for designing, developing, and deploying data pipelines and architectures that support the training, fine-tuning, and deployment of LLMs for various industries. This role requires strong technical expertise in data engineering, problem-solving skills, and the ability to work effectively with clients and internal teams. What youll be doing Key Responsibilities: Design, develop, and manage data pipelines and architectures to support GenAI model training, fine-tuning, and deployment Data Ingestion and Integration: Develop data ingestion frameworks to collect data from various sources, transform, and integrate it into a unified data platform for GenAI model training and deployment. GenAI Model Integration: Collaborate with data scientists to integrate GenAI models into production-ready applications, ensuring seamless model deployment, monitoring, and maintenance. Cloud Infrastructure Management: Design, implement, and manage cloud-based data infrastructure (e.g., AWS, GCP, Azure) to support large-scale GenAI workloads, ensuring cost-effectiveness, security, and compliance. Write scalable, readable, and maintainable code using object-oriented programming concepts in languages like Python, and utilize libraries like Hugging Face Transformers, PyTorch, or TensorFlow Performance Optimization: Optimize data pipelines, GenAI model performance, and infrastructure for scalability, efficiency, and cost-effectiveness. Data Security and Compliance: Ensure data security, privacy, and compliance with regulatory requirements (e.g., GDPR, HIPAA) across data pipelines and GenAI applications. Client Collaboration: Collaborate with clients to understand their GenAI needs, design solutions, and deliver high-quality data engineering services. Innovation and R&D: Stay up to date with the latest GenAI trends, technologies, and innovations, applying research and development skills to improve data engineering services. Knowledge Sharing: Share knowledge, best practices, and expertise with team members, contributing to the growth and development of the team. Bachelors degree in computer science, Engineering, or related fields (Masters recommended) Experience with vector databases (e.g., Pinecone, Weaviate, Faiss, Annoy) for efficient similarity search and storage of dense vectors in GenAI applications 5+ years of experience in data engineering, with a strong emphasis on cloud environments (AWS, GCP, Azure, or Cloud Native platforms) Proficiency in programming languages like SQL, Python, and PySpark Strong data architecture, data modeling, and data governance skills Experience with Big Data Platforms (Hadoop, Databricks, Hive, Kafka, Apache Iceberg), Data Warehouses (Teradata, Snowflake, BigQuery), and lakehouses (Delta Lake, Apache Hudi) Knowledge of DevOps practices, including Git workflows and CI/CD pipelines (Azure DevOps, Jenkins, GitHub Actions) Experience with GenAI frameworks and tools (e.g., TensorFlow, PyTorch, Keras) Nice to have: Experience with containerization and orchestration tools like Docker and Kubernetes Integrate vector databases and implement similarity search techniques, with a focus on GraphRAG is a plus Familiarity with API gateway and service mesh architectures Experience with low latency/streaming, batch, and micro-batch processing Familiarity with Linux-based operating systems and REST APIs

Posted 6 days ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

Role - GCP Data Engineer Experience:4+ years Preferred - Data Engineering Background Location - Bangalore, Chennai, Pune, Gurgaon, Kolkata Required Skills - GCP DE Experience, Big query, SQL, Cloud compressor/Python, Cloud functions, Dataproc+pyspark, Python injection, Dataflow+PUB/SUB Here is the job description for the same - Job Requirement: Have Implemented and Architected solutions on Google Cloud Platform using the components of GCP Experience with Apache Beam/Google Dataflow/Apache Spark in creating end to end data pipelines. Experience in some of the following: Python, Hadoop, Spark, SQL, Big Query, Big Table Cloud Storage, Datastore, Spanner, Cloud SQL, Machine Learning. Experience programming in Java, Python, etc. Expertise in at least two of these technologies: Relational Databases, Analytical Databases, NoSQL databases. Certified in Google Professional Data Engineer/ Solution Architect is a major Advantage Skills Required: 3~13 years of experience in IT or professional services experience in IT delivery or large-scale IT analytics projects 3+ years of expertise knowledge of Google Cloud Platform; the other cloud platforms are nice to have. Expert knowledge in SQL development. Expertise in building data integration and preparation tools using cloud technologies (like Snaplogic, Google Dataflow, Cloud Dataprep, Python, etc). Experience with Apache Beam/Google Dataflow/Apache Spark in creating end to end data pipelines. Experience in some of the following: Python, Hadoop, Spark, SQL, Big Query, Big Table Cloud Storage, Datastore, Spanner, Cloud SQL, Machine Learning. Experience programming in Java, Python, etc. Identify downstream implications of data loads/migration (e.g., data quality, regulatory, etc.) Implement data pipelines to automate the ingestion, transformation, and augmentation of data sources, and provide best practices for pipeline operations. Capability to work in a rapidly changing business environment and to enable simplified user access to massive data by building scalable data solutions Advanced SQL writing and experience in data mining (SQL, ETL, data warehouse, etc.) and using databases in a business environment with complex datasets

Posted 6 days ago

Apply

6.0 - 8.0 years

17 - 18 Lacs

Hyderabad, Pune

Work from Office

Naukri logo

Develop, implement, and optimize ETL/ELT pipelines for processing large datasets efficiently. • Work extensively with BigQuery for data processing, querying, and optimization. • Utilize Cloud Storage, Cloud Logging, Dataproc, and Pub/Sub for data ingestion, storage, and event-driven processing. • Perform performance tuning and testing of the ELT platform to ensure high efficiency and scalability. • Debug technical issues, perform root cause analysis, and provide solutions for production incidents. • Ensure data quality, accuracy, and integrity across data pipelines. • Collaborate with cross-functional teams to define technical requirements and deliver solutions. • Work independently on assigned tasks while maintaining high levels of productivity and efficiency. Skills Required: • Proficiency in SQL and PL/SQL for querying and manipulating data. • Experience in Python for data processing and automation. • Hands-on experience with Google Cloud Platform (GCP), particularly: o BigQuery (must-have) o Cloud Storage o Cloud Logging o Dataproc o Pub/Sub • Experience with GitHub and CI/CD pipelines for automation and deployment. • Performance tuning and performance testing of ELT processes. • Strong analytical and debugging skills to resolve data and pipeline issues efficiently. • Self-motivated and able to work independently as an individual contributor. • Good understanding of data modeling, database design, and data warehousing concepts. Role & responsibilities Preferred candidate profile

Posted 6 days ago

Apply

6.0 - 8.0 years

8 - 16 Lacs

Hyderabad, Pune, Mumbai (All Areas)

Work from Office

Naukri logo

Role & responsibilities Develop, implement, and optimize ETL/ELT pipelines for processing large datasets efficiently. • Work extensively with BigQuery for data processing, querying, and optimization. • Utilize Cloud Storage, Cloud Logging, Dataproc, and Pub/Sub for data ingestion, storage, and event-driven processing. • Perform performance tuning and testing of the ELT platform to ensure high efficiency and scalability. • Debug technical issues, perform root cause analysis, and provide solutions for production incidents. • Ensure data quality, accuracy, and integrity across data pipelines. • Collaborate with cross-functional teams to define technical requirements and deliver solutions. • Work independently on assigned tasks while maintaining high levels of productivity and efficiency. Skills Required: • Proficiency in SQL and PL/SQL for querying and manipulating data. • Experience in Python for data processing and automation. • Hands-on experience with Google Cloud Platform (GCP), particularly: o BigQuery (must-have) o Cloud Storage o Cloud Logging o Dataproc o Pub/Sub • Experience with GitHub and CI/CD pipelines for automation and deployment. • Performance tuning and performance testing of ELT processes. • Strong analytical and debugging skills to resolve data and pipeline issues efficiently. • Self-motivated and able to work independently as an individual contributor. • Good understanding of data modeling, database design, and data warehousing concepts. Preferred candidate profile

Posted 6 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies