Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
15.0 - 20.0 years
4 - 8 Lacs
mumbai
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data ServicesMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, e...
Posted 1 month ago
2.0 - 6.0 years
7 - 11 Lacs
bengaluru
Work from Office
If you are creating JD for an Associate program rolereplace word entry in the first sentence below with word Associate. For example -> As an entry level Software Developer.. will be changed to As an Associate Software Developer.. As an entry level Application Developer at IBM, you'll work with clients to co-create solutions to major real-world challenges by using best practice technologies, tools, techniques, and products to translate system requirements into the design and development of customized systems. In your role, you may be responsible for: Working across the entire system architecture to design, develop, and support high quality, scalable products and interfaces for our clients Col...
Posted 1 month ago
3.0 - 8.0 years
6 - 10 Lacs
chennai
Work from Office
Overview GCP, Big Query, Composer, Python, Pyspark, Dataproc, Airflow, SQL Roles and Responsibility Design and implement data pipelines using Google Cloud Platform services. Develop and maintain large-scale data architectures and systems. Collaborate with cross-functional teams to identify and prioritize project requirements. Ensure data quality and integrity by implementing robust testing and validation procedures. Optimize data processing workflows for improved performance and efficiency. Troubleshoot and resolve complex technical issues related to data engineering. Job Requirements Strong understanding of Google Cloud Platform services and technologies. Experience with data modeling, desi...
Posted 1 month ago
6.0 - 10.0 years
5 - 15 Lacs
chennai
Hybrid
We are looking for seasoned professionals who thrive in platform engineering roles, especially those with hands-on experience in cloud environments. Ideal candidates are proactive, detail-oriented, and capable of supporting large-scale infrastructure projects. They should be comfortable working in hybrid setups, contributing to both engineering and operational aspects of cloud platforms. CHENNAI CANDIDATES ONLY ! Location: DLF Downtown Chennai : Jawaharlal Nehru Salai, next to American International School, Tharamani, Chennai, Tamil Nadu 600113 Years of Experience: 6+ years overall - mandate | 3 years relevancy in GCP Notice: Immediate to 30 days Serving / Official Key Responsibilities Suppo...
Posted 1 month ago
2.0 - 7.0 years
9 - 13 Lacs
chennai
Work from Office
Overview GCP, Big Query, Composer, Python, Pyspark, Dataproc, Airflow, SQL Key Responsibilities: Design, develop, and maintain robust and scalable data pipelines using GCP services such as Dataflow, Dataproc, BigQuery, Pub/Sub, and Cloud Storage. Collaborate with data scientists, analysts, and stakeholders to understand data requirements and deliver efficient solutions. Implement ETL/ELT workflows, data ingestion, transformation, and validation processes. Optimize performance and cost-efficiency of data processing jobs and cloud resources. Troubleshoot and resolve data pipeline issues and production incidents. Ensure data quality, governance, and compliance standards are met. Automate data w...
Posted 1 month ago
4.0 - 9.0 years
15 - 30 Lacs
bengaluru
Work from Office
Overview We're looking for a motivated Senior Data Engineer to manage the transformation of data pipelines from Hadoop to BigQuery. You should have strong experience in Google Cloud Platform (GCP), BigQuery, and Dataproc, as well as proficiency in Linux/Unix environments. Experience with Spark, Scala or Python and shell scripting Responsibilities Mandatory Skills: Experience in pipeline migration from Hadoop to BigQuery. Proficiency with GCP, BigQuery, and Dataproc. Strong skills in Linux/Unix environments. Knowledge of Spark, Scala or Python. Advanced Experience with SQL (Any Query Language), shell scripting Strong Analytical skills Finetech & data migration exp. is an added advantage Requi...
Posted 1 month ago
2.0 - 6.0 years
7 - 11 Lacs
bengaluru
Work from Office
As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the world’s technology leader. Come to IBM and make a global impact! IBM’s Cloud Services are focused on supporting clients on their cloud journey across any platform to achieve their business goals. It en...
Posted 1 month ago
7.0 - 12.0 years
18 - 27 Lacs
chennai
Hybrid
Role & responsibilities Strong 7+ years experience in Bigdata, GCP - BQ, , DataFlow, DataProc, Spanner Good knowledge/Exp in SQL Very good communication skill Self starter and learner Willing to work from office on Hybrid mode. Preferred candidate profile
Posted 1 month ago
7.0 - 12.0 years
4 - 8 Lacs
pune
Work from Office
Should be capable of developing/configuring data pipelines in a variety of platforms and technologies Possess the following technical skills SQL, Python, Pyspark, Hive, ETL, Unix, Control-M (or similar scheduling tools) Can demonstrate strong experience in writing complex SQL queries to perform data analysis on Databases SQL server, Oracle, HIVE etc. Experience with GCP (particularly Airflow, Dataproc, Big Query) is an advantage Have experience with creating solutions which power AI/ML models and generative AI Ability to work independently on specialized assignments within the context of project deliverables Take ownership of providing solutions and tools that iteratively increase engineerin...
Posted 1 month ago
6.0 - 9.0 years
6 - 10 Lacs
hyderabad, pune
Work from Office
Mandatory skill : ETL_GCP_Bigquery Develop, implement, and optimize ETL/ELT pipelines for processing large datasets efficiently. Work extensively with BigQuery for data processing, querying, and optimization. Utilize Cloud Storage, Cloud Logging, Dataproc, and Pub/Sub for data ingestion, storage, and event-driven processing. Perform performance tuning and testing of the ELT platform to ensure high efficiency and scalability. Debug technical issues, perform root cause analysis, and provide solutions for production incidents. Ensure data quality, accuracy, and integrity across data pipelines. Collaborate with cross-functional teams to define technical requirements and deliver solutions. Work i...
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Backend Software Engineer at Wayfair, you will be joining the Fulfillment Optimization team, responsible for building platforms that optimize customer order fulfillment for Wayfair profitability and customer satisfaction. Your role will involve enhancing and scaling customer-facing platforms that provide fulfillment information on the website, from search pages to order delivery. You will play a crucial role in maintaining an accurate representation of the dynamic supply chain, predicting product packaging, warehouse flow, and surfacing necessary information for customers, suppliers, and carriers in milliseconds. **Key Responsibilities:** - Build and launch data pipelines and products f...
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
As an experienced Data Engineer with over 10 years of experience, your role will involve designing and implementing scalable, reliable, and efficient data pipelines and architectures across various Google products and services. You will be responsible for developing and maintaining data models, schemas, and ontologies to support different data sources and use cases. Your expertise will be crucial in evaluating and recommending new data technologies and tools to enhance infrastructure and capabilities. Key Responsibilities: - Collaborate with product managers, engineers, and researchers to define data requirements and provide robust technical solutions. - Build and optimize batch and real-tim...
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
As an experienced Data Engineer with 8+ years of experience, your role will involve designing and implementing scalable, reliable, and efficient data pipelines and architectures across Google products and services. You will be responsible for developing and maintaining data models, schemas, and ontologies to support various data sources and use cases. Additionally, you will evaluate and recommend new data technologies and tools to enhance infrastructure and capabilities. Key Responsibilities: - Collaborate with product managers, engineers, and researchers to define data requirements and deliver robust technical solutions. - Build and optimize batch and real-time data pipelines using Google C...
Posted 1 month ago
6.0 - 11.0 years
27 - 30 Lacs
chennai
Work from Office
Position Description:Employees in this job function are responsible for designing, building, and maintaining data solutions including data infrastructure, pipelines, etc. for collecting, storing, processing and analyzing large volumes of data efficiently and accurately Key Responsibilities: 1) Collaborate with business and technology stakeholders to understand current and future data requirements 2) Design, build and maintain reliable, efficient and scalable data infrastructure for data collection, storage, transformation, and analysis 3) Plan, design, build and maintain scalable data solutions including data pipelines, data models, and applications for efficient and reliable data workflow 4...
Posted 1 month ago
4.0 - 9.0 years
5 - 15 Lacs
chennai
Hybrid
Role & responsibilities- GCP Data Engineer , Big Querry, Data flow, Data Composer, SQL, Python, Pyspark along with 3+ Years strong GCP experience & worked as Module lead / Tech Lead Exp - 4 to 15 yrs Locations - Pan India Work Mode - Hybrid Notice Period - Immediate to 30 Days & Serving notice period
Posted 1 month ago
4.0 - 9.0 years
5 - 15 Lacs
bengaluru
Hybrid
Role & responsibilities- GCP Data Engineer , Big Querry, Data flow, Data Composer, SQL, Python, Pyspark along with 3+ Years strong GCP experience & worked as Module lead / Tech Lead Exp - 4 to 15 yrs Locations - Pan India Work Mode - Hybrid Notice Period - Immediate to 30 Days & Serving notice period
Posted 1 month ago
4.0 - 9.0 years
5 - 15 Lacs
hyderabad
Hybrid
Role & responsibilities- GCP Data Engineer , Big Querry, Data flow, Data Composer, SQL, Python, Pyspark along with 3+ Years strong GCP experience & worked as Module lead / Tech Lead Exp - 4 to 15 yrs Locations - Pan India Work Mode - Hybrid Notice Period - Immediate to 30 Days & Serving notice period
Posted 1 month ago
6.0 - 11.0 years
9 - 13 Lacs
hyderabad, pune, bengaluru
Work from Office
Type: Contract Description: GCPdata engineer Big Query SQL Python Talend ETL ProgrammerGCPor Any Cloud technology. Experienced in GCP data engineer Big Query SQL Python Talend ETL Programmer GCPor Any Cloud technology. Good experience in building the pipeline of GCP Components to load the data into Big Query and to cloud storage buckets. Excellent Data Analysis skills. Good written and oral communication skills Self-motivated able to work independently
Posted 1 month ago
6.0 - 11.0 years
6 - 9 Lacs
hyderabad, pune
Work from Office
At least 8 + years of experience in any of the ETL tools Prophecy, Datastage 11.5/11.7, Pentaho.. etc . At least 3 years of experience in Pyspark with GCP (Airflow, Dataproc, Big query) capable of configuring data pipelines . Strong Experience in writing complex SQL queries to perform data analysis on Databases SQL server, Oracle, HIVE etc . Possess the following technical skills SQL, Python, Pyspark, Hive, ETL, Unix, Control-M (or similar scheduling tools ) Ability to work independently on specialized assignments within the context of project deliverables Take ownership of providing solutions and tools that iteratively increase engineering efficiencies . Design should help embed standard pr...
Posted 1 month ago
5.0 - 10.0 years
3 - 7 Lacs
chennai, gurugram, bengaluru
Work from Office
We are looking for a skilled professional with 5-10 years of experience to join our team as a GCP expert. The ideal candidate will have a strong background in BigQuery, GCP, and coding languages like Java, Python, and Hadoop. This position is available in Bangalore, Hyderabad, Chennai, Pune, Noida, and Gurgaon. Roles and Responsibility Design and develop scalable and efficient data pipelines using BigQuery and GCP. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain large-scale data architectures using Cloud SQL and Apache Airflow. Ensure high-quality data processing and analysis by implementing robust testing and validation framework...
Posted 1 month ago
5.0 - 10.0 years
8 - 12 Lacs
chennai, gurugram, bengaluru
Work from Office
We are looking for a skilled professional with 5-10 years of experience to join our team in Bangalore, Hyderabad, Chennai, Pune, Noida, and Gurgaon. The ideal candidate will have expertise in GCP Hadoop ecosystem and be able to handle batch data processing on the Hadoop ecosystem. Roles and Responsibility Design and develop scalable data pipelines using GCP Hadoop ecosystem. Process large datasets using Scala Spark, Hive, and other relevant tools. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain technical documentation for data processing systems. Troubleshoot and resolve issues related to data processing and system performance. Op...
Posted 1 month ago
5.0 - 10.0 years
4 - 8 Lacs
chennai, gurugram, bengaluru
Work from Office
We are looking for a skilled professional with 5-10 years of experience to join our team as a GCP Databases expert in Bangalore, Hyderabad, Chennai, Pune, Noida, and Gurgaon. The ideal candidate will have a strong background in GCP migration, database migration, and native GCP databases. Roles and Responsibility Design and implement scalable and efficient GCP databases for large-scale applications. Develop and maintain database architectures using CloudSQL, MySQL, PostgreSQL, and SQL Server. Collaborate with cross-functional teams to identify and prioritize database requirements. Ensure high availability and performance of GCP databases through monitoring and optimization techniques. Impleme...
Posted 1 month ago
4.0 - 9.0 years
10 - 17 Lacs
bangalore rural, bengaluru
Work from Office
Position: GCP Data Engineer Location: Bangalore Experience: 57 Years Employment Type: Full-time Are you a data enthusiast with a passion for building scalable data solutions on the cloud? Join our team as a GCP Data Engineer and help us design, develop, and operationalize cutting-edge data frameworks that power intelligent decision-making. Key Responsibilities: Design and develop data ingestion , real-time processing , and transformation frameworks using open-source tools. Work hands-on with technologies like Kafka , Apache Spark (SQL, Scala, Java) , Python , Hadoop , Hive , and Airflow . Build and manage data pipelines using GCP services : Cloud Composer , BigQuery , and DataProc . Operatio...
Posted 1 month ago
5.0 - 9.0 years
3 - 7 Lacs
hyderabad, chennai, bengaluru
Work from Office
We are looking for a skilled professional with 5 to 9 years of experience in the field, based in Bangalore, Pune, Chennai, and Hyderabad. The ideal candidate should have a strong background in GCP, including Big Query, Data Fusion, and Data Proc. Roles and Responsibility Design and implement scalable data pipelines using GCP services such as Big Query and Data Proc. Develop and maintain large-scale data warehouses and data lakes on GCP. Collaborate with cross-functional teams to identify and prioritize project requirements. Ensure data quality and integrity by implementing robust testing and validation procedures. Provide technical guidance and support to junior team members. Stay up-to-date...
Posted 1 month ago
4.0 - 7.0 years
3 - 7 Lacs
chennai
Work from Office
We are looking for a skilled professional with 4 to 7 years of experience in the field, located in Chennai. The ideal candidate should have expertise in Google BigQuery, SQL, Python, Apache Airflow, and Oracle to BigQuery DWH migration and modernization. Roles and Responsibility Design and develop scalable data pipelines using Google BigQuery, DataProc, and GCS. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain large-scale data warehouses using Oracle DB and PL/SQL. Implement data quality checks and validation processes to ensure data integrity. Optimize database performance and troubleshoot issues as needed. Work closely with stake...
Posted 1 month ago
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
123151 Jobs | Dublin
Wipro
40198 Jobs | Bengaluru
EY
32154 Jobs | London
Accenture in India
29674 Jobs | Dublin 2
Uplers
24333 Jobs | Ahmedabad
Turing
22774 Jobs | San Francisco
IBM
19350 Jobs | Armonk
Amazon.com
18945 Jobs |
Accenture services Pvt Ltd
18931 Jobs |
Capgemini
18788 Jobs | Paris,France