Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
19 - 22 Lacs
bengaluru
Work from Office
Job Summary As a key member of the Data Team at Equinix, we are seeking an experienced GCP Data Engineer who will lead end-to-end development of complex Data Engineering use cases and drive the evolution of Equinix's Data Lake platform You will design and build enterprise-scale data infrastructure and analytics solutions on Google Cloud Platform while providing technical mentorship to the data engineering team The ideal candidate combines deep technical expertise in cloud-native data technologies with proven leadership skills and a passion for building robust, scalable data platforms that drive strategic business insights Responsibilities Participate in design and implementation of enterpris...
Posted 1 hour ago
5.0 - 9.0 years
0 Lacs
delhi
On-site
As an Engineer with 5-8 years of experience, your role will involve working with competencies like Apache Beam, GCP Cloud, and Open Shift. You should have hands-on experience in Java and Kafka Streaming, with knowledge in GCP and Kubernetes being a plus. Your responsibilities will include developing and deploying data and analytics-led solutions on GCP, as well as designing highly available and scalable systems. You will be working on Data Engineering solutions using Cloud BigQuery, Cloud DataFlow, Cloud BigTable, Storage, Cloud Spanner, and Cloud IAM. It's essential to have a good understanding of Apache Kafka and proficiency in cloud-based ETL/Data Orchestration tools like Apache Beam and ...
Posted 5 days ago
3.0 - 7.0 years
0 Lacs
punjab
On-site
You will be responsible for the following tasks: - Experience in GCP. - Data migration experience from legacy systems including SQL, Oracle. - Experience with Data lake, data warehouse ETL pipelines build and design on GCP. - GCP data and analytics services like Cloud Dataflow, Cloud Dataprep, Apache Beam/Cloud composer, Cloud BigQuery, Cloud Fusion, Cloud PubSub, Cloud storage & Cloud Functions. - Using Cloud Native GCP CLI/gsutil along with scripting using Python & SQL. - Experience with Data Governance, Metadata Management, Data Masking & Encryption using GCP tools like Cloud Data Catalog, GCP KMS tools. No additional details about the company are mentioned in the job description.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
Role Overview: As a PostgreSQL Database Developer with a minimum of 4 years of experience in database management, you will play a critical role in maintaining and optimizing database structures while ensuring exceptional customer experiences through client interactions. Your passion for technology and commitment to continuous learning will drive you towards success in this role. Key Responsibilities: - Proficient in PL/SQL and PostgreSQL programming, capable of writing complex SQL queries and stored procedures. - Experience in migrating database structure and data from Oracle to Postgres SQL, preferably on GCP Alloy DB or Cloud SQL. - Familiarity with Cloud SQL/Alloy DB and tuning them for i...
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
kochi, kerala
On-site
As a Data Engineer at our company, you will be responsible for designing, developing, and optimizing ETL pipelines using PySpark on Google Cloud Platform (GCP). Your role will involve working with various GCP services such as BigQuery, Cloud Dataflow, Cloud Composer (Apache Airflow), and Cloud Storage for data transformation and orchestration. Key Responsibilities: - Develop and optimize Spark-based ETL processes for large-scale data processing - Implement best practices for data governance, security, and monitoring in a cloud environment - Collaborate with data engineers, analysts, and business stakeholders to understand data requirements - Troubleshoot performance bottlenecks and optimize ...
Posted 2 weeks ago
4.0 - 6.0 years
12 - 16 Lacs
pune
Work from Office
Experience Level: 46 years Note - Video of self Introduction of 3mins. Key Responsibilities Design, build, and maintain scalable and reliable data pipelines using GCP services such as: BigQuery, Cloud Dataflow, Pub/Sub, Cloud Storage, Cloud Composer (Airflow). Implement data ingestion from diverse sources (e.g., APIs, databases, files) into cloud data lake/warehouse. Optimize data processing workflows for performance, reliability, and cost efficiency. Develop and manage ETL/ELT processes using tools like Dataform, dbt, or custom scripts. Collaborate with data scientists, analysts, and stakeholders to deliver clean, well-documented datasets. Implement data quality, governance, and observabili...
Posted 2 weeks ago
11.0 - 17.0 years
25 - 40 Lacs
bengaluru
Remote
Dear Candidate, We have opening for Google Cloud Platform (GCP) Lead, please find the detailed description below Company Details : Royal Cyber is a Modernized e-Business Solutions provider specializing in software deployment. We are an IBM Premier Business Partner, an IBM Authorized trainer and Microsoft Certified Gold Partner. Since its inception in 2002, our experts have been leaders in providing exceptional & award winning services to organizations of different industry verticals all across the globe. Headquartered in Chicago, Illinois we have a global footprint with offices and development centers across North America, Asia, Europe, Africa and the Middle East region. URL : www.royalcyber...
Posted 1 month ago
10.0 - 12.0 years
0 Lacs
mumbai, maharashtra, india
On-site
We are seeking a highly skilled and motivated GCP Data Architect to join our team. Google Cloud Platform (GCP) Data Architect would be responsible for designing and implementing cloud-based solutions for enterprise-level clients using GCP. The role involves understanding clients business requirements and translating them into technical solutions that meet their needs. The GCP Data Architect should have a strong understanding of cloud architecture, including compute, networking, storage, security, and automation. They should also have a deep understanding of GCP services, such as App Engine, Compute Engine, Kubernetes Engine, BigQuery, Cloud Storage, Cloud SQL, and Cloud Pub/Sub and tools suc...
Posted 1 month ago
7.0 - 11.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Platform Engineer - Tech Lead at Deutsche Bank in Pune, India, you will be part of the DB Technology global team of tech specialists. Your role involves leading a group of engineers working on cutting-edge technologies in Hadoop, Big Data, GCP, Terraform, Big Query, Data Proc, and data management to develop robust data pipelines, ensure data quality, and implement efficient data management solutions. Your leadership will drive innovation, maintain high standards in data infrastructure, and mentor team members to support data-driven initiatives. You will collaborate with data engineers, analysts, cross-functional teams, and stakeholders to ensure the data platform meets the organiza...
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Engineer at our company, you will be responsible for designing scalable and robust AI/ML systems in production, focusing on high-performance and cost-effective solutions. Your expertise in various technologies, including GCP services like BigQuery, Cloud Dataflow, Pub/Sub, Dataproc, and Cloud Storage, along with programming languages such as Python, Java/Scala, and SQL, will be crucial for the success of our projects. Additionally, your experience with data processing tools like Apache Beam, Apache Kafka, and Cloud Dataprep, as well as orchestration tools like Apache Airflow and Terraform, will play a significant role in implementing efficient data pipelines. Knowledge of security ...
Posted 1 month ago
7.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
As a GCP DBT Manager, your primary responsibility will be to collaborate with the team in designing, building, and maintaining data pipelines and transformations using Google Cloud Platform (GCP) and the Data Build Tool (dbt). This role will involve utilizing tools such as BigQuery, Cloud Composer, and Python, requiring a strong foundation in SQL skills and knowledge of data warehousing concepts. Additionally, you will play a crucial role in ensuring data quality, optimizing performance, and working closely with cross-functional teams. Your key responsibilities will include: Data Pipeline Development: - Designing, building, and maintaining ETL/ELT pipelines using dbt and GCP services like Bi...
Posted 2 months ago
1.0 - 5.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a Data Engineer at Synoptek, you will be responsible for designing, developing, and maintaining robust and scalable data pipelines on the Google Cloud Platform (GCP). You will leverage your hands-on experience with GCP services such as BigQuery, Jitterbit, Cloud Dataflow, Cloud Pub/Sub, and Cloud Storage to build efficient data processing solutions. Collaborating with cross-functional teams, you will translate their data needs into technical requirements, ensuring data quality, integrity, and security throughout the data lifecycle. Your role will involve developing and optimizing ETL/ELT processes to extract, transform, and load data from various sources into data warehouses and data lake...
Posted 2 months ago
3.0 - 7.0 years
0 Lacs
punjab
On-site
As a GCP Data Engineer in Australia, you will be responsible for leveraging your experience in Google Cloud Platform (GCP) to handle various aspects of data engineering. Your role will involve working on data migration projects from legacy systems such as SQL and Oracle. You will also be designing and building ETL pipelines for data lake and data warehouse solutions on GCP. In this position, your expertise in GCP data and analytics services will be crucial. You will work with tools like Cloud Dataflow, Cloud Dataprep, Apache Beam/Cloud composer, Cloud BigQuery, Cloud Fusion, Cloud PubSub, Cloud storage, and Cloud Functions. Additionally, you will utilize Cloud Native GCP CLI/gsutil for opera...
Posted 2 months ago
4.0 - 10.0 years
0 Lacs
noida, uttar pradesh
On-site
We are seeking a PostgreSQL Database Developer with a minimum of 4 years of experience in database management. The ideal candidate should be passionate about technology, dedicated to continuous learning, and committed to providing exceptional customer experiences through client interactions. Qualifications: - Must have a degree in BE/B.Tech/MCA/MS-IT/CS/B.Sc/BCA or related fields. - Expertise and hands-on experience in PostgreSQL, PLSQL, Oracle, query optimization, performance tuning, and GCP Cloud. Job Description: The responsibilities of the PostgreSQL Database Developer include: - Proficient in PL/SQL and PostgreSQL programming, with the ability to write complex SQL queries and stored pro...
Posted 2 months ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have a strong understanding of the tech stack including GCP Services such as BigQuery, Cloud Dataflow, Pub/Sub, Dataproc, and Cloud Storage. Experience with Data Processing tools like Apache Beam (batch/stream), Apache Kafka, and Cloud Dataprep is crucial. Proficiency in programming languages like Python, Java/Scala, and SQL is required. Your expertise should extend to Orchestration tools like Apache Airflow (Cloud Composer) and Terraform, and Security aspects including IAM, Cloud Identity, and Cloud Security Command Center. Knowledge of Containerization using Docker and Kubernetes (GKE) is essential. Familiarity with Machine Learning platforms such as Google AI Platform, TensorFl...
Posted 2 months ago
4.0 - 10.0 years
0 Lacs
noida, uttar pradesh
On-site
We are seeking a PostgreSQL Database Developer with a minimum of 4 years of experience in Database management. We are looking for an individual who is enthusiastic about technology, committed to continuous learning, and approaches every client interaction as an opportunity to deliver exceptional customer service. Qualifications: - BE/B.Tech/MCA/MS-IT/CS/B.Sc/BCA or any related degrees - Proficiency in PostgreSQL, PLSQL, Oracle, Query optimization, Performance tuning, and GCP Cloud Key Responsibilities: - Proficient in PL/SQL and PostgreSQL programming with the ability to write complex SQL Queries and Stored Procedures - Experience in migrating Database structure and data from Oracle to Postg...
Posted 2 months ago
0.0 years
0 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology ...
Posted 3 months ago
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
94939 Jobs | Dublin
Wipro
31907 Jobs | Bengaluru
Accenture in India
26862 Jobs | Dublin 2
EY
25786 Jobs | London
Uplers
21179 Jobs | Ahmedabad
IBM
16641 Jobs | Armonk
Bajaj Finserv
16450 Jobs |
Amazon.com
15899 Jobs |
Capgemini
15815 Jobs | Paris,France
Accenture services Pvt Ltd
15806 Jobs |