Senior Data Engineer(GCP, Python)

5.0 - 10.0 years

6.0 - 12.0 Lacs P.A.

Gurgaon / Gurugram, Haryana, India

Posted:6 days ago| Platform: Foundit logo

Apply Now

Skills Required

Cloud Batchbigtable Cloud Functions

Work Mode

On-site

Job Type

Full Time

Job Description

Responsibilities Design, develop, and maintain scalable ETL/ELT pipelines for high-volume data processing Optimize and automate data ingestion , transformation , and storage workflows Handle both structured and unstructured data sources , ensuring data quality and consistency Develop and maintain data models , data warehouses , and databases Collaborate with cross-functional teams to support and enable data-driven decision-making Ensure data security , privacy , and compliance with industry and regulatory standards Troubleshoot and resolve data-related issues promptly and efficiently Monitor and enhance system performance , reliability , and scalability Stay up-to-date with emerging data technologies and recommend improvements to data architecture and engineering practices What You Will Need 5+ years of experience in data engineering , ETL development , or a related field Strong programming skills in Python Proficiency in SQL and experience with both relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., DynamoDB, MongoDB) Proven experience building data pipelines on Google Cloud Platform (GCP) using services like: DataFlow , Cloud Batch , BigQuery , BigTable , Cloud Functions , Cloud Workflows , Cloud Composer Solid understanding of data modeling , data warehousing , and data governance principles Capability to mentor junior data engineers and assist with technical challenges Familiarity with orchestration tools such as Apache Airflow Experience with containerization and orchestration tools like Docker and Kubernetes Proficiency with version control systems (e.g., Git) and CI/CD pipelines Excellent problem-solving and communication skills Ability to work effectively in a fast-paced , agile environment Experience with Snowflake , big data technologies (e.g., Hadoop, Spark, Kafka), and AWS is a plus Skilled at converting business requirements into technical documentation Education and Experience Bachelor's degree in Computer Science , Information Systems , Information Technology , or a related field Certified development training/program is a plus 5+ years of hands-on experience building data pipelines using Python and GCP

S&P Global Market Intelligence

Financial Services

New York

approximately 20,000 Employees

627 Jobs

    Key People

  • Eddie Fishman

    VP, Product Management
  • J. P. O'Connor

    Head of Product Management

RecommendedJobs for You