Posted:2 days ago|
Platform:
Work from Office
Full Time
Key Responsibilities Build and maintain scalable ETL/ELT data pipelines using Python and cloud-native tools. Design and optimize data models and queries on Google BigQuery for analytical workloads. Develop, schedule, and monitor workflows using orchestration tools like Apache Airflow or Cloud Composer. Ingest and integrate data from multiple structured and semi-structured sources, including MySQL, MongoDB, APIs, and cloud storage. Ensure data integrity, security, and quality through validation, logging, and monitoring systems. Collaborate with analysts and data consumers to understand requirements and deliver clean, usable datasets. Implement data governance, lineage tracking, and documentation as part of platform hygiene. Must-Have Skills 14 years of experience in data engineering or backend development. Strong experience with Google BigQuery and GCP (Google Cloud Platform). Proficiency in Python for scripting, automation, and data manipulation. Solid understanding of SQL and experience with relational databases like MySQL. Experience working with MongoDB and semi-structured data (e.g., JSON, nested formats). Exposure to data warehousing, data modeling, and performance tuning. Familiarity with Git-based version control and CI/CD practices.
Lenskart
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Kolkata
10.0 - 15.0 Lacs P.A.
Hyderabad
25.0 - 30.0 Lacs P.A.
Bengaluru
25.0 - 30.0 Lacs P.A.
Gurugram, Bengaluru, Delhi / NCR
0.5 - 0.6 Lacs P.A.
Noida, Bengaluru, Delhi / NCR
15.0 - 22.5 Lacs P.A.
Gurugram
5.0 - 8.0 Lacs P.A.
Gurugram
5.0 - 8.0 Lacs P.A.
Navi Mumbai, Pune, Mumbai (All Areas)
8.0 - 18.0 Lacs P.A.
Pune, Bengaluru, Mumbai (All Areas)
13.0 - 23.0 Lacs P.A.
Pune
0.5 - 3.0 Lacs P.A.