Posted:21 hours ago|
Platform:
Work from Office
Full Time
Position Overview We are looking for an experienced Lead Data Engineer to join our dynamic team. If you are passionate about building scalable software solutions, and work collaboratively with cross-functional teams to define requirements and deliver solutions we would love to hear from you. Job Responsibilities: Develop and maintain data pipelines and ETL/ELT processes using Python Design and implement scalable, high-performance applications Work collaboratively with cross-functional teams to define requirements and deliver solutions Develop and manage near real-time data streaming solutions using Pub, Sub or Beam. Contribute to code reviews, architecture discussions, and continuous improvement initiatives Monitor and troubleshoot production systems to ensure reliability and performance Basic Qualifications: 5+ years of professional software development experience with Python Strong understanding of software engineering best practices (testing, version control, CI/CD) Experience building and optimizing ETL/ELT processes and data pipelines Proficiency with SQL and database concepts Experience with data processing frameworks (e.g., Pandas) Understanding of software design patterns and architectural principles Ability to write clean, well-documented, and maintainable code Experience with unit testing and test automation Experience working with any cloud provider (GCP is preferred) Experience with CI/CD pipelines and Infrastructure as code Experience with Containerization technologies like Docker or Kubernetes Bachelors degree in Computer Science, Engineering, or related field (or equivalent experience) Proven track record of delivering complex software projects Excellent problem-solving and analytical thinking skills Strong communication skills and ability to work in a collaborative environment Preferred Qualifications: Experience with GCP services, particularly Cloud Run and Dataflow Experience with stream processing technologies (Pub/Sub) Familiarity with big data technologies (Airflow) Experience with data visualization tools and libraries Knowledge of CI/CD pipelines with Gitlab and infrastructure as code with Terraform Familiarity with platforms like Snowflake, Bigquery or Databricks,. GCP Data engineer certification
Shyftlabs
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Kolkata
10.0 - 15.0 Lacs P.A.
Hyderabad
25.0 - 30.0 Lacs P.A.
Bengaluru
25.0 - 30.0 Lacs P.A.
Gurugram, Bengaluru, Delhi / NCR
0.5 - 0.6 Lacs P.A.
Noida, Bengaluru, Delhi / NCR
15.0 - 22.5 Lacs P.A.
Gurugram
5.0 - 8.0 Lacs P.A.
Gurugram
5.0 - 8.0 Lacs P.A.
Navi Mumbai, Pune, Mumbai (All Areas)
8.0 - 18.0 Lacs P.A.
Pune, Bengaluru, Mumbai (All Areas)
13.0 - 23.0 Lacs P.A.
Pune
0.5 - 3.0 Lacs P.A.