Posted:2 days ago|
Platform:
Work from Office
Full Time
5 to 7 years of experience in data engineering
Architect and maintain scalable, secure, and reliable data platforms and pipelines
Design and implement data lake/data warehouse solutions such as Redshift, BigQuery, Snowflake, or Delta Lake
Build real-time and batch data pipelines using tools like Apache Airflow, Kafka, Spark, and DBT
Ensure data governance, lineage, quality, and observability
Collaborate with stakeholders to define data strategies, architecture, and KPIs
Lead code reviews and enforce best practices
Mentor junior and mid-level engineers
Optimize query performance, data storage, and infrastructure
Integrate CI/CD workflows for data deployment and automated testing
Evaluate and implement new tools and technologies as required
Demonstrate expert-level proficiency in Python and SQL
Possess deep knowledge of distributed systems and data processing frameworks
Be proficient in cloud platforms (AWS, GCP, or Azure), containerization, and CI/CD processes
Have experience with streaming platforms like Kafka or Kinesis and orchestration tools
Be highly skilled with Airflow, DBT, and data warehouse performance tuning
Exhibit strong leadership, communication, and mentoring skills
Omni Recruit
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Nowudaipur, gurugram
30.0 - 32.5 Lacs P.A.
gurugram, haryana, india
Salary: Not disclosed
mumbai, hyderabad
4.0 - 8.0 Lacs P.A.
mumbai, hyderabad
4.0 - 8.0 Lacs P.A.
mumbai, hyderabad
4.0 - 8.0 Lacs P.A.
10.0 - 15.0 Lacs P.A.
35.0 - 40.0 Lacs P.A.
pune
4.0 - 8.0 Lacs P.A.
bengaluru
17.0 - 19.0 Lacs P.A.
bengaluru
8.0 - 9.0 Lacs P.A.