5 - 7 years
10.0 - 19.0 Lacs P.A.
Delhi NCR, Gurgaon, Noida
Posted:2 months ago| Platform:
Work from Office
Full Time
Key Responsibilities: Design, develop, and maintain efficient data pipelines using Python/PySpark. Write complex SQL queries to extract, transform, and load (ETL) data. Collaborate with cross-functional teams to define and implement data strategies. Leverage cloud platforms (AWS, Azure, GCP) for scalable data solutions. Ensure data quality, integrity, and consistency across systems. Optimize performance of data processes and identify bottlenecks. Monitor and troubleshoot data-related issues to ensure system reliability. Skills & Qualifications: Strong proficiency in Python and PySpark for data processing. Advanced SQL skills for querying and manipulating large datasets. Hands-on experience with cloud platforms (AWS, Azure, GCP). Familiarity with data warehousing, ETL processes, and data pipeline orchestration. Understanding of big data technologies (e.g., Hadoop, Spark) is a plus. Ability to work independently and in a collaborative team environment. Strong problem-solving and analytical skills. Must Have's 5 + Years of experience with Data engineer Immediate Joiner's Work location will be Gurgaon only (Hybrid)
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Bengaluru, Hyderabad
INR 3.5 - 8.5 Lacs P.A.
Mumbai, Bengaluru, Gurgaon
INR 5.5 - 13.0 Lacs P.A.
Chennai, Pune, Delhi, Mumbai, Bengaluru, Hyderabad, Kolkata
INR 3.0 - 7.0 Lacs P.A.
Chennai, Pune, Mumbai (All Areas)
INR 5.0 - 15.0 Lacs P.A.
Pune, Bengaluru, Mumbai (All Areas)
INR 11.0 - 21.0 Lacs P.A.
Chennai, Pune, Delhi, Mumbai, Bengaluru, Hyderabad, Kolkata
INR 15.0 - 16.0 Lacs P.A.
Pune, Bengaluru, Mumbai (All Areas)
INR 10.0 - 15.0 Lacs P.A.
Bengaluru, Hyderabad, Mumbai (All Areas)
INR 0.5 - 3.0 Lacs P.A.
Hyderabad, Gurgaon, Mumbai (All Areas)
INR 6.0 - 16.0 Lacs P.A.
Bengaluru, Noida
INR 16.0 - 22.5 Lacs P.A.