Posted:3 days ago| Platform:
Work from Office
Full Time
Key Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines for data ingestion and processing Work with stakeholders to understand data requirements and translate them into technical solutions Ensure data quality, reliability, and governance Optimize data storage and retrieval for performance and cost efficiency Collaborate with Data Scientists, Analysts, and Developers to support their data needs Maintain and enhance data architecture to support business growth Required Skills: Strong experience with SQL and relational databases (MySQL, PostgreSQL, etc.) Hands-on experience with Big Data technologies (Spark, Hadoop, Hive, etc.) Proficiency in Python/Scala/Java for data engineering tasks Experience with cloud platforms (AWS, Azure, or GCP) Familiarity with data warehouse solutions (Redshift, Snowflake, BigQuery, etc.) Knowledge of workflow orchestration tools (Airflow, Luigi, etc.) Good to Have: Experience with real-time data streaming (Kafka, Flink, etc.) Understanding of CI/CD and DevOps practices for data workflows Exposure to data security, compliance, and data privacy practices Qualifications: Bachelors/Master’s degree in Computer Science, IT, or a related field Minimum 3 years of experience in data engineering or related field
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Experience: Not specified
5.2 - 6.3925 Lacs P.A.
Experience: Not specified
4.375 - 6.0 Lacs P.A.
Salary: Not disclosed
Salary: Not disclosed
Salary: Not disclosed
Salary: Not disclosed
Salary: Not disclosed
Salary: Not disclosed
Experience: Not specified
Salary: Not disclosed
Salary: Not disclosed