Posted:1 hour ago|
Platform:
Work from Office
Full Time
Job Overview We are looking for an experienced Data Engineer with at least 2-3 years of hands-on experience in designing and maintaining scalable data infrastructure. You will work with cross-functional teams to ensure high-quality, accessible, and well-structured data systems that support business intelligence, analytics, and other data-driven needs. Key Responsibilities Design, develop, and maintain robust ETL/ELT pipelines for ingesting, transforming, and loading data. Build and manage scalable data architectures on cloud platforms such as AWS, GCP, or Azure. Ensure data quality, integrity, and consistency through validation, auditing, and monitoring. Collaborate with analytics and product teams to gather data requirements and deliver optimized data sets. Implement and manage data lakes and data warehouses using tools like Redshift, BigQuery, Azure or Snowflake. Develop automated workflows using orchestration tools like Apache Airflow or AWS Step Functions. Optimize data processing workflows for performance and cost-efficiency. Document data models, data flows, and transformation logic to ensure transparency and maintainability. Enforce data security, privacy, and governance best practices. Perform regular data audits and troubleshooting for data issues. Required Skills & Qualifications Bachelor s or Master s degree in Computer Science, Information Systems, or related field. Minimum 2-3 years of experience in data engineering roles. Strong hands-on experience with cloud data services (e.g., AWS S3, Glue, Athena, Redshift; or equivalents in GCP/Azure). Proficiency in Python, PySpark, and SQL for data manipulation and scripting. Experience with big data tools such as Spark or Hadoop. Solid understanding of data modeling, warehousing concepts, and distributed systems. Strong problem-solving skills with a focus on debugging data quality and performance issues. Experience with data visualization tools or platforms is a plus (e.g., Power BI, Tableau, Looker). Good to Have Experience with version control and CI/CD for data pipelines. Familiarity with message queue systems like Kafka or Kinesis. Exposure to real-time data processing. Knowledge of infrastructure-as-code tools like Terraform or CloudFormation.
LenDenClub
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Chennai
4.0 - 7.0 Lacs P.A.
Bengaluru
25.0 - 30.0 Lacs P.A.
Mumbai
4.0 - 5.0 Lacs P.A.
Pune, Chennai, Bengaluru
0.5 - 0.5 Lacs P.A.
Chennai, Malaysia, Malaysia, Kuala Lumpur
7.0 - 11.0 Lacs P.A.
5.0 - 8.0 Lacs P.A.
Bengaluru
15.0 - 20.0 Lacs P.A.
Hyderabad
25.0 - 35.0 Lacs P.A.
Hyderabad
25.0 - 30.0 Lacs P.A.
25.0 - 30.0 Lacs P.A.