Data Engineer

2 - 5 years

6 - 10 Lacs

Posted:6 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Role Description
  • Design, develop, and optimize data pipelines to extract, transform, and load (ETL/ELT) data from a variety of sources.
  • Build and manage data models and data warehouses that support business intelligence, reporting, and analytics needs.
  • Leverage cloud technologies such as AWS, Azure, or Google Cloud Platform for building scalable, reliable, and efficient data solutions.
  • Develop and maintain automated data workflows using tools like Airflow, AWS Glue, Azure Data Factory, or similar technologies.
  • Work with large datasets and complex data structures, ensuring data quality, integrity, and performance.
  • Write and optimize SQL queries for complex data extraction, aggregation, and transformation tasks.
  • Integrate APIs to connect data sources, extract information, and facilitate real-time data processing.
  • Collaborate with business intelligence and data science teams to define data requirements and ensure the availability of clean, accurate data for analysis and decision-making.
  • Implement CI/CD pipelines for automated deployment of data pipelines and models.
  • Monitor the performance of data systems, ensuring reliability, availability, and scalability of data architectures.
  • Create and maintain comprehensive documentation for data pipelines, systems, and processes.
  • Stay up to date with emerging trends and technologies in the data engineering field and continuously improve data systems.
Technical Skills
  • Solid experience as a Data Engineer or similar role in data architecture and pipeline development.
  • Strong experience with cloud platforms such as AWS, Azure, or Google Cloud.
  • Advanced knowledge of ETL/ELT processes, data modeling, and data warehousing (e.g., Snowflake, Redshift).
  • Proficiency in SQL for complex data transformation and querying.
  • Hands-on experience with data pipeline orchestration tools like Azure Data Factory, Apache Airflow, AWS Glue, or similar.
  • Strong programming skills in Python for automation, data processing, and integration tasks.
  • Experience working with big data technologies such as Hadoop, Spark, or Kafka is a plus.
  • Familiarity with GitHub for version control and CI/CD pipelines for deployment automation.
  • Strong understanding of data security, governance, and compliance best practices.
  • Experience with business intelligence tools such as Tableau, Power BI, or similar for reporting and data visualization.
  • Ability to work in an agile, fast-paced environment and manage multiple tasks simultaneously.
Nice-to-have skills
Qualifications
  • Masters degree in computer science, Engineering, Data Science, or related field.
  • Experience with real-time data streaming platforms such as Kafka or AWS Kinesis.
  • Exposure to machine learning and AI technologies and how data engineering supports these initiatives.
  • Experience with Infrastructure as Code (IaC) tools such as Terraform or CloudFormation.
  • Knowledge of data lake architectures and modern data processing frameworks.
  • Experience with Tableau for building reports, dashboards, and visual analytics.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Incedo logo
Incedo

Information Technology and Services

Utrecht

RecommendedJobs for You