2 - 4 years

3 - 8 Lacs

Posted:8 months ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Key Responsibilities

  • Design, develop, and maintain robust ETL pipelines for data ingestion, transformation, and loading into data warehouses.
  • Optimize and improve data models to enhance performance and scalability.
  • Collaborate with data analysts, scientists, and other stakeholders to understand data requirements and deliver solutions.
  • Monitor and troubleshoot ETL workflows to ensure smooth operations and data quality.
  • Implement and enforce best practices for data governance, security, and compliance.
  • Analyze and resolve complex technical issues related to ETL processes.
  • Document ETL processes, data architecture, and operational workflows.
  • NoSQL Databases: Hands-on experience with NoSQL databases like MongoDB, Cassandra, or DynamoDB
  • Modern Data Lakehouses: Knowledge of modern data lakehouse platforms like Apache Iceberg, Snowflake, or Dremio.
  • Real-Time Data Processing: Develop and optimize real-time data workflows using tools like Apache Kafka, Apache Flink, or AWS Kinesis.

Required Skills and Qualifications

  • Bachelors degree in Computer Science, Data Engineering, or related fields.
  • 2+ years of experience in ETL development and data engineering.
  • Proficiency in ETL tools such as Informatica, Talend, SSIS, or equivalent.
  • Strong knowledge of SQL and database management systems (e.g., PostgreSQL, MySQL, SQL Server).
  • Hands-on experience with data integration and transformation in cloud environments (AWS, Azure, or Google Cloud).
  • Experience with data modeling and working with structured and unstructured data.
  • Familiarity with programming languages like Python, Scala, or Java for data manipulation.
  • Excellent problem-solving and communication skills.
  • NoSQL Databases: Hands-on experience with NoSQL databases like MongoDB, Cassandra, or DynamoDB
  • Modern Data Lakehouses: Knowledge of modern data lakehouse platforms like Apache Iceberg, Snowflake, or Dremio.
  • Real-Time Data Processing: Develop and optimize real-time data workflows using tools like Apache Kafka, Apache Flink, or AWS Kinesis

Preferred Skills

Knowledge of Big Data technologies like Hadoop, Spark, or Kafka.

Experience with data visualization tools (e.g., Tableau, Power BI).

Familiarity with DevOps practices for data pipelines.

Understanding of machine learning workflows and data preparation for AI models.

Mock Interview

Practice Video Interview with JobPe AI

Start Data Engineer Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Focus Management Consultants logo
Focus Management Consultants

Management Consulting

Business City

RecommendedJobs for You

saket, delhi, india

bangalore urban, karnataka, india