Data Engineer

5 - 10 years

25 - 40 Lacs

Posted:3 hours ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Role:

Years of Experience:

Key Skills:

Key responsibilities

  • Design and build data infrastructure:

    Create and maintain scalable systems, databases, and data pipeline architectures to support data collection, storage, and processing.
  • Transform and prepare data:

    Develop algorithms and methods to clean, combine, and transform raw data into a structured format suitable for analytics and machine learning.
  • Ensure data quality and security:

    Implement data validation methods, monitor workflows, and ensure compliance with data governance and security policies.
  • Collaborate with teams:

    Work with data scientists, analysts, and business stakeholders to understand data requirements and deliver valuable insights for decision-making.
  • Troubleshoot and optimize:

    Monitor systems, identify and correct errors, and optimize existing frameworks to improve performance.

Essential skills and qualifications

  • Programming:

    Strong proficiency in languages like Pyspark, Python and SQL.
  • Databases and big data technologies:

    Expertise in database systems and big data tools like Hadoop and Spark.
  • Data architecture:

    A solid understanding of how to structure and manage data systems.
  • Problem-solving:

    The ability to creatively solve problems and apply data science to business needs.
  • Communication:

    Excellent communication skills to collaborate with different teams and stakeholders.
  • Education:

    A bachelor's degree in a field like computer science, information technology, or applied mathematics is often required.

  • Technical Skills:

  • Programming Languages:

    Strong proficiency in

    Python

    and

    SQL

    is essential. Java or Scala is a plus.
  • Databases:

    Experience with relational and NoSQL databases.
  • Big Data Technologies:

    Experience with Apache Hadoop,

    Spark

    , or

    Kafka

    .
  • Cloud Platforms:

    Proficiency in AWS, Azure, or GCP and their data services.
  • ETL & Workflow Management:

    Experience designing ETL pipelines using tools like Apache Airflow.
  • Data Modeling:

    Understanding of data modeling, schema design, and data warehousing concepts.

Soft Skills:

  • Problem-solving and analytical abilities.
  • Strong communication skills.
  • Ability to work independently and collaboratively.
  • Detail-oriented with a commitment to data accuracy.

Education and Experience:

  • Bachelor's degree in a relevant technical field.
  • Experience with data visualization tools (Tableau, Power BI) is a plus.

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Wipzo Systech logo
Wipzo Systech

Information Technology

Innovation City

RecommendedJobs for You

pune, bengaluru, delhi / ncr

mumbai, mumbai suburban, mumbai (all areas)