4 - 5 years

4 - 8 Lacs

Posted:-1 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Job Summary:

We are looking for a Python Developer with strong data engineering and data handling skills. The ideal candidate will work on building scalable data pipelines, automating workflows, integrating datasets, and supporting analytics and ML teams with clean, reliable data solutions.

Key Responsibilities:

  • Develop, optimize, and maintain data pipelines using Python.
  • Work with large datasets and ensure efficient extraction, transformation, and loading (ETL).
  • Integrate data from various internal and external sources.
  • Build reusable Python modules and automation scripts.
  • Work closely with data engineers, analysts, and stakeholders to understand requirements.
  • Optimize performance of data processes and troubleshoot issues.
  • Ensure data quality, validation, and integrity across systems.
  • Deploy and monitor data workflows in production environments.

Technical Skills Required:

  • Strong hands-on experience in

    Python

    (Pandas, NumPy, PySpark preferred).
  • Experience with

    ETL development

    and building data pipelines.
  • Good understanding of

    SQL

    , complex queries, and stored procedures.
  • Experience with

    databases

    (PostgreSQL, MySQL, MongoDB, or similar).
  • Exposure to

    big data frameworks

    (Spark/Hadoop) is a strong advantage.
  • Experience with

    cloud platforms

    (AWS/Azure/GCP) for data services.
  • Familiarity with

    API integrations

    , automation, and scheduling tools (Airflow, Cron).
  • Knowledge of

    version control (Git)

    and CI/CD practices.

Preferred Qualifications:

  • Advanced experience with Python libraries used in data workflows (Pandas, NumPy, PySpark, FastAPI/Flask for data services).
  • Hands-on experience with data warehousing concepts and building scalable data models.
  • Practical understanding of ML model pipelines and supporting data science teams with clean, production-ready datasets.
  • Experience working with workflow orchestration tools (Airflow, Prefect, Luigi).
  • Familiarity with containerization and deployment (Docker/Kubernetes) for data applications.
  • Strong understanding of cloud-based data services (AWS Glue, Redshift, Azure Data Factory, BigQuery).
  • Bachelor s degree in Computer Science, Engineering, or a related technical field.


Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Cloudthat Technologies logo
Cloudthat Technologies

Cloud Computing

Bangalore

RecommendedJobs for You