Regular Data Engineer

3 - 6 years

12 - 16 Lacs

Posted:12 hours ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

  • Project description
    Our client is a leading global Financial markets data provider runs a number of change programs to deliver of high-quality software that connects Financial Markets across the globe in a real-time, high frequency and low-latency data management chain. The projects are technically challenging in a very engaging environment making the most of cutting-edge technologies.
  • Responsibilities

  • Design, build, and maintain scalable ETL/ELT data pipelines to support analytics, reporting, and machine learning use cases.

  • Develop and optimize data models and warehouse structures in Snowflake to ensure high performance and reliability.

  • Collaborate with data analysts, data scientists, and business stakeholders to understand requirements and deliver clean, reliable, and accessible datasets.

  • Implement and manage data orchestration workflows (e.g., Airflow, AWS Step Functions) to ensure timely and accurate data delivery.

  • Apply best practices in data governance, quality, and security across all pipelines and storage layers.

  • Monitor, troubleshoot, and optimize data processes for performance, scalability, and cost efficiency.

  • Leverage AWS cloud services to design resilient, scalable, and secure data architectures.

  • Use Python and SQL for advanced data transformations, automation, and performance optimization.

  • Document data pipelines, models, and workflows to ensure maintainability and transparency.

  • Contribute to the continuous improvement of the data engineering platform, adopting new tools, frameworks, and processes.
  • Skills
    Must have

  • Strong proficiency in Python for ETL/ELT development and data pipeline automation.

  • Hands-on experience with AWS services (S3, Glue, Lambda, EMR, RDS) for building and managing scalable data solutions.

  • Expertise in Snowflake for cloud data warehousing, performance tuning, and large-scale data management.

  • Advanced SQL skills for data modeling, transformations, and query optimization.

  • Practical experience with dbt for modular, version-controlled data transformations and modeling." Nice to have

  • Experience with data orchestration tools (Airflow, Prefect, AWS Step Functions).

  • Familiarity with big data frameworks (Apache Spark, Kafka, Kinesis).

  • Knowledge of CI/CD pipelines and version control (GitHub/GitLab, Jenkins).

  • Understanding of data modeling techniques (star/snowflake schemas, data lakehouse).

  • Exposure to monitoring & observability tools (CloudWatch, Datadog, Prometheus).

  • Awareness of containerization and deployment (Docker, Kubernetes).
  • Mock Interview

    Practice Video Interview with JobPe AI

    Start Python Interview
    cta

    Start Your Job Search Today

    Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

    Job Application AI Bot

    Job Application AI Bot

    Apply to 20+ Portals in one click

    Download Now

    Download the Mobile App

    Instantly access job listings, apply easily, and track applications.

    coding practice

    Enhance Your Python Skills

    Practice Python coding challenges to boost your skills

    Start Practicing Python Now
    Luxoft logo
    Luxoft

    IT Services and IT Consulting

    Zug New York

    RecommendedJobs for You

    noida, delhi / ncr, mumbai (all areas)