Posted:3 weeks ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

About the Company

Our client is a trusted global innovator of IT and business services, present in 50+ countries. They specialize in digital & IT modernization, consulting, managed services, and industry-specific solutions. With a commitment to long-term success, they empower clients and society to move confidently into the digital future.


Title

Location

Experience

Employment

Notice


About the Role

We are looking for an experienced Data Engineer to join our dynamic team. The ideal candidate will have a strong background in building, optimizing, and maintaining scalable data pipelines and infrastructure on cloud platforms. You will be responsible for designing and implementing data solutions using Snowflake and AWS technologies to support business analytics and data science initiatives.

Key Responsibilities:

  • Design, build, and maintain scalable, reliable, and efficient data pipelines on AWS cloud infrastructure.
  • Develop and optimize data ingestion, transformation, and loading processes primarily using Snowflake.
  • Collaborate closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver solutions.
  • Implement ETL/ELT workflows leveraging best practices for data integration and transformation.
  • Monitor and troubleshoot data pipeline performance, identify bottlenecks, and implement solutions to improve efficiency.
  • Ensure data quality, data governance, and compliance with organizational standards.
  • Work with orchestration tools and frameworks to automate workflows (knowledge of DBT or Python scripting is a plus).
  • Stay updated on the latest data engineering technologies and trends, and recommend improvements to existing data architecture.

Mandatory Skills:

  • Strong hands-on experience with

    Snowflake

    data warehouse platform.
  • Expertise in

    AWS

    cloud services related to data engineering (e.g., S3, Glue, Lambda, Redshift, EC2).
  • Proven experience in designing and developing large-scale data pipelines and data solutions.
  • Solid understanding of SQL and data modeling concepts.
  • Experience working in Agile teams and familiarity with CI/CD pipelines for data workflows.

Good to Have

  • Experience with

    DBT (Data Build Tool)

    for data transformation and modeling.
  • Proficiency in

    Python

    scripting for data manipulation, automation, and workflow orchestration.
  • Familiarity with other big data technologies such as Kafka, Spark, or Airflow.
  • Knowledge of data governance, metadata management, and data security best practices

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You