Posted:1 day ago|
Platform:
On-site
Contractual
Our Client is a global IT services company headquartered in Southborough, Massachusetts, USA. Founded in 1996, with a revenue of $1.8B, with 35,000+ associates worldwide, specializes in digital engineering, and IT services company helping clients modernize their technology infrastructure, adopt cloud and AI solutions, and accelerate innovation. It partners with major firms in banking, healthcare, telecom, and media.
Our Client is known for combining deep industry expertise with agile development practices, enabling scalable and cost-effective digital transformation. The company operates in over 50 locations across more than 25 countries, has delivery centers in Asia, Europe, and North America and is backed by Baring Private Equity Asia.
2 Rounds of Technical Interview + Including Client round
We are seeking a skilled and proactive Data Engineer to join our team.
The ideal candidate will have hands-on experience with Snowflake and DBT, as well as foundational knowledge of Python for data scripting and automation. Exposure to AWS cloud environments is a plus. This role offers the opportunity to work on cutting-edge data solutions that support key business initiatives.
Design, develop, and maintain scalable data pipelines using DBT and Snowflake.
Perform data modeling, transformation, and integration tasks to support business requirements.
Write and maintain clean, efficient Python scripts for automation and workflow orchestration.
Ensure data quality, integrity, and performance through best practices and testing.
Collaborate with cross-functional teams including analytics, operations, and engineering.
Participate in code reviews, sprint planning, and agile delivery processes.
Monitor and optimize existing data workflows and troubleshoot issues as they arise.
Must-Have Skills:
Strong experience with Snowflake (data warehouse design, performance tuning, SQL).
Proficient in DBT (Data Build Tool) for transforming and modeling data.
Basic knowledge of Python, particularly for scripting and automation in a data environment.
Nice-to-Have Skills:
Familiarity with AWS services such as S3, Lambda, Glue, or Redshift.
Understanding of CI/CD and version control systems like Git.
6+ years of experience in data engineering or a related discipline.
Strong analytical and problem-solving skills.
Excellent communication and collaboration abilities.
People Prime Worldwide
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Nowhyderabad
8.0 - 12.0 Lacs P.A.
chennai, gurugram
4.0 - 8.0 Lacs P.A.
bengaluru
Experience: Not specified
4.0 - 8.0 Lacs P.A.
3.0 - 4.0 Lacs P.A.
3.0 - 6.0 Lacs P.A.
chennai
5.0 - 8.0 Lacs P.A.
bengaluru
5.0 - 10.0 Lacs P.A.
30.0 - 35.0 Lacs P.A.
chennai
14.0 - 16.0 Lacs P.A.
13.0 - 14.0 Lacs P.A.