Posted:9 hours ago|
Platform:
On-site
Contractual
They balance innovation with an open, friendly culture and the backing of a long-established parent company, known for its ethical reputation. We guide customers from what’s now to what’s next by unlocking the value of their data and applications to solve their digital challenges, achieving outcomes that benefit both business and society.
Our client is a global digital solutions and technology consulting company headquartered in Mumbai, India. The company generates annual revenue of over $4.29 billion (₹35,517 crore), reflecting a 4.4% year-over-year growth in USD terms. It has a workforce of around 86,000 professionals operating in more than 40 countries and serves a global client base of over 700 organizations.
Our client operates across several major industry sectors, including Banking, Financial Services & Insurance (BFSI), Technology, Media & Telecommunications (TMT), Healthcare & Life Sciences, and Manufacturing & Consumer. In the past year, the company achieved a net profit of $553.4 million (₹4,584.6 crore), marking a 1.4% increase from the previous year. It also recorded a strong order inflow of $5.6 billion, up 15.7% year-over-year, highlighting growing demand across its service lines.
Key focus areas include Digital Transformation, Enterprise AI, Data & Analytics, and Product Engineering—reflecting its strategic commitment to driving innovation and value for clients across industries.
Detailed JD :
"Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and data science initiatives.
Key Responsibilities:
• Design and implement scalable data models and transformation pipelines using DBT on Snowflake.
• Write efficient and maintainable PL/SQL code for complex data processing and transformation tasks.
• Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions.
• Optimize Snowflake performance through query tuning, clustering, and resource management.
• Ensure data quality, integrity, and governance through testing, documentation, and monitoring.
• Participate in code reviews, architecture discussions, and continuous improvement initiatives.
• Maintain and enhance CI/CD pipelines for DBT projects.
Required Qualifications:
• 3+ years of experience in data engineering or a related field.
• Strong hands-on experience with DBT (modular SQL development, testing, documentation).
• Proficiency in Snowflake (data warehousing, performance tuning, security).
• Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages.
• Solid understanding of data modeling concepts (star/snowflake schemas, normalization).
• Experience with version control systems (e.g., Git) and CI/CD practices.Familiarity with orchestration tools (e.g., Airflow, dbt Cloud, Prefect) is a plus."
People Prime Worldwide
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
hyderabad
20.0 - 32.5 Lacs P.A.
hyderābād
5.3125 - 7.225 Lacs P.A.
bengaluru
7.0 - 9.0 Lacs P.A.
bengaluru
25.0 - 35.0 Lacs P.A.
greater kolkata area
Salary: Not disclosed
hyderabad, telangana, india
Salary: Not disclosed
greater kolkata area
Salary: Not disclosed
pune, maharashtra, india
Salary: Not disclosed
india
Salary: Not disclosed
pune, maharashtra, india
Salary: Not disclosed