Posted:4 days ago|
Platform:
On-site
Full Time
Strong DWH experience.
Strong in one of the query languages like MS SQL, PL/SQL etc.
Good understanding of best practices of cloud database technologies.
Experience in Snowflake and/or dbt would be preferred while is not mandatory.
Experience in CICD tools like GitHub etc.
Understanding of batch orchestration tools like Apache Airflow , Control-m etc.
Experience range
Experience of any kind of data migration project is nice to have.
Python is nice to have but DWH skills are must have.
Minimum Qualifications
Bachelor%27s degree in computer science, information technology, or a related field.
IT experience with a major focus on data warehouse/database-related projects
Preferred Qualifications/ Skills
You have experience in data warehousing, data modeling, and the building of data engineering pipelines.
You are well versed in data engineering methods, such as ETL and ELT techniques through scripting and/or tooling.
You are good at analyzing performance bottlenecks and providing enhancement recommendations you have a passion for customer service and a desire to learn and grow as a professional and a technologist.
Strong analytical skills related to working with structured, semi-structured, and unstructured datasets.
Collaborating with product owners to identify requirements, define desired
and deliver trusted results.
Building processes supporting data transformation, data structures, metadata, dependency, and workload management.
Extremely talented in applying SCD, CDC, and DQ/DV framework.
Desire to continually keep up with advancements in data engineering practices.
Must have exposure to technologies such as dbt, Apache Airflow, and Snowflake.
Experience in other data platforms: Oracle, SQL Server, MDM, etc
Expertise in writing SQL and database objects - Stored procedures, functions, and views. Hands-on experience in ETL/ELT and data security, SQL performance optimization and job orchestration tools and technologies e.g., dbt, APIs, Apache Airflow, etc.
Experience in data modeling and relational database design
Well-versed in applying SCD, CDC, and DQ/DV framework.
Demonstrate ability to write new code i.e., well-documented and stored in a version control system (we use GitHub & Bitbucket)
Good to have experience with Cloud Platforms such as AWS, Azure, GCP and Snowflake
Good to have strong programming/ scripting skills (Python, PowerShell, etc.)
Experience working with agile methodologies (Scrum, Kanban) and Meta Scrum with cross-functional teams (Product Owners, Scrum Master, Architects, and data SMEs)
Excellent written, and oral communication and presentation skills to present architecture, features, and solution recommendations Global functional product portfolio technical leaders (Finance, HR, Marketing, Legal, Risk, IT), product owners, functional area teams across levels
Global Data Product Portfolio Management & teams (Enterprise Data Model, Data Catalog, Master Data Management)
Genpact
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Experience: Not specified
Salary: Not disclosed
noida, uttar pradesh, india
Experience: Not specified
Salary: Not disclosed
bengaluru
35.0 - 40.0 Lacs P.A.
4.0 - 6.0 Lacs P.A.
bengaluru
17.0 - 20.0 Lacs P.A.
chennai, bengaluru
19.0 - 25.0 Lacs P.A.
bengaluru
17.0 - 20.0 Lacs P.A.
bengaluru
17.0 - 20.0 Lacs P.A.
pune, chennai, bengaluru
17.0 - 20.0 Lacs P.A.
bengaluru
17.0 - 20.0 Lacs P.A.