Posted:1 day ago|
Platform:
On-site
Part Time
Role Proficiency:
Act creatively to develop applications and select appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions account for others' developmental activities
Outcomes:
Measures of Outcomes:
Outputs Expected:
Code:
Documentation:
Configure:
Test:
Domain relevance:
Manage Project:
Manage Defects:
Estimate:
Manage knowledge:
Release:
Design:
Interface with Customer:
Manage Team:
Certifications:
Skill Examples:
Knowledge Examples:
Role Overview As a Data Quality Integration Engineer, you will play a key role in embedding data quality capabilities within our enterprise data landscape. You will focus on integrating industry-leading Data Quality tools (such as Ataccama and/or Collibra) with core platforms such as Snowflake and SQL databases, ensuring data quality processes are robust, scalable, and well governed. Key Responsibilities • Design, implement, and support the integration of Data Quality tools (Ataccama, Collibra, or similar) with Snowflake and SQL-based data platforms. • Develop and maintain data pipelines and connectors enabling automated data quality assessments, profiling, cleansing, and monitoring. • Collaborate with Data Governance, Data Architecture, and Data Engineering teams to align integration designs with business and governance requirements. • Configure and manage Data Quality rules and workflows, ensuring alignment with business data quality KPIs, risk controls, and governance policies. • Document integration solutions, workflows, and technical decisions to facilitate transparency and knowledge sharing. • Troubleshoot integration issues, monitor performance, and optimise connector reliability and efficiency. • Support user training and the adoption of data quality processes and tooling. Required Skills and Experience • Proven experience of integrating Data Quality tools into enterprise data environments, ideally with Ataccama and/or Collibra. • Hands-on experience with Snowflake data warehousing platform and SQL databases (e.g., MS SQL Server, PostgreSQL, Oracle, MySQL). • Strong SQL scripting and data pipeline development skills (preferably with Python, Scala, or similar). • Thorough understanding of data quality concepts including profiling, cleansing, enrichment, and monitoring. • Experience working with data integration technologies and ETL frameworks. • Knowledge of data governance frameworks, standards, and best practices. • Familiarity with REST APIs and metadata integration is highly desirable. • Experience working in financial services, asset management, or highly regulated sectors is advantageous. • Strong documentation and communication skills. Desirable • Certification in Ataccama, Collibra, Snowflake or related data governance/data engineering technologies. • Experience with cloud platforms (AWS, Azure) hosting Snowflake and data tooling. • Prior experience supporting data stewardship or data governance initiatives.
Data Quality,Snowflake,Sql Database
UST Global
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python NowTrivandrum, Kerala, India
Salary: Not disclosed
Kochi, Hyderabad, Pune, Chennai, Bengaluru
22.5 - 25.0 Lacs P.A.
4.25 - 7.0 Lacs P.A.
Kochi, Kerala, India
Experience: Not specified
Salary: Not disclosed
Thiruvananthapuram
5.0 - 7.8 Lacs P.A.
Trivandrum, Kerala, India
Salary: Not disclosed
Trivandrum, Kerala, India
Experience: Not specified
Salary: Not disclosed
Thiruvananthapuram
17.0 - 27.0 Lacs P.A.
Bengaluru
4.6625 - 10.0 Lacs P.A.
Trivandrum, Kerala, India
Salary: Not disclosed