Posted:17 hours ago|
Platform:
Remote
Full Time
We are seeking a highly skilled and technically adept Data Governance Engineer to join our dynamic data team. The ideal candidate will have a strong background in developing and implementing data quality rules, along with extensive experience in database management and data quality assurance processes. You will play a crucial role in ensuring the accuracy, consistency, and reliability of our data systems through technical solutions. Key Responsibilities: Develop, implement, and maintain comprehensive data governance rules and guidelines using advanced programming and scripting techniques. Design and build automated data validation frameworks to ensure the integrity of data across various systems. Collaborate with data engineers to integrate ETL pipelines and workflows. Create complex SQL queries to profile data and detect anomalies or deviations from expected patterns. Implement continuous monitoring solutions for data quality metrics and generate detailed reports to track performance over time. Develop and maintain documentation related to technical implementation of data governance standards, procedures, and best practices. Provide technical training and support to teams on implementing data governance rules and best practices. Qualifications: Bachelors degree in Computer Science, Information Systems, Data Science, or a related field. Masters degree is a plus. Proven experience as a Data Quality Engineer or similar role with a focus on technical implementation. Advanced proficiency in SQL and experience with databases such as MySQL, PostgreSQL, Oracle, etc. Strong knowledge of ETL processes and tools (e.g., Informatica, Talend) with experience in embedding data quality checks. Proficiency in programming languages such as Python or R for developing custom data validation scripts. Experience with big data technologies (e.g., Hadoop, Spark) and cloud-based platforms (e.g., AWS, Azure). Strong analytical skills with the ability to develop complex algorithms for detecting patterns and anomalies in large datasets. Excellent problem-solving abilities focused on root cause analysis through technical means. Detail-oriented mindset with strong organizational skills for managing multiple projects simultaneously. Preferred Skills: Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift). Familiarity with data visualization tools (e.g., Tableau, Power BI) for creating dashboards that monitor data quality metrics. Certifications related to data management or quality assurance. Looking for Immediate Joiners Only. Pls share your updated CV @monika.yadav@ness.com
Ness
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python NowBengaluru
12.0 - 22.0 Lacs P.A.
Hyderabad, Ahmedabad
20.0 - 32.5 Lacs P.A.
3.0 - 7.0 Lacs P.A.
Pune, Chennai, Mumbai (All Areas)
10.0 - 13.0 Lacs P.A.
5.0 - 8.0 Lacs P.A.
9.0 - 13.0 Lacs P.A.
5.0 - 8.0 Lacs P.A.
Bengaluru
15.0 - 25.0 Lacs P.A.
Gurugram
7.0 - 17.0 Lacs P.A.
Hyderabad
7.0 - 17.0 Lacs P.A.