Home
Jobs

1376 Data Governance Jobs - Page 45

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

15 - 19 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a talented Data Engineer with strong expertise in Databricks, specifically in Unity Catalog, PySpark , and SQL, to join our data team. you'll play a key role in building secure, scalable data pipelines and implementing robust data governance strategies using Unity Catalog. Key Responsibilities Design and implement ETL/ELT pipelines using Databricks and PySpark . Work with Unity Catalog to manage data governance, access controls, lineage, and auditing across data assets. Develop highperformance SQL queries and optimize Spark jobs. Collaborate with data scientists, analysts, and business stakeholders to understand data needs. Ensure data quality and compliance across all stages of the data lifecycle. Implement best practices for data security and lineage within the Databricks ecosystem. Participate in CI/CD, version control, and testing practices for data pipelines. Required Skills Proven experience with Databricks and Unity Catalog (data permissions, lineage, audits). Strong handson skills with PySpark and Spark SQL. Solid experience writing and optimizing complex SQL queries. Familiarity with Delta Lake, data lakehouse architecture, and data partitioning. Experience with cloud platforms like Azure or AWS. Understanding of data governance, RBAC, and data security standards. Preferred Qualifications Databricks Certified Data Engineer Associate or Professional. Experience with tools like Airflow, Git, Azure Data Factory, or dbt . Exposure to streaming data and realtime processing. Knowledge of DevOps practices for data engineering. Mandatory skill sets Databricks Preferred skill sets Databricks Education qualification BE/BTECH, ME/MTECH, MBA, MCA Education Degrees/Field of Study required Master Degree Computer Applications, Master of Business Administration, Bachelor of Engineering, Master of Engineering, Bachelor of Technology Degrees/Field of Study preferred Required Skills Databricks Platform Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling

Posted 1 month ago

Apply

5.0 - 10.0 years

15 - 19 Lacs

Hyderabad

Work from Office

Naukri logo

Summary Within the Operations Data domain team this role is responsible for the design and implementation of Data Management Business Process Design and Governance processes including and Data Operating Model in reporting to the LDC Core Data Lead and in close collaboration with the Enterprise Data Owner (EDO) s team and members of the Functional teams. This role will focus establishing and developing the Novartis data capability in collaboration with the functions as we'll lead the implementation within LDC scope. About the Role Major accountabilities: Role is accountable for overall global Material master business process design and improvement activities in alignment of the business goals and priorities in close collaboration with the respective Solution teams and business process owners. Accountable and responsible to ensure consistency and completeness of the end-to-end design with Material master business processes and underlying data data design Accountable and responsible to design and solution the Material master data management processes comparable to the best in class process, and identify areas of process opportunities improvements in line with Novartis guidelines and Business specific requirements Accountable in identifying the digital solutioning options in close collaboration with respective IT teams to ensure business user acceptance, enablement of business process automation capabilities and best practices for data management processes. Drive the overall plan for implementation and adoption of the Material master business process design in LDC releases in close collaboration with Core and Adopt teams Responsible for gathering and implementing data requirements coming from Business Function (Domain Pillars in LDC projects), GPO, EDO and team and other dependent projects / programs. Facilitate cultural change by improving data literacy across the business through training, education and increasing data consumption. Act as a point of reference and contact of all queries related to Material master process and data design. Drives the transition into new ways of working defined by Enterprise Operating model per LDC Scope. Key performance indicators: Delivery of key milestones and deliverables of the program on time, and in quality, with full buy-in and support of country and global teams. Minimum Requirements: Education: masters university degree or higher Work Experience: At least 5 years experience in regional or global role in material/product data related functional area such as Material master data management, Product master data management or Supply chain master data in cross-functional setting. Solid understanding on cross-functional master data management business process design and best practices in master data governance. Experience from SAP MDG, SAP S/4 HANA and materials management and related data concepts. Experience in SAP PLM / SAP EHS and/or Specification Management is an additional advantage. Proven track record for detailing data concepts for material master both from conceptual and operational governance perspective. Proven track record in driving discussions and facilitating cross-functional decision making in matrix organization. Experienced in collaborating with diverse project teams including functional solution teams, data governance teams and system configuration workstreams. Additional Project Management training, a certification/designation desirable Skills: Business acumen : very good understanding of Material master data models in connection with operational significance of key data elements and cross-functional elements of data governance. Curious and forward looking : looks for examples both inside and outside the company to identify fit-for-purpose design for the company. Data savvy : proven experience to analyse the As-Is and propose solutions that are fit for purpose. Technical and process knowledge : knowledge and understanding of driving data driven business process definition and governance. Collaboration and influencing skills : Outspoken and experienced to interact and drive solutioning in x-functional matrix organization. Excellent interpersonal communication skills to drive conversations in virtual and diverse audiences. Languages : English: fluency in business English is a must.

Posted 1 month ago

Apply

12.0 - 15.0 years

14 - 18 Lacs

Hyderabad

Work from Office

Naukri logo

Lead the technical vision and strategy for the Data Engineering Center of Excellence across cloud platforms (GCP, Azure, AWS), cloud-agnostic platforms (Databricks, Snowflake), and legacy systems. This leadership role will establish architectural standards and best practices while providing pre-sales leadership for strategic opportunities. Key Responsibilities Define and drive the technical vision and roadmap for the Data Engineering CoE Establish cross-cloud architecture standards and best practices with emphasis on Azure, GCP and AWS Lead pre-sales activities for strategic opportunities, particularly AWS, Azure, GCP-focused clients Build the CoEs accelerator development framework Mentor and guide pillar architects across all platforms Drive platform selection decisions and integration strategies Establish partnerships with key technology providers, especially Cloud Define governance models for the CoE implementation Represent the organization as a technical thought leader in client engagements 12+ years of data engineering experience with 6+ years in leadership roles Deep expertise in Google Cloud Platform data services (BigQuery, Dataflow, Dataproc) Strong knowledge of other cloud platforms (Azure Fabric/Synpase, Data Factory ,

Posted 1 month ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Responsibilities and Accountabilities: Designing and building cloud-based data infrastructure and data pipelines using cloud services such as AWS Developing, testing, and deploying data integration processes that move data from various sources into cloud-based data warehouses or data lakes. Collaborating with data scientists, business analysts, and other stakeholders to identify data requirements and develop appropriate data solutions. Implementing and managing data governance policies, data quality, and data security measures to ensure data accuracy, consistency, and privacy. Managing and monitoring cloud-based data infrastructure and data pipelines to ensure data availability, scalability, and reliability. Take business specification requirements through Design, development, testing and deployment. Develop a strong understanding of business requirements; working with business users/ business analysts to define technical and process requirements Build effective working relationships with team members and cross functional colleagues. About Experian Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, and gain deeper insights into the automotive market, all using our unique combination of data, analytics and software. We also assist millions of people to realize their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.com. Experience and Skills Experience Required: Qualified to degree level in science or engineering preferable Cloud experience - S3, step functions, Glue, Step functions and Airflow. Good Python development for data transfers and extractions (ELT and ETL). 5 to 8 years of development experience building data pipelines using Cloud technologies. 5+ years of experience in architecture of modern data warehousing Excellent problem solving, design, debugging, and testing skills Ability to work with multiple different personality types while managing workload within a fast paced, energetic, and dynamic workplace. Additional Information Our uniqueness is that we truly celebrate yours. Experians culture and people are key differentiators. We take our people agenda very seriously and focus on what truly matters; DEI, work/life balance, development, authenticity, engagement, collaboration, wellness, reward recognition, volunteering... the list goes on. Experian s strong people first approach is award winning; Great Place To Work in 24 countries, FORTUNE Best Companies to work and Glassdoor Best Places to Work (globally 4.4 Stars) to name a few. Check out Experian Life on social or our Careers Site to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is a critical part of Experians DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, colour, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here Share

Posted 1 month ago

Apply

15.0 - 17.0 years

40 - 45 Lacs

Bengaluru

Work from Office

Naukri logo

To propel our next phase of growth, ServiceNow is investing in a Financial Services Industry Solutions organization. Were building a team of collaborative individuals who are passionate about the opportunity to transform Financial Services using ServiceNows powerful digital workflow platform. Join a team that re-shapes Financial Services, partnering with leading financial institutions and the most inspiring Fintechs in the world. We are always evolving. We are passionate about our product, and we live for our customers. We have high expectations, and a career at ServiceNow means challenging yourself to always be better. We are looking for a Sr. Principal, Financial Services Architect to be part of our Financial Services Industry Solutions organization. You are a Financial Services industry expert with a knowledge of the Financial Services solution stack across the application, workflow and data/integration layers. You understand the core Financial Services business processes and surrounding technology ecosystem, including systems of record, third-party data providers and the Fintech / Regtech landscape. You have a technical vision for ServiceNows opportunity to transform Financial Services and will work with customers and partners on the architecture for the solutions with care for this vision. You will work with pre-sales technical resources to gather solution requirements and help them on solution roadmaps and architecture. You will provide visibility within ServiceNow and an opportunity to provide incredible results during sales cycles while achieving quarterly and annual sales goals for an assigned territory. What you get to do in this role: Solution Architecture: Architecting Financial Services solutions built on the ServiceNow platform, both using current horizontal applications, platform services and driving requirements for Financial Services-specific product development. Consideration of future state and transitional architectures in Financial Services. Asset Creation: create reference architecture and technical enablement guides for Financial Services solutions. Work with Solution Consulting teams to develop demo prototypes for solutions. Field and Partner Enablement: promote internal pre-sales technical resources and System Integrator and Independent Software Vendor partner ecosystem on Financial Services solutions and architecture. Build a Financial Services community of technical ambassadors enabled on the ServiceNow Financial Services message and solution architecture. Customer Engagement: Work with customers to create technology roadmaps around Financial Services solutions and to dive into architecture components. Bring deep technical knowledge of Financial Services to improve strategic opportunities and identify new use cases. Market-Facing Engagement: Present to large customer audiences and build relationships with CxOs. To be successful in this role you have: Experience in leveraging or critically thinking about how to integrate AI into work processes, decision-making, or problem-solving. This may include using AI-powered tools, automating workflows, analyzing AI-driven insights, or exploring AIs potential impact on the function or industry. Minimum 15+ years of related work experience Experience working with the ServiceNow Platform with ServiceNow customers, and in-depth understanding of the ServiceNow architecture and platform Understanding of top Financial Services regulations globally with the ability to assess the impact on the architecture of a ServiceNow solution in Financial Services Experience working with sales, with the ability to work as an extended part of the account teams Ability to provide expertise and work with internal ServiceNow product teams Interact at multiple levels within a customer account (Enterprise Architects, Technical Architects, Directors, VPs, and CXOs) Ability to travel up to 30% of the time Knowledge of enterprise integration, service-oriented architectures and micro-services Knowledge of security, data privacy, data governance within Financial Services Instant customer credibility with a record of building customer relationships

Posted 1 month ago

Apply

8.0 - 13.0 years

14 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

The Solution Architect Data Engineer will design, implement, and manage data solutions for the insurance business, leveraging expertise in Cognos, DB2, Azure Databricks, ETL processes, and SQL. The role involves working with cross-functional teams to design scalable data architectures and enable advanced analytics and reporting, supporting the company's finance, underwriting, claims, and customer service operations. Key Responsibilities: Data Architecture & Design: Design and implement robust, scalable data architectures and solutions in the insurance domain using Azure Databricks, DB2, and other data platforms. Data Integration & ETL Processes: Lead the development and optimization of ETL pipelines to extract, transform, and load data from multiple sources, ensuring data integrity and performance. Cognos Reporting: Oversee the design and maintenance of Cognos reporting systems, developing custom reports and dashboards to support business users in finance, claims, underwriting, and operations. Data Engineering: Design, build, and maintain data models, data pipelines, and databases to enable business intelligence and advanced analytics across the organization. Cloud Infrastructure: Develop and manage data solutions on Azure, including Databricks for data processing, ensuring seamless integration with existing systems (e.g., DB2, legacy platforms). SQL Development: Write and optimize complex SQL queries for data extraction, manipulation, and reporting purposes, with a focus on performance and scalability. Data Governance & Quality: Ensure data quality, consistency, and governance across all data solutions, implementing best practices and adhering to industry standards (e.g., GDPR, insurance regulations). Collaboration: Work closely with business stakeholders, data scientists, and analysts to understand business needs and translate them into technical solutions that drive actionable insights. Solution Architecture: Provide architectural leadership in designing data platforms, ensuring that solutions meet business requirements, are cost-effective, and can scale for future growth. Performance Optimization: Continuously monitor and tune the performance of databases, ETL processes, and reporting tools to meet service level agreements (SLAs). Documentation: Create and maintain comprehensive technical documentation including architecture diagrams, ETL process flows, and data dictionaries. Required Qualifications: Bachelors or Masters degree in Computer Science, Information Systems, or a related field. Proven experience as a Solution Architect or Data Engineer in the insurance industry, with a strong focus on data solutions. Hands-on experience with Cognos (for reporting and dashboarding) and DB2 (for database management). Proficiency in Azure Databricks for data processing, machine learning, and real-time analytics. Extensive experience in ETL development, data integration, and data transformation processes. Strong knowledge of Python, SQL (advanced query writing, optimization, and troubleshooting). Experience with cloud platforms (Azure preferred) and hybrid data environments (on-premises and cloud). Familiarity with data governance and regulatory requirements in the insurance industry (e.g., Solvency II, IFRS 17). Strong problem-solving skills, with the ability to troubleshoot and resolve complex technical issues related to data architecture and performance. Excellent verbal and written communication skills, with the ability to work effectively with both technical and non-technical stakeholders. Preferred Qualifications: Experience with other cloud-based data platforms (e.g., Azure Data Lake, Azure Synapse, AWS Redshift). Knowledge of machine learning workflows, leveraging Databricks for model training and deployment. Familiarity with insurance-specific data models and their use in finance, claims, and underwriting operations. Certifications in Azure Databricks, Microsoft Azure, DB2, or related technologies. Knowledge of additional reporting tools (e.g., Power BI, Tableau) is a plus. Key Competencies: Technical Leadership: Ability to guide and mentor development teams in implementing best practices for data architecture and engineering. Analytical Skills: Strong analytical and problem-solving skills, with a focus on optimizing data systems for performance and scalability. Collaborative Mindset: Ability to work effectively in a cross-functional team, communicating complex technical solutions in simple terms to business stakeholders. Attention to Detail: Meticulous attention to detail, ensuring high-quality data output and system performance.

Posted 1 month ago

Apply

2.0 - 5.0 years

7 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

Data Quality : Define and Measure Data Quality Metrics: Establish metrics for accuracy, completeness, validity, consistency, timeliness, and reliability. Continuous Monitoring and Remediation: Regularly monitor data quality, conduct audits, perform root cause analysis for recurring data issues, and implement preventive measures and remediation plans. Data Profiling: Develop and maintain comprehensive data profiles to understand data characteristics. Data Validation: Create and implement validation rules to ensure that incoming data conforms to expected formats and values. Data Cleansing: Design and execute data cleansing processes to correct errors and inconsistencies, enhancing overall data quality and reliability. Data Governance : Establish Governance Framework: Implement and enforce data governance practices to ensure compliance with regulatory requirements and corporate policies, ensuring data is managed according to best practices. Metadata Management: Develop and maintain a comprehensive metadata repository to document data definitions, lineage, and usage, ensuring it is kept up to date and accessible to end users. Understand User Needs: Collaborate with business users to identify data needs, pain points, and requirements, ensuring the data is fit for its intended use. Identify Improvement Areas: Continuously seek opportunities for process improvement in data governance and quality management. User Roles and Access Requirements: Understand user roles and access requirements for systems, so that similar protection can be implemented into the analytical solutions. Row-Level Security: Work with the data & analytics team to establish row-level security for analytical solutions, ensuring data is accessible only to authorised users. Continuous Improvement: Establish Naming Conventions: Define business friendly table names and column names, along with synonyms, to ensure data easily accessible using AI. Create Synonyms: Implement synonyms to simplify data access and enhance data readability. Establish KPIs for data governance and quality efforts and create regular reports for stakeholders to track progress and demonstrate the value of data governance initiatives. Continuous Improvement: Establish a feedback loop where users can report data quality issues and suggest improvements.

Posted 1 month ago

Apply

8.0 - 12.0 years

13 - 17 Lacs

Noida

Work from Office

Naukri logo

Job Title: Data Architect Location: Jewar airport Noida Experience - 8+ Years Data Architect We are looking for a Data Architect to oversee our organizations data architecture, governance, and product lifecycle. The role focuses on managing data layers, maintaining data governance frameworks, and creating data products aligned with business objectives. Key Responsibilities: Design and maintain the Lakehouse architecture, including data lake setup and management. Create and maintain data products, ensuring their alignment with business needs. Develop and enforce data governance policies, including the maintenance of a data catalog. Design data models and define database development standards. Automate workflows using Python, CI/CD pipelines, and unit tests. Required Skills and Experience: Extensive experience in data architecture and data platform management. Expertise in data governance, data modeling, and database development. Proficiency in Python for automation and pipeline development. Familiarity with Azure data services and data processing pipelines.

Posted 1 month ago

Apply

15.0 - 20.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title:Data Analytics Lead Experience15-20 Years Location:Bengaluru : Key Responsibilities Team Leadership & Delivery Excellence: Lead a cross-functional team comprising data architects, analysts, business SMEs, and technologists to deliver high-impact data analytics solutions. Define and enforce best practices for efficient, scalable, and high-quality delivery. Inspire a culture of collaboration, accountability, and continuous improvement within the team. Strategic Data Leadership: Develop and execute a data strategy aligned with client business objectives, ensuring seamless integration of analytics into decision-making processes. Collaborate with stakeholders to translate business needs into actionable data solutions, influencing strategic decisions. Technical and Architectural Expertise: Architect and oversee data platforms, including SQL Server, Snowflake, and Power BI, ensuring optimal performance, scalability, and governance. Lead initiatives in Data Architecture, Data Modeling, and Data Warehouse (DWH) development, tailored to alternative investment strategies. Evaluate emerging technologies, such as big data and advanced analytics tools, and recommend their integration into client solutions. Champion data quality, integrity, and security, aligning with compliance standards in private equity and alternative investments. Performance & Metrics: Define and monitor KPIs to measure team performance and project success, ensuring timely delivery and measurable impact. Collaborate with stakeholders to refine reporting, dashboarding, and visualization for decision support. Governance & Compliance: Establish robust data governance frameworks in partnership with client stakeholders. Ensure adherence to regulatory requirements impacting private markets investments, including fund accounting and compliance What’s on offer Competitive and above-market salary. Flexible hybrid work schedule with tools for both office and remote productivity. Hands-on exposure to cutting-edge technology and global financial markets. Opportunity to collaborate directly with international teams in New York and London. Candidate Profile Experience: 15+ years of progressive experience in program or project management within the capital markets and financial services sectors. Demonstrated expertise in SQL Server, Snowflake, Power BI, ETL processes, and Azure Cloud Data Platforms. Hands-on experience with big data technologies and modern data architecture. Proven track record in delivering projects emphasizing data quality, integrity, and accuracy. Deep understanding of private markets, including areas such as private equity, private credit, CLOs, compliance, and regulations governing alternative investments. Leadership & Collaboration: Exceptional problem-solving skills and decision-making abilities in high-pressure, dynamic environments. Experience leading multi-disciplinary teams to deliver large-scale data initiatives. Strong client engagement and communication skills, fostering alignment and trust with stakeholders. Preferred Certifications: Relevant certifications (e.g., CFA, Snowflake Certified Architect, or Microsoft Power BI Certified). Education Bachelor’s degree in computer science, IT, Finance, Economics, or a related discipline. Advanced degrees (MBA, MS in Computer Science, or related fields) preferred. Interview Process Initial recruiter call. Interview with technical team, delivery and account leadership team at ThoughtFocus. Interview with the client stakeholders. Final HR discussion.

Posted 1 month ago

Apply

0.0 - 5.0 years

1 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title:Data Engineer - DBT (Data Build Tool) Experience0-5 Years Location:Bengaluru : Job Responsibilities Assist in the design and implementation of Snowflake-based analytics solution(data lake and data warehouse) on AWS Requirements definition, source data analysis and profiling, the logical and physical design of the data lake and data warehouse as well as the design of data integration and publication pipelines Develop Snowflake deployment and usage best practices Help educate the rest of the team members on the capabilities and limitations of Snowflake Build and maintain data pipelines adhering to suggested enterprise architecture principles and guidelines Design, build, test, and maintain data management systems Work in sync with internal and external team members like data architects, data scientists, data analysts to handle all sorts of technical issue Act as technical leader within the team Working in Agile/Lean model Deliver quality deliverables on time Translating complex functional requirements into technical solutions. EXPERTISE AND QUALIFICATIONS Essential Skills, Education and Experience Should have a B.E. / B.Tech. / MCA or equivalent degree along with 4-7 years of experience in Data Engineering Strong experience in DBT concepts like Model building and configurations, incremental load strategies, macro, DBT tests. Strong experience in SQL Strong Experience in AWS Creation and maintenance of optimum data pipeline architecture for ingestion, processing of data Creation of necessary infrastructure for ETL jobs from a wide range of data sources using Talend, DBT, S3, Snowflake. Experience in Data storage technologies like Amazon S3, SQL, NoSQL Data modeling technical awareness Experience in working with stakeholders working in different time zones Good to have AWS data services development experience. Working knowledge on using Bigdata technologies. Experience in collaborating data quality and data governance team. Exposure to reporting tools like Tableau Apache Airflow, Apache Kafka (nice to have) Payments domain knowledge CRM, Accounting, etc. in depth understanding Regulatory reporting exposure Other skills Good Communication skills Team Player Problem solver Willing to learn new technologies, share your ideas and assist other team members as needed Strong analytical and problem-solving skills; ability to define problems, collect data, establish facts, and draw conclusions.

Posted 1 month ago

Apply

3.0 - 4.0 years

1 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title:Data Quality Engineer Experience3-4 Years Location:Bangalore : We are seeking a detail-oriented and highly motivated Data Quality Engineerto join our growing data team. In this role, you will be responsible for designing, implementing, and maintaining data quality frameworks to ensure the accuracy, completeness, consistency, and reliability of enterprise data. You will work closely with business stakeholders, data stewards, and data engineers to enforce data governance policies and utilize tools like Ataccamato support enterprise data quality initiatives. We only need immediate joiners. Key Responsibilities: Design and implement robust data quality frameworksand rules using Ataccama ONEor similar data quality tools. Develop automated data quality checks and validation routines to proactively detect and remediate data issues. Collaborate with business and technical teams to define data quality metrics, thresholds, and standards. Support the data governance strategyby identifying critical data elements and ensuring alignment with organizational policies. Monitor, analyze, and report on data quality trends, providing insights and recommendations for continuous improvement. Work with data stewards to resolve data issues and ensure adherence to data quality best practices. Support metadata management, data lineage, and data profiling activities. Document processes, data flows, and data quality rules to facilitate transparency and reproducibility. Conduct root cause analysis on data issues and implement corrective actions to prevent recurrence. Required Skills and Qualifications: Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field. 3+ years of experience in a Data Quality, Data Governance, or Data Engineering role. Hands-on experience with Ataccama ONE or similar data quality tools, including rule creation, data profiling, and issue management. Strong knowledge of data governance frameworks, principles, and best practices. Proficient in SQL and data analysis with the ability to query complex datasets. Experience with data management platforms and enterprise data ecosystems. Excellent problem-solving skills and attention to detail. Strong communication and stakeholder engagement skills. Preferred Qualifications: Experience with cloud data platforms (e.g., Snowflake, AWS, Azure). Familiarity with data catalog tools (e.g., Collibra, Alation). Knowledge of industry data standards and regulatory requirements (e.g., GDPR, HIPAA).

Posted 1 month ago

Apply

4.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Sustainability Data Quality Manager Duties & responsibilities The Sustainability Data Quality Manager role will join JLL’s Sustainability Data and Reporting team to support our data management, platform, compliance and reporting functions. The role will report to the team’s Regional Lead but work across our global client portfolio and multiple stakeholder groups to deliver regular data quality analytics reviews and reports, coordinating the resolution of issues found. The position is required to work collaboratively across internal global business lines including JLL’s Client Account, Technology and Operations teams to help manage stakeholder expectations and maintain high quality service delivery. The candidate will have experience in delivering multiple programs of work in parallel and applying a strategy that learns from and leverages challenges and opportunities observed from across the board. The role will be responsible for several tasks including: Partnering with business and technology teams to design, implement, and optimize data quality processes that support business operations and analytics Develop a detailed understanding of key tracking and reporting platforms, including internal tools and how we support our clients in measuring their sustainability performance. Coordinate and manage adherence to QAQC process for key business groups. Identify, assess, and document data quality issues and their impact on business operations Develop a detailed understanding of data structures within client’s data. Performance objectives Ability to actively manage concurrent projects and a strong talent for project coordination. Regularly communicate in a clear and non-technical way to internal JLL users. Be an integral part of the data and reporting team, completing a full review of data quality practices, identifying trends and potential issues, and communicating with others to implement changes and improvements as necessary. Identify support, training and management processes that can be improved to increase scalability and efficiency. Key skills Ability to see patterns and tell trends within and across large data sets, applying understanding of sustainability performance. Strong organizational and analytical skills, process-driven, with an orientation toward continuous improvement. Ability to clearly identify issues with data and raise them to the appropriate stakeholder. Ability to meet milestone dates and raise concerns early and often. Able to determine root causes of data discrepancies and recommend long-term solutions. Candidate specification 5+ years’ experience in similar role. High proficiency in Microsoft Excel and data management. Knowledge of other analytical tools such as Power BI, Tableau, Python, or SQL. Excellent communication skills including the ability to identify and describe data anomalies and provide solutions accordingly. Lateral thinking and problem-solving skills. Ability to multi-task and manage priorities to meet deadlines. Familiarity of sustainability and carbon emissions reporting will be a strong advantage. Project management experience would be an advantage. This role requires a high attention to detail and a strong process-driven approach. Location On-site –Bengaluru, KA, Mumbai, MH Scheduled Weekly Hours: 40 If this job description resonates with you, we encourage you to apply even if you don’t meet all of the requirements. We’re interested in getting to know you and what you bring to the table! JLL Privacy Notice Jones Lang LaSalle (JLL), together with its subsidiaries and affiliates, is a leading global provider of real estate and investment management services. We take our responsibility to protect the personal information provided to us seriously. Generally the personal information we collect from you are for the purposes of processing in connection with JLL’s recruitment process. We endeavour to keep your personal information secure with appropriate level of security and keep for as long as we need it for legitimate business or legal reasons. We will then delete it safely and securely. Candidate Privacy Statement . For candidates in the United States, please see a full copy of our Equal Employment Opportunity and Affirmative Action policy here. Jones Lang LaSalle (“JLL”) is an Equal Opportunity Employer and is committed to working with and providing reasonable accommodations to individuals with disabilities. If you need a reasonable accommodation because of a disability for any part of the employment process – including the online application and/or overall selection process – you may contact us at Accommodation Requests . This email is only to request an accommodation. Please direct any other general recruiting inquiries to our Contact Us page I want to work for JLL.

Posted 1 month ago

Apply

3.0 - 6.0 years

9 - 14 Lacs

Mumbai

Work from Office

Naukri logo

Role Overview : We are looking for aTalend Data Catalog Specialistto drive enterprise data governance initiatives by implementingTalend Data Catalogand integrating it withApache Atlasfor unified metadata management within a Cloudera-based data lakehouse. The role involves establishing metadata lineage, glossary harmonization, and governance policies to enhance trust, discovery, and compliance across the data ecosystem Key Responsibilities: o Set up and configure Talend Data Catalog to ingest and manage metadata from source systems, data lake (HDFS), Iceberg tables, Hive metastore, and external data sources. o Develop and maintain business glossaries , data classifications, and metadata models. o Design and implement bi-directional integration between Talend Data Catalog and Apache Atlas to enable metadata synchronization , lineage capture, and policy alignment across the Cloudera stack. o Map technical metadata from Hive/Impala to business metadata defined in Talend. o Capture end-to-end lineage of data pipelines (e.g., from ingestion in PySpark to consumption in BI tools) using Talend and Atlas. o Provide impact analysis for schema changes, data transformations, and governance rule enforcement. o Support definition and rollout of enterprise data governance policies (e.g., ownership, stewardship, access control). o Enable role-based metadata access , tagging, and data sensitivity classification. o Work with data owners, stewards, and architects to ensure data assets are well-documented, governed, and discoverable. o Provide training to users on leveraging the catalog for search, understanding, and reuse. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 6–12 years in data governance or metadata management, with at least 2–3 years in Talend Data Catalog. Talend Data Catalog, Apache Atlas, Cloudera CDP, Hive/Impala, Spark, HDFS, SQL. Business glossary, metadata enrichment, lineage tracking, stewardship workflows. Hands-on experience in Talend–Atlas integration , either through REST APIs, Kafka hooks, or metadata bridges. Preferred technical and professional experience .

Posted 1 month ago

Apply

2.0 - 6.0 years

6 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

As a senior SAP Consultant, you will serve as a client-facing practitioner working collaboratively with clients to deliver high-quality solutions and be a trusted business advisor with deep understanding of SAP Accelerate delivery methodology or equivalent and associated work products. You will work on projects that assist clients in integrating strategy, process, technology, and information to enhance effectiveness, reduce costs, and improve profit and shareholder value. There are opportunities for you to acquire new skills, work across different disciplines, take on new challenges, and develop a comprehensive understanding of various industries. Your primary responsibilities include Strategic SAP Solution FocusWorking across technical design, development, and implementation of SAP solutions for simplicity, amplification, and maintainability that meet client needs. Comprehensive Solution DeliveryInvolvement in strategy development and solution implementation, leveraging your knowledge of SAP and working with the latest technologies Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall 5 - 12 years of relevant experience in SAP BODS/BOIS/SDI/SDQ and 3+ Years of SAP functional experience specializing in design and configuration of SAP BODS/HANA SDI modules. Experience in gathering business requirements and Should be able to create requirement specifications based on Architecture/Design/Detailing of Processes. Should be able to prepare mapping sheet combining his/her Functional and technical expertise. All BODS Consultant should primarily have Data migration experience from Different Legacy Systems to SAP or Non SAP systems. Data Migration experience from SAP ECC to S/4HANA using Migration Cockpit or any other methods. In addition to Data Migration experience, Consultant should have experience or Strong knowledge on BOIS( BO Information Steward) for data Profiling or Data Governance Preferred technical and professional experience Having BODS Admin experience/Knowledge. Having working or strong Knowledge of SAP DATA HUB. Experience/Strong knowledge of HANA SDI (Smart data Integration) to use this as an ETL and should be able to develop flow graphs to Validate/Transform data. Consultant should Develop Workflows, Data flows based on the specifications using various stages in BODS

Posted 1 month ago

Apply

4.0 - 9.0 years

9 - 13 Lacs

Mumbai

Work from Office

Naukri logo

We are seeking a skilled Python Developer with expertise in Django, Flask, and API development to join our growing team. The Python Developer will be responsible for designing and implementing backend services, APIs, and integrations that power our core platform. The ideal candidate should have a strong foundation in Python programming, experience with Django and/or Flask frameworks, and a proven track record of delivering robust and scalable solutions. Primary Skill Responsibilities Design, develop, and maintain backend services and APIs using Python frameworks such as Django and Flask. Collaborate with front-end developers, product managers, and stakeholders to translate business requirements into technical solutions. Build and integrate RESTful APIs for seamless communication between our applications and external services. Qualifications Bachelors degree in computer science, Engineering, or related field; or equivalent experience. 5+ years of professional experience as a Python Developer, with a focus on backend development. Secondary Skill Amazon Elastic File System (EFS) Amazon Redshift Amazon S3 Apache Spark Ataccama DQ Analyzer AWS Apache Airflow AWS Athena Azure Data Factory Azure Data Lake Storage Gen2 (ADLS) Azure Databricks Azure Event Hub Azure Stream Analytics Azure Synapse Analytics BigID C++ Cloud Storage Collibra Data Governance (DG) Collibra Data Quality (DQ) Data Lake Storage Data Vault Modeling Databricks DataProc DDI Dimensional Data Modeling EDC AXON Electronic Medical Record (EMR) Extract, Transform & Load (ETL) Financial Services Logical Data Model (FSLDM) Google Cloud Platform (GCP) BigQuery Google Cloud Platform (GCP) Bigtable Google Cloud Platform (GCP) Dataproc HQL IBM InfoSphere Information Analyzer IBM Master Data Management (MDM) Informatica Data Explorer Informatica Data Quality (IDQ) Informatica Intelligent Data Management Cloud (IDMC) Informatica Intelligent MDM SaaS Inmon methodology Java Kimball Methodology Metadata Encoding & Transmission Standards (METS) Metasploit Microsoft Excel Microsoft Power BI NewSQL noSQL OpenRefine OpenVAS Performance Tuning Python R RDD Optimization SaS SQL Tableau Tenable Nessus TIBCO Clarity

Posted 1 month ago

Apply

7.0 - 11.0 years

15 - 20 Lacs

Mumbai

Work from Office

Naukri logo

This role requires deep understanding of data warehousing, business intelligence (BI), and data governance principles, with strong focus on the Microsoft technology stack. Data Architecture Develop and maintain the overall data architecture, including data models, data flows, data quality standards. Design and implement data warehouses, data marts, data lakes on Microsoft Azure platform Business Intelligence Design and develop complex BI reports, dashboards, and scorecards using Microsoft Power BI. Data Engineering Work with data engineers to implement ETL/ELT pipelines using Azure Data Factory. Data Governance Establish and enforce data governance policies and standards. Primary Skills Experience 15+ years of relevant experience in data warehousing, BI, and data governance. Proven track record of delivering successful data solutions on the Microsoft stack. Experience working with diverse teams and stakeholders. Required Skills and Experience Technical Skills: Strong proficiency in data warehousing concepts and methodologies. Expertise in Microsoft Power BI. Experience with Azure Data Factory, Azure Synapse Analytics, and Azure Databricks. Knowledge of SQL and scripting languages (Python, PowerShell). Strong understanding of data modeling and ETL/ELT processes. Secondary Skills Soft Skills Excellent communication and interpersonal skills. Strong analytical and problem-solving abilities. Ability to work independently and as part of a team. Strong attention to detail and organizational skills.

Posted 1 month ago

Apply

1.0 - 3.0 years

7 - 10 Lacs

Coimbatore

Work from Office

Naukri logo

The Opportunity Entry level position; support Avantors data management strategies by investigating and resolving data quality issues in enterprise applications via deletion and merging, while safeguarding against data loss. Execute mass data management processes while ensuring data quality. Manage documentation, updates to the Data Dictionary and data management training materials, under the guidance of the Enterprise Data Management & Analytics team. Coordinate and conduct mass data imports into core systems, and mass data-cleansing initiatives, ensuring the integrity and eliminating redundancy from corporate databases. Job Summary: The Junior Associate in Customer & Vendor Master Data will be responsible for maintaining, updating, and ensuring the accuracy of customer and vendor information in the organizations database. This role requires a high attention to detail and the ability to work collaboratively with internal teams and external stakeholders to support data integrity and smooth business operations. Experience - 0 to 1 Year Key Responsibilities: Maintain and update customer and vendor master data within the companys database, ensuring accuracy and completeness. Verify and validate new customer and vendor data by liaising with relevant departments or stakeholders. Assist with the creation and review of data entry guidelines and processes. Support the data entry process for both new customers and vendors, as well as modifications to existing records. Ensure compliance with data governance standards, including privacy policies and regulations. Collaborate with internal teams (e.g., Sales, Procurement, Finance) to resolve any discrepancies or issues related to master data. Monitor data quality and take proactive steps to identify and resolve data inaccuracies. Assist in running regular data audits and clean-up activities to maintain the integrity of the customer and vendor database. Prepare and maintain reports related to master data for review by management. Provide support for system upgrades or data migration activities, ensuring data integrity is maintained. Assist in handling queries related to master data from both internal and external stakeholders. Qualifications: Any degree Strong attention to detail and accuracy in data entry and management. Basic understanding of database management systems and data governance principles. Proficient in Microsoft Office Suite (Excel, Word, etc.). Strong communication and interpersonal skills to collaborate with various teams. Ability to manage multiple tasks with competing deadlines. Previous experience in data management or administrative support is a plus. Skills and Competencies: Attention to detail and accuracy in handling sensitive data. Analytical mindset with the ability to identify and resolve discrepancies. Strong organizational and time management skills. Problem-solving and troubleshooting abilities. Ability to work independently and as part of a team Disclaimer: The above statements are intended to describe the general nature and level of work being performed by employees assigned to this classification. They are not intended to be construed as an exhaustive list of all responsibilities, duties and skills required of employees assigned to this position. Avantor is proud to be an equal opportunity employer. Why Avantor Dare to go further in your career. Join our global team of 14,000+ associates whose passion for discovery and determination to overcome challenges relentlessly advances life-changing science. The work we do changes peoples lives for the better. It brings new patient treatments and therapies to market, giving a cancer survivor the chance to walk his daughter down the aisle. It enables medical devices that help a little boy hear his moms voice for the first time. Outcomes such as these create unlimited opportunities for you to contribute your talents, learn new skills and grow your career at Avantor. We are committed to helping you on this journey through our diverse, equitable and inclusive culture which includes learning experiences to support your career growth and success. At Avantor, dare to go further and see how the impact of your contributions set science in motion to create a better world. Apply today! EEO Statement: We are an Equal Employment/Affirmative Action employer and VEVRAA Federal Contractor. We do not discriminate in hiring on the basis of sex, gender identity, sexual orientation, race, color, religious creed, national origin, physical or mental disability, protected Veteran status, or any other characteristic protected by federal, state/province, or local law. If you need a reasonable accommodation for any part of the employment process, please contact us by email at recruiting@avantorsciences.com and let us know the nature of your request and your contact information. Requests for accommodation will be considered on a case-by-case basis. Please note that only inquiries concerning a request for reasonable accommodation will be responded to from this email address. 3rd party non-solicitation policy:

Posted 1 month ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Pune

Work from Office

Naukri logo

This job is hiring for Pan India Location. Marketing Cloud Intelligence(Datorama) Specialist, Marketing Cloud Personalization (Interaction Studio), Marketing Cloud Account Engagement(Pardot), Salesforce Data Cloud, Salesforce Marketing Cloud B2C. We are seeking an experienced Salesforce Data Cloud professional The ideal candidate will have a deep understanding of (SFMC) or Pardot or Marketing Cloud Personalization, with a focus on data management, customer segmentation, and analytics within the Salesforce Data Cloud environment. This role will involve designing and implementing data solutions that drive customer engagement, optimize marketing efforts, and enhance data-driven decision-making. Salesforce Data Cloud Implementation, Data management & Integration, Customer Segmentation & Personalization, Analytics & Reporting, Data Governance & Compliance.

Posted 1 month ago

Apply

6.0 - 10.0 years

20 - 25 Lacs

Hyderabad

Work from Office

Naukri logo

Position: Palantir Foundry & Pyspark Data Engineer Location: Hyderabad (PG&E Office) Key Skills: Palantir Foundry, Python, spark, AWS, Pyspark Experience: 6 -10 Years will be perfect fit Responsibilities: Preferred candidate having experience with Palantir Foundry (Code Repository, Contour, Data connection and workshop). Palantir Foundry experience is must to have. Develop and enhance data-processing, orchestration, monitoring, and more by leveraging popular open-source software, AWS, and GitLab automation. Collaborate with product and technology teams to design and validate the capabilities of the data platform Identify, design, and implement process improvements: automating manual processes, optimizing for usability, re-designing for greater scalability Provide technical support and usage guidance to the users of our platforms services. Drive the creation and refinement of metrics, monitoring, and alerting mechanisms to give us the visibility we need into our production services. Qualifications: Experience building and optimizing data pipelines in a distributed environment Experience supporting and working with cross-functional teams Proficiency working in Linux environment 4+ years of advanced working knowledge of Palantir Foundry, SQL, Python, and PySpark 2+ years of experience with using a broad range of AWS technologies Experience using tools such as: Git/Bitbucket, Jenkins/CodeBuild, CodePipeline Experience with platform monitoring and alerts tools

Posted 1 month ago

Apply

9.0 - 14.0 years

18 - 25 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Dear Professional, We are excited to present a unique opportunity at Cognizant, a leading IT firm renowned for fostering growth and innovation. We are seeking talented professionals with 9 to 14 years of experience in Power BI Administration,Power BI Desktop ,Service,Workspace Management,Dataset Management,Report Publishing,Tenant Migration,Security,Performance Optimization,SQL Server ,RLS,Data Governance ,DAX Optimization ,Azure Synapse Analytics ,L3 support,L4 Support to join our dynamic team. Your expertise in these areas is highly sought after, and we believe your contributions will be instrumental in driving our projects to new heights. We offer a collaborative environment where your skills will be valued and nurtured. To proceed to the next step of the recruitment process, please provide us with the following details with Updated resume to sathish.kumarmr@cognizant.com Please share below details (Mandatory) : Full Name(As per Pan card): Contact number: Email Current Location: Interested Locations: Total Years of experience: Relevant years of experience: Current company: Notice period: NP negotiable: if yes how many days they can negotiate? : If you are Serving any Notice period Means please mention Last date of Working: Current CTC- Expected CTC- Availability for interview on Weekdays ? Highest Qualification? Additionally, we would like to schedule a virtual interview with you on 26nd May 2025. Kindly confirm your availability for the same. We look forward to the possibility of you bringing your valuable experience to Cognizant. Please respond at your earliest convenience. Thanks & Regards, Sathish Kumar M R HR-Cognizant Sathish.KumarMR@cognizant.com

Posted 1 month ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Mumbai

Work from Office

Naukri logo

We are hiring Informatica CDQ Professionals with 4 to 8 years of experience for a contract position (6 months to 1 year). Type: Contract (6 months to 1 year) Start Date: Immediate joiners preferred Skills Required: Strong hands-on experience with Informatica Cloud Data Quality (CDQ) Expertise in data profiling, data cleansing, and implementing data quality rules Solid knowledge of data governance and data management Strong troubleshooting and performance optimization skills To Apply: Please share your updated resume along with: Current CTC Expected CTC Current Location Email to: navaneetha@suzva.com

Posted 1 month ago

Apply

4.0 - 9.0 years

8 - 10 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

Develop a strong understand of business flows, processes and architecture and leverage that in designing and developing BI content. Translate business/functional requirements into technical specifications encompassing the ETL, Metadata layer and Reporting layer. Significant experience in the areas of HANA Modeling, HANA data provisioning, HANA Views and SQL Script, Stored procedures. Hands-on development within all layers of the SAP Datasphere environment. Data acquisition, modeling, transformation, and load to HANA Platform. Design, build and execute data conversions from legacy systems. Write and execute test plans (unit, regression and integration) Technical and user documentation and training. Provide production support and user support, including researching data questions. Provide technical guidance and oversee work completed by junior team members Required Experience: 6+ years of full life-cycle experience in Data Warehouse/Reporting projects, preferably in an SAP environment. Hands-on experience in all facets of EDW Architecture, Data flow strategy, Data modeling, Metadata & Master data management. Experience with SAP HANA architecture and HANA Modeling. Understanding of HANA data provisioning, HANA views and SQL Script. Experience in SAP Datasphere implementation project with different data source like ERP (SAP ECC on HANA), Oracle R-12 Knowledge on extracting data from various sources including SAP and non-SAP system to SAP DS is required. Knowledge on Advanced modeling concepts including Analytic Views, Attribute Views, Hierarchies, Creating Restricted & Calculated Columns, Filter Operations, Variables, Creating Calculation Views, SAP HANA SQL, SQL Script and Procedures.Understanding of BI and HANA security (users, roles, privileges, etc).Excellent written and verbal communication skills Strong technical documentation SAP Datasphere Knowledge Location : -Remote

Posted 1 month ago

Apply

5.0 - 9.0 years

25 - 30 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Naukri logo

Role - Microsoft Purview Consultant Exp - 5-8 Years Location - All EXL Locations (Hybrid) Key Responsibilities Data Governance & Compliance: Design and implement Microsoft Purview solutions to ensure data classification, retention, and protection, aligning with organizational and regulatory standards. Policy Development: Develop and enforce data policies, including Data Loss Prevention (DLP), Insider Risk Management (IRM), and Information Rights Management (IRM), to safeguard sensitive information. Integration & Architecture: Leverage Azure core infrastructure to integrate Purview with other Azure services and Microsoft 365 applications, ensuring robust and secure data governance solutions. Collaboration & Stakeholder Engagement: Work closely with IT, security, compliance, and business teams to understand requirements and deliver effective solutions, including providing training and support to end-users and IT staff. Documentation & Reporting: Generate comprehensive as-built documentation representing the total output of work delivered to clients, ensuring transparency and alignment with organizational policies. Qualifications & Skills Experience : Typically, 38 years of experience in data governance, compliance, and security within a Microsoft 365 environment. Certifications : Relevant certifications such as Microsoft Certified: Security, Compliance, and Identity Fundamentals, or Microsoft Certified: Information Protection Administrator Associate, are often preferred. Technical Skills : Proficiency in Microsoft Purview, Microsoft 365 applications (Exchange Online, SharePoint, Teams, OneDrive), and Azure services. Analytical & Communication Skills : Strong analytical and problem-solving skills, along with excellent communication and interpersonal abilities to collaborate effectively with various teams.

Posted 1 month ago

Apply

5.0 - 9.0 years

8 - 12 Lacs

Mumbai

Hybrid

Naukri logo

Requirements - Support the implementation of the data governance framework across Zurich Cover-More, ensuring regulatory compliance and adherence to Zurich standards. - Collaborate with data owners to create documentation on data quality, privacy, and retention. - Manage metadata in the data catalogue, ensuring accuracy for reviews and tests. - Engage with various stakeholders to promote the adoption of the governance framework. - Monitor project scope, timelines, and resources while identifying risks. - Define and implement data quality rules with data owners and stewards. - Work with legal and compliance teams to ensure adherence to data security and privacy regulations like GDPR. - Develop training materials on data governance principles. - Establish metrics to track and improve the effectiveness of the governance framework. - Process access requests and evaluate changes related to data assets. - Refine the framework based on feedback and business needs. - Conduct Privacy Impact Assessments to identify and mitigate risks with personal data. - Manage the One Trust platform for Data Mapping and Automated Assessments. **Qualifications:** - Bachelors degree in Computer Science, Engineering, or a related field with 5+ years of experience. - Experience in data management or related fields. - Understanding of data governance frameworks and concepts. - Strong collaboration skills to work with cross-functional teams. - Relevant certifications (CIPPE, CDMP) are beneficial. - Familiarity with GDPR, CCPA, and data privacy best practices. - Experience conducting risk assessments and DPIAs.

Posted 1 month ago

Apply

3.0 - 6.0 years

12 - 22 Lacs

Gurugram

Work from Office

Naukri logo

Overview 170+ Years Strong. Industry Leader. Global Impact. At Pinkerton, the mission is to protect our clients. To do this, we provide enterprise risk management services and programs specifically designed for each client. Pinkerton employees are one of our most important assets and critical to the delivery of world-class solutions. Bonded together, we share a commitment to integrity, vigilance, and excellence. Pinkerton is an inclusive employer who seeks candidates with diverse backgrounds, experiences, and perspectives to join our family of industry subject matter experts. The Data Engineer will be part of a high-performing and international team with the goal to expand Data & Analytics solutions for our CRM application which is live in all Securitas countries. Together with the dedicated Frontend– & BI Developer you will be responsible for managing and maintaining the Databricks based BI Platform including the processes from data model changes, implementation and development of pipelines are part of the daily focus, but ETL will get most of your attention. Continuous development to do better will need the ability to think bigger and work closely with the whole team. The Data Engineer (ETL Specialist) will collaborate with the Frontend– & BI Developer to align on possibilities to improve the BI Platform deliverables specifically for the CEP organization. Cooperation with other departments such as integrations or specific IT/IS projects and business specialists is part of the job. The expectation is to always take data privacy into consideration when talking about moving data or sharing data with others. For that purpose, there is a need to develop the security layer as agreed with the legal department. Responsibilities Represent Pinkerton’s core values of integrity, vigilance, and excellence. Maintain & Develop the Databricks workspace used to host the BI CEP solution Active in advising needed changes the data model to accommodate new BI requirements Develop and implement new ETL scripts and improve the current ones Ownership on resolving the incoming tickets for both incidents and requests Plan activities to stay close to the Frontend- & BI Developer to foresee coming changes to the backend Through working with different team members improve the teamwork using the DevOps tool to keep track of the status of the deliverable from start to end Ensure understanding and visible implementation of the company’s core values of Integrity, Vigilance and Helpfulness. Knowledge about skills and experience available and required in your area today and tomorrow to drive liaison with other departments if needed. All other duties, as assigned. Qualifications At least 3+ years of experience in Data Engineering Understanding of designing and implementing data processing architectures in Azure environments Experience with different SSAS - modelling techniques (preferable Azure, databricks - Microsoft related) Understanding of data management and – treatment to secure data governance & security (Platform management and administration) An analytical mindset with clear communication and problem-solving skills Experience in working with SCRUM set up Fluent in English both spoken and written Bonus: knowledge of additional language(s) Ability to communicate, present and influence credibly at all levels both internally and externally Business Acumen & Commercial Awareness Working Conditions: With or without reasonable accommodation, requires the physical and mental capacity to effectively perform all essential functions; Regular computer usage. Occasional reaching and lifting of small objects and operating office equipment. Frequent sitting, standing, and/or walking. Travel, as required. Pinkerton is an equal opportunity employer to all applicants and positions without regard to race/ethnicity, color, national origin, ancestry, sex/gender, gender identity/expression, sexual orientation, marital/prenatal status, pregnancy/childbirth or related conditions, religion, creed, age, disability, genetic information, veteran status, or any protected status by local, state, federal or country-specific law.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies