Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Position Overview Job Title: Data Governance & Management – Associate Location: Bangalore, India Role Description The Compliance and Anti-Financial Crime (CAFC) Data Office is responsible for Data Governance and Management across key functions including AFC, Compliance, and Legal. The team supports these functions in establishing and improving data governance to achieve critical business outcomes such as effective control operation, regulatory compliance, and operational efficiency. The CAFC Data Governance and Management team implements Deutsche Bank’s Enterprise Data Management Framework—focusing on controls, culture, and capabilities—to drive improved data quality, reduce audit and regulatory findings, and strengthen controls. As a member of the Divisional Data Office, the role holder will support both Run-the-Bank and Change-the-Bank initiatives, with a particular focus on Financial Crime Risk Assessment (FCRA) data collation, processing, testing, and automation. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Document and maintain existing and new processes; respond to internal and external audit queries and communicate updates clearly to both technical and non-technical audiences. Independently manage the FCRA data collection process across multiple metrics, including template generation, data collection, quality checks, and stakeholder escalation. Perform variance analysis and develop a deep understanding of underlying data sources used in Financial Crime Risk Assessment. Collaborate with TDI on new releases and ensure new data sources align with Deutsche Bank’s Data Governance standards. Maintain metadata in Collibra, visualize data lineage in Solidatus, and ensure certification and control coverage. Automate manual data processes using tools such as Python, SQL, and Power Query to improve efficiency and reduce operational risk. Translate complex technical issues into simple, actionable insights for business stakeholders, demonstrating strong communication and stakeholder management skills. Your Skills And Experience 6+ years of experience in data management within financial services, with a strong understanding of data risks and controls. Familiarity with industry-standard frameworks such as DCAM or DAMA (certification preferred). Hands-on experience with Data cataloguing using Collibra, Data lineage documentation using Solidatus and Data control assessment and monitoring Proficiency in Python, SQL, and Power Query for data analysis and automation. Strong communication skills with the ability to explain technical concepts to non-technical stakeholders. Proven ability to work independently and collaboratively across global teams. How We’ll Support You Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment. Show more Show less
Posted 3 days ago
6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Position Overview Job Title: Data Governance & Management – Associate Location: Bangalore, India Role Description The Compliance and Anti-Financial Crime (CAFC) Data Office is responsible for Data Governance and Management across key functions including AFC, Compliance, and Legal. The team supports these functions in establishing and improving data governance to achieve critical business outcomes such as effective control operation, regulatory compliance, and operational efficiency. The CAFC Data Governance and Management team implements Deutsche Bank’s Enterprise Data Management Framework—focusing on controls, culture, and capabilities—to drive improved data quality, reduce audit and regulatory findings, and strengthen controls. As a member of the Divisional Data Office, the role holder will support both Run-the-Bank and Change-the-Bank initiatives, with a particular focus on Financial Crime Risk Assessment (FCRA) data collation, processing, testing, and automation. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Document and maintain existing and new processes; respond to internal and external audit queries and communicate updates clearly to both technical and non-technical audiences. Independently manage the FCRA data collection process across multiple metrics, including template generation, data collection, quality checks, and stakeholder escalation. Perform variance analysis and develop a deep understanding of underlying data sources used in Financial Crime Risk Assessment. Collaborate with TDI on new releases and ensure new data sources align with Deutsche Bank’s Data Governance standards. Maintain metadata in Collibra, visualize data lineage in Solidatus, and ensure certification and control coverage. Automate manual data processes using tools such as Python, SQL, and Power Query to improve efficiency and reduce operational risk. Translate complex technical issues into simple, actionable insights for business stakeholders, demonstrating strong communication and stakeholder management skills. Your Skills And Experience 6+ years of experience in data management within financial services, with a strong understanding of data risks and controls. Familiarity with industry-standard frameworks such as DCAM or DAMA (certification preferred). Hands-on experience with Data cataloguing using Collibra, Data lineage documentation using Solidatus and Data control assessment and monitoring Proficiency in Python, SQL, and Power Query for data analysis and automation. Strong communication skills with the ability to explain technical concepts to non-technical stakeholders. Proven ability to work independently and collaboratively across global teams. How We’ll Support You Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment. Show more Show less
Posted 3 days ago
6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Position Overview Job Title: Data Governance & Management – Associate Location: Bangalore, India Role Description The Compliance and Anti-Financial Crime (CAFC) Data Office is responsible for Data Governance and Management across key functions including AFC, Compliance, and Legal. The team supports these functions in establishing and improving data governance to achieve critical business outcomes such as effective control operation, regulatory compliance, and operational efficiency. The CAFC Data Governance and Management team implements Deutsche Bank’s Enterprise Data Management Framework—focusing on controls, culture, and capabilities—to drive improved data quality, reduce audit and regulatory findings, and strengthen controls. As a member of the Divisional Data Office, the role holder will support both Run-the-Bank and Change-the-Bank initiatives, with a particular focus on Financial Crime Risk Assessment (FCRA) data collation, processing, testing, and automation. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Document and maintain existing and new processes; respond to internal and external audit queries and communicate updates clearly to both technical and non-technical audiences. Independently manage the FCRA data collection process across multiple metrics, including template generation, data collection, quality checks, and stakeholder escalation. Perform variance analysis and develop a deep understanding of underlying data sources used in Financial Crime Risk Assessment. Collaborate with TDI on new releases and ensure new data sources align with Deutsche Bank’s Data Governance standards. Maintain metadata in Collibra, visualize data lineage in Solidatus, and ensure certification and control coverage. Automate manual data processes using tools such as Python, SQL, and Power Query to improve efficiency and reduce operational risk. Translate complex technical issues into simple, actionable insights for business stakeholders, demonstrating strong communication and stakeholder management skills. Your Skills And Experience 6+ years of experience in data management within financial services, with a strong understanding of data risks and controls. Familiarity with industry-standard frameworks such as DCAM or DAMA (certification preferred). Hands-on experience with Data cataloguing using Collibra, Data lineage documentation using Solidatus and Data control assessment and monitoring Proficiency in Python, SQL, and Power Query for data analysis and automation. Strong communication skills with the ability to explain technical concepts to non-technical stakeholders. Proven ability to work independently and collaboratively across global teams. How We’ll Support You Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment. Show more Show less
Posted 3 days ago
15.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Your Team Responsibilities MSCI has an immediate opening in of our fastest growing product lines. As a Lead Architect within Sustainability and Climate, you are an integral part of a team that works to develop high-quality architecture solutions for various software applications on modern cloud-based technologies. As a core technical contributor, you are responsible for conducting critical architecture solutions across multiple technical areas to support project goals. The systems under your responsibility will be amongst the most mission critical systems of MSCI. They require strong technology expertise and a strong sense of enterprise system design, state-of-the-art scalability and reliability but also innovation. Your ability to take technology decisions in a consistent framework to support the growth of our company and products, lead the various software implementations in close partnerships with global leaders and multiple product organizations and drive the technology innovations will be the key measures of your success in our dynamic and rapidly growing environment. At MSCI, you will be operating in a culture where we value merit and track record. You will own the full life-cycle of the technology services and provide management, technical and people leadership in the design, development, quality assurance and maintenance of our production systems, making sure we continue to scale our great franchise. Your Skills And Experience That Will Help You Excel Prior senior Software Architecture roles Demonstrate proficiency in programming languages such as Python/Java/Scala and knowledge of SQL and NoSQL databases. Drive the development of conceptual, logical, and physical data models aligned with business requirements. Lead the implementation and optimization of data technologies, including Apache Spark. Experience with one of the table formats, such as Delta, Iceberg. Strong hands-on experience in data architecture, database design, and data modeling. Proven experience as a Data Platform Architect or in a similar role, with expertise in Airflow, Databricks, Snowflake, Collibra, and Dremio. Experience with cloud platforms such as AWS, Azure, or Google Cloud. Ability to dive into details, hands on technologist with strong core computer science fundamentals. Strong preference for financial services experience Proven leadership of large-scale distributed software teams that have delivered great products on deadline Experience in a modern iterative software development methodology Experience with globally distributed teams and business partners Experience in building and maintaining applications that are mission critical for customers M.S. in Computer Science, Management Information Systems or related engineering field 15+ years of software engineering experience Demonstrated consensus builder and collegial peer About MSCI What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com Show more Show less
Posted 3 days ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Description: Key Capabilities And Responsibilities Core experience is as an engineer/developer; demonstrated proficiency in coding languages and working in APIs/connectors/integrations Experience working with various 3rd party tools or platforms; data governance (such as Collibra), privacy or security tools a definite plus, but not a firm requirement Ability and desire to learn a new platform – self teach, investigate capabilities and functionality, decipher how the platform works Translate technical functionality and capabilities in the platform/tools into business language, and present to team members – help team members understand how the platform/tools work, and what’s possible Help assess and define business requirements, recommend solutions, and then translate into functional and technical requirements, and development within the platform/tools; determine what should be tested and how, solution testing and test case development and execution Documentation of work, in requirements, design and technical documents that can be used by various team members, and serve as point of alignment and sign-off on work to be done Articulation of tasks to be done, estimation of effort for those tasks, and execution and management of own work according to timelines and commitments; experience working in sprints and iterative development (including prototyping, POCs), and utilizing JIRA to organize and drive work forward Collaborate across teams, leveraging skillsets and knowledge of other team members to deliver end products; manage work with other team members Troubleshooting and problem solving to address issues and guide team members Location: DGS India - Mumbai - Thane Ashar IT Park Brand: Dentsu Time Type: Full time Contract Type: Permanent Show more Show less
Posted 3 days ago
113.0 years
0 Lacs
Delhi, India
On-site
Rockwell Automation is a global technology leader focused on helping the world’s manufacturers be more productive, sustainable, and agile. With more than 28,000 employees who make the world better every day, we know we have something special. Behind our customers - amazing companies that help feed the world, provide life-saving medicine on a global scale, and focus on clean water and green mobility - our people are energized problem solvers that take pride in how the work we do changes the world for the better. We welcome all makers, forward thinkers, and problem solvers who are looking for a place to do their best work. And if that’s you we would love to have you join us! Job Description Rockwell Automation is a global technology leader focused on helping the world's manufacturers be more productive, sustainable, and agile. With more than 28,000 employees who make the world better every day, we know we have something special. Behind our customers - amazing companies that help feed the world, provide life-saving medicine on a global scale, and focus on clean water and green mobility - our people are energized problem solvers that take pride in how the work we do changes the world for the better. We welcome all makers, forward thinkers, and problem solvers who are looking for a place to do their best work. And if that's you we would love to have you join us! Job Description Job Summary: Do you like to implement technology solutions that have material impact on revenue production? Would you like to drive Digital Transformation across commercial products and services at a 113 year-old Wide Moat pure-play automation company? If your answer is yes, then Rockwell Automation wants you to join their growing Cloud Practice and effect change. Rockwell, a $7B industrial with 22,000 employees in 100 countries is at the forefront of the IoT revolution and is rebuilding their IT organization to accommodate for growth. We need Full Stack engineering pros with a DevOps mindset to drive innovative technology solutions. The Cloud DevOps Engineer is critical path for both transformation and innovation. This tech professional, a cloud delivery, and operations native, with solid fluency in networking technologies, automation, and scripting, will help Rockwell move to a Cloud-First operating model and create best-inclass, agile, on-demand IT products and services. Your Responsibilities: You will implement, managing, and supporting the Information Technology (IT) infrastructure. You will develop systems software to analyze data to improve existing ones. You will support the design, deployment, and operations of a large-scale global hybrid cloud computing environment. The role designs and maintains modern and open toolchains, automation frameworks and test suites, infrastructure tiers, advocating for end-to-end automated CI/CD DevOps pipelines. You will work with developers and the IT staff to oversee the code releases by understanding the requirements. Demonstrate a curiosity for solving problems, and initiative to make improvements in an environment with significant technical debt and the accompanying cultural inertia where a permanent bootstrap mentality is required Independently resolve tactical challenges Demonstrate both strong analytical capabilities in problem solving approaches Recognized for high quality and high impact results for self and others Challenges traditional way of doing things by moving beyond the easy, traditional, and obvious Able to provide collaboration and instill a level of confidence in team members and key stakeholders Qualifications Basic Qualifications: Bachelor's Degree in related field The Essentials - You Will Have: 4 plus years of relevant experience on Azure cloud DevSecOps Cloud Application architecture Kubernetes. DevSecOps. Cloud Networking. Confluent Cloud. ( administration ) Collibra Tool. ( Infrastructure, Integrations & Configuration ) Elastic Cloud. ( Cluster management & Configuration) Experience with Java, Python, SQL, or other domain-specific programming languages Demonstrable cloud development impact Experience working in a Cloud-First operating model or within a dedicated cloud team Service Provider/MSP experience The Preferred - You Might Also Have: Experience with Agile development methodologies Ability to adapt quickly to new technologies and changing business requirements Unwavering commitment to, and the ability to model, the standards of behavior set in our Code of Conduct What We Offer: Our benefits package includes … Comprehensive mindfulness programs with a premium membership to Calm Volunteer Paid Time off available after 6 months of employment for eligible employees. Company volunteer and donation matching program – Your volunteer hours or personal cash donations to an eligible charity can be matched with a charitable donation. Employee Assistance Program Personalized wellbeing programs through our OnTrack program On-demand digital course library for professional development and other local benefits! At Rockwell Automation we are dedicated to building a diverse, inclusive and authentic workplace, so if you're excited about this role but your experience doesn't align perfectly with every qualification in the job description, we encourage you to apply anyway. You may be just the right person for this or other roles. or Rockwell Automation's hybrid policy aligns that employees are expected to work at a Rockwell location at least Mondays, Tuesdays, and Thursdays unless they have a business obligation out of the office. Rockwell Automation’s hybrid policy aligns that employees are expected to work at a Rockwell location at least Mondays, Tuesdays, and Thursdays unless they have a business obligation out of the office. Show more Show less
Posted 3 days ago
113.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Rockwell Automation is a global technology leader focused on helping the world’s manufacturers be more productive, sustainable, and agile. With more than 28,000 employees who make the world better every day, we know we have something special. Behind our customers - amazing companies that help feed the world, provide life-saving medicine on a global scale, and focus on clean water and green mobility - our people are energized problem solvers that take pride in how the work we do changes the world for the better. We welcome all makers, forward thinkers, and problem solvers who are looking for a place to do their best work. And if that’s you we would love to have you join us! Job Description Rockwell Automation is a global technology leader focused on helping the world's manufacturers be more productive, sustainable, and agile. With more than 28,000 employees who make the world better every day, we know we have something special. Behind our customers - amazing companies that help feed the world, provide life-saving medicine on a global scale, and focus on clean water and green mobility - our people are energized problem solvers that take pride in how the work we do changes the world for the better. We welcome all makers, forward thinkers, and problem solvers who are looking for a place to do their best work. And if that's you we would love to have you join us! Job Description Job Summary: Do you like to implement technology solutions that have material impact on revenue production? Would you like to drive Digital Transformation across commercial products and services at a 113 year-old Wide Moat pure-play automation company? If your answer is yes, then Rockwell Automation wants you to join their growing Cloud Practice and effect change. Rockwell, a $7B industrial with 22,000 employees in 100 countries is at the forefront of the IoT revolution and is rebuilding their IT organization to accommodate for growth. We need Full Stack engineering pros with a DevOps mindset to drive innovative technology solutions. The Cloud DevOps Engineer is critical path for both transformation and innovation. This tech professional, a cloud delivery, and operations native, with solid fluency in networking technologies, automation, and scripting, will help Rockwell move to a Cloud-First operating model and create best-inclass, agile, on-demand IT products and services. Your Responsibilities: You will implement, managing, and supporting the Information Technology (IT) infrastructure. You will develop systems software to analyze data to improve existing ones. You will support the design, deployment, and operations of a large-scale global hybrid cloud computing environment. The role designs and maintains modern and open toolchains, automation frameworks and test suites, infrastructure tiers, advocating for end-to-end automated CI/CD DevOps pipelines. You will work with developers and the IT staff to oversee the code releases by understanding the requirements. Demonstrate a curiosity for solving problems, and initiative to make improvements in an environment with significant technical debt and the accompanying cultural inertia where a permanent bootstrap mentality is required Independently resolve tactical challenges Demonstrate both strong analytical capabilities in problem solving approaches Recognized for high quality and high impact results for self and others Challenges traditional way of doing things by moving beyond the easy, traditional, and obvious Able to provide collaboration and instill a level of confidence in team members and key stakeholders Qualifications Basic Qualifications: Bachelor's Degree in related field The Essentials - You Will Have: 4 plus years of relevant experience on Azure cloud DevSecOps Cloud Application architecture Kubernetes. DevSecOps. Cloud Networking. Confluent Cloud. ( administration ) Collibra Tool. ( Infrastructure, Integrations & Configuration ) Elastic Cloud. ( Cluster management & Configuration) Experience with Java, Python, SQL, or other domain-specific programming languages Demonstrable cloud development impact Experience working in a Cloud-First operating model or within a dedicated cloud team Service Provider/MSP experience The Preferred - You Might Also Have: Experience with Agile development methodologies Ability to adapt quickly to new technologies and changing business requirements Unwavering commitment to, and the ability to model, the standards of behavior set in our Code of Conduct What We Offer: Our benefits package includes … Comprehensive mindfulness programs with a premium membership to Calm Volunteer Paid Time off available after 6 months of employment for eligible employees. Company volunteer and donation matching program – Your volunteer hours or personal cash donations to an eligible charity can be matched with a charitable donation. Employee Assistance Program Personalized wellbeing programs through our OnTrack program On-demand digital course library for professional development and other local benefits! At Rockwell Automation we are dedicated to building a diverse, inclusive and authentic workplace, so if you're excited about this role but your experience doesn't align perfectly with every qualification in the job description, we encourage you to apply anyway. You may be just the right person for this or other roles. or Rockwell Automation's hybrid policy aligns that employees are expected to work at a Rockwell location at least Mondays, Tuesdays, and Thursdays unless they have a business obligation out of the office. Rockwell Automation’s hybrid policy aligns that employees are expected to work at a Rockwell location at least Mondays, Tuesdays, and Thursdays unless they have a business obligation out of the office. Show more Show less
Posted 3 days ago
2.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Entity: Technology Job Family Group: IT&S Group Job Description: Work location Pune You will work with Being part of a digital delivery data group supporting Realtime data for our Solutions group, you will apply your domain knowledge and familiarity with domain data processes to support the organisation. The data team provides daily operational data management, data engineering and analytics support to this organisation across a broad range of activity About The Role The Data Steward applies practitioner level knowledge of a business domain to curate and validate accuracy, security and referential integrity of data required to drive compliance, safety and business critical decision making. They are responsible for implementing technical changes and controls across systems of record and communicating planned changes to the data owners. They are responsible for implementing the data requirements to populate systems of record and transpose data between package software, including Input quality checks on received data and providing technical insights into creating, remediating and maintaining data definitions. What you will deliver Act as a custodian of Realtime engineering, reliability, maintenance and facilities data, ensuring data integrity, consistency, and compliance across the organization, prioritising safety and operational efficiency for the business. Your focus areas will include: production data sets, production accounting, forecasting and production optimisation Enforce data governance policies, standards, and regulations; participate in improvement of these based on business need. Assess, report on and resolve data quality issues through root cause analysis and remediation planning. Ensure that data, documents and models represent the physical reality of our assets. Responsible for implementing the data requirements to populate systems of record and transpose data between package software, including Input quality checks on received data Work closely with data engineers and business analysts to ensure high-quality, standardized data. Support business users by providing guidance on data usage, access, and policies. Implement technical changes and controls across systems of record and communicate planned changes to the data owners Assist in metadata management, ensuring all critical datasets are properly captured. Facilitate collaboration between business and technology teams to improve data literacy and governance. Support regulatory and compliance efforts related to data privacy, security, and access control. What you will need to be successful (experience and qualifications) Essential : Bachelor’s degree in a STEM area or equivalent experience Experience with Realtime facility telemetry and industrial data platforms Hands on experience in AVEVA PI portfolio including AVEVA PI Vision Experience with SCADA systems Knowledge and understanding of managing piping and instrumentation diagrams Working knowledge of oil and gas process equipment (separators, valves, heat exchanges, pumps, produced water system, glycol system, distillation) 2+ years experience in stewardship of datasets within an operating Oil & Gas organisation or asset-intensive industry Strong understanding of data governance frameworks, master data management principles, policies, and compliance within the wells & subsurface data domain. Ability to work with business and technical teams to resolve data quality issues. Excellent communication and documentation skills. Analytical mindset with a strong focus on data accuracy and process improvement. Desired : Proficiency in SQL and ability to work with large datasets. Experience with Palantir Foundry Familiarity with cloud data platforms (AWS, Azure, or GCP). Experience with data governance tools (e.g., Collibra, Alation, Informatica). About Bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks. Show more Show less
Posted 4 days ago
130.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description R3- Senior Manager, Data Quality Engineer The Opportunity Based in Hyderabad, join a global healthcare biopharma company and be part of a 130- year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Be part of an organisation driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be a part of a team with passion for using data, analytics, and insights to drive decision-making, and which creates custom software, allowing us to tackle some of the world's greatest health threats. Our Technology Centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of our company’s IT operating model, Tech Centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each Tech Center helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers. Role Overview As a Data Quality Engineer you will play a pivotal role ensuring the quality of our data across all domains, which will directly influence patients who use our life-saving products. Key tasks include data integration, ETL (Extract, Transform, Load) processes, and building and managing data quality routines. If you are passionate about data governance and want to make a significant impact, we encourage you to apply. What Will You Do In This Role As part of enterprise Data Quality platform team, you will contribute to our success in the following areas Work with our divisional partners to onboard their data to our data quality platform and help drive adoption within their teams. Understand divisional requirements and codify them within the data quality platform Create and maintain data quality rules and checks. Review data quality reports and communicate findings with divisional stakeholders Train users on the platform, promoting consistent use. Contribute to the development and documentation of standards for platform usage. Perform product engineering and develop automation utilities. What Should You Have Bachelor's degree in Computer Science, Information Technology, or a related field, or equivalent experience. Hands-on professional who has been in the technology industry for minimum 7-11 years as a Data Engineer. Strong level of SQL is a must. Knowledge of data transformation (ETL/ELT) routines. Strong understanding of REST APIs and how to use them programmatically. Experience with Collibra Data Quality and with data visualization tools is an advantage. Knowledge of GitHub and Python is an advantage Some experience with Spark/PySpark would be good. Good standard of professional communication and building working relationships with customers. Good time-management skills and ability to work independently. Innovative mindset, willingness to learn new areas and adapt to change. Strong work documentation habits with and attention to detail and accuracy. Critical analytical thinking and problem-solving attitude. Keen sense of urgency and customer focus. Team player spirit. Who We Are We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What We Look For Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are among the intellectually curious, join us—and start making your impact today. Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Data Engineering, Data Visualization, Design Applications, Software Configurations, Software Development, Software Development Life Cycle (SDLC), Solution Architecture, System Designs, Systems Integration, Testing Preferred Skills Job Posting End Date 07/12/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R346609 Show more Show less
Posted 4 days ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Summary: Privacy, Security, & Risk Specialist: (Manager, Data Governance_G4_EDAA0104) As a Data Risk & Compliance Analyst within the Enterprise Data Governance (EDG) team, you will play a key role in supporting data risk management, privacy, and compliance efforts across the organization. You will operationalize and enhance processes that support secure data practices, regulatory alignment, and the protection of sensitive data assets. Working cross-functionally with business, legal, privacy, and cybersecurity teams, you will help ensure that data governance capabilities are implemented with integrity and transparency. This role combines technical acumen, risk assessment, and compliance management to support data discovery, access controls, data classification, and privacy risk assessments. Who we’re looking for: Primary Responsibilities: Risk & Privacy Controls Execution : Maintain and support risk and privacy controls across key processes such as data retention, access monitoring, and records destruction. Data Discovery & Classification Enablement : Help drive the implementation of data discovery, tagging, and classification activities by identifying structured data with privacy and regulatory implications. Governance Platform Integration : Collaborate in testing and integrating data governance capabilities with risk and compliance systems (e.g., GRC tools, OneTrust, ServiceNow IRM). Key Responsibilities: Partner with the privacy, legal, and security teams to operationalize privacy-by-design, records management, and access governance. Support the creation, enhancement, and enforcement of data handling policies, including ROPA, data classification, and regulatory reporting. Maintain and analyze Records of Processing Activities (ROPA) and ensure accuracy and traceability of critical data elements. Assist with privacy and compliance risk assessments, tracking mitigation plans, and supporting enterprise audit requests. Align with Identity and Access Management teams to manage privileged access appropriately, supporting the governance of access control and provisioning. Assist in developing data quality metrics, health indices, and access provisioning dashboards. Provide expert guidance to EDG councils and data stewards regarding privacy, data protection, and compliance requirements. Support the organization in addressing questions about security classification, data-sharing agreements, and retention schedules. Skill: Bachelor’s degree in information technology, Computer Science, or a related field. 5+ years of experience in data governance, privacy, information risk, and compliance. Familiarity with NIST CSF, NIST Privacy Framework, and ISO 27001. Hands-on experience with GRC and privacy tools like OneTrust, RSA Archer, Collibra, or ServiceNow IRM. Strong understanding of data discovery and classification technologies; ability to define policies and regex rules. Knowledge of information governance, access control, and secure records lifecycle management. Excellent analytical and communication skills with the ability to work across technical and business teams. Cybersecurity certifications preferred (e.g., CISSP, CISA). Work location: Hyderabad, India Work hours: Work pattern: Full time role. Work mode: Hybrid. Show more Show less
Posted 4 days ago
3.0 years
0 Lacs
India
Remote
AWS Data Engineer Location: Remote (India) Experience: 3+ Years Employment Type: Full-Time About the Role: We are seeking a talented AWS Data Engineer with at least 3 years of hands-on experience in building and managing data pipelines using AWS services. This role involves working with large-scale data, integrating multiple data sources (including sensor/IoT data), and enabling efficient, secure, and analytics-ready solutions. Experience in the energy industry or working with time-series/sensor data is a strong plus. Key Responsibilities: Build and maintain scalable ETL/ELT data pipelines using AWS Glue, Redshift, Lambda, EMR, S3, and Athena Process and integrate structured and unstructured data, including sensor/IoT and real-time streams Optimize pipeline performance and ensure reliability and fault tolerance Collaborate with cross-functional teams including data scientists and analysts Perform data transformations using Python, Pandas, and SQL Maintain data integrity, quality, and security across the platform Use Terraform and CI/CD tools (e.g., Azure DevOps) for infrastructure and deployment automation Support and monitor pipeline workflows, troubleshoot issues, and implement fixes Contribute to the adoption of emerging tools like AWS Bedrock, Textract, Rekognition, and GenAI solutions Required Skills and Qualifications: Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field 3+ years of experience in data engineering using AWS Strong skills in: AWS Glue, Redshift, S3, Lambda, EMR, Athena Python, Pandas, SQL RDS, Postgres, SAP HANA Solid understanding of data modeling, warehousing, and pipeline orchestration Experience with version control (Git) and infrastructure as code (Terraform) Preferred Skills: Experience working with energy sector dat a or IoT/sensor-based data Exposure to machine learnin g tools and frameworks (e.g., SageMaker, TensorFlow, Scikit-learn) Familiarity with big data technologie s like Apache Spark, Kafka Experience with data visualization tool s (Tableau, Power BI, AWS QuickSight) Awareness of data governance and catalog tool s such as AWS Data Quality, Collibra, and AWS Databrew AWS Certifications (Data Analytics, Solutions Architect Show more Show less
Posted 4 days ago
0.0 - 2.0 years
0 Lacs
Raipur, Chhattisgarh
On-site
Company Name- Interbiz Consulting Pvt Ltd Position/Designation- Data Engineer Job Location- Raipur (C.G.) Mode- Work from office Experience- 2 to 5 Years We are seeking a talented and detail-oriented Data Engineer to join our growing Data & Analytics team. You will be responsible for building and maintaining robust, scalable data pipelines and infrastructure to support data-driven decision-making across the organization. Key Responsibilities Design and implement ETL/ELT data pipelines for structured and unstructured data using Azure Data Factory , Databricks , or Apache Spark . Work with Azure Blob Storage , Data Lake , and Synapse Analytics to build scalable data lakes and warehouses. Develop real-time data ingestion pipelines using Apache Kafka , Apache Flink , or Apache Beam . Build and schedule jobs using orchestration tools like Apache Airflow or Dagster . Perform data modeling using Kimball methodology for building dimensional models in Snowflake or other data warehouses. Implement data versioning and transformation using DBT and Apache Iceberg or Delta Lake . Manage data cataloging and lineage using tools like Marquez or Collibra . Collaborate with DevOps teams to containerize solutions using Docker , manage infrastructure with Terraform , and deploy on Kubernetes . Setup and maintain monitoring and alerting systems using Prometheus and Grafana for performance and reliability. Required Skills and Qualifications Qualifications Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field. [1–5+] years of experience in data engineering or related roles. Proficiency in Python , with strong knowledge of OOP and data structures & algorithms . Comfortable working in Linux environments for development and deployment. Strong command over SQL and understanding of relational (DBMS) and NoSQL databases. Solid experience with Apache Spark (PySpark/Scala). Familiarity with real-time processing tools like Kafka , Flink , or Beam . Hands-on experience with Airflow , Dagster , or similar orchestration tools. Deep experience with Microsoft Azure , especially Azure Data Factory , Blob Storage , Synapse , Azure Functions , etc. AZ-900 or other Azure certifications are a plus. Knowledge of dimensional modeling , Snowflake , Apache Iceberg , and Delta Lake . Understanding of modern Lakehouse architecture and related best practices. Familiarity with Marquez , Collibra , or other cataloging tools. Experience with Terraform , Docker , Kubernetes , and Jenkins or equivalent CI/CD tools. Proficiency in setting up dashboards and alerts with Prometheus and Grafana . Interested candidates may share their CV on swapna.rani@interbizconsulting.com or visit www.interbizconsulting.com Note:- Immediate joiner will be preferred. Job Type: Full-time Pay: From ₹25,000.00 per month Benefits: Food provided Health insurance Leave encashment Provident Fund Supplemental Pay: Yearly bonus Application Question(s): Do you have at least 2 years of work experience in Python? Do you have at least 2 years of work experience in Data Science? Are you from Raipur, Chhattisgarh? Are you willing to work for more than 2 years? What is your notice period? What is your current salary and what you are expecting? Work Location: In person
Posted 4 days ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Infosys BPM Limited hiring for Analyst role at Pune location. Analyst - Supplier master data Location - Pune Shift – Rotational Experience – JL4A – 6+ yrs Role/Responsibilities: Due diligence (kd Prevent) Vendor creation, Item creation, incl. validation of vendor particulars. Compliance/maintenance of vendor particulars, risk assessment, Technical Risk, Business Continuity, Limited Souricng Risk, Dependency Risk, Financial Risk, Sustainibilty Risk, Cyber Security Risk. Bank detail verification and update, Compliance check (with KD Prevent), Financial check (BVD database). Supplier code of conduct and anti-bribery questionnaire. Run kdPrevent report, manage result, store on ERP Receive vendor registration supporting documents (including code of conduct and anti-bribery questionnaire response), review, confirm, store on Shared Google at entity level. Create vendor on ERP, using information supplied and validated. MDM Strategy & Data Cleansing Strategy Development, Experience in classification, master data management (material, vendor and Customer master enrichment/cleansing), De-Duplication etc. SAP Functional knowledge – understanding overall SAP system structure. Multi-tasking Master Data Expert. Developing/ participating in new solutions, tools, methodologies, strategies in growing MDM Practice. Perform master data audits and validation to ensure conformance to business rules, standards, and metrics that meet business requirements. Data Design documentation preparation like Data Model, Data Standards, CRUD Matrix, IT System Integration Matrix. Should be able to drive a project implementation from Due diligence to the final signoff from the client and should also maintain the SLAs, on an agreed upon basis. Perform pre-analysis activities such as classifying invoice and PO records to a specified hierarchy, conduct data reviews to ensure high quality and accuracy of reports on which analysis is performed. Project collaboration: Work effectively both independently and as a member of cross-functional teams. Perform master data audits and validation to ensure conformance to business rules, standards, and metrics that meet business requirements. Skillset: Good understanding and work experience of Master Data and its impact on downstream processes. Min. 8 years of professional experience in Master Data Management (key data objects: Customer, Vendor, Material, Product). Fluent verbal and written communication skills, presentation skills. Excellent Data analysis and interpretation skills – proven skills in Excel modeling & analysis. Strong story-telling skills to deliver recommendations from data analysis – proven skills in PowerPoint, in a business-case presentation context. Knowledge and experience of key features of Master Data Management Platforms (Sap ECC, SAP MDG, Talend, Stibo, Collibra, Informatica, Winshuttle, etc.). AI/ ML based projects participation/ experience will be an asset. Fluent verbal and written communication skills, presentation skills. Self-motivated and takes ownership. Project and Team management skills. Skills in ERP, SQL and Data visualization. Interpersonal Skills and thought leadership. Effective communication and maintaining professional relations with the Client and Infosys. SAP Functional knowledge – understanding overall SAP system structure. Multi-tasking Master Data Expert. If interested, please share your updated resume with below details to merlin.varghese@infosys.com Total Experience: Relevant Experience: Current CTC: Expected CTC: Notice Period: Current Location: Willing to Work from Office: Flexible with night shifts: Flexible to Relocate: Pune (if any): Regard's Infosys BPM Team. Show more Show less
Posted 4 days ago
2.0 - 5.5 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Role: Associate Tower: Data, Analytics & Specialist Managed Service Experience: 2.0 - 5.5 years Key Skills: AWS Educational Qualification: BE / B Tech / ME / M Tech / MBA Work Location: India.;l Job Description As a Associate, you will work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review, Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Demonstrate leadership capabilities by working, with clients directly and leading the engagement. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good team player, take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements Required Skills: AWS Cloud Engineer Job description: Candidate is expected to demonstrate extensive knowledge and/or a proven record of success in the following areas: Should have minimum 2 years hand on experience building advanced Data warehousing solutions on leading cloud platforms. Should have minimum 1-3 years of Operate/Managed Services/Production Support Experience Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data that for downstream consumption like Business Intelligence systems, Analytics modelling, Data scientists etc. Designing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems, such as data warehouses or data lakes. Should have experience in building efficient, ETL/ELT processes using industry leading tools like AWS, AWS GLUE, AWS Lambda, AWS DMS, PySpark, SQL, Python, DBT, Prefect, Snoflake, etc. Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in AWS. Work together with data scientists and analysts to understand the needs for data and create effective data workflows. Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Improve the scalability, efficiency, and cost-effectiveness of data pipelines. Monitoring and troubleshooting data pipelines and resolving issues related to data processing, transformation, or storage. Implementing and maintaining data security and privacy measures, including access controls and encryption, to protect sensitive data Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases Should have experience in Building and maintaining Data Governance solutions (Data Quality, Metadata management, Lineage, Master Data Management and Data security) using industry leading tools Scaling and optimizing schema and performance tuning SQL and ETL pipelines in data lake and data warehouse environments. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Should have Experience of ITIL processes like Incident management, Problem Management, Knowledge management, Release management, Data DevOps etc. Should have Strong communication, problem solving, quantitative and analytical abilities. Nice To Have AWS certification Managed Services- Data, Analytics & Insights Managed Service At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights where we focus more so on the evolution of our clients’ Data and Analytics ecosystem. Our focus is to empower our clients to navigate and capture the value of their Data & Analytics portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Data, Analytics & Insights offerings and engagement including help desk support, enhancement, and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective. Show more Show less
Posted 5 days ago
8.0 - 10.0 years
0 Lacs
Andhra Pradesh, India
On-site
Summary about Organization A career in our Advisory Acceleration Center is the natural extension of PwC’s leading global delivery capabilities. The team consists of highly skilled resources that can assist in the areas of helping clients transform their business by adopting technology using bespoke strategy, operating model, processes, and planning. You will be at the forefront of helping organizations adopt innovative technology solutions that optimize business processes or enable scalable technology. Our team helps organizations transform their IT infrastructure, modernize applications and data management to help shape the future of business. An essential and strategic part of Advisory's multi-sourced, multi-geography Global Delivery Model, the Acceleration Centers are a dynamic, rapidly growing component of our business. The teams out of these Centers have achieved remarkable results in process quality and delivery capability, resulting in a loyal customer base and a reputation for excellence. Job Description As a Senior Data Governance Engineer, you will play a crucial role in the development and implementation of our data governance architecture & strategy. You will work closely with cross functional teams to ensure the integrity, quality, and security of our data assets. Your expertise in various Data Governance tools and custom implementations will be pivotal in driving our data governance initiatives forward. Key areas of expertise include Implement end to end data governance in medium to large sized data projects. Implement, configure, and maintain Data Governance tools such as Collibra, Apache Atlas, Microsoft PurView, BigID Evaluate and recommend appropriate DG tools and technologies based on business requirements. Define, implement, and monitor data quality rules and standards. Collaborate with data stewards, IT, legal, and business units to establish data governance processes. Provide guidance and support to data stewards. Work with business units to define, develop, and maintain business glossaries Ensure compliance with regulatory requirements and internal data governance frameworks. Collaborate with IT, data management teams, and business units to ensure alignment of data governance objectives. Communicate data governance initiatives and policies effectively across the organization. Qualifications Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Management, or a related field. 8 - 10 years of experience in data governance, data management, or a related field. Proven experience with Data Governance tools such as Collibra, Apache Atlas, Microsoft PurView, BigID and end to end data governance implementations. Experience with Cloud data quality monitoring and management Proficiency with cloud-native data services and tools on Azure and AWS Strong understanding of data quality principles and experience in defining and implementing data quality rules. Experience in implementing & monitoring data quality remediation workflows to address data quality issues. Experience serving in a data steward role with a thorough understanding of data stewardship responsibilities. Demonstrated experience in defining and maintaining business glossaries. Excellent analytical, problem solving, and organizational skills. Strong communication and interpersonal skills, with the ability to work effectively with cross functional teams. Knowledge of regulatory requirements related to data governance is a plus. Preferred Skills Certification in Data Governance or Data Management (e.g., CDMP, Collibra Certification). Knowledge of the Financial Services domain. Experience with data cataloging and metadata management. Familiarity with data Governance, Quality & Privacy regulations (e.g., GDPR, CCPA, BCBS, COBIT, DAMA-DMBOK). Show more Show less
Posted 5 days ago
130.0 years
6 - 9 Lacs
Hyderābād
On-site
Job Description Senior Manager, Data Engineer The Opportunity Based in Hyderabad, join a global healthcare biopharma company and be part of a 130- year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Be part of an organisation driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be a part of a team with passion for using data, analytics, and insights to drive decision-making, and which creates custom software, allowing us to tackle some of the world's greatest health threats. Our Technology Centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of our company’s IT operating model, Tech Centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each Tech Center helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers. Role Overview Responsibilities Designs, builds, and maintains data pipeline architecture - ingest, process, and publish data for consumption. Batch processes collected data, formats data in an optimized way to bring it analyze-ready Ensures best practices sharing and across the organization Enables delivery of data-analytics projects Develops deep knowledge of the company's supported technology; understands the whole complexity/dependencies between multiple teams, platforms (people, technologies) Communicates intensively with other platform/competencies to comprehend new trends and methodologies being implemented/considered within the company ecosystem Understands the customer and stakeholders business needs/priorities and helps building solutions that support our business goals Establishes and manages the close relationship with customers/stakeholders Has overview of the date engineering market development to be able to come up/explore new ways of delivering pipelines to increase their value/contribution Builds “community of practice” leveraging experience from delivering complex analytics projects Is accountable for ensuring that the team delivers solutions with high quality standards, timeliness, compliance and excellent user experience Contributes to innovative experiments, specifically to idea generation, idea incubation and/or experimentation, identifying tangible and measurable criteria Qualifications: Bachelor’s degree in Computer Science, Data Science, Information Technology, Engineering or a related field. 3+ plus years of experience as a Data Engineer or in a similar role, with a strong portfolio of data projects. 3+ plus years experience SQL skills, with the ability to write and optimize queries for large datasets. 1+ plus years experience and proficiency in Python for data manipulation, automation, and pipeline development. Experience with Databricks including creating notebooks and utilizing Spark for big data processing. Strong experience with data warehousing solution (such as Snowflake), including schema design and performance optimization. Experience with data governance and quality management tools, particularly Collibra DQ. Strong analytical and problem-solving skills, with an attention to detail. SAP Basis experience working on SAP S/4HANA deployments on Cloud platforms (example: AWS, GCP or Azure). Our technology teams operate as business partners, proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver services and solutions that help everyone be more productive and enable innovation. Who we are: We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What we look for: Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are among the intellectually curious, join us—and start making your impact today. #HYDIT Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status: Regular Relocation: VISA Sponsorship: Travel Requirements: Flexible Work Arrangements: Hybrid Shift: Valid Driving License: Hazardous Material(s): Required Skills: Business, Business, Business Intelligence (BI), Business Management, Contractor Management, Cost Reduction, Database Administration, Database Optimization, Data Engineering, Data Flows, Data Infrastructure, Data Management, Data Modeling, Data Optimization, Data Quality, Data Visualization, Design Applications, ETL Tools, Information Management, Management Process, Operating Cost Reduction, Senior Program Management, Social Collaboration, Software Development, Software Development Life Cycle (SDLC) {+ 1 more} Preferred Skills: Job Posting End Date: 08/13/2025 A job posting is effective until 11:59:59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID: R350686
Posted 5 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Platform Support Provide technical support and troubleshoot issues related to the Starburst Enterprise Platform. Ensure platform performance, availability, and reliability using Helm charts for resource management. Deployment And Configuration Manage deployment and configuration of the Starburst Enterprise Platform on Kubernetes using Helm charts and YAML-based values files. Build and maintain Docker images as needed to support efficient, scalable deployments and integrations. Employ GitHub Actions for streamlined CI/CD processes. User Onboarding And Support Assist in onboarding users by setting up connections, catalogs, and data consumption client tools. Address user queries and incidents, ensuring timely resolution and issue triage. Maintenance And Optimization Perform regular updates, patching, and maintenance tasks to ensure optimal platform performance. Conduct application housekeeping, user query logs, and access audits. Scripting And Automation Develop automation scripts using Python and GitHub pipelines to enhance operational efficiency. Document workflows and ensure alignment with business objectives. Broader Knowledge And Integration Maintain expertise in technologies like Immuta, Apache Ranger, Collibra, Snowflake, PostgreSQL, Redshift, Hive, Iceberg, dbt, AWS Lambda, AWS Glue, and Power BI. Provide insights and recommendations for platform improvements and integrations. New Feature Development And Integration Collaborate with feature and product development teams to design and implement new features and integrations with other data product value chain systems and tools. Assist in defining specifications and requirements for feature enhancements and new integrations. Automation And Innovation Identify opportunities for process automation and implement solutions to enhance operational efficiency. Innovate and contribute to the development of new automation tools and technologies. Incident Management Support incident management processes, including triaging and resolving technical challenges efficiently. Qualifications Bachelors degree in Computer Science, Information Technology, or a related field. Experience supporting and maintaining applications deployed on Kubernetes using Helm charts and Docker images. Understanding of RDS, GitHub Actions, and CI/CD pipelines. Proficiency in Python and YAML scripting for automation and configuration. Excellent problem-solving skills and the ability to support users effectively. Strong verbal and written communication skills. Preferred Qualifications Experience working with Kubernetes (k8s). Knowledge of data and analytical products like Immuta, Apache Ranger, Collibra, Snowflake, PostgreSQL, Redshift, Hive, Iceberg, dbt, AWS Lambda, AWS Glue, and Power BI. Familiarity with cloud environments such as AWS. Knowledge of additional scripting languages or tools is a plus. Beneficial Experience Exposure to Starburst or other data virtualization technologies like Dremio, Trino, Presto, and Athena. Show more Show less
Posted 5 days ago
15.0 years
0 Lacs
Bhubaneshwar
On-site
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Collibra Data Governance Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process, coordinating with team members, and ensuring project milestones are met. Roles & Responsibilities: - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute on key decisions - Provide solutions to problems for their immediate team and across multiple teams - Lead the application development process - Coordinate with team members to ensure project milestones are met Professional & Technical Skills: - Must To Have Skills: Proficiency in Collibra Data Governance - Strong understanding of data governance principles - Experience in implementing data governance frameworks - Knowledge of data quality management practices - Familiarity with metadata management tools Additional Information: - The candidate should have a minimum of 5 years of experience in Collibra Data Governance - This position is based at our Bhubaneswar office - A 15 years full-time education is required 15 years full time education
Posted 5 days ago
0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform. Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation. Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Participate in delivery reviews / product reviews, quality assurance and work as design authority Preferred Education Non-Degree Program Required Technical And Professional Expertise Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems Experience in data engineering and architecting data platforms. Experience in architecting and implementing Data Platforms Azure Cloud Platform Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred Technical And Professional Experience Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc Show more Show less
Posted 5 days ago
130.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Senior Specialist, Data and Analytics Architect THE OPPORTUNITY Based in Hyderabad, join a global healthcare biopharma company and be part of a 130-year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Lead an Organization driven by digital technology and data-backed approaches that supports a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be the leaders who have a passion for using data, analytics, and insights to drive decision-making, which will allow us to tackle some of the world's greatest health threats. Our Technology centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. AN integral part of our company IT operating model, Tech centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each tech center helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers. Role Overview We are seeking a talented and motivated Technical Architect to join our Data and Analytics Strategy & Architecture team. Reporting to the Lead Architect, this mid-level Technical Architect role is critical in shaping the technical foundation of our cross-product architecture. The ideal candidate will focus on reference architecture, driving proofs of concept (POCs) and points of view (POVs), staying updated on industry trends, solving technical architecture issues, and enabling a robust data observability framework. The role will also emphasize enterprise data marketplaces and data catalogs to ensure data accessibility, governance, and usability. This position will also focus on creating a customer-centric development environment that is resilient and easily adoptable by various user personas. The outcome of the cross-product integration will be improved efficiency and productivity through accelerated provisioning times and a seamless user experience, eliminating the need for interacting with multiple platforms and teams. What Will You Do In The Role Collaborate with product line teams to design and implement cohesive architecture solutions that enable cross-product integration, spanning ingestion, governance, analytics, and visualization. Develop, maintain, and advocate for reusable reference architectures that align with organizational goals and industry standards. Lead technical POCs and POVs to evaluate new technologies, tools, and methodologies, providing actionable recommendations. Diagnose and resolve complex technical architecture issues, ensuring stability, scalability, and performance across platforms. Implement and maintain frameworks to monitor data quality, lineage, and reliability across data pipelines. Contribute to the design and implementation of an enterprise data marketplace to facilitate self-service data discovery, analytics, and consumption. Oversee and extend the use of Collibra or similar tools to enhance metadata management, data governance, and cataloging across the enterprise. Monitor emerging industry trends in data and analytics (e.g., AI/ML, data engineering, cloud platforms) and identify opportunities to incorporate them into our ecosystem. Work closely with data engineers, data scientists, and other architects to ensure alignment with the enterprise architecture strategy. Create and maintain technical documentation, including architecture diagrams, decision records, and POC/POV results. What Should You Have Strong experience with Databricks, Dataiku, Starburst and related data engineering/analytics platforms. Proficiency in AWS cloud platforms and AWS Data and Analytics technologies Knowledge of modern data architecture patterns like data Lakehouse, data mesh, or data fabric. Hands-on experience with Collibra or similar data catalog tools for metadata management and governance. Familiarity with data observability tools and frameworks to monitor data quality and reliability. Experience contributing to or implementing enterprise data marketplaces, including facilitating self-service data access and analytics. Exposure to designing and implementing scalable, distributed architectures. Proven experience in diagnosing and resolving technical issues in complex systems. Passion for exploring and implementing innovative tools and technologies in data and analytics. 3–5 years of total experience in data engineering, analytics, or architecture roles. Hands-on experience with developing ETL pipelines with DBT, Matillion and Airflow. Experience with data modeling, and data visualization tools (e.g., ThoughtSpot, Power BI). Strong communication and collaboration skills. Ability to work in a fast-paced, cross-functional environment. Focus on continuous learning and professional growth. Preferred Skills Certification in Databricks, Dataiku, or a major cloud platform. Experience with orchestration tools like Airflow or Prefect. Understanding of AI/ML workflows and platforms. Exposure to frameworks like Apache Spark or Kubernetes. Our technology teams operate as business partners, proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver services and solutions that help everyone be more productive and enable innovation Who We Are We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What We Look For Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are intellectually curious, join us—and start making your impact today. Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Enterprise Architecture (BEA), Business Process Modeling, Data Modeling, Emerging Technologies, Requirements Management, Solution Architecture, Stakeholder Relationship Management, Strategic Planning, System Designs Preferred Skills Job Posting End Date 07/3/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R345601 Show more Show less
Posted 5 days ago
5.0 years
0 Lacs
Bhubaneswar, Odisha, India
On-site
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Collibra Data Governance Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process, coordinating with team members, and ensuring project milestones are met. Roles & Responsibilities: - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute on key decisions - Provide solutions to problems for their immediate team and across multiple teams - Lead the application development process - Coordinate with team members to ensure project milestones are met Professional & Technical Skills: - Must To Have Skills: Proficiency in Collibra Data Governance - Strong understanding of data governance principles - Experience in implementing data governance frameworks - Knowledge of data quality management practices - Familiarity with metadata management tools Additional Information: - The candidate should have a minimum of 5 years of experience in Collibra Data Governance - This position is based at our Bhubaneswar office - A 15 years full-time education is required 15 years full time education Show more Show less
Posted 5 days ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Please note, this is a 12 month initial contract, with the possibility of extensions. This role is hybrid in 560037, Bengaluru. Insight Global are looking for a Data Management Business Analyst to join one of their premium clients in the financial services space. You will play a pivotal role in bridging the gap between business needs and technical solutions, with a strong emphasis on data governance and data management. You will ensure that the companies data assets are effectively governed, secure, and aligned with business objectives with a specific focus on supporting the capture of data lineage across the technology estate. You will be the liaison for internal stakeholders when it comes to understanding requirements. You may also be involved in manipulating data at the same time. Must haves: 5+ years' experience in a Business Analyst and/or Data Analyst role with a focus on data governance, data management, or data quality Strong technical understanding of data systems, including databases (for example, SQL), data modelling, and data integration tools Proficiency in data analysis tools and techniques (such as Python, R, or Excel) Experience in developing and implementing data governance frameworks, policies, or standards Excellent communication and stakeholder management skills, with the ability to translate complex technical concepts into simplified business language Experience creating business requirement documentation (BRD) Strong understanding of regulatory compliance requirements related to data (for example GDPR, DORA, or industry-specific regulations) Bachelor’s degree in a relevant field such as Computer Science, Information Systems, Data Science, Business Administration, or equivalent Plusses: Hands-on experience with data governance tools (such as Collibra, Informatica or Solidatus) Familiarity with cloud-based data platforms (such as Azure, AWS or Google Cloud) Knowledge of modern data platforms (for example Snowflake, Databricks or Azure Data Lake) Knowledge of data visualization tools for presenting insights (for example Tableau or Power BI) Experience writing user stories Experience working in an Agile environment (using tools such as Jira is advantageous) Experience working in financial services or other regulated industries Understanding of machine learning or advanced analytics concepts An advanced degree in Data Science, Business Analytics, or related fields Professional certifications in business analysis (such as CBAP, CCBA), data analysis, or data governance (such DAMA CDMP, CISA) are highly desirable Show more Show less
Posted 5 days ago
5.0 years
0 Lacs
Bhubaneswar, Odisha, India
On-site
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Collibra Data Governance Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process, coordinating with team members, and ensuring project milestones are met. Roles & Responsibilities: - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute on key decisions - Provide solutions to problems for their immediate team and across multiple teams - Lead the application development process - Coordinate with team members to ensure project milestones are met Professional & Technical Skills: - Must To Have Skills: Proficiency in Collibra Data Governance - Strong understanding of data governance principles - Experience in implementing data governance frameworks - Knowledge of data quality management practices - Familiarity with metadata management tools Additional Information: - The candidate should have a minimum of 5 years of experience in Collibra Data Governance - This position is based at our Bhubaneswar office - A 15 years full-time education is required Show more Show less
Posted 5 days ago
3.0 years
0 Lacs
Kochi, Kerala, India
Remote
AWS Data Engineer Location: Remote (India) Experience: 3+ Years Employment Type: Full-Time About the Role: We are seeking a talented AWS Data Engineer with at least 3 years of hands-on experience in building and managing data pipelines using AWS services. This role involves working with large-scale data, integrating multiple data sources (including sensor/IoT data), and enabling efficient, secure, and analytics-ready solutions. Experience in the energy industry or working with time-series/sensor data is a strong plus. Key Responsibilities: Build and maintain scalable ETL/ELT data pipelines using AWS Glue, Redshift, Lambda, EMR, S3, and Athena Process and integrate structured and unstructured data, including sensor/IoT and real-time streams Optimize pipeline performance and ensure reliability and fault tolerance Collaborate with cross-functional teams including data scientists and analysts Perform data transformations using Python, Pandas, and SQL Maintain data integrity, quality, and security across the platform Use Terraform and CI/CD tools (e.g., Azure DevOps) for infrastructure and deployment automation Support and monitor pipeline workflows, troubleshoot issues, and implement fixes Contribute to the adoption of emerging tools like AWS Bedrock, Textract, Rekognition, and GenAI solutions Required Skills and Qualifications: Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field 3+ years of experience in data engineering using AWS Strong skills in: AWS Glue, Redshift, S3, Lambda, EMR, Athena Python, Pandas, SQL RDS, Postgres, SAP HANA Solid understanding of data modeling, warehousing, and pipeline orchestration Experience with version control (Git) and infrastructure as code (Terraform) Preferred Skills: Experience working with energy sector dat a or IoT/sensor-based dat aExposure to machine learnin g tools and frameworks (e.g., SageMaker, TensorFlow, Scikit-learn )Familiarity with big data technologie s like Apache Spark, Kafk aExperience with data visualization tool s (Tableau, Power BI, AWS QuickSight )Awareness of data governance and catalog tool s such as AWS Data Quality, Collibra, and AWS Databre wAWS Certifications (Data Analytics, Solutions Architect ) Show more Show less
Posted 6 days ago
15.0 years
0 Lacs
India
Remote
Job Title: Data Engineer Lead - AEP Location: Remote Experience Required: 12–15 years overall experience 8+ years in Data Engineering 5+ years leading Data Engineering teams Cloud migration & consulting experience (GCP preferred) Job Summary: We are seeking a highly experienced and strategic Lead Data Engineer with a strong background in leading data engineering teams, modernizing data platforms, and migrating ETL pipelines and data warehouses to Google Cloud Platform (GCP) . You will work directly with enterprise clients, architecting scalable data solutions, and ensuring successful delivery in high-impact environments. Key Responsibilities: Lead end-to-end data engineering projects including cloud migration of legacy ETL pipelines and Data Warehouses to GCP (BigQuery) . Design and implement modern ELT/ETL architectures using Dataform , Dataplex , and other GCP-native services. Provide strategic consulting to clients on data platform modernization, governance, and data quality frameworks. Collaborate with cross-functional teams including data scientists, analysts, and business stakeholders. Define and enforce data engineering best practices , coding standards, and CI/CD processes. Mentor and manage a team of data engineers; foster a high-performance, collaborative team culture. Monitor project progress, ensure delivery timelines, and manage client expectations. Engage in technical pre-sales and solutioning , driving excellence in consulting delivery. Technical Skills & Tools: Cloud Platforms: Strong experience with Google Cloud Platform (GCP) – particularly BigQuery , Dataform , Dataplex , Cloud Composer , Cloud Storage , Pub/Sub . ETL/ELT Tools: Apache Airflow, Dataform, dbt (if applicable). Languages: Python, SQL, Shell scripting. Data Warehousing: BigQuery, Snowflake (optional), traditional DWs (e.g., Teradata, Oracle). DevOps: Git, CI/CD pipelines, Docker. Data Modeling: Dimensional modeling, Data Vault, star/snowflake schemas. Data Governance & Lineage: Dataplex, Collibra, or equivalent tools. Monitoring & Logging: Stackdriver, DataDog, or similar. Preferred Qualifications: Proven consulting experience with premium clients or Tier 1 consulting firms. Hands-on experience leading large-scale cloud migration projects . GCP Certification(s) (e.g., Professional Data Engineer, Cloud Architect). Strong client communication, stakeholder management, and leadership skills. Experience with agile methodologies and project management tools like JIRA. Show more Show less
Posted 6 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2