Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 8.0 years
12 - 13 Lacs
Bengaluru
Work from Office
Department: Medium Voltage Offers Engineering Center of Excellence Overview: As an Engineering Data Steward within Schneider Electrics Medium Voltage Offers organization, you will play a critical role in managing and maintaining the integrity of engineering data. This position focuses on ensuring that data is accurate, accessible, and effectively utilized to support product development and decision-making processes. Key Responsibilities: Data Management: Oversee the collection, organization, and maintenance of engineering data related to medium voltage products (PDM, PIM data for SBU) Quality Assurance: Implement data quality standards and regularly audit data to ensure accuracy and compliance. Collaboration & leadership: Work closely with engineering, Technical Offer Architects, Offer Data Leaders, and IT teams to ensure seamless integration of data across various platforms (PDM, ERPs, PIM, Environmental data management systems ) and proper application of data management rules Documentation: Develop and maintain comprehensive documentation for data management processes and systems. Training and Support: Provide training and support to team members on data management best practices and tools. Reporting: Generate reports and dashboards to track data quality metrics and trends. Qualifications: bachelors degree in Engineering, Mechanical engineering, Computer Science, or a related field. Experience with data management and analysis in an engineering context. Strong analytical skills and attention to detail. Familiarity with data governance and data quality frameworks. Excellent communication and collaboration skills. What We Offer: A dynamic work environment with opportunities for growth and development. A chance to contribute to innovative projects within Medium Voltage life of business.
Posted 1 month ago
6.0 - 11.0 years
18 - 20 Lacs
Hyderabad
Work from Office
As part of the CE Data Platform team, your role will involve establishing a clear vision for data engineering practices, harmoniously aligning it with the data architecture. Collaboration with product managers is essential to comprehend business requirements and identify opportunities for data leverage. This position also entails the responsibility of creating, designing, and developing complex data processing pipelines. A cooperative relationship with the data intelligence team is necessary for designing scalable implementations and production of data models. The role involves writing clean, iterative code, utilizing various continuous delivery practices to deploy, support, and operate data pipelines. The selection of suitable data modeling techniques, optimization and design of physical data models, and understanding of the trade-offs between various data modeling techniques, form an integral part of this role. Job Qualifications : You are passionate about data, possessing the ability to build and operate data pipelines, and maintain data storage within distributed systems. This role requires a deep understanding of data modeling and experience with modern data engineering tools and platforms, along with cloud warehousing tools. It is perfect for individuals who can go deep into coding and leading junior members to implement a solution. Experience in defining and implementing data governance and security policies is crucial. Knowledge of DevOps and the ability to navigate all the phases of the data & release life cycle is also essential. Professional Skills: You are familiar with AWS and Azure Cloud. You have extensive knowledge of Snowflake , SnowPro Core certification is a must have. You have used DBT at least in one project to deploy models in production. You have configured and deployed Airflow and integrated various operator in airflow (especially DBT & Snowflake). You can design build, release pipelines and understand of Azure DevOps Ecosystem. You have excellent understanding of Python (especially PySpark) and able to write metadata driven programs. Familiar with Data Vault (Raw , Business) also concepts like Point In Time , Semantic Layer. You are resilient in ambiguous situations and can clearly articulate the problem in a business friendly way. You believe in documenting processes and managing the artifacts and evolving that over time. Good to have skills: You have experience with data visualization techniques and can communicate the insights as per the audience. Experience with Terraform and Hashicorp Vault highly desirable. Knowledge of docker and Streamlit is a big plus.
Posted 1 month ago
7.0 - 12.0 years
9 - 14 Lacs
Mumbai
Work from Office
As an Audit Manager, you will bring to life Internal Audit s value proposition by supporting the bank to move at pace, safely through our enterprise-wide lens and independence to deliver what matters for our customers, the board, and regulators. Your primary responsibility is to provide independent assurance on the risk and control environment. You will support the execution of the Internal Audit Plan for division. You will be accountable for ensuring appropriate and timely assessment of the key risks and controls, and for the design and delivery of assurance activities and insights. You will conduct audit fieldwork for assigned audit areas, contributing to the assessment of the risk profile and controls of the business area under audit. You will critically analyse the appropriateness and effectiveness of internal controls within the business being audited. As part of India branch Internal Audit team, you will assist in regulatory data submissions, periodic and ad hoc reports, and responding to regulatory queries during onsite regulatory inspections or as and when required by the regulators. Banking is changing and we re changing with it, giving our people great opportunities to try new things, learn and grow. Whatever your role at ANZ, you ll be building your future, while helping to build ours. Role Location: ANZ Branch, Mumbai, India Role Type: Permanent, Full-time What will your day look like? As an Audit Manager, you will be responsible with the following: Ensure delivery of timely and quality audit tasks assigned, reporting of findings, messaging to the business, and managing audit teams, providing real-time feedback. Ensure delivery of regulatory required submissions, agreed upon procedures, data validations, attestations - to meet regulatory expectations. Developing and sustaining positive and constructive relationships with key first and second line stakeholders. Audit messages and recommendations are value adding, material, show foresight, are timely, commercial and pragmatic. Support the IA team to raise the standard of customer experience and actively contribute to IA being seen as a value-add business partner. Deliver appropriate assurance through quality audit reports supported by appropriate audit evidence. Manage or execute Audits in alignment with IA methodology and IIA standards and ensure methodology efficiencies are identified. Compile and keep up to date own performance and development plans. Solicit and act on development and performance feedback. What will you bring? To grow and be successful in this role, you will ideally bring the following: Proven experience and understanding of the Institutional banking business. Knowledge of IT General Controls and understanding of payments technology and related controls. Good understanding of the Indian Regulatory environment; experience of and familiarity with RBI s data submission, attestation and validation expectations for Information Systems Audit (for instance RBI s Cyber Security Returns, SWIFT Related Operational Controls) Strong understanding and experience of key Operational Risk and Compliance frameworks and concepts. Knowledge and experience using data analytics tools and techniques to support audit work. Significant experience in internal audit, preferably with foreign banks operating in India, coupled with institutional banking & finance industry knowledge. Focus on technology and / or integrated audits. Able to undertake risk and control analysis of business areas and processes, using this to design and deliver effective assurance activities. Understanding of technology and / or data governance-related regulatory requirements. Able to use and apply Data Analytics techniques in the delivery of audits. Strong track record of delivering high quality work in complex technical areas (e.g. audit, assurance). Excellent verbal and written communication skills. Tertiary qualified, ideally with professional (CPA/CA) and/or post graduate qualifications with strong technically relevant skills. Post Graduate IT qualifications with technically relevant skills (e.g. CISA, CISSP, CISM, ITIL)
Posted 1 month ago
5.0 - 10.0 years
6 - 10 Lacs
Noida
Work from Office
Salesforce Health Cloud Specialist Location: Noida, India Experience: 5 Years Employment Type: Full-time Job Overview: We are looking for a skilled and motivated Salesforce Health Cloud Specialist to join our growing team in Noida. The ideal candidate will possess hands-on experience with Salesforce Health Cloud, coupled with a strong grasp of healthcare workflows and cloud-based technology. This role will be pivotal in designing, implementing, and optimizing Salesforce Health Cloud solutions to enhance patient care and operational efficiency. Key Responsibilities: Implementation & Configuration: Design, configure, and implement Salesforce Health Cloud solutions tailored to healthcare provider workflows and patient engagement needs. Customization & Optimization: Customize Health Cloud features for patient care management, care coordination, and compliance with healthcare standards and protocols. System Integration: Integrate Salesforce Health Cloud with other healthcare platforms (e.g., EHR/EMR systems) and third-party applications using APIs. Technical Support & Maintenance: Provide day-to-day support, issue resolution, and continuous improvement for existing Health Cloud applications. User Enablement: Develop training materials and conduct sessions to empower end-users with knowledge of system capabilities and best practices. Data Governance & Compliance: Ensure accuracy, consistency, and privacy of patient and provider data, adhering to HIPAA and other regulatory standards. Stakeholder Collaboration: Partner with healthcare teams, IT, and business stakeholders to gather requirements, deliver solutions, and drive adoption. Continuous Improvement: Stay updated with Salesforce Health Cloud enhancements, and proactively recommend system optimizations. Qualifications: Experience: 2-5 years of hands-on experience with Salesforce Health Cloud or related Salesforce platforms. Certifications: Salesforce Health Cloud Consultant certification (preferred). Additional Salesforce certifications such as Administrator or Platform Developer are a plus. Technical Proficiency: Solid understanding of Health Cloud components like Care Plans, Patient Timeline, and Provider Network. Experience with Apex, Visualforce , and Lightning Web Components is advantageous. Domain Knowledge: Good understanding of healthcare industry processes, workflows, and compliance requirements (e.g., HIPAA). Soft Skills: Excellent communication, documentation, problem-solving, and interpersonal skills. Collaboration: Proven ability to work effectively in cross-functional teams and manage competing priorities. What We Offer: Competitive salary and performance-based bonuses Health insurance and wellness programs Career development and Salesforce certification support A collaborative and inclusive work culture Opportunity to work on impactful healthcare transformation projects
Posted 1 month ago
4.0 - 7.0 years
15 - 20 Lacs
Pune
Work from Office
As Senior Solution Architects , we work with a collaborative team and will play a meaningful role in the integration of our businesses where we connect all the digital teams and the consumers and procurers of IT, to build a coordinated, flexible, effective IT architecture for bp. You will also work within data architecture, which specialises in our data and analytics platforms that provide the capability to progress data through its value chain to insight, and the data solutions based on those platforms. What you will deliver Architecture: Rigorously develops solution architectures, seeking practical solutions that optimize and re-use capabilities; builds technical designs of services or applications and cares passionately about the integrity of the IT capabilities developed. Technology: An excellent technologist passionate about understanding and learning, contributing to digital transformation from an architectural perspective. Brings hands-on skills in key technologies and quickly assesses new technologies commercially. Data engineering and analytics: Focuses on capabilities to realise insight from information/knowledge, spanning data analytics and data science, including business intelligence, machine learning pipelines and modeling, and other sophisticated analytics. Awareness of information modeling of data assets to their implementation in data pipelines, and the associated data processing and storage techniques. Safety and compliance: The safety of people and customers is the highest priority, this role advocates and ensures architectures, designs, and processes enhance a culture of operational safety and improve digital security. Collaboration: Establishes team abilities, demonstrating leadership through delegation, motivation, and trust. Builds positive relationships, advises leaders on technology, and mentors within Digital delivery teams. Encourage engagement with technology as a driver of change. Understands long-term solution needs and fosters rapport with team members inside and outside BP What you will need to be successful (experience and qualifications) Technical skills A Bachelors (or higher) degree or equivalent work experience. A confirmed background in architecture with real-world experience of architecting. Deep-seated functional knowledge of key technology sets. Be part of a tight-knit delivery team. You accomplish outstanding project outcomes in a respectful and encouraging culture. A proven grasp of architecture development and design thinking in an agile environment. You adapt delivery techniques to drive outstanding project delivery. Capable in information architecture and data engineering / management processes, including data governance / modelling techniques and tools, processing methods and technologies. Capable in data analytics and data science architectures, including business intelligence, machine learning pipelines and modelling, and associated technologies. Desirable Skills Data Governance and Modelling Tools, Databricks, Azure Data Services Architectures, Azure Data Analytics and Data Science Architectures Programming Languages - Python, Scala, Spark variants Business Modelling, Business Risk Management, User Experience Analysis, Emerging Technology Monitoring, IT Strategy and Planning Systems Design, Capacity Management, Network Design, Service Acceptance, Systems Development Management] About bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Additional Information We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills:
Posted 1 month ago
7.0 - 12.0 years
9 - 14 Lacs
Hyderabad
Work from Office
Job Description As a Sr Data Engineer, your role is to spearhead the data engineering teams and elevate the team to the next level! You will be responsible for laying out the architecture of the new project as well as selecting the tech stack associated with it. You will plan out the development cycles deploying AGILE if possible and create the foundations for good data stewardship with our new data products! You will also set up a solid code framework that needs to be built to purpose yet have enough flexibility to adapt to new business use cases tough but rewarding challenge! Responsibilities Collaborate with several stakeholders to deeply understand the needs of data practitioners to deliver at scale Lead Data Engineers to define, build and maintain Data Platform Work on building Data Lake in Azure Fabric processing data from multiple sources Migrating existing data store from Azure Synapse to Azure Fabric Implement data governance and access control Drive development effort End-to-End for on-time delivery of high-quality solutions that conform to requirements, conform to the architectural vision, and comply with all applicable standards. Present technical solutions, capabilities, considerations, and features in business terms. Effectively communicate status, issues, and risks in a precise and timely manner. Further develop critical initiatives, such as Data Discovery, Data Lineage and Data Quality Leading team and Mentor junior resources Help your team members grow in their role and achieve their career aspirations Build data systems, pipelines, analytical tools and programs Conduct complex data analysis and report on results Qualifications 7+ Years of Experience as a data engineer or similar role in Azure Synapses, ADF or relevant exp in Azure Fabric Degree in Computer Science, Data Science, Mathematics, IT, or similar field Must have experience executing projects end to end. At least one data engineering project should have worked in Azure Synapse, ADF or Azure Fabric Should be experienced in handling multiple data sources Technical expertise with data models, data mining, and segmentation techniques Deep understanding, both conceptually and in practice of at least one object orientated library (Python, pySpark) Strong SQL skills and a good understanding of existing SQL warehouses and relational databases. Strong Spark, PySpark, Spark SQL skills and good understanding of distributed processing frameworks. Build large-scale batches and real-time data pipelines. Ability to work independently and mentor junior resources. Desire to lead and develop a team of Data Engineers across multiple levels Experience or knowledge in Should have experience or knowledge in Data Governance Azure Cloud experience with Data modeling, CI\CD, Agile Methodologies, Docker\Kubernetes
Posted 1 month ago
7.0 - 12.0 years
9 - 14 Lacs
Hyderabad
Work from Office
As a Sr Data Engineer, your role is to spearhead the data engineering teams and elevate the team to the next level! You will be responsible for laying out the architecture of the new project as well as selecting the tech stack associated with it. You will plan out the development cycles deploying AGILE if possible and create the foundations for good data stewardship with our new data products! You will also set up a solid code framework that needs to be built to purpose yet have enough flexibility to adapt to new business use cases tough but rewarding challenge! Responsibilities Collaborate with several stakeholders to deeply understand the needs of data practitioners to deliver at scale Lead Data Engineers to define, build and maintain Data Platform Work on building Data Lake in Azure Fabric processing data from multiple sources Migrating existing data store from Azure Synapse to Azure Fabric Implement data governance and access control Drive development effort End-to-End for on-time delivery of high-quality solutions that conform to requirements, conform to the architectural vision, and comply with all applicable standards. Present technical solutions, capabilities, considerations, and features in business terms. Effectively communicate status, issues, and risks in a precise and timely manner. Further develop critical initiatives, such as Data Discovery, Data Lineage and Data Quality Leading team and Mentor junior resources Help your team members grow in their role and achieve their career aspirations Build data systems, pipelines, analytical tools and programs Conduct complex data analysis and report on results 7+ Years of Experience as a data engineer or similar role in Azure Synapses, ADF or relevant exp in Azure Fabric Degree in Computer Science, Data Science, Mathematics, IT, or similar field Must have experience executing projects end to end. At least one data engineering project should have worked in Azure Synapse, ADF or Azure Fabric Should be experienced in handling multiple data sources Technical expertise with data models, data mining, and segmentation techniques Deep understanding, both conceptually and in practice of at least one object orientated library (Python, pySpark) Strong SQL skills and a good understanding of existing SQL warehouses and relational databases. Strong Spark, PySpark, Spark SQL skills and good understanding of distributed processing frameworks. Build large-scale batches and real-time data pipelines. Ability to work independently and mentor junior resources. Desire to lead and develop a team of Data Engineers across multiple levels Experience or knowledge in Should have experience or knowledge in Data Governance Azure Cloud experience with Data modeling, CI\CD, Agile Methodologies, Docker\Kubernetes
Posted 1 month ago
7.0 - 12.0 years
9 - 14 Lacs
Bengaluru
Work from Office
We help the world run better What youll do: The main objective of the Senior Project Manager - Compliance & Certification is to accompany the security compliance governance of cloud products within the scope of our Information Security and Business Continouty Management Systems having a holistic approach end-to-end. Youll collaborate with cross-functional teams, including IT, Legal, and Finance, to address technology and security compliance needs Taking over responsibilities in prioritising security compliance issues together with the respective operations or development teams will be part of your main activities. As in cloud computing technical security, security controls, and technical and organisational measures are converging, a broad and holistic view on all aspects of compliance and governance is necessary. Your role will contribut to manage compliance initiatives, including gap assessments and consult of implementation and governance of controls to address regulatory requirements. You will consult and coordinat cross-functional especially in the area of legal, contractual and reglutroy compliance and obligations. You are the main contact to translate these into controls and transfer those into governance activities, i.e for SOC, C5, TISAX, CSA Star. Advising on framework and governance activities while providing a holistic overview of the strategic positioning of compliance and information security to the business units will be your day-to-day responsibility What you bring: 7+ years project management and/or program management 4+ years of professional compliance, certifications, and software consulting experience Working experience in international and virtual projects Master s / Bachelor s degree in computer /natural science, economics, engineering or related proficiency in business economics and process analytics. Deep knowledge of ISO, SOC and C5 related certifications and audits Passion for compliance and background in consulting (SDOL, IT Security Concepts, Product Security, Data Governance). Strong analytical and communication skills. Profession in English - verbal and written. Agility, openness, and ability to adapt to a dynamic environment Strong project management skills and records in external compliance and obligations Strong know how about cloud platform technologies like Infrastructure Management, Management System, Information Security, Virtualization, Container, Networ Meet your Team: The Governance, Compliance & Certification Office is responsible for the governance of compliance of all customer-facing cloud development and operations of our stakeholders. We offer the possibility to be a key player within our highly motivated Office by translating compliance and legal language into tangible good practices, while supporting and consulting stakeholders along the cloud value chain. Bring out your best . Successful candidates might be required to undergo a background verification with an external vendor. Requisition ID: 426618 | Work Area: Software-Design and Development | Expected Travel: 0 - 10% | Career Status: Professional | Employment Type: Regular Full Time | Additional Locations: #LI-Hybrid. Requisition ID: 426618 Posted Date: May 21, 2025 Work Area: Software-Design and Development Career Status: Professional Employment Type: Regular Full Time Expected Travel: 0 - 10% Location:
Posted 1 month ago
10.0 - 16.0 years
35 - 60 Lacs
Bengaluru
Hybrid
At least 5 years of experience in a complex business environment or international organisation matrix. Must have experience and knowledge in data governance. Strong IT background, including expertise in big data, cloud technology, monitoring solutions, machine learning ML, and artificial intelligence AI. Familiarity with data governance tools such as Collibra (preferred) or similar alternatives. Proven track record in product management, data management, and information technology systems and tools. Experience with the SAFe Agile framework. Knowledge of data analytics/dashboard tools like Qlik and Microsoft PowerBI is a plus. Nice to have experience in travel domain.
Posted 1 month ago
1.0 - 6.0 years
16 - 17 Lacs
Pune
Work from Office
Join us as a Data Engineer - Pyspark, SQL at Barclays, where youll spearhead the evolution of our digital landscape, driving innovation and excellence. Youll harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. As a part of team of developers, you will deliver technology stack, using strong analytical and problem solving skills to understand the business requirements and deliver quality solutions. To be successful as a Data Engineer - Pyspark, SQL you should have experience with: Hands on experience in Pyspark and strong knowledge on Dataframes, RDD and SparkSQL Hands on experience in Pyspark performance optimization techniques . Hands on Experience in developing, testing and maintaining applications on AWS Cloud. Strong hold on AWS Data Analytics Technology Stack (Glue, S3, Lambda, Lake formation, Athena) Design and implement scalable and efficient data transformation/storage solutions with open table formats such as DELTA, Iceberg, Hudi. Experience in using DBT (Data Build Tool) with snowflake/Athena/Glue for ELT pipeline development. Experience in Writing advanced SQL and PL SQL programs. Hands On Experience for building reusable components using Snowflake and AWS Tools/Technology Should have worked at least on two major project implementations. Exposure to data governance or lineage tools such as Immuta and Alation is added advantage. Experience in using Orchestration tools such as Apache Airflow or Snowflake Tasks is added advantage. Knowledge on Ab-initio ETL tool is a plus Some other highly valued skills includes: Ability to engage with Stakeholders, elicit requirements/ user stories and translate requirements into ETL components Ability to understand the infrastructure setup and be able to provide solutions either individually or working with teams. Good knowledge of Data Marts and Data Warehousing concepts. Resource should possess good analytical and Interpersonal skills. Implement Cloud based Enterprise data warehouse with multiple data platform along with Snowflake and NoSQL environment to build data movement strategy. You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based out of Pune. Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data. Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. Development of processing and analysis algorithms fit for the intended data complexity and volumes. Collaboration with data scientist to build and deploy machine learning models. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L - Listen and be authentic, E - Energise and inspire, A - Align across the enterprise, D - Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc). to solve problems creatively and effectively. Communicate complex information. Complex information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes.
Posted 1 month ago
1.0 - 6.0 years
16 - 17 Lacs
Pune
Work from Office
Join us as a Data Engineer - Pyspark, SQL at Barclays, where youll spearhead the evolution of our digital landscape, driving innovation and excellence. Youll harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. As a part of team of developers, you will deliver technology stack, using strong analytical and problem solving skills to understand the business requirements and deliver quality solutions. To be successful as a Data Engineer - Pyspark, SQL you should have experience with: Hands on experience in Pyspark and strong knowledge on Dataframes, RDD and SparkSQL Hands on experience in Pyspark performance optimization techniques . Hands on Experience in developing, testing and maintaining applications on AWS Cloud. Strong hold on AWS Data Analytics Technology Stack (Glue, S3, Lambda, Lake formation, Athena) Design and implement scalable and efficient data transformation/storage solutions with open table formats such as DELTA, Iceberg, Hudi. Experience in using DBT (Data Build Tool) with snowflake/Athena/Glue for ELT pipeline development. Experience in Writing advanced SQL and PL SQL programs. Hands On Experience for building reusable components using Snowflake and AWS Tools/Technology Should have worked at least on two major project implementations. Exposure to data governance or lineage tools such as Immuta and Alation is added advantage. Experience in using Orchestration tools such as Apache Airflow or Snowflake Tasks is added advantage. Knowledge on Ab-initio ETL tool is a plus Some other highly valued skills includes: Ability to engage with Stakeholders, elicit requirements/ user stories and translate requirements into ETL components Ability to understand the infrastructure setup and be able to provide solutions either individually or working with teams. Good knowledge of Data Marts and Data Warehousing concepts. Resource should possess good analytical and Interpersonal skills. Implement Cloud based Enterprise data warehouse with multiple data platform along with Snowflake and NoSQL environment to build data movement strategy. You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based out of Pune. Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data. Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. Development of processing and analysis algorithms fit for the intended data complexity and volumes. Collaboration with data scientist to build and deploy machine learning models. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L - Listen and be authentic, E - Energise and inspire, A - Align across the enterprise, D - Develop others. OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation.
Posted 1 month ago
5.0 - 8.0 years
8 - 15 Lacs
Pune
Work from Office
Core Technical Skills: Design and develop robust backend solutions using Java 11+/17 and Spring Boot. Build, test, and maintain scalable microservices in a cloud environment (AWS). Work with Kafka or other messaging systems for event-driven architecture. Write clean, maintainable code with high test coverage using. Tools & Reporting: Java 11+/17, SpringBoot, AWS, Kafta Soft Skills: Strong communication, coordination with app teams Analytical thinking and problem-solving Ability to work independently or collaboratively Snowflake architecture & performance tuning, Oracle DB, SQL optimization, Data governance, RBAC, Data replication, Time Travel & Cloning, Dynamic data masking, OEM & AWR reports, Apps DBA experience.
Posted 1 month ago
7.0 - 12.0 years
27 - 42 Lacs
Chennai
Work from Office
Collibra Certified Rangers (Preferable) Analyze current architecture, data flow, data dependencies and related documentation. Develop Collibra integration solutions. Trace data from source system, across the various contact points of data landscape, to final destination system. Use Lineage harvester and adhere to industry best practices. Experience in metadata extraction, building business and technical lineage among assets. Design, develop and test Collibra integrations and workflows. Take advantage of the depth and breadth of integrations to connect data ecosystem to Collibra Data Intelligence Platform. Access Collibra-supported integrations, partner and other pre-built integrations and APIs to gain visibility. Understand and share insights. Understand Data Governance, Metadata Management, Reference Data Management, Data Modeling, Data Integration & Data Analysis. Work as a Solution provider. Collibra API Development
Posted 1 month ago
2.0 - 7.0 years
3 - 5 Lacs
Hyderabad
Work from Office
send cvs to shilpa.srivastava@orcapod.work with subject Data Governance Notice period- Immediate to 15 days only Excellent spoken English as interview will be conducted by American panel of interviewers Location - Hyderabad Role & responsibilities Work you will do: Facilitate requests for new data and metrics from a variety of stakeholders through the evaluation and approval steps outlined in the governance process. Communicate necessary clarifications with requestors where needed. Update Governance and Security documentation as new guidance is obtained through the team and/or our risk, data security or governance partners. Provide recommendations to improvements to the processes in place. Develop knowledge and expertise of Talent Data Governance processes and what sensitive fields and combinations of fields are. Function as utility player across multiple areas on the team. Document detailed toolkits and support materials to help stakeholders understand and work through approval processes. The successful Candidate will possess: Ability to balance detail (e.g., tracking nuanced stakeholder needs) while thinking strategically and holistically about underlying goals/needs. Ability to think strategically and understand how requests can have downstream impacts. Ability to formulate constructive questions to gather missing information from stakeholders. Ability to communicate effectively with a broad set of stakeholders and other project teams from across the business, spanning multiple levels. Proficient in quickly understanding data governance and security and communicating their importance to stakeholders. Work well in a dynamic, complex, client- and team-focused environment with minimal oversight and an agile mindset. Analytical ability to delve into details while maintaining a broader business perspective. A self-starter who needs minimal guidance and instruction to perform daily responsibilities. Strong organizational skills, in executing multiple project priorities simultaneously. Strong written and oral communication skills to convey complex topics concisely. Knowledge of the professional services industry. Qualifications: Bachelors Degree and minimum 2 years related experience or 1 years applied experience in data governance and/or data security 2 3 years experience in the follow applications: Microsoft Office skills (e.g., PowerPoint, Excel, OneNote, Word, Teams) Preferred knowledge Knowledge of data security and data governance and how it is applied in the professional services industry. Understanding of lean data methodology. Proficiency in Microsoft Powerpoint
Posted 1 month ago
12.0 - 22.0 years
25 - 40 Lacs
Bangalore Rural, Bengaluru
Work from Office
Role & responsibilities Requirements: Data Modeling (Conceptual, Logical, Physical)- Minimum 5 years Database Technologies (SQL Server, Oracle, PostgreSQL, NoSQL)- Minimum 5 years Cloud Platforms (AWS, Azure, GCP) - Minimum 3 Years ETL Tools (Informatica, Talend, Apache Nifi) - Minimum 3 Years Big Data Technologies (Hadoop, Spark, Kafka) - Minimum 5 Years Data Governance & Compliance (GDPR, HIPAA) - Minimum 3 years Master Data Management (MDM) - Minimum 3 years Data Warehousing (Snowflake, Redshift, BigQuery)- Minimum 3 years API Integration & Data Pipelines - Good to have. Performance Tuning & Optimization - Minimum 3 years business Intelligence (Power BI, Tableau)- Minimum 3 years Job Description: We are seeking experienced Data Architects to design and implement enterprise data solutions, ensuring data governance, quality, and advanced analytics capabilities. The ideal candidate will have expertise in defining data policies, managing metadata, and leading data migrations from legacy systems to Microsoft Fabric/DataBricks/ . Experience and deep knowledge about at least one of these 3 platforms is critical. Additionally, they will play a key role in identifying use cases for advanced analytics and developing machine learning models to drive business insights. Key Responsibilities: 1. Data Governance & Management Establish and maintain a Data Usage Hierarchy to ensure structured data access. Define data policies, standards, and governance frameworks to ensure consistency and compliance. Implement Data Quality Management practices to improve accuracy, completeness, and reliability. Oversee Metadata and Master Data Management (MDM) to enable seamless data integration across platforms. 2. Data Architecture & Migration Lead the migration of data systems from legacy infrastructure to Microsoft Fabric. Design scalable, high-performance data architectures that support business intelligence and analytics. Collaborate with IT and engineering teams to ensure efficient data pipeline development. 3. Advanced Analytics & Machine Learning Identify and define use cases for advanced analytics that align with business objectives. Design and develop machine learning models to drive data-driven decision-making. Work with data scientists to operationalize ML models and ensure real-world applicability. Required Qualifications: Proven experience as a Data Architect or similar role in data management and analytics. Strong knowledge of data governance frameworks, data quality management, and metadata management. Hands-on experience with Microsoft Fabric and data migration from legacy systems. Expertise in advanced analytics, machine learning models, and AI-driven insights. Familiarity with data modelling, ETL processes, and cloud-based data solutions (Azure, AWS, or GCP). Strong communication skills with the ability to translate complex data concepts into business insights. Preferred candidate profile Immediate Joiner
Posted 1 month ago
12.0 - 16.0 years
30 - 40 Lacs
Bengaluru
Work from Office
We are seeking experienced Data Architects to design and implement enterprise data solutions, ensuring data governance, quality, and advanced analytics capabilities. The ideal candidate will have expertise in defining data policies, managing metadata, and leading data migrations from legacy systems to Microsoft Fabric/DataBricks/ . Experience and deep knowledge about at least one of these 3 platforms is critical. Additionally, they will play a key role in identifying use cases for advanced analytics and developing machine learning models to drive business insights. Requirements: Data Modeling (Conceptual, Logical, Physical)- Minimum 5 years Database Technologies (SQL Server, Oracle, PostgreSQL, NoSQL)- Minimum 5 years Cloud Platforms (AWS, Azure, GCP) - Minimum 3 Years ETL Tools (Informatica, Talend, Apache Nifi) - Minimum 3 Years Big Data Technologies (Hadoop, Spark, Kafka) - Minimum 5 Years Data Governance & Compliance (GDPR, HIPAA) - Minimum 3 years Master Data Management (MDM) - Minimum 3 years Data Warehousing (Snowflake, Redshift, BigQuery)- Minimum 3 years API Integration & Data Pipelines - Good to have. Performance Tuning & Optimization - Minimum 3 years business Intelligence (Power BI, Tableau)- Minimum 3 years Key Responsibilities: 1. Data Governance & Management Establish and maintain a Data Usage Hierarchy to ensure structured data access. Define data policies, standards, and governance frameworks to ensure consistency and compliance. Implement Data Quality Management practices to improve accuracy, completeness, and reliability. Oversee Metadata and Master Data Management (MDM) to enable seamless data integration across platforms. 2. Data Architecture & Migration Lead the migration of data systems from legacy infrastructure to Microsoft Fabric. Design scalable, high-performance data architectures that support business intelligence and analytics. Collaborate with IT and engineering teams to ensure efficient data pipeline development. 3. Advanced Analytics & Machine Learning Identify and define use cases for advanced analytics that align with business objectives. Design and develop machine learning models to drive data-driven decision-making. Work with data scientists to operationalize ML models and ensure real-world applicability. Required Qualifications: Proven experience as a Data Architect or similar role in data management and analytics. Strong knowledge of data governance frameworks, data quality management, and metadata management. Hands-on experience with Microsoft Fabric and data migration from legacy systems. Expertise in advanced analytics, machine learning models, and AI-driven insights. Familiarity with data modelling, ETL processes, and cloud-based data solutions (Azure, AWS, or GCP). Strong communication skills with the ability to translate complex data concepts into business insights.
Posted 1 month ago
4.0 - 8.0 years
18 - 22 Lacs
Hyderabad, Bengaluru
Work from Office
Job Type: C2H (Long Term) Required Skill Set: Core Technical Skills: Snowflake database design, architecture & performance tuning Strong experience in Oracle DB and SQL query optimization Expertise in DDL/DML operations, data replication, and failover handling Knowledge of Time Travel, Cloning, and RBAC (Role-Based Access Control) Experience with dynamic data masking, secure views, and data governance Tools & Reporting: Familiarity with OEM, Tuning Advisor, AWR reports Soft Skills: Strong communication, coordination with app teams Analytical thinking and problem-solving Ability to work independently or collaboratively Additional Experience: Previous role as Apps DBA or similar Exposure to agile methodologies Hands-on with Snowflake admin best practices, load optimization, and secure data sharing.
Posted 1 month ago
5 - 10 years
35 - 50 Lacs
Bengaluru
Work from Office
Position summary: We are seeking a Senior Software Development Engineer – Data Engineering with 5-8 years of experience to design, develop, and optimize data pipelines and analytics workflows using Snowflake, Databricks, and Apache Spark. The ideal candidate will have a strong background in big data processing, cloud data platforms, and performance optimization to enable scalable data-driven solutions. Key Roles & Responsibilities: Design, develop, and optimize ETL/ELT pipelines using Apache Spark, PySpark, Databricks, and Snowflake. Implement real-time and batch data processing workflows in cloud environments (AWS, Azure, GCP). Develop high-performance, scalable data pipelines for structured, semi-structured, and unstructured data. Work with Delta Lake and Lakehouse architectures to improve data reliability and efficiency. Optimize Snowflake and Databricks performance, including query tuning, caching, partitioning, and cost optimization. Implement data governance, security, and compliance best practices. Build and maintain data models, transformations, and data marts for analytics and reporting. Collaborate with data scientists, analysts, and business teams to define data engineering requirements. Automate infrastructure and deployments using Terraform, Airflow, or dbt. Monitor and troubleshoot data pipeline failures, performance issues, and bottlenecks. Develop and enforce data quality and observability frameworks using Great Expectations, Monte Carlo, or similar tools. Basic Qualifications Bachelor’s or Master’s Degree in Computer Science or Data Science 5-8 years of experience in data engineering, big data processing, and cloud-based data platforms. Hands-on expertise in Apache Spark, PySpark, and distributed computing frameworks. Strong experience with Snowflake (Warehouses, Streams, Tasks, Snowpipe, Query Optimization). Experience in Databricks (Delta Lake, MLflow, SQL Analytics, Photon Engine). Proficiency in SQL, Python, or Scala for data transformation and analytics. Experience working with data lake architectures and storage formats (Parquet, Avro, ORC, Iceberg). Hands-on experience with cloud data services (AWS Redshift, Azure Synapse, Google BigQuery). Experience in workflow orchestration tools like Apache Airflow, Prefect, or Dagster. Strong understanding of data governance, access control, and encryption strategies. Experience with CI/CD for data pipelines using GitOps, Terraform, dbt, or similar technologies. Preferred Qualifications Knowledge of streaming data processing (Apache Kafka, Flink, Kinesis, Pub/Sub). Experience in BI and analytics tools (Tableau, Power BI, Looker). Familiarity with data observability tools (Monte Carlo, Great Expectations). Experience with machine learning feature engineering pipelines in Databricks. Contributions to open-source data engineering projects.
Posted 1 month ago
5 - 10 years
19 - 25 Lacs
Bengaluru
Work from Office
locationsShell Technology Centre - Bangaloreposted onPosted 3 Days Ago time left to applyEnd DateMay 29, 2025 (12 days left to apply) job requisition idR168611 , India Job Family Group: Downstream Supply Chain Worker Type: Regular Posting Start Date: May 13, 2025 Business unit: Downstream and Renewables Experience Level Experienced Professionals About The Role Where you fit in? The Lubricants Americas Business is on a transformational journey to grow our business by 30% in the next 3 years - enabled by a broader digital transformation, a S&OP refresh, and brilliant basics amongst many other tactics all of which are underpinned by accurate, dynamic, master data. The business has over 10k ship-to locations, 20+ manufacturing locations, 10+ storage locations, 4k+ finished goods, 1k+ raw materials with thousands of different shipping lanes, vendors, etc. all this to say there is a significant amount of master data to be maintained in constantly evolving external market conditions. For us to not only be cost competitive, but also to deliver against our customer promise, and leverage the various digital tools being implemented our master data accuracy must grow significantly and fast. VisionTo take our E2E Master Data Mgmt. from Emerging to Differentiated on the Accenture Maturity scale distinguished by leveraging proactive alerts to address master data anomalies, incorporating financial inputs into master data management, and development and use of in-depth business rule sets to audit and update rule sets. We are seeking an experienced and highly capable Master Data Management (MDM) Team Lead to oversee our MDM operations, which include finished goods, production, raw materials, and customer sourcing. This role involves leading the development and execution of MDM strategies and policies across the organization. The ideal candidate will provide strong leadership to the MDM team, ensuring adherence to data governance standards and maintaining the accuracy and integrity of all master data. Whats the role? Develop and execute master data management (MDM) strategies and policies to ensure data accuracy, completeness, and consistency across the organization. Lead and guide the MDM team, ensuring compliance with established data governance standards and best practices. Collaborate cross-functionally with other departments to maintain high-quality data and drive continuous improvement in data integrity. Design and oversee data quality initiatives, including profiling, cleansing, and enrichment processes. Ensure proper classification, mapping, and maintenance of data to support reliable reporting and analytics. Utilize data analytics tools such as Power BI and Spotfire to detect anomalies and drive corrective actions, leveraging automation tools like Alteryx for mass data updates. Maintain a deep understanding of all systems housing master data (e.g., SAP, TMS, O8) and their interdependencies. Foster strong relationships with stakeholders and business units to promote adherence to data governance policies and ensure alignment with organizational goals. Contributes to the creation and implementation of enterprise-wide data governance frameworks and standards. Provide regular updates and reports to leadership on the progress and impact of MDM initiatives. Continuously evaluate and enhance MDM processes, tools, and methodologies to ensure they remain effective and aligned with business needs. What we need from you Any Bachelor's Degree in Computer Science, Supply Chain, Information Systems, or related field. Minimum of 5-10 years of experience in Master Data Management or related field. 5-10 years experience in a supply chain environment 3-5 years SAP experience / ERP experience required with strong exposure to at least two of the functional areas described above Proven experience in leading an MDM team. Strong knowledge of data governance principles, best practices, and technologies. Experience with data profiling, cleansing, and enrichment tools. Ability to work with cross-functional teams to understand and address their master data needs. Proven ability to build predictive analytics tools using PowerBI, Spotfire or otherwise. Additional Note: The time requirement for this position is M-F from 5:00 p.m. until 1.30 a.m. ISD and will change in accordance to Daylight Savings Time in the US. Shift allowance (as per India country norms i.e. IRN 475/ per day (shift)) will be provided for the off-shift requirement. - COMPANY DESCRIPTION Shell is a global energy company where we work towards powering progress through more and cleaner energy solutions. We use advanced technologies and take an innovative approach to help build a sustainable energy future. In India Shell has its businesses footprint in Information Technology, Projects & Technology, Finance Operations, Integrated Gas, Downstream & Upstream spread across more than 7 main locations.An innovative place to work Join us and youll be adding your talent and imagination to a business with the ambition to shape the future whether by investing in renewables, exploring new ways to store energy or developing technology that helps the world to use energy more efficiently, everyone at Shell does their part.An inclusive place to work To power progress together, we need to attract and develop the brightest minds and make sure every voice is heard. Here are just some of the ways we are nurturing an inclusive environment one where you can express your ideas, extend your skills and reach your potential We are creating a space where people with disabilities can excel through transparent recruitment process, workplace adjustments and ongoing support in their roles. Feel free to let us know about your circumstances when you apply, and well take it from there. We are striving to be a pioneer of an inclusive and diverse workplace, promoting equality for employees regardless of sexual orientation or gender identity. We consider ourselves a flexible employer and want to support you finding the right balance. We encourage you to discuss this with us in your application.A rewarding place to work As an equal opportunity employer, combining our ideas through a creative, collaborative environment and global operations we have developed and will continue to nurture a unique workplace with an impressive range of benefits to ensure that joining Shell is an inspired and rewarding career choice for everyone. DISCLAIMER: Please noteWe occasionally amend or withdraw Shell jobs and reserve the right to do so at any time, including prior to the advertised closing date. Before applying, you are advised to read our data protection policy. This policy describes the processing that may be associated with your personal data and informs you that your personal data may be transferred to Shell/Shell Group companies around the world. The Shell Group and its approved recruitment consultants will never ask you for a fee to process or consider your application for a career with Shell. Anyone who demands such a fee is not an authorised Shell representative and you are strongly advised to refuse any such demand. Shell is an Equal Opportunity Employer.
Posted 1 month ago
4 - 9 years
22 - 32 Lacs
Chennai
Work from Office
Key Responsibilities Should be currently working as Product / Project Manager- Data product management. Develop and implement data governance policies, standards, and processes to ensure data integrity, quality, and security. Collaborate with cross-functional teams to embed governance best practices into data pipelines, analytics, and BI reporting. Define and monitor data quality metrics and KPIs Setup data catalogs, classification, and metadata management for enhanced discoverability and compliance. Partner with IT, Security, and Compliance teams to ensure regulatory and policy adherence (e.g., GDPR, HIPAA). Leverage tools and technologies like SQL, Pandas Profiling , and Python to enhance data quality and governance workflows. Act as a subject matter expert on data governance strategies and tools. Skills and Experience Bachelors/Masters degree in Data Science, Information Management, Computer Science, or a related field. 8+ years of experience in data governance, data quality management, or BI reporting roles. Knowledge of data governance tools such as Collibra, OpenMetadata, DataHub , and Informatica . Proficiency in SQL , with hands-on experience in data profiling tools (e.g., Pandas Profiling). Strong understanding of data lifecycle management, privacy laws, and compliance frameworks. Excellent leadership, communication, and stakeholder management skills. Analytical mindset with experience in measuring and reporting on data quality KPIs.
Posted 1 month ago
4 - 8 years
7 - 9 Lacs
Hyderabad
Work from Office
Overview 170+ Years Strong. Industry Leader. Global Impact. At Pinkerton, the mission is to protect our clients. To do this, we provide enterprise risk management services and programs specifically designed for each client. Pinkerton employees are one of our most important assets and critical to the delivery of world-class solutions. Bonded together, we share a commitment to integrity, vigilance, and excellence. Pinkerton is an inclusive employer who seeks candidates with diverse backgrounds, experiences, and perspectives to join our family of industry subject matter experts. This role will require you to ensure that the Physical Security Operations involving the functioning of the access control programs, CCTV Monitoring programs and the Data/Analytics programs are effectively functioning. In this role you will be tasked with monitoring, reporting, investigating, analyzing, interpreting, and synthesizing data as well as lapses in the Physical Security System of the organization via electronic surveillance across multiple global locations. You will also serve as a point of contact for stakeholders from multiple locations. The role incorporates tasks such as data management initiatives and oversee a team of data professionals. The ideal candidate will have a strong background in data governance, preparing activity reports, alarm-based reports, and data centric reports. Responsibilities Represent Pinkerton’s core values of integrity, vigilance, and excellence. Assist the PSOC Manager/Global Facilities Operations Director in effectively and seamlessly running the routine operations of the PSOC. Assist the PSOC Manager in documentation, data synthesis and data analytics of the physical security operations. Enable documentation and data/record maintenance of occurrences reported within the spectrum of Physical Security of the organization. Communicate with various stakeholders about occurrences noticed during observation/monitoring. Assist the PSOC Manager/Global Facilities Operations Director in Audit centric tasks by providing information as and when requested. Assist the PSOC Manager/Global Facilities Operations Director in tasks involving spontaneous and real-time information/data fetching. Data Management Leadership : Develop and implement data management strategies to ensure data integrity, quality, and security across the organization. Team Supervision: Manage a team of at least five data analysts, delegating tasks effectively and fostering a collaborative work environment. Application technology and AI: Design and develop basic applications using Pegasystems technology or GenAI to streamline business processes and improve data workflows. Stakeholder Collaboration: Work closely with cross-functional teams to understand data requirements and deliver solutions that meet business needs. All other duties, as assigned. Qualifications Educational Background: Bachelor’s degree in any discipline. A master’s degree is a plus. Experience: Minimum of 3 years of experience in Security and Emergency incident management roles, data management roles. Proficiency in data governance and MS Excel, Power Bi, Power Automate and other AI tools. Physical Security Expertise. Risk Assessment Crisis Management Technical Proficiency Communication Skills Working Conditions: With or without reasonable accommodation, requires the physical and mental capacity to effectively perform all essential functions; Regular computer usage. Occasional reaching and lifting of small objects and operating office equipment. Frequent sitting, standing, and/or walking. Travel, as required. Pinkerton is an equal opportunity employer to all applicants and positions without regard to race/ethnicity, color, national origin, ancestry, sex/gender, gender identity/expression, sexual orientation, marital/prenatal status, pregnancy/childbirth or related conditions, religion, creed, age, disability, genetic information, veteran status, or any protected status by local, state, federal or country-specific law.
Posted 1 month ago
5 - 10 years
7 - 12 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP Master Data Governance MDG Tool Good to have skills : No Function Specialty Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will collaborate with teams to ensure successful project delivery and implementation. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead and mentor junior professionals Conduct regular team meetings to discuss progress and challenges Stay updated on industry trends and best practices Professional & Technical Skills: Must To Have Skills:Proficiency in SAP Master Data Governance MDG Tool Strong understanding of data governance principles Experience in data modeling and data quality management Knowledge of SAP ERP systems and integration Hands-on experience in configuring and customizing SAP MDG Tool Qualifications 15 years full time education
Posted 1 month ago
3 - 8 years
5 - 10 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP Master Data Governance MDG Tool Good to have skills : No Function Specialty Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in ensuring the smooth functioning of applications and processes. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Develop and implement SAP Master Data Governance MDG Tool solutions. Collaborate with cross-functional teams to gather and understand business requirements. Perform system testing and debugging to ensure applications are running smoothly. Provide technical support and guidance to end-users. Stay updated with the latest technologies and trends in application development. Professional & Technical Skills: Must To Have Skills:Proficiency in SAP Master Data Governance MDG Tool. Strong understanding of data modeling and data governance principles. Experience in SAP MDG configuration and customization. Knowledge of SAP ERP systems and integration with SAP MDG. Hands-on experience in ABAP programming for SAP MDG solutions. Additional Information: The candidate should have a minimum of 3 years of experience in SAP Master Data Governance MDG Tool. This position is based at our Bengaluru office. A 15 years full-time education is required. Should have one Implementation experience in Technical side end to end Qualifications 15 years full time education
Posted 1 month ago
7 - 12 years
9 - 14 Lacs
Pune
Work from Office
Project Role :Application Lead Project Role Description :Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills :SAP Master Data Governance MDG Tool Good to have skills :NA Minimum 7.5 year(s) of experience is required Educational Qualification :15 years full time education Summary:As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the effort to design, build, and configure applications Act as the primary point of contact Manage the team and ensure successful project delivery Professional & Technical Skills: Must To Have Skills:Proficiency in SAP Master Data Governance MDG Tool Strong understanding of data governance principles and best practices Experience in implementing and configuring SAP MDG Tool Hands-on experience in data modeling and data quality management Solid grasp of SAP MDG integration with other SAP modules Additional Information: The candidate should have a minimum of 7.5 years of experience in SAP Master Data Governance MDG Tool This position is based at our Pune office A 15 years full-time education is required Qualifications 15 years full time education
Posted 1 month ago
3 - 8 years
7 - 12 Lacs
Mumbai, Gurugram, Delhi / NCR
Work from Office
Banking- Data & AI Consultant Find endless opportunities to solve our clients toughest challenges, as you work with exceptional people, the latest tech, and leading companies. Practice: Banking, Industry Consulting, Strategy & Consulting Global Network (earlier named as Capability Network) I Areas of Work: Data Driven Consulting & Business Analytics | Level: Consultant | Location: Delhi, Gurgaon, Mumbai | Years of Exp: 3-9 years Explore an Exciting Career at Accenture Are you an outcome-oriented problem solver? Do you enjoy working on transformation strategies for global clients? Does working in an inclusive and collaborative environment spark your interest? Then, is the right place for you to explore limitless possibilities. As a part of within the agile and distributed Strategy & Consulting Global Network team, you will contribute as a strategic partner, helping shape new business models, collaborate with clients in unconventional ways, rediscover their purpose, and commit to constant innovation. Bring in your banking and analytics expertise with a global perspective to enable banks and payment providers take > Engage deeply with C-level executives to define and execute Enterprise-wide Data & AI Strategy programs, often becoming the "trusted advisor" on D&A strategy related topics Provide tailored advice and best practices to help customers implement mission-critical reforms and advances in analytics roadmap execution Work closely with clients to solve complex business problems and deliver business value by leveraging data and analytics capabilities Advise clients on Analytics-driven revenue growth, intelligent operations, fraud, risk and regulatory management, and future-ready platforms Design and implement campaigns around new products leveraging deep industry and client data Define & implement Digital Signals-to-Sales framework at scale Lead the project delivery, support in areas such as process improvement , systems enhancement and user training Interface with cross functional client teams to perform project management activities Support leaders in drafting winning pitches and discover new business opportunities Mentor team members and contribute to their professional development Support asset development , contribute to thought leadership, staffing efforts, and capability development Estimate resource effort and cost budgets to complete a project within a defined scope Provide subject matter expertise , guidance and recommendations that will drive the success and growth of the solutions and programs in the market. Bring your best skills forward to excel in the role: Impeccable team management skills with an ability to engage effectively with multiple stakeholders Ability to lead client workshops and engage with stakeholders Ability to solve complex business problems through a methodical and structured solutioning approach and deliver client delight Strong analytical and writing skills to build viewpoints on industry trends Developing and implementing data analyses, data collection systems and other strategies that optimize statistical efficiency and quality Ability to translate business requirements into non-technical, lay terms Knowledge about programming languages such as SQL, Oracle and Python is good-to-have Knowledge about Data Visualisation Tools such as PowerBI, QLIK and Tableu is good-to-have Excellent communication, interpersonal and presentation skills Cross-cultural competence with an ability to thrive in a dynamic environment Read more about us. Qualifications Your experience counts! MBA from Tier-1 B-school 3-9 years work experience of Data and AI solutions delivery/implementation in top strategy, management, technology consulting firms, and/or analytics firms for banks or financial services company Responsible in delivery and supporting pre-sales opportunities for Data, analytics and AI solutions working with broader team in country, across geographies and with our ecosystem Experience in combining technical expertise with advanced product experience to deliver Data & AI solutions to solve business problems Experience with data integration and governance, preferably from data warehousing and migration projects Identify problems or areas of improvement for the client using data analytics and modelling techniques and propose corrective solutions and implementable roadmap Experience with Data Science and AI concepts Generation of new business by identifying opportunities in existing accounts Lead and support the delivery of technical solutions throughout each stage of the full systems lifecycle during the development of software solutions Participate in the development engagement work plans; identify resource requirements and completing assigned tasks to budget and plan. Effectively identify and communicate any necessary changes to engagement scope, Help set and manage clients expectations to ensure that all customers are delighted with Data and AI products and services Demonstration of working collaboratively with business, technology practitioners, cross-functional teams to design and implement comprehensive strategy and operating models that enable Data & AI solutions for banking industry
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane