Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 8.0 years
5 - 15 Lacs
Pune
Work from Office
Our goal is to reduce complexity and enhance transparency for our internal stakeholders, ensuring a scalable foundation for future growth. Our technology stack includes SAP BW, SAP Datasphere, Databricks, Microsoft Fabric, Microsoft Power Platform, and Power BI. Background and Skills To be successful in this role, you should hold a degree in Finance, Sales & Marketing, IT, Business Administration, Engineering, or a related field that enables you to approach tasks with a structured, analytical, and data-driven mindset. Previous experience with Finance processes, ERP systems (particularly SAP), ERP data models, Microsoft tools, and Power BI combined with strong business acumen and exceptional collaboration skills is essential Role & responsibilities Gather, analyze, and document business requirements clearly to guide Data Engineers in developing solutions aligned with business needs and technological capabilities Evaluate existing data processes and solutions, identifying opportunities for optimization and complexity reduction Engage with stakeholders to gather new requirements and clearly communicate process and data gaps, advocating for improvements and corrective actions Perform data modeling exercises to ensure common data model is achieved across the various source data systems Cleanse and maintain data quality, actively supporting business stakeholders in continuous data quality improvement initiatives Ensure the development and maintenance of high-quality BI logic and reporting standards, delivering scalable, standardized outputs understandable by all stakeholders Desired profile:- Strong analytical mindset with excellent data querying and analytical skills Proactive, solution-oriented, and pragmatic approach to problem-solving. • Curiosity and openness to learning new methods and tools Independent and efficient work habits, demonstrating persistence and accountability. Proactive ownership in driving continuous improvements, considering broader organizational impacts Flexibility and dedication to team success, willing to go above and beyond as necessary. Key Qualifications: 5+ years of experience in data analysis, business intelligence, or a similar role Strong proficiency in data analysis tools such as Excel, SQL, Power BI, or Tableau Extensive knowledge of statistical analysis, data modeling, and data visualization techniques. Hands-on experience with ETL processes Ability to write complex DAX measures, KPIs, and perform advanced calculations in Power BI Demonstrated experience with manufacturing analytic
Posted 4 weeks ago
5.0 - 10.0 years
12 - 19 Lacs
Bengaluru
Work from Office
Preferred candidate profile 5+ years of hands experience in Salesforce development (Apex, Visualforce, Lightning Components, etc.). Strong understanding of Salesforce data models, security models, and automation tools (Flows, Process Builder, etc.). Proficiency in programming languages such as Apex, JavaScript, and SQL. Experience with Salesforce integrations using REST/SOAP APIs, Middleware, and Third-Party Systems. Familiarity with Salesforce deployment processes, including Change Sets, Salesforce DX, and Version Control (Git). Experience with Salesforce Lightning (UI development using Lightning Web Components or Aura Components). Knowledge of Salesforce AppExchange apps and third-party integrations. • Strong analytical and problem-solving skills with the ability to work independently and in a team environment. Excellent communication skills to interact effectively with internal teams and business stakeholders. Salesforce Developer Certification (Platform Developer I or II) is a plus. Experience with Salesforce Marketing Cloud, Service Cloud, or Sales Cloud is a plus. Working Model: Work-from-office (5 days) Shift timing: Day shift Role & responsibilities 1. Development and Customization Develop custom Salesforce applications using Apex, Visualforce, Lightning Web Components (LWC), and other Salesforce technologies. Customize Salesforce solutions to meet specific business requirements, including workflows, triggers, and APIs. 2. Integration Design and implement integrations between Salesforce and external systems using REST/SOAP APIs, middleware, and tools like MuleSoft. Ensure seamless data flow and synchronization across systems. 3. Code Optimization and Best Practices Write clean, efficient, and maintainable code following Salesforce best practices. Conduct code reviews and optimize existing code for performance improvements. 4. Collaboration Work closely with functional consultants, business analysts, and other developers to deliver end-to-end solutions. Provide technical expertise and guidance during the project lifecycle. 5. Testing and Deployment Perform unit testing, integration testing, and debugging of Salesforce applications. Deploy changes to production environments following change management protocols. 6. Support and Maintenance Provide ongoing technical support and resolve issues post-implementation as part of support plans. Implement enhancements and upgrades to Salesforce environments. 7. Reporting & Analytics Create and optimize Tableau dashboards to deliver actionable insights. Develop Salesforce reports and dashboards tailored to various user needs. 8. Continuous Improvement Stay updated with Salesforce releases and incorporate new features into solutions. Drive scalability, efficiency, and user adoption across the Salesforce platform. Interested Professional can mail their CV on below mail details. karis.paul@in.experis.com
Posted 4 weeks ago
7.0 - 10.0 years
13 - 18 Lacs
Noida
Work from Office
About the Job : We are seeking a highly skilled and experienced Power BI Developer to join our dynamic team. In this role, you will be instrumental in transforming raw data into insightful and actionable visualizations that drive business decisions. You will be responsible for the entire lifecycle of dashboard development, from understanding business needs and designing user-friendly interfaces in Figma to building robust data models and implementing sophisticated DAX calculations in Power BI. Collaboration with stakeholders and cross-functional teams is crucial to ensure the delivery of high-quality, impactful BI solutions. Job Summary : As a Power BI Developer, you will leverage your deep expertise in data visualization, SQL/DAX, and UI/UX design principles using Figma to create interactive and visually compelling dashboards. You will work closely with business users to gather requirements, design intuitive interfaces, develop efficient data models, and implement robust reporting solutions within the Power BI ecosystem. Your ability to translate business needs into technical specifications and effectively communicate analytical findings will be key to your success in this role. Key Responsibilities : - Dashboard Design and Development : Design, develop, and deploy interactive dashboards and visual reports using Power BI Desktop and Power BI Service. - UI/UX Prototyping with Figma : Collaborate with business users to understand their reporting needs and translate them into user-friendly wireframes, mockups, and prototypes using Figma. - Figma to Power BI Implementation : Convert Figma designs into fully functional and aesthetically pleasing Power BI dashboards, ensuring adherence to UI/UX best practices. - DAX Development : Write and optimize complex DAX (Data Analysis Expressions) calculations to derive meaningful insights and implement business logic within Power BI. - SQL Querying and Optimization : Develop and optimize complex SQL queries to extract, transform, and load data from various data sources. This includes writing joins, window functions, Common Table Expressions (CTEs), and stored procedures. - Data Modeling : Design and implement efficient and scalable data models within Power BI, adhering to data warehousing best practices (e.g., star schema, snowflake schema). - Security Implementation : Implement and manage row-level security (RLS) and other security measures within Power BI to ensure data privacy and appropriate access control. - Performance Tuning : Identify and implement performance optimization techniques within Power BI reports and data models to ensure optimal responsiveness and efficiency. - Data Source Integration : Integrate Power BI with diverse data sources, including databases (e.g., SQL Server, Azure Synapse), cloud platforms, APIs, and other relevant systems. - Stakeholder Communication : Present analytical findings, dashboard designs, and technical solutions to stakeholders in a clear, concise, and compelling manner. - Requirements Gathering : Actively participate in gathering business requirements through user interviews, workshops, and documentation analysis. - Agile Collaboration : Work effectively within Agile/Scrum teams, contributing to sprint planning, daily stand-ups, and retrospectives, ensuring timely delivery of assigned tasks. - Documentation : Create and maintain comprehensive documentation for developed dashboards, data models, and processes. - Continuous Improvement : Stay updated with the latest Power BI features, Figma updates, and industry best practices to continuously improve the quality and efficiency of BI solutions. Required Skills : - Experience : Minimum of 7 years of demonstrable experience in Business Intelligence and Data Analytics, with a strong focus on Power BI development. - Power BI Expertise : Proven hands-on expertise in Power BI Desktop and Power BI Service, including advanced DAX capabilities (e.g., CALCULATE, measures, calculated columns, time intelligence functions). - Figma Proficiency : Strong practical experience using Figma for UI/UX prototyping, wireframing, and creating visually appealing dashboard designs. Ability to translate design specifications into functional Power BI reports. - SQL Proficiency : Deep understanding and proficiency in SQL, with the ability to write complex queries involving multiple joins, window functions, CTEs, and stored procedures across various database systems. - Data Warehousing and Modeling : Solid understanding of data warehousing concepts, dimensional modeling (star and snowflake schemas), and ETL/ELT processes. - Cloud Data Experience (Preferred) : Experience working with cloud-based data sources such as Azure Synapse Analytics, SQL Server on Azure, or other cloud data platforms is a significant plus. - Requirements Elicitation : Demonstrated ability to effectively gather business requirements, conduct user interviews, and translate them into clear and actionable BI solutions. - Communication Skills : Excellent written and verbal communication skills, with the ability to effectively communicate technical concepts to both technical and non-technical audiences. - Problem-Solving : Strong analytical and problem-solving skills with the ability to troubleshoot issues and propose effective solutions. - Teamwork : Ability to work collaboratively within cross-functional teams and contribute positively to a team environment. - Agile Methodology : Experience working in Agile/Scrum development methodologies. Education : - Bachelor's degree in Computer Science, Information Systems, Data Science, or a related quantitative field.
Posted 4 weeks ago
2.0 - 3.0 years
3 - 4 Lacs
Nagpur
Work from Office
Responsibilities: Data Modeling & Integration: Report & Dashboard Development: Data Transformation: Collaboration: Performance Optimization: . Security & Access Control: Training & Support: Qualification: Gradution in IT or CS Field Requirements: Proven experience as a Power BI Engineer or BI Developer, with a solid understanding of data modeling, visualization, and reporting. Proficiency in Power BI Desktop, Power BI Service, Power Query, DAX, and Power BI Gateway. Strong experience with SQL and data integration from different sources (e.g., databases, APIs, cloud storage).. Strong analytical and problem-solving skills with attention to detail. Excellent communication skills and the ability to work in a collaborative team environment.
Posted 4 weeks ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Data Quality Engineer, you will collaborate with product, engineering, and customer teams to gather requirements and develop a comprehensive data quality strategy. You will lead data governance processes, including data preparation, obfuscation, integration, slicing, and quality control. Testing data pipelines, ETL processes, APIs, and system performance to ensure reliability and accuracy will be a key responsibility. Additionally, you will prepare test data sets, conduct data profiling, and perform benchmarking to identify inconsistencies or inefficiencies. Creating and implementing strategies to verify the quality of data products and ensuring alignment with business standards will be crucial. You will set up data quality environments and applications in compliance with defined standards, contributing to CI/CD process improvements. Participation in the design and maintenance of data platforms, as well as building automation frameworks for data quality testing and resolving potential issues, will be part of your role. Providing support in troubleshooting data-related issues to ensure timely resolution is also expected. It is essential to ensure that all data quality processes and tools align with organizational goals and industry best practices. Collaboration with stakeholders to enhance data platforms and optimize data quality workflows will be necessary to drive success in this role. Requirements: - Bachelors degree in Computer Science or a related technical field involving coding, such as physics or mathematics - At least three years of hands-on experience in Data Management, Data Quality verification, Data Governance, or Data Integration - Strong understanding of data pipelines, Data Lakes, and ETL testing methodologies - Proficiency in CI/CD principles and their application in data processing - Comprehensive knowledge of SQL, including aggregation and window functions - Experience in scripting with Python or similar programming languages - Databricks and Snowflake experience is a must, with good exposure to notebook, SQL editor, etc. - Experience in developing test automation frameworks for data quality assurance - Familiarity with Big Data principles and their application in modern data systems - Experience in data analysis and requirements validation, including gathering and interpreting business needs - Experience in maintaining QA environments to ensure smooth testing and deployment processes - Hands-on experience in Test Planning, Test Case design, and Test Result Reporting in data projects - Strong analytical skills, with the ability to approach problems methodically and communicate solutions effectively - English proficiency at B2 level or higher, with excellent verbal and written communication skills Nice to have: - Familiarity with advanced data visualization tools to enhance reporting and insights - Experience in working with distributed data systems and frameworks like Hadoop,
Posted 4 weeks ago
3.0 - 7.0 years
0 Lacs
gujarat
On-site
We are searching for a skilled SQL and Data Integration Developer to join our team. As a SQL and Data Integration Developer, you will be responsible for designing, developing, and maintaining scalable SQL database solutions, including data warehouses and complex ETL/ELT processes. Your role will involve integrating third-party API data, optimizing database performance, and collaborating with cross-functional teams to support critical business applications. The ideal candidate for this position should have a strong background in T-SQL development, enterprise data architecture, and cloud-based tools like Azure Data Factory. You should possess excellent communication skills and a proactive approach to problem-solving in a distributed team environment. **Job Responsibilities:** - Design, develop, and maintain SQL database schemas and scripts, including views, stored procedures, and SQL jobs. - Develop logical and physical data models to ensure robust and scalable database design. - Monitor and maintain existing data to ensure cleanliness, accuracy, consistency, and impact. - Write and optimize complex stored procedures, functions, and views using T-SQL for high-performance data operations. - Build and maintain data integration workflows using ETL/ELT tools like Azure Data Factory to facilitate data movement between systems and environments. - Integrate and import data from third-party APIs into SQL databases, ensuring data accuracy, security, and consistency. - Collaborate closely with application developers to implement and optimize database structures that meet application requirements. - Design and implement scalable data warehouse solutions that align with business goals and support enterprise analytics. - Act as the primary liaison between the SQL development team and cross-functional teams, including marketing, accounting, graphic design, and customer support. - Create comprehensive technical documentation, including design specifications, architecture documentation, and user instructions. - Continuously evaluate existing software components and tools, recommending improvements to ensure efficiency and scalability. - Participate in an agile development environment, coordinating with a distributed team across multiple time zones to deliver high-quality solutions on time and within budget. **Job Requirements:** - Work From Home (Shift: 2 pm - 11 pm) based in India. - 5+ years of hands-on experience in SQL development. - 3+ years of experience working with ETL/ELT tools, preferably Azure Data Factory. - Proficiency in writing and optimizing complex T-SQL queries, with expertise in performance tuning and query development. - Proven track record in developing and maintaining enterprise data warehouse architectures. - Experience in integrating data from external APIs and importing third-party data into relational databases. - Familiarity with version control systems such as TFS, GIT, or Azure DevOps. - Exposure to Azure SQL, SQL Source Control, and Azure Logic Apps is a plus. - Knowledge of Master Data Management (MDM) concepts and practices is advantageous. - Solid understanding of database troubleshooting and implementation of industry best practices. - Bachelor's degree (or higher) in Computer Science, Information Systems, or a related field. - Excellent communication and collaboration skills, with a proven ability to work effectively with globally distributed teams. **Benefits:** - Group Mediclaim Policy - Parental Insurance Coverage - Accident Policy - Retirement Benefits (Provident Fund) - Gratuity - Overtime Bonus, Paid Vacation & Holidays, Profit Sharing & Incentives,
Posted 4 weeks ago
15.0 - 19.0 years
0 Lacs
karnataka
On-site
As a Data Science Associate Director at Accenture Strategy & Consulting, Global Network Data & AI practice in the Resources team, you will be part of a dynamic group that helps clients grow their businesses through analytics and insights. Your role will involve working closely with clients and stakeholders to drive business growth, identify new opportunities, and develop advanced analytics models for various client problems. Your responsibilities will include solution architecture, design, deployment, and monitoring of analytics models, as well as collaborating with internal teams to drive sales and innovation. You will be expected to lead a team of data analysts, work on large-scale datasets, and provide thought leadership in key capability areas such as tools & technology and asset development. Qualifications and Experience: - Bachelor's/Masters degree in Mathematics, Statistics, Computer Science, Computer Engineering, or related field - 15+ years of experience as a Data Science professional focusing on cloud services - Strong knowledge of Statistical Modeling, Machine Learning algorithms, and Experimental design - Expertise in experimental test design and the ability to derive business strategies from statistical findings - Experience in Utilities, Energy, Chemical, and Natural Resources industries preferred - Proficiency in programming languages like Python, PySpark, SQL, or Scala - Implementation of MLOps practices for streamlining machine learning lifecycle - Understanding of data integration, data modeling, and data warehousing concepts - Excellent analytical, problem-solving, communication, and collaboration skills - Relevant certifications in Azure Data Services or cloud data engineering are highly desirable If you are a strategic thinker with excellent communication skills, a passion for innovation, and a drive to make a difference in the world of data science, we invite you to join our team at Accenture Strategy & Consulting.,
Posted 4 weeks ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
As a Senior Power BI Consultant at TheMathCompany, you will play a crucial role in leading the design, development, and deployment of impactful dashboards and data visualizations. Your deep understanding of Power BI capabilities, strong data modeling skills, and ability to translate business requirements into actionable insights will be key in contributing to the success of our projects. Your responsibilities will include leading the end-to-end development of Power BI dashboards, ensuring visual appeal, functionality, and business relevance. You will gather and interpret reporting requirements from stakeholders, architect data models, and design dashboard solutions accordingly. Leveraging advanced Power BI capabilities such as Power Query, DAX, M language, and API integrations, you will build robust reports that optimize performance and user experience. In addition to dashboard development, you will be responsible for ensuring strong data governance and security within Power BI by managing user access and role-based permissions. Collaboration with data engineering and business teams to manage data connections and ensure data quality will be essential. Your role will also involve driving technical discussions, providing consulting expertise, and mentoring junior team members to support capability development within the team. To excel in this role, you should have 4-6 years of hands-on experience with Power BI, including expertise in complex DAX queries and Power Query transformations. Strong SQL skills, familiarity with data warehousing, BI concepts, and dashboard performance optimization are required. Experience with data integration from multiple sources, handling complex data relationships, and designing intuitive and interactive dashboards for data storytelling and decision-making will be valuable. Certifications in Power BI or related visualization tools, exposure to big data technologies, understanding of agile methodologies, and experience in a consulting environment are preferred. Additionally, knowledge of DevOps practices, CI/CD pipelines, and version control in analytics workflows will be beneficial. As a Mathemagician at TheMathCompany, you are expected to embody our culture and way of working, demonstrate ownership, strive for excellence in delivering results, actively engage in initiatives fostering company growth, and support diversity while appreciating different perspectives. Your high emotional intelligence, cultural adaptability, effective communication skills, and focus on continuous improvement and client value will contribute to your success in this role.,
Posted 4 weeks ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-ConsultingKinaxis Rapid Response Planning Senior Consultant (4-8 years) The opportunity EY GDS is a global major in value-added Digital Supply Chain services for its clients. As part of this rapidly growing business segment, you will play a critical role in developing solutions, implementations, and performance improvement of Kinaxis Rapid Response. This is a fantastic opportunity to be part of a leading firm whilst being instrumental in the growth of a new service offering. Your key responsibilities Provide solutions proficiency to analyze and identify gaps, to lead the solution design, and implement the Rapid Response application to meet business requirements. Lead implementation, configurations, testing, training, knowledge transfer, and documentation activities. Able to conduct workshops to understand end-to-end business process requirements and propose the best possible solution. Deliver high-quality client solutions that meet and exceed client/EY expectations and are delivered on-time and on-budget. Manage client solution delivery, including defining project approach, motivating project teams, monitoring, managing project risks, managing client and EY key stakeholders, and successfully delivering client solutions. Identifying new business opportunities, including building strong client relations, understanding client needs and EY solution offerings, communicating client opportunities to EY leadership, and helping develop client opportunities. Skills and attributes for success Gather Business requirements/ lead design discussions with customer & business teams. Work on Proposal and RFPs. Analyze business requirements and Perform Fit-Gap Analysis. Develop detailed solution design based on business requirements. Strong expertise in detailed configuration and testing of Kinaxis Rapid Response planning tool. Assist customer/business teams during UAT phase. Prepare and Review project documentations. To qualify for the role, you must have Functional: In-depth knowledge of demand planning and forecasting and exposure to various forecasting techniques and the concepts like promotion planning, consensus demand planning. Technical: Workbook development - Table based, composite, data modification, Alerts - Monitoring, Hierarchies & Filters, Scenario hierarchy setup, Control Table Configuration, Planning Engine Knowledge, Data Model modification including custom fields and custom tables. Knowledge of integrating Kinaxis with host ERP systems through Data Warehouses for both Inbound and Outbound Interfaces, workflows, query development, preparation of detailed functional specifications for enhancements, layouts, and reports etc. 4 to 8 years of experience in supply chain consulting or operations role with proven experience in Kinaxis Rapid Response. Prior Implementation experience of end-to-end demand planning projects using the tool Kinaxis Rapid Response. Good understanding of functional and technical architecture to support working on data integration skills with multiple source and target systems. Ideally, you'll also have Overall, 4 to 8 years of experiences as SCM planner and responsibilities delivering projects in Supply Chain Management, Planning & Logistics domain. Working experience with Onsite & Offshore delivery model environment is preferred. Engaging with business partners and IT to understand requirements from various parts of an organization to drive the design, programming execution, and UAT for future state capabilities within the platform. Working in a fast-paced and dynamic environment while managing multiple projects and strict deadlines. Good understanding of outsourcing and offshoring, building win/win strategies and contracts with suppliers. What we look for Consulting experience, including assessments and implementations. Functional and technical Experience SCM Planning. Documenting requirements and processes (e.g., Process flows). Working collaboratively in a team environment. Excellent oral and written communication skills. Kinaxis Rapid Response Author certification or Contributor certification will be an added advantage. What working at EY offers At EY, we're dedicated to helping our clients, from startups to Fortune 500 companies - and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching, and feedback from some of the most engaging colleagues around. Opportunities to develop new skills and progress your career. The freedom and flexibility to handle your role in a way that's right for you. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 4 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
You will be responsible for designing and implementing scalable Snowflake data warehouse architectures, which includes schema modeling and data partitioning. You will lead or support data migration projects from on-premise or legacy cloud platforms to Snowflake. Additionally, you will be developing ETL/ELT pipelines and integrating data using tools such as DBT, Fivetran, Informatica, Airflow, etc. It will be part of your role to define and implement best practices for data modeling, query optimization, and storage efficiency in Snowflake. Collaboration with cross-functional teams, including data engineers, analysts, BI developers, and stakeholders, to align architectural solutions will be essential. Ensuring data governance, compliance, and security by implementing RBAC, masking policies, and access control within Snowflake will also be a key responsibility. Working with DevOps teams to enable CI/CD pipelines, monitoring, and infrastructure as code for Snowflake environments will be part of your duties. Optimizing resource utilization, monitoring workloads, and managing the cost-effectiveness of the platform will also be under your purview. Staying updated with Snowflake features, cloud vendor offerings, and best practices is crucial. Qualifications & Skills: - Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. - X years of experience in data engineering, data warehousing, or analytics architecture. - 3+ years of hands-on experience in Snowflake architecture, development, and administration. - Strong knowledge of cloud platforms (AWS, Azure, or GCP). - Solid understanding of SQL, data modeling, and data transformation principles. - Experience with ETL/ELT tools, orchestration frameworks, and data integration. - Familiarity with data privacy regulations (GDPR, HIPAA, etc.) and compliance. Qualifications: - Snowflake certification (SnowPro Core / Advanced). - Experience in building data lakes, data mesh architectures, or streaming data platforms. - Familiarity with tools like Power BI, Tableau, or Looker for downstream analytics. - Experience with Agile delivery models and CI/CD workflows.,
Posted 4 weeks ago
5.0 - 9.0 years
0 Lacs
nagpur, maharashtra
On-site
The Data & Analytics Team at GlobalLogic is looking for a skilled Data Engineer with expertise in data integration and application development. In this role, you will play a crucial part in designing, engineering, governing, and enhancing the entire Data Platform to provide self-service access to customers, partners, and employees. Your responsibilities will include demonstrating proficiency in data & metadata management, data integration, data warehousing, data quality, machine learning, and core engineering principles. Requirements: - Minimum 5 years of experience in system/data integration, development, or implementation of enterprise and/or cloud software. - Strong experience with Web APIs such as RESTful and SOAP. - Proficiency in setting up data warehousing solutions and associated pipelines, including ETL tools (preferably Informatica Cloud). - Demonstrated expertise in Python. - Strong experience in data wrangling and query authoring in SQL and NoSQL environments for structured and unstructured data. - Experience in a cloud-based computing environment, specifically GCP. - Expertise in documenting Business Requirement, Functional & Technical documentation. - Proficiency in writing Unit & Functional Test Cases, Test Scripts & Run books. - Experience with incident management systems like Jira, Service Now, etc. - Working knowledge of Agile Software development methodology. - Strong organizational and troubleshooting skills with attention to detail. - Analytical ability, judgment, and problem analysis techniques. - Excellent interpersonal skills and ability to work effectively in a cross-functional team. Responsibilities: - Lead system/data integration, development, or implementation efforts for enterprise and/or cloud software. - Design and implement data warehousing solutions and associated pipelines for internal and external data sources, including ETL processes. - Perform data wrangling and author complex queries in SQL and NoSQL environments for structured and unstructured data. - Develop and integrate applications using Python and Web APIs (RESTful and SOAP). - Provide operational support for the data platform and applications, including incident management. - Create comprehensive Business Requirement, Functional, and Technical documentation. - Develop Unit & Functional Test Cases, Test Scripts, and Run Books to ensure solution quality. - Manage incidents effectively using systems like Jira, Service Now, etc. - Prepare change management packages and implementation plans for migrations across different environments. - Actively participate in Enterprise Risk Management Processes. - Work within an Agile Software Development methodology, contributing to team success. - Collaborate effectively within cross-functional teams. GlobalLogic offers: - A culture of caring that prioritizes people and fosters an inclusive environment. - Continuous learning and development opportunities to help you grow personally and professionally. - Interesting and meaningful work on impactful projects that shape the world. - Balance and flexibility in work arrangements to achieve a healthy work-life balance. - A high-trust organization where integrity is valued and upheld. About GlobalLogic: GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for creating innovative digital products and experiences. Collaborating with forward-thinking companies, GlobalLogic helps transform businesses and redefine industries through intelligent products, platforms, and services.,
Posted 4 weeks ago
7.0 - 11.0 years
0 Lacs
hyderabad, telangana
On-site
As an AWS Solution Architect with 7-9 years of experience, you will be responsible for managing solution architecture, providing consulting services, overseeing software development, integration, and business processes. Your day-to-day tasks will include designing, developing, and implementing scalable and reliable AWS cloud solutions in accordance with best practices. You will collaborate with cross-functional teams to meet project objectives and customer needs. Your technical skills will include proficiency in working with relational databases like PostgreSQL, Microsoft SQL Server, and Oracle. You should have experience in database schema design, optimization, and management. Additionally, you should have a strong knowledge of AWS services such as S3, AWS DMS (Database Migration Service), and AWS Redshift Serverless. Experience in setting up and managing data pipelines using AWS DMS and creating and managing data storage solutions using AWS S3 is required. You should have expertise in data integration techniques and tools, designing and implementing ETL processes, and performing data mapping, transformation, and data cleansing activities. Experience in setting up and managing data warehouses, particularly AWS Redshift Serverless, and creating and managing views in AWS Redshift is essential. Proficiency in scripting languages like Python or SQL for automating data integration tasks and experience with automation tools and frameworks is necessary. Analytical and problem-solving skills are crucial for this role. You should be able to analyze and interpret complex data sets, identify and resolve data integration issues, and troubleshoot and resolve data integration and migration issues effectively. Soft skills such as collaboration, communication, and adaptability are important. You should be able to work collaboratively with stakeholders, document data integration processes, participate in design reviews, and provide input on data integration plans. Willingness to stay updated with the latest data integration tools and technologies and recommend upgrades when necessary is expected. Knowledge of data security and privacy regulations and experience in ensuring adherence to data security and privacy standards during data integration processes is required. Proven experience in similar data integration projects and familiarity with the specific requirements and challenges of integrating production relational databases with AWS services are essential. Having AWS certifications such as AWS Certified Solutions Architect or AWS Certified Database - Specialty is a plus. Experience in solution architecture and consulting, proficiency in software development and integration, and knowledge of business process optimization and automation are preferred qualifications for this role.,
Posted 4 weeks ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. You will analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. In Oracle enterprise performance management at PwC, you will specialise in providing consulting services for enterprise performance management solutions using Oracle technologies. You will collaborate with clients to assess their performance management needs, design and implement Oracle-based solutions for budgeting, forecasting, financial consolidation, and reporting. Your role will involve providing training and support for seamless integration and utilisation of Oracle enterprise performance management tools, helping clients improve their financial planning and analysis processes and achieve their performance objectives. A career in our Enterprise Performance Management practice, within Finance Consulting services, will provide you with the opportunity to work alongside CEOs, CFOs, Controllers, and Treasurers to optimize the structure of their finance functions and improve their contribution to the business. You will support clients by addressing the challenges of achieving appropriate standards of control, efficient back-office opportunities, and support to the business through insight and challenge. Your responsibilities will include helping clients optimize and align financial planning, consolidation, reporting, and analytics processes, systems, and information to provide business insights that drive better decisions and actions. As part of the EPM practice at PwC, you will work with a strong and growing team with a key focus on EPM Strategy, technology, delivery, and lifecycle management. We are looking for enthusiastic and ambitious individuals to join our practice and be part of the transformation journey for our clients and our people. In this role, you will collaborate with both PwC and client team members throughout the implementation life cycle, including planning, configuration, design, build, testing, training, change management, go-live, and post-production support. Your responsibilities will include: - Demonstrating good knowledge of OneStream, Regulatory reporting, and the Financial Close Process - Implementing Multi GAAP and Multi-currency applications in OneStream - Building best practices within planning, forecasting, and reporting processes - Designing metadata, configuring security, and writing business rules - Understanding intercompany elimination, consolidation adjustments, and ownership accounting - Utilizing Smart View and Reporting tools effectively - Demonstrating proficiency in end-to-end implementation of OneStream CPM/EPM Projects - Translating business requirements into OneStream solutions - Developing various reports and dashboards as required - Building prototype proof of concept applications within the OneStream platform - Transforming FP&A processes from excel-based to technology-supported integrated planning To excel in this role, you should have 2-4 years of experience in OneStream with at least 1-2 end-to-end project experiences. You should possess functional knowledge of Consolidation and FP&A to guide business users effectively during the financial close process. Additionally, good communication and detailing skills are essential for success in this position.,
Posted 4 weeks ago
9.0 - 13.0 years
0 Lacs
karnataka
On-site
As an AEP Architect, you will be responsible for designing and implementing advanced data analytics solutions utilizing Adobe Analytics, Adobe Target, Adobe Experience Platform (AEP), and Adobe Journey Optimizer (AJO). Your expertise in these technologies will be crucial in enhancing customer experiences and optimizing marketing strategies. Your key responsibilities will include designing and implementing data analytics solutions, integrating data from various sources into AEP, designing and optimizing customer journeys using AJO, monitoring and optimizing the performance of analytics and targeting solutions, and collaborating with cross-functional teams to deliver data-driven insights and solutions. To excel in this role, you should have proven experience in Adobe Analytics, Adobe Target, AEP, and AJO. Additionally, strong technical skills in data modeling, data integration, JavaScript, HTML, CSS, and RESTful APIs are required. Excellent problem-solving abilities, communication skills, and the ability to work effectively with diverse teams are essential for success in this position. Possessing Adobe certifications in Analytics, Target, AEP, and AJO would be advantageous. If you are a seasoned Technical Architect with at least 9 years of experience and a passion for leveraging data insights to drive personalized customer experiences, this role offers an exciting opportunity to contribute to the success of our organization.,
Posted 4 weeks ago
8.0 - 13.0 years
20 - 35 Lacs
Gurugram
Work from Office
A minimum of 10 years of experience in data architecture, data engineering, or a related field. Proven expertise in Snowflake and data transformation processes within Snowflake . Strong background in Data Warehouse (DWH) and Business Intelligence (BI) architecture. Experience with Salesforce & CPQ data and architecture. Proficiency with BI tools such as Tableau, Power BI, Sigma Computing, and others. In-depth understanding of financial bookings and revenue reports in Salesforce and DWH. Excellent problem-solving skills and the ability to work under pressure in a fast-paced environment. Strong leadership and team management skills, with the ability to motivate and guide a team of technical professionals. Exceptional communication and collaboration skills, with the ability to interact effectively with stakeholders at all levels.
Posted 4 weeks ago
6.0 - 11.0 years
15 - 25 Lacs
Kochi, Chennai, Bengaluru
Hybrid
Job Role: Data Quality Integration Engineer Location: PAN India Role Overview As a Data Quality Integration Engineer, you will be responsible for embedding data quality capabilities across enterprise data landscapes. You will lead the integration of advanced data quality tools such as Ataccama and Collibra with cloud data platforms like Snowflake and SQL databases. This role is essential in ensuring our data governance standards are met with robust, scalable, and automated data quality processes. Role Proficiency Develop scalable applications using suitable technical options. Optimize application development, maintenance, and performance. Reuse proven design patterns and manage peer development activities. Key Responsibilities Technical & Functional Responsibilities Design and implement integration of data quality tools (Ataccama, Collibra, etc.) with Snowflake and SQL-based platforms. Develop automated pipelines and connectors for profiling, cleansing, monitoring, and validating data. Configure and manage data quality rules and workflows aligned to governance policies and KPIs. Troubleshoot integration issues, monitor performance, and optimize reliability and efficiency. Collaborate with Data Governance, Architecture, and Engineering teams to align solutions with business needs. Maintain comprehensive documentation for integration solutions and configurations. Software Engineering Deliverables Code : Adhere to coding standards, perform peer reviews, and write optimized code. Documentation : Create/review design documents, templates, test cases, and checklists. Testing : Develop/review unit and integration test cases; support QA teams. Configuration : Define and manage configuration management practices. Release : Execute and oversee release processes. Project & Team Management Estimate efforts for project deliverables and track timelines. Perform defect RCA, trend analysis, and propose quality improvements. Set and review FAST goals for self and team. Mentor team members, manage aspirations, and keep the team engaged. Key Outcomes & Metrics Timely adherence to engineering and project standards. Minimal post-delivery defects and technical issues. Compliance with mandatory training and documentation processes. Increased customer satisfaction and domain relevance. Skills & Technologies Mandatory Skills Strong experience with data quality tools (Ataccama, Collibra). Hands-on with Snowflake and SQL databases (e.g., PostgreSQL, SQL Server, Oracle). Proficient in SQL scripting and data pipeline development (Python or Scala preferred). Sound understanding of data profiling, cleansing, enrichment, and monitoring. Familiar with REST APIs and metadata integration techniques. Desirable Skills Experience in cloud platforms (AWS, Azure) hosting Snowflake. Certification in Collibra, Ataccama, or Snowflake. Exposure to financial services or regulated industries. Prior involvement in data stewardship/governance initiatives. Soft Skills Strong analytical and problem-solving abilities. Ability to manage high-pressure environments and multiple priorities. Effective communication and presentation skills. Ability to mentor and guide junior engineers. Business etiquette in professional interactions. Certifications (Preferred) Ataccama/Collibra Certified Professional Snowflake Architect/Developer Certification AWS/Azure Data Engineering Certifications Domain Knowledge Deep understanding of enterprise data architecture and governance. Knowledge of financial services, insurance, or asset management domains is an advantage.
Posted 4 weeks ago
4.0 - 9.0 years
6 - 11 Lacs
Gurugram
Work from Office
About the Role: Grade Level (for internal use): 09 Department overview AutomotiveMastermind provides U.S. automotive dealers with AI/behavior prediction analytics software and marketing solutions that improve the vehicle purchase process and results. The companys cloud-based technology helps dealers precisely predict automobile-buying behavior and automates the creation of microtargeted customer communications, leading to proven higher sales and more consistent customer retention. Responsibilities: Work closely with Product Management and Data Strategy leadership to understand short and long-term roadmaps, and overall Data product strategy Drive backlog grooming agile sprint ceremony, acting as bridge between business needs and technical implementation Present on behalf of agile teams in sprint review, reiterating business value delivered with each work increment completed Develop expertise on the existing aM ecosystem of integrations and data available within the system Collaborate with data analysts, data management, data science, and engineering teams to develop short and long-term solutions to meet business needs and solve distinct problems Application of deep, creative, rigorous thinking to solve broad, platform-wide technical and/or business problems Identify key value drivers and key opportunities for/sources of error across products and processes Develop short-term preventive or detective measures, and leading medium/long-term product improvement initiatives arrived at via close collaboration with engineering, QA, and data support Coordinate with data engineers as appropriate to design and enable repeatable processes and generate deliverables to answer routine business questions What Were Looking For: Basic Required Qualifications: Minimum 4 years working experience as a Product Owner or Product Manager in an Agile scrum framework Experience using data and analytical processes to drive decision making, with ability to explain how analysis was done to an executive audience Strong knowledge of Agile development framework, with practical experience to support flexible application of principles Strong conceptual understanding of data integrations technologies and standards Working familiarity with road-mapping and issue tracking software applications (Aha!, MS Azure DevOps, Salesforce) Familiarity with Microsoft Excel, SQL, BigQuery, MongoDB, and Postman preferred An advocate for the importance of leveraging data, a supporter of the use of data analysis in decision-making, and a fierce promoter of data and engineering best practices throughout the organization. Passionate about empirical research A team player who is comfortable working with a globally distributed team across time zones A solid communicator, both with technology teams and with non-technical stakeholders PreferredExperience with or awareness of and interest in dimensional data modeling concepts B.tech/M.tech qualified. Grade9 LocationGurgaon Hybrid Modetwice a week work from office Shift Time12 pm to 9 pm IST About automotiveMastermind Who we are: Founded in 2012, automotiveMastermind is a leading provider of predictive analytics and marketing automation solutions for the automotive industry and believes that technology can transform data, revealing key customer insights to accurately predict automotive sales. Through its proprietary automated sales and marketing platform, Mastermind, the company empowers dealers to close more deals by predicting future buyers and consistently marketing to them. automotiveMastermind is headquartered in New York City. For more information, visit automotivemastermind.com. At automotiveMastermind, we thrive on high energy at high speed. Were an organization in hyper-growth mode and have a fast-paced culture to match. Our highly engaged teams feel passionately about both our product and our people. This passion is what continues to motivate and challenge our teams to be best-in-class. Our cultural values of Drive and Help have been at the core of what we do, and how we have built our culture through the years. This cultural framework inspires a passion for success while collaborating to win. What we do: Through our proprietary automated sales and marketing platform, Mastermind, we empower dealers to close more deals by predicting future buyers and consistently marketing to them. In short, we help automotive dealerships generate success in their loyalty, service, and conquest portfolios through a combination of turnkey predictive analytics, proactive marketing, and dedicated consultative services. Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ---- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ---- 20 - Professional (EEO-2 Job Categories-United States of America), PDMGDV202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)
Posted 4 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad
Work from Office
About the Role: Grade Level (for internal use): 11 The Role : The Knowledge Engineering team are seeking a Lead Knowledge Engineer to support our strategic transformation from a traditional data organization into a next generation interconnected data intelligence organization. The Team : The Knowledge Engineering team within data strategy and governance helps to lead fundamental organizational and operational change driving our linked data, open data, and data governance strategy, both internally and externally. The team partners closely with data and software engineering to envision and build the next generation of data architecture and tooling with modern technologies. The Impact : Knowledge Engineering efforts occur within the broader context of major strategic initiatives to extend market leadership and build next-generation data, insights and analytics products that are powered by our world class datasets. Whats in it for you : The Lead Knowledge Engineer role is an opportunity to work as an individual contributor in creatively solving complex challenges alongside visionary leadership and colleagues. Its a role with highly visible initiatives and outsized impact. The wider division has a great culture of innovation, collaboration, and flexibility with a focus on delivery. Every person is respected and encouraged to be their authentic self. Responsibilities : Develop, implement, and continue to enhance ontologies, taxonomies, knowledge graphs, and related semantic artefacts for interconnected data, as well as topical/indexed query, search, and asset discovery Design and prototype data / software engineering solutions enabling to scale the construction, maintenance and consumption of semantic artefacts and interconnected data layer for various application contexts Provide thought leadership for strategic projects ensuring timelines are feasible, work is effectively prioritized, and deliverables met Influence the strategic semantic vision, roadmap, and next-generation architecture Execute on the interconnected data vision by creating linked metadata schemes to harmonize semantics across systems and domains Analyze and implement knowledge organization strategies using tools capable of metadata management, ontology management, and semantic enrichment Influence and participate in governance bodies to advocate for the use of established semantics and knowledge-based tools Qualifications: Able to communicate complex technical strategies and concepts in a relatable way to both technical and non-technical stakeholders and executives to effectively persuade and influence 5+ years of experience with ontology development, semantic web technologies (RDF, RDFS, OWL, SPARQL) and open-source or commercial semantic tools (e.g., VocBench, TopQuadrant, PoolParty, RDFLib, triple stores); Advanced studies in computer science, knowledge engineering, information sciences, or related discipline preferred 3+ years of experience in advanced data integration with semantic and knowledge graph technologies in complex, enterprise-class, multi-system environment(s); skilled in all phases from conceptualization to optimization Programming skills in a mainstream programming language (Python, Java, JavaScript), with experience in utilizing cloud services (AWS, Google Cloud, Azure) is a great bonus Understanding of the agile development life cycle and the broader data management discipline (data governance, data quality, metadata management, reference and master data management) S&P Global Enterprise Data Organization is a unified, cross-divisional team focused on transforming S&P Globals data assets. We streamline processes and enhance collaboration by integrating diverse datasets with advanced technologies, ensuring efficient data governance and management. About S&P Global Commodity Insights At S&P Global Commodity Insights, our complete view of global energy and commodities markets enables our customers to make decisions with conviction and create long-term, sustainable value. Were a trusted connector that brings together thought leaders, market participants, governments, and regulators to co-create solutions that lead to progress. Vital to navigating Energy Transition, S&P Global Commodity Insights coverage includes oil and gas, power, chemicals, metals, agriculture and shipping. S&P Global Commodity Insights is a division of S&P Global (NYSESPGI). S&P Global is the worlds foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the worlds leading organizations navigate the economic landscape so they can plan for tomorrow, today. Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visit Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ---- S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - ---- 10 - Officials or Managers (EEO-2 Job Categories-United States of America), DTMGOP103.2 - Middle Management Tier II (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)
Posted 4 weeks ago
8.0 - 13.0 years
14 - 18 Lacs
Hyderabad
Work from Office
Overview Customer Data Stewardship Sr Analyst (IBP) Job Overview PepsiCo Data Governance Program OverviewPepsiCo is establishing a Data Governance program that will be the custodian of the processes, policies, rules and standards by which the Company will define its most critical data. Enabling this program will - Define ownership and accountability of our critical data assets to ensure they are effectively managed and maintain integrity throughout PepsiCos systems - Leverage data as a strategic enterprise asset enabling data-based decision analytics - Improve productivity and efficiency of daily business operations Position OverviewThe Customer Data Steward IBP role is responsible for working within the global data governance team and with their local businesses to maintain alignment to the Enterprise Data Governance's (EDG) processes, rules and standards set to ensure data is fit for purpose. Responsibilities Responsibilities Primary Accountabilities - Deliver key elements of Data Discovery, Source Identification, Data Quality Management, cataloging for program & Customer Domain data. - Ensure data accuracy and adherence to PepsiCo defined global governance practices, as well as driving acceptance of PepsiCo's enterprise data standards and policies across the various business segments. - Maintain and advise relevant stakeholders on data governance-related matters in the customer domain and with respect to Demand Planning in IBP, with a focus on the business use of the data - Define Data Quality Rules from Source systems, within the Enterprise Data Foundation and through to the End User systems to enable End to End Data Quality management to deliver a seamless user experience. - Advise on various projects and initiatives to ensure that any data related changes and dependencies are identified , communicated and managed to ensure adherence with the Enterprise Data Governance established standards. - Accountable for ensuring that data-centric activities are aligned with the EDG program and leverage applicable data standards, governance processes, and overall best practices. Data Governance Business Standards - Ensures alignment of the data governance processes and standards with applicable enterprise, business segment, and local data support models. - Champions the single set of Enterprise-level data standards & repository of key elements pertaining to their in-scope data domain ( e.g Customer, Material, Vendor, Finance, Consumer) and promoting their use throughout the PepsiCo organization. - Advise on various projects and initiatives to ensure that any data related changes and dependencies are identified , communicated and managed to ensure adherence with the Enterprise Data Governance established standards. Data Domain Coordination and Collaboration - Responsible for helping identify the need for sector-level data standards (and above) based on strategic business objectives and the evolution of enterprise-level capabilities and analytical requirements. - Collaborates across the organization to ensure consistent and effective execution of data governance and management principles across PepsiCo's enterprise and analytical systems and data domains. - Accountable for driving organizational acceptance of EDG established data standards, policies, and definitions and process standards for critical / related enterprise data. - Promotes and champions PepsiCo's Enterprise Data Governance Capability and data management program across the organization. Qualifications Qualifications 8+ years of experience working in Customer Operations, Demand Planning, Order to Cash, Commercial Data Governance or Data Management within a global CPG.
Posted 4 weeks ago
3.0 - 5.0 years
9 - 14 Lacs
Hyderabad
Work from Office
Overview We are looking for Associate analyst Role in our team. Person should have 3-5 Yrs of hands on experience on IDMC Informatica cloud Data integration. Person should have 3-5 Yrs of experience on SQL worked with DB's like oracle, SQL Server, mysql Person should have 1-2 Yrs of experience on unix shell scripting. Person should have knowledge on scheduling tools like Control M and Autosys. Should have good communication skills and should understand requirements and articulate them in implementations Person should have end to end project implementation knowledge. Should have worked on atleast 2 projects end to end. Person should have knowledge on testing, UAT and TCO activities. Responsibilities Work independently with business stakeholders to gather and understand integration requirements, with minimal or no support from technical leads. Take full ownership and drive projects from initiation to successful completion. Demonstrate strong communication skills to effectively collaborate with business and technical teams. Be responsible for end-to-end implementation of projects using the Informatica Cloud Data Integration tool. Analyze requirements and develop appropriate SQL queries or Unix shell scripts as needed. Prepare comprehensive test case documents, submit detailed test reports, and support the business during SIT, UAT, and TCO phases. Design mappings, mapping tasks, and taskflows in line with project requirements, with a solid understanding of applicable transformations. Identify and resolve performance issues by tuning underperforming jobs. Qualifications 35 years of hands-on experience with Informatica Cloud Data Integration (IDMC). 35 years of experience writing and optimizing SQL queries in Oracle, SQL Server, and MySQL environments. 12 years of experience in Unix shell scripting. Familiarity with job scheduling tools such as Control-M and AutoSys. Strong communication skills with the ability to understand and articulate business requirements. Experience in at least two end-to-end project implementations. Solid understanding of testing phases, including SIT, UAT, and TCO.
Posted 4 weeks ago
12.0 - 13.0 years
15 - 19 Lacs
Hyderabad
Work from Office
Overview PepsiCo is embarking on a significant initiative of digitalization and standardization of the FP&A solution across all its divisions to make the finance organization more Capable, more Agile, and more Efficient. The MOSAIC program is a key enabler of that vision. It is the FP&A solution of the PepsiCo Global Template (PGT) that, for the first time, aims to integrate vertical planning for Operating Units (OUs) or markets, and horizontal planning for functions (e.g., Global Procurement, Compensation and Benefits, etc.) that have accountability across markets. The program aims to harmonize data, planning processes and ways of working across PepsiCo market. The Finance Application Developer / Architect (TM1) is a key contributor in designing, developing, and maintaining financial planning and analytics solutions using IBM Planning Analytics (TM1). This role combines technical expertise with a deep understanding of finance processes to create robust, scalable, and efficient systems that enable data-driven decision-making. The ideal candidate will excel in solution design, stakeholder collaboration, and aligning technical implementations with strategic business goals. Responsibilities Design, Enhance and Maintain Mosaic Solution Develop, troubleshoot and maintain robust TM1/Planning Analytics applications, including cubes, rules, and TurboIntegrator (TI) processes, to support financial planning, forecasting, and reporting. Collaborate with stakeholders to design and implement scalable, future-proof solutions that meet business requirements. Business Incident Triage Engage with finance and business teams to understand objectives, gather requirements, and translate them into effective technical designs. Provide advisory support to optimize financial processes and restore the solution Optimize System Performance Ensure the stability and performance of TM1 models, performing optimization and tuning to handle growing data and user demands efficiently. Data Integration and Automation Manage data flows between TM1 and other systems, automating processes for data loading, transformation, and reconciliation. Governance and Standards Implement best practices for data governance, model development, documentation, and version control to maintain system reliability and accuracy. Training and Support Deliver training and support to finance teams, empowering them to leverage TM1 solutions effectively for business insights. Qualifications Bachelors degree required. Masters degree preferred. 12-13+ years of experience configuring, deploying and managing TM1 (Preferred) or SAP based Financial Planning & Analysis solution with a focus on Topline Planning.
Posted 4 weeks ago
6.0 - 11.0 years
3 - 6 Lacs
Bengaluru
Work from Office
Job Title ETL Developer Snap LogicExperience 6-12 YearsLocation Bangalore : We are seeking a highly skilled and experienced SnapLogic Contractor to join our team. The ideal candidate will have deep expertise in SnapLogic, including API development and utilizing the platform's agent functionality. This role is instrumental in driving seamless integrations and delivering robust data solutions for our organization. Experience and Education Required 6+ years of relevant experience inETL Developer Snap Logic Technical Skills: Design, develop, and maintain SnapLogic pipelines to support integration projects. Build and manage APIs using SnapLogic to connect various data sources and systems. Leverage SnapLogic agent functionality to enable secure and efficient data integration. Collaborate with cross-functional teams to gather requirements and ensure solutions meet business needs. Troubleshoot and optimize existing SnapLogic integrations to improve performance and reliability. Document integration processes and provide guidance to team members on best practices. Proven experience with SnapLogic, including API builds and agent functionality. Strong understanding of integration patterns and best practices. Proficiency in data integration and ETL processes. Expertise on Relational Databases Oracle, SSMS and familiar with NO SQL DB MongoDB Knowledge of data warehousing concepts and data modelling Experience of performing validations on large-scale datax`x` Strong Rest API ,JSONs and Data transformations experience Experience with Unit Testing and Integration Testing Familiarity with large language models (LLMs) and their integration with data pipelines. Experience in database architecture and optimization. Knowledge of U.S. healthcare systems, data standards (e.g., HL7, FHIR), and compliance requirements (e.g., HIPAA). Behavioral Skills: Excellent documentation and presentation skills, analytical and critical thinking skills, and the ability to identify needs and take initiative Follow engineering best practices and principles within your organisation Work closely with a Lead Software Engineer Be an active member of the MMC Technology community contribute, collaborate, and learn Build strong relationships with members of your engineering squad
Posted 4 weeks ago
5.0 - 10.0 years
1 - 5 Lacs
Bengaluru
Work from Office
Job Title:SAS Administrator ExperienceYearsLocation:Bangalore : Minimum 5+ years of experienced SAS Administrator with expertise in both SAS 9.4 and SAS Viya environments. Technical Skills: Strong experience with SAS 9.4 and SAS Viya administration. Proficiency in Linux/Unix and Windows operating systems. Knowledge of SAS Grid Manager, SAS Studio, SAS Enterprise Guide, and SAS Visual Analytics. Familiarity with cloud platforms (AWS, Azure, or GCP) for SAS Viya deployments. Hands-on experience in implementing and configuring SAS solutions. Understanding of data integration, ETL processes, and database connectivity (e.g., Oracle, Sybase, SQL Server & Hadoop). Responsibilities: Design and implement SAS solutions based on business requirements. Collaborate with stakeholders to gather requirements and translate them into technical solutions. Deploy and configure SAS applications, including integration with other systems. System Monitoring & Troubleshooting Monitor SAS servers, logs, and processes to identify and resolve issues proactively. Troubleshoot and resolve performance bottlenecks, errors, and system failures. Provide root cause analysis and implement preventive measures. Security & Compliance Ensure SAS environments comply with organizational security policies and standards. Implement encryption, authentication, and authorization mechanisms. Conduct regular audits and apply security patches. Documentation & Training Create and maintain detailed documentation for SAS environments, configurations, and processes. Provide training and support to end-users and team members. Develop best practices and standard operating procedures for SAS administration. Collaboration & Support Work closely with IT teams, data scientists, and business analysts to support SAS-related projects. Assist in data integration, ETL processes, and reporting tasks. Provide on-call support for critical issues and scheduled maintenance
Posted 4 weeks ago
0.0 - 5.0 years
1 - 5 Lacs
Bengaluru
Work from Office
Job Title:Data Engineer - DBT (Data Build Tool)Experience0-5 YearsLocation:Bengaluru : Job Responsibilities Assist in the design and implementation of Snowflake-based analytics solution(data lake and data warehouse) on AWSRequirements definition, source data analysis and profiling, the logical and physical design of the data lake and datawarehouse as well as the design of data integration and publication pipelines Develop Snowflake deployment and usage best practices Help educate the rest of the team members on the capabilities and limitations of Snowflake Build and maintain data pipelines adhering to suggested enterprise architecture principles and guidelines Design, build, test, and maintain data management systemsWork in sync with internal and external team members like data architects, data scientists, data analysts to handle all sorts of technical issue Act as technical leader within the team Working in Agile/Lean model Deliver quality deliverables on time Translating complex functional requirements into technical solutions. EXPERTISE AND QUALIFICATIONSEssential Skills, Education and Experience Should have a B.E. / B.Tech. / MCA or equivalent degree along with 4-7 years of experience in Data Engineering Strong experience in DBT concepts like Model building and configurations, incremental load strategies, macro, DBT tests. Strong experience in SQL Strong Experience in AWS Creation and maintenance of optimum data pipeline architecture for ingestion, processing of data Creation of necessary infrastructure for ETL jobs from a wide range of data sources using Talend, DBT, S3, Snowflake. Experience in Data storage technologies like Amazon S3, SQL, NoSQL Data modeling technical awareness Experience in working with stakeholders working in different time zones Good to have AWS data services development experience. Working knowledge on using Bigdata technologies. Experience in collaborating data quality and data governance team. Exposure to reporting tools like Tableau Apache Airflow, Apache Kafka (nice to have) Payments domain knowledge CRM, Accounting, etc. in depth understanding Regulatory reporting exposureOther skills Good Communication skills Team Player Problem solver Willing to learn new technologies, share your ideas and assist other team members as neededStrong analytical and problem-solving skills; ability to define problems, collect data, establish facts, and drawconclusions.
Posted 4 weeks ago
6.0 - 8.0 years
3 - 6 Lacs
Bengaluru
Work from Office
Job Title:ETL Developer Snap LogicExperience:6-8 YearsLocation:Bangalore : Technical Skills: Design, develop, and maintain SnapLogic pipelines to support integration projects. Build and manage APIs using SnapLogic to connect various data sources and systems. Leverage SnapLogic agent functionality to enable secure and efficient data integration. Collaborate with cross-functional teams to gather requirements and ensure solutions meet business needs. Troubleshoot and optimize existing SnapLogic integrations to improve performance and reliability. Document integration processes and provide guidance to team members on best practices. Proven experience with SnapLogic, including API builds and agent functionality. Strong understanding of integration patterns and best practices. Proficiency in data integration and ETL processes. Expertise on Relational Databases Oracle, SSMS and familiar with NO SQL DB MongoDB Knowledge of data warehousing concepts and data modelling Experience of performing validations on large-scale datax`x` Strong Rest API ,JSONs and Data transformations experience Experience with Unit Testing and Integration Testing Familiarity with large language models (LLMs) and their integration with data pipelines. Experience in database architecture and optimization. Knowledge of U.S. healthcare systems, data standards (e.g., HL7, FHIR), and compliance requirements (e.g., HIPAA). Behavioral Skills: Excellent documentation and presentation skills, analytical and critical thinking skills, and the ability to identify needs and take initiative Follow engineering best practices and principles within your organisation Work closely with a Lead Software Engineer Be an active member of the MMC Technology community contribute, collaborate, and learn Build strong relationships with members of your engineering squad
Posted 4 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40175 Jobs | Dublin
Wipro
19626 Jobs | Bengaluru
Accenture in India
17497 Jobs | Dublin 2
EY
16057 Jobs | London
Uplers
11768 Jobs | Ahmedabad
Amazon
10704 Jobs | Seattle,WA
Oracle
9513 Jobs | Redwood City
IBM
9439 Jobs | Armonk
Bajaj Finserv
9311 Jobs |
Accenture services Pvt Ltd
8745 Jobs |