Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
13 - 17 Lacs
Bengaluru
Work from Office
We are looking for an experienced and visionary BI Architect to lead the design and evolution of our Business Intelligence architecture. In this strategic role, you ll work closely with cross-functional leaders to build scalable, high-performance data solutions that empower smarter, faster decisions across the organization. If you re passionate about driving impact through architecture and innovation, this is your opportunity to make a lasting difference.
Posted 2 weeks ago
8.0 - 10.0 years
32 - 37 Lacs
Pune
Work from Office
You enjoy shaping the future of product innovation as a core leader, driving value for customers, guiding successful launches, and exceeding expectations. Join our dynamic team and make a meaningful impact by delivering high-quality products that resonate with clients. As a Data Product Manager within the Connected Commerce Travel Domain of Consumer and Community Banking, you play a pivotal role in the team, promoting the innovation of new data product offerings and overseeing the complete data product life cycle. As a key leader, your responsibilities include representing the customers perspective and developing profitable data products that deliver value to the customer. Leveraging your extensive knowledge of initiating a data product, you steer the successful launch of data products, collect vital feedback, and ensure superior client experiences. With a strong dedication to scalability, resiliency, and stability, you work closely with cross-functional teams to deliver high-quality data products that surpass customer expectations. Job responsibilities Develops a data product strategy and product vision that delivers value to customers Partner with stakeholders both in India and US, working with multiple tribes, squads and analysts to identify, refine, scope and prioritize high value features Owns, maintains, and develops a product backlog that enables development to support the overall strategic roadmap and value proposition Builds the framework and tracks the products key success metrics such as cost, feature and functionality, risk posture, and reliability Own the layout of the Jira stories for each sprint and ensure they are based on best practices and is in line with defined deliverables. Conduct business acceptance of features and epics according to definition of done. Coach, train, manage and mentor analysts and team members to improve the maturity and value of the product practices within the team Required qualifications, capabilities, and skills Education Bachelor/Masters degree 10+ years of experience or equivalent expertise in product management, agile development, data architecture and other relevant technology domains Advanced knowledge of the data product development life cycle, including requirements, design, data querying, cloud technologies and data analytics Experience as a people manager including coaching and performance management Preferred qualifications, capabilities, and skills Demonstrated prior experience working in a highly matrixed, complex organization Relevant experience in travel domain or customer loyalty space Background or coursework in Computer Science You enjoy shaping the future of product innovation as a core leader, driving value for customers, guiding successful launches, and exceeding expectations. Join our dynamic team and make a meaningful impact by delivering high-quality products that resonate with clients. As a Data Product Manager within the Connected Commerce Travel Domain of Consumer and Community Banking, you play a pivotal role in the team, promoting the innovation of new data product offerings and overseeing the complete data product life cycle. As a key leader, your responsibilities include representing the customers perspective and developing profitable data products that deliver value to the customer. Leveraging your extensive knowledge of initiating a data product, you steer the successful launch of data products, collect vital feedback, and ensure superior client experiences. With a strong dedication to scalability, resiliency, and stability, you work closely with cross-functional teams to deliver high-quality data products that surpass customer expectations. Job responsibilities Develops a data product strategy and product vision that delivers value to customers Partner with stakeholders both in India and US, working with multiple tribes, squads and analysts to identify, refine, scope and prioritize high value features Owns, maintains, and develops a product backlog that enables development to support the overall strategic roadmap and value proposition Builds the framework and tracks the products key success metrics such as cost, feature and functionality, risk posture, and reliability Own the layout of the Jira stories for each sprint and ensure they are based on best practices and is in line with defined deliverables. Conduct business acceptance of features and epics according to definition of done. Coach, train, manage and mentor analysts and team members to improve the maturity and value of the product practices within the team Required qualifications, capabilities, and skills Education Bachelor/Masters degree 10+ years of experience or equivalent expertise in product management, agile development, data architecture and other relevant technology domains Advanced knowledge of the data product development life cycle, including requirements, design, data querying, cloud technologies and data analytics Experience as a people manager including coaching and performance management Preferred qualifications, capabilities, and skills Demonstrated prior experience working in a highly matrixed, complex organization Relevant experience in travel domain or customer loyalty space Background or coursework in Computer Science
Posted 2 weeks ago
3.0 - 8.0 years
5 - 10 Lacs
Pune
Remote
Healthcare experience is Mandatory Position Overview : We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities : Data Architecture & Modeling : - Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management - Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) - Create and maintain data lineage documentation and data dictionaries for healthcare datasets - Establish data modeling standards and best practices across the organization Technical Leadership : - Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica - Architect scalable data solutions that handle large volumes of healthcare transactional data - Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise : - Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) - Design data models that support analytical, reporting and AI/ML needs - Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations - Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality : - Implement data governance frameworks specific to healthcare data privacy and security requirements - Establish data quality monitoring and validation processes for critical health plan metrics - Lead eAorts to standardize healthcare data definitions across multiple systems and data sources Required Qualifications : Technical Skills : - 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data - Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches - Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing - Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) - Proficiency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge : - Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data - Experience with healthcare data standards and medical coding systems - Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) - Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication : - Proven track record of leading data modeling projects in complex healthcare environments - Strong analytical and problem-solving skills with ability to work with ambiguous requirements - Excellent communication skills with ability to explain technical concepts to business stakeholders - Experience mentoring team members and establishing technical standards Preferred Qualifications : - Experience with Medicare Advantage, Medicaid, or Commercial health plan operations - Cloud platform certifications (AWS, Azure, or GCP) - Experience with real-time data streaming and modern data lake architectures - Knowledge of machine learning applications in healthcare analytics - Previous experience in a lead or architect role within healthcare organization
Posted 2 weeks ago
5.0 - 8.0 years
9 - 14 Lacs
Hyderabad
Work from Office
Job Description Summary The Sr Data Analyst - BI Reporting will play a key role in developing end-to-end reporting solutions from data collection and transformation to report generation and visualization. This role involves working on the cutting edge of data engineering and analytics leveraging machine learning predictive modeling and generative AI to drive business insights. Job Description Roles and Responsibilities Design visualizations and create dashboards/reports using Power BI (good to have Tableau experience). Explore clean and visualize data sets to prepare for analysis/reporting ensuring data quality and consistency. Develop and maintain BI semantic data models for large-scale data Warehouses/ Data Lakes eventually getting consumed by reporting tools. Leverage SQL and big data tools (e. g. Hadoop Spark) for data manipulation and optimization. Build advanced data models and pipelines using SQL and other tools. Ensure data quality consistency and integrity throughout the data lifecycle. Collaborate closely with data engineers analysts and other stakeholders to understand data requirements and optimize the data flow architecture. Document data processes data architecture modelling/flow charts and best practices for future reference and knowledge sharing. Desired Characteristics Technical Expertise 5 to 8 years of experience in data analytics data mining/integration BI development reporting and insights. Strong knowledge of SQL and experience with big data technologies such as Hadoop Spark or similar tool for data massaging / manipulation. Develop advanced visualization/reports to highlight trends patterns and outliers making complex data easily understandable for various business functions. Implement UI/UX best practices to improve navigation data storytelling and the overall usability of dashboards ensuring that reports are actionable and user-friendly providing the desired insights. #LI-CK1 Additional Information Relocation Assistance Provided: Yes
Posted 2 weeks ago
12.0 - 14.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Job Title: Database Expert/Architect We are seeking a highly skilled Database Expert with deep expertise in both SQL and NoSQL technologies, particularly in large-scale data environments. The ideal candidate will have a strong background in Redshift and data warehousing concepts, along with foundational knowledge of Power BI or similar reporting tools. Key Responsibilities: Design, develop, and optimize SQL and NoSQL queries, stored procedure for high-performance data retrieval across datasets exceeding 100 million records. Implement and manage database replication, partitioning, fragmentation, indexing strategies, and performance tuning. Oversee user and access management to ensure data security and compliance. Collaborate with reporting teams to support data visualization and dashboarding needs using Power BI or equivalent tools. Maintain and enhance database availability, scalability, and reliability through proactive monitoring and tuning. Required Skills & Experience: Proven expertise in SQL and NoSQL database development and optimization. Strong hands-on experience with Amazon Redshift, MySQL, MongoDB, and Flyway. In-depth understanding of data warehousing principles and performance tuning techniques. 3+ years of experience working with AWS-managed database services. 1+ years of experience with Power BI or similar BI/reporting platforms. EXPERIENCE 12-14 Years SKILLS Primary Skill: Data Architecture Sub Skill(s): Data Architecture Additional Skill(s): MongoDB SQL Development, Data Architecture, Redshift
Posted 2 weeks ago
2.0 - 4.0 years
4 - 7 Lacs
Gurugram
Work from Office
About the Role As a Data Engineer at BluSlash Consulting, you will play a pivotal role in transforming raw data into actionable insights that drive strategic decisions. You will work closely with our experienced BI leadership team to identify critical business metrics, analyze complex datasets, and build robust data models. Your responsibilities will include developing and implementing data-driven solutions that address the unique challenges faced by our clients, ensuring their projects success. You will utilize state-of-the-art data processing technologies and tools to create comprehensive reports and dashboards, providing clear, concise, and impactful visualizations. Through your expertise, you will help our clients understand their data better, uncover hidden trends, and make informed decisions that contribute to their overall growth and efficiency. Requirements Bachelors Degree required in any discipline. Any degree/ certification on data analytics subjects is plus. 2-4 years of relevant work experience Experience with data warehouses, distributed data platforms, and data lakes. Experience working on Excel, Python, SQL, Must know Data Ingestion from multiple sources, Data transformation, Data Architecture Excellent business and technical communication, organizational, and problem-solving skills. Knowledge of any visualization tool like Power BI, Tableau, Qlik, Looker is plus. Ability to gather, analyze, restructure, identify and articulate insights from qualitative and quantitative data Team player and comfortable interacting with people Key Responsibilities Conduct requirements gathering and project scoping sessions with subject matter experts, business users, and executive stakeholders to discover and define business data needs. Design , build, and optimize the data architecture and Extract, Transform, and Load (ETL) pipelines to make them accessible for Business Data Analysts, Data Scientists, and business users to enable data-driven decision-making. Work closely with analysts to productionize and scale value-creating capabilities, including data integrations and transformations, model features, and statistical and machine learning models. Drive the highest standards in data reliability, data integrity, and data governance, enabling accurate, consistent, and trustworthy data sets, business intelligence products, and analyses.
Posted 2 weeks ago
7.0 - 10.0 years
9 - 13 Lacs
Hyderabad
Work from Office
At Accellor, we are a trusted consultant that uses best-of-breed Cloud technology to deliver superior customer engagement and business effectiveness for clients. We bring a deep understanding of Financial, Retail, High Tech, Healthcare, and retail verticals. We ve created an atmosphere that encourages curiosity, constant learning, and persistence. We encourage our employees to grow and explore their interests. We cultivate an environment of collaboration, autonomy, and delegation - we know our people have a strong work ethic and a sense of pride and ownership over their work. They are passionate, eager, and motivated - focused on building the perfect solution but never losing sight of the bigger picture. As a Lead Data Engineer specializing in Snowflake and Databricks, you will be responsible for designing, developing, and delivering data engineering solutions using modern cloud data platforms. The candidate should have strong expertise in the data lifecycle, including data ingestion, transformation, and modeling, as well as experience with distributed data processing, data security, and integration with internal and external data sources. Additionally, the candidate should be proficient in leveraging best practices in data architecture and performance optimization. The role also requires the ability to drive end-to-end project delivery aligned with business objectives and ensure the realization of data-driven value. Responsibilities: Demonstrated ability to have successfully completed multiple, complex technical projects and create high-level design and architecture of the solution, including class, sequence and deployment infrastructure diagrams. Take ownership of technical solutions from design and architecture perspective for projects in presales phase as well as on-going projects. Experience with gathering end user requirements and writing technical documentation. Suggest innovative solutions based on new technologies and latest trends. Review the architectural/ technological solutions for ongoing projects and ensure right choice of solution. Work closely with client teams to understand their business, capture requirements, identify pain areas, accordingly, propose an ideal solution and win business. 7-10 years of experience working with Snowflake/Databricks in a data engineering or architecture role. Familiarity with programming languages such as Python, Java, or Scala for data processing and automation. Strong expertise in SQL, data modeling and advanced query optimization techniques. Hands-on experience with cloud platforms (AWS, Azure, or GCP) and their integration with Snowflake. Proficiency in ETL/ELT tools such as ADF, Fabric, etc. Experience with data visualization tools like Tableau, Power BI, or Looker. Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Ability to work in a fast-paced, dynamic environment. Certification in Databricks is added advantage Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment or even abroad in one of our global canters. Work-Life Balance: Accellor prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training, Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Personal Accident Insurance, Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Disclaimer: Accellor is proud to be an equal opportunity employer. We do not discriminate in hiring or any employment decision based on race, color, religion, national origin, age, sex (including pregnancy, childbirth, or related medical conditions), marital status, ancestry, physical or mental disability, genetic information, veteran status, gender identity or expression, sexual orientation, or other applicable legally protected characteristic.
Posted 2 weeks ago
4.0 - 7.0 years
12 - 16 Lacs
Noida
Work from Office
Build smarter systems. Unlock data. Power growth. At SaaS Labs, were scaling fast and were looking for someone who can turn CRM tools into growth engines, translate data into insights, and fuel GTM execution with speed and clarity. This isnt a traditional operations role. Its a hands-on opportunity to be at the heart of how we scale helping sales, marketing, and growth teams move faster, execute smarter, and win bigger. What Youll Do Build for Impact: Design and implement end-to-end CRM architectures (Salesforce/HubSpot) including complex flows, custom objects, field dependencies, and multi-system integrations that scale with our growth. Own Tooling & Process: Create robust data models, implement ETL processes, and architect reporting frameworks that transform raw pipeline data into actionable business intelligence. Integrate & Optimize Tech Stack: Own full-stack integrations between CRM, marketing automation, sales engagement tools, and data warehouses ensuring seamless data flow and system performance. Solve Complex Analytical Challenges: Partner with stakeholders to translate complex business problem statements into analytical frameworks by leveraging the full capabilities of the tech stack. Transform ambiguous analytical requirements into clear, actionable insights that help our GTM teams stay focused and accountable. Fuel GTM Velocity: Partner closely with Sales, CS and Marketing leaders to streamline motions, set up scalable systems, and track performance outcomes. What You Need to Succeed 4-7 years of hands-on experience in designing CRM architecture, systems, integration, and scalable solutions; experience working in revenue operations at high-growth SaaS companies Expert-level Salesforce administratio n including Apex triggers, custom objects, complex validation rules, Process Builder, Flow Builder, and Lightning components Proven experience with system integrations, API management, and data architecture across tools like HubSpot, Outreach, Salesloft, ZoomInfo, LinkedIn Sales Navigator, and marketing automation platforms; familiarity with automation platforms (Zapier/Workato) Proficient technical skills in SQL, data modeling, and experience with business intelligence tools (Tableau, Looker, Power BI) and data warehouses (Snowflake, BigQuery); Strong Excel/Sheets skills Comfortable working cross-functionally in a fast-paced startup environment Why Join Us High ownership, high impact role - you will work in the CEOs office Work closely with senior GTM leaders Fast-moving, outcome-driven environment We are global SaaS company backed by leading investors, with $40M+ in ARR, powering revenue teams across 6,000+ businesses to close deals faster, personalize customer engagement, and scale their GTM operations effortlessly. Want to build the systems and insights that shape how we growApply now and help us scale smarter.
Posted 2 weeks ago
5.0 - 7.0 years
7 - 9 Lacs
Bengaluru
Work from Office
Business Overview Roundel is Targets entry into the media business; an advertising sell-side business built on the principles of first party (people based) data, brand safe content environments and proof that our marketing programs drive business results for our clients. We operate with the ethos of trust and transparency and that media works best when it works in everyones best interest . At the very root of that, Roundel is here to drive business growth for our clients and redefine value in the industry by solving core industry challenges vs. copy current industry methods of operation. We are here to drive a key growth initiative for Target and to lead the industry to a better way of operating within the media marketplace. As the Analyst Data Strategy P&I at Roundel , you would responsible for understanding the data needs across team and would also develop and execute a roadmap to maximize data capabilities that align with the evolving needs of the P&I team. You would be the data custodian for the P&I team responsible for scaling the data function in the form of new products and services. You will also be responsible for ensuring that the data across the products is consistent with the highest standards of accuracy. You will do so by bringing your knowledge and expertise of working in a data analytics/data-centric business environment with a strong foundation in data strategy. You would work towards building new data capabilities in partnership with Roundels Product functions to enhance the reporting and insights machinery within the P&I team. You would work as a conduit between the product and P&I team to simplify data across sources to ensure data consistency and integrity and to develop new custom data products that support multiple use cases (MMM, Custom Analytics, etc.). Additionally, you will coordinate and lead both small and large working sessions withbusiness stakeholders that drive to conclusions,agreement,and outcomes. Key Responsibilities Data custodian for the P&I team to build and maintain data sets for reporting, insights, custom analytics and visualizations; creating clear documentation and managing accessible data libraries Create new data products that align with evolving P&I reporting needs and result in time and efforts savings for analysts. Identify automation opportunities , gaps in technology & tools and work with product and business operations team to find solutions & support. Data Warehousing Identify new data sets that can be integrated into existing reporting platforms to enrich our reporting & insights Continue to develop and help drive the strategy for the different data need including data architecture, reporting, insights, analytics, MMM, etc Data Quality and Reliability Develop procedures to enhance the accuracy and integrity of data by performing data analysis as well as by collaborating with product team to enhance data collection and storage procedures Ensure that the data from different sources which flows in the analyst facing products is consistent and accurate for all business use cases Ensure data accuracy, as well as reporting output quality control, as required also troubleshoot and identify root causes for data inaccuracy- manual v/s system errors Work with Cross-functional teams Product, Measurement, BII & GTM to align on the data strategy/ data roadmaps and implement solutions. Contribute to team-upskilling efforts required to familiarize them with the data products as well as also address any data challenges faced by them About You: 5-7 yrs of working experience; designing, building, and optimizing data structures for ETL processes, data wrangling & processing large data, cleaning and preparing data, querying data, and conducting exploratory analysis with a strong foundation in data strategy. Masters or bachelors degree in one of the followingAnalytics, Business Intelligence, Economics, Engineering. Knowledge of database structures, building and improving scalable datasets, and managing data pipelines and infrastructures Strong hands-on programming skills in Python, SQL, Hadoop/Hive. Additional knowledge of Spark and Scala desired but not mandatory Proven hands-on experience developing and preparing datasets for use in business intelligence (BI) tools such Domo (mandatory), Tableau, and Power BI are good to have) Knowledgeable in data models, data mining, and segmentation techniques. Background in Ad-tech or Retail Media Network Experience is desirable but not mandatory Excellent oral and written communication skills and the ability to summarize and present complex information in a clear and concise manner to technical and non-technical audience Experience and passion for using data to explain or solve complex business problems and influence the invention of new systematic and operational processes Experience working with ambiguity in a dynamic and challenging environment to drive long-term sustainable solutions Proven experience of achieving results by leading, partnering, and influencing peers and leaders
Posted 2 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad
Work from Office
About the Role: Grade Level (for internal use): 11 The Role : The Knowledge Engineering team are seeking a Lead Knowledge Engineer to support our strategic transformation from a traditional data organization into a next generation interconnected data intelligence organization. The Team : The Knowledge Engineering team within data strategy and governance helps to lead fundamental organizational and operational change driving our linked data, open data, and data governance strategy, both internally and externally. The team partners closely with data and software engineering to envision and build the next generation of data architecture and tooling with modern technologies. The Impact : Knowledge Engineering efforts occur within the broader context of major strategic initiatives to extend market leadership and build next-generation data, insights and analytics products that are powered by our world class datasets. Whats in it for you : The Lead Knowledge Engineer role is an opportunity to work as an individual contributor in creatively solving complex challenges alongside visionary leadership and colleagues. Its a role with highly visible initiatives and outsized impact. The wider division has a great culture of innovation, collaboration, and flexibility with a focus on delivery. Every person is respected and encouraged to be their authentic self. Responsibilities : Develop, implement, and continue to enhance ontologies, taxonomies, knowledge graphs, and related semantic artefacts for interconnected data, as well as topical/indexed query, search, and asset discovery Design and prototype data / software engineering solutions enabling to scale the construction, maintenance and consumption of semantic artefacts and interconnected data layer for various application contexts Provide thought leadership for strategic projects ensuring timelines are feasible, work is effectively prioritized, and deliverables met Influence the strategic semantic vision, roadmap, and next-generation architecture Execute on the interconnected data vision by creating linked metadata schemes to harmonize semantics across systems and domains Analyze and implement knowledge organization strategies using tools capable of metadata management, ontology management, and semantic enrichment Influence and participate in governance bodies to advocate for the use of established semantics and knowledge-based tools Qualifications: Able to communicate complex technical strategies and concepts in a relatable way to both technical and non-technical stakeholders and executives to effectively persuade and influence 5+ years of experience with ontology development, semantic web technologies (RDF, RDFS, OWL, SPARQL) and open-source or commercial semantic tools (e.g., VocBench, TopQuadrant, PoolParty, RDFLib, triple stores); Advanced studies in computer science, knowledge engineering, information sciences, or related discipline preferred 3+ years of experience in advanced data integration with semantic and knowledge graph technologies in complex, enterprise-class, multi-system environment(s); skilled in all phases from conceptualization to optimization Programming skills in a mainstream programming language (Python, Java, JavaScript), with experience in utilizing cloud services (AWS, Google Cloud, Azure) is a great bonus Understanding of the agile development life cycle and the broader data management discipline (data governance, data quality, metadata management, reference and master data management) S&P Global Enterprise Data Organization is a unified, cross-divisional team focused on transforming S&P Globals data assets. We streamline processes and enhance collaboration by integrating diverse datasets with advanced technologies, ensuring efficient data governance and management. About S&P Global Commodity Insights At S&P Global Commodity Insights, our complete view of global energy and commodities markets enables our customers to make decisions with conviction and create long-term, sustainable value. Were a trusted connector that brings together thought leaders, market participants, governments, and regulators to co-create solutions that lead to progress. Vital to navigating Energy Transition, S&P Global Commodity Insights coverage includes oil and gas, power, chemicals, metals, agriculture and shipping. S&P Global Commodity Insights is a division of S&P Global (NYSESPGI). S&P Global is the worlds foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the worlds leading organizations navigate the economic landscape so they can plan for tomorrow, today. Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visit Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ---- S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - ---- 10 - Officials or Managers (EEO-2 Job Categories-United States of America), DTMGOP103.2 - Middle Management Tier II (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)
Posted 2 weeks ago
12.0 - 15.0 years
0 - 3 Lacs
Bengaluru
Hybrid
Role & responsibilities Design, develop, and maintain scalable enterprise data architecture incorporating data warehouse, data lake, and data mesh concepts Create and maintain data models, schemas, and mappings that support Reporting, business intelligence, analytics, and AI/ML initiatives Establish data integration patterns for batch and real-time processing using AWS services (Glue, DMS, Lambda), Redshift, Snowflake or Data Bricks [KM1] . Define technical specifications for data storage, data processing, and data access patterns Develop data models and enforce data architecture standards [KM2] , policies, and best practices Partner with business stakeholders to translate requirements into architectural solutions Lead data modernization initiatives, including legacy system migrations Create roadmaps for evolving data architecture to support future business needs Provide expert guidance on complex data problems and architectural decisions Preferred candidate profile Bachelors degree in computer science, Information Systems, or related field; Masters degree preferred 8+ years of experience in data architecture, database design, data modelling or related roles[KM1] 5+ years of experience with cloud data platforms, particularly AWS data services 3+ years of experience architecting MPP database solutions (Redshift, Snowflake, etc.) Expert knowledge of data warehouse architecture and dimensional modelling Strong understanding of AWS [KM2] data services ecosystem (Redshift, S3, Glue, DMS, Lambda) Experience with SQL Server and migration to cloud data platforms Proficiency in data modelling, entity relationship diagrams, and schema design Working knowledge of data integration patterns and technologies (ETL/ELT, CDC) Experience with one or more programming/scripting languages (Python, SQL, Shell) Familiarity with data lake architectures and technologies (Parquet, Delta Lake, Athena) Excellent verbal and written communication skills, with ability to translate complex technical concepts to varied audiences Strong stakeholder management and influencing skills Experience implementing data warehouse, data lake and data mesh architectures Good to have knowledge of machine learning workflows and feature engineering Understanding of regulatory requirements related to data (Fed Ramp, GDPR, CCPA, etc.)[KM3] Experience with big data technologies [KM4] (Spark, Hadoop)
Posted 2 weeks ago
0.0 years
20 - 25 Lacs
Hyderabad
Work from Office
Overview The Tech Strategy & Enterprise Solutions Consumer Capability at PepsiCo drives personalization at scale by leveraging best-in-class technology stack. The team elevates consumer engagement through Salesforce Marketing Cloud, Data Cloud, MCP, and Service Cloud, enabling targeted campaigns and loyalty initiatives. Responsibilities Collaborate with cross-functional teams, including Business Analysts, Product Owners, and IT teams, to deeply understand Salesforce functionalities, assess risks, and develop robust solutions aligned with business requirements. Govern and maintain compliance with standards, including ISO and CMMI, to ensure alignment with organizational goals. Document solutions with a meticulous and detail-oriented approach, clearly articulating the how and why to build a comprehensive and accessible knowledge base. Act as the SPOC for Salesforce-related queries, ensuring timely and effective resolution. Manage user accounts, security settings, and data tasks such as imports, exports, and cleansing. Customize and maintain Salesforce objects, workflows, dashboards, and reports while ensuring usability and scalability. Monitor system performance, troubleshoot issues, and integrate Salesforce with third-party tools. Qualifications Mandatory Technical Skills Expertise in Salesforce administration, including custom objects, sharing settings, profiles, role hierarchies, Salesforce Shield, and GDPR compliance for secure system management. Proficiency in understanding the Salesforce Data Cloud elated metadata and working Proficiency in Salesforce reporting, analytics, and data visualization tools for decision-making. Familiarity with risk assessment frameworks for system development. Salesforce DataCloud Consultant (required) Salesforce Administrator Certification (required); Advanced certifications like Salesforce Advanced Administrator and Platform App Builder are preferred. Mandatory Non-Technical Skills A keen eye for detail and a proactive approach to identifying issues and inefficiencies. Strong problem-solving skills with the ability to develop and implement long-term solutions. A governance-focused mindset with a commitment to maintaining high operational standards. Effective communication and interpersonal skills to act as a trusted SPOC for stakeholders.
Posted 2 weeks ago
8.0 - 13.0 years
14 - 18 Lacs
Hyderabad
Work from Office
Overview Customer Data Stewardship Sr Analyst (IBP) Job Overview PepsiCo Data Governance Program OverviewPepsiCo is establishing a Data Governance program that will be the custodian of the processes, policies, rules and standards by which the Company will define its most critical data. Enabling this program will - Define ownership and accountability of our critical data assets to ensure they are effectively managed and maintain integrity throughout PepsiCos systems - Leverage data as a strategic enterprise asset enabling data-based decision analytics - Improve productivity and efficiency of daily business operations Position OverviewThe Customer Data Steward IBP role is responsible for working within the global data governance team and with their local businesses to maintain alignment to the Enterprise Data Governance's (EDG) processes, rules and standards set to ensure data is fit for purpose. Responsibilities Responsibilities Primary Accountabilities - Deliver key elements of Data Discovery, Source Identification, Data Quality Management, cataloging for program & Customer Domain data. - Ensure data accuracy and adherence to PepsiCo defined global governance practices, as well as driving acceptance of PepsiCo's enterprise data standards and policies across the various business segments. - Maintain and advise relevant stakeholders on data governance-related matters in the customer domain and with respect to Demand Planning in IBP, with a focus on the business use of the data - Define Data Quality Rules from Source systems, within the Enterprise Data Foundation and through to the End User systems to enable End to End Data Quality management to deliver a seamless user experience. - Advise on various projects and initiatives to ensure that any data related changes and dependencies are identified , communicated and managed to ensure adherence with the Enterprise Data Governance established standards. - Accountable for ensuring that data-centric activities are aligned with the EDG program and leverage applicable data standards, governance processes, and overall best practices. Data Governance Business Standards - Ensures alignment of the data governance processes and standards with applicable enterprise, business segment, and local data support models. - Champions the single set of Enterprise-level data standards & repository of key elements pertaining to their in-scope data domain ( e.g Customer, Material, Vendor, Finance, Consumer) and promoting their use throughout the PepsiCo organization. - Advise on various projects and initiatives to ensure that any data related changes and dependencies are identified , communicated and managed to ensure adherence with the Enterprise Data Governance established standards. Data Domain Coordination and Collaboration - Responsible for helping identify the need for sector-level data standards (and above) based on strategic business objectives and the evolution of enterprise-level capabilities and analytical requirements. - Collaborates across the organization to ensure consistent and effective execution of data governance and management principles across PepsiCo's enterprise and analytical systems and data domains. - Accountable for driving organizational acceptance of EDG established data standards, policies, and definitions and process standards for critical / related enterprise data. - Promotes and champions PepsiCo's Enterprise Data Governance Capability and data management program across the organization. Qualifications Qualifications 8+ years of experience working in Customer Operations, Demand Planning, Order to Cash, Commercial Data Governance or Data Management within a global CPG.
Posted 2 weeks ago
10.0 - 15.0 years
25 - 30 Lacs
Hyderabad
Work from Office
Overview The role will be responsible to successfully distribute the master data across the landscape including MDG, S4 HANA, DDH and downstream applications. The role will be responsible to ensure data consistency, seamless movement of data to avoid any adverse impact to business transactions. The Data conversion and ETL expert for RTR area, will have a good understanding of Data architecture, Data Solutions and Systems capabilities based around SAP S4/HANA as the core platform, and should be able to understand and influence end to end business requirements so that realistic and attainable solution is deployed. Responsibilities Partner with multiple Value Streams to define the data design and data standards for the S/4 migration project Partnership with other sector data leads to integrate the data migration standards and activities. Ensure data consistency across the landscape Development of standards and guidelines for master data interface modelling Support onboarding and KT for project resources commencing S4 migration/deployments projects Develop processes, template and migration tools (ETL) for new objects in scope for S4 deployment Qualifications Bachelors degree required 10+ years of functional experience with data / conversions / interfaces Demonstrated ability to effectively communicate with all levels of the organization Ability to work flexible hours based on varying business requirements Solves highly complex problems within their work team Ability to quickly adapt to changes in timelines and sequences Adaptability and flexibility including ability to manage deadline pressure, ambiguity and change
Posted 2 weeks ago
10.0 - 15.0 years
25 - 30 Lacs
Hyderabad
Work from Office
Overview The role will be responsible to successfully distribute the master data across the landscape including MDG, S4 HANA, DDH and downstream application. The role will be responsible to ensure data consistency, seamless movement of data to avoid any adverse impact to business transactions. The Data conversion and ETL expert will have a good understanding of Data architecture, Data Solutions and Systems capabilities based around SAP S4/HANA as the core platform, and should be able to understand and influence end to end business requirements so that realistic and attainable solution is deployed. Responsibilities Partner with multiple Value Streams to define the data design and data standards for the S/4 migration project Partnership with other sector data leads to integrate the data migration standards and activities. Ensure data consistency across the landscape Development of standards and guidelines for master data interface modelling Support onboarding and KT for project resources commencing S4 migration/deployments projects Develop processes, template and migration tools (ETL) for new objects in scope for S4 deployment Qualifications Bachelors degree required 10+ years of functional experience with data / conversions / interfaces Demonstrated ability to effectively communicate with all levels of the organization Ability to work flexible hours based on varying business requirements Solves highly complex problems within their work team Ability to quickly adapt to changes in timelines and sequences Adaptability and flexibility including ability to manage deadline pressure, ambiguity and change
Posted 2 weeks ago
5.0 - 10.0 years
19 - 25 Lacs
Hyderabad
Work from Office
Overview Seeking an Associate Manager, Data Operations, to support our growing data organization. In this role, you will assist in maintaining data pipelines and corresponding platforms (on-prem and cloud) while working closely with global teams on DataOps initiatives. Support the day-to-day operations of data pipelines, ensuring data governance, reliability, and performance optimization on Microsoft Azure. Hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and real-time streaming architectures is preferred. Assist in ensuring the availability, scalability, automation, and governance of enterprise data pipelines supporting analytics, AI/ML, and business intelligence. Contribute to DataOps programs, aligning with business objectives, data governance standards, and enterprise data strategy. Help implement real-time data observability, monitoring, and automation frameworks to improve data reliability, quality, and operational efficiency. Support the development of governance models and execution roadmaps to enhance efficiency across Azure, AWS, GCP, and on-prem environments. Work on CI/CD integration, data pipeline automation, and self-healing capabilities to improve enterprise-wide DataOps processes. Collaborate with cross-functional teams to support and maintain next-generation Data & Analytics platforms while promoting an agile and high-performing DataOps culture. Assist in the adoption of Data & Analytics technology transformations, ensuring automation for proactive issue identification and resolution. Partner with cross-functional teams to support process improvements, best practices, and operational efficiencies within DataOps. Responsibilities Assist in the implementation and optimization of enterprise-scale data pipelines using Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and Azure Stream Analytics. Support data ingestion, transformation, orchestration, and storage workflows, ensuring data reliability, integrity, and availability. Help ensure seamless batch, real-time, and streaming data processing, focusing on high availability and fault tolerance. Contribute to DataOps automation efforts, including CI/CD for data pipelines, automated testing, and version control using Azure DevOps and Terraform. Collaborate with Data Engineering, Analytics, AI/ML, CloudOps, and Business Intelligence teams to support data-driven decision-making. Assist in aligning DataOps practices with regulatory and security requirements by working with IT, data stewards, and compliance teams. Support data operations and sustainment activities, including testing and monitoring processes for global products and projects. Participate in data capture, storage, integration, governance, and analytics efforts, working alongside cross-functional teams. Assist in managing day-to-day DataOps activities, ensuring adherence to service-level agreements (SLAs) and business requirements. Engage with SMEs and business stakeholders to ensure data platform capabilities align with business needs. Contribute to Agile work intake and execution processes, helping to maintain efficiency in data platform teams. Help troubleshoot and resolve issues related to cloud infrastructure and data services in collaboration with technical teams. Support the development and automation of operational policies and procedures, improving efficiency and resilience. Assist in incident response and root cause analysis, contributing to self-healing mechanisms and mitigation strategies. Foster a customer-centric approach, advocating for operational excellence and continuous improvement in service delivery. Help build a collaborative, high-performing team culture, promoting automation and efficiency within DataOps. Adapt to shifting priorities and support cross-functional teams in maintaining productivity and achieving business goals. Utilize technical expertise in cloud and data operations to support service reliability and scalability. Qualifications 5+ years of technology work experience in a large-scale global organization, with CPG industry experience preferred. 5+ years of experience in Data & Analytics roles, with hands-on expertise in data operations and governance. 2+ years of experience working within a cross-functional IT organization, collaborating with multiple teams. Experience in a lead or senior support role, with a focus on DataOps execution and delivery. Strong communication skills, with the ability to collaborate with stakeholders and articulate technical concepts to non-technical audiences. Analytical and problem-solving abilities, with a focus on prioritizing customer needs and operational improvements. Customer-focused mindset, ensuring high-quality service delivery and operational efficiency. Growth mindset, with a willingness to learn and adapt to new technologies and methodologies in a fast-paced environment. Experience supporting data operations in a Microsoft Azure environment, including data pipeline automation. Familiarity with Site Reliability Engineering (SRE) principles, such as monitoring, automated issue remediation, and scalability improvements. Understanding of operational excellence in complex, high-availability data environments. Ability to collaborate across teams, building strong relationships with business and IT stakeholders. Basic understanding of data management concepts, including master data management, data governance, and analytics. Knowledge of data acquisition, data catalogs, data standards, and data management tools. Strong execution and organizational skills, with the ability to follow through on operational plans and drive measurable results. Adaptability in a dynamic, fast-paced environment, with the ability to shift priorities while maintaining productivity.
Posted 2 weeks ago
8.0 - 12.0 years
2 - 6 Lacs
Bengaluru
Work from Office
Job Title:Hadoop Engineer Experience8-12 YearsLocation:Bangalore : DevOps and CI/CDDesign, implement, and manage CI/CD pipelines using tools like Jenkins and GitOps to automate and streamline the software development lifecycle. Containerization and OrchestrationDeploy and manage containerized applications using Kubernetes and OpenShift, ensuring high availability and scalability. Infrastructure ManagementDevelop and maintain infrastructure as code (IaC) using tools like Terraform or Ansible. Big Data SolutionsArchitect and implement big data solutions using technologies such as Hadoop, Spark, and Kafka. Distributed SystemsDesign and manage distributed data architectures to ensure efficient data processing and storage. CollaborationWork closely with development, operations, and data teams to understand requirements and deliver robust solutions. Monitoring and OptimizationImplement monitoring solutions and optimize system performance, reliability, and scalability. Security and ComplianceEnsure infrastructure and data solutions adhere to security best practices and regulatory requirements. Technical Skills: Proficiency in CI/CD tools such as Jenkins and GitOps. Strong experience with containerization and orchestration tools like Kubernetes and OpenShift. Knowledge of big data technologies such as Hadoop, Spark, ETLs. Proficiency in scripting languages such as Python, Bash, or Groovy. Familiarity with infrastructure as code (IaC) tools like Terraform or Ansible.
Posted 2 weeks ago
10.0 - 12.0 years
9 - 13 Lacs
Chennai
Work from Office
Job Title Data ArchitectExperience 10-12 YearsLocation Chennai : 10-12 years experience as Data Architect Strong expertise in streaming data technologies like Apache Kafka, Flink, Spark Streaming, or Kinesis. ProficiencyinprogramminglanguagessuchasPython,Java,Scala,orGo ExperiencewithbigdatatoolslikeHadoop,Hive,anddatawarehousessuchas Snowflake,Redshift,Databricks,MicrosoftFabric. Proficiencyindatabasetechnologies(SQL,NoSQL,PostgreSQL,MongoDB,DynamoDB,YugabyteDB). Should be flexible to work as anIndividual contributor
Posted 2 weeks ago
10.0 - 15.0 years
4 - 8 Lacs
Noida
Work from Office
Highly skilled and experienced Data Modeler to join Enterprise Data Modelling team. The candidate will be responsible for creating and maintaining conceptual logical and physical data models ensuring alignment with industry best practices and standards. Working closely with business and functional teams the Data Modeler will play a pivotal role in standardizing data models at portfolio and domain levels driving efficiencies and maximizing the value of clients data assets. Preference will be given to candidates with prior experience within an Enterprise Data Modeling team. The ideal domain experience would be Insurance or Investment Banking. Roles and Responsibilities: Develop comprehensive conceptual logical and physical data models for multiple domains within the organization leveraging industry best practices and standards. Collaborate with business and functional teams to understand their data requirements and translate them into effective data models that support their strategic objectives. Serve as a subject matter expert in data modeling tools such as ERwin Data Modeler providing guidance and support to other team members and stakeholders. Establish and maintain standardized data models across portfolios and domains ensuring consistency governance and alignment with organizational objectives. Identify opportunities to optimize existing data models and enhance data utilization particularly in critical areas such as fraud banking AML. Provide consulting services to internal groups on data modeling tool usage administration and issue resolution promoting seamless data flow and application connections. Develop and deliver training content and support materials for data models ensuring that stakeholders have the necessary resources to understand and utilize them effectively. Collaborate with the enterprise data modeling group to develop and implement a robust governance framework and metrics for model standardization with a focus on longterm automated monitoring solutions. Qualifications: Bachelors or masters degree in computer science Information Systems or a related field. 10 years of experience working as a Data Modeler or in a similar role preferably within a large enterprise environment. Expertise in data modeling concepts and methodologies with demonstrated proficiency in creating conceptual logical and physical data models. Handson experience with data modeling tools such as Erwin Data Modeler as well as proficiency in database environments such as Snowflake and Netezza. Strong analytical and problemsolving skills with the ability to understand complex data requirements and translate them into effective data models. Excellent communication and collaboration skills with the ability to work effectively with crossfunctional teams and stakeholders. problem-solving skills,business intelligence platforms,erwin,data modeling,database management systems,data warehousing,etl processes,big data technologies,agile methodologies,data governance,sql,enterprise data modelling,data visualization tools,cloud data services,analytical skills,data modelling tool,,data architecture,communication skills
Posted 2 weeks ago
6.0 - 8.0 years
30 - 35 Lacs
Bengaluru
Work from Office
Role Description: As a Data Engineering Lead, you will play a crucial role in overseeing the design, development, and maintenance of our organization's data architecture and infrastructure. You will be responsible for designing and developing the architecture for the data platform that ensures the efficient and effective processing of large volumes of data, enabling the business to make informed decisions based on reliable and high-quality data. The ideal candidate will have a strong background in data engineering, excellent leadership skills, and a proven track record of successfully managing complex data projects. Responsibilities : Data Architecture and Design: Design and implement scalable and efficient data architectures to support the organization's data processing needs Work closely with cross-functional teams to understand data requirements and ensure that data solutions align with business objectives ETL Development: Oversee the development of robust ETL processes to extract, transform, and load data from various sources into the data warehouse Ensure data quality and integrity throughout the ETL process, implementing best practices for data cleansing and validation Big Data Technology - Stay abreast of emerging trends and technologies in big data and analytics, and assess their applicability to the organization's data strategy Implement and optimize big data technologies to process and analyze large datasets efficiently Cloud Integration: Collaborate with the IT infrastructure team to integrate data engineering solutions with cloud platforms, ensuring scalability, security, and performance. Performance Monitoring and Optimization: Implement monitoring tools and processes to track the performance of data pipelines and proactively address any issues Optimize data processing. Documentation: Maintain comprehensive documentation for data engineering processes, data models, and system architecture Ensure that team members follow documentation standards and best practices. Collaboration and Communication: Collaborate with data scientists, analysts, and other stakeholders to understand their data needs and deliver solutions that meet those requirements Communicate effectively with technical and non-technical stakeholders, providing updates on project status, challenges, and opportunities. Qualifications: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. 6-8 years of professional experience in data engineering In-depth knowledge of data modeling, ETL processes, and data warehousing. In-depth knowledge of building the data warehouse using Snowflake Should have experience in data ingestion, data lakes, data mesh and data governance Must have experience in Python programming Strong understanding of big data technologies and frameworks, such as Hadoop, Spark, and Kafka. Experience with cloud platforms, such as AWS, Azure, or Google Cloud. Familiarity with database systems like SQL, NoSQL, and data pipeline orchestration tools .Excellent problem-solving and analytical skills .Strong communication and interpersonal skills. Proven ability to work collaboratively in a fast-paced, dynamic environment.
Posted 2 weeks ago
10.0 - 12.0 years
37 - 40 Lacs
Chennai
Remote
10+ yrs (5+ yrs as Lead Role) Proven Exp Legacy App modernization Strong expert in Enterprise Arch (Data/Apps/Cloud) AWS Ecosystem (IAM, VPC, CloudWatch, RDS, Secrets Mgr) Knowledge of Camunda for BPM/Workflow & AWS API Gateway-Service architecture. Required Candidate profile Working Time (1.30 - 10 p.m - IST) Lead End-to-End Engagement Delivery Define modernization roadmaps, architectural decisions Oversee Solution Design, Risk Mgt CI/CD using GitHub, SonarQube for Code
Posted 2 weeks ago
10.0 - 14.0 years
25 - 30 Lacs
Pune
Work from Office
We are seeking a highly experienced Principal Solution Architect to lead the design, development, and implementation of sophisticated cloud-based data solutions for our key clients. The ideal candidate will possess deep technical expertise across multiple cloud platforms (AWS, Azure, GCP), data architecture paradigms, and modern data technologies. You will be instrumental in shaping data strategies, driving innovation through areas like GenAI and LLMs, and ensuring the successful delivery of complex data projects across various industries. Key Responsibilities: Solution Design & Architecture: Lead the architecture and design of robust, scalable, and secure enterprise-grade data solutions, including data lakes, data warehouses, data mesh, and real-time data pipelines on AWS, Azure, and GCP. Client Engagement & Pre-Sales: Collaborate closely with clients to understand their business challenges, translate requirements into technical solutions, and present compelling data strategies. Support pre-sales activities, including proposal development and solution demonstrations Data Strategy & Modernization: Drive data and analytics modernization initiatives, leveraging cloud-native services, Big Data technologies, GenAI, and LLMs to deliver transformative business value Industry Expertise: Apply data architecture best practices across various industries (e.g., BFSI, Retail, Supply Chain, Manufacturing) Requireme ntsRequired Qualifications & Skills Experience: 10+ years of experience in IT, with a significant focus on data architecture, solution architecture, and data engineering. Proven experience in a principal-level or lead architect role Cloud Expertise: Deep, hands-on experience with major cloud platforms: Azure: (Microsoft Fabric, Data Lake, Power BI, Data Factory, Azure Purview ), good understanding of Azure Service Foundry, Agentic AI, copi lotGCP: (Big Query, Vertex.AI, Gemini) Data Science Leadership: Understanding and experience in integrating AI/ML capabilities, including GenAI and LLMs, into data solutions Leadership & Communication: Exceptional communication, presentation, and interpersonal skills. Proven ability to lead technical teams and manage client relationships Problem-Solving: Strong analytical and problem-solving abilities with a strategic minds Education: Bachelors or masters degree in computer science, Engineering, Information Technology, or a related field Preferred Qualifications Relevant certifications in AWS, Azure, GCP, Snowflake, or Databricks Experience with Agentic AI, hyper-intelligent automation
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
vadodara, gujarat
On-site
The purpose of your role is to define and develop Enterprise Data Structure, Data Warehouse, Master Data, Integration, and transaction processing while maintaining and strengthening modeling standards and business information. You will define and develop Data Architecture that supports the organization and clients in new/existing deals. This includes partnering with business leadership to provide strategic recommendations, assessing data benefits and risks, creating data strategy and roadmaps, engaging stakeholders for data governance, ensuring data storage and database technologies are supported, monitoring compliance with Data Modeling standards, overseeing frameworks for data management, and collaborating with vendors and clients to maximize the value of information. Additionally, you will be responsible for building enterprise technology environments for data architecture management. This involves developing, maintaining, and implementing standard patterns for data layers, data stores, data hub & lake, evaluating implemented systems, collecting and integrating data, creating data models, implementing best security practices, and demonstrating strong experience in database architectures and design patterns. You will also enable Delivery Teams by providing optimal delivery solutions and frameworks. This includes building relationships with delivery and practice leadership teams, defining database structures and specifications, establishing relevant metrics, monitoring system capabilities and performance, integrating new solutions, managing projects, identifying risks, ensuring quality assurance, recommending tools for reuse and automation, and supporting the integration team for better efficiency. In addition, you will ensure optimal Client Engagement by supporting pre-sales teams, negotiating and coordinating with client teams, demonstrating thought leadership, and acting as a trusted advisor. Join Wipro to reinvent your world and be a part of an end-to-end digital transformation partner with bold ambitions. Realize your ambitions in a business powered by purpose and empowered to design your reinvention. Applications from people with disabilities are explicitly welcome.,
Posted 3 weeks ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
Embark on your transformative journey as a Solution Design Business Analyst - Vice President. You will be responsible for driving key strategic change initiatives for regulatory deliverables across Risk, Finance, and Treasury. To excel in this role, you should have at least 10 years of experience in business/data analysis, enabling you to present complex data issues in a simple and engaging manner. Your expertise should extend to front to back system designing, complex business problem solutioning, data gathering, data cleansing, and data validation. You will be expected to analyze large volumes of data, identify patterns, address data quality issues, conduct metrics analysis, and translate your analysis into valuable insights. Additionally, you will play a crucial role in capturing business requirements and translating them into technical data requirements. Collaboration with stakeholders to ensure proposed solutions meet their needs and expectations is a key aspect of this role. You will also be involved in creating operational and process designs to ensure the successful delivery of proposed solutions within the agreed scope, as well as supporting change management activities. Experience within the financial services industry, particularly in the banking sector within a Risk/Finance/Treasury role, will be highly valued. Proficiency in data analysis tools such as SQL, Hypercube, Python, and data visualization/reporting tools like Tableau, Qlikview, Power BI, and Advanced Excel will be beneficial. Familiarity with data modeling and data architecture is also desirable. The primary purpose of this role is to support the organization in achieving its strategic objectives by identifying business requirements and proposing solutions to address business problems and opportunities. Key Accountabilities include identifying and analyzing business problems and client requirements necessitating change within the organization, developing business requirements to address these challenges, collaborating with stakeholders to ensure proposed solutions align with their needs, creating business cases justifying investment in solutions, conducting feasibility studies to assess proposed solutions" viability, reporting on project progress to ensure timely and budget-compliant delivery, and supporting change management activities. As a Vice President, you are expected to contribute to strategic planning, resource allocation, policy management, continuous improvement initiatives, and policy enforcement. Your leadership responsibilities may involve demonstrating a set of leadership behaviors focusing on creating an environment for colleagues to excel. For individual contributors, being a subject matter expert within your discipline, guiding technical direction, leading collaborative assignments, and coaching team members are essential. You will also provide guidance on functional and cross-functional areas of impact and alignment, risk management, and organizational strategies. Demonstrating a comprehensive understanding of the organization's functions, collaborating with various work areas, creating solutions based on analytical thought, building trusting relationships with stakeholders, and upholding Barclays Values and Mindset are crucial aspects of this role.,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
We are seeking a highly motivated and experienced IT Enterprise Architect with a strong focus on end-to-end (E2E) customer service processes. As an IT Enterprise Architect, you will be instrumental in shaping and aligning our IT landscape, encompassing platforms like SAP, ServiceNow, and other customer service-related systems. Your expertise will play a crucial role in driving the digital transformation of our global service processes to ensure scalability, resilience, and exceptional customer experience. Your responsibilities will include enterprise architecture management, deriving IT strategies from business requirements, designing and maintaining end-to-end Enterprise Architecture for customer service processes, leading cross-functional workshops and architecture communities, developing architecture framework and roadmap, guiding platform selection and integration, modeling IT architectures and processes, contributing to solution evaluations, coordinating communication with key decision-makers, and driving documentation and presentations for executive alignment. To be successful in this role, you should possess a degree in computer science or industrial engineering, along with experience as an Enterprise Architect or Solution-/Domain Architect in Customer facing IT landscapes. Familiarity with enterprise architecture methods and frameworks, governance structures, IT Service Management Frameworks, functional or IT implementation experience in customer service processes, and expertise in customer service solutions implementation are essential. Additionally, you should have extensive experience with data architecture, integration concepts, and cloud technologies. In addition to your technical skills, you should have excellent English language proficiency, good command of German, strong communication and presentation skills, organizational talent, and the ability to work effectively in a global environment. Being results-oriented, quality-focused, flexible, and possessing good analytical and conceptual skills are also key attributes for this role.,
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France