Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
7.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Manager – Azure Data Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Managers (Big Data Architects) with strong technology and data understanding having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Develop standardized practices for delivering new products and capabilities using Big Data & cloud technologies, including data acquisition, transformation, analysis, Modelling, Governance & Data management skills Interact with senior client technology leaders, understand their business goals, create, propose solution, estimate effort, build architectures, develop and deliver technology solutions Define and develop client specific best practices around data management within a cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using ADB, ADF, PySpark, Python, Snypase Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Have managed team and have experience in end to end delivery Have experience of building technical capability and teams to deliver Skills And Attributes For Success Strong understanding & familiarity with all Cloud Ecosystem components Strong understanding of underlying Cloud Architectural concepts and distributed computing paradigms Experience in the development of large scale data processing. Experience with CI/CD pipelines for data workflows in Azure DevOps Hands-on programming experience in ADB, ADF, Synapse, Python, PySpark, SQL Hands-on expertise in cloud services like AWS, and/or Microsoft Azure eco system Solid understanding of ETL methodologies in a multi-tiered stack with Data Modelling & Data Governance Experience with BI, and data analytics databases Experience in converting business problems/challenges to technical solutions considering security, performance, scalability etc. Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Strong stakeholder, client, team, process & delivery management skills To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 8-11 years industry experience Ideally, you’ll also have Project management skills Client management skills Solutioning skills Nice to have: Knowledge in data security best practices Knowledge in Data Architecture Design Patterns What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 day ago
0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction Work with Match360, Publisher, and Watsonx integrations to modernize MDM workloads Drive architectural decisions and ensure alignment with product roadmaps and enterprise standards Secondary: Informatica MDM (Desirable Skillset) Understand Key Concepts Of Informatica MDM Including: Landing, staging, base objects, trust & match rules Hierarchy configuration, E360 views, and SIF/REST API integrations Support data ingestion processes (batch & real-time), transformation, and cleansing routines via IDQ and Java-based user exits Provide insights and inputs to help us strategically position IBM MDM against Informatica, shaping unique assets and accelerators Cross-Functional and Strategic Responsibilities Collaborate with data governance and business teams to implement DQ rules, lineage, and business glossaries Mentor junior developers; participate in design/code reviews and knowledge-sharing sessions Create and maintain documentation: architecture diagrams, integration blueprints, solution specs Stay current with modern MDM practices, AI/ML in data mastering, and cloud-first platforms (e.g., CP4D, IICS, Snowflake, Databricks) Experience with other database platforms and technologies (e.g., DB2,Oracle, SQL Server). Experience with containerization technologies (e.g., Docker, Kubernetes) and orchestration tools. Knowledge of database regulatory compliance requirements (e.g., GDPR, HIPAA). Your Role And Responsibilities We are seeking an experienced and self driven Senior MDM Consultant to design, develop, and maintain enterprise-grade Master Data Management solutions with a primary focus on IBM MDM and foundational knowledge of Informatica MDM. This role will play a key part in advancing our data governance, quality, and integration strategies across customer, product, and party domains. Having experience in IBM DataStage , Knowledge Catalog, Cloud Pak for Data, Manta is important. You will work closely with cross-functional teams including Data Governance, Source System Owners, and Business Data Stewards to implement robust MDM solutions that ensure consistency, accuracy, and trustworthiness of enterprise data. Strong Hands-on Experience With: Informatica MDM 10.x, IDQ, and Java-based user exits. MDM components: base/landing/staging tables, relationships, mappings, hierarchy, E360 Informatica PowerCenter, IICS, or similar ETL tools Experience with REST APIs, SOA, event-based integrations, and SQL/RDBMS. Familiarity with IBM MDM core knowledge in matching, stewardship UI, workflows, and metadata management. Excellent understanding of data architecture, governance, data supply chain, and lifecycle management. Strong communication, documentation, and stakeholder management skills. Experience with cloud MDM/SaaS solutions and DevOps automation for MDM deployments. Knowledge of BAW, Consent Management, Account & Macro Role configuration. Preferred Education Bachelor's Degree Required Technical And Professional Expertise We are seeking an experienced and self driven Senior MDM Consultant to design, develop, and maintain enterprise-grade Master Data Management solutions with a primary focus on IBM MDM and foundational knowledge of Informatica MDM. This role will play a key part in advancing our data governance, quality, and integration strategies across customer, product, and party domains. Having experience in IBM DataStage , Knowledge Catalog, Cloud Pak for Data, Manta is important. You will work closely with cross-functional teams including Data Governance, Source System Owners, and Business Data Stewards to implement robust MDM solutions that ensure consistency, accuracy, and trustworthiness of enterprise data. Strong Hands-on Experience With: Informatica MDM 10.x, IDQ, and Java-based user exits MDM components: base/landing/staging tables, relationships, mappings, hierarchy, E360 Informatica PowerCenter, IICS, or similar ETL tools Experience with REST APIs, SOA, event-based integrations, and SQL/RDBMS. Familiarity with IBM MDM core knowledge in matching, stewardship UI, workflows, and metadata management. Excellent understanding of data architecture, governance, data supply chain, and lifecycle management. Strong communication, documentation, and stakeholder management skills. Experience with cloud MDM/SaaS solutions and DevOps automation for MDM deployments. Knowledge of BAW, Consent Management, Account & Macro Role configuration. Preferred Technical And Professional Experience Other required skills: IBM DataStage , Knowledge Catalog, Cloud Pak for Data, Manta Show more Show less
Posted 1 day ago
15.0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction Joining the IBM Technology Expert Labs teams means you’ll have a career delivering world-class services for our clients. As the ultimate expert in IBM products, you’ll bring together all the necessary technology and services to help customers solve their most challenging problems. Working in IBM Technology Expert Labs means accelerating the time to value confidently and ensuring speed and insight while our clients focus on what they do best—running and growing their business. Excellent onboarding and industry-leading learning culture will set you up for a positive impact, while advancing your career. Our culture is collaborative and experiential. As part of a team, you will be surrounded by bright minds and keen co-creators—always willing to help and be helped—as you apply passion to work that will positively impact the world around us. Your Role And Responsibilities As a Delivery Consultant, you will work closely with IBM clients and partners to design, deliver, and optimize IBM Technology solutions that align with your clients’ goals. In this role, you will apply your technical expertise to ensure world-class delivery while leveraging your consultative skills such as problem-solving issue- / hypothesis-based methodologies, communication, and service orientation skills. As a member of IBM Technology Expert Labs, a team that is client focused, courageous, pragmatic, and technical, you’ll collaborate with clients to optimize and trailblaze new solutions that address real business challenges. If you are passionate about success with both your career and solving clients’ business challenges, this role is for you. To help achieve this win-win outcome, a ‘day-in-the-life’ of this opportunity may include, but not be limited to… Solving Client Challenges Effectively: Understanding clients’ main challenges and developing solutions that helps them reach true business value by working thru the phases of design, development integration, implementation, migration and product support with a sense of urgency . Agile Planning and Execution: Creating and executing agile plans where you are responsible for installing and provisioning, testing, migrating to production, and day-two operations. Technical Solution Workshops: Conducting and participating in technical solution workshops. Building Effective Relationships: Developing successful relationships at all levels —from engineers to CxOs—with experience of navigating challenging debate to reach healthy resolutions. Self-Motivated Problem Solver: Demonstrating a natural bias towards self-motivation, curiosity, initiative in addition to navigating data and people to find answers and present solutions. Collaboration and Communication: Strong collaboration and communication skills as you work across the client, partner, and IBM team. Preferred Education Bachelor's Degree Required Technical And Professional Expertise In-depth knowledge of the IBM Data & AI portfolio. 15+ years of experience in software services 10+ years of experience in the planning, design, and delivery of one or more products from the IBM Data Integration, IBM Data Intelligence product platforms Experience in designing and implementing solution on IBM Cloud Pak for Data, IBM DataStage Nextgen, Orchestration Pipelines 10+ years’ experience with ETL and database technologies, Experience in architectural planning and implementation for the upgrade/migration of these specific products Experience in designing and implementing Data Quality solutions Experience with installation and administration of these products Excellent understanding of cloud concepts and infrastructure Excellent verbal and written communication skills are essential Preferred Technical And Professional Experience Experience with any of DataStage, Informatica, SAS, Talend products Experience with any of IKC, IGC,Axon Experience with programming languages like Java/Python Experience in AWS, Azure Google or IBM cloud platform Experience with Redhat OpenShift Good to have Knowledge: Apache Spark , Shell scripting, GitHub, JIRA Show more Show less
Posted 1 day ago
15.0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction Joining the IBM Technology Expert Labs teams means you'll have a career delivering world-class services for our clients. As the ultimate expert in IBM products, you'll bring together all the necessary technology and services to help customers solve their most challenging problems. Working in IBM Technology Expert Labs means accelerating the time to value confidently and ensuring speed and insight while our clients focus on what they do best—running and growing their business. Excellent onboarding and industry-leading learning culture will set you up for a positive impact, while advancing your career. Our culture is collaborative and experiential. As part of a team, you will be surrounded by bright minds and keen co-creators—always willing to help and be helped—as you apply passion to work that will positively impact the world around us. Your Role And Responsibilities This Candidate is responsible for: DB2 installation and configuration on the below environments. On Prem Multi Cloud Redhat Open shift Cluster HADR Non-DPF and DPF. Migration of other databases to Db2(eg TERADATA / SNOWFLAKE / SAP/ Cloudera to db2 migration) Create high-level designs, detail level designs, maintaining product roadmaps which includes both modernization and leveraging cloud solutions Design scalable, performant, and cost-effective data architectures within the Lakehouse to support diverse workloads, including reporting, analytics, data science, and AI/ML. Perform health check of the databases, make recommendations and deliver tuning. At the Database and system level. Deploy DB2 databases as containers within Red Hat OpenShift clusters Configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Lead the architectural design and implementation of solutions on IBM watsonx.data, ensuring alignment with overall enterprise data strategy and business objectives. Define and optimize the watsonx.data ecosystem, including integration with other IBM watsonx components (watsonx.ai, watsonx.governance) and existing data infrastructure (DB2, Netezza, cloud data sources) Establish best practices for data modeling, schema evolution, and data organization within the watsonx.data lakehouse Act as a subject matter expert on Lakehouse architecture, providing technical leadership and guidance to data engineering, analytics, and development teams. Mentor junior architects and engineers, fostering their growth and knowledge in modern data platforms. Participate in the development of architecture governance processes and promote best practices across the organization. Communicate complex technical concepts to both technical and non-technical stakeholders. Required Technical And Professional Expertise 15+ years of experience in data architecture, data engineering, or a similar role, with significant hands-on experience in cloud data platforms Strong proficiency in DB2, SQL and Python. Strong understanding of: Database design and modelling(dimensional, normalized, NoSQL schemas) Normalization and indexing Data warehousing and ETL processes Cloud platforms (AWS, Azure, GCP) Big data technologies (e.g., Hadoop, Spark) Database Migration project experience from one database to another database (target database Db2). Experience in deployment of DB2 databases as containers within Red Hat OpenShift clusters and configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Excellent communication, collaboration, problem-solving, and leadership skills Preferred Technical And Professional Experience Experience with machine learning environments and LLMs Certification in IBM watsonx.data or related IBM data and AI technologies Hands-on experience with Lakehouse platform (e.g., Databricks, Snowflake) Having exposure to implement or understanding of DB replication process Experience with integrating watsonx.data with GenAI or LLM initiatives (e.g., RAG architectures). Experience with NoSQL databases (e.g., MongoDB, Cassandra). Experience in data modeling tools (e.g., ER/Studio, ERwin). Knowledge of data governance and compliance standards (e.g., GDPR, HIPAA).Soft Skills Show more Show less
Posted 1 day ago
3.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting - Data and Analytics – GIG - Data Modeller EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. T he opportunity We’re looking for a candidate with 3-7 years of expertise in data science, data analysis and visualization skills.Act as Technical Lead to a larger team in EY GDS DnA team to work on various Data and Analytics projects Your Key Responsibilities Lead and mentor a team throughout design, development and delivery phases and keep the team intact on high pressure situations. Work as a Senior team member to contribute in various technical streams EY DnA implementation project. Client focused with good presentation, communication and relationship building skills. Completion of assigned tasks on time and regular status reporting to the lead Collaborate with technology team and support the development of analytical models with the effective use of data and analytic techniques and validate the model results and articulate the insights to the business team Interface and communicate with the onsite teams directly to understand the requirement and determine the optimum solutions Create technical solutions as per business needs by translating their requirements and finding innovative solution options Provide product and design level functional and technical expertise along with best practices Get involved in business development activities like creating proof of concepts (POCs), point of views (POVs), assist in proposal writing and service offering development, and capable of developing creative power point content for presentations Participate in organization-level initiatives and operational activities Ensure continual knowledge management and contribute to internal L&D teams Building a quality work culture and Foster teamwork and lead by example Skills and attributes for success Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint To qualify for the role, you must have BE/BTech/MCA/MBA with 3+ years of industry experience with machine learning, visualization, data science and related offerings. At least around 3+ years of experience in BI and Analytics. To be have ability to do end to end data solutions from analysis, mapping, profiling, ETL architecture and data modelling. Knowledge and experience of at least 1 Insurance domain engagement life or Property n Causality. Understanding of Business Intelligence, Data Warehousing and Data Modelling. Good experience using CA Erwin or other similar modelling tool is absolute must. Experience of working in Guidewire DataHub & InfoCenter skills. Strong knowledge of relational and dimensional data modelling concepts Develop logical and physical data flow models for ETL applications. Translate data access, transformation and movement requirements into functional requirements and mapping designs. Strong knowledge of data architecture, database structure , data analysis and SQL skills Experience in data management analysis. Analyse business objectives and evaluate data solutions to meet customer needs. Establishing scalable, efficient, automated processes for large scale data analyses and management Prepare and analyse historical data and identify patterns To collaborate with technology team and support the development of analytical models with the effective use of data and analytic techniques. To validate the model results and articulate the insights to the business team. Drive the Business requirements gathering for analytics projects Intellectual curiosity - eagerness to learn new things Experience with unstructured data is added advantage Ability to effectively visualize and communicate analysis results Experience with big data and cloud preferred Experience, interest and adaptability to working in an Agile delivery environment. Ability to work in a fast-paced environment where change is a constant and ability to handle ambiguous requirements Exceptional interpersonal and communication skills (written and verbal) Ideally, you’ll also have Good exposure to any ETL tools. Good to have knowledge about P&C insurance. Understanding of Business Intelligence, Data Warehousing and Data Modelling. Must have led a team size of at least 4 members. Experience in Insurance and Banking domain Prior Client facing skills, Self-motivated and collaborative What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 day ago
4.0 - 6.0 years
0 Lacs
Kochi, Kerala, India
On-site
Job Title: SQL Database Administrator Location: Nippon Q1 Business Centre, Kochi Experience: 4-6 Years Type: Full-time About Us: Expericia Technologies is a fast-growing IT services company specializing in enterprise applications, custom software development, SharePoint, .NET, Azure, React and more. We're passionate about technology, innovation and building solutions that resolve real-world problems. Join us to work on exciting projects, learn directly from industry veterans, and grow your career the right way. About the Role: We are looking for an experienced SQL SDA (SQL Developer/Administrator) with 4-6 years of expertise in database management, performance optimization, and ETL processes. The ideal candidate will handle database administration tasks, optimize performance, and support analytics by developing ETL processes for data transformation and reporting. Key Responsibilities: · Design, develop, and optimize SQL queries , stored procedures, and reports. · Perform data analysis and support decision-making with accurate, efficient reports. · Collaborate with business teams to provide tailored database solutions. · Optimize SQL queries and database performance, including troubleshooting and tuning. · Administer and manage SQL Server databases, ensuring availability, security, and data integrity. · Implement and manage ETL processes for data extraction, transformation, and loading. · Develop and maintain dashboards and reporting solutions using SQL and ETL tools. · Ensure data quality and troubleshoot any ETL-related issues. · Support database migrations, upgrades, and high-availability configurations. Skills and Qualifications: · 4-6 years of experience in SQL development and administration, with a focus on ETL processes. · Strong expertise in T-SQL, SQL Server, and ETL tools (e.g., SSIS, Talend). · Proficient in database performance tuning, query optimization, and backup/recovery strategies. · Strong problem-solving and analytical skills. · Bachelor’s degree in Computer Science or related field. Preferred Qualifications: · Experience with cloud migration and data warehousing solutions. · Experience with cloud platforms (AWS RDS, Azure SQL) · Familiarity with high-availability configurations and data integration. Show more Show less
Posted 1 day ago
4.0 - 8.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Hi All, We are hiring for our investment banking client in Mumbai powai location. Location: Mumbai—locals only. Experience: 4-8 years Budget: Open Competitive Market rate [always keep it low] Interview Mode: 1st Round -Virtual, 2nd/3rd -compulsory face to face, may have more than 3 rounds. Required Details: Total Experience Relevant Experience Current Company: Current Designation: Current CTC Expected CTC Notice Period: Current Location Expected Location: Offer In hand: Reason for Job Change: Degree CGPA Passed Out: JD: Requirements (indicate mandatory and/or preferred): Mandatory Must have extensive development experience in Informatica. Sound knowledge of Transformation, mapping and workflow . Good knowledge of relational database (MSSQL/Oracle) & SQL Good knowledge of Oracle/SQL stored procedure / packages / functions Good knowledge of Unix shell scripting Good communication skills and must be able to interact at all levels on a wide range of issues. Must adapt to dynamic business requirements that alter project flowsFlexible for changes and ability to multi-tasks Hard working and self-motivated person Preferred Investment Banking domain knowledge Proactive and willing to learn Knowledge of Autosys Knowledge of Python Show more Show less
Posted 1 day ago
0 years
0 Lacs
Andhra Pradesh, India
On-site
Combine interface design concepts with digital design and establish milestones to encourage cooperation and teamwork. Develop overall concepts for improving the user experience within a business webpage or product, ensuring all interactions are intuitive and convenient for customers. Collaborate with back-end web developers and programmers to improve usability. Conduct thorough testing of user interfaces in multiple platforms to ensure all designs render correctly and systems function properly. Converting the jobs from Talend ETL to Python and convert Lead SQLS to Snowflake. Developers with Python and SQL Skills. Developers should be proficient in Python (especially Pandas, PySpark, or Dask) for ETL scripting, with strong SQL skills to translate complex queries. They need expertise in Snowflake SQL for migrating and optimizing queries, as well as experience with data pipeline orchestration (e.g., Airflow) and cloud integration for automation and data loading. Familiarity with data transformation, error handling, and logging is also essential. Show more Show less
Posted 1 day ago
3.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting - Data and Analytics – GIG - Data Modeller EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. T he opportunity We’re looking for a candidate with 3-7 years of expertise in data science, data analysis and visualization skills.Act as Technical Lead to a larger team in EY GDS DnA team to work on various Data and Analytics projects Your Key Responsibilities Lead and mentor a team throughout design, development and delivery phases and keep the team intact on high pressure situations. Work as a Senior team member to contribute in various technical streams EY DnA implementation project. Client focused with good presentation, communication and relationship building skills. Completion of assigned tasks on time and regular status reporting to the lead Collaborate with technology team and support the development of analytical models with the effective use of data and analytic techniques and validate the model results and articulate the insights to the business team Interface and communicate with the onsite teams directly to understand the requirement and determine the optimum solutions Create technical solutions as per business needs by translating their requirements and finding innovative solution options Provide product and design level functional and technical expertise along with best practices Get involved in business development activities like creating proof of concepts (POCs), point of views (POVs), assist in proposal writing and service offering development, and capable of developing creative power point content for presentations Participate in organization-level initiatives and operational activities Ensure continual knowledge management and contribute to internal L&D teams Building a quality work culture and Foster teamwork and lead by example Skills and attributes for success Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint To qualify for the role, you must have BE/BTech/MCA/MBA with 3+ years of industry experience with machine learning, visualization, data science and related offerings. At least around 3+ years of experience in BI and Analytics. To be have ability to do end to end data solutions from analysis, mapping, profiling, ETL architecture and data modelling. Knowledge and experience of at least 1 Insurance domain engagement life or Property n Causality. Understanding of Business Intelligence, Data Warehousing and Data Modelling. Good experience using CA Erwin or other similar modelling tool is absolute must. Experience of working in Guidewire DataHub & InfoCenter skills. Strong knowledge of relational and dimensional data modelling concepts Develop logical and physical data flow models for ETL applications. Translate data access, transformation and movement requirements into functional requirements and mapping designs. Strong knowledge of data architecture, database structure , data analysis and SQL skills Experience in data management analysis. Analyse business objectives and evaluate data solutions to meet customer needs. Establishing scalable, efficient, automated processes for large scale data analyses and management Prepare and analyse historical data and identify patterns To collaborate with technology team and support the development of analytical models with the effective use of data and analytic techniques. To validate the model results and articulate the insights to the business team. Drive the Business requirements gathering for analytics projects Intellectual curiosity - eagerness to learn new things Experience with unstructured data is added advantage Ability to effectively visualize and communicate analysis results Experience with big data and cloud preferred Experience, interest and adaptability to working in an Agile delivery environment. Ability to work in a fast-paced environment where change is a constant and ability to handle ambiguous requirements Exceptional interpersonal and communication skills (written and verbal) Ideally, you’ll also have Good exposure to any ETL tools. Good to have knowledge about P&C insurance. Understanding of Business Intelligence, Data Warehousing and Data Modelling. Must have led a team size of at least 4 members. Experience in Insurance and Banking domain Prior Client facing skills, Self-motivated and collaborative What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 day ago
0 years
0 Lacs
Andhra Pradesh, India
On-site
Job Description Design, develop, and maintain scalable data pipelines and systems using DBT and Big Data technologies. Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs. Implement data models and transformations using DBT. Develop and maintain ETL processes to ingest and process large volumes of data from various sources. Optimize and troubleshoot data workflows to ensure high performance and reliability. Ensure data quality and integrity through rigorous testing and validation. Monitor and manage data infrastructure, ensuring security and compliance with best practices. Provide technical support and guidance to team members on data engineering best practices. Requirements Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience as a Data Engineer or in a similar role. Strong proficiency in DBT for data modeling and transformations. Hands-on experience with Big Data technologies (e.g., Hadoop, Spark, Kafka). Proficient in Python for data processing and automation. Experience with SQL and database management. Familiarity with data warehousing concepts and best practices. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Preferred Qualifications Experience with cloud platforms (e.g., AWS, Azure, Google Cloud). Knowledge of data governance and security practices. Certification in relevant technologies (e.g., DBT, Big Data platforms). Show more Show less
Posted 1 day ago
5.0 - 7.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Design, develop, and maintain scalable data processing applications using Spark and PySpark API Development 5+ years of experience in at least one of the following: Java, Spark, scala, Python API Development expertise. Write efficient, reusable, and well-documented code. Design and implement data pipelines using tools like Spark and PySpark. Strong analytical and problem-solving abilities to address technical challenges. Perform code reviews and provide constructive feedback to improve code quality. Design and implement data processing tasks that integrate with SQL databases. Proficiency in data modeling, data lake, lakehouse, and data warehousing concepts. Experience with cloud platforms like AWS
Posted 1 day ago
6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Ciklum is looking for a Data Engineer to join our team full-time in India. We are a custom product engineering company that supports both multinational organizations and scaling startups to solve their most complex business challenges. With a global team of over 4,000 highly skilled developers, consultants, analysts and product owners, we engineer technology that redefines industries and shapes the way people live. About the role: As a Data Engineer, become a part of a cross-functional development team engineering experiences of tomorrow. Responsibilities: Design, build, and maintain robust data pipelines using PL/SQL, Oracle, and Java (Spring Boot) Develop and maintain data services and APIs following microservices architecture best practices Implement analytics and reporting solutions using tools such as OBIEE, ODI, or Oracle Apex Ensure performance, scalability, and reliability of ETL/ELT processes across structured and semi-structured data Participate in unit testing, data validation, and quality assurance for data services Collaborate with cross-functional teams to deliver data-driven solutions aligned with business objectives Troubleshoot data issues in development and production environments Engage in Agile/SAFe ceremonies like PI Planning, sprint planning, reviews, and retrospectives Requirements: 4–6 years of hands-on experience in data engineering, preferably within financial services or enterprise environments Proficient in: PL/SQL, Oracle RDBMS Java, Spring Boot, and REST-based APIs ETL/ELT pipeline development Tools like OBIEE, ODI, or similar Familiarity with microservices, data integration, and software development best practices Strong problem-solving and debugging skills Effective communicator with the ability to collaborate across technical and non-technical teams Demonstrated initiative, adaptability, and a desire to learn Desirable: Exposure to MongoDB and/or Oracle Apex Experience with cloud platforms such as AWS or Azure Proficiency in data visualization/reporting tools like Power BI or Tableau Understanding of SAFe Agile methodologies in large-scale data environments Awareness of data governance, lineage, and optimization techniques What's in it for you? Care: your mental and physical health is our priority. We ensure comprehensive company-paid medical insurance, as well as financial and legal consultation Tailored education path: boost your skills and knowledge with our regular internal events (meetups, conferences, workshops), Udemy licence, language courses and company-paid certifications Growth environment: share your experience and level up your expertise with a community of skilled professionals, locally and globally Flexibility: hybrid work mode at Chennai or Pune Opportunities: we value our specialists and always find the best options for them. Our Resourcing Team helps change a project if needed to help you grow, excel professionally and fulfil your potential Global impact: work on large-scale projects that redefine industries with international and fast-growing clients Welcoming environment: feel empowered with a friendly team, open-door policy, informal atmosphere within the company and regular team-building events About us: At Ciklum, we are always exploring innovations, empowering each other to achieve more, and engineering solutions that matter. With us, youll work with cutting-edge technologies, contribute to impactful projects, and be part of a One Team culture that values collaboration and progress. India is a strategic innovation hub for Ciklum, with growing teams in Chennai and Pune leading advancements in EdgeTech, AR/VR, IoT, and beyond. Join us to collaborate on game-changing solutions and take your career to the next level. Want to learn more about us? Follow us on Instagram , Facebook , LinkedIn . Explore, empower, engineer with Ciklum! Experiences of tomorrow. Engineered together Interested already? We would love to get to know you! Submit your application. Can’t wait to see you at Ciklum. Show more Show less
Posted 1 day ago
5.0 - 10.0 years
20 - 27 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
We're Hiring | Platform Engineer @ Xebia Locations: Bangalore | Bhopal | Chennai | Gurgaon | Hyderabad | Jaipur | Pune Immediate Joiners (015 Days Notice Period Only) Valid Passport is Mandatory Xebia is on the lookout for passionate Platform Engineers with a strong mix of Azure Infrastructure as Code (IaC) Terraform and Data Engineering expertise to join our Cloud Data Platform team. What You'll Do: Design & deploy scalable Azure infrastructure using Terraform Build & optimize ETL/ELT pipelines using Azure Data Factory, Databricks, Event Hubs Automate infra provisioning, enforce security/governance via IaC Support CI/CD workflows with Git , Azure DevOps Work with VNETs, Key Vaults, Storage Accounts, Monitoring Tools Use Python, SQL, Spark for data transformation & processing What Were Looking For: Hands-on experience in Azure IaC + Data Engineering Strong in scripting, automation, & monitoring Familiarity with real-time & batch processing Azure certifications (Data Engineer / DevOps) are a plus Must have a valid passport Interested? Send your CV along with the following details to: vijay.s@xebia.com Required Details: Full Name Total Experience Current CTC Expected CTC Current Location Preferred Location Notice Period / Last Working Day (if serving notice) Primary Skill Set LinkedIn Profile URL Do you have a valid passport? (Yes/No) Please apply only if you haven't applied recently or aren't already in the process with any open Xebia roles. Let’s build the future of cloud-native data platforms together!
Posted 1 day ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Eviden, part of the Atos Group, with an annual revenue of circa € 5 billion is a global leader in data-driven, trusted and sustainable digital transformation. As a next generation digital business with worldwide leading positions in digital, cloud, data, advanced computing and security, it brings deep expertise for all industries in more than 47 countries. By uniting unique high-end technologies across the full digital continuum with 47,000 world-class talents, Eviden expands the possibilities of data and technology, now and for generations to come. Roles and Responsibility The Senior Tech Lead - Databricks leads the design, development, and implementation of advanced data solutions. Has To have extensive experience in Databricks, cloud platforms, and data engineering, with a proven ability to lead teams and deliver complex projects. Responsibilities Lead the design and implementation of Databricks-based data solutions. Architect and optimize data pipelines for batch and streaming data. Provide technical leadership and mentorship to a team of data engineers. Collaborate with stakeholders to define project requirements and deliverables. Ensure best practices in data security, governance, and compliance. Troubleshoot and resolve complex technical issues in Databricks environments. Stay updated on the latest Databricks features and industry trends. Key Technical Skills & Responsibilities Experience in data engineering using Databricks or Apache Spark-based platforms. Proven track record of building and optimizing ETL/ELT pipelines for batch and streaming data ingestion. Hands-on experience with Azure services such as Azure Data Factory, Azure Data Lake Storage, Azure Databricks, Azure Synapse Analytics, or Azure SQL Data Warehouse. Proficiency in programming languages such as Python, Scala, SQL for data processing and transformation. Expertise in Spark (PySpark, Spark SQL, or Scala) and Databricks notebooks for large-scale data processing. Familiarity with Delta Lake, Delta Live Tables, and medallion architecture for data lakehouse implementations. Experience with orchestration tools like Azure Data Factory or Databricks Jobs for scheduling and automation. Design and implement the Azure key vault and scoped credentials. Knowledge of Git for source control and CI/CD integration for Databricks workflows, cost optimization, performance tuning. Familiarity with Unity Catalog, RBAC, or enterprise-level Databricks setups. Ability to create reusable components, templates, and documentation to standardize data engineering workflows is a plus. Ability to define best practices, support multiple projects, and sometimes mentor junior engineers is a plus. Must have experience of working with streaming data sources and Kafka (preferred) Eligibility Criteria Bachelor’s degree in Computer Science, Data Engineering, or a related field Extensive experience with Databricks, Delta Lake, PySpark, and SQL Databricks certification (e.g., Certified Data Engineer Professional) Experience with machine learning and AI integration in Databricks Strong understanding of cloud platforms (AWS, Azure, or GCP) Proven leadership experience in managing technical teams Excellent problem-solving and communication skills Our Offering Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences Attractive Salary Hybrid work culture Let’s grow together. Show more Show less
Posted 1 day ago
3.0 - 6.0 years
0 Lacs
Gurugram, Haryana, India
On-site
About The Role Grade Level (for internal use): 09 The Role: As a Software Developer with the Data & Research Development team, you will be responsible for developing & providing backend support across a variety of products within the Market Intelligence platform. Together, you will build scalable and robust solutions using AGILE development methodologies with a focus on high availability to end users. The Team: Do you love to collaborate & provide solutions? This team comes together across eight different locations every single day to craft enterprise grade applications that serve a large customer base with growing demand and usage. You will use a wide range of technologies and cultivate a collaborative environment with other internal teams. The Impact: We focus primarily developing, enhancing and delivering required pieces of information & functionality to internal & external clients in all client-facing applications. You will have a highly visible role where even small changes have very wide impact. What’s in it for you? Opportunities for innovation and learning new state of the art technologies To work in pure agile & scrum methodology Responsibilities Deliver solutions within a multi-functional Agile team Develop expertise in our proprietary enterprise software products Set and maintain a level of excitement in using various technologies to develop, support, and iteratively deploy real enterprise level software Achieve an understanding of customer environments and their use of the products Build solutions architecture, algorithms, and designs for solutions that scale to the customer's enterprise/global requirements Apply software engineering practices and implement automation across all elements of solution delivery Basic Qualifications What we’re looking for: 3-6 years of desktop application development experience with deep understanding of Design Patterns & Object-oriented programming. Hands on development experience using C#, .Net 4.0/4.5, WPF, Asp.net, SQL server. Strong OOP and Service Oriented Architecture (SOA) knowledge. Strong understanding of cloud applications (Containers, Dockers etc.) and exposure to data ETL will be a plus. Ability to resolve serious performance related issues through various techniques, including testing, debugging and profiling. Strong problem solving, analytical and communication skills. Possess a true “roll up the sleeves and get it done” working approach; demonstrated success as a problem solver, operating as a client-focused self-starter. Preferred Qualifications Bachelor's degree in computer science or computer engineering About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 313152 Posted On: 2025-05-05 Location: Hyderabad, Telangana, India Show more Show less
Posted 1 day ago
5.0 - 10.0 years
15 - 30 Lacs
Bengaluru
Work from Office
Dear Candidates, We are hiring for the position of ETL Developer , Kindly find below details. Organization : Confidential (IT Service MNC Company) Job Location : Bangalore, Marathahalli Experience Required : 5Years - 10Years Designation : ETL Developer Mode of Job : Work From Office Job Description for ETL Developer BPCE ES Dataverse Team Profile Required: Technical profile with 5+ years in Data warehousing and BI Strong fundamentals of Data warehousing and BI concepts Experience in Data Integration, Governance and Management Skills Required Mandatory : Minimum experience of 5 years in IBM Datastage 11.7 and SQL-PLSQL Working knowledge Oracle and PostGRE database Should have hands on experience in unix shell scripting. Personal Skills: Good Communication skills written and verbal with the ability to understand and interact with the diverse range of stakeholders Ability to raise factual alerts & risks when necessary Capability to work with cross location team members / stakeholders in order to establish and maintain a consistent delivery. Good To Have (Optional) Technical - Reporting: PowerBI, SAP BO, Tableau Functional - Finance/Banking - Asset finance / Equipment finance / Leasing Role and Responsibilities Responsible for developing data warehousing solutions (Data Stage, Oracle, PostgreSQL) as per requirements. Must provide hands-on technical knowledge and take ownership while working with Business users, Project Managers, Technical Leads, Architects and Testing teams. Provide guidance to IT management in establishing both a short-term roadmap and long-term DW/BI strategy. Work closely with team members and stakeholders to ensure seamless development and delivery of assigned tasks. Assist the team/lead in team management, identifying training needs and inducting new starters Take part in discussions on BI/DW forums alongside peers to the benefit of the organization INTERESTED CANDIDATE SHARE RESUME ON anshu.baranwal @rigvedtech.com
Posted 1 day ago
7.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Experience: 7+ Years Location: Noida-Sector 64 Key Responsibilities: Data Architecture Design: Design, develop, and maintain the enterprise data architecture, including data models, database schemas, and data flow diagrams. Develop a data strategy and roadmap that aligns with the business objectives and ensures the scalability of data systems. Architect both transactional (OLTP) and analytical (OLAP) databases, ensuring optimal performance and data consistency. Data Integration & Management: Oversee the integration of disparate data sources into a unified data platform, leveraging ETL/ELT processes and data integration tools. Design and implement data warehousing solutions, data lakes, and/or data marts that enable efficient storage and retrieval of large datasets. Ensure proper data governance, including the definition of data ownership, security, and privacy controls in accordance with compliance standards (GDPR, HIPAA, etc.). Collaboration with Stakeholders: Work closely with business stakeholders, including analysts, developers, and executives, to understand data requirements and ensure that the architecture supports analytics and reporting needs. Collaborate with DevOps and engineering teams to optimize database performance and support large-scale data processing pipelines. Technology Leadership: Guide the selection of data technologies, including databases (SQL/NoSQL), data processing frameworks (Hadoop, Spark), cloud platforms (Azure is a must), and analytics tools. Stay updated on emerging data management technologies, trends, and best practices, and assess their potential application within the organization. Data Quality & Security: Define data quality standards and implement processes to ensure the accuracy, completeness, and consistency of data across all systems. Establish protocols for data security, encryption, and backup/recovery to protect data assets and ensure business continuity. Mentorship & Leadership: Lead and mentor data engineers, data modelers, and other technical staff in best practices for data architecture and management. Provide strategic guidance on data-related projects and initiatives, ensuring that all efforts are aligned with the enterprise data strategy. Required Skills & Experience: Extensive Data Architecture Expertise: Over 7 years of experience in data architecture, data modeling, and database management. Proficiency in designing and implementing relational (SQL) and non-relational (NoSQL) database solutions. Strong experience with data integration tools (Azure Tools are a must + any other third party tools), ETL/ELT processes, and data pipelines. Advanced Knowledge of Data Platforms: Expertise in Azure cloud data platform is a must. Other platforms such as AWS (Redshift, S3), Azure (Data Lake, Synapse), and/or Google Cloud Platform (BigQuery, Dataproc) is a bonus. Experience with big data technologies (Hadoop, Spark) and distributed systems for large-scale data processing. Hands-on experience with data warehousing solutions and BI tools (e.g., Power BI, Tableau, Looker). Data Governance & Compliance: Strong understanding of data governance principles, data lineage, and data stewardship. Knowledge of industry standards and compliance requirements (e.g., GDPR, HIPAA, SOX) and the ability to architect solutions that meet these standards. Technical Leadership: Proven ability to lead data-driven projects, manage stakeholders, and drive data strategies across the enterprise. Strong programming skills in languages such as Python, SQL, R, or Scala. Certification: Azure Certified Solution Architect, Data Engineer, Data Scientist certifications are mandatory. Pre-Sales Responsibilities: Stakeholder Engagement: Work with product stakeholders to analyze functional and non-functional requirements, ensuring alignment with business objectives. Solution Development: Develop end-to-end solutions involving multiple products, ensuring security and performance benchmarks are established, achieved, and maintained. Proof of Concepts (POCs): Develop POCs to demonstrate the feasibility and benefits of proposed solutions. Client Communication: Communicate system requirements and solution architecture to clients and stakeholders, providing technical assistance and guidance throughout the pre-sales process. Technical Presentations: Prepare and deliver technical presentations to prospective clients, demonstrating how proposed solutions meet their needs and requirements. Additional Responsibilities: Stakeholder Collaboration: Engage with stakeholders to understand their requirements and translate them into effective technical solutions. Technology Leadership: Provide technical leadership and guidance to development teams, ensuring the use of best practices and innovative solutions. Integration Management: Oversee the integration of solutions with existing systems and third-party applications, ensuring seamless interoperability and data flow. Performance Optimization: Ensure solutions are optimized for performance, scalability, and security, addressing any technical challenges that arise. Quality Assurance: Establish and enforce quality assurance standards, conducting regular reviews and testing to ensure robustness and reliability. Documentation: Maintain comprehensive documentation of the architecture, design decisions, and technical specifications. Mentoring: Mentor fellow developers and team leads, fostering a collaborative and growth-oriented environment. Qualifications: Education: Bachelor’s or master’s degree in computer science, Information Technology, or a related field. Experience: Minimum of 7 years of experience in data architecture, with a focus on developing scalable and high-performance solutions. Technical Expertise: Proficient in architectural frameworks, cloud computing, database management, and web technologies. Analytical Thinking: Strong problem-solving skills, with the ability to analyze complex requirements and design scalable solutions. Leadership Skills: Demonstrated ability to lead and mentor technical teams, with excellent project management skills. Communication: Excellent verbal and written communication skills, with the ability to convey technical concepts to both technical and non-technical stakeholders. Show more Show less
Posted 1 day ago
2.0 - 4.0 years
4 - 6 Lacs
Pune
Work from Office
An Engineer is responsible for designing, developing and delivering significant components of engineering solutions to accomplish business goals. Key responsibilities of this role include active participation in the design and development of new features of application enhancement, investigating re-use, ensuring that solutions are fit for purpose and maintainable, and can be integrated successfully into the overall solution and environment with clear, robust and well tested deployments. Assists more junior members of the team and controls their work where applicable. Your key responsibilities Develops source code , including CI/CD pipelines, infrastructure and application related configurations , for all Software Components in accordance with Detailed Software Requirements specification. Provides quality development for technical infrastructure components (i.e., Cloud configuration, Networking and Security, Storage, Infrastructure as a Code) and source code development. Debugs, fixes and provides support to L3 and L2 team. Verifies the developed source code by reviews (4-eyes principle). Contributes to quality assurance by writing and conducting unit testing. Ensures architectural changes (as defined by Architects) are implemented. Contributes to problem and root cause analysis. Integrates software components following the integration strategy. Verifies integrated software components by unit and integrated software testing according to the software test plan. Software test findings must be resolved. Ensures that all code changes end up in Change Items (CIs). Where applicable, develops routines to deploy CIs to the target environments. Provides Release Deployments on non-Production Management controlled environments. Supports creation of Software Product Training Materials, Software Product User Guides, and Software Product Deployment Instructions. Checks consistency of documents with the respective Software Product Release. Where applicable, manages maintenance of applications and performs technical change requests scheduled according to Release Management processes. Fixes software defects/bugs, measures and analyses code for quality. Collaborates with colleagues participating in other stages of the Software Development Lifecycle (SDLC). Identifies dependencies between software product components, between technical components, and between applications and interfaces. Identifies product integration verifications to be performed based on the integration sequence and relevant dependencies. Suggests and implements continuous technical improvements on the applications (Scalability, Reliability, Availability , Performance) Your skills and experience General Skills Bachelor of Science degree from an accredited college or university with a concentration in Computer Science or Software Engineering (or equivalent) with a minor in Finance, Mathematics or Engineering. Strong analytical skills. Proficient communication skills. Fluent in English (written/verbal). Ability to work in virtual teams and in matrixed organizations. Excellent team player. Open minded. Keeps pace with technical innovation. Understands the relevant business area Ability to share information, transfer knowledge and expertise to team members. Ability to design and write code in accordance with provided business requirements Ability to contribute to QA strategy and Architecture decisions. Knowledge of IT delivery and architecture of Cloud native systems and applications Relevant Financial Services experience. Ability to work in a fast-paced environment with competing and alternating priorities with a constant focus on delivery. Ability to balance business demands and IT fulfilment in terms of standardization, reducing risk and increasing IT flexibility. Domain Specific Skills Very Good knowledge of the following technologies are needed: Cloud offering (GCP preferred) Cloud services - IAAS, PAAS, SAAS Cloud native Development and DevOPS, API management, Networking and Configuration Java or Python, Good understanding of ETL/ Data pipelines Very good Knowledge about the core processes / tools such as HP ALM, Jira, Service Now, SDLC, Agile processes.
Posted 1 day ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Req ID: 328445 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Business Intelligence Senior Analyst to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Job Description Role Description : As a Cognos Developer, you will be a key contributor to our business intelligence initiatives. You will be responsible for building, testing, and deploying Cognos reports, managing Framework Manager packages, and ensuring the accuracy and reliability of our data visualizations. Your ability to collaborate with cross-functional teams and your expertise in Cognos Analytics will be essential for success in this role. Responsibilities : Design, develop, and deploy Cognos reports and dashboards using Cognos Analytics 11/12. Build and maintain Cognos reports using Framework Manager and Report Studio. Develop reports with Drill Through, List, Crosstab, and Prompt pages, Page grouping & sections. Build, manage, and maintain Framework Manager packages. Ensure data integrity and consistency within Cognos packages. Optimize Framework Manager performance. Understand and apply data warehousing concepts. Possess basic knowledge of Extract, Transform, Load (ETL) processes. Write and optimize SQL queries for data retrieval and manipulation. Perform data analysis and validation using SQL. Build, test, and deploy Cognos reports and dashboards. Ensure reports meet business requirements and quality standards. Analyze business requirements and translate them into technical specifications. Collaborate with stakeholders to understand reporting needs. Create and maintain technical documentation for Cognos reports and packages. Provide support to end-users on Cognos reporting. Collaborate with cross-functional teams to deliver business intelligence solutions. Communicate effectively with team members and stakeholders. Technical Skills : Cognos Analytics , Oracle , Teradata Experience in Cognos Analytics 11/12 (Data Modules, Framework Manager Packages, Report Studio, Visualization Gallery, Cognos Dashboard). Good knowledge in Cognos packages using Framework Manager. Design and develop reports using Report Studio. Good SQL skills for data retrieval and manipulation. Experience in data warehousing and business intelligence. Basic knowledge of Extract, Transform, Load (ETL) processes.E15- Design, develop, and deploy Cognos reports and dashboards using Cognos Analytics 11/12. Build and maintain Cognos reports using Framework Manager and Report Studio. Develop reports with Drill Through, List, Crosstab, and Prompt pages, Page grouping & sections. Utilize Cognos Data Modules and Visualization Gallery to create interactive and insightful visualizations. Build, manage, and maintain Framework Manager packages. Ensure data integrity and consistency within Cognos packages. Optimize Framework Manager performance. Understand and apply data warehousing concepts. Possess basic knowledge of Extract, Transform, Load (ETL) processes. Write and optimize SQL queries for data retrieval and manipulation. Perform data analysis and validation using SQL. Build, test, and deploy Cognos reports and dashboards. Ensure reports meet business requirements and quality standards. Analyze business requirements and translate them into technical specifications. Collaborate with stakeholders to understand reporting needs. Create and maintain technical documentation for Cognos reports and packages. Provide support to end-users on Cognos reporting. Collaborate with cross-functional teams to deliver business intelligence solutions. Communicate effectively with team members and stakeholders. Technical Skills :Cognos Analytics : Experience in Cognos Analytics 11/12 Good knowledge in Cognos packages using Framework Manager. Design and develop reports using Report Studio. Good SQL skills for data retrieval and manipulation. Experience in data warehousing and business intelligence. Basic knowledge of Extract, Transform, Load (ETL) processes. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here . Show more Show less
Posted 1 day ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Senior Manager - Product Quality Engineering Leader Career Level - E Introduction to role: Join our Commercial IT Data Analytics & AI (DAAI) team as a Product Quality Leader, where you will play a pivotal role in ensuring the quality and stability of our data platforms built on AWS services, Databricks, and Snaplogic. Based in Chennai GITC, you will drive the quality engineering strategy, lead a team of quality engineers, and contribute to the overall success of our data platform. Accountabilities : As the Product Quality Team Leader for data platforms, your key accountabilities will include leadership and mentorship, quality engineering standards, collaboration, technical expertise, and innovation and process improvement. You will lead the design, development, and maintenance of scalable and secure data infrastructure and tools to support the data analytics and data science teams. You will also develop and implement data and data engineering quality assurance strategies and plans tailored to data product build and operations. Essential Skills/Experience: Bachelor’s degree or equivalent in Computer Engineering, Computer Science, or a related field Proven experience in a product quality engineering or similar role, with at least 3 years of experience in managing and leading a team. Experience of working within a quality and compliance environment and application of policies, procedures, and guidelines A broad understanding of cloud architecture (preferably in AWS) Strong experience in Databricks, Pyspark and the AWS suite of applications (like S3, Redshift, Lambda, Glue, EMR). Proficiency in programming languages such as Python Experienced in Agile Development techniques and Methodologies. Solid understanding of data modelling, ETL processes and data warehousing concepts Excellent communication and leadership skills, with the ability to collaborate effectively with the technical and non-technical stakeholders. Experience with big data technologies such as Hadoop or Spark Certification in AWS or Databricks. Prior significant experience working in Pharmaceutical or Healthcare industry IT environment. When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca, we are committed to disrupting an industry and changing lives. Our work has a direct impact on patients, transforming our ability to develop life-changing medicines. We empower the business to perform at its peak and lead a new way of working, combining cutting-edge science with leading digital technology platforms and data. We dare to lead, applying our problem-solving mindset to identify and tackle opportunities across the whole enterprise. Our spirit of experimentation is lived every day through our events like hackathons. We enable AstraZeneca to perform at its peak by delivering world-class technology and data solutions. Are you ready to be part of a team that has the backing to innovate, disrupt an industry and change lives? Apply now to join us on this exciting journey! Show more Show less
Posted 1 day ago
10.0 years
0 Lacs
Vadodara, Gujarat, India
On-site
Role Title: Head- Business Intelligence & AI Reporting To: Chief Information Officer Location of Posting: Corporate office, Vadodara Position Overview: We are seeking seasoned Head- Business Intelligence & AI to lead our data strategy, design scalable data models, and drive analytical and AI innovation across the organization. This role combines leadership in data science, AI and business analytics with deep technical expertise in data architecture and modelling, AI/ML, ETL, Dashboards and AI including Gen AI, Agentic AI. The ideal candidate will be a strategic thinker, technical expert, and effective communicator capable of aligning data initiatives with business objectives. As the Head of AI and Analytics in a chemical manufacturing organization, your role involves leveraging AI and analytics across all functions—R&D, production, supply chain, sales, marketing, finance, HR, and compliance—while incorporating dashboarding, ETL processes, and a data lake to enable data-driven decision-making. Key Responsibilities: Data Strategy Leadership - Define and drive the enterprise-wide business intelligence and analytics strategy , Align BI initiatives with overall business goals and digital transformation priorities Formulate a comprehensive AI and analytics roadmap aligned with the organization’s goals, focusing on improving operational efficiency. Oversee the design and maintenance of a centralized data lake to store diverse data, ensuring scalability, security, and accessibility for cross-functional BI and AI initiatives. Identify cross-functional use cases, such as using AI to predict market demand, optimize pricing strategies, or enhance employee training programs. Apply AI for predictive maintenance of equipment and process optimization while using BI to monitor production KPIs and identify bottlenecks through historical data analysis. Stakeholder Engagement - Collaborate with executive leadership, functional heads, and IT to identify analytics needs, Translate business questions into actionable insights and dashboards Leadership: Lead the Analytics and AI team, provide strategic insights to the C-suite, and foster a data-driven culture. Develop and maintain interactive dashboards for all functions, providing real-time insights to stakeholders Data-Driven Decision Support - Deliver KPIs, scorecards, and predictive models to enable strategic decision-making, Promote advanced analytics, AI/ML initiatives, and scenario planning AI & GenAI Enablement: Spearhead AI and Generative AI initiatives, including hands-on leadership in deploying LLMs, implementing RAG (Retrieval-Augmented Generation) models, and identifying data science-driven opportunities across the organization. Data Governance & Quality: Ensure best practices in data governance, security, and quality management to uphold data integrity and compliance. Education Qualification: Bachelor’s or master’s in computer science, Data Science, Statistics, or related field. PhD is a plus. Experience: 10+ years of experience in analytics, data architecture, or related roles. Strong knowledge of data modelling techniques Understanding of Data Science (SQL, Python, R, and at least one cloud platform. Experience with modern data warehousing tools (Snowflake, BigQuery, Redshift) and orchestration (Airflow, DBT) Technical Competencies/Skills: Analytics tools (Data Lake, Tableau), and integration with other systems Deep understanding of manufacturing processes and best practices. Proven track record of implementing enterprise analytics solutions and predictive modeling at scale. Strong hands-on experience with tools like Power BI, Tableau, Python/R, SQL, and cloud platforms (AWS/GCP/Azure) or any other relevant cloud platform. Experience setting up and managing data lakes and developing end-to-end data pipelines. Sound understanding of AI/ML techniques , LLMs , GenAI tools , and emerging technologies in data science. Experience with modern data warehousing tools (Snowflake, BigQuery, Redshift) and orchestration (Airflow, DBT). Behavioural Competencies: Strong leadership and team management skills. Excellent communication and interpersonal skills. High level of initiative and proactive approach to problem-solving. Ability to work under pressure and manage multiple priorities. Excellent verbal and written communication skills, with the ability to present complex information to both technical and non-technical stakeholders. Strong analytical and problem-solving skills, with the ability to make data-driven decisions. Show more Show less
Posted 1 day ago
10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Company Description SBS is a global financial technology company that helps banks and financial services to adapt in a digital world. Trusted by over 1,500 financial institutions and large-scale lenders in 80 countries, including Santander and Mercedes-Benz, SBS provides a cloud platform with a composable architecture for digitizing operations such as banking, lending, compliance, and payments. Headquartered in Paris, France, SBS employs 3,400 people across 50 offices and is recognized as a top fintech company in Europe. Senior Technical Team leader Business Intelligence, Data Governance & Reporting Key Responsibilities • Lead the development and execution of BI strategies, tools, and reporting solutions in alignment with business objectives. • Serve as a subject matter expert for BI within the organization, supporting internal initiatives and mentoring team members on best practices. • Design, implement, and maintain scalable data models, analytical layers, and interactive dashboards using modern BI tools (primarily Power BI). • Continuously optimize BI architecture to ensure scalability, performance, and adaptability to evolving business needs. • Apply performance optimization techniques to improve data processing, dashboard responsiveness, and user experience. • Ensure high standards of data quality, consistency, and governance across all BI solutions. • Collaborate closely with cross-functional teams including data engineers, data scientists, and business stakeholders to define and meet BI requirements. • Utilize advanced Power BI features (DAX, Power Query, Power BI Service) to build robust, automated reporting and analytical solutions. • Host workshops and office hours to guide business units on Power BI usage, selfservice BI strategies, and technical troubleshooting. • Stay abreast of emerging BI tools, trends, and methodologies to drive continuous innovation and improvement. Desired Skills and Experience • Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, Mathematics, or a related field. • 10+ years of experience in Business Intelligence, including data warehousing, ETL pipelines, and reporting. • Expert-level proficiency in BI tools, particularly Power BI. Certified Power BI Data Analyst Associate (PL300) and Certified Data Management Professional (CDMP)- DAMA. • Strong command of DAX, Power Query, and SQL for data modeling, integration, and Python for analysis. • Proficient in Agile\Scrum or traditional project management methodologies. • Foster a collaborative team culture and encourage continuous learning. • Act as a bridge between technical teams and business stakeholders. • Familiarity with modern cloud data platforms (e.g., Snowflake, Azure Synapse, etc.). • Understanding of data governance, privacy, and security best practices. • Excellent problem-solving and analytical thinking skills, with attention to detail. • Ability to translate complex technical topics into clear, business-friendly language. • Fluency in English, both written and spoken. Show more Show less
Posted 1 day ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Description Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It With Pride. Together with analytics team leaders you will support our business with excellent data models to uncover trends that can drive long-term business results. How You Will Contribute You will: Execute the business analytics agenda in conjunction with analytics team leaders Work with best-in-class external partners who leverage analytics tools and processes Use models/algorithms to uncover signals/patterns and trends to drive long-term business performance Execute the business analytics agenda using a methodical approach that conveys to stakeholders what business analytics will deliver What You Will Bring A desire to drive your future and accelerate your career and the following experience and knowledge: Using data analysis to make recommendations to analytic leaders Understanding in best-in-class analytics practices Knowledge of Indicators (KPI's) and scorecards Knowledge of BI tools like Tableau, Excel, Alteryx, R, Python, etc. is a plus Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It with Pride In This Role As a DaaS Data Engineer, you will have the opportunity to design and build scalable, secure, and cost-effective cloud-based data solutions. You will develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes, ensuring data quality and validation processes to maintain data accuracy and integrity. You will ensure efficient data storage and retrieval for optimal performance, and collaborate closely with data teams, product owners, and other stakeholders to stay updated with the latest cloud technologies and best practices. Role & Responsibilities: Design and Build: Develop and implement scalable, secure, and cost-effective cloud-based data solutions. Manage Data Pipelines: Develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes. Ensure Data Quality: Implement data quality and validation processes to ensure data accuracy and integrity. Optimize Data Storage: Ensure efficient data storage and retrieval for optimal performance. Collaborate and Innovate: Work closely with data teams, product owners, and stay updated with the latest cloud technologies and best practices to remain current in the field. Technical Requirements: Programming: Python, PySpark, Go/Java Database: SQL, PL/SQL ETL & Integration: DBT, Databricks + DLT, AecorSoft, Talend, Informatica/Pentaho/Ab-Initio, Fivetran. Data Warehousing: SCD, Schema Types, Data Mart. Visualization: Databricks Notebook, PowerBI, Tableau, Looker. GCP Cloud Services: Big Query, GCS, Cloud Function, PubSub, Dataflow, DataProc, Dataplex. AWS Cloud Services: S3, Redshift, Lambda, Glue, CloudWatch, EMR, SNS, Kinesis. Supporting Technologies: Graph Database/Neo4j, Erwin, Collibra, Ataccama DQ, Kafka, Airflow. Experience with RGM.ai product would have an added advantage. Soft Skills: Problem-Solving: The ability to identify and solve complex data-related challenges. Communication: Effective communication skills to collaborate with Product Owners, analysts, and stakeholders. Analytical Thinking: The capacity to analyse data and draw meaningful insights. Attention to Detail: Meticulousness in data preparation and pipeline development. Adaptability: The ability to stay updated with emerging technologies and trends in the data engineering field. Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy Business Unit Summary At Mondelēz International, our purpose is to empower people to snack right by offering the right snack, for the right moment, made the right way. That means delivering a broad range of delicious, high-quality snacks that nourish life's moments, made with sustainable ingredients and packaging that consumers can feel good about. We have a rich portfolio of strong brands globally and locally including many household names such as Oreo , belVita and LU biscuits; Cadbury Dairy Milk , Milka and Toblerone chocolate; Sour Patch Kids candy and Trident gum. We are proud to hold the top position globally in biscuits, chocolate and candy and the second top position in gum. Our 80,000 makers and bakers are located in more than 80 countries and we sell our products in over 150 countries around the world. Our people are energized for growth and critical to us living our purpose and values. We are a diverse community that can make things happen—and happen fast. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Analytics & Modelling Analytics & Data Science Show more Show less
Posted 1 day ago
5.0 - 10.0 years
8 - 14 Lacs
Navi Mumbai
Work from Office
Data Strategy and Planning: Develop and implement data architecture strategies that align with organizational goals and objectives. Collaborate with business stakeholders to understand data requirements and translate them into actionable plans. Data Modeling: Design and implement logical and physical data models to support business needs. Ensure data models are scalable, efficient, and comply with industry best practices. Database Design and Management: Oversee the design and management of databases, selecting appropriate database technologies based on requirements. Optimize database performance and ensure data integrity and security. Data Integration: Define and implement data integration strategies to facilitate seamless flow of information across. Responsibilities: Experience in data architecture and engineering Proven expertise with Snowflake data platform Strong understanding of ETL/ELT processes and data integration Experience with data modeling and data warehousing concepts Familiarity with performance tuning and optimization techniques Excellent problem-solving skills and attention to detail Strong communication and collaboration skills Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Cloud & Data Architecture: AWS , Snowflake ETL & Data Engineering: AWS Glue, Apache Spark, Step Functions Big Data & Analytics: Athena,Presto, Hadoop Database & Storage: SQL, Snow sql Security & Compliance: IAM, KMS, Data Masking Preferred technical and professional experience Cloud Data Warehousing: Snowflake (Data Modeling, Query Optimization) Data Transformation: DBT (Data Build Tool) for ELT pipeline management Metadata & Data Governance: Alation (Data Catalog, Lineage, Governance
Posted 1 day ago
5.0 - 8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Dreaming big is in our DNA. It’s who we are as a company. It’s our culture. It’s our heritage. And more than ever, it’s our future. A future where we’re always looking forward. Always serving up new ways to meet life’s moments. A future where we keep dreaming bigger. We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential. The power we create together – when we combine your strengths with ours – is unstoppable. Are you ready to join a team that dreams as big as you do? AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev. The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology. The teams are transforming Operations through Tech and Analytics. Do You Dream Big? We Need You. Job Description Job Title: Manager- GBS Commercial Location: Bangalore Reporting to: Senior Manager - GBS Commercial Purpose of the role This role sits at the intersection of data science and revenue growth strategy, focused on developing advanced analytical solutions to optimize pricing, trade promotions, and product mix. The candidate will lead the end-to-end design, deployment, and automation of machine learning models and statistical frameworks that support commercial decision-making, predictive scenario planning, and real-time performance tracking. By leveraging internal and external data sources—including transactional, market, and customer-level data—this role will deliver insights into price elasticity, promotional lift, channel efficiency, and category dynamics. The goal is to drive measurable improvements in gross margin, ROI on trade spend, and volume growth through data-informed strategies. Key tasks & accountabilities Design and implement price elasticity models using linear regression, log-log models, and hierarchical Bayesian frameworks to understand consumer response to pricing changes across channels and segments. Build uplift models (e.g., Causal Forests, XGBoost for treatment effect) to evaluate promotional effectiveness and isolate true incremental sales vs. base volume. Develop demand forecasting models using ARIMA, SARIMAX, and Prophet, integrating external factors such as seasonality, promotions, and competitor activity. time-series clustering and k-means segmentation to group SKUs, customers, and geographies for targeted pricing and promotion strategies. Construct assortment optimization models using conjoint analysis, choice modeling, and market basket analysis to support category planning and shelf optimization. Use Monte Carlo simulations and what-if scenario modeling to assess revenue impact under varying pricing, promo, and mix conditions. Conduct hypothesis testing (t-tests, ANOVA, chi-square) to evaluate statistical significance of pricing and promotional changes. Create LTV (lifetime value) and customer churn models to prioritize trade investment decisions and drive customer retention strategies. Integrate Nielsen, IRI, and internal POS data to build unified datasets for modeling and advanced analytics in SQL, Python (pandas, statsmodels, scikit-learn), and Azure Databricks environments. Automate reporting processes and real-time dashboards for price pack architecture (PPA), promotion performance tracking, and margin simulation using advanced Excel and Python. Lead post-event analytics using pre/post experimental designs, including difference-in-differences (DiD) methods to evaluate business interventions. Collaborate with Revenue Management, Finance, and Sales leaders to convert insights into pricing corridors, discount policies, and promotional guardrails. Translate complex statistical outputs into clear, executive-ready insights with actionable recommendations for business impact. Continuously refine model performance through feature engineering, model validation, and hyperparameter tuning to ensure accuracy and scalability. Provide mentorship to junior analysts, enhancing their skills in modeling, statistics, and commercial storytelling. Maintain documentation of model assumptions, business rules, and statistical parameters to ensure transparency and reproducibility. Other Competencies Required Presentation Skills: Effectively presenting findings and insights to stakeholders and senior leadership to drive informed decision-making. Collaboration: Working closely with cross-functional teams, including marketing, sales, and product development, to implement insights-driven strategies. Continuous Improvement: Actively seeking opportunities to enhance reporting processes and insights generation to maintain relevance and impact in a dynamic market environment. Data Scope Management: Managing the scope of data analysis, ensuring it aligns with the business objectives and insights goals. Act as a steadfast advisor to leadership, offering expert guidance on harnessing data to drive business outcomes and optimize customer experience initiatives. Serve as a catalyst for change by advocating for data-driven decision-making and cultivating a culture of continuous improvement rooted in insights gleaned from analysis. Continuously evaluate and refine reporting processes to ensure the delivery of timely, relevant, and impactful insights to leadership stakeholders while fostering an environment of ownership, collaboration, and mentorship within the team. Technical Skills - Must Have Data Manipulation & Analysis: Advanced proficiency in SQL, Python (Pandas, NumPy), and Excel for structured data processing. Data Visualization: Expertise in Power BI and Tableau for building interactive dashboards and performance tracking tools. Modeling & Analytics: Hands-on experience with regression analysis, time series forecasting, and ML models using scikit-learn or XGBoost. Data Engineering Fundamentals: Knowledge of data pipelines, ETL processes, and integration of internal/external datasets for analytical readiness. Proficient in Power BI, Advanced MS Excel (Pivots, calculated fields, Conditional formatting, charts, dropdown lists, etc.), MS PowerPoint SQL & Python. Business Environment Work closely with Zone Revenue Management teams. Work in a fast-paced environment. Provide proactive communication to the stakeholders. This is an offshore role and requires comfort with working in a virtual environment. GCC is referred to as the offshore location. The role requires working in a collaborative manner with Zone/country business heads and GCC commercial teams. Summarize insights and recommendations to be presented back to the business. Continuously improve, automate, and optimize the process. Geographical Scope: Global 3. Qualifications, Experience, Skills Level Of Educational Attainment Required Bachelor or Post-Graduate in the field of Business & Marketing, Engineering/Solution, or other equivalent degree or equivalent work experience. Previous Work Experience 5-8 years of experience in the Retail/CPG domain. Extensive experience solving business problems using quantitative approaches. Comfort with extracting, manipulating, and analyzing complex, high volume, high dimensionality data from varying sources. And above all of this, an undying love for beer! We dream big to create future with more cheer. Show more Show less
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The ETL (Extract, Transform, Load) job market in India is thriving with numerous opportunities for job seekers. ETL professionals play a crucial role in managing and analyzing data effectively for organizations across various industries. If you are considering a career in ETL, this article will provide you with valuable insights into the job market in India.
These cities are known for their thriving tech industries and often have a high demand for ETL professionals.
The average salary range for ETL professionals in India varies based on experience levels. Entry-level positions typically start at around ₹3-5 lakhs per annum, while experienced professionals can earn upwards of ₹10-15 lakhs per annum.
In the ETL field, a typical career path may include roles such as: - Junior ETL Developer - ETL Developer - Senior ETL Developer - ETL Tech Lead - ETL Architect
As you gain experience and expertise, you can progress to higher-level roles within the ETL domain.
Alongside ETL, professionals in this field are often expected to have skills in: - SQL - Data Warehousing - Data Modeling - ETL Tools (e.g., Informatica, Talend) - Database Management Systems (e.g., Oracle, SQL Server)
Having a strong foundation in these related skills can enhance your capabilities as an ETL professional.
Here are 25 interview questions that you may encounter in ETL job interviews:
As you explore ETL jobs in India, remember to showcase your skills and expertise confidently during interviews. With the right preparation and a solid understanding of ETL concepts, you can embark on a rewarding career in this dynamic field. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.