Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
Andhra Pradesh, India
On-site
Combine interface design concepts with digital design and establish milestones to encourage cooperation and teamwork. Develop overall concepts for improving the user experience within a business webpage or product, ensuring all interactions are intuitive and convenient for customers. Collaborate with back-end web developers and programmers to improve usability. Conduct thorough testing of user interfaces in multiple platforms to ensure all designs render correctly and systems function properly. Converting the jobs from Talend ETL to Python and convert Lead SQLS to Snowflake. Developers with Python and SQL Skills. Developers should be proficient in Python (especially Pandas, PySpark, or Dask) for ETL scripting, with strong SQL skills to translate complex queries. They need expertise in Snowflake SQL for migrating and optimizing queries, as well as experience with data pipeline orchestration (e.g., Airflow) and cloud integration for automation and data loading. Familiarity with data transformation, error handling, and logging is also essential. Show more Show less
Posted 2 days ago
3.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting - Data and Analytics – GIG - Data Modeller EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services that leverage deep industry experience with strong functional and technical capabilities and product knowledge. EY’s financial services practice provides integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY’s Consulting Practice, Data and Analytics team solves big, complex issues and capitalize on opportunities to deliver better working outcomes that help expand and safeguard the businesses, now and in the future. This way we help create a compelling business case for embedding the right analytical practice at the heart of client’s decision-making. T he opportunity We’re looking for a candidate with 3-7 years of expertise in data science, data analysis and visualization skills.Act as Technical Lead to a larger team in EY GDS DnA team to work on various Data and Analytics projects Your Key Responsibilities Lead and mentor a team throughout design, development and delivery phases and keep the team intact on high pressure situations. Work as a Senior team member to contribute in various technical streams EY DnA implementation project. Client focused with good presentation, communication and relationship building skills. Completion of assigned tasks on time and regular status reporting to the lead Collaborate with technology team and support the development of analytical models with the effective use of data and analytic techniques and validate the model results and articulate the insights to the business team Interface and communicate with the onsite teams directly to understand the requirement and determine the optimum solutions Create technical solutions as per business needs by translating their requirements and finding innovative solution options Provide product and design level functional and technical expertise along with best practices Get involved in business development activities like creating proof of concepts (POCs), point of views (POVs), assist in proposal writing and service offering development, and capable of developing creative power point content for presentations Participate in organization-level initiatives and operational activities Ensure continual knowledge management and contribute to internal L&D teams Building a quality work culture and Foster teamwork and lead by example Skills and attributes for success Use an issue-based approach to deliver growth, market and portfolio strategy engagements for corporates Strong communication, presentation and team building skills and experience in producing high quality reports, papers, and presentations. Experience in executing and managing research and analysis of companies and markets, preferably from a commercial due diligence standpoint To qualify for the role, you must have BE/BTech/MCA/MBA with 3+ years of industry experience with machine learning, visualization, data science and related offerings. At least around 3+ years of experience in BI and Analytics. To be have ability to do end to end data solutions from analysis, mapping, profiling, ETL architecture and data modelling. Knowledge and experience of at least 1 Insurance domain engagement life or Property n Causality. Understanding of Business Intelligence, Data Warehousing and Data Modelling. Good experience using CA Erwin or other similar modelling tool is absolute must. Experience of working in Guidewire DataHub & InfoCenter skills. Strong knowledge of relational and dimensional data modelling concepts Develop logical and physical data flow models for ETL applications. Translate data access, transformation and movement requirements into functional requirements and mapping designs. Strong knowledge of data architecture, database structure , data analysis and SQL skills Experience in data management analysis. Analyse business objectives and evaluate data solutions to meet customer needs. Establishing scalable, efficient, automated processes for large scale data analyses and management Prepare and analyse historical data and identify patterns To collaborate with technology team and support the development of analytical models with the effective use of data and analytic techniques. To validate the model results and articulate the insights to the business team. Drive the Business requirements gathering for analytics projects Intellectual curiosity - eagerness to learn new things Experience with unstructured data is added advantage Ability to effectively visualize and communicate analysis results Experience with big data and cloud preferred Experience, interest and adaptability to working in an Agile delivery environment. Ability to work in a fast-paced environment where change is a constant and ability to handle ambiguous requirements Exceptional interpersonal and communication skills (written and verbal) Ideally, you’ll also have Good exposure to any ETL tools. Good to have knowledge about P&C insurance. Understanding of Business Intelligence, Data Warehousing and Data Modelling. Must have led a team size of at least 4 members. Experience in Insurance and Banking domain Prior Client facing skills, Self-motivated and collaborative What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 2 days ago
0 years
0 Lacs
Andhra Pradesh, India
On-site
Job Description Design, develop, and maintain scalable data pipelines and systems using DBT and Big Data technologies. Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs. Implement data models and transformations using DBT. Develop and maintain ETL processes to ingest and process large volumes of data from various sources. Optimize and troubleshoot data workflows to ensure high performance and reliability. Ensure data quality and integrity through rigorous testing and validation. Monitor and manage data infrastructure, ensuring security and compliance with best practices. Provide technical support and guidance to team members on data engineering best practices. Requirements Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience as a Data Engineer or in a similar role. Strong proficiency in DBT for data modeling and transformations. Hands-on experience with Big Data technologies (e.g., Hadoop, Spark, Kafka). Proficient in Python for data processing and automation. Experience with SQL and database management. Familiarity with data warehousing concepts and best practices. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Preferred Qualifications Experience with cloud platforms (e.g., AWS, Azure, Google Cloud). Knowledge of data governance and security practices. Certification in relevant technologies (e.g., DBT, Big Data platforms). Show more Show less
Posted 2 days ago
5.0 - 7.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Design, develop, and maintain scalable data processing applications using Spark and PySpark API Development 5+ years of experience in at least one of the following: Java, Spark, scala, Python API Development expertise. Write efficient, reusable, and well-documented code. Design and implement data pipelines using tools like Spark and PySpark. Strong analytical and problem-solving abilities to address technical challenges. Perform code reviews and provide constructive feedback to improve code quality. Design and implement data processing tasks that integrate with SQL databases. Proficiency in data modeling, data lake, lakehouse, and data warehousing concepts. Experience with cloud platforms like AWS
Posted 2 days ago
6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Ciklum is looking for a Data Engineer to join our team full-time in India. We are a custom product engineering company that supports both multinational organizations and scaling startups to solve their most complex business challenges. With a global team of over 4,000 highly skilled developers, consultants, analysts and product owners, we engineer technology that redefines industries and shapes the way people live. About the role: As a Data Engineer, become a part of a cross-functional development team engineering experiences of tomorrow. Responsibilities: Design, build, and maintain robust data pipelines using PL/SQL, Oracle, and Java (Spring Boot) Develop and maintain data services and APIs following microservices architecture best practices Implement analytics and reporting solutions using tools such as OBIEE, ODI, or Oracle Apex Ensure performance, scalability, and reliability of ETL/ELT processes across structured and semi-structured data Participate in unit testing, data validation, and quality assurance for data services Collaborate with cross-functional teams to deliver data-driven solutions aligned with business objectives Troubleshoot data issues in development and production environments Engage in Agile/SAFe ceremonies like PI Planning, sprint planning, reviews, and retrospectives Requirements: 4–6 years of hands-on experience in data engineering, preferably within financial services or enterprise environments Proficient in: PL/SQL, Oracle RDBMS Java, Spring Boot, and REST-based APIs ETL/ELT pipeline development Tools like OBIEE, ODI, or similar Familiarity with microservices, data integration, and software development best practices Strong problem-solving and debugging skills Effective communicator with the ability to collaborate across technical and non-technical teams Demonstrated initiative, adaptability, and a desire to learn Desirable: Exposure to MongoDB and/or Oracle Apex Experience with cloud platforms such as AWS or Azure Proficiency in data visualization/reporting tools like Power BI or Tableau Understanding of SAFe Agile methodologies in large-scale data environments Awareness of data governance, lineage, and optimization techniques What's in it for you? Care: your mental and physical health is our priority. We ensure comprehensive company-paid medical insurance, as well as financial and legal consultation Tailored education path: boost your skills and knowledge with our regular internal events (meetups, conferences, workshops), Udemy licence, language courses and company-paid certifications Growth environment: share your experience and level up your expertise with a community of skilled professionals, locally and globally Flexibility: hybrid work mode at Chennai or Pune Opportunities: we value our specialists and always find the best options for them. Our Resourcing Team helps change a project if needed to help you grow, excel professionally and fulfil your potential Global impact: work on large-scale projects that redefine industries with international and fast-growing clients Welcoming environment: feel empowered with a friendly team, open-door policy, informal atmosphere within the company and regular team-building events About us: At Ciklum, we are always exploring innovations, empowering each other to achieve more, and engineering solutions that matter. With us, youll work with cutting-edge technologies, contribute to impactful projects, and be part of a One Team culture that values collaboration and progress. India is a strategic innovation hub for Ciklum, with growing teams in Chennai and Pune leading advancements in EdgeTech, AR/VR, IoT, and beyond. Join us to collaborate on game-changing solutions and take your career to the next level. Want to learn more about us? Follow us on Instagram , Facebook , LinkedIn . Explore, empower, engineer with Ciklum! Experiences of tomorrow. Engineered together Interested already? We would love to get to know you! Submit your application. Can’t wait to see you at Ciklum. Show more Show less
Posted 2 days ago
5.0 - 10.0 years
20 - 27 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
We're Hiring | Platform Engineer @ Xebia Locations: Bangalore | Bhopal | Chennai | Gurgaon | Hyderabad | Jaipur | Pune Immediate Joiners (015 Days Notice Period Only) Valid Passport is Mandatory Xebia is on the lookout for passionate Platform Engineers with a strong mix of Azure Infrastructure as Code (IaC) Terraform and Data Engineering expertise to join our Cloud Data Platform team. What You'll Do: Design & deploy scalable Azure infrastructure using Terraform Build & optimize ETL/ELT pipelines using Azure Data Factory, Databricks, Event Hubs Automate infra provisioning, enforce security/governance via IaC Support CI/CD workflows with Git , Azure DevOps Work with VNETs, Key Vaults, Storage Accounts, Monitoring Tools Use Python, SQL, Spark for data transformation & processing What Were Looking For: Hands-on experience in Azure IaC + Data Engineering Strong in scripting, automation, & monitoring Familiarity with real-time & batch processing Azure certifications (Data Engineer / DevOps) are a plus Must have a valid passport Interested? Send your CV along with the following details to: vijay.s@xebia.com Required Details: Full Name Total Experience Current CTC Expected CTC Current Location Preferred Location Notice Period / Last Working Day (if serving notice) Primary Skill Set LinkedIn Profile URL Do you have a valid passport? (Yes/No) Please apply only if you haven't applied recently or aren't already in the process with any open Xebia roles. Let’s build the future of cloud-native data platforms together!
Posted 2 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Eviden, part of the Atos Group, with an annual revenue of circa € 5 billion is a global leader in data-driven, trusted and sustainable digital transformation. As a next generation digital business with worldwide leading positions in digital, cloud, data, advanced computing and security, it brings deep expertise for all industries in more than 47 countries. By uniting unique high-end technologies across the full digital continuum with 47,000 world-class talents, Eviden expands the possibilities of data and technology, now and for generations to come. Roles and Responsibility The Senior Tech Lead - Databricks leads the design, development, and implementation of advanced data solutions. Has To have extensive experience in Databricks, cloud platforms, and data engineering, with a proven ability to lead teams and deliver complex projects. Responsibilities Lead the design and implementation of Databricks-based data solutions. Architect and optimize data pipelines for batch and streaming data. Provide technical leadership and mentorship to a team of data engineers. Collaborate with stakeholders to define project requirements and deliverables. Ensure best practices in data security, governance, and compliance. Troubleshoot and resolve complex technical issues in Databricks environments. Stay updated on the latest Databricks features and industry trends. Key Technical Skills & Responsibilities Experience in data engineering using Databricks or Apache Spark-based platforms. Proven track record of building and optimizing ETL/ELT pipelines for batch and streaming data ingestion. Hands-on experience with Azure services such as Azure Data Factory, Azure Data Lake Storage, Azure Databricks, Azure Synapse Analytics, or Azure SQL Data Warehouse. Proficiency in programming languages such as Python, Scala, SQL for data processing and transformation. Expertise in Spark (PySpark, Spark SQL, or Scala) and Databricks notebooks for large-scale data processing. Familiarity with Delta Lake, Delta Live Tables, and medallion architecture for data lakehouse implementations. Experience with orchestration tools like Azure Data Factory or Databricks Jobs for scheduling and automation. Design and implement the Azure key vault and scoped credentials. Knowledge of Git for source control and CI/CD integration for Databricks workflows, cost optimization, performance tuning. Familiarity with Unity Catalog, RBAC, or enterprise-level Databricks setups. Ability to create reusable components, templates, and documentation to standardize data engineering workflows is a plus. Ability to define best practices, support multiple projects, and sometimes mentor junior engineers is a plus. Must have experience of working with streaming data sources and Kafka (preferred) Eligibility Criteria Bachelor’s degree in Computer Science, Data Engineering, or a related field Extensive experience with Databricks, Delta Lake, PySpark, and SQL Databricks certification (e.g., Certified Data Engineer Professional) Experience with machine learning and AI integration in Databricks Strong understanding of cloud platforms (AWS, Azure, or GCP) Proven leadership experience in managing technical teams Excellent problem-solving and communication skills Our Offering Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences Attractive Salary Hybrid work culture Let’s grow together. Show more Show less
Posted 2 days ago
3.0 - 6.0 years
0 Lacs
Gurugram, Haryana, India
On-site
About The Role Grade Level (for internal use): 09 The Role: As a Software Developer with the Data & Research Development team, you will be responsible for developing & providing backend support across a variety of products within the Market Intelligence platform. Together, you will build scalable and robust solutions using AGILE development methodologies with a focus on high availability to end users. The Team: Do you love to collaborate & provide solutions? This team comes together across eight different locations every single day to craft enterprise grade applications that serve a large customer base with growing demand and usage. You will use a wide range of technologies and cultivate a collaborative environment with other internal teams. The Impact: We focus primarily developing, enhancing and delivering required pieces of information & functionality to internal & external clients in all client-facing applications. You will have a highly visible role where even small changes have very wide impact. What’s in it for you? Opportunities for innovation and learning new state of the art technologies To work in pure agile & scrum methodology Responsibilities Deliver solutions within a multi-functional Agile team Develop expertise in our proprietary enterprise software products Set and maintain a level of excitement in using various technologies to develop, support, and iteratively deploy real enterprise level software Achieve an understanding of customer environments and their use of the products Build solutions architecture, algorithms, and designs for solutions that scale to the customer's enterprise/global requirements Apply software engineering practices and implement automation across all elements of solution delivery Basic Qualifications What we’re looking for: 3-6 years of desktop application development experience with deep understanding of Design Patterns & Object-oriented programming. Hands on development experience using C#, .Net 4.0/4.5, WPF, Asp.net, SQL server. Strong OOP and Service Oriented Architecture (SOA) knowledge. Strong understanding of cloud applications (Containers, Dockers etc.) and exposure to data ETL will be a plus. Ability to resolve serious performance related issues through various techniques, including testing, debugging and profiling. Strong problem solving, analytical and communication skills. Possess a true “roll up the sleeves and get it done” working approach; demonstrated success as a problem solver, operating as a client-focused self-starter. Preferred Qualifications Bachelor's degree in computer science or computer engineering About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 313152 Posted On: 2025-05-05 Location: Hyderabad, Telangana, India Show more Show less
Posted 2 days ago
5.0 - 10.0 years
15 - 30 Lacs
Bengaluru
Work from Office
Dear Candidates, We are hiring for the position of ETL Developer , Kindly find below details. Organization : Confidential (IT Service MNC Company) Job Location : Bangalore, Marathahalli Experience Required : 5Years - 10Years Designation : ETL Developer Mode of Job : Work From Office Job Description for ETL Developer BPCE ES Dataverse Team Profile Required: Technical profile with 5+ years in Data warehousing and BI Strong fundamentals of Data warehousing and BI concepts Experience in Data Integration, Governance and Management Skills Required Mandatory : Minimum experience of 5 years in IBM Datastage 11.7 and SQL-PLSQL Working knowledge Oracle and PostGRE database Should have hands on experience in unix shell scripting. Personal Skills: Good Communication skills written and verbal with the ability to understand and interact with the diverse range of stakeholders Ability to raise factual alerts & risks when necessary Capability to work with cross location team members / stakeholders in order to establish and maintain a consistent delivery. Good To Have (Optional) Technical - Reporting: PowerBI, SAP BO, Tableau Functional - Finance/Banking - Asset finance / Equipment finance / Leasing Role and Responsibilities Responsible for developing data warehousing solutions (Data Stage, Oracle, PostgreSQL) as per requirements. Must provide hands-on technical knowledge and take ownership while working with Business users, Project Managers, Technical Leads, Architects and Testing teams. Provide guidance to IT management in establishing both a short-term roadmap and long-term DW/BI strategy. Work closely with team members and stakeholders to ensure seamless development and delivery of assigned tasks. Assist the team/lead in team management, identifying training needs and inducting new starters Take part in discussions on BI/DW forums alongside peers to the benefit of the organization INTERESTED CANDIDATE SHARE RESUME ON anshu.baranwal @rigvedtech.com
Posted 2 days ago
7.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Experience: 7+ Years Location: Noida-Sector 64 Key Responsibilities: Data Architecture Design: Design, develop, and maintain the enterprise data architecture, including data models, database schemas, and data flow diagrams. Develop a data strategy and roadmap that aligns with the business objectives and ensures the scalability of data systems. Architect both transactional (OLTP) and analytical (OLAP) databases, ensuring optimal performance and data consistency. Data Integration & Management: Oversee the integration of disparate data sources into a unified data platform, leveraging ETL/ELT processes and data integration tools. Design and implement data warehousing solutions, data lakes, and/or data marts that enable efficient storage and retrieval of large datasets. Ensure proper data governance, including the definition of data ownership, security, and privacy controls in accordance with compliance standards (GDPR, HIPAA, etc.). Collaboration with Stakeholders: Work closely with business stakeholders, including analysts, developers, and executives, to understand data requirements and ensure that the architecture supports analytics and reporting needs. Collaborate with DevOps and engineering teams to optimize database performance and support large-scale data processing pipelines. Technology Leadership: Guide the selection of data technologies, including databases (SQL/NoSQL), data processing frameworks (Hadoop, Spark), cloud platforms (Azure is a must), and analytics tools. Stay updated on emerging data management technologies, trends, and best practices, and assess their potential application within the organization. Data Quality & Security: Define data quality standards and implement processes to ensure the accuracy, completeness, and consistency of data across all systems. Establish protocols for data security, encryption, and backup/recovery to protect data assets and ensure business continuity. Mentorship & Leadership: Lead and mentor data engineers, data modelers, and other technical staff in best practices for data architecture and management. Provide strategic guidance on data-related projects and initiatives, ensuring that all efforts are aligned with the enterprise data strategy. Required Skills & Experience: Extensive Data Architecture Expertise: Over 7 years of experience in data architecture, data modeling, and database management. Proficiency in designing and implementing relational (SQL) and non-relational (NoSQL) database solutions. Strong experience with data integration tools (Azure Tools are a must + any other third party tools), ETL/ELT processes, and data pipelines. Advanced Knowledge of Data Platforms: Expertise in Azure cloud data platform is a must. Other platforms such as AWS (Redshift, S3), Azure (Data Lake, Synapse), and/or Google Cloud Platform (BigQuery, Dataproc) is a bonus. Experience with big data technologies (Hadoop, Spark) and distributed systems for large-scale data processing. Hands-on experience with data warehousing solutions and BI tools (e.g., Power BI, Tableau, Looker). Data Governance & Compliance: Strong understanding of data governance principles, data lineage, and data stewardship. Knowledge of industry standards and compliance requirements (e.g., GDPR, HIPAA, SOX) and the ability to architect solutions that meet these standards. Technical Leadership: Proven ability to lead data-driven projects, manage stakeholders, and drive data strategies across the enterprise. Strong programming skills in languages such as Python, SQL, R, or Scala. Certification: Azure Certified Solution Architect, Data Engineer, Data Scientist certifications are mandatory. Pre-Sales Responsibilities: Stakeholder Engagement: Work with product stakeholders to analyze functional and non-functional requirements, ensuring alignment with business objectives. Solution Development: Develop end-to-end solutions involving multiple products, ensuring security and performance benchmarks are established, achieved, and maintained. Proof of Concepts (POCs): Develop POCs to demonstrate the feasibility and benefits of proposed solutions. Client Communication: Communicate system requirements and solution architecture to clients and stakeholders, providing technical assistance and guidance throughout the pre-sales process. Technical Presentations: Prepare and deliver technical presentations to prospective clients, demonstrating how proposed solutions meet their needs and requirements. Additional Responsibilities: Stakeholder Collaboration: Engage with stakeholders to understand their requirements and translate them into effective technical solutions. Technology Leadership: Provide technical leadership and guidance to development teams, ensuring the use of best practices and innovative solutions. Integration Management: Oversee the integration of solutions with existing systems and third-party applications, ensuring seamless interoperability and data flow. Performance Optimization: Ensure solutions are optimized for performance, scalability, and security, addressing any technical challenges that arise. Quality Assurance: Establish and enforce quality assurance standards, conducting regular reviews and testing to ensure robustness and reliability. Documentation: Maintain comprehensive documentation of the architecture, design decisions, and technical specifications. Mentoring: Mentor fellow developers and team leads, fostering a collaborative and growth-oriented environment. Qualifications: Education: Bachelor’s or master’s degree in computer science, Information Technology, or a related field. Experience: Minimum of 7 years of experience in data architecture, with a focus on developing scalable and high-performance solutions. Technical Expertise: Proficient in architectural frameworks, cloud computing, database management, and web technologies. Analytical Thinking: Strong problem-solving skills, with the ability to analyze complex requirements and design scalable solutions. Leadership Skills: Demonstrated ability to lead and mentor technical teams, with excellent project management skills. Communication: Excellent verbal and written communication skills, with the ability to convey technical concepts to both technical and non-technical stakeholders. Show more Show less
Posted 2 days ago
2.0 - 4.0 years
4 - 6 Lacs
Pune
Work from Office
An Engineer is responsible for designing, developing and delivering significant components of engineering solutions to accomplish business goals. Key responsibilities of this role include active participation in the design and development of new features of application enhancement, investigating re-use, ensuring that solutions are fit for purpose and maintainable, and can be integrated successfully into the overall solution and environment with clear, robust and well tested deployments. Assists more junior members of the team and controls their work where applicable. Your key responsibilities Develops source code , including CI/CD pipelines, infrastructure and application related configurations , for all Software Components in accordance with Detailed Software Requirements specification. Provides quality development for technical infrastructure components (i.e., Cloud configuration, Networking and Security, Storage, Infrastructure as a Code) and source code development. Debugs, fixes and provides support to L3 and L2 team. Verifies the developed source code by reviews (4-eyes principle). Contributes to quality assurance by writing and conducting unit testing. Ensures architectural changes (as defined by Architects) are implemented. Contributes to problem and root cause analysis. Integrates software components following the integration strategy. Verifies integrated software components by unit and integrated software testing according to the software test plan. Software test findings must be resolved. Ensures that all code changes end up in Change Items (CIs). Where applicable, develops routines to deploy CIs to the target environments. Provides Release Deployments on non-Production Management controlled environments. Supports creation of Software Product Training Materials, Software Product User Guides, and Software Product Deployment Instructions. Checks consistency of documents with the respective Software Product Release. Where applicable, manages maintenance of applications and performs technical change requests scheduled according to Release Management processes. Fixes software defects/bugs, measures and analyses code for quality. Collaborates with colleagues participating in other stages of the Software Development Lifecycle (SDLC). Identifies dependencies between software product components, between technical components, and between applications and interfaces. Identifies product integration verifications to be performed based on the integration sequence and relevant dependencies. Suggests and implements continuous technical improvements on the applications (Scalability, Reliability, Availability , Performance) Your skills and experience General Skills Bachelor of Science degree from an accredited college or university with a concentration in Computer Science or Software Engineering (or equivalent) with a minor in Finance, Mathematics or Engineering. Strong analytical skills. Proficient communication skills. Fluent in English (written/verbal). Ability to work in virtual teams and in matrixed organizations. Excellent team player. Open minded. Keeps pace with technical innovation. Understands the relevant business area Ability to share information, transfer knowledge and expertise to team members. Ability to design and write code in accordance with provided business requirements Ability to contribute to QA strategy and Architecture decisions. Knowledge of IT delivery and architecture of Cloud native systems and applications Relevant Financial Services experience. Ability to work in a fast-paced environment with competing and alternating priorities with a constant focus on delivery. Ability to balance business demands and IT fulfilment in terms of standardization, reducing risk and increasing IT flexibility. Domain Specific Skills Very Good knowledge of the following technologies are needed: Cloud offering (GCP preferred) Cloud services - IAAS, PAAS, SAAS Cloud native Development and DevOPS, API management, Networking and Configuration Java or Python, Good understanding of ETL/ Data pipelines Very good Knowledge about the core processes / tools such as HP ALM, Jira, Service Now, SDLC, Agile processes.
Posted 2 days ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Req ID: 328445 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Business Intelligence Senior Analyst to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Job Description Role Description : As a Cognos Developer, you will be a key contributor to our business intelligence initiatives. You will be responsible for building, testing, and deploying Cognos reports, managing Framework Manager packages, and ensuring the accuracy and reliability of our data visualizations. Your ability to collaborate with cross-functional teams and your expertise in Cognos Analytics will be essential for success in this role. Responsibilities : Design, develop, and deploy Cognos reports and dashboards using Cognos Analytics 11/12. Build and maintain Cognos reports using Framework Manager and Report Studio. Develop reports with Drill Through, List, Crosstab, and Prompt pages, Page grouping & sections. Build, manage, and maintain Framework Manager packages. Ensure data integrity and consistency within Cognos packages. Optimize Framework Manager performance. Understand and apply data warehousing concepts. Possess basic knowledge of Extract, Transform, Load (ETL) processes. Write and optimize SQL queries for data retrieval and manipulation. Perform data analysis and validation using SQL. Build, test, and deploy Cognos reports and dashboards. Ensure reports meet business requirements and quality standards. Analyze business requirements and translate them into technical specifications. Collaborate with stakeholders to understand reporting needs. Create and maintain technical documentation for Cognos reports and packages. Provide support to end-users on Cognos reporting. Collaborate with cross-functional teams to deliver business intelligence solutions. Communicate effectively with team members and stakeholders. Technical Skills : Cognos Analytics , Oracle , Teradata Experience in Cognos Analytics 11/12 (Data Modules, Framework Manager Packages, Report Studio, Visualization Gallery, Cognos Dashboard). Good knowledge in Cognos packages using Framework Manager. Design and develop reports using Report Studio. Good SQL skills for data retrieval and manipulation. Experience in data warehousing and business intelligence. Basic knowledge of Extract, Transform, Load (ETL) processes.E15- Design, develop, and deploy Cognos reports and dashboards using Cognos Analytics 11/12. Build and maintain Cognos reports using Framework Manager and Report Studio. Develop reports with Drill Through, List, Crosstab, and Prompt pages, Page grouping & sections. Utilize Cognos Data Modules and Visualization Gallery to create interactive and insightful visualizations. Build, manage, and maintain Framework Manager packages. Ensure data integrity and consistency within Cognos packages. Optimize Framework Manager performance. Understand and apply data warehousing concepts. Possess basic knowledge of Extract, Transform, Load (ETL) processes. Write and optimize SQL queries for data retrieval and manipulation. Perform data analysis and validation using SQL. Build, test, and deploy Cognos reports and dashboards. Ensure reports meet business requirements and quality standards. Analyze business requirements and translate them into technical specifications. Collaborate with stakeholders to understand reporting needs. Create and maintain technical documentation for Cognos reports and packages. Provide support to end-users on Cognos reporting. Collaborate with cross-functional teams to deliver business intelligence solutions. Communicate effectively with team members and stakeholders. Technical Skills :Cognos Analytics : Experience in Cognos Analytics 11/12 Good knowledge in Cognos packages using Framework Manager. Design and develop reports using Report Studio. Good SQL skills for data retrieval and manipulation. Experience in data warehousing and business intelligence. Basic knowledge of Extract, Transform, Load (ETL) processes. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here . Show more Show less
Posted 2 days ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Senior Manager - Product Quality Engineering Leader Career Level - E Introduction to role: Join our Commercial IT Data Analytics & AI (DAAI) team as a Product Quality Leader, where you will play a pivotal role in ensuring the quality and stability of our data platforms built on AWS services, Databricks, and Snaplogic. Based in Chennai GITC, you will drive the quality engineering strategy, lead a team of quality engineers, and contribute to the overall success of our data platform. Accountabilities : As the Product Quality Team Leader for data platforms, your key accountabilities will include leadership and mentorship, quality engineering standards, collaboration, technical expertise, and innovation and process improvement. You will lead the design, development, and maintenance of scalable and secure data infrastructure and tools to support the data analytics and data science teams. You will also develop and implement data and data engineering quality assurance strategies and plans tailored to data product build and operations. Essential Skills/Experience: Bachelor’s degree or equivalent in Computer Engineering, Computer Science, or a related field Proven experience in a product quality engineering or similar role, with at least 3 years of experience in managing and leading a team. Experience of working within a quality and compliance environment and application of policies, procedures, and guidelines A broad understanding of cloud architecture (preferably in AWS) Strong experience in Databricks, Pyspark and the AWS suite of applications (like S3, Redshift, Lambda, Glue, EMR). Proficiency in programming languages such as Python Experienced in Agile Development techniques and Methodologies. Solid understanding of data modelling, ETL processes and data warehousing concepts Excellent communication and leadership skills, with the ability to collaborate effectively with the technical and non-technical stakeholders. Experience with big data technologies such as Hadoop or Spark Certification in AWS or Databricks. Prior significant experience working in Pharmaceutical or Healthcare industry IT environment. When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca, we are committed to disrupting an industry and changing lives. Our work has a direct impact on patients, transforming our ability to develop life-changing medicines. We empower the business to perform at its peak and lead a new way of working, combining cutting-edge science with leading digital technology platforms and data. We dare to lead, applying our problem-solving mindset to identify and tackle opportunities across the whole enterprise. Our spirit of experimentation is lived every day through our events like hackathons. We enable AstraZeneca to perform at its peak by delivering world-class technology and data solutions. Are you ready to be part of a team that has the backing to innovate, disrupt an industry and change lives? Apply now to join us on this exciting journey! Show more Show less
Posted 2 days ago
10.0 years
0 Lacs
Vadodara, Gujarat, India
On-site
Role Title: Head- Business Intelligence & AI Reporting To: Chief Information Officer Location of Posting: Corporate office, Vadodara Position Overview: We are seeking seasoned Head- Business Intelligence & AI to lead our data strategy, design scalable data models, and drive analytical and AI innovation across the organization. This role combines leadership in data science, AI and business analytics with deep technical expertise in data architecture and modelling, AI/ML, ETL, Dashboards and AI including Gen AI, Agentic AI. The ideal candidate will be a strategic thinker, technical expert, and effective communicator capable of aligning data initiatives with business objectives. As the Head of AI and Analytics in a chemical manufacturing organization, your role involves leveraging AI and analytics across all functions—R&D, production, supply chain, sales, marketing, finance, HR, and compliance—while incorporating dashboarding, ETL processes, and a data lake to enable data-driven decision-making. Key Responsibilities: Data Strategy Leadership - Define and drive the enterprise-wide business intelligence and analytics strategy , Align BI initiatives with overall business goals and digital transformation priorities Formulate a comprehensive AI and analytics roadmap aligned with the organization’s goals, focusing on improving operational efficiency. Oversee the design and maintenance of a centralized data lake to store diverse data, ensuring scalability, security, and accessibility for cross-functional BI and AI initiatives. Identify cross-functional use cases, such as using AI to predict market demand, optimize pricing strategies, or enhance employee training programs. Apply AI for predictive maintenance of equipment and process optimization while using BI to monitor production KPIs and identify bottlenecks through historical data analysis. Stakeholder Engagement - Collaborate with executive leadership, functional heads, and IT to identify analytics needs, Translate business questions into actionable insights and dashboards Leadership: Lead the Analytics and AI team, provide strategic insights to the C-suite, and foster a data-driven culture. Develop and maintain interactive dashboards for all functions, providing real-time insights to stakeholders Data-Driven Decision Support - Deliver KPIs, scorecards, and predictive models to enable strategic decision-making, Promote advanced analytics, AI/ML initiatives, and scenario planning AI & GenAI Enablement: Spearhead AI and Generative AI initiatives, including hands-on leadership in deploying LLMs, implementing RAG (Retrieval-Augmented Generation) models, and identifying data science-driven opportunities across the organization. Data Governance & Quality: Ensure best practices in data governance, security, and quality management to uphold data integrity and compliance. Education Qualification: Bachelor’s or master’s in computer science, Data Science, Statistics, or related field. PhD is a plus. Experience: 10+ years of experience in analytics, data architecture, or related roles. Strong knowledge of data modelling techniques Understanding of Data Science (SQL, Python, R, and at least one cloud platform. Experience with modern data warehousing tools (Snowflake, BigQuery, Redshift) and orchestration (Airflow, DBT) Technical Competencies/Skills: Analytics tools (Data Lake, Tableau), and integration with other systems Deep understanding of manufacturing processes and best practices. Proven track record of implementing enterprise analytics solutions and predictive modeling at scale. Strong hands-on experience with tools like Power BI, Tableau, Python/R, SQL, and cloud platforms (AWS/GCP/Azure) or any other relevant cloud platform. Experience setting up and managing data lakes and developing end-to-end data pipelines. Sound understanding of AI/ML techniques , LLMs , GenAI tools , and emerging technologies in data science. Experience with modern data warehousing tools (Snowflake, BigQuery, Redshift) and orchestration (Airflow, DBT). Behavioural Competencies: Strong leadership and team management skills. Excellent communication and interpersonal skills. High level of initiative and proactive approach to problem-solving. Ability to work under pressure and manage multiple priorities. Excellent verbal and written communication skills, with the ability to present complex information to both technical and non-technical stakeholders. Strong analytical and problem-solving skills, with the ability to make data-driven decisions. Show more Show less
Posted 2 days ago
10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Company Description SBS is a global financial technology company that helps banks and financial services to adapt in a digital world. Trusted by over 1,500 financial institutions and large-scale lenders in 80 countries, including Santander and Mercedes-Benz, SBS provides a cloud platform with a composable architecture for digitizing operations such as banking, lending, compliance, and payments. Headquartered in Paris, France, SBS employs 3,400 people across 50 offices and is recognized as a top fintech company in Europe. Senior Technical Team leader Business Intelligence, Data Governance & Reporting Key Responsibilities • Lead the development and execution of BI strategies, tools, and reporting solutions in alignment with business objectives. • Serve as a subject matter expert for BI within the organization, supporting internal initiatives and mentoring team members on best practices. • Design, implement, and maintain scalable data models, analytical layers, and interactive dashboards using modern BI tools (primarily Power BI). • Continuously optimize BI architecture to ensure scalability, performance, and adaptability to evolving business needs. • Apply performance optimization techniques to improve data processing, dashboard responsiveness, and user experience. • Ensure high standards of data quality, consistency, and governance across all BI solutions. • Collaborate closely with cross-functional teams including data engineers, data scientists, and business stakeholders to define and meet BI requirements. • Utilize advanced Power BI features (DAX, Power Query, Power BI Service) to build robust, automated reporting and analytical solutions. • Host workshops and office hours to guide business units on Power BI usage, selfservice BI strategies, and technical troubleshooting. • Stay abreast of emerging BI tools, trends, and methodologies to drive continuous innovation and improvement. Desired Skills and Experience • Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, Mathematics, or a related field. • 10+ years of experience in Business Intelligence, including data warehousing, ETL pipelines, and reporting. • Expert-level proficiency in BI tools, particularly Power BI. Certified Power BI Data Analyst Associate (PL300) and Certified Data Management Professional (CDMP)- DAMA. • Strong command of DAX, Power Query, and SQL for data modeling, integration, and Python for analysis. • Proficient in Agile\Scrum or traditional project management methodologies. • Foster a collaborative team culture and encourage continuous learning. • Act as a bridge between technical teams and business stakeholders. • Familiarity with modern cloud data platforms (e.g., Snowflake, Azure Synapse, etc.). • Understanding of data governance, privacy, and security best practices. • Excellent problem-solving and analytical thinking skills, with attention to detail. • Ability to translate complex technical topics into clear, business-friendly language. • Fluency in English, both written and spoken. Show more Show less
Posted 2 days ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Description Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It With Pride. Together with analytics team leaders you will support our business with excellent data models to uncover trends that can drive long-term business results. How You Will Contribute You will: Execute the business analytics agenda in conjunction with analytics team leaders Work with best-in-class external partners who leverage analytics tools and processes Use models/algorithms to uncover signals/patterns and trends to drive long-term business performance Execute the business analytics agenda using a methodical approach that conveys to stakeholders what business analytics will deliver What You Will Bring A desire to drive your future and accelerate your career and the following experience and knowledge: Using data analysis to make recommendations to analytic leaders Understanding in best-in-class analytics practices Knowledge of Indicators (KPI's) and scorecards Knowledge of BI tools like Tableau, Excel, Alteryx, R, Python, etc. is a plus Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It with Pride In This Role As a DaaS Data Engineer, you will have the opportunity to design and build scalable, secure, and cost-effective cloud-based data solutions. You will develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes, ensuring data quality and validation processes to maintain data accuracy and integrity. You will ensure efficient data storage and retrieval for optimal performance, and collaborate closely with data teams, product owners, and other stakeholders to stay updated with the latest cloud technologies and best practices. Role & Responsibilities: Design and Build: Develop and implement scalable, secure, and cost-effective cloud-based data solutions. Manage Data Pipelines: Develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes. Ensure Data Quality: Implement data quality and validation processes to ensure data accuracy and integrity. Optimize Data Storage: Ensure efficient data storage and retrieval for optimal performance. Collaborate and Innovate: Work closely with data teams, product owners, and stay updated with the latest cloud technologies and best practices to remain current in the field. Technical Requirements: Programming: Python, PySpark, Go/Java Database: SQL, PL/SQL ETL & Integration: DBT, Databricks + DLT, AecorSoft, Talend, Informatica/Pentaho/Ab-Initio, Fivetran. Data Warehousing: SCD, Schema Types, Data Mart. Visualization: Databricks Notebook, PowerBI, Tableau, Looker. GCP Cloud Services: Big Query, GCS, Cloud Function, PubSub, Dataflow, DataProc, Dataplex. AWS Cloud Services: S3, Redshift, Lambda, Glue, CloudWatch, EMR, SNS, Kinesis. Supporting Technologies: Graph Database/Neo4j, Erwin, Collibra, Ataccama DQ, Kafka, Airflow. Experience with RGM.ai product would have an added advantage. Soft Skills: Problem-Solving: The ability to identify and solve complex data-related challenges. Communication: Effective communication skills to collaborate with Product Owners, analysts, and stakeholders. Analytical Thinking: The capacity to analyse data and draw meaningful insights. Attention to Detail: Meticulousness in data preparation and pipeline development. Adaptability: The ability to stay updated with emerging technologies and trends in the data engineering field. Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy Business Unit Summary At Mondelēz International, our purpose is to empower people to snack right by offering the right snack, for the right moment, made the right way. That means delivering a broad range of delicious, high-quality snacks that nourish life's moments, made with sustainable ingredients and packaging that consumers can feel good about. We have a rich portfolio of strong brands globally and locally including many household names such as Oreo , belVita and LU biscuits; Cadbury Dairy Milk , Milka and Toblerone chocolate; Sour Patch Kids candy and Trident gum. We are proud to hold the top position globally in biscuits, chocolate and candy and the second top position in gum. Our 80,000 makers and bakers are located in more than 80 countries and we sell our products in over 150 countries around the world. Our people are energized for growth and critical to us living our purpose and values. We are a diverse community that can make things happen—and happen fast. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Analytics & Modelling Analytics & Data Science Show more Show less
Posted 2 days ago
5.0 - 10.0 years
8 - 14 Lacs
Navi Mumbai
Work from Office
Data Strategy and Planning: Develop and implement data architecture strategies that align with organizational goals and objectives. Collaborate with business stakeholders to understand data requirements and translate them into actionable plans. Data Modeling: Design and implement logical and physical data models to support business needs. Ensure data models are scalable, efficient, and comply with industry best practices. Database Design and Management: Oversee the design and management of databases, selecting appropriate database technologies based on requirements. Optimize database performance and ensure data integrity and security. Data Integration: Define and implement data integration strategies to facilitate seamless flow of information across. Responsibilities: Experience in data architecture and engineering Proven expertise with Snowflake data platform Strong understanding of ETL/ELT processes and data integration Experience with data modeling and data warehousing concepts Familiarity with performance tuning and optimization techniques Excellent problem-solving skills and attention to detail Strong communication and collaboration skills Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Cloud & Data Architecture: AWS , Snowflake ETL & Data Engineering: AWS Glue, Apache Spark, Step Functions Big Data & Analytics: Athena,Presto, Hadoop Database & Storage: SQL, Snow sql Security & Compliance: IAM, KMS, Data Masking Preferred technical and professional experience Cloud Data Warehousing: Snowflake (Data Modeling, Query Optimization) Data Transformation: DBT (Data Build Tool) for ELT pipeline management Metadata & Data Governance: Alation (Data Catalog, Lineage, Governance
Posted 2 days ago
5.0 - 8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Dreaming big is in our DNA. It’s who we are as a company. It’s our culture. It’s our heritage. And more than ever, it’s our future. A future where we’re always looking forward. Always serving up new ways to meet life’s moments. A future where we keep dreaming bigger. We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential. The power we create together – when we combine your strengths with ours – is unstoppable. Are you ready to join a team that dreams as big as you do? AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev. The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology. The teams are transforming Operations through Tech and Analytics. Do You Dream Big? We Need You. Job Description Job Title: Manager- GBS Commercial Location: Bangalore Reporting to: Senior Manager - GBS Commercial Purpose of the role This role sits at the intersection of data science and revenue growth strategy, focused on developing advanced analytical solutions to optimize pricing, trade promotions, and product mix. The candidate will lead the end-to-end design, deployment, and automation of machine learning models and statistical frameworks that support commercial decision-making, predictive scenario planning, and real-time performance tracking. By leveraging internal and external data sources—including transactional, market, and customer-level data—this role will deliver insights into price elasticity, promotional lift, channel efficiency, and category dynamics. The goal is to drive measurable improvements in gross margin, ROI on trade spend, and volume growth through data-informed strategies. Key tasks & accountabilities Design and implement price elasticity models using linear regression, log-log models, and hierarchical Bayesian frameworks to understand consumer response to pricing changes across channels and segments. Build uplift models (e.g., Causal Forests, XGBoost for treatment effect) to evaluate promotional effectiveness and isolate true incremental sales vs. base volume. Develop demand forecasting models using ARIMA, SARIMAX, and Prophet, integrating external factors such as seasonality, promotions, and competitor activity. time-series clustering and k-means segmentation to group SKUs, customers, and geographies for targeted pricing and promotion strategies. Construct assortment optimization models using conjoint analysis, choice modeling, and market basket analysis to support category planning and shelf optimization. Use Monte Carlo simulations and what-if scenario modeling to assess revenue impact under varying pricing, promo, and mix conditions. Conduct hypothesis testing (t-tests, ANOVA, chi-square) to evaluate statistical significance of pricing and promotional changes. Create LTV (lifetime value) and customer churn models to prioritize trade investment decisions and drive customer retention strategies. Integrate Nielsen, IRI, and internal POS data to build unified datasets for modeling and advanced analytics in SQL, Python (pandas, statsmodels, scikit-learn), and Azure Databricks environments. Automate reporting processes and real-time dashboards for price pack architecture (PPA), promotion performance tracking, and margin simulation using advanced Excel and Python. Lead post-event analytics using pre/post experimental designs, including difference-in-differences (DiD) methods to evaluate business interventions. Collaborate with Revenue Management, Finance, and Sales leaders to convert insights into pricing corridors, discount policies, and promotional guardrails. Translate complex statistical outputs into clear, executive-ready insights with actionable recommendations for business impact. Continuously refine model performance through feature engineering, model validation, and hyperparameter tuning to ensure accuracy and scalability. Provide mentorship to junior analysts, enhancing their skills in modeling, statistics, and commercial storytelling. Maintain documentation of model assumptions, business rules, and statistical parameters to ensure transparency and reproducibility. Other Competencies Required Presentation Skills: Effectively presenting findings and insights to stakeholders and senior leadership to drive informed decision-making. Collaboration: Working closely with cross-functional teams, including marketing, sales, and product development, to implement insights-driven strategies. Continuous Improvement: Actively seeking opportunities to enhance reporting processes and insights generation to maintain relevance and impact in a dynamic market environment. Data Scope Management: Managing the scope of data analysis, ensuring it aligns with the business objectives and insights goals. Act as a steadfast advisor to leadership, offering expert guidance on harnessing data to drive business outcomes and optimize customer experience initiatives. Serve as a catalyst for change by advocating for data-driven decision-making and cultivating a culture of continuous improvement rooted in insights gleaned from analysis. Continuously evaluate and refine reporting processes to ensure the delivery of timely, relevant, and impactful insights to leadership stakeholders while fostering an environment of ownership, collaboration, and mentorship within the team. Technical Skills - Must Have Data Manipulation & Analysis: Advanced proficiency in SQL, Python (Pandas, NumPy), and Excel for structured data processing. Data Visualization: Expertise in Power BI and Tableau for building interactive dashboards and performance tracking tools. Modeling & Analytics: Hands-on experience with regression analysis, time series forecasting, and ML models using scikit-learn or XGBoost. Data Engineering Fundamentals: Knowledge of data pipelines, ETL processes, and integration of internal/external datasets for analytical readiness. Proficient in Power BI, Advanced MS Excel (Pivots, calculated fields, Conditional formatting, charts, dropdown lists, etc.), MS PowerPoint SQL & Python. Business Environment Work closely with Zone Revenue Management teams. Work in a fast-paced environment. Provide proactive communication to the stakeholders. This is an offshore role and requires comfort with working in a virtual environment. GCC is referred to as the offshore location. The role requires working in a collaborative manner with Zone/country business heads and GCC commercial teams. Summarize insights and recommendations to be presented back to the business. Continuously improve, automate, and optimize the process. Geographical Scope: Global 3. Qualifications, Experience, Skills Level Of Educational Attainment Required Bachelor or Post-Graduate in the field of Business & Marketing, Engineering/Solution, or other equivalent degree or equivalent work experience. Previous Work Experience 5-8 years of experience in the Retail/CPG domain. Extensive experience solving business problems using quantitative approaches. Comfort with extracting, manipulating, and analyzing complex, high volume, high dimensionality data from varying sources. And above all of this, an undying love for beer! We dream big to create future with more cheer. Show more Show less
Posted 2 days ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About The Role We're seeking an experienced Infrastructure Engineer to join our platform team, handling massive-scale data processing and analytics infrastructure that supports over 5B+ events and more 5M+ DAU .We’re looking for someone who can help us scale gracefully while optimizing for performance, cost, and resiliency. Key Responsibilities Design, implement, and manage our AWS infrastructure, with a strong emphasis on automation, resiliency, and cost-efficiency. Develop and oversee scalable data pipelines (for event processing, transformation, and delivery). Implement and manage stream processing frameworks (such as Kinesis, Kafka, or MSK). Handle orchestration and ETL workloads, employing services like AWS Glue, Athena, Databricket, Redshift, or Apache Airflow. Implement robust network, storage, and backup strategies for growing workloads. Monitor, debug, and resolve production issues related to data and infrastructure in real time. Implement IAM controls, logging, alerts, and Security Best Practices across all components. Provide deployment automation (Docker, Terraform, CloudFormation) and collaborate with application engineers to enable smooth delivery. Build SOP for support and setup a functioning 24*7 support system (including hiring right engineers) to ensure system uptime and availability Required Technical Skills 5+ years of experience with AWS services (VPC, EC2, S3, Security Groups, RDS, Kinesis, MSK, Redshift, Glue). Experience designing and managing large-scale data pipelines with high-throughput workloads. Ability to handle 5 billion events/day and 1M+ concurrent users’ workloads gracefully. Familiar with scripting (Python, Terraform) and automation practices (Infrastructure as Code). Familiar with network fundamentals, Linux, scaling strategies, and backup routines. Collaborative team player — able to work with engineers, data analysts, and stakeholders. Preferred Tools & Technologies AWS: EC2, S3, VPC, Security Groups, RDS, Redshift, DocumentDB, MSK, Glue, Athena, CloudWatch Infrastructure as Code: Terraform, CloudFormation Scripted automation: Python, Bash Container orchestration: Docker, ECS or EKS Workflow orchestration: Apache Airflow, Dagster Streaming framework: Apache Kafka, Kinesis, Flink Other: Linux, Git, Security best practices (IAM, Security Groups, ACM) Education Bachelor's/Master's degree in Computer Science, Data Science, or related field Relevant professional certifications in cloud platforms or data technologies Why Join Us? Opportunity to work in a fast-growing audio and content platform. Exposure to multi-language marketing and global user base strategies. A collaborative work environment with a data-driven and innovative approach. Competitive salary and growth opportunities in marketing and growth strategy. Success Metrics ✅ Scalability: Ability to handle 1+ billion events/day with low latency and high resiliency. ✅ Cost-efficiency: Reduction in AWS operational costs by optimizing services, storage, and data transfer. ✅ Uptime/SLI: Achieve 99.9999% platform and pipeline uptimes with automated fallback mechanisms. ✅ Data delivery latency: Reduce event delivery latency to under 5 minutes for real-time processing. ✅ Security and compliance: Implement controls to pass PCI-DSS or SOC 2 audits with zero major findings. ✅ Developer productivity: Improve team delivery speed by self-service IaC modules and automated routines. About KUKU Founded in 2018, KUKU is India’s leading storytelling platform, offering a vast digital library of audio stories, short courses, and microdramas. KUKU aims to be India’s largest cultural exporter of stories, culture and history to the world with a firm belief in “Create In India, Create For The World”. We deliver immersive entertainment and education through our OTT platforms: Kuku FM, Guru, Kuku TV, and more. With a mission to provide high-quality, personalized stories across genres from entertainment across multiple formats and languages, KUKU continues to push boundaries and redefine India’s entertainment industry. 🌐 Website: www.kukufm.com 📱 Android App: Google Play 📱 iOS App: App Store 🔗 LinkedIn: KUKU 📢 Ready to make an impact? Apply now Skills: aws services,bash,networking,kafka,data pipeline,docker,kinesis,data pipelines,etl,terraform,automation,aws,security,ec2,cloudformation,cloud,scripting,linux,infrastructure,amazon redshift,python,vpc,network fundamentals,workflow orchestration,stream processing frameworks,container orchestration,dagster,airflow,s3 Show more Show less
Posted 2 days ago
3.0 - 6.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
About The Role Grade Level (for internal use): 09 The Role: As a Software Developer with the Data & Research Development team, you will be responsible for developing & providing backend support across a variety of products within the Market Intelligence platform. Together, you will build scalable and robust solutions using AGILE development methodologies with a focus on high availability to end users. The Team: Do you love to collaborate & provide solutions? This team comes together across eight different locations every single day to craft enterprise grade applications that serve a large customer base with growing demand and usage. You will use a wide range of technologies and cultivate a collaborative environment with other internal teams. The Impact: We focus primarily developing, enhancing and delivering required pieces of information & functionality to internal & external clients in all client-facing applications. You will have a highly visible role where even small changes have very wide impact. What’s in it for you? Opportunities for innovation and learning new state of the art technologies To work in pure agile & scrum methodology Responsibilities Deliver solutions within a multi-functional Agile team Develop expertise in our proprietary enterprise software products Set and maintain a level of excitement in using various technologies to develop, support, and iteratively deploy real enterprise level software Achieve an understanding of customer environments and their use of the products Build solutions architecture, algorithms, and designs for solutions that scale to the customer's enterprise/global requirements Apply software engineering practices and implement automation across all elements of solution delivery Basic Qualifications What we’re looking for: 3-6 years of desktop application development experience with deep understanding of Design Patterns & Object-oriented programming. Hands on development experience using C#, .Net 4.0/4.5, WPF, Asp.net, SQL server. Strong OOP and Service Oriented Architecture (SOA) knowledge. Strong understanding of cloud applications (Containers, Dockers etc.) and exposure to data ETL will be a plus. Ability to resolve serious performance related issues through various techniques, including testing, debugging and profiling. Strong problem solving, analytical and communication skills. Possess a true “roll up the sleeves and get it done” working approach; demonstrated success as a problem solver, operating as a client-focused self-starter. Preferred Qualifications Bachelor's degree in computer science or computer engineering About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 313152 Posted On: 2025-05-05 Location: Hyderabad, Telangana, India Show more Show less
Posted 2 days ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
The ideal candidate will be responsible for designing, developing, and deploying scalable ETL processes using Informatica PowerCenter to support our data warehousing and analytics initiatives. You will collaborate with business and technical stakeholders to ensure high data quality, availability, and performance. Key Responsibilities: Design, develop, and maintain ETL workflows and mappings using Informatica PowerCenter or Informatica Intelligent Cloud Services (IICS). Extract, transform, and load data from various source systems (e.g., SQL Server, Oracle, flat files, cloud APIs) into data warehouses or operational data stores. Optimize ETL performance, conduct tuning, and ensure error handling and logging. Collaborate with data architects and analysts to understand data requirements and deliver high-quality data solutions. Work with QA teams to support data validation and testing efforts. Support data integration, migration, and transformation initiatives. Document ETL processes, data flows, and job schedules. Monitor daily ETL jobs and resolve production issues in a timely manner. Requirements Bachelor's degree in Computer Science, Information Systems, or a related field (or equivalent work experience). 3+ years of experience with Informatica PowerCenter or Informatica IICS. Strong SQL skills and experience with relational databases (e.g., Oracle, SQL Server, PostgreSQL). Solid understanding of data warehousing concepts and dimensional modeling. Experience in performance tuning and troubleshooting ETL processes. Hands-on experience with job scheduling tools (e.g., Autosys, Control-M, Tidal). Familiarity with version control systems and DevOps practices. Preferred Qualifications: Experience with cloud data platforms (e.g., Snowflake, AWS Redshift, Azure Synapse). Exposure to data governance and data quality tools. Knowledge of scripting languages (e.g., Shell, Python). Experience working in Agile/Scrum environments. Familiarity with BI tools (e.g., Tableau, Power BI) is a plus. Benefits This position comes with competitive compensation and benefits package: Competitive salary and performance-based bonuses Comprehensive benefits package Home Office model Career development and training opportunities Flexible work arrangements (remote and/or office-based) Dynamic and inclusive work culture within a globally known group Private Health Insurance Pension Plan Paid Time Off Training & Development *Note: Benefits differ based on employee level Show more Show less
Posted 2 days ago
1.0 - 2.6 years
0 Lacs
Hyderabad, Telangana, India
Remote
Summary Position Summary ServiceNow Configurator/Developer (Analyst) – Deloitte Support Services India Private Limited Solutions Delivery-Canada is an integral part of the Information Technology Services group. The principle focus of this organization is the development and maintenance of technology solutions that e-enable the delivery of Function and Marketplace Services and Management Information Systems. Solutions Delivery Canada develops and maintains solutions built on varied technologies like SalesForce, Microsoft technologies, SAP, Hadoop, ETL, BI , ServiceNow, PowerAutomate, OpenText. Solutions Delivery Canada has various groups which provide the best of the breed solutions to the clients by following a streamlined system development methodology. Solutions Delivery Canada comprises of groups like Usability, Application Architecture, Development and Quality Assurance and Performance. Work you’ll do Create, configure, and customize ServiceNow applications for new and existing implementations. Create and configure functional data such as Notifications and Service Level Agreements. Create and configure script objects such as Business Rules, Script Includes, UI Policies and Actions, Client Scripts, ACLs. Set-up interfaces between ServiceNow and other platforms in line with integration opportunities identified by Solution Architects. Perform system and integration testing. Recommend Administration settings and best practices. Create documentation of the developments, unit test cases and implementation plans. Work effectively in diverse teams within an inclusive team culture where people are recognized for their contribution Responsibilities Strategic Strong technical skills regarding technical topics and remote collaboration skills are critical to this role. Demonstrates an ability to deliver on project commitments. Produces work that consistently meets quality standards. Must have hands on experience in ITSM & ITBM module of ServiceNow. Must have knowledge of UI builder and Workspace configuration in ServiceNow. Should have hands-on experience in Business Rules, Script Include, ACLs, and all server-side scripting in best practice. Knowledge of Domain separation in ServiceNow is add-on. Operational Design, Development and Implementation of ServiceNow customization including, but not limited to core setup, workflow administration, reporting, data imports, custom scripting, and third-party software integrations. Should have a good understanding of Agile/SAFe Methodologies. Perform advanced customizations including Business Rules, UI Pages, UI Macros, UI Scripts, Script Includes, Client Scripts, workflows, custom tables, reports etc. Perform workflow design, configuration, development, and data loads for ServiceNow platform, (ServiceNow) applications. Responsible for programming workflow, enhancements, and integrations with ServiceNow platform applications. Should have REST/SOAP Web Services integration experience. Good to have knowledge on following ServiceNow applications. Discovery – On-Premises & Off- Premises ServiceNow Orchestration ITOM - IT Operations Management SPM/ITBM – IT Business Management HRSD (HR Service Delivery Fundamentals) ServiceNow Event Management Integration and ServiceNow Scripting (Glide, (JavaScript, Ajax, XML, JSON etc. HTML and CSS)) Maintain pace with ServiceNow versioning. Perform upgrades and customizations of ServiceNow platform applications based on guidance from project manager, architects, ITIL practice leads and customers. Maintain and adhere to source code, configuration management, release management and software development best practices. Develop training materials and provide end-user or IT technician training on using the ServiceNow functionality. Provide in-person support daily to customer and team This will include direct interaction with the Executive staff and other key management. Maintain ServiceNow training and knowledge thru self-learning, attending conferences and training Responsible for proactive problem and risk management Triage and fix defects found in ServiceNow platform, applications, and workflows. Defining and validating non-functional (technical) requirements and establishing traceability between requirements and application architecture/design. End-to-end ownership of Solutioning for current & new opportunities (starting from requirement analysis to proposal delivery). Working with SMEs, Leads, Managers, Resources & Project/Delivery Manager (in case of specific inputs for solution) on finalizing the solution and estimates. Work with Project/delivery managers to build POC (proof of concept), prototype and sample development. Work with project/delivery managers to devise the timeline/schedule for executing the project. Working as a bridge between the Client & Delivery team during the transition of the won opportunities. And supporting delivery team in initial stages of the Discovery Phase, including discovery agenda finalization, facilitation material preparations, dry runs and actual engagement. Timely & quality delivery of opportunities Should have good understanding and should be up to date on ServiceNow latest releases, features and issues. Should be always align to the best practices and thrive towards innovative solution. Should have niche understanding of the ITIL processes and should be able to relate with the stakeholder requirements. Experience: 1-2.6 Years Work location: Hyderabad Shift Timings: 11- 8 pm Key Technical Skills, Experience and Knowledge At least 2-4 years of ServiceNow experience, including custom development, configuration. ServiceNow scripting experience using JavaScript, HTML, CSS, XML and REST/SOAP Web Services. Understanding and experience of Business Rules, Script Includes, UI Actions and all scripted aspects of ServiceNow. Customize ServiceNow UI and Service Portal through use of UI pages, CMS, CSS and Service Portal widgets. Strong knowledge of integrations and migrations. Deep understanding of ITIL. Strong understanding of ServiceNow administration settings. Deep functional and technical knowledge of the ServiceNow platform as well as experience delivering medium to large-scale ServiceNow implementations Performs well in an agile environment with constant feedback and interaction with the team. Ability to accurately estimate level of effort/duration on projects and tasks. A positive attitude and perseverance required to troubleshoot/resolve complex technical issues whilst balancing multiple priorities. Demonstrated ability to troubleshoot technical issues. Strong knowledge in application development life cycle Executes design activities leveraging knowledge of all application design techniques; Ensures design is consistent with solution architecture; Ensures adherence to design standards; Performs technology proofs-of-concept to support design approaches Execute construction of solution that leverages knowledge of designated programming language(s) and ensures consistency with proposed design approach; Initiates peer reviews of system code; Establishes standards and leading practices Experience working with geographically distributed and culturally diverse work groups Strong written and verbal communication skills with the ability to present to IT and business leaders Demonstrated ability to stay current with development best practices, existing and emerging technology platforms, and industry trends Experience with formal software development methodologies, with a focus on Agile Certifications ServiceNow Certified Systems Administrator is a must. ServiceNow Implementation Specialist & CAD is a great bonus. Essential Competencies High degree of technical expertise in relevant areas Team Orientation and Team lead Motivated team player willing to learn from others Analytical, logical, thorough and methodical Problem management skills Able to work without supervision using their initiative to be creative in solution design Excellent interpersonal manner, communication skills & customer focussed Education/Other: Bachelor’s Degree Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 302821 Show more Show less
Posted 2 days ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Company Qualcomm India Private Limited Job Area Information Technology Group, Information Technology Group > Systems Analysis General Summary We are seeking a Systems Analyst,Senior to join our growing organization with specialized skills in IBM Planning Analytics/TM1 and functional understanding of Finance budgeting and forecasting. This role involves advanced development, troubleshooting, and implementation of TM1 solutions to meet complex business requirements. The person will be part of Finance Planning and reporting team and will primarily work closely with his/her manager and will be helping in delivering TM1 planning and budgeting roadmap for the global stakeholders. Key Responsibilities Able to design and develop IBM Planning Analytics(TM1) solutions as per standards. Able to write logical, complex, concise, efficient, and well-documented code for both TM1 rules and Turbo Integrator processes. Good to have knowledge of Python and TM1py libraries. Able to write business requirement specifications, define level of efforts for Projects/Enhancements and should design and coordinate system tests to ensure solutions meet business requirements SQL skills to be able to work with source data and understand source data structures. Good understanding of the SQL and ability to write complex queries. Understanding cloud technologies especially AWS and Databricks will be an added advantage. Experience in client reporting and dashboard tools like Tableau, PA Web,PAFE. Understanding of ETL processes and data manipulation Working independently with little supervision Taking responsibility for own work and making decisions that are moderate in impact; errors may have financial impact or effect on projects, operations, or customer relationships; errors may require involvement beyond immediate work group to correct. Should provide ongoing system support, including troubleshooting and resolving issues to ensure optimal system performance and reliability Using verbal and written communication skills to convey information that may be complex to others who may have limited knowledge of the subject in question Using deductive and inductive problem solving; multiple approaches may be taken/necessary to solve the problem; often information is missing or incomplete; intermediate data analysis/interpretation skills may be required. Exercising substantial creativity to innovate new processes, procedures, or work products within guidelines or to achieve established objectives. Minimum Qualifications 3+ years of IT-relevant work experience with a Bachelor's degree. OR 5+ years of IT-relevant work experience without a Bachelor’s degree. Qualifications The ideal candidate will have 8-10 years of experience in designing, modeling, and developing enterprise performance management (EPM) applications using IBM Planning Analytics (TM1). Able to design and develop IBM Planning Analytics(TM1) solutions as per standards. Able to write logical, complex, concise, efficient, and well-documented code for both TM1 rules and Turbo Integrator processes. Lead the design, modeling, and development of TM1 applications, including TI scripting, MDX, rules, feeders, and performance tuning. Should able to provide technical expertise in identifying, evaluating, and developing systems and procedures that are efficient, cost effective and meet user requirements. Plans and executes unit, integration and acceptance testing Must be a good team player who can work seamlessly with Global teams and Data teams Excellent communication and collaboration skills to work with business stakeholders Having functional understanding of Finance budgeting and forecasting Understanding cloud technologies especially AWS and Databricks will be an added advantage Experience in Agile methodologies and JIRA user stories Able to design and develop solutions using python as per standards we are seeking a Systems Analyst,Senior to join our growing organization with specialized skills in IBM Planning Analytics/TM1 and functional understanding of Finance budgeting and forecasting. The person will be part of Finance Planning and reporting te Required bachelor’s or master’s degree in information science, computer science, business, or equivalent work experience. Applicants : Qualcomm is an equal opportunity employer. If you are an individual with a disability and need an accommodation during the application/hiring process, rest assured that Qualcomm is committed to providing an accessible process. You may e-mail disability-accomodations@qualcomm.com or call Qualcomm's toll-free number found here. Upon request, Qualcomm will provide reasonable accommodations to support individuals with disabilities to be able participate in the hiring process. Qualcomm is also committed to making our workplace accessible for individuals with disabilities. (Keep in mind that this email address is used to provide reasonable accommodations for individuals with disabilities. We will not respond here to requests for updates on applications or resume inquiries). Qualcomm expects its employees to abide by all applicable policies and procedures, including but not limited to security and other requirements regarding protection of Company confidential information and other confidential and/or proprietary information, to the extent those requirements are permissible under applicable law. To all Staffing and Recruiting Agencies : Our Careers Site is only for individuals seeking a job at Qualcomm. Staffing and recruiting agencies and individuals being represented by an agency are not authorized to use this site or to submit profiles, applications or resumes, and any such submissions will be considered unsolicited. Qualcomm does not accept unsolicited resumes or applications from agencies. Please do not forward resumes to our jobs alias, Qualcomm employees or any other company location. Qualcomm is not responsible for any fees related to unsolicited resumes/applications. If you would like more information about this role, please contact Qualcomm Careers. 3076094 Show more Show less
Posted 2 days ago
0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Our Client is one of the United States' largest insurers, providing a wide range of insurance and financial services products with gross written premiums well over US$25 Billion (P&C). They proudly serve more than 10 million U.S. households with more than 19 million individual policies across all 50 states through the efforts of over 48,000 exclusive and independent agents and nearly 18,500 employees. Finally, our client is part of one of the largest Insurance Groups in the world. Role Overview The purpose of this role is to ensure smooth operations of our production data assets. Activities will include monitoring production systems for incident occurrence, alerting applicable parties when incidents arise and incident triaging and management. They will also carry out activities to prevent production incidents. The Data Production Support Analyst plays a crucial role in ensuring the smooth operation of our production data assets and overall operational efficiency. They ensure the reliability and accuracy of our data production processes. This role requires a blend of technical expertise, data acumen, problem-solving skills, the ability to work under pressure and the ability to work collaboratively with various teams. Responsibilities Works with off-shore Application Operations team Administers, analyzes, and prioritizes systems issues and negotiates a course of action for resolution Supports work flow and solutions; trouble shoots user errors and supports reporting capabilities Utilizes system monitoring utilities to monitor system availability Extracts and compiles data system monitoring data to create availability scorecards and reports System Monitoring: Continuously monitor IT systems to ensure optimal performance and availability, identifying and addressing potential issues before they escalate Monitoring and Maintenance: Regularly monitor production data assets to ensure they are functioning correctly and efficiently. Alerting applicable parties if an issue arises in production Issue Resolution: Work with data team to identify, diagnose, and resolve technical issues related to production data assets. Work with relevant teams to implement effective solutions Incident Management: Manage and prioritize incidents, ensuring that they are resolved promptly and efficiently and follow the incident management process. Document incidents and resolutions for future reference Incident Management: Respond to and resolve technical issues reported by users or automated monitoring alerts. This includes diagnosing problems, identifying solutions, and implementing fixes Problem Analysis: Analyze recurring issues to identify root causes and implement long-term solutions to prevent future occurrences Root Cause Analysis: Conduct thorough investigations to determine the underlying causes of recurring incidents and implement preventive measures Preventative Measures: Identify incidents that recur and put solutions in place to prevent recurrence Data Integrity: Work with data team to ensure the accuracy and integrity of data produced and provided to the business, work with the data teams to implement and maintain quality control measures to prevent errors Documentation: Maintain comprehensive documentation of processes, system configurations, and troubleshooting procedures. Ensure documentation is created and owned be it by the data team or the production support team Support: Provide support to data teams, data users and stakeholders. Respond to inquiries and assist with requests as applicable Optimization: Identify opportunities to optimize data production processes and implement improvements to enhance efficiency Performance Optimization: Analyze system performance and identify areas for improvement. Suggest and implement changes to enhance system efficiency and reliability. Requirements Qualifications/Skills Education: A bachelor's degree in computer science, information technology, or a related field is preferred Experience: Proven experience in data production support or a similar role. Familiarity with data production tools and technologies Technical Expertise: Strong knowledge of IT systems, applications, and troubleshooting techniques. Proficiency in relevant software and tools Technical Skills: Strong knowledge of database management, data warehousing, and ETL processes. Proficiency in programming languages such as SQL, Python, or Java Problem-Solving: Excellent analytical and problem-solving skills. Ability to diagnose and resolve technical issues efficiently Communication: Strong written and verbal communication skills. Ability to explain technical concepts to non-technical stakeholders Attention to Detail: High level of attention to detail and commitment to data accuracy Attention to Detail: Precision in monitoring systems and documenting incidents and solutions Team Player: Ability to work collaboratively in a team environment and build positive relationships with colleagues and stakeholders. Willingness to share knowledge and assist others Time Management: Strong organizational skills and the ability to manage multiple tasks and priorities effectively Adaptability: Flexibility to manage changing priorities and handle multiple tasks simultaneously Benefits This position comes with competitive compensation and benefits package: Competitive salary and performance-based bonuses Comprehensive benefits package Home Office model Career development and training opportunities Flexible work arrangements (remote and/or office-based) Dynamic and inclusive work culture within a globally known group Private Health Insurance Pension Plan Paid Time Off Training & Development *Note: Benefits differ based on employee level Show more Show less
Posted 2 days ago
5.0 - 7.0 years
18 - 20 Lacs
Mumbai
Work from Office
Work Timing: Standard IST 6 months Contract Exp in batch/real-time integrations with ODI 11g, customizing knowledge modules & design/development expertise Skilled in ETL processes, PL/SQL & building Interfaces, Packages, Load Plans & Sequences in ODI Required Candidate profile Exp in ODI Master & Work Repository, data modeling & ETL design, multi-system integration, error handling, automation & object migration in ODI Performance tuning, unit testing & debugging mappings Perks and benefits MNC
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The ETL (Extract, Transform, Load) job market in India is thriving with numerous opportunities for job seekers. ETL professionals play a crucial role in managing and analyzing data effectively for organizations across various industries. If you are considering a career in ETL, this article will provide you with valuable insights into the job market in India.
These cities are known for their thriving tech industries and often have a high demand for ETL professionals.
The average salary range for ETL professionals in India varies based on experience levels. Entry-level positions typically start at around ₹3-5 lakhs per annum, while experienced professionals can earn upwards of ₹10-15 lakhs per annum.
In the ETL field, a typical career path may include roles such as: - Junior ETL Developer - ETL Developer - Senior ETL Developer - ETL Tech Lead - ETL Architect
As you gain experience and expertise, you can progress to higher-level roles within the ETL domain.
Alongside ETL, professionals in this field are often expected to have skills in: - SQL - Data Warehousing - Data Modeling - ETL Tools (e.g., Informatica, Talend) - Database Management Systems (e.g., Oracle, SQL Server)
Having a strong foundation in these related skills can enhance your capabilities as an ETL professional.
Here are 25 interview questions that you may encounter in ETL job interviews:
As you explore ETL jobs in India, remember to showcase your skills and expertise confidently during interviews. With the right preparation and a solid understanding of ETL concepts, you can embark on a rewarding career in this dynamic field. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.