Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
The Data Quality Analyst will collaborate with business stakeholders, Data Science, and wider data teams to enhance data quality throughout the organization and ensure data credibility in its usage. You will be responsible for developing a robust framework for data quality to uphold data integrity for regulatory and strategic needs. You will identify and address potential data quality issues at all stages of the data lifecycle and monitor data quality performance using tools and processes to maintain the highest standards. In this role, you will work in close coordination with data stewards to resolve data integrity issues and guarantee the delivery of high-quality data. Additionally, you will closely collaborate with the data platform team and stakeholders to contribute to the implementation of the data quality framework and roadmap. It is essential to align data quality initiatives with the overall data governance strategies. As a Data Quality Analyst, you will perform detailed root cause analysis of data issues and provide recommendations for preventing future defects. You will propose enhancements to streamline processes and enhance data management. You will also be responsible for implementing data quality rules in data quality tools to ensure compliance with enterprise data quality standards and requirements. Furthermore, you will advocate for high-quality data, ensuring that valuable data is governed, compliant, and delivers optimal value by identifying and resolving issues. You will also play a key role in contributing to Data management KPI reporting by maintaining data quality scores.,
Posted 3 days ago
3.0 - 6.0 years
7 - 10 Lacs
Hyderabad
Remote
Job Type: C2H (Contract to Hire) As a Data Engineer, you will work in a diverse, innovative team, responsible for designing, building, and optimizing the data infrastructure and pipelines for our new healthcare company's data platform. You'll architect and construct our core data backbone on a modern cloud stack, enabling the entire organization to turn complex data into life-saving insights. In this role, you will have the opportunity to solve challenging technical problems, mentor team members, and collaborate with innovative people to build a scalable, reliable, and world-class data ecosystem from the ground up. Core Responsibilities: (essential job duties and responsibilities) Design, develop, and maintain data replication streams and data flows to bring data from various SAP and non-SAP sources into Snowflake. Implement curated datasets on a modern data warehouse and data hub Interface directly with business and systems subject matter experts to understand analytic needs and determine logical data model requirements Work closely with data architects and senior analysts to identify common data requirements and develop shared solutions Support data integration and data modelers engineers Support and maintain data warehouse, ETL, and analytic platforms Required Skills and Experiences: Data warehouse and ETL background Advanced SQL programming capabilities Background in preparing data for analysis and reporting Familiar with data governance principles and tools Success in a highly dynamic environment with ability to shift priorities with agility Ability to go from whiteboard discussion to code Willingness to explore and implement new ideas and technologies Ability to effectively communicate with technical and non-technical audiences Ability to work independently with minimal supervision Minimum Qualifications: 4+ years experience with SQL. Snowflake strongly preferred. 3+ years experience with SAP Datasphere. 2+ years experience working directly with subject matter experts in both business and technology domains 2+ years experience with ERP data - preferably SAP S4, MS Dynamics and or BPCS 1+ year of experience with Salesforce, Workday, Concur or any other Enterprise application Nice-to-have: Experience with Machine Learning tools and processes Hands-on experience with Python Experience with Infrastructure as Code (IaC) principles and tools (e.g., Terraform, CloudFormation). Education: Bachelors in Computer Science, Information Systems, Engineering, science discipline, or similar.
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Business Requirements Analyst at LSEG, you will play a crucial role in the development and communication of Business Requirements for the Data Platform. Your responsibilities will include collaborating with Product Managers to create detailed maps of platform use cases, defining functional taxonomy, workflows, object definitions, and states, as well as specifying volumetric and non-functional requirements. You will work closely with programme collaborators to establish clear business success criteria and maintain a traceability model to ensure regulatory compliance. Throughout the delivery lifecycle, you will support design decisions, development, and testing activities, acting as a proxy-product owner when necessary. Building trust with product management and delivery Squads will be key to enabling flawless delivery. To excel in this role, you should have industry experience in developing business requirements for major transformations in financial services. Proficiency in standard Business Analysis methods and tools such as use case mapping, requirements gathering, data modeling, and user story mapping is essential. Your ability to thrive in a technically sophisticated and evolving environment, respond calmly to changing requirements, and exhibit strong planning and organization skills will be critical. Excellent communication skills, both written and verbal, are necessary for presenting and explaining issues logically. Furthermore, you should be a dedicated team player capable of working independently and making clear decisions in complex situations. Desired skills for this role include meticulous attention to detail, determination to focus on key business outcomes, experience in an agile DevOps environment, technical business analysis experience in the Financial Services industry, and familiarity with Microsoft cloud products and services. Prior experience in a large consulting firm is preferable. At LSEG, we are committed to fostering a diverse and inclusive organization that values individuality and encourages new ideas. Joining us means being part of a global team of 25,000 people across 70 countries, where your unique perspective will be welcomed. We strive to create a collaborative and creative culture focused on sustainability and driving economic growth. Additionally, we offer a range of benefits and support, including healthcare, retirement planning, paid volunteering days, and wellbeing initiatives. As a Recruitment Agency Partner, it is your responsibility to ensure that candidates applying to LSEG are aware of our privacy notice, which outlines how we handle personal information and your rights as a data subject.,
Posted 4 days ago
8.0 - 13.0 years
35 - 40 Lacs
Bengaluru
Work from Office
Technical Skills: Deep knowledge of cloud technologies (AWS, Azure) and cloud-native architectures. Strong expertise in system integration patterns, microservices architecture , RESTful APIs and functioning of middleware technologies Experience with Business Intelligence (BI) tools and data platforms (e.g., Power BI, Data Fabric, Talend ETL etc.). Knowledge of enterprise software, databases, and data modeling (SQL, NoSQL). Proficiency in development frameworks and programming languages like PHP, React, HTML, CSS, Javascript Exposure to programming languages relevant to BI application desired (e.g. Python, R, or .NET) Experience to develop and deploy solutions in AWS Cloud is preferred. Design and Architecture: Analytical mindset with the ability to analyze complex issues and provide effective solutions. Strong understanding of system design patterns, enterprise architecture frameworks (e.g., TOGAF), and architectural principles. Experience & Education: Bachelors degree in computer science, Engineering, or a related field; a Masters degree is a plus
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
Are you someone with a relentless drive for perfection, always seeking to make things better If so, you'll find a kindred spirit in Ford Quality. We're passionate about continuous improvement, constantly striving to deliver the highest quality products and services our customers deserve. Join us and become a key player in driving operational excellence. You'll contribute to innovative, proprietary initiatives like our Global Product Development System, Quality Operating System, and New Model Launch processes. This role offers fantastic cross-functional exposure, as you'll collaborate closely with integrated teams across Manufacturing, Product Development, Purchasing, Marketing, Sales, and Service. In this exciting role, you'll be at the heart of our data-driven decision-making, analyzing vast amounts of data to pinpoint opportunities for improvement. Your insights will directly enhance quality performance and elevate the customer experience with our products. We truly believe that data holds immense power to help us create exceptional products and experiences that delight our customers. By providing actionable, persistent insights from a high-quality data platform, you'll empower our business and engineering teams to make even more impactful decisions.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As the Senior Product Manager for Platform at Enterpret, you will play a crucial role in defining and executing the strategic vision, product strategy, and roadmap for the core platform. This platform serves as the foundation of Enterpret's product, consolidating customer feedback from various sources and transforming it into valuable insights through the Knowledge Graph infrastructure and Adaptive Taxonomy engine, among others. Your key responsibilities will include driving the strategy by leading the product vision, roadmap, and overall platform development. You will be responsible for delivering a robust and performant platform that provides near real-time, high-quality, predictive insights to customers while ensuring developer productivity and customer satisfaction at scale. Collaboration with engineering and product leadership is essential to make architectural decisions that enhance performance, scalability, reliability, security, and cost efficiency. You will also work cross-functionally to understand platform needs across different product teams, align on roadmap dependencies, and ensure the platform continues to support and accelerate overall product development. Translating complex technical concepts into clear product requirements and owning key success metrics such as latency, scalability, reliability, cost, and internal developer velocity will be part of your role. Additionally, you will invest in improving developer experience through observability, documentation, and tooling to facilitate faster and higher-quality development by Enterpret teams. As a champion of platform-as-a-product, you will promote the platform's capabilities internally and externally, ensuring that shared services are well-understood, adopted, and designed with a customer-centric and metrics-driven approach. Your role will be instrumental in driving Enterpret's platform to new heights and maintaining its position as a key asset in delivering trusted insights to customers.,
Posted 1 week ago
10.0 - 20.0 years
25 - 40 Lacs
Noida, Hyderabad/Secunderabad, Bangalore/Bengaluru
Hybrid
Dear candidate, We found your profile suitable for our current opening, please go through the below JD for better understanding of the role, Job Description: Role: Technical Architect / Senior TA Exp :10 - 15 years Employment Type : Full-time Mode of work: Hybrid Model (3days WFO) Work Location: Hyderabad/Bangalore/Noida/Pune/Kolkata Role Overview: We are looking for a skilled Business Analyst with strong domain experience in Real Estate Investment Trusts (REITs) , specifically in Mortgage-Backed Securities (MBS) and/or Capital Allocation . The ideal candidate will have exposure to data platforms or application development and be capable of translating business needs into actionable insights and technical requirements. Key Responsibilities: Understand and analyze business processes in the REIT domain (MBS, capital allocation). Collaborate with stakeholders to gather and document requirements. Define domain models and mappings for data platforms. Work closely with data engineering and application development teams. Support product and platform enhancements through data-driven insights. Participate in stakeholder meetings, L2 interviews, and managerial discussions. Required Skills: Strong domain knowledge in REITs MBS and/or Capital Allocation. Experience as a Business Analyst or in a similar analytical role. Exposure to data platforms or application development projects. Ability to define domain models and mappings. Good understanding of SQL (basic knowledge is acceptable; writing skills are preferred). Excellent communication and documentation skills. Preferred Qualifications: Experience working in financial services, investment platforms, or real estate analytics. Familiarity with tools like Power BI, Tableau, Snowflake, or similar. Comfortable working in remote teams and cross-functional environments. How to Apply: Please share your updated resume highlighting relevant REIT domain experience, BA skills, and exposure to data platforms or app development. Please check below link for organisation details, https://www.tavant.com/ If interested , please drop your resume to dasari.gowri@tavant.com Regards Dasari Krishna Gowri Associate Manager - HR www.tavant.com
Posted 1 week ago
9.0 - 14.0 years
19 - 34 Lacs
Bengaluru
Hybrid
Reporting to the Senior Product Manager, Master Reference Data (MRD) is the centralised taxonomy and data governance solution that defines how Euromonitor structures and combines its various data sources. It is the single source of truth that lays out definitions to our taxonomy, enables seamless data integration across all our systems and unlocks value to our clients by enabling all our data sources to be combined in any possible way. The Senior Data Business Analyst will serve as the critical link between business stakeholders and technical teams, building in-depth knowledge on our various data sources; understanding our taxonomy challenges, clients needs, ETL processes and business objectives; analysing and documenting requirements; working closely with architects and software engineers to design solutions for our data warehouse and master reference data that deliver scalable, high-quality data solutions that solves real user problems and aligns with business objectives. Key responsibilities 1. Requirement Gathering and Analysis- Independently lead sessions with stakeholders and senior product manager. Navigate through complex requirements with autonomy. Gather, analyse, and document business requirements. Translate business requirements into functional specifications with clear acceptance criteria. 2. Solution Design and Implementation: Collaborate with architects and software engineers to clarify requirements and design solutions. Reconcile conflicting requirements from multiple stakeholders and design solutions that balance priorities and meet shared objectives. Conduct user acceptance testing (UAT) and coordinate with stakeholders for feedback and sign-off. Ensure consistency and traceability of data across systems. 3. Stakeholder and Team Management: Participate in sprint planning, backlog grooming and all other ceremonies. Discuss alternatives, cost-benefit, trade-offs and make informed recommendations to ensure solutions aligned with requirements and objectives are delivered on time and budget. Build strong relationships with stakeholders at all levels. Manage stakeholder expectations and provide regular updates. Communicate progress, issues and solutions effectively. 4. Documentation and Training: Create and maintain comprehensive and detailed documentation, ensuring it is up-to date and accessible. Provide training and support to end-users. 5. Process Improvement: Utilize process modelling techniques to develop detailed process models and workflows. Implement process improvement frameworks to systematically identify and address inefficiencies in business processes. The ideal candidates will demonstrate: • Minimum 8 years of experience as a Business Analyst, with recent experience specifically in data warehouse or data platform products. Must demonstrate expertise in capturing and translating complex data requirements into functional and nonfunctional requirements with clear acceptance criteria and testing cases within Agile teams. • Deep understanding on data platform technologies, ETL processes and dimensional modelling is a must-have. • Excellent communication skills and organisational skills. Oral and written fluency in English. • Proficient in business analysis tools and methodologies. Ability to produce high-quality documentation and artifacts to support stakeholders and team to understand requirements. • Ability to manage multiple projects and priorities simultaneously, deal with ambiguity and conflicting interests of different stakeholders. Desirable attributes: • Experience with Azure DevOps. • Basic / intermediate knowledge on data programming languages like SQL, Python or R. • Experience with Data Visualization tools, preferably Power BI. • Degree in Computer Science, Information Systems, Statistics or a related field. A masters degree is a plus.
Posted 1 week ago
5.0 - 8.0 years
15 - 20 Lacs
Hyderabad
Work from Office
Job Title: Product Manager Location: Hyderabad, India Work Model: Hybrid (Onsite 3 days/week) Team Function: Product Level: Mid-Senior Job Description: We are seeking a skilled Product Manager to support our enterprise data initiatives. This is a hybrid offshore role based in Hyderabad, India, where you will collaborate with global cross-functional teams to optimize and manage menu data across our brand platforms. Key Responsibilities: Partner with stakeholders across Data , Technology , and Business units to understand menu-related data needs and translate them into functional product requirements. Define user stories , acceptance criteria , and roadmaps aligned with business priorities and scalable data management practices. Collaborate with engineering and data teams to ensure successful integration, delivery, and QA of product solutions. Drive data consistency , governance , and standardization of menu attributes across systems and platforms. Serve as a subject matter expert on menu data management tools and workflows. Identify opportunities for process improvements and automation . Qualifications: 4-7 years of experience in product management or data platform/product roles. Experience working with offshore/global teams in a hybrid model . Ability to define and prioritize product backlogs in collaboration with cross-functional teams. Proficiency in JIRA , Confluence , and other Agile product management tools. Strong analytical skills with the ability to translate business needs into technical requirements. Excellent communication and stakeholder management skills. Preferred Skills: Experience in the restaurant , foodservice , or retail industry. Familiarity with data governance , data modeling , or enterprise data platforms . Knowledge of the Inspire Brands ecosystem or similar enterprise environments.
Posted 1 week ago
0.0 - 4.0 years
0 Lacs
karnataka
On-site
Build the future of the AI Data Cloud by joining the Snowflake team. Snowflake is at the forefront of the data revolution, committed to creating the world's greatest data and applications platform. Our "get it done" culture ensures that everyone at Snowflake has an equal opportunity to innovate on new ideas, create work with a lasting impact, and excel in a collaborative environment. Snowflake's pre-sales organization is actively seeking an Associate Sales Engineer to join the Sales Engineering training program called Snowmaker. The purpose of Snowmaker is to nurture aspiring technical talent through a blend of education and mentorship. This six-month program provides comprehensive technical and sales skills training through classroom sessions, shadowing, and mentoring by sales and pre-sales leaders and peers. As an Associate Sales Engineer, you will have the chance to familiarize yourself with Snowflake's technology portfolio, understand the needs and business challenges of customers from various industries, and grasp Snowflake's sales process to address them. You will apply your technical aptitude, exceptional communication skills, and creative problem-solving abilities on a daily basis. Upon successful completion of the program, you will join our regional Sales Engineering team and contribute to its success. Upon the successful completion of the training, your responsibilities will include: - Presenting Snowflake technology and vision to executives and technical contributors at prospects and customers - Leveraging knowledge of a domain or industry to align Snowflake's value with the customers" business and technical problems - Working hands-on with SEs, prospects, and customers to demonstrate and communicate the value of Snowflake technology throughout the sales cycle - Maintaining a deep understanding of competitive and complementary technologies and vendors to position Snowflake effectively - Collaborating with Product Management, Engineering, and Marketing to enhance Snowflake's products and marketing - Providing post-sales technical guidance to the customers" technical team to drive customer utilization of Snowflake and digital transformation success - Contributing to global and regional Sales Engineering initiatives On day one, we expect you to have: - A deep interest in translating customer needs and problems into technical solutions - A passion for technology, a willingness to learn, and the ability to thrive in a fast-paced work environment - Ability to present technical topics to various audiences via whiteboard sessions, presentations, and demos - A university degree in Computer Science, Engineering, Mathematics, or related fields; equivalent experience is preferred - Industry or internship experience focusing on data analytics, pre-sales, solution architecture, or data engineering - Hands-on experience with SQL, Python, Scala, Spark, Java, cloud technology, data platforms, or data analytics (bonus) - A strong desire to pursue a career in Sales Engineering Snowflake is experiencing rapid growth, and we are expanding our team to support and accelerate our development. We are seeking individuals who share our values, challenge conventional thinking, and drive innovation while building a successful future for themselves and Snowflake. Join us and make an impact today! For jobs in the United States, please refer to the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
ahmedabad, gujarat
On-site
The Senior Data Engineer role requires 5 to 8 years of experience and expertise in Azure Synapse and deep Azure data engineering. You will be responsible for designing and implementing data technology and modern data platform solutions within the Azure environment. Your key responsibilities will include collaborating with Data Architects, Presales Architects, and Cloud Engineers to deliver high-quality solutions, mentoring junior team members, and conducting research to stay updated with the latest industry trends. You will also be expected to develop and enforce best practices in data engineering and platform development. We are looking for candidates with substantial experience in data engineering and Azure data services, strong analytical and problem-solving skills, proven experience working with diverse customers, and expertise in developing data pipelines, APIs, file formats, and databases. Familiarity with technologies such as Synapse, ADLS2, Databricks, Azure Data Factory, Azure SQL, Keyvault, and Azure Security is essential. Experience with CI/CD practices, specifically within Azure DevOps, and agile delivery methods is preferred. This is a full-time position based in Ahmedabad, India, with a hybrid work mode. The work schedule is from Monday to Friday during day shifts. As a Senior Data Engineer, you will have the opportunity to contribute to the development of cutting-edge data solutions, support various teams within the organization, and play a key role in mentoring and guiding junior team members. To apply for this position, please provide information on your notice period, current annual salary, expected annual salary, and current city of residence. The ideal candidate for this role will have a minimum of 6 years of experience with Azure data services, Azure Synapse, Databricks, and Azure Data Factory. If you have a passion for data engineering, a drive for continuous learning, and a desire to work with innovative technologies, we encourage you to apply for this exciting opportunity.,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
hyderabad, telangana
On-site
As an IT Project Manager/Architect for Data Platform & Monitoring within Global Operations and Supply Chain IT, your primary responsibility is to lead the architecture, technical implementation, and overall management of the data platform and monitoring program. Your role is critical in the planning and execution of a strategic program that includes developing a centralized data platform to consolidate manufacturing systems data across all sites and implementing robust observability and monitoring capabilities for global manufacturing systems and applications. Success in this role demands strong coordination and communication skills to work seamlessly across cross-functional teams, ensuring alignment with organizational objectives, timelines, and delivery standards. You will be leading a team of 10-15 Global Operations Supply Chain team members in the core manufacturing and supply chain digital platform domain. Your responsibilities will include developing a comprehensive project plan, defining project scope, goals, and objectives, identifying potential risks, leading a diverse cross-functional project team, establishing a collaborative environment, and working closely with business stakeholders to gather and document functional and technical requirements for the IT systems implementation solution. You will also lead the implementation of manufacturing IT systems, provide updates to the leadership team, and coordinate cross-functional teams and stakeholders to gather business and technical requirements, translating them into a clear, actionable 3-year data platform roadmap. Minimum qualifications for this role include a Bachelor's degree (required), with an advanced degree preferred, along with a minimum of 10 years of relevant experience in IT project or program management roles and 4+ years of team management experience of 10+ team members. Prior experience in regulated or validated industries is a strong plus. Strong documentation, organizational, and communication skills are essential, along with familiarity with project management tools and the ability to understand the customer's business problem and design effective solutions. Proven ability to deliver quality results within defined timelines, understanding of application lifecycle processes and system integration concepts, and the ability to thrive in a fast-paced, team-oriented environment are also required. Skills needed for this role include a strong background in IT project management, especially in manufacturing or supply chain domains, experience in leading multi-function cross-team collaboration between IT and Business, managing program timelines, risks, status, and escalations, understanding and working within processes and tools, solid knowledge of SDLC and Agile/Waterfall/Hybrid project management principles, experience with project management tools like DevOps, strong knowledge of MS PowerPoint, MS Excel, MS Projects, experience managing Project Costing, Budget Forecasting, and Resource Management, and working knowledge of manufacturing IT systems like ERP, MES, etc.,
Posted 2 weeks ago
3.0 - 8.0 years
5 - 12 Lacs
Navi Mumbai
Work from Office
Objectives aligned to this role: One place to manage all Informatica related development jobs along with the setup support. What would you do: Analyse Database models and Requirements: - Informatica developers analyse a business's database storage and warehousing capabilities and assess the company's data requirements.They review data storage and access procedures and use Informatica tools to update, test, and provide solutions for data issues Develop Technical documents for informatica systems: - Informatica developers maintainup-to-date documentation of implementation, troubleshooting, and ETL processes related to Informatica systems. They also keep documents pertaining to specific issues and how they were resolved, including coding information and extraction and transformation processes. Integrate Informatica Systems: - One of the main roles of an Informatica developer is to develop Target systems using Informatica software tools. These developers must integrate this system with a company's existing systems, troubleshoot any issues, and smoothly implement the Informatica cloud data management product. Develop informatica Workflows: - The developers core job is by following the technical documentation develop informatica ETL workflow to pull data from source transform and then load into the target system. Conduct data quality test: - It is up to Informatica developers to regularly check the quality of stored data. They oversee mappings and workflows, check data integrity and accuracy, and perform data cleansing procedures as needed. Whom we are looking for: In addition to the expected technical skills required to be an Informatica developer, these professionals should be team leaders with strong analytical, creative, and time management skills. After examining several job postings, we found that employers tend to favor candidates who display the following abilities. 2 to 5 years of overall experience doing development job in Informatica ETL. Computer skills - a thorough knowledge of computer programming, coding, and various operating and database systems is a must for Informatica developers. Time management - Informatica developers should have the ability to quickly develop data warehousing systems and solve any issues to ensure the continued accuracy of business data. Creativity - the ability to create mappings from scratch often calls for strong creative skills on the part of an Informatica developer. Analytical thinking - Informatica developers should be able to analyse data needs and options and understand the needs of various clients. Troubleshooting - when data warehousing systems are down, it falls to Informatica developers to quickly assess the problem and provide a solution. Team collaboration - Informatica developers rarely work alone; they typically interact closely with database managers and other IT specialists when maintaining, storing, and retrieving data. Technical skills: Required: Informatica PowerCenter tools (workflow manager, workflow monitor, designer) Programming languages (SQL, XTML) Data platforms (Oracle, Teradata, Hadoop) IMMEDIATE JOINERS ONLY (15 DAYS OR LESS)
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
About McDonald's: McDonald's Corporation, one of the world's largest employers with locations in more than 100 countries, is offering corporate opportunities in Hyderabad. The global offices of McDonald's are dynamic innovation and operations hubs, aimed at expanding the global talent base and in-house expertise of the company. The newly established office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating the ability of McDonald's to deliver impactful solutions for the business and customers worldwide. Position Overview: McDonald's is looking for an exceptional Senior Data Product Engineering SRE to take charge of the development and operational excellence of data products that provide insights and drive crucial business decisions. This role requires a unique combination of a product engineering mindset, data platform expertise, and site reliability engineering practices to create, scale, and maintain customer-facing data products and internal analytics platforms. The Senior Data Product Engineering SRE will be responsible for ensuring the end-to-end reliability of data products, from ingestion to user experience, to ensure they deliver business value at scale. Key Responsibilities: - Define and implement a product reliability strategy for customer-facing analytics, dashboards, and data APIs. - Collaborate with Product Management to translate business requirements into scalable, reliable data product architectures. - Establish product metrics, KPIs, and success criteria for data products serving both external and internal customers. - Lead cross-functional initiatives to enhance data product adoption, engagement, and customer satisfaction. - Develop and maintain data products, including real-time dashboards, analytics APIs, and embedded analytics solutions. - Design user-centric data experiences focusing on performance, reliability, and scalability. - Implement A/B testing frameworks and experimentation platforms for data product optimization. - Set and maintain SLAs for data product availability, latency, and accuracy. - Implement comprehensive monitoring for user-facing data products, encompassing frontend and backend metrics. - Create automated testing frameworks for data product functionality, performance, and data quality. - Lead incident response for data product issues that impact customer experience. - Monitor and optimize data product performance from an end-user perspective, including page load times and query response times. - Implement user feedback collection and product analytics to drive continuous improvement. - Collaborate closely with Product, Engineering, Data Science, and Customer Success teams. - Establish engineering practices for data product development, encompassing code reviews and deployment processes. - Influence the product roadmap with technical feasibility and reliability considerations. - Advocate for data product best practices throughout the organization. - Strike a balance between innovation, operational stability, and customer commitments. - Collaborate with Product Management on feature prioritization and requirements. Required Qualifications: - 8+ years of experience in product engineering, data engineering, or SRE roles. - 5+ years of experience in building customer-facing data products, analytics platforms, or business intelligence solutions. - 3+ years in senior or lead positions with direct team management experience. - Proven track record of delivering data products that drive measurable business impact. - Expertise in the product development lifecycle from ideation to launch and optimization. - Advanced experience in building user-facing applications and APIs. - Deep expertise with analytics databases (Redshift, BigQuery, ClickHouse), real-time processing (Kafka, Spark Streaming), and BI tools (Tableau, Looker, Power BI). - Proficiency in React, Vue.js, or Angular for constructing data visualization interfaces. - Advanced skills in Python, Java, or Node.js for API development and data services. - Expert-level SQL skills and experience optimizing queries for interactive analytics workloads. - Extensive experience with AWS or GCP data and compute services. - Strong product sense with the ability to balance technical constraints with user needs. - Experience with product analytics tools (Amplitude, Mixpanel, Google Analytics) and metrics-driven development. - Ability to understand business requirements and translate them into technical solutions. - Strong technical writing skills for customer-facing documentation and API specifications. - Experience with agile product development methodologies (Scrum, Kanban, Design Thinking). - Proven track record of building and scaling product engineering teams. Work Location: Hyderabad, India Work Pattern: Full-time role. Work Mode: Hybrid.,
Posted 2 weeks ago
3.0 - 8.0 years
15 - 25 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Skill Name :- I ntersystem IRIS Tool Work Location:- USI location Experience 3 to 5 year A resource with hands-on experience in Intersystem IRIS Data Platform, proficient in ObjectScript, SQL, and integration technologies (REST, SOAP) . FHIR is must to have. Experience with data modeling, performance tuning, and deploying IRIS on Linux/Windows is required. Skills in Python, Java, .NET, and Docker are a plus. Rounds of interview R1 and R2 (client round if required) Mode of interview (Virtual/ In-person) Work timing 11AM 8PM Work Mode (Remote/ On-site/ Hybrid) Hybrid
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
You will be the visionary Group Data Product Manager (GPM) for AI/ML & Metadata Management, responsible for leading the development of advanced AI/ML-powered metadata solutions. Your primary focus will be on establishing a cohesive and intuitive Data Platform tailored to cater to a variety of user roles including data engineers, producers, and consumers. Your role involves integrating various tools to create a unified platform that will significantly improve data discoverability, governance, and operational efficiency on a large scale.,
Posted 3 weeks ago
10.0 - 18.0 years
2 - 3 Lacs
Hyderabad
Work from Office
Experience needed: 12-18 years Type: Full-Time Mode: WFO Shift: General Shift IST Location: Hyderabad NP: Immediate Joinee - 30 days Job Summary: We are looking for an experienced and visionary Data Architect - Azure Data & Analytics to lead the design and delivery of scalable, secure, and modern data platform solutions leveraging Microsoft Azure and Microsoft Fabric . This role requires deep technical expertise in the Azure Data & Analytics ecosystem, strong experience in designing cloud-native architectures, and a strategic mindset to modernize enterprise data platforms. Key Responsibilities: Architect and design Modern Data Platform solutions on Microsoft Azure, including ingestion, transformation, storage, and visualization layers. Lead implementation and integration of Microsoft Fabric , including OneLake, Direct Lake mode, and Fabric workloads (Data Engineering, Data Factory, Real-Time Analytics, Power BI). Define enterprise-level data architecture , including data lakehouse patterns, delta lakes, data marts, and semantic models. Collaborate with business stakeholders, data engineers, and BI teams to translate business needs into scalable cloud data solutions. Design solutions using Azure-native services such as Azure Data Factory, Azure Synapse Analytics, Azure Data Lake Storage (Gen2), Azure SQL, and Azure Event Hubs. Establish best practices for data security, governance, DevOps, CI/CD pipelines, and cost optimization. Guide implementation teams on architectural decisions and technical best practices across the data lifecycle. Develop reference architectures and reusable frameworks for accelerating data platform implementations. Stay updated on Microsofts data platform roadmap and proactively identify opportunities to enhance data strategy. Assist in developing RFPs, architecture assessments, and solution proposals. Requirements Required Skills & Qualifications: Proven 12-18 years of experience which includes designing and implementing cloud-based modern data platforms on Microsoft Azure. Deep knowledge and understanding of Microsoft Fabric architecture , including Data Factory, Data Engineering, Synapse Real-Time Analytics, and Power BI integration. Expertise in Azure Data Services : Azure Synapse, Data Factory, Azure SQL, ADLS Gen2, Azure Functions, Azure Purview, Event Hubs, etc. Experience of data warehousing, lakehouse architectures, ETL/ELT , and data modeling. Experience in data governance, security, role-based access (Microsoft Entra/Azure AD) , and compliance frameworks. Strong leadership and communication skills to influence both technical and non-technical stakeholders. Familiarity with DevOps and infrastructure-as-code (e.g., ARM templates, Bicep, Terraform) is a plus. Preferred Qualifications: Microsoft Certified: Azure Solutions Architect Expert , Azure Data Engineer Associate , or Microsoft Fabric Certification . Experience with real-time data streaming , IoT , or machine learning pipelines in Azure. Familiarity with multi-cloud data strategies or hybrid deployments is an advantage.
Posted 3 weeks ago
9.0 - 14.0 years
20 - 35 Lacs
Bhubaneswar, Hyderabad, Mumbai (All Areas)
Hybrid
Strong knowledge & working exp. in Temenos Data hub and Temenos Analytics. Should have exposure to solutioning in TDH namely - validating requirements from Bank, development reports out of TDH/Analytics and support integration around TDH/Analytics. Required Candidate profile Should have exp. on Data stream and develop skill set within Team.TLC certification on Temenos data platforms is added advantage. Overall ownership on the programs around data platform. Exp-9-15 Yrs
Posted 3 weeks ago
9.0 - 14.0 years
15 - 19 Lacs
Bengaluru
Work from Office
About the Role: We are looking for an Associate Architect with atleast 9+ years of experience to help scale andmodernize Myntra's data platform The ideal candidate will have a strong background inbuilding scalable data platforms using a combination of open-source technologies andenterprise solutions The role demands deep technical expertise in data ingestion, processing, serving, andgovernance, with a strategic mindset to scale the platform 10x to meet the ever-growing dataneeds across the organization This is a high-impact role requiring innovation, engineering excellence and system stability,with an opportunity to contribute to OSS projects and build data products leveragingavailable data assets Key Responsibilities: Design and scale Myntra's data platform to support growing data needs acrossanalytics, ML, and reporting Architect and optimize streaming data ingestion pipelines using Debezium, Kafka(Confluent), Databricks Spark and Flink Lead improvements in data processing and serving layers, leveraging DatabricksSpark, Trino, and Superset Good understanding of open table formats like Delta and Iceberg Scale data quality frameworks to ensure data accuracy and reliability Build data lineage tracking solutions for governance, access control, and compliance Collaborate with engineering, analytics, and business teams to identify opportunitiesand build / enhance self-serve data platforms Improve system stability, monitoring, and observability to ensure high availability ofthe platform Work with open-source communities and contribute to OSS projects aligned withMyntras tech stack Implement cost-efficient, scalable architectures for handling 10B+ daily events in acloud environment Qualifications: Education: Bachelor's or Masters degree in Computer Science, Information Systems, or arelated field. Experience: 9+ years of experience in building large-scale data platforms Expertise in big data architectures using Databricks, Trino, and Debezium Strong experience with streaming platforms, including Confluent Kafka Experience in data ingestion, storage, processing, and serving in a cloud-basedenvironment Hands-on experience implementing data quality checks using Great Expectations Deep understanding of data lineage, metadata management, and governancepractices Strong knowledge of query optimization, cost efficiency, and scaling architectures Familiarity with OSS contributions and keeping up with industry trends in dataengineering Soft Skills: Strong analytical and problem-solving skills with a pragmatic approach to technicalchallenges Excellent communication and collaboration skills to work effectively withcross-functional teams Ability to lead large-scale projects in a fast-paced, dynamic environment Passion for continuous learning, open-source collaboration, and buildingbest-in-class data products
Posted 3 weeks ago
8.0 - 11.0 years
17 - 30 Lacs
Kolkata, Bengaluru, Mumbai (All Areas)
Hybrid
Job title: Functional BA Manager Location: Mumbai/Bangalore/Gurgaon/Kolkata Experience: 8 to 11 years Notice: Immediate to 30 days Functional BAs Led a team of at least 3 BAs in the past Has understanding of data platform implementations Has experience in FS industry Has project management skills/certifications
Posted 4 weeks ago
5.0 - 10.0 years
40 - 85 Lacs
Bengaluru
Work from Office
About the Team The Data Platform Tech Team at Navi is instrumental in enabling data-driven decision-making across the organization. We build and manage the core infrastructure and tools required to collect, store, process, and analyze data at scale. Our platforms support self-serve capabilities for both batch and real-time data processing. We work closely with Analytics, Data Science, and Product teams to power a wide range of data use cases across Navi. About the Role As an SDE-3 on the Data Platform team at Navi, you'll design and build large-scale systems powering web events, analytics, and real-time data pipelines. You'll lead backend development, contribute to platform architecture, and solve complex data problems. This is a high-impact IC role with strong cross-functional collaboration and mentorship opportunities. What We Expect From You Design, develop, and maintain backend services, data pipelines, and batch/realtime datasets related to web events and analytics. Strong proficiency in at least one of the following languages: Java, Python, Scala. Expertise in object-oriented design, design patterns, and data structures. Lead the development of new foundational capabilities in support of enabling our users to interact with, analyze, and derive insights from their data. Solve complex and challenging problems in the intersection between low latency, high correctness, and full determinism. Participate in code reviews, provide mentorship to junior team members, and enforce coding standards. Investigate, diagnose, and resolve software defects and issues, ensuring a high level of product quality. Contribute to the overall architecture and design of data platform frameworks. Strong interpersonal skills, showcasing effective stakeholder management with product and design teams. A minimum of 5 years of software development experience. Must Haves Familiarity with modern data lakehouse architectures and related technologies (e.g. Spark, Flink, Kafka, Trino). Prior experience in on-prem data-platform is preferred. Demonstrated ability to quickly adapt to new and complex development environments, along with strong deep-dive analytical skills. Previous success in mentoring and guiding junior engineers. Inside Navi We are shaping the future of financial services for a billion Indians through products that are simple, accessible, and affordable. From Personal & Home Loans to UPI, Insurance, Mutual Funds, and Gold we’re building tech-first solutions that work at scale, with a strong customer-first approach. Founded by Sachin Bansal & Ankit Agarwal in 2018, we are one of India’s fastest-growing financial services organisations. But we’re just getting started! Our Culture The Navi DNA Ambition. Perseverance. Self-awareness. Ownership. Integrity. We’re looking for people who dream big when it comes to innovation. At Navi, you’ll be empowered with the right mechanisms to work in a dynamic team that builds and improves innovative solutions. If you’re driven to deliver real value to customers, no matter the challenge, this is the place for you. We chase excellence by uplifting each other—and that starts with every one of us. Why You'll Thrive at Navi At Navi, it’s about how you think, build, and grow. You’ll thrive here if: You’re impact-driven : You take ownership, build boldly, and care about making a real difference. You strive for excellence : Good isn’t good enough. You bring focus, precision, and a passion for quality. You embrace change : You adapt quickly, move fast, and always put the customer first.
Posted 1 month ago
10.0 - 20.0 years
50 - 75 Lacs
Bengaluru
Work from Office
A leading player in cloud-based enterprise solutions is expanding its analytics leadership team in Bangalore. This pivotal role calls for a seasoned professional to drive the evolution of data products and analytics capabilities across international markets. The ideal candidate will possess the strategic vision, technical expertise, and stakeholder savvy to lead in a fast-paced, innovation-driven environment. Key Responsibilities Lead and mentor a dynamic team of product managers to scale enterprise-grade data lake and analytics platforms Drive program execution and delivery with a focus on performance, prioritization, and business alignment Define and execute the roadmap for an analytical data platform, ensuring alignment with strategic and user-centric goals Collaborate cross-functionally with engineering, design, and commercial teams to launch impactful BI solutions Translate complex business needs into scalable data models and actionable product requirement documents for multi-tenant SaaS products Champion AI-enabled analytics experiences to deliver smart, context-aware data workflows Maintain high standards in performance, usability, trust, and documentation of data products Ensure seamless execution of global data strategies through on-the-ground leadership in India Promote agile methodologies, metadata governance, and product-led thinking across teams Ideal Candidate Profile 10+ years in product leadership roles focused on data products, BI, or analytics in SaaS environments Deep understanding of modern data architectures, including dimensional modeling and cloud-native analytics tools Proven expertise in building multi-tenant data platforms serving external customer use cases Skilled in simplifying complex inputs into clear, scalable requirements and deliverables Familiarity with platforms like Deltalake, dbt, ThoughtSpot, and similar tools Strong communicator with demonstrated stakeholder management and team leadership capabilities Experience launching customer-facing analytics products is a definite plus A passion for intuitive, scalable, and intelligent user experiences powered by data
Posted 1 month ago
3.0 - 5.0 years
5 - 7 Lacs
Hyderabad, Bengaluru
Work from Office
About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotaks Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotaks data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If youve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologies: Redshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects
Posted 1 month ago
3.0 - 8.0 years
8 - 15 Lacs
Pune
Hybrid
Role : Developer Location: Pune Hybrid Excellent Communication skills NP: Immediate Joiners to 1 month 9 (Only serving NP candidates apply) Exp: 3 to 9 yrs All Mandatory Skills : ( Must be in the roles and responsibilities) Data Platform Java Python Spark Kafka Cloud technologies (Azure / AWS) Databricks Interested Candidate Share Resume at dipti.bhaisare@in.experis.com
Posted 1 month ago
6.0 - 10.0 years
10 - 13 Lacs
Chennai, Bengaluru, Delhi / NCR
Work from Office
About the Role Minimum 6+ years of experience as Data Platform Engineer Cloud data platform engineer who designs, builds, and manages data storage and workflows in cloud environments. They ensure that data is secure, accessible, and processed efficiently. Data Platform Management and Optimization - designs, builds, and manages data storage and workflows/compute in cloud environments. Roles & Responsibilities: Assist with the data platform blueprint and design. Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Develop and maintain data platform components. Contribute to the overall success of the project. Professional & Technical Skills: Proficiency in Databricks Unified Data Analytics Platform. Strong understanding of statistical analysis and machine learning algorithms. Experience with data visualization tools such as Tableau or Power BI. Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough