Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
18.0 - 22.0 years
0 Lacs
haryana
On-site
As an Enterprise Architect in the Financial Services domain, you will play a crucial role in leading the design and integration of enterprise solutions that are in line with the business objectives. With over 18 years of experience, you will focus on designing scalable, reliable, and secure solution architectures that meet both the technical requirements and the overarching business strategy, specifically within the Capital Markets domain. Your responsibilities will include collaborating with technical architects, engineering teams, and key stakeholders to ensure successful implementation, bridging the gap between business objectives and technical execution. You will be responsible for aligning high-level solution architectural designs with the organization's immediate and long-term strategies, with a focus on cloud technologies and microservices integration patterns. Your role will involve leading the design of end-to-end solutions, making decisions on build vs buy, selecting COTS solutions, and prioritizing technical debt based on architectural principles and roadmap. Additionally, you will work closely with technical teams to ensure that the architectural vision is implemented successfully, balancing business needs with technical constraints. Your expertise in Capital Markets, COTS solutions, cloud technologies (such as AWS and Azure), integration patterns, and security governance will be vital in this role. You will also be required to have experience in working on RFP design, build vs buy decision making, and collaborating with business stakeholders to understand their problems and find suitable solutions. Strong communication skills, leadership abilities, and a collaborative mindset will be essential for fostering partnerships with internal and external teams to ensure solution delivery aligns with enterprise standards. Ideally, you should have a strong understanding of financial services regulations and compliance, along with a background in financial systems solution architecture. A degree in computer science, information technology, or related fields, as well as certification in TOGAF 9.1/9.2, would be advantageous. Your ability to articulate the architectural vision clearly and guide decision-making processes across business and technical teams will be key to your success in this role.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
You have over 8 years of experience and are located in Balewadi, Pune. Your technical skills and core competencies include a strong understanding of Data Architecture and models, leading data-driven projects, expertise in Data Modelling paradigms such as Kimball, Inmon, Data Marts, Data Vault, and Medallion. You have a solid experience with Cloud Based data strategies and big data technologies with a preference for AWS. You are adept at designing data pipelines for ETL, possessing expert knowledge on ingestion, transformation, and data quality. Hands-on experience in SQL is a must, including a deep understanding of PostGreSQL development, query optimization, and designing indexes. You should be able to understand and manipulate intermediate to complex levels of SQL, with thorough knowledge of Postgres PL/SQL for complex warehouse workflows. Moreover, you can apply advanced SQL concepts and statistical concepts through SQL, and experience with PostGres SQL extensions like PostGIS is desired. Expertise in writing ETL pipelines combining Python + SQL is required, along with an understanding of data manipulation libraries in Python like Pandas, Polars, DuckDB. Experience in designing Data visualization with tools such as Tableau and PowerBI is desirable. Your responsibilities include participating in the design and development of features in the existing Data Warehouse, providing leadership in establishing connections between Engineering, product, and analytics/data scientists team, designing, implementing, and updating existing/new batch ETL pipelines, defining and implementing data architecture, and partnering with both engineers and data analysts to build reliable datasets. You will work with various data orchestration tools (Apache Airflow, Dagster, Prefect, and others), embrace a fast-paced start-up environment, and should be passionate about your job and enjoy a fast-paced international working environment. Background or experience in the telecom industry is a plus but not a requirement. You love automating and enjoy monitoring.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You will be responsible for demonstrating thorough knowledge and a proven record of success in executing various functional and technical aspects of SAP Master Data Governance (MDG) projects following industry best practices. This includes Data Modelling, Process Modelling, UI Modelling, Business Validation Rules Modelling, Derivations, and Data Replication Framework (DRF) and Workflow creations and Maintenance. Your role will require a good understanding of the SAP MDG technical framework, including BADI, BAPI/RFC/FM, Workflows, BRF+, Enterprise Services, IDoc, Floorplan Manager, WebDynPro, Fiori, and MDG API framework. Knowledge of SAP data dictionary tables, views, relationships, and corresponding data architecture for ECC and S/4 HANA for various SAP master and transactional data entities is essential, including excellent functional knowledge for core master data objects like customer, vendor, and material. Hands-on experience in configuring customer, vendor, finance, and product/material master data in MDG is necessary, including data harmonization involving de-duplication, mass changes, and data replication involving Key/Value mapping, SOA Web services, ALE/Idoc. Effective communication with customers and partners to understand specific Enterprise Data needs is a key aspect of this role. You should possess excellent written and verbal communication skills with the ability to impart ideas in technical, business, and user-friendly language. Having an appetite to acquire new knowledge, adapt to, and contribute to fast innovation is important for success in this role. The ideal candidate will have a minimum of 5 years of experience in SAP Master Data Governance (MDG) with at least 2 full cycle implementations. Implementation experience of SAP MDG in key domains such as Customer, Supplier, Material, and Finance Master is required. Hands-on experience with SAP Fiori, SAP MDG mass processing, consolidation, central governance, Workflow, and BRF+ is essential. Experience in RDG is considered an added advantage for this role.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
Siemens Energy is looking for a highly skilled and experienced Senior MLOps Engineer to join the Digital Core function and contribute significantly to the data architecture and strategy within the organization. You will work closely with stakeholders to develop our Machine Learning environment, collaborating with business stakeholders and other teams to prioritize backlog items, offer consultancy on AI ML solutions, support test automation, build CI CD pipelines, and work on PoCs/MVPs using various hyperscale offerings. If you are passionate about the environment and climate change and eager to be a part of the energy transition future, then the Siemens Energy Data Analytics & AI team is the place for you. We are seeking innovative, enthusiastic, and versatile data, digital, and AI professionals to drive us forward on this exciting journey of energy transformation. Your responsibilities will include onboarding new AI ML use cases in AWS / Google Cloud Platform, defining MLOps architecture for AWS/ GCP/ Cloud Agnostic, working with AI ML services like AWS Sagemaker and GCP AutoML, developing PoCs / MVPs using AWS / GCP and other MLOps services, implementing CI CD pipelines using GitLab CI, writing Infrastructure as code with AWS CDK scripts, providing consultancy to stakeholders on AI ML solutions, supporting test automation, and deploying code base to production environments. To be successful in this role, you should have a Bachelor's degree in Computer Science, Mathematics, Engineering, Physics, or related fields (a Master's degree is a plus), around 10 years of hands-on experience in ML / AI development and Operations, expertise in ML Life Cycle and MLOps, proficiency in Python coding and Linux administration, experience with CI CD pipelines and DevOps processes, familiarity with JIRA and Confluence, and excellent interpersonal and communication skills. Certification in AWS or GCP in ML AI area is preferred. The Data Platforms and Services organization at Siemens Energy is committed to becoming a data-driven organization to support customers in transitioning to a more sustainable world. By using innovative technologies and treating data as a strategic asset, we aim to make sustainable, reliable, and affordable energy a reality. Siemens Energy values diversity and inclusion, welcoming applications from individuals of all backgrounds. We believe that through diversity, we generate power, and our combined creative energy is fueled by over 130 nationalities. We celebrate character and do not discriminate based on differences, upholding equal opportunities for all. If you are ready to make an impact in shaping the future of energy and are committed to innovation, decarbonization, and energy transformation, join us at Siemens Energy. For more information about Siemens Energy and our commitment to diversity, visit our website: https://www.siemens-energy.com/employeevideo To explore job opportunities at Siemens Energy, visit: https://jobs.siemens-energy.com/jobs,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
chennai, tamil nadu
On-site
We are looking for a highly motivated and experienced Data and Analytics Senior Architect to lead our Master Data Management (MDM) and Data Analytics team. As the Data and Analytics Architect Lead, you will be responsible for defining and implementing the overall data architecture strategy to ensure alignment with business goals and support data-driven decision-making. Your role will involve designing scalable, secure, and efficient data systems, including databases, data lakes, and data warehouses. You will evaluate and recommend tools and technologies for data integration, processing, storage, and analytics while staying updated on industry trends. You will lead a high-performing team, fostering a collaborative and innovative culture, and ensuring data integrity, consistency, and availability across the organization. You will manage the existing MDM solution and data platform based on Microsoft Data Lake Gen 2, Snowflake as the DWH, and Power BI managing data from core applications. Additionally, you will drive further development to handle additional data and capabilities to support our AI journey. The ideal candidate will possess strong leadership skills, a deep understanding of data management and technology principles, and the ability to collaborate effectively across different departments and functions. **Principle Duties and Responsibilities:** **Team Leadership:** - Lead, mentor, and develop a high-performing team of data analysts and MDM specialists. - Foster a collaborative and innovative team culture that encourages continuous improvement and efficiency. - Provide technical leadership and guidance to the development teams and oversee the implementation of IT solutions. **Architect:** - Define the overall data architecture strategy, aligning it with business goals and ensuring it supports data-driven decision-making. - Identify, evaluate, and establish shared enabling technical capabilities for the division in collaboration with IT to ensure consistency, quality, and business value. - Design and oversee the implementation of data systems, including databases, data lakes, and data warehouses, ensuring they are scalable, secure, efficient, and cost-effective. - Evaluate and recommend tools and technologies for data integration, processing, storage, and analytics, staying updated on industry trends. **Strategic Planning:** - Develop and implement the MDM and analytics strategy aligned with the overall team and organizational goals. - Work with the Enterprise architect to align on the overall strategy and application landscape to ensure MDM and data analytics fit into the ecosystem. - Identify opportunities to enhance data quality, governance, and analytics capabilities. **Project Management:** - Oversee project planning, execution, and delivery to ensure timely and successful completion of initiatives. - Monitor project progress and cost, identify risks, and implement mitigation strategies. **Stakeholder Engagement:** - Collaborate with cross-functional teams to understand data needs and deliver solutions that support business objectives. - Serve as a key point of contact for data-related inquiries and support requests. - Develop business cases and proposals for IT investments and present them to senior management and stakeholders. **Data/Information Governance:** - Establish and enforce data/information governance policies and standards to ensure compliance and data integrity. - Champion best practices in data management and analytics across the organization. **Reporting and Analysis:** - Utilize data analytics to derive insights and support decision-making processes. - Document and present findings and recommendations to senior management. **Knowledge, Skills and Abilities Required:** - Bachelor's degree in computer science, Data Science, Information Management, or a related field; master's degree preferred. - 10+ years of experience in data management, analytics, or a related field, with at least 2 years in a leadership role. - Strong knowledge of master data management concepts, data governance, data technology, and analytics tools. - Proficiency in data modeling, ETL processes, database management, big data technologies, and data integration techniques. - Excellent project management skills with a proven track record of delivering complex projects on time and within budget. - Strong analytical, problem-solving, and decision-making abilities. - Exceptional communication and interpersonal skills. - Team player, result-oriented, structured, with attention to detail and a strong work ethic. **Special Competencies required:** - Proven leader with excellent structural skills, good at documenting and presenting. - Strong executional skills to make things happen, not just generate ideas. - Experience in working with analytics tools and data ingestion platforms. - Experience in working with MDM solutions and preferably TIBCO EBX. - Experience in working with Jira/Confluence. **Additional Information:** - Office, remote, or hybrid working. - Ability to function within variable time zones. - International travel may be required.,
Posted 1 week ago
2.0 - 5.0 years
9 - 13 Lacs
Ahmedabad
Work from Office
Role Expectations Advanced Analytics, Leadership & Cross-functional Collaboration Lead cross-functional projects using advanced data modelling and analysis techniques to discover insights that will guide strategic decisions and uncover optimization opportunities Manage and optimize processes for data intake, validation, mining and engineering as well as modelling, visualization and communication deliverables Develop Strategies for effective data analysis and reporting Define company-wide metrics and relevant data sources Build systems to transform raw data into actionable business insights Work closely with business leaders and stakeholders to understand their needs and translate them into functional and technical requirements Champion a data-driven culture, promoting the use of insights across the organization for informed decision-making Communicate technical complexities clearly to non-technical stakeholders and align diverse teams around common goals Lead digital transformation initiatives to foster innovation and modernization IT Management And Strategic Planning Oversee the architectural planning, development, and operation of all IT systems, ensuring their scalability, performance, security, and continuous integration/continuous deployment (CI/CD) Evaluate, select, and implement cutting-edge technology platforms and infrastructure to enable business growth and competitive advantage Develop and manage an IT budget to optimize resource allocation and ensure ROI on IT investments Establish IT policies, standards, and procedures in line with best practices and regulatory compliance Drive the talent management lifecycle of the IT team, including hiring, training, coaching, and performance management Profile We Are Looking At Working in/anchoring Analytics team for DTC business & Market places A person who lives, breathes and dreams numbers Marico Information classification: Official Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field Master's degree or MBA is a plus Minimum of 3-4 years of experience in IT management and/or advanced analytics roles Proficiency in advanced analytics techniques (e-g , machine learning) and tools (e-g , Python, R, SQL, Tableau, Hadoop) Extensive experience with IT systems architecture, cloud-based solutions (AWS, Google Cloud, Azure), and modern development methodologies (Agile, Scrum, DevOps) Proven ability to lead and develop a high-performing team Strong communication, strategic thinking, and project management skills Familiarity with data privacy standards and regulations (e-g , GDPR, CCPA) Experience in creating breakthrough visualizations Understanding of RDMS, Data Architecture/Schemas, Data Integrations, Data Models and Data Flows is a must Technical Ideal to have o Exposure to our tech stack PHP Microsoft workflows knowledge Experience in the beauty and personal care industry is desirable
Posted 1 week ago
1.0 - 3.0 years
9 - 13 Lacs
Pune
Work from Office
Delivery Manager Data Engineering (Databricks & Snowflake) Position: Delivery Manager Data Engineering Location: Bavdhan/Baner, Pune Experience: 7-10 years Employment Type: Full-time Job Summary We are seeking a Delivery Manager Data Engineering to oversee multiple data engineering projects leveraging Databricks and Snowflake This role requires strong leadership skills to manage teams, ensure timely delivery, and drive best practices in cloud-based data platforms The ideal candidate will have deep expertise in data architecture, ETL processes, cloud data platforms, and stakeholder management Key Responsibilities Project & Delivery Management: Oversee the end-to-end delivery of multiple data engineering projects using Databricks and Snowflake Define project scope, timelines, milestones, and resource allocation to ensure smooth execution Identify and mitigate risks, ensuring that projects are delivered on time and within budget Establish agile methodologies (Scrum, Kanban) to drive efficient project execution Data Engineering & Architecture Oversight Provide technical direction on data pipeline architecture, data lakes, data warehousing, and ETL frameworks Ensure optimal performance, scalability, and security of data platforms Collaborate with data architects and engineers to design and implement best practices for data processing and analytics Stakeholder & Client Management Act as the primary point of contact for clients, senior management, and cross-functional teams Understand business requirements and translate them into technical solutions Provide regular status updates and manage client expectations effectively Team Leadership & People Management Lead, mentor, and develop data engineers, architects, and analysts working across projects Drive a culture of collaboration, accountability, and continuous learning Ensure proper resource planning and capacity management to balance workload effectively Technology & Process Improvement Stay up-to-date with emerging trends in Databricks, Snowflake, and cloud data technologies Continuously improve delivery frameworks, automation, and DevOps for data engineering Implement cost-optimization strategies for cloud-based data solutions Technical Expertise Required Skills & Experience: 10+ years of experience in data engineering and delivery management Strong expertise in Databricks, Snowflake, and cloud platforms (AWS, Azure, GCP) Hands-on experience in ETL, data modeling, and big data processing frameworks (Spark, Delta Lake, Apache Airflow, DBT) Understanding of data governance, security, and compliance standards (GDPR, CCPA, HIPAA, etc) Familiarity with SQL, Python, Scala, or Java for data transformation Project & Team Management Proven experience in managing multiple projects simultaneously Strong knowledge of Agile, Scrum, and DevOps practices Experience in budgeting, forecasting, and resource management Soft Skills & Leadership Excellent communication and stakeholder management skills Strong problem-solving and decision-making abilities Ability to motivate and lead cross-functional teams effectively Preferred Qualifications ???? Experience with data streaming (Kafka, Kinesis, or Pub/Sub) ???? Knowledge of ML & AI-driven data processing solutions ???? Certifications in Databricks, Snowflake, or cloud platforms (AWS/Azure/GCP) Apply or share your updated CV at hr@anvicybernetics,
Posted 1 week ago
2.0 - 7.0 years
9 - 13 Lacs
Gurugram
Work from Office
We are looking for a skilled Database Specialist to join our team at Squareops, focusing on Cloud Infrastructure. The ideal candidate will have 2-7 years of experience in database management and cloud computing. Roles and Responsibility Design, implement, and manage databases for cloud infrastructure projects. Collaborate with cross-functional teams to identify and prioritize database requirements. Develop and maintain database documentation and technical specifications. Ensure data security, integrity, and compliance with industry standards. Troubleshoot and resolve complex database issues efficiently. Optimize database performance and scalability for large-scale applications. Job Requirements Strong knowledge of database management systems and cloud computing platforms. Experience with designing and implementing scalable and secure databases. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills. Familiarity with database development tools and technologies.
Posted 1 week ago
9.0 - 14.0 years
40 - 55 Lacs
Bengaluru
Work from Office
Roles and responsibilities: Collaborate with cross-functional teams to understand data requirements and design scalable and efficient data processing solutions. Develop and maintain data pipelines using PySpark and SQL on the Databricks platform. Optimise and tune data processing jobs for performance and reliability. Implement automated testing and monitoring processes to ensure data quality and reliability. Work closely with data scientists, data analysts, and other stakeholders to understand their data needs and provide effective solutions. Troubleshoot and resolve data-related issues, including performance bottlenecks and data quality problems. Stay up to date with industry trends and best practices in data engineering and Databricks. Key Requirements: 8+ years of experience as a Data Engineer, with a focus on Databricks and cloud-based data platforms, with a minimum of 4 years of experience in writing unit/end-to-end tests for data pipelines and ETL processes on Databricks. Hands-on experience in PySpark programming for data manipulation, transformation, and analysis. Strong experience in SQL and writing complex queries for data retrieval and manipulation. Experience in Docker for containerising and deploying data engineering applications is good to have. Strong knowledge of the Databricks platform and its components, including Databricks notebooks, clusters, and jobs. Experience in designing and implementing data models to support analytical and reporting needs will be an added advantage.
Posted 1 week ago
7.0 - 9.0 years
8 - 14 Lacs
Agra
Work from Office
Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards
Posted 1 week ago
7.0 - 9.0 years
8 - 14 Lacs
Vadodara
Work from Office
Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards
Posted 1 week ago
7.0 - 9.0 years
8 - 14 Lacs
Faridabad
Work from Office
Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards
Posted 1 week ago
7.0 - 9.0 years
8 - 14 Lacs
Jaipur
Work from Office
Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards
Posted 1 week ago
7.0 - 9.0 years
8 - 14 Lacs
Nagpur
Work from Office
Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards
Posted 1 week ago
3.0 - 6.0 years
15 - 16 Lacs
Pune
Work from Office
At YASH, we re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire MDM Professionals in the following areas : Experience 4-6 Years Job Title: Profisee MDM Developer. Experience Required : 4 to 6 Years. Employment Type : Full-Time / Contract. Job Summary: We are seeking a skilled and motivated Profisee MDM professional to join our data management team. The ideal candidate will have hands-on experience with Profisee MDM and a strong understanding of master data management principles, data governance, and data integration. You will play a key role in designing, implementing, and maintaining MDM solutions that support business operations and data quality initiatives. Key Responsibilities: Design and configure Profisee MDM solutions including data models, business rules, workflows, and user interfaces. Collaborate with business and IT stakeholders to gather requirements and define master data domains. Implement data quality rules and validation processes to ensure data accuracy and consistency. Integrate MDM solutions with enterprise systems using ETL tools and APIs. Monitor and maintain MDM performance, troubleshoot issues, and optimize configurations. Support data governance initiatives and ensure compliance with data standards and policies. Provide training and support to end-users and contribute to documentation and best practices. Required Skills: 4 6 years of experience in Master Data Management, with at least 2 3 years of hands-on experience in Profisee MDM . Strong understanding of data modeling, data governance, and data quality principles. Proficiency in SQL and experience with ETL tools and data integration techniques. Familiarity with enterprise data architecture and systems (e. g. , ERP, CRM). Excellent problem-solving, communication, and collaboration skills. Required Technical/ Functional Competencies Domain/ Industry Knowledge: Basic knowledge of customers business processes- relevant technology platform or product. Able to prepare process maps, workflows, business cases and simple business models in line with customer requirements with assistance from SME and apply industry standards/ practices in implementation with guidance from experienced team members. Requirement Gathering and Analysis: Working knowledge of requirement management processes and requirement analysis processes, tools and methodologies. Able to analyse the impact of change requested / enhancement / defect fix and identify dependencies or interrelationships among requirements and transition requirements for the engagement. Product/ Technology Knowledge: Working knowledge of technology product/platform standards and specifications. Able to implement code or configure/customize products and provide inputs in design and architecture adhering to industry standards/ practices in implementation. Analyse various frameworks/tools, review the code and provide feedback on improvement opportunities. Architecture tools and frameworks: Basic knowledge of architecture Industry tools & frameworks Able to analyse available tools and frameworks for review by the SME and plan for tool configurations and development. Architecture concepts and principles: Basic knowledge of architectural elements, SDLC, methodologies. Able to apply various architectural constructs in the projects and identify various architectural patterns and implement. Analytics Solution Design: High-level awareness of a wide range of core data science/analytics techniques, their advantages, disadvantages, and areas of application. Tools & Platform Knowledge: Familiar with wide range of mainstream commercial and open-source data science/analytics software tools, their constraints, advantages, disadvantages, and areas of application. Required Behavioral Competencies Accountability: Takes responsibility for and ensures accuracy of own work, as well as the work and deadlines of the team. Collaboration: Participates in team activities and reaches out to others in team to achieve common goals. Agility: Demonstrates a willingness to accept and embrace differing ideas or perceptions which are beneficial to the organization. Customer Focus: Displays awareness of customers stated needs and gives priority to meeting and exceeding customer expectations at or above expected quality within stipulated time. Communication: Targets communications for the appropriate audience, clearly articulating and presenting his/her position or decision. Drives Results: Sets realistic stretch goals for self & others to achieve and exceed defined goals/targets. Certifications Good To Have Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture
Posted 1 week ago
7.0 - 8.0 years
32 - 45 Lacs
Pune
Work from Office
We are looking to add an experienced and enthusiastic Lead Data Scientist to our Jet2 Data Science team in India. Reporting to the Data Science Delivery Manager , the Lead Data Scientist is a key appointment to the Data Science Team , with responsibility for executing the data science strategy and realising the benefits we can bring to the business by combining insights gained from multiple large data sources with the contextual understanding and experience of our colleagues across the business. In this exciting role, y ou will be joining an established team of 40+ Data Science professionals , based across our UK and India bases , who are using data science to understand, automate and optimise key manual business processes, inform our marketing strategy, and ass ess product development and revenue opportunities and optimise operational costs. As Lead Data Scientist, y ou will have strong experience in leading data science projects and creating machine learning models and be able t o confidently communicate with and enthuse key business stakeholders . Roles and Responsibilities A typical day in your role at Jet2TT: A lead data scientist would lead a team of data science team Lead will be responsible for delivering & managing day-to-day activities The successful candidate will be highly numerate with a statistical background , experienced in using R, Python or similar statistical analysis package Y ou will be expected to work with internal teams across the business , to identify and collaborate with stakeholders across the wider group. Leading and coaching a group of Data Scientists , y ou will plan and execute the use of machine learning and statistical modelling tools suited to the identified initiative delivery or discovery problem identified . You will have strong ability to analyse the create d algorithms and models to understand how changes in metrics in one area of the business could impact other areas, and be able to communicate those analyses to key business stakeholders. You will identify efficiencies in the use of data across its lifecycle, reducing data redundancy, structuring data to ensure efficient use of time , and ensuring retained data/information provides value to the organisation and remains in-line with legitimate business and/or regulatory requirements. Your ability to rise above group think and see beyond the here and now is matched only by your intellectual curiosity. Strong SQL skills and the ability to create clear data visualisations in tools such as Tableau or Power BI will be essential . They will also have experience in developing and deploying predictive models using machine learning frameworks and worked with big data technologies. As we aim to realise the benefits of cloud technologies, some familiarity with cloud platforms like AWS for data science and storage would be desirable. You will be skilled in gathering data from multiple sources and in multiple formats with knowledge of data warehouse design, logical and physical database design and challenges posed by data quality. Qualifications, Skills and Experience (Candidate Requirements): Experience in leading small to mid-size data science team Minimum 7 years of experience in the industry & 4+ experience in data science Experience in building & deploying machine learning algorithms & detail knowledge on applied statistics Good understanding of various data architecture RDBMS, Datawarehouse & Big Data Experience of working with regions such as US, UK, Europe or Australia is a plus Liaise with the Data Engineers, Technology Leaders & Business Stakeholder Working knowledge of Agile framework is good to have Demonstrates willingness to learn Mentoring, coaching team members Strong delivery performance, working on complex solutions in a fast-paced environment
Posted 1 week ago
3.0 - 4.0 years
17 - 18 Lacs
Bengaluru
Work from Office
KPMG India is looking for Azure Data Engineer - Consultant Azure Data Engineer - Consultant to join our dynamic team and embark on a rewarding career journey Assure that data is cleansed, mapped, transformed, and otherwise optimised for storage and use according to business and technical requirements Solution design using Microsoft Azure services and other tools The ability to automate tasks and deploy production standard code (with unit testing, continuous integration, versioning etc.) Load transformed data into storage and reporting structures in destinations including data warehouse, high speed indexes, real-time reporting systems and analytics applications Build data pipelines to collectively bring together data Other responsibilities include extracting data, troubleshooting and maintaining the data warehouse
Posted 1 week ago
6.0 - 13.0 years
20 - 25 Lacs
Hyderabad
Work from Office
ServiceNow is looking for an experienced Data Platform Architect to join our Customer Journey Data Platform and Insights team. This is a senior-level, hands-on architecture role focused on building a unified, intelligent data platform that brings together signals across the entire customer journey from product usage and learning engagement to support interactions and community activity. You will play a critical role in designing and delivering a scalable, secure, performant and AI-ready data platform that powers customer understanding, insights generation, and value-driven actions. Experience with the ServiceNow platform, enterprise software, and AI/ML integration is highly preferred. Key Responsibilities Platform Architecture Design Design and lead the technical architecture for a unified customer journey data platform leveraging modern data and AI technologies. Define data modeling standards, integration blueprints, and lifecycle management practices. Architect scalable, distributed data systems including data lakes, data mesh, data fabric, data warehouses, and real-time processing pipelines. Create reusable architectural patterns that support real-time and batch processing across structured and unstructured datasets. Technical Strategy Leadership Partner with engineering, product, and analytics teams to align technical roadmaps with customer insights and business objectives. Translate business needs and customer signals into data solutions and architectural strategies. Provide thought leadership in evaluating and implementing cutting-edge tools, frameworks, and best practices. Drive architectural governance, review processes, and architecture maturity models. Cloud, Governance, and Security Design and optimize cloud-native solutions using Azure, Snowflake, Databricks, and other modern platforms. Establish governance frameworks ensuring data quality, privacy, lineage, and compliance across all sources. Implement robust data security practices, including RBAC, encryption, and access auditing. AI/ML Analytics Integration Architect platforms to support AI/ML workflows, including feature stores, model inference pipelines, and feedback loops. Work closely with data scientists and ML engineers to embed intelligence across customer journeys. Enable data democratization and advanced insights through curated datasets, semantic layers, and analytical products. ServiceNow Platform Alignment Integrate deeply with ServiceNow platform data models, APIs, and event based systems. Architect solutions that enhance customer understanding within IT/DT software domains and workflows powered by ServiceNow. Ensure platform extensibility and API-driven architecture supports self-service analytics, operational workflows, and personalization. To be successful in this role you have: Experience in leveraging or critically thinking about how to integrate AI into work processes, decision-making, or problem-solving. This may include using AI-powered tools,
Posted 1 week ago
4.0 - 8.0 years
25 - 30 Lacs
Hyderabad
Work from Office
Job Description We are seeking a Lead Snowflake Engineer to join our dynamic Data Engineering team. This role will involve owning the architecture, implementation, and optimization of our Snowflake-based data warehouse solutions while mentoring a team of engineers and driving project success. The ideal candidate will bring deep technical expertise in Snowflake, hands-on experience with DBT (Data Build Tool), and a collaborative mindset for working across data, analytics, and business teams. Key Responsibilities: Design and implement scalable and efficient Snowflake data warehouse architectures and ELT pipelines. Leverage DBT to build and manage data transformation workflows within Snowflake. Lead data modeling efforts to support analytics and reporting needs across the organization. Optimize Snowflake performance including query tuning, resource scaling, and storage usage. Collaborate with business stakeholders and data analysts to gather requirements and deliver high-quality data solutions. Manage and mentor a team of data engineers; provide technical guidance, code reviews, and career development support. Establish and enforce best practices for data engineering, including version control, CI/CD, documentation, and data quality. Ensure data solutions are secure, compliant, and aligned with privacy regulations (e.g., GDPR, CCPA). Continuously evaluate emerging tools and technologies to enhance our data ecosystem. Qualifications Bachelor s degree in Computer Science, Information Technology, or related field. 6+ years of experience in data engineering, including at least 2+ years of hands-on experience with
Posted 1 week ago
8.0 - 13.0 years
20 - 25 Lacs
Ahmedabad
Work from Office
Position Overview This role is responsible for defining and delivering ZURU s next-generation data architecture built for global scalability, real-time analytics, and AI enablement. You will lead the unification of fragmented data systems into a cohesive, cloud-native platform that supports advanced business intelligence and decision-making. Sitting at the intersection of data strategy, engineering, and commercial enablement, this role demands both deep technical acumen and strong cross-functional influence. You will drive the vision and implementation of robust data infrastructure, champion governance standards, and embed a culture of data excellence across the organisation. Position Impact In the first six months, the Head of Data Architecture will gain deep understanding of ZURU s operating model, technology stack, and data fragmentation challenges. You ll conduct a comprehensive review of current architecture, identifying performance gaps, security concerns, and integration challenges across systems like SAP, Odoo, POS, and marketing platforms. By month twelve, you ll have delivered a fully aligned architecture roadmap implementing cloud-native infrastructure, data governance standards, and scalable models and pipelines to support AI and analytics. You will have stood up a Centre of Excellence for Data, formalised global data team structures, and established yourself as a trusted partner to senior leadership. What are you Going to do Lead Global Data Architecture: Own the design, evolution, and delivery of ZURU s enterprise data architecture across cloud and hybrid environments. Consolidate Core Systems: Unify data sources across SAP, Odoo, POS, IoT, and media into a single analytical platform optimised for business value. Build Scalable Infrastructure: Architect cloud-native solutions that support both batch and streaming data workflows using tools like Databricks, Kafka, and Snowflake. Implement Governance Frameworks: Define and enforce enterprise-wide data standards for access control, privacy, quality, security, and lineage. Enable Metadata Cataloguing: Deploy metadata management and cataloguing tools to enhance data discoverability and self-service analytics. Operationalise AI/ML Pipelines: Lead data architecture that supports AI/ML initiatives, including demand forecasting, pricing models, and personalisation. Partner Across Functions: Translate business needs into data architecture solutions by collaborating with leaders in Marketing, Finance, Supply Chain, RD, and Technology. Optimize Cloud Cost Performance: Roll out compute and storage systems that balance cost efficiency, performance, and observability across platforms. Establish Data Leadership: Build and mentor a high-performing data team across India and NZ, and drive alignment across engineering, analytics, and governance. Vendor and Tool Strategy: Evaluate external tools and partners to ensure the data ecosystem is future-ready, scalable, and cost-effective. What are we Looking for 8+ years of experience in data architecture, with 3+ years in a senior or leadership role across cloud or hybrid environments Proven ability to design and scale large data platforms supporting analytics, real-time reporting, and AI/ML use cases Hands-on expertise with ingestion, transformation, and orchestration pipelines (e.g. Kafka, Airflow, DBT, Fivetran) Strong knowledge of ERP data models, especially SAP and Odoo Experience with data governance, compliance (GDPR/CCPA) , metadata cataloguing, and security practices Familiarity with distributed systems and streaming frameworks like Spark or Flink Strong stakeholder management and communication skills, with the ability to influence both technical and business teams Experience building and leading cross-regional data teams Tools Technologies Cloud Platforms: AWS (S3, EMR, Kinesis, Glue), Azure (Synapse, ADLS), GCP Big Data: Hadoop, Apache Spark, Apache Flink Streaming: Kafka, Kinesis, Pub/Sub Orchestration: Airflow, Prefect, Dagster, DBT Warehousing: Snowflake, Redshift, BigQuery, Databricks Delta NoSQL: Cassandra, DynamoDB, HBase, Redis Query Engines: Presto/Trino, Athena IaC CI/CD: Terraform, GitLab Actions Monitoring: Prometheus, Grafana, ELK, OpenTelemetry Security/Governance: IAM, TLS, KMS, Amundsen, DataHub, Collibra, DBT for lineage What do we Offer Competitive compensation 5 Working Days with Flexible Working Hours Medical Insurance for self family Training skill development programs Work with the Global team, Make the most of the diverse knowledge Several discussions over Multiple Pizza Parties
Posted 1 week ago
3.0 - 8.0 years
10 - 14 Lacs
Pune
Work from Office
Not Applicable Specialism Data, Analytics AI Management Level Senior Associate Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC Learn more about us . s Key Attributes Hands on Experience in Data migration and ETL tools like BODS LTMC knowledge with SAP functional understanding Excellent project management and communication skills End to end Data Migration Landscape knowledge Lead Role experience Domestic client handling experience i.e. working with client with domestic onsite Understanding of Data Security and Data Compliance Agile Understanding Certification (Good to have) Domain Knowledge of Manufacturing Industry Sector Mandatory skill sets SAP BODS Preferred skill sets SAP BODS Years of experience required 38 Years Education qualification BE, B.Tech, MCA, M.Tech Education Degrees/Field of Study required Bachelor of Technology, Master of Engineering, Bachelor of Engineering Degrees/Field of Study preferred Required Skills SAP BO Data Services (BODS) Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} No
Posted 1 week ago
4.0 - 8.0 years
11 - 15 Lacs
Ahmedabad
Work from Office
Not Applicable Specialism Data, Analytics AI Management Level Senior Associate Summary . In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. . s Roles Responsibilities Hands on Experience with Power BI Dashboard Development and willing to work as an individual contributor. Clear Understanding of Data Warehousing Concepts. Should Work with a data engineering team closely to perform data extraction and data transformation processes to create the datasets. Good Experience in different Categories of DAX functions like Time Intelligence Function, Filter Functions, Date Functions, Logical Functions, Text Functions, Number and Statistical Functions. Good experience with Visual level, Page level, Report level and Drill Through filters for filtering the data in a Report. Experience in Row Level Security (RLS) implementation in Power BI. Should work with OnPremises Data Gateway to Refresh and Schedule Refresh of the Dataset. Strong data transformation skills through Power Query Editor with familiarity in M language. Data Modelling knowledge with Joins on multiple tables and creating new bridge tables. Knowledge on PBI desktop features like Bookmarks, Selections, Sync Slicers Edit interactions. Knowledge of PBI Service features like creating import, scheduling extract refresh, managing subscriptions etc. Publishing and maintenance of Apps in Power BI. Also, knowledge on configuring Row Level Security and Dashboard level Security in Power BI Service. Experience in creating and publishing reports on both web and mobile layout. Able to Perform Unit Testing like functionality testing and Data Validation. Report Performance Optimization and Troubleshooting. Clear Understanding of UI and UX designing. Hands on Working Experience in SQL to write the queries. Very good communication skills must be able to discuss the requirements effectively with business owners. Mandatory skill sets Power BI, DAX Preferred skill sets Power BI, DAX Years of experience required 48 Years Educational Qualification BE, B.Tech, MCA, M.Tech Education Degrees/Field of Study required Master of Business Administration, Bachelor of Technology, Bachelor of Engineering Degrees/Field of Study preferred Required Skills DAX Language, Power BI Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} Travel Requirements Available for Work Visa Sponsorship
Posted 1 week ago
2.0 - 4.0 years
6 - 9 Lacs
Chennai
Work from Office
Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary . In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . s Possess handson experience in IICS / Informatica Powercenter. Demonstrated involvement in endtoend IICS/IDMC project. Possess handson experience in Informatica PowerCenter Structured Query Language (SQL) Data Warehouse expertise Experience in Extract, Transform, Load (ETL) Testing Effective communication skills Key Responsibilities Design and develop ETL processes using Informatica IICS / Informatica Powercenter. Collaborate with stakeholders to gather requirements and translate them into technical specifications. Should have good expertise on IICS Data integration / Informatica Powercenter and Application Integration, Oracle SQL Implement data integration solutions that ensure data accuracy and consistency. Monitor and optimize existing workflows for performance and efficiency. Troubleshoot and resolve any issues related to data integration and ETL processes. Maintain documentation for data processes and integration workflows. Mandatory skill sets ETL Informatica Preferred skill sets ETL Informatica Years of experience required 2 4 yrs Education qualification BTech/MBA/MCA Education Degrees/Field of Study required Master of Business Administration, Bachelor of Technology Degrees/Field of Study preferred Required Skills ETL (Informatica) Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} No
Posted 1 week ago
2.0 - 4.0 years
5 - 9 Lacs
Hyderabad
Work from Office
About the role Youll be at the heart of developing and maintaining our sophisticated in-house insurance products built on relational or document databases. You will have the opportunity to join one of our product teams and contribute to the development of functionality which generates real business impact. About the team We are a team that believes in engineering excellence and that our leaders should also be engineers themselves. We build applications that are carefully designed, thoughtfully implemented, and surpass the expectations of our users by working together with product owners. Quality and stability are first-class deliverables in everything we do, and we lead by example by embedding high standards into our processes. Your responsibilities include Designs, develops, deploys, and supports sustainable data/solutions architectures such as design patterns, reference data architecture, conceptual, logical, and physical data models for both Relational and NoSQL DB Data migration / ingestion / transfer from / to heterogeneous databases and file types. Performance Optimization (Query fine tuning, indexing strategy etc.) Support project team conduct Public Cloud Data Growth and Data Service Consumption assessment and forecast Collaborate effectively within a cross-functional team including requirements engineers, QA specialists, and other application engineers. Stay current with emerging technologies and Generative AI developments to continuously improve our solutions. About you Youre a naturally curious and thoughtful professional who thrives in a high-performance engineering environment. Your passion for coding is matched by your commitment to delivering business value. You believe in continuous learning through self-improvement or by absorbing knowledge from those around you and youre excited to contribute to a team that values technical excellence. You should bring the following skills and experiences Proficient in Relational and NoSQL DBs Proficient in PL/SQL programming Strong data model and database design skill for both Relational and NoSQL Experience with seamless data integration using Informatica and Azure Data Factory Must have previous public cloud experience, particularly with Microsoft Azure
Posted 1 week ago
6.0 - 8.0 years
22 - 27 Lacs
Hyderabad
Work from Office
We are seeking a Lead Snowflake Engineer to join our dynamic Data Engineering team. This role will involve owning the architecture, implementation, and optimization of our Snowflake-based data warehouse solutions while mentoring a team of engineers and driving project success. The ideal candidate will bring deep technical expertise in Snowflake, hands-on experience with DBT (Data Build Tool), and a collaborative mindset for working across data, analytics, and business teams. Key Responsibilities: Design and implement scalable and efficient Snowflake data warehouse architectures and ELT pipelines. Leverage DBT to build and manage data transformation workflows within Snowflake. Lead data modeling efforts to support analytics and reporting needs across the organization. Optimize Snowflake performance including query tuning, resource scaling, and storage usage. Collaborate with business stakeholders and data analysts to gather requirements and deliver high-quality data solutions. Manage and mentor a team of data engineers; provide technical guidance, code reviews, and career development support. Establish and enforce best practices for data engineering, including version control, CI/CD, documentation, and data quality. Ensure data solutions are secure, compliant, and aligned with privacy regulations (e.g., GDPR, CCPA). Continuously evaluate emerging tools and technologies to enhance our data ecosystem. Bachelor s degree in Computer Science, Information Technology, or related field. 6+ years of experience in data engineering, including at least 2+ years of hands-on experience with Snowflake.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough