Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 18.0 years
2 - 3 Lacs
Hyderabad
Work from Office
Experience needed: 12-18 years Type: Full-Time Mode: WFO Shift: General Shift IST Location: Hyderabad NP: Immediate Joinee - 30 days Job Summary: We are looking for an experienced and visionary Data Architect - Azure Data & Analytics to lead the design and delivery of scalable, secure, and modern data platform solutions leveraging Microsoft Azure and Microsoft Fabric . This role requires deep technical expertise in the Azure Data & Analytics ecosystem, strong experience in designing cloud-native architectures, and a strategic mindset to modernize enterprise data platforms. Key Responsibilities: Architect and design Modern Data Platform solutions on Microsoft Azure, including ingestion, transformation, storage, and visualization layers. Lead implementation and integration of Microsoft Fabric , including OneLake, Direct Lake mode, and Fabric workloads (Data Engineering, Data Factory, Real-Time Analytics, Power BI). Define enterprise-level data architecture , including data lakehouse patterns, delta lakes, data marts, and semantic models. Collaborate with business stakeholders, data engineers, and BI teams to translate business needs into scalable cloud data solutions. Design solutions using Azure-native services such as Azure Data Factory, Azure Synapse Analytics, Azure Data Lake Storage (Gen2), Azure SQL, and Azure Event Hubs. Establish best practices for data security, governance, DevOps, CI/CD pipelines, and cost optimization. Guide implementation teams on architectural decisions and technical best practices across the data lifecycle. Develop reference architectures and reusable frameworks for accelerating data platform implementations. Stay updated on Microsofts data platform roadmap and proactively identify opportunities to enhance data strategy. Assist in developing RFPs, architecture assessments, and solution proposals. Requirements Required Skills & Qualifications: Proven 12-18 years of experience which includes designing and implementing cloud-based modern data platforms on Microsoft Azure. Deep knowledge and understanding of Microsoft Fabric architecture , including Data Factory, Data Engineering, Synapse Real-Time Analytics, and Power BI integration. Expertise in Azure Data Services : Azure Synapse, Data Factory, Azure SQL, ADLS Gen2, Azure Functions, Azure Purview, Event Hubs, etc. Experience of data warehousing, lakehouse architectures, ETL/ELT , and data modeling. Experience in data governance, security, role-based access (Microsoft Entra/Azure AD) , and compliance frameworks. Strong leadership and communication skills to influence both technical and non-technical stakeholders. Familiarity with DevOps and infrastructure-as-code (e.g., ARM templates, Bicep, Terraform) is a plus. Preferred Qualifications: Microsoft Certified: Azure Solutions Architect Expert , Azure Data Engineer Associate , or Microsoft Fabric Certification . Experience with real-time data streaming , IoT , or machine learning pipelines in Azure. Familiarity with multi-cloud data strategies or hybrid deployments is an advantage.
Posted 3 weeks ago
7.0 - 11.0 years
0 Lacs
hyderabad, telangana
On-site
As a Principal Data Engineer at Skillsoft, you will play a crucial role in driving the advancement of Enterprise data infrastructure by designing and implementing the logic and structure for how data is set up, cleansed, and stored for organizational usage. You will be responsible for developing a Knowledge Management strategy to support Skillsoft's analytical objectives across various business areas. Your role will involve building robust systems and reusable code modules to solve problems, working with the latest open-source tools and platforms to build data products, and collaborating with Product Owners and cross-functional teams in an agile environment. Additionally, you will champion the standardization of processes for data elements used in analysis, establish forward-looking data and technology objectives, manage a small team through project deliveries, and design rich data visualizations and interactive tools to communicate complex ideas to stakeholders. Furthermore, you will evangelize the Enterprise Data Strategy & Execution Team mission, identify opportunities to influence decision-making with supporting data and analysis, and seek additional data resources that align with strategic objectives. To qualify for this role, you should possess a degree in Data Engineering, Information Technology, CIS, CS, or related field, along with 7+ years of experience in Data Engineering/Data Management. You should have expertise in building cloud data applications, cloud computing, data engineering/analysis programming languages, and SQL Server. Proficiency in data architecture, data modeling, and experience with technology stacks for Metadata Management, Data Governance, and Data Quality are essential. Additionally, experience in working cross-functionally across an enterprise organization and an Agile methodology environment is preferred. Your strong business acumen, analytical skills, technical abilities, and problem-solving skills will be critical in this role. Experience with app and web analytics data, CRM, and ERP systems data is a plus. Join us at Skillsoft and be part of our mission to democratize learning and help individuals unleash their edge. If you find this opportunity intriguing, we encourage you to apply and be a part of our team dedicated to leadership, learning, and success at Skillsoft. Thank you for considering this role.,
Posted 3 weeks ago
7.0 - 12.0 years
0 Lacs
maharashtra
On-site
As a Lead Data Engineer, you will be responsible for leveraging your 7 to 12+ years of hands-on experience in SQL database design, data architecture, ETL, Data Warehousing, Data Mart, Data Lake, Big Data, Cloud (AWS), and Data Governance domains. Your expertise in a modern programming language such as Scala, Python, or Java, with a preference for Spark/ Pyspark, will be crucial in this role. Your role will require you to have experience with configuration management and version control apps like Git, along with familiarity working within a CI/CD framework. If you have experience in building frameworks, it will be considered a significant advantage. A minimum of 8 years of recent hands-on SQL programming experience in a Big Data environment is necessary, with a preference for experience in Hadoop/Hive. Proficiency in PostgreSQL, RDBMS, NoSQL, and columnar databases will be beneficial for this role. Your hands-on experience in AWS Cloud data engineering components, including API Gateway, Glue, IoT Core, EKS, ECS, S3, RDS, Redshift, and EMR, will play a vital role in developing and maintaining ETL applications and data pipelines using big data technologies. Experience with Apache Kafka, Spark, and Airflow is a must-have for this position. If you are excited about this opportunity and possess the required skills and experience, please share your CV with us at omkar@hrworksindia.com. We look forward to potentially welcoming you to our team. Regards, Omkar,
Posted 3 weeks ago
12.0 - 16.0 years
0 Lacs
karnataka
On-site
This is a full-time position with D Square Consulting Services Pvt Ltd. As a Senior Data Architect / Modeler, you will collaborate closely with various business partners, product owners, data strategy, data platform, data science, and machine learning teams to innovate data products for end users. Your role will involve shaping the overall solution architecture and defining models for data products using best-in-class engineering practices. By working with stakeholders, you will comprehend business requirements and design/build data models that support acquisition, ingestion processes, and critical reporting and insight needs. To be successful in this role, you must have a minimum of 12 years of experience, with at least 7 years in Data & Analytics initiatives. You should possess a deep understanding of how data architecture and modeling facilitate data pipelines, data management, and analytics. Additionally, you need 5+ years of experience in data architecture & modeling within Consumer/Healthcare Goods industries and hands-on experience in Cloud Architecture (Azure, GCP, AWS) and related databases like Synapse, Databricks, Snowflake, and Redshift. Proficiency in SQL and familiarity with data modeling tools like Erwin or ER Studio is crucial. Your responsibilities will include leading data architecture and modeling efforts in collaboration with engineering and platform teams to develop next-generation product capabilities that drive business growth. You will focus on delivering reliable, high-quality data products to maximize business value and work within the DevSecOps framework to enhance data & analytics capabilities. Collaborating with Business Analytics leaders, you will translate business needs into optimal architecture designs and design scalable and reusable models for various functional areas of data products while adhering to FAIR principles. In this role, you will also collaborate with data engineers, solution architects, and other stakeholders to maintain and optimize data models. You will establish trusted partnerships with Data Engineering, Platforms, and Data Science teams to create business-relevant data models and ensure the maintenance of Metadata Rules, Data Dictionaries, and associated lineage details. Additionally, staying updated with emerging technologies and mentoring other data modelers in the team will be a part of your responsibilities. Qualifications for this position include an undergraduate degree in Technology, Computer Science, Applied Data Sciences, or related fields, with an advanced degree being preferred. Experience in NoSQL and graphDB databases, as well as hands-on experience with data catalogs like Alation, Collibra, or similar tools, is beneficial. You should have a strong ability to challenge existing technologies and architecture while effectively influencing across the organization. Lastly, experience in a diverse company culture and a commitment to inclusion and equal-opportunity employment are desired traits for this role.,
Posted 3 weeks ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
As a Data Architect / Modeler, you will collaborate closely with various business partners, Product Owners, Data Strategy, Data Platform, Data Science, and Machine Learning (MLOps) teams to drive the development of innovative data products for end users. Your role will involve contributing to the development of overall solution architecture and defining models for data products by employing best-in-class engineering practices. By working with different stakeholders, you will gain insights into business needs and develop data models that support acquisition, ingestion processes, and critical reporting and insight requirements. With a minimum of 8 years of experience, including 5+ years of progressive experience in Data & Analytics initiatives, you are expected to possess knowledge of how data architecture and modeling support data pipelines, data management, and analytics. Additionally, you should have at least 3 years of hands-on experience in Cloud Architecture (Azure, GCP, AWS) and cloud-based databases (e.g., Synapse, Databricks, Snowflake, Redshift), along with 3+ years of data architecture & modeling experience in Consumer/Healthcare Goods companies. Your expertise should include proficiency in SQL, Erwin/ER Studio, and data modeling, as well as experience in designing and developing performance-tuned, reusable, and scalable data model standards and data dictionaries. Hands-on experience with data catalogs like Alation, Collibra, or similar tools is desirable. Furthermore, you should have 3 years of experience working with Agile methodology (Scrum/Kanban) in the DevSecOps model and possess strong interpersonal and communication skills. In this role, you will be responsible for providing guidance on data architecture and modeling to engineering and platform teams to create next-generation product capabilities that drive business growth. You will focus on the delivery of reliable, high-quality data products to maximize business value and collaborate with Business Analytics leaders to translate business needs into optimal architecture designs. Your responsibilities will include designing data architecture and scalable, reusable models for various functional areas of data products while adhering to "FAIR" principles. You will collaborate with data engineers, solution architects, and other stakeholders on data model maintenance and optimization and establish trusted partnerships with Data Engineering, Platforms, and Data Science teams to architect business-relevant data models. Additionally, you will create and maintain Metadata Rules, Data Dictionaries, and associated lineage details of the data models.,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
The Chief Data & Analytics Office (CDAO) at JPMorgan Chase is responsible for accelerating the firm's data and analytics journey. This includes ensuring the quality, integrity, and security of the company's data, as well as leveraging this data to generate insights and drive decision-making. The CDAO is also responsible for developing and implementing solutions that support the firm's commercial goals by harnessing artificial intelligence and machine learning technologies to develop new products, improve productivity, and enhance risk management effectively and responsibly. Within CDAO, The Firmwide Chief Data Office (CDO) is responsible for maximizing the value and impact of data globally, in a highly governed way. It consists of several teams focused on accelerating JPMorgan Chase's data, analytics, and AI journey, including data strategy, data impact optimization, privacy, data governance, transformation, and talent. As a Senior Associate at JPMorgan Chase within the Chief Data & Analytics team, you will be responsible for working with stakeholders to define governance and tooling requirements and building out the BCBS Data Governance framework. In addition, you will be responsible for delivering tasks in detailed project plans for the BCBS deliverables owned by the Firmwide CDO. Lastly, you will play a role in developing and syndicating the content used for the BCBS governance meetings. **Job Responsibilities:** - Deliver on the BCBS book of work owned by the Firmwide CDO - Support the definition, prioritization, and resolution of governance and requirements decisions needed by the BCBS program - Collect, synthesize, analyze, and present project data and findings - Conduct analyses to identify issues and formulate recommendations - Develop regular, compelling communications on project status - Research data governance requirements and potential solutions - Collaborate effectively across organizations, functions, and geographies **Required qualifications, capabilities, and skills:** - Formal training or certification on Data Governance concepts and 3+ years applied experience - Diverse problem-solving experience - Excellent communication skills (oral and written) and the ability to work effectively in cross-functional teams - Excellent project management and organizational skills, with the ability to manage multiple deliverables and work simultaneously - Strong interpersonal leadership and influencing skills - Proficiency in MS Excel and PowerPoint **Preferred qualifications, capabilities, and skills:** - Familiarity with data management and governance, big data platforms, or data architecture is preferred - BS/BA degree or equivalent experience / Bachelor's degree in Business, Finance, Economics, or other related area,
Posted 3 weeks ago
5.0 - 10.0 years
0 Lacs
kolkata, west bengal
On-site
As the Solution Architect for Salesforce CPQ at RSM, you will have the opportunity to work with a dynamic team that is dedicated to delivering exceptional professional services to the middle market globally. RSM's purpose is to instill confidence in a world of change, empowering clients and individuals to reach their full potential. With a workforce of over 15,000 employees in the U.S., Canada, and a global presence in 120 countries, RSM focuses on providing audit, tax, and consulting services to drive economic growth and understanding. Your role will involve serving as a subject matter expert on Salesforce Configure, Price, Quote projects, overseeing client delivery, proposals, new business opportunities, and knowledge management. You will collaborate with key decision-makers and company owners to understand their challenges and provide well-architected solutions that leverage the Salesforce platform effectively. Key Responsibilities: - Deliver as an individual contributor and lead a team of Business Analysts, Consultants, Developers, or Solution Architects in various projects. - Collaborate with Business Analysts to capture and understand client business requirements and recommend best practices. - Translate business requirements into well-architected solutions on the Salesforce platform, taking ownership of the solution design and project delivery. - Lead technical design sessions, estimate user stories, and develop technical solution documentation aligned with business objectives. - Demonstrate Salesforce CPQ expertise and educate clients on best practices while ensuring adherence across the implementation team. - Assist in creating best practice assets and accelerators in Lead-to-Cash/Quote-to-Cash processes. - Provide Salesforce CPQ expertise during sales efforts and stay updated on new product releases and capabilities from Salesforce. - Coach and mentor junior resources while maintaining responsibility for configuration and development on Salesforce projects. Qualifications: - 7-10 years of overall experience with at least 5 years of hands-on Salesforce CPQ experience and 2+ years in leading Salesforce CPQ project implementations as a Solution Architect. - Strong communication and interpersonal skills, ability to manage tasks and drive issues to resolution. - Hold Salesforce CPQ Specialist and Salesforce Certified Administrator certifications; experience with Salesforce Revenue Cloud is a plus. - Proficiency in Agile methodologies and administering comprehensive training to end-users and CPQ admin users. - Preferred certifications include Revenue Cloud Accredited Professional, Community Cloud Consultant, Sales Cloud Consultant, and more. If you are looking for a challenging yet rewarding role where you can make a real impact, RSM offers a competitive benefits package and a supportive work environment that encourages personal and professional growth. Join us and be part of a team that values diversity, innovation, and excellence in client service. Apply now and discover the endless opportunities at RSM.,
Posted 3 weeks ago
8.0 - 13.0 years
15 - 20 Lacs
Pune
Hybrid
EY is hiring for Leading Client for Data Governance Senior Analyst role for Pune location Role & responsibilities Coordinating with Data Srewards/Data Owners to enable identification of Critical data elements for SAP master Data Supplier/Finance/Bank master. Develop and maintain a business-facing data glossary and data catalog for SAP master data (Supplier, Customer, Finance (GL, Cost Center, Profit Center etc), capturing data definitions, lineage, and usage for relevant SAP master Data Develop and implement data governance policies, standards, and processes to ensure data quality, data management, and compliance for relevant SAP Master Data (Finance, Supplier and Customer Master Data) Develop both end-state and interim-state architecture for master data, ensuring alignment with business requirements and industry best practices. Define and implement data models that align with business needs and Gather requirements for master data structures. Design scalable and maintainable data models by ensuring data creation through single source of truth Conduct data quality assessments and implement corrective actions to address data quality issues. Collaborate with cross-functional teams to ensure data governance practices are integrated into all SAP relevant business processes. Manage data cataloging and lineage to provide visibility into data assets, their origins, and transformations in SAP environment Facilitate governance forums, data domain councils, and change advisory boards to review data issues, standards, and continuous improvements. Collaborate with the Data Governance Manager to advance the data governance agenda. Responsible to prepare data documentation, including data models, process flows, governance policies, and stewardship responsibilities. Collaborate with IT, data management, and business units to implement data governance best practices and migrate from ECC to S/4 MDG Monitor data governance activities, measure progress, and report on key metrics to senior management. Conduct training sessions and create awareness programs to promote data governance within the organization. Demonstrate deep understanding of SAP (and other ERP system such as JD Edwards etc.) master data structures such as Vendor, Customer, Cost center, Profit Center, GL Accounts etc. Summary: SAP Master Data (Vendor, Customer, GL, Cost Center, etc.) Data Governance Implementation (Transactional & Master Data) Data Modeling & Architecture (S/4HANA, ECC) Data Cataloging, Lineage, and Quality Assessment Governance Forums & Change Advisory Boards Experience in S/4HANA Greenfield implementations Migration Experience (ECC to S/4 MDG) Preferred candidate profile 8-14 years in data governance and SAP master data Strong understanding of upstream/downstream data impacts Expert in data visualization
Posted 3 weeks ago
8.0 - 13.0 years
25 - 40 Lacs
Gurugram, Mumbai (All Areas)
Work from Office
About the role In this role, we are seeking an experienced, hands-on, and innovative Data Architect with expertise in the Azure Cloud platform to design, implement, and optimize scalable data solutions. The ideal candidate will have deep expertise in data architecture, cloud solutions, and experience with healthcare and benefits data systems. They will work closely with a cross functional team of Data, DevOps, and Analytics engineers to architect a robust data platform for H&B, ensure efficient data management, and support enterprise-level decision making process. The Role - Partner with other architecture resources to lead the end-to-end architecture of the health and benefits data platform using Azure services, ensuring scalability, flexibility, and reliability. Develop broad understanding of the data lake architecture, including the impact of changes on a whole system, the onboarding of clients and the security implications of the solution. Design a new or improve upon existing architecture including data ingestion, storage, transformation and consumption layers. Define data models, schemas, and database structures optimized for H&B use cases including claims, census, placement, broking and finance sources. Designing solutions for seamless integration of diverse health and benefits data sources. Implement data governance and security best practices in compliance with industry standards and regulations using Microsoft Purview. Evaluate data lake architecture to understand how technical decisions may impact business outcomes and suggest new solutions/technologies that better align to the Health and Benefits Data strategy. Draw on internal and external practices to establish data lake architecture best practices and standards within the team and ensure that they are shared and understood. Continuously develop technical knowledge and be recognised as a key resource across the global team. Collaborate with other specialists and/or technical experts to ensure H&B Data Platform is delivering to the highest possible standards and that solutions support stakeholder needs and business requirements. Initiate practices that will increase code quality, performance and security. Develop recommendations for continuous improvements initiatives, applying deep subject matter knowledge to provide guidance at all levels on the potential implications of changes. Build the teams technical expertise/capabilities/skills through the delivery of regular feedback, knowledge sharing, and coaching. Analyze existing data design and suggest improvements that promote performance, stability and interoperability. Work with product management and business subject matter experts to translate business requirements into good data lake design. Maintain the governance model on the data lake architecture through training, design reviews, code reviews, and progress reviews. Participate in the development of Data lake Architecture and Roadmaps in support of business strategies
Posted 3 weeks ago
6.0 - 9.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Experience with machine learning environments and LLMs Certification in IBM watsonx.data or related IBM data and AI technologies Experience with integrating watsonx.data with GenAI or LLM initiatives (e.g., RAG architectures). Hands-on experience with Lakehouse platform (e.g., Databricks, Snowflake) Having exposure to implement or understanding of DB replication process Experience with NoSQL databases (e.g., MongoDB, Cassandra). Experience in data modeling tools (e.g., ER/Studio, ERwin). Knowledge of data governance and compliance standards (e.g., GDPR, HIPAA).Soft Skills Having Database Administration skills Experience with data security, backup, and disaster recovery strategies. Implementing WLM on any of the databases (preferably DB2) Experience implementing data governance frameworks, including metadata management and data cataloging tools.
Posted 3 weeks ago
2.0 - 7.0 years
4 - 9 Lacs
Pune, Bengaluru, Vadodara
Work from Office
{"company":" About Rearc At Rearc, were committed to empowering engineers to build awesome products and experiences. Success as a business hinges on our peoples ability to think freely, challenge the status quo, and speak up about alternative problem-solving approaches. If youre an engineer driven by the desire to solve problems and make a difference, youre in the right place! Our approach is simple empower engineers with the best tools possible to make an impact within their industry. Were on the lookout for engineers who thrive on ownership and freedom, possessing not just technical prowess, but also exceptional leadership skills. Our ideal candidates are hands-on-keyboard leaders who dont just talk the talk but also walk the walk, designing and building solutions that push the boundaries of cloud computing. Founded in 2016, we pride ourselves on fostering an environment where creativity flourishes, bureaucracy is non-existent, and individuals are encouraged to challenge the status quo. Were not just a company; were a community of problem-solvers dedicated to improving the lives of fellow software engineers. Our commitment is simple - finding the right fit for our team and cultivating a desire to make things better. If youre a cloud professional intrigued by our problem space and eager to make a difference, youve come to the right place. Join us, and lets solve problems together! ","role":" About the role As a Data Engineer at Rearc, youll contribute to the technical excellence of our data engineering team. Your expertise in data architecture, ETL processes, and data modeling will help optimize data workflows for efficiency, scalability, and reliability. Youll work closely with cross-functional teams to design and implement robust data solutions that meet business objectives and adhere to best practices in data management. Building strong partnerships with technical teams and stakeholders will be essential as you support data-driven initiatives and contribute to their successful implementation. What youll do Collaborate with Colleagues : Work closely with colleagues to understand customers data requirements and challenges, contributing to the development of robust data solutions tailored to client needs. Apply DataOps Principles : Embrace a DataOps mindset and utilize modern data engineering tools and frameworks like Apache Airflow, Apache Spark, or similar, to create scalable and efficient data pipelines and architectures. Support Data Engineering Projects : Assist in managing and executing data engineering projects, providing technical support and contributing to project success. Promote Knowledge Sharing : Contribute to our knowledge base through technical blogs and articles, advocating for best practices in data engineering, and fostering a culture of continuous learning and innovation. Were looking for: 2+ years of experience in data engineering, data architecture, or related fields, bringing valuable expertise in managing and optimizing data pipelines and architectures. Solid track record of contributing to complex data engineering projects, including assisting in the design and implementation of scalable data solutions. Hands-on experience with ETL processes, data warehousing, and data modelling tools, enabling the support and delivery of efficient and robust data pipelines. Good understanding of data integration tools and best practices, facilitating seamless data flow across systems. Familiarity with cloud-based data services and technologies (e.g., AWS Redshift, Azure Synapse Analytics, Google BigQuery) ensuring effective utilization of cloud resources for data processing and analytics. Strong analytical skills to address data challenges and support data-driven decision-making. Proficiency in implementing and optimizing data pipelines using modern tools and frameworks. Strong communication and interpersonal skills enabling effective collaboration with cross-functional teams and stakeholder engagement. Your first few weeks at Rearc will be spent in an immersive learning environment where our team will help you get up to speed. Within the first few months, you ll have the opportunity to experiment with a lot of different tools as you find your place on the team. Rearc is committed to a diverse and inclusive workplace. Rearc is an equal opportunity employer and does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status. "},"
Posted 3 weeks ago
7.0 - 12.0 years
9 - 14 Lacs
Hubli, Mangaluru, Mysuru
Work from Office
Job Snapshot Location: Karnataka - Other, Karnataka Job ID: JN -032025-95763 Category: TEK-Technology Operations Management (TOM) Location: Karnataka - Other, Karnataka Job ID: Category: TEK-Technology Operations Management (TOM) Job Summary Job Opportunity with TEKsystems Snaplogic Developer Snaplogic, SQL, Kafka Are you a passionate "Snaplogic Developer" with a knack for SQL and seamless system integration? We want to hear from you! We re looking for a talented developer to join our team and help us drive innovation through cutting-edge integrations and efficient data pipelines. Years of Experience: 7+ years Location: Bangalore (Hybrid) What You ll Do: * Design, develop, and implement Snaplogic integrations for a variety of business needs * Utilize SQL to extract, manipulate, and manage data * Build scalable, reusable, and high-performance integration solutions * Collaborate with cross-functional teams to ensure smooth data flow across systems * Troubleshoot and optimize Snaplogic pipelines for maximum performance What We re Looking For: * Proven experience with Snaplogic, building and maintaining integration pipelines * Strong proficiency in SQL for data querying, transformation, and analysis * Experience in integrating various systems and APIs via Snaplogic * Solid understanding of data architecture and ETL processes * Excellent problem-solving skills and attention to detail * Strong communication and teamwork skills. Apply Get personalised tech job recommendations based on your skills.
Posted 3 weeks ago
4.0 - 6.0 years
6 - 8 Lacs
Bengaluru
Work from Office
KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focussed and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. The person will work on a variety of projects in a highly collaborative, fast-paced environment. The person will be responsible for software development activities of KPMG, India. Part of the development team, he/she will work on the full life cycle of the process, develop code and unit testing. He/she will work closely with Technical Architect, Business Analyst, user interaction designers, and other software engineers to develop new product offerings and improve existing ones. Additionally, the person will ensure that all development practices are in compliance with KPMG s best practices policies and procedures. This role requires quick ramp up on new technologies whenever required. Bachelor s or Master s degree in Computer Science, Information Technology, or a related field. . Role : Azure Data Engineer Location: Bangalore Experience: 4 to 6 years Data Management : Design, implement, and manage data solutions on the Microsoft Azure cloud platform. Data Integration : Develop and maintain data pipelines, ensuring efficient data extraction, transformation, and loading (ETL) processes using Azure Data Factory. Data Storage : Work with various Azure data storage solutions like Azure SQL Database, Azure Data Lake Storage, and Azure Cosmos DB. Big Data Processing : Utilize big data technologies such as Azure Databricks and Apache Spark to handle and analyze large datasets. Data Architecture : Design and optimize data models and architectures to meet business requirements. Performance Monitoring : Monitor and optimize the performance of data systems and pipelines. Collaboration : Collaborate with data scientists, analysts, and other stakeholders to support data-driven decision-making. Security and Compliance : Ensure data solutions comply with security and regulatory requirements. Technical Skills : Proficiency in Azure Data Factory, Azure Databricks, Azure SQL Database, and other Azure data tools. Analytical Skills : Strong analytical and problem-solving skills. Communication : Excellent communication and teamwork skills. Certifications : Relevant certifications such as Microsoft Certified: Azure Data Engineer Associate are a plus.
Posted 3 weeks ago
5.0 - 8.0 years
8 - 12 Lacs
Hyderabad
Work from Office
Role Purpose The purpose of this role is to provide significant technical expertise in architecture planning and design of the concerned tower (platform, database, middleware, backup etc) as well as managing its day-to-day operations Do Provide adequate support in architecture planning, migration & installation for new projects in own tower (platform/dbase/ middleware/ backup) Lead the structural/ architectural design of a platform/ middleware/ database/ back up etc. according to various system requirements to ensure a highly scalable and extensible solution Conduct technology capacity planning by reviewing the current and future requirements Utilize and leverage the new features of all underlying technologies to ensure smooth functioning of the installed databases and applications/ platforms, as applicable Strategize & implement disaster recovery plans and create and implement backup and recovery plans Manage the day-to-day operations of the tower Manage day-to-day operations by troubleshooting any issues, conducting root cause analysis (RCA) and developing fixes to avoid similar issues. Plan for and manage upgradations, migration, maintenance, backup, installation and configuration functions for own tower Review the technical performance of own tower and deploy ways to improve efficiency, fine tune performance and reduce performance challenges Develop shift roster for the team to ensure no disruption in the tower Create and update SOPs, Data Responsibility Matrices, operations manuals, daily test plans, data architecture guidance etc. Provide weekly status reports to the client leadership team, internal stakeholders on database activities w.r.t. progress, updates, status, and next steps Leverage technology to develop Service Improvement Plan (SIP) through automation and other initiatives for higher efficiency and effectiveness Team Management Resourcing Forecast talent requirements as per the current and future business needs Hire adequate and right resources for the team Train direct reportees to make right recruitment and selection decisions Talent Management Ensure 100% compliance to Wipros standards of adequate onboarding and training for team members to enhance capability & effectiveness Build an internal talent pool of HiPos and ensure their career progression within the organization Promote diversity in leadership positions Performance Management Set goals for direct reportees, conduct timely performance reviews and appraisals, and give constructive feedback to direct reports. Ensure that organizational programs like Performance Nxt are well understood and that the team is taking the opportunities presented by such programs to their and their levels below Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Proactively challenge the team with larger and enriching projects/ initiatives for the organization or team Exercise employee recognition and appreciation Mandatory Skills: Big Data Consulting. Experience: 5-8 Years.
Posted 3 weeks ago
3.0 - 6.0 years
2 - 6 Lacs
Hyderabad
Work from Office
Detailed job description - Skill Set: Technically strong hands-on Self-driven Good client communication skills Able to work independently and good team player Flexible to work in PST hour(overlap for some hours) Past development experience for Cisco client is preferred.
Posted 3 weeks ago
4.0 - 8.0 years
6 - 9 Lacs
Nagpur
Work from Office
Project Role: SAP CPI Consultant Project Role Description: Work for an end-to-end integration solution. Drive client discussions to define the integration requirements and translate the business requirements to the technology solution. Activities include mapping business processes to support applications, defining the data entities, selecting integration technology components and patterns. Work Experience: 4-6 years Work location: Nagpur Must Have Skills: SAP CPI, API Key Responsibilities: The candidate is expected to work with the Functional and Data Architecture team members to facilitate design and development across required Project Life cycle. Should be able to understand the requirement from the client and develop the interfaces individually. Ready to work from office for 15 Days Sap Cpi
Posted 3 weeks ago
12.0 - 18.0 years
50 - 65 Lacs
Bengaluru
Work from Office
Oversee the delivery of data engagements across a portfolio of client accounts, understanding their specific needs, goals & challenges Provide mentorship & guidance for the Architects, Project Managers, & technical teams for data engagements Required Candidate profile 12+ years of experience and should be hands on in Data Architecture and should be an expert in DataBricks or Azure Should be in data engineering leadership or management roles
Posted 3 weeks ago
10.0 - 15.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Project description We are a leading international Bank that is going through a significant transformation of its front-to-back operations, marked as one of the banks top 3 transformation agendas. F2B Business Architecture is a small central global team in CIB- CTB to support the delivery of this key front-to-back (F2B) transformation priorities of the management board. The Data Architecture team will play the central role of defining the data model that will align the business processes and ensure data lineage, effective controls, and implement efficient client strategy and reporting solutions. This will require building strong relationships with key stakeholders and help deliver tangible value. The role will report to the India Head of Investment Bank and Cross Product F2B Operations. Responsibilities Be part of the CTB team to define and manage Data models used to implement solutions to automate F2B business processes and controls Ensure the models follows bank's data modelling standards and principles and influence them as necessary Actively partner with various functional leads & teams to socialize the data models towards adoption and execution of front-to-back solutions Skills Must have 10+ years in financial services, preferably Strategy and solutions in the Corporate and Investment Banking domain. Strong Knowledge of transaction banking domain process and controls for banking & trading business to drive conversation with business SMEs. Experience in developing models for transacting banking products is preferable. Knowledge of Loans or Cash/Deposits lifecycle and/or Customer lifecycle and related business data required to manage operations and analytics is desirable. Well-developed business requirements analysis skills, including good communication abilities (both speaking and listening), influencing, and stakeholder management (all levels up to managing director). Can partner with Technology and business to understand current issues and articulate recommendations and solutions. Experience working in an enterprise agile environment in a matrix organization. Critical problem-solving skills, able to think tactically and strategically. Strong design experience and defining solutions. Knowledge of banking industry data models/best practices is a plus. Consolidates process and data, and existing architecture to drive recommendations and solutions. Strong Data analysis skills, SQL/Python experience, and the ability to build data models are desirable. Nice to have Good Tech stack
Posted 3 weeks ago
6.0 - 11.0 years
13 - 18 Lacs
Gurugram
Work from Office
Project description We are looking for an experienced Senior ServiceNow (SNOW) Engineer to join our IT Operations team. You are responsible for designing robust data models, developing custom reports, and building seamless API integrations within the ServiceNow platform. You should have a strong background in ITSM processes, data architecture, and hands-on experience with ServiceNow development and automation. You will play a pivotal role in optimizing our ServiceNow environment to enhance service delivery, operational visibility, and integration with enterprise systems. Responsibilities Internal Data Structures & Configuration Design, build, and maintain data models, tables, and relationships within the ServiceNow platform. Extend and customize out-of-the-box modules (e.g., CMDB, Incident, Change, Request, etc.) to meet business requirements. Ensure data integrity, normalization, and performance optimization across the ServiceNow environment. Collaborate with stakeholders to translate business requirements into scalable ServiceNow configurations or custom applications. Reporting & Dashboards Develop real-time dashboards and reports using ServiceNow Reporting Tools and Performance Analytics. Deliver insights into key ITSM metrics such as SLAs, incident trends, and operational KPIs. Automate the generation and distribution of recurring reports to stakeholders. Work with business and technical teams to define and implement reporting frameworks tailored to their needs. Automated Feeds & API Integration Develop and manage robust data integrations using ServiceNow REST/SOAP APIs. Build and maintain data pipelines to and from external systems (e.g., CMDB, HRIS, ERP, Flexera, etc.). Implement secure, scalable automation for data exchange with appropriate error handling, logging, and monitoring. Troubleshoot and resolve integration-related issues to ensure smooth system interoperability. Skills Must have Minimum 6+ years of hands-on experience with ServiceNow, including ITSM, CMDB, and integrations. Technical Expertise Advanced knowledge of ServiceNow architecture, configuration, and scripting (JavaScript, Glide). Strong experience with REST/SOAP APIs for ServiceNow integrations. Solid understanding of relational databases, data normalization, and model optimization. Familiarity with common enterprise systems such as ERP, HRIS, Flexera, and CMDB tools. Reporting Skills: Proficiency in ServiceNow Performance Analytics, standard reporting, and dashboard design. Experience defining KPIs and building automated reporting solutions. Soft Skills: Strong communication and collaboration skills. Proven ability to translate business requirements into scalable ServiceNow solutions. Analytical and detail-oriented mindset with a problem-solving approach. Nice to have N/A.
Posted 3 weeks ago
8.0 - 13.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Project description Application Modernization Practice is a horizontal practice, supporting all business verticals in DXC. As a Senior Modernization Architect, you'll play a pivotal role in both shaping modernization solutions and supervising delivery execution. You will partner with sales, delivery, and clients to design transformation paths from legacy to modern architectures, integrating GenAI accelerators and helping deliver tangible business value. Responsibilities Collaborate on pre-sales engagementsassessments, proposals, orals, and business case creation. Design modernization paths from legacy systems (COBOL, z/OS, etc.) to modern stacks (Java, MSA, cloud). Lead effort estimation, tool strategy selection, and transformation approach definition. Provide architectural oversight during execution to ensure value realization. Participate in tooling evaluations and PoCs involving GenAI and automation accelerators. Skills Must have 8+ years in enterprise application architecture, with at least 3 years in modernization. Proven ability to assess legacy estates and define future-state architectures. Proficiency in mainframe tech (COBOL, DB2, CICS) and modern stacks (Java, Spring, microservices). Exposure to GenAI use cases in application engineering and code conversion. Strong client communication, technical documentation, and stakeholder alignment skills. Nice to have Java, Python, C#
Posted 3 weeks ago
7.0 - 12.0 years
8 - 13 Lacs
Bengaluru
Work from Office
Date 25 Jun 2025 Location: Bangalore, KA, IN Company Alstom At Alstom, we understand transport networks and what moves people. From high-speed trains, metros, monorails, and trams, to turnkey systems, services, infrastructure, signalling, and digital mobility, we offer our diverse customers the broadest portfolio in the industry. Every day, 80,000 colleagues lead the way to greener and smarter mobility worldwide, connecting cities as we reduce carbon and replace cars. Your future role Take on a new challenge and apply your data engineering expertise in a cutting-edge field. Youll work alongside collaborative and innovative teammates. You'll play a key role in enabling data-driven decision-making across the organization by ensuring data availability, quality, and accessibility. Day-to-day, youll work closely with teams across the business (e.g., Data Scientists, Analysts, and ML Engineers), mentor junior engineers, and contribute to the architecture and design of our data platforms and solutions. Youll specifically take care of designing and developing scalable data pipelines, but also managing and optimizing object storage systems. Well look to you for: Designing, developing, and maintaining scalable and efficient data pipelines using tools like Apache NiFi and Apache Airflow. Creating robust Python scripts for data ingestion, transformation, and validation. Managing and optimizing object storage systems such as Amazon S3, Azure Blob, or Google Cloud Storage. Collaborating with Data Scientists and Analysts to understand data requirements and deliver production-ready datasets. Implementing data quality checks, monitoring, and alerting mechanisms. Ensuring data security, governance, and compliance with industry standards. Mentoring junior engineers and promoting best practices in data engineering. All about you We value passion and attitude over experience. Thats why we dont expect you to have every single skill. Instead, weve listed some that we think will help you succeed and grow in this role: Bachelors or Masters degree in Computer Science, Engineering, or a related field. 7+ years of experience in data engineering or a similar role. Strong proficiency in Python and data processing libraries (e.g., Pandas, PySpark). Hands-on experience with Apache NiFi for data flow automation. Deep understanding of object storage systems and cloud data architectures. Proficiency in SQL and experience with both relational and NoSQL databases. Familiarity with cloud platforms (AWS, Azure, or GCP). Exposure to the Data Science ecosystem, including tools like Jupyter, scikit-learn, TensorFlow, or MLflow. Experience working in cross-functional teams with Data Scientists and ML Engineers. Cloud certifications or relevant technical certifications are a plus. Things youll enjoy Join us on a life-long transformative journey the rail industry is here to stay, so you can grow and develop new skills and experiences throughout your career. Youll also: Enjoy stability, challenges, and a long-term career free from boring daily routines. Work with advanced data and cloud technologies to drive innovation. Collaborate with cross-functional teams and helpful colleagues. Contribute to innovative projects that have a global impact. Utilise our flexible and hybrid working environment. Steer your career in whatever direction you choose across functions and countries. Benefit from our investment in your development, through award-winning learning programs. Progress towards leadership roles or specialized technical paths. Benefit from a fair and dynamic reward package that recognises your performance and potential, plus comprehensive and competitive social coverage (life, medical, pension). You dont need to be a train enthusiast to thrive with us. We guarantee that when you step onto one of our trains with your friends or family, youll be proud. If youre up for the challenge, wed love to hear from you! Important to note As a global business, were an equal-opportunity employer that celebrates diversity across the 63 countries we operate in. Were committed to creating an inclusive workplace for everyone.
Posted 3 weeks ago
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru
Work from Office
entomo is an Equal Opportunity Employer. The company promotes and supports a diverse workforce at all levels across the organization. We ensure that our associates, potential hires, third-party support staff, and suppliers are not discriminated against directly or indirectly based on color, creed, caste, race, nationality, ethnicity, national origin, marital status, pregnancy, age, disability, religion or similar philosophical belief, sexual orientation, gender, or gender reassignment. We are looking for a skilled and experienced Data Engineer with 3 to 5 years of experience to design, build, and optimize scalable data pipelines and infrastructure. The ideal candidate will work closely with data scientists, analysts, and software engineers to ensure reliable and efficient data delivery throughout our data ecosystem. Key Responsibilities Design, implement, and maintain robust data pipelines using ETL/ELT frameworks Build and manage data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) Optimize data systems for performance, scalability, and cost-efficiency Ensure data quality, consistency, and integrity across various sources Collaborate with cross-functional teams to integrate data from multiple business systems Implement data governance, privacy, and security best practices Monitor and troubleshoot data workflows and perform root cause analysis on data issues Automate data integration and validation using scripting languages (e.g., Python, SQL) Work with DevOps teams to deploy data solutions using CI/CD pipelines Required Skills & Qualifications Bachelor s or Master s degree in Computer Science, Engineering, Data Science, or a related field 3 5 years of experience in data engineering or a similar role Strong proficiency in SQL and at least one programming language (Python, Java, or Scala) Experience with cloud platforms (AWS, Azure, or GCP) Hands-on experience with data pipeline tools (e.g., Apache Airflow, Luigi, DBT) Proficient with relational and NoSQL databases Familiarity with big data tools (e.g., Spark, Hadoop) Good understanding of data architecture, modeling, and warehousing principles Excellent problem-solving and communication skills Preferred Qualifications Certifications in cloud platforms or data engineering tools Experience with containerization tools (e.g., Docker, Kubernetes) Knowledge of real-time data processing tools (e.g., Kafka, Flink) Exposure to data privacy regulations (e.g., GDPR, HIPAA) LINKEDIN PROFILE submit application By clicking the submit application button, you consent to entomo processing your personal information for the purpose of assessing your candidacy for this position in accordance with entomo Job Applicant Privacy Policy. transform people experience in your enterprise of tomorrow No 60 Paya Lebar Road, #11-06 Paya Lebar Square, Singapore 409051 +65 3138 1767 2700 Post Oak Blvd, 21st Floor, Houston, TX 77056 +1 (800) 947 8211 The Onyx Tower 1 Office 910 P.O. Box 410870. Dubai, United Arab Emirates +971 4399 52 53 Taubstummengasse 7 A-1040 Vienna + 43 1 78 66 318 Unit 27-13, Level 27, Q Sentral, Jalan Stesen Sentral 2, 50470, Kuala Lumpur, Malaysia 13th Cross, Sampige Road 4th Floor, #218 JP Royale, Malleshwaram, Bengaluru, Karnataka 560003
Posted 3 weeks ago
3.0 - 8.0 years
17 - 20 Lacs
India, Bengaluru
Work from Office
We are looking for a highly motivated and experienced IT Enterprise Architect (f/m/d) with a strong focus on end-to-end (E2E) customer service processes. You will play a key role in shaping and aligning our IT landscape across platforms such as SAP, ServiceNow, and other customer service-related systems. Your expertise will help drive the digital transformation of our global service processes, ensuring scalability, resilience, and excellent customer experience. Your tasks and responsibilities: You are responsible for enterprise architecture management (including business IT alignment and application portfolio analysis) and the derivation of IT strategies from business requirements. You design and maintain the end-to-end Enterprise Architecture for all customer service processes,and supporting processes (egs. spare parts managment, returns management technician skill matching etc.). You Lead cross-functional workshops and architecture communities to align business goals with IT strategy You will drive the development of the architecture framework, the architecture roadmap and the application and data architecture for the end-to-end customer service business process. You guide the selection and integration of platforms such as SAP S/4HANA, SAP Sales Cloud, Salesforce, Oracle Sales Cloud, and ServiceNow etc. You model IT architectures and processes and drive the consistent design planning and implementation of IT solutions. You contribute to solution evaluations, RFI/RFP processes, and vendor selection in the customer service space You coordinate communication with all key decision-makers and relevant stakeholders and advise them on the development of the IT landscape You drive documentation and presentations to ensure executive alignment Your qualifications and experience: You have a degree in computer science, industrial engineering or a comparable qualification You have experience as an Enterprise Architect or Solution-/Domain Architect in Customer facing IT landscapes You are familar with enterprise architecture methods and frameworks, governance structures, IT Service Management Frameworks (egs. TOGAF, Zachman, ITIL etc.). You bring functional or IT implementation experience across all customer service processes and functions (installation and maintenance, customer service, field service, material logistics and finance etc.) You have experience in the implementation of customer service solutions (e.g. ServiceNow, Salesforce, SAP Service Cloud, SAP Field Service Management, Oracle Sales Cloud, CPQ, Spryker etc.) You have extensive experience with data architecture and integration concepts and a very good understanding of cloud technologies (e.g. Azure, AWS) You have gained practical experience with enterprise architecture tools such as BizzDesign, LeanIX or Avolution and have good knowledge of modeling and managing business processes Your attributes and skills: In addition, you have sound technological know-how and several years of experience in complex technology landscapes We require a very good command of English, both spoken and written, for cooperation with specialist departments in Germany and abroad. Ideally, you also have a very good command of German You are an organizational talent and impress with good communication and presentation skills You are a team player with strong interpersonal skills who can operate confidently in a global environment We do not compromise on quality - you work in a results and quality-oriented manner with a high level of commitment and have good analytical and conceptual skills You are flexible in thought and action, have a quick understanding and constructive assertiveness
Posted 3 weeks ago
4.0 - 8.0 years
6 - 10 Lacs
Pune
Work from Office
Job Title Data Modeler Location Pune, Maharashtra, India Experience 4 to 8 Years As a Data Architect at JCI, you will play a pivotal role in designing and implementing robust data solutions that support our analytics and business intelligence initiatives. This role requires extensive experience in data modeling, data warehousing, and familiarity with cloud technologies. How you will do it Design and implement data architecture solutions that meet business requirements and align with overall data strategy. Work closely with data engineers, data scientists, and other stakeholders to understand data requirements and ensure data availability and quality. Create and maintain data models, ensuring they are optimized for performance and scalability. Establish data governance practices to maintain data integrity and security across the organization. Lead the design and implementation of data integration processes, including ETL workflows and data pipelines. Evaluate and recommend new tools and technologies to improve data management capabilities. Provide technical leadership and mentorship to other team members in best practices for data architecture. Stay current with industry trends and advancements in data technologies and methodologies. What we look for Bachelor s degree in computer science, Information Technology, or a related field. 4 to 8 years of experience in data architecture or a similar role. Strong proficiency in SQL and experience with data modeling and database design. Experience with cloud data solutions, such as AWS, Azure, or Google Cloud Platform. Familiarity with data warehousing concepts and tools. Excellent analytical and problem-solving skills. Strong communication skills, with the ability to convey complex technical concepts to non-technical stakeholders. Join JCI and leverage your expertise to create impactful data solutions that drive our business forward!
Posted 3 weeks ago
7.0 - 12.0 years
15 - 27 Lacs
Pune
Hybrid
Role & responsibilities Join as an AVP - Business Analyst (Data Designer) at a leading UK based bank, where you'll spearhead the evolution of the digital landscape, driving innovation and excellence. You'll harness cutting edge technology to revolutionize our digital offerings, ensuring unapparelled customer experiences. You may be assessed on the key critical skills relevant for success in role, such as experience with data design compliance with best practices, governance, and security policies, data profiling and analysis and data design specifications as well as job-specific skillsets. Basic/ Essential Qualifications: Has designed and develop detail data models, schemas, and database designs. Understands data requirements and translate them into effective data designs and data flows. Optimize data structures for performance and scalability in alignment with business objectives. Experience in conducting data profiling and analysis to identify data quality issues and propose solutions. Understands data design specifications and maintain data dictionaries. Proficiency in SQL and familiarity with database management systems (e.g., Oracle, SQL Server, MySQL, Kafka, AWS etc.). Desirable skillsets/ good to have: Bachelors degree in Business Administration, Data Science, or related field. Proven experience in data modeling, database design, and data governance frameworks. Knowledge of data warehousing concepts and tools. Has basic understanding of financial crime domain. Excellent communication skills to interact with both technical and non-technical stakeholders.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France