Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 12.0 years
0 Lacs
bangalore, karnataka
On-site
Role Overview: You will be responsible for architecting and delivering highly scalable, distributed, cloud-based enterprise data solutions. Your role will involve designing scalable data architectures with Snowflake, integrating cloud technologies such as AWS, Azure, GCP, and ETL/ELT tools like DBT. Additionally, you will guide teams in proper data modeling, transformation, security, and performance optimization. Key Responsibilities: - Architect and deliver highly scalable, distributed, cloud-based enterprise data solutions - Design scalable data architectures with Snowflake and integrate cloud technologies like AWS, Azure, GCP, and ETL/ELT tools such as DBT - Guide teams in proper data modeling (star, snowflake schemas), transformation, security, and performance optimization - Load data from disparate data sets and translate complex functional and technical requirements into detailed design - Deploy Snowflake features such as data sharing, events, and lake-house patterns - Implement data security and data access controls and design - Understand relational and NoSQL data stores, methods, and approaches (star and snowflake, dimensional modeling) - Utilize AWS, Azure, or GCP data storage and management technologies such as S3, Blob/ADLS, and Google Cloud Storage - Implement Lambda and Kappa Architectures - Utilize Big Data frameworks and related technologies, with mandatory experience in Hadoop and Spark - Utilize AWS compute services like AWS EMR, Glue, and Sagemaker, as well as storage services like S3, Redshift, and DynamoDB - Experience with AWS Streaming Services like AWS Kinesis, AWS SQS, and AWS MSK - Troubleshoot and perform performance tuning in Spark framework - Spark core, SQL, and Spark Streaming - Experience with flow tools like Airflow, Nifi, or Luigi - Knowledge of Application DevOps tools (Git, CI/CD Frameworks) - Experience in Jenkins or Gitlab with rich experience in source code management like Code Pipeline, Code Build, and Code Commit - Experience with AWS CloudWatch, AWS Cloud Trail, AWS Account Config, AWS Config Rules Qualifications Required: - 8-12 years of relevant experience - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, ETL data Pipelines, Big Data model techniques using Python/Java - Strong expertise in the end-to-end implementation of Cloud data engineering solutions like Enterprise Data Lake, Data hub in AWS - Proficiency in AWS, Data bricks, and Snowflake data warehousing, including SQL, Snow pipe - Experience in data security, data access controls, and design - Strong AWS hands-on expertise with a programming background preferably Python/Scala - Good knowledge of Big Data frameworks and related technologies, with mandatory experience in Hadoop and Spark - Good experience with AWS compute services like AWS EMR, Glue, and Sagemaker and storage services like S3, Redshift & Dynamodb - Experience with AWS Streaming Services like AWS Kinesis, AWS SQS, and AWS MSK - Troubleshooting and Performance tuning experience in Spark framework - Spark core, SQL, and Spark Streaming - Experience in one of the flow tools like Airflow, Nifi, or Luigi - Good knowledge of Application DevOps tools (Git, CI/CD Frameworks) - Experience in Jenkins or Gitlab with rich experience in source code management like Code Pipeline, Code Build, and Code Commit - Experience with AWS CloudWatch, AWS Cloud Trail, AWS Account Config, AWS Config Rules Kindly share your profiles on dhamma.b.bhawsagar@pwc.com if you are interested in this opportunity.,
Posted 3 days ago
2.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
Role Overview: You will be responsible for providing system solutions to Anaplan issues and enhancements. Your role will involve participating in prioritizing system issues for the development team, engaging in regular calls with the team to monitor progress, and managing the team for Anaplan service line ticket and resolution. Additionally, you will support existing Anaplan models, perform functional and technical testing of system projects, bug fixes, and enhancements, and coordinate with customer Business and IT users. Key Responsibilities: - Experienced in Anaplan modeling, integration, and data hub - Building models, reviewing existing planning models, and performing enhancements - Supporting the aggregation/distribution of data, implemented user stories, and defect prioritization - Ability to coordinate with customer Business and IT users - Managing workload for the team, assigning work to others, and overseeing overall risk and quality - Outstanding problem-solving skills within functional and technical areas - Determining architectural implications and identifying optimal solutions for complex problems - Strong oral and written communication levels Qualifications: - 6 to 8 years of experience in Anaplan implementation - Minimum 2 years of experience in Production Support for large Anaplan models, troubleshooting, prioritizing support tickets, and resolution - Hands-on experience in Anaplan modeling, integration, and data hub - Certified Solution Architect or Master Anaplanner - Hands-on experience in Anaplan Data Integration Cloud works, files, API,
Posted 3 days ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
Role Overview: As a Technical Consultant in the Advisory Acceleration Centre, your primary responsibility will be creating customized solutions for various UKG customers to meet their interfacing requirements with UKG's products in the Cloud environment. You will be utilizing your knowledge of HCM/WFM product and Dell Boomi integration to design, code, test, and deploy interfaces, ensuring successful integration implementations. Key Responsibilities: - Follow the defined methodology to provide solutions for Interfaces. - Understand the requirements of the customer and prepare design documents accordingly. - Code, test, and deploy interfaces using Boomi or importing vendor files into UKG. - Provide User Acceptance Testing support and deploy to production environments when necessary. - Upgrade existing UKG customers to the new UKG Dimensions platform while maintaining a strong understanding of UKG solutions. - Show flexibility in taking calls in the evening with clients and support major releases and upgrades during US business hours. Qualifications Required: - Bachelor's degree or equivalent in Computer Science or related field. - At least 10+ years of industry experience. - Dell Boomi Atmosphere Certified Process Developer 1 or 2 (preferred). - Experience in working with SQL relational database management systems and SQL scripting. - Experience in creating interfaces for upstream/downstream applications and using Data Hub (preferred). - Domain knowledge of HCM/WFM is an additional advantage. - Proficiency in designing, building, testing, deploying, and scheduling integration processes involving third party systems. - Experience with Dell Boomi components, connectors, Application Source Qualifier, Mapping Designer, Transformations, Flat File, Json, XML, EDI Profiles. - Preferred Java/Groovy Scripting experience, knowledge of Rest API, SOAP framework, XML, Web service design. - Excellent oral and written communication skills with good customer interfacing skills. Additional Company Details: Joining PwC Acceleration Centers (ACs) offers you the opportunity to actively support various services, engage in challenging projects, and provide distinctive services to clients. You will participate in dynamic training to grow your technical and professional skills, focusing on enhanced quality and innovation. As part of the UKG implementation team, you will lead teams, manage client accounts, and drive consistency in delivery practices, ultimately defining processes and standards. (Note: The additional company details have been summarized from the job description provided.),
Posted 3 days ago
3.0 - 5.0 years
6 - 10 Lacs
bengaluru
Work from Office
Software, the fuel for mobility We bring bold digital visions to life So were on the lookout for more curious and creative engineers who want to create change one line of high-quality code at a time Our transformation isn't for everyone, but if you're excited about solving the leading-edge technological challenges facing the auto industry, then lets talk about your next move, Let's introduce ourselves At Volvo Cars, curiosity, collaboration, and continuous learning define our culture Join our mission to create sustainable transportation solutions that protect what matters most people, communities, and the planet, As a Data Engineer, you will drive digital innovation, leading critical technology initiatives with global teams Youll design and implement solutions impacting millions worldwide, supporting Volvos vision for autonomous, electric, and connected vehicles, What You'll Do Key Responsibilities Technical Leadership & Development Lead development and implementation using Advanced SQL, Data Warehousing, Data Modelling, and DataOps, Design, build, and maintain scalable solutions supporting global operations, Collaborate closely with stakeholders across the USA in product management and engineering, Promote technical excellence through code reviews, architecture decisions, and best practices, Cross-Functional Collaboration Partner internationally using Microsoft Teams, Slack, SharePoint, and Azure DevOps, Participate in Agile processes and sprint planning, Share knowledge and maintain technical documentation across regions, Support 24/7 operations through on-call rotations and incident management, Innovation & Continuous Improvement Research emerging technologies to enhance platform capabilities, Contribute to roadmap planning and architecture decisions, Mentor junior team members and encourage knowledge sharing, What You'll Bring Professional Experience 5 to 8 years hands-on experience in software development, system administration, or related fields, Deep expertise in Advanced SQL, Data Warehousing, Data Modelling, and DataOps with proven implementation success, Experience collaborating with global teams across time zones, Preferred industry knowledge in automotive, manufacturing, or enterprise software, Technical Proficiency Advanced skills in core technologies: Advanced SQL, Data Warehousing, Data Modelling, and DataOps, Strong grasp of cloud platforms, DevOps, and CI/CD pipelines, Experience with enterprise integration and microservices architecture, Skilled in database design and optimization with SQL and NoSQL, Essential Soft Skills Excellent communication, able to explain complex technical topics, Adaptable in multicultural, globally distributed teams, Strong problem-solving abilities, Additional Qualifications Business-level English fluency, Flexibility to collaborate across USA time zones, Volvo Cars For Life, For nearly a century, Volvo Cars has empowered people to move freely in a personal, sustainable and safe way Today, we are driving bold advancements in electrification, sustainability and automotive safety To realise our ambitious vision, we are seeking innovative minds who are ready to tackle the challenges of tomorrow today, In our company, we believe extraordinary things are achieved by ordinary people with the drive to make a difference, Show more Show less
Posted 4 days ago
4.0 - 8.0 years
0 Lacs
maharashtra
On-site
As an AWS Senior Developer at PwC Advisory Acceleration Center, your role involves interacting with Offshore Manager/ Onsite Business Analyst to understand requirements and taking responsibility for end-to-end implementation of Cloud data engineering solutions like Enterprise Data Lake and Data hub in AWS. Your strong experience in AWS cloud technology, planning, organization skills, and ability to work as a cloud developer/lead on an agile team to provide automated cloud solutions will be crucial for success. **Key Responsibilities:** - Architect and deliver highly scalable, distributed, cloud-based enterprise data solutions - Implement Cloud data engineering solutions like Enterprise Data Lake, Data hub in AWS - Utilize Snowflake utilities, SnowSQL, SnowPipe, ETL data Pipelines, and Big Data model techniques using Python/Java - Translate complex requirements into detailed design - Deploy Snowflake features such as data sharing, events, and lake-house patterns - Ensure data security, access controls, and design - Understand relational and NoSQL data stores, methods, and approaches (star and snowflake, dimensional modeling) - Proficient in Lambda and Kappa Architectures - Utilize AWS EMR, Glue, Sagemaker, S3, Redshift, Dynamodb, and Streaming Services like AWS Kinesis, AWS SQS, AWS MSK - Troubleshoot and perform performance tuning in Spark framework - Experience in flow tools like Airflow, Nifi, or Luigi - Familiarity with Application DevOps tools (Git, CI/CD Frameworks) - Jenkins or Gitlab - Use AWS CloudWatch, Cloud Trail, Account Config, Config Rules - Understand Cloud data migration processes, methods, and project lifecycle - Apply analytical & problem-solving skills effectively - Demonstrate good communication and presentation skills **Qualifications Required:** - BE / B.Tech / MCA / M.Sc / M.E / M.Tech / MBA **Desired Knowledge / Skills:** - Experience in building stream-processing systems using solutions such as Storm or Spark-Streaming - Knowledge in Big Data ML toolkits like Mahout, SparkML, or H2O - Proficiency in Python - Work experience in Offshore/Onsite Engagements - Familiarity with AWS services like STEP & Lambda In addition to your technical skills, your ability to travel to client locations as per project requirements will be essential for this role. PwC Advisory Acceleration Center in Bangalore offers a high-performance culture based on passion for excellence, diversity, and inclusion, providing you with global leadership development frameworks and the latest digital technologies to support your career growth. Apply now if you believe PwC is the place where you can learn, grow, and excel.,
Posted 5 days ago
5.0 - 8.0 years
12 - 17 Lacs
bengaluru
Work from Office
- IOT Engineer with minimum 7 years experience - Expertise in Scala, Angular, Python, Microservices, Docker, Kubernetes Good to have skills - Angular Apollo, GraphQL, Typescript, IoT concepts and MQQT
Posted 6 days ago
20.0 - 30.0 years
28 - 33 Lacs
greater noida
Work from Office
Practice Lead Data & Analytics Job Location: Greater Noida Experience Required: 20+ Years Mandatory Skills: Data & Analytics, Data Management, Cloud data migrations, Data Quality and Governance, Data Engineering, ETL /ELT, Data Ops, BI and Analytics, Data Science and Machine Learning, or AI / Gen AI. Practice Lead Data & Analytics Experience : 20 to 30 Years Job Location : Greater Noida Mandatory Skills: Data & Analytics, Data Management, Cloud data migrations, Data Quality and Governance, Data Engineering, ETL /ELT, Data Ops, BI and Analytics, Data Science and Machine Learning, or AI / Gen AI. Job Description: Provide leadership and manage delivery across various client engagements. Prioritize Accounts and Customer value: Rank Accounts and projects based on strategic importance, urgency etc. to ensure the most critical ones get priority attention Allocate Resources: Ensure that the necessary resources (e.g., budget, personnel, technology) are allocated efficiently across the portfolio. Portfolio wide Performance Metrics and Analysis: Use effective KPIs to measure the success and impact of projects within the portfolio. Optimize Resource Utilization: Monitor resource usage to avoid overallocation or underutilization, ensuring optimal performance and cost-effectiveness. Track Project Progress: Regularly review the status of ongoing projects to ensure they are on track with their timelines, budgets, and deliverables. Generate Reports: Provide detailed reports to stakeholders, highlighting progress, risks, and any necessary adjustments. Risk Management - Proactively identify potential risks that could impact the portfolio, such as resource constraints, technical challenges, or market changes. Develop and implement strategies to mitigate identified risks, ensuring minimal disruption to project execution. Stakeholder Engagement: Effective Communication framework with all Stakeholders. Manage Expectations: Ensure stakeholders are informed about project statuses, potential risks, and any changes to the portfolio. Effective communication on delivery commitments, and their execution as per the contractual agreement and statement of work. Ensure compliance and governance around service level and contractual obligations. Accord the right level of urgency to the financial planning analysis and budgeting aspects at the project / portfolio level, forward looking planning on revenue projections, staffing capacity, and review the same periodically along with key internal stakeholders. Take stock of delivery teams’ performance score cards on a regular basis. Ensure that best practices and standards are implemented and followed across the services. Work with delivery managers to identify and implement process improvements to drive overall delivery excellence, cost reduction, and efficiency gains. Kindly share your resume at below mail ID & also share a brief summary of your candidature which should include headcount & business volume that you have handled. Please share your resume at anshul.meshram@coforge.com
Posted 1 week ago
15.0 - 20.0 years
5 - 9 Lacs
bengaluru
Work from Office
Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Kronos Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time educationRoles and responsibilities:Collaborate with project teams and client stakeholders to support project delivery.Perform maintenance and configuration activities for Kronos modules such as accruals and timekeeper.Prior experience in supporting functional testing, integration testing and UAT preferred.Assisting the customer with testing, understanding the solution and hand holding during handover of the system.Experience in Test automation and/or manual testing wrt UKG platform.Mentor junior members.Thrive in a team environment, while also possessing the ability to work independently.Proven ability to work creatively and analytically in a problem-solving environment.Solid interpersonal skills to interface with co-workers and customers and manage specific tasks to completion with minimal direction.Proven ability to build, manage and foster a team-oriented environment.Desire to work in an information systems environment.Technical and Professional Experience:Minimum of 6+years of experience in Pro WFM domain.Capability to understand the business case, features, high-level architecture and benefits of Data Hub and general process of getting data into and out of the system.Identify the various settings related to pipelines and wrappers.Mandatory Experience in Migration from WFC to Pro WFM.Hands on exp in using Navigator, Paragon Transfer Manager, Data Migration Tool.Must have hands on experience on Desktop reports.List implementation tasks Describe mappings required for Pro WFM to successfully deliver pay code-related data to Pro WFM Data Hub.Recognize how to validate reporting data in Data Hub to ensure consistency between Pro WFM and Data Hub and other Data Hub considerations such as role-level security, how to use the data dictionary, importing/exporting data, and transitioning customers to support.Design and determine if a customers reporting need is best met by a Data view, standard report, modified standard report, or custom report.Make modifications to standard reports and publish the resulting reports.Create custom reports from scratch using BIRT Report Studio Implement computed columns in custom reports.Navigate and leverage information found in the Data Dictionary. Additional Information:Ready to work in shifts. Qualification 15 years full time education
Posted 1 week ago
5.0 - 10.0 years
30 - 35 Lacs
mumbai
Work from Office
About the Job: Be the expert customers turn to when they need to build strategic, scalable systems. Red Hat Services is looking for a well-rounded Architect to join our team in Mumbai covering Asia Pacific. In this role, you will design and implement modern platforms, onboard and build cloud-native applications, and lead architecture engagements using the latest open source technologies. Youll be part of a team of consultants who are leaders in open hybrid cloud, platform modernisation, automation, and emerging practices - including foundational AI integration. Working in agile teams alongside our customers, youll build, test, and iterate on innovative prototypes that drive real business outcomes. This role is ideal for architects who can work across application, infrastructure, and modern AI-enabling platforms like Red Hat OpenShift AI. If you're passionate about open source, building solutions that scale, and shaping the future of how enterprises innovate this is your opportunity. What will you do: Design and implement modern platform architectures with a strong understanding of Red Hat OpenShift, container orchestration, and automation at scale. Strong experience in managing Day-2 operations of Kubernetes container platforms by collaborating with infrastructure teams in defining practices for platform deployment, platform hardening, platform observability, monitoring and alerting, capacity management, scalability, resiliency, security operations. Lead the discovery, architecture, and delivery of modern platforms and cloud-native applications, using technologies such as containers, APIs, microservices, and DevSecOps patterns. Collaborate with customer teams to co-create AI-ready platforms, enabling future use cases with foundational knowledge of AI/ML workloads. Remain hands-on with development and implementation especially in prototyping, MVP creation, and agile iterative delivery. Present strategic roadmaps and architectural visions to customer stakeholders, from engineers to executives. Support technical presales efforts, workshops, and proofs of concept, bringing in business context and value-first thinking. Create reusable reference architectures, best practices, and delivery models, and mentor others in applying them. Contribute to the development of standard consulting offerings, frameworks, and capability playbooks. What will you bring: Strong experience with Kubernetes, Docker, and Red Hat OpenShift or equivalent platforms In-depth expertise in managing multiple Kubernetes clusters across multi-cloud environments. Proven expertise in operationalisation of Kubernetes container platform through the adoption of Service Mesh, GitOps principles, and Serverless frameworks Migrating from XKS to OpenShift Proven leadership of modern software and platform transformation projects Hands-on coding experience in multiple languages (e.g., Java, Python, Go) Experience with infrastructure as code, automation tools, and CI/CD pipelines Practical understanding of microservices, API design, and DevOps practices Applied experience with agile, scrum, and cross-functional team collaboration Ability to advise customers on platform and application modernisation, with awareness of how platforms support emerging AI use cases. Excellent communication and facilitation skills with both technical and business audiences Willingness to travel up to 40% of the time Nice to Have: Experience with Red Hat OpenShift AI, Open Data Hub, or similar MLOps platforms Foundational understanding of AI/ML, including containerized AI workloads, model deployment, open source AI frameworks Familiarity with AI architectures (e.g., RAG, model inference, GPU-aware scheduling) Engagement in open source communities or contributor background
Posted 1 week ago
8.0 - 12.0 years
10 - 20 Lacs
mysuru, pune, bengaluru
Hybrid
Required Skills & Experience: Mandatory: Hands-on experience with MarkLogic. Strong expertise in Apache NiFi for data integration and orchestration. Cloud experience, preferably Azure (Data Factory, Data Lake, Functions, etc.). Proven experience in REST API design & development. Familiarity with Git Bash, Gradle, and any modern IDE (IntelliJ, Eclipse, VS Code, etc.). Prior experience working as part of a DevOps team with CI/CD practices. Excellent problem-solving and communication skills. Madatory: Marklogics
Posted 2 weeks ago
6.0 - 11.0 years
10 - 20 Lacs
mumbai, new delhi, bengaluru
Work from Office
We are seeking a skilled and experienced SAP Commerce Developer (SAP Hybris) with 6-12 years of expertise in e-commerce platform development. The candidate will work on customizing and implementing Hybris-based applications (B2C and B2B) with strong proficiency in WCMS, Solr, HMC, CMS, Product Cockpit, CronJobs, and ImpEx. Responsibilities include product data modeling, catalog structure design, and leveraging composable storefronts, OCC, and headless architecture. The developer will integrate SAP with backend systems, develop scalable REST/SOAP web services, and lead software development teams using agile methodologies. Hands-on experience with Java, J2EE, XML, AJAX, and JavaScript is essential. Knowledge of Hybris Data Hub, CPQ, and SAP integration is a strong advantage. Candidates must exhibit a proactive attitude, flexibility, and the ability to manage ambiguity while driving results. This role offers a six-month remote opportunity with flexible timings. Immediate joiners are preferred. Keywords SAP Commerce Developer, SAP Hybris, WCMS, Solr, HMC, Product Cockpit, REST/SOAP Web Services, Java, J2EE, Composable Storefront, SAP Integration, Headless Architecture, CPQ, Data Hub. Location - Remote, Hyderabad,Ahmedabad,pune,chennai,kolkata.
Posted 2 weeks ago
5.0 - 8.0 years
6 - 10 Lacs
gurugram
Work from Office
Were growing rapidly and seeking a passionate Senior DevOps Engineer with 5-8 years of experience to join our team and help us build world-class infrastructure and deployment pipelines. Key Responsibilities: As a Senior DevOps Engineer at SquareOps, youll be expected to: Drive the scalability and reliability of our customers cloud applications. Work directly with clients, engineering, and infrastructure teams to deliver high-qualitysolutions. Design and develop various systems from scratch with a focus on scalability, security,and compliance. Develop deployment strategies and build configuration management systems. Lead a team of junior DevOps engineers, providing guidance and support on day-to-dayactivities. Drive innovation within the team, promoting the adoption of new technologies and practices to improve project outcomes. Demonstrate ownership and accountability for project implementations, ensuring projects are delivered on time and within budget. Act as a mentor to junior team members, fostering a culture of continuous learning and growth. Qualifications:- 1. A proven track record in architecting complex production systems with multi-tier application stacks. 2. Expertise in designing solutions tailored to industry-specific requirements such as SaaS,AI, Data Ops, and highly compliant enterprise architectures. 3. Extensive experience working with Kubernetes, various CI/CD tools, and cloud service providers, preferably AWS. 4. Proficiency in automating cloud infrastructure management, primarily with tools like Terraform, Shell scripting, AWS Lambda, and Event Bridge. 5. Solid understanding of cloud financial management strategies to ensure cost-effective use of cloud resources. 6. Experience in setting up high availability and disaster recovery for cloud infrastructure. 7. Strong problem-solving skills with an innovative mindset. 8. Excellent communication skills, capable of effectively liaising with clients, engineering,and infrastructure teams. 9. The ability to lead and mentor a team, guiding them to achieve their objectives. 10. High levels of empathy and emotional intelligence, with a talent for managing and resolving conflict. 11. An adaptable nature, comfortable working in a fast-paced, dynamic environment.
Posted 3 weeks ago
12.0 - 20.0 years
35 - 60 Lacs
bengaluru
Work from Office
Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Are you passionate about delivering exceptional service and revolutionizing the world of technology? We have an incredible opportunity for a talented individual to join our dynamic team as a Data Architect. In this customer-centric role, you will play a pivotal role in ensuring our customers receive top-notch service within a contractual framework. As a visionary leader, you will inspire and guide our team of experts to deliver high-quality and reliable information technology services. Working closely with the latest systems, software products, and networked devices, you will align our solutions perfectly with our customers' evolving business needs. Your deep knowledge of the services we provide paired with your understanding of customer businesses, will enable you to propose and implement tailored solutions that exceed their expectations. You will be an integrated part of our customer account structure, fostering strong relationships with our customers and collaborating closely with our Delivery Partner. Together, you will create an environment that promotes innovation, collaboration, and customer success. By owning the technical and managerial support for our field engineers, technicians, system administrators, subject matter experts, and product support personnel, you will empower them to deliver, manage, maintain, and deploy IT services effectively. When it comes to troubleshooting incidents, problems, changes, and escalations, you will be at the forefront, providing swift support to fix any issues that may arise in malfunctioning services, operations, software, or equipment. Your expertise will be crucial in ensuring that our systems run smoothly, offering our customers a seamless experience. As a Data Architect, you will have the unique opportunity to collaborate with internal stakeholders and client. Together, you will co-create, design, deploy, and maintain reliable, available, and future-proof systems and services. Your innovative ideas and leadership skills will play a vital role in shaping the technological landscape of our organization and the industry as a whole. If you are ready to make an impact, drive customer success, and be at the forefront of technological advancements, this is the role for you. Join our team and be part of an exhilarating journey as we reshape the IT services landscape with creativity, passion, and excellence. Your Future at Kyndryl Kyndryl has a global footprint, which means that as a Data Architect at Kyndryl you will have opportunities to work on projects and collaborate with colleagues from around the world. This role is dynamic and influential – offering a wide range of professional and personal growth opportunities that you won’t find anywhere else. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. We are seeking an experienced Cloud Data Architect with strong expertise in Databricks to lead and deliver end-to-end data architecture solutions across cloud-based delivery projects. The ideal candidate will have deep experience with cloud data ecosystems (Azure, AWS, or GCP), data engineering, and hands-on delivery using Databricks for data transformation, advanced analytics, and AI/ML use cases . Key Responsibilities: Own the architecture and design of modern data platforms using Databricks on cloud (Azure/AWS/GCP). Develop scalable, high-performance data lakehouse architectures , incorporating Delta Lake and structured streaming. Architect robust data pipelines (ETL/ELT) using PySpark, SparkSQL, and Databricks notebooks. Work closely with business stakeholders, project managers, and engineering teams to understand data requirements and translate them into technical architecture. Optimize data processing workflows for performance, scalability, and cost-efficiency. Implement best practices around data governance, security, lineage , and compliance in cloud-native environments. Enable analytics, BI, and ML initiatives by ensuring strong data foundations. Provide technical leadership and mentor junior engineers and data modelers. Drive innovation and continuously evaluate emerging technologies in the data and AI space. Required Skills & Experience: 10+ years of experience in data architecture, engineering, or analytics roles. Strong hands-on experience with Databricks including Delta Lake , Spark , and MLflow . Expertise in cloud platforms (preferably Azure , but AWS/GCP also considered) and native cloud data services like Azure Data Lake Storage, Synapse, AWS S3, Redshift, or GCP BigQuery. Strong command of SQL, PySpark, SparkSQL , and distributed data processing. Solid experience in data modeling (dimensional and normalized), metadata management, and data quality frameworks. Strong understanding of DevOps/DataOps , CI/CD pipelines, and infrastructure-as-code (e.g., Terraform). Experience implementing data security, RBAC, and encryption in cloud environments. Excellent stakeholder communication, leadership, and documentation skills. Preferred Skills and Experience: Certification in Databricks (e.g., Databricks Certified Data Engineer Professional or Lakehouse Fundamentals). Cloud certifications (Azure Solutions/Data Engineer, AWS Data Analytics, GCP Data Engineer). Experience with machine learning workflows , feature stores , or real-time analytics . Familiarity with data cataloging tools like Unity Catalog, Purview, or Collibra. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.
Posted 3 weeks ago
3.0 - 7.0 years
14 - 18 Lacs
mumbai, pune, chennai
Work from Office
Project description Develop scalable data collection, storage, and distribution platform to house data from vendors, research providers, exchanges, PBs, and web-scraping. Make data available to systematic & fundamental PMs, and enterprise functionsOps, Risk, Trading, and Compliance. Develop internal data products and analytics Responsibilities Web scraping using scripts/APIs/Tools Help build and maintain greenfield data platform running on Snowflake and AWS Understand the existing pipelines and enhance pipelines for the new requirements. Onboarding new data providers Data migration projects Skills Must have 10+ years of exp as Data Engineer SQL Python Linux Containerization(Docker, Kubernetes) Good communication skills AWS Strong on Dev ops side of things(K8s, Docker, Jenkins) Being ready to work in EU time zone Capital markets exp Nice to have Market Data Projects Snowflake is a big plus Airflow Location - pune,mumbai,chennai,banagalore
Posted 4 weeks ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
As a Senior AWS Data Engineer Cloud Data Platform at Teamware Solutions, a division of Quantum Leap Consulting Pvt. Ltd, located in Bangalore, you will be responsible for end-to-end implementation of Cloud data engineering solutions like Enterprise Data lake and Data hub in AWS. Working onsite in an office environment for 5 days a week, you will collaborate with the Offshore Manager and Onsite Business Analyst to understand the requirements and deliver scalable, distributed, cloud-based enterprise data solutions. You should have a strong background in AWS cloud technology, with 4-8 years of hands-on experience. Proficiency in architecting and delivering highly scalable solutions is a must, along with expertise in Cloud data engineering solutions, Lambda or Kappa Architectures, Data Management concepts, and Data Modelling. You should be proficient in AWS services such as EMR, Glue, S3, Redshift, and DynamoDB, as well as have experience in Big Data frameworks like Hadoop and Spark. Additionally, you must have hands-on experience with AWS compute and storage services, AWS Streaming Services, troubleshooting and performance tuning in Spark framework, and knowledge of Application DevOps tools like Git and CI/CD Frameworks. Familiarity with AWS CloudWatch, Cloud Trail, Account Config, Config Rules, security, key management, data migration processes, and analytical skills is required. Good communication and presentation skills are essential for this role. Desired skills include experience in building stream-processing systems, Big Data ML toolkits, Python, Offshore/Onsite Engagements, flow tools like Airflow, Nifi or Luigi, and AWS services like STEP & Lambda. A professional background in BE/B.Tech/MCA/M.Sc/M.E/M.Tech/MBA is preferred, and an AWS certified Data Engineer certification is recommended. If you are interested in this position and meet the qualifications mentioned above, please send your resume to netra.s@twsol.com.,
Posted 1 month ago
3.0 - 7.0 years
8 - 12 Lacs
Hyderabad
Work from Office
Working with Us Challenging Meaningful Life-changing Those aren't words that are usually associated with a job But working at Bristol Myers Squibb is anything but usual Here, uniquely interesting work happens every day, in every department From optimizing a production line to the latest breakthroughs in cell therapy, this is work that transforms the lives of patients, and the careers of those who do it You'll get the chance to grow and thrive through opportunities uncommon in scale and scope, alongside high-achieving teams Take your career farther than you thought possible, Bristol Myers Squibb recognizes the importance of balance and flexibility in our work environment We offer a wide variety of competitive benefits, services and programs that provide our employees with the resources to pursue their goals, both at work and in their personal lives Read more: careers bms /working-with-us , Summary As a Data Engineer based out of our BMS Hyderabad you are part of the Data Platform team along with supporting the larger Data Engineering community, that delivers data and analytics capabilities across different IT functional domains The ideal candidate will have a strong background in data engineering, DataOps, cloud native services, and will be comfortable working with both structured and unstructured data, Key Responsibilities The Data Engineer will be responsible for designing, building, and maintaining the ETL pipelines, data products, evolution of the data products, and utilize the most suitable data architecture required for our organization's data needs, Responsible for delivering high quality, data products and analytic ready data solution Work with an end-to-end ownership mindset, innovate and drive initiatives through completion, Develop and maintain data models to support our reporting and analysis needs Optimize data storage and retrieval to ensure efficient performance and scalability Collaborate with data architects, data analysts and data scientists to understand their data needs and ensure that the data infrastructure supports their requirements Ensure data quality and integrity through data validation and testing Implement and maintain security protocols to protect sensitive data Stay up-to-date with emerging trends and technologies in data engineering and analytics Closely partner with the Enterprise Data and Analytics Platform team, other functional data teams and Data Community lead to shape and adopt data and technology strategy, Serves as the Subject Matter Expert on Data & Analytics Solutions, Knowledgeable in evolving trends in Data platforms and Product based implementation Has end-to-end ownership mindset in driving initiatives through completion Comfortable working in a fast-paced environment with minimal oversight Mentors other team members effectively to unlock full potential Prior experience working in an Agile/Product based environment Qualifications & Experience 7+ years of hands-on experience working on implementing and operating data capabilities and cutting-edge data solutions, preferably in a cloud environment Breadth of experience in technology capabilities that span the full life cycle of data management including data lakehouses, master/reference data management, data quality and analytics/AI ML is needed, In-depth knowledge and hands-on experience with ASW Glue services and AWS Data engineering ecosystem, Hands-on experience developing and delivering data, ETL solutions with some of the technologies like AWS data services (Redshift, Athena, lakeformation, etc), Cloudera Data Platform, Tableau labs is a plus 5+ years of experience in data engineering or software development Create and maintain optimal data pipeline architecture, assemble large, complex data sets that meet functional / non-functional business requirements, Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc Strong programming skills in languages such as Python, R, PyTorch, PySpark, Pandas, Scala etc Experience with SQL and database technologies such as MySQL, PostgreSQL, Presto, etc Experience with cloud-based data technologies such as AWS, Azure, or Google Cloud Platform Strong analytical and problem-solving skills Excellent communication and collaboration skills Functional knowledge or prior experience in Lifesciences Research and Development domain is a plus Experience and expertise in establishing agile and product-oriented teams that work effectively with teams in US and other global BMS site, Initiates challenging opportunities that build strong capabilities for self and team Demonstrates a focus on improving processes, structures, and knowledge within the team Leads in analyzing current states, deliver strong recommendations in understanding complexity in the environment, and the ability to execute to bring complex solutions to completion, If you come across a role that intrigues you but doesn't perfectly line up with your resume, we encourage you to apply anyway You could be one step away from work that will transform your life and career, Uniquely Interesting Work, Life-changing Careers With a single vision as inspiring as Transforming patients' lives through science?, every BMS employee plays an integral role in work that goes far beyond ordinary Each of us is empowered to apply our individual talents and unique perspectives in a supportive culture, promoting global participation in clinical trials, while our shared values of passion, innovation, urgency, accountability, inclusion and integrity bring out the highest potential of each of our colleagues, On-site Protocol Responsibilities BMS has an occupancy structure that determines where an employee is required to conduct their work This structure includes site-essential, site-by-design, field-based and remote-by-design jobs The occupancy type that you are assigned is determined by the nature and responsibilities of your role: Site-essential roles require 100% of shifts onsite at your assigned facility Site-by-design roles may be eligible for a hybrid work model with at least 50% onsite at your assigned facility For these roles, onsite presence is considered an essential job function and is critical to collaboration, innovation, productivity, and a positive Company culture For field-based and remote-by-design roles the ability to physically travel to visit customers, patients or business partners and to attend meetings on behalf of BMS as directed is an essential job function, BMS is dedicated to ensuring that people with disabilities can excel through a transparent recruitment process, reasonable workplace accommodations/adjustments and ongoing support in their roles Applicants can request a reasonable workplace accommodation/adjustment prior to accepting a job offer If you require reasonable accommodations/adjustments in completing this application, or in any part of the recruitment process, direct your inquiries to adastaffingsupport@bms Visit careers bms / eeo -accessibility to access our complete Equal Employment Opportunity statement, BMS cares about your well-being and the well-being of our staff, customers, patients, and communities As a result, the Company strongly recommends that all employees be fully vaccinated for Covid-19 and keep up to date with Covid-19 boosters, BMS will consider for employment qualified applicants with arrest and conviction records, pursuant to applicable laws in your area, If you live in or expect to work from Los Angeles County if hired for this position, please visit this page for important additional information: https://careers bms /california-residents/ Any data processed in connection with role applications will be treated in accordance with applicable data privacy policies and regulations, Show
Posted 1 month ago
1.0 - 4.0 years
9 - 13 Lacs
Pune
Work from Office
Overview Data Technology group in MSCI is responsible to build and maintain state-of-the-art data management platform that delivers Reference. Market & other critical datapoints to various products of the firm. The platform, hosted on firms’ data centers and Azure & GCP public cloud, processes 100 TB+ data and is expected to run 24*7. With increased focus on automation around systems development and operations, Data Science based quality control and cloud migration, several tech stack modernization initiatives are currently in progress. To accomplish these initiatives, we are seeking a highly motivated and innovative individual to join the Data Engineering team for the purpose of supporting our next generation of developer tools and infrastructure. The team is the hub around which Engineering, and Operations team revolves for automation and is committed to provide self-serve tools to our internal customers. Responsibilities Implement & Maintain Data Catalogs Deploy and manage data catalog tool Collibra to improve data discoverability and governance. Metadata & Lineage Management Automate metadata collection, establish data lineage, and maintain consistent data definitions across systems. Enable Data Governance Collaborate with governance teams to apply data policies, classifications, and ownership structures in the catalog. Support Self-Service & Adoption Promote catalog usage across teams through training, documentation, and continuous support. Cross-Team Collaboration Work closely with data engineers, analysts, and stewards to align catalog content with business needs. Tooling & Automation Build scripts and workflows for metadata ingestion, tagging, and monitoring of catalog health. Leverage AI tools for automation of cataloging activities Reporting & Documentation Maintain documentation and generate usage metrics, ensuring transparency and operational efficiency. Qualifications Self-motivated, collaborative individual with passion for excellence E Computer Science or equivalent with 5+ years of total experience and at least 2 years of experience in working with Azure DevOps tools and technologies Good working knowledge of source control applications like git with prior experience of building deployment workflows using this tool Good working knowledge of Snowflake YAML, Python Tools: Experience with data catalog platforms (e.g., Collibra, Alation, DataHub). Metadata & Lineage: Understanding of metadata management and data lineage. Scripting: Proficient in SQL and Python for automation and integration. APIs & Integration: Ability to connect catalog tools with data sources using APIs. Cloud Knowledge: Familiar with cloud data services (Azure, GCP). Data Governance: Basic knowledge of data stewardship, classification, and compliance. Collaboration: Strong communication skills to work across data and business teams What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com
Posted 2 months ago
15.0 - 19.0 years
0 Lacs
karnataka
On-site
The SF Data Cloud Architect plays a critical role within Salesforce's Professional Services team, assisting in pre-sales and leading the design and implementation of enterprise-grade Data Management solutions. As the SF Data Cloud Architect, you will be responsible for architecting scalable solutions across enterprise landscapes using Data Cloud. Your role involves ensuring that data is ready for enterprise AI, applying data governance guardrails, supporting enterprise analytics, and automation. This position covers the ANZ, ASEAN, and India markets. The ideal candidate for this role should bring deep expertise in data architecture, project lifecycle, and Salesforce ecosystem knowledge. Additionally, possessing strong soft skills, stakeholder engagement capabilities, and technical writing ability is essential. You will collaborate with cross-functional teams to shape the future of the customer's data ecosystem and enable data excellence at scale. Key Responsibilities: - Serve as a Salesforce Data Cloud Trusted Advisor, supporting and leading project delivery and customer engagements during the pre-sales cycle. Provide insights on how Data Cloud contributes to the success of AI projects. - Offer Architecture Support by providing Data and System Architecture guidance to Salesforce Account teams and Customers. This includes reviewing proposed architectures and peer-reviewing project effort estimates, scope, and delivery considerations. - Lead Project Delivery by working on cross-cloud projects and spearheading Data Cloud Design & Delivery. Collaborate with cross-functional teams from Developers to Executives. - Design and guide the customer's enterprise data architecture aligned with their business goals. Emphasize the importance of Data Ethics and Privacy by ensuring that customer solutions adhere to relevant regulations and best practices in data security and privacy. - Lead Data Cloud architecture enablement for key domains and cross-cloud teams. - Collaborate with analytics and AI teams to ensure data readiness for advanced analytics, reporting, and AI/ML initiatives. - Engage with stakeholders across multiple Salesforce teams and projects to deliver aligned and trusted data solutions. Influence Executive Customer stakeholders while aligning technology strategy with business value and ROI. Build strong relationships with internal and external teams to contribute to broader goals and growth. - Create and maintain high-quality architecture blueprints, design documents, standards, and technical guidelines. Technical Skills: - Over 15 years of experience in data architecture or consulting with expertise in solution design and project delivery. - Deep knowledge in MDM, Data Distribution, and Data Modelling concepts. - Expertise in data modelling with a strong understanding of metadata and lineage. - Experience in executing data strategies, landscape architecture assessments, and proof-of-concepts. - Excellent communication, stakeholder management, and presentation skills. - Strong technical writing and documentation abilities. - Basic understanding of Hadoop spark fundamentals is an advantage. - Understanding of Data Platforms such as Snowflake, DataBricks, AWS, GCP, MS Azure. - Experience with tools like Salesforce Data Cloud or similar enterprise Data platforms. Hands-on deep Data Cloud experience is a strong plus. - Working knowledge of enterprise data warehouse, data lake, and data hub concepts. - Strong understanding of Salesforce Products and functional domains like Technology, Finance, Telco, Manufacturing, and Retail is beneficial. Expected Qualifications: - Salesforce Certified Data Cloud Consultant - Highly Preferred. - Salesforce Data Architect - Preferred. - Salesforce Application Architect - Preferred. - AWS Spark/ DL, Az DataBricks, Fabric, Google Cloud, Snowflakes, or similar - Preferred.,
Posted 2 months ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
A career in our Advisory Acceleration Centre is the natural extension of PwC's leading-class global delivery capabilities. We provide premium, cost-effective, high-quality services that support process,
Posted 2 months ago
6.0 - 11.0 years
10 - 20 Lacs
Mumbai, New Delhi, Bengaluru
Work from Office
We are seeking a skilled and experienced SAP Commerce Developer (SAP Hybris) with 6-12 years of expertise in e-commerce platform development. The candidate will work on customizing and implementing Hybris-based applications (B2C and B2B) with strong proficiency in WCMS, Solr, HMC, CMS, Product Cockpit, CronJobs, and ImpEx. Responsibilities include product data modeling, catalog structure design, and leveraging composable storefronts, OCC, and headless architecture. The developer will integrate SAP with backend systems, develop scalable REST/SOAP web services, and lead software development teams using agile methodologies. Hands-on experience with Java, J2EE, XML, AJAX, and JavaScript is essential. Knowledge of Hybris Data Hub, CPQ, and SAP integration is a strong advantage. Candidates must exhibit a proactive attitude, flexibility, and the ability to manage ambiguity while driving results. This role offers a six-month remote opportunity with flexible timings. Immediate joiners are preferred. Keywords SAP Commerce Developer, SAP Hybris, WCMS, Solr, HMC, Product Cockpit, REST/SOAP Web Services, Java, J2EE, Composable Storefront, SAP Integration, Headless Architecture, CPQ, Data Hub. Location - Remote, Hyderabad,Ahmedabad,pune,chennai,kolkata.
Posted 2 months ago
6.0 - 11.0 years
8 - 14 Lacs
Mumbai, New Delhi, Bengaluru
Work from Office
We are seeking a skilled and experienced SAP Commerce Developer (SAP Hybris) with 6-12 years of expertise in e-commerce platform development. The candidate will work on customizing and implementing Hybris-based applications (B2C and B2B) with strong proficiency in WCMS, Solr, HMC, CMS, Product Cockpit, CronJobs, and ImpEx. Responsibilities include product data modeling, catalog structure design, and leveraging composable storefronts, OCC, and headless architecture. T he developer will integrate SAP with backend systems, develop scalable REST/SOAP web services, and lead software development teams using agile methodologies. Hands-on experience with Java, J2EE, XML, AJAX, and JavaScript is essential. Knowledge of Hybris Data Hub, CPQ, and SAP integration is a strong advantage. Candidates must exhibit a proactive attitude, flexibility, and the ability to manage ambiguity while driving results. This role offers a six-month remote opportunity with flexible timings. Immediate joiners are preferred. Locations : Mumbai, Delhi NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote
Posted 2 months ago
10.0 - 15.0 years
12 - 17 Lacs
Bengaluru
Work from Office
Project description We are looking for an experienced Finance Data Hub Platform Product Manager to own the strategic direction, development, and management of the core data platform that underpins our Finance Data Hub. This role is focused on ensuring the platform is scalable, reliable, secure, and optimized to support data ingestion, transformation, and access across the finance organisation. As the Platform Product Manager, you will work closely with engineering, architecture, governance, and infrastructure teams to define the technical roadmap, prioritize platform enhancements, and ensure seamless integration with data and UI product streams. Your focus will be on enabling data products and services by ensuring the platform's core capabilities meet evolving business needs. Responsibilities Key Responsibilities Platform Strategy & VisionDefine and own the roadmap for the Finance Data Hub platform, ensuring it aligns with business objectives and supports broader data product initiatives. Technical Collaborate with architects, data engineers, and governance teams to define and prioritise platform capabilities, including scalability, security, resilience, and data lineage. Integration ManagementEnsure the platform seamlessly integrates with data streams and serves UI products, enabling efficient data ingestion, transformation, storage, and consumption. Infrastructure CoordinationWork closely with infrastructure and DevOps teams to ensure platform performance, cost optimisation, and alignment with enterprise architecture standards. Governance & CompliancePartner with data governance and security teams to ensure the platform adheres to data management standards, privacy regulations, and security protocols. Backlog ManagementOwn and prioritise the platform development backlog, balancing technical needs with business priorities, and ensuring timely delivery of enhancements. Agile LeadershipSupport and often lead agile ceremonies, write clear user stories focused on platform capabilities, and facilitate collaborative sessions with technical teams. Stakeholder CommunicationProvide clear updates on platform progress, challenges, and dependencies to stakeholders, ensuring alignment across product and engineering teams. Continuous ImprovementRegularly assess platform performance, identify areas for optimization, and champion initiatives that enhance reliability, scalability, and efficiency. Risk ManagementIdentify and mitigate risks related to platform stability, security, and data integrity. SkillsMust have Proven 10+ years experience as a Product Manager focused on data platforms, infrastructure, or similar technical products. Strong understanding of data platforms and infrastructure, including data ingestion, processing, storage, and access within modern data ecosystems. Experience with cloud data platforms (e.g., Azure, AWS, GCP) and knowledge of data lake architectures. Understanding of data governance, security, and compliance best practices. Strong stakeholder management skills, particularly with technical teams (engineering, architecture, security). Experience managing product backlogs and roadmaps in an Agile environment. Ability to balance technical depth with business acumen to drive effective decision-making. Nice to have Experience with financial systems and data sources, such as HFM, Fusion, or other ERPs Knowledge of data orchestration and integration tools (e.g., Apache Airflow, Azure Data Factory). Experience with transitioning platforms from legacy technologies (e.g., Teradata) to modern solutions. Familiarity with cost optimization strategies for cloud platforms.
Posted 2 months ago
8.0 - 13.0 years
13 - 17 Lacs
Noida
Work from Office
Join Us in Transforming Healthcare with the Power of Data & AI At Innovaccer, were on a advanced Healthcare Intelligence Platform ever created. Grounded in an AI-first design philosophy, our platform turns complex health data into real-time intelligence, empowering healthcare systems to make faster, smarter decisions. We are building a unified , end-to-end data platform that spans Data Acquisition & Integration, Master Data Management , Data Classification & Governance , Advanced Analytics & AI Studio , App Marketplace , AI-as-BI capabilities, etc. All of this is powered by an Agent-first approach , enabling customers to build solutions dynamically and at scale. Youll have the opportunity to define and develop platform capabilities that help healthcare organizations tackle some of the industrys most pressing challenges, such as Kidney disease management, Clinical trials optimization for pharmaceutical companies, Supply chain intelligence for pharmacies, and many more real-world applications. Were looking for talented engineers and platform thinkers who thrive on solving large-scale, complex, and meaningful problems. If youre excited about working at the intersection of healthcare, AI, and cutting-edge platform engineering, wed love to hear from you. About the Role We are looking for a Staff Engineer to design and develop highly scalable, low-latency data platforms and processing engines. This role is ideal for engineers who enjoy building core systems and infrastructure that enable mission-critical analytics at scale. Youll work on solving some of the toughest data engineering challenges in healthcare. A Day in the Life Architect, design, and build scalable data tools and frameworks. Collaborate with cross-functional teams to ensure data compliance, security, and usability. Lead initiatives around metadata management, data lineage, and data cataloging. Define and evangelize standards and best practices across data engineering teams. Own the end-to-end lifecycle of tooling from prototyping to production deployment. Mentor and guide junior engineers and contribute to technical leadership across the organization. Drive innovation in privacy-by-design, regulatory compliance (e.g., HIPAA), and data observability solutions. What You Need 8+ years of experience in software engineering with strong experience building distributed systems. Proficient in backend development (Python, Java, or Scala or Go) and familiar with RESTful API design. Expertise in modern data stacks: Kafka, Spark, Airflow, Snowflake etc. Experience with open-source data governance frameworks like Apache Atlas, Amundsen, or DataHub is a big plus. Familiarity with cloud platforms (AWS, Azure, GCP) and their native data governance offerings. Bachelor's or Masters degree in Computer Science, Engineering, or a related field.
Posted 2 months ago
9.0 - 14.0 years
12 - 16 Lacs
Pune
Work from Office
Skills requiredStrong SQL(minimum 6-7 years experience), Datawarehouse, ETL Data and Client Platform Tech project provides all data related services to internal and external clients of SST business. Ingestion team is responsible for getting and ingesting data into Datalake. This is Global team with development team at Shanghai, Pune, Dublin and Tampa. Ingestion team uses all Big Data technologies like Impala, Hive, Spark and HDFS. Ingestion team uses Cloud technologies such as Snowflake for cloud data storage. Responsibilities: You will gain an understanding of the complex domain model and define the logical and physical data model for the Securities Services business. You will also constantly improve the ingestion, storage and performance processes by analyzing them and possibly automating them wherever possible. You will be responsible for defining standards and best practices for the team in the areas of Code Standards, Unit Testing, Continuous Integration, and Release Management. You will be responsible for improving performance of queries from lake tables views You will be working with a wide variety of stakeholders source systems, business sponsors, product owners, scrum masters, enterprise architects and possess excellent communication skills to articulate challenging technical details to various class of people. You will be working in Agile Scrum and complete all assigned tasks JIRAs as per Sprint timelines and standards. Qualifications 5 8 years of relevant experience in Data Development, ETL and Data Ingestion and Performance optimization. Strong SQL skills are essential experience writing complex queries spanning multiple tables is required. Knowledge of Big Data technologies Impala, Hive, Spark nice to have. Working knowledge of performance tuning of database queries understanding the inner working of the query optimizer, query plans, indexes, partitions etc. Experience in systems analysis and programming of software applications in SQL and other Big Data Query Languages. Working knowledge of data modelling and dimensional modelling tools and techniques. Knowledge of working with high volume data ingestion and high volume historic data processing is required. Exposure to scripting language like shell scripting, python is required. Working knowledge of consulting project management techniques methods Knowledge of working in Agile Scrum Teams and processes. Experience in data quality, data governance, DataOps and latest data management techniques a plus. Education Bachelors degree University degree or equivalent experience
Posted 2 months ago
11.0 - 15.0 years
50 - 55 Lacs
Ahmedabad, Chennai, Bengaluru
Work from Office
Dear Candidate, We are hiring a Zig Developer to create reliable and performant systems software. Zig emphasizes safety and manual control without hidden behavior, ideal for OS-level programming, game engines, or embedded development. Key Responsibilities: Develop low-level systems using Zig programming language . Replace or interface with C codebases using Zigs FFI. Focus on compile-time safety and performance tuning . Build tools, compilers, or libraries with deterministic behavior. Contribute to debugging, testing, and optimization. Required Skills & Qualifications: Strong understanding of Zig , manual memory management , and no runtime Experience with C interop, embedded systems, or OS internals Familiarity with LLVM, compilers, or real-time systems Bonus: Interest in Rust, C++, or Go Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Srinivasa Reddy Kandi Delivery Manager Integra Technologies
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |