Jobs
Interviews

30 Data Hub Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Working with Us Challenging Meaningful Life-changing Those aren't words that are usually associated with a job But working at Bristol Myers Squibb is anything but usual Here, uniquely interesting work happens every day, in every department From optimizing a production line to the latest breakthroughs in cell therapy, this is work that transforms the lives of patients, and the careers of those who do it You'll get the chance to grow and thrive through opportunities uncommon in scale and scope, alongside high-achieving teams Take your career farther than you thought possible, Bristol Myers Squibb recognizes the importance of balance and flexibility in our work environment We offer a wide variety of competitive benefits, services and programs that provide our employees with the resources to pursue their goals, both at work and in their personal lives Read more: careers bms /working-with-us , Summary As a Data Engineer based out of our BMS Hyderabad you are part of the Data Platform team along with supporting the larger Data Engineering community, that delivers data and analytics capabilities across different IT functional domains The ideal candidate will have a strong background in data engineering, DataOps, cloud native services, and will be comfortable working with both structured and unstructured data, Key Responsibilities The Data Engineer will be responsible for designing, building, and maintaining the ETL pipelines, data products, evolution of the data products, and utilize the most suitable data architecture required for our organization's data needs, Responsible for delivering high quality, data products and analytic ready data solution Work with an end-to-end ownership mindset, innovate and drive initiatives through completion, Develop and maintain data models to support our reporting and analysis needs Optimize data storage and retrieval to ensure efficient performance and scalability Collaborate with data architects, data analysts and data scientists to understand their data needs and ensure that the data infrastructure supports their requirements Ensure data quality and integrity through data validation and testing Implement and maintain security protocols to protect sensitive data Stay up-to-date with emerging trends and technologies in data engineering and analytics Closely partner with the Enterprise Data and Analytics Platform team, other functional data teams and Data Community lead to shape and adopt data and technology strategy, Serves as the Subject Matter Expert on Data & Analytics Solutions, Knowledgeable in evolving trends in Data platforms and Product based implementation Has end-to-end ownership mindset in driving initiatives through completion Comfortable working in a fast-paced environment with minimal oversight Mentors other team members effectively to unlock full potential Prior experience working in an Agile/Product based environment Qualifications & Experience 7+ years of hands-on experience working on implementing and operating data capabilities and cutting-edge data solutions, preferably in a cloud environment Breadth of experience in technology capabilities that span the full life cycle of data management including data lakehouses, master/reference data management, data quality and analytics/AI ML is needed, In-depth knowledge and hands-on experience with ASW Glue services and AWS Data engineering ecosystem, Hands-on experience developing and delivering data, ETL solutions with some of the technologies like AWS data services (Redshift, Athena, lakeformation, etc), Cloudera Data Platform, Tableau labs is a plus 5+ years of experience in data engineering or software development Create and maintain optimal data pipeline architecture, assemble large, complex data sets that meet functional / non-functional business requirements, Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc Strong programming skills in languages such as Python, R, PyTorch, PySpark, Pandas, Scala etc Experience with SQL and database technologies such as MySQL, PostgreSQL, Presto, etc Experience with cloud-based data technologies such as AWS, Azure, or Google Cloud Platform Strong analytical and problem-solving skills Excellent communication and collaboration skills Functional knowledge or prior experience in Lifesciences Research and Development domain is a plus Experience and expertise in establishing agile and product-oriented teams that work effectively with teams in US and other global BMS site, Initiates challenging opportunities that build strong capabilities for self and team Demonstrates a focus on improving processes, structures, and knowledge within the team Leads in analyzing current states, deliver strong recommendations in understanding complexity in the environment, and the ability to execute to bring complex solutions to completion, If you come across a role that intrigues you but doesn't perfectly line up with your resume, we encourage you to apply anyway You could be one step away from work that will transform your life and career, Uniquely Interesting Work, Life-changing Careers With a single vision as inspiring as Transforming patients' lives through science?, every BMS employee plays an integral role in work that goes far beyond ordinary Each of us is empowered to apply our individual talents and unique perspectives in a supportive culture, promoting global participation in clinical trials, while our shared values of passion, innovation, urgency, accountability, inclusion and integrity bring out the highest potential of each of our colleagues, On-site Protocol Responsibilities BMS has an occupancy structure that determines where an employee is required to conduct their work This structure includes site-essential, site-by-design, field-based and remote-by-design jobs The occupancy type that you are assigned is determined by the nature and responsibilities of your role: Site-essential roles require 100% of shifts onsite at your assigned facility Site-by-design roles may be eligible for a hybrid work model with at least 50% onsite at your assigned facility For these roles, onsite presence is considered an essential job function and is critical to collaboration, innovation, productivity, and a positive Company culture For field-based and remote-by-design roles the ability to physically travel to visit customers, patients or business partners and to attend meetings on behalf of BMS as directed is an essential job function, BMS is dedicated to ensuring that people with disabilities can excel through a transparent recruitment process, reasonable workplace accommodations/adjustments and ongoing support in their roles Applicants can request a reasonable workplace accommodation/adjustment prior to accepting a job offer If you require reasonable accommodations/adjustments in completing this application, or in any part of the recruitment process, direct your inquiries to adastaffingsupport@bms Visit careers bms / eeo -accessibility to access our complete Equal Employment Opportunity statement, BMS cares about your well-being and the well-being of our staff, customers, patients, and communities As a result, the Company strongly recommends that all employees be fully vaccinated for Covid-19 and keep up to date with Covid-19 boosters, BMS will consider for employment qualified applicants with arrest and conviction records, pursuant to applicable laws in your area, If you live in or expect to work from Los Angeles County if hired for this position, please visit this page for important additional information: https://careers bms /california-residents/ Any data processed in connection with role applications will be treated in accordance with applicable data privacy policies and regulations, Show

Posted 4 days ago

Apply

1.0 - 4.0 years

9 - 13 Lacs

Pune

Work from Office

Overview Data Technology group in MSCI is responsible to build and maintain state-of-the-art data management platform that delivers Reference. Market & other critical datapoints to various products of the firm. The platform, hosted on firms’ data centers and Azure & GCP public cloud, processes 100 TB+ data and is expected to run 24*7. With increased focus on automation around systems development and operations, Data Science based quality control and cloud migration, several tech stack modernization initiatives are currently in progress. To accomplish these initiatives, we are seeking a highly motivated and innovative individual to join the Data Engineering team for the purpose of supporting our next generation of developer tools and infrastructure. The team is the hub around which Engineering, and Operations team revolves for automation and is committed to provide self-serve tools to our internal customers. Responsibilities Implement & Maintain Data Catalogs Deploy and manage data catalog tool Collibra to improve data discoverability and governance. Metadata & Lineage Management Automate metadata collection, establish data lineage, and maintain consistent data definitions across systems. Enable Data Governance Collaborate with governance teams to apply data policies, classifications, and ownership structures in the catalog. Support Self-Service & Adoption Promote catalog usage across teams through training, documentation, and continuous support. Cross-Team Collaboration Work closely with data engineers, analysts, and stewards to align catalog content with business needs. Tooling & Automation Build scripts and workflows for metadata ingestion, tagging, and monitoring of catalog health. Leverage AI tools for automation of cataloging activities Reporting & Documentation Maintain documentation and generate usage metrics, ensuring transparency and operational efficiency. Qualifications Self-motivated, collaborative individual with passion for excellence E Computer Science or equivalent with 5+ years of total experience and at least 2 years of experience in working with Azure DevOps tools and technologies Good working knowledge of source control applications like git with prior experience of building deployment workflows using this tool Good working knowledge of Snowflake YAML, Python Tools: Experience with data catalog platforms (e.g., Collibra, Alation, DataHub). Metadata & Lineage: Understanding of metadata management and data lineage. Scripting: Proficient in SQL and Python for automation and integration. APIs & Integration: Ability to connect catalog tools with data sources using APIs. Cloud Knowledge: Familiar with cloud data services (Azure, GCP). Data Governance: Basic knowledge of data stewardship, classification, and compliance. Collaboration: Strong communication skills to work across data and business teams What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com

Posted 2 weeks ago

Apply

15.0 - 19.0 years

0 Lacs

karnataka

On-site

The SF Data Cloud Architect plays a critical role within Salesforce's Professional Services team, assisting in pre-sales and leading the design and implementation of enterprise-grade Data Management solutions. As the SF Data Cloud Architect, you will be responsible for architecting scalable solutions across enterprise landscapes using Data Cloud. Your role involves ensuring that data is ready for enterprise AI, applying data governance guardrails, supporting enterprise analytics, and automation. This position covers the ANZ, ASEAN, and India markets. The ideal candidate for this role should bring deep expertise in data architecture, project lifecycle, and Salesforce ecosystem knowledge. Additionally, possessing strong soft skills, stakeholder engagement capabilities, and technical writing ability is essential. You will collaborate with cross-functional teams to shape the future of the customer's data ecosystem and enable data excellence at scale. Key Responsibilities: - Serve as a Salesforce Data Cloud Trusted Advisor, supporting and leading project delivery and customer engagements during the pre-sales cycle. Provide insights on how Data Cloud contributes to the success of AI projects. - Offer Architecture Support by providing Data and System Architecture guidance to Salesforce Account teams and Customers. This includes reviewing proposed architectures and peer-reviewing project effort estimates, scope, and delivery considerations. - Lead Project Delivery by working on cross-cloud projects and spearheading Data Cloud Design & Delivery. Collaborate with cross-functional teams from Developers to Executives. - Design and guide the customer's enterprise data architecture aligned with their business goals. Emphasize the importance of Data Ethics and Privacy by ensuring that customer solutions adhere to relevant regulations and best practices in data security and privacy. - Lead Data Cloud architecture enablement for key domains and cross-cloud teams. - Collaborate with analytics and AI teams to ensure data readiness for advanced analytics, reporting, and AI/ML initiatives. - Engage with stakeholders across multiple Salesforce teams and projects to deliver aligned and trusted data solutions. Influence Executive Customer stakeholders while aligning technology strategy with business value and ROI. Build strong relationships with internal and external teams to contribute to broader goals and growth. - Create and maintain high-quality architecture blueprints, design documents, standards, and technical guidelines. Technical Skills: - Over 15 years of experience in data architecture or consulting with expertise in solution design and project delivery. - Deep knowledge in MDM, Data Distribution, and Data Modelling concepts. - Expertise in data modelling with a strong understanding of metadata and lineage. - Experience in executing data strategies, landscape architecture assessments, and proof-of-concepts. - Excellent communication, stakeholder management, and presentation skills. - Strong technical writing and documentation abilities. - Basic understanding of Hadoop spark fundamentals is an advantage. - Understanding of Data Platforms such as Snowflake, DataBricks, AWS, GCP, MS Azure. - Experience with tools like Salesforce Data Cloud or similar enterprise Data platforms. Hands-on deep Data Cloud experience is a strong plus. - Working knowledge of enterprise data warehouse, data lake, and data hub concepts. - Strong understanding of Salesforce Products and functional domains like Technology, Finance, Telco, Manufacturing, and Retail is beneficial. Expected Qualifications: - Salesforce Certified Data Cloud Consultant - Highly Preferred. - Salesforce Data Architect - Preferred. - Salesforce Application Architect - Preferred. - AWS Spark/ DL, Az DataBricks, Fabric, Google Cloud, Snowflakes, or similar - Preferred.,

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

A career in our Advisory Acceleration Centre is the natural extension of PwC's leading-class global delivery capabilities. We provide premium, cost-effective, high-quality services that support process,

Posted 2 weeks ago

Apply

6.0 - 11.0 years

10 - 20 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

We are seeking a skilled and experienced SAP Commerce Developer (SAP Hybris) with 6-12 years of expertise in e-commerce platform development. The candidate will work on customizing and implementing Hybris-based applications (B2C and B2B) with strong proficiency in WCMS, Solr, HMC, CMS, Product Cockpit, CronJobs, and ImpEx. Responsibilities include product data modeling, catalog structure design, and leveraging composable storefronts, OCC, and headless architecture. The developer will integrate SAP with backend systems, develop scalable REST/SOAP web services, and lead software development teams using agile methodologies. Hands-on experience with Java, J2EE, XML, AJAX, and JavaScript is essential. Knowledge of Hybris Data Hub, CPQ, and SAP integration is a strong advantage. Candidates must exhibit a proactive attitude, flexibility, and the ability to manage ambiguity while driving results. This role offers a six-month remote opportunity with flexible timings. Immediate joiners are preferred. Keywords SAP Commerce Developer, SAP Hybris, WCMS, Solr, HMC, Product Cockpit, REST/SOAP Web Services, Java, J2EE, Composable Storefront, SAP Integration, Headless Architecture, CPQ, Data Hub. Location - Remote, Hyderabad,Ahmedabad,pune,chennai,kolkata.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

8 - 14 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

We are seeking a skilled and experienced SAP Commerce Developer (SAP Hybris) with 6-12 years of expertise in e-commerce platform development. The candidate will work on customizing and implementing Hybris-based applications (B2C and B2B) with strong proficiency in WCMS, Solr, HMC, CMS, Product Cockpit, CronJobs, and ImpEx. Responsibilities include product data modeling, catalog structure design, and leveraging composable storefronts, OCC, and headless architecture. T he developer will integrate SAP with backend systems, develop scalable REST/SOAP web services, and lead software development teams using agile methodologies. Hands-on experience with Java, J2EE, XML, AJAX, and JavaScript is essential. Knowledge of Hybris Data Hub, CPQ, and SAP integration is a strong advantage. Candidates must exhibit a proactive attitude, flexibility, and the ability to manage ambiguity while driving results. This role offers a six-month remote opportunity with flexible timings. Immediate joiners are preferred. Locations : Mumbai, Delhi NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 3 weeks ago

Apply

10.0 - 15.0 years

12 - 17 Lacs

Bengaluru

Work from Office

Project description We are looking for an experienced Finance Data Hub Platform Product Manager to own the strategic direction, development, and management of the core data platform that underpins our Finance Data Hub. This role is focused on ensuring the platform is scalable, reliable, secure, and optimized to support data ingestion, transformation, and access across the finance organisation. As the Platform Product Manager, you will work closely with engineering, architecture, governance, and infrastructure teams to define the technical roadmap, prioritize platform enhancements, and ensure seamless integration with data and UI product streams. Your focus will be on enabling data products and services by ensuring the platform's core capabilities meet evolving business needs. Responsibilities Key Responsibilities Platform Strategy & VisionDefine and own the roadmap for the Finance Data Hub platform, ensuring it aligns with business objectives and supports broader data product initiatives. Technical Collaborate with architects, data engineers, and governance teams to define and prioritise platform capabilities, including scalability, security, resilience, and data lineage. Integration ManagementEnsure the platform seamlessly integrates with data streams and serves UI products, enabling efficient data ingestion, transformation, storage, and consumption. Infrastructure CoordinationWork closely with infrastructure and DevOps teams to ensure platform performance, cost optimisation, and alignment with enterprise architecture standards. Governance & CompliancePartner with data governance and security teams to ensure the platform adheres to data management standards, privacy regulations, and security protocols. Backlog ManagementOwn and prioritise the platform development backlog, balancing technical needs with business priorities, and ensuring timely delivery of enhancements. Agile LeadershipSupport and often lead agile ceremonies, write clear user stories focused on platform capabilities, and facilitate collaborative sessions with technical teams. Stakeholder CommunicationProvide clear updates on platform progress, challenges, and dependencies to stakeholders, ensuring alignment across product and engineering teams. Continuous ImprovementRegularly assess platform performance, identify areas for optimization, and champion initiatives that enhance reliability, scalability, and efficiency. Risk ManagementIdentify and mitigate risks related to platform stability, security, and data integrity. SkillsMust have Proven 10+ years experience as a Product Manager focused on data platforms, infrastructure, or similar technical products. Strong understanding of data platforms and infrastructure, including data ingestion, processing, storage, and access within modern data ecosystems. Experience with cloud data platforms (e.g., Azure, AWS, GCP) and knowledge of data lake architectures. Understanding of data governance, security, and compliance best practices. Strong stakeholder management skills, particularly with technical teams (engineering, architecture, security). Experience managing product backlogs and roadmaps in an Agile environment. Ability to balance technical depth with business acumen to drive effective decision-making. Nice to have Experience with financial systems and data sources, such as HFM, Fusion, or other ERPs Knowledge of data orchestration and integration tools (e.g., Apache Airflow, Azure Data Factory). Experience with transitioning platforms from legacy technologies (e.g., Teradata) to modern solutions. Familiarity with cost optimization strategies for cloud platforms.

Posted 4 weeks ago

Apply

8.0 - 13.0 years

13 - 17 Lacs

Noida

Work from Office

Join Us in Transforming Healthcare with the Power of Data & AI At Innovaccer, were on a advanced Healthcare Intelligence Platform ever created. Grounded in an AI-first design philosophy, our platform turns complex health data into real-time intelligence, empowering healthcare systems to make faster, smarter decisions. We are building a unified , end-to-end data platform that spans Data Acquisition & Integration, Master Data Management , Data Classification & Governance , Advanced Analytics & AI Studio , App Marketplace , AI-as-BI capabilities, etc. All of this is powered by an Agent-first approach , enabling customers to build solutions dynamically and at scale. Youll have the opportunity to define and develop platform capabilities that help healthcare organizations tackle some of the industrys most pressing challenges, such as Kidney disease management, Clinical trials optimization for pharmaceutical companies, Supply chain intelligence for pharmacies, and many more real-world applications. Were looking for talented engineers and platform thinkers who thrive on solving large-scale, complex, and meaningful problems. If youre excited about working at the intersection of healthcare, AI, and cutting-edge platform engineering, wed love to hear from you. About the Role We are looking for a Staff Engineer to design and develop highly scalable, low-latency data platforms and processing engines. This role is ideal for engineers who enjoy building core systems and infrastructure that enable mission-critical analytics at scale. Youll work on solving some of the toughest data engineering challenges in healthcare. A Day in the Life Architect, design, and build scalable data tools and frameworks. Collaborate with cross-functional teams to ensure data compliance, security, and usability. Lead initiatives around metadata management, data lineage, and data cataloging. Define and evangelize standards and best practices across data engineering teams. Own the end-to-end lifecycle of tooling from prototyping to production deployment. Mentor and guide junior engineers and contribute to technical leadership across the organization. Drive innovation in privacy-by-design, regulatory compliance (e.g., HIPAA), and data observability solutions. What You Need 8+ years of experience in software engineering with strong experience building distributed systems. Proficient in backend development (Python, Java, or Scala or Go) and familiar with RESTful API design. Expertise in modern data stacks: Kafka, Spark, Airflow, Snowflake etc. Experience with open-source data governance frameworks like Apache Atlas, Amundsen, or DataHub is a big plus. Familiarity with cloud platforms (AWS, Azure, GCP) and their native data governance offerings. Bachelor's or Masters degree in Computer Science, Engineering, or a related field.

Posted 1 month ago

Apply

9.0 - 14.0 years

12 - 16 Lacs

Pune

Work from Office

Skills requiredStrong SQL(minimum 6-7 years experience), Datawarehouse, ETL Data and Client Platform Tech project provides all data related services to internal and external clients of SST business. Ingestion team is responsible for getting and ingesting data into Datalake. This is Global team with development team at Shanghai, Pune, Dublin and Tampa. Ingestion team uses all Big Data technologies like Impala, Hive, Spark and HDFS. Ingestion team uses Cloud technologies such as Snowflake for cloud data storage. Responsibilities: You will gain an understanding of the complex domain model and define the logical and physical data model for the Securities Services business. You will also constantly improve the ingestion, storage and performance processes by analyzing them and possibly automating them wherever possible. You will be responsible for defining standards and best practices for the team in the areas of Code Standards, Unit Testing, Continuous Integration, and Release Management. You will be responsible for improving performance of queries from lake tables views You will be working with a wide variety of stakeholders source systems, business sponsors, product owners, scrum masters, enterprise architects and possess excellent communication skills to articulate challenging technical details to various class of people. You will be working in Agile Scrum and complete all assigned tasks JIRAs as per Sprint timelines and standards. Qualifications 5 8 years of relevant experience in Data Development, ETL and Data Ingestion and Performance optimization. Strong SQL skills are essential experience writing complex queries spanning multiple tables is required. Knowledge of Big Data technologies Impala, Hive, Spark nice to have. Working knowledge of performance tuning of database queries understanding the inner working of the query optimizer, query plans, indexes, partitions etc. Experience in systems analysis and programming of software applications in SQL and other Big Data Query Languages. Working knowledge of data modelling and dimensional modelling tools and techniques. Knowledge of working with high volume data ingestion and high volume historic data processing is required. Exposure to scripting language like shell scripting, python is required. Working knowledge of consulting project management techniques methods Knowledge of working in Agile Scrum Teams and processes. Experience in data quality, data governance, DataOps and latest data management techniques a plus. Education Bachelors degree University degree or equivalent experience

Posted 1 month ago

Apply

11.0 - 15.0 years

50 - 55 Lacs

Ahmedabad, Chennai, Bengaluru

Work from Office

Dear Candidate, We are hiring a Zig Developer to create reliable and performant systems software. Zig emphasizes safety and manual control without hidden behavior, ideal for OS-level programming, game engines, or embedded development. Key Responsibilities: Develop low-level systems using Zig programming language . Replace or interface with C codebases using Zigs FFI. Focus on compile-time safety and performance tuning . Build tools, compilers, or libraries with deterministic behavior. Contribute to debugging, testing, and optimization. Required Skills & Qualifications: Strong understanding of Zig , manual memory management , and no runtime Experience with C interop, embedded systems, or OS internals Familiarity with LLVM, compilers, or real-time systems Bonus: Interest in Rust, C++, or Go Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Srinivasa Reddy Kandi Delivery Manager Integra Technologies

Posted 1 month ago

Apply

5.0 - 10.0 years

30 - 35 Lacs

Mumbai

Work from Office

About the Job: Be the expert customers turn to when they need to build strategic, scalable systems. Red Hat Services is looking for a well-rounded Architect to join our team in Mumbai covering Asia Pacific. In this role, you will design and implement modern platforms, onboard and build cloud-native applications, and lead architecture engagements using the latest open source technologies. Youll be part of a team of consultants who are leaders in open hybrid cloud, platform modernisation, automation, and emerging practices - including foundational AI integration. Working in agile teams alongside our customers, youll build, test, and iterate on innovative prototypes that drive real business outcomes. This role is ideal for architects who can work across application, infrastructure, and modern AI-enabling platforms like Red Hat OpenShift AI. If you're passionate about open source, building solutions that scale, and shaping the future of how enterprises innovate this is your opportunity. What will you do: Design and implement modern platform architectures with a strong understanding of Red Hat OpenShift, container orchestration, and automation at scale. Strong experience in managing Day-2 operations of Kubernetes container platforms by collaborating with infrastructure teams in defining practices for platform deployment, platform hardening, platform observability, monitoring and alerting, capacity management, scalability, resiliency, security operations. Lead the discovery, architecture, and delivery of modern platforms and cloud-native applications, using technologies such as containers, APIs, microservices, and DevSecOps patterns. Collaborate with customer teams to co-create AI-ready platforms, enabling future use cases with foundational knowledge of AI/ML workloads. Remain hands-on with development and implementation especially in prototyping, MVP creation, and agile iterative delivery. Present strategic roadmaps and architectural visions to customer stakeholders, from engineers to executives. Support technical presales efforts, workshops, and proofs of concept, bringing in business context and value-first thinking. Create reusable reference architectures, best practices, and delivery models, and mentor others in applying them. Contribute to the development of standard consulting offerings, frameworks, and capability playbooks. What will you bring Strong experience with Kubernetes, Docker, and Red Hat OpenShift or equivalent platforms In-depth expertise in managing multiple Kubernetes clusters across multi-cloud environments. Proven expertise in operationalisation of Kubernetes container platform through the adoption of Service Mesh, GitOps principles, and Serverless frameworks Migrating from XKS to OpenShift Proven leadership of modern software and platform transformation projects Hands-on coding experience in multiple languages (e.g., Java, Python, Go) Experience with infrastructure as code, automation tools, and CI/CD pipelines Practical understanding of microservices, API design, and DevOps practices Applied experience with agile, scrum, and cross-functional team collaboration Ability to advise customers on platform and application modernisation, with awareness of how platforms support emerging AI use cases. Excellent communication and facilitation skills with both technical and business audiences Willingness to travel up to 40% of the time Nice to Have Experience with Red Hat OpenShift AI, Open Data Hub, or similar MLOps platforms Foundational understanding of AI/ML, including containerized AI workloads, model deployment, open source AI frameworks Familiarity with AI architectures (e.g., RAG, model inference, GPU-aware scheduling) Engagement in open source communities or contributor background About Red Hat Red Hat is the worlds leading provider of enterprise open source software solutions, using a community-powered approach to deliver high-performing Linux, cloud, container, and Kubernetes technologies. Spread across 40+ countries, our associates work flexibly across work environments, from in-office, to office-flex, to fully remote, depending on the requirements of their role. Red Hatters are encouraged to bring their best ideas, no matter their title or tenure. We're a leader in open source because of our open and inclusive environment. We hire creative, passionate people ready to contribute their ideas, help solve complex problems, and make an impact. Inclusion at Red Hat Red Hats culture is built on the open source principles of transparency, collaboration, and inclusion, where the best ideas can come from anywhere and anyone. When this is realized, it empowers people from different backgrounds, perspectives, and experiences to come together to share ideas, challenge the status quo, and drive innovation. Our aspiration is that everyone experiences this culture with equal opportunity and access, and that all voices are not only heard but also celebrated. We hope you will join our celebration, and we welcome and encourage applicants from all the beautiful dimensions that compose our global village. Equal Opportunity Policy (EEO) Red Hat is proud to be an equal opportunity workplace and an affirmative action employer. We review applications for employment without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, ancestry, citizenship, age, veteran status, genetic information, physical or mental disability, medical condition, marital status, or any other basis prohibited by law. Red Hat supports individuals with disabilities and provides reasonable accommodations to job applicants. If you need assistance completing our online job application, email application-assistance@redhat.com . General inquiries, such as those regarding the status of a job application, will not receive a reply.

Posted 1 month ago

Apply

7.0 - 12.0 years

25 - 27 Lacs

Pune, Bengaluru

Work from Office

Role & responsibilities seeking an experienced and highly skilled CFIN Non-SAP Functional Consultant with deep functional understanding of financial processes and extensive hands-on experience in integrating non-SAP source systems into SAP Central Finance (CFIN) using Magnitude/Datahub. This role focuses on enabling replication of financial data (FI/CO) from legacy or third-party ERP systems to SAP S/4HANA Central Finance. Primary responsibilities: Collaborate with cross-functional teams to understand the source system architecture and financial data flow. Design and implement end-to-end non-SAP system integrations to SAP CFIN using Magnitude/Datahub. Configure and manage replication of financial documents (GL, AP, AR, CO, etc.) from non-SAP systems to SAP CFIN. Ensure data mapping, data harmonization, and data transformation aligns with SAP CFIN requirements. Support ongoing operations and resolve issues related to data inconsistencies, failed replications, and error handling. Lead or participate in workshops, requirement gathering sessions, and technical/functional documentation efforts. Collaborate with the CFIN project team (SAP, Middleware, Source System SMEs) for smooth data integration. Perform impact analysis and support testing cycles (unit testing, SIT, UAT) for all integration scenarios. Work with Magnitude support or internal teams to troubleshoot and optimize performance or configuration. Required Qualifications & Skills: Minimum 7 years of experience in IT/Finance domain with at least 4 years in CFIN projects. Hands-on experience integrating non-SAP ERPs (Oracle, JDE, Peoplesoft, etc.) to SAP CFIN using Magnitude/Datahub. Strong understanding of financial master data and transactional data replication in CFIN. Sound knowledge of source system extractors, interface design, and error handling in Datahub. Functional understanding of SAP FI/CO modules and their equivalents in non-SAP systems. Experience with mapping and reconciliation between source systems and SAP S/4HANA CFIN. Familiarity with SLT, AIF (Application Interface Framework), and error resolution processes in SAP CFIN landscape.

Posted 1 month ago

Apply

7.0 - 12.0 years

8 - 18 Lacs

Pune, Bengaluru

Work from Office

Role & responsibilities Primary responsibilities: SLT Implementation & Management: Lead the setup and management of SLT for data replication between SAP and non-SAP systems. Ensure smooth data extraction, transformation, and loading (ETL) processes in real-time or batch modes. Non-SAP to SAP Migration: Oversee the migration of data from non-SAP systems to SAP using SLT, ensuring data integrity, quality, and minimal disruption during the process. Central Finance Support: Manage and support the implementation of Central Finance using SLT for seamless integration across financial systems. Configure and monitor SLT-based data replication for Central Finance projects. Basis and Datahub/Magnitude: Collaborate with the SAP Basis team to ensure proper system configuration, maintenance, and performance of SLT landscapes. Work with Datahub/Magnitude solutions to manage and optimize data integration across SAP and non-SAP platforms. Performance & Troubleshooting: Monitor and optimize the performance of SLT processes, including data replication and transformation jobs. Troubleshoot and resolve any issues related to data replication, integration, and performance bottlenecks. Documentation & Best Practices: Maintain clear documentation for SLT configuration, migrations, and troubleshooting procedures. Follow SAP best practices to ensure efficient and secure data integration processes Required Qualifications & Skills: Strong experience with SAP Basis and SLT for both non-SAP to SAP migrations and Central Finance integrations. Expertise in SLT configuration, monitoring, and troubleshooting. Hands-on experience with Datahub/Magnitude solutions and their integration with SAP systems. Familiarity with SAP Central Finance architecture and data integration processes

Posted 1 month ago

Apply

7.0 - 12.0 years

0 - 1 Lacs

Bengaluru

Work from Office

seeking an experienced and highly skilled CFIN Non-SAP Functional Consultant with deep functional understanding of financial processes and extensive hands-on experience in integrating non-SAP source systems into SAP Central Finance (CFIN) using Magnitude/Datahub. This role focuses on enabling replication of financial data (FI/CO) from legacy or third-party ERP systems to SAP S/4HANA Central Finance. Primary responsibilities: Collaborate with cross-functional teams to understand the source system architecture and financial data flow. Design and implement end-to-end non-SAP system integrations to SAP CFIN using Magnitude/Datahub. Configure and manage replication of financial documents (GL, AP, AR, CO, etc.) from non-SAP systems to SAP CFIN. Ensure data mapping, data harmonization, and data transformation aligns with SAP CFIN requirements. Support ongoing operations and resolve issues related to data inconsistencies, failed replications, and error handling. Lead or participate in workshops, requirement gathering sessions, and technical/functional documentation efforts. Collaborate with the CFIN project team (SAP, Middleware, Source System SMEs) for smooth data integration. Perform impact analysis and support testing cycles (unit testing, SIT, UAT) for all integration scenarios. Work with Magnitude support or internal teams to troubleshoot and optimize performance or configuration. Required Qualifications & Skills: Minimum 7 years of experience in IT/Finance domain with at least 4 years in CFIN projects. Hands-on experience integrating non-SAP ERPs (Oracle, JDE, Peoplesoft, etc.) to SAP CFIN using Magnitude/Datahub. Strong understanding of financial master data and transactional data replication in CFIN. Sound knowledge of source system extractors, interface design, and error handling in Datahub. Functional understanding of SAP FI/CO modules and their equivalents in non-SAP systems. Experience with mapping and reconciliation between source systems and SAP S/4HANA CFIN. Familiarity with SLT, AIF (Application Interface Framework), and error resolution processes in SAP CFIN landscape.

Posted 1 month ago

Apply

5.0 - 9.0 years

13 - 17 Lacs

Pune

Work from Office

Diacto is looking for a highly capable Data Architect with 5 to 9 years of experience to lead cloud data platform initiatives with a primary focus on Snowflake and Azure Data Hub. This individual will play a key role in defining the data architecture strategy, implementing robust data pipelines, and enabling enterprise-grade analytics solutions. This is an on-site role based in our Baner, Pune office. Qualifications: B.E./B.Tech in Computer Science, IT, or related discipline MCS/MCA or equivalent preferred Key Responsibilities: Design and implement enterprise-level data architecture with a strong focus on Snowflake and Azure Data Hub Define standards and best practices for data ingestion, transformation, and storage Collaborate with cross-functional teams to develop scalable, secure, and high-performance data pipelines Lead Snowflake environment setup, configuration, performance tuning, and optimization Integrate Azure Data Services with Snowflake to support diverse business use cases Implement governance, metadata management, and security policies Mentor junior developers and data engineers on cloud data technologies and best practices Experience and Skills Required: 5?9 years of overall experience in data architecture or data engineering roles Strong, hands-on expertise in Snowflake, including design, development, and performance tuning Solid experience with Azure Data Hub and Azure Data Services (Data Lake, Synapse, etc.) Understanding of cloud data integration techniques and ELT/ETL frameworks Familiarity with data orchestration tools such as DBT, Airflow, or Azure Data Factory Proven ability to handle structured, semi-structured, and unstructured data Strong analytical, problem-solving, and communication skills Nice to Have: Certifications in Snowflake and/or Microsoft Azure Experience with CI/CD tools like GitHub for code versioning and deployment Familiarity with real-time or near-real-time data ingestio Why Join Diacto Technologies Work with a cutting-edge tech stack and cloud-native architectures Be part of a data-driven culture with opportunities for continuous learning Collaborate with industry experts and build transformative data solutions

Posted 1 month ago

Apply

7.0 - 12.0 years

10 - 20 Lacs

Hyderabad, Bengaluru, Mumbai (All Areas)

Work from Office

End to end knowledge of Hybris Order Management and Hybris Data Hub integration. Exp in Spring, Hibernate and JEE technologies frameworks,troubleshooting and bug fixing issues. X/HTML, CSS, JavaScript, XML and SQL. •- Hybris Core, Hybris eCommerce

Posted 1 month ago

Apply

10.0 - 16.0 years

15 - 25 Lacs

Bengaluru

Work from Office

If interested, please apply on the link : https://forms.office.com/r/3Qjxw7hwYj We are currently seeking a highly capable professional to join our team to who can lead Data Governance programs and initiatives like designing governance frameworks, policies and standards, implement governance strategies, data quality, security & compliance and privacy. As a successful Data Governance professional you will be responsible for : Define and drive the Data Governance Programmes and initiatives. Designing Data Governance and Data Quality Frameworks. Drive and Manage the implementation of Data Governance, Quality and Data Catalog projects. Hire and enable the team on Data governance. Strong verbal and written communication to Interface with varied senior stakeholders to convey and establish POV. Understand the Regulatory Landscape as per client requirements and collaborate to maintain enterprise data policies and standards. Design test solutions to ensure data quality principles are maintained as may be required. Design and deliver effective and actionable insights on data privacy and risk, proactively identifying opportunities to reduce risk through timely action and mitigate risk related to data management. Develop, Implement and Promote Governance Strategies, ensure quality, security & compliance and privacy of data assets. Be the Governance champion to lead the data steward, data owners and data users to communicate and reinforce the importance of data governance and support them to succeed in their role. Collaborate with cross-functional teams to define the governance policies, roles and responsibilities. Manage and report on the compliance with Data Privacy policies and escalation of issues and concerns Drive and Manage the implementation of data governance tools and technologies to support data catalogs, business glossary, technical & business meta data management, data lineage, metadata management, active metadata management and data quality monitoring. Required Skills: A Masters Degree in Business, Engineering, Sciences, IT, Computer Sciences or Statistics 12 to 16 years of Information systems experience along with 10 years of experience in data governance, data management, delivering data governance projects/program. CDMP Certification is mandatory. Strong understanding of data governance frameworks, methodologies, and industry standards like DAMA, GDPR, CCPA, DPDPA, BCBS 239, etc. Aware of industry trends and priorities and can apply to governance and policies. Interest in latest trends in AI & Data with Data Governance platforms. In-depth knowledge of data management, common data models, metadata management, data quality, master data management, data stewardship, data protection, etc. Strong experience and knowledge of data management tools and technologies, such as data governance platforms, data cataloging tools, data quality, lineage tools, security and Privacy tools. Strong team spirit, balanced by a healthy sense of autonomy Excellent communication and interpersonal skills, with the ability to effectively collaborate with stakeholders at all levels of the organization. Hands-on experience on any of the following Data Governance tools: Collibra, Atlan, Alation, Datahub, IDGC. Knowledge of relational databases, SQL is a must. Take responsibilities and Manage complex projects and be the trusted advisor of the clients and consultants in the Organization. Ability to work in a complex and constantly changing environment.

Posted 1 month ago

Apply

8.0 - 13.0 years

7 - 11 Lacs

Hyderabad

Work from Office

Req ID: 324284 We are currently seeking a Oracle Cloud PLM Technical consultant to join our team in Hyderabad, Telangana (IN-TG), India (IN). About the role Technical consultant with 8+ years"™ experience in Oracle Cloud PLM . Responsibilities Implementation and IntegrationLead the deployment and integration of Oracle Cloud PLM modules - PD (Product Development) and PDH (Product Data Hub), ensuring alignment with business processes. Requirement AnalysisGather requirements to understand client needs. Configuration and CustomizationOversee the configuration and customization of the PLM module. Data Migration Support "“ Generation of FBDI files from Custom Tables, different Data Sources/PLM"™s. Programming expertise for extraction , Validation, cleanup of Data for migration. Skillset requirements Expertise in Oracle Cloud PLM and Agile PLM Technical aspects. Strong programming skills - expertise in Groovy script (Example Validation rules, Triggers, Object functions). Expertise on Integrating Oracle PLM Cloud with External Systems using Webservices, OCI etc. Programming Expertise (Java etc.) for supporting Data Extraction, Cleanup, Validation, Transformation etc. Make application changes using Application Composer Configuring objects Proficiency in PL/SQL reporting for Oracle Cloud. Deep understanding of Oracle Cloud database tables and structure. Experience with data loads into Oracle Cloud using FBDI (File-Based Data Import). Proven experience in PD (Product Development) and PDH (Product Data Hub) modules. Hands-on experience with data migration across Oracle Cloud PLM and Agile PLM environments. Excellent analytical, problem-solving, and communication skills. Ability to collaborate with both technical and business teams effectively. A results-driven and proactive approach with a track record of successful project delivery.

Posted 1 month ago

Apply

3.0 - 8.0 years

18 - 22 Lacs

Pune

Work from Office

Job ID: 199713 Required Travel :Minimal Managerial - No Location: :India- Pune (Amdocs Site) Who are we Amdocs helps those who build the future to make it amazing. With our market-leading portfolio of software products and services, we unlock our customers innovative potential, empowering them to provide next-generation communication and media experiences for both the individual end user and enterprise customers. Our employees around the globe are here to accelerate service providers migration to the cloud, enable them to differentiate in the 5G era, and digitalize and automate their operations. Listed on the NASDAQ Global Select Market, Amdocs had revenue of $5.00 billion in fiscal 2024. For more information, visit www.amdocs.com In one sentence Our Consultant role provides services for Amdocs and 3rd party applications, also runs programs, applies functional business standard methodologies to transform business processes, crafts and leads change to restructure and transform organizations and achieves business objectives. What will your job look like Develop and Implement AI Models o Design, develop, and deploy generative AI models and machine learning algorithms to address various business needs. o Developing and maintaining documentation for prompt generation processes and best practices o Design and deliver cutting-edge AI systems to solve some of our clients most challenging business problems o Deliver production-grade, mature AI/ML solutions for our clients at scale, built with automated, repeatable processes o Take ownership of and drive key architecture decisions around the AI/ML lifecycle (build, test, deploy, and monitor) o Utilize best-of-breed practices, tools, and cloud solutions. Performance Optimization o Optimize AI models for performance, scalability, and accuracy. o Continuously improving prompt quality, performance and the overall AI prompt generation process o Analyzing user feedback and data to identify areas for improvement in prompt quality and performance o Optimal performant AI model selection by Evaluation. Research and Innovation o Stay updated with the latest advancements in AI and ML, and apply cutting-edge techniques to improve existing models and create new solutions. o Conducting research on natural language processing and machine learning techniques to enhance prompt generation Collaboration o Collaborating with content, product and data teams to align prompts with company goals and user needs o Collaborating with data scientists and machine learning engineers to develop and improve AI models Documentation and Reporting o Document AI models, processes, and findings, and present them to stakeholders All you need is... and skills Educational Background o Bachelor's or Master's degree in Computer Science, Artificial Intelligence, Machine Learning, or a related field. Technical Skills: o Proficiency in programming languages such as Python/ R/ Java o Proficiency in data management and querying languages ( SQL / Pyspark) o Good understanding in DataOps, ModelOps and DevOps. o Experience with AI/ML frameworks and libraries like TensorFlow, PyTorch, Keras, and Scikit-learn. o Experience using large language models like gemma, llama, anthropic. o Experience using frameworks like Langchain, langgraph, RAG with cloud AI platforms (AWS SageMaker, GCP AI Platform, Azure ML) to architect AI solutions o Experience in deploying LLM based bots on web, integration with MSFT Powerapps. o Experience in containerization of LLM powered apps. o Proven work experience as a Prompt Engineer or similar role. o Strong understanding of generative models (e.g., GANs, VAEs) and deep learning techniques. o Knowledge of natural language processing (NLP) and computer vision. o Handson experience in Agentic AI Framework o Proven Data engineering experience. o More than 3 years of professional experience with cloud technologies (AWS, GCP, Azure) o Relevant training and/or certifications in computer science, AI or a related field o Knowledge of automated testing and monitoring solutions o Experience with open-source modeling frameworks (ex. TensorFlow, Keras, PyTorch, Scikit-learn) o Experience in developing agentic AI solutions Communication o Strong verbal and written communication skills o Ability to collaborate effectively with cross-functional teams o Excellent verbal and written communication skills to convey technical concepts to non-technical stakeholders. Why you will love this job: You will be able to use your specific insights to lead business change on a large scale and drive transformation within our organization. You will be a key member of a global, dynamic and highly collaborative team with various possibilities for personal and professional development. You will have the opportunity to work in multinational environment for the global market leader in its field! We offer a wide range of stellar benefits including health, dental, vision, and life insurance as well as paid time off, sick time, and parental leave! Amdocs is an equal opportunity employer. We welcome applicants from all backgrounds and are committed to fostering a diverse and inclusive workforce

Posted 1 month ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Kronos Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time educationRoles and responsibilities:Collaborate with project teams and client stakeholders to support project delivery.Perform maintenance and configuration activities for Kronos modules such as accruals and timekeeper.Prior experience in supporting functional testing, integration testing and UAT preferred.Assisting the customer with testing, understanding the solution and hand holding during handover of the system.Experience in Test automation and/or manual testing wrt UKG platformMentor junior members.Thrive in a team environment, while also possessing the ability to work independently.Proven ability to work creatively and analytically in a problem-solving environment.Solid interpersonal skills to interface with co-workers and customers and manage specific tasks to completion with minimal direction.Proven ability to build, manage and foster a team-oriented environment.Desire to work in an information systems environment.Technical and Professional Experience:Minimum of 4+years of experience.Capability to understand the business case, features, high-level architecture and benefits of Data Hub and general process of getting data into and out of the system.Identify the various settings related to pipelines and wrappers.Mandatory Experience in Migration from WFC to Pro WFM.Hands on exp in using Navigator, Paragon Transfer Manger,Data Migtration Tool.List implementation tasks Describe mappings required for Pro WFM to successfully deliver pay code-related data to Pro WFM Data Hub.Recognize how to validate reporting data in Data Hub to ensure consistency between Pro WFM and Data Hub and other Data Hub considerations such as role-level security, how to use the data dictionary, importing/exporting data, and transitioning customers to support.Design and determine if a customers reporting need is best met by a Data view, standard report, modified standard report, or custom report.Make modifications to standard reports and publish the resulting reports.Create custom reports from scratch using BIRT Report Studio Implement computed columns in custom reports.Navigate and leverage information found in the Data Dictionary. Additional Information:Ready to work in shifts. Qualification 15 years full time education

Posted 2 months ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Kronos Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time educationRoles and responsibilities:Collaborate with project teams and client stakeholders to support project delivery.Perform maintenance and configuration activities for Kronos modules such as accruals and timekeeper.Prior experience in supporting functional testing, integration testing and UAT preferred.Assisting the customer with testing, understanding the solution and hand holding during handover of the system.Experience in Test automation and/or manual testing wrt UKG platform.Mentor junior members.Thrive in a team environment, while also possessing the ability to work independently.Proven ability to work creatively and analytically in a problem-solving environment.Solid interpersonal skills to interface with co-workers and customers and manage specific tasks to completion with minimal direction.Proven ability to build, manage and foster a team-oriented environment.Desire to work in an information systems environment.Technical and Professional Experience:Minimum of 6+years of experience in Pro WFM domain.Capability to understand the business case, features, high-level architecture and benefits of Data Hub and general process of getting data into and out of the system.Identify the various settings related to pipelines and wrappers.Mandatory Experience in Migration from WFC to Pro WFM.Hands on exp in using Navigator, Paragon Transfer Manager, Data Migration Tool.Must have hands on experience on Desktop reports.List implementation tasks Describe mappings required for Pro WFM to successfully deliver pay code-related data to Pro WFM Data Hub.Recognize how to validate reporting data in Data Hub to ensure consistency between Pro WFM and Data Hub and other Data Hub considerations such as role-level security, how to use the data dictionary, importing/exporting data, and transitioning customers to support.Design and determine if a customers reporting need is best met by a Data view, standard report, modified standard report, or custom report.Make modifications to standard reports and publish the resulting reports.Create custom reports from scratch using BIRT Report Studio Implement computed columns in custom reports.Navigate and leverage information found in the Data Dictionary. Additional Information:Ready to work in shifts. Qualification 15 years full time education

Posted 2 months ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Dear Candidate, We are hiring a Zig Developer to build bare-metal, embedded, or systems-level applications with minimal runtime overhead. Key Responsibilities: Develop applications in Zig with a focus on performance and safety Replace or extend C code with cleaner Zig equivalents Work on cross-compilation for embedded or platform-specific builds Contribute to tooling, kernel development, or embedded firmware Optimize binary sizes and compile times Required Skills & Qualifications: Strong grasp of Zig and its manual memory management Familiar with low-level programming , C interop , and cross-compilation Experience with bare-metal systems or firmware is a plus Bonus: Kernel development or OS-level contributions Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 2 months ago

Apply

1.0 - 4.0 years

5 - 9 Lacs

Chennai

Work from Office

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Software engineering is the application of engineering to the design, development, implementation, testing and maintenance of software in a systematic method. The roles in this function will cover all primary development activity across all technology functions that ensure we deliver code with high quality for our applications, products and services and to understand customer needs and to develop product roadmaps. These roles include, but are not limited to analysis, design, coding, engineering, testing, debugging, standards, methods, tools analysis, documentation, research and development, maintenance, new development, operations and delivery. With every role in the company, each position has a requirement for building quality into every output. This also includes evaluating new tools, new techniques, strategies; Automation of common tasks; build of common utilities to drive organizational efficiency with a passion around technology and solutions and influence of thought and leadership on future capabilities and opportunities to apply technology in new and innovative ways. Basic, structured, standard approach to work. Primary Responsibilities Build data pipelines to process terabytes of data Orchestrate in Airflow the data tasks to run on Kubernetes/Hadoop for the ingestion, processing and cleaning of data Create Docker images for various applications and deploy them on Kubernetes Design and build best in class processes to clean and standardize data Tuning and optimizing data processes Advancing the team’s DataOps culture (CI/CD, Orchestration, Testing, Monitoring) and building out standard development patterns Drive efficiencies in current engineering processes via standardization and migration of existing on-premises processes to the cloud Ensuring Data Quality - building best in class data quality monitoring that ensure that all data products exceed customer expectations Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Computer Science bachelor’s degree or similar Good experience handling real-time, near real-time and batch data ingestions Experience building cloud-native data pipelines on either AWS, Azure or GCP, following best practices in cloud deployments Solid DataOps experience (CI/CD, Orchestration, Testing, Monitoring) Hands on experience on the following technologies: Developing processes in Spark Writing complex SQL queries Building ETL/data pipelines Related/complementary open-source software platforms and languages (e.g. Scala, Python, Java, Linux) Good understanding of Data Modelling techniques i.e. DataVault, Kimble Star Excellent understanding of Column-Store RDBMS (DataBricks, Snowflake, Redshift, Vertica, Clickhouse) Proven track record of designing effective data strategies and leveraging modern data architectures that resulted in business value Demonstrated effective interpersonal, influence, collaboration and listening skills Proven solid stakeholder management skills At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.

Posted 2 months ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Dear Candidate, We are hiring an OCaml Developer to build functional and type-safe applications for fintech, compilers, or language tooling projects. Key Responsibilities: Write and maintain applications using OCaml Design algorithms and data structures for high-performance tasks Work on compilers, static analysis tools, or financial systems Interface with C bindings and build cross-platform binaries Contribute to code quality through tests and formal methods Required Skills & Qualifications: Proficient in OCaml , functional programming , and type systems Familiarity with Jane Streets Core , Dune , and OPAM Understanding of immutability , pattern matching , and functors Bonus: Experience in ReasonML or formal verification Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 2 months ago

Apply

5.0 - 10.0 years

9 - 19 Lacs

Hyderabad, Chennai

Hybrid

Location : Hyderabad/Chennai Model : hybrid ( 3 Days onsite and 2 Days Work from Home) Client : ACS Solution End Client : Verizon Experience : 5-9 years Job Description: We are seeking a skilled SAP Commerce Cloud Developer with expertise in the B2B Accelerator, Order Management, and Datahub integration to join our dynamic team in either Hyderabad or Chennai location. As a key member of our e-commerce team, you will be responsible for designing, developing, and implementing solutions tailored to meet our clients' B2B commerce needs. Responsibilities: 1. Design, develop, and customize SAP Commerce Cloud (formerly Hybris) B2B Accelerator features to meet business requirements. 2. Implement and integrate SAP Commerce Cloud Order Management functionalities to streamline order processing and fulfillment processes. 3. Configure and extend SAP Datahub to integrate with various internal and external systems, ensuring seamless data flow and synchronization. 4. Enable to develop a base store in SAP Hybris framework from end to end and Knowledge of Hybris Order Management and Hybris Data Hub integration and experience in Spring, Hibernate and other J2EE technologies/frameworks. And front end technologies HTML, CSS, JavaScript, XML and SQL. 5. Understanding the current Hybris system implementation and customization Technical. 4. Collaborate with cross-functional teams, including business stakeholders, architects, and project managers, to gather requirements and deliver high-quality solutions. 6. Provide technical guidance and support to junior developers, ensuring adherence to best practices and coding standards. 7. Perform code reviews, troubleshooting, and debugging to ensure the stability, performance, and scalability of the SAP Commerce Cloud platform. 8. Stay updated with the latest SAP Commerce Cloud and industry trends, best practices, and technologies, recommending innovative solutions to enhance the e-commerce platform. Requirements: 1. Bachelor's degree in Computer Science, Engineering, or related field. 2. 5-9 years of hands-on experience in SAP Commerce Cloud development, customization, and integration. 3. Proficiency in SAP Commerce Cloud B2B Accelerator, including catalog management, pricing, account management, and order management. 4. Strong understanding of SAP Commerce Cloud Order Management capabilities, including order processing, fulfillment, and inventory management. 5. Experience with SAP Datahub integration, data modeling, and data synchronization between SAP Commerce Cloud and other systems. 6. Expertise in Java/J2EE development, Spring framework, and web technologies (HTML, CSS, JavaScript). 7. Familiarity with SAP Commerce Cloud Backoffice customization, ImpEx scripts, and SAP Hybris Cockpit. 8. Excellent communication skills and ability to work effectively in a collaborative team environment. 9. Strong analytical and problem-solving skills, with a proactive attitude towards issue resolution and continuous improvement. Preferred Qualifications: 1. Good to have SAP Commerce Cloud certification. 4. Experience with Agile/Scrum methodologies and DevOps practices. 5. Prior experience in e-commerce or retail industry projects.

Posted 2 months ago

Apply
Page 1 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies