Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 years
0 Lacs
Greater Chennai Area
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : Apache Spark Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for ensuring that the applications are developed according to the specified requirements and are aligned with the business goals. Your typical day will involve collaborating with the team to understand the application requirements, designing and developing the applications using PySpark, and configuring the applications to meet the business process needs. You will also be responsible for testing and debugging the applications to ensure their functionality and performance. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Design and build applications using PySpark. - Configure applications to meet business process requirements. - Collaborate with the team to understand application requirements. - Test and debug applications to ensure functionality and performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark. - Good To Have Skills: Experience with Apache Spark. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 5 years of experience in PySpark. - This position is based at our Chennai office. - A 15 years full time education is required.
Posted 1 week ago
12.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! Role Summary Digital Experience (DX) ( https://www.adobe.com/experience-cloud.html) is a USD 4B+ business serving the needs of enterprise businesses including 95%+ of fortune 500 organizations. Adobe Experience Manager, within Adobe DX is the world’s largest CMS platform, is a solution that helps enterprises create, manage, and deliver digital experiences across various channels like websites, mobile apps, and digital signage. According to a Forrester report, Experience Manager is the most robust CMS on the market. More than 128,000 websites rely on the agile setup of Experience Manager to manage their content. We are looking for strong and passionate engineers/managers to join our team as we scale the business by building the next gen products and adding customer value to our existing offerings. If you’re passionate about innovative technology, then we would be excited to talk to you! What You'll Do Mentor and guide a high-performing engineering team to deliver outstanding results Lead the technical design, vision, and implementation strategy for next-gen Multi-cloud services Partner with global leaders to help craft product architecture, roadmap, and release plans Drive strategic decisions ensuring successful project delivery and high code quality Apply standard methodologies and coding patterns to develop maintainable and modular solutions Optimize team efficiency through innovative engineering processes and teamwork models Attract, hire, and retain top talent while encouraging a positive, collaborative culture Lead discussions on emerging industry technologies and influence product direction What you need to succeed 12+ years of experience in software development with a proven leadership track record, min 3 years as manager leading a team of high performing full stack engineers. Proficiency in Java/JSP for backend development and experience with frontend technologies like React, Angular, or JQuery Experience with cloud platforms such as AWS or Azure Proficiency in version control, CI/CD pipelines, and DevOps practices Familiarity with Docker, Kubernetes, and Infrastructure as Code tools Experience with Web-Sockets, or event-driven architectures Deep understanding of modern software architecture, including microservices and API-first development Proven usage of AI/GenAI engineering productivity tools like github copilot, cursor. Practical experience with Python would be helpful. Exposure to open source contribution models to Apache, Linux foundation projects or any other 3rd party frameworks would be an added advantage. Strong problem-solving, analytical, and decision-making skills Excellent communication, collaboration, and management skills Passion for high-quality software and improving engineering processes BS/MS or equivalent experience in Computer Science or a related field Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more about our vision here. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015.
Posted 1 week ago
10.0 years
0 Lacs
Dehradun, Uttarakhand, India
On-site
Key Responsibilities - Familiarity with modern storage formats like Parquet and ORC. Design and develop conceptual, logical, and physical data models to support enterprise data initiatives. Build, maintain, and optimize data models within Databricks Unity Catalog. Develop efficient data structures using Delta Lake, optimizing for performance, scalability, and reusability. Collaborate with data engineers, architects, analysts, and stakeholders to ensure data model alignment with ingestion pipelines and business goals. Translate business and reporting requirements into robust data architecture using best practices in data warehousing and Lakehouse design. Maintain comprehensive metadata artifacts including data dictionaries, data lineage, and modeling documentation. Enforce and support data governance, data quality, and security protocols across data ecosystems. Continuously evaluate and improve modeling processes and Skills and Experience : 10+ years of hands-on experience in data modeling in Big Data environments. Expertise in OLTP, OLAP, dimensional modeling, and enterprise data warehouse practices. Proficient in modeling methodologies including Kimball, Inmon, and Data Vault. Hands-on experience with modeling tools such as ER/Studio, ERwin, PowerDesigner, SQLDBM, dbt, or Lucidchart. Proven experience in Databricks with Unity Catalog and Delta Lake. Strong command of SQL and Apache Spark for querying and transformation. Hands-on experience with the Azure Data Platform, including : Azure Data Factory Azure Data Lake Storage Azure Synapse Analytics Azure SQL Database Exposure to Azure Purview or similar data cataloging tools. Strong communication and documentation skills, with the ability to work in cross-functional agile Qualifications : Bachelor's or Masters degree in Computer Science, Information Systems, Data Engineering, or related field. Certifications such as Microsoft DP-203: Data Engineering on Microsoft Azure. Experience working in agile/scrum environments. Exposure to enterprise data security and regulatory compliance frameworks (e.g., GDPR, HIPAA) is a plus. (ref:hirist.tech)
Posted 1 week ago
15.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description: Lead Software Engineer – Enterprise Solutions & Transformation We are seeking an accomplished Lead Software Engineer with 15+ years of experience in IT and software development to architect, modernize, and deliver robust enterprise solutions. You will drive the transformation of legacy applications to modern cloud-native architectures, build and integrate scalable platforms, and champion best practices in DevOps, observability, and cross-functional collaboration. This technical leadership role is ideal for innovators passionate about enabling business agility through technology modernization and integration. Roles and Responsibilities Architect, design, develop, test, and document enterprise-grade software solutions, aligning with business needs, quality standards, and operational requirements. Lead transformation and modernization efforts: Evaluate and migrate legacy systems to modern, scalable, and maintainable architectures leveraging cloud-native technologies and microservices. Engineer integration solutions with platforms such as Apache Kafka, MuleSoft, and other middleware or messaging technologies to support seamless enterprise connectivity. Define and implement end-to-end architectures for both new and existing systems, ensuring scalability, security, performance, and maintainability. Collaborate with Solution and Enterprise Architects and portfolio stakeholders to analyze, plan, and realize features, enablers, and modernization roadmaps. Work closely with infrastructure engineers to provision, configure, and optimize cloud resources, especially within Azure (AKS, Cosmos DB, Event Hub). Champion containerization and orchestration using Docker and Azure Kubernetes Service (AKS) for efficient deployment and scaling. Drive observability: Define and implement system monitoring, logging, and alerting strategies using tools such as Prometheus, Grafana, and ELK Stack. Lead and participate in code and documentation reviews to uphold quality and engineering excellence. Mentor and coach engineers and developers, fostering technical growth and knowledge sharing. Troubleshoot and resolve complex issues across application, integration, and infrastructure layers. Advocate and implement modern DevOps practices: Build and maintain robust CI/CD pipelines, Infrastructure-as-Code, and automated deployments. Continuously evaluate and adopt new tools, technologies, and processes to improve system quality, delivery, and operational efficiency. Translate business needs and legacy constraints into actionable technical requirements and provide accurate estimates for both new builds and modernization projects. Ensure NFRs (scalability, security, availability, performance) are defined, implemented, and maintained across all solutions. Collaborate cross-functionally with DevOps, support, and peer teams to ensure operational excellence and smooth transformation initiatives. Required Qualifications Bachelor’s or master’s degree in computer science, Information Systems, or a related field. 15+ years of experience in IT and software development roles, with a track record of delivering enterprise-scale solutions. 5+ years of hands-on experience building Java-based, high-volume/high-transaction applications. 5+ years of experience with Java, Spring, and RESTful API development. 3+ years of experience in modernizing legacy applications or leading transformation initiatives. 3+ years of experience in performance tuning, application monitoring, and troubleshooting. 3+ years of experience with integration platforms (Kafka, MuleSoft, RabbitMQ, etc.). 2+ years of experience architecting solutions and leading technical design for enterprise systems. Experience working with container orchestration, especially Azure Kubernetes Service (AKS). Preferred Qualifications 3+ years of experience in microservices architecture and system design. 3+ years in technical leadership or mentoring roles. 3+ years hands-on with cloud platforms (Azure, AWS, GCP, OpenStack). Experience with cloud resource provisioning (ARM templates, Terraform, Ansible, Chef). Strong DevOps skills: CI/CD pipelines with GitHub, Maven, Jenkins, Nexus, SonarQube. Advanced knowledge of observability (Prometheus, Grafana, ELK). Proficiency in Unix/Linux command line and shell scripting. Expert in asynchronous messaging, stream processing, and event-driven architectures. Experience in Agile/Scrum/Kanban environments. Familiarity with front-end technologies (HTML5, JavaScript frameworks, CSS3). Certifications in Java, Spring, Azure, or relevant integration/cloud technologies. Excellent communication skills for both technical and business audiences. Technical Skills Languages & Frameworks: Java, Groovy, Spring (Boot, Cloud), REST Integration & Messaging: Kafka, MuleSoft, RabbitMQ, MQ, Redis, Hazelcast Legacy Modernization: Refactoring, rearchitecting, and migrating monolithic or legacy applications to modern platforms. Databases: NoSQL (Cassandra, Cosmos DB), SQL Monitoring & Observability: Prometheus, Grafana, ELK Stack Orchestration: Docker, AKS (Azure Kubernetes Service) Cloud Platforms: Azure (Event Hub, Cosmos DB, AKS), AWS, GCP, OpenStack IaC & DevOps: Terraform, Ansible, Chef, Jenkins, Maven, Nexus, SonarQube, Git, Jira Scripting & Front-End: Node.js, React.js, Python, R Why Join Us? Lead modernization and transformation of critical business systems to future-ready cloud architectures. Architect and deliver enterprise-scale, highly integrated, observable solutions. Mentor and inspire a talented engineering team. Shape the organization’s technical direction in cloud, integration, and DevOps. Thrive in a collaborative, innovative, and growth-focused environment. Enjoy competitive compensation and opportunities for career advancement. Weekly Hours: 40 Time Type: Regular Location: IND:AP:Hyderabad / Argus Bldg 4f & 5f, Sattva, Knowledge City- Adm: Argus Building, Sattva, Knowledge City It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made.
Posted 1 week ago
10.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Role Description Role Proficiency: Act creatively to develop applications and select appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions account for others' developmental activities Outcomes Interpret the application/feature/component design to develop the same in accordance with specifications. Code debug test document and communicate product/component/feature development stages. Validate results with user representatives; integrates and commissions the overall solution Select appropriate technical options for development such as reusing improving or reconfiguration of existing components or creating own solutions Optimises efficiency cost and quality. Influence and improve customer satisfaction Set FAST goals for self/team; provide feedback to FAST goals of team members Measures Of Outcomes Adherence to engineering process and standards (coding standards) Adherence to project schedule / timelines Number of technical issues uncovered during the execution of the project Number of defects in the code Number of defects post delivery Number of non compliance issues On time completion of mandatory compliance trainings Code Outputs Expected: Code as per design Follow coding standards templates and checklists Review code – for team and peers Documentation Create/review templates checklists guidelines standards for design/process/development Create/review deliverable documents. Design documentation r and requirements test cases/results Configure Define and govern configuration management plan Ensure compliance from the team Test Review and create unit test cases scenarios and execution Review test plan created by testing team Provide clarifications to the testing team Domain Relevance Advise Software Developers on design and development of features and components with a deep understanding of the business problem being addressed for the client. Learn more about the customer domain identifying opportunities to provide valuable addition to customers Complete relevant domain certifications Manage Project Manage delivery of modules and/or manage user stories Manage Defects Perform defect RCA and mitigation Identify defect trends and take proactive measures to improve quality Estimate Create and provide input for effort estimation for projects Manage Knowledge Consume and contribute to project related documents share point libraries and client universities Review the reusable documents created by the team Release Execute and monitor release process Design Contribute to creation of design (HLD LLD SAD)/architecture for Applications/Features/Business Components/Data Models Interface With Customer Clarify requirements and provide guidance to development team Present design options to customers Conduct product demos Manage Team Set FAST goals and provide feedback Understand aspirations of team members and provide guidance opportunities etc Ensure team is engaged in project Certifications Take relevant domain/technology certification Skill Examples Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Break down complex problems into logical components Develop user interfaces business software components Use data models Estimate time and effort required for developing / debugging features / components Perform and evaluate test in the customer or target environment Make quick decisions on technical/project related challenges Manage a Team mentor and handle people related issues in team Maintain high motivation levels and positive dynamics in the team. Interface with other teams designers and other parallel practices Set goals for self and team. Provide feedback to team members Create and articulate impactful technical presentations Follow high level of business etiquette in emails and other business communication Drive conference calls with customers addressing customer questions Proactively ask for and offer help Ability to work under pressure determine dependencies risks facilitate planning; handling multiple tasks. Build confidence with customers by meeting the deliverables on time with quality. Estimate time and effort resources required for developing / debugging features / components Make on appropriate utilization of Software / Hardware’s. Strong analytical and problem-solving abilities Knowledge Examples Appropriate software programs / modules Functional and technical designing Programming languages – proficient in multiple skill clusters DBMS Operating Systems and software platforms Software Development Life Cycle Agile – Scrum or Kanban Methods Integrated development environment (IDE) Rapid application development (RAD) Modelling technology and languages Interface definition languages (IDL) Knowledge of customer domain and deep understanding of sub domain where problem is solved Additional Comments Job Desription UST Global® is looking for a Technical Integration Lead; who under the general supervision of the Program Manager and Delivery Lead; will work to deliver the implementation of a Health care ecosystem for our Health Plan clients. The Integration Lead will work with the small to mid-size Health Plan Clients; third party vendors; and onshore and offshore UST associates to support system installations; conversions and migrations by ensuring the smooth technical integration of products and services as well as data conversion. Candidate should have excellent technical and communication skills and ability to engage as part of a team working both physically together and virtually. You should be an excellent problem solver who’s able to grasp customer needs and brainstorm ways to fulfill them. As a Technical Integration Lead at UST Global; this is your opportunity to: Conduct technical design reviews to ensure architecture governance compliance. Work with the UST integration team to ensure solution quality and architecture compliance Collaborate with numerous external vendors many who are top in the healthcare industry Independently test and debug Java and SQL code. Utilize your skills in Agile development methodologies including JIRA; SCRUM; Git; BitBucket. Exercise strong communication skills including the ability to convey technical information effectively Identify prospective areas of growth in automation by analyzing Integration/Middleware landscape. Ensure repeatability and reusability of integration design; processes and code Consistently use analytical skills - able to synthesize complex information. Deploy expertise in estimation techniques. Perform other duties and special projects as assigned. Support or conduct staff mentoring as needed. Provide constructive feedback to staff. May have responsibility for staff oversight on a project including goal setting and performance monitoring. Comply with the organization’s Code of Conduct; all regulatory and contractual requirements; organizational policies; procedures; and internal controls. You bring: Experience in Core Java; Spring Boot; Web Services; XML; SQL; Middleware. Experience in developing; debugging and testing SOAP Webservices. Expertise with SOAP UI tool. Experience in using Apache Camel as middleware tool is added advantage. Experience in using java 8 features is added advantage. Expertise with SQL and relational databases. Experience with Agile tools and methodologies including JIRA; SCRUM; Git; BitBucket Experience in architecting software solutions and understanding of architecture governance Bachelor’s degree or higher in business or technical field; 10+ years of relevant work experience in project management or software development; 7+ years of work experience in leadership; management; or similar roles. Work experience in client partner or account interaction is needed. Strong Customer Focus Deep understanding of software development in a team; and a track record of implementing quality software solutions on time and on budget Experience in a Health Edge application and a managed care environment is preferred Experience with Medicare and Medicaid is preferred Skills Healthcare,Java,Spring Boot,Microservices
Posted 1 week ago
15.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description: Lead Software Engineer – Enterprise Solutions & Transformation We are seeking an accomplished Lead Software Engineer with 15+ years of experience in IT and software development to architect, modernize, and deliver robust enterprise solutions. You will drive the transformation of legacy applications to modern cloud-native architectures, build and integrate scalable platforms, and champion best practices in DevOps, observability, and cross-functional collaboration. This technical leadership role is ideal for innovators passionate about enabling business agility through technology modernization and integration. Roles and Responsibilities Architect, design, develop, test, and document enterprise-grade software solutions, aligning with business needs, quality standards, and operational requirements. Lead transformation and modernization efforts: Evaluate and migrate legacy systems to modern, scalable, and maintainable architectures leveraging cloud-native technologies and microservices. Engineer integration solutions with platforms such as Apache Kafka, MuleSoft, and other middleware or messaging technologies to support seamless enterprise connectivity. Define and implement end-to-end architectures for both new and existing systems, ensuring scalability, security, performance, and maintainability. Collaborate with Solution and Enterprise Architects and portfolio stakeholders to analyze, plan, and realize features, enablers, and modernization roadmaps. Work closely with infrastructure engineers to provision, configure, and optimize cloud resources, especially within Azure (AKS, Cosmos DB, Event Hub). Champion containerization and orchestration using Docker and Azure Kubernetes Service (AKS) for efficient deployment and scaling. Drive observability: Define and implement system monitoring, logging, and alerting strategies using tools such as Prometheus, Grafana, and ELK Stack. Lead and participate in code and documentation reviews to uphold quality and engineering excellence. Mentor and coach engineers and developers, fostering technical growth and knowledge sharing. Troubleshoot and resolve complex issues across application, integration, and infrastructure layers. Advocate and implement modern DevOps practices: Build and maintain robust CI/CD pipelines, Infrastructure-as-Code, and automated deployments. Continuously evaluate and adopt new tools, technologies, and processes to improve system quality, delivery, and operational efficiency. Translate business needs and legacy constraints into actionable technical requirements and provide accurate estimates for both new builds and modernization projects. Ensure NFRs (scalability, security, availability, performance) are defined, implemented, and maintained across all solutions. Collaborate cross-functionally with DevOps, support, and peer teams to ensure operational excellence and smooth transformation initiatives. Required Qualifications Bachelor’s or master’s degree in computer science, Information Systems, or a related field. 15+ years of experience in IT and software development roles, with a track record of delivering enterprise-scale solutions. 5+ years of hands-on experience building Java-based, high-volume/high-transaction applications. 5+ years of experience with Java, Spring, and RESTful API development. 3+ years of experience in modernizing legacy applications or leading transformation initiatives. 3+ years of experience in performance tuning, application monitoring, and troubleshooting. 3+ years of experience with integration platforms (Kafka, MuleSoft, RabbitMQ, etc.). 2+ years of experience architecting solutions and leading technical design for enterprise systems. Experience working with container orchestration, especially Azure Kubernetes Service (AKS). Preferred Qualifications 3+ years of experience in microservices architecture and system design. 3+ years in technical leadership or mentoring roles. 3+ years hands-on with cloud platforms (Azure, AWS, GCP, OpenStack). Experience with cloud resource provisioning (ARM templates, Terraform, Ansible, Chef). Strong DevOps skills: CI/CD pipelines with GitHub, Maven, Jenkins, Nexus, SonarQube. Advanced knowledge of observability (Prometheus, Grafana, ELK). Proficiency in Unix/Linux command line and shell scripting. Expert in asynchronous messaging, stream processing, and event-driven architectures. Experience in Agile/Scrum/Kanban environments. Familiarity with front-end technologies (HTML5, JavaScript frameworks, CSS3). Certifications in Java, Spring, Azure, or relevant integration/cloud technologies. Excellent communication skills for both technical and business audiences. Technical Skills Languages & Frameworks: Java, Groovy, Spring (Boot, Cloud), REST Integration & Messaging: Kafka, MuleSoft, RabbitMQ, MQ, Redis, Hazelcast Legacy Modernization: Refactoring, rearchitecting, and migrating monolithic or legacy applications to modern platforms. Databases: NoSQL (Cassandra, Cosmos DB), SQL Monitoring & Observability: Prometheus, Grafana, ELK Stack Orchestration: Docker, AKS (Azure Kubernetes Service) Cloud Platforms: Azure (Event Hub, Cosmos DB, AKS), AWS, GCP, OpenStack IaC & DevOps: Terraform, Ansible, Chef, Jenkins, Maven, Nexus, SonarQube, Git, Jira Scripting & Front-End: Node.js, React.js, Python, R Why Join Us? Lead modernization and transformation of critical business systems to future-ready cloud architectures. Architect and deliver enterprise-scale, highly integrated, observable solutions. Mentor and inspire a talented engineering team. Shape the organization’s technical direction in cloud, integration, and DevOps. Thrive in a collaborative, innovative, and growth-focused environment. Enjoy competitive compensation and opportunities for career advancement. Weekly Hours: 40 Time Type: Regular Location: IND:AP:Hyderabad / Argus Bldg 4f & 5f, Sattva, Knowledge City- Adm: Argus Building, Sattva, Knowledge City It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made.
Posted 1 week ago
5.0 years
0 Lacs
Dehra Dun, Uttarakhand, India
On-site
Our Expectations From You Responsibilities Work with development team and product manager to ideate web designing and development, mobile applications, and software solutions. Build the front-end and back-end of websites including e-commerce based and mobile or web applications through appealing visual designs. Develop and manage well-functioning databases, websites, and applications. Test software to ensure responsiveness and efficiency. Troubleshoot, debug, and upgrade software. Create and maintain security and data protection settings. Build different websites and applications with a mobile responsive design Write technical documentation and audio-video tutorials. Writing clean, functional code on the front- and back-end. Requirements Qualified MCA/MTech form a recognized University. Proven experience as a Full Stack Developer. Experience in developing websites and mobile applications Knowledge of multiple front-end languages and libraries (e.g. PHP, Word, HTML/ CSS, JavaScript, XML, jQuery) Knowledge of multiple back-end languages (e.g. C#, Java, Python) and JavaScript frameworks (e.g. Angular, React, Node.js) Familiarity with databases (e.g. MySQL, MongoDB), web servers (e.g. Apache) and UI/UX design Excellent communication and teamwork skills. Would be awesome to have Strong Communications Skills Ability to work with different teams. Working At LN Webworks We invite you to explore your career path here at LN Webworks and work with some of the largest and most-recognized brands in the world. If you are capable and in the pursuit of excellence in the IT industry then Look no further as LN Webworks Pvt Ltd have the most competent full-time job opportunities that shall boost your career trajectory. Our global delivery HQ is in Ludhiana, India with a Branch Office in Dehradun, India, along with sales offices in the New York US. Why LN Webworks ? Present in 3 locations. Pioneers of digital experience services in the region. We started in 2013 with a team of 2 and Now we have 80 happy team members in Ludhiana & Dehradun Locations. That's the proof that we know how to survive & thrive :) This one is really interesting. The average tenure of our team member with LN Webworks is 5 years - we must be doing something right. :) Amazing healthy & calm workspace with full of greenery all around. Both of our lawns are packed with basketball and Badminton players during break times. Yoga sessions, Dance sessions, Tech presentations, Coaching from experts, Weekly activities, Food stalls, In house DJ parties, Gaming Competitions, you name it :) We are doing all of this Need we say more?
Posted 1 week ago
1.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Overview of 66degrees 66degrees is a leading consulting and professional services company specializing in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology. With our unmatched engineering capabilities and vast industry experience, we help the world's leading brands transform their business challenges into opportunities and shape the future of work. At 66degrees, we believe in embracing the challenge and winning together. These values not only guide us in achieving our goals as a company but also for our people. We are dedicated to creating a significant impact for our employees by fostering a culture that sparks innovation and supports professional and personal growth along the way. Overview of Role As a Data Engineer specializing in AI/ML, you'll be instrumental in designing, building, and maintaining the data infrastructure crucial for training, deploying, and serving our advanced AI and Machine Learning models. You'll work closely with Data Scientists, ML Engineers, and Cloud Architects to ensure data is accessible, reliable, and optimized for high-performance AI/ML workloads, primarily within the Google Cloud ecosystem. Responsibilities Data Pipeline Development: Design, build, and maintain robust, scalable, and efficient ETL/ELT data pipelines to ingest, transform, and load data from various sources into data lakes and data warehouses, specifically optimized for AI/ML consumption. AI/ML Data Infrastructure: Architect and implement the underlying data infrastructure required for machine learning model training, serving, and monitoring within GCP environments. Google Cloud Ecosystem: Leverage a broad range of Google Cloud Platform (GCP) data services including, BigQuery, Dataflow, Dataproc, Cloud Storage, Pub/Sub, Vertex AI, Composer (Airflow), and Cloud SQL. Data Quality & Governance: Implement best practices for data quality, data governance, data lineage, and data security to ensure the reliability and integrity of AI/ML datasets. Performance Optimization: Optimize data pipelines and storage solutions for performance, cost-efficiency, and scalability, particularly for large-scale AI/ML data processing. Collaboration with AI/ML Teams: Work closely with Data Scientists and ML Engineers to understand their data needs, prepare datasets for model training, and assist in deploying models into production. Automation & MLOps Support: Contribute to the automation of data pipelines and support MLOps initiatives, ensuring seamless integration from data ingestion to model deployment and monitoring. Troubleshooting & Support: Troubleshoot and resolve data-related issues within the AI/ML ecosystem, ensuring data availability and pipeline health. Documentation: Create and maintain comprehensive documentation for data architectures, pipelines, and data models. Qualifications 1-2+ years of experience in Data Engineering, with at least 2-3 years directly focused on building data pipelines for AI/ML workloads. Deep, hands-on experience with core GCP data services such as BigQuery, Dataflow, Dataproc, Cloud Storage, Pub/Sub, and Composer/Airflow. Strong proficiency in at least one relevant programming language for data engineering (Python is highly preferred).SQL skills for complex data manipulation, querying, and optimization. Solid understanding of data warehousing concepts, data modeling (dimensional, 3NF), and schema design for analytical and AI/ML purposes. Proven experience designing, building, and optimizing large-scale ETL/ELT processes. Familiarity with big data processing frameworks (e.g., Apache Spark, Hadoop) and concepts. Exceptional analytical and problem-solving skills, with the ability to design solutions for complex data challenges. Excellent verbal and written communication skills, capable of explaining complex technical concepts to both technical and non-technical stakeholders. 66degrees is an Equal Opportunity employer. All qualified applicants will receive consideration for employment without regard to actual or perceived race, color, religion, sex, gender, gender identity, national origin, age, weight, height, marital status, sexual orientation, veteran status, disability status or other legally protected class.
Posted 1 week ago
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Teamwork makes the stream work. Roku is changing how the world watches TV Roku is the #1 TV streaming platform in the U.S., Canada, and Mexico, and we've set our sights on powering every television in the world. Roku pioneered streaming to the TV. Our mission is to be the TV streaming platform that connects the entire TV ecosystem. We connect consumers to the content they love, enable content publishers to build and monetize large audiences, and provide advertisers unique capabilities to engage consumers. From your first day at Roku, you'll make a valuable - and valued - contribution. We're a fast-growing public company where no one is a bystander. We offer you the opportunity to delight millions of TV streamers around the world while gaining meaningful experience across a variety of disciplines. About the team Roku runs one of the largest data lakes in the world. We store over 70 PB of data, run 10+M queries per month, scan over 100 PB of data per month. Big Data team is the one responsible for building, running, and supporting the platform that makes this possible. We provide all the tools needed to acquire, generate, process, monitor, validate and access the data in the lake for both streaming data and batch. We are also responsible for generating the foundational data. The systems we provide include Scribe, Kafka, Hive, Presto, Spark, Flink, Pinot, and others. The team is actively involved in the Open Source, and we are planning to increase our engagement over time. About the Role Roku is in the process of modernizing its Big Data Platform. We are working on defining the new architecture to improve user experience, minimize the cost and increase efficiency. Are you interested in helping us build this state-of-the-art big data platform? Are you an expert with Big Data Technologies? Have you looked under the hood of these systems? Are you interested in Open Source? If you answered “Yes” to these questions, this role is for you! What you will be doing You will be responsible for streamlining and tuning existing Big Data systems and pipelines and building new ones. Making sure the systems run efficiently and with minimal cost is a top priority You will be making changes to the underlying systems and if an opportunity arises, you can contribute your work back into the open source You will also be responsible for supporting internal customers and on-call services for the systems we host. Making sure we provided stable environment and great user experience is another top priority for the team We are excited if you have 7+ years of production experience building big data platforms based upon Spark, Trino or equivalent Strong programming expertise in Java, Scala, Kotlin or another JVM language. A robust grasp of distributed systems concepts, algorithms, and data structures Strong familiarity with the Apache Hadoop ecosystem: Spark, Kafka, Hive/Iceberg/Delta Lake, Presto/Trino, Pinot, etc. Experience working with at least 3 of the technologies/tools mentioned here: Big Data / Hadoop, Kafka, Spark, Trino, Flink, Airflow, Druid, Hive, Iceberg, Delta Lake, Pinot, Storm etc Extensive hands-on experience with public cloud AWS or GCP BS/MS degree in CS or equivalent AI Literacy / AI growth mindset Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive benefits include global access to mental health and financial wellness support and resources. Local benefits include statutory and voluntary benefits which may include healthcare (medical, dental, and vision), life, accident, disability, commuter, and retirement options (401(k)/pension). Our employees can take time off work for vacation and other personal reasons to balance their evolving work and life needs. It's important to note that not every benefit is available in all locations or for every role. For details specific to your location, please consult with your recruiter. The Roku Culture Roku is a great place for people who want to work in a fast-paced environment where everyone is focused on the company's success rather than their own. We try to surround ourselves with people who are great at their jobs, who are easy to work with, and who keep their egos in check. We appreciate a sense of humor. We believe a fewer number of very talented folks can do more for less cost than a larger number of less talented teams. We're independent thinkers with big ideas who act boldly, move fast and accomplish extraordinary things through collaboration and trust. In short, at Roku you'll be part of a company that's changing how the world watches TV. We have a unique culture that we are proud of. We think of ourselves primarily as problem-solvers, which itself is a two-part idea. We come up with the solution, but the solution isn't real until it is built and delivered to the customer. That penchant for action gives us a pragmatic approach to innovation, one that has served us well since 2002. To learn more about Roku, our global footprint, and how we've grown, visit https://www.weareroku.com/factsheet. By providing your information, you acknowledge that you have read our Applicant Privacy Notice and authorize Roku to process your data subject to those terms.
Posted 1 week ago
8.0 - 13.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Job Description What you will do As a Data Engineer, you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree / Master's degree / Bachelor's degree and 8 to 13 years of Computer Science, IT or related field experience. Must-Have Skills: Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL),Snowflake, workflow orchestration, performance tuning on big data processing Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Proficient in SQL, Python for extracting, transforming, and analyzing complex datasets from relational data stores Proficient in Python with strong experience in ETL tools such as Apache Spark and various data processing packages, supporting scalable data workflows and machine learning pipeline development. Strong understanding of data modeling, data warehousing, and data integration concepts Proven ability to optimize query performance on big data platforms Knowledge on Data visualization and analytics tools like Spotfire, PowerBI Preferred Qualifications: Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Knowledge of Python/R, Databricks, SageMaker, cloud data platforms Strong knowledge on Oracle / SQL Server, Stored Procedure, PL/SQL, Knowledge on LINUX OS Experience in implementing Retrieval-Augmented Generation (RAG) pipelines, integrating retrieval mechanisms with language models. Skilled in developing machine learning models using Python, with hands-on experience in deep learning frameworks including PyTorch and TensorFlow. Strong understanding of data governance frameworks, tools, and best practices. Knowledge of vector databases, including implementation and optimization. Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Professional Certifications: Databricks Certificate preferred AWS Data Engineer/Architect Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 week ago
40.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
[Role Name : IS Architecture] Job Posting Title: Data Architect Workday Job Profile : Principal IS Architect Department Name: Digital, Technology & Innovation Role GCF: 06A About Amgen Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. About The Role Role Description: The role is responsible for developing and maintaining the data architecture of the Enterprise Data Fabric. Data Architecture includes the activities required for data flow design, data modeling, physical data design, query performance optimization. The Data Architect is a senior-level position responsible for developing business information models by studying the business, our data, and the industry. This role involves creating data models to realize a connected data ecosystem that empowers consumers. The Data Architect drives cross-functional data interoperability, enables efficient decision-making, and supports AI usage of Foundational Data. This role will manage a team of Data Modelers. Roles & Responsibilities: Provide oversight to data modeling team members. Develop and maintain conceptual logical, and physical data models and to support business needs Establish and enforce data standards, governance policies, and best practices Design and manage metadata structures to enhance information retrieval and usability Maintain comprehensive documentation of the architecture, including principles, standards, and models Evaluate and recommend technologies and tools that best fit the solution requirements Evaluate emerging technologies and assess their potential impact. Drive continuous improvement in the architecture by identifying opportunities for innovation and efficiency Basic Qualifications and Experience: [GCF Level 6A] Doctorate Degree and 8 years of experience in Computer Science, IT or related field OR Master’s degree with 12 - 15 years of experience in Computer Science, IT or related field OR Bachelor’s degree with 14 - 17 years of experience in Computer Science, IT or related field Functional Skills: Must-Have Skills : Data Modeling: Expert in creating conceptual, logical, and physical data models to represent information structures. Ability to interview and communicate with business Subject Matter experts to develop data models that are useful for their analysis needs. Metadata Management: Knowledge of metadata standards, taxonomies, and ontologies to ensure data consistency and quality. Information Governance: Familiarity with policies and procedures for managing information assets, including security, privacy, and compliance. Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), performance tuning on big data processing Good-to-Have Skills: Experience with Graph technologies such as Stardog, Allegrograph, Marklogic Professional Certifications Certifications in Databricks are desired Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated awareness of presentation skills Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What You Will Do As a Data Engineer, you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master's degree / Bachelor's degree and 5 to 9 years of Computer Science, IT or related field experience. Must-Have Skills: Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL),Snowflake, workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) Proficient in SQL for extracting, transforming, and analyzing complex datasets from relational data stores Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Proven ability to optimize query performance on big data platforms Preferred Qualifications: Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Knowledge of Python/R, Databricks, SageMaker, cloud data platforms Strong knowledge on Oracle / SQL Server, Stored Procedure, PL/SQL, Knowledge on LINUX OS Knowledge on Data visualization and analytics tools like Spotfire, PowerBI Strong understanding of data governance frameworks, tools, and best practices. Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Professional Certifications: Databricks Certificate preferred AWS Data Engineer/Architect Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What You Will Do As a Data Engineer, you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master's degree / Bachelor's degree and 5 to 9 years of Computer Science, IT or related field experience. Must-Have Skills: Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL),Snowflake, workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) Proficient in SQL for extracting, transforming, and analyzing complex datasets from relational data stores Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Proven ability to optimize query performance on big data platforms Preferred Qualifications: Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Knowledge of Python/R, Databricks, SageMaker, cloud data platforms Strong knowledge on Oracle / SQL Server, Stored Procedure, PL/SQL, Knowledge on LINUX OS Knowledge on Data visualization and analytics tools like Spotfire, PowerBI Strong understanding of data governance frameworks, tools, and best practices. Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Professional Certifications: Databricks Certificate preferred AWS Data Engineer/Architect Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 week ago
6.0 - 7.0 years
15 - 17 Lacs
India
On-site
About The Opportunity This role is within the fast-paced enterprise technology and data engineering sector, delivering high-impact solutions in cloud computing, big data, and advanced analytics. We design, build, and optimize robust data platforms powering AI, BI, and digital products for leading Fortune 500 clients across industries such as finance, retail, and healthcare. As a Senior Data Engineer, you will play a key role in shaping scalable, production-grade data solutions with modern cloud and data technologies. Role & Responsibilities Architect and Develop Data Pipelines: Design and implement end-to-end data pipelines (ingestion → transformation → consumption) using Databricks, Spark, and cloud object storage. Data Warehouse & Data Mart Design: Create scalable data warehouses/marts that empower self-service analytics and machine learning workloads. Database Modeling & Optimization: Translate logical models into efficient physical schemas, ensuring optimal partitioning and performance management. ETL/ELT Workflow Automation: Build, automate, and monitor robust data ingestion and transformation processes with best practices in reliability and observability. Performance Tuning: Optimize Spark jobs and SQL queries through careful tuning of configurations, indexing strategies, and resource management. Mentorship and Continuous Improvement: Provide production support, mentor team members, and champion best practices in data engineering and DevOps methodology. Skills & Qualifications Must-Have 6-7 years of hands-on experience building production-grade data platforms, including at least 3 years with Apache Spark/Databricks. Expert proficiency in PySpark, Python, and advanced SQL with a record of performance tuning distributed jobs. Proven expertise in data modeling, data warehouse/mart design, and managing ETL/ELT pipelines using tools like Airflow or dbt. Hands-on experience with major cloud platforms such as AWS or Azure, and familiarity with modern lakehouse/data-lake patterns. Strong analytical, problem-solving, and mentoring skills with a DevOps mindset and commitment to code quality. Preferred Experience with AWS analytics services (Redshift, Glue, S3) or the broader Hadoop ecosystem. Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Exposure to streaming pipelines (Kafka, Kinesis, Delta Live Tables) and real-time analytics solutions. Familiarity with ML feature stores, MLOps workflows, or data governance frameworks. Relevant certifications (Databricks, AWS, Azure) or active contributions to open source projects. Location: India | Employment Type: Fulltime Skills: agile methodologies,team leadership,performance tuning,sql,elt,airflow,aws,data modeling,apache spark,pyspark,data,hadoop,databricks,python,dbt,big data technologies,etl,azure
Posted 1 week ago
7.0 years
15 - 17 Lacs
India
Remote
Note: This is a remote role with occasional office visits. Candidates from Mumbai or Pune will be preferred About The Company A fast-growing enterprise technology consultancy operating at the intersection of cloud computing, big-data engineering and advanced analytics . The team builds high-throughput, real-time data platforms that power AI, BI and digital products for Fortune 500 clients across finance, retail and healthcare. By combining Databricks Lakehouse architecture with modern DevOps practices, they unlock insight at petabyte scale while meeting stringent security and performance SLAs. Role & Responsibilities Architect end-to-end data pipelines (ingestion → transformation → consumption) using Databricks, Spark and cloud object storage. Design scalable data warehouses/marts that enable self-service analytics and ML workloads. Translate logical data models into physical schemas; own database design, partitioning and lifecycle management for cost-efficient performance. Implement, automate and monitor ETL/ELT workflows, ensuring reliability, observability and robust error handling. Tune Spark jobs and SQL queries, optimizing cluster configurations and indexing strategies to achieve sub-second response times. Provide production support and continuous improvement for existing data assets, championing best practices and mentoring peers. Skills & Qualifications Must-Have 6–7 years building production-grade data platforms, including 3 years+ hands-on Apache Spark/Databricks experience. Expert proficiency in PySpark, Python and advanced SQL, with a track record of performance-tuning distributed jobs. Demonstrated ability to model data warehouses/marts and orchestrate ETL/ELT pipelines with tools such as Airflow or dbt. Hands-on with at least one major cloud platform (AWS or Azure) and modern lakehouse / data-lake patterns. Strong problem-solving skills, DevOps mindset and commitment to code quality; comfortable mentoring fellow engineers. Preferred Deep familiarity with the AWS analytics stack (Redshift, Glue, S3) or the broader Hadoop ecosystem. Bachelor’s or Master’s degree in Computer Science, Engineering or a related field. Experience building streaming pipelines (Kafka, Kinesis, Delta Live Tables) and real-time analytics solutions. Exposure to ML feature stores, MLOps workflows and data-governance/compliance frameworks. Relevant professional certifications (Databricks, AWS, Azure) or notable open-source contributions. Benefits & Culture Highlights Remote-first & flexible hours with 25+ PTO days and comprehensive health cover. Annual training budget & certification sponsorship (Databricks, AWS, Azure) to fuel continuous learning. Inclusive, impact-focused culture where engineers shape the technical roadmap and mentor a vibrant data community Skills: data modeling,big data technologies,team leadership,aws,data,sql,agile methodologies,performance tuning,elt,airflow,apache spark,pyspark,hadoop,databricks,python,dbt,etl,azure
Posted 1 week ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description Job Summary: We are seeking a skilled and motivated System Programmer to join our IT Infrastructure team. This role is responsible for the installation, configuration, maintenance, and performance of critical enterprise systems including Linux servers , Apache HTTP Server , and Oracle WebLogic . The ideal candidate will have strong scripting abilities and experience with writing SQL queries to support operational and development teams. Key Responsibilities Install, configure, and maintain Linux operating systems, Apache HTTP Server, and Oracle WebLogic application servers in development, test, and production environments. Perform regular system patching and software updates to ensure platform security and stability. Develop and maintain automation scripts (e.g., Bash, Python, or similar) to streamline system management tasks. Write and optimize SQL queries to support reporting, troubleshooting, and system integration needs. Monitor system performance and implement tuning improvements to maximize availability and efficiency. Work closely with development, QA, and operations teams to support application deployments and troubleshoot system-related issues. Maintain accurate system documentation, including configurations, procedures, and troubleshooting guides. Participate in an on-call rotation and respond to incidents as required. Required Qualifications Overall 8-12 years of experience. Proven experience with Linux system administration (RHEL, CentOS, or equivalent). Hands-on experience with Apache HTTP Server and Oracle WebLogic. Proficiency in scripting languages such as Bash, Python, or Perl. Strong understanding of SQL and relational databases (e.g., Oracle, MySQL). Familiarity with system monitoring tools and performance tuning. Knowledge of security best practices and patch management procedures. Excellent troubleshooting, analytical, and problem-solving skills. Strong communication skills and ability to work in a collaborative team environment. Preferred Qualifications Experience with CI/CD pipelines, Ansible, ArgoCD, or other automation tools. Exposure to cloud environments (e.g., AWS, Azure) or container technologies (e.g., Docker, Kubernetes). Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation.
Posted 1 week ago
3.0 years
4 Lacs
Delhi
On-site
Job Description: Hadoop & ETL Developer Location: Shastri Park, Delhi Experience: 3+ years Education: B.E./ B.Tech/ MCA/ MSC (IT or CS) / MS Salary: Upto 80k (rest depends on interview and the experience) Notice Period: Immediate joiner to 20 days of joiners Candidates from Delhi/ NCR will only be preferred Job Summary:- We are looking for a Hadoop & ETL Developer with strong expertise in big data processing, ETL pipelines, and workflow automation. The ideal candidate will have hands-on experience in the Hadoop ecosystem, including HDFS, MapReduce, Hive, Spark, HBase, and PySpark, as well as expertise in real-time data streaming and workflow orchestration. This role requires proficiency in designing and optimizing large-scale data pipelines to support enterprise data processing needs. Key Responsibilities Design, develop, and optimize ETL pipelines leveraging Hadoop ecosystem technologies. Work extensively with HDFS, MapReduce, Hive, Sqoop, Spark, HBase, and PySpark for data processing and transformation. Implement real-time and batch data ingestion using Apache NiFi, Kafka, and Airbyte. Develop and manage workflow orchestration using Apache Airflow. Perform data integration across structured and unstructured data sources, including MongoDB and Hadoop-based storage. Optimize MapReduce and Spark jobs for performance, scalability, and efficiency. Ensure data quality, governance, and consistency across the pipeline. Collaborate with data engineering teams to build scalable and high-performance data solutions. Monitor, debug, and enhance big data workflows to improve reliability and efficiency. Required Skills & Experience : 3+ years of experience in Hadoop ecosystem (HDFS, MapReduce, Hive, Sqoop, Spark, HBase, PySpark). Strong expertise in ETL processes, data transformation, and data warehousing. Hands-on experience with Apache NiFi, Kafka, Airflow, and Airbyte. Proficiency in SQL and handling structured and unstructured data. Experience with NoSQL databases like MongoDB. Strong programming skills in Python or Scala for scripting and automation. Experience in optimizing Spark and MapReduce jobs for high-performance computing. Good understanding of data lake architectures and big data best practices. Preferred Qualifications Experience in real-time data streaming and processing. Familiarity with Docker/Kubernetes for deployment and orchestration. Strong analytical and problem-solving skills with the ability to debug and optimize data workflows. If you have a passion for big data, ETL, and large-scale data processing, we’d love to hear from you! Job Types: Full-time, Contractual / Temporary Pay: From ₹400,000.00 per year Work Location: In person
Posted 1 week ago
5.0 - 9.0 years
3 - 9 Lacs
No locations specified
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Sr Associate IS Architect What you will do Let’s do this. Let’s change the world. In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data to deliver actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and performing data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has deep technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Standup and enhance BI reporting capabilities through Cognos, PowerBI or similar tools. Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with multi-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementatio What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master's degree / Bachelor's degree with 5- 9 years of experience in Computer Science, IT or related field Functional Skills: Must-Have Skills Proficiency in Python, PySpark, and Scala for data processing and ETL (Extract, Transform, Load) workflows, with hands-on experience in using Databricks for building ETL pipelines and handling big data processing Experience with data warehousing platforms such as Amazon Redshift, or Snowflake. Strong knowledge of SQL and experience with relational (e.g., PostgreSQL, MySQL) databases. Familiarity with big data frameworks like Apache Hadoop, Spark, and Kafka for handling large datasets. Experience in BI reporting tools such as Cognos, PowerBI and/or Tableau Experienced with software engineering best-practices, including but not limited to version control (GitLab, Subversion, etc.), CI/CD (Jenkins, GITLab etc.), automated unit testing, and Dev Ops Good-to-Have Skills: Experience with cloud platforms such as AWS particularly in data services (e.g., EKS, EC2, S3, EMR, RDS, Redshift/Spectrum, Lambda, Glue, Athena) Experience with Anaplan platform, including building, managing, and optimizing models and workflows including scalable data integrations Understanding of machine learning pipelines and frameworks for ML/AI models Professional Certifications: AWS Certified Data Engineer (preferred) Databricks Certified (preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 week ago
8.0 - 13.0 years
3 - 6 Lacs
No locations specified
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What you will do Let’s do this. Let’s change the world. In this vital role you will work as a member of a Data Platform Engineering team that uses Cloud and Big Data technologies to craft, develop, implement and maintain solutions to support various functions like Manufacturing, Commercial, Research and Development. Roles & Responsibilities: Collaborate with Lead Architect, Business SMEs, and Data Scientists to design data solutions Serve as a Lead Engineer for technical implementation of projects including planning, architecture, design, development, testing, and deployment following agile methodologies Design and development of API services for managing Databricks resources, services & features and to support data governance applications to manage security of data assets following the standards Design and development of enterprise-level re-usable components, frameworks and services to enable data engineers Proactively work on challenging data integration problems by implementing efficient ETL patterns, frameworks for structured and unstructured data Automate and optimize data pipeline and framework for easier and efficient development process Overall management of the Enterprise Data Fabric/Lake on AWS environment to ensure that the service delivery is efficient and business SLAs around uptime, performance and capacity are met Help define guidelines, standards, strategies, security policies and change management policies to support the Enterprise Data Fabric/Lake Advice and support project teams (project managers, architects, business analysts, and developers) on cloud platforms (AWS, Databricks preferred), tools, technology, and methodology related to the design, build scalable, efficient and maintain Data Lake and other Big Data solutions Experience developing in an Agile development environment and ceremonies Familiarity with code versioning using GITLAB, and code deployment tools Mentor junior engineers and team members What we expect of you Basic Qualifications Doctorate degree / Master's degree / Bachelor's degree and 8 to 13 years in Computer Science or Engineering Must-Have Skills: Proven hands-on experience with cloud platforms—AWS (preferred), Azure, or GCP. Strong development experience with Databricks, Apache Spark, PySpark, and Apache Airflow. Proficiency in Python-based microservices development and deployment. Experience with CI/CD pipelines, containerization (Docker, Kubernetes/EKS), and infrastructure-as-code tools. Demonstrated ability to build enterprise-grade, performance-optimized data pipelines in Databricks using Python and PySpark, following best practices and standards. Solid understanding of SQL and relational/dimensional data modelling techniques. Strong analytical and problem-solving skills to address complex data engineering challenges. Familiarity with software engineering standard methodologies, including version control, automated testing, and continuous integration. Hands-on experience with key AWS services: EKS, EC2, S3, EMR, RDS, Redshift/Spectrum, Lambda, and Glue. Exposure to Agile tools such as Jira or Jira Align. Good-to-Have Skills: Experience building APIs and services for provisioning and managing AWS Databricks environments. Knowledge of Databricks SDK and REST APIs for managing workspaces, clusters, jobs, users, and permissions. Familiarity with building AI/ML solutions using Databricks-native features. Experience working with SQL/NoSQL databases and vector databases for large language model (LLM) applications. Exposure to model fine-tuning and timely engineering practices. Experience developing self-service portals using front-end frameworks like React.js. Ability to thrive in startup-like environments with minimal direction. Good communication skills to effectively present technical information to leadership and respond to collaborator inquiries. Certifications (preferred but not required): AWS Certified Data Engineer Databricks Certification SAFe Agile Certification Soft Skills: Strong analytical and problem-solving attitude with the ability to troubleshoot sophisticated data and platform issues. Exceptional communication skills—able to translate technical concepts into clear, business-relevant language for diverse audiences. Collaborative and globally minded, with experience working effectively in distributed, multi-functional teams. Self-motivated and proactive, demonstrating a high degree of ownership and initiative in driving tasks to completion. Skilled at managing multiple priorities in fast-paced environments while maintaining attention to detail and quality. Team-oriented with a growth mindset, contributing to shared goals and fostering a culture of continuous improvement. Effective time and task management, with the ability to estimate, plan, and deliver work across multiple projects while ensuring consistency and quality What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 week ago
5.0 - 9.0 years
4 - 8 Lacs
No locations specified
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What you will do As a Sr. Associate IS Security Engineer at Amgen, you will play a critical role in ensuring the security and protection of the company's information systems and data. You will implement security measures, conduct security audits, analyze security incidents, and provide recommendations for improvements. Your strong knowledge of security protocols, network infrastructure, and vulnerability assessment will contribute to maintaining a secure IT environment. Roles & Responsibilities: Apply patches, perform OS upgrades, manage platform end-of-life. Perform annual audits and periodic compliance reviews. Support GxP validation and documentation processes. Monitor and respond to security incidents. Correlate alerts across platforms for threat detection. Improve procedures through post-incident analysis. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master's degree / Bachelor's degree and 5 to 9years of Computer Science, IT or related field experience. Must-Have Skills: Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL),Snowflake, workflow orchestration, performance tuning on big data processiSolid understanding of security technologies and their core functionality Experience in analyzing cybersecurity threats with up-to-date knowledge of attack vectors and the cyber threat landscape. Ability to prioritize tasks effectively and solve problems efficiently in a diverse, global team environment. Good knowledge of Windows and/or Linux systems. Experience with security alert correlation across different platforms. Experience with ServiceNow, especially CMDB, Common Service Data Model (CSDM) and IT Service Management. SQL & Database Knowledge – Experience working with relational databases, querying data, and optimizing datasets. Preferred Qualifications: Familiarity with Cloud services like AWS (e.g., Redshift, S3, EC2, IAM ), Databricks (Deltalake, Unity catalog, token etc) Understanding of Agile methodologies (Scrum, SAFe) Knowledge of DevOps, CI/CD practices Familiarity with scientific or healthcare data domains Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 week ago
5.0 years
5 - 7 Lacs
Hyderābād
Remote
About the Role: Grade Level (for internal use): 09 S&P Global Commodity Insights The Role: Engineer II, Application Support Analyst, The Location: Hyderabad/Gurgaon, India The Team: AppOps is responsible for providing high quality operational and technical support for all Commodity Insights (CI) business specific applications and systems. Responsible to provide CI Business Partners with initial first line remote support for IT issues and requests which occur during business hours in relation to the use of CI business specific applications. Ensuring that standard operating procedures is followed for all incident and service requests received into the helpdesk function. Proactively monitor applications responding to alerts and providing the business with periodic health check reports. We operate 24x7 which can involve working during APAC|EMEA|AMER Hours & requires weekend support. (Rotational shifts 5 day a week). Work hours can change depending on Business requirements. Enter the grade level of the position: Grade 9 The Impact: You will be the first line of support for all requests and incidents raised by Commodity Insights business partners. You will ensure the business receives a prompt response to any requests and ensure issues are resolved within agree service level agreements What’s in it for you: The position is the part of the global application Support team supporting users based in three time zones and across 26 offices. Exposure to Application /Product support, technical operations, monitoring and projects in a role where you will interact directly with the business and learn the products and systems required to support the Platts business operations. Responsibilities: Provide initial first line Application/Product support and triage of incidents and service requests for IT issues which occur during use of Platts applications. Technical Excellence: In-depth Technical understanding of all Applications, Monitoring Tools, and all available technical resources. Executing Effective Weekend Support Incident Identification, Effective Shift handovers, Major Incident Mgmt. & Process Hygiene. Log and capture incidents from all sources into ticketing system (ServiceNow) ensuring correct categorization and prioritization of IT issues Application Support Operations: Ensure application operations excellence and guaranteed response times by actively monitoring application health checks, end user emails/tickets and ensuring all Incidents/service requests are resolved in a timely and comprehensive manner. Server maintenance, monitoring, health checks, restarts, and BAU operational work. Provide 24 x 7 round the clock support to Platts business partners utilizing shift patterns Major Incident Management: Engaging & driving the major Incidents during the weekends to Initiate bridge call, engage technical teams and restore the service Immediately Incident Hygiene: Adhering to the Incident Hygiene process, ensuring High Hygiene in the Incidents & requests handled. Knowledge Management and competency development: Create & share the SOPs, Best Practice documents, check list, technical knowledge articles. Resolving IT incidents to restore service as quickly as possible using known error database. Escalation of tickets to other teams as required Active participation in knowledge transitions, also coming up with Process Initiatives, deliver ideas and values to achieve the desired results. What We’re Looking For: Basic Qualifications: Experience working with various Application Monitoring systems and tools (Autosys / AppDynamics /Nagios/Naemon/Splunk preferred) Experience in IT Service Management frameworks (ITIL or similar) Knowledge of troubleshooting & supporting applications running on either Linux (preferred) or Windows server OS Exposure to industry standard ITSM tools (ServiceNow strongly preferred) Experience supporting Cloud computing (AWS). Familiar with infrastructure concepts related to distributed applications (Load balancers, Networking. Firewall, NAT, Virtual servers) Exposure working with tools like Putty, RDP, SSH, WinSCP, MySQL Query Browser, Oracle SQL Developer. Familiar with reporting and analyzing tools (Beneficial but not essential) Experience working collaborative platforms like Microsoft SharePoint, Box, OneDrive, MS Teams. Good understanding of Agile Framework. Any knowledge of Webservers either (Beneficial but not essential) Windows IIS Linux Apache, and WebLogic (preferred) Any knowledge of scripting languages (JScript and JavaScript DOS, VBScript, Pearl, Python, PowerShell, or shell script) preferred (Beneficial but not essential) Microsoft Office / Office 365 especially Excel (Macros, Worksheets, and add-ins) Preferred Qualifications: 5+ years of relevant experience with bachelor’s degree. About S&P Global Commodity Insights At S&P Global Commodity Insights, our complete view of global energy and commodities markets enables our customers to make decisions with conviction and create long-term, sustainable value. We’re a trusted connector that brings together thought leaders, market participants, governments, and regulators to co-create solutions that lead to progress. Vital to navigating Energy Transition, S&P Global Commodity Insights’ coverage includes oil and gas, power, chemicals, metals, agriculture and shipping. S&P Global Commodity Insights is a division of S&P Global (NYSE: SPGI). S&P Global is the world’s foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the world’s leading organizations navigate the economic landscape so they can plan for tomorrow, today. For more information, visit http://www.spglobal.com/commodity-insights . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 317512 Posted On: 2025-07-26 Location: Gurgaon, Haryana, India
Posted 1 week ago
5.0 - 9.0 years
7 - 8 Lacs
Hyderābād
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What you will do As a Data Engineer, you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master's degree / Bachelor's degree and 5 to 9 years of Computer Science, IT or related field experience. Must-Have Skills: Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL),Snowflake, workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) Proficient in SQL for extracting, transforming, and analyzing complex datasets from relational data stores Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Proven ability to optimize query performance on big data platforms Preferred Qualifications: Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Knowledge of Python/R, Databricks, SageMaker, cloud data platforms Strong knowledge on Oracle / SQL Server, Stored Procedure, PL/SQL, Knowledge on LINUX OS Knowledge on Data visualization and analytics tools like Spotfire, PowerBI Strong understanding of data governance frameworks, tools, and best practices. Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Professional Certifications: Databricks Certificate preferred AWS Data Engineer/Architect Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 week ago
5.0 years
7 - 7 Lacs
Gurgaon
Remote
About the Role: Grade Level (for internal use): 09 S&P Global Commodity Insights The Role: Engineer II, Application Support Analyst, The Location: Hyderabad/Gurgaon, India The Team: AppOps is responsible for providing high quality operational and technical support for all Commodity Insights (CI) business specific applications and systems. Responsible to provide CI Business Partners with initial first line remote support for IT issues and requests which occur during business hours in relation to the use of CI business specific applications. Ensuring that standard operating procedures is followed for all incident and service requests received into the helpdesk function. Proactively monitor applications responding to alerts and providing the business with periodic health check reports. We operate 24x7 which can involve working during APAC|EMEA|AMER Hours & requires weekend support. (Rotational shifts 5 day a week). Work hours can change depending on Business requirements. Enter the grade level of the position: Grade 9 The Impact: You will be the first line of support for all requests and incidents raised by Commodity Insights business partners. You will ensure the business receives a prompt response to any requests and ensure issues are resolved within agree service level agreements What’s in it for you: The position is the part of the global application Support team supporting users based in three time zones and across 26 offices. Exposure to Application /Product support, technical operations, monitoring and projects in a role where you will interact directly with the business and learn the products and systems required to support the Platts business operations. Responsibilities: Provide initial first line Application/Product support and triage of incidents and service requests for IT issues which occur during use of Platts applications. Technical Excellence: In-depth Technical understanding of all Applications, Monitoring Tools, and all available technical resources. Executing Effective Weekend Support Incident Identification, Effective Shift handovers, Major Incident Mgmt. & Process Hygiene. Log and capture incidents from all sources into ticketing system (ServiceNow) ensuring correct categorization and prioritization of IT issues Application Support Operations: Ensure application operations excellence and guaranteed response times by actively monitoring application health checks, end user emails/tickets and ensuring all Incidents/service requests are resolved in a timely and comprehensive manner. Server maintenance, monitoring, health checks, restarts, and BAU operational work. Provide 24 x 7 round the clock support to Platts business partners utilizing shift patterns Major Incident Management: Engaging & driving the major Incidents during the weekends to Initiate bridge call, engage technical teams and restore the service Immediately Incident Hygiene: Adhering to the Incident Hygiene process, ensuring High Hygiene in the Incidents & requests handled. Knowledge Management and competency development: Create & share the SOPs, Best Practice documents, check list, technical knowledge articles. Resolving IT incidents to restore service as quickly as possible using known error database. Escalation of tickets to other teams as required Active participation in knowledge transitions, also coming up with Process Initiatives, deliver ideas and values to achieve the desired results. What We’re Looking For: Basic Qualifications: Experience working with various Application Monitoring systems and tools (Autosys / AppDynamics /Nagios/Naemon/Splunk preferred) Experience in IT Service Management frameworks (ITIL or similar) Knowledge of troubleshooting & supporting applications running on either Linux (preferred) or Windows server OS Exposure to industry standard ITSM tools (ServiceNow strongly preferred) Experience supporting Cloud computing (AWS). Familiar with infrastructure concepts related to distributed applications (Load balancers, Networking. Firewall, NAT, Virtual servers) Exposure working with tools like Putty, RDP, SSH, WinSCP, MySQL Query Browser, Oracle SQL Developer. Familiar with reporting and analyzing tools (Beneficial but not essential) Experience working collaborative platforms like Microsoft SharePoint, Box, OneDrive, MS Teams. Good understanding of Agile Framework. Any knowledge of Webservers either (Beneficial but not essential) Windows IIS Linux Apache, and WebLogic (preferred) Any knowledge of scripting languages (JScript and JavaScript DOS, VBScript, Pearl, Python, PowerShell, or shell script) preferred (Beneficial but not essential) Microsoft Office / Office 365 especially Excel (Macros, Worksheets, and add-ins) Preferred Qualifications: 5+ years of relevant experience with bachelor’s degree. About S&P Global Commodity Insights At S&P Global Commodity Insights, our complete view of global energy and commodities markets enables our customers to make decisions with conviction and create long-term, sustainable value. We’re a trusted connector that brings together thought leaders, market participants, governments, and regulators to co-create solutions that lead to progress. Vital to navigating Energy Transition, S&P Global Commodity Insights’ coverage includes oil and gas, power, chemicals, metals, agriculture and shipping. S&P Global Commodity Insights is a division of S&P Global (NYSE: SPGI). S&P Global is the world’s foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the world’s leading organizations navigate the economic landscape so they can plan for tomorrow, today. For more information, visit http://www.spglobal.com/commodity-insights . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 317512 Posted On: 2025-07-26 Location: Gurgaon, Haryana, India
Posted 1 week ago
8.0 years
5 - 10 Lacs
Bengaluru
On-site
We help the world run better At SAP, we enable you to bring out your best. Our company culture is focused on collaboration and a shared passion to help the world run better. How? We focus every day on building the foundation for tomorrow and creating a workplace that embraces differences, values flexibility, and is aligned to our purpose-driven and future-focused work. We offer a highly collaborative, caring team environment with a strong focus on learning and development, recognition for your individual contributions, and a variety of benefit options for you to choose from. What you'll do: You will take an ownership of designing and building core integration frameworks that enable real-time, event-driven data flows between distributed SAP systems. As a senior contributor, you will work closely with architects to drive end-to-end development of services and pipelines supporting distributed data processing, data transformations and intelligent automation. This is an unique opportunity to contribute to SAP’s evolving data platform initiatives with hands-on involvement in Java, Python, Kafka, DevOps, Real-Time Analytics, Intelligent Monitoring, BTP and Hyperscaler ecosystems. Responsibilities: Design and develop Micro services using Java, RESTful APIs and messaging frameworks such as Apache Kafka. Design and Develop UI based on SAP UI5/Fiori is a plus Design and Develop Observability Framework for Customer Insights Build and maintain scalable data processing and ETL pipelines that support real-time and batch data flows. Experience with Databricks is an advantage. Accelerate the App2App integration roadmap by identifying reusable patterns, driving platform automation and establishing best practices. Collaborate with cross-functional teams to enable secure, reliable and performant communication across SAP applications. Build and maintain distributed data processing pipelines, supporting large-scale data ingestion, transformation and routing. Work closely with DevOps to define and improve CI/CD pipelines, monitoring and deployment strategies using modern GitOps practices. Guide cloud-native secure deployment of services on SAP BTP and major Hyperscaler (AWS, Azure, GCP). Collaborate with SAP’s broader Data Platform efforts including Datasphere, SAP Analytics Cloud and BDC runtime architecture. Ensure adherence to best practices in microservices architecture, including service discovery, load balancing, and fault tolerance. Stay updated with the latest industry trends and technologies to continuously improve the development process What you bring: 8+ years of hands-on experience in backend development using Java, with strong object-oriented design and integration patterns. Hands-on experience building ETL pipelines and working with large-scale data processing frameworks. Exposure to Log Aggregator Tools like Splunk, ELK , etc. Experience or experimentation with tools such as Databricks, Apache Spark or other cloud-native data platforms is highly advantageous. Familiarity with SAP Business Technology Platform (BTP), SAP Datasphere, SAP Analytics Cloud or HANA is highly desirable. Design CI/CD pipelines, containerization (Docker), Kubernetes and DevOps best practices. Working knowledge of Hyperscaler environments such as AWS, Azure or GCP. Passionate about clean code, automated testing, performance tuning and continuous improvement. Strong communication skills and ability to collaborate with global teams across time zones. Meet your Team SAP is the market leader in enterprise application software, helping companies of all sizes and industries run at their best. As part of the Business Data Cloud (BDC) organization, the Foundation Services team is pivotal to SAP’s Data & AI strategy, delivering next-generation data experiences that power intelligence across the enterprise. Located in Bangalore, India, our team drives cutting-edge engineering efforts in a collaborative, inclusive and high-impact environment, enabling innovation and integration across SAP’s data platform #DevT3 Bring out your best SAP innovations help more than four hundred thousand customers worldwide work together more efficiently and use business insight more effectively. Originally known for leadership in enterprise resource planning (ERP) software, SAP has evolved to become a market leader in end-to-end business application software and related services for database, analytics, intelligent technologies, and experience management. As a cloud company with two hundred million users and more than one hundred thousand employees worldwide, we are purpose-driven and future-focused, with a highly collaborative team ethic and commitment to personal development. Whether connecting global industries, people, or platforms, we help ensure every challenge gets the solution it deserves. At SAP, you can bring out your best. We win with inclusion SAP’s culture of inclusion, focus on health and well-being, and flexible working models help ensure that everyone – regardless of background – feels included and can run at their best. At SAP, we believe we are made stronger by the unique capabilities and qualities that each person brings to our company, and we invest in our employees to inspire confidence and help everyone realize their full potential. We ultimately believe in unleashing all talent and creating a better and more equitable world. SAP is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to the values of Equal Employment Opportunity and provide accessibility accommodations to applicants with physical and/or mental disabilities. If you are interested in applying for employment with SAP and are in need of accommodation or special assistance to navigate our website or to complete your application, please send an e-mail with your request to Recruiting Operations Team: Careers@sap.com For SAP employees: Only permanent roles are eligible for the SAP Employee Referral Program, according to the eligibility rules set in the SAP Referral Policy. Specific conditions may apply for roles in Vocational Training. EOE AA M/F/Vet/Disability: Qualified applicants will receive consideration for employment without regard to their age, race, religion, national origin, ethnicity, age, gender (including pregnancy, childbirth, et al), sexual orientation, gender identity or expression, protected veteran status, or disability. Successful candidates might be required to undergo a background verification with an external vendor. Requisition ID: 430165 | Work Area: Software-Design and Development | Expected Travel: 0 - 10% | Career Status: Professional | Employment Type: Regular Full Time | Additional Locations: #LI-Hybrid.
Posted 1 week ago
4.0 - 6.0 years
3 - 4 Lacs
Bengaluru
On-site
What we offer Our company culture is focused on helping our employees enable innovation by building breakthroughs together. How? We focus every day on building the foundation for tomorrow and creating a workplace that embraces differences, values flexibility, and is aligned to our purpose-driven and future-focused work. We offer a highly collaborative, caring team environment with a strong focus on learning and development, recognition for your individual contributions, and a variety of benefit options for you to choose from. Apply now! What you'll do: We are seeking a hands-on Product Manager with strong technical acumen and a passion for data engineering to drive the evolution of our data foundation capabilities. In this role, you will work closely with engineering, architecture, design, and go-to-market teams to define product requirements, shape roadmap priorities, and deliver impactful services that power the BDC platform. You will bring customer empathy, execution focus, and a collaborative mindset to ensure delivery of valuable outcomes for both internal and external stakeholders. Product Development & Execution Define and manage product requirements and use cases based on customer needs, stakeholder inputs, and technical feasibility Partner with engineering teams to deliver high-quality features on time and with measurable impact Prioritize and manage the product backlog, balancing short-term iterations with long-term strategic goals Support the creation of clear documentation, release notes, and user-facing communication Data-Driven Insights Use data, and user feedback to continuously improve product features and drive customer value Collaborate with teams to monitor adoption, measure impact, and identify opportunities Cross-Functional Collaboration Facilitate productive working relationships across BDC, SAP LOBs, and external partners Ensure alignment between technical teams and business stakeholders on product objectives Customer & Stakeholder Engagement Gather feedback directly from internal users, partners, and customers to validate hypotheses and inform future development Participate in customer calls, demos, and workshops to showcase capabilities and understand evolving needs What you bring: Experience: 4–6 years of product management experience in data engineering, platform, data integration or cloud services environments Technical Expertise: Strong background in data engineering, including hands-on experience with ETL, data pipelines, databases, and analytics platforms. Knowledge of Apache Spark, data lake, delta lake, cloud data warehouse, Object store technologies, and experience in building APIs for data sharing using “zero copy share” techniques such as Delta and Iceberg is highly desired. Customer Focus: Proven ability to translate user needs into product requirements and iterate quickly on feedback Execution Skills: Strong organizational, collaboration, interpersonal and planning skills with a bias toward action and delivery Communication Skills: Strong written and verbal communication skills, with the ability to articulate complex ideas clearly and effectively to both technical and non-technical audiences Education: Bachelor’s degree in Computer Science, Engineering, Data Science or related field. Advanced degree or MBA is a plu Meet your Team: SAP Business Data Cloud (BDC) is SAP’s next-generation data platform that brings together data from SAP and non-SAP sources into a unified, open, and business-ready environment. BDC enables organizations to harness the full power of their data with seamless integration, rich semantic context, and advanced governance capabilities. By providing trusted and connected data across landscapes, BDC empowers users to make better, faster, and more confident decisions. BDC Data Foundation Services is a forward-looking team at the heart of SAP’s Business Data Cloud (BDC) mission. We focus on building scalable, robust, and secure data product infrastructure services that empower customers with trusted, unified, and actionable data. As part of our growth journey, we are looking for a skilled and motivated Product Manager to join our team and contribute to the next wave of innovation in data foundations. We are SAP SAP innovations help more than 400,000 customers worldwide work together more efficiently and use business insight more effectively. Originally known for leadership in enterprise resource planning (ERP) software, SAP has evolved to become a market leader in end-to-end business application software and related services for database, analytics, intelligent technologies, and experience management. As a cloud company with 200 million users and more than 100,000 employees worldwide, we are purpose-driven and future-focused, with a highly collaborative team ethic and commitment to personal development. Whether connecting global industries, people, or platforms, we help ensure every challenge gets the solution it deserves. At SAP, we build breakthroughs, together. Our inclusion promise SAP’s culture of inclusion, focus on health and well-being, and flexible working models help ensure that everyone – regardless of background – feels included and can run at their best. At SAP, we believe we are made stronger by the unique capabilities and qualities that each person brings to our company, and we invest in our employees to inspire confidence and help everyone realize their full potential. We ultimately believe in unleashing all talent and creating a better and more equitable world. SAP is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to the values of Equal Employment Opportunity and provide accessibility accommodations to applicants with physical and/or mental disabilities. If you are interested in applying for employment with SAP and are in need of accommodation or special assistance to navigate our website or to complete your application, please send an e-mail with your request to Recruiting Operations Team: Americas: Careers.NorthAmerica@sap.com or Careers.LatinAmerica@sap.com, APJ: Careers.APJ@sap.com, EMEA: Careers@sap.com. EOE AA M/F/Vet/Disability: Qualified applicants will receive consideration for employment without regard to their age, race, religion, national origin, ethnicity, age, gender (including pregnancy, childbirth, et al), sexual orientation, gender identity or expression, protected veteran status, or disability. Successful candidates might be required to undergo a background verification with an external vendor. Requisition ID:430237 | Work Area: Solution and Product Management | Expected Travel: 0 - 10% | Career Status: Professional | Employment Type: Regular Full Time |
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15459 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France