Home
Jobs

984 Data Bricks Jobs - Page 25

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 12.0 years

15 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

Data Engineer Python- Azure Databricks – 7 Years – Bangalore Location – Bangalore Are you a seasoned Data Engineer passionate about turning complex datasets into scalable insights? Here’s your chance to build robust data platforms and pipelines that support global decision-making at scale—within a forward-thinking organization that champions innovation and excellence. Your Future Employer – A global enterprise delivering high-impact technology and operational services to Fortune-level clients. Known for fostering a culture of innovation, agility, and collaboration. Responsibilities – 1. Architect and implement data models and infrastructure to support analytics, reporting, and data science. 2. Build high-performance ETL pipelines and manage data integration from multiple sources. 3. Maintain data quality, governance, and security standards. 4. Collaborate with cross-functional teams to translate business needs into technical solutions. 5. Troubleshoot, optimize, and document scalable data workflows. Requirements – 1. 7+ years of experience as a Data Engineer with at least 4 years in cloud ecosystems . 2. Strong expertise in Azure (ADF, Data Lake Gen2, Databricks) or AWS. 3. Proficiency in Python and SQL ; experience with Spark, Kafka, or Hadoop is a plus. 4. Deep understanding of data warehousing, OLAP, and data modelling. 5. Familiarity with visualization tools like Power BI, Tableau, or Looker. What is in it for you – High-visibility projects with real-world impact. Access to cutting-edge cloud and big data technologies. Flexible hybrid work environment in Bangalore. Dynamic and collaborative global work culture. Reach us: If you think this role aligns with your career, kindly write to me along with your updated CV at parul.arora@crescendogroup.in for a confidential discussion. Disclaimer: Crescendo Global specializes in senior to C-level niche recruitment. We are passionate about empowering job seekers and employers with a memorable job search and leadership hiring experience. We do not discriminate based on race, religion, gender, or any other protected status. Note: We receive a lot of applications on a daily basis, so it becomes difficult for us to get back to each candidate. Please assume that your profile has not been shortlisted in case you don't hear back from us in 1 week. Your patience is highly appreciated. Profile Keywords – Data Engineer Bangalore, Azure Data Factory, Azure Data Lake, Azure Databricks, ETL Developer, Big Data Engineer, Python Data Engineer, SQL Developer, Data Pipeline Developer, Cloud Data Engineering, Data Warehousing, Data Modelling, Spark Developer, Kafka Engineer, Hadoop Jobs, Power BI Developer, Tableau Analyst, CI/CD for Data, Streaming Data Engineer, DataOps

Posted 3 weeks ago

Apply

5.0 - 8.0 years

25 - 35 Lacs

Gurugram, Bengaluru

Hybrid

Naukri logo

Role & responsibilities Work with data product managers, analysts, and data scientists to architect, build and maintain data processing pipelines in SQL or Python. Build and maintain a data warehouse / data lake-house for analytics, reporting and ML predictions. Implement DataOps and related DevOps focused on creating ETL pipelines for data analytics / reporting, and ELT pipelines for model training. Support, optimise and transition our current processes to ensure well architected implementations and best practices. Work in an agile environment within a collaborative agile product team using Kanban Collaborate across departments and work closely with data science teams and with business (economists/data) analysts in refining their data requirements for various initiatives and data consumption requirements. Educate and train colleagues such as data scientists, analysts, and stakeholders in data pipelining and preparation techniques, which make it easier for them to integrate and consume the data they need for their own use cases. Participate in ensuring compliance and governance during data use, to ensure that the data users and consumers use the data provisioned to them responsibly through data governance and compliance initiatives. Become a data and analytics evangelist, and promote the available data and analytics capabilities and expertise to business unit leaders, and educate them in leveraging these. Preferred candidate profile What you'll need to be successful 8+ years of professional experience with data processing environments used in large scale digital applications. Extensive experience with programming in Python, Spark( SparkSQL) and SQL Experience with warehouse technologies such as Snowflake, and data modelling, lineage and data governance tools such as Alation. Professional experience of designing, building and managing bespoke data pipelines (including ETL, ELT and lambda architectures), using technologies such as Apache Airflow, Snowflake, Amazon Athena, AWS Glue, Amazon EMR, or other equivalent. Strong, fundamental technical expertise in cloud-native technologies, such as serverless functions, API gateway, relational and NoSQL databases, and caching. Experience in leading / mentoring data engineering teams. Experience in working in teams with data scientists and ML engineers, for building automated pipelines for data pre-processing and feature extraction. An advanced degree in software / data engineering, computer / information science, or a related quantitative field or equivalent work experience. Strong verbal and written communication skills and ability to work well with a wide range of stakeholders. Strong ownership, scrappy and biassed for action. Perks and benefits

Posted 3 weeks ago

Apply

6.0 - 11.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

Who we are Samsara (NYSE: IOT) is the pioneer of the Connected Operations Cloud, which is a platform that enables organizations that depend on physical operations to harness Internet of Things (IoT) data to develop actionable insights and improve their operations. At Samsara, we are helping improve the safety, efficiency and sustainability of the physical operations that power our global economy. Representing more than 40% of global GDP, these industries are the infrastructure of our planet, including agriculture, construction, field services, transportation, and manufacturing — and we are excited to help digitally transform their operations at scale. Working at Samsara means you’ll help define the future of physical operations and be on a team that’s shaping an exciting array of product solutions, including Video-Based Safety, Vehicle Telematics, Apps and Driver Workflows, Equipment Monitoring, and Site Visibility. As part of a recently public company, you’ll have the autonomy and support to make an impact as we build for the long term. About the role: Samsara Technologies India Private Limited is looking for an experienced support engineer to join our Global Technical Support organization as a Product Support Engineer - Platform. In this role, you will be focused on solving mission critical issues across all Samsara cloud connected software products. You will provide debugging, failure analysis, and design feedback to our Software Engineering teams with the primary goal of improving product quality to ensure world-class customer experience. You will innovate on tools and automation that will simplify the customer product experience. This role is a combination of troubleshooting and creative problem solving with a strong customer focus. We are looking for hands-on engineers who are passionate about Samsara’s mission with an attitude to resolve issues with a sense of urgency. This role will require 10-15% engagement on cellular issues as well. This role will have a dotted line to the Support Director based in India and will require you to work from our Bengaluru office as needed to handle on-site training. You must reside within a 1.5 hour commuting distance from the office. This is a hybrid position, based in our Bengaluru office, requiring you to participate in an on-call rotation schedule to triage and manage high-severity issues and outages. This is a hybrid position requiring 3 days per week in our Bangalore office and 2 days working remotely. Relocation assistance will not be provided for this role.The candidate will be working in EMEA shift hours. You should apply if: You want to impact the industries that run our world: Your efforts will result in real-world impact—helping to keep the lights on, get food into grocery stores, reduce emissions, and most importantly, ensure workers return home safely. You are the architect of your own career: If you put in the work, this role won’t be your last at Samsara. We set up our employees for success and have built a culture that encourages rapid career development, countless opportunities to experiment and master your craft in a hyper growth environment. You’re energized by our opportunity: The vision we have to digitize large sectors of the global economy requires your full focus and best efforts to bring forth creative, ambitious ideas for our customers. You want to be with the best: At Samsara, we win together, celebrate together and support each other. You will be surrounded by a high-caliber team that will encourage you to do your best. In this role, you will: Identify and resolve cloud application issues for Samsara customers related to performance, roles & permissions, and reports Analyze cloud data and logs to drive quality improvement Serve as a subject matter expert and educator to our global customer support team Develop services and tools to create and improve workflows for technical support & simplify the customer experience Analyze product support trends and partner with the R&D team to build a world-class customer solution. Lead post-mortem analyses to identify learnings, root causes, systematic patterns that need attention, improvement opportunities, and relevant trends. Interface directly with cellular carriers on RCCAs for outages and close coverage gaps. Champion, role model, and embed Samsara’s cultural values (Focus on Customer Success, Build for the Long Term, Adopt a Growth Mindset, Be Inclusive, Win as a Team) as we scale globally and across new offices Minimum requirements for the role: Bachelor’s degree in computer science, software engineering, or related field. 6+ years in a technical role, preferably cross-functional. Proficient in data analytics tools like Databricks and Tableau. Experience with SQL, GraphQL, and Python. Prior technical support experience required. Strong problem-solving skills. Fluent in English with excellent communication and customer service skills. Resourceful, creative, and able to form strong relationships with R&D and product team. Able to work in a fast-paced environment. Experience in direct customer interaction, incident response, and 24/7 on-call support is essential. An ideal candidate also has: 6+ years of experience in product support, software development or systems engineering for cloud based products Experience interfacing directly with cellular carriers IoT, networking and wireless troubleshooting skills. Executive presence & communication - excellent written and verbal communication skills tailored to a senior leadership audience Ability to drive outcomes without authority - strong project management skills necessary to prioritize, delegate, and drive action across departments Operational urgency - you have hands-on experience delivering business results under tight timelines Technical know-how - you’re comfortable interfacing with engineers, translating complex technical concepts into everyday language, and working with SaaS systems Levelheadedness - you’re self-assured and calm amidst high pressure situations Strong bias for action, ability to deep-dive, insistance on the highest standards and work in a hyper-growth environment with shifting priorities #LI-Hybrid At Samsara, we welcome everyone regardless of their background. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, sex, gender, gender identity, sexual orientation, protected veteran status, disability, age, and other characteristics protected by law. We depend on the unique approaches of our team members to help us solve complex problems. We are committed to increasing diversity across our team and ensuring that Samsara is a place where people from all backgrounds can make an impact. Benefits Full time employees receive a competitive total compensation package along with employee-led remote and flexible working, health benefits, Samsara for Good charity fund, and much, much more. Take a look at our Benefits site to learn more. Accommodations Samsara is an inclusive work environment, and we are committed to ensuring equal opportunity in employment for qualified persons with disabilities. Please email accessibleinterviewing@samsara.com or click here if you require any reasonable accommodations throughout the recruiting process. Flexible Working At Samsara, we embrace a flexible working model that caters to the diverse needs of our teams. Our offices are open for those who prefer to work in-person and we also support remote work where it aligns with our operational requirements. For certain positions, being close to one of our offices or within a specific geographic area is important to facilitate collaboration, access to resources, or alignment with our service regions. In these cases, the job description will clearly indicate any working location requirements. Our goal is to ensure that all members of our team can contribute effectively, whether they are working on-site, in a hybrid model, or fully remotely. All offers of employment are contingent upon an individual’s ability to secure and maintain the legal right to work at the company and in the specified work location, if applicable.

Posted 3 weeks ago

Apply

12.0 - 22.0 years

40 - 60 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Job Description: Data Architect - Azure with MS Fabric Location: Pune/Bangalore/Hyderabad Experience: 12+ Years Role Overview: As a Data Architect specializing in Azure with MS Fabric , you will play a pivotal role in designing and implementing robust data solutions that leverage Microsoft Fabric for cloud-based data management and analytics. Your expertise will guide clients through the complexities of data architecture, ensuring seamless integration with existing systems and optimizing data workflows. You will be responsible for leading projects from inception to completion, providing strategic insights and technical leadership throughout the process. Required Skills and Qualifications: Experience: 14+ years in Data and Analytics, with a minimum of 7-8 years focused on Azure and at least 2 implementations using Microsoft Fabric. Data Architecture Expertise: Proven experience as a Data Architect, particularly in consulting and solution design, with a strong background in cloud data stacks. Technical Proficiency: Extensive knowledge of data modeling, database design, ETL processes, and data governance principles. MS Fabric: Hands-on experience with Microsoft Fabric, including data integration, data pipelines, and analytics capabilities. SQL Skills: Advanced SQL knowledge with experience in writing complex queries, performance tuning, and troubleshooting. Programming Skills: Proficiency in programming languages such as Java, Python, or Scala for building data pipelines. Methodologies: Familiarity with Agile, Scrum, and other project delivery methodologies. Stakeholder Management: Strong experience in managing both internal and external stakeholders effectively. Certifications: Relevant certifications in Azure and Microsoft Fabric will be an advantage. Key Responsibilities: Leadership & Strategy Lead the design and implementation of end-to-end solutions using Microsoft Fabric. Collaborate with business and technical stakeholders to define data strategies. Act as the primary point of contact for all Fabric-related projects and initiatives. Provide mentorship and guidance to junior data engineers, BI developers, and analysts. Architecture & Development Design and manage Lakehouses, Data Warehouses, and Pipelines within Microsoft Fabric. Build scalable data models and visualizations using Power BI (with Fabric Integration). Develop and maintain Dataflows, Notebooks, Spark Jobs, and Synapse Pipelines. Implement best practices in data governance, security, and compliance using Fabrics tools. Project Execution Lead cross-functional teams for successful project delivery. Ensure alignment of architecture with business KPIs and OKRs. Drive adoption of Fabric across business units. Perform code reviews and architectural assessments. Monitoring & Optimization Monitor data pipeline performance, troubleshoot issues, and tune performance. Ensure data quality, availability, and lineage using Microsoft Purview (or native Fabric tooling). Maintain documentation of data models, architecture, and workflows.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

17 - 22 Lacs

Mumbai, Gurugram

Work from Office

Naukri logo

locationsGurugram - DLF BuildingMumbai - Hiranandaniposted onPosted Yesterday time left to applyEnd DateJune 10, 2025 (10 days left to apply) job requisition idR_308095 Company: Mercer Description: We are seeking a talented individual to join our Data Engineering team at Mercer. This role will be based in Gurgaon/ Mumbai. This is a hybrid role that has a requirement of working at least three days a week in the office. Senior Principal Engineer - Data Enginering We will count on you to: Design, develop, and maintain scalable and robust data pipelines on Databricks. Collaborate with data scientists and analysts to understand data requirements and deliver solutions. Optimize and troubleshoot existing data pipelines for performance and reliability. Ensure data quality and integrity across various data sources. Implement data security and compliance best practices. Monitor data pipeline performance and conduct necessary maintenance and updates. Document data pipeline processes and technical specifications. Use analytical skills to solve complex problems associated with database development and management. Working with other teams, such as data scientists, business analysts, and Qlik Developers, to identify organizational needs and design effective solutions. Providing technical leadership and guidance to the team. This may include code reviews, mentoring, and helping team members troubleshoot technical issues. Aligning the data engineering strategy with the wider organizational strategy. This might involve deciding which projects to prioritize, making technology choices, and planning for the team's growth and development. Ensuring that all data engineering activities are compliant with relevant laws and regulations, and that data is stored and processed securely. Keeping up-to-date with new technologies and methodologies in the field of data engineering, and fostering a culture of innovation and continuous improvement within the team. Communicate effectively with both technical and non-technical stakeholders, explaining data infrastructure, strategies, and systems in an understandable way. What you need to have: Bachelors degree (BE/B.TECH) in Computer Science/IT/ECE etc. MIS or related qualification. A masters degree is always helpful 3-5 years of experience in data engineering Proficiency with Databricks or AWS (Glue, S3), Phython and Spark Strong SQL skills and experience with relational databases Knowledge of data warehousing concepts and ETL processes Excellent problem-solving and analytical skills Effective Communication Skills What makes you stand out Exposure to any BI tool like Qlik(Preference tool), Power BI, Tableau etc. Hands on experience on SQL OR PL-SQL Experience with big data technologies (e.g., Hadoop, Kafka) Agile, JIRA and SDLC process knowledge Teamwork and collaboration skills Strong Quantitative and Analytical skills Why join our team: We help you be your best through professional development opportunities, interesting work and supportive leaders. We foster a vibrant and inclusive culture where you can work with talented colleagues to create new solutions and have impact for colleagues, clients and communities. Our scale enables us to provide a range of career opportunities, as well as benefits and rewards to enhance your well-being Mercer, a business of Marsh McLennan (NYSEMMC), is a global leader in helping clients realize their investment objectives, shape the future of work and enhance health and retirement outcomes for their people. Marsh McLennan is a global leader in risk, strategy and people, advising clients in 130 countries across four businessesMarsh, Guy Carpenter, Mercer and Oliver Wyman. With annual revenue of $24 billion and more than 90,000 colleagues, Marsh McLennan helps build the confidence to thrive through the power of perspective. For more information, visit mercer.com, or follow on LinkedIn and X. Marsh McLennan is committed to embracing a diverse, inclusive and flexible work environment. We aim to attract and retain the best people and embrace diversity of age, background, caste, disability, ethnic origin, family duties, gender orientation or expression, gender reassignment, marital status, nationality, parental status, personal or social status, political affiliation, race, religion and beliefs, sex/gender, sexual orientation or expression, skin color, or any other characteristic protected by applicable law. Marsh McLennan is committed to hybrid work, which includes the flexibility of working remotely and the collaboration, connections and professional development benefits of working together in the office. All Marsh McLennan colleagues are expected to be in their local office or working onsite with clients at least three days per week. Office-based teams will identify at least one anchor day per week on which their full team will be together in person.

Posted 3 weeks ago

Apply

8.0 - 12.0 years

26 - 28 Lacs

Mumbai, Maharashtra, India

On-site

Foundit logo

This role is for one of Weekday's clients Salary range: Rs 2600000 - Rs 2800000 (ie INR 26-28 LPA) Min Experience: 8 years Location: Mumbai JobType: full-time About the Role We are seeking a highly skilled and experiencedData Architectto lead the design, development, and optimization of our enterprise data infrastructure. This is a strategic role for an individual passionate about modern data platforms, cloud technologies, and data governance. The ideal candidate will bring deep expertise inData Engineering,Azure Data Services,Databricks,Power BI, andETLframeworks to drive scalable and secure data architecture for enterprise analytics and reporting. As a Data Architect, you will collaborate with cross-functional teams including business analysts, data scientists, and application developers to ensure the delivery of high-quality, actionable data solutions. Your work will directly influence data-driven decision-making and operational efficiency across the organization. Key Responsibilities Architect Scalable Data Solutions:Design and implement robust, secure, and scalable data architectures usingMicrosoft Azure,Databricks, andADFfor large-scale enterprise environments. Data Engineering Leadership:Provide technical leadership and mentoring to data engineering teams onETL/ELT best practices, data pipeline development, and optimization. Cloud-Based Architecture:Build and optimize data lakes and data warehouses onAzure, leveragingAzure Data Lake,Synapse Analytics, andAzure SQLservices. Databricks Expertise:UseAzure Databricksfor distributed data processing, real-time analytics, and machine learning data pipelines. ETL Frameworks:Design and maintain ETL workflows usingAzure Data Factory (ADF), ensuring efficient movement and transformation of data from multiple sources. Visualization & Reporting:Collaborate with business stakeholders to deliver intuitive and insightful dashboards and reports usingPower BI. Data Governance & Quality:Enforce data quality standards, lineage, and governance across all data assets, ensuring compliance and accuracy. Collaboration & Integration:Work with application developers and DevOps teams to integrate data systems with other enterprise applications. Documentation & Standards:Maintain detailed architecture diagrams, data dictionaries, and standard operating procedures for all data systems. Required Skills & Qualifications 8+ years of experiencein data engineering, data architecture, or related fields. Proven experience designing and implementing cloud-based data solutions usingMicrosoft Azure. Hands-on expertise inAzure Databricks,ADF,Azure SQL,Data Lake Storage, andPower BI. Strong proficiency inETL/ELTdevelopment, pipeline orchestration, and performance optimization. Solid understanding of data modeling, warehousing concepts (Kimball/Inmon), and big data technologies. Proficiency in scripting languages such asPython,SQL, andSpark. Experience in managing data security, compliance, and governance in large enterprises. Strong problem-solving skills and a collaborative mindset. Preferred Qualifications Azure certifications such asAzure Data Engineer AssociateorAzure Solutions Architect Expert. Experience with CI/CD pipelines for data workloads. Exposure to MDM (Master Data Management) and data catalog tools.

Posted 3 weeks ago

Apply

3.0 - 8.0 years

0 - 3 Lacs

Hyderabad

Work from Office

Naukri logo

Job Summary: We are looking for a Machine Learning Engineer with strong data engineering capabilities to support the development and deployment of predictive models in a smart manufacturing environment. This role involves building robust data pipelines, developing high-accuracy ML models for defect prediction, and implementing automated control systems for real-time corrective actions on the production floor. Key Responsibilities: Data Engineering & Integration: Validate and ensure the correct flow of data from Influx DB/CDL to Smart box/Databricks. Assist data scientists in the initial modeling phase through reliable data provisioning. Provide ongoing support for data pipeline corrections and ad-hoc data extraction. ML Model Development for Defect Prediction: Develop 3 separate ML models for predicting 3 types of defects based on historical data. Predict defect occurrence within a 5-minute window using: Artificial sampling techniques Dimensionality reduction Deliver results with: Accuracy 95% Precision & recall 80% Feature importance insights Closed-Loop Control System Implementation: Prescribe machine setpoint changes based on model outputs to prevent defect occurrence. Design and implement a closed-loop system that includes: Real-time data fetching from production line PLCs (via Influx DB/CDL). Deployment of ML models on Smart box. Pipeline to output recommendations to the appropriate PLC tag. Retraining pipeline triggered by drift detection (cloud-based retraining when recommendations deviate from centerlines). Qualifications: Education: Bachelor's or Masters degree in Computer Science, Data Science, Electrical Engineering, or related field. Technical Skills: Proficient in Python and ML libraries (e.g., scikit-learn, XG Boost, pandas) Experience with: Influx DB and CDL for industrial data integration Smart box and Databricks for model deployment and data processing Real-time data pipelines and industrial control systems (PLCs) Model performance tracking and retraining pipelines Preferred: Experience in manufacturing analytics or predictive maintenance Familiarity with Industry 4.0 principles and edge/cloud hybrid architectures Soft Skills: Strong analytical and problem-solving abilities Effective communication with cross-functional teams (data science, automation, production) Attention to detail and focus on solution reliability

Posted 3 weeks ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to gather requirements, developing application features, and ensuring that the applications align with business objectives. You will also engage in problem-solving activities, providing innovative solutions to enhance application performance and user experience, while maintaining a focus on quality and efficiency throughout the development process. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with cloud computing platforms.- Strong understanding of application development methodologies.- Familiarity with data integration and ETL processes.- Experience in performance tuning and optimization of applications. Additional Information:- The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Mumbai

Work from Office

Naukri logo

Job Summary: We are looking for a highly skilled Azure Data Engineer with a strong background in real-time and batch data ingestion and big data processing, particularly using Kafka and Databricks . The ideal candidate will have a deep understanding of streaming architectures , Medallion data models , and performance optimization techniques in cloud environments. This role requires hands-on technical expertise , including live coding during the interview process. Key Responsibilities Design and implement streaming data pipelines integrating Kafka with Databricks using Structured Streaming . Architect and maintain Medallion Architecture with well-defined Bronze, Silver, and Gold layers . Implement efficient ingestion using Databricks Autoloader for high-throughput data loads. Work with large volumes of structured and unstructured data , ensuring high availability and performance. Apply performance tuning techniques such as partitioning, caching , and cluster resource optimization . Collaborate with cross-functional teams (data scientists, analysts, business users) to build robust data solutions. Establish best practices for code versioning , deployment automation , and data governance . Required Technical Skills: Strong expertise in Azure Databricks and Spark Structured Streaming Processing modes (append, update, complete) Output modes (append, complete, update) Checkpointing and state management Experience with Kafka integration for real-time data pipelines Deep understanding of Medallion Architecture Proficiency with Databricks Autoloader and schema evolution Deep understanding of Unity Catalog and Foreign catalog Strong knowledge of Spark SQL, Delta Lake, and DataFrames Expertise in performance tuning (query optimization, cluster configuration, caching strategies) Must have Data management strategies Excellent with Governance and Access management Strong with Data modelling, Data warehousing concepts, Databricks as a platform Solid understanding of Window functions Proven experience in: Merge/Upsert logic Implementing SCD Type 1 and Type 2 Handling CDC (Change Data Capture) scenarios Retail/Telcom/Energy any one industry expertise Real time use case execution Data modelling Location: Mumbai

Posted 3 weeks ago

Apply

5.0 - 9.0 years

20 - 30 Lacs

Pune

Hybrid

Naukri logo

Job Summary : We are looking for a highly skilled AWS Data Engineer with over 5 years of experience in designing, developing, and maintaining scalable data pipelines on AWS. The ideal candidate will be proficient in data engineering best practices and cloud-native technologies, with hands-on experience in building ETL/ELT pipelines, working with large datasets, and optimizing data architecture for analytics and business intelligence. Key Responsibilities : Design, build, and maintain scalable and robust data pipelines and ETL processes using AWS services (e.g., Glue, Lambda, EMR, Redshift, S3, Athena). Collaborate with data analysts, data scientists, and stakeholders to understand data requirements and deliver high-quality solutions. Implement data lake and data warehouse architectures, ensuring data governance, data quality, and compliance. Optimize data pipelines for performance, reliability, scalability, and cost. Automate data ingestion and transformation workflows using Python, PySpark, or Scala. Manage and monitor data infrastructure including logging, error handling, alerting, and performance metrics. Leverage infrastructure-as-code tools like Terraform or AWS CloudFormation for infrastructure deployment. Ensure security best practices are implemented for data access and storage (IAM, KMS, encryption, etc.). Document data processes, architectures, and standards. Required Qualifications : Bachelors or Master’s degree in Computer Science, Information Systems, or a related field. Minimum 5 years of experience as a Data Engineer with a focus on AWS cloud services. Strong experience in building ETL/ELT pipelines using AWS Glue, EMR, Lambda , and Step Functions . Proficiency in SQL , Python , PySpark , and data modeling techniques. Experience working with data lakes (S3) and data warehouses (Redshift, Snowflake, etc.) . Experience with Athena , Kinesis , Kafka , or similar streaming data tools is a plus. Familiarity with DevOps and CI/CD processes, using tools like Git , Jenkins , or GitHub Actions . Understanding of data privacy, governance, and compliance standards such as GDPR, HIPAA, etc. Strong problem-solving and analytical skills, with the ability to work in a fast-paced environment.

Posted 3 weeks ago

Apply

8.0 - 13.0 years

20 - 27 Lacs

Bengaluru

Work from Office

Naukri logo

Location: Bengaluru Interview Mode: Virtual Experience Required: 8+ Years Mandatory Skills Required Azure Databricks Azure Data Factory (ADF) PySpark Azure Synapse Analytics Machine Learning Must have If interested send cv to jobs.redbats@gmail.com

Posted 3 weeks ago

Apply

6.0 - 11.0 years

13 - 23 Lacs

Hyderabad

Work from Office

Naukri logo

Role & responsibilities As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will oversee the development process and ensure successful project delivery. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Lead the design and development of applications.- Act as the primary point of contact for the project team.- Provide guidance and mentorship to junior team members.- Collaborate with stakeholders to gather requirements and ensure project alignment.- Ensure timely delivery of high-quality software solutions. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark, Python (Programming Language), Amazon Web Services (AWS).- Strong understanding of data processing and analysis.- Experience in building scalable and efficient data pipelines.- Knowledge of cloud computing and deployment on AWS infrastructure Preferred candidate profile Please apply if you are an immediate joiner and wiling for C2H

Posted 3 weeks ago

Apply

6.0 - 10.0 years

10 - 12 Lacs

Chennai

Work from Office

Naukri logo

Responsibilities: * Ensure data quality through testing and monitoring. * Collaborate with cross-functional teams on project delivery. * Design, develop, optimize RDBMS solutions. * Lead ETL processes using Data Bricks.

Posted 3 weeks ago

Apply

1.0 - 3.0 years

3 - 5 Lacs

Hyderabad

Work from Office

Naukri logo

Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Sr Associate IS Analyst What you will do As a Sr Associate IS Analyst, you will join a collaborative team implementing and supporting the next generation of safety platforms and supporting existing & future technologies. In this role, you will analyze and resolve issues with adverse event data and file transmissions across integrated systems, leveraging data analysis to identify trends, optimize workflows, and prevent future incidents. Collaborating closely with various teams, you will develop insights and implement solutions to improve system performance, ensuring reliable and efficient adverse event flow to critical safety operations. Roles & Responsibilities: Monitor, solve, and resolve issues related to adverse event distribution processing across multiple systems. Conduct detailed investigations into system disruptions, data anomalies, or processing delays and implement corrective and preventive measures. Work closely with internal teams, external vendors, and business partners to address dependencies and resolve bottlenecks for critical issues Design and maintain dashboards, reports, and analytics to monitor system performance and identify trends or areas of improvements. Present findings and recommendations to leadership, ensuring data-driven decision-making and clear transparency into system operations. Identify inefficiencies and propose data-driven solutions to optimize and enhance reliability. Collaborate on the development of test plans, scenarios to ensure robust validation of system updates, patches and new features Perform regression testing to verify the changes do not negatively impact existing system functionality Support the creating and implementation of automated testing frameworks to improve efficiency and consistency Support compliance with Key Control Indicators (KCI) and chips into overall process governance What we expect of you Basic Qualifications and Experience: Master’s degree and 1 to 3 years of experience in Computer Science, IT or related field OR Bachelor’s degree and 3 to 5 years of experience in Computer Science, IT or related field OR Diploma and 7 to 9 years of experience in Computer Science, IT or related field Functional Skills: Must-Have Skills: Demonstrated expertise in monitoring, troubleshooting, and resolving data and system issues. Experienced in database programming languages using SQL. Experience with File transfer processes and tackle stuck or delayed files. Collaborative spirit and effective communication skills to seamlessly work in a cross-functional team. Experienced in Agile methodology Hands-on experience with the ITIL framework. Knowledge of SDLC process, including requirements, design, testing, data analysis, change control. Good-to-Have Skills: Experience with API integrations such as MuleSoft, Data Bricks platforms. Experience with programming languages such as Python. Experienced in managing GxP systems and implementing GxP projects. Extensive experience with Software Development Lifecycle (SDLC). Knowledge of Artificial Intelligence (AI), Robotic Process Automation (RPA), Machine Learning (ML), Natural Language Processing (NLP) and Natural Language Generation (NLG) automation technologies with building business requirements. Familiarity with cloud technologies such as AWS, Azure. Ability to explain technical concepts to non-technical clients. High Level Understanding on Pharmacovigilance Terminologies or R&D IT process Experience on any existing PV system like ARGUS, arisG,. is added advantage. Professional Certifications : SAFe for Teams certification (preferred) Soft Skills: Excellent analytical and troubleshooting skills Excellent leadership and strategic thinking abilities Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Ability to deal with ambiguity and think on their feet What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Shift Information: This individual contributor position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 4 weeks ago

Apply

1.0 - 4.0 years

2 - 5 Lacs

Hyderabad

Work from Office

Naukri logo

Role Description? The Data Engineer is a key contributor to the Clinical Trial Data & Analytics (CTDA) Team, driving the development of robust data pipelines and platforms to enable advanced analytics and decision-making. Operating within a SAFE Agile product team, this role ensures system performance, minimizes downtime through automation, and supports the creation of actionable insights from clinical trial data. Collaborating with product owners, architects, and engineers, the Data Engineer will implement and enhance analytics capabilities. Ideal candidates are detail-oriented professionals with strong technical skills, a problem-solving mindset, and a passion for advancing clinical operations through data engineering and analytics. Roles & Responsibilities? ? Proficiency in developing interactive dashboards and visualizations using Spotfire, Power BI, and Tableau to provide actionable insights. Expertise in creating dynamic reports and visualizations that support data-driven decision-making and meet stakeholder requirements. Ability to analyze complex datasets and translate them into meaningful KPIs, metrics, and trends. Strong knowledge of data visualization best practices, including user-centric design, accessibility, and responsiveness. Experience in integrating data from multiple sources (databases, APIs, data warehouses) into visualizations. Skilled in performance tuning of dashboards and reports to optimize responsiveness and usability. Ability to work with end-users to define reporting requirements, develop prototypes, and implement final solutions. Familiarity with integrating real-time and predictive analytics within dashboards to enhance forecasting capabilities. Basic Qualifications and Experience? Master’s degree and 1 to 3 years of Computer Science, IT or related field experience? OR ? Bachelor’s degree and 3 to 5 years of Computer Science, IT or related field experience OR ? Diploma and 7 to 9 years of Computer Science, IT or related field experience ? Functional Skills: ? Must-Have Skills Proven hands-on experience with cloud platforms such as AWS, Azure, and GCP. Proficiency in using Python, PySpark , and SQL, with practical experience in ETL performance tuning. Development knowledge in Databricks. Strong analytical and problem-solving skills to tackle complex data challenges, with expertise in using analytical tools like Spotfire, Power BI, and Tableau. Good-to-Have Skills: ? Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Familiarity with SQL/NOSQL database, vector database for large language models Familiarity with prompt engineering, model fine tuning Professional Certifications AWS Certified Data Engineer (preferred) Databricks Certification (preferred) Any SAFe Agile certification (preferred) Soft Skills: ? Excellent critical-thinking and problem-solving skills? ? Strong communication and collaboration skills ? Demonstrated awareness of how to function in a team setting ? Demonstrated presentation skills

Posted 4 weeks ago

Apply

2.0 - 5.0 years

3 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

Sr Associate Software Engineer Safety ART - Safety Reporting What you will do Let’s do this. Let’s change the world. In this vital role you will deliver innovative custom solutions for supporting pharmacovigilance (PV) and adhering to regulatory requirements from around the world. You will be an active participant in the team directly working towards advancing technical features and enhancements of PV business applications, also involving Machine Learning and Natural Language Processing technologies. Roles & Responsibilities: Write SQL queries to manipulate and visualize data using data visualization tools Ensures design, development of software solutions is meeting Amgen architectural, security, quality and development guidelines Participates in Agile development ceremonies and practices Develops and delivers robust technology solutions in a regulated environment by collaborating with business partners, information systems (IS) colleagues and service providers Authors documentation for technical specifications and designs that satisfy detailed business and functional requirements Works closely with business and IS teams to find opportunities Responsible for crafting and building end-to-end solutions using cloud technologies (e.g. Amazon Web Services and Business Intelligence tools (e.g. Cognos, Tableau and Spotfire) or other platforms Contributes towards design and rapid Proof-of-Concept (PoC) development efforts for automated solutions that improve efficiency and simplify business processes. What we expect of you We are all different, yet we all use our unique contributions to serve patients. The [vital attribute] professional we seek is a [type of person] with these qualifications. Basic Qualifications: Master’s degree with 1 to 3 years of experience in software engineering Or Bachelor’s degree with 4 to 5 years of experience in software engineering Or Diploma with 7 to 9 years of experience in software engineering Preferred Qualifications: Must-Have Skills: Experience and proficient with at least one development programming language/technologies such as Database SQL and Python Experience with at least one Business Intelligence tool such as Cognos, Tableau or Spotfire Familiarity with automation technologies UiPath and a desire to learn and support Understanding of AWS/cloud storage, hosting, and compute environments is required Experience in Software Development Life Cycle (SDLC), including requirements, design, data analysis, testing, and change control Good-to-Have Skills: Experienced in data modelling concepts Understanding of API integrations such as MuleSoft and ETL technologies (e.g. Informatica, Databricks) Solid understanding of using one or more general programming languages Outstanding written and verbal communication skills, and ability to explain technical concepts to non-technical clients Experience or demonstrable understanding of Computer Systems Validation including FDA 21 CFR Part 11, GxP Compliance Sharp learning agility, problem solving and analytical thinking. Professional Certifications: Understanding and experience with Agile methodology and DevOps OR Agile Software Engineer certification (preferred) Soft Skills: Strong communication and presentation skills Ability to work on multiple projects simultaneously Expertise in visualizing and manipulating large data sets Willing to learn to new technologies High learning agility, innovation, and analytical skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 4 weeks ago

Apply

5.0 - 8.0 years

9 - 13 Lacs

Hyderabad

Work from Office

Naukri logo

Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What you will do Let’s do this. Let’s change the world. In this vital role you will as an expert IS Architect lead the design and implementation of integration frameworks for pharmacovigilance (PV) systems spanning both SaaS and internally hosted. This role focuses on building secure, compliant, and scalable architectures to ensure seamless data flow between safety databases, external systems, and analytics platforms, without direct access to backend databases. The ideal candidate will work closely with PV system collaborators, SaaS vendors, and internal IT teams to deliver robust and efficient solutions. Roles & Responsibilities: Design hybrid integration architectures to manage data flows between SaaS-based PV systems, internally hosted systems and platforms. Implement middleware solutions to bridge on-premise and cloud environments, applying Application Programming Interface API-first integration design pattern and establishing secure data exchange mechanisms to ensure data consistency and compliance. Work with SaaS providers and internal IT teams to define integration approach for Extract Transform Load (ETL), event-driven architecture, and batch processing. Design and maintain end-to-end data flow diagrams and blueprints that consider the unique challenges of hybrid environments. Define and enforce data governance frameworks to maintain data quality, integrity, and traceability across integrated systems. Lead all aspects of data lifecycle management for both cloud and internally hosted systems to ensure consistency and compliance. Act as the main point of contact between pharmacovigilance teams, SaaS vendors, internal IT staff, and other parties to align technical solutions with business goals. Ensure alignment with the delivery and platform teams to safeguard that the applications follow approved Amgen’s architectural and development guidelines as well as data/software standards. Collaborate with analytics teams to ensure timely access to PV data for signal detection, trending, and regulatory reporting. Continuously evaluate and improve integration frameworks to adapt to evolving PV requirements, data volumes, and business needs. Provide technical guidance and mentorship to junior developers. Basic Qualifications Master’s degree with 4 to 6 years of experience in Computer Science, software development or related field Bachelor’s degree with 6 to 8 years of experience in Computer Science, software development or related field Diploma with 10 to 12 years of experience in Computer Science, software development or related field Must-Have Skills: Demonstrable experience in architecting data pipeline and/or integration cross technology landscape (SaaS, Data lake, internally hosted systems) Experience with Application Programming Interface (API integrations) such as MuleSoft and Extract Transform Load (ETL tools) as Informatica platform, Snowflake, or Databricks. Strong problem-solving skills, particularly in hybrid system integrations. Superb communication and collaborator leadership skills, ability to explain technical concepts to non-technical clients Ability to balance technical solutions with business priorities and compliance needs. Passion for using technology to improve pharmacovigilance and patient safety. Experience with data transfer processes and taking on stuck or delayed data files. Knowledge of testing methodologies and quality assurance standard processes. Proficiency in working with data analysis and QA tools. Understanding data flows related to regulations such as GDPR and HIPAA. Experience in SQL/NOSQL database, database programming languages, data modelling concepts. Good-to-Have Skills: Knowledgeable in SDLC, including requirements, design, testing, data analysis, change control Knowledgeable in reporting tools (e.g. Tableau, Power BI) Professional Certifications: SAFe for Architect certification (preferred) Soft Skills: Excellent analytical skills to gather options to deal with ambiguity scenarios. Excellent leadership and progressive thinking abilities Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to balance multiple priorities Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills Ability to influence and strive to an intended outcome Ability to hold team members accountable to commitments Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation .

Posted 4 weeks ago

Apply

0.0 - 3.0 years

1 - 4 Lacs

Hyderabad

Work from Office

Naukri logo

Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What you will do Let’s do this. Let’s change the world. In this vital role you will be a part of a collaborative team developing and implementing the next generation of safety platforms and supporting technologies. In this role, you will be responsible for designing, developing and deploying complex software applications, and mentoring junior developers. You will work closely with the business to develop high-quality, scalable and maintainable solutions. Roles & Responsibilities: Design, develop and deploy applications to support pharmacovigilance systems. Develop and maintain solutions to enhance E2E data reconciliation, ensuring consistency and accuracy across systems related to adverse events/product complaints Develop innovative solutions using generative AI technologies including large language models (LLMs) like Open AI GPT for enhanced decision making and drive efficiency Using strong rapid prototyping skills, quickly translate concepts into working code. Conduct code reviews to ensure code quality and consistency to standards. Create and maintain documentation on software architecture, design, deployment, disaster recovery, and operations. Identify and resolve technical challenges effectively. Stay informed about industry developments, emerging trends, and standard practices relevant to systems and processes Work with partners to identify and prioritize system enhancements and new functionalities to meet evolving business needs of drug-systems. Find opportunities for automation and process improvements within drug-safety ecosystem. Overall accountability of technical implementation aspects of projects including planning, architecture, design, development, and testing to follow Information Systems Change Control and GxP validation process. Collaborate with the delivery and platform teams to ensure that the applications are aligned with approved architectural and development guidelines. Maintain knowledge of market trends and developments in web application development frameworks and related and new technologies to provide, recommend, and deliver standard methodology solutions. Responsible for supporting technical root cause analysis and work with software vendors to resolve Pharmacovigilance systems related issues. Basic Qualifications and Experience: Master’s degree with 4 to 6 years of experience in Computer Science, software development or related field Bachelor’s degree with 6 to 8 years of experience in Computer Science, software development or related field Diploma with 10 to 12 years of experience in Computer Science, software development or related field Must-Have Skills: Experienced in database programming languages, data modelling concepts, using SQL and Databricks. Experienced with reporting tools such as Tableau and Power BI. Experienced in one or more general programming languages, including but not limited toJava or Python. Excellent problem-solving skills and a commitment to resolving challenges. Knowledge of Artificial Intelligence (AI), Robotic Process Automation (RPA), Machine Learning (ML), Natural Language Processing (NLP) and Natural Language Generation (NLG) automation technologies with building business requirements. Extensive experience with SDLC. Collaborative spirit and effective communication to seamlessly work in a multi-functional team. An ongoing commitment to learning and staying at the forefront of AI/ML advancements. Experience with Application Programming Interface (API integrations) such as MuleSoft and Extract Transform Load (ETL tools) as Informatica platform, and Databricks. Outstanding ability to explain technical concepts to non-technical clients. Knowledge with ITIL process Support technical implementation aspects of projects including planning, architecture, design, development, and testing to follow Information Systems (IS) Change Control and GxP validation process Demonstrate the ability to make informed technology choices after due diligence and impact assessment Good-to-Have Skills: 3+ years of experience COTS Pharmacovigilance Platforms (for example Argus) is a plus or other safety database. Experienced in managing GxP systems and implementing GxP projects Experienced with Cloud Technology such as AWS, Azure Experience with DevOps Knowledge of Artificial Intelligence (AI), Robotic Process Automation (RPA), Machine Learning (ML), Natural Language Processing (NLP) and Natural Language Generation (NLG) automation technologies with building business requirements. Professional Certifications: Certified SAFe® Agile Software Engineer SAFe for Teams certification (preferred) Soft Skills: Strong verbal and written communication skills Excellent analytical and problem-solving skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to deal with ambiguity and prioritize Team-oriented, with a focus on achieving team goals Ability to influence and drive to an intended outcome and hold peers accountable, and work effectively with global, virtual teams Ability to hold peers accountable to commitments Shift Information This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation .

Posted 4 weeks ago

Apply

1.0 - 4.0 years

2 - 5 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. What you will do Let’s do this. Let’s change the world. In this vital role you will deliver innovative custom solutions for supporting patient safety and adhering to regulatory requirements from around the world. You will be an active participant in the team directly working towards advancing technical features and enhancements of the business applications, also involving Machine Learning and Natural Language Processing technologies. Roles & Responsibilities: Develops and delivers robust technology solutions in a regulated environment by collaborating with business partners, information systems (IS) colleagues and service providers Authors documentation for technical specifications and designs that satisfy detailed business and functional requirements Works closely with business and IS teams to find opportunities Responsible for crafting and building end-to-end solutions using cloud technologies (e.g. Amazon Web Services and Business Intelligence tools (e.g. Cognos, Tableau and Spotfire) or any other platforms Chips in towards design and rapid Proof-of-Concept (POC) development efforts for automated solutions that improve efficiency and simplify business processes. Quickly and iteratively prove or disprove the concepts being considered. Ensures design, development of software solutions is meeting Amgen architectural, security, quality and development guidelines Participates in Agile development ceremonies and practices Write SQL queries to manipulate and visualize data using data visualization tools What we expect of you Master’s degree and 1 to 3 years of experience in software engineering OR Bachelor’s degree and 3 to 5 years of experience in software engineering OR Diploma and 7 to 9 years of in software engineering Basic Qualifications: Experience and proficient with at least one development programming language/technologies such as Database SQL and Python Experience with at least one Business Intelligence tool such as Cognos, Tableau or Spotfire Familiarity with automation technologies UiPath and a desire to learn and support Solid understanding of Mulesoft and ETL technologies (e.g. Informatica, DataBricks) Understanding of AWS/cloud storage, hosting, and compute environments is required Preferred Qualifications: Experienced in database programming languages, data modelling concepts, including Oracle SQL and PL/SQL Experience with API integrations such as MuleSoft Solid understanding of using one or more general programming languages, including but not limited toJava or Python Outstanding written and verbal communication skills, and ability to explain technical concepts to non-technical clients Sharp learning agility, problem solving and analytical thinking Experienced in managing GxP systems and implementing GxP projects Extensive expertise in SDLC, including requirements, design, testing, data analysis, change control Professional Certifications: Understanding and experience with Agile methodology and DevOps Soft Skills: Strong communication and presentation skills Ability to work on multiple projects simultaneously Expertise in visualizing and manipulating large data sets Willing to learn to new technologies High learning agility, innovation, and analytical skills Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation .

Posted 4 weeks ago

Apply

1.0 - 4.0 years

2 - 5 Lacs

Hyderabad

Work from Office

Naukri logo

Sr Associate Software Engineer – Tech Enablement Team What you will do In this vital and technical role, you will deliver innovative custom solutions for supporting patient safety and adhering to regulatory requirements from around the world. You will be an active participant in the team directly working towards advancing technical features and enhancements of the business applications, also involving Machine Learning and Natural Language Processing technologies. Develops and delivers robust technology solutions in a regulated environment by collaborating with business partners, information systems (IS) colleagues and service providers Authors documentation for technical specifications and designs that satisfy detailed business and functional requirements Works closely with business and IS teams to find opportunities Responsible for crafting and building end-to-end solutions using cloud technologies (e.g. Amazon Web Services and Business Intelligence tools (e.g. Cognos, Tableau and Spotfire) or any other platforms Chips in towards design and rapid Proof-of-Concept (POC) development efforts for automated solutions that improve efficiency and simplify business processes. Quickly and iteratively prove or disprove the concepts being considered. Ensures design, development of software solutions is meeting Amgen architectural, security, quality and development guidelines Participates in Agile development ceremonies and practices Write SQL queries to manipulate and visualize data using data visualization tools What we expect of you Master’s degree with 1 - 2 years of experience in Computer Science, Software Development, IT or related field (OR) Bachelor’s degree with 2 - 4 years of experience in Computer Science, Software Development, IT or related field (OR) Diploma with 5 - 8 years of experience in Computer Science, Software Development, IT or related field Must Have Skills: Experience and proficient with at least one development programming language/technologies such as Database SQL and Python Experience with at least one Business Intelligence tool such as Cognos, Tableau or Spotfire Familiarity with automation technologies UiPath and a desire to learn and support Solid understanding of Mulesoft and ETL technologies (e.g. Informatica, DataBricks) Understanding of AWS/cloud storage, hosting, and compute environments is required Good to Have Skills: Experienced in database programming languages, data modelling concepts, including Oracle SQL and PL/SQL Experience with API integrations such as MuleSoft Solid understanding of using one or more general programming languages, including but not limited toJava or Python Outstanding written and verbal communication skills, and ability to explain technical concepts to non-technical clients Sharp learning agility, problem solving and analytical thinking Experienced in managing GxP systems and implementing GxP projects Extensive expertise in SDLC, including requirements, design, testing, data analysis, change control Certification: Understanding and experience with Agile methodology and DevOps Soft Skills: Strong communication and presentation skills Ability to work on multiple projects simultaneously Expertise in visualizing and manipulating large data sets Willing to learn to new technologies High learning agility, innovation, and analytical skills Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 4 weeks ago

Apply

1.0 - 4.0 years

2 - 5 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT THE ROLE You will play a key role in a regulatory submission content automation initiative which will modernize and digitize the regulatory submission process, positioning Amgen as a leader in regulatory innovation. The initiative leverages state-of-the-art technologies, including Generative AI, Structured Content Management, and integrated data to automate the creation, review, and approval of regulatory content. ? The role is responsible for sourcing and analyzing data for this initiative and support designing, building, and maintaining the data pipelines to drive business actions and automation . This role involves working with Operations source systems, find the right data sources, standardize data sets, supporting data governance to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities Ensure reliable , secure and compliant operating environment. Identify , extract, and integrate required business data from Operations systems residing in modern cloud-based architectures. Design, develop, test and maintain scalable data pipelines, ensuring data quality via ETL/ELT processes. Schedul e and manag e workflows the ensure pipeline s run on schedule and are monitored for failures. Implement data integration solutions and manage end-to-end pipeline projects, including scope, timelines, and risk. Reverse-engineer schemas and explore source system tables to map local representations of target business concepts. Navigate application UIs and backends to gain business domain knowledge and detect data inconsistencies. Break down information models into fine-grained, business-contextualized data components. Work closely with cross-functional teams, including product teams, data architects, and business SMEs, to understand requirements and design solutions. Collaborate with data scientists to develop pipelines that meet dynamic business needs across regions. Create and maintain data models, dictionaries, and documentation to ensure accuracy and consistency. Adhere to SOPs, GDEs , and best practices for coding, testing, and reusable component design. Basic Qualifications and Experience Master’s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor’s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Functional Skills: Must-Have Skills: Hands on experience with data practices, technologies , and platforms , such as Databricks, Python, Prophecy, Gitlab, LucidChart etc Proficiency in data analysis tools ( eg. SQL) and experience with data sourcing tools Excellent problem-solving skills and the ability to work with large, complex datasets U nderstanding of data governance frameworks, tools, and best practices. Knowledge of and experience with data standards (FAIR) and protection regulations and compliance requirements (e.g., GDPR, CCPA) Good-to-Have Skills: Experience with ETL tools and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python /R , Databricks, cloud data platforms Professional Certifications Certified Data Engineer / Data Analyst (preferred on Databricks ) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills

Posted 4 weeks ago

Apply

2.0 - 5.0 years

7 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

Data Platform Engineer About Amgen Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. What you will do Roles & Responsibilities: Work as a member of a Data Platform Engineering team that uses Cloud and Big Data technologies to design, develop, implement and maintain solutions to support various functional areas like Manufacturing, Commercial, Research and Development. Work closely with the Enterprise Data Lake delivery and platform teams to ensure that the applications are aligned with the overall architectural and development guidelines Research and evaluate technical solutions including Databricks and AWS Services, NoSQL databases, Data Science packages, platforms and tools with a focus on enterprise deployment capabilities like security, scalability, reliability, maintainability, cost management etc. Assist in building and managing relationships with internal and external business stakeholders Develop basic understanding of core business problems and identify opportunities to use advanced analytics Assist in reviewing 3rd party providers for new feature/function/technical fit with EEA's data management needs. Work closely with the Enterprise Data Lake ecosystem leads to identify and evaluate emerging providers of data management & processing components that could be incorporated into data platform. Work with platform stakeholders to ensure effective cost observability and control mechanisms are in place for all aspects of data platform management. Experience developing in an Agile development environment, and comfortable with Agile terminology and ceremonies. Keen on adopting new responsibilities, facing challenges, and mastering new technologies What we expect of you Basic Qualifications and Experience: Master’s degree in computer science or engineering field and 1 to 3 years of relevant experience OR Bachelor’s degree in computer science or engineering field and 3 to 5 years of relevant experience OR Diploma and Minimum of 8+ years of relevant work experience Must-Have Skills: Experience with Databricks (or Snowflake), including cluster setup, execution, and tuning Experience with common data processing librariesPandas, PySpark, SQL-Alchemy. Experience in UI frameworks (Angular.js or React.js) Experience with data lake, data fabric and data mesh concepts Experience with data modeling, performance tuning, and experience on relational databases Experience building ETL or ELT pipelines; Hands-on experience with SQL/NoSQL Program skills in one or more computer languages – SQL, Python, Java Experienced with software engineering best-practices, including but not limited to version control (Git, GitLab.), CI/CD (GitLab, Jenkins etc.), automated unit testing, and Dev Ops Exposure to Jira or Jira Align. Good-to-Have Skills: Knowledge on R language will be considered an advantage Experience in Cloud technologies AWS preferred. Cloud Certifications -AWS, Databricks, Microsoft Familiarity with the use of AI for development productivity, such as GitHub Copilot, Databricks Assistant, Amazon Q Developer or equivalent. Knowledge of Agile and DevOps practices. Skills in disaster recovery planning. Familiarity with load testing tools (JMeter, Gatling). Basic understanding of AI/ML for monitoring. Knowledge of distributed systems and microservices. Data visualization skills (Tableau, Power BI). Strong communication and leadership skills. Understanding of compliance and auditing requirements. Soft Skills: Excellent analytical and solve skills Excellent written and verbal communications skills (English) in translating technology content into business-language at various levels Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong problem-solving and analytical skills. Strong time and task leadership skills to estimate and successfully meet project timeline with ability to bring consistency and quality assurance across various projects. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 4 weeks ago

Apply

3.0 - 7.0 years

4 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. What you will do Let’s do this. Let’s change the world. In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you Master’s degree and 4 to 6 years of Computer Science, IT or related field experience OR Bachelor’s degree and 6 to 8 years of Computer Science, IT or related field experience OR Diploma and 10 to 12 years of Computer Science, IT or related field experience Basic Qualifications: Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL),Snowflake, workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) Proficient in SQL for extracting, transforming, and analyzing complex datasets from relational data stores Excellent problem-solving skills and the ability to work with large, complex datasets Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Proven ability to optimize query performance on big data platforms Preferred Qualifications: Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Knowledge of Python/R, Databricks, SageMaker, cloud data platforms Strong understanding of data governance frameworks, tools, and best practices. Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Professional Certifications: AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation .

Posted 4 weeks ago

Apply

1.0 - 4.0 years

2 - 5 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. What you will do Let’s do this. Let’s change the world. In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you Master’s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor’s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Basic Qualifications: Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL),Snowflake, workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) Proficient in SQL for extracting, transforming, and analyzing complex datasets from relational data stores Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Proven ability to optimize query performance on big data platforms Preferred Qualifications: Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Knowledge of Python/R, Databricks, SageMaker, cloud data platforms Strong understanding of data governance frameworks, tools, and best practices. Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Professional Certifications: AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation .

Posted 4 weeks ago

Apply

1.0 - 4.0 years

2 - 5 Lacs

Hyderabad

Work from Office

Naukri logo

What you will do Let’s do this. Let’s change the world. In this vital role you are responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master’s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor’s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Preferred Qualifications: Functional Skills: Must-Have Skills : Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), AWS, Redshift, Snowflake, workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) and experience with data visualization tools. Proficient in SQL for extracting, transforming, and analyzing complex datasets from relational data stores Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Proven ability to optimize query performance on big data platforms Good-to-Have Skills: Experience with data modeling, performance tuning, on relational and graph databases( e.g. Marklogic, Allegrograph, Stardog, RDF Triplestore).. Strong understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python/R, Databricks, SageMaker, cloud data platform Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Professional Certifications : AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. Equal opportunity statement Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com

Posted 4 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies