Jobs
Interviews

24121 Etl Jobs - Page 11

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : PySpark Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team through the development process. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that the applications developed meet both user needs and technical specifications. Your role will require you to balance technical oversight with team management, fostering an environment of innovation and collaboration. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing sessions to enhance team skills and capabilities. - Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark. - Strong understanding of data processing frameworks and distributed computing. - Experience with data integration and ETL processes. - Familiarity with cloud platforms and services related to data processing. - Ability to mentor junior team members and provide technical guidance. Additional Information: - The candidate should have minimum 7.5 years of experience in PySpark. - This position is based in Pune. - A 15 years full time education is required.

Posted 2 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure Data Services Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that project goals are met, facilitating discussions to address challenges, and guiding your team through the development process. You will also be responsible for maintaining communication with stakeholders to provide updates and gather feedback, ensuring that the applications meet the required specifications and quality standards. Your role will be pivotal in driving the success of the projects you oversee, fostering a collaborative environment, and mentoring team members to enhance their skills and performance. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate training sessions to enhance team capabilities and knowledge sharing. - Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Data Services. - Strong understanding of cloud computing principles and architecture. - Experience with data integration and ETL processes. - Familiarity with application development frameworks and methodologies. - Ability to troubleshoot and resolve technical issues efficiently. Additional Information: - The candidate should have minimum 5 years of experience in Microsoft Azure Data Services. - This position is based in Pune. - A 15 years full time education is required.

Posted 2 days ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

The Data Analytics Lead Analyst is a strategic professional who stays abreast of developments within own field and contributes to directional strategy by considering their application in own job and the business. Recognized technical authority for an area within the business. Requires basic commercial awareness. There are typically multiple people within the business that provide the same level of subject matter expertise. Developed communication and diplomacy skills are required in order to guide, influence and convince others, in particular colleagues in other areas and occasional external customers. Significant impact on the area through complex deliverables. Provides advice and counsel related to the technology or operations of the business. Work impacts an entire area, which eventually affects the overall performance and effectiveness of the sub-function/job family. Responsibilities: Deep hands-on experience with PySpark for data processing, ETL (Extract, Transform, Load) operations, data manipulation, and building distributed computing solutions on large datasets. Proficiency in designing and building robust data pipelines, data ingestion, transformation, and processing workflows Solid understanding of data modeling principles, database design, and strong SQL skills for data querying and analysis. Ability to analyze data, identify patterns, uncover insights, and translate business needs into actionable data solutions. Leading and mentoring a team of data engineers or analysts, fostering best practices, and ensuring the delivery of high-quality data products. Working closely with product partners and business analysts, to understand requirements and deliver impactful analytical solutions. Qualifications: To be successful in this role, you should meet the following requirements: 8+ years of experience in handling distributed / big data projects. Proficiency in Pyspark, Linux scripting, SQL and Bigdata tools. Technology stack – Pyspark, ETL, Unix Shell Scripting, Python, Spark, SQL, Impala, Hive Strong exposure in interpretation of business requirements from a technical perspective. Design, develop and implement IT solutions that fulfill business users' requirements and conform to a high level of quality standard. Sound problem-solving skills and attention to detail. Strong communication, presentation and team collaboration skills. Knowledge of Automation and DevOps practices. Familiarity with agile development methodologies using Jira Education: Bachelor’s/University degree or equivalent experience, potentially Masters degree This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Data Analytics ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 2 days ago

Apply

12.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

At PwC, our teams in Scaled Engineering Services are dedicated to delivering reliable, scalable, and cost-effective technology solutions that enable clients to achieve operational excellence and business agility. These teams apply technical expertise and a strong service-oriented mindset to support the design, development, deployment, and maintenance of enterprise-grade IT systems and applications. Professionals in engineering management roles will design and implement technology solutions to meet clients' business needs. You will leverage your experience in analysing requirements, developing technical designs to enable the successful delivery of solutions. Driven by a passion for engineering excellence and scalable design, you lead with purpose, guiding teams to deliver high-quality solutions that meet complex client needs. You take ownership of defining reliable solutions ensuring technical integrity and enabling successful delivery through hands-on leadership and mentorship. By leveraging your expertise in backend systems, cloud-native technologies, and infrastructure best practices, you coach team members, maximize their strengths, and foster a culture of continuous improvement—all while aligning solutions with business goals and delivery commitments. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Take ownership of technical solutions and technical delivery, ensuring successful planning, estimation, execution, and deployment of complex IT initiatives. Collaborate with project and engineering leads to drive shared accountability for technical quality, delivery timelines, and overall solution outcomes. Mentor and guide developers and engineers, sharing solution insights and promoting best practices across the team. Use code and design reviews as a platform to elevate team capabilities and encourage continuous technical growth. Proactively address technical challenges, leading difficult conversations with clarity and professionalism, and escalating issues when necessary. The Opportunity When you join PwC Acceleration Centers (ACs), you step into a pivotal role focused on actively supporting various Acceleration Center services, from Advisory to Assurance, Tax and Business Services. In our innovative hubs, you’ll engage in challenging projects and provide distinctive services to support client engagements through enhanced quality and innovation. You’ll also participate in dynamic and digitally enabled training that is designed to grow your technical and professional skills. As part of the Software and Product Innovation team you will lead the development of modern applications and drive innovative engineering solutions. As a Manager you will oversee engineering teams, fostering a culture of ownership and continuous improvement while delivering reliable and scalable solutions. This role offers the chance to work with advanced technologies and collaborate with cross-functional teams to deliver impactful projects that align with business goals. Responsibilities Lead the development and implementation of modern applications Supervise engineering teams to promote successful project execution Foster a culture of accountability and continuous enhancement Collaborate with cross-functional teams to align projects with business objectives Utilize advanced technologies to drive innovative engineering solutions Assure the delivery of scalable and dependable solutions Mentor team members to encourage professional growth and development Analyze project outcomes to identify areas for improvement What You Must Have Bachelor's Degree in Computer Science, Information Technology, Engineering 12 years of experience Strong hands-on experience in building modern applications using Java, .NET, Node.js, Python, or similar technologies Solid understanding of data engineering concepts such as data pipelines, ETL workflows, and working with large datasets Good grasp of cloud-native solutions and experience with platforms like AWS, Azure, or GCP Strong understanding of DevOps principles including CI/CD, infrastructure as code, containerization, and monitoring Proven leadership in managing engineering teams and delivering reliable, scalable solutions Ability to work in agile environments and manage delivery across sprints and releases Familiarity with solution design, API integration, and secure development practices Ability to engage with technical and solution architects to understand design goals and contribute practical implementation insights Excellent communication, team mentoring, and problem-solving skills Oral and written proficiency in English required What Sets You Apart Bachelor’s Degree in Engineering or Technology with relevant specialization Proven experience in software delivery ownership Overseeing data engineering project implementations Guiding DevOps practices for system reliability Collaborating with architects on design validation Leading engineering teams with technical guidance Managing cross-functional coordination for project delivery Monitoring project health and delivery metrics Driving a culture of accountability and improvement

Posted 2 days ago

Apply

12.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

At PwC, our teams in Scaled Engineering Services are dedicated to delivering reliable, scalable, and cost-effective technology solutions that enable clients to achieve operational excellence and business agility. These teams apply technical expertise and a strong service-oriented mindset to support the design, development, deployment, and maintenance of enterprise-grade IT systems and applications. Professionals in engineering management roles will design and implement technology solutions to meet clients' business needs. You will leverage your experience in analysing requirements, developing technical designs to enable the successful delivery of solutions. Driven by a passion for engineering excellence and scalable design, you lead with purpose, guiding teams to deliver high-quality solutions that meet complex client needs. You take ownership of defining reliable solutions ensuring technical integrity and enabling successful delivery through hands-on leadership and mentorship. By leveraging your expertise in backend systems, cloud-native technologies, and infrastructure best practices, you coach team members, maximize their strengths, and foster a culture of continuous improvement—all while aligning solutions with business goals and delivery commitments. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Take ownership of technical solutions and technical delivery, ensuring successful planning, estimation, execution, and deployment of complex IT initiatives. Collaborate with project and engineering leads to drive shared accountability for technical quality, delivery timelines, and overall solution outcomes. Mentor and guide developers and engineers, sharing solution insights and promoting best practices across the team. Use code and design reviews as a platform to elevate team capabilities and encourage continuous technical growth. Proactively address technical challenges, leading difficult conversations with clarity and professionalism, and escalating issues when necessary. The Opportunity When you join PwC Acceleration Centers (ACs), you step into a pivotal role focused on actively supporting various Acceleration Center services, from Advisory to Assurance, Tax and Business Services. In our innovative hubs, you’ll engage in challenging projects and provide distinctive services to support client engagements through enhanced quality and innovation. You’ll also participate in dynamic and digitally enabled training that is designed to grow your technical and professional skills. As part of the Software and Product Innovation team you will lead the development of modern applications and drive innovative engineering solutions. As a Manager you will oversee engineering teams, fostering a culture of ownership and continuous improvement while delivering reliable and scalable solutions. This role offers the chance to work with advanced technologies and collaborate with cross-functional teams to deliver impactful projects that align with business goals. Responsibilities Lead the development and implementation of modern applications Supervise engineering teams to promote successful project execution Foster a culture of accountability and continuous enhancement Collaborate with cross-functional teams to align projects with business objectives Utilize advanced technologies to drive innovative engineering solutions Assure the delivery of scalable and dependable solutions Mentor team members to encourage professional growth and development Analyze project outcomes to identify areas for improvement What You Must Have Bachelor's Degree in Computer Science, Information Technology, Engineering 12 years of experience Strong hands-on experience in building modern applications using Java, .NET, Node.js, Python, or similar technologies Solid understanding of data engineering concepts such as data pipelines, ETL workflows, and working with large datasets Good grasp of cloud-native solutions and experience with platforms like AWS, Azure, or GCP Strong understanding of DevOps principles including CI/CD, infrastructure as code, containerization, and monitoring Proven leadership in managing engineering teams and delivering reliable, scalable solutions Ability to work in agile environments and manage delivery across sprints and releases Familiarity with solution design, API integration, and secure development practices Ability to engage with technical and solution architects to understand design goals and contribute practical implementation insights Excellent communication, team mentoring, and problem-solving skills Oral and written proficiency in English required What Sets You Apart Bachelor’s Degree in Engineering or Technology with relevant specialization Proven experience in software delivery ownership Overseeing data engineering project implementations Guiding DevOps practices for system reliability Collaborating with architects on design validation Leading engineering teams with technical guidance Managing cross-functional coordination for project delivery Monitoring project health and delivery metrics Driving a culture of accountability and improvement

Posted 2 days ago

Apply

8.0 - 10.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Who w e are? Johnson Controls is the global leader for smart, healthy and sustainable buildings. At Johnson Controls, we’ve been making buildings smarter since 1885, and our capabilities, depth of innovation experience, and global reach have been growing ever since. Today, we offer the world’s largest portfolio of building products, technologies, software, and services; we put that portfolio to work to transform the environments where people live, work, learn and play. This is where Johnson Controls comes in, helping drive the outcomes that matter most. Through a full range of systems and digital solutions, we make your buildings smarter. A smarter building is safer, more comfortable, more efficient, and, ultimately, more sustainable. Most important, smarter buildings let you focus more intensely on your unique mission. Better for your people. Better for your bottom line. Better for the planet. We’re helping to create a healthy planet with solutions that decrease energy use, reduce waste and make carbon neutrality a reality. Sustainability is a top priority for our company. We committed to invest 75 percent of new product development R&D in climate-related innovation to develop sustainable products and services. We take sustainability seriously. Achieving net zero carbon emissions before 2040 is just one of our commitments to making the world a better place. Please visit and follow Johnson Controls LinkedIn for recent exciting activities. Why JC I: https:/ /www.youtube.com/watch ?v = nrbigjbpxkg A sia-Pacific L i nkedIn: https:/ /www.linkedin.com/showcase/johnson-controls-asia-pacific/posts/ ?fee dView=all C areer: The Power Behind Your Mission O penBlue: This is How a Space Comes Alive What will you do? Solution Architecture Design: Design scalable and efficient data architectures using Snowflake that meet business needs and best practices Implementation: Lead the deployment of Snowflake solutions, including data ingestion, transformation, and visualization processes Data Governance & Security: Ensuring compliance with global data regulations in accordance with the data strategy and cybersecurity initiatives Collaboration: Work closely with data engineers, data scientists, and business stakeholders to gather requirements and provide technical guidance Optimization: Monitor and optimize performance, storage, and cost of Snowflake environments, implementing best practices for data modeling and querying Integration: Integrate Snowflake with other cloud services and tools (e.g., ETL/ELT tools, BI tools, data lakes) to create seamless data workflows Documentation: Create and maintain documentation for architecture designs, data models, and operational procedures Training and Support: Provide training and support to teams on Snowflake usage and best practices Troubleshooting: Identify and resolve issues related to Snowflake performance, security, and data integrity Stay Updated: Keep abreast of Snowflake updates, new features, and industry trends to continually enhance solutions and methodologies Assist Data Architects in implementing Snowflake-based data warehouse solutions to support advanced analytics and reporting use cases Requirement & Qualification Minimum: Bachelor’s / Postgraduate/ Master’s Degree in any stream Minimum 8-10 years of relevant experience as Solutions Architect, Data Architect, or similar role Knowledge of Snowflake Data warehouse and understanding the concepts of data warehousing including ELT, ETL processes and data modelling Understanding of cloud platforms (AWS, Azure, GCP) and their integration with Snowflake Competency in data preparation and/or ETL tools to build and maintain data pipelines and flows Strong knowledge of databases, stored procedures(SPs) and optimization of large data sets SQL, Power BI/Tableau is mandatory along with knowledge of any data integration tool Excellent communication and collaboration skills Strong problem-solving abilities and analytical mindset Ability to work in a fast-paced, dynamic environment What We Offer We offer an exciting and challenging position. Joining us you will become part of a leading global multi-industrial corporation defined by its stimulating work environment and job satisfaction. In addition, we offer outstanding career development opportunities which will stretch your abilities and channel your talents Diversity & Inclusion Our ded ication to d iversity a n d inclusion starts w ith ou r v a lues. W e lead w ith i n tegrity a n d p ur p o se, f o cusing o n the future a n d a ligning w ith o u r customers’ v ision for s u ccess. Our H igh-Performance Culture e n sures that w e h a v e the b e st talent that is h i gh ly e n g ag e d a n d eag e r to innovate. O u r D&I m ission e levates e a ch e m p l oye e ’ s re sponsibility to contribute to ou r culture. It’s through t he se contributions that we’ ll d r ive the m indsets an d be h a v iors w e nee d t o p o w e r o u r customers’ m issions. Y o u ha v e the p o wer . Yo u ha v e t h e v o i ce. Yo u ha v e the culture in y o u r h and s.

Posted 2 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

8 plus years of relevant experience in engineering software application/product Java 8+, Spring Boot, Spring Core, Spring MVC, Spring Security, REST, Micro Services Experienced working with database (Oracle, Mongo DB) Distributed Cache: Redis/Hazelcast Messaging: Kafka/Active MQ, Tibco EMS, IBM MQ Operating System: Linux RHEL7, Windows Familiarity with data pipeline and ETL technologies. Knowledge of JIRA, CI/CD tools like Jenkins, maven, BitBucket, Git, uDeploy Researching, designing, implementing and managing software programs. Coordinate with all stakeholders for the right design and and provide extensible low-level design with right separation of concern and abstractions keeping impact of design change and backward compatibility in mind. Writes modular, extensible, readable, and performant, secured code. Participates in code review. Keeps application security as top of the mind, follows secure design architecture and design standards and practices Creates technical solution artefacts, code review records, deployment plan Troubleshoots cross component issues, issues identified during static analysis, penetration testing or deployment. Narrows down to root causes and solves the symptoms Uses advanced language constructs, design principles, design patterns, libraries and frameworks being used, right data structure, performance and scalability concepts, tools, and Tech stacks and be able to apply appropriately. Conversant with CI/CD ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 2 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job title : Analyst - Data & Process Management Location: Hyderabad/Mumbai % of travel expected: Travel required as per business need, if any Job type: Permanent and Full time About The Job Our Team: Sanofi Business Operations (SBO) is an internal Sanofi resource organization based in India and is setup to centralize processes and activities to support Specialty Care, Vaccines, General Medicines, CHC, CMO, and R&D, Data & Digital functions . SBO strives to be a strategic and functional partner for tactical deliveries to Medical, HEVA, and Commercial organizations in Sanofi, globally. Main Responsibilities The overall purpose and main responsibilities are listed below: Ensuring the accuracy, security, and accessibility of organizational data. Identify and resolve data inconsistencies, redundancies, and quality issues. Maintain requirements documents, business rules and metadata. Collaborate across departments to streamline data processes, implement governance frameworks, and provide insights that drive informed decision-making. People: Maintain effective relationship with the end stakeholders within the allocated GBU and tasks – with an end objective to develop education and communication content as per requirement. Actively lead and develop SBO operations associates and ensure new technologies are leveraged. Initiate the contracting process and related documents within defined timelines; and Collaborate with global stakeholders for project planning and setting up the timelines and maintaining budget. Performance: Refresh Existing Reports and identifies improvement opportunities in reporting and BI tools and as needed structure / functionality with the latest insights as they become available. Create dashboards to synthesize and visualize key information and enable business decisions. Works to develop deal tracking analytics and reporting capabilities. Collaborates with Digital to enhance data access across various sources, develop tools, technology, and process to constantly improve quality and productivity. Collect, organize, and maintain datasets to ensure accuracy, completeness, and consistency. Monitor data pipelines, ETL processes, and ensure the smooth flow of data across systems. Develop and enforce data quality standards, governance policies, and best practices. Analyse data to identify trends, patterns, and insights that support decision-making. Build and maintain dashboards and reports using BI tools (e.g., Tableau, Power BI). Provide ad hoc data analysis for various departments and stakeholders. Performance indicators: Adherence to timeline, quality targe Process: Support delivery of projects in terms of resourcing, coordination, quality, timeliness, efficiency, and high technical standards for deliveries made by the medical/ field writing group, including scientific documents and clinical/medical reports. Contribute to overall quality enhancement by ensuring high scientific standards for the output produced by the medical/ field writing group Secure adherence to compliance procedures and internal/operational risk controls in accordance with all applicable standards Refresh report on frequency/cycle basis (weekly/monthly/quarterly/annually), along with QC checks for each refresh. Ability to work cross-functionally, gather requirements, analyse data, and generate insights and reports that can be used by the GBU. Stakeholder: Work closely with global teams and/ external vendors to ensure the end-to-end effective project delivery of the designated publication/medical education deliverables also to prioritize work and deliver on time-sensitive requests. Be able to provide and defend gathered intelligence, methodology, content, and conclusions to the global leadership in a clear, concise format. About You Experience:3+ years of experience in pharma data management, data governance and data stewardship. In-depth knowledge of common databases like IQVIA, APLD, SFMC, Google analytics, Engagement, and execution data, etc. Soft skills: Strong learning agility; Ability to manage ambiguous environments, and to adapt to changing needs of the business; Good interpersonal and communication skills; strong presentation skills a must; Team player who is curious, dynamic, result oriented and can work collaboratively; Ability to think strategically in an ambiguous environment; Ability to operate effectively in an international matrix environment, with ability to work across time zones; Demonstrated leadership and management in driving innovation and automation leveraging advanced statistical and analytical techniques Technical skills: Bachelor’s degree in Life Sciences, Pharmacy, Data Science, Computer Science, or a related field. A master’s degree is preferred. 3–5 years of hands-on experience in pharmaceutical data and data management, with a focus on syndicated data, Specialty Pharmacy, and digital/multi-channel data. Strong technical expertise in tools and platforms such as AWS, Snowflake, Databricks, SQL, Python, and Informatica. Solid knowledge of pharmaceutical sales and marketing data sources like IQVIA and Veeva. Familiarity with pharmaceutical sales operations and the application of data within a pharmaceutical commercial operations context. Ability to translate business requirements into detailed data solutions. Familiarity with data governance and stewardship practices, ensuring data quality and compliance. Experience with analytical tools such as Power BI, VBA, and Alteryx is a plus. Ability to contribute to driving innovation and automation by leveraging advanced analytical and statistical techniques. Education: Advanced degree in areas such as Management/Statistics/Decision Sciences/Engineering/Life Sciences/ Business Analytics or related field (e.g., PhD / MBA / Masters) Languages: Excellent knowledge in English and strong communication skills – written and spoken Why chose us? Bring the miracles of science to life alongside a supportive, future-focused team. Discover endless opportunities to grow your talent and drive your career, whether it’s through a promotion or lateral move, at home or internationally. Enjoy a thoughtful, well-crafted rewards package that recognizes your contribution and amplifies your impact. Take good care of yourself and your family, with a wide range of health and wellbeing benefits including high-quality healthcare, prevention and wellness programs and at least 14 weeks’ gender-neutral parental leave. Play an instrumental part in creating best practice within our Go-to-Market Capabilities. Pursue progress, discover extraordinary Better is out there. Better medications, better outcomes, better science. But progress doesn’t happen without people – people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. So, let’s be those people. At Sanofi, we provide equal opportunities to all regardless of race, colour, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, or gender identity. Watch our ALL IN video and check out our Diversity Equity and Inclusion actions at sanofi.com! null Pursue Progress . Discover Extraordinary . Join Sanofi and step into a new era of science - where your growth can be just as transformative as the work we do. We invest in you to reach further, think faster, and do what’s never-been-done-before. You’ll help push boundaries, challenge convention, and build smarter solutions that reach the communities we serve. Ready to chase the miracles of science and improve people’s lives? Let’s Pursue Progress and Discover Extraordinary – together. At Sanofi, we provide equal opportunities to all regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity, protected veteran status or other characteristics protected by law.

Posted 2 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist In this role, you will: Software design, Scala & Spark development, automated testing of new and existing components in an Agile, DevOps and dynamic environment Promoting development standards, code reviews, mentoring, knowledge sharing Production support & troubleshooting. Implement the tools and processes, handling performance, scale, availability, accuracy and monitoring Liaison with BAs to ensure that requirements are correctly interpreted and implemented. Participation in regular planning and status meetings. Input to the development process – through the involvement in Sprint reviews and retrospectives. Input into system architecture and design. Peer code reviews Requirements To be successful in this role, you should meet the following requirements: Scala development and design using Scala 2.10+ or Java development and design using Java 1.8+. Experience with most of the following technologies (Apache Hadoop, Scala, Apache Spark, Spark streaming, YARN, Kafka, Hive, Python, ETL frameworks, Map Reduce, SQL, RESTful services) Sound knowledge on working Unix/Linux Platform Hands-on experience building data pipelines using Hadoop components - Hive, Spark, Spark SQL. Experience with industry standard version control tools (Git, GitHub), automated deployment tools (Ansible & Jenkins) and requirement management in JIRA. Understanding of big data modelling techniques using relational and non-relational techniques Experience on Debugging the Code issues and then publishing the highlighted differences to the development team/Architects; Experience with time-series/analytics dB’s such as Elastic search. Experience with scheduling tools such as Airflow, Control-M. Understanding or experience of Cloud design patterns Exposure to DevOps & Agile Project methodology such as Scrum and Kanban. Experience with developing Hive QL, UDF’s for analysing semi structured/structured datasets Location : Pune and Bangalore You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSDI

Posted 2 days ago

Apply

3.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

About Blenheim Chalcot Blenheim Chalcot is the UK's leading digital venture builder, building and investing in innovative technology companies that are transforming industries. For over two decades, they have created a unique ecosystem that provides ventures with shared services, deep operational expertise, and a collaborative network to accelerate growth and success. As a portfolio company of Blenheim Chalcot, Avado benefits from this supportive environment, enabling it to focus on its mission to unlock potential and change lives through transformative learning experiences. About Avado Here at Avado, we believe in unlocking potential and changing lives. We are a professional academy that helps people and companies transform and grow through interactive learning experiences in data, finance, people, technology and marketing. Our connected learning approach combines deep expertise, collaborative learning, scalable technology and award-winning learning design to deliver learner experiences for people who want real change. What You’ll Do In the role of Data Engineer, you will play a crucial part in shaping and maintaining Avado's data infrastructure and analytics capabilities. You will design, develop, and support robust ETL processes, maintain high-quality data models, and create insightful reporting visuals using SQL, Power BI, and SSRS. Your expertise will ensure the integrity, accuracy, and optimisation of our data workflows, proactively resolving complex challenges and facilitating effective communication between technical and non-technical stakeholders to support Avado’s strategic objectives. Role Responsibilities Data Architecture & Modelling: Lead the design and implementation of robust, scalable data warehouse solutions on the Azure platform, using the Kimball methodology to create efficient staging and production layers. ETL/ELT Development: Design, build, and maintain ETL processes to integrate data from diverse sources, including APIs. Utilise Azure Data Factory and other tools to ensure efficient and reliable data ingestion into Azure SQL Server. Reporting & Analytics Support: Build and manage insightful reports and dashboards using Power BI and SSRS. Collaborate with data analysts to support complex SQL queries and ensure seamless data delivery for business intelligence. Performance & Optimization: Analyse and optimize existing data workflows and queries for enhanced performance, scalability, and cost efficiency. Data Governance & Quality: Champion data quality, reliability, and integrity. Implement proactive data quality checks, monitoring processes, and data governance policies to maintain high standards. Automation & Version Control: Develop automated scripts using Python, PowerShell, or Bash for efficient data extraction and integration. Maintain and manage version control of all workflows and scripts using Git. Collaboration & Communication: Work closely with stakeholders to gather requirements, troubleshoot issues, and provide data-driven solutions. Effectively communicate complex technical concepts to both technical and non-technical audiences. Documentation: Create and maintain clear, comprehensive documentation for data models, processes, and workflows to support knowledge sharing. Required What We’re Looking For Experience: A minimum of 3 years of hands-on experience designing, building, and managing data warehouses, with at least 2 years of experience analysing and extracting data from diverse sources (including REST APIs). SQL Mastery: Demonstrated mastery of SQL, including the ability to write complex queries and create tables, views, stored procedures, and functions. Azure Proficiency: Expertise in the Azure data stack, including Azure Data Factory, Azure SQL Server, Azure DevOps, Key Vaults, and Logic Apps. ETL Tools: Proven experience building and maintaining ETL pipelines using tools such as Azure Data Factory and SSIS. Reporting Tools: Experience developing reports and dashboards with visualization tools like Power BI and SSRS. Scripting: Proficiency in scripting languages such as Python, PowerShell, or Bash for automation and orchestration. Generative AI: Knowledge of Generative AI principles and their practical application in data engineering. Version Control: Experience using source control tools, particularly Git. Core Competencies: Strong data modelling, problem-solving, and performance-tuning skills. Familiarity with Agile workflows and CI/CD pipelines. Desirable Cloud Cost Management: Understanding of strategies to manage and optimize Azure resource costs. Data Quality Tooling: Hands-on experience with tools and practices to monitor and improve data quality within a data pipeline. What We Can Offer You Be part of the World’s Leading Digital Venture Builder Have the opportunity to be a part of and learn from the incredible diverse talent in BC Be exposed to the right mix of challenges, within a culture that promotes continuous learning and development and opportunity to work with Gen AI A fun and open, if a little cricket obsessed, atmosphere – we own the Rajasthan Royals IPL team! 24 days of annual leave &10 public holiday days Private Medical for you and your immediate family & Life Insurance for yourself Important At Blenheim Chalcot, we strive to create an environment where differences are not only accepted but greatly valued; where everyone can make the most of their capabilities and potential. We promote meritocracy, competence and the sharing of ideas and opinions. We are driven by data and believe the diversity, agility, generosity, and curiosity of our people is what sets us apart as an organisation. Our strong commitment to a culture of inclusion is evident through our constant focus on recruiting, developing and advancing individuals based on their skills and talent. By submitting your CV you understand that we have a legitimate interest to use your personal data for the purposes of assessing your eligibility for this role. This means that we may use your personal data to contact you to discuss your CV or arrange an interview, or transfer your CV to the hiring manager(s) of the role you have applied for. You can ask us at any time to remove your CV from our database by emailing recruitment@blenheimchalcot.com – but please note that this means we will no longer consider you for the role you have applied for. You can review our privacy policy here.

Posted 2 days ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 3-6 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300075

Posted 2 days ago

Apply

3.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 3-6 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300075

Posted 2 days ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 3-6 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300075

Posted 2 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 3-6 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300075

Posted 2 days ago

Apply

3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 3-6 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300075

Posted 2 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Where Data Does More. Join the Snowflake team. Snowflake’s Support team is expanding! We are looking for a Senior Cloud Support Engineer who likes working with data and solving a wide variety of issues utilizing their technical experience having worked on a variety of operating systems, database technologies, big data, data integration, connectors, and networking. Snowflake Support is committed to providing high-quality resolutions to help deliver data-driven business insights and results. We are a team of subject matter experts collectively working toward our customers’ success. We form partnerships with customers by listening, learning, and building connections. Snowflake’s values are key to our approach and success in delivering world-class Support. Putting customers first, acting with integrity, owning initiative and accountability, and getting it done are Snowflake's core values, which are reflected in everything we do. As a Senior Cloud Support Engineer , your role is to delight our customers with your passion and knowledge of Snowflake Data Warehouse. Customers will look to you for technical guidance and expert advice with regard to their effective and optimal use of Snowflake. You will be the voice of the customer regarding product feedback and improvements for Snowflake’s product and engineering teams. You will play an integral role in building knowledge within the team and be part of strategic initiatives for organizational and process improvements. Based on business needs, you may be assigned to work with one or more Snowflake Priority Support customers . You will develop a strong understanding of the customer’s use case and how they leverage the Snowflake platform. You will deliver exceptional service, enabling them to achieve the highest levels of continuity and performance from their Snowflake implementation. Ideally, you have worked in a 24x7 environment, handled technical case escalations and incident management, worked in technical support for an RDBMS, been on-call during weekends, and are familiar with database release management. AS A SENIOR CLOUD SUPPORT ENGINEER AT SNOWFLAKE, YOU WILL: Drive technical solutions to complex problems providing in-depth analysis and guidance to Snowflake customers and partners using the following methods of communication: email, web, and phone Adhere to response and resolution SLAs and escalation processes to ensure fast resolution of customer issues that exceed expectations Demonstrate good problem-solving skills and be process-oriented Utilize the Snowflake environment, connectors, 3rd party partner software, and tools to investigate issues Document known solutions to the internal and external knowledge base Report well-documented bugs and feature requests arising from customer-submitted requests Partner with engineering teams in prioritizing and resolving customer requests Participate in a variety of Support initiatives Provide support coverage during holidays and weekends based on business needs OUR IDEAL SENIOR CLOUD SUPPORT ENGINEER WILL HAVE THE FOLLOWING: Bachelor’s or Master’s degree in Computer Science or equivalent discipline. 5+ years experience in a Technical Support environment or a similar technical function in a customer-facing role. Excellent written and communication skills in English with attention to detail. Ability to reproduce and troubleshoot complex technical issues. In-depth knowledge of one of the major cloud service providers' ecosystems. ETL/ELT tools knowledge such as AWS Glue, EMR, Azure Data Factory, and Informatica. Expert working knowledge of internet protocols such as TCP/IP, HTTP/S, SFTP, and DNS as well as the ability to use diagnostic tools to troubleshoot connectivity issues. In-depth understanding of SSL/TLS handshake and troubleshooting SSL negotiation Advanced knowledge in driver configuration and troubleshooting for ODBC, JDBC, GO, and .NET. High level of proficiency with system troubleshooting on a variety of operating systems (Windows, Mac, *Nix), including many of the following tools: tcpdump, lsof, Wireshark, netstat, sar, perfmon, and process explorer. Debugging experience in Python, Java, or Scala. Experienced with software development principles, including object-oriented programming and version control systems (e.g., Git, GitHub, GitLab) Familiarity with Kafka and Spark technologies. NICE TO HAVE: Understanding of data loading/unloading process in Snowflake. Understanding Snowflake streams and tasks. Expertise in database migration processes. SQL skills, including JOINS, Common Table Expressions (CTEs), and Window Functions. Experience in supporting applications hosted on Amazon AWS or Microsoft Azure. Familiarity with containerization technologies like Docker and Kubernetes. Working experience in Data Visualization tools such as Tableau, Power BI, matplotlib, seaborn, and Plotly. Experience developing CI/CD components for production-ready data pipelines. Experience working with big data and/or MPP (massively parallel processing) databases Experienced with data warehousing fundamentals and concepts Database migration and ETL experience Familiarity with Data Manipulation and Analysis such as pandas, NumPy, scipy. Knowledge of authentication and authorization protocols (OAuth, JWT, etc.). SPECIAL REQUIREMENTS: Participate in pager duty rotations during nights, weekends, and holidays. Ability to work the 4th/night shift, which typically starts at 10 pm IST. Applicants should be flexible with schedule changes to meet business needs. Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. How do you want to make your impact? For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com

Posted 2 days ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 3-6 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300075

Posted 2 days ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. We are seeking a talented and motivated Data Engineer to join our growing data team. You will play a key role in building scalable data pipelines, optimizing data infrastructure, and enabling data-driven solutions. Primary Responsibilities Design, develop, and maintain scalable ETL/ELT pipelines for batch and real-time data processing Build and optimize data models and data warehouses to support analytics and reporting Collaborate with analysts and software engineers to deliver high-quality data solutions Ensure data quality, integrity, and security across all systems Monitor and troubleshoot data pipelines and infrastructure for performance and reliability Contribute to internal tools and frameworks to improve data engineering workflows Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications 5+ years of experience working on commercially available software and / or healthcare platforms as a Data Engineer 3+ years of solid experience designing and building Enterprise Data solutions on cloud 1+ years of experience developing solutions hosted within public cloud providers such as Azure or AWS or private cloud/container-based systems using Kubernetes/OpenShift Experience with some of the modern relational databases Experience with Data warehousing services preferably Snowflake Experience in using modern software engineering and product development tools including Agile / SAFE, Continuous Integration, Continuous Delivery, DevOps etc. Solid experience of operating in a quickly changing environment and driving technological innovation to meet business requirements Skilled at optimizing SQL statements Subject matter expert on Cloud technologies preferably Azure and Big Data ecosystem Preferred Qualifications Experience with real-time data streaming and event-driven architectures Experience building Big Data solutions on public cloud (Azure) Experience building data pipelines on Azure with skills Databricks spark, scala, Azure Data factory, Kafka and Kafka Streams, App services, Az Functions Experience developing RESTful Services in .NET, Java or any other language Experience with DevOps in Data engineering Experience with Microservices architecture Exposure to DevOps practices and infrastructure-as-code (e.g., Terraform, Docker) Knowledge of data governance and data lineage tools Ability to establish repeatable processes, best practices and implement version control software in a Cloud team environment At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 2 days ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Position Overview Job Title- FinOps Tooling Engineer, AVP Location- Pune, India Role Description: The DB Cloud FinOps function drives financial accountability of cloud consumption, providing distributed teams with insights into their consumption, spend and optimisation / control options to ensure cloud usage is managed efficiently. We are looking for a meticulous and proactive FinOps Tooling Engineer to support the FinOps tooling & optimization capability. We are seeking an individual who has a real passion for technology with a strong technical background with Business Intelligence tools & data analysis. You should also have a track record of excellent problem-solving skills. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Develop and maintain FinOps dashboards and reports in Looker, Cloudability and other Business Intelligence tools to provide clear insights into cloud spend, usage and optimization opportunities E2E management of the FinOps Business Intelligence Platform including user access, data connections and security settings. Design and implement data models that enable efficient, scalable analytics in Looker and related platforms Automate and improve the reporting process to support forecasting and budgeting Develop models to support cost allocation and reallocation across consuming tenants, ensuring accurate chargeback/showback Analysis of GCP billing data, usage data from Asset Inventory and Monitoring utilization data to assess application teams GCP consumption, identify wasteful spend and provide actionable insights to optimize spend Leverage APIs and ETL process to ingest and transform cloud cost and usage into BI tools Manage code changes 'via Github workflows' and version control. Your Skills And Experience Minimum 3 years' experience in Business Intelligence and Data Analytics Experience with Cloudability or other cloud cost management tools for building dashboards & data models. Experience configuring FinOps Tooling (e.g. Cloudability) programmatically by writing and posting JSON/XML to APIs. Proficiency in SQL and data transformation tools, proficiency in python Familiarity with cloud platforms like GCP, AWS, Azure etc. Experience with cloud usage data, billing files, and tagging strategies How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your need About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 days ago

Apply

1.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Description Want to participate in building the next generation of online payment system that supports multiple countries and payment methods? Amazon Payment Services (APS) is a leading payment service provider in MENA region with operations spanning across 8 countries and offers online payment services to thousands of merchants. APS team is building robust payment solution for driving the best payment experience on & off Amazon. Over 100 million customers send tens of billions of dollars moving at light-speed through our systems annually. We build systems that process payments at an unprecedented scale with accuracy, speed and mission-critical availability. We innovate to improve customer experience, with support for currency of choice, in-store payments, pay on delivery, credit and debit card payments, seller disbursements and gift cards. Many new exciting & challenging ideas are in the works. Key job responsibilities Data Engineers focus on managing data requests, maintaining operational excellence, and enhancing core infrastructure. You will be collaborating closely with both technical and non-technical teams to design and execute roadmaps Basic Qualifications 1+ years of data engineering experience Experience with SQL Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Preferred Qualifications Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI MAA 15 SEZ Job ID: A3049756

Posted 2 days ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Google Cloud Platform - Data Engineer Cloud is shifting business models at our clients, and transforming the way technology enables business. As our clients embark on this transformational journey to cloud, they are looking for trusted partners who can help them navigate through this journey. Our client’s journey spans across cloud strategy to implementation, migration of legacy applications to supporting operations of a cloud ecosystem and everything in between. Deloitte’s Cloud Delivery Center supports our client project teams in this journey by delivering these new solutions by which IT services are obtained, used, and managed. You will be working with other technologists to deliver cutting edge solutions using Google Cloud Services ( GCP ), programming and automation tools for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building a new cloud solutions, migrating an application to co-exist in the hybrid cloud, deploying a global cloud application across multiple countries or supporting a set of cloud managed services. Our teams of technologists have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. You will have an opportunity to leverage the skills you already have, try new technologies and develop skills that will improve your brand and career as a well-rounded cutting-edge technologist . Work you’ll do As GCP Data Engineer you will have multiple responsibilities depending on project type. As a Cloud Data Engineer, you will guide customers on how to ingest, store, process, analyze and explore/visualize data on the Google Cloud Platform. You will work on data migrations and transformational projects, and with customers to design large-scale data processing systems, develop data pipelines optimized for scaling, and troubleshoot potential platform issues. In this role you are the Data Engineer working with Deloitte's most strategic Cloud customers. Together with the team you will support customer implementation of Google Cloud products through: architecture guidance, best practices, data migration, capacity planning, implementation, troubleshooting, monitoring and much more. The key responsibilities may involve some or all of the areas listed below: Act as a trusted technical advisor to customers and solve complex Big Data challenges. Create and deliver best practices recommendations, tutorials, blog articles, sample code, and technical presentations adapting to different levels of key business and technical stakeholders. ▪ Identifying new tools and processes to improve the cloud platform and automate processes Qualifications Technical Requirements BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience. Experience in Cloud SQL and Cloud Bigtable Experience in Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub/Sub and Genomics Experience in Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer Experience with data processing software (such as Hadoop, Kafka, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume). Experience working with technical customers. Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScript. Consulting Requirements 3-6 years of relevant consulting, industry or technology experience Strong problem solving and troubleshooting skills Strong communicator Willingness to travel up in case of project requirement Preferred Qualifications Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments. Experience in technical consulting. Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Google Cloud Platform (mandatory) and AWS/Azure(good to have) Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka,NPL, MongoDB, SparkML, Tensorflow). Working knowledge of ITIL and/or agile methodologies Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300075

Posted 2 days ago

Apply

1.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Description Want to participate in building the next generation of online payment system that supports multiple countries and payment methods? Amazon Payment Services (APS) is a leading payment service provider in MENA region with operations spanning across 8 countries and offers online payment services to thousands of merchants. APS team is building robust payment solution for driving the best payment experience on & off Amazon. Over 100 million customers send tens of billions of dollars moving at light-speed through our systems annually. We build systems that process payments at an unprecedented scale with accuracy, speed and mission-critical availability. We innovate to improve customer experience, with support for currency of choice, in-store payments, pay on delivery, credit and debit card payments, seller disbursements and gift cards. Many new exciting & challenging ideas are in the works. Key job responsibilities Data Engineers focus on managing data requests, maintaining operational excellence, and enhancing core infrastructure. You will be collaborating closely with both technical and non-technical teams to design and execute roadmaps Basic Qualifications 1+ years of data engineering experience Experience with SQL Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Preferred Qualifications Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI MAA 15 SEZ Job ID: A3049753

Posted 2 days ago

Apply

5.0 years

0 Lacs

Sahibzada Ajit Singh Nagar, Punjab, India

On-site

Position Type Full time Type Of Hire Experienced (relevant combo of work and education) Education Desired Bachelor of Computer Science Travel Percentage 0% Are you curious, motivated, and forward-thinking? At FIS you’ll have the opportunity to work on some of the most challenging and relevant issues in financial services and technology. Our talented people empower us, and we believe in being part of a team that is open, collaborative, entrepreneurial, passionate and above all fun. About The Team FIS Global team handles various clients and provides Banking and Payment solutions. We support our client’s requests and implement them in our platform. The service includes onboarding new clients and as well as integrating into our platform solutions. Location - Mohali Shift - Day Shift Job Summary We are looking for a skilled and motivated Implementation Conversion Analyst Senior to join our IP Data Conversion Team. The ideal candidate will have 3–5 years of experience in data conversion and implementation projects, with strong technical expertise in C# , Angular , and SQL . This role is critical in ensuring seamless client data migration and integration during implementation cycles. Key Responsibilities Lead and execute data conversion tasks for client onboarding and implementation projects Develop and maintain tools and scripts using C# and SQL to support data extraction, transformation, and loading (ETL) Collaborate with cross-functional teams to gather requirements and ensure accurate data mapping and validation Build and maintain front-end tools or dashboards using Angular to support data visualization and reporting Troubleshoot and resolve data-related issues during testing and go-live phases Document conversion processes, mappings, and technical specifications Mentor junior analysts and contribute to process improvements Required Skills & Qualifications Bachelor’s degree in computer science, Information Technology, or a related field 3–5 years of hands-on experience in data conversion, implementation, or software development roles Proficiency in C# for backend development and scripting Strong experience with SQL for data manipulation, queries, and performance tuning Working knowledge of Angular for front-end development Solid understanding of data structures, ETL processes, and system integration Strong analytical and problem-solving skills Excellent communication and collaboration abilities Preferred Qualifications Experience in the financial services domain Familiarity with Agile methodologies and version control systems (e.g., Git) Exposure to cloud platforms or data migration tools is a plus What We Offer You An exciting opportunity be a part of World’s Leading FinTech Product MNC To be a part of vibrant team and to build up a career on banking domain A multifaceted job with a high degree of responsibility and a broad spectrum of opportunities A broad range of professional education and personal development possibilities – FIS is your final career step! Privacy Statement FIS is committed to protecting the privacy and security of all personal information that we process in order to provide services to our clients. For specific information on how FIS protects personal information online, please see the Online Privacy Notice. Sourcing Model Recruitment at FIS works primarily on a direct sourcing model; a relatively small portion of our hiring is through recruitment agencies. FIS does not accept resumes from recruitment agencies which are not on the preferred supplier list and is not responsible for any related fees for resumes submitted to job postings, our employees, or any other part of our company. #pridepass

Posted 2 days ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Description: Automation Tester: As a Test Leads / Architect who is passionate about seeing customer succeeded and ensuring best-in-class product quality. The objective of the role is End-to-End ownership of testing of the product and be the custodian of product quality. With the Digital Engineering team, you will have the opportunity to join a fast-growing team that is embarking on a multi-year implementation as part of an on-going digital modernization effort. As the project team ramps up, you will have the chance to help define and shape the vision of how the solution will be maintained and monitored to meet the business’ needs. Experience Level: 5+ Roles and Responsibilities: Experience in Telecom Industry Application is MUST Experience in testing CRM, Web Application, Billing & Order Management Have deep experience of system level debugging (including customer issues) with good understanding of managing and triaging production level issues Should have experience in database query language such as SQL, so that they can query and validate the production data for analysis. Exposer to Database / ETL Testing. Should have understanding for Microsoft Azure Cloud Environment. Responsible for maintaining QE health through facilitated defect management using standardized triage, defect management, and communication processes. Monitor trends and improvement opportunities using Mean Time to Resolution, defect RCAs, and environment downtime as key performance indicators. Conduct daily defect review with key program stakeholders, including dev, test, product, and support teams to ensure the path forward and reduce defect burn down for all prioritized defects. Monitor and ensure that overall defect processes are aligned with a standard one defect process across all test phases and teams. Provide Test Estimation to leads for Intake planning Should have a good understanding of API testing and be able to perform API testing using relevant tools like Postman, REST Assured or SoapUI Experience in Test/Defect Management tool (Preferably in JIRA/Zephyr) Partners with other leads & Architect for test planning, Assignment & reporting Monitor chats and host working sessions with impacted teams and ensure there is a path forward and active investigation on defects until disposition. Escalation Point for Dispute defects & resolution Conduct root cause analysis for defects and ensure the defect is closed under the right root cause category. Mitigate impediments and fosters a work environment for high-performing team dynamics, continuous the team’s workflow, and relentless improvements Responsible for designing holistic test architecture and Test solutions in alignment with business requirements and solution specifications Works with for Test Data team for Test Data Requirement & Fulfillment Primary / Mandatory skills: 5+ years’ experience in Product Testing with minimum 3+ years of experience on Defect Management/Production Validation Testing. Proven experience in Testing/defect management and triaging in a fast-paced software development environment. Experience in using defect tracking tools like JIRA and creating reports using Power BI. Experience of system level debugging (including customer issues) with good understanding of managing and triaging production level issues Should be familiar with database query language such as SQL, so that they can query and validate the production data for analysis Should have exposure to Test Automation (Selenium) Should have a good understanding of API testing and be able to perform API testing using relevant tools like Postman, REST Assured or SoapUI Experience in Test/Defect Management tool (Preferably in JIRA/Zephyr) Expertise in preparing daily status reports/dashboards to Management Decision maker in Entry and Exit of internal /development Testing stages Should be able to Co-ordinate proactively with Testing Team, Dev Leads & Other Members Expertise in Risk Identification & analysis Proven expertise in Agile software development especially Scrum and Kanban Experience in Telecom Industry is added advantage Strong written and verbal communication skills Technical Skills: Selenium , JAVA, JavaScript, JMeter, Rest Assured, SQL, Maven, Eclipse/VS Code, Bitbucket, JIRA, Jenkins, Git/GitHub, DevOps, Postman Additional information (if any): Willing to work in Shift Duties, Willingness to learn is very important as AT&T offers excellent environment to learn Digital Transformation skills such as cloud etc. Weekly Hours: 40 Time Type: Regular Location: Bangalore, Karnataka, India It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made.

Posted 2 days ago

Apply

12.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. [Senior Manager Software Development Engineering] What You Will Do Let’s do this. Let’s change the world. In this vital role you will be responsible for designing, developing, and maintaining software applications and solutions that meet business needs and ensuring the availability and performance of critical systems and applications. This role involves working closely with product managers, designers, and other engineers to create high-quality, scalable software solutions and automating operations, monitoring system health, and responding to incidents to minimize downtime. Roles & Responsibilities: Provide technical leadership to enhance the culture of innovation, automation, and solving difficult scientific and business challenges. Technical leadership includes providing vision and direction to develop scalable reliable solutions. Provide leadership to select right-sized and appropriate tools and architectures based on requirements, data source format, and current technologies Develop, refactor, research and improve Weave cloud platform capabilities. Understand business drivers and technical needs so our cloud services seamlessly, automatically, and securely provides them the best service. Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development Build strong partnership with stakeholder Build data products and service processes which perform data transformation, metadata extraction, workload management and error processing management to ensure high quality data Provide clear documentation for delivered solutions and processes, integrating documentation Collaborate with business partners to understand user stories and ensure technical solution/build can deliver to those needs Work with multi-functional teams to design and document effective and efficient solutions. Develop change management strategies and assist in their implementation. Mentor junior data engineers on standard methodologies in the industry and in the Amgen data landscape What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Basic Qualifications and Experience: Doctorate Degree /Master's degree / Bachelor's degree and 12to 17 years Computer Science, IT or related field experience Preferred Skills: Must-Have Skills: Superb communication and interpersonal skills, with the ability to work cross-functionally with multi-functional GTM, product, and engineering teams. Minimum of 10+ years overall Software Engineer or Cloud Architect experience Minimum 3+ years in architecture role using public cloud solutions such as AWS Experience with AWS Technology stack Good-to-Have Skills: Familiarity with big data technologies, AI platforms, and cloud-based data solutions. Ability to work effectively across matrixed organizations and lead collaboration between data and AI teams. Passion for technology and customer success, particularly in driving innovative AI and data solutions. Experience working with teams of data scientists, software engineers and business experts to drive insights Experience with AWS Services such as EC2, S3, Redshift/Spectrum, Glue, Athena, RDS, Lambda, and API gateway. Experience with Big Data Technologies (Hadoop, Hive, Hbase, Pig, Spark, etc) Good understanding of relevant data standards and industry trends Ability to understand new business requirements and prioritize them for delivery Experience working in biopharma/life sciences industry Proficient in one of the coding languages (Python, Java, Scala) Hands on experience writing SQL using any RDBMS (Redshift, Postgres, MySQL, Teradata, Oracle, etc.). Experience with Schema Design & Dimensional data modeling. Experience with software DevOps CI/CD tools, such Git, Jenkins, Linux, and Shell Script Hands on experience using Databricks/Jupyter or similar notebook environment. Experience working with GxP systems Experience working in an agile environment (i.e. user stories, iterative development, etc.) Experience working with test-driven development and software test automation Experience working in a Product environment Good overall understanding of business, manufacturing, and laboratory systems common in the pharmaceutical industry, as well as the integration of these systems through applicable standards. Soft Skills: Excellent analytical and troubleshooting skills. Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to handle multiple priorities successfully. Team-oriented, with a focus on achieving team goals What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies