Jobs
Interviews

1016 Etl Process Jobs - Page 33

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 9.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Role Description? As a Manager of Information Systems at Amgen, you will provide leadership and oversight to the organization's information systems function. You will collaborate with stakeholders to understand business needs, develop IT strategies, manage IT projects, and ensure the effective delivery of information systems solutions. Your strong leadership skills, strategic mindset, and technical expertise will contribute to the alignment of IT initiatives with the company's goals. The candidate should be a self-starter and have a strong passion for fostering innovation and excellence in the biotechnology industry. The role requires a solid background in the end-to-end software development lifecycle and a Scaled Agile practitioner, coupled with leadership and transformation experience. Roles & Responsibilities? ? Collaborate with product managers, designers, and engineers to support CTDA operations by automating tasks, monitoring system health, and minimizing downtime through incident response Capture the voice of the customer to define business processes and product needs ? Collaborate with CTDA business stakeholders, Architects and Engineering teams to prioritize release scopes and refine the Product backlog ? Facilitate the breakdown of Epics into Features and Sprint-Sized User Stories and participate in backlog reviews with the development team ? Clearly express features in User Stories/requirements so all team members and stakeholders understand how they fit into the product backlog ? Ensure Acceptance Criteria and Definition of Done are well-defined ? Stay focused on software development to ensure it meets requirements, providing proactive feedback to stakeholders ? Help develop and maintain a product roadmap that clearly outlines the planned features and enhancements, timelines, and achievements ? Identify and manage risks associated with the systems, requirement validation, and user acceptance ? Develop & maintain documentations of configurations, processes, changes, communication plans and training plans for end users ? Ensure operational excellence, cybersecurity, and compliance. ? Collaborate with geographically dispersed teams, including those in the US and other international locations. ? Foster a culture of collaboration, innovation, and continuous improvement ? ? Basic Qualifications and Experience? Master’s degree with 4 - 6 years of experience in Computer Science/Information Systems experience with Agile Software Development methodologies OR? ? Bachelor’s degree with 6 - 8 years of experience in Computer Science/Information Systems experience with Agile Software Development methodologies OR ? Diploma with 10 - 12 years of experience in Computer Science/Information Systems experience with Agile Software Development methodologies ? Functional Skills: ? Must-Have Skills? ? Experience with Agile software development methodologies (Scrum) ? and SAFe Excellent communication skills and the ability to interface with senior leadership with confidence and clarity ? Strong knowledge of ETL Processes Familiarity with regulatory requirements for Clinical Trials ( e.g. 21 CFR Part11, ICH) ? ? Good-to-Have Skills: ? Experience in managing product features for PI planning and developing product roadmaps and user journeys ? Experience maintaining SaaS (software as a system) solutions and COTS (Commercial off the shelf) solutions ? Technical thought leadership ? Able to communicate technical or complex subject matters in business terms ? Experience with data analysis, data modeling, and data visualization solutions such as Tableau and Spotfire ? Professional Certifications (please mention if the certification is preferred or mandatory for the role)? ITIL (preferred) SAFe for Teams certification (preferred) ? Certified Business Analysis Professional (Preferred) ? ? Soft Skills: ? Able to work under minimal supervision ? Skilled in providing oversight and mentoring team members. Demonstrated ability in effectively delegating work ? Excellent analytical and gap/fit assessment skills ? Strong verbal and written communication skills ? Ability to work effectively with global, virtual teams ? High degree of initiative and self-motivation ? Ability to manage multiple priorities successfully ? Team-oriented, with a focus on achieving team goals ? Strong presentation and public speaking skills ? Shift Information? This position operates on the second shift, from 2:00 PM to 10:00 PM IST. Candidates must be willing and able to work during these hours . ?

Posted 2 months ago

Apply

1.0 - 4.0 years

1 - 5 Lacs

Hyderabad

Work from Office

Role Description? The role leverages domain and business process expertise to detail product requirements as epics and user stories, along with supporting artifacts like business process maps, use cases, and test plans for Clinical Trial Data & Analytics (CTDA) Product Team . This role involves working closely with varied business stakeholders across R&D , Data engineers , Data Analysts, and Testers to ensure that the technical requirements for upcoming development are thoroughly elaborated. This enables the delivery team to estimate, plan, and commit to delivery with high confidence and identify test cases and scenarios to ensure the quality and performance of IT Systems. You will analyze business requirements and design information systems solutions. You will collaborate with multi-functional teams to understand business needs, identify system enhancements, and drive system implementation projects. Your solid experience in business analysis, system design, and project management will enable you to deliver innovative and effective technology products. Roles & Responsibilities? ? Collaborate with System Architects and Product Owners to manage business analysis activities for CTDA systems, ensuring alignment with engineering and product goals ? Capture the voice of the customer to define business processes and product needs ? Collaborate with CTDA business stakeholders , Architects and Engineering teams to prioritize release scopes and refine the Product backlog ? Facilitate the breakdown of Epics into Features and Sprint-Sized User Stories and participate in backlog reviews with the development team ? Clearly express features in User Stories/requirements so all team members and stakeholders understand how they fit into the product backlog ? Ensure Acceptance Criteria and Definition of Done are well-defined ? Stay focused on software development to ensure it meets requirements, providing proactive feedback to stakeholders ? Develop and execute effective product demonstrations for internal and external stakeholders ? Help develop and maintain a product roadmap that clearly outlines the planned features and enhancements, timelines, and achievements ? Identify and manage risks associated with the systems, requirement validation, and user acceptance ? Develop & maintain documentations of configurations, processes, changes, communication plans and training plans for end users ? Ensure operational excellence, cybersecurity, and compliance. ? Collaborate with geographically dispersed teams, including those in the US and other international locations. ? Foster a culture of collaboration, innovation, and continuous improvement ? ? Basic Qualifications and Experience? Master’s degree with 4 - 6 years of experience in Computer Science/Information Systems experience with Agile Software Development methodologies OR? ? Bachelor’s degree with 6 - 8 years of experience in Computer Science/Information Systems experience with Agile Software Development methodologies OR ? Diploma with 10 - 12 years of experience in Computer Science/Information Systems experience with Agile Software Development methodologies ? Functional Skills: ? Must-Have Skills? ? Proven abilit y in translating business requirements into technical specifications and writing user requirement documents. Experience with Agile software development methodologies (Scrum) ? Excellent communication skills and the ability to interface with senior leadership with confidence and clarity ? Strong knowledge of ETL process ? Familiarity with regulatory requirements for Clinical Trials ( e.g. 21 CFR Part11, ICH) ? ? Good-to-Have Skills: ? Experience in managing product features for PI planning and developing product roadmaps and user journeys ? Experience maintaining SaaS (software as a system) solutions and COTS (Commercial off the shelf) solutions ? Technical thought leadership ? Able to communicate technical or complex subject matters in business terms ? Experience with data analysis, data modeling, and data visualization solutions such as Tableau and Spotfire ? Hands on experience writing SQL using any RDBMS (Redshift, Postgres, MySQL, Teradata, Oracle, etc.) ? Experience with AWS Services (like EC2, S3), Salesforce, Jira, and API gateway, etc. ? Professional Certifications (please mention if the certification is preferred or mandatory for the role)? SAFe for Teams certification (preferred) ? Certified Business Analysis Professional (Preferred) ? ? Soft Skills: ? Able to work under minimal supervision ? Skilled in providing oversight and mentoring team members. Demonstrated ability in effectively delegating work ? Excellent analytical and gap/fit assessment skills ? Strong verbal and written communication skills ? Ability to work effectively with global, virtual teams ? High degree of initiative and self-motivation ? Ability to manage multiple priorities successfully ? Team-oriented, with a focus on achieving team goals ? Strong presentation and public speaking skills ? Shift Information? This position operates on the second shift, from 2:00 PM to 10:00 PM IST. Candidates must be willing and able to work during these hours .

Posted 2 months ago

Apply

8.0 - 12.0 years

12 - 16 Lacs

Hyderabad

Work from Office

What you will do Let’s do this. Let’s change the world. In this vital role you will be responsible for ensuring the proper management, governance, quality, and accessibility of an organization's data. Ensure data is trustworthy, well-organized, and effectively used, playing a crucial role in enabling data-driven decision-making and maintaining regulatory compliance. The ideal candidate has deep technical skills, experience with Syniti Knowledge platform (SKP) technologies, and a deep understanding of data architecture and ETL processes. Roles & Responsibilities: Design, develop, and manage data migration solutions using the Syniti Knowledge Platform. Configure and optimize Syniti Data Replication (SDR) and Syniti Advanced Data Migration (ADM) solutions. Collaborate with team members to understand data governance and quality requirements and implement solutions to address them. Monitor and improve data accuracy, completeness, consistency, and reliability. Create and maintain metadata repositories for easy data identification and understanding. Manage access controls to ensure authorized personnel can access data. Safeguard sensitive and confidential data from unauthorized access. Work with data owners, data analysts, and IT teams to align data practices with business objectives. Act as a liaison between technical teams and business users to ensure data meets organizational needs. Develop and maintain documentation for data processes, standards, and workflows. Investigate and resolve data-related issues and discrepancies. Develop and maintain metrics to track data quality and stewardship effectiveness. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master’s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor’s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Preferred Qualifications: Demonstrate a solid understanding of Syniti’s Data Migration methodology. Demonstrate strong proficiency in Syniti’s tools(ADM, Data Quality, MDM, Mass maintenance & SDR), working to prescribed development standards. Good understanding of business processes for one or more assigned functional area, and the ability to quickly learn about new areas. Demonstrate a strong knowledge of underlying technical data structures and definitions of assigned functional process area. Develop data migration rules and reports in the Syniti’s tools as per supplied specification. Evaluate and contribute to ongoing data migration design for assigned process area through data analysis, reporting and collaboration with on-site Syniti and client colleagues. Create and maintain data load programs. Experience with SAP Advanced Data Migration and Management(ADMM) Maintain related documentation and clearly communicate status to Syniti Team Leadership and customer PMO. Execution of end-to-end data loads and reconciliations. Proficiency in data analysis tools (eg. SQL) and experience with data visualization tools Database experience - SQL Server, Oracle etc. Good-to-Have Skills: Experience in ETL / Data Migration / Data Quality / Data Analysis. ERP techno-functional or functional experience, SAP preferred. Experience in SAP MDG Business Partner (Customer & Supplier), Material & Finance objects. Successfully participated in at least one or more full ERP implementation lifecycles. Working knowledge in SAFe development environment. Understanding of enterprise data strategy, data governance, data infrastructure. Working knowledge with cloud (e.g. AWS) and on-premises compute infrastructure. Database experience - SQL Server, Oracle etc. Professional Certifications: Agile Certified Practitioner (preferred) Syniti Certified Developer certification (preferred) Soft Skills: Excellent critical-thinking and problem-solving skills. Good communication and collaboration skills. Demonstrated awareness of how to function in a team setting. Demonstrated presentation skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment.

Posted 2 months ago

Apply

2.0 - 5.0 years

13 - 17 Lacs

Hyderabad

Work from Office

What you will do Let’s do this. Let’s change the world. In this vital role you will responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and implementing data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with multi-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Adhere to standard methodologies for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master’s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor’s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Preferred Qualifications: Functional Skills: Must-Have Skills Proficiency in Python, PySpark, and Scala for data processing and ETL (Extract, Transform, Load) workflows, with hands-on experience in using Databricks for building ETL pipelines and handling big data processing Experience with data warehousing platforms such as Amazon Redshift, or Snowflake. Strong knowledge of SQL and experience with relational (e.g., PostgreSQL, MySQL) databases. Familiarity with big data frameworks like Apache Hadoop, Spark, and Kafka for handling large datasets. Experienced with software engineering best-practices, including but not limited to version control (GitLab, Subversion, etc.), CI/CD (Jenkins, GITLab etc.), automated unit testing, and Dev Ops Good-to-Have Skills: Experience with cloud platforms such as AWS particularly in data services (e.g., EKS, EC2, S3, EMR, RDS, Redshift/Spectrum, Lambda, Glue, Athena) Strong understanding of data modeling, data warehousing, and data integration concepts Understanding of machine learning pipelines and frameworks for ML/AI models Professional Certifications: AWS Certified Data Engineer (preferred) Databricks Certified (preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com

Posted 2 months ago

Apply

1.0 - 4.0 years

2 - 5 Lacs

Hyderabad

Work from Office

ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. What you will do Let’s do this. Let’s change the world. In this vital role you will deliver innovative custom solutions for supporting patient safety and adhering to regulatory requirements from around the world. You will be an active participant in the team directly working towards advancing technical features and enhancements of the business applications, also involving Machine Learning and Natural Language Processing technologies. Roles & Responsibilities: Develops and delivers robust technology solutions in a regulated environment by collaborating with business partners, information systems (IS) colleagues and service providers Authors documentation for technical specifications and designs that satisfy detailed business and functional requirements Works closely with business and IS teams to find opportunities Responsible for crafting and building end-to-end solutions using cloud technologies (e.g. Amazon Web Services and Business Intelligence tools (e.g. Cognos, Tableau and Spotfire) or any other platforms Chips in towards design and rapid Proof-of-Concept (POC) development efforts for automated solutions that improve efficiency and simplify business processes. Quickly and iteratively prove or disprove the concepts being considered. Ensures design, development of software solutions is meeting Amgen architectural, security, quality and development guidelines Participates in Agile development ceremonies and practices Write SQL queries to manipulate and visualize data using data visualization tools What we expect of you Master’s degree and 1 to 3 years of experience in software engineering OR Bachelor’s degree and 3 to 5 years of experience in software engineering OR Diploma and 7 to 9 years of in software engineering Basic Qualifications: Experience and proficient with at least one development programming language/technologies such as Database SQL and Python Experience with at least one Business Intelligence tool such as Cognos, Tableau or Spotfire Familiarity with automation technologies UiPath and a desire to learn and support Solid understanding of Mulesoft and ETL technologies (e.g. Informatica, DataBricks) Understanding of AWS/cloud storage, hosting, and compute environments is required Preferred Qualifications: Experienced in database programming languages, data modelling concepts, including Oracle SQL and PL/SQL Experience with API integrations such as MuleSoft Solid understanding of using one or more general programming languages, including but not limited toJava or Python Outstanding written and verbal communication skills, and ability to explain technical concepts to non-technical clients Sharp learning agility, problem solving and analytical thinking Experienced in managing GxP systems and implementing GxP projects Extensive expertise in SDLC, including requirements, design, testing, data analysis, change control Professional Certifications: Understanding and experience with Agile methodology and DevOps Soft Skills: Strong communication and presentation skills Ability to work on multiple projects simultaneously Expertise in visualizing and manipulating large data sets Willing to learn to new technologies High learning agility, innovation, and analytical skills Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation .

Posted 2 months ago

Apply

1.0 - 4.0 years

2 - 5 Lacs

Hyderabad

Work from Office

Sr Associate Software Engineer – Tech Enablement Team What you will do In this vital and technical role, you will deliver innovative custom solutions for supporting patient safety and adhering to regulatory requirements from around the world. You will be an active participant in the team directly working towards advancing technical features and enhancements of the business applications, also involving Machine Learning and Natural Language Processing technologies. Develops and delivers robust technology solutions in a regulated environment by collaborating with business partners, information systems (IS) colleagues and service providers Authors documentation for technical specifications and designs that satisfy detailed business and functional requirements Works closely with business and IS teams to find opportunities Responsible for crafting and building end-to-end solutions using cloud technologies (e.g. Amazon Web Services and Business Intelligence tools (e.g. Cognos, Tableau and Spotfire) or any other platforms Chips in towards design and rapid Proof-of-Concept (POC) development efforts for automated solutions that improve efficiency and simplify business processes. Quickly and iteratively prove or disprove the concepts being considered. Ensures design, development of software solutions is meeting Amgen architectural, security, quality and development guidelines Participates in Agile development ceremonies and practices Write SQL queries to manipulate and visualize data using data visualization tools What we expect of you Master’s degree with 1 - 2 years of experience in Computer Science, Software Development, IT or related field (OR) Bachelor’s degree with 2 - 4 years of experience in Computer Science, Software Development, IT or related field (OR) Diploma with 5 - 8 years of experience in Computer Science, Software Development, IT or related field Must Have Skills: Experience and proficient with at least one development programming language/technologies such as Database SQL and Python Experience with at least one Business Intelligence tool such as Cognos, Tableau or Spotfire Familiarity with automation technologies UiPath and a desire to learn and support Solid understanding of Mulesoft and ETL technologies (e.g. Informatica, DataBricks) Understanding of AWS/cloud storage, hosting, and compute environments is required Good to Have Skills: Experienced in database programming languages, data modelling concepts, including Oracle SQL and PL/SQL Experience with API integrations such as MuleSoft Solid understanding of using one or more general programming languages, including but not limited toJava or Python Outstanding written and verbal communication skills, and ability to explain technical concepts to non-technical clients Sharp learning agility, problem solving and analytical thinking Experienced in managing GxP systems and implementing GxP projects Extensive expertise in SDLC, including requirements, design, testing, data analysis, change control Certification: Understanding and experience with Agile methodology and DevOps Soft Skills: Strong communication and presentation skills Ability to work on multiple projects simultaneously Expertise in visualizing and manipulating large data sets Willing to learn to new technologies High learning agility, innovation, and analytical skills Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 2 months ago

Apply

1.0 - 4.0 years

2 - 5 Lacs

Hyderabad

Work from Office

ABOUT THE ROLE You will play a key role in a regulatory submission content automation initiative which will modernize and digitize the regulatory submission process, positioning Amgen as a leader in regulatory innovation. The initiative leverages state-of-the-art technologies, including Generative AI, Structured Content Management, and integrated data to automate the creation, review, and approval of regulatory content. ? The role is responsible for sourcing and analyzing data for this initiative and support designing, building, and maintaining the data pipelines to drive business actions and automation . This role involves working with Operations source systems, find the right data sources, standardize data sets, supporting data governance to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities Ensure reliable , secure and compliant operating environment. Identify , extract, and integrate required business data from Operations systems residing in modern cloud-based architectures. Design, develop, test and maintain scalable data pipelines, ensuring data quality via ETL/ELT processes. Schedul e and manag e workflows the ensure pipeline s run on schedule and are monitored for failures. Implement data integration solutions and manage end-to-end pipeline projects, including scope, timelines, and risk. Reverse-engineer schemas and explore source system tables to map local representations of target business concepts. Navigate application UIs and backends to gain business domain knowledge and detect data inconsistencies. Break down information models into fine-grained, business-contextualized data components. Work closely with cross-functional teams, including product teams, data architects, and business SMEs, to understand requirements and design solutions. Collaborate with data scientists to develop pipelines that meet dynamic business needs across regions. Create and maintain data models, dictionaries, and documentation to ensure accuracy and consistency. Adhere to SOPs, GDEs , and best practices for coding, testing, and reusable component design. Basic Qualifications and Experience Master’s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor’s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Functional Skills: Must-Have Skills: Hands on experience with data practices, technologies , and platforms , such as Databricks, Python, Prophecy, Gitlab, LucidChart etc Proficiency in data analysis tools ( eg. SQL) and experience with data sourcing tools Excellent problem-solving skills and the ability to work with large, complex datasets U nderstanding of data governance frameworks, tools, and best practices. Knowledge of and experience with data standards (FAIR) and protection regulations and compliance requirements (e.g., GDPR, CCPA) Good-to-Have Skills: Experience with ETL tools and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python /R , Databricks, cloud data platforms Professional Certifications Certified Data Engineer / Data Analyst (preferred on Databricks ) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills

Posted 2 months ago

Apply

3.0 - 7.0 years

4 - 7 Lacs

Hyderabad

Work from Office

ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. What you will do Let’s do this. Let’s change the world. In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you Master’s degree and 4 to 6 years of Computer Science, IT or related field experience OR Bachelor’s degree and 6 to 8 years of Computer Science, IT or related field experience OR Diploma and 10 to 12 years of Computer Science, IT or related field experience Basic Qualifications: Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL),Snowflake, workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) Proficient in SQL for extracting, transforming, and analyzing complex datasets from relational data stores Excellent problem-solving skills and the ability to work with large, complex datasets Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Proven ability to optimize query performance on big data platforms Preferred Qualifications: Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Knowledge of Python/R, Databricks, SageMaker, cloud data platforms Strong understanding of data governance frameworks, tools, and best practices. Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Professional Certifications: AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation .

Posted 2 months ago

Apply

1.0 - 4.0 years

2 - 5 Lacs

Hyderabad

Work from Office

ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. What you will do Let’s do this. Let’s change the world. In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you Master’s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor’s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Basic Qualifications: Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL),Snowflake, workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) Proficient in SQL for extracting, transforming, and analyzing complex datasets from relational data stores Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Proven ability to optimize query performance on big data platforms Preferred Qualifications: Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Knowledge of Python/R, Databricks, SageMaker, cloud data platforms Strong understanding of data governance frameworks, tools, and best practices. Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Professional Certifications: AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation .

Posted 2 months ago

Apply

1.0 - 4.0 years

2 - 5 Lacs

Hyderabad

Work from Office

What you will do Let’s do this. Let’s change the world. In this vital role you are responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master’s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor’s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Preferred Qualifications: Functional Skills: Must-Have Skills : Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), AWS, Redshift, Snowflake, workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) and experience with data visualization tools. Proficient in SQL for extracting, transforming, and analyzing complex datasets from relational data stores Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Proven ability to optimize query performance on big data platforms Good-to-Have Skills: Experience with data modeling, performance tuning, on relational and graph databases( e.g. Marklogic, Allegrograph, Stardog, RDF Triplestore).. Strong understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python/R, Databricks, SageMaker, cloud data platform Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Professional Certifications : AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. Equal opportunity statement Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com

Posted 2 months ago

Apply

0.0 - 2.0 years

3 - 5 Lacs

Hyderabad

Work from Office

What you will do Let’s do this. Let’s change the world. In this vital role you areresponsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Bachelor’s degree and 0 to 3 years of Computer Science, IT or related field experience OR Diploma and 4 to 7 years of Computer Science, IT or related field experience Preferred Qualifications: Functional Skills: Must-Have Skills : Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), AWS, Redshift, Snowflake, workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) and experience with data visualization tools. Proficient in SQL for extracting, transforming, and analyzing complex datasets from relational data stores. Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Good-to-Have Skills: Experience with data modeling, performance tuning on relational and graph databases ( e.g. Marklogic, Allegrograph, Stardog, RDF Triplestore). Understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python/R, Databricks, SageMaker, cloud data platform Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Professional Certifications : AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. Equal opportunity statement Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com

Posted 2 months ago

Apply

6.0 - 10.0 years

10 - 14 Lacs

Hyderabad

Work from Office

What you will do Let’s do this. Let’s change the world. In this vital role you are responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data governance initiatives and, visualizing (to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Leading and being hands-on for the technical design, development, testing, implementation, and support of data pipelines that load the data domains in the Enterprise Data Fabric and associated data services. Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs. Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency. Be able to translate data models (ontology, relational) into physical designs that performant, maintainable, easy to use. Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical Collaboration with RunOps engineers to continuously increase our ability to push changes into production with as little manual overhead and as much speed as possible. What we expect of you We are all different, yet we all use our unique contributions to serve patients. The [vital attribute] professional we seek is a [type of person] with these qualifications. Basic Qualifications: Master’s degree and 4 to 6 years of Computer Science, IT or related field experience OR Bachelor’s degree and 6 to 8 years of Computer Science, IT or related field experience OR Diploma and 10 to 12 years of Computer Science, IT or related field experience Preferred Qualifications: Functional Skills: Must-Have Skills: Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), workflow orchestration, performance tuning on big data processing Proficient in SQL for extracting, transforming, and analyzing complex datasets from both relational and graph data stores ( e.g. Marklogic, Allegrograph, Stardog, RDF Triplestore). Experience with ETL tools such as Apache Spark, Prophecy and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Able to take user requirements and develop data models for data analytics use cases. Good-to-Have Skills: Knowledge of Python/R, Databricks, SageMaker, cloud data platforms Experience using graph databases such as Stardog , Marklogic , Neo4J , Allegrograph, etc. and writing SPARQL queries. Experience working with agile development methodologies such as Scaled Agile. Professional Certifications AWS Certified Data Engineer preferred Certified Data Engineer / Data Analyst (preferred on Databricks or cloud environments) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. Equal opportunity statement Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com

Posted 2 months ago

Apply

7.0 - 11.0 years

11 - 15 Lacs

Chennai, Guindy, Chenai

Work from Office

Senior Technical Lead Chennai - Guindy, India Information Technology 17096 Overview We are seeking an experienced Senior Technical Lead to join our team for a critical migration project. This role will focus on migrating data and services from on-premise or legacy systems to cloud platforms (preferably AWS). The ideal candidate will have a solid background in software engineering, cloud technologies, especially AWS, and a solid understanding of database technologies and data migration processes., and hands-on experience with data and application migration projects. Responsibilities Key Responsibilities: Lead data migration efforts from legacy systems (e.g., on-premises databases) to cloud-based platforms AWS Collaborate with cross-functional teams to gather requirements and define migration strategies. Develop and implement migration processes to move legacy applications and data to cloud platforms like AWS. Write scripts and automation to support data migration, system configuration, and cloud infrastructure provisioning. Optimize existing data structures and processes for performance and scalability in the new environment. Ensure the migration adheres to performance, security, and compliance standards. Identify potential issues, troubleshoot, and implement fixes during the migration process. Maintain documentation of migration processes and post-migration maintenance plans. Provide technical support post-migration to ensure smooth operation of the migrated systems. Requirements Primary Skills (Required): Proven experience in leading data migration projects and migrating applications, services, or data to cloud platforms (preferably AWS). Knowledge of migration tools such as AWS Database Migration Service (DMS), AWS Server Migration Service (SMS), AWS Migration Hub Expertise in data mapping, validation, transformation, and ETL processes Proficiency in Python, Java or similar programming languages. Experience with scripting languages such as Shell, PowerShell, or Bash Cloud Technologies (AWS focus)Strong knowledge of AWS services relevant to data migration (e.g., S3, Redshift, Lambda, RDS, DMS, Glue). Experience in working with CI/CD pipelines (Jenkins, GitLab CI/CD) and infrastructure as code (IaC) using Terraform or AWS CloudFormation Experience in database management and migrating relational (e.g., MySQL, PostgreSQL, Oracle) and non-relational (e.g., MongoDB) databases.

Posted 2 months ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Chennai

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with data modeling and database design.- Strong understanding of ETL processes and data integration techniques.- Familiarity with cloud platforms such as AWS or Azure.- Experience in performance tuning and optimization of data queries. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Pune

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : No Additional SkillsMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function seamlessly within the existing infrastructure. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application design and functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration techniques and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with programming languages such as Python or Scala.- Knowledge of data modeling and database design principles. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SnapLogic Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop innovative solutions and contribute to key decisions in application development. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the integration and implementation of SnapLogic solutions- Develop and maintain SnapLogic pipelines- Troubleshoot and resolve issues related to SnapLogic integrations Professional & Technical Skills: - Must To Have Skills: Proficiency in SnapLogic- Strong understanding of ETL processes- Experience with API integrations- Knowledge of cloud platforms such as AWS or Azure- Hands-on experience with data transformation and mapping Additional Information:- The candidate should have a minimum of 5 years of experience in SnapLogic- This position is based at our Hyderabad office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 months ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SnapLogic Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop innovative solutions and contribute to key decisions. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the development and implementation of SnapLogic applications- Troubleshoot and resolve issues in SnapLogic integrations- Stay updated on the latest SnapLogic features and best practices Professional & Technical Skills: - Must To Have Skills: Proficiency in SnapLogic- Strong understanding of ETL processes- Experience with API integrations- Knowledge of cloud platforms such as AWS or Azure- Hands-on experience in designing and implementing complex data workflows Additional Information:- The candidate should have a minimum of 5 years of experience in SnapLogic- This position is based at our Pune office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 months ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Pune

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SnapLogic Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements in Pune. You will play a crucial role in developing innovative solutions to enhance business operations and efficiency. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the development and implementation of SnapLogic solutions- Conduct code reviews and ensure adherence to coding standards- Troubleshoot and resolve technical issues in SnapLogic integrations Professional & Technical Skills: - Must To Have Skills: Proficiency in SnapLogic- Strong understanding of ETL processes- Experience with API integrations- Knowledge of cloud platforms such as AWS or Azure- Hands-on experience in developing and maintaining SnapLogic pipelines Additional Information:- The candidate should have a minimum of 5 years of experience in SnapLogic- This position is based at our Pune office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 months ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications function seamlessly within the existing infrastructure. You will also engage in problem-solving activities, providing support and enhancements to existing applications while ensuring alignment with business objectives. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with data integration tools.- Strong understanding of data modeling and ETL processes.- Familiarity with cloud computing concepts and services.- Experience in performance tuning and optimization of data queries. Additional Information:- The candidate should have minimum 7.5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

7.0 - 12.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BusinessObjects Data Services Good to have skills : No Function SpecialtyMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop solutions and ensure applications align with business needs. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead and mentor junior professionals- Conduct regular team meetings to discuss progress and challenges- Stay updated on industry trends and technologies to enhance application development Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BusinessObjects Data Services S4 HANA Implementation knowledge Life sciences experience- Strong understanding of ETL processes and data integration- Experience in developing and optimizing data workflows- Knowledge of data quality management and data governance- Familiarity with SAP BusinessObjects reporting tools Additional Information:- The candidate should have a minimum of 7.5 years of experience in SAP BusinessObjects Data Services- This position is based at our Hyderabad office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 months ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in developing solutions that align with organizational goals and objectives. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and implement software solutions to meet business requirements.- Collaborate with cross-functional teams to analyze and address technical issues.- Conduct code reviews and provide feedback to enhance code quality.- Stay updated with industry trends and best practices in application development.- Assist in troubleshooting and resolving technical issues in applications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Strong understanding of ETL processes and data integration.- Experience with data warehousing concepts and methodologies.- Hands-on experience in developing and optimizing ETL workflows.- Knowledge of SQL and database management systems. Additional Information:- The candidate should have a minimum of 3 years of experience in Ab Initio.- This position is based at our Hyderabad office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the team in implementing new technologies- Conduct regular code reviews and provide feedback for improvement Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio with minimum of 5 years of experience in Ab Initio- Strong understanding of ETL processes- Experience in data warehousing concepts- Knowledge of data modeling and database design- Hands-on experience in developing and implementing data integration solutions Additional Information:- This position is based at our Bengaluru, Chennai & Pune office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 months ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : AWS Glue Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the development process. Your role will require you to balance technical oversight with team management, fostering an environment of innovation and collaboration. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Facilitate regular team meetings to discuss progress and address any roadblocks. Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Glue.- Good To Have Skills: Experience with AWS Lambda, AWS S3, and AWS Redshift.- Strong understanding of data integration and ETL processes.- Experience with cloud-based data warehousing solutions.- Familiarity with data governance and security best practices. Additional Information:- The candidate should have minimum 7.5 years of experience in AWS Glue.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Pune

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Oracle Data Integrator (ODI) Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process and ensuring seamless communication among team members and stakeholders. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Ensure effective communication among team members and stakeholders- Identify and address any issues or roadblocks in the development process Professional & Technical Skills: - Must To Have Skills: Proficiency in Oracle Data Integrator (ODI)- Strong understanding of ETL processes- Experience in data integration and transformation- Knowledge of Oracle databases- Hands-on experience in designing and implementing data integration solutions Additional Information:- The candidate should have a minimum of 5 years of experience in Oracle Data Integrator (ODI)- This position is based at our Pune office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 months ago

Apply

12.0 - 15.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : O9 Solutions Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that project goals are met, facilitating discussions to address challenges, and guiding the team in implementing effective solutions. You will also engage in strategic planning and decision-making processes, ensuring that the applications align with organizational objectives and user needs. Your role will require a balance of technical expertise and leadership skills to drive project success and foster a collaborative environment. Roles & Responsibilities:Play the integration architect role on o9 implementation projects. Engage with client stakeholders to understand data requirements, carry out fit-gap analysis, design integration solution. Review and analyze data provided by client along with its technical & functional intent and inter dependencies. Guide integration team to build and deploy effective solutions to validate and transform customer data for integrated business planning and analytics Technical Skills: Minimum 10 to 15 years of experience of implementing ETL solutions to integrate systems in client environment. Should have played Integration Architect / Sr Integration Consultant / Sr Integration Developer role in at least 2 implementation projects. Strong experience on SQL, PySpark, Python, Spark SQL and ETL tools. Proficiency in database (SQL Server, Oracle etc ). Knowledge of DDL, DML, stored procedures. Strong collaborator- team player- and individual contributor. Strong communication skills with comfort in speaking with business stakeholders. Strong problem solver with ability to manage and lead the team to push the solution and drive progressProfessional skills:Proven ability to work creatively and analytically in a problem-solving environment.Proven ability to build, manage and foster a team-oriented environment Desire to work in an information systems environmentExcellent communication written and oral and interpersonal skills. Educational QualificationBTech/BE/MCA Additional Information:The candidate should have minimum 12 years of experience in O9 Solutions.This position is based at our Bengaluru office.A 15 years full time education is required.Open to travel - short / long term Qualification 15 years full time education

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies