Jobs
Interviews

1016 Etl Process Jobs - Page 39

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7 - 10 years

9 - 12 Lacs

Noida

Work from Office

JD Staff Software Data Engineer R1 RCM Inc. is a leading provider of technology-enabled revenue cycle management services which transform and solve challenges across health systems, hospitals and physician practices. Headquartered in Chicago, R1 is a publicly-traded organization with employees throughout the US and multiple INDIA locations. Our mission is to be the one trusted partner to manage revenue, so providers and patients can focus on what matters most. Our priority is to always do what is best for our clients, patients and each other. With our proven and scalable operating model, we complement a healthcare organizations infrastructure, quickly driving sustainable improvements to net patient revenue and cash flows while reducing operating costs and enhancing the patient experience. Description: We are seeking a Staff Data Engineer with 7-10 year of experience to join our Data Platform team. This role will report to the Manager of data engineering and be involved in the planning, design, and implementation of our centralized data warehouse solution for ETL, reporting and analytics across all applications within the company. Qualifications: Deep knowledge and experience working with SSIS, T-SQL Experienced in Azure data factory, Azure Data bricks & Azure Data Lake. Experience working with any language like Python Experience working with SQL and NoSQL database systems such as MongoDB Experience in distributed system architecture design Experience with cloud environments (Azure Preferred) Experience with acquiring and preparing data from primary and secondary disparate data sources (real-time preferred) Experience working on large scale data product implementation, responsible for technical delivery, mentoring and managing peer engineers Experience working with PowerBI preferred. Experience working with agile methodology preferred. Healthcare industry experience preferred Responsibilities: Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions Work with other team with deep experience in ETL process, distributed microservices, and data science domains to understand how to centralize their data Share your passion for staying experimenting with and learning new technologies. Perform thorough data analysis, uncover opportunities, and address business problems.

Posted 2 months ago

Apply

10 - 14 years

35 - 40 Lacs

Noida

Work from Office

We are seeking a highly skilled and motivated Data Cloud Architect to join our Product and technology team. As a Data Cloud Architect, you will play a key role in designing and implementing our cloud-based data architecture, ensuring scalability, reliability, and optimal performance for our data-intensive applications. Your expertise in cloud technologies, data architecture, and data engineering will drive the success of our data initiatives. Responsibilities: Collaborate with cross-functional teams, including data engineers, data leads, product owner and stakeholders, to understand business requirements and data needs. Design and implement end-to-end data solutions on cloud platforms, ensuring high availability, scalability, and security. Architect delta lakes, data lake, data warehouses, and streaming data solutions in the cloud. Evaluate and select appropriate cloud services and technologies to support data storage, processing, and analytics. Develop and maintain cloud-based data architecture patterns and best practices. Design and optimize data pipelines, ETL processes, and data integration workflows. Implement data security and privacy measures in compliance with industry standards. Collaborate with DevOps teams to deploy and manage data-related infrastructure on the cloud. Stay up-to-date with emerging cloud technologies and trends to ensure the organization remains at the forefront of data capabilities. Provide technical leadership and mentorship to data engineering teams. Qualifications: Bachelors degree in computer science, Engineering, or a related field (or equivalent experience). 10 years of experience as a Data Architect, Cloud Architect, or in a similar role. Expertise in cloud platforms such as Azure. Strong understanding of data architecture concepts and best practices. Proficiency in data modeling, ETL processes, and data integration techniques. Experience with big data technologies and frameworks (e.g., Hadoop, Spark). Knowledge of containerization technologies (e.g., Docker, Kubernetes). Familiarity with data warehousing solutions (e.g., Redshift, Snowflake). Strong knowledge of security practices for data in the cloud. Excellent problem-solving and troubleshooting skills. Effective communication and collaboration skills. Ability to lead and mentor technical teams. Additional Preferred Qualifications: Bachelors degree / Master's degree in Data Science, Computer Science, or related field. Relevant cloud certifications (e.g., Azure Solutions Architect) and data-related certifications. Experience with real-time data streaming technologies (e.g., Apache Kafka). Knowledge of machine learning and AI concepts in relation to cloud-based data solutions.

Posted 2 months ago

Apply

7 - 10 years

9 - 12 Lacs

Noida

Work from Office

Position summary We are seeking a Staff Data Engineer with 7-10 year of experience to join our Data Platform team. This role will report to the Manager of data engineering and be involved in the planning, design, and implementation of our centralized data warehouse solution for ETL, reporting and analytics across all applications within the company Key duties & responsibilities Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions Work with other team with deep experience in ETL process, distributed microservices, and data science domains to understand how to centralize their data Share your passion for staying experimenting with and learning new technologies. Perform thorough data analysis, uncover opportunities, and address business problems. Qualification B.E/B. Tech/MCA or equivalent professional degree Experience, Skills and Knowledge Deep knowledge and experience working with SSIS, T-SQL Experienced in Azure data factory, Azure Data bricks & Azure Data Lake. Experience working with any language like Python/SCALA Experience working with SQL and NoSQL database systems such as MongoDB Experience in distributed system architecture design Experience with cloud environments (Azure Preferred) Experience with acquiring and preparing data from primary and secondary disparate data sources (real-time preferred) Experience working on large scale data product implementation, responsible for technical delivery, mentoring and managing peer engineers Experience working with Databricks preferred Experience working with agile methodology preferred Healthcare industry experience preferred Key competency profile Spot new opportunities by anticipating change and planning accordingly Find ways to better serve customers and patients. Be accountable for customer service of highest quality Create connections across teams by valuing differences and including others Own your developmentby implementing and sharing your learnings Motivate each other to perform at our highest level Help people improve by learning from successes and failures Work the right way by acting with integrity and living our values every day Succeed by proactively identifying problems and solutions for yourself and others.

Posted 2 months ago

Apply

3 - 6 years

7 - 11 Lacs

Hyderabad

Work from Office

Sr Semantic Engineer – Research Data and Analytics What you will do Let’s do this. Let’s change the world. In this vital role you will be part of Research’s Semantic Graph Team is seeking a dedicated and skilled Semantic Data Engineer to build and optimize knowledge graph-based software and data resources. This role primarily focuses on working with technologies such as RDF, SPARQL, and Python. In addition, the position involves semantic data integration and cloud-based data engineering. The ideal candidate should possess experience in the pharmaceutical or biotech industry, demonstrate deep technical skills, and be proficient with big data technologies and demonstrate experience in semantic modeling. A deep understanding of data architecture and ETL processes is also essential for this role. In this role, you will be responsible for constructing semantic data pipelines, integrating both relational and graph-based data sources, ensuring seamless data interoperability, and leveraging cloud platforms to scale data solutions effectively. Roles & Responsibilities: Develop and maintain semantic data pipelines using Python, RDF, SPARQL, and linked data technologies. Develop and maintain semantic data models for biopharma scientific data Integrate relational databases (SQL, PostgreSQL, MySQL, Oracle, etc.) with semantic frameworks. Ensure interoperability across federated data sources, linking relational and graph-based data. Implement and optimize CI/CD pipelines using GitLab and AWS. Leverage cloud services (AWS Lambda, S3, Databricks, etc.) to support scalable knowledge graph solutions. Collaborate with global multi-functional teams, including research scientists, Data Architects, Business SMEs, Software Engineers, and Data Scientists to understand data requirements, design solutions, and develop end-to-end data pipelines to meet fast-paced business needs across geographic regions. Collaborate with data scientists, engineers, and domain experts to improve research data accessibility. Adhere to standard processes for coding, testing, and designing reusable code/components. Explore new tools and technologies to improve ETL platform performance. Participate in sprint planning meetings and provide estimations on technical implementation. Maintain comprehensive documentation of processes, systems, and solutions. Harmonize research data to appropriate taxonomies, ontologies, and controlled vocabularies for context and reference knowledge. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications and Experience: Doctorate Degree OR Master’s degree with 4 - 6 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Bachelor’s degree with 6 - 8 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Diploma with 10 - 12 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field Preferred Qualifications and Experience: 6+ years of experience in designing and supporting biopharma scientific research data analytics (software platforms) Functional Skills: Must-Have Skills: Advanced Semantic and Relational Data Skills: Proficiency in Python, RDF, SPARQL, Graph Databases (e.g. Allegrograph), SQL, relational databases, ETL pipelines, big data technologies (e.g. Databricks), semantic data standards (OWL, W3C, FAIR principles), ontology development and semantic modeling practices. Cloud and Automation ExpertiseGood experience in using cloud platforms (preferably AWS) for data engineering, along with Python for automation, data federation techniques, and model-driven architecture for scalable solutions. Technical Problem-SolvingExcellent problem-solving skills with hands-on experience in test automation frameworks (pytest), scripting tasks, and handling large, complex datasets. Good-to-Have Skills: Experience in biotech/drug discovery data engineering Experience applying knowledge graphs, taxonomy and ontology concepts in life sciences and chemistry domains Experience with graph databases (Allegrograph, Neo4j, GraphDB, Amazon Neptune) Familiarity with Cypher, GraphQL, or other graph query languages Experience with big data tools (e.g. Databricks) Experience in biomedical or life sciences research data management Soft Skills: Excellent critical-thinking and problem-solving skills Good communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 2 months ago

Apply

3 - 5 years

4 - 8 Lacs

Hyderabad

Work from Office

Sr Associate Software Engineer – Finance What you will do The role is responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master’s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor’s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Proficiency in Python, PySpark, and Scala for data processing and ETL (Extract, Transform, Load) workflows, with hands-on experience in using Databricks for building ETL pipelines and handling big data processing Experience with data warehousing platforms such as Amazon Redshift, or Snowflake. Strong knowledge of SQL and experience with relational (e.g., PostgreSQL, MySQL) databases. Familiarity with big data frameworks like Apache Hadoop, Spark, and Kafka for handling large datasets. Experienced with software engineering best-practices, including but not limited to version control (GitLab, Subversion, etc.), CI/CD (Jenkins, GITLab etc.), automated unit testing, and Dev Ops Preferred Qualifications: Experience with cloud platforms such as AWS particularly in data services (e.g., EKS, EC2, S3, EMR, RDS, Redshift/Spectrum, Lambda, Glue, Athena) Experience with Anaplan platform, including building, managing, and optimizing models and workflows including scalable data integrations Understanding of machine learning pipelines and frameworks for ML/AI models Professional Certifications: AWS Certified Data Engineer (preferred) Databricks Certified (preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 2 months ago

Apply

2 - 6 years

3 - 6 Lacs

Hyderabad

Work from Office

Role Description The R&D Data Catalyst Team is responsible for building Data Searching, Cohort Building, and Knowledge Management tools that provide the Amgen scientific community with visibility to Amgen’s wealth of human datasets, projects and study histories, and knowledge over various scientific findings . These solutions are pivotal tools in Amgen’s goal to accelerate the speed of discovery, and speed to market of advanced precision medications . The S r. Data Engineer will be responsible for the end-to-end development of an enterprise analytics and data mastering solution leveraging Databricks and Power BI. This role requires expertise in both data architecture and analytics, with the ability to create scalable, reliable, and high-performing enterprise solutions that research cohort-building and advanced research pipeline . The ideal candidate will have experience creating and surfacing large unified repositories of human data, based on integrations from multiple repositories and solutions , and be exceptionally skilled with data analysis and profiling . You will collaborate closely with stakeholders , product team members , and related I T teams, to design and implement data models, integrate data from various sources, and ensure best practices for data governance and security. The ideal candidate will have a strong background in data warehousing, ETL, Databricks, Power BI, and enterprise data mastering. Roles & Responsibilities Design and build scalable enterprise analytics solutions using Databricks, Power BI, and other modern data tools. Leverage data virtualization, ETL, and semantic layers to balance need for unification, performance, and data transformation with goal to reduce data proliferation Break down features into work that aligns with the architectural direction runway Participate hands-on in pilots and proofs-of-concept for new patterns Create robust documentation from data analysis and profiling, and proposed designs and data logic Develop advanced sql queries to profile, and unify data Develop data processing code in sql , along with semantic views to prepare data for reporting Develop PowerBI Models and reporting packages Design robust data models , and processing layers, that support both analytical processing and operational reporting needs. D esign and develop solutions based on best practices for data governance, security, and compliance within Databricks and Power BI environments. Ensure the integration of data systems with other enterprise applications, creating seamless data flows across platforms. Develop and maintain Power BI solutions, ensuring data models and reports are optimized for performance and scalability. Collaborate with stakeholders to define data requirements, functional specifications, and project goals. Continuously evaluate and adopt new technologies and methodologies to enhance the architecture and performance of data solutions. Basic Qualifications and Experience Master’s degree with 4 to 6 years of experience in Product Owner / Platform Owner / Service Owner OR Bachelor’s degree with 8 to 10 years of experience in Product Owner / Platform Owner / Service Owner Functional Skills: Must-Have Skills Minimum of 3 years of hands-on experience with BI solutions (Preferrable Power BI or Business Objects) including report development, dashboard creation, and optimization. Minimum of 6 years of hands-on experience building Change-data-capture (CDC) ETL pipelines, data warehouse design and build, and enterprise-level data management. Hands-on experience with Databricks, including data engineering, optimization, and analytics workloads. Deep understanding of Power BI, including model design , DAX, and Power Query. Proven experience designing and implementing data mastering solutions and data governance frameworks. Expertise in cloud platforms ( AWS ), data lakes, and data warehouses. Strong knowledge of ETL processes, data pipelines, and integration technologies. Strong communication and collaboration skills to work with cross-functional teams and senior leadership. Ability to assess business needs and design solutions that align with organizational goals. Exceptional h ands - on capabilities with data profiling, data transformation, data mastering Success in mentoring and training team members Good-to-Have Skills: Experience in developing differentiated and deliverable solutions Experience with human data, ideally human healthcare data Familiarity with laboratory testing, patient data from clinical care, HL7, FHIR, and/or clinical trial data management Professional Certifications (please mention if the certification is preferred or mandatory for the role) ITIL Foundation or other relevant certifications (preferred) SAFe Agile Practitioner (6.0) Microsoft CertifiedData Analyst Associate (Power BI) or related certification. Databricks Certified Professional or similar certification. Soft Skills: Excellent analytical and troubleshooting skills Deep intellectual curiosity High est degree of initiative and self-motivation Strong verbal and written communication skills , including presentation to varied audiences of complex technical/business topics Confidence technical leader Ability to work effectively with global, virtual teams , specifically including leveraging of tools and artifacts to assure clear and efficient collaboration across time zones Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong problem solving, analytical skills; Ability to learn quickly and retain and synthesize complex information from diverse sources

Posted 2 months ago

Apply

2 - 5 years

3 - 7 Lacs

Hyderabad

Work from Office

ABOUT THE ROLE Role Description: The Product Master Data Management (PMDM) Data Engineer is responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data governance initiatives, and visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes. You will play a key role in a regulatory submission content automation initiative which will modernize and digitize the regulatory submission process, positioning Amgen as a leader in regulatory innovation. The initiative leverages state-of-the-art technologies, including Generative AI, structured content management, and integrated data to automate the creation, review, and approval of regulatory content. Roles & Responsibilities: Design, develop, and maintain solutions for data generation, collection, and processing. Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions. Build and maintain back-end services using languages like Python, Java, or Node.js that provide secure, reliable, and scalable access to Product Master Data. Collaborate with the design and product teams to understand user needs and translate them into technical requirements. Write clean, efficient, and well-tested code. Participate in code reviews and provide constructive feedback. Maintain system uptime and optimal performance Learn and adapt to new technologies and industry trends Collaborate and communicate effectively with product teams Participate in sprint planning meetings and provide estimations on technical implementation Basic Qualifications and Experience: Bachelor’s degree and 0 to 3 years of Computer Science, IT or related field experience OR Diploma and 4 to 7 years of Computer Science, IT or related field experience Functional Skills: Must-Have Skills: Hands on experience with web API development Hands on experience with backend development, proficient with SQL/NoSQL database, proficient in Python and SQL Ability to learn new technologies quickly. Strong problem-solving and analytical skills. Excellent communication and teamwork skills. Good-to-Have Skills: Strong understanding of data modeling, data warehousing, and data integration concepts Professional Certifications: Certified Data Engineer / Data Analyst (preferred on Databricks or cloud environments) Machine Learning Certification (preferred on Databricks or Cloud environments Soft Skills: Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.

Posted 2 months ago

Apply

4 - 6 years

10 - 14 Lacs

Hyderabad

Work from Office

ABOUT THE ROLE Role Description The R&D Data Catalyst Team is responsible for building Data Searching, Cohort Building, and Knowledge Management tools that provide the Amgen scientific community with visibility to Amgen’s wealth of human datasets, projects and study histories, and knowledge over various scientific findings . These solutions are pivotal tools in Amgen’s goal to accelerate the speed of discovery, and speed to market of advanced precision medications . The Data Architect will be responsible for the end-to-end architecture of an enterprise analytics and data mastering solution leveraging Databricks and Power BI. This role requires expertise in both data architecture and analytics, with the ability to create scalable, reliable, and high-performing enterprise solutions that research cohort-building and advanced research pipeline . The ideal candidate will have proven experience creating and surfacing large unified repositories of human data, based on integrations from multiple sources and solutions. You will collaborate closely with stakeholders across departments, including data engineering, business intelligence, and IT teams, to design and implement data models, integrate data from various sources, and ensure best practices for data governance and security. The ideal candidate will have a strong background in data warehousing, ETL, Databricks, Power BI, and enterprise data mastering. Roles & Responsibilities Architect scalable enterprise analytics solutions using Databricks, Power BI, and other modern data tools. Leverage data virtualization, ETL, and semantic layers to balance need for unification, performance, and data transformation with goal to reduce data proliferation Support development planning by break ing down features into work that aligns with the architectural direction runway Participate hands-on in pilots and proofs-of-concept for new patterns Create robust documentation of architectural direction, patterns, and standards Present and train engineers and cross-team collaborators on architecture strategy and patterns Collaborate with data engineers to build and optimize ETL pipelines, ensuring efficient data ingestion and processing from multiple sources. Design robust data models , and processing layers, that support both analytical processing and operational reporting needs. Develop and implement best practices for data governance, security, and compliance within Databricks and Power BI environments. Ensure the integration of data systems with other enterprise applications, creating seamless data flows across platforms. Provide thought leadership and strategic guidance on data architecture, advanced analytics, and data mastering best practices. Develop and maintain Power BI solutions, ensuring data models and reports are optimized for performance and scalability. Serve as a subject matter expert on Power BI and Databricks, providing technical leadership and mentoring to other teams. Collaborate with stakeholders to define data requirements, architecture specifications, and project goals. Continuously evaluate and adopt new technologies and methodologies to enhance the architecture and performance of data solutions. Basic Qualifications and Experience Master’s degree with 4 to 6 years of experience in data management and data architecture OR Bachelor’s degree with 6 to 8 years of experience in data management and data architecture Functional Skills: Must-Have Skills Minimum of 3 years of hands-on experience with BI solutions (Preferably Power BI or Business Objects) including report development, dashboard creation, and optimization. Minimum of 7 years of hands-on experience building change-data-capture (CDC) ETL pipelines, data warehouse design and build, and enterprise-level data management. Hands-on experience with Databricks, including data engineering, optimization, and analytics workloads. Deep understanding of Power BI, including model design , DAX, and Power Query. Proven experience designing and implementing data mastering solutions and data governance frameworks. Expertise in cloud platforms ( AWS ), data lakes, and data warehouses. Strong knowledge of ETL processes, data pipelines, and integration technologies. Strong communication and collaboration skills to work with cross-functional teams and senior leadership. Ability to assess business needs and design solutions that align with organizational goals. Exceptional h ands - on capabilities with data profiling, data transformation, data mastering Success in mentoring and training team members Good-to-Have Skills: Experience in developing differentiated and deliverable solutions Experience with human data, ideally human healthcare data Familiarity with laboratory testing, patient data from clinical care, HL7, FHIR, and/or clinical trial data management Professional Certifications (please mention if the certification is preferred or mandatory for the role) ITIL Foundation or other relevant certifications (preferred) SAFe Agile Practitioner (6.0) Microsoft CertifiedData Analyst Associate (Power BI) or related certification. Databricks Certified Professional or similar certification. Soft Skills: Excellent analytical and troubleshooting skills Deep intellectual curiosity High est degree of initiative and self-motivation Strong verbal and written communication skills , including presentation to varied audiences of complex technical/business topics Confidence technical leader Ability to work effectively with global, virtual teams , specifically including leveraging of tools and artifacts to assure clear and efficient collaboration across time zones Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong problem solving, analytical skills; Ability to learn quickly and retain and synthesize complex information from diverse sources

Posted 2 months ago

Apply

2 - 6 years

11 - 15 Lacs

Hyderabad

Work from Office

Amgen’s Precision Medicine technology te am is responsible for building Data Searching, Cohort Building, and Knowledge Management tools that provide the Amgen scientific community with visibility to Amgen’s wealth of human datasets, projects and study histories, and knowledge over various scientific findings. These data include multiomics data (genomics, transcriptomics, proteomics, etc.), clinical study subject measurement and outcome data, images, and specimen inventory data . Our PMED data management , standardization, surfacing, and processing capabilities are pivotal tools in Amgen’s goal to accelerate the speed of discovery, and speed to market of advanced precision medications . The Solution and Data Architect will be responsible for the end-to-end architecture of an enterprise analytics and data mastering solution leveraging Databricks and Power BI. This role requires expertise in both data architecture and analytics, with the ability to create scalable, reliable, and high-performing enterprise solutions that research cohort-building and advanced research pipeline . The ideal candidate will have experience creating and surfacing large unified repositories of human data, based on integrations from multiple repositories and solutions. You will collaborate closely with stakeholders across departments, including data engineering, business intelligence, and IT teams, to design and implement data models, integrate data from various sources, and ensure best practices for data governance and security. The ideal candidate will have a strong background in data warehousing, ETL, Databricks, Power BI, and enterprise data mastering. Roles & Responsibilities Architect scalable enterprise analytics solutions using Databricks, Power BI, and other modern data tools. Leverage data virtualization, ETL, and semantic layers to balance need for unification, performance, and data transformation with goal to reduce data proliferation Support development planning by break ing down features into work that aligns with the architectural direction runway Participate hands-on in pilots and proofs-of-concept for new patterns Create robust documentation of architectural direction, patterns, and standards Present and train engineers and cross-team collaborators on architecture strategy and patterns Collaborate with data engineers to build and optimize ETL pipelines, ensuring efficient data ingestion and processing from multiple sources. Design robust data models , and processing layers, that support both analytical processing and operational reporting needs. Develop and implement best practices for data governance, security, and compliance within Databricks and Power BI environments. Ensure the integration of data systems with other enterprise applications, creating seamless data flows across platforms. Provide thought leadership and strategic guidance on data architecture, advanced analytics, and data mastering best practices. Develop and maintain Power BI solutions, ensuring data models and reports are optimized for performance and scalability. Serve as a subject matter expert on Power BI and Databricks, providing technical leadership and mentoring to other teams. Collaborate with stakeholders to define data requirements, architecture specifications, and project goals. Continuously evaluate and adopt new technologies and methodologies to enhance the architecture and performance of data solutions. Basic Qualifications and Experience Master’s degree with 6 to 8 years of experience in data management and data solution architecture Bachelor’s degree with 8 to 10 years of experience in in data management and data solution architecture Diploma and 10 to 12 years of experience in in data management and data solution architecture Functional Skills: Must-Have Skills Minimum of 3 years of hands-on experience with BI solutions (Preferable Power BI or Business Objects) including report development, dashboard creation, and optimization. Minimum of 7 years of hands-on experience building Change-data-capture (CDC) ETL pipelines, data warehouse design and build, and enterprise-level data management. Hands-on experience with Databricks, including data engineering, optimization, and analytics workloads. Deep understanding of Power BI, including model design , DAX, and Power Query. Proven experience designing and implementing data mastering solutions and data governance frameworks. Expertise in cloud platforms ( AWS ), data lakes, and data warehouses. Strong knowledge of ETL processes, data pipelines, and integration technologies. Strong communication and collaboration skills to work with cross-functional teams and senior leadership. Ability to assess business needs and design solutions that align with organizational goals. Exceptional h ands - on capabilities with data profiling, data transformation, data mastering Success in mentoring and training team members Good-to-Have Skills: Experience in developing differentiated and deliverable solutions Experience with human data, ideally human healthcare data Familiarity with laboratory testing, patient data from clinical care, HL7, FHIR, and/or clinical trial data management Professional Certifications (please mention if the certification is preferred or mandatory for the role) ITIL Foundation or other relevant certifications (preferred) SAFe Agile Practitioner (6.0) Microsoft CertifiedData Analyst Associate (Power BI) or related certification. Databricks Certified Professional or similar certification. Soft Skills: Excellent analytical and troubleshooting skills Deep intellectual curiosity High est degree of initiative and self-motivation Strong verbal and written communication skills , including presentation to varied audiences of complex technical/business topics Confidence technical leader Ability to work effectively with global, virtual teams , specifically including leveraging of tools and artifacts to assure clear and efficient collaboration across time zones Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong problem solving, analytical skills; Ability to learn quickly and retain and synthesize complex information from diverse sources

Posted 2 months ago

Apply

1 - 4 years

2 - 6 Lacs

Hyderabad

Work from Office

Sr Associate Software Engineer – Tech Enablement Team Job Posting Title Sr Associate Software Engineer Workday Job Profile Sr Associate Software Engineer Department Name Digital, Technology & Innovation Role GCF 4 ABOUT THE ROLE Role Description: In this vital and technical role, you will deliver innovative custom solutions for supporting patient safety and adhering to regulatory requirements from around the world. You will be an active participant in the team directly working towards advancing technical features and enhancements of the business applications, also involving Machine Learning and Natural Language Processing technologies. Roles & Responsibilities: Develops and delivers robust technology solutions in a regulated environment by collaborating with business partners, information systems (IS) colleagues and service providers Authors documentation for technical specifications and designs that satisfy detailed business and functional requirements Works closely with business and IS teams to find opportunities Responsible for crafting and building end-to-end solutions using cloud technologies (e.g. Amazon Web Services and Business Intelligence tools (e.g. Cognos, Tableau and Spotfire) or any other platforms Chips in towards design and rapid Proof-of-Concept (POC) development efforts for automated solutions that improve efficiency and simplify business processes. Quickly and iteratively prove or disprove the concepts being considered. Ensures design, development of software solutions is meeting Amgen architectural, security, quality and development guidelines Participates in Agile development ceremonies and practices Write SQL queries to manipulate and visualize data using data visualization tools Basic Qualifications and Experience: Master’s degree with 2 - 6 years of experience in software engineering experience OR Bachelor’s degree with 4 - 6 years of experience in software engineering experience OR Diploma with 6- 8 years of experience in software engineering experience Functional Skills: Must-Have Skills: Experience and proficient with at least one development programming language/technologies such as Database SQL and Python Experience with at least one Business Intelligence tool such as Cognos, Tableau or Spotfire Familiarity with automation technologies UiPath and a desire to learn and support Solid understanding of Mulesoft and ETL technologies (e.g. Informatica, DataBricks) Understanding of AWS/cloud storage, hosting, and compute environments is required Good-to-Have Skills: Experienced in database programming languages, data modelling concepts, including Oracle SQL and PL/SQL Experience with API integrations such as MuleSoft Solid understanding of using one or more general programming languages, including but not limited toJava or Python Outstanding written and verbal communication skills, and ability to explain technical concepts to non-technical clients Sharp learning agility, problem solving and analytical thinking Experienced in managing GxP systems and implementing GxP projects Extensive expertise in SDLC, including requirements, design, testing, data analysis, change control Professional Certifications: Understanding and experience with Agile methodology and DevOps Soft Skills: Strong communication and presentation skills Ability to work on multiple projects simultaneously Expertise in visualizing and manipulating large data sets Willing to learn to new technologies High learning agility, innovation, and analytical skills Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.

Posted 2 months ago

Apply

3 - 7 years

4 - 7 Lacs

Hyderabad

Work from Office

What you will do Let’s do this. Let’s change the world. In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and implementing data governance initiatives and visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes . Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing. Be a key team member that assists in design and development of the data pipeline. Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems. Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions. Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks. Collaborate with multi-functional teams to understand data requirements and design solutions that meet business needs. Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency. Implement data security and privacy measures to protect sensitive data. Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions. Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions. Identify and resolve complex data-related challenges. Adhere to standard processes for coding, testing, and designing reusable code/component. Explore new tools and technologies that will help to improve ETL platform performance. Participate in sprint planning meetings and provide estimations on technical implementation. Collaborate and communicate effectively with product teams. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master’s degree with 4 - 6 years of experience in Computer Science, IT or related field OR Bachelor’s degree with 6 - 8 years of experience in Computer Science, IT or related field OR Diploma with 10 - 12 years of experience in Computer Science, IT or related field. Functional Skills: Must-Have Skills: Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), workflow orchestration, performance tuning on big data processing. Hands on experience with various Python/R packages for EDA, feature engineering and machine learning model training. Proficiency in data analysis tools (eg. SQL) and experience with data visualization tools. Excellent problem-solving skills and the ability to work with large, complex datasets. Strong understanding of data governance frameworks, tools, and standard methodologies. Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA). Good-to-Have Skills: Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development. Strong understanding of data modeling, data warehousing, and data integration concepts. Knowledge of Python/R, Databricks, SageMaker, OMOP. Professional Certifications: Certified Data Engineer / Data Analyst (preferred on Databricks or cloud environments). Certified Data Scientist (preferred on Databricks or Cloud environments). Machine Learning Certification (preferred on Databricks or Cloud environments). SAFe for Teams certification (preferred). Soft Skills: Excellent critical-thinking and problem-solving skills. Strong communication and collaboration skills. Demonstrated awareness of how to function in a team setting. Demonstrated presentation skills. Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 2 months ago

Apply

3 - 8 years

5 - 9 Lacs

Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in developing solutions that align with organizational goals and objectives. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Develop and implement software solutions to meet business requirements. Collaborate with cross-functional teams to analyze and address technical issues. Conduct code reviews and provide feedback to enhance code quality. Stay updated with industry trends and best practices in application development. Assist in troubleshooting and resolving technical issues in applications. Professional & Technical Skills: Must To Have Skills: Proficiency in Ab Initio. Strong understanding of ETL processes and data integration. Experience with data warehousing concepts and methodologies. Hands-on experience in developing and optimizing ETL workflows. Knowledge of SQL and database management systems. Additional Information: The candidate should have a minimum of 3 years of experience in Ab Initio. This position is based at our Hyderabad office. A 15 years full-time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

5 - 10 years

5 - 9 Lacs

Chennai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will collaborate with teams to ensure successful project delivery and application functionality. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead and mentor junior professionals Conduct regular team meetings to discuss progress and challenges Stay updated on industry trends and best practices Professional & Technical Skills: Must To Have Skills: Proficiency in Ab Initio Strong understanding of ETL processes Experience with data integration and data warehousing Knowledge of data quality and data governance principles Hands-on experience with Ab Initio GDE and EME tools Additional Information: The candidate should have a minimum of 5 years of experience in Ab Initio This position is based at our Chennai office A 15 years full time education is required Qualification 15 years full time education

Posted 2 months ago

Apply

2 - 7 years

10 - 20 Lacs

Pune, Chennai, Bengaluru

Hybrid

Salary: 10- 30 LPA Exp: 2-7 years Location: Gurgaon/Pune/Bangalore/Chennai Notice period: Immediate to 30 days..!! Key Responsibilities: 2+ years hands on strong experience in Ab-Initio technology. Should have good knowledge about Ab-Initio components like Reformat, Join, Sort, Rollup, Normalize, Scan, Lookup, MFS, Ab-Initio parallelism and products like Metadata HUB, Conduct>IT, Express>IT, Control center and good to have clear understanding of concepts like Meta programming, continuous flows & PDL. Very good knowledge of Data warehouse, SQL and Unix shell scripting. Knowledge on ETL side of Cloud platforms like AWS or Azure and on Hadoop platform is also an added advantage. Experience in working with banking domain data is an added advantage. Excellent technical knowledge in Design, development & validation of complex ETL features using Ab-Initio. Excellent knowledge in integration with upstream and downstream processes and systems. Ensure compliance to technical standards, and processes. Ability to engage and collaborate with Stakeholders to deliver assigned tasks with defined quality goals. Can work independently with minimum supervision and help the development team on technical Issues. Good Communication and analytical skills.

Posted 2 months ago

Apply

5 - 10 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : Python (Programming Language), Data Building Tool Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. You will create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Your day will involve working on data solutions and collaborating with teams to optimize data processes. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Develop and maintain data pipelines Ensure data quality and integrity Implement ETL processes Professional & Technical Skills: Must To Have Skills: Proficiency in Snowflake Data Warehouse Good To Have Skills: Experience with Data Building Tool Strong understanding of data architecture Proficiency in SQL and database management Experience with cloud data platforms Knowledge of data modeling Additional Information: The candidate should have a minimum of 5 years of experience in Snowflake Data Warehouse This position is based at our Bengaluru office A 15 years full time education is required Qualification 15 years full time education

Posted 2 months ago

Apply

7 - 12 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : Python (Programming Language), Data Building Tool Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. Your role involves creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. Roles & Responsibilities: Expected to be an SME. Collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Develop and maintain data solutions for data generation, collection, and processing. Create data pipelines to ensure efficient data flow. Implement ETL processes for data migration and deployment. Professional & Technical Skills: Must To Have Skills: Proficiency in Snowflake Data Warehouse. Good To Have Skills: Experience with Data Building Tool, Python (Programming Language). Strong understanding of data architecture and data modeling. Experience in developing and optimizing ETL processes. Knowledge of cloud data platforms and services. Additional Information: The candidate should have a minimum of 7.5 years of experience in Snowflake Data Warehouse. This position is based at our Bengaluru office. A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

5 - 7 years

0 - 0 Lacs

Thiruvananthapuram

Work from Office

Job Summary: We are seeking a highly skilled SAP BODS Data Engineer with strong expertise in ETL development and Enterprise Data Warehousing (EDW) . The ideal candidate will have a deep understanding of SAP Business Objects Data Services (BODS) and will be responsible for designing, developing, and maintaining robust data integration solutions. Key Responsibilities: Design, develop, and implement efficient ETL solutions using SAP BODS. Build and optimize SAP BODS jobs, including job design, data flows, scripting, and debugging. Develop and maintain scalable data extraction, transformation, and loading (ETL) processes from diverse data sources. Create and manage data integration workflows to ensure high performance and scalability. Collaborate closely with data architects, analysts, and business stakeholders to deliver accurate and timely data solutions. Ensure data quality and consistency across different systems and platforms. Troubleshoot and resolve data-related issues in a timely manner. Document all ETL processes and maintain technical documentation. Required Skills & Qualifications: 3+ years of hands-on experience with ETL development using SAP BODS . Strong proficiency in SAP BODS job design, data flow creation, scripting, and debugging. Solid understanding of data integration , ETL concepts , and data warehousing principles . Proficiency in SQL for data querying, data manipulation, and performance tuning. Familiarity with data modeling concepts and major database systems (e.g., Oracle, SQL Server, SAP HANA). Excellent problem-solving skills and keen attention to detail. Strong communication and interpersonal skills to facilitate effective collaboration. Ability to work independently, prioritize tasks, and manage multiple tasks in a dynamic environment. Required Skills Sap,Edw,Etl

Posted 2 months ago

Apply

4 - 9 years

10 - 14 Lacs

Gurugram

Work from Office

KDataScience (USA & INDIA) is looking for Senior Power BI Developer to join our dynamic team and embark on a rewarding career journey Power BI Solution Development:Design, develop, and implement Power BI dashboards and reports that meet business needs Create visually appealing and user-friendly reports for various stakeholders Optimize Power BI data models for performance and efficiency Data Modeling and Transformation:Work with stakeholders to understand data requirements and design appropriate data models Perform data cleansing, transformation, and integration to ensure accurate and reliable reporting Implement best practices for data modeling and establish data governance standards Data Integration:Integrate Power BI with various data sources, including databases, APIs, and other data repositories Collaborate with data engineers to ensure seamless data flow between systems Implement ETL processes to extract, transform, and load data into Power BI Performance Monitoring and Optimization:Monitor the performance of Power BI reports and dashboards and implement optimizations as needed Identify and resolve data-related issues affecting reporting accuracy and reliability Provide recommendations for infrastructure improvements to enhance performance Collaboration and Training:Collaborate with cross-functional teams, including data analysts, business users, and IT, to gather requirements Conduct training sessions for end-users and team members on Power BI best practices and usage Stay informed about the latest Power BI features and updates

Posted 2 months ago

Apply

7 - 12 years

9 - 14 Lacs

Mumbai

Work from Office

Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : Microsoft SQL Server Integration Services (SSIS) Good to have skills : Microsoft SQL Server Reporting Services Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Lead, you will be responsible for developing and configuring software systems, applying knowledge of technologies, methodologies, and tools to support clients or projects in Mumbai. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the team in implementing innovative solutions Mentor junior team members for their professional growth Professional & Technical Skills: Must To Have Skills:Proficiency in Microsoft SQL Server Integration Services (SSIS) Good To Have Skills:Experience with Microsoft SQL Server Reporting Services Strong understanding of database management and optimization Expertise in ETL processes and data integration Ability to troubleshoot and resolve complex technical issues Additional Information: The candidate should have a minimum of 12 years of experience in Microsoft SQL Server Integration Services (SSIS). This position is based at our Mumbai office. A 15 years full-time education is required. Qualifications 15 years full time education

Posted 2 months ago

Apply

3 - 8 years

5 - 10 Lacs

Gurugram

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica PowerCenter, Oracle Procedural Language Extensions to SQL (PLSQL) Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education " Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will collaborate with teams to ensure successful project delivery and implementation. Professional & Technical Skills: Must To Have Skills:Proficiency in Informatica PowerCenter, Oracle Procedural Language Extensions to SQL (PLSQL) Strong understanding of ETL processes and data integration Experience in developing complex data mappings and transformations Knowledge of data warehousing concepts and best practices Hands-on experience in performance tuning and optimization of ETL processes Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead and mentor junior professionals Drive innovation and continuous improvement in application development Additional Information: The candidate should have min 4 years of experience This position is based at our Gurugram office. Visiting client office twice a week is must. A 15 years full-time education is required." Qualifications 15 years full time education

Posted 2 months ago

Apply

5 - 10 years

7 - 12 Lacs

Bengaluru

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : SAP BusinessObjects Data Services Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will analyze, design, code, and test multiple components of application code across one or more clients. You will also perform maintenance, enhancements, and/or development work throughout the day. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead team meetings to discuss progress and challenges Mentor junior team members to enhance their skills Stay updated on industry trends and technologies to suggest improvements Professional & Technical Skills: Must To Have Skills:Proficiency in SAP BusinessObjects Data Services Strong understanding of ETL processes Experience with data integration and data quality management Hands-on experience in data modeling and database design Knowledge of SAP systems and integration with other platforms Additional Information: The candidate should have a minimum of 5 years of experience in SAP BusinessObjects Data Services This position is based at our Bengaluru office A 15 years full-time education is required Qualifications 15 years full time education

Posted 2 months ago

Apply

5 - 10 years

7 - 12 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Analysis & Interpretation Good to have skills : Snowflake Data Warehouse Minimum 5 year(s) of experience is required Educational Qualification : Minimum 15 years of Full-time education Project Role :Application Developer Project Role Description :Design, build and configure applications to meet business process and application requirements. Must have Skills :Data Analysis & InterpretationGood to Have Skills :Snowflake Data WarehouseJob Requirements :Key Responsibilities :1 Enable himself with the Accenture Standards and Policies of working in a Project and client environment2 Work with Project Manager and Project Lead to get his Client user accounts created3 Lead the overall Snowflake Transformation journey for the customer4 Design Develop the new solution in Snowflake Datawarehouse 5 Prepare and test strategy and an implementation plan for the solution6 Play role of a End to End Data Engineer Technical Experience :1 2 Years of Hands-on Experience in SNOWFLAKE Datawarehouse Design and Development Projects specifically2 4 Years of Hands-on Experience in SQL Programming Language PLSQL3 1 Years of Experience in JavaScripting or any programming languages Python, ReactJS, Angular4 Good Understanding and Concepts of Cloud Datawarehouse and Datawarehousing concepts and Dimensional Modelling concepts5 1 Year Experience in ETL Technologies - Informatica or DataStage or Talend or SAP BODS or Abinitio, etc Professional Attributes :1 Should be fluent in English communication2 Should have handled direct Client Interactions in the past3 Should be clear in Written Communications4 Should be having strong interpersonal skills5 Should be conscious of European Professional Etiquettes Educational Qualification:Minimum 15 years of Full-time educationAdditional Info :Exposure to AWS and Amazon S3 and other Amazon Cloud Hosting Products related to Analytics or DBs Qualifications Minimum 15 years of Full-time education

Posted 2 months ago

Apply

3 - 8 years

5 - 9 Lacs

Bhubaneswar, Kolkata, Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements in Bhubaneswar. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Develop and implement software programs to meet business requirements. Collaborate with team members to design and develop applications. Troubleshoot and debug applications to ensure optimal performance. Conduct code reviews and provide feedback to improve code quality. Stay updated with industry trends and technologies to enhance application development. Professional & Technical Skills: Must To Have Skills: Proficiency in Ab Initio. Strong understanding of ETL processes and data integration. Experience with data warehousing concepts and methodologies. Hands-on experience in developing and optimizing data pipelines. Knowledge of SQL and database management systems. Additional Information: The candidate should have a minimum of 3 years of experience in Ab Initio. This position is based at our Bhubaneswar office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 2 months ago

Apply

3 - 8 years

5 - 10 Lacs

Bhubaneswar, Kolkata, Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will work closely with the team to ensure the successful delivery of high-quality software solutions. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Collaborate with cross-functional teams to gather and analyze requirements. Design, develop, and test applications based on business requirements. Troubleshoot and debug issues in existing applications. Ensure the performance, quality, and responsiveness of applications. Participate in code reviews to maintain code quality. Stay up-to-date with emerging technologies and industry trends. Provide technical guidance and support to junior team members. Professional & Technical Skills: Must To Have Skills:Proficiency in Ab Initio. Good To Have Skills:Experience with data integration tools. Strong understanding of ETL concepts and data warehousing principles. Experience in designing and developing ETL workflows using Ab Initio. Knowledge of SQL and database concepts. Familiarity with version control systems such as Git. Excellent problem-solving and analytical skills. Additional Information: The candidate should have a minimum of 3 years of experience in Ab Initio. This position is based at our Bengaluru office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 2 months ago

Apply

3 - 8 years

5 - 9 Lacs

Bhubaneswar, Kolkata, Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements in Bhubaneswar. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Develop and implement software programs to meet business requirements. Collaborate with team members to design and develop applications. Troubleshoot and debug applications to ensure optimal performance. Conduct code reviews and provide feedback to improve code quality. Stay updated on industry trends and technologies to enhance application development. Professional & Technical Skills: Must To Have Skills: Proficiency in Ab Initio. Strong understanding of ETL processes and data integration. Experience with data warehousing concepts and methodologies. Hands-on experience in developing and optimizing data pipelines. Good To Have Skills: Experience with data modeling and database design. Additional Information: The candidate should have a minimum of 3 years of experience in Ab Initio. This position is based at our Bhubaneswar office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies