Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
5 - 9 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Job Description: KPI Partners is seeking an experienced Senior Snowflake Administrator to join our dynamic team. In this role, you will be responsible for managing and optimizing our Snowflake environment to ensure performance, reliability, and scalability. Your expertise will contribute to designing and implementing best practices to facilitate efficient data warehousing solutions. Key Responsibilities: - Administer and manage the Snowflake platform, ensuring optimal performance and security. - Monitor system performance, troubleshoot issues, and implement necessary solutions. - Collaborate with data architects and engineers to design data models and optimal ETL processes. - Conduct regular backups and recovery procedures to protect data integrity. - Implement user access controls and security measures to safeguard data. - Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs. - Participate in the planning and execution of data migration to Snowflake. - Provide support for data governance and compliance initiatives. - Stay updated with Snowflake features and best practices, and provide recommendations for continuous improvement. Qualifications: - Bachelor's degree in Computer Science, Information Technology, or a related field. - 5+ years of experience in database administration, with a strong focus on Snowflake. - Hands-on experience with SnowSQL, SQL, and data modeling. - Familiarity with data ingestion tools and ETL processes. - Strong problem-solving skills and the ability to work independently. - Excellent communication skills and the ability to collaborate with technical and non-technical stakeholders. - Relevant certifications in Snowflake or cloud data warehousing are a plus. If you are a proactive, detail-oriented professional with a passion for data and experience in Snowflake administration, we would love to hear from you. Join KPI Partners and be part of a team that is dedicated to delivering exceptional data solutions for our clients.
Posted 2 weeks ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: ETL Developer Location: Hyderabad (5 days WFO) Experience Required: 4+ years in ETL Developer We are looking for a talented Talend Developer with hands-on experience in Talend Management Console on Cloud and Snowflake to join our growing team. The ideal candidate will play a key role in building and optimizing ETL/ELT data pipelines, integrating complex data systems, and ensuring high performance across cloud environments. While experience with Informatica is a plus, it is not mandatory for this role. As a Talend Developer, you will be responsible for designing, developing, and maintaining data integration solutions to meet the organization’s growing data needs. You will collaborate with business stakeholders, data architects, and other data professionals to ensure the seamless and secure movement of data across platforms, ensuring scalability and performance. Key Responsibilities: Develop and maintain ETL/ELT data pipelines using Talend Management Console on Cloud to integrate data from various on-premises and cloud-based sources. Design, implement, and optimize data flows for data ingestion, processing, and transformation in Snowflake to support analytical and reporting needs. Utilize Talend Management Console on Cloud to manage, deploy, and monitor data integration jobs, ensuring robust pipeline management and process automation. Collaborate with data architects to ensure that the data integration solutions align with business requirements and follow best practices. Ensure data quality, performance, and scalability of Talend-based data solutions. Troubleshoot, debug, and optimize existing ETL processes to ensure smooth and efficient data integration. Document data integration processes, including design specifications, mappings, workflows, and performance optimizations. Collaborate with the Snowflake team to implement best practices for data warehousing and data transformation. Implement error-handling and data validation processes to ensure high levels of accuracy and data integrity. Provide ongoing support for Talend jobs, including post-deployment monitoring, troubleshooting, and optimization. Participate in code reviews and collaborate in an agile development environment. Required Qualifications: 2+ years of experience in Talend development, with a focus on using the Talend Management Console on Cloud for managing and deploying jobs. Strong hands-on experience with Snowflake data warehouse, including data integration and transformation. Expertise in developing ETL/ELT workflows for data ingestion, processing, and transformation. Experience with SQL and working with relational databases to extract and manipulate data. Experience working in cloud environments (e.g., AWS, Azure, or GCP) with integration of cloud-based data platforms. Strong knowledge of data integration, data quality, and performance optimization in Talend. Ability to troubleshoot and resolve issues in data integration jobs and processes. Solid understanding of data modeling concepts and best practices for building scalable data pipelines. Preferred Qualifications: Experience with Informatica is a plus but not mandatory. Experience with scripting languages such as Python or Shell scripting for automation. Familiarity with CI/CD pipelines and working in DevOps environments for continuous integration of Talend jobs. Knowledge of data governance and data security practices in cloud environments.
Posted 2 weeks ago
6.0 years
0 Lacs
Mumbai, Maharashtra, India
Remote
We are seeking a talented individual to join our Metrics, Analytics & Reporting team at Marsh. This role will be based in Mumbai. This is a hybrid role that has a requirement of working at least three days a week in the office. Senior Manager - Metrics, Analytics & Reporting ( Scrum Master) We will count on you to: Promoting Agile principles and practices across teams, ensure Agile / Scrum concepts and principles are adhered to, and where necessary coach the teams in implementing and practicing Agile principles. Acting as a bridge between development teams and stakeholders. Foster a culture of trust, collaboration, and accountability. Organize, and facilitate Scrum ceremonies for Scrum teams. Track Scrum metrics including team velocity and sprint / release progress and communicate this internally and externally, improving transparency Help and coach the product owner to establish and enforce sprint priorities and release delivery deadlines. Ensure business objectives are understood and achieved by as per sprint commitments. Identifying and removing obstacles to team progress. Prevent distractions that interfere with the ability of the team to deliver the sprint goals, through mediation, arbitration, mitigation and addressing impediments with the team members and the organizational hierarchy. Enabling self-organizing, cross-functional teams. Ensure DOR is met for all prioritized requirements. Encourage DOD and the importance of Driving a collaborative and supportive team culture through team building and engagement practices. Drive continuous improvement through team retrospectives and facilitating process enhancements. Identify and resolve conflicts, promote constructive dialogue, and encourage innovation. Work closely with other Scrum Masters to align cross-team dependencies and best practices. What you need to have: 6+ years of experience as a Scrum Master in a distributed Agile team with CSM or equivalent certification. Solid understanding of Agile frameworks (Scrum, Kanban, SAFe, etc.). Proficiency in Jira/Confluence and Azure Dev Ops and familiarity with different Agile practices such as Kanban/Lean. Proven track record of being a servant/leader in a Scrum team, driving teams and removing blockers, and improving processes through retrospectives. Strong facilitation, conflict resolution, and mentoring skills. Ability to assist technical team members and senior non-technical product owners in making appropriate decisions (Stakeholder Management). Comfortable with responsibility for delivering results and resilient enough to handle pressure in balancing time, quality, and scope. Proven ability to coach and mentor others, positive approach to complex problems, and a can-do attitude. Assertive and fact-based communicator, able to explain technical issues to a business audience and vice versa. Experience as a self-starter in a rapidly evolving and ambiguous environment, continuously learning and problem-solving quickly. Ability to identify and articulate risks and constructively challenge assumptions. Strong team player with Influencing and negotiation skills in a virtual/remote environment, working with customers/ developers across the globe. Excellent communication and interpersonal skills. Experience working with distributed or hybrid teams. What makes you stand out? Understanding of the Data Quality domain and experience in delivering KPI dashboards Track record of successful Agile transformations or scaling initiatives Strong analytical mindset with a data-driven approach to problem-solving. Exposure to solutions such as SQL, QlikView, Qlik Sense, Informatica DQ , Power BI Strong insurance and / or insurance broking business domain knowledge SAFE 6 Certification would be a big Plus. Why join our team: We help you be your best through professional development opportunities, interesting work and supportive leaders. We foster a vibrant and inclusive culture where you can work with talented colleagues to create new solutions and have impact for colleagues, clients and communities. Our scale enables us to provide a range of career opportunities, as well as benefits and rewards to enhance your well-being. Marsh, a business of Marsh McLennan (NYSE: MMC), is the world’s top insurance broker and risk advisor. Marsh McLennan is a global leader in risk, strategy and people, advising clients in 130 countries across four businesses: Marsh, Guy Carpenter, Mercer and Oliver Wyman. With annual revenue of $24 billion and more than 90,000 colleagues, Marsh McLennan helps build the confidence to thrive through the power of perspective. For more information, visit marsh.com, or follow on LinkedIn and X. Marsh McLennan is committed to embracing a diverse, inclusive and flexible work environment. We aim to attract and retain the best people and embrace diversity of age, background, caste, disability, ethnic origin, family duties, gender orientation or expression, gender reassignment, marital status, nationality, parental status, personal or social status, political affiliation, race, religion and beliefs, sex/gender, sexual orientation or expression, skin color, or any other characteristic protected by applicable law. Marsh McLennan is committed to hybrid work, which includes the flexibility of working remotely and the collaboration, connections and professional development benefits of working together in the office. All Marsh McLennan colleagues are expected to be in their local office or working onsite with clients at least three days per week. Office-based teams will identify at least one “anchor day” per week on which their full team will be together in person.
Posted 2 weeks ago
7.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Job Information Date Opened 07/23/2025 Job Type Permanent RSD NO 10371 Industry IT Services Min Experience 15+ Max Experience 15+ City Chennai State/Province Tamil Nadu Country India Zip/Postal Code 600018 Job Description Job Summary: We are seeking a Data Architect to design and implement scalable, secure, and efficient data solutions that support Convey Health Solutions' business objectives. This role will focus on data modeling, cloud data platforms, ETL processes, and analytics solutions, ensuring compliance with healthcare regulations (HIPAA, CMS guidelines). The ideal candidate will collaborate with data engineers, BI analysts, and business stakeholders to drive data-driven decision-making. Key Responsibilities: Enterprise Data Architecture: Design and maintain the overall data architecture to support Convey Health Solutions’ data-driven initiatives. Cloud & Data Warehousing: Architect cloud-based data solutions (AWS, Azure, Snowflake, BigQuery) to optimize scalability, security, and performance. Data Modeling: Develop logical and physical data models for structured and unstructured data, supporting analytics, reporting, and operational processes. ETL & Data Integration: Define strategies for data ingestion, transformation, and integration, leveraging ETL tools like INFORMATICA, TALEND, DBT, or Apache Airflow. Data Governance & Compliance: Ensure data quality, security, and compliance with HIPAA, CMS, and SOC 2 standards. Performance Optimization: Optimize database performance, indexing strategies, and query performance for real-time analytics. Collaboration: Partner with data engineers, software developers, and business teams to align data architecture with business objectives. Technology Innovation: Stay up to date with emerging data technologies, AI/ML applications, and industry trends in healthcare data analytics. Required Qualifications: Education: Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Engineering, or a related field. Experience: 7+ years of experience in data architecture, data engineering, or related roles. Technical Skills: Strong expertise in SQL, NoSQL, and data modeling techniques Hands-on experience with cloud data platforms (AWS Redshift, Snowflake, Google BigQuery, Azure Synapse) Experience with ETL frameworks (INFORMATICA, TALEND, DBT, Apache Airflow, etc.) Knowledge of big data technologies (Spark, Hadoop, Data-bricks) Strong understanding of data security and compliance (HIPAA, CMS, SOC 2, GDPR) Soft Skills: Strong analytical, problem-solving, and communication skills. Ability to work in a collaborative, agile environment. Preferred Qualifications: Experience in healthcare data management, claims processing, risk adjustment, or pharmacy benefit management (PBM). Familiarity with AI/ML applications in healthcare analytics. Certifications in cloud data platforms (AWS Certified Data Analytics, Google Professional Data Engineer, etc.). At Indium diversity, equity, and inclusion (DEI) are the cornerstones of our values. We champion DEI through a dedicated council, expert sessions, and tailored training programs, ensuring an inclusive workplace for all. Our initiatives, including the WE@IN women empowerment program and our DEI calendar, foster a culture of respect and belonging. Recognized with the Human Capital Award, we are committed to creating an environment where every individual thrives. Join us in building a workplace that values diversity and drives innovation.
Posted 2 weeks ago
10.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Location: Bangalore - Karnataka, India - EOIZ Industrial Area Job Family: Artificial Intelligence & Machine Learning Worker Type Reference: Regular - Permanent Pay Rate Type: Salary Career Level: T4(A) Job ID: R-46721-2025 Description & Requirements Introduction: A Career at HARMAN Technology Services (HTS) We’re a global, multi-disciplinary team that’s putting the innovative power of technology to work and transforming tomorrow. At HARMAN HTS, you solve challenges by creating innovative solutions. Combine the physical and digital, making technology a more dynamic force to solve challenges and serve humanity’s needs Work at the convergence of cross channel UX, cloud, insightful data, IoT and mobility Empower companies to create new digital business models, enter new markets, and improve customer experiences Role : Data Architect with Microsoft Azure + Fabric + Purview Skill Experience Required: 10+ Years Key Responsibilities of the role include: Data Engineer Develop and implement data engineering project including data lakehouse or Big data platform Knowledge of Azure Purview is must Knowledge of Azure Data fabric Ability to define reference data architecture Cloud native data platform experience in Microsoft Data stack including – Azure data factory, Databricks on Azure Knowledge about latest data trends including datafabric and data mesh Robust knowledge of ETL and data transformation and data standardization approaches Key contributor on growth of the COE and influencing client revenues through Data and analytics solutions Lead the selection, deployment, and management of Data tools, platforms, and infrastructure. Ability to guide technically a team of data engineers Oversee the design, development, and deployment of Data solutions Define, differentiate & strategize new Data services/offerings and create reference architecture assets Drive partnerships with vendors on collaboration, capability building, go to market strategies, etc. Guide and inspire the organization about the business potential and opportunities around Data Network with domain experts Collaborate with client teams to understand their business challenges and needs. Develop and propose Data solutions tailored to client specific requirements. Influence client revenues through innovative solutions and thought leadership. Lead client engagements from project initiation to deployment. Build and maintain strong relationships with key clients and stakeholders. Build re-usable Methodologies, Pipelines & Models Create data pipelines for more efficient and repeatable data science projects Design and implement data architecture solutions that support business requirements and meet organizational needs Collaborate with stakeholders to identify data requirements and develop data models and data flow diagrams Work with cross-functional teams to ensure that data is integrated, transformed, and loaded effectively across different platforms and systems Develop and implement data governance policies and procedures to ensure that data is managed securely and efficiently Develop and maintain a deep understanding of data platforms, technologies, and tools, and evaluate new technologies and solutions to improve data management processes Ensure compliance with regulatory and industry standards for data management and security. Develop and maintain data models, data warehouses, data lakes and data marts to support data analysis and reporting. Ensure data quality, accuracy, and consistency across all data sources. Knowledge of ETL and data integration tools such as Informatica, Qlik Talend, and Apache NiFi. Experience with data modeling and design tools such as ERwin, PowerDesigner, or ER/Studio Knowledge of data governance, data quality, and data security best practices Experience with cloud computing platforms such as AWS, Azure, or Google Cloud Platform. Familiarity with programming languages such as Python, Java, or Scala. Experience with data visualization tools such as Tableau, Power BI, or QlikView. Understanding of analytics and machine learning concepts and tools. Knowledge of project management methodologies and tools to manage and deliver complex data projects. Skilled in using relational database technologies such as MySQL, PostgreSQL, and Oracle, as well as NoSQL databases such as MongoDB and Cassandra. Strong expertise in cloud-based databases such as Azure datalake , Synapse, Azure data factory and AWS glue , AWS Redshift and Azure SQL. Knowledge of big data technologies such as Hadoop, Spark, snowflake, databricks , and Kafka to process and analyze large volumes of data. Proficient in data integration techniques to combine data from various sources into a centralized location. Strong data modeling, data warehousing, and data integration skills. People & Interpersonal Skills Build and manage a high-performing team of Data engineers and other specialists. Foster a culture of innovation and collaboration within the Data team and across the organization. Demonstrate the ability to work in diverse, cross-functional teams in a dynamic business environment. Candidates should be confident, energetic self-starters, with strong communication skills. Candidates should exhibit superior presentation skills and the ability to present compelling solutions which guide and inspire. Provide technical guidance and mentorship to the Data team Collaborate with other stakeholders across the company to align the vision and goals Communicate and present the Data capabilities and achievements to clients and partners Stay updated on the latest trends and developments in the Data domain What is required for the role? 10+ years of experience in the information technology industry with strong focus on Data engineering, architecture and preferably as Azure data engineering lead 8+ years of data engineering or data architecture experience in successfully launching, planning, and executing advanced data projects. Data Governance experience is mandatory MS Fabric Certified Experience in working on RFP/ proposals, presales activities, business development and overlooking delivery of Data projects is highly desired Educational Qualification: A master’s or bachelor’s degree in computer science, data science, information systems, operations research, statistics, applied mathematics, economics, engineering, or physics. Candidate should have demonstrated the ability to manage data projects and diverse teams. Should have experience in creating data and analytics solutions. Experience in building solutions with Data solutions in any one or more domains – Industrial, Healthcare, Retail, Communication Problem-solving, communication, and collaboration skills. Good knowledge of data visualization and reporting tools Ability to normalize and standardize data as per Key KPIs and Metrics Benefits: Opportunities for professional growth and development. Collaborative and supportive work environment. What We Offer Access to employee discounts on world class HARMAN/Samsung products (JBL, Harman Kardon, AKG etc.) Professional development opportunities through HARMAN University’s business and leadership academies. An inclusive and diverse work environment that fosters and encourages professional and personal development. You Belong Here HARMAN is committed to making every employee feel welcomed, valued, and empowered. No matter what role you play, we encourage you to share your ideas, voice your distinct perspective, and bring your whole self with you – all within a support-minded culture that celebrates what makes each of us unique. We also recognize that learning is a lifelong pursuit and want you to flourish. We proudly offer added opportunities for training, development, and continuing education, further empowering you to live the career you want. About HARMAN: Where Innovation Unleashes Next-Level Technology Ever since the 1920s, we’ve been amplifying the sense of sound. Today, that legacy endures, with integrated technology platforms that make the world smarter, safer, and more connected. Across automotive, lifestyle, and digital transformation solutions, we create innovative technologies that turn ordinary moments into extraordinary experiences. Our renowned automotive and lifestyle solutions can be found everywhere, from the music we play in our cars and homes to venues that feature today’s most sought-after performers, while our digital transformation solutions serve humanity by addressing the world’s ever-evolving needs and demands. Marketing our award-winning portfolio under 16 iconic brands, such as JBL, Mark Levinson, and Revel, we set ourselves apart by exceeding the highest engineering and design standards for our customers, our partners and each other. If you’re ready to innovate and do work that makes a lasting impact, join our talent community today! You Belong Here HARMAN is committed to making every employee feel welcomed, valued, and empowered. No matter what role you play, we encourage you to share your ideas, voice your distinct perspective, and bring your whole self with you – all within a support-minded culture that celebrates what makes each of us unique. We also recognize that learning is a lifelong pursuit and want you to flourish. We proudly offer added opportunities for training, development, and continuing education, further empowering you to live the career you want. About HARMAN: Where Innovation Unleashes Next-Level Technology Ever since the 1920s, we’ve been amplifying the sense of sound. Today, that legacy endures, with integrated technology platforms that make the world smarter, safer, and more connected. Across automotive, lifestyle, and digital transformation solutions, we create innovative technologies that turn ordinary moments into extraordinary experiences. Our renowned automotive and lifestyle solutions can be found everywhere, from the music we play in our cars and homes to venues that feature today’s most sought-after performers, while our digital transformation solutions serve humanity by addressing the world’s ever-evolving needs and demands. Marketing our award-winning portfolio under 16 iconic brands, such as JBL, Mark Levinson, and Revel, we set ourselves apart by exceeding the highest engineering and design standards for our customers, our partners and each other. If you’re ready to innovate and do work that makes a lasting impact, join our talent community today! Important Notice: Recruitment Scams Please be aware that HARMAN recruiters will always communicate with you from an '@harman.com' email address. We will never ask for payments, banking, credit card, personal financial information or access to your LinkedIn/email account during the screening, interview, or recruitment process. If you are asked for such information or receive communication from an email address not ending in '@harman.com' about a job with HARMAN, please cease communication immediately and report the incident to us through: harmancareers@harman.com. HARMAN is proud to be an Equal Opportunity employer. HARMAN strives to hire the best qualified candidates and is committed to building a workforce representative of the diverse marketplaces and communities of our global colleagues and customers. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics.HARMAN attracts, hires, and develops employees based on merit, qualifications and job-related performance.(www.harman.com)
Posted 2 weeks ago
3.0 - 8.0 years
4 - 8 Lacs
Noida, Gurugram, Delhi / NCR
Hybrid
Educational Bachelor of Engineering,BSc,BCA,MCA,MTech,MSc Service Line Infosys Quality Engineering Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of design principles and fundamentals of architecture Understanding of performance engineering Knowledge of quality processes and estimation techniques Basic understanding of project domain Ability to translate functional / nonfunctional requirements to systems requirements Ability to design and code complex programs Ability to write test cases and scenarios based on the specifications Good understanding of SDLC and agile methodologies Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional : Primary skills:Technology-ETL & Data Quality-ETL - Others Preferred Skills: Technology-ETL & Data Quality-ETL - Others
Posted 2 weeks ago
8.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Role Description Job Title : Salesforce Architect – Data Cloud & Marketing Cloud Hiring Locations : Bangalore, Pune, Trivandrum, Kochi, Hyderabad, Chennai Experience Range Total IT Experience: 8+ years Salesforce Marketing Cloud Experience: Minimum 5 years (hands-on) Salesforce Data Cloud (CDP) Experience: Minimum 2 years Leadership Experience: Experience in leading cross-functional teams and mentoring junior architects Must Have Skills Platform Expertise Strong hands-on experience with Salesforce Data Cloud (formerly CDP): Data unification, identity resolution, calculated insights, segmentation, data streams, harmonization rules Deep hands-on expertise in Salesforce Marketing Cloud: Journey Builder, Email Studio, Mobile Studio, Automation Studio, Contact Builder Development using AMPscript, SSJS, SQL, HTML/CSS, JavaScript Integration experience using REST/SOAP APIs Data model design and audience segmentation for large-scale, multi-channel campaigns Design of real-time and batch-based data ingestion and activation flows Proven ability to translate complex business requirements into scalable Salesforce architecture Strong experience integrating Salesforce Marketing Cloud with Sales Cloud, Service Cloud, and third-party platforms Experience in delivering projects in Agile environments, including sprint planning and estimation Experience with ETL tools like MuleSoft, Informatica, or Talend Ability to create architecture diagrams, reusable frameworks, and technical documentation Awareness of data privacy laws (e.g., GDPR, CAN-SPAM) and compliance standards Good To Have Skills Experience with: Marketing Cloud Personalization (Interaction Studio) Datorama, Pardot, Social Studio AWS / GCP for data storage or event processing Familiarity with: Salesforce Administrator and Platform Developer I capabilities Salesforce Marketing Cloud Personalization Experience developing POCs and custom demos for client presentations Experience working with enterprise architecture frameworks Exposure to data governance, security models, and compliance audits Certifications Required : Salesforce Marketing Cloud Consultant Salesforce Marketing Cloud Developer Salesforce Data Cloud Consultant Nice To Have Salesforce Administrator (ADM201) Platform Developer I Marketing Cloud Personalization Specialist Key Responsibilities Architect and implement unified customer data strategies using Data Cloud Lead technical discussions and requirement-gathering with business and technical teams Design scalable multi-channel SFMC solutions for campaign execution Manage integrations with Salesforce core clouds and external systems Mentor developers, review code/designs, and ensure delivery Create documentation, standards, and best practices Ensure governance, compliance, and high delivery quality across engagements Skills Salesforce,Amp,Javascript
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
We are looking for a skilled ETL Tester with hands-on experience in SQL and Python to join our Quality Engineering team. The ideal candidate will be responsible for validating data pipelines, ensuring data quality, and supporting the end-to-end ETL testing lifecycle in a fast-paced environment. Design, develop, and execute test cases for ETL workflows and data pipelines. Perform data validation and reconciliation using advanced SQL queries. Use Python for automation of test scripts, data comparison, and validation tasks. Work closely with Data Engineers and Business Analysts to understand data transformations and business logic. Perform root cause analysis of data discrepancies and report defects in a timely manner. Validate data across source systems, staging, and target data stores (e.g., Data Lakes, Data Warehouses). Participate in Agile ceremonies, including sprint planning and daily stand-ups. Maintain test documentation including test plans, test cases, and test results. Required qualifications to be successful in this role: 5+ years of experience in ETL/Data Warehouse testing. Strong proficiency in SQL (joins, aggregations, window functions, etc.). Experience in Python scripting for test automation and data validation. Hands-on experience with tools like Informatica, Talend, Apache NiFi, or similar ETL tools. Understanding of data models, data marts, and star/snowflake schemas. Familiarity with test management and bug tracking tools (e.g., JIRA, HP ALM). Strong analytical, debugging, and problem-solving skills. Good to Have: Exposure to Big Data technologies (e.g., Hadoop, Hive, Spark). Experience with Cloud platforms (e.g., AWS, Azure, GCP) and related data services. Knowledge of CI/CD tools and automated data testing frameworks. Experience working in Agile/Scrum teams. Together, as owners, let's turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect, and belonging. Here, you'll reach your full potential because You are invited to be an owner from day 1 as we work together to bring our Dream to life. That's why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company's strategy and direction. Your work creates value. You'll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You'll shape your career by joining a company built to grow and last. You'll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our teamone of the largest IT and business consulting services firms in the world.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
The team responsible for Business Analytics at Seagate is seeking a skilled individual to join as a SAP BODS Developer. In this role, you will be involved in SAP BODS Development projects, including requirements analysis, solution conception, implementation, and development. Working closely with various cross-functional teams, you will be tasked with developing solutions related to BODS, architecting, developing, and maintaining BODS jobs, as well as designing complex data flows and workflows. Your responsibilities will also include ensuring timely delivery, adherence to scope, and industry best practices. To excel in this role, you should possess expertise in SAP BODS, excellent verbal and written communication skills, and strong analytical capabilities. Familiarity with offshore/onsite work models, the ability to articulate complex problems and solutions clearly, and experience collaborating virtually with professionals from diverse backgrounds are crucial. Problem-solving skills and a collaborative team player mindset are essential traits for success. Ideal candidates will have hands-on experience in SAP BODS tool implementation, designing and developing ETL data flows and jobs, and executing data migration strategies involving SAP BW4HANA and Enterprise HANA. Proficiency in various Data Services transformations, such as Map Operation, Table Comparison, Row-Generation, and SQL transformations, is required. Additionally, knowledge of integrating non-SAP/Cloud systems with SAP BW4HANA using Data Services, SQL/PLSQL, and SAP BW is highly beneficial. Familiarity with BODS administration, SAP BW, and exposure to SDI/SDA/Informatica will be advantageous. The position is based in Pune, India, offering a dynamic work environment with innovative projects and various on-site amenities for personal and professional development. Employees can enjoy meals from multiple cafeterias, participate in recreational activities like walkathons and sports competitions, and engage in technical learning opportunities through the Technical Speaker Series. Cultural festivals, celebrations, and community volunteer activities further enrich the vibrant workplace culture. Join our team in Pune and contribute to our cutting-edge work in Business Analytics by leveraging your SAP BODS expertise and collaborative skills effectively.,
Posted 2 weeks ago
0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Knowledge, Skills, And Abilities Ability to translate a logical data model into a relational or non-relational solution Expert in one or more of the following ETL tools: SSIS, Azure Data Factory, AWS Glue, Matillion, Talend, Informatica, Fivetran Hands on experience in setting up End to End cloud based data lakes Hands-on experience in database development using views, SQL scripts and transformations Ability to translate complex business problems into data-driven solutions Working knowledge of reporting tools like Power BI , Tableau etc Ability to identify data quality issues that could affect business outcomes Flexibility in working across different database technologies and propensity to learn new platforms on-the-fly Strong interpersonal skills Team player prepared to lead or support depending on situation"
Posted 2 weeks ago
15.0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
Project Role : Technology OpS Support Practitioner Project Role Description : Own the integrity and governance of systems, including best practices for delivering services. Develop, deploy and support infrastructures, applications and technology initiatives from an architectural and operational perspective in conjunction with existing standards and methods of delivery. Must have skills : Informatica PowerCenter, Data warehouse implementation, To support TDM licensing packa Good to have skills : NA Minimum 15 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Technology OpS Support Practitioner, you will be responsible for ensuring the integrity and governance of systems while adhering to best practices for service delivery. Your typical day will involve developing, deploying, and supporting infrastructures, applications, and technology initiatives, all while aligning with existing standards and methods of delivery. You will collaborate with various teams to ensure operational excellence and contribute to the overall success of technology initiatives within the organization. Roles & Responsibilities: - Expected to be a Subject Matter Expert with deep knowledge and experience. - Should have influencing and advisory skills. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Expected to provide solutions to problems that apply across multiple teams. - Facilitate training sessions and workshops to enhance team capabilities. - Monitor and evaluate the effectiveness of implemented solutions and make necessary adjustments. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica PowerCenter. - Strong understanding of data integration and ETL processes. - Experience with data warehousing concepts and methodologies. - Familiarity with database management systems and SQL. - Ability to troubleshoot and resolve technical issues efficiently. Additional Information: - The candidate should have minimum 15 years of experience in Informatica PowerCenter. - This position is based at our Mumbai office. - A 15 years full time education is required., 15 years full time education
Posted 2 weeks ago
10.0 - 15.0 years
20 - 35 Lacs
Noida, Bengaluru
Work from Office
Description: We are looking for a Python Developer with working knowledge of ETL workflow. Experience in data extraction using APIs and writing queries in PostgreSQL is mandatory. Requirements: Need a Python that has good EXperience in Python programming and problem solving Should be good in Data Structure and implementation. Shoudl be good in Data base i.e. relation Database and SQL. Should be proficient in requirements and implementation Should have a degreee in Computer science Should have good communication, prioritization, organization skills Should be keen on learning and upskilling Job Responsibilities: Need a Python that has good Experiencein Python programming and problem solving Should be good in Data Structure and implementation. Shoudl be good in Data base i.e. relation Database and SQL. Should be proficient in requirements and implementation Should have a degreee in Computer science Should have good communication, prioritization, organization skills Should be keen on learning and upskilling What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!
Posted 2 weeks ago
5.0 - 8.0 years
7 - 11 Lacs
Hyderabad
Work from Office
We are seeking an experienced MDM Data Analyst with 5-8 years of experience on MDM development and implementation and operations of our Master Data Management (MDM) platforms, with hands-on experience in Informatica IDQ and Informatica MDM. This role will involve hands-on MDM implementation of MDM solutions using IDQ and Informatica MDM. To succeed in this role, the candidate must have strong IDQ and Informatica MDM technical experience. Roles & Responsibilities: Develop and implement MDM solutions using Informatica IDQ and Informatica MDM platforms. Define enterprise-wide MDM architecture, including IDQ, data stewardship, and metadata workflows. Match/Merge and Survivorship strategy and implementation Design and delivery of MDM processes and data integrations using Unix, Python, and SQL. Collaborate with backend data engineering team and frontend custom UI team for strong integrations and a seamless enhanced user experience respectively Coordinate with business and IT stakeholders to align MDM capabilities with organizational goals. Establish data quality metrics and monitor compliance using automated profiling and validation tools. Promote data governance and contribute to enterprise data modeling and approval workflow (DCRs). Ensure data integrity, lineage, and traceability across MDM pipelines and solutions. Basic Qualifications and Experience: Master s degree with 4 - 6 years of experience in Business, Engineering, IT or related field OR Bachelor s degree with 5 - 8 years of experience in Business, Engineering, IT or related field OR Diploma with 10 - 12 years of experience in Business, Engineering, IT or related field Must-Have Skills: Deep knowledge of MDM tools (Informatica MDM) and data quality frameworks (IDQ) from configuring data assets to building end to end data pipelines and integrations for data mastering and orchestrations of ETL pipelines Very good understanding on reference data, hierarchy and its integration with MDM Hands on experience with custom workflows AVOS, Eclipse etc Strong experience with external data enrichment services like Address doctor etc Strong experience on match/merge and survivorship rules strategy and implementations Strong experience with group fields, cross reference data and UUIDs Strong understanding of AWS cloud services and Databricks architecture. Proficiency in Python, SQL, and Unix for data processing and orchestration. Experience with data modeling, governance, and DCR lifecycle management(Avos). Proven leadership and project management in large-scale MDM implementations. Able to implement end to end integrations including API based integrations, Batch integrations and Flat file based integrations Must have worked on atleast 3 end to end implementations of MDM Hands on Unix and Advance sql Good-to-Have Skills: Experience with Tableau or PowerBI for reporting MDM insights. Exposure to Agile practices and tools (JIRA, Confluence). Prior experience in Pharma/Life Sciences. Understanding of compliance and regulatory considerations in master data. Professional Certifications : Any MDM certification (e.g. Informatica) Any Data Analysis certification (SQL) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams EQUAL OPPORTUNITY STATEMENT We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Role GCF: 04A .
Posted 2 weeks ago
15.0 - 16.0 years
22 - 30 Lacs
Hyderabad
Work from Office
We are seeking a highly skilled and experienced Principal AI Solution Architect to join our dynamic team. The candidate will lead the AI Solutioning and Designing across Enterprise Teams and cross-functional teams. They will primarily be working with the MDM CoE to lead and drive AI solutions and optimizations and also provide thought leadership. The role involves developing and implementing AI strategies, collaborating with cross-functional teams, and ensuring the scalability, reliability, and performance of AI solutions. To succeed in this role, the candidate must have strong AI/ML, Data Science , GenAI experience along with MDM knowledg . Candidate must have AI/ML, data science and GenAI experience on technologies like ( PySpark / PyTorch , TensorFlow, LLM , Autogen , Hugging FaceVectorDB , Embeddings, RAGs etc ), along with knowledge of MDM (Master Data Management) Roles & Responsibilities: Lead the Designing, solutioning and development of enterprise-level GenAI applications using LLM frameworks such as Langchain , Autogen , and Hugging Face. Architect intelligent pipelines using PySpark , TensorFlow, and PyTorch within Databricks and AWS environments. Implement embedding models and manage VectorStores for retrieval-augmented generation (RAG) solutions. Integrate and leverage MDM platforms like Informatica and Reltio to supply high-quality structured data to ML systems. Utilize SQL and Python for data engineering, data wrangling, and pipeline automation. Build scalable APIs and services to serve GenAI models in production. Lead cross-functional collaboration with data scientists, engineers, and product teams to scope, design, and deploy AI-powered systems. Ensure model governance, version control, and auditability aligned with regulatory and compliance expectations. Basic Qualifications and Experience: Master s degree with 11 - 1 4 years of experience in Data Science, Artificial Intelligence, Computer Science, or related fields OR Bachelor s degree with 1 5 - 16 years of experience in Data Science, Artificial Intelligence, Computer Science, or related fields OR Diploma with 1 7 - 1 8 years of hands-on experience in Data Science, AI/ML technologies, or related technical domains Functional Skills: Must-Have Skills: 1 4 + years of experience working in AI/ML or Data Science roles, including designing and implementing GenAI solutions. Extensive hands-on experience with LLM frameworks and tools such as Langchain , Autogen , Hugging Face, OpenAI APIs, and embedding models. Expertise in AI/ML solution architecture and design , knowledge of industry best practices Experience desining GenAI based solutions using Databricks platform Hands-on experience with Python, PySpark , PyTorch , LLMs, Vector DB, Embeddings, SciKit , Langchain , SK-learn, Tensorflow , APIs, Autogen , VectorStores , MongoDB, DataBricks , Django Strong knowledge of AWS and cloud-based AI infrastructure Excellent problem-solving skills Strong communication and leadership skills Ability to collaborate effectively with cross-functional teams and stakeholders Experience in managing and mentoring junior team members Must be able to p rovide thought leadership to the junior team members Good-to-Have Skills: Prior experience in Data Modeling, ETL development, and data profiling to support AI/ML workflows. Working knowledge of Life Sciences or Pharma industry standards and regulatory considerations. Proficiency in tools like JIRA and Confluence for Agile delivery and project collaboration. Familiarity with MongoDB, VectorStores , and modern architecture principles for scalable GenAI applications. Professional Certifications : Any Data Analysis certification (SQL , Python, Other DBs or Programming languages ) Any cloud certification (AWS or AZURE) Data Science and ML Certification s Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams
Posted 2 weeks ago
8.0 - 13.0 years
5 - 9 Lacs
Hyderabad
Work from Office
We are seeking an experienced MDM Engineer with 8 12 years of experience to lead development and operations of our Master Data Management (MDM) platforms, with hands-on experience in data engineering experience. This role will involve handling the backend data engineering solution within MDM team. This is a technical role that will require hands-on work. To succeed in this role, the candidate must have strong Data Engineering experience. Candidate must have experience on technologies like (SQL, Python, PySpark, Databricks, AWS, API Integrations etc). Roles & Responsibilities: Develop distributed data pipelines using PySpark on Databricks for ingesting, transforming, and publishing master data Write optimized SQL for large-scale data processing, including complex joins, window functions, and CTEs for MDM logic Implement match/merge algorithms and survivorship rules using Informatica MDM or Reltio APIs Build and maintain Delta Lake tables with schema evolution and versioning for master data domains Use AWS services like S3, Glue, Lambda, and Step Functions for orchestrating MDM workflows Automate data quality checks using IDQ or custom PySpark validators with rule-based profiling Integrate external enrichment sources (e.g., D&B, LexisNexis) via REST APIs and batch pipelines Design and deploy CI/CD pipelines using GitHub Actions or Jenkins for Databricks notebooks and jobs Monitor pipeline health using Databricks Jobs API, CloudWatch, and custom logging frameworks Implement fine-grained access control using Unity Catalog and attribute-based policies for MDM datasets Use MLflow for tracking model-based entity resolution experiments if ML-based matching is applied Collaborate with data stewards to expose curated MDM views via REST endpoints or Delta Sharing Basic Qualifications and Experience: 8 to 13 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Advanced proficiency in PySpark for distributed data processing and transformation Strong SQL skills for complex data modeling, cleansing, and aggregation logic Hands-on experience with Databricks including Delta Lake, notebooks, and job orchestration Deep understanding of MDM concepts including match/merge, survivorship, and golden record creation Experience with MDM platforms like Informatica MDM or Reltio, including REST API integration Proficiency in AWS services such as S3, Glue, Lambda, Step Functions, and IAM Familiarity with data quality frameworks and tools like Informatica IDQ or custom rule engines Experience building CI/CD pipelines for data workflows using GitHub Actions, Jenkins, or similar Knowledge of schema evolution, versioning, and metadata management in data lakes Ability to implement lineage and observability using Unity Catalog or third-party tools Comfort with Unix shell scripting or Python for orchestration and automation Hands on experience on RESTful APIs for ingesting external data sources and enrichment feeds Good-to-Have Skills: Experience with Tableau or PowerBI for reporting MDM insights. Exposure to Agile practices and tools (JIRA, Confluence). Prior experience in Pharma/Life Sciences. Understanding of compliance and regulatory considerations in master data. Professional Certifications : Any MDM certification (e.g. Informatica, Reltio etc) Any Data Analysis certification (SQL, Python, PySpark, Databricks) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams
Posted 2 weeks ago
2.0 - 5.0 years
6 - 9 Lacs
Hyderabad
Work from Office
We are seeking an MDM Admin/Infrastructure Resource with 2-5 years of experience to support and maintain our enterprise MDM (Master Data Management) platforms using Informatica MDM and IDQ. This role is critical in ensuring the reliability, availability, and performance of master data solutions across the organization, utilizing modern tools like Databricks and AWS for automation, backup, recovery, and preventive maintenance. The ideal candidate will have strong experience in server maintenance, data recovery, data backup, and MDM software support. Roles & Responsibilities: Administer and maintain customer, product, study master data using Informatica MDM and IDQ solutions. Perform data recovery and data backup processes to ensure master data integrity. Conduct server maintenance and preventive maintenance activities to ensure system reliability. Leverage Unix/Linux, Python, and Databricks for scalable data processing and automation. Collaborate with business and data engineering teams for continuous improvement in MDM solutions. Implement automation processes for data backup, recovery, and preventive maintenance. Utilize AWS cloud services for data storage and compute processes related to MDM. Support MDM software maintenance and upgrades. Track and manage data issues using tools such as JIRA and document processes in Confluence. Apply Life Sciences/Pharma industry context to ensure data standards and compliance. Functional Skills: Must-Have Skills: Strong experience with Informatica MDM, IDQ platforms in administering and maintaining configurations Strong experience in data recovery and data backup processes Strong experience in server maintenance and preventive maintenance activities (Linux/Unix strong hands on and server upgrade experience) Expertise in handling data backups, server backups, MDM products upgrades, server upgrades Good understanding and hands on experience of access control Experience with IDQ, data modeling, and approval workflow/DCR Advanced SQL expertise and data wrangling Knowledge of MDM, data governance, stewardship, and profiling practices Good-to-Have Skills: Familiarity with Databricks and AWS architecture Background in Life Sciences/Pharma industries Familiarity with project tools like JIRA and Confluence Basics of data engineering concepts Basic Qualifications and Experience: Master s degree with 1 - 3 years of experience in Business, Engineering, IT or related field OR Bachelor s degree with 2 - 5 years of experience in Business, Engineering, IT or related field OR Diploma with 6 - 8 years of experience in Business, Engineering, IT or related field Professional Certifications (preferred): Any ETL certification (e.g., Informatica) Any Data Analysis certification (SQL) Any cloud certification (AWS or Azure) Soft Skills: Excellent written and verbal communications skills (English) in translating technology content into business-language at various levels Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to manage multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong problem-solving and analytical skills. Strong time and task management skills to estimate and successfully meet project timeline with ability to bring consistency and quality assurance across various projects.
Posted 2 weeks ago
10.0 - 15.0 years
9 - 14 Lacs
Hyderabad
Work from Office
We are seeking an accomplished and visionary Data Scientist/ GenAI Lead to join Amgen s Enterprise Data Management team. As MDM Data Science/Manager, you will lead the design, development, and deployment of Generative AI and ML models to power data-driven decisions across business domains. This role is ideal for an AI practitioner who thrives in a collaborative environment and brings a strategic mindset to applying advanced AI techniques to solve real-world problems. To succeed in this role, the candidate must have strong AI/ML, Data Science , GenAI experience along with MDM knowledge, hence the candidates having only MDM experience are not eligible for this role. Candidate must have AI/ML, data science and GenAI experience on technologies like ( PySpark / PyTorch , TensorFlow, LLM , Autogen , Hugging FaceVectorDB , Embeddings, RAGs etc ), along with knowledge of MDM (Master Data Management) Roles & Responsibilities: Drive development of enterprise-level GenAI applications using LLM frameworks such as Langchain , Autogen , and Hugging Face. Architect intelligent pipelines using PySpark , TensorFlow, and PyTorch within Databricks and AWS environments. Implement embedding models and manage VectorStores for retrieval-augmented generation (RAG) solutions. Integrate and leverage MDM platforms like Informatica and Reltio to supply high-quality structured data to ML systems. Utilize SQL and Python for data engineering, data wrangling, and pipeline automation. Build scalable APIs and services to serve GenAI models in production. Lead cross-functional collaboration with data scientists, engineers, and product teams to scope, design, and deploy AI-powered systems. Ensure model governance, version control, and auditability aligned with regulatory and compliance expectations. Basic Qualifications and Experience: Master s degree with 8 - 10 years of experience in Data Science, Artificial Intelligence, Computer Science, or related fields OR Bachelor s degree with 10 - 14 years of experience in Data Science, Artificial Intelligence, Computer Science, or related fields OR Diploma with 14 - 16 years of hands-on experience in Data Science, AI/ML technologies, or related technical domains Functional Skills: Must-Have Skills: 10+ years of experience working in AI/ML or Data Science roles, including designing and implementing GenAI solutions. Extensive hands-on experience with LLM frameworks and tools such as Langchain , Autogen , Hugging Face, OpenAI APIs, and embedding models. Strong programming background with Python, PySpark , and experience in building scalable solutions using TensorFlow, PyTorch , and SK-Learn. Proven track record of building and deploying AI/ML applications in cloud environments such as AWS. Expertise in developing APIs, automation pipelines, and serving GenAI models using frameworks like Django, FastAPI , and DataBricks . Solid experience integrating and managing MDM tools (Informatica/Reltio) and applying data governance best practices. Guide the team on development activities and lead the solution discussions Must have core technical capabilities in GenAI , Data Science space Good-to-Have Skills: Prior experience in Data Modeling, ETL development, and data profiling to support AI/ML workflows. Working knowledge of Life Sciences or Pharma industry standards and regulatory considerations. Proficiency in tools like JIRA and Confluence for Agile delivery and project collaboration. Familiarity with MongoDB, VectorStores , and modern architecture principles for scalable GenAI applications. Professional Certifications : Any ETL certification ( e.g. Informatica) Any Data Analysis certification (SQL) Any cloud certification (AWS or AZURE) Data Science and ML Certification Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams
Posted 2 weeks ago
6.0 - 9.0 years
50 - 55 Lacs
Hyderabad
Work from Office
We are seeking an experienced MDM Senior Data Engineer with 6 - 9 years of experience and expertise in backend engineering to work closely with business on development and operations of our Master Data Management (MDM) platforms, with hands-on experience in Informatica or Reltio and data engineering experience . This role will also involve guiding junior data engineers /analysts , and quality experts to deliver high-performance, scalable, and governed MDM solutions that align with enterprise data strategy. To succeed in this role, the candidate must have strong Data Engineering experience along with MDM knowledge, hence the candidates having only MDM experience are not eligible for this role. Candidate must have data engineering experience on technologies like (SQL, Python, PySpark , Databricks, AWS, API Integrations etc ), along with knowledge of MDM (Master Data Management) Roles & Responsibilities: Develop the MDM backend solutions and implement ETL and Data engineering pipelines using Databricks, AWS, Python/ PySpark , SQL etc Lead the implementation and optimization of MDM solutions using Informatica or Reltio platforms. Perform data profiling and identify the DQ rules need . Define and drive enterprise-wide MDM architecture, including IDQ, data stewardship, and metadata workflows. Manage cloud-based infrastructure using AWS and Databricks to ensure scalability and performance. Ensure data integrity, lineage, and traceability across MDM pipelines and solutions. Provide mentorship and technical leadership to junior team members and ensure project delivery timelines. Help custom UI team for integration with backend data using API or other integration methods for better user experience on data stewardship Basic Qualifications and Experience: Master s degree with 4 - 6 years of experience in Business, Engineering, IT or related field OR Bachelor s degree with 6 - 9 years of experience in Business, Engineering, IT or related field OR Diploma with 1 0 - 12 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Strong understanding and hands on experience of Databricks and AWS cloud services. Proficiency in Python, PySpark , SQL, and Unix for data processing and orchestration. Deep knowledge of MDM tools (Informatica, Reltio) and data quality frameworks (IDQ). Must have knowledge on customer master data (HCP, HCO etc ) Experience with data modeling, governance, and DCR lifecycle management. Able to implement end to end integrations including API based integrations, Batch integrations and Flat file-based integrations Strong experience with external data enrichments like D&B Strong experience on match/merge and survivorship rules implementations Very good understanding on reference data and its integration with MDM Hands on experience with custom workflows or building data pipelines /orchestrations Good-to-Have Skills: Experience with Tableau or PowerBI for reporting MDM insights. Exposure or knowledge of DataScience and GenAI capabilities. Exposure to Agile practices and tools (JIRA, Confluence). Prior experience in Pharma/Life Sciences. Understanding of compliance and regulatory considerations in master data. Professional Certifications : Any MDM certification ( e.g. Informatica , Reltio etc ) Databricks Certifications (Data engineer or Architect) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams
Posted 2 weeks ago
10.0 - 14.0 years
6 - 9 Lacs
Hyderabad
Work from Office
We are seeking an experienced MDM Manager with 10-14 years of experience to lead strategic development and operations of our Master Data Management (MDM) platforms, with hands-on experience in Informatica or Reltio. This role will involve managing a team of data engineers, architects, and quality experts to deliver high-performance, scalable, and governed MDM solutions that align with enterprise data strategy. To succeed in this role, the candidate must have strong MDM experience along with Data Governance, DQ, Data Cataloging implementation knowledge, hence the candidates must have minimum 6-8 years of core MDM technical experience for this role (Along with total experience in the range of 10-14 years) . Roles & Responsibilities: Lead the implementation and optimization of MDM solutions using Informatica or Reltio platforms. Define and drive enterprise-wide MDM architecture, including IDQ, data stewardship, and metadata workflows. Match/Merge and Survivorship strategy and implementation experience D esign and delivery of MDM processes and data integrations using Unix, Python, and SQL. Collaborate with backend data engineering team and frontend custom UI team for strong integrations and a seamless enhanced user experience respectively Manage cloud-based infrastructure using AWS and Databricks to ensure scalability and performance. Coordinate with business and IT stakeholders to align MDM capabilities with organizational goals. Establish data quality metrics and monitor compliance using automated profiling and validation tools. Promote data governance and contribute to enterprise data modeling and approval workflow (DCRs). Ensure data integrity, lineage, and traceability across MDM pipelines and solutions. Provide mentorship and technical leadership to junior team members and ensure project delivery timelines. Lead custom UI design for better user experience on data stewardship Basic Qualifications and Experience: Master s degree with 8 - 10 years of experience in Business, Engineering, IT or related field OR Bachelor s degree with 10 - 14 years of experience in Business, Engineering, IT or related field OR Diploma with 14 - 16 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Deep knowledge of MDM tools (Informatica, Reltio) and data quality frameworks (IDQ) from configuring data assets to building end to end data pipelines and integrations for data mastering and orchestrations of ETL pipelines Very good understanding on reference data, hierarchy and its integration with MDM Hands on experience with custom workflows AVOS , Eclipse etc Strong experience with external data enrichment services like D&B, Address doctor etc Strong experience on match/merge and survivorship rules strategy and implementations Strong experience with group fields, cross reference data and UUIDs Strong understanding of AWS cloud services and Databricks architecture. Proficiency in Python, SQL, and Unix for data processing and orchestration. Experience with data modeling, governance, and DCR lifecycle management. Proven leadership and project management in large-scale MDM implementations. Able to implement end to end integrations including API based integrations, Batch integrations and Flat file based integrations Must have worked on atleast 3 end to end implementations of MDM Good-to-Have Skills: Experience with Tableau or PowerBI for reporting MDM insights. Exposure to Agile practices and tools (JIRA, Confluence). Prior experience in Pharma/Life Sciences. Understanding of compliance and regulatory considerations in master data. Professional Certifications : Any MDM certification ( e.g. Informatica , Reltio etc ) Any Data Analysis certification (SQL) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams
Posted 2 weeks ago
10.0 - 14.0 years
10 - 14 Lacs
Hyderabad
Work from Office
We are seeking an experienced MDM Engineering Manager with 10 14 years of experience to lead strategic development and operations of our Master Data Management (MDM) platforms, with hands-on experience in Informatica or Reltio and data engineering experience . This role will involve managing a team of data engineers, architects, and quality experts to deliver high-performance, scalable, and governed MDM solutions that align with enterprise data strategy. This is a technical role that will require hands-on work. To succeed in this role, the candidate must have strong Data Engineering experience along with MDM knowledge, hence the candidates having only MDM experience are not eligible for this role. Candidate must have data engineering experience on technologies like (SQL, Python, PySpark , Databricks, AWS, API Integrations etc ), along with knowledge of MDM (Master Data Management) Roles & Responsibilities: Lead the implementation and optimization of MDM solutions using Informatica or Reltio platforms. Define and drive enterprise-wide MDM architecture, including IDQ, data stewardship, and metadata workflows. D esign the solution and deliver enhanced MDM processes and data integrations using Unix, Python, and SQL. Manage cloud-based infrastructure using AWS and Databricks to ensure scalability and performance. Coordinate with business and IT stakeholders to align MDM capabilities with organizational goals. Establish data quality metrics and monitor compliance using automated profiling and validation tools. Promote data governance and contribute to enterprise data modeling and approval workflow (DCRs). Ensure data integrity, lineage, and traceability across MDM pipelines and solutions. Provide mentorship and technical leadership to junior team members and ensure project delivery timelines. Lead custom UI design for better user experience on data stewardship Basic Qualifications and Experience: Master s degree with 8 - 10 years of experience in Business, Engineering, IT or related field OR Bachelor s degree with 10 - 14 years of experience in Business, Engineering, IT or related field OR Diploma with 14 - 16 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Strong understanding and hands on experience of Databricks and AWS cloud services. Proficiency in Python, PySpark , SQL, and Unix for data processing and orchestration. Deep knowledge of MDM tools (Informatica, Reltio) and data quality frameworks (IDQ). Strong e xperience with data modeling, governance, and DCR lifecycle management. Proven leadership and project management in large-scale MDM implementations. Able to implement end to end integrations including API based integrations, Batch integrations and Flat file based integrations Strong experience with external data enrichments like D&B Strong experience on match/merge and survivorship rules implementations Must have worked on atleast 3 end to end implementations of MDM Very good understanding on reference data and its integration with MDM Hands on experience with custom workflows Good-to-Have Skills: Experience with Tableau or PowerBI for reporting MDM insights. Exposure to Agile practices and tools (JIRA, Confluence). Prior experience in Pharma/Life Sciences. Understanding of compliance and regulatory considerations in master data. Professional Certifications : Any MDM certification ( e.g. Informatica , Reltio etc ) Any Data Analysis certification (SQL , Python, PySpark , Databricks ) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams
Posted 2 weeks ago
1.0 - 2.0 years
20 - 25 Lacs
Hyderabad
Work from Office
In this vital and technical role, you will deliver innovative custom solutions for supporting patient safety and adhering to regulatory requirements from around the world. You will be an active participant in the team directly working towards advancing technical features and enhancements of the business applications, also involving Machine Learning and Natural Language Processing technologies. Develops and delivers robust technology solutions in a regulated environment by collaborating with business partners, information systems (IS) colleagues and service providers Authors documentation for technical specifications and designs that satisfy detailed business and functional requirements Works closely with business and IS teams to find opportunities Responsible for crafting and building end-to-end solutions using cloud technologies (e.g. Amazon Web Services and Business Intelligence tools (e.g. Cognos, Tableau and Spotfire) or any other platforms Chips in towards design and rapid Proof-of-Concept (POC) development efforts for automated solutions that improve efficiency and simplify business processes. Quickly and iteratively prove or disprove the concepts being considered. Ensures design, development of software solutions is meeting Amgen architectural, security, quality and development guidelines Participates in Agile development ceremonies and practices Write SQL queries to manipulate and visualize data using data visualization tools What we expect of you Master s degree with 1 - 2 years of experience in Computer Science, Software Development, IT or related field (OR) Bachelor s degree with 2 - 4 years of experience in Computer Science, Software Development, IT or related field (OR) Diploma with 5 - 8 years of experience in Computer Science, Software Development, IT or related field Must Have Skills: Experience and proficient with at least one development programming language/technologies such as Database SQL and Python Experience with at least one Business Intelligence tool such as Cognos, Tableau or Spotfire Familiarity with automation technologies UiPath and a desire to learn and support Solid understanding of Mulesoft and ETL technologies (e.g. Informatica, DataBricks) Understanding of AWS/cloud storage, hosting, and compute environments is required Good to Have Skills: Experienced in database programming languages, data modelling concepts, including Oracle SQL and PL/SQL Experience with API integrations such as MuleSoft Solid understanding of using one or more general programming languages, including but not limited to: Java or Python Outstanding written and verbal communication skills, and ability to explain technical concepts to non-technical clients Sharp learning agility, problem solving and analytical thinking Experienced in managing GxP systems and implementing GxP projects Extensive expertise in SDLC, including requirements, design, testing, data analysis, change control Certification: Understanding and experience with Agile methodology and DevOps Soft Skills: Strong communication and presentation skills Ability to work on multiple projects simultaneously Expertise in visualizing and manipulating large data sets Willing to learn to new technologies High learning agility, innovation, and analytical skills Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.
Posted 2 weeks ago
8.0 - 13.0 years
20 - 25 Lacs
Hyderabad
Work from Office
Let s do this. Let s change the world. In this vital role You will be responsible for designing, developing, and deploying Generative AI and ML models to power data-driven decisions across business domains. This role is ideal for an AI practitioner who thrives in a collaborative environment and brings a strategic approach to applying advanced AI techniques to solve real-world problems. To succeed in this role, the candidate must have strong AI/ML, Data Science, GenAI experience along with MDM (Master Data Management) knowledge, hence the candidates having only MDM experience are not eligible for this role. Roles & Responsibilities: Develop enterprise-level GenAI applications using LLM frameworks. Design and develop thoughtful pipelines within Databricks and AWS environments. Implement embedding models and manage VectorStores for retrieval-augmented generation (RAG) solutions. Integrate and leverage MDM platforms like Informatica and Reltio to supply high-quality structured data to ML systems. Use SQL and Python for data engineering, data wrangling, and pipeline automation. Build scalable APIs and services to serve GenAI models in production. Lead multi-functional collaboration with data scientists, engineers, and product teams to scope, design, and deploy AI-powered systems. Ensure model governance, version control, and auditability aligned with regulatory and compliance expectations. Possesses strong rapid prototyping skills and can quickly translate concepts into working code. Stay updated with the latest trends, advancements and standard process for Veeva Vault Platform ecosystem. Design, develop, and implement applications and modules, including custom reports, SDKs, interfaces, and enhancements. Analyze and understand the functional & technical requirements of applications, solutions and systems, translate them into software architecture and design specifications. Develop and implement unit tests, integration tests, and other testing strategies to ensure the quality of the software following IS change control and GxP Validation process while exhibiting expertise in Risk Based Validation methodology. Work closely with multi-functional teams, including product management, design, and QA, to deliver high-quality software on time. Maintain detailed documentation of software designs, code, and development processes. Work on integrating with other systems and platforms to ensure seamless data flow and functionality. Stay up to date on Veeva Vault Features, new releases and standard processes around Veeva Platform Governance. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree / Masters degree / Bachelors degree and 8 to 13 years Computer Science, IT or related field experience Preferred Qualifications: Functional Skills: Must-Have Skills: 6+ years of experience working in AI/ML or Data Science roles, including designing and implementing GenAI solutions. Extensive hands-on experience with LLM frameworks, OpenAI APIs, and embedding models. Consistent record of building and deploying AI/ML applications in cloud environments such as AWS. Expertise in developing APIs, automation pipelines, and serving GenAI models using frameworks like Django, FastAPI, and DataBricks. Solid experience integrating and managing MDM tools (Informatica/Reltio) and applying data governance standard methodologies. Guide the team on development activities and lead the solution discussions Must have core technical capabilities in GenAI, Data Science space Experience with Veeva Vault platform and its application, including Veeva configuration settings and custom builds. 6-8 years of experience working in global pharmaceutical Industry Experience with version control systems such as Git. Good-to-Have Skills: Prior experience in Data Modeling, ETL development, and data profiling to support AI/ML workflows. Working knowledge of Life Sciences or Pharma industry standards and regulatory considerations. Proficiency in tools like JIRA and Confluence for Agile delivery and project collaboration. Familiarity with relational databases (such as MySQL, SQL server, PostgreSQL etc.) Proficiency in programming languages such as Python, JavaScript or other programming languages Outstanding written and verbal communication skills, and ability to translate technical concepts for non-technical audiences. Experience with ETL Tools (Informatica, Databricks). Experience with API integrations such as MuleSoft. Solid understanding & Proficiency in writing SQL queries. Hands on experience on reporting tools such as Tableau, Spotfire & Power BI. Professional Certifications: Any ETL certificztion (eg. Mulesoy) Veeva Vault Platform Administrator or Equivalent Vault Certification (Mandatory) Any Data Analysis certification (SQL) Any cloud certification (AWS or AZURE) Data Science and ML Certification SAFe for Teams (Preferred) Soft Skills: Excellent analytical and solving skills. Strong verbal and written communication skills. Ability to work effectively with global, virtual teams. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills. Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. Equal opportunity statement We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.
Posted 2 weeks ago
5.0 - 7.0 years
2 - 6 Lacs
Hyderabad
Work from Office
Let s do this. Let s change the world. In this vital role you will In this vital role in the Veeva Vault team you will be responsible for designing, developing, and maintaining security solutions that meet business needs. This role involves working closely with product managers, designers, and other engineers to create high-quality, scalable software solutions and automating operations, monitoring system health, and responding to incidents to minimize downtime. Roles & Responsibilities: Solid understanding of Veeva Basic and Atomic security configuration. Ensure compliance with relevant regulations and maintain current certification status against various standards. Identifying control gaps, advising teams on how to address them, and collecting, organizing, and reviewing evidence for Veeva products. They also play a crucial role in planning and managing audits, interacting with external auditors, and advising management on risk and control issues. Lead day to day operations and maintenance of Amgen s R&D Veeva Vaults and hosted applications. Possesses strong rapid prototyping skills and can quickly translate concepts into working code. Stay updated with the latest trends, advancements and standard process for Veeva Vault Platform ecosystem. Design, develop, and implement applications and modules, including custom reports, SDKs, interfaces, and enhancements. Analyze and understand the functional & technical requirements of applications, solutions and systems, translate them into software architecture and design specifications. Develop and execute unit tests, integration tests, and other testing strategies to ensure the quality of the software following IS change control and GxP Validation process while exhibiting expertise in Risk Based Validation methodology. Work closely with multi-functional teams, including product management, design, and QA, to deliver high-quality software on time. Maintain detailed documentation of software designs, code, and development processes. Work on integrating with other systems and platforms to ensure seamless data flow and functionality. Stay up to date on Veeva Vault Features, new releases and best practices around Veeva Platform Governance. Basic Qualifications and Experience: Master s degree and 5 to 7years of Computer Science, IT or related field experience OR Bachelor s degree and 7 to 9 years of Computer Science, IT or related field experience Functional Skills: Must-Have Skills: Experience with Veeva Vault Platform and Products, including Veeva configuration settings and custom builds. Strong knowledge of information systems and network technologies. 6-8 years of experience working in global pharmaceutical Industry Experience in building configured and custom solutions on Veeva Vault Platform. Experience in managing systems, implementing and validating projects in GxP regulated environments. Extensive expertise in SDLC, including requirements, design, testing, data analysis, creating and managing change controls. Proficiency in programming languages such as Python, JavaScript etc. Good understanding of software development methodologies, including Agile and Scrum. Experience with version control systems such as Git. Good-to-Have Skills: Familiarity with relational databases (such as MySQL, SQL server, PostgreSQL etc.) Proficiency in programming languages such as Python, JavaScript or other programming languages Outstanding written and verbal communication skills, and ability to translate technical concepts for non-technical audiences. Experience with ETL Tools (Informatica, Databricks). Experience with API integrations such as MuleSoft. Solid understanding & Proficiency in writing SQL queries. Hands on experience on reporting tools such as Tableau, Spotfire & Power BI. Professional Certifications: Veeva Vault Platform Administrator or Equivalent Vault Certification (Mandatory) SAFe for Teams (Preferred) Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global, virtual teams. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills. Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.
Posted 2 weeks ago
1.0 - 3.0 years
3 - 7 Lacs
Hyderabad
Work from Office
Let s do this. Let s change the world. In this vital role you will be responsible for designing, developing, and maintaining software applications and solutions that meet business needs and ensuring the availability and performance of critical systems and applications. This role is for a technical functional lead/developer supporting Veeva ClinOps vault (suite of applications). The role involves working closely with product managers, designers, and other engineers to create/maintain high-quality, scalable software solutions and automating operations, monitoring system health, and responding to incidents to minimize downtime. Roles & Responsibilities: Participate in technical discussions related to the Veeva vault within Clinical Trial Management, Monitoring, and Engagement (CTMME) product team. Drive development/maintenance activities per release calendar by working with various members of the product team and business partners. Conduct user acceptance testing with the customer, including coordination of all feedback, resolution of issues, and acceptance of the study. Support requirements gathering and specification creation process for the development/maintenance work. Communicate potential risks and contingency plans with project management to ensure process compliance with all regulatory and Amgen procedural requirements. Participate and contribute to process product or standard methodologies initiatives and support developers and testers during the project lifecycle. Define, author, and present various architecture footprints i.e. Business, Logical, Integration, Security, Infrastructure, etc. Create and maintain documentation on software architecture, design, deployment, disaster recovery, and operations. Develop and implement unit tests, integration tests, and other testing strategies to ensure quality of the software following IS change control and GxP Validation process. Identify and resolve technical challenges/bugs & maintenance requests effectively. Work closely with multi-functional teams, including product management, design, and QA, to deliver high-quality software on time. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master s degree and 1 to 3 years of experience in Computer Science (or) Bachelor s degree and 3 to 5 years of experience in Computer Science(or) Diploma and 7 to 9 years of experience in Computer Science. Must-Have Skills: Proficiency in Veeva vault configuration/customization. Proficiency in programming languages such as Python, JavaScript preferred or other programming languages. Strong understanding of software development methodologies, including Agile and Scrum. Experience with version control systems like Git. Solid understanding & Proficient in writing SQL queries. Working knowledge of clinical trial processes specifically software validation. Good Problem-solving skills - Identifying and fixing bugs, adapting to changes. Excellent communication skills - Explaining design decisions, collaborating with teams. Good-to-Have Skills: Familiarity with relational databases (such as MySQL, SQL server, PostgreSQL etc.). Outstanding written and verbal communication skills, and ability to explain technical concepts to non-technical clients. Sharp learning agility, problem solving and analytical thinking . Experienced in implementing GxP projects. Extensive expertise in SDLC, including requirements, design, testing, data analysis, change control. Experience with API integrations such as MuleSoft. Experience with ETL Tools (Informatica, Databricks). What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.
Posted 2 weeks ago
5.0 - 9.0 years
1 - 5 Lacs
Hyderabad
Work from Office
In this role with the Veeva Vault team, you will design, develop, and maintain software applications in Amgen s Veeva Vault eTMF. You will ensure system availability and performance, collaborating with product managers, designers, and engineers to create scalable solutions. Your tasks include automating operations, monitoring system health, and responding to incidents to minimize downtime. Roles & Responsibilities: Possesses strong rapid prototyping skills and can quickly translate concepts into working code. Lead day to day operations and maintenance of Amgen s Veeva Vault eTMF and hosted applications. Stay updated with the latest trends, advancements and standard process for Veeva Vault Platform ecosystem. Design, develop, and implement applications and modules, including custom reports, SDKs, interfaces, and enhancements. Analyze and understand the functional & technical requirements of applications, solutions and systems, translate them into software architecture and design specifications. Develop and implement unit tests, integration tests, and other testing strategies to ensure the quality of the software following IS change control and GxP Validation process while exhibiting expertise in Risk Based Validation methodology. Work closely with multi-functional teams, including product management, design, and QA, to deliver high-quality software on time. Maintain detailed documentation of software designs, code, and development processes. Work on integrating with other systems and platforms to ensure seamless data flow and functionality. Stay up to date on Veeva Vault Features, new releases and standard processes around Veeva Platform Governance. What we expect of you We are all different, yet we all use our unique contributions to serve patients. The professional we seek is someone with these qualifications. Basic Qualifications: Masters degree / Bachelors degree and 5 to 9 years of relevant experience Must-Have Skills: Experience with Veeva Vault eTMF, including Veeva configuration settings and custom builds. Strong knowledge of information systems and network technologies. 6-8 years of experience working in global pharmaceutical Industry Experience in building configured and custom solutions on Veeva Vault Platform. Experience in managing systems, implementing and validating projects in GxP regulated environments. Extensive expertise in SDLC, including requirements, design, testing, data analysis, creating and managing change controls. Proficiency in programming languages such as Python, JavaScript etc. Solid understanding of software development methodologies, including Agile and Scrum. Experience with version control systems such as Git. Good-to-Have Skills: Familiarity with relational databases (such as MySQL, SQL server, PostgreSQL etc.) Proficiency in programming languages such as Python, JavaScript or other programming languages Outstanding written and verbal communication skills, and ability to translate technical concepts for non-technical audiences. Experience with ETL Tools (Informatica, Databricks). Experience with API integrations such as MuleSoft. Solid understanding & Proficiency in writing SQL queries. Hands on experience on reporting tools such as Tableau, Spotfire & Power BI. Professional Certifications: Veeva Vault Platform Administrator or Equivalent Vault Certification (Must-Have) SAFe for Teams (Preferred) Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global, virtual teams. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills. Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40005 Jobs | Dublin
Wipro
19416 Jobs | Bengaluru
Accenture in India
16187 Jobs | Dublin 2
EY
15356 Jobs | London
Uplers
11435 Jobs | Ahmedabad
Amazon
10613 Jobs | Seattle,WA
Oracle
9462 Jobs | Redwood City
IBM
9313 Jobs | Armonk
Accenture services Pvt Ltd
8087 Jobs |
Capgemini
7830 Jobs | Paris,France