Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
It's fun to work in a company where people truly BELIEVE in what they are doing! We're committed to bringing passion and customer focus to the business. Location - Open Position: Data Engineer (GCP) – Technology If you are an extraordinary developer and who loves to push the boundaries to solve complex business problems using creative solutions, then we wish to talk with you. As an Analytics Technology Engineer, you will work on the Technology team that helps deliver our Data Engineering offerings at large scale to our Fortune clients worldwide. The role is responsible for innovating, building and maintaining technology services. Responsibilities: Be an integral part of large scale client business development and delivery engagements Develop the software and systems needed for end-to-end execution on large projects Work across all phases of SDLC, and use Software Engineering principles to build scaled solutions Build the knowledge base required to deliver increasingly complex technology projects Qualifications & Experience: A bachelor’s degree in Computer Science or related field with 5 to 10 years of technology experience Desired Technical Skills: Data Engineering and Analytics on Google Cloud Platform: Basic Cloud Computing Concepts Bigquery, Google Cloud Storage, Cloud SQL, PubSub, Dataflow, Cloud Composer, GCP Data Transfer, gcloud CLI Python, Google Cloud Python SDK, SQL Experience in working with Any NoSQL/Columnar / MPP Database Experience in working with Any ETL Tool (Informatica/DataStage/Talend/Pentaho etc.) Strong Knowledge of database concepts, Data Modeling in RDBMS Vs NoSQL, OLTP Vs OLAP, MPP Architecture Other Desired Skills: Excellent communication and co-ordination skills Problem understanding, articulation and solutioning Quick learner & adaptable with regards to new technologies Ability to research & solve technical issues Responsibilities: Developing Data Pipelines (Batch/Streaming) Developing Complex data transformations ETL Orchestration Data Migration Develop and Maintain Datawarehouse / Data Lakes Good To Have: Experience in working with Apache Spark / Kafka Machine Learning concepts Google Cloud Professional Data Engineer Certification If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us! Not the right fit? Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!
Posted 1 week ago
4.0 - 6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description Position : Senior Software Engineer / Principal Software Engineer - ETL Experience Expected : 4-6 years. Job Description : Designing, developing, and deploying Data Transformation using SQL portion of the data warehousing solution. Definition and implementation of Database development standards, procedures Skills / Competencies: Ability to develop and debug complex and Advance SQL queries, and stored procedures (must have) Hands-on experience in either one or more of the ETL tools like Talend, Informatica (good to have) Hands on experience on any one streaming tool like DMS, Qlik, Golden gate, IICS , Open Flow Hands on experience using snowflake and Postgres databases Database optimization experience would be an added advantage. (good to have) Excellent design, coding, testing, and debugging skills. Should have experience in AGILE methodologies, experience in custom facing will be an added advantage. (good to have) Automation using phyton, java or any other tool will be an added advantage. (good to have)
Posted 1 week ago
6.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Manager IT Operations at UST HealthProof, you will lead and manage production support operations to ensure high service quality and customer satisfaction. Your responsibilities will include overseeing a geographically distributed support team responsible for health plan technology solutions. Reporting to the Director of Delivery, you will be tasked with managing SLAs, coordinating change and issue resolution, driving operational efficiency, and delivering continuous improvements aligned with business goals. Key Responsibilities - Ensure operational excellence for customer-facing technology delivery. - Drive resolution of production incidents and conduct root cause analysis. - Generate SLA/operational reports for internal stakeholders and customers. - Manage incidents using ITSM tools like JIRA or ServiceNow. - Coordinate with internal and external teams for support and upgrades. - Lead customer calls, prioritize daily support issues, and handle escalations. - Identify value-added innovations and efficiency opportunities. - Mentor and guide the support team; manage team development and performance evaluations. - Participate in contract renewals, SOWs, and onboarding activities. - Ensure knowledge management and upskilling through platforms like TICL, GAMA, etc. - Strategically contribute to account growth via resource planning and new engagements. Mandatory Skills - Minimum 6+ years managing production support in a mid to large-scale IT environment. - Strong hands-on experience with ServiceNow/JIRA or other ITSM tools. - Experience in SLA governance and operational reporting. - Proven capability in SQL, Excel, and PowerPoint. - Working knowledge of Cloud platforms (AWS/GCP). - Excellent understanding of ITIL standards and practices. - Experience managing support for enterprise applications or healthcare systems. Good To Have Skills - Informatica / Informatica Cloud experience (highly desirable). - Knowledge of SOAP, EDI, and ETL processing. - Familiarity with SaaS platforms and HealthEdge applications. - PMP/Prince2/CSM certification or equivalent. - Exposure to working with SOWs, SLAs, contract management, and change requests. - Experience in working in an onshore-offshore delivery model. Soft Skills - Strong communication and presentation abilities. - Customer-focused mindset and ability to foster strong relationships. - High ownership, problem-solving attitude, and stakeholder management. - Ability to manage critical escalations under pressure. - Team mentoring, conflict resolution, and people development. - Agility in multitasking across priorities and timelines. Outputs & Success Metrics - Timely and quality SLA/Operational reporting. - Effective incident reduction and permanent fixes implementation. - Improved customer satisfaction (C-SAT/NPS). - Seamless knowledge transfers and upskilling initiatives. - Measurable team engagement, development, and performance. - Achievement of project/account financial targets (EBITDA). - Value additions and innovations introduced in the engagement. Certifications (Preferred) - PMP / Prince2 / CSM - ITIL v3 or v4 Foundation / Intermediate About UST HealthProof UST HealthProof is reshaping the future of health insurance operations by building best-in-class cloud-based administrative ecosystems. With a focus on reducing administrative costs and improving the healthcare experience, we aim to drive meaningful industry transformation while nurturing individual growth within a startup culture.,
Posted 1 week ago
10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Role Overview We are looking for an experienced Solution Architect AI/ML & Data Engineering to lead the design and delivery of advanced data and AI/ML solutions for our clients. Responsibilities The ideal candidate will have a strong background in end-to-end data architecture, AI lifecycle management, cloud technologies, and emerging Generative AI Responsibilities : Collaborate with clients to understand business requirements and design robust data solutions. Lead the development of end-to-end data pipelines including ingestion, storage, processing, and visualization. Architect scalable, secure, and compliant data systems following industry best practices. Guide data engineers, analysts, and cross-functional teams to ensure timely delivery of solutions. Participate in pre-sales efforts: solution design, proposal creation, and client presentations. Act as a technical liaison between clients and internal teams throughout the project lifecycle. Stay current with emerging technologies in AI/ML, data platforms, and cloud services. Foster long-term client relationships and identify opportunities for business expansion. Understand and architect across the full AI lifecyclefrom ingestion to inference and operations. Provide hands-on guidance for containerization and deployment using Kubernetes. Ensure proper implementation of data governance, modeling, and warehousing : Bachelors or masters degree in computer science, Data Science, or related field. 10+ years of experience as a Data Solution Architect or similar role. Deep technical expertise in data architecture, engineering, and AI/ML systems. Strong experience with Hadoop-based platforms, ideally Cloudera Data Platform or Data Fabric. Proven pre-sales experience: technical presentations, solutioning, and RFP support. Proficiency in cloud platforms (Azure preferred; also, AWS or GCP) and cloud-native data tools. Exposure to Generative AI frameworks and LLMs like OpenAI and Hugging Face. Experience in deploying and managing applications on Kubernetes (AKS, EKS, GKE). Familiarity with data governance, data modeling, and large-scale data warehousing. Excellent problem-solving, communication, and client-facing & Technology Architecture & Engineering: Hadoop Ecosystem: Cloudera Data Platform, Data Fabric, HDFS, Hive, Spark, HBase, Oozie. ETL & Integration: Apache NiFi, Talend, Informatica, Azure Data Factory, AWS Glue. Warehousing: Azure Synapse, Redshift, BigQuery, Snowflake, Teradata, Vertica. Streaming: Apache Kafka, Azure Event Hubs, AWS Platforms: Azure (preferred), AWS, GCP. Data Lakes: ADLS, AWS S3, Google Cloud Platforms: Data Fabric, AI Essentials, Unified Analytics, MLDM, MLDE. AI/ML & GenAI Lifecycle Tools: MLflow, Kubeflow, Azure ML, SageMaker, Ray. Inference: TensorFlow Serving, KServe, Seldon. Generative AI: Hugging Face, LangChain, OpenAI API (GPT-4, etc. DevOps & Deployment Kubernetes: AKS, EKS, GKE, Open Source K8s, Helm. CI/CD: Jenkins, GitHub Actions, GitLab CI, Azure DevOps. (ref:hirist.tech)
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
pune, maharashtra
On-site
Our client is a global IT services company with offices in India and the United States, specializing in digital transformation, IT collaborations, and utilizing technology to make a positive impact on businesses. They are currently seeking an experienced Informatica Architect to join their team. As an Informatica Architect, you will be responsible for leading data governance, data catalog, and data quality efforts. With over 7 years of expertise in data quality and data cataloging, you will work closely with the Data & Analytics lead to ensure the integrity and quality of critical data within the product. Your role will involve developing efficient data processes using tools such as Informatica, Alation, Altan, or Collibra. Key responsibilities include overseeing the data elements of a complex product catalog, designing and developing data catalog and data assets on leading tools, managing the Enterprise Glossary, configuring data catalog resources, implementing Critical Data Elements, and ensuring compliance with Data Governance Standards. The ideal candidate will have 7-8 years of enterprise IICS data integration and management experience, along with practical experience configuring data governance resources and hands-on experience with Informatica CDQ and Data Quality. A strong understanding of data quality, data cataloging, and data governance best practices is essential. Preferred qualifications include administration and management of Collibra/Alation data catalog tool, configuration of data profiling and data lineage, and working with Data Owners and stewards to understand catalog requirements. If you have the required qualifications and experience, we invite you to apply online through our portal or via email at careers@speedmart.co.in. Join us in driving digital transformation and making a difference in the world of business.,
Posted 1 week ago
12.0 - 18.0 years
0 Lacs
karnataka
On-site
As a Senior Technical Analyst at Nasdaq, you will be a vital part of the team responsible for delivering complex technical systems to both new and existing customers. Your role will involve exploring new technologies within the FinTech industry and contributing to the ongoing transformation and innovation at Nasdaq. Located in Bangalore, you will join the Enterprise Solutions team to drive the execution of central initiatives across Nasdaq's corporate technology portfolio, encompassing Software Products and Software Services. Your primary focus will be on developing state-of-the-art corporate software for Nasdaq's employees, collaborating with a dedicated team of professionals to enhance and restructure enterprise products. Your responsibilities will include working on cross-functional projects globally, delivering essential solutions and services to Nasdaq's finance processes. You will engage in crucial design activities, interact with internal customers, and build strong relationships with key stakeholders in business and technology. Furthermore, you will have the opportunity to contribute to the development of sophisticated technology solutions and collaborate with subject matter experts within the Enterprise Solutions team. Key Responsibilities: - Drive cross-functional initiatives worldwide to support Nasdaq's finance operations - Collaborate closely with internal customers and stakeholders to ensure successful design and implementation - Establish and enforce development standards and best practices within the team - Evaluate external software packages and provide recommendations for future use at Nasdaq - Identify and propose solutions for configuration issues in ERP platforms supporting finance processes - Deliver executive-level architecture presentations related to the Corporate Finance suite of platforms Qualifications: - 12 to 18 years of experience in software implementation and configuration within the financial ERP space - Proficiency in Workday's Finance modules, Workday Extend, and Workday Studio - Familiarity with procurement and expense management platforms like Coupa, Navan, etc. - Strong executive-level presentation skills, both written and oral - Bachelor's or Master's degree in computer science or related engineering fields Desired Skills: - Experience in Informatica will be an added advantage - Knowledge of finance organization processes such as AP, procurement, GL accounting, asset management, and planning & forecasting - Previous exposure to multinational organizations is beneficial If this opportunity aligns with your experience and aspirations, we encourage you to apply promptly. Nasdaq offers a dynamic and inclusive work environment where individuals are empowered to innovate, collaborate, and grow. As part of our team, you will have access to various benefits, including an annual monetary bonus, stock ownership opportunities, health insurance, flexible work arrangements, and continuous learning resources. Come as you are, and let's build a future together at Nasdaq.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have a minimum of 5 years of experience in Abinitio ETL, Informatica, and AWS. Your main responsibilities will include designing and implementing complex ETL solutions using Informatica or Ab Initio, scheduling, optimizing, and monitoring jobs using Autosys or Control M, as well as analyzing and resolving system performance issues with tools like Dynatrace. You will collaborate with architects, business analysts, and operations teams to ensure end-to-end system efficiency, drive process improvements and best practices in data pipeline development, and ensure data integrity, consistency, and accuracy in the ETL pipeline by implementing validation and exception handling procedures. Regular auditing and maintenance of data pipelines to ensure high-quality data delivery will also be part of your role. About Virtusa: Virtusa embodies values of teamwork, quality of life, and professional and personal development. As part of a global team of 27,000 professionals, you will experience exciting projects, opportunities, and work with state-of-the-art technologies throughout your career at Virtusa. The company values collaboration, the team environment, and aims to provide a dynamic space for great minds to nurture new ideas and foster excellence.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
As a System Analyst Datawarehouse at our company, you will be responsible for collaborating with stakeholders to understand business requirements and translate them into data warehouse design specifications. Your role will involve developing and maintaining data warehouse architecture, including data models, ETL processes, and data integration strategies. You will create, optimize, and manage ETL processes to extract, transform, and load data from various source systems into the data warehouse. Ensuring data quality and accuracy during the ETL process by implementing data cleansing and validation procedures will be a key part of your responsibilities. Designing and maintaining data models, schemas, and hierarchies to support efficient data retrieval and reporting will be crucial. You will implement best practices for data modeling, including star schemas, snowflake schemas, and dimension tables. Integrating data from multiple sources, both structured and unstructured, into the data warehouse will be part of your daily tasks. You will work with API endpoints, databases, and flat files to collect and process data efficiently. Monitoring and optimizing the performance of the data warehouse, identifying and resolving bottlenecks and performance issues, will be essential. You will implement indexing, partitioning, and caching strategies for improved query performance. Enforcing data governance policies and security measures to protect sensitive data within the data warehouse will be a priority. You will ensure compliance with data privacy regulations, such as GDPR or HIPAA. Collaborating with business intelligence teams to provide support for reporting and analytics initiatives will also be part of your role. You will assist in the creation of data marts and dashboards for end-users. Maintaining comprehensive documentation of data warehouse processes, data models, and ETL workflows will be crucial. Additionally, you will train and mentor junior data analysts and team members. To qualify for this role, you should have a Bachelor's degree in computer science, information technology, or a related field. A minimum of 3 years of experience as a Data Warehouse Systems Analyst is required. Strong expertise in data warehousing concepts, methodologies, and tools, as well as proficiency in SQL, ETL tools (e.g., Informatica, Talend, SSIS), and data modeling techniques, are essential. Knowledge of data governance, data security, and compliance best practices is necessary. Excellent problem-solving and analytical skills, along with strong communication and interpersonal skills for effective collaboration with cross-functional teams, will be beneficial in this role. Immediate joiners will be preferable for this position. If you meet the qualifications and are looking to join a dynamic team in Mumbai, we encourage you to apply.,
Posted 1 week ago
4.0 - 7.0 years
15 - 25 Lacs
Noida
Work from Office
Role & responsibilities Collaborate with customers' Business and IT teams to understand integration requirements in the B2B/Cloud/API/Data/ETL/EAI Integration space and implement solutions using the Adeptia platform Design, develop, and configure complex integration solutions, ensuring scalability, performance, and maintainability. Take ownership of assigned modules and lead the implementation lifecycle from requirement gathering to production deployment. Troubleshoot issues during implementation and deployment, ensuring smooth system performance. Guide team members in addressing complex integration challenges and promote best practices and performance practices. Collaborate with offshore/onshore and internal teams to ensure timely execution and coordination of project deliverables. Write efficient, well-documented, and maintainable code, adhering to established coding standards. Review code and designs of team members, providing constructive feedback to improve quality. Participate in Agile processes, including Sprint Planning, Daily Standups, and Retrospectives, ensuring effective task management and delivery. Stay updated with emerging technologies to continuously enhance technical expertise and team skills. Preferred candidate profile 5-7 years of hands-on experience in designing and implementing integration solutions across B2B, ETL, EAI, Cloud, API & Data Integration environments using leading platforms such as Adeptia, Talend, MuleSoft, or equivalent enterprise-grade tools. Proficiency in designing and implementing integration solutions, including integration processes, data pipelines, and data mappings, to facilitate the movement of data between applications and platforms. Proficiency in applying data transformation and data cleansing as needed to ensure data quality and consistency across different data sources and destinations. Good experience in performing thorough testing and validation of data integration processes to ensure accuracy, reliability, and data integrity. Proficiency in working with SOA, RESTful APIs, and SOAP Web Services with all security policies. Good understanding and implementation experience with various security concepts, best practices,Security standards and protocols such as OAUTH, SSL/TLS, SSO, SAML, IDP (Identity Provider). Strong understanding of XML, XSD, XSLT, and JSON. Good understanding in RDBMS/NoSQL technologies (MSSQL, Oracle, MySQL). Proficiency with transport protocols (HTTPS, SFTP, JDBC) and experiences of messaging systems such as Kafka, ASB(Azure Service Bus) or RabbitMQ. Hands-on experience in Core Java and exposure to commonly used Java frameworks
Posted 1 week ago
0 years
0 Lacs
Bengaluru East, Karnataka, India
On-site
Primary skills:Technology->Data Management - Data Integration->Informatica A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Knowledge of design principles and fundamentals of architecture Understanding of performance engineering Knowledge of quality processes and estimation techniques Basic understanding of project domain Ability to translate functional / nonfunctional requirements to systems requirements Ability to design and code complex programs Ability to write test cases and scenarios based on the specifications Good understanding of SDLC and agile methodologies Awareness of latest technologies and trends
Posted 1 week ago
40.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Amgen Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. About The Role Role Description: We are seeking an experienced MDM Engineer with 8–12 years of experience to lead development and operations of our Master Data Management (MDM) platforms, with hands-on experience in data engineering experience. This role will involve handling the backend data engineering solution within MDM team. This is a technical role that will require hands-on work. To succeed in this role, the candidate must have strong Data Engineering experience. Candidate must have experience on technologies like (SQL, Python, PySpark, Databricks, AWS, API Integrations etc). Roles & Responsibilities: Develop distributed data pipelines using PySpark on Databricks for ingesting, transforming, and publishing master data Write optimized SQL for large-scale data processing, including complex joins, window functions, and CTEs for MDM logic Implement match/merge algorithms and survivorship rules using Informatica MDM or Reltio APIs Build and maintain Delta Lake tables with schema evolution and versioning for master data domains Use AWS services like S3, Glue, Lambda, and Step Functions for orchestrating MDM workflows Automate data quality checks using IDQ or custom PySpark validators with rule-based profiling Integrate external enrichment sources (e.g., D&B, LexisNexis) via REST APIs and batch pipelines Design and deploy CI/CD pipelines using GitHub Actions or Jenkins for Databricks notebooks and jobs Monitor pipeline health using Databricks Jobs API, CloudWatch, and custom logging frameworks Implement fine-grained access control using Unity Catalog and attribute-based policies for MDM datasets Use MLflow for tracking model-based entity resolution experiments if ML-based matching is applied Collaborate with data stewards to expose curated MDM views via REST endpoints or Delta Sharing Basic Qualifications and Experience: 8 to 13 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Advanced proficiency in PySpark for distributed data processing and transformation Strong SQL skills for complex data modeling, cleansing, and aggregation logic Hands-on experience with Databricks including Delta Lake, notebooks, and job orchestration Deep understanding of MDM concepts including match/merge, survivorship, and golden record creation Experience with MDM platforms like Informatica MDM or Reltio, including REST API integration Proficiency in AWS services such as S3, Glue, Lambda, Step Functions, and IAM Familiarity with data quality frameworks and tools like Informatica IDQ or custom rule engines Experience building CI/CD pipelines for data workflows using GitHub Actions, Jenkins, or similar Knowledge of schema evolution, versioning, and metadata management in data lakes Ability to implement lineage and observability using Unity Catalog or third-party tools Comfort with Unix shell scripting or Python for orchestration and automation Hands on experience on RESTful APIs for ingesting external data sources and enrichment feeds Good-to-Have Skills: Experience with Tableau or PowerBI for reporting MDM insights. Exposure to Agile practices and tools (JIRA, Confluence). Prior experience in Pharma/Life Sciences. Understanding of compliance and regulatory considerations in master data. Professional Certifications : Any MDM certification (e.g. Informatica, Reltio etc) Any Data Analysis certification (SQL, Python, PySpark, Databricks) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. GCF Level 05A
Posted 1 week ago
40.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. Associate IS Bus Sys Analyst - Veeva eTMF What You Will Do In the Veeva Vault team, you will collaborate with business partners, product owners, developers, and testers to develop release strategies for Amgen’s Veeva Vault eTMF. Ensure systems' availability and performance, work closely with product managers, designers, and engineers to create scalable software solutions, automate operations, monitor system health, and respond to incidents to reduce downtime. Roles & Responsibilities: Possesses strong rapid prototyping skills and can quickly translate concepts into working code. Lead day to day operations and maintenance of Amgen’s Veeva Vault eTMF and hosted applications. Stay updated with the latest trends, advancements and standard process for Veeva Vault Platform ecosystem. Work closely with cross-functional teams, including product management, design, and QA, to deliver high-quality software on time. Maintain detailed documentation of software designs, code, and development processes. Work on integrating with other systems and platforms to ensure seamless data flow and functionality. Stay up to date on Veeva Vault Features, new releases and best practices around Veeva Platform Governance. Assess and plan for the impact of Veeva changes and releases on stakeholders. Assisst to develop and managing communication plans to engage Veeva stakeholders and provide clear messaging. Support designing and delivery of training programs, offering ongoing support and coaching. Design, develop, and implement applications and modules, including custom reports, SDKs, interfaces, and enhancements. Analyze and understand the functional & technical requirements of applications, solutions and systems, translate them into software architecture and design specifications. Develop and execute unit tests, integration tests, and other testing strategies to ensure the quality of the software following IS change control and GxP Validation process while exhibiting expertise in Risk Based Validation methodology. Work closely with cross-functional teams, including product management, design, and QA, to deliver high-quality software on time. Maintain detailed documentation of software designs, code, and development processes. Work on integrating with other systems and platforms to ensure seamless data flow and functionality. Stay up to date on Veeva Vault Features, new releases and best practices around Veeva Platform Governance. Work Hours: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required. Potential Shifts ( subject to change based on business requirements ): Second Shift: 2:00pm – 10:00pm IST; Third Shift: 10:00 pm – 7:00 am IST. What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. The [vital attribute] professional we seek is a [type of person] with these qualifications. Basic Qualifications: Bachelor’s degree and 2 to 6 year of Information Systems experience or related field OR Must-Have Skills: Experience with Veeva Vault eTMF, including Veeva configuration settings and custom builds. Strong knowledge of information systems and network technologies. Experience in building configured and custom solutions on Veeva Vault Platform. Experience in managing systems, implementing and validating projects in GxP regulated environments. Extensive expertise in SDLC, including requirements, design, testing, data analysis, creating and managing change controls. Proficiency in programming languages such as Python, JavaScript etc. Strong understanding of software development methodologies, including Agile and Scrum. Experience with version control systems such as Git. Good-to-Have Skills: Familiarity with relational databases (such as MySQL, SQL server, PostgreSQL etc.) Proficiency in programming languages such as Python, JavaScript or other programming languages Outstanding written and verbal communication skills, and ability to translate technical concepts for non-technical audiences. Experience with ETL Tools (Informatica, Databricks). Experience with API integrations such as MuleSoft. Solid understanding & Proficiency in writing SQL queries. Hands on experience on reporting tools such as Tableau, Spotfire & Power BI. Professional Certifications: Veeva Vault Platform Administrator or Equivalent Vault Certification (Must-Have) SAFe for Teams (Preferred) Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global, virtual teams. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills. What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 week ago
5.0 - 10.0 years
35 - 40 Lacs
Mumbai
Work from Office
Job Description: Job Title: AI/ML and Knowledge Management Location: Mumbai, India Corporate Title: Assistant Vice President Role Description The AI/ML and Knowledge Management team will drive the build of 1 LOD data analytics and knowledge management capability, with a view towards supporting a proactive risk management function. The candidate will take active responsibility in developing solutions to achieve strategic and business objectives. The candidate will provide technical expertise and use large language models (AI/ML) effectively to build solutions to ensure continuous development of risk MI, data models for Risk and Control assessments, regulatory sentiment analysis, and knowledge management for Technology Risk across CB, IB and Ops divisions. The 1st line Tech Risk function for business divisions (CB, IB and Ops) at Deutsche Bank sits within the Divisional Control Office. CB and IB front to back have the largest footprint as a risk bearing function within the banking divisions and you will be part of a dynamic team which is consistently in demand for providing insights, assessments and managing IT and IS risks on behalf of the business. As part of the team, you will join the Bank s journey and contribute towards our strategic goal of managing technology risk within appetite whilst enabling adoption of emerging and new technologies for business growth. You will do so through promoting a data-enabled risk management function, that provides business division aligned insights for informed decision making. The role will work closely with stakeholders within the team and in business divisions to gather requirements and provide innovative solutions for risk insights and analytics capabilities. This role will report directly to the AI/ML and Knowledge Management Lead and has no line management responsibilities. What we ll offer you As part of our flexible scheme, here are just some of the benefits that you ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under child care assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Establish and maintain a risk reporting model for IT and IS risk. This includes operational and risk dashboards for senior management views. Develop a knowledge management model which helps create a structured and indexable central repository for regulatory responses and presentations for various councils. Leverage opportunities to design innovative solutions that facilitate the periodic Risk and Control assessments with singular view of contextual and reference data. Develop process models that aid intelligent response production for multiple and global regulatory and external queries. Be a catalyst and an enabler for sustainable IT&IS risk reduction in-line with changing regulatory landscape and overall internal controls framework. Partner with reporting functions in other teams to ensure alignment with business needs and group risk management framework. Your skills and experience Minimum 5 years experience in Data Analytics - designing and implementation of data models and creating meaningful dashboards that drive insights. Overall experience in similar roles in a Technology division or in a Banking Technology division or IT audit in a cross-cultural and diverse operating environment Good understanding of Industry best practices over how risk data or AI models are defined, and data quality and integrity is maintained. Programming Language - Python, MS SQL and Data analytics and visualization tools - Tableau, SAP objects, Informatica, Alteryx. Experience in developing data standards, processes, and policies, as well as developing and implementing enterprise data strategies, operational data stores and data quality tools. Experience with dimensional modeling, change data capture methods and implementation of data warehouse and data lake house architectures. Strong team player with a result-oriented mindset and ability to deliver under tight timelines. Must be comfortable with navigating ambiguity to extract meaningful risk insights. How we ll support you Training and development to help you excel in your career Flexible working to assist you balance your personal priorities Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs
Posted 1 week ago
7.0 - 12.0 years
35 - 40 Lacs
Noida
Work from Office
The engineer role is to support external data transmission, operations, scheduling and middleware transmission. Experience in Windows and Linux environments and knowledge of Informatica MFT & Data Exchange tools. Should be able to handle day to day customer transmission and Informatica MFT/DX activities. Design and implement complex integration solutions through collaboration with engineers, application teams and operations team across the global enterprise Provide technical support to application developers when required. This includes promoting use of best practices, ensuring standardization across applications and trouble shooting Able to create new setups and support existing transmissions Able to diagnose and troubleshoot transmission and connection issues Experience in Windows administration and good to have expertise in IBM workload scheduler Hands on experience in tools like IIS, Informatica MFT & DX console, Splunk and IBM workload scheduler Responsibilities also include planning, engineering, and implementation of new transmissions as well as migration of setups The role will participate in the evaluation and recommendation of new products and technologies The role will also represent the domain in relevant automation and value innovation efforts Technical leadership, ability to think strategically and effectively communicate solutions to a variety of stake holders Able to debug production issues by analyzing the logs directly and using tools like Splunk. Learn new technologies based on demand and help team members by coaching and assisting Willing to work in rotational shifts Good Communication skill with the ability to communicate clearly and effectively Knowledge, Skills and Abilities Education Bachelors degree in computer science, Information Systems, or related field Experience 7+ years of total experience and at least 4+ years of experience in designing and implementation of complex integration solutions through collaboration with engineers, application and operations team Create new setups and support existing transmissions Experience in tools like IIS, Informatica MFT & DX console, Splunk and IBM workload scheduler SSH/SSL/Tectia Microsoft IIS IBM Connect:Direct IBM Sterling Informatica MFT Operating System Knowledge (Linux/Windows/AIX) Troubleshooting Azure Dev Ops Pipeline Knowledge Mainframe z/OS Knowledge Open Shift and Kube Enterprise Scheduling Knowledge (Maestro) Good to Have : Python and/or Powershell Agile SAFe for Teams Ansible (Automation) Elastic Other Requirements (licenses, certifications, specialized training if required) Working Relationships Internal Contacts (and purpose of relationship): MetLife internal partners External Contacts (and purpose of relationship) If Applicable MetLife external partners
Posted 1 week ago
12.0 - 17.0 years
13 - 18 Lacs
Mumbai
Work from Office
Overview: We are seeking an experienced Data Architect with over 12 years of expertise in data engineering, big data, and cloud data solutions particularly on Microsoft Azure . The ideal candidate will have a proven track record of delivering scalable data architectures, building enterprise data lakes, leading complex migrations, and architecting real-time and batch data pipelines. You ll be responsible for end-to-end architecture from data ingestion and transformation to governance, analytics, and performance optimization. Key Responsibilities: Architecture & Design Design scalable, high-performance, cloud-native data architectures using Azure Data Lake, Azure Synapse, and Databricks . Develop high-level and low-level architecture documents (HLD/LLD) for modern data platforms. Define data models using star and snowflake schemas , optimizing for analytics and query performance. Data Engineering & ETL Lead the development of ETL/ELT pipelines using Azure Data Factory , PySpark , Spark SQL , and Databricks . Manage ingestion of structured and semi-structured data from diverse sources to Azure-based data lakes and warehouses. Implement real-time data pipelines using Azure Event Hubs and Structured Streaming . Governance & Security Define and implement data governance frameworks including lineage, cataloging, access controls , and compliance (e.g., GDPR ). Collaborate with MDM and governance teams using tools like Informatica AXON and EDC . Performance Tuning & Optimization Drive cost-efficient architecture design with partitioning, caching, indexing, and cluster optimization. Monitor and troubleshoot data pipelines using Azure Monitor , Log Analytics , and Databricks tools . Stakeholder Engagement Collaborate with data scientists, analysts, business stakeholders, and DevOps teams to deliver robust, scalable data platforms. Conduct design reviews and training sessions to support platform adoption and knowledge sharing. Key Skills & Technologies: Cloud Platforms: Azure (ADF, ADLS, Azure SQL, Synapse, Databricks), AWS (S3, RDS, EC2) Big Data: Spark, Delta Lake, PySpark, Hadoop ETL Tools: Azure Data Factory, Informatica, IBM DataStage Data Modeling: Star, Snowflake, SCD, Fact & Dimension Tables Programming: Python, PySpark, SQL, Shell Scripting, R Visualization Tools: Power BI, Tableau, Cognos Data Governance: Informatica MDM, AXON, EDC Certifications Preferred: Microsoft Certified: Azure Data Engineer Associate Databricks Data Engineer Associate / Professional
Posted 1 week ago
2.0 - 7.0 years
12 - 16 Lacs
Mumbai
Work from Office
Context KPMG entities in India are professional service firms(s). These Indian member firms are affiliated with KPMG international limited. We strive to provide rapid, performance-based, industry-focused and technology-enabled service, which reflect a shared knowledge of global and local industries and out experience of the Indian business environment. We are creating a strategic solution architecture horizontal team to own, translate and drive this vision into various verticals, business or technology capability block owners and strategic projects. Job Description Role Objective: Senior ETL Developer will design, develop, and optimize Talend data pipelines, ensuring the seamless integration of data from multiple sources to provide actionable insights for informed decision-making across the organization. Sound understanding of databases to store structured and unstructured data with optimized modelling techniques. Should have good exposure on data catalog and data quality modules of any leading product (preferably Talend). Location- Mumbai Years of Experience - 3-5 yrs Roles & Responsibilities: Business Understanding: Collaborate with business analysts and stakeholders to understand business needs and translate them into ETL solution. Arch/Design Documentation: Develop comprehensive architecture and design documentation for data landscape. Dev Testing & Solution: Implement and oversee development testing to ensure the reliability and performance of solution. Provide solutions to identified issues and continuously improve application performance. Understanding Coding Standards, Compliance & Infosecurity: Adhere to coding standards and ensure compliance with information security protocols and best practices. Non-functional Requirement: Address non-functional requirements such as performance, scalability, security, and maintainability in the design and development of Talend based ETL solution. Technical Skills: Core Tool exposure Talend Data Integrator, Talend Data Catalog, Talend Data Quality, Relational Database (PostgreSQL, SQL Server, etc.) Core Concepts ETL, Data load strategy, Data Modelling, Data Governance and management, Query optimization and performance enhancement Cloud exposure Exposure of working on one of the cloud service providers (AWS, Azure, GCP, OCI, etc.) SQL Skills- Extensive knowledge and hands-on experience with SQL, Query tuning, optimization, and best practice understanding Soft Skills- Very good communication and presentation skills Must be able to articulate the thoughts and convince key stakeholders Should be able to guide and upskill team members Good to Have: Programming Language: Knowledge and hands-on experience with languages like Python and R. Relevant certifications related to the role. Total Experience: 1. Total experience in data engineering and data lake/data warehouse between 2-7 years. 2. Primary skill set in any ETL Tool (SSIS / Talend / Informatica / DataStage) 3. Must have good experience on working on any database (Teradata, Snowflake, Oracle, SQL Server, Sybase IQ, Postgre SQL, Redshift, Synapse) and able to write SQL queries/scripts 4. Good team player, ability to own work assigned and take it to closure 5. Should have clear understanding of concepts of Data Lake/Data warehouse. .
Posted 1 week ago
15.0 - 22.0 years
35 - 40 Lacs
Bengaluru
Work from Office
Job Description: Job Title: Sales Force Program Development, VP Location: Bangalore, India Role Description The SalesForce Program Development Team within DWS Global Technology is aiming to recruit a Senior Developer . This role is ideal for an experienced SalesForce Developer who is seeking a challenging and rewarding engagement, with the potential to grow both their career and their understanding of this strategic system. In DWS Asset Management, SalesForce is used for Client Relationship Management (CRM), Know Your Customer (KYC) and to support the DWS Asset Management Sales organisation to conform to regulatory requirements such as MiFID or GDPR (EU Data Protection Rules). What we ll offer you As part of our flexible scheme, here are just some of the benefits that you ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Reporting to the Salesforce Application Development Manager, the key objective of this role is to provide analysis and development of issues and develop components to help manage the implementation of SalesForce project deliverables. Working with the business and technical delivery teams through the end to end software development lifecycle, to deliver a high quality solution that meets the client s needs. Specific responsibilities of the role include ensuring that: Work with Business Analysts and Project Managers to understand functional requirements at a high level and set development expectations as needed for specific project deliverables Creation of relevant Technical Solutions for use in hybrid agile development environment Collaboration with development team as needed to deliver metadata components into the SalesForce system following our SDLC Lifecycle promotion path Perform Senior Developer duties such as create new code components, support and maintain existing code components and support our Production environments and implementations using bank approved tools. Your skills and experience This role will suit a candidate who is comfortable operating within a team and is able to see the bigger development picture, as well as being immersed in the detail. It requires a dynamic, enthusiastic, self-starter, with a strong work ethic, who has a passion for delivering tangible business value. The skills and experience that are most relevant to the role are: Very strong experience ( 15 to 22 Years ) with SalesForce configuration and development skills to a certified level. 15 to 22 Years experience working on SalesForce project deliverable components in the financial sector or similar heavily regulated industry (Asset Management/Banking being preferred) Experience with Salesforce Sales Cloud, Salesforce Service Cloud, Salesforce Marketing Cloud, related installed AppExchange packages APTTUS and GEOPOINTE, Salesforce1 Mobile and Einstein Analytics. Experience with Salesforce CRM technologies such as: SOQL, Lightning Components, Visualforce Components, Apex Classes, Apex Triggers, JavaScript, JAVA, JSON, FLOWS etc.. Experience working with tools and deployments using tools like IntelliJ, Bitbucket, Git, TeamCity, Force.com IDE, Eclipse, ANT Migration tool, Change Sets, Data loader, Informatica ETL tools Excellent problem solving skills, with the ability and mind set to jump in during collaboration and resolve issues Highly developed written and verbal communication skills, experience with breaking down business problems into developing technical solutions and components. How we ll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs https: / / www.db.com / company / company.htm We at DWS are committed to creating a diverse and inclusive workplace, one that embraces dialogue and diverse views, and treats everyone fairly to drive a high-performance culture. The value we create for our clients and investors is based on our ability to bring together various perspectives from all over the world and from different backgrounds. It is our experience that teams perform better and deliver improved outcomes when they are able to incorporate a wide range of perspectives. We call this #ConnectingTheDots.
Posted 1 week ago
10.0 - 15.0 years
35 - 40 Lacs
Hyderabad
Work from Office
Job Position Summary The MetLife Corporate Technology (CT) organization is evolving to enable MetLife s New Frontier strategy. With a strong vision in place, we are a global function focused on driving digital technology strategies for key corporate functions within MetLife including, Finance, Actuarial, Reinsurance, Legal, Human Resources, Employee Experience, Risk, Treasury, Audit and Compliance. In partnership with our business leaders, we develop and deliver seamless technology experiences to our employees across the entire employee lifecycle. Our vision and mission is to create innovative, transformative and contemporary technology solutions to empower our leaders and employees so they can focus on what matters most, our customers. We are technologists with strong business acumen focused on developing our talent to continually transform and innovate. We are seeking a highly skilled hands-on Sr Techno-functional HCM Oracle Specialist who is responsible for partnering with HR Leaders, third party vendors and IT Executives to lead global transformation projects with the goal of attracting, developing and retaining talent across the organization. This position will be a part of a fast-paced IT team leveraging the newly implemented Oracle HCM to support employee hire to retire lifecycle (supporting all processes for HR, Hire, Learning, Pay, Benefits, etc. from an IT perspective) and interfaces of over 200 internal and external policy administrations He/she should be a strategic thinker, an effective communicator, and an expert in technological development. Key Relationships Internal Stake Holder Key Responsibilities Stakeholder Management - Managing key business stakeholders to deliver required technology capabilities to support the digital transformation agenda. Driving prioritization of the product backlog. This includes managing key vendors providing the resources, SaaS & other capabilities. Technology Implementation Implement and support projects on Internal Audit Technology platforms, specifically Azure Cloud. Ways of Working Adoption of the Agile ways of working in the software delivery lifecycle. E2E Software Lifecycle Management (Architecture, Design, Development, Testing & Production) Evaluate/Implement technical solutions supporting Internal Audit and SAAS based solutions, talent development, performance management, and workforce analytic Work with Functional Experts to translate user requirements into Technical Specifications Partner with internal business process owners, technical team members, and senior management throughout the project life cycle Act as the intermediary to facilitate a clear understanding among all parties about business assumptions and requirements, design, technical, testing, and production migration requirements Drive the resolution and troubleshooting of issues during development and post- production support. Responsible to Support Day-to-day business enhancements Knowledge, Skills, and Abilities Education A Bachelors/masters degree in computer science or equivalent Engineering degree. Candidate Qualifications: Education: Bachelors degree in computer science, Information Systems or related field Experience: Required: 10-15 years of software development experience Deep expertise in Oracle Fusion Cloud Human Capital Management (HCM) Prior lead role or project management experience Software development experience in one or more of the following languages: Oracle HCM, HCM extracts, Fast Formulas, HSDL, API s, Synapse ETL, Redwood, Journeys, SQL, BI Reporting Preferred: Ability to manage systems testing including unit, QA, end to end and user acceptance testing Familiar with technology landscape supporting web and mobile delivery and integration solutions such as Informatica. Experience managing vendors to SLA s. Experience with software development life cycle and related activities to the implementation and maintenance of HR systems Proven experience collaborating with peers to establish best practices to achieve high service levels. Experience with MS Project, Visio, Excel, PowerPoint and related project delivery utilities. Skills and Competencies: Language: Proficiency at business level in English. Competencies Communication: Ability to influence and help communicate the organization s direction and ensure results are achieved Collaboration: Proven track record of building collaborative partnerships and ability to operate effectively in a global environment Diverse environment: Can-do attitude and ability to work in a high paced environment Tech Stack Development & Delivery Methods: Agile (Scaled Agile Framework) Development Frameworks and Languages: SQL Oracle HCM Cloud tool and configuration HCM Cloud systems HCM Data Loader HCM Spreadsheet Data Loader HCM Fast Formula Oracle HCM: Functional Knowledge of Payroll, Benefits, Time and Labor, Absence, Learning, Performance and/or Compensation
Posted 1 week ago
5.0 - 10.0 years
15 - 30 Lacs
Bengaluru
Work from Office
Gyansys is looking for resource with informatica CDGC Opportunity with one of our direct customer. Role & responsibilities 5+ years of experience of informatica and Good hands on experience in CDGC.
Posted 1 week ago
300.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
LSEG (London Stock Exchange Group) is more than a diversified global financial markets infrastructure and data business. We are dedicated, open-access partners with a dedication to excellence in delivering the services our customers expect from us. With extensive experience, deep knowledge and worldwide presence across financial markets, we enable businesses and economies around the world to fund innovation, manage risk and create jobs. It’s how we’ve contributed to supporting the financial stability and growth of communities and economies globally for more than 300 years. Through a comprehensive suite of trusted financial market infrastructure services – and our open-access model – we provide the flexibility, stability and trust that enable our customers to pursue their ambitions with confidence and clarity. People are at the heart of what we do and drive the success of our business. Our culture of connecting, creating opportunity and delivering excellence shape how we think, how we do things and how we help our people fulfil their potential. We embrace diversity and actively seek to attract individuals with unique backgrounds and perspectives. We break down barriers and encourage teamwork, enabling innovation and rapid development of solutions that make a difference. Our workplace generates an enriching and rewarding experience for our people and customers alike. Our vision is to build an inclusive culture in which everyone feels encouraged to fulfil their potential. We know that real personal growth cannot be achieved by simply climbing a career ladder – which is why we encourage and enable a wealth of avenues and interesting opportunities for everyone to broaden and deepen their skills and expertise. As a global organisation spanning 70 countries and one rooted in a culture of growth, opportunity, diversity and innovation, LSEG is a place where everyone can grow, develop and fulfil your potential with meaningful careers. Data Platforms is an exciting collective of teams spanning product and engineering teams. LSEG Data Platforms provides a range of tools and solutions for managing, accessing, and distributing financial data. They offer various platforms, including the LSEG Data Platform, LSEG DataScope Select, and LSEG DataScope Warehouse. These platforms enable users to access both LSEG's own data and third-party data, with options for real-time, delayed, and historical data delivery. Key Responsibilities: Design, develop, and maintain automated test scripts for ETL pipelines and data warehouse solutions. Develop comprehensive test strategies and plans for data validation, data quality, end-to-end testing, and performance testing. Build and implement effective testing strategies aligned with business and technical requirements. Plan, execute, and analyze performance testing to ensure scalability and reliability of data solutions. Collaborate with data engineers, Product Managers, and stakeholders to understand data requirements and business rules. Perform in-depth data testing, including source-to-target mapping, data transformation, and data integrity checks. Implement and maintain test automation frameworks using industry-standard tools (e.g., Python, SQL, Selenium, Informatica, Talend, etc.). Analyze test results, identify data anomalies, and work with development teams to resolve issues. Ensure compliance with financial industry standards and regulatory requirements. Report on test progress, quality metrics, and team performance to management. Required Skills & Experience: 8-10 years of experience in test automation, with a focus on ETL and data warehouse testing. Strong expertise in building testing strategies, planning performance testing, and executing end-to-end testing. Strong expertise in SQL, data profiling, and data validation techniques. Hands-on experience with automation tools and frameworks for ETL/data testing. In-depth understanding of data warehouse concepts, data modeling, and data integration. Exposure to financial industry concepts, products, and regulatory requirements. Proven experience managing and mentoring test teams. Excellent analytical, problem-solving, and communication skills. Preferred Qualifications: Knowledge of scripting languages (Python, Shell, etc.). ISTQB or equivalent testing certification. LSEG is a leading global financial markets infrastructure and data provider. Our purpose is driving financial stability, empowering economies and enabling customers to create sustainable growth. Our purpose is the foundation on which our culture is built. Our values of Integrity, Partnership , Excellence and Change underpin our purpose and set the standard for everything we do, every day. They go to the heart of who we are and guide our decision making and everyday actions. Working with us means that you will be part of a dynamic organisation of 25,000 people across 65 countries. However, we will value your individuality and enable you to bring your true self to work so you can help enrich our diverse workforce. You will be part of a collaborative and creative culture where we encourage new ideas and are committed to sustainability across our global business. You will experience the critical role we have in helping to re-engineer the financial ecosystem to support and drive sustainable economic growth. Together, we are aiming to achieve this growth by accelerating the just transition to net zero, enabling growth of the green economy and creating inclusive economic opportunity. LSEG offers a range of tailored benefits and support, including healthcare, retirement planning, paid volunteering days and wellbeing initiatives. We are proud to be an equal opportunities employer. This means that we do not discriminate on the basis of anyone’s race, religion, colour, national origin, gender, sexual orientation, gender identity, gender expression, age, marital status, veteran status, pregnancy or disability, or any other basis protected under applicable law. Conforming with applicable law, we can reasonably accommodate applicants' and employees' religious practices and beliefs, as well as mental health or physical disability needs. Please take a moment to read this privacy notice carefully, as it describes what personal information London Stock Exchange Group (LSEG) (we) may hold about you, what it’s used for, and how it’s obtained, your rights and how to contact us as a data subject. If you are submitting as a Recruitment Agency Partner, it is essential and your responsibility to ensure that candidates applying to LSEG are aware of this privacy notice.
Posted 1 week ago
6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Role Profile LSEG are embarking on a Finance Transformation programme, delivering our Finance Vision and redefining the way we work to bring value and deliver sustainable growth for the business. The Programme shall drive efficiencies and maximise benefits for LSEG by Transforming our abilities of managing Financial Crime via our internal Engineering capabilities. We have an exciting opportunity for a Data Engineer to join our dynamic team within the London Stock Exchange Group, working as part of the Financ ial Crime Engineering team, and supporting LSEG through designing and developing data warehousing solutions and integration of our core systems. The role sits within the Corporate Engineering Delivery function, which provides technology services to Corporate Functions Teams for LSEG and we are looking for someone with demonstrated ability for our Financial Crime Engineering Team. This is a hands-on development position for a candidate with demonstrable record of working with data analysis and database development having excellent development and problem-solving skills, someone with creativity and self-motivation to deliver on critically important projects with timelines and competing priorities and execution excellence (preferably in Financial Services) in supporting enterprise applications across various business units and has good exposure to various database engineering platforms. As a Data Engineer, you will be responsible for: Work with a team of Developers to deliver the product led by data along with the desired functionalities and business outcomes. Managing backlog, designing, developing, delivering and supporting data warehouse changes in an Agile methodology. Ensure data structure and database designs align with application requirements, organization standards, business goals, and scalability needs. Develop ETL pipelines and processes for OLAP systems and data warehouses. Managing build and release activities for data solutions Assisting with architecture artefacts to support product changes. Maintaining and optimizing of data modelling through continuous improvements and enhancements. Working with product owners, architects, business analysts, scrum master and other team members to deliver change on a timely basis Knowledge/Skills Hands on experience in writing advanced SQL Queries. Strong analytical problem-solving skills Hands on development experience with solid skills in designing, developing and deploying complex applications using OLTP and OLAP based databases. Hands on expertise in Physical Data Modeling and DB design with solid skills in performance tuning. Hands on experience in troubleshooting and resolving database performance issues. Hands on experience in building systems using modern, scalable, resilient, cloud native architectures. Experience on AWS is desirable. Hands on experience with multiple databases along with Snowflake and implementing complex stored procedures Good knowledge of data modeling concepts like dimensional modeling and DWH concepts like change data capture (CDC) Hands-on experience in Matillion, Boomi or any other ETL tools Scripting knowledge (e.g. Python, Spark etc.) would be desirable Ability to provide production support for Data Warehouse issues such data load problems, transformation/translation problems etc. Worked in Offshore / Onsite Engagements and collaborated across time zones Should be a great teammate Should be eager to learn new technology and/or functional areas Experience At least 6 years of experience in various databases and ETL tools like Informatica, Matillion Must have ETL E2E experience in documentation, development, testing & deployment to Production Bachelor’s degree in computer science or related field Strong Relational Database background and SQL Skills Proficiency in automation and continuous delivery methods Proficiency in all aspects of the Software Development Life Cycle in an Agile environment Experience leading & managing a development team, preferably in a financial technology or banking MNC. LSEG is a leading global financial markets infrastructure and data provider. Our purpose is driving financial stability, empowering economies and enabling customers to create sustainable growth. Our purpose is the foundation on which our culture is built. Our values of Integrity, Partnership , Excellence and Change underpin our purpose and set the standard for everything we do, every day. They go to the heart of who we are and guide our decision making and everyday actions. Working with us means that you will be part of a dynamic organisation of 25,000 people across 65 countries. However, we will value your individuality and enable you to bring your true self to work so you can help enrich our diverse workforce. You will be part of a collaborative and creative culture where we encourage new ideas and are committed to sustainability across our global business. You will experience the critical role we have in helping to re-engineer the financial ecosystem to support and drive sustainable economic growth. Together, we are aiming to achieve this growth by accelerating the just transition to net zero, enabling growth of the green economy and creating inclusive economic opportunity. LSEG offers a range of tailored benefits and support, including healthcare, retirement planning, paid volunteering days and wellbeing initiatives. We are proud to be an equal opportunities employer. This means that we do not discriminate on the basis of anyone’s race, religion, colour, national origin, gender, sexual orientation, gender identity, gender expression, age, marital status, veteran status, pregnancy or disability, or any other basis protected under applicable law. Conforming with applicable law, we can reasonably accommodate applicants' and employees' religious practices and beliefs, as well as mental health or physical disability needs. Please take a moment to read this privacy notice carefully, as it describes what personal information London Stock Exchange Group (LSEG) (we) may hold about you, what it’s used for, and how it’s obtained, your rights and how to contact us as a data subject. If you are submitting as a Recruitment Agency Partner, it is essential and your responsibility to ensure that candidates applying to LSEG are aware of this privacy notice.
Posted 1 week ago
10.0 - 15.0 years
0 Lacs
Hyderābād
On-site
Country India Working Schedule Full-Time Work Arrangement Hybrid Relocation Assistance Available No Posted Date 25-Jul-2025 Job ID 11114 Description and Requirements Job Description and Requirements Position Summary The MetLife Corporate Technology (CT) organization is evolving to enable MetLife’s New Frontier strategy. With a strong vision in place, we are a global function focused on driving digital technology strategies for key corporate functions within MetLife including, Finance, Actuarial, Reinsurance, Legal, Human Resources, Employee Experience, Risk, Treasury, Audit and Compliance. In partnership with our business leaders, we develop and deliver seamless technology experiences to our employees across the entire employee lifecycle. Our vision and mission is to create innovative, transformative and contemporary technology solutions to empower our leaders and employees so they can focus on what matters most, our customers. We are technologists with strong business acumen focused on developing our talent to continually transform and innovate. We are seeking a highly skilled hands-on Sr Techno-functional HCM Oracle Specialist who is responsible for partnering with HR Leaders, third party vendors and IT Executives to lead global transformation projects with the goal of attracting, developing and retaining talent across the organization. This position will be a part of a fast-paced IT team leveraging the newly implemented Oracle HCM to support employee hire to retire lifecycle (supporting all processes for HR, Hire, Learning, Pay, Benefits, etc. from an IT perspective) and interfaces of over 200 internal and external policy administrations He/she should be a strategic thinker, an effective communicator, and an expert in technological development. Key Relationships Internal Stake Holder – Key Responsibilities Stakeholder Management - Managing key business stakeholders to deliver required technology capabilities to support the digital transformation agenda. Driving prioritization of the product backlog. This includes managing key vendors providing the resources, SaaS & other capabilities. Technology Implementation – Implement and support projects on Internal Audit Technology platforms, specifically Azure Cloud. Ways of Working – Adoption of the Agile ways of working in the software delivery lifecycle. E2E Software Lifecycle Management (Architecture, Design, Development, Testing & Production) Evaluate/Implement technical solutions supporting Internal Audit and SAAS based solutions, talent development, performance management, and workforce analytic Work with Functional Experts to translate user requirements into Technical Specifications Partner with internal business process owners, technical team members, and senior management throughout the project life cycle Act as the intermediary to facilitate a clear understanding among all parties about business assumptions and requirements, design, technical, testing, and production migration requirements Drive the resolution and troubleshooting of issues during development and post- production support. Responsible to Support Day-to-day business enhancements Knowledge, Skills, and Abilities Education A Bachelors/master's degree in computer science or equivalent Engineering degree. Candidate Qualifications: Education: Bachelor's degree in computer science, Information Systems or related field Experience: Required: 10-15 years of software development experience Deep expertise in Oracle Fusion Cloud Human Capital Management (HCM) Prior lead role or project management experience Software development experience in one or more of the following languages: Oracle HCM, HCM extracts, Fast Formulas, HSDL, API’s, Synapse ETL, Redwood, Journeys, SQL, BI Reporting Preferred: Ability to manage systems testing including unit, QA, end to end and user acceptance testing Familiar with technology landscape supporting web and mobile delivery and integration solutions such as Informatica. Experience managing vendors to SLA’s. Experience with software development life cycle and related activities to the implementation and maintenance of HR systems Proven experience collaborating with peers to establish best practices to achieve high service levels. Experience with MS Project, Visio, Excel, PowerPoint and related project delivery utilities. Skills and Competencies: Language: Proficiency at business level in English. Competencies Communication: Ability to influence and help communicate the organization’s direction and ensure results are achieved Collaboration: Proven track record of building collaborative partnerships and ability to operate effectively in a global environment Diverse environment: Can-do attitude and ability to work in a high paced environment Tech Stack Development & Delivery Methods: Agile (Scaled Agile Framework) Development Frameworks and Languages: SQL Oracle HCM Cloud tool and configuration HCM Cloud systems HCM Data Loader HCM Spreadsheet Data Loader HCM Fast Formula Oracle HCM: Functional Knowledge of Payroll, Benefits, Time and Labor, Absence, Learning, Performance and/or Compensation About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible. Join us!
Posted 1 week ago
4.0 years
3 - 6 Lacs
Hyderābād
On-site
CDP ETL & Database Engineer The CDP ETL & Database Engineer will specialize in architecting, designing, and implementing solutions that are sustainable and scalable. The ideal candidate will understand CRM methodologies, with an analytical mindset, and a background in relational modeling in a Hybrid architecture. The candidate will help drive the business towards specific technical initiatives and will work closely with the Solutions Management, Delivery, and Product Engineering teams. The candidate will join a team of developers across the US, India & Costa Rica. Responsibilities: ETL Development – The CDP ETL C Database Engineer will be responsible for building pipelines to feed downstream data They will be able to analyze data, interpret business requirements, and establish relationships between data sets. The ideal candidate will be familiar with different encoding formats and file layouts such as JSON and XML. Implementations s Onboarding – Will work with the team to onboard new clients onto the ZMP/CDP+ The candidate will solidify business requirements, perform ETL file validation, establish users, perform complex aggregations, and syndicate data across platforms. The hands-on engineer will take a test-driven approach towards development and will be able to document processes and workflows. Incremental Change Requests – The CDP ETL C Database Engineer will be responsible for analyzing change requests and determining the best approach towards implementation and execution of the This requires the engineer to have a deep understanding of the platform's overall architecture. Change requests will be implemented and tested in a development environment to ensure their introduction will not negatively impact downstream processes. Change Data Management – The candidate will adhere to change data management procedures and actively participate in CAB meetings where change requests will be presented and Prior to introducing change, the engineer will ensure that processes are running in a development environment. The engineer will be asked to do peer-to-peer code reviews and solution reviews before production code deployment. Collaboration s Process Improvement – The engineer will be asked to participate in knowledge share sessions where they will engage with peers, discuss solutions, best practices, overall approach, and The candidate will be able to look for opportunities to streamline processes with an eye towards building a repeatable model to reduce implementation duration. Job Requirements: The CDP ETL C Database Engineer will be well versed in the following areas: Relational data modeling ETL and FTP concepts Advanced Analytics using SQL Functions Cloud technologies - AWS, Snowflake Able to decipher requirements, provide recommendations, and implement solutions within predefined The ability to work independently, but at the same time, the individual will be called upon to contribute in a team setting. The engineer will be able to confidently communicate status, raise exceptions, and voice concerns to their direct manager. Participate in internal client project status meetings with the Solution/Delivery management When required, collaborate with the Business Solutions Analyst (BSA) to solidify. Ability to work in a fast paced, agile environment; the individual will be able to work with a sense of urgency when escalated issues arise. Strong communication and interpersonal skills, ability to multitask and prioritize workload based on client demand. Familiarity with Jira for workflow , and time allocation. Familiarity with Scrum framework, backlog, planning, sprints, story points, retrospectives. Required Skills: ETL – ETL tools such as Talend (Preferred, not required) DMExpress – Nice to have Informatica – Nice to have Database - Hands on experience with the following database Technologies Snowflake (Required) MYSQL/PostgreSQL – Nice to have Familiar with NOSQL DB methodologies (Nice to have) Programming Languages – Can demonstrate knowledge of any of the PLSQL JavaScript Strong Plus Python - Strong Plus Scala - Nice to have AWS – Knowledge of the following AWS services: S3 EMR (Concepts) EC2 (Concepts) Systems Manager / Parameter Store Understands JSON Data structures, key value Working knowledge of Code Repositories such as GIT, Win CVS, Workflow management tools such as Apache Airflow, Kafka, Automic/Appworx Jira. Minimum Qualifications: Bachelor's degree or equivalent 4+ Years' experience Excellent verbal C written communications skills Self-Starter, highly motivated Analytical mindset Company Summary: Zeta Global is a NYSE listed data-powered marketing technology company with a heritage of innovation and industry leadership. Founded in 2007 by entrepreneur David A. Steinberg and John Sculley, former CEO of Apple Inc and Pepsi-Cola, the Company combines the industry's 3rd largest proprietary data set (2.4B+ identities) with Artificial Intelligence to unlock consumer intent, personalize experiences and help our clients drive business growth. Our technology runs on the Zeta Marketing Platform, which powers 'end to end' marketing programs for some of the world's leading brands. With expertise encompassing all digital marketing channels – Email, Display, Social, Search and Mobile – Zeta orchestrates acquisition and engagement programs that deliver results that are scalable, repeatable and sustainable. Zeta Global is an Equal Opportunity/Affirmative Action employer and does not discriminate on the basis of race, gender, ancestry, color, religion, sex, age, marital status, sexual orientation, gender identity, national origin, medical condition, disability, veterans status, or any other basis protected by law. Zeta Global Recognized in Enterprise Marketing Software and Cross-Channel Campaign Management Reports by Independent Research Firm https://www.forbes.com/sites/shelleykohan/2024/06/1G/amazon-partners-with-zeta-global-to-deliver- gen-ai-marketing-automation/ https://www.cnbc.com/video/2024/05/06/zeta-global-ceo-david-steinberg-talks-ai-in-focus-at-milken- conference.html https://www.businesswire.com/news/home/20240G04622808/en/Zeta-Increases-3Q%E2%80%GG24- Guidance https://www.prnewswire.com/news-releases/zeta-global-opens-ai-data-labs-in-san-francisco-and-nyc- 300S45353.html https://www.prnewswire.com/news-releases/zeta-global-recognized-in-enterprise-marketing-software-and- cross-channel-campaign-management-reports-by-independent-research-firm-300S38241.html
Posted 1 week ago
10.0 years
0 Lacs
Hyderābād
On-site
City/Cities Hyderabad Country India Working Schedule Full-Time Work Arrangement Hybrid Relocation Assistance Available No Posted Date 25-Jul-2025 Job ID 11139 Description and Requirements This position is responsible for design, implementation, and support of MetLife's enterprise data management and integration systems, the underlying infrastructure, and integrations with other enterprise systems and applications using AIX, Linux, or Microsoft Technologies. Job Responsibilities Provide technical expertise in the planning, engineering, design, implementation and support of data management and integration system infrastructures and technologies. This includes the systems operational procedures and processes Partner with the Capacity Management, Production Management, Application Development Teams and the Business to ensure customer expectations are maintained and exceeded Participate in the evaluation and recommendation of new products and technologies, maintain knowledge of emerging technologies for application to the enterprise Identify and resolve complex data management and integration system issues (Tier 3 support) utilizing product knowledge and structured troubleshooting tools and techniques Support Disaster Recovery implementation and testing as required Experience in design and developing Automation/Scripting (shell, Perl, PowerShell, Python, Java…) Good decision-making skills Take ownership for the deliverables from the entire team Strong collaboration with leadership groups Learn new technologies based on demand Coach other team members and bring them up to speed Track project status working with team members and report to leadership Participate in cross-departmental efforts Leads initiatives within the community of practice Willing to work in rotational shifts Good Communication skill with the ability to communicate clearly and effectively Knowledge, Skills and Abilities Education Bachelor's degree in computer science, Information Systems, or related field. Experience 10+ years of total experience and at least 7+ years of experience in Informatica applications implementation and support of data management and integration system infrastructures and technologies. This includes the system's operational procedures and processes. Participate in the evaluation and recommendation of new products and technologies, maintain knowledge of emerging technologies for application to the enterprise. Good understanding in Disaster Recovery implementation and testing Design and developing Automation/Scripting (shell, Perl, PowerShell, Python, Java…) Informatica PowerCenter Informatica PWX Informatica DQ Informatica DEI Informatica B2B/DX Informatica MFT Informatica MDM Informatica ILM Informatica Cloud (IDMC/IICS) Ansible (Automation) Operating System Knowledge (Linux/Windows/AIX) Azure Dev Ops Pipeline Knowledge Python and/or Powershell Agile SAFe for Teams Enterprise Scheduling Knowledge (Maestro) Troubleshooting Communications CP4D Datastage Mainframe z/OS Knowledge Open Shift Elastic Experience in creating and working on Service Now tasks/tickets Other Requirements (licenses, certifications, specialized training – if required) Working Relationships Internal Contacts (and purpose of relationship): MetLife internal partners External Contacts (and purpose of relationship) – If Applicable MetLife external partners About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible. Join us!
Posted 1 week ago
3.0 years
0 Lacs
Hyderābād
On-site
Job Title – Master Data Analyst Preferred Location - Hyderabad, India Full time/Part Time - Full Time Build a career with confidence Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do Job Summary We are seeking a detail-oriented and experienced Master Data Analyst to ensure the accuracy, consistency, and integrity of our critical master data across various enterprise systems. The Master Data Analyst will play a crucial role in data governance, data quality initiatives, and supporting business processes through reliable and well-managed master data. Key Responsibilities Develop, implement, and maintain master data management (MDM) policies, standards, and procedures. Ensure data quality, completeness, and consistency of master data (e.g., customer, product, vendor, material) across all relevant systems. Perform data profiling , cleansing, and validation to identify and resolve data quality issues. Collaborate with business units and IT teams to define data definitions, business rules, and data hierarchies . Act as a data steward, overseeing the creation, modification, and deletion of master data records. Support data integration efforts, ensuring master data is accurately and efficiently synchronized between systems. Document master data processes, data flows, and data lineage . Participate in projects related to data migration, system implementations, and data governance initiatives. Provide training and support to end-users on master data best practices and tools. Required Qualifications Bachelor's degree in Information Systems, Data Science, or a related quantitative field. 3+ years of experience in a Master Data Management (MDM), Data Quality, or Data Analyst role, specifically focused on master data. Strong understanding of master data concepts, data governance principles, and data lifecycle management. Proficiency with data analysis tools and techniques. Experience with enterprise resource planning (ERP) systems (e.g., SAP, Oracle, Microsoft Dynamics) and their master data structures. Experienced in cloud platforms (AWS, Azure) or relevant data technologies. Excellent analytical, problem-solving, and communication skills, with the ability to translate technical concepts to non-technical stakeholders. Proven ability to work independently and collaboratively in a fast-paced environment. Preferred Qualifications Experience with MDM software solutions (e.g., Informatica MDM, SAP MDG ). Familiarity with SQL and experience querying relational databases . Knowledge of SAP modules (ECC, CRM, BW) and with data governance, metadata management, and data cataloging tools (e.g., Alation, Collibra). Familiarity handling MDM in SAP ECC and SAP S/4 versions Knowledge of data warehousing concepts and business intelligence tools (e.g., Power BI, Tableau ). Experience with data governance frameworks and tools. Certifications in data management or related fields. Benefits We are committed to offering competitive benefits programs for all of our employees and enhancing our programs when necessary. Have peace of mind and body with our health insurance Make yourself a priority with flexible schedules and leave Policy Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Program. Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Apply Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class. Job Applicant's Privacy Notice: Click on this link to read the Job Applicant's Privacy Notice
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France