Jobs
Interviews

1988 Migrate Jobs - Page 6

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Area(s) of responsibility Role Summary An Apigee X Architect is responsible for designing and implementing API management solutions using the Apigee platform. This role involves working closely with clients to migrate from older versions of Apigee to Apigee X, ensuring seamless integration and optimal performance. Key Responsibilities Architectural Design: Develop and guide Apigee implementations, focusing on API security, traffic management, and reliability. Client Interaction: Work directly with clients to explain architectural considerations and differences between Apigee versions. Proxy Development: Design effective proxies and shared flows adhering to best practices. Platform Operations: Implement and operate Apigee platforms, including installation, automation, and CI/CD integration. Mentorship: Serve as a mentor and advisor to other consultants, helping resolve complex Apigee challenges. Required Skills Extensive experience with Apigee platforms (Edge Cloud, OPDK, Hybrid, X). Proficiency in Apigee proxy development and network configuration. Familiarity with Google Cloud Platform and Apigee components (Cassandra, nginx routers, Kubernetes). Strong communication skills and client-facing consulting experience. Knowledge of API security, especially OAuth and secure API development. Bachelor's degree in computer science or a related field, or equivalent experience. Preferred Qualifications Certification in Apigee API Management or Google Cloud Platform

Posted 5 days ago

Apply

15.0 years

2 - 8 Lacs

Hyderābād

On-site

Job Description: . Lead Software Engineer – Enterprise Solutions & Transformation We are seeking an accomplished Lead Software Engineer with 15+ years of experience in IT and software development to architect, modernize, and deliver robust enterprise solutions. You will drive the transformation of legacy applications to modern cloud-native architectures, build and integrate scalable platforms, and champion best practices in DevOps, observability, and cross-functional collaboration. This technical leadership role is ideal for innovators passionate about enabling business agility through technology modernization and integration. Roles and Responsibilities Architect, design, develop, test, and document enterprise-grade software solutions, aligning with business needs, quality standards, and operational requirements. Lead transformation and modernization efforts: Evaluate and migrate legacy systems to modern, scalable, and maintainable architectures leveraging cloud-native technologies and microservices. Engineer integration solutions with platforms such as Apache Kafka, MuleSoft, and other middleware or messaging technologies to support seamless enterprise connectivity. Define and implement end-to-end architectures for both new and existing systems, ensuring scalability, security, performance, and maintainability. Collaborate with Solution and Enterprise Architects and portfolio stakeholders to analyze, plan, and realize features, enablers, and modernization roadmaps. Work closely with infrastructure engineers to provision, configure, and optimize cloud resources, especially within Azure (AKS, Cosmos DB, Event Hub). Champion containerization and orchestration using Docker and Azure Kubernetes Service (AKS) for efficient deployment and scaling. Drive observability: Define and implement system monitoring, logging, and alerting strategies using tools such as Prometheus, Grafana, and ELK Stack. Lead and participate in code and documentation reviews to uphold quality and engineering excellence. Mentor and coach engineers and developers, fostering technical growth and knowledge sharing. Troubleshoot and resolve complex issues across application, integration, and infrastructure layers. Advocate and implement modern DevOps practices: Build and maintain robust CI/CD pipelines, Infrastructure-as-Code, and automated deployments. Continuously evaluate and adopt new tools, technologies, and processes to improve system quality, delivery, and operational efficiency. Translate business needs and legacy constraints into actionable technical requirements and provide accurate estimates for both new builds and modernization projects. Ensure NFRs (scalability, security, availability, performance) are defined, implemented, and maintained across all solutions. Collaborate cross-functionally with DevOps, support, and peer teams to ensure operational excellence and smooth transformation initiatives. Required Qualifications Bachelor’s or master’s degree in computer science, Information Systems, or a related field. 15+ years of experience in IT and software development roles, with a track record of delivering enterprise-scale solutions. 5+ years of hands-on experience building Java-based, high-volume/high-transaction applications. 5+ years of experience with Java, Spring, and RESTful API development. 3+ years of experience in modernizing legacy applications or leading transformation initiatives. 3+ years of experience in performance tuning, application monitoring, and troubleshooting. 3+ years of experience with integration platforms (Kafka, MuleSoft, RabbitMQ, etc.). 2+ years of experience architecting solutions and leading technical design for enterprise systems. Experience working with container orchestration, especially Azure Kubernetes Service (AKS). Preferred Qualifications 3+ years of experience in microservices architecture and system design. 3+ years in technical leadership or mentoring roles. 3+ years hands-on with cloud platforms (Azure, AWS, GCP, OpenStack). Experience with cloud resource provisioning (ARM templates, Terraform, Ansible, Chef). Strong DevOps skills: CI/CD pipelines with GitHub, Maven, Jenkins, Nexus, SonarQube. Advanced knowledge of observability (Prometheus, Grafana, ELK). Proficiency in Unix/Linux command line and shell scripting. Expert in asynchronous messaging, stream processing, and event-driven architectures. Experience in Agile/Scrum/Kanban environments. Familiarity with front-end technologies (HTML5, JavaScript frameworks, CSS3). Certifications in Java, Spring, Azure, or relevant integration/cloud technologies. Excellent communication skills for both technical and business audiences. Technical Skills Languages & Frameworks: Java, Groovy, Spring (Boot, Cloud), REST Integration & Messaging: Kafka, MuleSoft, RabbitMQ, MQ, Redis, Hazelcast Legacy Modernization: Refactoring, rearchitecting, and migrating monolithic or legacy applications to modern platforms. Databases: NoSQL (Cassandra, Cosmos DB), SQL Monitoring & Observability: Prometheus, Grafana, ELK Stack Orchestration: Docker, AKS (Azure Kubernetes Service) Cloud Platforms: Azure (Event Hub, Cosmos DB, AKS), AWS, GCP, OpenStack IaC & DevOps: Terraform, Ansible, Chef, Jenkins, Maven, Nexus, SonarQube, Git, Jira Scripting & Front-End: Node.js, React.js, Python, R Why Join Us? Lead modernization and transformation of critical business systems to future-ready cloud architectures. Architect and deliver enterprise-scale, highly integrated, observable solutions. Mentor and inspire a talented engineering team. Shape the organization’s technical direction in cloud, integration, and DevOps. Thrive in a collaborative, innovative, and growth-focused environment. Enjoy competitive compensation and opportunities for career advancement. Weekly Hours: 40 Time Type: Regular Location: IND:AP:Hyderabad / Argus Bldg 4f & 5f, Sattva, Knowledge City- Adm: Argus Building, Sattva, Knowledge City It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made.

Posted 5 days ago

Apply

8.0 - 13.0 years

7 - 8 Lacs

Hyderābād

On-site

Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Sr Data Engineer What you will do Let’s do this. Let’s change the world. In this vital role you will be responsible for "Run" and "Build" project portfolio execution, collaborate with business partners and other IS service leads to deliver IS capability and roadmap in support of business strategy and goals. Real world data analytics, visualization and advanced technology play a vital role in supporting Amgen’s industry leading innovative Real World Evidence approaches. The role is responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and implementing data governance initiatives and visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation Collaborate and communicate effectively with product teams What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master's degree / Bachelor's degree and 8 to 13 years of experience in Computer Science, IT or related field Must-Have Skills: Hands on experience with bigdata technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), workflow orchestration, performance tuning on bigdata processing Hands on experience with various Python/R packages for EDA, feature engineering and machine learning model training Proficiency in data analysis tools (eg. SQL) and experience with data visualization tools Excellent problem-solving skills and the ability to work with large, complex datasets Strong understanding of data governance frameworks, tools, and standard processes. Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Preferred Qualifications: Good-to-Have Skills: Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python/R, Databricks, SageMaker, OMOP. Professional Certifications: Certified Data Engineer / Data Analyst (preferred on Databricks or cloud environments) Certified Data Scientist (preferred on Databricks or Cloud environments) Machine Learning Certification (preferred on Databricks or Cloud environments) SAFe for Teams certification (preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 5 days ago

Apply

5.0 - 9.0 years

7 - 8 Lacs

Hyderābād

On-site

Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Data Engineer What you will do Let’s do this. Let’s change the world. In this vital role you will be responsible for "Run" and "Build" project portfolio execution, collaborate with business partners and other IS service leads to deliver IS capability and roadmap in support of business strategy and goals. Real world data analytics, visualization and advanced technology play a vital role in supporting Amgen’s industry leading innovative Real World Evidence approaches. The role is responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data governance initiatives and visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation Collaborate and communicate effectively with product teams What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master's degree / Bachelor's degree and 5 to 9 years' of experience in Computer Science, IT or related field Must-Have Skills: Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), workflow orchestration, performance tuning on big data processing Hands on experience with various Python/R packages for EDA, feature engineering and machine learning model training Proficiency in data analysis tools (eg. SQL) and experience with data visualization tools Excellent problem-solving skills and the ability to work with large, complex datasets Strong understanding of data governance frameworks, tools, and best practices. Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Preferred Qualifications: Good-to-Have Skills: Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python/R, Databricks, SageMaker, OMOP. Professional Certifications: Certified Data Engineer / Data Analyst (preferred on Databricks or cloud environments) Certified Data Scientist (preferred on Databricks or Cloud environments) Machine Learning Certification (preferred on Databricks or Cloud environments) SAFe for Teams certification (preferred) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 5 days ago

Apply

7.0 years

3 - 5 Lacs

Hyderābād

On-site

Country India Working Schedule Full-Time Work Arrangement Hybrid Relocation Assistance Available No Posted Date 28-Jul-2025 Job ID 11149 Description and Requirements Position Summary The Shared Application Platform Engineering team is to provide the enterprise configuration and support for integration technologies such as IBM Middleware tools like MQ and ensure the platform stability and process improvement. Responsibilities include planning, support, and implementation of application platform infrastructure to include operational processes and procedures. Job Responsibilities Handle MQ Admin BAU activities such as manage QMGRs & Objects/maintenance/patching/ configurations etc. Should have knowledge on SSL Certificate management, security vulnerabilities in MQ Scheduling and Monitoring MQ backups & performing housekeeping and daily health check Install & Configure IBM MQ Support Project for MQ upgrade or migrate to new version and apply Fixpack/Interim Fixpack, Refresh Pack/Ifix etc. Setting up new QMGRs and its object Investigate and Troubleshot issues in MQ Knowledge on Performance Tuning or optimizing of MQ Coordinate with Systems Administrators, UNIX, Network and DBAs, scheduling and implementing software patches & upgrades Support development/functional teams with performance tuning and troubleshooting issues & Co-ordinatr with IBM vendor Monitor and acknowledge Incidents/Change-Tickets/SRs/Problem-Tickets within SLA Working Knowledge on RCA's & SIP's & Automating tasks Provide Support for MQ DR activity Basic knowledge of shell scripting or Ansible to manage & create MQ admin related tasks for automation Good communication, written skills & interacting with Client & Stake holders Create knowledge base documents and SOPs for the Middleware support Handling Problem management calls and provide the RCA for the P1/P2 issues Good knowledge on IIB and/or APIC Basic knowledge on IBM-CP4I and/or OpenShift Container Platform (OCP) Knowledge, Skills and Abilities Education Bachelor's degree in computer science, Information Systems, or related field Experience 7+ years of total experience and at least 4+ years of experience in Middlware applications like MQ Admin BAU activities such as manage QMGRs & Objects/maintenance/patching/ configurations. Install & Configure IBM MQ Scheduling and Monitoring MQ backups & performing housekeeping and daily health check WebMethods WebSphere Message Broker (WMB) IBM Integration Bus (IIB) CP4I ACE MQ IBM API Connect v10 App Connect Professional (Cast Iron) Linux / AIX SDLC SSL Good to Have : Open Shift (Kubernettes) Ansible (Automation) Elastic Azure DevOps YAML/JSON Python and/or Powershell Agile SAFe for Teams DataPower Other Requirements (licenses, certifications, specialized training – if required) About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible. Join us!

Posted 5 days ago

Apply

7.0 years

3 - 5 Lacs

Hyderābād

On-site

Country India Working Schedule Full-Time Work Arrangement Hybrid Relocation Assistance Available No Posted Date 28-Jul-2025 Job ID 11150 Description and Requirements Position Summary The Shared Application Platform Engineering team is to provide the enterprise configuration and support for integration technologies such as IBM Middleware tools like MQ and ensure the platform stability and process improvement. Responsibilities include planning, support, and implementation of application platform infrastructure to include operational processes and procedures. Job Responsibilities Handle MQ Admin BAU activities such as manage QMGRs & Objects/maintenance/patching/ configurations etc. Should have knowledge on SSL Certificate management, security vulnerabilities in MQ Scheduling and Monitoring MQ backups & performing housekeeping and daily health check Install & Configure IBM MQ Support Project for MQ upgrade or migrate to new version and apply Fixpack/Interim Fixpack, Refresh Pack/Ifix etc. Setting up new QMGRs and its object Investigate and Troubleshot issues in MQ Knowledge on Performance Tuning or optimizing of MQ Coordinate with Systems Administrators, UNIX, Network and DBAs, scheduling and implementing software patches & upgrades Support development/functional teams with performance tuning and troubleshooting issues & Co-ordinatr with IBM vendor Monitor and acknowledge Incidents/Change-Tickets/SRs/Problem-Tickets within SLA Working Knowledge on RCA's & SIP's & Automating tasks Provide Support for MQ DR activity Basic knowledge of shell scripting or Ansible to manage & create MQ admin related tasks for automation Good communication, written skills & interacting with Client & Stake holders Create knowledge base documents and SOPs for the Middleware support Handling Problem management calls and provide the RCA for the P1/P2 issues Good knowledge on IIB and/or APIC Basic knowledge on IBM-CP4I and/or OpenShift Container Platform (OCP) Knowledge, Skills and Abilities Education Bachelor's degree in computer science, Information Systems, or related field Experience 7+ years of total experience and at least 4+ years of experience in Middlware applications like MQ Admin BAU activities such as manage QMGRs & Objects/maintenance/patching/ configurations. Install & Configure IBM MQ Scheduling and Monitoring MQ backups & performing housekeeping and daily health check WebMethods WebSphere Message Broker (WMB) IBM Integration Bus (IIB) CP4I ACE MQ IBM API Connect v10 App Connect Professional (Cast Iron) Linux / AIX SDLC SSL Good to Have : Open Shift (Kubernettes) Ansible (Automation) Elastic Azure DevOps YAML/JSON Python and/or Powershell Agile SAFe for Teams DataPower About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible. Join us!

Posted 5 days ago

Apply

0 years

4 - 9 Lacs

Bengaluru

On-site

Job Title : Data Analyst (Power BI / Data Visualization Specialist) Job Description : We are seeking an experienced Data Analyst with a strong background in designing dashboards, reports, and KPIs to support data-driven decision-making. The ideal candidate will have hands-on expertise in Power BI, SQL, and dashboard scripting platforms like PLX, with a proven ability to transform complex data into actionable business insights. Key Responsibilities : Design and deploy interactive dashboards to visualize KPIs, sales trends, and customer engagement metrics. Develop custom reports and visualizations using Power BI, PLX Dashboard/Script, and SQL. Manage Power BI Server, On-premise Gateways, and production Workspaces to ensure optimal performance. Perform data modelling, create DAX expressions, and maintain datasets aligned with business objectives. Migrate existing reports (e.g., Tableau) to Power BI and integrate internal applications using APIs like Salesforce Connect. Simplify complex technical information for stakeholders and ensure accuracy, usability, and standardization of all reports. Required Skills : Power BI (Desktop & Service), DAX, PLX Dashboard/Script SQL, MySQL Strong understanding of KPI development and data storytelling Experience with report migration and server management Excellent communication and stakeholder collaboration skills

Posted 5 days ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Requirements Description and Requirements Position Summary The Shared Application Platform Engineering team is to provide the enterprise configuration and support for integration technologies such as IBM Middleware tools like MQ and ensure the platform stability and process improvement. Responsibilities include planning, support, and implementation of application platform infrastructure to include operational processes and procedures. Job Responsibilities Handle MQ Admin BAU activities such as manage QMGRs & Objects/maintenance/patching/ configurations etc. Should have knowledge on SSL Certificate management, security vulnerabilities in MQ Scheduling and Monitoring MQ backups & performing housekeeping and daily health check Install & Configure IBM MQ Support Project for MQ upgrade or migrate to new version and apply Fixpack/Interim Fixpack, Refresh Pack/Ifix etc. Setting up new QMGRs and its object Investigate and Troubleshot issues in MQ Knowledge on Performance Tuning or optimizing of MQ Coordinate with Systems Administrators, UNIX, Network and DBAs, scheduling and implementing software patches & upgrades Support development/functional teams with performance tuning and troubleshooting issues & Co-ordinatr with IBM vendor Monitor and acknowledge Incidents/Change-Tickets/SRs/Problem-Tickets within SLA Working Knowledge on RCA's & SIP's & Automating tasks Provide Support for MQ DR activity Basic knowledge of shell scripting or Ansible to manage & create MQ admin related tasks for automation Good communication, written skills & interacting with Client & Stake holders Create knowledge base documents and SOPs for the Middleware support Handling Problem management calls and provide the RCA for the P1/P2 issues Good knowledge on IIB and/or APIC Basic knowledge on IBM-CP4I and/or OpenShift Container Platform (OCP) Knowledge, Skills And Abilities Education Bachelor's degree in computer science, Information Systems, or related field Experience 7+ years of total experience and at least 4+ years of experience in Middlware applications like MQ Admin BAU activities such as manage QMGRs & Objects/maintenance/patching/ configurations. Install & Configure IBM MQ Scheduling and Monitoring MQ backups & performing housekeeping and daily health check WebMethods WebSphere Message Broker (WMB) IBM Integration Bus (IIB) CP4I ACE MQ IBM API Connect v10 App Connect Professional (Cast Iron) Linux / AIX SDLC SSL Good to Have : Open Shift (Kubernettes) Ansible (Automation) Elastic Azure DevOps YAML/JSON Python and/or Powershell Agile SAFe for Teams DataPower About MetLife Other Requirements (licenses, certifications, specialized training – if required) Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible . Join us!

Posted 5 days ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Requirements Description and Requirements Position Summary The Shared Application Platform Engineering team is to provide the enterprise configuration and support for integration technologies such as IBM Middleware tools like MQ and ensur e the platform stability and process improvement. Responsibilities include planning, support , and implementation of application platform infrastructure to include operational processes and procedures. Job Responsibilities Handle MQ Admin BAU activities such as manage QMGRs & Objects/ maintenance /patching/ c onfigurations etc . Should have knowledge on SSL Certificate management, security vulnerabilities in MQ Scheduling and Monitoring MQ backups & performing housekeeping and daily health check Install & Configure IBM MQ Support Project for MQ upgrade or migrate to new version and apply Fixpack /Interim Fixpack , R efresh P ack/ I fix etc . Setting up new QMGRs and its object Investigate and Troubleshot issues in MQ Knowledge on Performance Tuning or optimizing of MQ Coordinate with Systems Administrators, UNIX, Network and DBAs, scheduling and implementing software patches & upgrades Support development/functional teams with performance tuning and troubleshooting issues & Co- ordinatr with IBM vendor Monitor and acknowledge Incidents/Change-Tickets/SRs/Problem-Tickets within SLA Working Knowledge on RCA's & SIP's & Automating tasks Provide Support for MQ DR activity Basic knowledge of shell scripting or Ansible to manage & create MQ admin related tasks for automation Good communication, written skills & interacting with Client & Stake holders Create knowledge base documents and SOPs for the Middleware support Handling Problem management calls and provide the RCA for the P1/P2 issues Good knowledge on IIB and/or APIC Basic knowledge on IBM-CP4I and/or OpenShift Container Platform (OCP) Knowledge, Skills And Abilities Education Bachelor's degree in computer science, Information Systems, or related field Experience 7+ years of total experience and at least 4+ years of experience in Middlware applications like MQ Admin BAU activities such as manage QMGRs & Objects/maintenance/patching/ configurations. Install & Configure IBM MQ Scheduling and Monitoring MQ backups & performing housekeeping and daily health check WebMethods WebSphere Message Broker (WMB) IBM Integration Bus (IIB) CP4I ACE MQ IBM API Connect v10 App Connect Professional (Cast Iron) Linux / AIX SDLC SSL Good to Have : Open Shift (Kubernettes) Ansible (Automation) Elastic Azure DevOps YAML/JSON Python and/or Powershell Agile SAFe for Teams DataPower About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible . Join us!

Posted 5 days ago

Apply

20.0 years

0 Lacs

Gurugram, Haryana, India

On-site

About The Role OSTTRA India The Role: Specialist - Professional Services The Team: The OSTTRA Technology team is composed of Capital Markets Technology professionals, who build, support and protect the applications that operate our network. The technology landscape includes high-performance, high-volume applications as well as compute intensive applications, leveraging contemporary microservices, cloud-based architectures. The Impact: Together, we build, support, protect and manage high-performance, resilient platforms that process more than 100 million messages a day. Our services are vital to automated trade processing around the globe, managing peak volumes and working with our customers and regulators to ensure the efficient settlement of trades and effective operation of global capital markets. What’s In It For You The role will primarily focus on delivering implementations & integrations. This position may additionally be required to produce cross-training materials in the agreed, standardised formats; take on primary & secondary responsibilities when delivering implementations & integrations with other team members; and engage in product UAT cycles. Specialist - Professional Services at all levels are expected to collaborate with other members of professional services, and other internal teams, in order to deliver implementations & integrations. The expected working hours in Gurgaon are 12 - 9pm. Some tasks, such as deployment of changes, is required on Sundays as part of the role. This is an excellent opportunity to be part of a team based out of Gurgaon and to work with colleagues across multiple regions globally. The role is being opened to work on new initiatives within OSTTRA. Responsibilities Implementation & Integration Deliver implementations & integrations for multiple project types across the services (currently limited to ex. Traiana services) offered within the FX&S pillar at OSTTRA Hand over to the operations teams once live Day one check in with the customer Finalising readiness to migrate to production, and liaising with the relevant counterparties (as required) Undergoing the UAT phase with the customer directly, unilaterally identifying issues, investigating those issues, and resolving those issues with the relevant internal or external team Gathering & setting up all required static data in UAT & production (as required) System configuration in UAT and production environments Connectivity & integration set up in the product Connectivity & integration set up in IC and/or Adapters Coordinate the development of the transformer based on the spec provided by Solution Design Create any required routing in IC Ensure that all integration changes & set ups undergo the required 4-eye checks prior to deployment in production Ensure all integrations follow the integration standards outlined Work effectively as part of a professional services project team on each implementation and/or integration, alongside a project manager and solution design manager Work effectively with key internal stakeholders outside of professional services during the implementation and/or integration, such as the connectivity team, product or development teams Demonstrate a positive customer experience during implementations & integrations, regardless of whether the Technical Project Manager leads discussions or is working behind the scenes on items Update the PSA system (e.g. Monday.com) on a daily basis so that the project manager has the correct information on project status, risks, issues and dependencies Creating and tracking UAT plans Ensure all required implementation & integration documentation is produced in the standard formats defined, and is made available prior to the point of go-live, including the operations handover material Effectively manage time so that tasks are completed by the expected due date Cross-Training Create cross-training materials in the pre-defined standardised formats on implementation & integration processes for project types To lead implementation & integrations as a primary resource, while developing a secondary resource Develop new core skills, and take on new project types To assist a primary resource during implementation & integrations, while acting as a secondary resource Where necessary during the professional services restructure, assist with other teams in their cross-training priorities and needs Teamwork Responsive, collaborative and engaged with the internal project management team assigned to each implementation and/or integration Engage, be open and be objective in post-project retrospectives to develop the team further Product UAT Executing the required UAT runbook Operations Escalations Act as an escalation point for certain project types / services from a technical project management perspective What We’re Looking For Knowledge of a message formats such as FIX, XML, JSON or CSV Work effectively as part of a team Ability to define and document detailed workflow processes Process-oriented with excellent organisational skills Ability to fulfil required project tasks in a timely manner Customer facing skills Creative problem solver Excellent verbal and written communication skills Understanding of the services offered by the OSTTRA FX & S pillar The Location: Gurgaon, India About Company Statement OSTTRA is a market leader in derivatives post-trade processing, bringing innovation, expertise, processes and networks together to solve the post-trade challenges of global financial markets. OSTTRA operates cross-asset post-trade processing networks, providing a proven suite of Credit Risk, Trade Workflow and Optimisation services. Together these solutions streamline post-trade workflows, enabling firms to connect to counterparties and utilities, manage credit risk, reduce operational risk and optimise processing to drive post-trade efficiencies. OSTTRA was formed in 2021 through the combination of four businesses that have been at the heart of post trade evolution and innovation for the last 20+ years: MarkitServ, Traiana, TriOptima and Reset. These businesses have an exemplary track record of developing and supporting critical market infrastructure and bring together an established community of market participants comprising all trading relationships and paradigms, connected using powerful integration and transformation capabilities. About OSTTRA Candidates should note that OSTTRA is an independent firm, jointly owned by S&P Global and CME Group. As part of the joint venture, S&P Global provides recruitment services to OSTTRA - however, successful candidates will be interviewed and directly employed by OSTTRA, joining our global team of more than 1,200 post trade experts. OSTTRA was formed in 2021 through the combination of four businesses that have been at the heart of post trade evolution and innovation for the last 20+ years: MarkitServ, Traiana, TriOptima and Reset. OSTTRA is a joint venture, owned 50/50 by S&P Global and CME Group. With an outstanding track record of developing and supporting critical market infrastructure, our combined network connects thousands of market participants to streamline end to end workflows - from trade capture at the point of execution, through portfolio optimization, to clearing and settlement. Joining the OSTTRA team is a unique opportunity to help build a bold new business with an outstanding heritage in financial technology, playing a central role in supporting global financial markets. Learn more at www.osttra.com. What’s In It For You? Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), BSMGMT203 - Entry Professional (EEO Job Group) Job ID: 317339 Posted On: 2025-07-28 Location: Gurgaon, Haryana, India

Posted 5 days ago

Apply

14.0 - 16.0 years

0 Lacs

Haryana, India

On-site

Qualification Position Summary As a BI Architect, you will be the core functionary to BI Practice at Impetus. You will enable our clients to get the real value out of their peta bytes of data. You will work closely with client business stakeholders to create BI strategy, architect, design and lead the implementation of end-to-end BI solutions. You will help business teams quantum jump in accessing key business metrics and improve customer experience across multiple client engagements. Description of Role Gather, describe and prioritize requirements with client business stakeholders Architect and design overall BI solutions including logical and physical data models, ETL workflows, dashboards, and reports Create BI development, testing and production deployment strategy Lead teams of BI leads and developers, translate designed solutions to them, review their work, provide guidance Manage client communications and alignment on BI architecture and project plans Ensure high quality BI solution deliveries Create and keep up to date the BI standards, guidelines and best practices Create skillset roadmap for company and manage updating of skills in the teams Be a thought leader in the industry - share knowledge, conduct training sessions, write whitepapers / case studies Contribute to pre-sales activities and acquire new projects Role Skills / Requirements Experience in architecting, designing, and implementing complex BI solutions in large scale data management environments Expertise in at least two of the following BI Tools – Power BI, Tableau, Qlik, Spotfire, QuickSight, Looker, MicroStrategy, SAP BO, Cognos, along with overall knowhow of multiple BI tools Deep knowledge of BI architecture and data warehousing Experience working with high volume databases and MPPs Experience with the preparation of data (e.g., data profiling, data cleansing, volume assessment and partitioning etc.) Strong knowledge of one of the cloud providers – AWS / Azure / GCP Strong skills in cloud-based data intelligence platforms like (Databricks / Snowflake) Strong skills in databases (Oracle / MySQL / DB2) and expertise in writing SQL queries Experience and understating with Generative AI implementation in BI tools Very well versed in HiveQL / Spark SQL / Impala SQL Working knowledge of scripting languages like Perl, Shell, Python is desirable Working knowledge of data lake and data lake house architecture Hands on knowledge of ETL tool Hands on knowledge of enterprise repository tools, data modelling tools, data mapping tools, and data profiling tools Experience of working in BI platform migration project(s) Good understanding of business to build the formulae and calculations to recreate or migrate existing reports and dashboards Skills in administering Power BI, Tableau or MicroStrategy servers for maximum efficiency and performance Skills in setting up infrastructure including BI servers, sizing / capacity planning, clustered deployment architecture and ability to provide deployment solutions Experience in customizing / extending the default functionalities of BI tools Experience of working in multiple business domains (e.g. BFSI, healthcare, telecom) is desirable Experience with agile based development Outstanding communication, problem solving and interpersonal skills Self-starter and resourceful, skilled in identifying and mitigating risks Out of box thinker Experience 14 to 16 years Job Reference Number 13100

Posted 5 days ago

Apply

8.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job description: Job Description Role Purpose The purpose of the role is to create exceptional and detailed architectural application design and provide thought leadership and enable delivery teams to provide exceptional client engagement and satisfaction. ͏ Do 1. Develop architectural application for the new deals/ major change requests in existing deals a. Creates an enterprise-wide architecture that ensures systems are scalable, reliable, and manageable. b. Manages application assets and directs the development efforts within an enterprise to improve solution delivery and agility c. Guides how to construct and assemble application components and services to support solution architecture and application development d. Maintains the frameworks and artefacts used in the implementation of an application, with reference to the systematic architecture of the overall application portfolio e. Responsible for application architecture paradigms such as service-oriented architecture (SOA) and, more specifically, microservices, ensuring business achieve agility and scalability for a faster time to market ͏ f. Provide solution of RFP’s received from clients and ensure overall design assurance Develop a direction to manage the portfolio of to-be-solutions including systems, shared infrastructure services, applications in order to better match business outcome objectives Analyse technology environment, enterprise specifics, client requirements to set a collaboration design framework/ architecture Depending on the client’s need with particular standards and technology stacks create complete RFPs Provide technical leadership to the design, development and implementation of custom solutions through thoughtful use of modern technology Define and understand current state solutions and identify improvements, options & tradeoffs to define target state solutions Clearly articulate and sell architectural targets, recommendations and reusable patterns and accordingly propose investment roadmaps Evaluate and recommend solutions to integrate with overall technology ecosystem Tracks industry and application trends and relates these to planning current and future IT needs g. Provides technical and strategic inputs during the project planning phase in the form of technical architectural designs and recommendations h. Account mining to find opportunities in the existing clients i. Collaborates with all relevant parties in order to review the objectives and constraints of solutions and determine conformance with the Enterprise Architecture. j. Identifies implementation risks and potential impacts. k. Create new revenue streams within applications as APIs that can be leveraged by clients l. Bring knowledge of automation in application by embracing Agile and dev-ops principles to reduce manual part ͏ 2. Understanding application requirements and design a standardize application a. Creating Intellectual Property in forms of services, patterns, models and organizational approaches b. Designing patterns, best practices and reusable applications that can be used for future references c. Ensure system capabilities are consumed by system components and set criteria for evaluating technical and business value in terms of Tolerate, Invest, Migrate and Eliminate d. Provide platform to create standardize tools, uniform design and techniques are maintained to reduce costs of maintenance e. Coordinating input on risks, costs and opportunities for concepts f. Developing customised applications for the customers aligned with their needs g. Perform design and code reviews thoroughly on regular basis, keeping in mind the security measures h. Understanding design and production procedures and standards to create prototypes and finished products i. Work closely with systems analysts, software developers, data managers and other team members to ensure successful production of application software j. Offer viable solutions for various systems and architectures to different types of businesses k. Seamless integration of new and existing systems to eliminate potential problems and maintain data structure and bring value in terms of development l. Transforming all applications into digital form and implement and evolve around mesh app and service architecture that support new technologies like IOT, blockchain, machine learning, automation, BOTS etc ͏ m. Cloud Transformation: (Migration) Understanding non-functional requirements Producing artefacts such as deployment architecture, interface catalogue Identify internal and external dependency, vendor and internal IT management Support build and testing team n. Cloud Transformation: (Modernization) Understanding and Defining target architecture in Integration space Assessing project pipeline / demand and align to target architecture Technical support of delivery team in terms and POC and technical guidance o. Keep Up-to-date with the latest technologies in the market Mandatory Skills: SAP HANA Cloud Integration . Experience: 8-10 Years . Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 5 days ago

Apply

3.0 - 5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Role – Syniti ADMM - Staff Job Description Experience – 3-5 years Experience in Syniti SAP Advanced Data Migration and Management (ADMM) design and architecture Should have worked on preparation of Technical Specification documents. Strong hands-on experience in ADMM as technical developer Thorough knowledge of LTMC / Migrate your Data App Sound Knowledge of SQL Experience on creating Profiling rules, Construction sheets and Syniti Replicate tool Good analytical skills to analyse the ETL issues and fix them independently Should have experience in SAP Data Migration project with an end-to-end implementation using ADMM. Should have knowledge on working with SAP system as source and target. Should be able to connect to customers and gather requirements and work independently on those requirements. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 5 days ago

Apply

3.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Role – Syniti ADMM - Staff Job Description Experience – 3-5 years Experience in Syniti SAP Advanced Data Migration and Management (ADMM) design and architecture Should have worked on preparation of Technical Specification documents. Strong hands-on experience in ADMM as technical developer Thorough knowledge of LTMC / Migrate your Data App Sound Knowledge of SQL Experience on creating Profiling rules, Construction sheets and Syniti Replicate tool Good analytical skills to analyse the ETL issues and fix them independently Should have experience in SAP Data Migration project with an end-to-end implementation using ADMM. Should have knowledge on working with SAP system as source and target. Should be able to connect to customers and gather requirements and work independently on those requirements. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 5 days ago

Apply

13.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

MAIN PURPOSE OF ROLE Manage a group of customers to achieve designated sales target levels. Develop profitable business with new and existing customers. Main Responsibilities Possess and apply detailed product knowledge as well as thorough knowledge of client's business. Responsible for the direct sales process, aiming at meeting and/or exceeding sales targets. Is in charge of sales expansion, introduce new products/services to clients and organize visits to current and potential clients. Submit short and long-range sales plans and prepare sales strategies utilizing available marketing programs to reach nominated targets. Responsible for retaining long-term customer relationships with established clients. Ensure that clients receive high quality customer service. Inform clients of new products and services as they are introduced, Migrate information to appropriate sales representative when clients have additional service needs. Qualifications Associates Degree (± 13 years)

Posted 5 days ago

Apply

4.0 - 7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title: BI Developer (JasperReports / Jaspersoft) Location: Chennai, Tamil Nadu (Onsite – Mandatory) Job Type: Full-Time Experience Required: 4 to 7 Years Start Date: Immediate Joiner – Mandatory Job Summary: We are seeking a skilled and detail-oriented BI Developer with hands-on experience in Jaspersoft / JasperReports for a full-time onsite position based in Chennai, Tamil Nadu. The ideal candidate will play a critical role in designing, developing, and maintaining business intelligence reports and dashboards. Strong expertise in PL/SQL and database management is essential. Key Responsibilities: Design, develop, and manage BI reports and dashboards using Jaspersoft / JasperReports. Convert and migrate reports from legacy systems into Jaspersoft. Write and optimize complex PL/SQL queries for efficient reporting. Ensure high-quality, error-free output within tight deadlines. Collaborate with cross-functional teams in a fast-paced environment. Required Skills: 4 to 7 years of hands-on experience with Jaspersoft / JasperReports. Proficiency in PL/SQL and relational databases such as Oracle and MySQL. Experience in report migration and conversion. Strong analytical, debugging, and problem-solving skills. Good communication skills and the ability to work independently. Preferred Qualifications: Candidates based in Tamil Nadu will be given preference. Experience working in Agile development environments is a plus.

Posted 5 days ago

Apply

2.5 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Title: SAP B1 Functional Consultant Location: On Site Experience: 2.5 Years+ (Experience in SAP B1 Preferred) Office Timings: 9:30 AM – 6:30 PM Salary: Based on Interview Notice Period: Immediate Joiner Job Description: We are hiring a Techno-Functional Consultant to lead and support ERP implementations across modules. The role includes requirement gathering, configuration, user training, UAT support, and post-go-live assistance. Key Responsibilities: Lead end-to-end ERP implementation Gather and document business requirements Configure ERP modules (Procurement, Sales, Production) Conduct training and support UAT Coordinate with technical teams for customizations Prepare and migrate opening balances Provide post-go-live support Skills Required: ERP Implementation Experience SQL Queries, Crystal Reports Client Handling, BBP & Document Preparation Strong Communication & Problem-Solving Willing to travel/relocate Finance & Accounts knowledge (preferred) Qualification: B.Tech / B.E / MCA / MBA / M.Tech

Posted 5 days ago

Apply

3.0 years

15 - 20 Lacs

Madurai, Tamil Nadu

On-site

Dear Candidate, Greetings of the day!! I am Kantha, and I'm reaching out to you regarding an exciting opportunity with TechMango. You can connect with me on LinkedIn https://www.linkedin.com/in/kantha-m-ashwin-186ba3244/ Or Email: kanthasanmugam.m@techmango.net Techmango Technology Services is a full-scale software development services company founded in 2014 with a strong focus on emerging technologies. It holds a primary objective of delivering strategic solutions towards the goal of its business partners in terms of technology. We are a full-scale leading Software and Mobile App Development Company. Techmango is driven by the mantra “Clients Vision is our Mission”. We have a tendency to stick on to the current statement. To be the technologically advanced & most loved organization providing prime quality and cost-efficient services with a long-term client relationship strategy. We are operational in the USA - Chicago, Atlanta, Dubai - UAE, in India - Bangalore, Chennai, Madurai, Trichy. Techmangohttps://www.techmango.net/ Job Title: GCP Data Engineer Location: Madurai Experience: 5+ Years Notice Period: Immediate About TechMango TechMango is a rapidly growing IT Services and SaaS Product company that helps global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. We are seeking a GCP Data Architect to lead data modernization efforts for our prestigious client, Livingston, in a highly strategic project. Role Summary As a GCP Data Engineer, you will be responsible for designing and implementing scalable, high-performance data solutions on Google Cloud Platform. You will work closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities: Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) Define data strategy, standards, and best practices for cloud data engineering and analytics Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) Architect data lakes, warehouses, and real-time data platforms Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications: 5+ years of experience in data architecture, data engineering, or enterprise data platforms. Minimum 3 years of hands-on experience in GCP Data Service. Proficient in:BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner. Python / Java / SQL Data modeling (OLTP, OLAP, Star/Snowflake schema). Experience with real-time data processing, streaming architectures, and batch ETL pipelines. Good understanding of IAM, networking, security models, and cost optimization on GCP. Prior experience in leading cloud data transformation projects. Excellent communication and stakeholder management skills. Preferred Qualifications: GCP Professional Data Engineer / Architect Certification. Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics. Exposure to AI/ML use cases and MLOps on GCP. Experience working in agile environments and client-facing roles. What We Offer: Opportunity to work on large-scale data modernization projects with global clients. A fast-growing company with a strong tech and people culture. Competitive salary, benefits, and flexibility. Collaborative environment that values innovation and leadership. Job Type: Full-time Pay: ₹1,500,000.00 - ₹2,000,000.00 per year Application Question(s): Current CTC ? Expected CTC ? Notice Period ? (If you are serving Notice period please mention the Last working day) Experience: GCP Data Architecture : 3 years (Required) BigQuery: 3 years (Required) Cloud Composer (Airflow): 3 years (Required) Location: Madurai, Tamil Nadu (Required) Work Location: In person

Posted 5 days ago

Apply

9.0 - 15.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Title- Snowflake Data Architect Experience- 9 to 15 Years Location- Gurugram Job Summary: We are seeking a highly experienced and motivated Snowflake Data Architect & ETL Specialist to join our growing Data & Analytics team. The ideal candidate will be responsible for designing scalable Snowflake-based data architectures, developing robust ETL/ELT pipelines, and ensuring data quality, performance, and security across multiple data environments. You will work closely with business stakeholders, data engineers, and analysts to drive actionable insights and ensure data-driven decision-making. Key Responsibilities: Design, develop, and implement scalable Snowflake-based data architectures. Build and maintain ETL/ELT pipelines using tools such as Informatica, Talend, Apache NiFi, Matillion, or custom Python/SQL scripts. Optimize Snowflake performance through clustering, partitioning, and caching strategies. Collaborate with cross-functional teams to gather data requirements and deliver business-ready solutions. Ensure data quality, governance, integrity, and security across all platforms. Migrate legacy data warehouses (e.g., Teradata, Oracle, SQL Server) to Snowflake. Automate data workflows and support CI/CD deployment practices. Implement data modeling techniques including dimensional modeling, star/snowflake schema, normalization/denormalization. Support and promote metadata management and data governance best practices. Technical Skills (Hard Skills): Expertise in Snowflake: Architecture design, performance tuning, cost optimization. Strong proficiency in SQL, Python, and scripting for data engineering tasks. Hands-on experience with ETL tools: Informatica, Talend, Apache NiFi, Matillion, or similar. Proficient in data modeling (dimensional, relational, star/snowflake schema). Good knowledge of Cloud Platforms: AWS, Azure, or GCP. Familiar with orchestration and workflow tools such as Apache Airflow, dbt, or DataOps frameworks. Experience with CI/CD tools and version control systems (e.g., Git). Knowledge of BI tools such as Tableau, Power BI, or Looker. Certifications (Preferred/Required): ✅ Snowflake SnowPro Core Certification – Required or Highly Preferred ✅ SnowPro Advanced Architect Certification – Preferred ✅ Cloud Certifications (e.g., AWS Certified Data Analytics – Specialty, Azure Data Engineer Associate) – Preferred ✅ ETL Tool Certifications (e.g., Talend, Matillion) – Optional but a plus Soft Skills: Strong analytical and problem-solving capabilities. Excellent communication and collaboration skills. Ability to translate technical concepts into business-friendly language. Proactive, detail-oriented, and highly organized. Capable of multitasking in a fast-paced, dynamic environment. Passionate about continuous learning and adopting new technologies. Why Join Us? Work on cutting-edge data platforms and cloud technologies Collaborate with industry leaders in analytics and digital transformation Be part of a data-first organization focused on innovation and impact Enjoy a flexible, inclusive, and collaborative work culture

Posted 5 days ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title : Cloud Data Engineer | Database Administrator | ETL & Power BI | DevOps Enthusiast Job Location : Hyderabad /Chennai Job Type : Full Time Experience : 6+ Yrs Notice Period - Immediate to 15 days joiners are highly preferred About the Role: We are seeking a Cloud Data Engineer & Database Administrator to join our Cloud Engineering team and support our cloud-based data infrastructure. This role focuses on optimizing database operations, enabling analytics/reporting tools, and driving automation initiatives to improve scalability, reliability, and cost efficiency across the data platform. Key Responsibilities: Manage and administer cloud-native databases, including Azure SQL, PostgreSQL Flexible Server, Cosmos DB (vCore), and MongoDB Atlas . Automate database maintenance tasks (e.g., backups, performance tuning, auditing, and cost optimization). Implement and monitor data archival and retention policies to enhance query performance and reduce costs. Build and maintain Jenkins pipelines and Azure Automation jobs for database and data platform operations. Design, develop, and maintain dashboards for cost tracking, performance monitoring, and usage analytics (Power BI/Tableau). Enable and manage authentication and access controls (Azure AD, MFA, RBAC). Collaborate with cross-functional teams to support workflows in Databricks, Power BI, and other data tools . Write and maintain technical documentation and standard operating procedures (SOPs) for data platform operations. Work with internal and external teams to ensure alignment of deliverables and data platform standards. Preferred Qualifications: Proven experience with cloud platforms (Azure preferred; AWS or GCP acceptable). Strong hands-on expertise with relational and NoSQL databases . Experience with Power BI (DAX, data modeling, performance tuning, and troubleshooting). Familiarity with CI/CD tools (Jenkins, Azure Automation) and version control (Git). Strong scripting knowledge ( Python, Bash, PowerShell ) and experience with Jira, Confluence, and ServiceNow . Understanding of cloud cost optimization and billing/usage tracking. Experience implementing RBAC, encryption, and security best practices . Excellent problem-solving skills, communication, and cross-team collaboration abilities. Nice to Have: Hands-on experience with Databricks, Apache Spark, or Lakehouse architecture . Familiarity with logging, monitoring, and incident response for data platforms. Understanding of Kubernetes, Docker, Terraform , and advanced CI/CD pipelines. Required Skills: Bachelor’s degree in computer science, Information Technology, or a related field (or equivalent professional experience). 6+ years of professional experience in data engineering or database administration. 3+ years of database administration experience in Linux and cloud/enterprise environments. About the Company: Everest DX – We are a Digital Platform Services company, headquartered in Stamford. Our Platform/Solution includes Orchestration, Intelligent operations with BOTs’, AI-powered analytics for Enterprise IT. Our vision is to enable Digital Transformation for enterprises to deliver seamless customer experience, business efficiency and actionable insights through an integrated set of futuristic digital technologies. Digital Transformation Services - Specialized in Design, Build, Develop, Integrate, and Manage cloud solutions and modernize Data centers, build a Cloud-native application and migrate existing applications into secure, multi-cloud environments to support digital transformation. Our Digital Platform Services enable organizations to reduce IT resource requirements and improve productivity, in addition to lowering costs and speeding digital transformation. Digital Platform - Cloud Intelligent Management (CiM) - An Autonomous Hybrid Cloud Management Platform that works across multi-cloud environments. helps enterprise Digital Transformation get most out of the cloud strategy while reducing Cost, Risk and Speed. To know more please visit: http://www.everestdx.com

Posted 5 days ago

Apply

3.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

Remote

We use cookies to offer you the best possible website experience. Your cookie preferences will be stored in your browser’s local storage. This includes cookies necessary for the website's operation. Additionally, you can freely decide and change any time whether you accept cookies or choose to opt out of cookies to improve website's performance, as well as cookies used to display content tailored to your interests. Your experience of the site and the services we are able to offer may be impacted if you do not accept all cookies. Press Tab to Move to Skip to Content Link Skip to main content Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook Search by Keyword Search by Location Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook View Profile Employee Login Search by Keyword Search by Location Show More Options Loading... Requisition ID All Skills All Select How Often (in Days) To Receive An Alert: Create Alert Select How Often (in Days) To Receive An Alert: Apply now » Apply Now Start apply with LinkedIn Please wait... Senior Data Engineer Job Date: Jun 29, 2025 Job Requisition Id: 60803 Location: Pune, MH, IN Pune, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Data Warehouse Professionals in the following areas : Job Description: Senior Data Engineer As a Senior Data Engineer, you will support the European World Area using the Windows & Azure suite of Analytics & Data platforms. The focus of the role is on the technical aspects and implementation of data gathering, integration and database design. We look forward to seeing your application! In This Role, Your Responsibilities Will Be: Data Ingestion and Integration: Collaborate with Product Owners and analysts to understand data requirements & design, develop, and maintain data pipelines for ingesting, transforming, and integrating data from various sources into Azure Data Services. Migration of existing ETL packages: Migrate existing SSIS packages to Synapse pipelines Data Modelling: Assist in designing and implementing data models, data warehouses, and databases in Azure Synapse Analytics, Azure Data Lake Storage, and other Azure services. Data Transformation: Develop ETL (Extract, Transform, Load) processes using SQL Server Integration Services (SSIS), Azure Synapse Pipelines, or other relevant tools to prepare data for analysis and reporting. Data Quality and Governance: Implement data quality checks and data governance practices to ensure the accuracy, consistency, and security of data assets. Monitoring and Optimization: Monitor and optimize data pipelines and workflows for performance, scalability, and cost efficiency. Documentation: Maintain comprehensive documentation of processes, including data lineage, data dictionaries, and pipeline schedules. Collaboration: Work closely with cross-functional teams, including data analysts, data scientists, and business stakeholders, to understand their data needs and deliver solutions accordingly. Azure Services: Stay updated on Azure data services and best practices to recommend and implement improvements in our data architecture and processes. For This Role, You Will Need: 3-5 years of experience in Data Warehousing with On-Premises or Cloud technologies Strong practical experience of Synapse pipelines / ADF. Strong practical experience of developing ETL packages using SSIS. Strong practical experience with T-SQL or any variant from other RDBMS. Graduate degree educated in computer science or a relevant subject. Strong analytical and problem-solving skills. Strong communication skills in dealing with internal customers from a range of functional areas. Willingness to work flexible working hours according to project requirements. Technical documentation skills. Fluent in English. Preferred Qualifications That Set You Apart: Oracle PL/SQL. Experience in working on Azure Services like Azure Synapse Analytics, Azure Data Lake. Working experience with Azure DevOps paired with knowledge of Agile and/or Scrum methods of delivery. Languages: French, Italian, or Spanish would be an advantage. Agile certification. Who You Are: You understand the importance and interdependence of internal customer relationships. You seek out experts and innovators to learn about the impact emerging technologies might have on your business. You focus on priorities and set stretch goals. Our Offer to You We understand the importance of work-life balance and are dedicated to supporting our employees' personal and professional needs. From competitive benefits plans and comprehensive medical care to equitable opportunities for growth and development we strive to create a workplace that is supportive and rewarding. Depending on location, our flexible work from home policy allows you to make the best of your time, by combining quiet home office days with collaborative experiences in the office so that you can personalize your work-life mix. Moreover, our global volunteer employee resource groups will empower you to connect with peers that share the same interest, promote diversity and inclusion, and positively contribute to communities around us. At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Apply now » Apply Now Start apply with LinkedIn Please wait... Find Similar Jobs: Careers Home View All Jobs Top Jobs Quick Links Blogs Events Webinars Media Contact Contact Us Copyright © 2020. YASH Technologies. All Rights Reserved.

Posted 6 days ago

Apply

15.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description: Lead Software Engineer – Enterprise Solutions & Transformation We are seeking an accomplished Lead Software Engineer with 15+ years of experience in IT and software development to architect, modernize, and deliver robust enterprise solutions. You will drive the transformation of legacy applications to modern cloud-native architectures, build and integrate scalable platforms, and champion best practices in DevOps, observability, and cross-functional collaboration. This technical leadership role is ideal for innovators passionate about enabling business agility through technology modernization and integration. Roles and Responsibilities Architect, design, develop, test, and document enterprise-grade software solutions, aligning with business needs, quality standards, and operational requirements. Lead transformation and modernization efforts: Evaluate and migrate legacy systems to modern, scalable, and maintainable architectures leveraging cloud-native technologies and microservices. Engineer integration solutions with platforms such as Apache Kafka, MuleSoft, and other middleware or messaging technologies to support seamless enterprise connectivity. Define and implement end-to-end architectures for both new and existing systems, ensuring scalability, security, performance, and maintainability. Collaborate with Solution and Enterprise Architects and portfolio stakeholders to analyze, plan, and realize features, enablers, and modernization roadmaps. Work closely with infrastructure engineers to provision, configure, and optimize cloud resources, especially within Azure (AKS, Cosmos DB, Event Hub). Champion containerization and orchestration using Docker and Azure Kubernetes Service (AKS) for efficient deployment and scaling. Drive observability: Define and implement system monitoring, logging, and alerting strategies using tools such as Prometheus, Grafana, and ELK Stack. Lead and participate in code and documentation reviews to uphold quality and engineering excellence. Mentor and coach engineers and developers, fostering technical growth and knowledge sharing. Troubleshoot and resolve complex issues across application, integration, and infrastructure layers. Advocate and implement modern DevOps practices: Build and maintain robust CI/CD pipelines, Infrastructure-as-Code, and automated deployments. Continuously evaluate and adopt new tools, technologies, and processes to improve system quality, delivery, and operational efficiency. Translate business needs and legacy constraints into actionable technical requirements and provide accurate estimates for both new builds and modernization projects. Ensure NFRs (scalability, security, availability, performance) are defined, implemented, and maintained across all solutions. Collaborate cross-functionally with DevOps, support, and peer teams to ensure operational excellence and smooth transformation initiatives. Required Qualifications Bachelor’s or master’s degree in computer science, Information Systems, or a related field. 15+ years of experience in IT and software development roles, with a track record of delivering enterprise-scale solutions. 5+ years of hands-on experience building Java-based, high-volume/high-transaction applications. 5+ years of experience with Java, Spring, and RESTful API development. 3+ years of experience in modernizing legacy applications or leading transformation initiatives. 3+ years of experience in performance tuning, application monitoring, and troubleshooting. 3+ years of experience with integration platforms (Kafka, MuleSoft, RabbitMQ, etc.). 2+ years of experience architecting solutions and leading technical design for enterprise systems. Experience working with container orchestration, especially Azure Kubernetes Service (AKS). Preferred Qualifications 3+ years of experience in microservices architecture and system design. 3+ years in technical leadership or mentoring roles. 3+ years hands-on with cloud platforms (Azure, AWS, GCP, OpenStack). Experience with cloud resource provisioning (ARM templates, Terraform, Ansible, Chef). Strong DevOps skills: CI/CD pipelines with GitHub, Maven, Jenkins, Nexus, SonarQube. Advanced knowledge of observability (Prometheus, Grafana, ELK). Proficiency in Unix/Linux command line and shell scripting. Expert in asynchronous messaging, stream processing, and event-driven architectures. Experience in Agile/Scrum/Kanban environments. Familiarity with front-end technologies (HTML5, JavaScript frameworks, CSS3). Certifications in Java, Spring, Azure, or relevant integration/cloud technologies. Excellent communication skills for both technical and business audiences. Technical Skills Languages & Frameworks: Java, Groovy, Spring (Boot, Cloud), REST Integration & Messaging: Kafka, MuleSoft, RabbitMQ, MQ, Redis, Hazelcast Legacy Modernization: Refactoring, rearchitecting, and migrating monolithic or legacy applications to modern platforms. Databases: NoSQL (Cassandra, Cosmos DB), SQL Monitoring & Observability: Prometheus, Grafana, ELK Stack Orchestration: Docker, AKS (Azure Kubernetes Service) Cloud Platforms: Azure (Event Hub, Cosmos DB, AKS), AWS, GCP, OpenStack IaC & DevOps: Terraform, Ansible, Chef, Jenkins, Maven, Nexus, SonarQube, Git, Jira Scripting & Front-End: Node.js, React.js, Python, R Why Join Us? Lead modernization and transformation of critical business systems to future-ready cloud architectures. Architect and deliver enterprise-scale, highly integrated, observable solutions. Mentor and inspire a talented engineering team. Shape the organization’s technical direction in cloud, integration, and DevOps. Thrive in a collaborative, innovative, and growth-focused environment. Enjoy competitive compensation and opportunities for career advancement. Weekly Hours: 40 Time Type: Regular Location: IND:AP:Hyderabad / Argus Bldg 4f & 5f, Sattva, Knowledge City- Adm: Argus Building, Sattva, Knowledge City It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made.

Posted 6 days ago

Apply

15.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description: Lead Software Engineer – Enterprise Solutions & Transformation We are seeking an accomplished Lead Software Engineer with 15+ years of experience in IT and software development to architect, modernize, and deliver robust enterprise solutions. You will drive the transformation of legacy applications to modern cloud-native architectures, build and integrate scalable platforms, and champion best practices in DevOps, observability, and cross-functional collaboration. This technical leadership role is ideal for innovators passionate about enabling business agility through technology modernization and integration. Roles and Responsibilities Architect, design, develop, test, and document enterprise-grade software solutions, aligning with business needs, quality standards, and operational requirements. Lead transformation and modernization efforts: Evaluate and migrate legacy systems to modern, scalable, and maintainable architectures leveraging cloud-native technologies and microservices. Engineer integration solutions with platforms such as Apache Kafka, MuleSoft, and other middleware or messaging technologies to support seamless enterprise connectivity. Define and implement end-to-end architectures for both new and existing systems, ensuring scalability, security, performance, and maintainability. Collaborate with Solution and Enterprise Architects and portfolio stakeholders to analyze, plan, and realize features, enablers, and modernization roadmaps. Work closely with infrastructure engineers to provision, configure, and optimize cloud resources, especially within Azure (AKS, Cosmos DB, Event Hub). Champion containerization and orchestration using Docker and Azure Kubernetes Service (AKS) for efficient deployment and scaling. Drive observability: Define and implement system monitoring, logging, and alerting strategies using tools such as Prometheus, Grafana, and ELK Stack. Lead and participate in code and documentation reviews to uphold quality and engineering excellence. Mentor and coach engineers and developers, fostering technical growth and knowledge sharing. Troubleshoot and resolve complex issues across application, integration, and infrastructure layers. Advocate and implement modern DevOps practices: Build and maintain robust CI/CD pipelines, Infrastructure-as-Code, and automated deployments. Continuously evaluate and adopt new tools, technologies, and processes to improve system quality, delivery, and operational efficiency. Translate business needs and legacy constraints into actionable technical requirements and provide accurate estimates for both new builds and modernization projects. Ensure NFRs (scalability, security, availability, performance) are defined, implemented, and maintained across all solutions. Collaborate cross-functionally with DevOps, support, and peer teams to ensure operational excellence and smooth transformation initiatives. Required Qualifications Bachelor’s or master’s degree in computer science, Information Systems, or a related field. 15+ years of experience in IT and software development roles, with a track record of delivering enterprise-scale solutions. 5+ years of hands-on experience building Java-based, high-volume/high-transaction applications. 5+ years of experience with Java, Spring, and RESTful API development. 3+ years of experience in modernizing legacy applications or leading transformation initiatives. 3+ years of experience in performance tuning, application monitoring, and troubleshooting. 3+ years of experience with integration platforms (Kafka, MuleSoft, RabbitMQ, etc.). 2+ years of experience architecting solutions and leading technical design for enterprise systems. Experience working with container orchestration, especially Azure Kubernetes Service (AKS). Preferred Qualifications 3+ years of experience in microservices architecture and system design. 3+ years in technical leadership or mentoring roles. 3+ years hands-on with cloud platforms (Azure, AWS, GCP, OpenStack). Experience with cloud resource provisioning (ARM templates, Terraform, Ansible, Chef). Strong DevOps skills: CI/CD pipelines with GitHub, Maven, Jenkins, Nexus, SonarQube. Advanced knowledge of observability (Prometheus, Grafana, ELK). Proficiency in Unix/Linux command line and shell scripting. Expert in asynchronous messaging, stream processing, and event-driven architectures. Experience in Agile/Scrum/Kanban environments. Familiarity with front-end technologies (HTML5, JavaScript frameworks, CSS3). Certifications in Java, Spring, Azure, or relevant integration/cloud technologies. Excellent communication skills for both technical and business audiences. Technical Skills Languages & Frameworks: Java, Groovy, Spring (Boot, Cloud), REST Integration & Messaging: Kafka, MuleSoft, RabbitMQ, MQ, Redis, Hazelcast Legacy Modernization: Refactoring, rearchitecting, and migrating monolithic or legacy applications to modern platforms. Databases: NoSQL (Cassandra, Cosmos DB), SQL Monitoring & Observability: Prometheus, Grafana, ELK Stack Orchestration: Docker, AKS (Azure Kubernetes Service) Cloud Platforms: Azure (Event Hub, Cosmos DB, AKS), AWS, GCP, OpenStack IaC & DevOps: Terraform, Ansible, Chef, Jenkins, Maven, Nexus, SonarQube, Git, Jira Scripting & Front-End: Node.js, React.js, Python, R Why Join Us? Lead modernization and transformation of critical business systems to future-ready cloud architectures. Architect and deliver enterprise-scale, highly integrated, observable solutions. Mentor and inspire a talented engineering team. Shape the organization’s technical direction in cloud, integration, and DevOps. Thrive in a collaborative, innovative, and growth-focused environment. Enjoy competitive compensation and opportunities for career advancement. Weekly Hours: 40 Time Type: Regular Location: IND:AP:Hyderabad / Argus Bldg 4f & 5f, Sattva, Knowledge City- Adm: Argus Building, Sattva, Knowledge City It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made.

Posted 6 days ago

Apply

8.0 - 13.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Job Description What you will do As a Data Engineer, you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate degree / Master's degree / Bachelor's degree and 8 to 13 years of Computer Science, IT or related field experience. Must-Have Skills: Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL),Snowflake, workflow orchestration, performance tuning on big data processing Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Proficient in SQL, Python for extracting, transforming, and analyzing complex datasets from relational data stores Proficient in Python with strong experience in ETL tools such as Apache Spark and various data processing packages, supporting scalable data workflows and machine learning pipeline development. Strong understanding of data modeling, data warehousing, and data integration concepts Proven ability to optimize query performance on big data platforms Knowledge on Data visualization and analytics tools like Spotfire, PowerBI Preferred Qualifications: Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Knowledge of Python/R, Databricks, SageMaker, cloud data platforms Strong knowledge on Oracle / SQL Server, Stored Procedure, PL/SQL, Knowledge on LINUX OS Experience in implementing Retrieval-Augmented Generation (RAG) pipelines, integrating retrieval mechanisms with language models. Skilled in developing machine learning models using Python, with hands-on experience in deep learning frameworks including PyTorch and TensorFlow. Strong understanding of data governance frameworks, tools, and best practices. Knowledge of vector databases, including implementation and optimization. Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Professional Certifications: Databricks Certificate preferred AWS Data Engineer/Architect Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 6 days ago

Apply

5.0 - 9.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What You Will Do As a Data Engineer, you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What We Expect Of You We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master's degree / Bachelor's degree and 5 to 9 years of Computer Science, IT or related field experience. Must-Have Skills: Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL),Snowflake, workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) Proficient in SQL for extracting, transforming, and analyzing complex datasets from relational data stores Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Proven ability to optimize query performance on big data platforms Preferred Qualifications: Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Knowledge of Python/R, Databricks, SageMaker, cloud data platforms Strong knowledge on Oracle / SQL Server, Stored Procedure, PL/SQL, Knowledge on LINUX OS Knowledge on Data visualization and analytics tools like Spotfire, PowerBI Strong understanding of data governance frameworks, tools, and best practices. Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Professional Certifications: Databricks Certificate preferred AWS Data Engineer/Architect Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 6 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies