Home
Jobs

957 Migrate Jobs - Page 17

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 years

0 - 0 Lacs

India

On-site

GlassDoor logo

Hiring Alert: Study Abroad Counsellors (Experience: 0–3 Years) Location: Noida (Onsite, Full-Time) Migrate to Abroad is expanding, and we're on the lookout for motivated and passionate Study Abroad Counsellors with 0–3 years of experience to join our growing team! Who We're Looking For: 0–3 years of relevant experience in higher education counselling Strong knowledge of universities in the USA, UK, Canada, Australia & Europe Excellent communication and time management skills This is an onsite, full-time opportunity based out of Noida. Come be a part of a mission-driven team shaping global careers! DM me or send your resume to +91 8799701212 Job Type: Full-time Pay: ₹15,000.00 - ₹35,000.00 per month Benefits: Cell phone reimbursement Commuter assistance Schedule: Day shift Supplemental Pay: Commission pay Performance bonus Yearly bonus Work Location: In person Expected Start Date: 01/07/2025

Posted 1 week ago

Apply

0 years

4 - 8 Lacs

Calcutta

On-site

GlassDoor logo

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose – the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant-Data Engineer, Azure+Python ! Responsibilities Design and deploy scalable, highly available , and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Master’s Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience of working with Oracle ERP Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Lead Consultant Primary Location India-Kolkata Schedule Full-time Education Level Master's / Equivalent Job Posting Jun 4, 2025, 6:38:02 AM Unposting Date Ongoing Master Skills List Digital Job Category Full Time

Posted 1 week ago

Apply

5.0 years

0 Lacs

Calcutta

On-site

GlassDoor logo

Project description We have an ambitious goal to migrate a legacy system written in HLASM (High-Level Assembler) from the mainframe to a cloud-based Java environment for one of the largest banks in the USA. Responsibilities We are looking for an experienced Java Developer who can help perform the migration of the client platform: Write Java code following the new architecture. Troubleshoot, debug, and resolve issues within the new Java system. Collaborate with client teams, to ensure alignment with project goals and deliver high-quality solutions. Maintain and fine-tune Gen AI application that supports the migration process. Mandatory work from the office 5 days per week. Skills Must have Proficiency in Java development (5+ years). Deep understanding of enterprise application architecture patterns. Strong problem-solving and debugging skills. Experience of work in distributed teams, with US customers. Excellent communication skills for collaboration with the client teams. Nice to have Experience with cloud platforms and services (preferably AWS). Experience with Python Language. Familiarity with large-scale system migrations and modernization efforts. Experience with HLASM or other low-level programming languages. Familiarity with Generative AI. Prior experience working in the banking or financial services industry. Knowledge of performance tuning and optimization in cloud environments. Other Languages English: B2 Upper Intermediate Seniority Senior Kolkata, India Req. VR-107927 Java BCM Industry 04/06/2025 Req. VR-107927

Posted 1 week ago

Apply

3.0 years

0 - 1 Lacs

Calcutta

On-site

GlassDoor logo

Project description Application Modernization Practice is a horizontal practice, supporting all business verticals in Luxoft. We are looking for a Senior System Analyst who will be able to work with different domain zones. For this project we have an ambitious goal to migrate a legacy system written in HLASM (High-Level Assembler) from the mainframe to a cloud-based Java environment for one of the largest banks in the USA. Compensation for NYC: 90000-115000 USD Gross per year. Responsibilities Develop, analyze, prioritize, and organize requirement specifications, data mapping, diagrams, and flowcharts for developers and testers to follow; Recovering the design, requirement specifications and functions of a system from an analysis of its code (currently High-Level Assembler, Cobol in the nearest future); Translate non-technical requirements into clear technical specifications; Convert low level code descriptions into meaningful functional requirements; Organize requirements into features and stories; Maintain knowledge base for the system and the project; Participate in the optimization of internal team processes; Mandatory work from the office 5 days per week. Skills Must have Technical background is a must; Must be ready to deep dive into High-Level Assembler code to reverse engineer requirements; Must be able to provide concise and coherent documentation in English (target audience is native speakers); Must be able to enhance requirements with models and diagrams as needed We expect from the candidate: At least 3 years of experience in a BA/SA role; Analytical mindset; Understanding of and practical experience with different types of requirements and their representation; Good understanding of non-functional requirements; Ability to communicate in both business and technical languages; High level of seniority when dealing with complex tasks; Experience in analyzing and documenting integrations; Experience of working on projects with international stakeholders (USA, UK); Good theoretical and practical knowledge of analytical tools (functional/non-functional requirements, use cases, user stories, UML). Nice to have Experience with IBM zOS mainframe Understanding of IBM High-Level Assembler, COBOL Experience with microservices architecture, message brokers Other Languages English: B2 Upper Intermediate Seniority Senior Kolkata, India Req. VR-113322 Functional/System Analysis BCM Industry 04/06/2025 Req. VR-113322

Posted 1 week ago

Apply

5.0 years

9 - 9 Lacs

Indore

On-site

GlassDoor logo

Project description We have an ambitious goal to migrate a legacy system written in HLASM (High-Level Assembler) from the mainframe to a cloud-based Java environment for one of the largest banks in the USA. Responsibilities We are looking for an experienced Java Developer who can help perform the migration of the client platform: Write Java code following the new architecture. Troubleshoot, debug, and resolve issues within the new Java system. Collaborate with client teams, to ensure alignment with project goals and deliver high-quality solutions. Maintain and fine-tune Gen AI application that supports the migration process. Mandatory work from the office 5 days per week. Skills Must have Proficiency in Java development (5+ years). Deep understanding of enterprise application architecture patterns. Strong problem-solving and debugging skills. Experience of work in distributed teams, with US customers. Excellent communication skills for collaboration with the client teams. Nice to have Experience with cloud platforms and services (preferably AWS). Experience with Python Language. Familiarity with large-scale system migrations and modernization efforts. Experience with HLASM or other low-level programming languages. Familiarity with Generative AI. Prior experience working in the banking or financial services industry. Knowledge of performance tuning and optimization in cloud environments. Other Languages English: B2 Upper Intermediate Seniority Senior Indore, India Req. VR-107927 Java BCM Industry 04/06/2025 Req. VR-107927

Posted 1 week ago

Apply

3.0 years

0 - 1 Lacs

Indore

On-site

GlassDoor logo

Project description Application Modernization Practice is a horizontal practice, supporting all business verticals in Luxoft. We are looking for a Senior System Analyst who will be able to work with different domain zones. For this project we have an ambitious goal to migrate a legacy system written in HLASM (High-Level Assembler) from the mainframe to a cloud-based Java environment for one of the largest banks in the USA. Compensation for NYC: 90000-115000 USD Gross per year. Responsibilities Develop, analyze, prioritize, and organize requirement specifications, data mapping, diagrams, and flowcharts for developers and testers to follow; Recovering the design, requirement specifications and functions of a system from an analysis of its code (currently High-Level Assembler, Cobol in the nearest future); Translate non-technical requirements into clear technical specifications; Convert low level code descriptions into meaningful functional requirements; Organize requirements into features and stories; Maintain knowledge base for the system and the project; Participate in the optimization of internal team processes; Mandatory work from the office 5 days per week. Skills Must have Technical background is a must; Must be ready to deep dive into High-Level Assembler code to reverse engineer requirements; Must be able to provide concise and coherent documentation in English (target audience is native speakers); Must be able to enhance requirements with models and diagrams as needed We expect from the candidate: At least 3 years of experience in a BA/SA role; Analytical mindset; Understanding of and practical experience with different types of requirements and their representation; Good understanding of non-functional requirements; Ability to communicate in both business and technical languages; High level of seniority when dealing with complex tasks; Experience in analyzing and documenting integrations; Experience of working on projects with international stakeholders (USA, UK); Good theoretical and practical knowledge of analytical tools (functional/non-functional requirements, use cases, user stories, UML). Nice to have Experience with IBM zOS mainframe Understanding of IBM High-Level Assembler, COBOL Experience with microservices architecture, message brokers Other Languages English: B2 Upper Intermediate Seniority Senior Indore, India Req. VR-113322 Functional/System Analysis BCM Industry 04/06/2025 Req. VR-113322

Posted 1 week ago

Apply

12.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

Overview PepsiCo operates in an environment undergoing immense and rapid change. Big-data and digital technologies are driving business transformation that is unlocking new capabilities and business innovations in areas like eCommerce, mobile experiences and IoT. The key to winning in these areas is being able to leverage enterprise data foundations built on PepsiCo’s global business scale to enable business insights, advanced analytics, and new product development. PepsiCo’s Data Management and Operations team is tasked with the responsibility of developing quality data collection processes, maintaining the integrity of our data foundations, and enabling business leaders and data scientists across the company to have rapid access to the data they need for decision-making and innovation. What PepsiCo Data Management and Operations does: Maintain a predictable, transparent, global operating rhythm that ensures always-on access to high-quality data for stakeholders across the company. Responsible for day-to-day data collection, transportation, maintenance/curation, and access to the PepsiCo corporate data asset. Work cross-functionally across the enterprise to centralize data and standardize it for use by business, data science or other stakeholders. Increase awareness about available data and democratize access to it across the company. As a data engineering lead, you will be the key technical expert overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be empowered to create & lead a strong team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. Ideally Candidate must be flexible to work an alternative schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon coverage requirements of the job. The candidate can work with immediate supervisor to change the work schedule on rotational basis depending on the product and project requirements. Responsibilities Manage a team of data engineers and data analysts by delegating project responsibilities and managing their flow of work as well as empowering them to realize their full potential. Design, structure and store data into unified data models and link them together to make the data reusable for downstream products. Manage and scale data pipelines from internal and external data sources to support new product launches and drive data quality across data products. Create reusable accelerators and solutions to migrate data from legacy data warehouse platforms such as Teradata to Azure Databricks and Azure SQL. Enable and accelerate standards-based development prioritizing reuse of code, adopt test-driven development, unit testing and test automation with end-to-end observability of data Build and own the automation and monitoring frameworks that captures metrics and operational KPIs for data pipeline quality, performance and cost. Collaborate with internal clients (product teams, sector leads, data science teams) and external partners (SI partners/data providers) to drive solutioning and clarify solution requirements. Evolve the architectural capabilities and maturity of the data platform by engaging with enterprise architects to build and support the right domain architecture for each application following well-architected design standards. Define and manage SLA’s for data products and processes running in production. Create documentation for learnings and knowledge transfer to internal associates. Qualifications 12+ years of overall technology experience that includes at least 5+ years of hands-on software development, data engineering, and systems architecture. 8+ years of experience with Data Lakehouse, Data Warehousing, and Data Analytics tools. 6+ years of experience in SQL optimization and performance tuning on MS SQL Server, Azure SQL or any other popular RDBMS 6+ years of experience in Python/Pyspark/Scala programming on big data platforms like Databricks 4+ years in cloud data engineering experience in Azure or AWS. Fluent with Azure cloud services. Azure Data Engineering certification is a plus. Experience with integration of multi cloud services with on-premises technologies. Experience with data modelling, data warehousing, and building high-volume ETL/ELT pipelines. Experience with data profiling and data quality tools like Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one business intelligence tool such as Power BI or Tableau Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes. Experience with version control systems like ADO, Github and CI/CD tools for DevOps automation and deployments. Experience with Azure Data Factory, Azure Databricks and Azure Machine learning tools. Experience with Statistical/ML techniques is a plus. Experience with building solutions in the retail or in the supply chain space is a plus. Understanding of metadata management, data lineage, and data glossaries is a plus. BA/BS in Computer Science, Math, Physics, or other technical fields. Candidate must be flexible to work an alternative work schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon product and project coverage requirements of the job. Candidates are expected to be in the office at the assigned location at least 3 days a week and the days at work needs to be coordinated with immediate supervisor Skills, Abilities, Knowledge: Excellent communication skills, both verbal and written, along with the ability to influence and demonstrate confidence in communications with senior level management. Proven track record of leading, mentoring data teams. Strong change manager. Comfortable with change, especially that which arises through company growth. Ability to understand and translate business requirements into data and technical requirements. High degree of organization and ability to manage multiple, competing projects and priorities simultaneously. Positive and flexible attitude to enable adjusting to different needs in an ever-changing environment. Strong leadership, organizational and interpersonal skills; comfortable managing trade-offs. Foster a team culture of accountability, communication, and self-management. Proactively drives impact and engagement while bringing others along. Consistently attain/exceed individual and team goals. Ability to lead others without direct authority in a matrixed environment. Comfortable working in a hybrid environment with teams consisting of contractors as well as FTEs spread across multiple PepsiCo locations. Domain Knowledge in CPG industry with Supply chain/GTM background is preferred. Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Requirements Description and Requirements Position Summary: The Shared Application Platform Engineering team is to provide the enterprise configuration and support for integration technologies such as IBM Middleware tools like MQ and ensure the platform stability and process improvement. Responsibilities include planning, support, and implementation of application platform infrastructure to include operational processes and procedures Job Responsibilities: Handle MQ Admin BAU activities such as manage QMGRs & Objects/maintenance/patching/ configurations etc. Should have knowledge on SSL Certificate management, security vulnerabilities in MQ Scheduling and Monitoring MQ backups & performing housekeeping and daily health check Install & Configure IBM MQ Support Project for MQ upgrade or migrate to new version and apply Fixpack/Interim Fixpack, Refresh Pack/Ifix etc. Setting up new QMGRs and its object Investigate and Troubleshot issues in MQ Knowledge on Performance Tuning or optimizing of MQ Coordinate with Systems Administrators, UNIX, Network and DBAs, scheduling and implementing software patches & upgrades Support development/functional teams with performance tuning and troubleshooting issues & Co-ordinatr with IBM vendor Monitor and acknowledge Incidents/Change-Tickets/SRs/Problem-Tickets within SLA Working Knowledge on RCA's & SIP's & Automating tasks Provide Support for MQ DR activity Basic knowledge of shell scripting or Ansible to manage & create MQ admin related tasks for automation Create knowledge base documents and SOPs for the Middleware support Handling Problem management calls and provide the RCA for the P1/P2 issues Good knowledge on IIB and/or APIC Basic knowledge on IBM-CP4I and/or OpenShift Container Platform (OCP) Willing to work in rotational shift Good communication, written skills & interacting with Client & Stake holders Education: Bachelor's degree in computer science, Information Systems, or related field Experience: 10+ years of total experience and at least 7+ years of experience in Middlware applications like MQ Admin BAU activities such as manage QMGRs & Objects/maintenance/patching/ configurations Install & Configure IBM MQ Scheduling and Monitoring MQ backups & performing housekeeping and daily health check Good knowledge on IIB and/or APIC Good knowledge on SSL Certificate management, security vulnerabilities in MQ WebMethods WebSphere Message Broker (WMB) IBM Integration Bus (IIB) CP4I ACE MQ Open Shift (Kubernettes) Ansible (Automation) IBM API Connect v10 App Connect Professional (Cast Iron) Linux / AIX Elastic Azure DevOps YAML/JSON Python and/or Powershell Agile SAFe for Teams SDLC SSL DataPower About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible . Join us! Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

Linkedin logo

Hydro Global Business Services (GBS) is an organizational area that operates as an internal service provider for the Hydro group. Its ultimate purpose is to deliver relevant IT, financial and HR business services to all business areas within the company Role And Responsibilities Role Purpose The primary role and responsibility of the SAP Data Migration Expert (S4) is to Design, Develop and Execute data migration objects in the scope of Hydro’s SAPEX program. The role is a regional one (EU), but as much as possible synergies among regions need to be ensured in the relevant objects and activities. Responsibilities Utilize strong developer experience in all phases of an end-to-end SAP Data Migration. Requires working experience with Data migration cockpit and Fiori app Migrate Your Data. Requires strong ABAP developer knowledge to enhance and debug the Migration objects. Responsible for S4 Hana Data Migration and Fiori app, LTMC, LTMOM is a must. Working experience in Data Services as ETL for Data Migration to S4 Hana system. Review as is data quality and support achieving required data quality improvements. Support data extraction and transformation workshops to identify appropriate field mappings, value mappings and business rules. Support the build of data profiling, cleansing and transformation. Support the data load and cutover activities to migrate data to target environment. Provision of medium and complex cleansing rules where required / perform testing in tool. Work Experience Required experience, qualifications and skills Should have 8+ years of experience in a SAP developer role in Data Migration. Able to exhibit a progression of increasingly complex job responsibilities during the period. Full lifecycle SAP experience in an SAP manufacturing environment with specific understanding of manufacturing, supply chain, shop floor, commercial and finance processes and operations. Developer experience is a must in ABAP and SAP S4 DM tools. Certification is a plus. Developer experience of SAP Data services is required. Education, Specific Skills College or University (IT /Business Degree or Equivalent). Certified Data Management Professional (CDMP) or equivalent. Good understanding of SAP system’s general architecture; broad knowledge of business functions. Specific knowledge of master and transactional data objects in SAP (all modules SD, MM, WM, PP, QM, FI and CO). ETL methodology, tools and processes. Technical expertise in: Data Models, Database Design Oracle SQL or MS SQL, Access queries & report writing. Technical knowledge of BODS and S/4 Hana Migration Cockpit (2022 beneficial). LSMW is a plus. Fluent in English is mandatory. Any other language is a plus. Expected Skills, Expected Soft-skills, Competencies Experience in S4 Hana Data Migration and Fiori app, LTMC, LTMOM is a must. Development experience in Data Services (BODS) for Data Migration. Good understanding of SAP Data Migration with DS as ETL is a plus. Should have worked on preparation of Technical Specification documents. Experience in end-to-end Data Migration Projects for SAP Deployments. Experience in S4/HANA Data Migration projects with strong ABAP knowledge. Thorough knowledge of developing real time Interfaces like IDOC, BAPI, LTMC Staging Tables, S4 Hana Data Migration cockpit, Fiori App Migrate your data, ETL with Data services. Should have knowledge on working with SAP system and/or other ERP-s as a source. Should have basic knowledge of Data Warehousing concepts on schema formation, SQL queries. Technical understanding of data in terms of Configuration data, Master data and Transactional data handling. Should be able to provide the estimation on work effort. Able to set proper priorities on his/her own tasks, and initiate escalations when necessary. Has to be a good team player; opened for discussions, knowledge share and capable to coordinate conversations with cross-functional colleagues regarding own data objects. What We Offer You Working at the world’s only fully integrated aluminum and leading renewable energy company Diverse, global teams Flexible work environment/home office We provide you the freedom to be creative and to learn from experts Possibility to grow with the company, gain new certificates Attractive benefit package Please apply by uploading your CV and optionally a cover letter. Only applications received through our online system will be considered, not via e-mail. Recruiter Lima Mathew Sr. HR Advisor People Resourcing Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Power Apps Developer – High Priority Role Location: Remote Ermin Systems is actively seeking a skilled Power Apps Developer to lead the transformation of an existing Access-based application into a scalable, model-driven Power App solution. Project Overview: We are building a model-driven application to replicate and enhance a legacy Access-based system. The initial goal is to develop a mini-CRM for customer information management. This will be expanded over time to include modules for managing assets, products, and more. Current Environment: Existing data is stored in an on-premise SQL Server , synced daily to Azure SQL Target data source for the Power App is Dataverse Database design and field definitions are mostly complete Customer information management is the first major feature and currently a critical gap Key Responsibilities: Develop a model-driven Power App leveraging Dataverse as the backend Migrate data from Azure SQL to Dataverse Collaborate with the team to enhance the current database schema where needed Build scalable, maintainable components for future expansion into additional modules Ensure the application adheres to best practices in Power Platform architecture Requirements: Strong experience with Microsoft Power Apps, especially model-driven apps In-depth understanding of Dataverse and Azure SQL integration Ability to translate business needs into structured, low-code application workflows Familiarity with Power Automate and Power BI is a plus Strong problem-solving skills and the ability to work independently in a fast-paced environment This is a high-priority engagement, and we are looking to onboard someone who can start immediately and take ownership of the build from day one. If you’re interested in solving real business problems with modern tools, we’d love to hear from you. Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Title: UiPath Test Automation Engineer Location: Chennai, Blore or Gurgaon Experience Required: Minimum 6 years in Test Automation, including at least 3 years in UiPath Job Summary : We are seeking a skilled and experienced UiPath Test Automation Engineer to join our team. The ideal candidate will have a strong background in test automation, with deep expertise in UiPath tools and frameworks. You will be responsible for developing, migrating, and maintaining automated test cases while continuously enhancing the test automation framework. Key Responsibilities : Design, develop, and implement new automated test cases using UiPath. Migrate existing test cases to the UiPath test automation framework. Contribute to the development of reusable libraries, utilities, and reporting mechanisms within the test framework. Continuously improve automation processes and enhance the test framework for better efficiency and maintainability. Collaborate with team members to troubleshoot and resolve issues, ensuring minimal downtime and quick problem resolution. Work closely with developers, QA engineers, and other stakeholders to ensure robust and scalable automation solutions. Required Qualifications & Skills: Minimum 6 years of experience in test automation, with at least 3 years hands-on experience in UiPath Test Automation. Proficiency in UiPath Studio, UiPath Orchestrator, and related RPA components. Strong understanding and practical experience in designing and building scalable UiPath test automation frameworks. Familiarity with version control systems such as Git and issue tracking tools like JIRA. Excellent analytical, problem-solving, and communication skills. Ability to work independently and as part of a collaborative team environment. Preferred Qualifications : Experience with CI/CD pipelines and integration of test automation with build systems. Knowledge of other automation tools or RPA platforms. Familiarity with Agile methodologies. #UrgentHiring #Blore #Guragon #Chennai Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Role Overview: The Infrastructure Monitoring & Security Engineer will be responsible for enhancing and maintaining monitoring platforms (Zabbix, Cacti, Report Portal), DNS and Syslog services, and ensuring the security hardening of all servers. This role requires hands-on experience in monitoring, performance tuning, upgrades, and security compliance across complex IT environments. Key Responsibilities: Zabbix Monitoring Enhancements: Design and customize templates, items, triggers, and dashboards. Optimize alert rules, integrate with third-party tools (email, WhatsApp, Telegram), and improve Grafana visualizations. Zabbix System Tuning & Upgrades: Upgrade to the latest version, fine-tune performance (pollers, proxies, DB tuning), and manage data retention policies. Cacti Monitoring Platform Management: Upgrade platform components (RRDTool, Spine, PHP, MySQL), implement performance optimizations, and apply security patches. Report Portal Optimization: Upgrade to the latest version, implement RBAC, and configure secure access with SSL. Syslog Server Optimization: Manage log rotation, compression, and archival. Migrate logs to archive server and upgrade syslog service. TACACS+ Server Hardening: Implement SSL, configure certificates, and ensure secure access controls. DNS Server Administration: Set up and manage DNS (BIND/PowerDNS), zone configuration, DNSSEC, and performance tuning. Integrate monitoring for DNS health. Security & Compliance Hardening: Perform security audits, enforce firewall rules, disable root access, enable SSL/TLS across services, manage password policies, and ensure regular encrypted backups. Required Skills: Zabbix, Grafana, Cacti, Report Portal Linux server administration (Ubuntu, CentOS, RHEL) DNS (BIND/PowerDNS), Syslog-ng or Rsyslog MySQL/PostgreSQL database tuning Shell scripting, basic Python scripting SSL/TLS, firewall configuration, SSH hardening CI/CD fundamentals and DevOps exposure (preferred) Minimum Experience: 6+ years in infrastructure monitoring, Linux administration, and security hardening. Show more Show less

Posted 1 week ago

Apply

15.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Ready to join the future of innovation in IT at NXP? Become part of the startup of a dynamic team that is leading NXP on a digital transformation journey. Your role is to be an ambassador for the Agile and DevOps way of working within our global NXP organization. There is a lot of room for new ideas and innovation, and you will be supported to have a continuous focus on development, coaching and creating a supportive environment for your team. Windows and Virtualization System Architect / SME Subject matter expert and in-depth hands on experience managing Hyper-V, VMware and Hyper converge solutions Subject matter expert and in-depth hands on experience managing Windows operating systems, Clustering and various types of compute hardware including blade server technologies includes HPE Synergy, C7000. Design and prepare solution blueprint, high level and low level design of infrastructure diverse solutions for above technologies to implement, migrate, integrate or transform the services in datacenters on premises, hybrid and native cloud (Azure and AWS) environment. In depth knowledge and hands on experience integrating above technologies with patching solutions like WSUS, SCCM, etc, Dell, NetApp, Pure storage solutions, Networker backup, Oracle, MSSQL, MySQL database solutions and middleware services. Extensive experience in datacenter migrations involves above technologies. Strong knowledge and hands on experience with virtualization migrations such P2V and Virtual machine migration across different platform products. Design, configure and support of Active-Active datacenter with virtualization and clustering technologies Expertise in automating the technology stack using Ansible, GIT, Splunk, REST API and native OS scripting for provision, upgrade, changes and management. Strong knowledge in monitoring solutions like Splunk, Zabbix, HPE OneView and native OS monitoring tools. Good knowledge on storage, backup, networking and security products / principles Ensure license compliance of products Researches, identifies, selects and tests technology products required for solution delivery and architectural improvements Establishes, implements and documents the technology implementation, integration and migration strategies to help the organization achieve strategic goals and objectives Design and document DR architecture to ensure business continuity Keep current on industry trends and new technologies for the system architecture Manage the integrated infrastructure solutions to help business functions achieve objectives in a cost-effective and efficient manner. Harmonize and maintain the standardization in IT infrastructure solutions in datacenters in accordance to global IT architecture and security standards Identifies gaps, strategic impacts, financial impacts and the risks in the technical solution or offering, and provides technical support Define the monitoring KPI’s and thresholds for proactive detection of availability and performance of technology stack. Prepare, maintain and track the roadmap of technology refresh to improve efficiency, reliability and performance, eliminate technical debt and security risks Diagnose complex Infrastructure issues and drive support team to ensure zero impact delivery of services through Incident, Problem, Change and risk management. Support technical support teams to fix critical incidents and perform root cause analysis Periodically audit existing systems infrastructure and architecture to ensure an quality, compliance, accurate, high-level understanding of present capabilities Periodically perform the assessment of existing systems infrastructure and provide recommendations to capacity, improve quality, high availability and performance. Recommend and coordinate upgrades, assisting business functions in technology planning aligned with growth projections from IT managers. Work with IT managers, understand the requirements / issues and guide technology support teams with strategic and technical steps to provide solution. Defines system solutions based on business function needs, cost, and required integration with existing applications, systems or platforms. Report to IT managers and key stakeholders regarding findings, making recommendations and providing clear roadmaps for successful changes and upgrades Collaborate with other IT managers, other infrastructure teams and application eco domains to develop highly available and reliable systems solutions capable of supporting global IT goals Oversee the support teams that implement changes in infrastructure, ensuring seamless integration of new technologies. Coordinate with project teams and IT managers to track and implement the infrastructure migration and changes. Review infrastructure changes and advise the steps and plan to ensure business continuity. Qualifications Education & Experience Bachelor’s degree in Information Technology, Computer Science, or a related field. 15+ years of experience in IT architecture/SME role. Preferred Qualifications/Certifications Related technology certifications are highly desirable. Leadership & Soft Skills Excellent leadership, decision-making, and team-building abilities. Strong problem-solving skills with a focus on root cause analysis and proactive prevention. Analytical abilities, proficient in analyzing data and creating reports. Exceptional verbal and written communication and training skills, with the ability to convey technical concepts to non-technical audiences. Ability to work under pressure in high-stakes situations with a calm and focused approach. More information about NXP in India... Show more Show less

Posted 1 week ago

Apply

15.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Ready to join the future of innovation in IT at NXP? Become part of the startup of a dynamic team that is leading NXP on a digital transformation journey. Your role is to be an ambassador for the Agile and DevOps way of working within our global NXP organization. There is a lot of room for new ideas and innovation, and you will be supported to have a continuous focus on development, coaching and creating a supportive environment for your team. Linux and UNIX Architect / SME Subject matter expert and in-depth hands on experience managing Linux, AIX, Solaris, Clustering, and various types of compute hardware including blade server technologies includes HPE Synergy, C7000. Subject matter expert and in-depth hands on experience managing VMware, LPAR/VPAR, Solaris Zones, KVM based virtualization, Hyper converge, Docker and Kubernetes. Design and prepare solution blueprint, high level and low level design of infrastructure diverse solutions for above technologies to implement, migrate, integrate or transform the services in datacenters on premises, hybrid and native cloud (Azure and AWS) environment. In depth knowledge and hands on experience integrating above technologies with Satellite, Dell, NetApp, Pure storage solutions, Networker backup, Oracle, MSSQL, MySQL database solutions and middleware services. Extensive experience in datacenter migrations involves above technologies. Strong knowledge and hands on experience with virtualization migrations such P2V and Virtual machine migration across different platform products. Design, configure and support of Active-Active datacenter with virtualization and clustering technologies Expertise in automating the technology stack using Ansible, GIT, Splunk, REST API and native OS scripting for provision, upgrade, changes and management. Strong knowledge in monitoring solutions like Splunk, Zabbix, HPE OneView and native OS monitoring tools. Good knowledge on storage, backup, networking and security products / principles Ensure license compliance of products Researches, identifies, selects and tests technology products required for solution delivery and architectural improvements Establishes, implements and documents the technology implementation, integration and migration strategies to help the organization achieve strategic goals and objectives Design and document DR architecture to ensure business continuity Keep current on industry trends and new technologies for the system architecture Manage the integrated infrastructure solutions to help business functions achieve objectives in a cost-effective and efficient manner. Harmonize and maintain the standardization in IT infrastructure solutions in datacenters in accordance to global IT architecture and security standards Identifies gaps, strategic impacts, financial impacts and the risks in the technical solution or offering, and provides technical support Define the monitoring KPI’s and thresholds for proactive detection of availability and performance of technology stack. Prepare, maintain and track the roadmap of technology refresh to improve efficiency, reliability and performance, eliminate technical debt and security risks Diagnose complex Infrastructure issues and drive support team to ensure zero impact delivery of services through Incident, Problem, Change and risk management. Support technical support teams to fix critical incidents and perform root cause analysis Periodically audit existing systems infrastructure and architecture to ensure an quality, compliance, accurate, high-level understanding of present capabilities Periodically perform the assessment of existing systems infrastructure and provide recommendations to capacity, improve quality, high availability and performance. Recommend and coordinate upgrades, assisting business functions in technology planning aligned with growth projections from IT managers. Work with IT managers, understand the requirements / issues and guide technology support teams with strategic and technical steps to provide solution. Defines system solutions based on business function needs, cost, and required integration with existing applications, systems or platforms. Report to IT managers and key stakeholders regarding findings, making recommendations and providing clear roadmaps for successful changes and upgrades Collaborate with other IT managers, other infrastructure teams and application eco domains to develop highly available and reliable systems solutions capable of supporting global IT goals Oversee the support teams that implement changes in infrastructure, ensuring seamless integration of new technologies. Coordinate with project teams and IT managers to track and implement the infrastructure migration and changes. Review infrastructure changes and advise the steps and plan to ensure business continuity. Qualifications Education & Experience Bachelor’s degree in Information Technology, Computer Science, or a related field. 15+ years of experience in IT architecture/SME role. Preferred Qualifications/Certifications Related technology certifications are highly desirable. Leadership & Soft Skills Excellent leadership, decision-making, and team-building abilities. Strong problem-solving skills with a focus on root cause analysis and proactive prevention. Analytical abilities, proficient in analyzing data and creating reports. Exceptional verbal and written communication and training skills, with the ability to convey technical concepts to non-technical audiences. Ability to work under pressure in high-stakes situations with a calm and focused approach. More information about NXP in India... Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

LivePerson (NASDAQ: LPSN) is the global leader in enterprise conversations. Hundreds of the world’s leading brands — including HSBC, Chipotle, and Virgin Media — use our award-winning Conversational Cloud platform to connect with millions of consumers. We power nearly a billion conversational interactions every month, providing a uniquely rich data set and safety tools to unlock the power of Conversational AI for better customer experiences. At LivePerson, we foster an inclusive workplace culture that encourages meaningful connection, collaboration, and innovation. Everyone is invited to ask questions, actively seek new ways to achieve success, and reach their full potential. We are continually looking for ways to improve our products and make things better. This means spotting opportunities, solving ambiguities, and seeking effective solutions to the problems our customers care about. Overview We are looking for an experienced Data Engineer to provide data engineering expertise and support to various analytical products of LivePerson, and assist in migrating our existing data processing ecosystem from Hadoop (Spark, MapReduce, Java, and Scala) to Databricks on GCP. The goal is to leverage Databricks’ scalability, performance, and ease of use to enhance our current workflows. You Will Assessment and Planning: Review the existing Hadoop infrastructure, including Spark and MapReduce jobs. Analyze Java and Scala codebases for compatibility with Databricks. Identify dependencies, libraries, and configurations that may require modification. Propose a migration plan with clear timelines and milestones. Code Migration: Refactor Spark jobs to run efficiently on Databricks. Migrate MapReduce jobs where applicable or rewrite them using Spark DataFrame/Dataset API. Update Java and Scala code to comply with Databricks' runtime environment. Testing and Validation: Develop unit and integration tests to ensure parity between the existing and new systems. Compare performance metrics before and after migration. Implement error handling and logging consistent with best practices in Databricks. Optimization and Performance Tuning: Fine-tune Spark configurations for performance improvements on Databricks. Optimize data ingestion and transformation processes. Deployment and Documentation: Deploy migrated jobs to production in Databricks. Document changes, configurations, and processes thoroughly. Provide knowledge transfer to internal teams if required. Required Skills 6+ years of experience in Data Engineering with focus on building data pipelines, data platforms and ETL (Extract, transform, Load) processes on Hadoop and Databricks. Strong Expertise in Databricks (Spark on Databricks, Delta Lake, etc.) preferably on GCP. Strong expertise in the Hadoop ecosystem (Spark, MapReduce, HDFS) with solid foundations of Spark and its internals. Proficiency in Scala and Java. Strong SQL knowledge. Strong understanding of data engineering and optimization techniques. Solid understanding on Data governance, Data modeling and enterprise scale data lakehouse platform Experience with test frameworks like Great Expectations Minimum Qualifications Bachelor's degree in Computer Science or a related field Certified Databricks Engineer- Preferred You Should Be An Expert In Databricks with spark and its internals (3 years) - MUST Data engineering in Hadoop ecosystem (5 years) - MUST Scala and Java (5 years) - MUST SQL - MUST Benefits Health: Medical, Dental and Vision Time away: vacation and holidays Development: Access to internal professional development resources. Equal opportunity employer Why You’ll Love Working Here As leaders in enterprise customer conversations, we celebrate diversity, empowering our team to forge impactful conversations globally. LivePerson is a place where uniqueness is embraced, growth is constant, and everyone is empowered to create their own success. And, we're very proud to have earned recognition from Fast Company, Newsweek, and BuiltIn for being a top innovative, beloved, and remote-friendly workplace. Belonging At LivePerson We are proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, color, family or medical care leave, gender identity or expression, genetic information, marital status, medical condition, national origin, physical or mental disability, protected veteran status, race, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable laws, regulations and ordinances. We also consider qualified applicants with criminal histories, consistent with applicable federal, state, and local law. We are committed to the accessibility needs of applicants and employees. We provide reasonable accommodations to job applicants with physical or mental disabilities. Applicants with a disability who require reasonable accommodation for any part of the application or hiring process should inform their recruiting contact upon initial connection. The talent acquisition team at LivePerson has recently been notified of a phishing scam targeting candidates applying for our open roles. Scammers have been posing as hiring managers and recruiters in an effort to access candidates' personal and financial information. This phishing scam is not isolated to only LivePerson and has been documented in news articles and media outlets. Please note that any communication from our hiring teams at LivePerson regarding a job opportunity will only be made by a LivePerson employee with an @ liveperson.com email address. LivePerson does not ask for personal or financial information as part of our interview process, including but not limited to your social security number, online account passwords, credit card numbers, passport information and other related banking information. If you have any questions and or concerns, please feel free to contact recruiting-lp@liveperson.com Show more Show less

Posted 1 week ago

Apply

3.0 - 4.0 years

0 Lacs

Chandigarh, India

On-site

Linkedin logo

Company Description Abroad Gateway is renowned for being a top IELTS coaching institute and education visa consultancy in Chandigarh. We provide professional IELTS coaching to aspiring students and individuals who wish to pursue higher studies abroad or migrate to developed countries like Canada, the UK, the USA, Australia, and New Zealand for permanent residency or citizenship. Our mission is to support our clients in achieving their dreams of studying or residing abroad. Role Description 1. Provide comprehensive visa consultation to clients seeking tourist visas, Study visas, spouse open work permits, and concurrent visas for Canada, Australia, and the United Kingdom 2. Complete knowledge of documents required for various types of visas. 3. Assess clients' eligibility and guide them through the visa application process, ensuring compliance with immigration regulations and requirements. Requirements and Qualifications 1. Must have previous experience in visa counselling, preferably in Canada, the UK, the USA and Australia for Student Visa, tourist visas, spouse open work permit and concurrent visa. 2. 3 - 4 years of experience in immigration services. 3. Proven experience in a sales management role with managing a team of at least 2/3 employees. Remuneration and Perks Salary is not a bar for the right candidate Show more Show less

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Puducherry, Puducherry

On-site

Indeed logo

Full job description Role :Laundromat Manager Salary :Rs 25000 - Rs 30000 per month +other facilities Benefits : Individual or Family accommodation , food , fuel , Medical insurance is provided. Children educational allowance & Yearly Bonus is also provided. If Family is ready to migrate , both can work with reasonable payment Office Location :Pondicherry Education : HSC and above Key Responsibilities: - Overseas the daily operations , including managing staff, ensuring cleanliness, maintaining equipment, and providing quality service to customers - If Interested ,WhatsApp or call +91 90479 88988/ Job Type: Full-time Pay: ₹25,000.00 - ₹30,000.00 per month Benefits: Food provided Health insurance Schedule: Day shift Supplemental Pay: Yearly bonus Ability to commute/relocate: Pondicherry, Puducherry: Reliably commute or planning to relocate before starting work (Preferred) Work Location: In person

Posted 2 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Looking for Mainframe developer for the migration project to GCP. Ideal resource will have experience in Writing and debugging the mainframe code and will migrate it to Java . Experience- 5+ yrs Location: Pune, Bangalore, Chennai, Hyderabad Notice: 15 -30 Days Key Responsibilities: - Design, code, test, and debug mainframe applications using COBOL, JCL, CICS, DB2, and other mainframe languages. - Assist in migrating legacy applications and data to modern platforms (GCP), ensuring minimal disruption to business operations. - Analyze and optimize existing programs for performance enhancement, ensuring efficient use of mainframe resources. - Work with cross-functional teams to gather requirements, provide technical support, and ensure smooth project execution. - Perform unit, integration, and regression testing to validate developed functionality and ensure high-quality deliverables. Technical Skills: -Bachelor’s degree in computer science, Information Technology, or a related field. - Proficiency in COBOL, PL/I, JCL, and other mainframe languages. - Strong experience with DB2 or similar mainframe database systems. - Knowledge of mainframe tools and utilities, such as Change man, Omegamon, File-Aid, and Rational Developer for Z. - Experience with mainframe migration projects, including migrating legacy applications and data to modern platforms. - Strong problem-solving skills and attention to detail. - Excellent communication and teamwork skills. - Ability to manage multiple tasks and projects simultaneously. Please share your updated resume with rosysmita.jena@atyeti.com Show more Show less

Posted 2 weeks ago

Apply

0.0 - 2.0 years

0 Lacs

Chennai, Tamil Nadu

On-site

Indeed logo

IT Full-Time Job ID: DGC00638 Chennai, Tamil Nadu 0-5 Yrs ₹02 - ₹06 Yearly Job description REPORTING TO: GSC Network Manager - IT GEOGRAPHICAL REPORTING LOCATION: Bangalore / Chennai WORKING LOCATION: Bangalore / Chennai NUMBER OF EMPLOYEES UNDER RESPONSIBILITY : 0 OVERALL OBJECTIVES: As part of the IT Infrastructure Support Team for Eurofins Food, Feed & Environment Testing Europe, you will join a growing international team to work on maintenance of newly deployed segregated IT User Zones and Application Availability Groups. Your role will be to support daily operations tasks and may require to support operational deliverables for new infrastructure or migrate the existing network components. You will collaborate with number of different IT Support and Delivery Teams on daily basis. It will require networking and network security skills. Willingness to work in rotational shifts with open mind and ability to cooperate with multiple individuals in a multicultural environment is essential.You may also get an opportunity to participate and drive Eurofins major IT Infrastructure transition and transformation across Food, Feed & Environment Testing Europe businesses. SPECIFIC ASSIGNMENTS : Maintain current IT User Zones and IT Application Availability Groups network infrastructure Design, plan and implement network IT Infrastructure for Transformation / Integration / Segregation throughout Food and Environment testing businesses across Europe. These projects will be aimed to perform installations, configuration and migrations of enterprise-level network IT infrastructure systems from legacy to the new solutions Propose and deliver solutions that best meet the business needs in terms of security, cost and bandwidth Work close with Server, Cloud, Application team members do deliver the IT Infrastructure as wider project. Act based on the end to end view of the Infrastructure Document the solutions delivered and help to transition the service to regional teams in final stage of the project. REQUIRED SKILLS : An ideal candidate should have strong 0-4+ years overall experience and 0-2+ years of relevant experience in all of the below: MUST HAVE Knowledge / Skills CCNA certified. CCNP and / or CCIE certification is plus (CCIE-LAB) Ideally a proven track of LAN / WAN / MPLS / SDWAN network infrastructure design and implementation during acquisitions, transitions and delivery on enterprise level Experience with protocols like: DNS, DHCP, NTP, HTTP, BGP, OSPF, PKI, Certificates etc. Cisco based routing and switching (CCNA-Mandatory and CCNP MUST, CCIE LAB-desirable) Knowledge of protocols: IP, SNMP, SMTP, OSPF, BGP, VLAN ACLs / Static routing / QoS Switch stacking STP / SPAN / RSPAN / Switch Virtualization / VSS (Virtual Switching System) VLAN / Private-VLAN / Inter-VLAN Routing / VLAN Migration / Trunks Switch Security - Storm-Control / VLAN Hopping, DHCP Snooping Gateway Redundancy VRRP / HSRP / GLBP Must have working knowledge and experience in managing wireless infrastructure through (Meraki / Aruba / Aero Hive etc.) Well versed with ITIL framework to support business operations as per defined SLA Network components maintenance experience (upgrades etc...) Ability to read, understand and create technical documentations, SOP, Run-books, Process guides etc. of the delivered solutions VeloCloud SDWAN / MPLS Network monitoring Well versed with managing tickets handling and maintain CMDB through Service Now. GOOD TO HAVE Knowledge / Skills Experience of complex network migrations Next Generation Firewalls (Checkpoint, Palo Alto and FortiGate) Fortinet packet flow and architecture knowledge along with hardware and systems configuration knowledge FortiManager FortiGate NG Firewall, VPN, SDWAN Fortinet Checkpoint packet flow and architecture knowledge along with hardware and systems configuration knowledge Data centre network experience is a plus. Willing to take ownership with constructive mind-set as a team player. Willingness to support operations beyond job hours as and when required. Very good command of English written and spoken Ability to present technical projects on meetings Experience in building Network infrastructure design diagrams (LLD and HLD) and transitions ADDITIONAL Knowledge / Skills VOIP Network Load Balancing / Application Load Balancer VPN (ideally Pulse Secure) Data centre network experience Experience in complex network migration Cloud Networking

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Everest DX - We are a Digital Platform Services company, headquartered in Stamford. Our Platform/Solution includes Orchestration, Intelligent operations with BOTs', AI-powered analytics for Enterprise IT. Our vision is to enable Digital Transformation for enterprises to deliver seamless customer experience, business efficiency and actionable insights through an integrated set of futuristic digital technologies. Digital Transformation Services Specialized in Design, Build, Develop, Integrate, and Manage cloud solutions and modernize Data centers, build a Cloud-native application and migrate existing applications into secure, multi-cloud environments to support digital transformation. Our Digital Platform Services enable organizations to reduce IT resource requirements and improve productivity, in addition to lowering costs and speeding digital transformation. Digital Platform Cloud Intelligent Management (CiM) An Autonomous Hybrid Cloud Management Platform that works across multi-cloud environments helps enterprise Digital Transformation get most out of the cloud strategy while reducing Cost, Risk and Speed. Job Title : Informatica Developer. Job Location : Hyderabad. Job Type : Full Time. Experience : 5+ Yrs. Notice Period : Immediate to 30 days joiners are highly preferred. Responsibilities Candidate should hands-on experience on ETL and SQL. Design, develop, and optimize ETL workflows using Informatica PowerCenter. Implement cloud-based ETL solutions using Informatica IDMC and IICS. Should have expertise on all transformations in Power Center and IDMC/IICS. Should have experience or knowledge on the PC to IICS migration using CDI PC tool or some other tool. Lead data migration projects, transitioning data from on-premise to cloud environments. Write complex SQL queries and perform data validation and transformation. Conduct detailed data analysis to ensure accuracy and integrity of migrated data. Troubleshoot and optimize ETL processes for performance and error handling. Collaborate with cross-functional teams to gather requirements and design solutions. Create and maintain documentation for ETL processes and system configurations. Implement industry best practices for data integration and performance tuning. Required Skills Hands-on experience with Informatica Power Center, IDMC and IICS. Strong expertise in writing complex SQL queries and database management. Experience in data migration projects (on-premise to cloud). Strong data analysis skills for large datasets and ensuring accuracy. Solid understanding of ETL design & development concepts. Familiarity with cloud platforms (AWS, Azure). Experience with version control tools (e., Git) and deployment processes. Preferred Skills Experience with data lakes, data warehousing, or big data platforms. Familiarity with Agile methodologies. Knowledge of other ETL tools. Why Join Us Continuous Learning : The company invests in skills development through training, certifications, and access to cutting-edge projects in digital transformation. Dynamic, Collaborative Culture : Unlike a large, rigid structure, Everest DX promotes a culture of open communication, creativity, and innovation. Agile Work Environment : The company operates with a startup mindset, adapting quickly to changes, which can be more stimulating than rigid corporate processes. Innovative Solutions : Working at a boutique company like Everest DX provides exposure to cutting-edge technologies, creative problem solving, and innovative product development. (ref:hirist.tech) Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Job Title : Database Administrator III - IN Job responsibilities : SQL Servers Administration- Log maintenance, truncating databases logs, trimming SQL back up, tuning of queries, ..etc Configure and Monitor Automation tasks -backups/rebuild-reorg indexes/stats job/clear history/integrity check HA/DR Administration (review sync, review AG Dashboard for health) Implement secure environment (no explicit user access/AD domain /groups Review Data Platform resources (CPU, Memory, Storage usage, ability to analyze performance bottlenecks etc.) Migrate databases from earlier SQL Server version to the current SQL server version Create databases Run commands and scripts in SQL Install and configure SQ Server agents on various SQL database servers Plan for storage spaces requirements and placing them for optimum I/O performance Involve in tuning of database by creating efficient store procedures and T-SQL statements Design, development and maintenance of reports using report services (SSRS) Perform backup and restoration of databases, as well as troubleshoot and resolve user problems Develop the monitoring scripts and DBA tool scripts to help DBA to perform the daily tasks Develop various T-SQL store procedures, functions, views and adding/changing tables for data extraction, transformation, load and reporting Support database administrator to troubleshoot and resolve various connectivity problems Perform other DBA activities like including space management and performance monitoring Certifications: MS SQL Server Administrator (DP-300 Certification within 12 month) Show more Show less

Posted 2 weeks ago

Apply

4.0 - 12.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

JOB TITLE: Azure Cloud Engineer L – II, L - III JOB DESCRIPTION: We are seeking a mid to senior-level Azure Cloud Engineer to deliver cloud engineering services to Rackspace’s Enterprise clients. The ideal candidate will have strong, hands-on technical skills, with the experience and consulting skills to understand, shape, and deliver against our customers’ requirements. Design and implement Azure cloud solutions that are secure, scalable, resilient, monitored, auditable, and cost optimized Build out new customer cloud solutions using cloud-native components Write Infrastructure as Code (IaC) using Terraform Write application/Infra deployment pipeline using Azure DevOps or other industry-standard deployment and configuration tools Usage of cloud foundational architecture and components to build out automated cloud environments, CI/CD pipelines, and supporting services frameworks Work with developers to identify necessary Azure resources and automate their provisioning. Document automation processes Create and document a disaster recovery plan Strong communication skills along with customer-facing experience Respond to customer support requests within response time SLAs Troubleshoot performance degradation or loss of service as time-critical incidents Ownership of issues, including collaboration with other teams and escalation Participate in a shared on-call rotation Support the success and development of others in the team Skills & Experience Engineer with 4-12 years of experience in the Azure cloud along with writing Infrastructure as Code (IaC) and building application/Infra deployment pipelines Experienced in on-prem/AWS/GCP to Azure migration using tooling like Azure migrate etc Expert-level knowledge of Azure Products & Services, Scaling, Load Balancing, etc Expert level knowledge of Azure DevOps, Pipelines, and CI/CD Expert level knowledge of Terraform and scripting (Python/Shell/PowerShell) Working knowledge in containerization technologies like Kubernetes, AKS, etc Working knowledge of Azure networking, like VPN gateways, VNets, etc. Working knowledge of Windows or Linux operating systems – experience with supporting and troubleshooting stability and performance issues Working knowledge of automating the management and enforcement of policies using Azure Policies or similar Good understanding of other DevOps tools like Ansible, Jenkins, etc. Good understanding of the design of native Cloud applications, Cloud application design patterns, and practices Azure Admin, Azure DevOps, terraform certified candidates preferred. Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

India

On-site

Linkedin logo

Job Profile Summary The Cloud NoSQL Database Engineer performs database engineering and administration activities, including design, planning, configuration, monitoring, automation, self-serviceability, alerting, and space management. The role involves database backup and recovery, performance tuning, security management, and migration strategies. The ideal candidate will lead and advise on Neo4j and MongoDB database solutions, including migration, modernization, and optimization, while also supporting secondary RDBMS platforms (SQL Server, PostgreSQL, MySQL, Oracle). The candidate should be proficient in workload migrations to Cloud (AWS/Azure/GCP). Key Responsibilities: MongoDB Administration: Install, configure, and maintain Neo4j (GraphDB) and MongoDB (NoSQL) databases in cloud and on-prem environments. NoSQL Data Modeling: Design and implement graph-based models in Neo4j and document-based models in MongoDB to optimize data retrieval and relationships. Performance Tuning & Optimization: Monitor and tune databases for query performance, indexing strategies, and replication performance. Backup, Restore, & Disaster Recovery: Design and implement backup and recovery strategies for Neo4j, MongoDB, and secondary database platforms. Migration & Modernization: Lead database migration strategies, including homogeneous and heterogeneous migrations between NoSQL, Graph, and RDBMS platforms. Capacity Planning: Forecast database growth and plan for scalability, optimal performance, and infrastructure requirements. Patch Management & Upgrades: Plan and execute database software upgrades, patches, and service packs. Monitoring & Alerting: Set up proactive monitoring and alerting for database health, performance, and potential failures using Datadog, AWS CloudWatch, Azure Monitor, or Prometheus. Automation & Scripting: Develop automation scripts using Python, AWS CLI, PowerShell, Shell scripting to streamline database operations. Security & Compliance: Implement database security best practices, including access controls, encryption, key management, and compliance with cloud security standards. Incident & Problem Management: Work within ITIL frameworks to resolve incidents, service requests, and perform root cause analysis for problem management. High Availability & Scalability: Design and manage Neo4j clustering, MongoDB replication/sharding, and HADR configurations across cloud and hybrid environments. Vendor & Third-Party Tool Management: Evaluate, implement, and manage third-party tools for Neo4j, MongoDB, and cloud database solutions. Cross-Platform Database Support: Provide secondary support for SQL Server (Always On, Replication, Log Shipping), PostgreSQL (Streaming Replication, Partitioning), MySQL (InnoDB Cluster, Master-Slave Replication), and Oracle (RAC, Data Guard, GoldenGate). Cloud Platform Expertise: Hands-on with cloud-native database services such as AWS DocumentDB, DynamoDB, Azure CosmosDB, Google Firestore, Google BigTable. Cost Optimization: Analyze database workload, optimize cloud costs, and recommend licensing enhancements. Shape Knowledge & Skills: Strong expertise in Neo4j (Cypher Query Language, APOC, Graph Algorithms, GDS Library) and MongoDB (Aggregation Framework, Sharding, Replication, Indexing). Experience with homogeneous and heterogeneous database migrations (NoSQL-to-NoSQL, Graph-to-RDBMS, RDBMS-to-NoSQL). Familiarity with database monitoring tools such as Datadog, Prometheus, CloudWatch, Azure Monitor. Proficiency in automation using Python, AWS CLI, PowerShell, Bash/Shell scripting. Experience in cloud-based database deployment using AWS RDS, Aurora, DynamoDB, Azure SQL, Azure CosmosDB, GCP Cloud SQL, Firebase, BigTable. Understanding of microservices and event-driven architectures, integrating MongoDB and Neo4j with applications using Kafka, RabbitMQ, or AWS SNS/SQS. Experience with containerization (Docker, Kubernetes) and Infrastructure as Code (Terraform, CloudFormation, Ansible). Strong analytical and problem-solving skills for database performance tuning and optimization. Shape Education & Certifications: Bachelor’s degree in Computer Science, Information Systems, or a related field. Database Specialty Certifications in Neo4j and MongoDB (Neo4j Certified Professional, MongoDB Associate/Professional Certification). Cloud Certifications (AWS Certified Database - Specialty, Azure Database Administrator Associate, Google Cloud Professional Data Engineer). Preferred Experience: 5+ years of experience in database administration with at least 3 years dedicated to Neo4j and MongoDB. Hands-on experience with GraphDB & NoSQL architecture and migrations. Experience working in DevOps environments and automated CI/CD pipelines for database deployments. Strong expertise in data replication, ETL, and database migration tools such as AWS DMS, Azure DMS, MongoDB Atlas Live Migrate, Neo4j ETL Tool. Show more Show less

Posted 2 weeks ago

Apply

4.0 - 12.0 years

0 Lacs

India

On-site

Linkedin logo

JOB TITLE: Azure Cloud Engineer L – II, L - III JOB DESCRIPTION: We are seeking a mid to senior-level Azure Cloud Engineer to deliver cloud engineering services to Rackspace’s Enterprise clients. The ideal candidate will have strong, hands-on technical skills, with the experience and consulting skills to understand, shape, and deliver against our customers’ requirements. Design and implement Azure cloud solutions that are secure, scalable, resilient, monitored, auditable, and cost optimized Build out new customer cloud solutions using cloud-native components Write Infrastructure as Code (IaC) using Terraform Write application/Infra deployment pipeline using Azure DevOps or other industry-standard deployment and configuration tools Usage of cloud foundational architecture and components to build out automated cloud environments, CI/CD pipelines, and supporting services frameworks Work with developers to identify necessary Azure resources and automate their provisioning. Document automation processes Create and document a disaster recovery plan Strong communication skills along with customer-facing experience Respond to customer support requests within response time SLAs Troubleshoot performance degradation or loss of service as time-critical incidents Ownership of issues, including collaboration with other teams and escalation Participate in a shared on-call rotation Support the success and development of others in the team Skills & Experience Engineer with 4-12 years of experience in the Azure cloud along with writing Infrastructure as Code (IaC) and building application/Infra deployment pipelines Experienced in on-prem/AWS/GCP to Azure migration using tooling like Azure migrate etc Expert-level knowledge of Azure Products & Services, Scaling, Load Balancing, etc Expert level knowledge of Azure DevOps, Pipelines, and CI/CD Expert level knowledge of Terraform and scripting (Python/Shell/PowerShell) Working knowledge in containerization technologies like Kubernetes, AKS, etc Working knowledge of Azure networking, like VPN gateways, VNets, etc. Working knowledge of Windows or Linux operating systems – experience with supporting and troubleshooting stability and performance issues Working knowledge of automating the management and enforcement of policies using Azure Policies or similar Good understanding of other DevOps tools like Ansible, Jenkins, etc. Good understanding of the design of native Cloud applications, Cloud application design patterns, and practices Azure Admin, Azure DevOps, terraform certified candidates preferred. Show more Show less

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

The Applications Development Intermediate Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: Utilize knowledge of applications development procedures and concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements, including using script tools and analyzing/interpreting code Consult with users, clients, and other technology groups on issues, and recommend programming solutions, install, and support customer exposure systems Apply fundamental knowledge of programming languages for design specifications. Analyze applications to identify vulnerabilities and security issues, as well as conduct testing and debugging Serve as advisor or coach to new or lower level analysts Identify problems, analyze information, and make evaluative judgements to recommend and implement solutions Resolve issues by identifying and selecting solutions through the applications of acquired technical experience and guided by precedents Has the ability to operate with a limited level of direct supervision. Can exercise independence of judgement and autonomy. Acts as SME to senior stakeholders and /or other team members. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 3-7 years of relevant experience in the Financial Service industry Intermediate level experience in Applications Development role Consistently demonstrates clear and concise written and verbal communication Demonstrated problem-solving and decision-making skills Ability to work under pressure and manage deadlines or unexpected changes in expectations or requirements Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. This position is needed for enhancing Feed generation process and migrate to Strategic FI Feed generations as part of the LM Retirement project part of Simplification and Modernization. The skills required to do this job include: Java Microservices Kafka Flink SQL Docker, ECS More description of the role: This is critical role to complete Migration of all Feeds from legacy feed publishers and Rivulet to FI Common Feed generators and retire LM Simpliciti. Developer works with team leads to deliver project on time. Candidate has to design, develop and work on migrating all Jobs and clusters to common FI Feed generators platform Strategic Solution. Candidate has to collaborate with other developers and deliver all changes by following best practices and coding Standards. Along with completing LM Feeds and Publishers, candidate will also help in migrate all G10 and XVA feeds publishers to common feed generators part of Simplification and Modernization. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 2 weeks ago

Apply

Exploring Migrate Jobs in India

The job market for migrate professionals in India is currently thriving, with numerous opportunities available in various industries. Whether you are just starting your career or looking to make a job transition, migrate roles can offer a rewarding career path with growth opportunities.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Chennai
  5. Mumbai

These cities are known for their booming IT sectors and have a high demand for migrate professionals.

Average Salary Range

The average salary range for migrate professionals in India varies based on experience levels. Entry-level positions can expect to earn around INR 3-5 lakhs per annum, while experienced professionals can command salaries upwards of INR 10-15 lakhs per annum.

Career Path

A typical career path in the migrate field may involve starting as a Junior Developer, progressing to a Senior Developer, then moving up to a Tech Lead role. With experience and expertise, one could further advance to roles like Solution Architect or Project Manager.

Related Skills

In addition to migrate skills, professionals in this field are often expected to have knowledge in related areas such as cloud computing, database management, programming languages like Java or Python, and software development methodologies.

Interview Questions

  • What is data migration and why is it important? (basic)
  • Can you explain the difference between ETL and ELT processes? (medium)
  • How do you handle data quality issues during migration? (medium)
  • What tools have you used for data migration in your previous projects? (basic)
  • Describe a challenging data migration project you worked on and how you overcame obstacles. (medium)
  • What are the different types of data migration strategies? (advanced)
  • How do you ensure data integrity during the migration process? (medium)
  • Explain the concept of data mapping in the context of data migration. (basic)
  • Have you worked with any data migration automation tools? If so, which ones? (medium)
  • What are some common challenges faced during data migration projects and how do you address them? (medium)
  • How do you prioritize data migration tasks in a project with tight deadlines? (medium)
  • Can you discuss the role of metadata in data migration? (advanced)
  • What are some best practices for data migration to ensure project success? (medium)
  • How do you handle data security and compliance issues during migration? (medium)
  • What considerations should be taken into account when migrating data to a cloud environment? (medium)
  • Explain the concept of data deduplication and its importance in data migration. (medium)
  • What role does data profiling play in the data migration process? (medium)
  • How do you ensure data accuracy and consistency post-migration? (medium)
  • Have you worked on any data migration projects involving legacy systems? If so, how did you approach them? (medium)
  • What is the difference between schema migration and data migration? (medium)
  • Describe a time when you had to roll back a data migration process. How did you handle it? (medium)
  • How do you handle stakeholder communication during a data migration project? (basic)
  • What are the key metrics you use to measure the success of a data migration project? (medium)
  • Can you explain the concept of data lineage and its importance in data migration? (advanced)

Closing Remark

As you explore opportunities in the migrate job market in India, remember to showcase your skills and experience confidently during interviews. Prepare thoroughly, stay updated on industry trends, and demonstrate your passion for data migration. Best of luck on your job search journey!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies