Jobs
Interviews

2012 Migrate Jobs - Page 17

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 years

0 Lacs

Ameerpet, Telangana, India

On-site

Job Description: Hands-On Technical Architect (Microsoft .NET Applications) Position Overview: We are seeking a Hands-On Technical Architect specializing in Microsoft .NET applications to join our team. The ideal candidate will have a strong background in designing, developing, and overseeing the architecture of scalable, secure, and high-performing software applications using the .NET framework and .NET Core. This role requires a blend of technical expertise and hands-on coding ability, coupled with the strategic mindset to lead application development from concept to delivery. --- Key Responsibilities: - Architect and Design Solutions: Lead the architecture, design, and development of enterprise-grade applications using .NET Core, ASP.NET MVC, C#, and other Microsoft technologies, ensuring scalability and performance. - Hands-On Development: Be actively involved in coding, debugging, and reviewing application modules, leading by example in developing robust solutions. -10+ Years of Experience in Dot Net. - Requirements Analysis & Solution Design: Work closely with stakeholders, product managers, and development teams to gather requirements and translate them into detailed technical designs and development plans. - Full Stack Development Guidance: Oversee front-end and back-end development activities, guiding teams on best practices for technologies such as React, Angular, Blazor, or other JavaScript frameworks and RESTful API development. - Lead Code Reviews and Technical Standards: Establish and enforce best practices for coding standards, design patterns, performance optimization, security, and code reuse. - Application Performance Optimization: Identify and resolve performance bottlenecks throughout the application stack, focusing on database queries, API responses, and front-end optimizations. - System Integration and API Development: Design and implement APIs, microservices, and integrations with third-party systems, ensuring reliability, security, and ease of use. - Modernize and Migrate Legacy Systems: Modernize existing .NET applications by upgrading to .NET Core and/or migrating on-premises applications to cloud platforms like Azure or AWS. - Mentorship and Technical Leadership: Mentor and guide development teams, promoting continuous learning and skill development, as well as fostering a collaborative engineering culture. - Continuous Improvement and Innovation: Evaluate new tools, technologies, and frameworks that can improve software delivery, performance, and security. --- ### Qualifications & Skills #### Technical Experience: - Extensive .NET Framework and .NET Core Experience: Deep understanding of C#, ASP.NET Core, Entity Framework, LINQ, and associated tools for building web applications and APIs. - Database Knowledge: Proficiency in SQL Server, Entity Framework Core, and knowledge of NoSQL databases (e.g., MongoDB) for effective data modeling and management. - Front-End Expertise: Strong experience with front-end frameworks like Angular, React, Vue.js, or Blazor for developing responsive and dynamic web applications. - API Design & Microservices: Hands-on experience in building and consuming RESTful APIs, implementing microservice architectures, and using message queues like Azure Service Bus or RabbitMQ for inter-service communication. - Azure Cloud Services (Preferred): Experience with cloud computing platforms, especially Azure—including App Services, Azure Functions, Azure SQL, Blob Storage, App Insights, and Azure DevOps for CI/CD. - Software Development Best Practices: Strong understanding of SOLID principles, design patterns, TDD/BDD, and DevOps practices, with hands-on experience in setting up CI/CD pipelines using tools like Azure DevOps, GitHub Actions, or Jenkins. - Security and Compliance: Knowledge of best practices in application security, including OAuth, JWT, data encryption, authentication/authorization patterns, and compliance with data protection regulations (e.g., GDPR). #### Professional Experience & Skills: - Architectural Experience: Proven experience in architecting and designing high-performance, scalable, and secure software solutions in .NET environments. - Hands-On Coding Skills: Ability to code and prototype solutions, taking the lead on complex development tasks when necessary. - Problem-Solving and Analytical Abilities: Strong ability to analyze, design, and troubleshoot technical issues, offering innovative solutions to meet business needs. - Leadership & Collaboration: Ability to lead development teams, guide junior developers, and work collaboratively with stakeholders to align technical solutions with business goals. - Effective Communication: Excellent written and verbal communication skills, with the ability to convey technical concepts to non-technical stakeholders. - Agile Experience: Familiarity with Agile methodologies (Scrum/Kanban) and experience working in cross-functional, collaborative teams. --- ### Preferred Qualifications: - Certifications: Microsoft Certifications like Azure Solutions Architect Expert, Azure Developer Associate, or similar relevant certifications. - Experience with Legacy Systems: Prior experience in modernizing and refactoring legacy systems to align with modern standards and practices. - Experience with DevOps & Automation: Hands-on experience in setting up build pipelines, automated testing, and deployment processes. - Familiarity with Serverless and Containerization: Knowledge of serverless computing (e.g., Azure Functions) and containerization tools like Docker and Kubernetes for deployment and scaling.

Posted 3 weeks ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana

On-site

Req ID: 333208 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Sr BODS Developer - Track Lead to join our team in Hyderabad, Telangana (IN-TG), India (IN). Sr SAP BODS Developer - Track Lead Position Overview Understand and execute data migration blueprints (migration concepts, transformation rules, mappings, selection criteria) Understand and contribute to the documentation of the data mapping specifications, conversion rules, technical design specifications as required Build the conversion processes and associated programs that will migrate the data per the design and conversion rules that have been signed-off by the client Execution of all data migration technical steps (extract, transform & load) as well as Defect Management and Issue Resolution Perform data load activities for each mock load, cutover simulation and production deployment identified in L1 plan into environments identified Provide technical support, defect management, and issue resolution during all testing cycles, including Mock Data Load cycles Complete all necessary data migration documentation necessary to support system validation / compliance requirements Support the development of unit and end-to-end data migration test plans and test scripts (including testing for data extraction, transformation, data loading, and data validation) Job Requirements 8+ Yrs. of overall technical experience in SAP BODS with all the SAP BODS application modules (Extract, Transform, Load) 5+ Yrs. of experience with Data Migration experience with S/4 HANA/ECC Implementations Experience in BODS Designer Components- Projects, Jobs, Workflow, Data Flow, Scripts, Data Stores and Formats Experience in BODS performance tuning techniques using parallel processing (Degree of Parallelism), Multithreading, Partitioning, and Database Throughputs to improve job performance Extensive experience in ETL using SAP BODS and SAP IS with respect to SAP Master / Transaction Data Objects in SAP FICO, SAP SD, SAP MM/WM, SAP Plant Maintenance, SAP Quality Management etc. is desirable Experience with Data Migration using LSMW, IDOCS, LTMC About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.

Posted 3 weeks ago

Apply

0.0 - 3.0 years

0 Lacs

Chennai, Tamil Nadu

Remote

Job description Job Title: Jaspersoft developer Location: INDIA (Remote OK) Experience: 5 to 10 Years Start Date: Immediate Joiners Preferred Job Summary: We are looking for a skilled and detail-oriented freelance BI Developer with hands-on experience in Jaspersoft / JasperReports . The ideal candidate will be responsible for developing, converting, and maintaining business intelligence reports and dashboards, with a strong focus on PL/SQL and data handling. Key Responsibilities: Design, develop, and maintain BI reports and dashboards using Jaspersoft / JasperReports. Migrate and convert reports from legacy systems into Jaspersoft. Write, optimize, and troubleshoot complex PL/SQL queries for reporting. Work independently and deliver quality output within deadlines. Required Skills: 1 to 3 years of experience with Jaspersoft / JasperReports. Proficiency in PL/SQL and relational databases (Oracle, MySQL, etc.). Experience in report conversion and migration projects. Strong analytical and problem-solving abilities. Good communication skills and ability to work independently. Preferred: Candidates based in Tamil Nadu. Immediate or short notice availability. Experience working in Agile environments is a plus. Industry Software Development Employment Type Full-time Job Type: Full-time Benefits: Health insurance Provident Fund Work from home Work Location: In person

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Description Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company. A leader in the convenience store and fuel space with over 17,000 stores in 31 countries, serving more than 6 million customers each day It is an exciting time to be a part of the growing Data Engineering team at Circle K. We are driving a well-supported cloud-first strategy to unlock the power of data across the company and help teams to discover, value and act on insights from data across the globe. With our strong data pipeline, this position will play a key role partnering with our Technical Development stakeholders to enable analytics for long term success. About The Role We are looking for a Senior Data Engineer with a collaborative, “can-do” attitude who is committed & strives with determination and motivation to make their team successful. A Sr. Data Engineer who has experience architecting and implementing technical solutions as part of a greater data transformation strategy. This role is responsible for hands on sourcing, manipulation, and delivery of data from enterprise business systems to data lake and data warehouse. This role will help drive Circle K’s next phase in the digital journey by modeling and transforming data to achieve actionable business outcomes. The Sr. Data Engineer will create, troubleshoot and support ETL pipelines and the cloud infrastructure involved in the process, will be able to support the visualizations team. Roles and Responsibilities Collaborate with business stakeholders and other technical team members to acquire and migrate data sources that are most relevant to business needs and goals. Demonstrate deep technical and domain knowledge of relational and non-relation databases, Data Warehouses, Data lakes among other structured and unstructured storage options. Determine solutions that are best suited to develop a pipeline for a particular data source. Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development. Efficient in ETL/ELT development using Azure cloud services and Snowflake, Testing and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance). Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines for scalable analytics delivery. Provide clear documentation for delivered solutions and processes, integrating documentation with the appropriate corporate stakeholders. Identify and implement internal process improvements for data management (automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability). Stay current with and adopt new tools and applications to ensure high quality and efficient solutions. Build cross-platform data strategy to aggregate multiple sources and process development datasets. Proactive in stakeholder communication, mentor/guide junior resources by doing regular KT/reverse KT and help them in identifying production bugs/issues if needed and provide resolution recommendation. Job Requirements Bachelor’s Degree in Computer Engineering, Computer Science or related discipline, Master’s Degree preferred. 5+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment. 5+ years of experience with setting up and operating data pipelines using Python or SQL 5+ years of advanced SQL Programming: PL/SQL, T-SQL 5+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization. Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads. 5+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data. 5+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions. 5+ years of experience in defining and enabling data quality standards for auditing, and monitoring. Strong analytical abilities and a strong intellectual curiosity In-depth knowledge of relational database design, data warehousing and dimensional data modeling concepts Understanding of REST and good API design. Experience working with Apache Iceberg, Delta tables and distributed computing frameworks Strong collaboration and teamwork skills & excellent written and verbal communications skills. Self-starter and motivated with ability to work in a fast-paced development environment. Agile experience highly desirable. Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools. Knowledge Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management). Strong working knowledge of Snowflake, including warehouse management, Snowflake SQL, and data sharing techniques. Experience building pipelines that source from or deliver data into Snowflake in combination with tools like ADF and Databricks. Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools. Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance). Hands on experience in Databases like (Azure SQL DB, MySQL/, Cosmos DB etc.), File system (Blob Storage), Python/Unix shell Scripting. ADF, Databricks and Azure certification is a plus. Technologies we use: Databricks, Azure SQL DW/Synapse, Azure Tabular, Azure Data Factory, Azure Functions, Azure Containers, Docker, DevOps, Python, PySpark, Scripting (Powershell, Bash), Git, Terraform, Power BI, Snowflake

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Description Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company. A leader in the convenience store and fuel space with over 17,000 stores in 31 countries, serving more than 6 million customers each day It is an exciting time to be a part of the growing Data Engineering team at Circle K. We are driving a well-supported cloud-first strategy to unlock the power of data across the company and help teams to discover, value and act on insights from data across the globe. With our strong data pipeline, this position will play a key role partnering with our Technical Development stakeholders to enable analytics for long term success. About The Role We are looking for a Data Engineer with a collaborative, “can-do” attitude who is committed & strives with determination and motivation to make their team successful. A Data Engineer who has experience implementing technical solutions as part of a greater data transformation strategy. This role is responsible for hands on sourcing, manipulation, and delivery of data from enterprise business systems to data lake and data warehouse. This role will help drive Circle K’s next phase in the digital journey by transforming data to achieve actionable business outcomes. Roles and Responsibilities Collaborate with business stakeholders and other technical team members to acquire and migrate data sources that are most relevant to business needs and goals Demonstrate technical and domain knowledge of relational and non-relational databases, Data Warehouses, Data lakes among other structured and unstructured storage options Determine solutions that are best suited to develop a pipeline for a particular data source Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development Efficient in ELT/ETL development using Azure cloud services and Snowflake, including Testing and operational support (RCA, Monitoring, Maintenance) Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines for scalable analytics deliver Provide clear documentation for delivered solutions and processes, integrating documentation with the appropriate corporate stakeholders Identify and implement internal process improvements for data management (automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability) Stay current with and adopt new tools and applications to ensure high quality and efficient solutions Build cross-platform data strategy to aggregate multiple sources and process development datasets Proactive in stakeholder communication, mentor/guide junior resources by doing regular KT/reverse KT and help them in identifying production bugs/issues if needed and provide resolution recommendation Job Requirements Bachelor’s degree in Computer Engineering, Computer Science or related discipline, Master’s Degree preferred 3+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment 3+ years of experience with setting up and operating data pipelines using Python or SQL 3+ years of advanced SQL Programming: PL/SQL, T-SQL 3+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads 3+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data 3+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions 3+ years of experience in defining and enabling data quality standards for auditing, and monitoring Strong analytical abilities and a strong intellectual curiosity. In-depth knowledge of relational database design, data warehousing and dimensional data modeling concepts Understanding of REST and good API design Experience working with Apache Iceberg, Delta tables and distributed computing frameworks Strong collaboration, teamwork skills, excellent written and verbal communications skills Self-starter and motivated with ability to work in a fast-paced development environment Agile experience highly desirable Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools Preferred Skills Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management) Strong working knowledge of Snowflake, including warehouse management, Snowflake SQL, and data sharing techniques Experience building pipelines that source from or deliver data into Snowflake in combination with tools like ADF and Databricks Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance) Hands on experience in Databases like (Azure SQL DB, MySQL/, Cosmos DB etc.), File system (Blob Storage), Python/Unix shell Scripting ADF, Databricks and Azure certification is a plus Technologies we use : Databricks, Azure SQL DW/Synapse, Azure Tabular, Azure Data Factory, Azure Functions, Azure Containers, Docker, DevOps, Python, PySpark, Scripting (Powershell, Bash), Git, Terraform, Power BI, Snowflake

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

JOB_POSTING-3-72477-2 Job Description Role Title: VP, Solution Architect (L12) Company Overview Synchrony (NYSE: SYF) is a premier consumer financial services company delivering one of the industry’s most complete digitally enabled product suites. Our experience, expertise and scale encompass a broad spectrum of industries including digital, health and wellness, retail, telecommunications, home, auto, outdoors, pet and more. We have recently been ranked #2 among India’s Best Companies to Work for by Great Place to Work. We were among the Top 50 India’s Best Workplaces in Building a Culture of Innovation by All by GPTW and Top 25 among Best Workplaces in BFSI by GPTW. We have also been recognized by AmbitionBox Employee Choice Awards among the Top 20 Mid-Sized Companies, ranked #3 among Top Rated Companies for Women, and Top-Rated Financial Services Companies. Synchrony celebrates ~52% women talent We offer Flexibility and Choice for all employees and provide best-in-class employee benefits and programs that cater to work-life integration and overall well-being We provide career advancement and upskilling opportunities for all to take up leadership roles Organization Overview Synchrony's Engineering Team is a dynamic and innovative team dedicated to driving technological excellence. As a member of this Team, you'll play a pivotal role in designing and developing cutting-edge tech stack and solutions that redefine industry standards.The Credit Card that we use every day to purchase our essentials and later settle the bills - A simple process that we all are used to on a day-to-day basis. Now, consider the vast complexity hidden behind this seemingly simple process, operating tirelessly for millions of cardholders. The sheer volume of data processed is mind-boggling. Fortunately, advanced technology stands ready to automate and manage this constant torrent of information, ensuring smooth transactions around the clock, 365 days a year.Our collaborative environment encourages creative problem-solving and fosters career growth. Join us to work on diverse projects, from fintech to data analytics, and contribute to shaping the future of technology. If you're passionate about engineering and innovation, Synchrony's Engineering Team is the place to be. Role Summary/Purpose Reporting to Senior Engineering Manager, a hands-on expert who thrives in System design, architecture and complex domains environments and leads the team by example in terms of code quality and on-time deliverables. To be successful in this role, deep expertise in solution, Cloud Native Apps Architecture, event-driven architecture, building resilient batch applications using Spring batch and the ability to influence and lead a team are musts. The Solution Architect will develop and lead a team of solution driven Engineers responsible developing, optimizing, and monitoring products & capabilities across Synchrony’s Apply-Buy platforms.The ideal candidate will have a solid and established track record of building successful, efficient & reliable applications, cloud migration experience, partnering with business & technology stakeholders to guide the delivery of agile, cost-effective, and high-performance complex cloud solutions. Key Responsibilities Motivate, influence and lead a team of Engineers, Dev Lead, Tech Lead, SREs. Architect applications across the full stack using Synchrony standards i.e., leveraging PCF, AWS, J2EE, Spring Java and React. Implement new technologies and assist developers as they migrate to new technologies Coach and empower them to work efficiently and be responsible to deliver. Design and Own Common Code Modules, Review and Approve them using formal approval process. Be a Java/Solution Architect, CoP champion and contribute to community of practice. And encourage team members to contribute to respective Community of Practice. Work in Agile Development & Scaled Agile Framework Manage and Automate DevOps Process with Hands on Influence and collaborate to deliver technical solutions with high quality, performance, and sound design principles. Mentoring Tech leads(AVPs) to lead/build successful teams. Hiring and retaining talent Perform other duties and/or special projects as assigned. Qualifications/Requirements Bachelor's degree in computer science or related degree with modern application development experience, with a minimum 10+ years of experience in Technology Or in lieu of a degree 12+ years of experience required. 6+ years in Software Architecture and Development using Java, Spring and modern front-end frameworks like React etc. Strong expertise in building widgets that are hosted internally and/or consumed by 3rd party applications Strong Experience in developing microservices with 12-factor app methodology. Proven experience in designing and building applications from the ground up, with hands-on involvement across all phases of software development life cycle. Strong Experience in Distributed Architecture with acumen for Monitoring and Incident Management leveraging tools like Splunk, New Relic Working Experience with Scaled Agile Framework, and familiarity with tools like JIRA Strong Experience with CICD Pipeline, Process and Tools Strong Experience writing unit and integration tests and familiarity with frameworks like Junit, Mockito, Spring Test Familiarity in designing application using SOLID principles, Java and microservice design patterns with business acumen. Familiarity with Cloud Platform like Pivotal Cloud Foundry Familiarity with Behavior Driven Development and API Test Automation Working knowledge in RDBMS Strong communication skills with technical and non-technical peers Ability to analyze, use structured problem solving and available tools to troubleshoot systems, identify root cause, action plans, impact and resolution options. Passionate to learn and understand diversified business domains and technologies. Ability to quickly learn new technologies and frameworks. Eligibility Criteria Bachelor's degree in computer science or related degree with modern application development experience, with a minimum 10+ years of experience in Technology Or in lieu of a degree 12+ years of experience required.. Work Timings: 2PM - 11PM IST This role qualifies for Enhanced Flexibility and Choice offered in Synchrony India and will require the incumbent to be available between 06:00 AM Eastern Time – 11:30 AM Eastern Time (timings are anchored to US Eastern hours and will adjust twice a year locally). This window is for meetings with India and US teams. The remaining hours will be flexible for the employee to choose. Exceptions may apply periodically due to business needs. Please discuss this with the hiring manager for more details. For Internal Applicants Understand the criteria or mandatory skills required for the role, before applying Inform your manager and HRM before applying for any role on Workday Ensure that your professional profile is updated (fields such as education, prior experience, other skills) and it is mandatory to upload your updated resume (Word or PDF format) Must not be any corrective action plan (First Formal/Final Formal, LPP) L10+ Employees who have completed 18 months in the organization and 12 months in current role and level are only eligible. L10+ Employees can apply Grade/Level: 12 Job Family Group Information Technology

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

India

Remote

Job Title: Talend with SAP Master Data Specialist for S/4HANA Public Cloud Migration Remote Long Term/Full Time We are seeking a highly skilled and experienced Talent & SAP Master Data Specialist to play a critical role in our SAP S/4HANA Public Cloud data migration project. The ideal candidate will possess deep expertise in the Talend data integration platform and extensive knowledge of SAP Master Data, specifically within the context of SAP S/4HANA Public Cloud. This role will be instrumental in ensuring the successful extraction, transformation, loading, cleanup, and enrichment of data for our new S/4HANA environment. Responsibilities Data Migration Strategy & Execution: Collaborate with the project team to define and implement data migration strategies for SAP S/4HANA Public Cloud, leveraging Talend for efficient data extraction, transformation, and loading (ETL). Talend Development: Design, develop, test, and deploy robust and scalable ETL jobs using Talend Data Integration (or similar Talend products) to migrate master and transactional data from various source systems to SAP S/4HANA Public Cloud. SAP Master Data Expertise: Apply in-depth knowledge of SAP Master Data objects (e.g., Material Master, Customer Master, Vendor Master, Business Partner, GL Accounts, etc.) to ensure data integrity, consistency, and adherence to S/4HANA Public Cloud best practices. Data Quality & Governance: Lead data cleanup and enrichment activities, identifying and resolving data discrepancies, inconsistencies, and incompleteness. Implement data quality rules and processes to ensure high-quality data in the target S/4HANA system. Data Mapping & Transformation: Develop complex data mappings and transformations between source systems and SAP S/4HANA Public Cloud, ensuring accurate and efficient data conversion. Collaboration: Work closely with functional consultants, business users, and technical teams to understand data requirements, validate transformed data, and resolve any data-related issues. Documentation: Create and maintain comprehensive documentation for data migration processes, ETL jobs, data mappings, and data quality rules. Troubleshooting & Optimization: Identify and troubleshoot data migration issues, performance bottlenecks, and provide effective solutions. Optimize ETL processes for efficiency and performance. Required Skills & Qualifications 5+ years of hands-on experience with Talend Data Integration (or other relevant Talend products) for complex data migration and integration projects. Extensive experience (5+ years) with SAP Master Data concepts, structures, and best practices across various modules (e.g., SD, MM, FICO) in an SAP ECC or S/4HANA environment. Proven experience with SAP S/4HANA Public Cloud data migration projects is highly desirable. Strong understanding of data governance, data quality, and data stewardship principles. Proficiency in SQL and experience with various database systems. Experience with data profiling, data cleansing, and data enrichment techniques. Excellent analytical and problem-solving skills with a keen eye for detail. Strong communication and interpersonal skills, with the ability to collaborate effectively with technical and business stakeholders. Ability to work independently and as part of a team in a fast-paced project environment. Preferred Qualifications Certifications in Talend or SAP S/4HANA. Experience with other data migration tools or methodologies. Knowledge of SAP Activate methodology.

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Mentor junior team members to enhance their skills and knowledge in data engineering. - Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse. - Good To Have Skills: Experience with data modeling and database design. - Strong understanding of ETL processes and data integration techniques. - Familiarity with cloud platforms and services related to data storage and processing. - Experience in performance tuning and optimization of data queries. Additional Information: - The candidate should have minimum 5 years of experience in Snowflake Data Warehouse. - This position is based in Mumbai. - A 15 years full time education is required.

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : Data Engineering, Databricks Unified Data Analytics Platform Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and contribute to the overall data strategy of the organization, ensuring that data solutions are efficient, scalable, and aligned with business objectives. You will also monitor and optimize existing data processes to enhance performance and reliability, while staying updated with the latest industry trends and technologies to continuously improve data management practices. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Collaborate with stakeholders to gather and analyze data requirements. - Design and implement data models that support business needs. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse. - Good To Have Skills: Experience with Data Engineering, Databricks Unified Data Analytics Platform. - Strong understanding of ETL processes and data integration techniques. - Experience with data quality assurance and data governance practices. - Familiarity with cloud-based data solutions and architecture. Additional Information: - The candidate should have minimum 3 years of experience in Snowflake Data Warehouse. - This position is based at our Mumbai office. - A 15 years full time education is required., 15 years full time education

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description Some careers have more impact than others. If you’re looking for a career where you can make a real impression, join HSBC and discover how valued you’ll be. HSBC is one of the largest banking and financial services organisations in the world, with operations in 62 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Assistant Vice President Principal Responsibilities The FDS Data Team are seeking to recruit a Data Modeller with a passion for organising and transforming complex Finance data into actionable insights represented within data model structures that are fit for purpose. The role requires a strong analytical mindset, a good understanding of various data modelling techniques and tools with a proven track record. The individual should have exposure of designing and implementing efficient data models that cater to the data sourcing, storage and usage needs of the Finance business and/or Front-to-Back business domains within a global financial institution. Support the design and develop FDS conceptual, logical and application data models as per HSBC's Future State Architecture (Data Asset Strategy) and work across Finance business teams to drive understanding, interpretation, design, and implementation. Support Finance business and change teams to migrate to target state data models and Data Asset delivery, driving improvement on current feeds and data issues. Develop data modelling schemas aligned with Enterprise data models and supporting Finance Data Assets. Contribute to FDS program model development planning and scheduling. Continuously improve FDS data modelling estate adhering to risks, controls, security, and regulatory compliance standards. Advise and support Finance modelling data requirements that support new use case and data changes. Serve as FDS data modelling subject matter expert. Seek opportunities to simplify, automate, rationalise, and improve the efficiency of Finance IT and modelling solutions. Update and maintain the key FDS modelling artefacts, (i.e., Confluence, SharePoint, documents, reports, roadmap, and other domain artefacts). Provide data modelling and technical advice as well as maintain ongoing relationships. Provide feedback in a timely manner to ensure that model development or modification meets the business need. Requirements Minimum of 5 years' experience of Data management and modelling solutions working as a Data Modeller within the Financial Services sector is essential; preferably in a Treasury/Finance function and or related front office environment. Good communication skills with the ability to influence and present data models (as well as concepts) to technology and business stakeholders. Good collaboration skills with the ability to demonstrate experience achieving outcomes in a matrixed environment partnering with data modellers from other domains to build and join shared and reusable data assets. Experience of working with Agile and Scrum in a large scalable Agile environment. This should include participation and progress reporting in daily standups. Experience working with leading data modelling tools modelling documentation using tools such as Visual Paradigm, ERwin, PowerDesigner, ER Studio etc. Knowledge of data modelling standards and modelling technical documentation using Entity Relationship Diagrams (ERD) or Unified Modelling language (UML) or BIAN. Results oriented with ability to produce solutions that deliver organisational benefit. Understanding of issue and data quality management, prioritisation, business case development, remediation planning and tactical or strategic solution delivery Exposure with data governance initiatives such as lineage, masking, retention policy, and data quality Strong analytical skills and problem-solving, with the ability to work unsupervised and take ownership for key deliverables. You’ll achieve more at HSBC HSBC is an equal opportunity employer committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and, opportunities to grow within an inclusive and diverse environment. We encourage applications from all suitably qualified persons irrespective of, but not limited to, their gender or genetic information, sexual orientation, ethnicity, religion, social status, medical care leave requirements, political affiliation, people with disabilities, color, national origin, veteran status, etc., We consider all applications based on merit and suitability to the role.” Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued By HSBC Electronic Data Processing (India) Private LTD***

Posted 3 weeks ago

Apply

7.0 years

0 Lacs

India

On-site

Our client is embarking on a digital transformation initiative aimed at modernizing field data capture processes. The goal is to retire outdated Maximo Mobile forms and migrate field forms into a modern, user-friendly mobile interface. This solution will integrate seamlessly with existing Maximo workflows and support critical field operations. Key Responsibilities Analyze and document existing Maximo Mobile field forms Design and configure mobile data capture modules using a low-code/no-code mobile platform Integrate the new mobile solution with Maximo to support key business workflows Lead the end-to-end migration of legacy forms Support module deployment, UAT testing, and user training Collaborate with PMO and business stakeholders to ensure timely delivery of milestones Provide go-live support and thorough documentation  Requirements 7+ years of experience with IBM Maximo Experience with mobile form platforms Strong understanding of Maximo integration points and workflow automation Proven track record of managing large-scale form migrations and deployments Excellent communication, stakeholder coordination, and documentation skills Experience in field operations, utilities, or environmental services is a plus

Posted 3 weeks ago

Apply

12.0 years

25 - 35 Lacs

Madurai

On-site

Dear Candidate, Greetings of the day!! I am Kantha, and I'm reaching out to you regarding an exciting opportunity with TechMango. You can connect with me on LinkedIn https://www.linkedin.com/in/kantha-m-ashwin-186ba3244/ Or Email: kanthasanmugam.m@techmango.net Techmango Technology Services is a full-scale software development services company founded in 2014 with a strong focus on emerging technologies. It holds a primary objective of delivering strategic solutions towards the goal of its business partners in terms of technology. We are a full-scale leading Software and Mobile App Development Company. Techmango is driven by the mantra “Clients Vision is our Mission”. We have a tendency to stick on to the current statement. To be the technologically advanced & most loved organization providing prime quality and cost-efficient services with a long-term client relationship strategy. We are operational in the USA - Chicago, Atlanta, Dubai - UAE, in India - Bangalore, Chennai, Madurai, Trichy. Techmangohttps://www.techmango.net/ Job Title: GCP Data Architect Location: Madurai Experience: 12+ Years Notice Period: Immediate About TechMango TechMango is a rapidly growing IT Services and SaaS Product company that helps global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. We are seeking a GCP Data Architect to lead data modernization efforts for our prestigious client, Livingston, in a highly strategic project. Role Summary As a GCP Data Architect, you will be responsible for designing and implementing scalable, high-performance data solutions on Google Cloud Platform. You will work closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities: Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) Define data strategy, standards, and best practices for cloud data engineering and analytics Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) Architect data lakes, warehouses, and real-time data platforms Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications: 10+ years of experience in data architecture, data engineering, or enterprise data platforms. Minimum 3–5 years of hands-on experience in GCP Data Service. Proficient in:BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner. Python / Java / SQL Data modeling (OLTP, OLAP, Star/Snowflake schema). Experience with real-time data processing, streaming architectures, and batch ETL pipelines. Good understanding of IAM, networking, security models, and cost optimization on GCP. Prior experience in leading cloud data transformation projects. Excellent communication and stakeholder management skills. Preferred Qualifications: GCP Professional Data Engineer / Architect Certification. Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics. Exposure to AI/ML use cases and MLOps on GCP. Experience working in agile environments and client-facing roles. What We Offer: Opportunity to work on large-scale data modernization projects with global clients. A fast-growing company with a strong tech and people culture. Competitive salary, benefits, and flexibility. Collaborative environment that values innovation and leadership. Job Type: Full-time Pay: ₹2,500,000.00 - ₹3,500,000.00 per year Application Question(s): Current CTC ? Expected CTC ? Notice Period ? (If you are serving Notice period please mention the Last working day) Experience: GCP Data Architecture : 3 years (Required) BigQuery: 3 years (Required) Cloud Composer (Airflow): 3 years (Required) Location: Madurai, Tamil Nadu (Required) Work Location: In person

Posted 3 weeks ago

Apply

6.0 - 8.0 years

0 Lacs

Chennai

On-site

Job Summary We are seeking a skilled Cloud Engineer with 6 to 8 years of experience to join our team. The ideal candidate will have expertise in BC-DR Azure Migrate Tool Rehost-Replatform Refactor-Rearchitect Azure Assessment and Capacity Planning. This role focuses on enhancing user experience services in a hybrid work model with day shifts. No travel is required. Responsibilities Lead the design and implementation of cloud solutions using Azure technologies to optimize performance and scalability. Oversee the migration process using Azure Migrate Tool ensuring seamless transition and minimal downtime. Provide expertise in BC-DR strategies to ensure data integrity and business continuity. Collaborate with cross-functional teams to rehost replatform refactor and rearchitect applications for cloud environments. Conduct comprehensive Azure assessments to identify opportunities for improvement and cost optimization. Develop capacity planning strategies to ensure resources are efficiently allocated and utilized. Enhance user experience services by integrating innovative cloud solutions that meet business needs. Monitor cloud infrastructure to ensure high availability and performance addressing any issues proactively. Implement security best practices to protect cloud environments and sensitive data. Support the development of cloud-based applications ensuring they align with organizational goals. Provide technical guidance and support to team members fostering a collaborative work environment. Evaluate emerging cloud technologies and recommend solutions that align with business objectives. Document processes and procedures to ensure knowledge transfer and operational efficiency. Qualifications Demonstrate proficiency in BC-DR and Azure Migrate Tool showcasing successful project implementations. Possess strong skills in rehosting replatforming refactoring and rearchitecting applications for cloud environments. Exhibit expertise in conducting Azure assessments and developing capacity planning strategies. Have a solid understanding of user experience services and how cloud solutions can enhance them. Show experience in monitoring and maintaining cloud infrastructure for optimal performance. Display knowledge of security best practices in cloud environments. Be adept at collaborating with cross-functional teams to achieve project goals. Required Skills Technical Skills- Azure MSCRM Azure Domain Skills- Industrial Manufacturing Nice to have skills Technical Skills- Azure Public Cloud Admin Azure Cloud Native Security Domain Skills- Technology MBG CG Shift Day 12:00PM-10:00PM Roles & Responsibilities Admin Azure Cloud Azure Cloud Discovery AWS Resource Request History Ensure application performance uptime and scale maintaining high standards of code quality and thoughtful design Managing cloud environments in accordance with company security guidelines Develop and implement technical efforts to design build and deploy Azure/AWS applications at the direction of lead architects including large-scale data processing computationally intensive statistical modelling and advanced analytics Participate in all aspects of the software development life cycle for Azure/AWS solutions including planning requirements development testing and quality assurance Troubleshoot incidents Certifications Required Azure Solutions Architect Expert Microsoft Certified: Azure Administrator Associate

Posted 3 weeks ago

Apply

1.0 years

11 - 15 Lacs

Bengaluru

On-site

Work Location : Kuala Lumpur, Malaysia Contract duration : 1 Year (extendable) Key Responsibilities: Design, develop, and maintain mission-critical applications using PowerBuilder (v12.x or later). Utilize DataWindow, PowerScript, and Integrated SQL for UI and data access layers. Collaborate with business analysts to gather requirements and translate them into technical designs (DataWindows, windows, scripts). Troubleshoot, debug, and resolve application issues promptly (production and pre-production environments). Enhance legacy systems, support modernization (such as REST integration, UI refresh), and migrate existing modules where necessary. Perform unit testing, support UAT, and coordinate with QA teams to ensure quality deliverables. Document technical designs, code changes, and contribute to knowledge-base articles. Work closely within Agile/Scrum or Waterfall teams, participating in stand-ups, sprint planning, and code reviews. Required Qualifications: 1–5 years of hands-on experience with PowerBuilder development Strong proficiency in SQL and relational databases (e.g., Sybase ASE, SQL Server, Oracle). Good understanding of software development lifecycle (SDLC)—Agile/Scrum familiarity preferred. Solid analytical skills and debugging capabilities. Excellent communication and collaboration skills; ability to work with cross-functional teams. Eagerness to learn and contribute to modernization efforts, including cloud integrations. Job Type: Contractual / Temporary Contract length: 12 months Pay: ₹1,100,000.00 - ₹1,500,000.00 per year Experience: PowerBuilder development: 2 years (Required)

Posted 3 weeks ago

Apply

6.0 - 8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Summary We are seeking a skilled Cloud Engineer with 6 to 8 years of experience to join our team. The ideal candidate will have expertise in BC-DR Azure Migrate Tool Rehost-Replatform Refactor-Rearchitect Azure Assessment and Capacity Planning. This role focuses on enhancing user experience services in a hybrid work model with day shifts. No travel is required. Responsibilities Lead the design and implementation of cloud solutions using Azure technologies to optimize performance and scalability. Oversee the migration process using Azure Migrate Tool ensuring seamless transition and minimal downtime. Provide expertise in BC-DR strategies to ensure data integrity and business continuity. Collaborate with cross-functional teams to rehost replatform refactor and rearchitect applications for cloud environments. Conduct comprehensive Azure assessments to identify opportunities for improvement and cost optimization. Develop capacity planning strategies to ensure resources are efficiently allocated and utilized. Enhance user experience services by integrating innovative cloud solutions that meet business needs. Monitor cloud infrastructure to ensure high availability and performance addressing any issues proactively. Implement security best practices to protect cloud environments and sensitive data. Support the development of cloud-based applications ensuring they align with organizational goals. Provide technical guidance and support to team members fostering a collaborative work environment. Evaluate emerging cloud technologies and recommend solutions that align with business objectives. Document processes and procedures to ensure knowledge transfer and operational efficiency. Qualifications Demonstrate proficiency in BC-DR and Azure Migrate Tool showcasing successful project implementations. Possess strong skills in rehosting replatforming refactoring and rearchitecting applications for cloud environments. Exhibit expertise in conducting Azure assessments and developing capacity planning strategies. Have a solid understanding of user experience services and how cloud solutions can enhance them. Show experience in monitoring and maintaining cloud infrastructure for optimal performance. Display knowledge of security best practices in cloud environments. Be adept at collaborating with cross-functional teams to achieve project goals. Required Skills Technical Skills- Azure MSCRM Azure Domain Skills- Industrial Manufacturing Nice to have skills Technical Skills- Azure Public Cloud Admin Azure Cloud Native Security Domain Skills- Technology MBG CG Shift Day 12:00PM-10:00PM Roles & Responsibilities Admin Azure Cloud Azure Cloud Discovery AWS Resource Request History Ensure application performance uptime and scale maintaining high standards of code quality and thoughtful design Managing cloud environments in accordance with company security guidelines Develop and implement technical efforts to design build and deploy Azure/AWS applications at the direction of lead architects including large-scale data processing computationally intensive statistical modelling and advanced analytics Participate in all aspects of the software development life cycle for Azure/AWS solutions including planning requirements development testing and quality assurance Troubleshoot incidents Certifications Required Azure Solutions Architect Expert Microsoft Certified: Azure Administrator Associate

Posted 3 weeks ago

Apply

8.0 - 11.0 years

0 Lacs

Andhra Pradesh

On-site

HIH - Software Engineering Associate Advisor Position Summary: Evernorth, a leading Health Services company, is looking for exceptional data engineers/developers in our Data and Analytics organization. In this role, you will actively participate with your development team on initiatives that support Evernorth's strategic goals as well as subject matter experts to understand business logic you will be engineering. As a software engineer, you will help develop an integrated architectural strategy to support next-generation reporting and analytical capabilities on an enterprise-wide scale. You will work in an agile environment, delivering user-oriented products which will be available both internally and externally by our customers, clients, and providers. Candidates will be provided the opportunity to work on a range of technologies and data manipulation concepts. Specifically, this may include developing healthcare data structures and data transformation logic to allow for analytics and reporting for customer journeys, personalization opportunities, pre-active actions, text mining, action prediction, fraud detection, text/sentiment classification, collaborative filtering/recommendation, and/or signal detection. This position will involve taking these skills and applying them to some of the most exciting and massive health data opportunities that exist here at Evernorth. The ideal candidate will work in a team environment that demands technical excellence, whose members are expected to hold each other accountable for the overall success of the end product. Focus for this team is on the delivery of innovative solutions to complex problems, but also with a mind to drive simplicity in refining and supporting of the solution by others Job Description & Responsibilities : Be accountable for delivery of business functionality. Work on the AWS cloud to migrate/re-engineer data and applications from on premise to cloud. Responsible for engineering solutions conformant to enterprise standards, architecture, and technologies Provide technical expertise through a hands-on approach, developing solutions that automate testing between systems. Perform peer code reviews, merge requests and production releases. Implement design/functionality using Agile principles. Proven track record of quality software development and an ability to innovate outside of traditional architecture/software patterns when needed. A desire to collaborate in a high-performing team environment, and an ability to influence and be influenced by others. Have a quality mindset, not just code quality but also to ensure ongoing data quality by monitoring data to identify problems before they have business impact. Be entrepreneurial, business minded, ask smart questions, take risks, and champion new ideas. Take ownership and accountability. Experience Required: 8 - 11 years of experience in application program development Experience Desired: Knowledge and/or experience with healthcare information domains. Documented experience in a business intelligence or analytic development role on a variety of large-scale projects. Documented experience working with databases larger than 5TB and excellent data analysis skills. Experience with TDD/BDD Experience working with SPARK and real time analytic framework Education and Training Required: Bachelor’s degree in Engineering, Computer Science Primary Skills: PYTHON, Databricks, TERADATA, SQL, UNIX, ETL, Data Structures, Looker, Tableau, GIT, Jenkins, RESTful & GraphQL APIs. AWS services such as Glue, EMR, Lambda, Step Functions, CloudTrail, CloudWatch, SNS, SQS, S3, VPC, EC2, RDS, IAM Additional Skills: Ability to rapidly prototype and storyboard/wireframe development as part of application design. Write referenceable and modular code. Willingness to continuously learn & share learnings with others. Ability to communicate design processes, ideas, and solutions clearly and effectively to teams and clients. Ability to manipulate and transform large datasets efficiently. Excellent troubleshooting skills to root cause complex issues About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.

Posted 3 weeks ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Company Description Infocus Technologies Pvt Ltd, based in Kolkata, is a premier SAP, ERP, and Cloud consulting firm. As a Gold partner of SAP in Eastern India and an AWS Standard consulting partner, we specialize in SAP consulting, resourcing, and training services. Infocus is ISO 9001:2008 DNV certified and CMMI Level 3 certified, providing services including SAP ECC & S4 HANA implementation, migration, version upgrades, and Enterprise Application Integration (EAI) solutions. Our expertise extends to AWS Cloud services, helping clients migrate and host SAP infrastructure on the cloud. Role Description This is a full-time on-site role for an IT Recruiter located in Kolkata. The IT Recruiter will be responsible for managing the full-life cycle of recruitment, including sourcing, screening, interviewing, and hiring candidates for technical positions. The recruiter will develop and implement effective recruiting strategies, manage job postings, coordinate with hiring managers, and ensure the recruitment process aligns with the company’s objectives and standards. Qualifications MBA in HR B.Tech Location: Kolkata

Posted 3 weeks ago

Apply

0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Role Description Role Proficiency: Act creatively to develop applications and select appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions account for others' developmental activities Outcomes Interpret the application/feature/component design to develop the same in accordance with specifications. Code debug test document and communicate product/component/feature development stages. Validate results with user representatives; integrates and commissions the overall solution Select appropriate technical options for development such as reusing improving or reconfiguration of existing components or creating own solutions Optimises efficiency cost and quality. Influence and improve customer satisfaction Set FAST goals for self/team; provide feedback to FAST goals of team members Measures Of Outcomes Adherence to engineering process and standards (coding standards) Adherence to project schedule / timelines Number of technical issues uncovered during the execution of the project Number of defects in the code Number of defects post delivery Number of non compliance issues On time completion of mandatory compliance trainings Code Outputs Expected: Code as per design Follow coding standards templates and checklists Review code – for team and peers Documentation Create/review templates checklists guidelines standards for design/process/development Create/review deliverable documents. Design documentation r and requirements test cases/results Configure Define and govern configuration management plan Ensure compliance from the team Test Review and create unit test cases scenarios and execution Review test plan created by testing team Provide clarifications to the testing team Domain Relevance Advise Software Developers on design and development of features and components with a deep understanding of the business problem being addressed for the client. Learn more about the customer domain identifying opportunities to provide valuable addition to customers Complete relevant domain certifications Manage Project Manage delivery of modules and/or manage user stories Manage Defects Perform defect RCA and mitigation Identify defect trends and take proactive measures to improve quality Estimate Create and provide input for effort estimation for projects Manage Knowledge Consume and contribute to project related documents share point libraries and client universities Review the reusable documents created by the team Release Execute and monitor release process Design Contribute to creation of design (HLD LLD SAD)/architecture for Applications/Features/Business Components/Data Models Interface With Customer Clarify requirements and provide guidance to development team Present design options to customers Conduct product demos Manage Team Set FAST goals and provide feedback Understand aspirations of team members and provide guidance opportunities etc Ensure team is engaged in project Certifications Take relevant domain/technology certification Skill Examples Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Break down complex problems into logical components Develop user interfaces business software components Use data models Estimate time and effort required for developing / debugging features / components Perform and evaluate test in the customer or target environment Make quick decisions on technical/project related challenges Manage a Team mentor and handle people related issues in team Maintain high motivation levels and positive dynamics in the team. Interface with other teams designers and other parallel practices Set goals for self and team. Provide feedback to team members Create and articulate impactful technical presentations Follow high level of business etiquette in emails and other business communication Drive conference calls with customers addressing customer questions Proactively ask for and offer help Ability to work under pressure determine dependencies risks facilitate planning; handling multiple tasks. Build confidence with customers by meeting the deliverables on time with quality. Estimate time and effort resources required for developing / debugging features / components Make on appropriate utilization of Software / Hardware’s. Strong analytical and problem-solving abilities Knowledge Examples Appropriate software programs / modules Functional and technical designing Programming languages – proficient in multiple skill clusters DBMS Operating Systems and software platforms Software Development Life Cycle Agile – Scrum or Kanban Methods Integrated development environment (IDE) Rapid application development (RAD) Modelling technology and languages Interface definition languages (IDL) Knowledge of customer domain and deep understanding of sub domain where problem is solved Additional Comments We are seeking a highly skilled Senior AWS Cloud Engineer with a strong background in cloud migration projects. The ideal candidate will have hands-on experience in designing, implementing, and managing cloud infrastructure on AWS, with a proven track record of migrating legacy systems to the cloud. Experience with mainframe systems is a strong plus. Key Responsibilities: Lead and execute end-to-end cloud migration projects from on-premise or legacy systems to AWS. Design and implement scalable, secure, and highly available cloud architectures. Collaborate with development teams to modernize and migrate .NET/Mainframe-based applications to AWS. Develop and maintain Infrastructure as Code (IaC) using tools like CloudFormation or Terraform. Optimize cloud usage and costs through automation and monitoring. Provide technical leadership and mentorship to junior engineers. Work closely with stakeholders to gather requirements and translate them into technical solutions. (Optional) Interface with mainframe systems during migration or integration phases. Required Skills & Qualifications: hands-on experience with AWS services (EC2, S3, Lambda, RDS, VPC, IAM, SQS, SNS etc.). Strong AWS experience, including EC2, S3, EKS, Lambda, Glue, and CloudWatch for logging and monitoring. Intermediate experience with Terraform for infrastructure automation. Experience with AWS MWAA, AWS Aurora Postgres, and EMR Serverless for data pipeline management. Proven experience in cloud migration projects, including rehosting, replatforming, and refactoring. Strong understanding of DevOps practices, CI/CD pipelines, and automation tools (e.g., GitHub Actions, Jenkins, CodePipeline). Experience with containerization (Docker, ECS, EKS) and serverless architectures. Proficiency in scripting languages like PowerShell, Python, or Bash. Familiarity with monitoring and logging tools (CloudWatch, ELK, Datadog). Excellent communication and problem-solving skills. Skills Aws Services,Eks,Terraform,Aws Emr

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Bhubaneswar, Odisha, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Business Agility Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Additionally, you will monitor and optimize data workflows to enhance performance and reliability, ensuring that data is accessible and actionable for stakeholders. Roles & Responsibilities: - Need Databricks resource with Azure cloud experience - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Collaborate with data architects and analysts to design scalable data solutions. - Implement best practices for data governance and security throughout the data lifecycle. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with Business Agility. - Strong understanding of data modeling and database design principles. - Experience with data integration tools and ETL processes. - Familiarity with cloud platforms and services related to data storage and processing. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Pune office. - A 15 years full time education is required.

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Greater Hyderabad Area

On-site

Company Description Orghire is a leading staffing and recruitment partner offering customized hiring solutions across multiple industries, including IT, finance, healthcare, engineering, and support. Our mission is to connect businesses with top-tier talent and help individuals discover meaningful career paths. We simplify the hiring process by sourcing and delivering highly qualified candidates, boosting efficiency, and reducing costs. Orghire provides a wide range of services such as permanent recruitment, temporary staffing, RPO, and project-based hiring. Role Description This is a full-time on-site role for a .Net Developer with React, located in the Greater Hyderabad Area. Summary: The Programmer Analyst develops and modifies complex technical computer software. They meet project deadlines to insure satisfactory function of the software. Duties and Responsibilities: Works with Business Systems Analysts and Quality Assurance colleagues to complete assigned user stories that deliver business value Remediate any bugs or vulnerabilities introduced into the code Provides technical support to programming and operations areas Basic Qualifications: Bachelor’s Degree 5+ years of knowledge and experience with Visual Studio, Visual Studio Code, .Net, and C# 5+ years of knowledge and experience with React 5+ years knowledge and/or experience in a Windows environment Ability to comprehend and execute the business requirements contained in user stories Basic competence with Azure DevOps for both user story management and pull request management Preferred Qualifications: Strong troubleshooting and problem-solving skills Ability to understand Webforms pages to migrate them to React Knowledge/experience with version control practices and version control tools Good understanding of IT system administration best practices Good understanding of SQL databases and writing queries Experience in working on projects that involve business segments. Strong interpersonal skills, focus on customer service, and the ability to work well with other IT, vendor, and business groups. Excellent communication, design, documentation, analytical, and SDLC skills

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Greater Hyderabad Area

On-site

Trianz is a leading-edge technology platforms and services company that accelerates digital transformations at Fortune 100 and emerging companies worldwide in data & analytics, digital experiences, cloud infrastructure, and security. The company has developed a disruptive “IP Led Transformations” vision, strategy, and business model over the past 3 years. Some of the company’s IP was recently acquired by AWS and its overall business model has taken off sharply in 2024. Trianz is led by Sri Manchala, a former special forces officer from the Indian army and author of Crossing the Digital Faultline | Trianz , and a team of veterans from well-known firms such as Deloitte, HCL, KPMG, Wipro, Microsoft, TATA, AWS, GE, etc. About Trianz: Trianz believes that companies around the world face three challenges in their digital transformation journeys - shrinking ‘time to transform’ due to competition & AI, lack of digital-ready talent, and uncertain economic conditions. To help clients leapfrog over these challenges, Trianz has built IP and platforms that have transformed the adoption of the cloud, data, analytics & insights AI. Specifically, the following Trianz platforms are changing the way companies approach transformations in various disciplines: Concierto: A fully automated platform to Migrate, Manage, and Maximize the multi & hybrid cloud. A zero code and SaaS platform, Concierto allows teams to migrate to AWS, Azure and GCP and manage them efficiently from a single pane of glass. Visit www.concierto.cloud for more information Avrio Data to AI Platform: Avrio is a Data to AI SaaS platform designed to drive data-led transformation at lightning speed. Through conversational AI, organizations seamlessly engage with all their data, unlocking real-time insights, and uncovering hidden opportunities and risks—all within one powerful platform. Visit www.avriodata.ai to know more. Pulse: Recognizing that workforces will be distributed, mobile, and fluid, Trianz has built a ‘future of work’ digital workplace platform called Pulse. Visit www.trianz.com/Pulse Since the market launch of this strategy in mid-2023, Trianz has experienced enormous growth, success, and recognition. Some of Trianz’ built IP in data and analytics was acquired by Amazon. Since then, Trianz has been made an engineering partner of Amazon for building/supporting connected ecosystems across multiple AWS platforms. Most recently, Trianz and AWS have signed a strategic collaboration agreement within which the two companies will work on joint roadmaps/solutions for the cloud; AWS will buy Trianz | Concierto in bulk for AWS partners to use for migrations; AWS will also recommend Concierto to their MSPs and finally, AWS Professional Services and Trianz have signed an agreement for joint solutioning and customer delivery. Read more: Trianz enters into a Strategic Collaboration Agreement with AWS to Revolutionize Cloud Adoption and Management (yahoo.com) Given all this, Trianz is experiencing a significant demand for its SW platforms and consequent growth. To support this growth, Trianz has recently raised private equity capital to scale the company over the next several years ( Trianz Announces Strategic Growth Capital Investment by Capital Square Partners (prnewswire.com ). It is now bolstering its senior and mid-level leadership with top talent across GTM, Engineering, Services, and Partnership organizations. We are seeking leaders driven by our purpose - to help customers accelerate digital transformations and build the next generation software and services organization. Trianz | Accelerating Digital Evolution Leaders in Product Engineering, Data & Analytics Consulting, APPS & Experience Consulting, Hybrid Cloud Consulting, IT infrastructure services, managed services and IT security consulting. Job Role Senior .NET Full-Stack Architect Job Description: Design and develop scalable, high-performance, and secure web applications using .NET, Angular 12+, and SQL Server. Collaborate with cross-functional teams to gather requirements, define system architecture, and oversee the development lifecycle. Implement and optimize RESTful APIs to support the front-end applications. Develop and maintain complex stored procedures and database schemas to support the application's data needs. Mentor and guide junior developers, ensuring adherence to best practices and coding standards. Stay up-to-date with the latest technologies, frameworks, and industry trends in the .NET ecosystem. Actively participate in code reviews, architectural discussions, and process improvements. Key Responsibilities: Design and develop scalable, high-performance, and secure web applications using .NET, Angular 12+, and SQL Server. Collaborate with cross-functional teams to gather requirements, define system architecture, and oversee the development lifecycle. Implement and optimize RESTful APIs to support the front-end applications. Develop and maintain complex stored procedures and database schemas to support the application's data needs. Mentor and guide junior developers, ensuring adherence to best practices and coding standards. Stay up-to-date with the latest technologies, frameworks, and industry trends in the .NET ecosystem. Actively participate in code reviews, architectural discussions, and process improvements. Qualifications: 10-14 years of experience in .NET development, with a strong understanding of Angular 8+ and SQL Server. Proficient in designing and implementing RESTful APIs and working with stored procedures. Extensive experience in developing and maintaining web applications using .NET, Angular, and SQL Server. Proven track record of leading and mentoring cross-functional teams in software development projects. Excellent problem-solving and critical-thinking skills, with the ability to think strategically and implement creative solutions. Strong communication and collaboration skills, with the ability to work effectively with both technical and non-technical stakeholders.

Posted 3 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

About VOIS VO IS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Group’s partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, _VO IS has evolved into a global, multi-functional organisation, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone. About VOIS India In 2009, VO IS started operating in India and now has established global delivery centres in Pune, Bangalore and Ahmedabad. With more than 14,500 employees, VO IS India supports global markets and group functions of Vodafone, and delivers best-in-class customer experience through multi-functional services in the areas of Information Technology, Networks, Business Intelligence and Analytics, Digital Business Solutions (Robotics & AI), Commercial Operations (Consumer & Business), Intelligent Operations, Finance Operations, Supply Chain Operations and HR Operations and more. Core Competencies Excellent knowledge on EKS, Kubernetes and its related AWS Component. Kubernetes Networking Kubernetes DevOps which includes Deployment of Kubernetes – EKS Cluster using IaaC (Terraform) and CI/CD pipeline. EKS Secret Management, Autoscaling and Lifecycle Management. EKS Security using AWS Native Services. Excellent Understanding on AWS cloud services like VPC, EC2, ECS, S3, EBS, ELB, Elastic IPs, Security Group etc. AWS Component deployment using Terraform Application Onboarding on Kubernetes using Argocd AWS Codepipeline, Codebuild, Code Commit HashiCorp Stack, HasiCorp Packer. Bitbucket and Git, Profound Cloud Technology, Network, Security and Platform Expertise (AWS or Google Cloud or Azure) Good documentation and communication skills. Good Understanding on ELK, Cloudwatch, datadog Roles & Responsibilites Manage project driven integration and day-to-day administration of cloud solutions Develop prototypes, designing and building modules and solutions for Cloud Platforms in an iterative agile cycles, develop, maintain, and optimize the business outcome Conduct peer reviews and maintain coding standards Driving automation using CI/CD using Jenkins or argcd Driving Cloud solution automation and integration activity for Cloud Provider - AWS and Tenant (Project) workloads. Build and deploy AWS cloud infrastructure by using cloud formation and terraform scripts. Use Ansible & Python to perform routines tasks like user management and security hardening, etc. Providing professional technical consultancy to migrate and transform existing on-premises applications to public cloud and support to all Cloud-related programmes and existing environments Design and deploy direct connect network between AWS and datacentre. Train and develop AWS expertise within the organisation. Proven troubleshooting skills to resolve issues related with cloud network, storage and performance management. VOIS Equal Opportunity Employer Commitment VO IS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees’ growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, colour, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 5 Best Workplaces for Diversity, Equity, and Inclusion , Top 10 Best Workplaces for Women , Top 25 Best Workplaces in IT & IT-BPM and 14th Overall Best Workplaces in India by the Great Place to Work Institute in 2023. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills! Apply now, and we’ll be in touch!

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Key Responsibilities: Develop AWS serverless applications using .NET and Angular Design, build, test, and maintain REST APIs and services Migrate legacy apps to modern serverless architecture Embrace SOLID principles and clean coding practices Collaborate with architects, testers, and DevOps for seamless delivery Mentor junior team members and lead technical discussions Contribute to CI/CD pipelines and participate in code reviews ✅ What We’re Looking For: Bachelor’s degree in computer science or equivalent experience 3+ years in serverless app development using C# on AWS Strong experience in: .NET Core / C#, Angular SQL Server, Entity Framework REST APIs, JSON, Bootstrap Solid understanding of SOLID principles and design patterns ⭐ Good to Have: 10+ years in Microsoft tech stack Experience with Azure DevOps, Git, JIRA Familiarity with TDD, Agile/Scrum, and CI/CD processes 📩 Interested candidates can apply or share your CV to: sameerafazil@yitroglobal.com 🕐 Immediate joiners or candidates with short notice are highly preferred.

Posted 3 weeks ago

Apply

2.0 - 5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Greetings from MSRcosmos!! We are hiring for Desktop Engineers. Experience: 2-5 Years Work Location: Hyderabad (Work from Office) Notice Period: Immediate Mode Of Interview: F2F only Shift Timings: Rottional shift Job Description: Experience Required: 2–5 years Languages : En g l i s h ,H i n d I , Te l u g u Key Responsibilities for JD. Desktop Support & OS Management Install, configure, and maintain operating systems (Windows & Linux) on desktops and laptops. Provide first-level support for hardware, software, and peripheral device issues. Set up and configure user accounts, email clients (IMAP, POP3, SMTP), and user profiles. Install and configure common developer and enterprise software. Configure and troubleshoot network and local printers, print queues, and sharing permissions. Network & System Troubleshooting Diagnose and resolve LAN issues including IP conflicts, switch/router issues, DNS, and DHCP problems. Use network tools such as ping , tracert , ipconfig , and netstat for troubleshooting. Monitor and maintain local area networks and ensure optimal performance. Virtual Infrastructure (VMware) Deploy, clone, migrate, and manage VMs within vCenter . Monitor resource usage (CPU, memory, disk, network) of VMs and ESXi hosts. Manage snapshots, VM templates, and customization specs. Troubleshoot VM performance issues like CPU Ready, Memory Ballooning, and Disk Latency. Backup and Restore (Veeam) Configure and manage Veeam Backup & Replication jobs for vCenter virtual machines. Verify backup integrity and perform restore operations as required. System & Asset Management Manage disk partitions, formatting, and mounting in Windows and Linux systems. Maintain asset inventory including laptops, desktops, monitors, and peripherals. Configure and monitor biometric devices and maintain logs. DevOps & Database Support (Basic Level) Install and configure DevOps agents on Windows/Linux systems. Create and manage DevOps projects and assign roles to Active Directory (AD) users. Install and configure basic MySQL/PostgreSQL databases on Windows/Linux platforms. Application: Install knowload of Python, Java, Node.js , Apache , React, Angular , docker, NPM…etc How to set java path How to install MS office. Interested candidates can share resumes to supraja@msr-it.com

Posted 3 weeks ago

Apply

15.0 years

0 Lacs

Satara, Maharashtra, India

On-site

Join us as a Supply Chain Manager and Planning in Satara, Maharashtra to be responsible overseeing the planning and execution of supply chain activities, including demand forecasting, procurement, order handling and alignment with inventory management team, warehouse activities and ensuring a high performance of our end-to-end Customer Service. About The Job At Alfa Laval, we always go that extra mile to overcome the toughest challenges. Our driving force is to accelerate success for our customers, people and planet. You can only achieve that by having dedicated people with a curious mind. Curiosity is the spark behind great ideas and great ideas drive progress. As a member of our team, you thrive in a truly diverse and inclusive workplace based on care and empowerment. You are here to make a difference. Constantly building bridges to the future with sustainable solutions that have an impact on our planet’s most urgent problems. Making the world a better place every day. About The Position Local Assembly Satara is one of 7 assembly supply sites in the world within the Product Group Gasketed Plate Heat Exchanger (GPHE). From Satara we supply our whole range of GPHE. We are now looking for a Unit Manager för Supply Chain process for the Local Assembly Site in Satara. As UM Supply Chain you are responsible for overseeing the planning and execution of supply chain activities, including demand forecasting, procurement, order handling and alignment with inventory management team, warehouse activities and ensuring a high performance of our end-to-end Customer Service. You are responsible for securing the team’s daily performance as well as team improvements and competence. You ensure that the group is working towards set targets and following our processes. Your role is to make sure that you, together with the team, work in an efficient way according to our business principals and requirements and adding value to our customers. The role is situated in Satara/Maharashtra/India, and you’ll report to the Factory Manager. You will be part of the Local Assembly Satara Management Team. You’ll work in close collaboration the rest of the organization with end-to-end improvements and to drive our factory to meet the future requirements. The Local Assembly Factory in Satara is in a major program for setting the new standards for customer service with high level of MRP system integration, implementation of Lean concepts in our Supply Chain where we are developing the methods of working and serving our production lines. During 2024 we will focus on increasing the capabilities in our processes to be able to deliver 50% more products with shorter lead time and prepare ourselves to further volume increase the year after. We are also preparing to migrate to a new MRP system within a couple of years. An automated order flow process will require closeness to our markets. We will seek to understand needs to increase our Service level by building Lean Flow based supply chain organization. Our Assembly Lines in the Factory will run as a Lean line and supply chain processes are to be managed in accordance with the same principle (One Piece Flow). You will have a key role in these projects. Who are you? We believe you are a natural leader with a clear feeling of “sense of urgency”, with clear values and integrity. Safety is our top-priority, and we expect that from you as well. With a strategic mind-set and a can-do attitude, you act on our strategies within the Business Unit and create result according to set goals. You communicate in inclusive and engaging way and believe that result and behavior is equally important. We are looking for a leader that wants to drive and handle change at Gemba. You have the courage to think differently, seeing opportunities rather than problems. Through support, attendance, and genuine interest in people, you help your employees grow in their roles. What You Know We believe that you have Bachelor’s degree in Mechanical or Production Engineering and Supply Chain Management, Business Administration or a related field or the relevant work experience. A master’s degree is a plus. You possess 15+ years of experience with at least 5 - 7 years’ experience in supply chain management in world-wide industrial products, although we may consider other backgrounds and will put strong and healthy leadership as our priority. You have a proven track record of successfully leading and transforming supply chain operations, preferably at a managerial level. You need have expertise in implementation, improvement, and management of the S&OP process with a proven high-business impact track record. You are both operative to ensure the daily deliveries and strategic to drive continuous improvements. Experience in Manufacturing Transformation and implementing new concepts is an advantage. You are fluent in English both verbal and written. Knowledge in a second language is plus. Responsibilities You have the responsibility for a team consisting of 15 young, energetic and dynamic colleagues that are waiting to reveal their full potential in developing our business, processes and support our customers. Physical & Environmental Factors Office environment with frequent attendance on the shop floor. Safety equipment required when present on the shop floor – footwear, hearing, eyewear. Environmental Factors (hazardous materials, work location, work surfaces, exposure). Why Should You Apply We offer you an interesting and challenging position in an open and friendly environment where we help each other to develop and create value for our customers. Exciting place to build a global network with different nationalities. Your work will have a true impact on Alfa Laval’s future success, you will be learning new things every day. "We care about diversity, inclusion and equity in our recruitment processes. We also believe behavioural traits can provide important insights into a candidate's fit to a role. To help us achieve this we apply Pymetrics assessments, and upon application you will be invited to play the assessment games.”

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies