Job Description
How will you contribute:-
Integration Architecture Ownership:
- Take ownership of the end-to-end integration architecture across all planning tracks (Demand, Supply, etc.)
- Design and maintain the overall integration strategy, ensuring scalability, reliability, and security.
- Oversee inbound and outbound data transformations and orchestration processes.
Decision Support & Guidance:
- Support decision-making related to integration disposition, data transformations, and performance assessments.
- Provide guidance and recommendations on integration approaches, technologies, and best practices.
- Collaborate with stakeholders to understand business requirements and translate them into technical solutions.
Inter-Tenant Data Transfer Design:
- Design and implement secure and efficient inter-tenant data transfer mechanisms.
- Ensure data integrity and consistency across different o9 environments.
Team Guidance & Mentoring:
- Provide technical guidance and mentorship to the juniors in the team on building and maintaining interfaces.
- Share best practices for integration development, testing, and deployment.
- Conduct code reviews and ensure adherence to coding standards.
CI/CD Implementation:
- Design and implement a robust CI/CD pipeline for integration deployments.
- Automate integration testing and deployment processes to ensure rapid and reliable releases.
Batch Orchestration Design:
- Design and implement batch orchestration processes for all planning tracks.
- Optimize batch processing schedules to minimize processing time and resource utilization.
- Technical Leadership & Implementation:
- Serve as a technical leader and subject matter expert on o9 integration.
- Lead and participate in the implementation of end-to-end SCM solutions.
- Provide hands-on support for troubleshooting and resolving integration issues.
Qualifications:_
Experience:
- Delivered a minimum of 2-3 comprehensive end-to-end SCM Product implementations as a Technical Architect.
- At least 8 years of experience in SDLC with a key emphasis on architecting, designing, and developing solutions using big data technologies.
- Technical Skills:
- Proficiency in SSIS Packages, Python, Pyspark, SQL programming languages.
- Experience with workflow management tools like Airflow, SSIS.
- Experience with Amazon Web Services (AWS), AZURE, Google Cloud Infrastructures preferred.
- Experience working with Parquet, JSON, Restful APIs, HDFS, Delta Lake and query frameworks like Hive, Presto.
- Deep understanding and hands-on experience with writing orchestration workflows and/or API coding (knowledge of Apache NiFi is a plus).
- Good hands-on technical expertise in building scalable Interfaces, performance tuning, data cleansing, and validation strategies.
- Experience working with version control platforms (e.g., GitHub, Azure DevOps).
- Experience with DeltaLake and Pyspark is a must.
Other Skills:
- Good to have experience in Cloud Data Quality, Source Systems Analysis, Business Rules Validation, Source Target Mapping Design, Performance Tuning, and High-Volume Data Loads.
- Familiarity with Agile methodology.
- Proficient in the use of Microsoft Excel/PowerPoint /Visio for analysis and presentation.
- Excellent communication and interpersonal skills.
- Strong problem-solving and analytical abilities.
- Ability to work independently and as part of a team.
- Proactive and results-oriented.
- Ability to thrive in a fast-paced environment.
Work schedule :
- 3 days work from office per week / 2 days wfh
No Relocation support available
Business Unit Summary
Job Type
Regular
Software & Applications
Technology & Digital