Jobs
Interviews

Konnecting Tree

7 Job openings at Konnecting Tree
Sap Sd Consultant Chennai 6 - 11 years INR 8.4 - 12.0 Lacs P.A. Work from Office Full Time

Direct USA Based Client - contract SAP SD with HANA & Contract Management Consultant Location: Chennai (Hybrid - Weekly 3 days onsite) Duration: 12+ Months (contract) Working Hours: 11am to 3pm (IST) - at the office 7pm to 11pm (IST) - work from home

SAP SD Consultant (Architect) chennai 10 - 15 years INR 24.0 - 30.0 Lacs P.A. Work from Office Full Time

Responsibilities: * Lead SAP SD implementations from planning to go-live. * Collaborate with cross-functional teams on RAR design and configuration. * Design and architect SAP SD solutions using best practices.

Sap Edi Consultant chennai 7 - 12 years INR 18.0 - 24.0 Lacs P.A. Work from Office Full Time

Responsibilities: * Collaborate with cross-functional teams on project delivery * Provide technical support and training to clients * Design, implement, and maintain SAP EDI solutions

Sap Pi Po Developer chennai 9 - 14 years INR 18.0 - 24.0 Lacs P.A. Work from Office Full Time

Responsibilities: * Collaborate with cross-functional teams on project delivery. * Ensure data integrity through quality assurance processes. * Design, develop, test & maintain SAP PI/PO solutions.

Sap Pi Po Developer chennai 9 - 14 years INR 18.0 - 24.0 Lacs P.A. Work from Office Full Time

Responsibilities: * Collaborate with cross-functional teams on project deliverables. * Design, develop, test & maintain SAP PI/PO solutions. * Ensure data security compliance through proper access controls.

Data Engineer noida 6 - 10 years INR 12.0 - 15.0 Lacs P.A. Work from Office Full Time

Responsibilities: * Design, develop & maintain data pipelines using Spark, Python & Azure Databricks * Collaborate with cross-functional teams on project requirements & deliverables

Data Engineer chennai 8 - 12 years INR 12.0 - 13.32 Lacs P.A. Work from Office Full Time

Responsibilities: * Design, develop, and maintain data pipelines using Data Bricks, PySpark, and Python. * Optimize database performance through data modeling and query optimization.