Jobs
Interviews

96 Integration Tools Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

You will be working with HCL Software, a product development division of HCL Tech, to deliver software solutions that cater to the transformative needs of clients worldwide. The software developed by HCL Software spans across AI, Automation, Data & Analytics, Security, and Cloud domains, receiving accolades for its innovation and quality. Your primary focus will be on the Unica+ Marketing Platform, a product that empowers clients to execute precise and high-performance marketing campaigns across various channels such as social media, AdTech platforms, mobile applications, and websites. This platform, driven by data and AI, enables clients to create hyper-personalized offers and messages for customer acquisition, product awareness, and retention. As a Senior & Lead Python Developer specializing in Data Science and AI/ML, you are expected to leverage your 8+ years of experience in the field to deliver AI-driven marketing campaigns effectively. Your responsibilities will include Python programming, statistical analysis and modeling, data cleaning and preprocessing, SQL and database management, exploratory data analysis, machine learning algorithms, deep learning frameworks, model evaluation and optimization, and deployment of machine learning models. To excel in this role, you must possess a minimum of 8-12 years of Python development experience, with at least 4 years dedicated to data science and machine learning. Additionally, familiarity with Customer Data Platforms (CDP) like Treasure Data, Epsilon, Tealium, Adobe, Salesforce, and AWS SageMaker will be advantageous. Proficiency in integration tools and frameworks such as Postman, Swagger, and API Gateways is desired, along with expertise in REST, JSON, XML, and SOAP. A degree in Computer Science or IT is a prerequisite for this position. Excellent communication and interpersonal skills are essential, as you will be collaborating within an agile team environment. Your ability to work effectively with others and apply agile methodologies will be crucial for success. The role may require approximately 30% travel, and the preferred location is Pune, India. If you meet the qualifications and possess the necessary skills, we invite you to consider joining our dynamic team at HCL Software to contribute to cutting-edge software solutions and drive innovation in the field of data science and AI/ML.,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

As a Software Engineering Technical Specialist at Kyndryl CIO, you will play a crucial role in the Hire to Retire portfolio by designing, implementing, and maintaining integrations between Workday and other systems. Your key responsibilities will include developing integration solutions that meet business needs, creating data mapping and transformation rules, utilizing RESTful and SOAP APIs, performing comprehensive testing of integrations, troubleshooting and resolving integration issues, and ensuring compliance with data security standards and organizational policies. You will collaborate closely with functional consultants, IT teams, and business stakeholders to gather requirements, understand integration needs, and ensure successful implementation. Additionally, you will be responsible for managing and implementing changes to integrations, monitoring their performance, and ensuring that they are well-designed, reliable, and aligned with business processes and goals. To excel in this role, you must have a minimum of 6 years of experience with Workday integrations, development, maintenance, and support. Proficiency with Workday's integration tools and technologies, system integration skills, knowledge of programming languages, experience with API integration, familiarity with data formats, database skills, security and authentication knowledge, error handling and debugging abilities, testing and validation experience, and change management skills are essential requirements. Preferred qualifications include a strong understanding of Workday integrations across multiple modules, Workday Pro Certifications, excellent problem-solving skills, strong communication and interpersonal skills, ability to work collaboratively with cross-functional teams, experience with requirements gathering, testing, validation, end-user training, troubleshooting and support, knowledge of industry best practices for Workday implementations, ability to manage multiple projects and priorities simultaneously, and strong analytical and critical thinking skills. If you are a talented individual with a growth mindset, customer-focused approach, and inclusive work style, and possess the technical expertise and professional experience required for this role, we encourage you to apply and join our dynamic team at Kyndryl CIO.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

You have 8-10+ years of Integration experience, specifically involving Oracle Utilities and MDM integration with other enterprise platforms. You have hands-on experience with Oracle MDM and Oracle Utilities Application Framework. Additionally, you possess experience with Oracle SOA and Oracle Integration Cloud. Your expertise includes working with Integration tools and middleware such as Webservices and Rest APIs. Furthermore, you have a good understanding of Enterprise Architecture.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a highly valued member of the technical solutions team, you will serve as the primary technical point of contact to ensure the smooth delivery of integrations from the initial stages to the go-live phase. You are required to have a minimum of 5 years of experience in EDI and API integration development. Your responsibilities will include integrating data from various source and target formats, developing and maintaining integrations using integration tools, and proficiently handling data formats like XML, JSON, EDIFACT, X12, raw text files, EXCEL, CSV, and tagged separated files. You should possess knowledge of EDI protocols such as OFTP (2), FTP, SFTP, AS2, HTTP, HTTPS, and communication methods. Additionally, familiarity with SOAP and REST API designs, proficiency in API sandbox, and a strong understanding of mapping and integration tools are crucial requirements. Experience with integration tools like Lobster_data, MuleSoft, Boomi, Cleo Integration Cloud, Jitterbit Harmony, etc., is expected. Monitoring and managing integrations for optimal performance, troubleshooting integration issues, and effective communication with team members and stakeholders through various channels are also part of your role. The ability to work independently and collaboratively in a remote setting is essential. Hands-on experience with low-code to no-code tools such as Microsoft Power Platform and Azure is preferred. You should be knowledgeable about producing comprehensive supporting documentation, including requirements, standard document format, translation and mapping methods, and communication networks for sending and receiving documents. Desirable skills include familiarity with Cargowise XML (XUS/XUE/XUT), experience with Altova Mapforce, prior experience in the Freight Forwarding Industry, and expertise in building integrations EDI/API with third parties like Project44, Inttra, Gravity, E2Open, Wakeo, etc.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

kochi, kerala

On-site

As an SAP Group Reporting & Consolidation Consultant, you will have the opportunity to leverage your expertise in SAP S/4HANA Group Reporting to drive financial consolidation processes. Joining our Finance Transformation team, you will play a pivotal role in shaping real-time, cloud-first reporting environments for global enterprises. Your responsibilities will include leading the end-to-end implementation of SAP Group Reporting in S/4HANA, designing consolidation structures such as units, hierarchies, rules, and versions, configuring currency translation, intercompany eliminations, and group journals. You will ensure compliance with IFRS, GAAP, and local statutory standards, manage monthly/quarterly/annual group close cycles, and deliver real-time insights through SAP Analytics Cloud (SAC). Operating within SAP S/4HANA Public Cloud environments, you will also be tasked with integrating SAP with tools like LucaNet via flat files or APIs. We are seeking candidates with a minimum of 6 years of experience in SAP FI/CO, including at least 3 years of hands-on experience in SAP Group Reporting & Consolidation. Your background should demonstrate experience in multi-entity, multi-currency, global finance setups, familiarity with SAP S/4HANA Public Cloud, and knowledge of integration tools and data mapping techniques. Joining our team means being part of a global effort to drive finance transformation by utilizing modern tools and public cloud technologies to facilitate smarter decision-making. If you are ready to take on this exciting challenge, apply now or refer someone from your network who would be a perfect fit for this role. #SAPJobs #FinanceTransformation #SAPGroupReporting #SAPConsolidation #S4HANA #PublicCloud #SAPFI #AnalyticsCloud #HiringNow #FinanceCareers,

Posted 2 weeks ago

Apply

8.0 - 15.0 years

0 Lacs

hyderabad, telangana

On-site

As an Oracle OIC Technical Consultant - Principal Consultant, you will be responsible for gathering and understanding detailed technical requirements related to conversions, interfaces, reports, integrations, and custom development. You will need to possess excellent written, verbal, and interpersonal communication skills, along with the ability to deliver under pressure. Being self-motivated and adaptable to changing and competing demands is essential for this role. Your role will involve troubleshooting technical issues and demonstrating tenacity in problem-solving. You will work closely with functional teams to optimize and tailor the Oracle Cloud System to meet the organization's needs. Additionally, you will be involved in delivering technical components related to implementation, upgrade, and maintenance of Oracle Cloud systems. Your responsibilities will include developing and fixing reports using BI Publisher, performing data conversions, designing and developing OIC-based integrations, and working on custom application development and extensions using Apex & VBCS. You will also be required to prepare and submit technical documentation, participate in requirement collection discussions with business stakeholders, and manage integrations between Cloud and external applications or systems within the organization. Collaboration with stakeholders to understand their needs and translate them into technical requirements is a key aspect of this role. You will provide training and support to business users and work closely with the IT team to ensure data security, backup, and recovery plans are effective. Additionally, you will collaborate with Oracle Support on escalated support issues and should have knowledge of integration tools and third-party applications. Having ERP Cloud functional certifications would be a plus for this position. If you have 8-15 years of experience and a Bachelor's Degree, and you are looking for a challenging role in Hyderabad, Mumbai, or Bangalore, this opportunity might be the right fit for you.,

Posted 2 weeks ago

Apply

3.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

As a Teamcenter Developer, your primary responsibility will be to design and develop custom solutions to meet business requirements within the Teamcenter platform. You will work on implementing and enhancing BMIDE configurations, including custom properties, business rules, and templates. Additionally, you will be tasked with developing and customizing workflows, handlers, and triggers using Workflow Designer. Your role will also involve customizing and extending Teamcenter functionalities using ITK, RAC, SOA, and TCXML. You will be responsible for developing and maintaining integrations with third-party systems such as ERP, CAD, and MES tools, utilizing middleware tools like T4x. It will be essential to write efficient, scalable, and maintainable code while adhering to coding standards and to debug, test, and troubleshoot issues in customizations, integrations, and configurations. Performance tuning for Teamcenter solutions to enhance system efficiency will be part of your duties, as well as maintaining and managing source code repositories using tools like Git or SVN. You will assist in the implementation of new Teamcenter modules or updates and participate in data migration tasks, including script development and testing for legacy data import/export. Collaboration with Teamcenter Architects, Administrators, and business analysts to gather and understand requirements will be crucial. You will work closely with cross-functional teams to ensure seamless integration of Teamcenter with other enterprise systems. Documentation of technical designs, code changes, and user guides for future reference is also expected. Staying updated on the latest Teamcenter technologies, tools, and enhancements will be necessary. You will be required to research and suggest new tools, technologies, or methods to improve development efficiency and solution quality. Implementing best practices and ensuring compliance with organizational policies will be essential. To be successful in this role, you should have a strong proficiency in Siemens Teamcenter development, including Active Workspace customization, ITK for server-side customization, RAC customization using Java and Eclipse, and SOA for web services and integrations. A solid understanding of PLM concepts such as BOM management, change management, Classification, Stylesheet, Query Builder, PLMXML, and access control is required. Problem-solving and analytical skills, the ability to work in a collaborative environment, and adapt to agile methodologies are essential qualities. Excellent verbal and written communication skills, attention to detail, and a focus on delivering high-quality solutions are also important. Ideally, you should hold a Bachelor's degree in computer science, Information Technology, or a related field and have 3 to 10 years of experience in Teamcenter development and customization. Prior involvement in end-to-end Teamcenter implementation projects would be beneficial. Desired Skills: - Experience in CAD integrations - Knowledge of database systems - Familiarity with integration tools like T4x (Teamcenter Gateway) or APIs for ERP Interview Mode: Virtual Work Mode: Remote (Source: hirist.tech),

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 17 Lacs

Gurugram

Work from Office

Job Title : Kafka Integration Specialist Job Description : We are seeking a highly skilled Kafka Integration Specialist to join our team. The ideal candidate will have extensive experience in designing, developing, and integrating Apache Kafka solutions to support real-time data streaming and distributed systems. Key Responsibilities : - Design, implement, and maintain Kafka-based data pipelines. - Develop integration solutions using Kafka Connect, Kafka Streams, and other related technologies. - Manage Kafka clusters, ensuring high availability, scalability, and performance. - Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. - Implement best practices for data streaming, including message serialization, partitioning, and replication. - Monitor and troubleshoot Kafka performance, latency, and security issues. - Ensure data integrity and implement failover strategies for critical data pipelines. Required Skills : - Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). - Proficiency in programming languages like Java, Python, or Scala. - Experience with distributed systems and data streaming concepts. - Familiarity with Zookeeper, Confluent Kafka, and Kafka Broker configurations. - Expertise in creating and managing topics, partitions, and consumer groups. - Hands-on experience with integration tools such as REST APIs, MQ, or ESB. - Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Nice to Have : - Experience with monitoring tools like Prometheus, Grafana, or Datadog. - Exposure to DevOps practices, CI/CD pipelines, and infrastructure automation. - Knowledge of data serialization formats like Avro, Protobuf, or JSON. Qualifications : - Bachelor's degree in Computer Science, Information Technology, or related field. - 4+ years of hands-on experience in Kafka integration projects.

Posted 2 weeks ago

Apply

12.0 - 15.0 years

2 - 6 Lacs

Hyderabad, Telangana, India

Remote

Work independently as part of multi-cultural teams delivering SuccessFactors / SAP HCM solutions to the customers Lead multi-cultural teams delivering SuccessFactors / SAP HCM solutions to the customers Work with the customers to understand their requirements and provide them the right solutions Able to guide the customers, both on-site as well as offshore, on the SAP Best Practices for quick win Experience of working in the remote delivery model Application-specific solution consulting (for a specific application: creation of business blueprint/configuration workbook, Implementation based on a blueprint, creation of test cases, test scheduling and execution, key user training, go live support, and post go live support) Performance of feasibility studies / solution reviews Support of pre-sales activities Leading & Conducting Requirement Gathering Workshops & Trainings Industry /corporate process implementation across all related applications (SAP/non-SAP) Develop and Assure quality of process models Demonstrating advanced / expert knowledge of modeling standards and tools Support in escalated projects Understands the Business Process Library approach and is contributing to it Acts as a coach for colleagues At least (4-5) end-to-end implementation experience in SuccessFactors / SAP HCM

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a ServiceNow Architect within our IT/Enterprise Applications department based in Hyderabad, India, you will play a crucial role in designing, developing, and leading scalable and secure ServiceNow solutions that are aligned with our business goals. Your responsibilities will encompass application development, system integration, automation, and mentoring development teams while ensuring the platform's performance, reliability, and compliance. You will be tasked with designing scalable and high-performance ServiceNow solutions that are tailored to our business needs. This includes developing and customizing applications, workflows, modules, and user interfaces on the ServiceNow platform. Additionally, you will analyze existing business processes to implement automation that enhances efficiency and reduces manual efforts. Managing integrations with both cloud-based and on-premise legacy systems using SOAP/REST protocols will be a key aspect of your role. Ensuring high-quality solution delivery through rigorous testing, performance tuning, and collaboration with stakeholders across the business will also be vital. To excel in this role, you should have a Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field. You should possess at least 5 years of hands-on experience in ServiceNow development and hold certifications such as ServiceNow Certified Application Developer (CAD) and ServiceNow Certified Implementation Specialist (CIS). A strong command of JavaScript, HTML, and CSS, along with experience in web services such as SOAP and REST, is essential. Familiarity with Agile/Scrum development methodologies, problem-solving skills, effective leadership abilities, and excellent communication and interpersonal capabilities are also required. Your adaptability to fast-paced, evolving environments and your stakeholder engagement skills will be critical in this role. Experience with ITSM, ITOM, HRSD, or other ServiceNow modules, exposure to ServiceNow Governance, Risk, and Compliance (GRC) or Security Operations, and familiarity with integration tools like MuleSoft and Dell Boomi will be advantageous in fulfilling your responsibilities effectively.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You are Omnissa! The world is evolving fast, and organizations everywherefrom corporations to schoolsare under immense pressure to provide flexible, work-from-anywhere solutions. They need IT infrastructure that empowers employees and customers to access applications from any device, on any cloud, all while maintaining top-tier security. That's where Omnissa comes in. The Omnissa Platform is the first AI-driven digital work platform that enables smart, seamless, and secure work experiences from anywhere. It uniquely integrates multiple industry-leading solutions including Unified Endpoint Management, Virtual Apps and Desktops, Digital Employee Experience, and Security & Compliance through common data, identity, administration, and automation services. Built on the vision of autonomous workspaces - self-configuring, self-healing, and self-securing - Omnissa continuously adapts to the way people work, delivering personalized and engaging employee experiences, while optimizing security, IT operations, and costs. We're experiencing rapid growthand this is just the beginning of our journey! At Omnissa, you are driven by a shared mission to maximize value for our customers. Our five Core Values guide us: Act in Alignment, Build Trust, Foster Inclusiveness, Drive Efficiency, and Maximize Customer Valueall with the aim of achieving shared success for our clients and our team. As a global private company with over 4,000 employees, we're always looking for passionate, talented individuals to join us. If you're ready to make an impact and help shape the future of work, we'd love to hear from you! What is the opportunity We are looking for a skilled Senior Salesforce Engineer with 5+ years of hands-on experience in Salesforce platform development and administration. The ideal candidate will be responsible for designing, developing, and deploying customized solutions within Salesforce, improving business processes, and leading technical projects. This role requires strong technical expertise, problem-solving skills, and the ability to work in an agile, fast-paced environment. Key Responsibilities: - Salesforce Development: Design, develop, test, and implement custom applications, flows, and configurations on Salesforce (Sales Cloud, Service Cloud). - Custom Code: Create Apex classes, triggers. Lightning components (Aura/LWC), Visualforce pages, and work with SOQL. - Integration: Design and implement integration between Salesforce and other third-party systems using REST/SOAP APIs, middleware, and platform events. - Solution Design: Understand the business requirements and translate them into technical specifications and scalable solutions by ensuring best practices. - Quality Assurance: Write test classes for the code coverage, perform unit testing, and support UAT. - Deployment: Participate in code reviews, manage releases, and handle deployment activities using CI/CD tools. - Documentation: Maintain clear documentation of processes, solutions, and system architecture for future use and training. - Support & Troubleshooting: Provide production support for issues, optimizing platform performance, and identifying Security & Compliance: Ensure Salesforce security best practices are implemented. What will you bring to Omnissa - Bachelor's degree in computer science, Engineering, or a related field. - 5+ years of experience in Salesforce platform development, including hands-on experience with Apex, Lightning Components (Aura/LWC), Visualforce, and integration tools. - In-depth understanding of Salesforce declarative and programmatic development. - Strong experience in Salesforce configuration, administration, and security management. - Proven experience in integrating Salesforce with other systems using APIs. - Experience with version control and deployment tools (Git, etc.). - Experience working in an Agile/Scrum environment. - Excellent problem-solving skills and the ability to work independently and as part of a team. - Strong communication skills, both written and verbal. Preferred Skills: - Experience in implementing Salesforce Communities, Experience Clouds. - Familiarity with data Migration tools. - Salesforce certifications like Salesforce Platform Developer I & II, App Builder, or Salesforce Administrator are preferred. Why Join Us: - Opportunity to work on innovative projects and cutting-edge Salesforce technology. - Competitive salary, benefits, and professional growth opportunities. - Collaborative and flexible work environment that encourages learning and development. Location: Bengaluru Location Type: HYBRID / ONSITE Omnissa's commitment to diversity & inclusion: "Omnissa is committed to building a workforce that reflects the communities we serve across the globe. We believe this brings unique perspectives, experiences, and ideas, which are essential for driving innovation and achieving business success. We hire based on merit and with equal opportunity for all." Omnissa is an Equal Employment Opportunity company and Prohibits Discrimination and Harassment of Any Kind:,

Posted 2 weeks ago

Apply

5.0 - 10.0 years

3 - 6 Lacs

Noida

Work from Office

We are seeking a highly skilled Kafka Integration Specialist to join our team. The ideal candidate will have extensive experience in designing, developing, and integrating Apache Kafka solutions to support real-time data streaming and distributed systems. Key Responsibilities : - Design, implement, and maintain Kafka-based data pipelines. - Develop integration solutions using Kafka Connect, Kafka Streams, and other related technologies. - Manage Kafka clusters, ensuring high availability, scalability, and performance. - Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. - Implement best practices for data streaming, including message serialization, partitioning, and replication. - Monitor and troubleshoot Kafka performance, latency, and security issues. - Ensure data integrity and implement failover strategies for critical data pipelines. Required Skills : - Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). - Proficiency in programming languages like Java, Python, or Scala. - Experience with distributed systems and data streaming concepts. - Familiarity with Zookeeper, Confluent Kafka, and Kafka Broker configurations. - Expertise in creating and managing topics, partitions, and consumer groups. - Hands-on experience with integration tools such as REST APIs, MQ, or ESB. - Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Nice to Have : - Experience with monitoring tools like Prometheus, Grafana, or Datadog. - Exposure to DevOps practices, CI/CD pipelines, and infrastructure automation. - Knowledge of data serialization formats like Avro, Protobuf, or JSON. Qualifications : - Bachelor's degree in Computer Science, Information Technology, or related field. - 4+ years of hands-on experience in Kafka integration projects.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

3 - 6 Lacs

Pune

Work from Office

We are seeking a highly skilled Kafka Integration Specialist to join our team. The ideal candidate will have extensive experience in designing, developing, and integrating Apache Kafka solutions to support real-time data streaming and distributed systems. Key Responsibilities : - Design, implement, and maintain Kafka-based data pipelines. - Develop integration solutions using Kafka Connect, Kafka Streams, and other related technologies. - Manage Kafka clusters, ensuring high availability, scalability, and performance. - Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. - Implement best practices for data streaming, including message serialization, partitioning, and replication. - Monitor and troubleshoot Kafka performance, latency, and security issues. - Ensure data integrity and implement failover strategies for critical data pipelines. Required Skills : - Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). - Proficiency in programming languages like Java, Python, or Scala. - Experience with distributed systems and data streaming concepts. - Familiarity with Zookeeper, Confluent Kafka, and Kafka Broker configurations. - Expertise in creating and managing topics, partitions, and consumer groups. - Hands-on experience with integration tools such as REST APIs, MQ, or ESB. - Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Nice to Have : - Experience with monitoring tools like Prometheus, Grafana, or Datadog. - Exposure to DevOps practices, CI/CD pipelines, and infrastructure automation. - Knowledge of data serialization formats like Avro, Protobuf, or JSON. Qualifications : - Bachelor's degree in Computer Science, Information Technology, or related field. - 4+ years of hands-on experience in Kafka integration projects.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

3 - 8 Lacs

Bengaluru, Karnataka, India

On-site

In this role, you will: Operational Excellence Incident Management:Ensure timely and effective resolution of incidents to minimize downtime and impact on sellers. Problem Management: Identify and resolve recurring issues through thorough root cause analysis and long-term solutions. Process Improvement: Continuously evaluate and enhance operational processes to improve efficiency and effectiveness. Collaboration and Communication Cross-functional Collaboration: Work closely with development, quality engineering, and other relevant teams to ensure seamless operations and issue resolution. Stakeholder Communication: Maintain clear and open communication with key stakeholders, providing regular updates on operational performance and ongoing initiatives. Cross-Functional Collaboration: Partner closely with the Sales, Finance, Business Technology and Product teams to ensure systems integration, seamless information flow, and alignment with the overall business strategy. Strategic Planning and Execution Operational Strategy: Develop and implement strategies to enhance the performance and reliability of the GTMS ecosystem. Resource Allocation: Efficiently allocate resources to ensure optimal team performance and support for critical projects. Cost Management: Monitor and manage operational costs, ensuring budget adherence and cost-effective solutions. Quality Assurance and Release Management Quality Standards: Ensure that all releases and updates meet the highest standards of quality and reliability. Release Coordination: Oversee the release management process, ensuring smooth and efficient deployments with minimal disruption. Team Leadership and Development Lead and Manage Teams: Oversee the L2 and L3 support teams, ensuring they are effectively addressing and resolving issues. Mentorship and Training: Provide guidance, mentorship, and professional development opportunities to team members. Performance Management: Set performance goals, conduct regular reviews, and implement improvement plans as needed. Vendor Management: Identify, negotiate, and manage partnerships with third-party vendors to ensure the delivery of top-notch tools and services. Performance Metrics: Establish, track, and report key performance indicators to gauge success and identify opportunities for improvement. Compliance and Security: Ensure all systems adhere to industry standards, legal requirements, and best practices in security and data privacy. Budget Management: Oversee budget planning, allocation, and management within the area of responsibility, employing deep business acumen. Champion, role model, and embed Samsara s cultural principles (Focus on Customer Success, Build for the Long Term, Adopt a Growth Mindset, Be Inclusive, Win as a Team) as we scale globally and across new offices. Hire, develop and lead an inclusive, engaged, and high performing team. Minimum requirements for the role: Bachelors degree in Business, Computer Science, Information Technology, or related field; Master s degree preferred. 3+ years of experience in leadership roles focusing on Sales systems (ex. Salesforce Sales Cloud, CPQ ). Experience in managing Operations team driving system stability and Operational excellence. 3+ years of Salesforce experience. Deep business acumen, with a strong understanding of CRM systems, integration tools, Quality Assurance and analytics platforms. Advanced influencing and communication skills, with a strong executive presence. Exceptional strategic thinking and problem-solving skills. Excellent collaboration skills, with experience working closely with Sales and Channel teams. A customer-centric mindset, with a focus on delivering value and a great customer experience.

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

We are looking for a Salesforce Data Integration Lead to be a part of our team in Bengaluru, India. As a Salesforce Data Integration Lead, you will play a crucial role in driving our data integration initiatives. Your responsibilities will revolve around designing and implementing data integration strategies for Salesforce, collaborating with cross-functional teams to enhance data processes, and ensuring seamless data flow across various systems. Your primary tasks will include monitoring, validating, and optimizing data quality throughout the integration lifecycle, utilizing ETL tools for effective management of large datasets, and developing documentation for data flows and integration processes. You will also be responsible for providing training and support to team members on best practices in data management. The ideal candidate should have proven experience in Salesforce integrations and API management, a strong grasp of database concepts and ETL processes, excellent problem-solving skills with keen attention to detail, and the ability to communicate complex technical information clearly. Experience with middleware tools would be an added advantage. If you are ready to make a significant impact and grow with us, apply now and be a part of our dynamic team!,

Posted 2 weeks ago

Apply

5.0 - 10.0 years

3 - 6 Lacs

Ahmedabad

Work from Office

Job Title : Kafka Integration Specialist Job Description : We are seeking a highly skilled Kafka Integration Specialist to join our team. The ideal candidate will have extensive experience in designing, developing, and integrating Apache Kafka solutions to support real-time data streaming and distributed systems. Key Responsibilities : - Design, implement, and maintain Kafka-based data pipelines. - Develop integration solutions using Kafka Connect, Kafka Streams, and other related technologies. - Manage Kafka clusters, ensuring high availability, scalability, and performance. - Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. - Implement best practices for data streaming, including message serialization, partitioning, and replication. - Monitor and troubleshoot Kafka performance, latency, and security issues. - Ensure data integrity and implement failover strategies for critical data pipelines. Required Skills : - Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). - Proficiency in programming languages like Java, Python, or Scala. - Experience with distributed systems and data streaming concepts. - Familiarity with Zookeeper, Confluent Kafka, and Kafka Broker configurations. - Expertise in creating and managing topics, partitions, and consumer groups. - Hands-on experience with integration tools such as REST APIs, MQ, or ESB. - Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Nice to Have : - Experience with monitoring tools like Prometheus, Grafana, or Datadog. - Exposure to DevOps practices, CI/CD pipelines, and infrastructure automation. - Knowledge of data serialization formats like Avro, Protobuf, or JSON. Qualifications : - Bachelor's degree in Computer Science, Information Technology, or related field. - 4+ years of hands-on experience in Kafka integration projects.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

3 - 6 Lacs

Chennai

Work from Office

Job Title : Kafka Integration Specialist Job Description : We are seeking a highly skilled Kafka Integration Specialist to join our team. The ideal candidate will have extensive experience in designing, developing, and integrating Apache Kafka solutions to support real-time data streaming and distributed systems. Key Responsibilities : - Design, implement, and maintain Kafka-based data pipelines. - Develop integration solutions using Kafka Connect, Kafka Streams, and other related technologies. - Manage Kafka clusters, ensuring high availability, scalability, and performance. - Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. - Implement best practices for data streaming, including message serialization, partitioning, and replication. - Monitor and troubleshoot Kafka performance, latency, and security issues. - Ensure data integrity and implement failover strategies for critical data pipelines. Required Skills : - Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). - Proficiency in programming languages like Java, Python, or Scala. - Experience with distributed systems and data streaming concepts. - Familiarity with Zookeeper, Confluent Kafka, and Kafka Broker configurations. - Expertise in creating and managing topics, partitions, and consumer groups. - Hands-on experience with integration tools such as REST APIs, MQ, or ESB. - Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Nice to Have : - Experience with monitoring tools like Prometheus, Grafana, or Datadog. - Exposure to DevOps practices, CI/CD pipelines, and infrastructure automation. - Knowledge of data serialization formats like Avro, Protobuf, or JSON. Qualifications : - Bachelor's degree in Computer Science, Information Technology, or related field. - 4+ years of hands-on experience in Kafka integration projects.

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You will be joining our team as a Camunda Developer, bringing with you 3 to 5 years of experience in working with the Camunda BPM platform. Your key responsibilities will include: - Demonstrating a strong proficiency in BPMN 2.0, DMN, and workflow modeling. - Showing proficiency in Java, Spring Boot, and microservices architecture. - Utilizing your experience with REST/SOAP APIs, database technologies (SQL/NoSQL), and integration tools. - Demonstrating familiarity with Docker, Kubernetes, and CI/CD pipelines. - Having a good understanding of business process automation, event-driven architecture, and SOA. - Experience with Jolt would be a plus. If you are passionate about Camunda development and meet the requirements mentioned, we would love to have you on board!,

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

pune, maharashtra

On-site

You are a results-driven Data Project Manager (PM) responsible for leading data initiatives within a regulated banking environment, focusing on leveraging Databricks and Confluent Kafka. Your role involves overseeing the successful end-to-end delivery of complex data transformation projects aligned with business and regulatory requirements. In this position, you will be required to lead the planning, execution, and delivery of enterprise data projects using Databricks and Confluent. This includes developing detailed project plans, delivery roadmaps, and work breakdown structures, as well as ensuring resource allocation, budgeting, and adherence to timelines and quality standards. Collaboration with data engineers, architects, business analysts, and platform teams is essential to align on project goals. You will act as the primary liaison between business units, technology teams, and vendors, facilitating regular updates, steering committee meetings, and issue/risk escalations. Your technical oversight responsibilities include managing solution delivery on Databricks for data processing, ML pipelines, and analytics, as well as overseeing real-time data streaming pipelines via Confluent Kafka. Ensuring alignment with data governance, security, and regulatory frameworks such as GDPR, CBUAE, and BCBS 239 is crucial. Risk and compliance management are key aspects of your role, involving ensuring regulatory reporting data flows comply with local and international financial standards and managing controls and audit requirements in collaboration with Compliance and Risk teams. The required skills and experience for this role include 7+ years of Project Management experience within the banking or financial services sector, proven experience in leading data platform projects, a strong understanding of data architecture, pipelines, and streaming technologies, experience in managing cross-functional teams, and proficiency in Agile/Scrum and Waterfall methodologies. Technical exposure to Databricks (Delta Lake, MLflow, Spark), Confluent Kafka (Kafka Connect, kSQL, Schema Registry), Azure or AWS Cloud Platforms, integration tools, CI/CD pipelines, and Oracle ERP Implementation is expected. Preferred qualifications include PMP/Prince2/Scrum Master certification, familiarity with regulatory frameworks, and a strong understanding of data governance principles. The ideal candidate will hold a Bachelors or Masters degree in Computer Science, Information Systems, Engineering, or a related field. Key performance indicators for this role include on-time, on-budget delivery of data initiatives, uptime and SLAs of data pipelines, user satisfaction, and compliance with regulatory milestones.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

We are looking for a Salesforce Data Integration Lead to lead our data integration initiatives in Bengaluru, India. As a detail-oriented professional with extensive experience in Salesforce, data management, and integration tools, you will play a crucial role in ensuring seamless data flow across multiple systems. Your responsibilities will include designing and implementing data integration strategies for Salesforce, collaborating with cross-functional teams to gather requirements, and enhancing data processes. You will also be responsible for monitoring, validating, and optimizing data quality throughout the integration lifecycle, utilizing ETL tools effectively, and developing documentation for data flows and integration processes. Additionally, you will provide training and support to team members on best practices in data management. The ideal candidate for this role will have proven experience in Salesforce integrations and API management, a strong understanding of database concepts and ETL processes, excellent problem-solving skills with attention to detail, and the ability to communicate complex technical information clearly. Experience with middleware tools would be a plus. If you are ready to make an impact and grow with us, apply now!,

Posted 2 weeks ago

Apply

0.0 - 2.0 years

1 - 3 Lacs

Navi Mumbai

Work from Office

Job Summary: We are seeking a skilled and detail-oriented ERP Technical Developer to support the design, development, customization, and maintenance of our ERP system. The ideal candidate will work closely with business analysts, functional consultants, and end users to understand requirements and deliver robust technical solutions that align with organizational goals. Key Responsibilities: Develop, customize, and maintain ERP system modules (e.g., Finance, HR, Supply Chain, Manufacturing). Write and optimize code, reports, interfaces, and data conversions using appropriate ERP tools and languages Collaborate with functional teams to analyze requirements and translate them into technical specifications. Integrate ERP systems with third-party applications and external systems via APIs, middleware, or custom interfaces. Troubleshoot technical issues and provide timely resolutions. Participate in ERP upgrades, patching, and migration activities. Ensure data integrity and security within the ERP environment. Maintain technical documentation including customizations, integrations, and support procedures. Follow software development best practices and ensure compliance with IT policies and procedures. Technical Skills: Proficiency in programming languages used by the ERP platform. Strong knowledge of ERP architecture and database management systems Experience with integration tools and technologies Understanding of ERP data models and workflows. Soft Skills: Strong analytical and problem-solving skills. Effective communication and documentation abilities. Ability to work collaboratively in cross-functional teams.

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

You should have 8-10+ years of integration experience, specifically working with Oracle Utilities and MDM integration with other enterprise platforms. Hands-on experience with Oracle MDM and Oracle Utilities Application Framework is a must. You should also have experience with Oracle SOA or Oracle Integration Cloud. Strong expertise in integration tools and middleware such as Webservices and Rest APIs is required. Additionally, having an understanding of Enterprise Architecture will be beneficial for this role.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a highly valued member of the technical solutions team, you will serve as the primary technical point of contact responsible for ensuring the seamless delivery of integrations from the initial stages through to the go-live phase. With a minimum of 5 years of experience in EDI and API integration development, you will be expected to integrate data from various source and target formats. Your responsibilities will include developing and maintaining integrations using Integration tools, proficiency in handling data formats like XML, JSON, EDIFACT, X12, raw text files, Excel, CSV, and tagged separated files. Additionally, you should possess knowledge of EDI protocols such as OFTP (2), FTP, SFTP, AS2, HTTP, HTTPS, and various communication methods. You should be well-versed in SOAP and REST API designs, proficient in API sandbox environments, and demonstrate a strong understanding of mapping and integration tools. Experience with integration tools like Lobster_data, MuleSoft, Boomi, Cleo Integration Cloud, and Jitterbit Harmony would be highly advantageous. Monitoring and managing integrations to ensure optimal performance, troubleshooting integration issues effectively, and maintaining clear communication with team members and stakeholders through various channels are crucial aspects of this role. The ability to work independently and collaboratively in a remote work environment is essential. Hands-on experience with low-code to no-code tools such as Microsoft Power Platform and Azure is preferred. You should also be capable of producing comprehensive supporting documentation including requirements, standard document formats, translation and mapping guidelines, preferred communication methods, and communication networks for sending and receiving documents. Desirable skills would include familiarity with Cargowise XML (XUS/XUE/XUT), experience with Altova Mapforce, a background in the Freight Forwarding Industry, and expertise in building integrations with 3rd parties like Project44, Inttra, Gravity, E2Open, and Wakeo.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

3 - 6 Lacs

Hyderabad

Work from Office

We are seeking a highly skilled Kafka Integration Specialist to join our team. The ideal candidate will have extensive experience in designing, developing, and integrating Apache Kafka solutions to support real-time data streaming and distributed systems. Key Responsibilities : - Design, implement, and maintain Kafka-based data pipelines. - Develop integration solutions using Kafka Connect, Kafka Streams, and other related technologies. - Manage Kafka clusters, ensuring high availability, scalability, and performance. - Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. - Implement best practices for data streaming, including message serialization, partitioning, and replication. - Monitor and troubleshoot Kafka performance, latency, and security issues. - Ensure data integrity and implement failover strategies for critical data pipelines. Required Skills : - Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). - Proficiency in programming languages like Java, Python, or Scala. - Experience with distributed systems and data streaming concepts. - Familiarity with Zookeeper, Confluent Kafka, and Kafka Broker configurations. - Expertise in creating and managing topics, partitions, and consumer groups. - Hands-on experience with integration tools such as REST APIs, MQ, or ESB. - Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Nice to Have : - Experience with monitoring tools like Prometheus, Grafana, or Datadog. - Exposure to DevOps practices, CI/CD pipelines, and infrastructure automation. - Knowledge of data serialization formats like Avro, Protobuf, or JSON. Qualifications : - Bachelor's degree in Computer Science, Information Technology, or related field. - 4+ years of hands-on experience in Kafka integration projects.

Posted 3 weeks ago

Apply

3.0 - 8.0 years

0 Lacs

karnataka

On-site

You should have 3-8 years of experience as a Workday Technical Consultant with key skills in Workday Technical, Workday Finance, and Workday HCM. Your responsibilities will include hands-on experience in Advanced Compensation and Talent Management modules, maintenance, and configuration of existing Workday Core HCM application. You will also assist in updating system configuration, test and deploy business process solutions, and provide best practices and solutions for Core HCM functional area. Additionally, you will create Workday reports (including dashboards), work on various integration tools & methodologies such as EIBs, CCW, Studio, PICOF, and PECI. Experience in Workday Release management is required. Strong communication skills are essential for working with business stakeholders. Being active in the Workday Community and performing POCs to improve Workday solutions and implementation methodology is expected. Having knowledge of BIRT Reports will be an added advantage. If you find this opportunity interesting, please share your CV on Sneha.Gedam@ltimindtree.com.,

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies