Jobs
Interviews

593 Jobs in Remote - Page 5

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 3.0 years

0 - 3 Lacs

Remote, , India

On-site

About the job Description Executive Customer Relations (ECR) team manages IN trans related escalations for rescue, root causing and medium/long term systemic changes. The team manages multiple programs such as INOPS VPI, ECR email escalations and customer rescue programs. The team liaises with stakeholders across the IN Network to develop proactive rescue mechanisms, solutions and systemic fixes around the opportunities identified through root cause analysis to improve customer experience. Key job responsibilities Respond to inquiries from leaders, in addition to resolving contacts (received through escalation channels). Communicate effectively and professionally with CS and non-CS departments. Work on a detailed root cause analysis. Recognize systemic and quality concerns contributing to poor customer experiences and communicate to appropriate stakeholders. A day in the life You will address customer issues by rescuing the customer but also by identifying and measuring root cause of the customer's experience failure and presenting your findings and recommendations to right stakeholders who can fix process or technology that caused customer defect. Basic Qualifications A relentless obsession for the customer. Excellent English communication skills both verbal and written. Prior experience in Customer Service Demonstrates flexibility in work hours based on operational requirement. Ability to work independently, self-motivated, and demonstrate flexibility in approaching responsibilities and change. Displays good judgment and discretion. Excellent decision-making skills to effectively manage the needs of the customer and business Goal driven, target orientated, able to step back and look at the bigger picture, the person will also be able to manage during ambiguity and possess a preparedness to get involved Preferred Qualifications Prior experience in Customer Service Perfection in responses to internal leaders is required. MS-Office Suite (Word, PowerPoint, Excel, SharePoint).

Posted 1 week ago

Apply

2.0 - 5.0 years

2 - 5 Lacs

Remote, , India

On-site

Description Executive Customer Relations (ECR) team manages IN trans related escalations for rescue, root causing and medium/long term systemic changes. The team manages multiple programs such as INOPS VPI, ECR email escalations and customer rescue programs. The team liaises with stakeholders across the IN Network to develop proactive rescue mechanisms, solutions and systemic fixes around the opportunities identified through root cause analysis to improve customer experience. Key job responsibilities Respond to inquiries from leaders, in addition to resolving contacts (received through escalation channels). Communicate effectively and professionally with CS and non-CS departments. Work on a detailed root cause analysis. Recognize systemic and quality concerns contributing to poor customer experiences and communicate to appropriate stakeholders. A day in the life You will address customer issues by rescuing the customer but also by identifying and measuring root cause of the customer's experience failure and presenting your findings and recommendations to right stakeholders who can fix process or technology that caused customer defect. Basic Qualifications A relentless obsession for the customer. Excellent English communication skills both verbal and written. Prior experience in Customer Service Demonstrates flexibility in work hours based on operational requirement. Ability to work independently, self-motivated, and demonstrate flexibility in approaching responsibilities and change. Displays good judgment and discretion. Excellent decision-making skills to effectively manage the needs of the customer and business Goal driven, target orientated, able to step back and look at the bigger picture, the person will also be able to manage during ambiguity and possess a preparedness to get involved Preferred Qualifications Prior experience in Customer Service Perfection in responses to internal leaders is required. MS-Office Suite (Word, PowerPoint, Excel, SharePoint).

Posted 1 week ago

Apply

3.0 - 5.0 years

3 - 5 Lacs

Remote, , India

On-site

Job Description MUST HAVES: 3+ years of Salesforce Administration in SF Lightning Salesforce Service Cloud Exposure to Apex programming Configuration experience Security experience (permission sets, profiles, roles, etc.) PLUSSES: Automation experience Experience with Salesforce APIs CI/CD, Git, Git Hub, Git Hub experience DAY-TO-DAY This person will be working on the Red Hat internal support team assisting with the migration from Salesforce Classic to Salesforce Lightning. They are working towards an environment of minimal customization so looking for an administrator who can come in and help with configurations, data management, reporting, analytics, and security administration. Looking for someone who has: Knowledge of SF Security Model Proven ability to design and implementnew processes and facilitate user adoption Understanding of SF Best practices and functionality Data management abilities (experience with Excel for data export, manipulation, and revert back into the system using data loader tools) Ability to understand and articulate complex requirements.

Posted 1 week ago

Apply

3.0 - 7.0 years

3 - 7 Lacs

Remote, , India

On-site

Job Description JOB DESCRIPTION A talented and experienced Data Engineer with a strong focus on ETL projects to join our team. The ideal candidate will have extensive experience in data warehouse projects, and proficiency in Apache Nifi, PostgreSQL, Java or Node JS, and AWS. The primary responsibility will be to design, develop, and maintain ETL processes to support our data infrastructure. The backend application will be built in Node JS, therefore proficiency in this technology is preferred. candidate should be able to : Design, develop, and implement ETL processes to extract, transform, and load data from various sources into our data warehouse. Collaborate with cross-functional teams to gather requirements and translate business needs into technical specifications. Optimize ETL processes for performance and scalability. Ensure data quality and integrity throughout the ETL process. Troubleshoot and resolve issues related to data processing and ETL pipelines. Stay updated with industry best practices and emerging technologies in data engineering. EXPERTISE AND QUALIFICATIONS NOTE- The minimum requirement for the job is to have hands-on experience in Apache Nifi and Java. Bachelor's degree in Computer Science, Engineering, or related field. Extensive experience in data warehouse projects with a focus on ETL development. Proficiency in Apache Nifi for data ingestion and processing. Strong SQL skills and experience with relational databases, preferably PostgreSQL. Experience with either Java or Node JS. Familiarity with AWS services. Hands-on experience with Node JS for backend application development. Certification in AWS or relevant technologies. Experience with other ETL tools and frameworks. Strong analytical and problem-solving skills.

Posted 1 week ago

Apply

8.0 - 10.0 years

8 - 10 Lacs

Remote, , India

On-site

Job Description Bachelor's degree or higher in a quantitative field such as analytics, engineering, mathematics, computer science, physics, or related technical discipline Minimum of 8+ (6+ years with a Masters) years of professional experience in a data-driven, finance analytical role Demonstrated previous proficiency in SQL and Tableau and experience with Python/R to be tested during technical interviews Experience in quantitative analysis Demonstrated strong communication skills Strong data storytelling skills, including the ability to distill technical insights into a clear and compelling narrative Experience building intuitive dashboards using Tableau producing actionable data insights Experience working independently and as a member of a cross-functional team Hands on previous professional experience in finance analytics, preferably at a SaaS subscription based company Experience with dbt is a bonus

Posted 1 week ago

Apply

6.0 - 10.0 years

6 - 10 Lacs

Remote, , India

On-site

JOB DESCRIPTION SUMMARY: The Software Developer is responsible for the development and maintenance of software products. These responsibilities include assisting with requirements gathering, coding, testing, and installing applications. This position will work with the development manager to ensure that project deadlines are met and that the work meets user requirements. JOB DUTIES & RESPONSIBILITIES: Build scalable, secure, and robust technical solutions with product vision, learning and incorporating new technologies as appropriate. Produce high-quality and bug-free code as per the coding standards in close collaboration and interaction with other members of the Engineering/QA team. Participate in Agile methodologies for all aspects of the software development lifecycle (SDLC) process. Develop High Level and Low-Level Technical design documentation. Work with support team to resolve production support issues. Participate in peer reviews. Mentor the junior developers and help building knowledge base. Assist with technical documentation. Consult with QA staff on strategies for testing specific work items. Other duties as deemed necessary by management. Contribute to the success of the organization by helping others accomplish job results; learning new skills needed by the team; finding new ways to help the team EXPERTISE AND QUALIFICATIONS WORK EXPERIENCE AND EDUCATION REQUIREMENTS: Bachelor?s degree from four-year College or university in Computer Science or relevant streams Total 8 to 10 years of experience 6+ years of software development experience design, build, and deploy scalable cloud solutions, utilizing the full spectrum of AWS Cloud PAASofferings. Experience in building efficient backend services using Node.js, Nest JS (Mandatory). Experience in development of interactive and responsive user interfaces, leveraging ReactJS, Tailwinds CSS and Zustand(Mandatory). Experience in working with Docker / EKS(Mandatory) Experience in building infrastructure resources on AWS efficiently using Terraform (Preferred). Experience in implementing CI/CD pipelines with tools like GitHub Actions / AWS code pipeline (Preferred) Strong problem-solving skills in a fast-paced environment Capable of system tuning, code optimization and bug solving. Working experience in agile team is preferred

Posted 1 week ago

Apply

5.0 - 6.0 years

5 - 6 Lacs

Remote, , India

On-site

Job Description Proactively drive data science engineering projects forward with a self-motivated and go-getter attitude, effectively navigating ambiguity and managing at-times incomplete requirements. Build on top of and utilize Azure Cloud or AWS platforms, leveraging familiarity with components such as IAM, storage, compute, services, and application development. Develop, maintain, and optimize Python code bases to ensure performance, readability, and adherence to code standards like PEP8, including implementing comprehensive test coverage. Design, develop, and deploy scalable and performant Python web services and APIs for diverse architectures, including synchronous and asynchronous REST APIs. Implement and maintain event-driven product architectures and batch processing systems to support scalable and efficient data processing. Develop and deploy LLM-based and GenAI applications using tools and frameworks such as OpenAI, HuggingFace Transformers, LlamaIndex, and vector stores/databases like Chroma, FAISS, Qdrant, and Weaviate. Utilize the Elastic stack (Elastic, Logstash, Kibana) and Databricks for data processing and analytics as preferred. Effectively use version control, containerization, CI/CD pipelines, and the deployment of applications on Azure using Git and Docker. Manage SQL databases, such as PostgreSQL, ensuring efficient backend operations and data integrity. Collaborate effectively with cross-functional teams, including data scientists, engineers, and architects, to build and release data and AI applications. Communicate clearly and effectively in English, facilitating interactions and collaboration within a globally distributed and diverse data science and engineering team. SKILLS Utilize distributed data processing frameworks such as Apache Spark, Apache Dask, and Databricks. Implement logging and monitoring using Azure Log Analytics, Monitor, or Prometheus. Manage the development lifecycle, including code development, testing, and deployment to production environments. Apply parallel processing and concurrency paradigms in Python and .NET (desirable). Develop basic front-ends with JavaScript/TypeScript and frameworks like React or Vue (desirable). Proficient in a statically typed language such as C# .NET or Java (desirable). PROFESSIONAL EXPERIENCE/QUALIFICATIONS 5-6 years of development and back-end engineering experience, and at least a year of experience in shipping production-grade codebases. Bachelor's (B.E., B. Tech) or Masters (M.E, MTech) in computer science or software engineering is required.

Posted 1 week ago

Apply

3.0 - 7.0 years

3 - 7 Lacs

Remote, , India

On-site

Job Description Build scalable, secure, and robust technical solutions with product vision, learning and incorporating new technologies as appropriate. Produce high-quality and bug-free code as per the coding standards in close collaboration and interaction with other members of the Engineering/QA team. Participate in Agile methodologies for all aspects of the software development lifecycle (SDLC) process. Develop High Level and Low-Level Technical design documentation. Work with support team to resolve production support issues. Participate in peer reviews. Mentor the junior developers and help building knowledge base. Assist with technical documentation. Consult with QA staff on strategies for testing specific work items. Other duties as deemed necessary by management. Contribute to the success of the organization by helping others accomplish job results; learning new skills needed by the team; finding new ways to help the team. Bachelor's degree from four-year College or university in Computer Science or relevant streams 10+ years of software development experience design, build, and deploy scalable cloud solutions, utilizing the full spectrum of AWS Cloud PAAS offerings. Experience in building efficient backend services using Node.js, Nest JS (Mandatory). Experience in development of interactive and responsive user interfaces, leveraging ReactJS , Tailwinds CSS and Zustand (Mandatory). Experience in working with SQL Databases (preferably PostgreSQL), creating complex SQL queries and query optimization. Experience in working with Docker / EKS (Mandatory) Experience in building infrastructure resources on AWS efficiently using Terraform (Preferred). Experience in implementing CI/CD pipelines with tools like GitHub Actions / AWS code pipeline (Preferred) Strong problem-solving skills in a fast-paced environment Capable of system tuning, code optimization and bug solving. Working experience in agile team is preferred.

Posted 1 week ago

Apply

3.0 - 5.0 years

3 - 5 Lacs

Remote, , India

On-site

Job Description External Description Collaborating withcross-functional teams to understand client needs and pain points related todata processing and storage. Developing and executing aproduct strategy to centralize and optimize data through ETL processes and ourEDW. Leading the design andimplementation of efficient data pipelines to ensure seamless dataextraction, transformation, and loading. Identifying opportunities toenhance data-driven decision-making processes for clients through improveddata infrastructure. Working closely with KFDigital solution areas product management teams, engineering, design andarchitecture to prioritize features and ensure timely delivery of productenhancements. Analyzing market trends andcompetitor offerings to inform product roadmap decisions. Communicating the valueproposition of the ETL processes and EDW to internal stakeholders andclients. Education : BA/BS in business administration, computer science/IT, engineering, design,marketing, or data science. 3-5 years product managementexperience across product lifecycle (ideation to launch to sunset) 1-3 years hands-onexperience working with ETL processes and enterprise data warehouses,transforming data into organized, accessible formats. (ETL workflows, dataintegration, data modeling, data architecture, data warehousing, and big datasolutions.) HCM background (Talentmanagement, Talent assessment, core HR or Workforce Management) Experience with HRIS, ATS orLMS integrations is a big plus. External Skills And Expertise Strong written and verbal communication skills Fluent in data processing and integration processes, with the ability to translate these processes for non-technical stakeholders. Ability to communicate ideas through data flow diagrams and process maps. Ability to work collaboratively with engineering, data architecture, and IT teams. Experience with Atlassian suite (Jira, Confluence, Trello), ProductBoard, Aha!, Figma, Lucid or other related tools for roadmap, requirements, wireframing. Advanced PowerPoint, Excel skills or similar, with proficiency in data analysis and reporting. Well organized and able to plan and execute projects in a structured manner.

Posted 1 week ago

Apply

5.0 - 7.0 years

5 - 7 Lacs

Remote, , India

On-site

Job Description Collaborating withcross-functional teams to understand client needs and pain points related todata utilization. Developing and executing aproduct strategy to centralize and visualize organizational data on the platform. Leading the design andimplementation of intuitive visualization tools to showcase insights derivedfrom client data. Identifying opportunities toenhance data-driven decision-making processes for clients through the analyticsapp. Working closely with KFDigital solution areas product management teams, engineering, design andarchitecture to prioritize features and ensure timely delivery of productenhancements. Analysing market trends andcompetitor offerings to inform product roadmap decisions. Communicating the valueproposition of the analytics app to internal stakeholders and clients. Education : BA/BS in business administration, computer science/IT, engineering, design,marketing, or data science. 5-7 years product managementexperience across product lifecycle (ideation to launch to sunset) 3-5 years SaaS hands-onexperience working with enterprise data, transforming it into actionableinsights and intuitive visualizations. (reports, dashboards, analytics,telemetry, usage analytics, enterprise insights, data analytics and visualization,business intelligence, consumer insights or big data solutions.) HCM background (Talentmanagement, Talent assessment, core HR or Workforce Management) Experience with HRIS, ATS or LMS integrations is a big plus. Strong written and verbal communication skills Fluent in technical processes and the ability to translate these processes with non-technical stakeholders. Ability to communicate ideas through mocks/wireframes. Ability to work collaboratively with engineering, design, architecture. Experience with Atlassian suite (Jira, Confluence, Trello), ProductBoard, Aha!, Figma, Lucid or other related tools for roadmap, requirements, wireframing. Advanced PowerPoint, Excel skills or similar. Well organized and able to plan and execute projects in a structured manner.

Posted 1 week ago

Apply

4.0 - 6.0 years

4 - 6 Lacs

Remote, , India

On-site

Job Description MUST HAVES: 4+ years of Business Systems Analyst experience within Salesforce Experience with CI/CD development processes based in GitHub Has experience setting up permission sets Configuration experience in Salesforce Experience creating Flows in Salesforce PLUSSES: Scratch Orgs within Salesforce Experience working on multiple different workstreams at once Salesforce Marketing Cloud Experience integrating with other salesforce tools DAY-TO-DAY This resource will be working to deliver capabilities to salesforce members for applications. They will need to be capable of working between the technical and business sides of the organization performing administrative tasks and supporting the current workstream.The team works in two week sprints and responsibilities include: Gathering Requirements Setting up permission sets Configuration Solution Design Documentation Development Testing & implementation User Training & Support Salesforce integration with other systems Report and dashboard development Workflow automation

Posted 1 week ago

Apply

5.0 - 8.0 years

5 - 8 Lacs

Remote, , India

On-site

Job Description MUST HAVES: 5+ years of SFDC/Salesforceimplementation and dev experience APEX & LWC Programming experience Git or other version control experience CI/CD Environment experience Experience working in theSalesforcePartner Ecosystem PLUSSES: Integration betweenSalesforceand REST/SOAP (ie. writing custom integrations with APIs) Scratch Orgs Flows Jira Experience with Workday, Oracle, or CIAM DAY-TO-DAY This resource will be working on theSalesforcePartner Relationship Management workstream within Experience Cloud for the second phase of the organization'sSalesforcemigration from classic to lightning.This product team is currently focused on bridging the connection betweenSalesforceand vendor tools through API connectivity. The team works in two week sprints and responsibilities include: Helping out other team members and assisting with data analysis Assigned a story to work on Be a part of refinement meetings Troubleshooting and deployments Attending Stand Ups and Scrum meetings ConnectingSalesforcewith Rest endpoints and Soap Asking questions about stories and helping to size them Participating in peer reviews with other team members Moving through Git Hub to the Sit environment from the scratch org Fixing bugs found by testing team

Posted 1 week ago

Apply

5.0 - 8.0 years

5 - 8 Lacs

Remote, , India

On-site

Job Description: Lead and participate in the end-to-end implementation ofSalesforcesolutions, with a focus on Service Cloud. Collaborate with cross-functional teams to design, develop, and deploy high-quality solutions tailored to meet client requirements. Work closely with stakeholders to gather, analyze, and document detailed requirements. Utilize your extensive experience to translate business needs into technical specifications, ensuring alignment withSalesforcebest practices. Demonstrate a deep understanding of Service Cloud functionalities, including case management, knowledge base, omni-channel support, and service automation. Leverage your expertise to optimize Service Cloud configurations for complex business processes. Apply your extensive knowledge of Help Desk systems to design and implement seamless integrations withSalesforceService Cloud. Ensure a smooth transition and enhanced user experience for help desk operations. Collaborate effectively with development teams, administrators, and business analysts to drive successfulSalesforceimplementations. Provide technical guidance and mentorship to junior team members. Conduct thorough testing and validation ofSalesforcesolutions to guarantee functionality, performance, and security. Address and resolve any issues promptly to deliver high-quality, error-free implementations. Create comprehensive technical documentation, including solution design documents, process flow diagrams, and system configurations. Maintain accurate records of customizations and integrations for future reference. Qualifications: Bachelor's degree in Computer Science, Information Technology, or related field. Minimum of 5 years of hands-on experience inSalesforceimplementations, specifically in Service Cloud. Advanced proficiency in English (both written and spoken). Strong expertise in large-scale Help Desk implementations and related technologies. Proven experience in gathering and translating complex business requirements into technical solutions. Salesforcecertifications (e.g., Certified ApplicationArchitect, Certified SystemArchitect) preferred. Exceptional problem-solving skills and ability to work in a fast-paced environment. Excellent interpersonal and communication skills, with a keen attention to detail.

Posted 1 week ago

Apply

3.0 - 5.0 years

3 - 5 Lacs

Remote, , India

On-site

Target environment: Web applications and apis hosted on the cloud services. Conducting Performance Testing: They are responsible for creating, modifying, and executing test scripts to ensure system functionality, capacity, reliability, and scalability. Analysing Results: Performance Test Engineers analyse application CPU usage, heap memory, garbage collection (GC) activity, and threads using profiling tools. Monitoring the azure/aws cloud resources for the reliability which involved on the Performance tests. Monitoring and Diagnosing: They monitor application health, diagnose performance issues, and identify bottlenecks to suggest improvements. Collaboration: They work closely with development teams to handle bug fixes and resolve performance problems. Automation: Developing automated test scenarios for performance testing is also part of their job. Reporting: They generate performance test reports and provide recommendations for system enhancements. Load testing: Should possess Good amount of experience in the Load and Stress testing. EXPERTISE AND QUALIFICATIONS Technical Proficiency: Expertise in distributed test automation execution, configuration of monitors, and performance monitoring tools. Problem-Solving: Ability to analyze and profile performance issues to find the root causes and provide solutions. Programming Knowledge: Familiarity with scripting languages and the ability to create and modify test scripts is essential. System Monitoring: Skills in monitoring system resources such as memory stacks and CPU utilization on various operating systems.

Posted 1 week ago

Apply

2.0 - 6.0 years

2 - 6 Lacs

Remote, , India

On-site

Proven work history with Dedicated SQL Pools. Microsoft Azure Synapse Analytics experience is essential. (Dedicated SQL Pool, Azure Data Factory, Azure Storage) Design and implement end-to-end data solutions using Azure Synapse. Will likely have a degree in Computer Science, Statistics, Informatics, Information Technology or quantitative field. Excellentcommunication skills with the ability to work within a team, across the business, and build strong relationships. End-to-end Data Warehouse experience: ingestion, ETL, big data pipelines, data architecture, message queuing, stream processing, BI/Reporting, and Data Security. Abilityto build processes supporting data transformation, data structures, metadata, dependency, and workload management, as well as the ability to manipulate,process, and extract value from large, disconnected structured and unstructureddatasets. Advanced SQL/relational database knowledge (including SSIS), query authoring (SQL). Experience performing root cause analysis on data, answering specific business questions, and identifying opportunities for improvement. Understanding of machine learning and artificial intelligence ML libraries and frameworks (TensorFlow, Spark, etc). Experience with Data Governance (Quality, Lineage, Data dictionary, and Security). Proficiency in programming language such as Python or equivalentfor data engineering tasks. Familiarity with Git for version control and Azure DevOps for continuous integration and continuous deployment (CI/CD) processes. External Skills And Expertise Microsoft Azure Synapse Analytics experience is essential. (Dedicated SQL Pool, Azure Data Factory, AzureStorage) Excellent communication skills with the ability to work within a team, across the business, and build strong relationships. End-to-end Data Warehouse experience: ingestion, ETL, big data pipelines, data architecture message queuing, stream processing, BI/Reporting, and Data Security. Ability to build processes supporting data transformation, data structures, metadata, dependency, and workload management, as well as the ability to manipulate,process, and extract value from large, disconnected structured and unstructureddatasets. Advanced SQL/relational database knowledge (including SSIS), query authoring (SQL). Experience performing root cause analysis on data, answering specific businessquestions, and identifying opportunities for improvement. Understanding of machine learning and artificial intelligence ML libraries and frameworks (TensorFlow, Spark, etc). Experience with Data Governance (Quality, Lineage, Data dictionary, and Security). Proficiency in programming language such as Python or equivalent for data engineering tasks. Familiarity with Git for version control and Azure DevOps for continuous integration and continuous deployment (CI/CD) processes.

Posted 1 week ago

Apply

7.0 - 14.0 years

2 - 6 Lacs

Remote, , India

On-site

Job Description Role and Responsibilities : Handle day-to-day administration tasks such as configuring clusters and workspaces, Monitor platform health, troubleshoot issues, and perform routine maintenance and upgrades. Evaluate new features and enhancements introduced by Databricks from Security, Compliance and manageability prospective Implement and maintain security controls to protect the Databricks platform and the data within it. Collaborate with the security team to ensure compliance with data privacy and regulatory requirements. Develop and enforce governance policies and practices, including access management, data retention, and data classification. Optimize the platform's performance by monitoring resource utilization, identifying and resolving bottlenecks, and fine-tuning configurations for optimal performance. Collaborate with infrastructure and engineering teams to ensure that the platform meets the scalability and availability requirements. Work closely with data analysts, data scientists, and other users to understand their requirements and provide technical support Automate platform deployment, configuration, and monitoring processes using scripting languages and automation tools. Collaborate with the DevOps team to integrate the Databricks platform into the overall infrastructure and CI/CD pipelines. What we Look for : 7+ years of experience with Big Data Technologies such as Apache Spark, cloud native Data lakes and Data mesh platforms technical Architecture or consulting role Strong experience in administering and managing Databricks or other big data platforms - AWS cloud Python programming Skills in technical areas which support deployment and integration of Databricks based solutions. Understanding latest services offered by Databricks and evaluation of those services and understanding how these services can fit into the platform

Posted 1 week ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

Remote, , India

On-site

Job Description Responsibilities In this role, you'll: Serve as the Global Oracle SME for the Source to Pay area - Payables, Cash Management and Expenses modules You will act as the solution architect for this area mindful of how solutions impact the current and future ERP IT landscape. You will also mentor and act as a coach for junior members of the global S2P team. Be comfortable with articulating solution proposals to all levels of management and end users. Cultivate strong business relationships with key stakeholders (In India, U.S. and EMEA) You will be comfortable working with globally dispersed I.T. teams and senior business leadership being predominantly based in the U.S. Qualifications Not all applicants will have skills that match a job description exactly. Twilio values diverse experiences in other industries, and we encourage everyone who meets the required qualifications to apply. While having desired qualifications make for a strong candidate, we encourage applicants with alternative experiences to also apply. If your career is just starting or hasn't followed a traditional path, don't let that stop you from considering Twilio. We are always looking for people who will bring something new to the table! Required: Bachelor's degree in Business, Information Technology, or related field. Advanced degrees and Oracle Fusion Certifications are a plus. Knowledge of Oracle Cloud SaaS platform and related technologies to extract / upload data Knowledge of industry leading practices - deep techno functional detail in S2P and other processes including (but not restricted to) Vendor management, Self-service Procurement, OCR technologies and capabilities, Invoice processing/Accounts Payable, Cash management, tax (especially Withholding tax), Expenses and GL accounting.You will be responsible for design and architecture decisions with the team. Architect and design cross-functional requirement sessions to elicit, document and analyse business requirements and functional specifications. Includes identifying unspoken or conflicting requirements and challenging the norm 5+ years of experience and an in-depth understanding of S2Pprocesses in Oracle Cloud ERP. Experience within a global, SaaS, technology company preferred Architect, design, document, configure,deliver and automate in areas of S2P across various applications such as Onit, Beeline, Lease calcs and other integrated (external to Oracle AP) finance applications Proficient in process and solution engineering, managing and delivering cross functional multiple projects and integrations. Excellent communication - verbal / presentation and written plus interpersonal skills, English is a must. Experience working with a distributed team across time zones and comfortable working in second shift ( 2 - 11 PM IST) Minimum of 3 End-to-End Implementation projects in Oracle ERP. Detail-oriented with strong organizational skills. Teamwork-oriented with a strong focus on customer satisfaction Desired: Oracle Fusion Certifications SAFe / AGILE certifications Focus areas Functional Focus Areas: Oracle Cloud techno-functional knowledge - PO, AP, CE & Expenses Project Lead experience Integrations Stakeholder Management Functional Competencies: Oracle S2P functional knowledge - focusing on Payables, Expenses & Cash Mgt Focus on AP, Expenses and Cash Management Business process flows Problem solving and analysis Integrations Working in global teams Project experience / scenarios

Posted 1 week ago

Apply

4.0 - 8.0 years

4 - 8 Lacs

Remote, , India

On-site

Core skills and Competencies 1. Design, develop, and maintain data pipelines, ETL/ELT processes, and data integrations to support efficient and reliable data ingestion, transformation, and loading. 2. Collaborate with API developers, and other stakeholders to understand data requirements and ensure the availability, reliability, and accuracy of the data. 3. Optimize and tune performance of data processes and workflows to ensure efficient data processing and analysis at scale. 4. Implement data governance practices, including data quality monitoring, data lineage tracking, and metadata management. 5. Work closely with infrastructure and DevOps teams to ensure the scalability, security, and availability of the data platform and data storage systems. 6. Continuously evaluate and recommend new technologies, tools, and frameworks to improve the efficiency and effectiveness of data engineering processes. 7. Collaborate with software engineers to integrate data engineering solutions with other systems and applications. 8. Document and maintain data engineering processes, including data pipeline configurations, job schedules, and monitoring and alerting mechanisms. 9. Stay up-to-date with industry trends and advancements in data engineering, cloud technologies, and data processing frameworks. 10. Provide mentorship and guidance to junior data engineers, promoting best practices in data engineering and ensuring the growth and development of the team. 11. Able to implement and troubleshoot Rest services in Python. External Skills And Expertise 1. Strong proficiency in Python or another programming language commonly used in data engineering, such as Java. 2. Excellent SQL skills with ability to work with data across different SQL databases including Postgres, Databricks, SQL Server etc. NoSQL databases is a plus 3. In-depth knowledge of cloud platforms such as AWS, Azure, or Google Cloud, and experience with the services and tools they offer for data storage, processing, and analytics (e.g., S3, Redshift, BigQuery, Dataflow). 4. Experience in building and optimizing ETL/ELT processes, data pipelines, and workflows, using tools like Apache Nifi, or Apache Kafka. 5. Familiarity with data modeling, data warehousing concepts, and data governance best practices. 6. Strong problem-solving and troubleshooting skills, with the ability to identify and resolve data quality and performance issues. 7. Experience with version control systems, such as Git, and knowledge of CI/CD practices in a data engineering context. 8. Knowledge of data security and privacy principles, as well as experience implementing data access controls and managing sensitive data. 10. Knowledge of elastic Search/log stash implementation skills very desirable. 11. Understanding of distributed computing principles and experience with distributed data processing frameworks like Apache Spark or Hadoop. 12. Familiarity with containerization technologies like Docker and orchestration tools like Kubernetes. 13.Excellent communication and collaboration skills, with the ability to work effectively in cross-functional teams and translate business requirements into technical solutions.

Posted 1 week ago

Apply

3.0 - 7.0 years

3 - 7 Lacs

Remote, , India

On-site

Job Description Responsible for high quality, bug-free development as per the coding standards in close collaboration and interaction with other members of the development/QA team. Participate in daily project scrum meetings and provide daily personal status report. Create user documentation for completed solutions. Work with support team to resolve production support issues. Mentor the junior developers and help building knowledge base. Participate in peer reviews. Assist with technical documentation. Consult with QA staff on strategies for testing specific work items. Other duties as deemed necessary by management. Contribute to the success of the organization by helping others accomplish job results; learning new skills needed by the team; finding new ways to help the team. Bachelor's degree from four-year College or university in Computer Science, IT 3+ years of software development and testing experience using NodeJS, Java Experience working with REST API, API security Experience working with ReactJS/Angular Experience with SQL (Postgres, MSSQL), ORM, NoSQL (MongoDB) Experience in Product Development Excellent communication skills Understanding of SDLC Capable of system tuning, code optimization and bug solving Familiar with source control principles and systems Strong problem-solving skills in a fast-paced environment Working experience in agile team is preferred

Posted 1 week ago

Apply

4.0 - 8.0 years

4 - 8 Lacs

Remote, , India

On-site

Job Description External Description Build scalable, secure, and robust technical solutions with product vision, learning and incorporating new technologies as appropriate. Produce high-quality and bug-free code as per the coding standards in close collaboration and interaction with other members of the Engineering/QA team. Participate in Agile methodologies for all aspects of the software development lifecycle (SDLC) process. Develop High Level and Low-Level Technical design documentation. Work with support team to resolve production support issues. Participate in peer reviews. Mentor the junior developers and help building knowledge base. Assist with technical documentation. Consult with QA staff on strategies for testing specific work items. Other duties as deemed necessary by management. Contribute to the success of the organization by helping others accomplish job results; learning new skills needed by the team; finding new ways to help the team. External Skills And Expertise Bachelor's degree from four-year College or university in Computer Science or relevant streams 4+ years of software?development experience design, build, and deploy scalable cloud solutions, utilizing the full spectrum of AWS Cloud PAAS offerings. Experience in building efficient backend services using Node.js / Nest Js (Mandatory). Experience in building infrastructure resources on AWS efficiently using Terraform (Mandatory). Experience in implementing CI/CD pipelines with tools like GitHub Actions / AWS code pipeline (Preferred) Experience in working with Docker / EKS (Mandatory) Experience in development of interactive and responsive user interfaces, leveraging ReactJS , Tailwinds CSS (Preferred). Experience in of the following skill tools is an added advantage Kafka / KeyClock / Grafana / Elasticsearch Strong problem-solving skills in a fast-paced environment Capable of system tuning, code optimization and bug solving. Working experience in agile team is preferred.

Posted 1 week ago

Apply

2.0 - 5.0 years

2 - 5 Lacs

Remote, , India

On-site

Job Description Documentum is a must. Design, build and maintain the CI/CD infrastructure and tools including the container platform (Kubernetes) Documentum version upgrades Experience Weblogic experience Implement features and functionality for major and minor releases mainly on Automation stack including Documentum platform technologies as necessary (D2, Indexing Services, Retention Policy Services) Work on the technical design of CI/CD components Ensure that project/enhancements work is delivered on agreed time, cost and quality constraints following the release calendars Resolve issues and provide debugging if necessary primarily for CI/CD tool components Address all CI/CD infrastructure technical questions Responsible for continuous improvement and management of changes affecting the services Collaboration with the application and project management teams that build and manage the applications running on top of the CI/CD platforms to ensure all necessary requests are managed within the CI/CD framework Take accountability to ensure adherence with Security and Compliance policies and procedures Work with multiple product vendors to address bugs and perform Root-cause analysis

Posted 1 week ago

Apply

5.0 - 15.0 years

5 - 15 Lacs

Remote, , India

On-site

Job Description Build scalable, secure, and robust technical solutions with product vision, learning and incorporating new technologies as appropriate. Produce high-quality and bug-free code as per the coding standards in close collaboration and interaction with other members of the Engineering/QA team. Participate in Agile methodologies for all aspects of the software development lifecycle (SDLC) process. Develop High Level and Low-Level Technical design documentation. Work with support team to resolve production support issues. Participate in peer reviews. Mentor the junior developers and help building knowledge base. Assist with technical documentation. Consult with QA staff on strategies for testing specific work items. Other duties as deemed necessary by management. Contribute to the success of the organization by helping others accomplish job results; learning new skills needed by the team; finding new ways to help the team Bachelor's degree from four-year College or university in Computer Science or relevant streams 6+ years of software development experience design, build, and deploy scalable cloud solutions, utilizing the full spectrum of AWS Cloud PAAS offerings. Experience in building efficient backend services using Node.js, Nest JS (Mandatory). Experience in working with SQL Databases (preferably PostgreSQL), creating complex SQL queries and query optimization. Experience in working with Docker / EKS (Mandatory) Experience in building infrastructure resources on AWS efficiently using Terraform (Preferred). Experience in implementing CI/CD pipelines with tools like GitHub Actions (Preferred) Strong problem-solving skills in a fast-paced environment Capable of system tuning, code optimization and bug solving. Working experience in agile team is preferred.

Posted 1 week ago

Apply

10.0 - 15.0 years

10 - 15 Lacs

Remote, , India

On-site

Job Description Build scalable, secure, and robust technical solutions with product vision, learning and incorporating new technologies as appropriate. Produce high-quality and bug-free code as per the coding standards in close collaboration and interaction with other members of the Engineering/QA team. Participate in Agile methodologies for all aspects of the software development lifecycle (SDLC) process. Develop High Level and Low-Level Technical design documentation. Work with support team to resolve production support issues. Participate in peer reviews. Mentor the junior developers and help building knowledge base. Assist with technical documentation. Consult with QA staff on strategies for testing specific work items. Other duties as deemed necessary by management. Contribute to the success of the organization by helping others accomplish job results; learning new skills needed by the team; finding new ways to help the team. External Skills And Expertise Bachelor's degree from four-year College or university in Computer Science or relevant streams 10+ years of software development experience design, build, and deploy scalable cloud solutions, utilizing the full spectrum of AWS Cloud PAAS offerings. Experience in building efficient backend services using Node.js, Nest JS (Mandatory). Experience in development of interactive and responsive user interfaces, leveraging ReactJS , Tailwinds CSS and Zustand (Mandatory). Experience in working with SQL Databases (preferably PostgreSQL), creating complex SQL queries and query optimization. Experience in working with Docker / EKS (Mandatory) Experience in building infrastructure resources on AWS efficiently using Terraform (Preferred). Experience in implementing CI/CD pipelines with tools like GitHub Actions / AWS code pipeline (Preferred) Strong problem-solving skills in a fast-paced environment Capable of system tuning, code optimization and bug solving. Working experience in agile team is preferred.

Posted 1 week ago

Apply

0.0 - 2.0 years

0 - 2 Lacs

Remote, , India

On-site

Job Description Mandate Skills : SAP S/4HANA Cloud for Warehouse Management, Public Edition, SAP Supply Chain Management, SAP-SCM: TM Requirements: Application specific requirement analysis, solution consulting Performance of feasibility studies / providing best solutions Solid understanding of SAP S/4HANA and SAP TM, and integration with other related modules Understands the Business Process in the Supply Chain Customer facing support Personal drive, with a track record of working to tight timelines in a challenging environment. Key Responsibilities and objectives Extensive experience and understanding in the SAP Logistics Management (TM / LBN ) applications implementation and support life-cycle and how this is used to support core business capabilities & processes for supply chain Strong experience in the logistics management process. Develop and enhance SAP TM interfaces, reports, and workflows with best practices for development in an SAP TM environment. Being involved at least on a ONE full cycle implementation in SAP TM Optimize performance and ensure seamless integration with other SAP modules. Support testing, troubleshooting, and deployment of SAP TM solutions. Provide Hypercare support, technical guidance to BAU for future Articulate and good communicator with the ability to communicate technical issues in an easy-to-understand ways. Experience of working within a global organisation. Broad knowledge of IT and business context. Understands the importance of effective communication and influencing skills within teams, vendors, suppliers, stakeholders and leadership teams Key Interfaces & Stakeholders Role reports to the Lipton IT Lead Engagement with product vendor(s) for supply chain Engagement with supply chain functional leaders, process owners, super users

Posted 1 week ago

Apply

2.0 - 6.0 years

2 - 6 Lacs

Remote, , India

On-site

Job Description Ascendion acquired Zenith, a legacy PL1-based multi-bank financial management platform (accounting, cash management, bill pay, payroll, outsourced services, and integrated banking) serving high-net-worth individuals. To modernize and expand market share, we are rebuilding the platform, enhancing functionality, and enabling white-labelling to drive deposit growth and customer acquisition

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies