Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 7.0 years
9 - 13 Lacs
Pune, Gurugram
Work from Office
We are seeking a highly skilled Development Lead with expertise in Generative AI and Large Language models, in particular, to join our dynamic team. As a Development Lead, you will play a key role in developing cutting-edge LLM applications and systems for our clients. Your primary focus will be on driving innovation and leveraging LLMs to create impactful solutions. The ideal candidate will have a strong technical background and a passion for pushing the boundaries of LLM apps. Job Description: Responsibilities : Develop and extend digital products and creative applications, leveraging LLM technologies at their core. Lead a product development and product operations team to further develop, enhance and extend existing digital products built on top of LLMs Lead client onboarding, client rollout, and client adoption efforts , maximizing use of the product across multiple clients Lead enhancements and extensions for client specific capabilities and requests Successful leadership and delivery of projects involving Cloud Gen-AI Platforms and Cloud AI Services, Data Pre-processing, Cloud AI PaaS Solutions, LLMs Ability to work with Base Foundation LLM Models, Fine Tuned LLM models, working with a variety of different LLMs and LLM APIs. Conceptualize, Design, build and develop experiences and solutions which demonstrate the minimum required functionality within tight timelines. Collaborate with creative technology leaders and cross-functional teams to test feasibility of new ideas, help refine and validate client requirements and translate them into working prototypes, and from thereon to scalable Gen-AI solutions. Research and explore emerging trends and techniques in the field of generative AI and LLMs to stay at the forefront of innovation. Research and explore new products, platforms, and frameworks in the field of generative AI on an ongoing basis and stay on top of this very dynamic, evolving field Design and optimize Gen-AI Apps for efficient data processing and model leverage. Implement LLMOps processes, and the ability to manage Gen-AI apps and models across the lifecycle from prompt management to results evaluation. Evaluate and fine-tune models to ensure high performance and accuracy. Collaborate with engineers to develop and integrate AI solutions into existing systems. Stay up-to-date with the latest advancements in the field of Gen-AI and contribute to the companys technical knowledge base. Must-Have: Strong Expertise in Python development, and the Python Dev ecosystem, including various frameworks/libraries for front-end and back-end Python dev, data processing, API integration, and AI/ML solution development. Minimum 2 years hands-on experience in working with Large Language Models Hands-on Experience with building production solutions using a variety of different. Experience with multiple LLMs and models - including Azure OpenAI GPT model family primarily, but also Google Gemini, Anthropic Claude, etc. Deep Experience and Expertise in Cloud Gen-AI platforms, services, and APIs, primarily Azure OpenAI . Solid Hands-on, and Deep Experience working with RAG pipelines and Enterprise technologies and solutions / frameworks - including LangChain, Llama Index, etc. Solid Hands-on Experience with developing end-to-end RAG Pipelines . Solid Hands-on Experience with AI and LLM Workflows Experience with LLM model registries (Hugging Face), LLM APIs, embedding models, etc. Experience with vector databases (Azure AI Search, AWS Kendra, FAISS, Milvus etc.). Experience with LLM evaluation frameworks such as Ragas, and their use to evaluate / improve LLM model outputs Experience in data preprocessing , and post-processing model / results evaluation. Hands-on Experience with API Integration and orchestration across multiple platforms Good Experience with Workflow Builders and Low-Code Workflow Builder tools such as Azure Logic Apps , or n8n (Nodemation) Good Experience with Serverless Cloud Applications , including Cloud / Serverless Functions with Azure Good Experience with Automation Workflows and building Automation solutions to facilitate rapid onboarding for digital products Ability to lead design and development teams, for Full-Stack Gen-AI Apps and Products/Solutions, built on LLMs and Diffusion models. Ability to lead design and development for Creative Experiences and Campaigns , built on LLMs and Diffusion models. Nice-to-Have Skills (not essential, but useful) : Good understanding of Transformer Models and how they work. Hands-on Experience with Fine-Tuning LLM models at scale. Good Experience with Agent-driven Gen-AI architectures and solutions, and working with AI Agents . Some experience with Single-Agent and Multi-Agent Orchestration solutions Hands-on Experience with Diffusion Models and AI. Art models including SDXL, DALL-E 3, Adobe Firefly, Midjourney, is highly desirable. Hands-on Experience with Image Processing and Creative Automation at scale, using AI models. Hands-on experience with image and media transformation and adaptation at scale, using AI Art and Diffusion models. Hands-on Experience with dynamic creative use cases, using AI Art and Diffusion Models. Hands-on Experience with Fine-Tuning Diffusion models and Fine-tuning techniques such as LoRA for AI Art models as well. Hands-on Experience with AI Speech models and services, including Text-to-Speech and Speech-to-Text. Good Background and Foundation with Machine Learning solutions and algorithms Experience with designing, developing, and deploying production-grade machine learning solutions. Experience with Generative Adversarial Networks (GANs) and Variational Autoencoders (VAEs). Experience with custom ML model development and deployment Proficiency in deep learning frameworks such as TensorFlow, PyTorch, or Keras. Strong knowledge of machine learning algorithms and their practical applications. Experience with Cloud ML Platforms such as Azure ML Service, AWS Sage maker, and NVidia AI Foundry. Hands-on Experience with Video Generation models. Hands-on Experience with 3D Generation Models. Location: DGS India - Pune - Kharadi EON Free Zone Brand: Dentsu Creative Time Type: Full time Contract Type: Consultant
Posted 1 week ago
1.0 - 3.0 years
0 - 1 Lacs
Raipur
Work from Office
Tally | MS Office | Excel | All Bank related works |GST Filing |Auditing | Tds Calculation and return |construction company experience| salary sheet preparation |banking reconciliation| data entry| Billing| Eway Generation|
Posted 1 week ago
0.0 - 3.0 years
0 - 1 Lacs
Hyderabad
Work from Office
Job Summary: Little Einsteins Neknampur - International Preschool is looking for a motivated, detail-oriented, and friendly Receptionist & Admin Executive to join our team. This role focuses on managing the front desk, assisting with administrative tasks, and ensuring smooth day-to-day operations at our preschool. Key Responsibilities: Reception & Administrative Duties: Greet and assist parents, children, and visitors with a warm and professional demeanor. Answer and direct phone calls, respond to emails, and manage daily correspondence. Maintain accurate records of student attendance, enrollment forms, and other essential documentation. Schedule appointments, school tours, and meetings as required. Assist with the enrollment process, including distributing and collecting forms from parents. Manage inventory of office supplies and classroom materials, ensuring timely reordering. Perform general administrative tasks such as filing, data entry, photocopying, and documentation support. Provide administrative assistance to teachers and staff as needed. General Skills and Attributes: Excellent organizational skills and attention to detail. Ability to multitask and prioritize tasks effectively. Friendly and approachable demeanor with a passion for working with children and families. Basic knowledge of Microsoft Office Suite (Word, Excel, PowerPoint) and other office tools. Work Environment: The role is based at the Little Einsteins Neknampur - International Preschool in a friendly, educational, and interactive setting. Frequent interaction with children, parents, and staff is part of the role, contributing to a warm, community-focused work environment. Application Process: Interested candidates are invited to send their resume outlining their relevant experience to: leneknampur@lepreschools.com We look forward to hearing from enthusiastic candidates who are eager to contribute to the success of our preschool and daycare center! Little Einsteins Neknampur H.No: 4-3/13, Plot No: 13 & 14, EVV Colony, Near Chinthachettu Circle, Neknampur, Alkapur, Manikonda, Hyderabad. Pin Code: 500089. Mobile Number: 9030257030
Posted 1 week ago
0.0 years
1 - 3 Lacs
Ahmedabad
Work from Office
Ready to shape the future of work? At Genpact, we don't just adapt to change we drive it. AI and digital innovation are redefining industries and were leading the charge. Genpacts AI Gigafactory, our industry-first accelerator, is an example of how were scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team thats shaping the future, this is your moment Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook. Mega Virtual Drive for Customer Service roles -English+ Hindi Language on 16th June 2025 (Monday) || Ahmedabad Location Date: 16-June-2025 (Monday) MS Teams meeting ID: 466 340 118 246 3 MS Teams Passcode: fM9uG9Te Time: 12:00 PM - 1:00 PM Job Location: Ahmedabad (Work from office) Languages Known: Hindi + English Shifts: Flexible with any shift Responsibilities • Respond to customer queries and customer's concern • Provide support for data collection to enable Recovery of the account for end user. • Maintain a deep understanding of client process and policies • Reproduce customer issues and escalate product bugs • Provide excellent customer service to our customers • You should be responsible to exhibit capacity for critical thinking and analysis. • Responsible to showcase proven work ethic, with the ability to work well both independently and within the context of a larger collaborative environment Qualifications we seek in you Minimum qualifications • Graduate (Any Discipline except law) • Only Freshers are eligible • Fluency in English & Hindi language is mandatory Preferred qualifications • Effective probing skills and analyzing / understanding skills • Analytical skills with customer centric approach • Excellent proficiency with written English and with neutral English accent • You should be able to work on a flexible schedule (including weekend shift) Why join Genpact? Be a transformation leader Work at the cutting edge of AI, automation, and digital innovation Make an impact Drive change for global enterprises and solve business challenges that matter Accelerate your career Get hands-on experience, mentorship, and continuous learning opportunities Work with the best Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Lets build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. **Note: Please keep your E-Aadhar card handy while appearing for interview.
Posted 1 week ago
3.0 - 5.0 years
5 - 12 Lacs
Hyderabad
Work from Office
Job Summary Join our team of professionals focused on antimoney laundering Client due diligence sanctions screening and antibribery and corruption We use domain knowledge combined with technology to offer our clients a unique package of operational delivery and excellence to rebuild compliance frameworks provide support and recommendations on regulatory requirements as well as implement process enhancements with a commitment to both quantity and quality of delivery Your role will be driven by our clients n Responsibilities Join our team of professionals focused on antimoney laundering Client due diligence sanctions screening and antibribery and corruption We use domain knowledge combined with technology to offer our clients a unique package of operational delivery and excellence to rebuild compliance frameworks provide support and recommendations on regulatory requirements as well as implement process enhancements with a commitment to both quantity and quality of delivery Your role will be driven by our clients needs and by your ability to align your domain knowledge to your role requirements for financial crime compliance Your key responsibilities Review of Monthly alerts and understanding of alert generation due to exceed in thresholdcertain spike in customer account Ability to understand the pattern of transaction in terms of suspiciousanomalous activity while doing periodic reviews Must have exposure over Global Alert management tools Actimize Norkom UCM etc Establish and implement moneylaundering rules in transaction monitoring system covering all bank products Should be able to determine source and utilization of fund for customer Ability to interpret KYC policies procedures and laws and put into practice Should be aware of UBOs Should be able to perform KYC reviews on High Medium and Low Risk entities Should have knowledge of PEP classification and Naming convention as well Should have understanding on Highrisk jurisdiction Sanction entityindividual different types of trade sanctions SDN etc Should have a fair understanding on identifying the relationship between customer and counterpartiesintermediaries Exposure over preparing AML case log and validation of information in terms of transaction and counterparties via different external applications ie Lexis Nexis DB etc Adapt to multitasking and meeting deadlines in highpressure environment Strong documentation skills to clearly articulate alert disposition To qualify for the role you must have A bachelors degree and around 24 years of work experience and must have a good knowledge on transaction monitoringKYC 3 years of experience in compliance or related position A degree in finance accounting business or a related discipline Exceptional research and analytical skills with the ability to analyze large amounts of data decipher higher risk attributes transactional geographical product customer type etc and develop wellreasoned recommendations Ability to perform KYC reviews on different entity types such as Trusts Hedge Funds regulated entities Should have good understanding of USA Patriotic Act BSA and CIP and knowledge on World Check LexisNexis and negative searches Ability to interpret KYC policies procedures and laws and put into practice Should be able to perform KYC reviews on High Medium and Low Risk entities Should have understanding on Highrisk jurisdiction Sanction entityindividual different types of trade sanctions SDN etc Strong proven communication skills demonstrated through effective writing and presentations to clients and internal stakeholders
Posted 1 week ago
5.0 - 10.0 years
12 - 22 Lacs
Bengaluru
Work from Office
Detailed JD (Roles and Responsibilities) Proficiency in Snowflake, Unix scripting. Experience working in Data Warehouse projects involving Snowflake, exposure in ETL, Data processing Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy Good exposure in Python scripting Good exposure in SQL concepts Good understanding of Datawarehouse concepts Mandatory skills Snowflake, SQL, Unix Desired/ Secondary skills Python
Posted 1 week ago
2.0 - 7.0 years
20 - 25 Lacs
Hyderabad
Work from Office
About the ORG: : Amazon is looking for an energetic and enthusiastic candidate to join the fast paced world of Payroll operations. Payroll is processed on a weekly, bi-weekly and monthly basis in multiple states, and this person will perform a variety of technical tasks relative to assigned areas of responsibility including data compilation and support of the Payroll Team. Responsibilities include: Process payroll utilizing Vendor payroll software Audit payroll related data Process and input garnishments, child support, levies and liens Review and process timesheet input records for employees Process manual check calculations, work with the vendor to process stop payments/reversals, enter paycheck card entries and assist with check distribution and backup other payroll analysts as needed Produce timely responses to employee inquiries Process payroll utilizing Vendor payroll software Audit payroll related data Process and input garnishments, child support, levies and liens Review and process timesheet input records for employees Process manual check calculations, work with the vendor to process stop payments/reversals, enter paycheck card entries and assist with check distribution and backup other payroll analysts as needed Produce timely responses to employee inquiries A day in the life Processing US payroll Garnishment orders received from court and agencies Graduation or equivalent degree 2+ years of relevant experience Proficient in Microsoft Excel Flexible to work in shifts Preferred Qualifications US/CA Payroll Experience FPC Certified
Posted 1 week ago
9.0 - 14.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Candidates for this position are preferred to be based in Bangalore, India and will be expected to comply with their teams hybrid work schedule requirements. The Advertising Optimization & Automation Science team is central to this effort. We leverage machine learning and generative AI to streamline campaign workflows, delivering impactful recommendations on budget allocation, target Return on Ad Spend (tROAS), and SKU selection. Additionally, we are developing intelligent systems for creative optimization and exploring agentic frameworks to further simplify and enhance advertiser interactions. We are looking for an experienced Senior Machine Learning Scientist to join the Advertising Optimization & Automation Science team. In this role, you will be responsible for building intelligent, ML-powered systems that drive personalized recommendations and campaign automation within Wayfair s advertising platform. You will work closely with other scientists, as well as members of our internal Product and Engineering teams, to apply your ML expertise to define and deliver 0-to-1 capabilities that unlock substantial commercial value and directly enhance advertiser outcomes. What You ll do: Design and build intelligent budget, tROAS, and SKU recommendations, and simulation-driven decisioning that extends beyond the current advertising platform capabilities. Lead the next phase of GenAI-powered creative optimization and automation to drive significant incremental ad revenue and improve supplier outcomes. Raise technical standards across the team by promoting best practices in ML system design and development. Partner cross-functionally with Product, Engineering, and Sales to deliver scalable ML solutions that improve supplier campaign performance. Ensure systems are designed for reuse, extensibility, and long-term impact across multiple advertising workflows. Research and apply best practices in advertising science, GenAI applications in creative personalization, and auction modeling. Keep Wayfair at the forefront of innovation in supplier marketing optimization. Collaborate with Engineering teams (AdTech, ML Platform, Campaign Management) to build and scale the infrastructure needed for automated, intelligent advertising decisioning. We Are a Match Because You Have: Bachelors or Master s degree in Computer Science, Mathematics, Statistics, or related field. 9+ years of experience in building large scale machine learning algorithms. 4+ years of experience working in an architect or technical leadership position. Strong theoretical understanding of statistical models such as regression, clustering and ML algorithms such as decision trees, neural networks, transformers and NLP techniques. Proficiency in programming languages such as Python and relevant ML libraries (e.g., TensorFlow, PyTorch) to develop production-grade products. Strategic thinker with a customer-centric mindset and a desire for creative problem solving, looking to make a big impact in a growing organization. Demonstrated success influencing senior level stakeholders on strategic direction based on recommendations backed by in-depth analysis; Excellent written and verbal communication. Ability to partner cross-functionally to own and shape technical roadmaps Intellectual curiosity and a desire to always be learning! Nice to have: Experience with GCP, Airflow, and containerization (Docker). Experience building scalable data processing pipelines with big data tools such as Hadoop, Hive, SQL, Spark, etc. Familiarity with Generative AI and agentic workflows. Experience in Bayesian Learning, Multi-armed Bandits, or Reinforcement Learning. Your personal data is processed in accordance with our Candidate Privacy Notice ( https: / / www.wayfair.com / careers / privacy ).
Posted 1 week ago
5.0 - 9.0 years
20 - 27 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Designing and implementing data processing systems using Microsoft Fabric, Azure Data Analytics, Databricks and other distributed frameworks (ex Hadoop, Spark, Snowflake, Airflow) Writing efficient and scalable code to process, transform, and clean large volumes of structured and unstructured data Designing data pipelines: Snowflake Data Cloud uses data pipelines to ingest data into its system from sources like databases, cloud storage, or streaming platforms A Snowflake Data Engineer designs, builds, and fine-tunes these pipelines to make sure that all data is loaded into Snowflake correctly Designing and implementing data processing systems using Microsoft Fabric, Azure Data Analytics, Databricks and other distributed frameworks (ex Hadoop, Spark, Snowflake, Airflow) Writing efficient and scalable code to process, transform, and clean large volumes of structured and unstructured data Designing data pipelines: Snowflake Data Cloud uses data pipelines to ingest data into its system from sources like databases, cloud storage, or streaming platforms A Snowflake Data Engineer designs, builds, and fine-tunes these pipelines to make sure that all data is loaded into Snowflake correctly
Posted 1 week ago
7.0 - 12.0 years
5 - 10 Lacs
Ahmedabad
Work from Office
Job Title: Senior Specialist - Data Protection Locations : London/Essex/India | Hybrid Get To Know the Team Join this dynamic team as a Senior Specialist in the Data Protection team where you will support the companys products and/or service offerings by ensuring compliance with applicable federal securities laws and state requirements. Why You Will Love It Here! Flexibility : Hybrid Work Model & a Business Casual Dress Code, including jeans Your Future: Professional Development Reimbursement Work/Life Balance: Flexible Personal/Vacation Time Off, Sick Leave, Paid Holidays Your Wellbeing: Medical, Dental, Vision, Employee Assistance Program, Parental Leave Diversity & Inclusion: Committed to Welcoming, Celebrating and Thriving on Diversity Training: Hands-On, Team-Customized, including SS&C University Extra Perks: Discounts on fitness clubs, travel and more! What You Will Get To Do: Provide support, advice and guidance across the Data Protection Framework including data processing queries, data breach management, supplier due diligence, ROPAs and privacy notices Create & assess DPIAs for change projects across the business Establish effective collaboration with the relevant internal stakeholders at all levels to facilitate delivery of the operational elements of the Data Protection Framework Develop relationships with internal and external parties to provide effective communication on data protection issues Participate in internal projects to provide context in the application of data protection requirements to support the creation of effective and pragmatic solutions Work closely with the business to ensure data protection processes and procedures remain fit for purpose and are updated to reflect data protection changes Provide support to the Senior Management of the Data Protection team in respect of data protection matters To identify and take action to help build upon our data protection culture Provide input/updates at relevant workshops on data protection related matters What You Will Bring: Knowledge and a working understanding of data protection laws (especially UK & EU GDPR), codes of practice and regulator guidance Working knowledge of data protection frameworks A data protection qualification (or willing to work towards obtaining one) such as CIPP/E or PC.dp A pragmatic approach to data protection compliance A desire to work as part of a support, professional team and willingness to develop your knowledge and experience in data protection.
Posted 1 week ago
0.0 - 2.0 years
8 - 11 Lacs
Kozhikode
Work from Office
1. Batch Processing: Oversee the scheduling, processing, and management of batches in the LMS. Ensure that all batches are processed accurately and on time. 2. Data Management: Manage the data associated with batch processing. This includes ensuring data integrity, troubleshooting data issues, and making necessary corrections. 3. System Monitoring: Monitor the LMS to ensure it is functioning properly. Identify and resolve any issues that may affect batch processing. 4. Collaboration: Work closely with other teams, such as Category manager and Course Development, to ensure smooth operation of the LMS. 5. Reporting: Generate and provide reports on batch processing activities. Use these reports to identify areas for improvement and make recommendations.
Posted 1 week ago
1.0 - 5.0 years
4 - 8 Lacs
Mumbai
Work from Office
Develop and maintain HR dashboards and reports using Power BI.Automate data processing using Excel Macros to streamline HR workflows.Create and manage applications using Power Apps for HR process automation.Ensure data accuracy and generate insights for HR decision-making.Collaborate with HR teams to improve reporting efficiency and analytics.Manage employee data and ensure compliance with organizational policies. Qualifications Expertise in Excel Macros, Power BI, and Power Apps.Strong analytical skills and understanding of HR data management.Ability to automate HR processes and generate meaningful reports.Problem-solving skills with atte
Posted 1 week ago
5.0 - 11.0 years
11 - 13 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Expertise in handling large scale structured and unstructured data. Efficiently handled large-scale generative AI datasets and outputs. Familiarity in the use of Docker tools, pipenv/conda/poetry env Comfort level in following Python project management best practices (use of setup.py, logging, pytests, relative module imports,sphinx docs,etc.,) Familiarity in use of Github (clone, fetch, pull/push,raising issues and PR, etc.,) High familiarity in the use of DL theory/practices in NLP applications Comfort level to code in Huggingface, LangChain, Chainlit, Tensorflow and/or Pytorch, Scikit-learn, Numpy and Pandas Comfort level to use two/more of open source NLP modules like SpaCy, TorchText, fastai.text, farm-haystack, and others Knowledge in fundamental text data processing (like use of regex, token/word analysis, spelling correction/noise reduction in text, segmenting noisy unfamiliar sentences/phrases at right places, deriving insights from clustering, etc.,) Have implemented in real-world BERT/or other transformer fine-tuned models (Seq classification, NER or QA) from data preparation, model creation and inference till deployment Use of GCP services like BigQuery, Cloud function, Cloud run, Cloud Build, VertexAI, Good working knowledge on other open source packages to benchmark and derive summary Experience in using GPU/CPU of cloud and on-prem infrastructures Education : Bachelor s in Engineering or Master s Degree in Computer Science, Engineering, Maths or Science Performed any modern NLP/LLM courses/open competitions is also welcomed. Design NLP/LLM/GenAI applications/products by following robust coding practices, Explore SoTA models/techniques so that they can be applied for automotive industry usecases Conduct ML experiments to train/infer models; if need be, build models that abide by memory & latency restrictions, Deploy REST APIs or a minimalistic UI for NLP applications using Docker and Kubernetes tools Showcase NLP/LLM/GenAI applications in the best way possible to users through web frameworks (Dash, Plotly, Streamlit, etc.,) Converge multibots into super apps using LLMs with multimodalities Develop agentic workflow using Autogen, Agentbuilder, langgraph Build modular AI/ML products that could be consumed at scale
Posted 1 week ago
10.0 - 12.0 years
9 - 10 Lacs
Bengaluru
Work from Office
Introduction: Automotive . About the Role Harman Automotive Services is looking for a Manager - Sales and Revenue Operations to join the Automotive Services Business Operations team. In this role, you would be positioned within the Sales ecosystem and acting as a Strategic partner who is responsible for improving efficiency, introducing and executing transformative programs, provide inputs which would be key to growth and profitability. In this role you would be collaborating with the Sales, Presales, Customer Success, Finance, and Delivery teams to participate in activities required for the smooth functioning of the overall sales function. What You Will Do Managing the end-to-end Sales and Revenue Operations for a globally and functionally distributed Sales ecosystem, ensuring conventional governance of sales regimen and creative transformational methods to keep the sales engine humming. Provide actionable and strategic inputs on forecast, pipeline health, and risks ensuring that organizational goals are followed, and course corrections and interventions are not delayed. Participate in account plan reviews to understand Customer Success team s account footprint strategy and provide recommendations and track metrics related to customer engagement, farming opportunities, and profitability. Partner with Sales, Presales, Customer Success, Sales Strategy, and Partnerships teams to become accountable for the governance of sales pipeline, current pursuits, demand velocity, account plan, strategic initiatives and demand planning for new and existing businesses. Ensuring Sales data processing, management, and hygiene and understanding the importance of managing a multi-tool, multi-source data gathering and information extraction system. Work closely with Finance, Business Leads and Engineering Delivery teams to influence profitability metrics and plan ramp-ups, ramp-downs, and workforce additions and regional expansions. Participate in Sales reviews and governance calls to close open actions and blockers for seamless execution of Sales function. Regular sync-ups, present data efficiently using various dashboards, with stakeholders from the Sales and Customer Success teams to help them take decisions and actions for profitable business and growth. Follow up and ensure key steps in the sales lifecycle business processes are actioned on time and within SLAs Manage a completion-based follow-up cadence with the functional step owners in different business processes. Publish weekly reports for business processes flagging progress, risks, and pending actions. Ensure data validation based on Harman standards for tools in the Harman Ecosystem. What You Need to Be Successful 10-12+ years of experience within Sales or Revenue Operations within the Software Services Industry. Experience working Automotive Tier-1 or OEM companies or clients. Master s degree in business administration preferably MBA. Knowledge of key processes of Sales and Revenue Operations Governance, Deal Tracking, Bid Management Lifecycle, Cost control, and Revenue Optimization. Excellent hands-on experience on using Salesforce tool. Proven business awareness or hands-on experience in Working with Bid management teams with an understanding of the biding lifecycle. Working with Customer Success, Farming, or Account Management teams understanding the sales lifecycle of existing businesses. Working with New business or hunting teams understanding customer ecosystems and pipeline management. Working with Partnerships and Strategy teams understanding and contributing to the overall Sales vision and strategy of the organization. Working with Finance controlling teams understanding the commercial aspects of deal structuring and positioning. Working with Delivery and Engineering teams understanding deal transition, resource planning, and ramp-up design for an opportunity from inception to kick-off. Expert in data analytics and representation with the ability to create data visualizations using Microsoft Excel for multi-source and multidimensional data. Proven experience in creating presentations related to Sales and Revenue Operations and a keen eye to map critical KPIs together to showcase the overall health of the Sales organization. Exceptional presentation skills and ability to create and deliver presentations to multiple business stakeholders. Creative problem-solver with the ability to work with a blank slate and inspire others. Excellent organizational and cross functional skills Strong verbal, written and presentation skills to have effective communication at all levels in the organization. Bonus Points if You Have Experience in working within an Automotive OEM or Tier-1 supplier in the Automotive industry. Experience in using AI tools to maximize efficiency in Business Operations and Workforce Management What Makes You Eligible Willingness to travel. Willingness to work in an office. Any offer of employment is conditioned upon the successful completion of a background investigation. What We Offer Flexible work environment, allowing for full-time remote work globally for positions that can be performed outside a HARMAN or customer location. Access to employee discounts on world-class Harman and Samsung products (JBL, HARMAN Kardon, AKG, etc.) Extensive training opportunities through our own HARMAN University Competitive wellness benefits Tuition reimbursement Be Brilliant employee recognition and rewards program. An inclusive and diverse work environment that fosters and encourages professional and personal development. You Belong Here . About HARMAN: Where Innovation Unleashes Next-Level Technology . . !
Posted 1 week ago
5.0 - 10.0 years
5 - 9 Lacs
Bengaluru
Work from Office
id="job_description_2_0"> Roku is the #1 TV streaming platform in the U.S., Canada, and Mexico, and weve set our sights on powering every television in the world. Roku pioneered streaming to the TV. Our mission is to be the TV streaming platform that connects the entire TV ecosystem. We connect consumers to the content they love, enable content publishers to build and monetize large audiences, and provide advertisers unique capabilities to engage consumers. From your first day at Roku, youll make a valuable - and valued - contribution. Were a fast-growing public company where no one is a bystander. We offer you the opportunity to delight millions of TV streamers around the world while gaining meaningful experience across a variety of disciplines. About the team The mission of Rokus Data Engineering team is to develop a world-class big data platform so that internal and external customers can leverage data to grow their businesses. Data Engineering works closely with business partners and Engineering teams to collect metrics on existing and new initiatives that are critical to business success. As Senior Data Engineer working on Device metrics, you will design data models & develop scalable data pipelines to capturing different business metrics across different Roku Devices. About the role Roku pioneered streaming to the TV. We connect users to the streaming content they love, enable content publishers to build and monetise large audiences, and provide advertisers with unique capabilities to engage consumers. Roku streaming players and Roku TV models are available around the world through direct retail sales and licensing arrangements with TV brands and pay-TV operators.With tens of million players sold across many countries, thousands of streaming channels and billions of hours watched over the platform, building scalable, highly available, fault-tolerant, big data platform is critical for our success.This role is based in Bangalore, India and requires hybrid working, with 3 days in the office. What youll be doing Build highly scalable, available, fault-tolerant distributed data processing systems (batch and streaming systems) processing over 10s of terabytes of data ingested every day and petabyte-sized data warehouse Build quality data solutions and refine existing diverse datasets to simplified data models encouraging self-service Build data pipelines that optimise on data quality and are resilient to poor quality data sources Own the data mapping, business logic, transformations and data quality Low level systems debugging, performance measurement & optimization on large production clusters Participate in architecture discussions, influence product roadmap, and take ownership and responsibility over new projects Maintain and support existing platforms and evolve to newer technology stacks and architectures Were excited if you have Extensive SQL Skills Proficiency in at least one scripting language, Python is required Experience in big data technologies like HDFS, YARN, Map-Reduce, Hive, Kafka, Spark, Airflow, Presto, etc. Proficiency in data modeling, including designing, implementing, and optimizing conceptual, logical, and physical data models to support scalable and efficient data architectures. Experience with AWS, GCP, Looker is a plus Collaborate with cross-functional teams such as developers, analysts, and operations to execute deliverables 5+ years professional experience as a data or software engineer BS in Computer Science; MS in Computer Science preferred #LI-AR3 Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive benefits include global access to mental health and financial wellness support and resources. Local benefits include statutory and voluntary benefits which may include healthcare (medical, dental, and vision), life, accident, disability, commuter, and retirement options (401(k)/pension). Our employees can take time off work for vacation and other personal reasons to balance their evolving work and life needs. Its important to note that not every benefit is available in all locations or for every role. For details specific to your location, please consult with your recruiter. The Roku Culture We have a unique culture that we are proud of. We think of ourselves primarily as problem-solvers, which itself is a two-part idea. We come up with the solution, but the solution isnt real until it is built and delivered to the customer. That penchant for action gives us a pragmatic approach to innovation, one that has served us well since 2002. To learn more about Roku, our global footprint, and how weve grown, visit https: / / www.weareroku.com / factsheet . By providing your information, you acknowledge that you have read our Applicant Privacy Notice and authorize Roku to process your data subject to those terms.
Posted 1 week ago
7.0 - 12.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Were Celonis, the global leader in Process Mining technology and one of the worlds fastest-growing SaaS firms. We believe there is a massive opportunity to unlock productivity by placing data and intelligence at the core of business processes - and for that, we need you to join us. The Team: Our team is responsible for building the Celonis end-to-end Task Mining solution . Task Mining is the technology that allows businesses to capture user interaction (desktop) data, so they can analyze how people get work done, and how they can do it even better. We own all the related components, e.g. the desktop client, the related backend services, the data processing capabilities, and Studio frontend applications. The Role: Celonis is looking for a Senior Software Engineer to build new features and increase the reliability of our Task Mining solution. You would contribute to the development of our Task Mining Client so expertise on C# and .NET framework is required and knowledge of Java and Spring boot is a plus. The work you ll do: Implement highly performant and scalable desktop components to improve our existing Task Mining software Own the implementation of end to end solutions: leading the design, implementation, build and delivery to customers Increase the maintainability, reliability and robustness of our software Continuously improve and automate our development processes Document procedures, concepts, and share knowledge within and across teams Manage complex requests from support, finding the right technical solution and managing the communication with stakeholders Occasionally work directly with customers, including getting to know their system in detail and helping them debug and improve their setup. The qualifications you need: 7+ years of professional experience building .NET applications Passion for writing clean code that follows SOLID principles Hand-on experience in C# and .NET framework. Experience in user interface development using WPF and MVVM. Familiarity with Java, Spring framework is a plus. Familiarity with containerization technologies (i.e. Docker) Experience in REST APIs and/or distributed micro service architecture Experience in monitoring and log analysis capabilities (i.e. DataDog) Experience in writing and setting up unit and integration tests Experience in refactoring legacy components. Able to supervise and coach junior colleagues Experience interacting with customers is a plus. Strong communication skills. What Celonis Can Offer You: Pioneer Innovation: Work with the leading, award-winning process mining technology, shaping the future of business. Accelerate Your Growth: Benefit from clear career paths, internal mobility, a dedicated learning program, and mentorship opportunities. Receive Exceptional Benefits: Including generous PTO, hybrid working options, company equity (RSUs), comprehensive benefits, extensive parental leave, dedicated volunteer days, and much more . Prioritize Your Well-being: Access to resources such as gym subsidies, counseling, and well-being programs. Connect and Belong: Find community and support through dedicated inclusion and belonging programs. Make Meaningful Impact: Be part of a company driven by strong values that guide everything we do: Live for Customer Value, The Best Team Wins, We Own It, and Earth Is Our Future. Collaborate Globally: Join a dynamic, international team of talented individuals. Empowered Environment: Contribute your ideas in an open culture with autonomous teams. About Us: Celonis makes processes work for people, companies and the planet. The Celonis Process Intelligence Platform uses industry-leading process mining and AI technology and augments it with business context to give customers a living digital twin of their business operation. It s system-agnostic and without bias, and provides everyone with a common language for understanding and improving businesses. Celonis enables its customers to continuously realize significant value across the top, bottom, and green line. Celonis is headquartered in Munich, Germany, and New York City, USA, with more than 20 offices worldwide. Get familiar with the Celonis Process Intelligence Platform by watching this video . Celonis Inclusion Statement: At Celonis, we believe our people make us who we are and that The Best Team Wins . We know that the best teams are made up of people who bring different perspectives to the table. And when everyone feels included, able to speak up and knows their voice is heard - thats when creativity and innovation happen. Your Privacy: Any information you submit to Celonis as part of your application will be processed in accordance with Celonis Accessibility and Candidate Notices By submitting this application, you confirm that you agree to the storing and processing of your personal data by Celonis as described in our Privacy Notice for the Application and Hiring Process . Please be aware of common job offer scams, impersonators and frauds. Learn more here .
Posted 1 week ago
4.0 - 9.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Lending team at Grab is dedicated to building safe, secure, and loan products catering to all user segments across SEA. Our mission is to promote financial inclusion and support underbanked partners across the region. Data plays a pivotal role in our lending operations, guiding decisions across credit assessment, collections, reporting, and beyond You will report to the Lead Data Engineer. This role is based in Bangalore. Get to Know the Role: As the Data engineer in the Lending Data Engineering team, you will work with data modellers, product analytics, product managers, software engineers and business stakeholders across the SEA in understanding the business and data requirements. You will build and manage the data asset, including acquisition, storage, processing and use channels, and using some of the most scalable and resilient open source big data technologies like Flink, Airflow, Spark, Kafka, Trino and more on cloud infrastructure. You are encouraged to think out of the box and have fun exploring the latest patterns and designs. The Critical Tasks You will Perform: Develop scalable, reliable ETL pipelines to ingest data from diverse sources. Build expertise in real-time data availability to support accurate real-time metric definitions. Implement data quality checks and governance best practices for data cleansing, assurance, and ETL operations. Use existing data platform tools to set up and manage pipelines. Improve data infrastructure performance to ensure, reliable insights for decision-making. Design next-gen data lifecycle management tools/frameworks for batch, real-time, API-based, and serverless use cases. Build solutions using AWS services like Glue, Redshift, Athena, Lambda, S3, Step Functions, EMR, and Kinesis. Use tools like Amazon MSK/Kinesis for real-time data processing and metric tracking. Read more Skills you need Essential Skills Youll Need: 4+ years of experience building scalable, secure, distributed, and data pipelines. Proficiency in Python, Scala, or Java for data engineering solutions. Knowledge of big data technologies like Flink, Spark, Trino, Airflow, Kafka, and AWS services (EMR, Glue, Redshift, Kinesis, and Athena). Solid experience with SQL, data modelling, and schema design. Hands-on with AWS storage and compute services (S3, DynamoDB, Athena, and Redshift Spectrum). Experience working with NoSQL, Columnar, and Relational databases. Curious and eager to explore new data technologies and solutions. Familiarity with in-house and AWS-native tools for efficient pipeline development. Design event-driven architectures using SNS, SQS, Lambda, or similar serverless technologies. Experience with data structures, algorithms, or ML concepts. Read more What we offer About Grab and Our Workplace Grab is Southeast Asias leading superapp. From getting your favourite meals delivered to helping you manage your finances and getting around town hassle-free, weve got your back with everything. In Grab, purpose gives us joy and habits build excellence, while harnessing the power of Technology and AI to deliver the mission of driving Southeast Asia forward by economically empowering everyone, with heart, hunger, honour, and humility. Read more Life at Grab Life at Grab We care about your well-being at Grab, here are some of the global benefits we offer: We have your back with Term Life Insurance and comprehensive Medical Insurance. With GrabFlex, create a benefits package that suits your needs and aspirations. Celebrate moments that matter in life with loved ones through Parental and Birthday leave , and give back to your communities through Love-all-Serve-all (LASA) volunteering leave We have a confidential Grabber Assistance Programme to guide and uplift you and your loved ones through lifes challenges. What We Stand For at Grab We are committed to building an inclusive and equitable workplace that enables diverse Grabbers to grow and perform at their best. As an equal opportunity employer, we consider all candidates fairly and equally regardless of nationality, ethnicity, religion, age, gender identity, sexual orientation, family commitments, physical and mental impairments or disabilities, and other attributes that make them unique. Read more
Posted 1 week ago
2.0 - 7.0 years
12 - 17 Lacs
Bengaluru
Work from Office
2 - 7 years of experience in Python Good understanding of Big data ecosystems and frameworks such as Hadoop, Spark etc. Experience in developing data processing task using PySpark . Expertise in at least one popular cloud provider preferably AWS is a plus. Good knowledge of any RDBMS/NoSQL database with strong SQL writing skills Experience on Datawarehouse tools like Snowflake is a plus. Experience with any one ETL tool is a plus Strong analytical and problem-solving capability Excellent verbal and written communications skills Client facing skills: Solid experience working with clients directly, to be able to build trusted relationships with stakeholders Ability to collaborate effectively across global teams Strong understanding of data structures, algorithm, object-oriented design and design patterns Experience in the use of multi-dimensional data, data curation processes, and the measurement/improvement of data quality. General knowledge of business processes, data flows and quantitative models that generate or consume data Independent thinker, willing to engage, challenge and learn new technologies Qualification Bachelor s degree or master s in computer science or related field. Certification from professional bodies is a plus. SELECTION PROCESS Candidates should expect 3 - 4 rounds of personal or telephonic interviews to assess fitment and communication skills .
Posted 1 week ago
9.0 - 11.0 years
11 - 13 Lacs
Gurugram
Work from Office
Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? You will be responsible for delivery of highly impactful analytics to understand and optimize our commercial acquisition site experience and increase digital conversion. Deliver strategic analytics focused on digital acquisition and membership experiences. Define and build key KPIs to monitor the channel/product/ platform health and success Support the development of new products and capabilities Deliver read out of campaigns uncovering insights and learnings that can be utilized to further optimize the channels Gain deep functional understanding of the enterprise-wide product capabilities and associated platforms over time and ensure analytical insights are relevant and actionable Power in-depth strategic analysis and provide analytical and decision support by mining digital activity data along with American Express closed loop data Minimum Qualifications Advanced degree in a quantitative field (e.g. Finance, Engineering, Mathematics, Computer Science) Some experience with Big Data programming languages (BigQuery, Hive, Spark), Python, SQL. Experience in large data processing and handling, understanding in data science is a plus. Ability to work in a dynamic, cross-functional environment, with strong attention to detail. Excellent communication skills with the ability to engage, influence, and encourage partners to drive collaboration and alignment. Preferred Qualifications Strong analytical/conceptual thinking competence to solve unstructured and complex business problems and articulate key findings to senior leaders/partners in a succinct and concise manner. Basic knowledge of statistical techniques for experimentation & hypothesis testing, regression, t-test, or chi-square test. Strong programming skills
Posted 1 week ago
10.0 - 14.0 years
25 - 30 Lacs
Bengaluru
Work from Office
At Visa, the Corporate Information Technology, Billing & Incentives Platforms team, enables Visas revenue growth through flexible pricing engines and global revenue platforms built on next-generation technologies. This includes managing system requirements, evaluating cutting-edge technologies, design, development, integration, quality assurance, implementation and maintenance of corporate revenue applications. The team works closely with business owners of these services to deliver customer developed solutions, as well as implement industry leading packaged software. This team has embarked on a major transformational journey to build and implement best of breed revenue and billing applications to transform our business as well as technology. The candidate should enjoy working on diverse technologies and should be excited to take initiatives to solve complex business problems and get the job done while taking on new challenges. You should thrive in team-oriented and fast-paced environments where each team-member is vital to the overall success of the projects. Key Responsibilities Develop and maintain test automation scripts using PySpark for big data applications. Collaborate with data engineers and developers to understand data processing workflows and requirements. Design and implement automated tests for data ingestion, processing, and transformation in a Hadoop ecosystem. Perform data validation, data integrity, and performance testing for Spark applications. Utilize Spark-specific concepts such as RDDs, Data Frames, Datasets, and Spark SQL in test automation. Create and manage CI/CD pipelines for automated testing in a big data environment. Identify, report, and track defects, and work with the development team to resolve issues. Optimize and tune Spark jobs for performance and scalability. Maintain and update test cases based on new features and changes in the application. Document test plans, test cases, and test results comprehensively. Perform QA and manual testing for payments applications, ensuring compliance with business requirements and standards. Work with limited direction, usually within a complex environment, to drive delivery of solutions and meet service levels. Productively work with stakeholders in multiple countries and time zones. With active engagement, collaboration, effective communication, quality, integrity, and reliable delivery develop and maintain a trusted and valued relationship with the team, customers and business partners. Basic Qualifications Bachelors degree, OR 3+ years of relevant work experience Preferred Qualifications bachelor s degree in computer science, Information Technology or related field. Relevant certifications in Big Data, Spa
Posted 1 week ago
0.0 - 4.0 years
6 - 7 Lacs
Pune
Work from Office
About KPI Partners. KPI Partners is a leading provider of data analytics solutions, dedicated to helping organizations transform data into actionable insights. Our innovative approach combines advanced technology with expert consulting, allowing businesses to leverage their data for improved performance and decision-making. Job Description. We are seeking a skilled and motivated Data Engineer with experience in Databricks to join our dynamic team. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and data processing solutions that support our analytics initiatives. You will collaborate closely with data scientists, analysts, and other engineers to ensure the consistent flow of high-quality data across our platforms. Key skills: Python, Pyspark, Databricks, ETL, Cloud (AWS, Azure, or GCP) Key Responsibilities. - Develop, construct, test, and maintain data architectures (e.g., large-scale data processing systems) in Databricks. - Design and implement ETL (Extract, Transform, Load) processes to move and transform data from various sources to target systems. - Collaborate with data scientists and analysts to understand data requirements and design appropriate data models and structures. - Optimize data storage and retrieval for performance and efficiency. - Monitor and troubleshoot data pipelines to ensure reliability and performance. - Engage in data quality assessments, validation, and troubleshooting of data issues. - Stay current with emerging technologies and best practices in data engineering and analytics. Qualifications. - Bachelors degree in Computer Science, Engineering, Information Technology, or related field. - Proven experience as a Data Engineer or similar role, with hands-on experience in Databricks. - Strong proficiency in SQL and programming languages such as Python or Scala. - Experience with cloud platforms (AWS, Azure, or GCP) and related technologies. - Familiarity with data warehousing concepts and data modeling techniques. - Knowledge of data integration tools and ETL frameworks. - Strong analytical and problem-solving skills. - Excellent communication and teamwork abilities. Why Join KPI Partners? - Be part of a forward-thinking team that values innovation and collaboration. - Opportunity to work on exciting projects across diverse industries. - Continuous learning and professional development opportunities. - Competitive salary and benefits package. - Flexible work environment with hybrid work options. If you are passionate about data engineering and excited about using Databricks to drive impactful insights, we would love to hear from you! KPI Partners is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.
Posted 1 week ago
3.0 - 8.0 years
2 - 6 Lacs
Hyderabad
Work from Office
Azure Data Engineer - Soulpage IT Solutions Home Azure Data Engineer March 6, 2025 Position: Azure Data Engineer Skill set: Azure Databricks and Data Lake implementation Experience: 3+ years Notice Period: Immediate Immediate to 15 days Location: WFO, Hyderabad Job Type : Full-Time Positions: 2 Job Summary: We are looking for a highly skilled Azure Data Engineer with expertise in Azure Databricks and Data Lake implementation to design, develop, and optimize our data pipelines. The engineer will be responsible for integrating data from multiple sources, ensuring data is cleaned, standardized, and normalized for ML model building. This role involves working closely with stakeholders to understand data requirements and ensuring seamless data flow across different platforms. Key Responsibilities: Data Lake & Pipeline Development Design, develop, and implement scalable Azure Data Lake solutions . Build robust ETL/ELT pipelines using Azure Databricks, Data Factory, and Synapse Analytics . Optimize data ingestion and processing from multiple structured and unstructured sources. Implement data cleaning, standardization, and normalization processes to ensure high data quality. Implement best practices for data governance, security, and compliance . Optimize data storage and retrieval for performance and cost-efficiency. Monitor and troubleshoot data pipelines, ensuring minimal downtime. Work closely with data scientists, analysts, and business stakeholders to define data needs. Maintain thorough documentation for data pipelines, transformations, and integrations. Assist in developing ML-ready datasets by ensuring consistency across integrated data sources. Required Skills & Qualifications: 3+ years of experience in data engineering, with a focus on Azure cloud technologies . Expertise in Azure Databricks, Data Factory, Data Lake Strong proficiency in Python, SQL, and PySpark for data processing and transformations. Understanding of ML data preparation workflows , including feature engineering and data normalization. Knowledge of data security and governance principles . Experience in optimizing ETL pipelines for scalability and performance. Strong analytical and problem-solving skills. Excellent written and verbal communication skills. Preferred Qualifications: Azure Certifications Azure Data Engineer Associate, Azure Solutions Architect . Why Join Us? Work on cutting-edge Azure cloud and data technologies . Collaborate with a dynamic and innovative team solving complex data challenges. Competitive compensation and career growth opportunities. Application Process: Interested candidates can send their resumes to [email protected] with the subject line: Application for Azure Data Engineer We look forward to welcoming passionate individuals to our team!
Posted 1 week ago
2.0 - 7.0 years
3 - 7 Lacs
Kochi
Work from Office
Create Annual Activity Planner and share with the client and TPV. Approve and Publish Final Version of Agreed Annual Payroll Calendar and system set up . Agree password format for the year Service Delivery Act as First point of escalation for payroll queries. Handle all the non-Payroll related tickets under the correct function . Mass upload, master data processing in hrX (only if applicable) Exchange event monitoring (only for hrX clients) Manager RCA - Arrange RCAs. validate quality Etc LVMS or BO reports to ensure all the ticket are close on time by TPV Responsible for the updating, maintaining, and enforcing of the Defined Work Instructions (DWIs) and CLIENT Solution workbook Responsible for the resolution of Technical/Functional issues escalated from the team, CLIENT and/or Partner and ensuring all system issues/defects are reported correctly and tickets are logged with the necessary details and evidence so Application Services and/or Products can investigate SLA Reporting Cross check the KPI with the real results and report to TPV to identify and correct any deviation Updating of SLA and fail reason in LVMS reported on monthly bases. Change Requests Check Client/Strada CSW/SOW for compliance Check Strada/TPV CSW/SOW for compliance Notify PSM on Change Requests raised Apply the CR process as per VPS 3.0 std. process Update CSW and get client s approval on the changes in Docs Escalations SPOC for TPV s First Escalation point for Clients. Include in RAG the escalations with PSM help Manage issues that need to be escalated - TPV related Security and Compliance Initiate SI process in case any SI detected by PSA Perform SOC1 Controls Hyper-care Participate in Hyper care calls Collaboration with Project Manager, PSM and OA team for Integration Support etc Supporting and Validating the test performed during pre-go live phase. (UAT/SIT testing and data mapping configuration, support in process definition) VPS process Walkthrough call with all the new CLIENTs during Hypercare Governance Manage regular Operations calls (Corrections call/Post-payroll call etc) Prepare post payroll Review Deck. Manages Operational Plan to track actions/issues. Manage issues that need to be escalated - TPV related Ensure adherence to all agreed schedules as per SOW for Client/TPV Collaborate with PSMs to ensure on the quality of services provided by the TPV provided to the client. Requirements 2+ years of client /vendor management experience in similar industries Experience in leading and handling client call 3 years Degree/Dipolma
Posted 1 week ago
7.0 - 10.0 years
30 - 35 Lacs
Surat
Work from Office
KP Group is looking for Sr. Manager to join our dynamic team and embark on a rewarding career journey Delegating responsibilities and supervising business operations Hiring, training, motivating and coaching employees as they provide attentive, efficient service to customers, assessing employee performance and providing helpful feedback and training opportunities. Resolving conflicts or complaints from customers and employees. Monitoring store activity and ensuring it is properly provisioned and staffed. Analyzing information and processes and developing more effective or efficient processes and strategies. Establishing and achieving business and profit objectives. Maintaining a clean, tidy business, ensuring that signage and displays are attractive. Generating reports and presenting information to upper-level managers or other parties. Ensuring staff members follow company policies and procedures. Other duties to ensure the overall health and success of the business.
Posted 1 week ago
8.0 - 10.0 years
10 - 12 Lacs
Bengaluru
Work from Office
ECMS Req # /Demand ID 529266 PU DNA Role Technology Lead Number of Openings 1 Duration of project 1-2 years No of years experience 8-10 years Detailed job description - Skill Set: 6+ years of industry experience including cloud technologies. Very strong hands-on experience in Databricks with AWS cloud service in Data engineering/Data processing hands-on. Hands on experience in AWS C loud-based development and integration Proficiency in programming languages Scala and Spark Data frame for data processing and application development Practical experience with Data Engineering, Data Ingestion/Orchestration with Apache Airflow and the accompanying DevOps with CI-CD tools Strong knowledge on Spark, Databricks SQL - Data engineering pipeline Experience in offshore/onshore model and Ability and agile methodology. Gathering requirements, understand the business need and regular discussion with tech on design, development activities. Should have good experience working with client architect/design team to understand the architecture, requirement and work on the development. Experience working in a Financial Industry. Certification on Databricks and AWS will be added advantage
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
19947 Jobs | Dublin
Wipro
9475 Jobs | Bengaluru
EY
7894 Jobs | London
Accenture in India
6317 Jobs | Dublin 2
Amazon
6141 Jobs | Seattle,WA
Uplers
6077 Jobs | Ahmedabad
Oracle
5820 Jobs | Redwood City
IBM
5736 Jobs | Armonk
Tata Consultancy Services
3644 Jobs | Thane
Capgemini
3598 Jobs | Paris,France