Home
Jobs

714 Vertex Jobs - Page 24

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

We're not hiring a manager. We're hiring a builder. In 1914 Ernest Shackleton posted an ad: "Men wanted for hazardous journey. Low wages, bitter cold, long hours of complete darkness. Safe return doubtful. Honour and recognition in event of success." Commenda lives by the same spirit—minus the bitter cold and low wages. Backed by leading VCs including 1517 Fund, Pelion Venture Partners, Boost, Urban Innovation Fund, and more, we're racing to build the world's best transaction tax engine. If you're looking for a 9-to-5, close this tab. If you're ready to work harder than you ever have—and love it—read on. The mandate Own every rate, rule, threshold, and filing requirement for VAT and GST worldwide—starting with the EU, UK, India, UAE, Canada, and Brazil. Ship country coverage fast. Your north-star metric is simple: how many countries we launch, and how quickly. What you'll do Research, verify, and maintain VAT/GST rates, boundaries, place-of-supply logic, product taxability, and return schemas Structure that content in clear, well-documented spreadsheets our engineers can ingest directly Track legislative changes in real time and push accurate updates into production inside 48 hours Model complex scenarios (OSS/IOSS, reverse charge, marketplace rules, imports/exports) so our API returns the right calculations every time Create rollout playbooks and, over time, hire and mentor regional specialists—while staying hands-on yourself Use AI and other automation tools to multiply your output Requirements What you'll bring Five to ten years of deep VAT/GST expertise across multiple regions A proven record of turning raw legislation into clean, structured data or guidance that others implement Familiarity with tax engines (Vertex, Avalara, Stripe Tax, etc.) and how their content pipelines work Comfort collaborating with engineers and product teams—even if you don't write code yourself A relentless bias for action, an ownership mentality, and the stamina for startup pace ("Nobody ever changed the world on 40 hours a week") Crisp written and spoken English; additional languages are a plus Degrees are optional. Results aren't Benefits Why Commenda Competitive salary plus meaningful equity Work side by side with founders and engineers in San Francisco, Bangalore, or London The chance to build the global VAT/GST platform you've always wished existed and watch it go live in weeks, not years Show more Show less

Posted 4 weeks ago

Apply

0 years

0 Lacs

Kolkata metropolitan area, West Bengal, India

On-site

Linkedin logo

We are seeking an experienced AI Solution Architect to lead the design and implementation of AI-driven, cloud-native applications. The ideal candidate will possess deep expertise in Generative AI, Agentic AI, cloud platforms (AWS, Azure, GCP), and modern data engineering practices. This role involves collaborating with cross-functional teams to deliver scalable, secure, and intelligent solutions in a fast-paced, innovation-driven environment. Key Responsibilities: Design and architect AI/ML solutions, including Generative AI, Retrieval-Augmented Generation (RAG), and fine-tuning of Large Language Models (LLMs) using frameworks like LangChain, LangGraph, and Hugging Face. Implement cloud migration strategies for monolithic systems to microservices/serverless architectures using AWS, Azure, and GCP. Lead development of document automation systems leveraging models such as BART, LayoutLM, and Agentic AI workflows. Architect and optimize data lakes, ETL pipelines, and analytics dashboards using Databricks, PySpark, Kibana, and MLOps tools. Build centralized search engines using ElasticSearch, Solr, and Neo4j for intelligent content discovery and sentiment analysis. Ensure application and ML pipeline security with tools like SonarQube, WebInspect, and container security tools. Collaborate with InfoSec and DevOps teams to maintain CI/CD pipelines, perform vulnerability analysis, and ensure compliance. Guide modernization initiatives across app stacks and coordinate BCDR-compliant infrastructures for mission-critical services. Provide technical leadership and mentoring to engineering teams during all phases of the SDLC. Hands-on experience with: Generative AI, LLMs, Prompt Engineering, LangChain, AutoGen, Vertex AI, AWS Bedrock Python, Java (Spring Boot, Spring AI), PyTorch Vector & Graph Databases: ElasticSearch, Solr, Neo4j Cloud Platforms: AWS, Azure, GCP (CAF, serverless, containerization) DevSecOps: SonarQube, OWASP, oAuth2, container security Strong background in application modernization, cloud-native architecture, and MLOps orchestration. Familiarity with front-end technologies: HTML, JavaScript, React, JQuery. Certifications Any certification on AI/ML from reputed institute Required Skills & Qualifications Bachelor's degree in Computer Science, Engineering, or Mathematics 10+ years of total experience, with extensive tenure as a Solution Architect in AI and cloud-driven transformations. Advanced knowledge of leading architecture solutions in the industry area Strong interpersonal and collaboration skills Ability to demonstrate technical concepts to non-technical audiences Show more Show less

Posted 4 weeks ago

Apply

10 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Project Role : Data Platform Architect Project Role Description : Architects the data platform blueprint and implements the design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Google Cloud Platform Architecture Good to have skills : Google Cloud Machine Learning Services Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an AI/ML lead, you will be responsible for developing applications and systems that utilize AI tools and Cloud AI services. Your typical day will involve applying CCAI and Gen AI models as part of the solution, utilizing deep learning, neural networks and chatbots. Should have hands-on experience in creating, deploying, and optimizing chatbots and voice applications using Google Conversational Agents and other tools. Roles & Responsibilities: - Solutioning and designing CCAI applications and systems utilizing Google Cloud Machine Learning Services, dialogue flow CX, agent assist, conversational AI. - Design, develop, and maintain intelligent chatbots and voice applications using Google Dialogflow CX. - Integrate Dialogflow agents with various platforms, such as Google Assistant, Facebook Messenger, Slack, and websites. Hands-on experience with IVR integration and telephony systems such as Twilio, Genesys, Avaya - Integrate with IVR systems and Proficiency in webhook setup and API integration. - Develop Dialogflow CX - flows, pages, webhook as well as playbook and integration of tool into playbooks. - Creation of agents in Agent builder and integrating them into end end to pipeline using python. - Apply GenAI-Vertex AI models as part of the solution, utilizing deep learning, neural networks, chatbots, and image processing. - Work with Google Vertex AI for building, training and deploying custom AI models to enhance chatbot capabilities - Implement and integrate backend services (using Google Cloud Functions or other APIs) to fulfill user queries and actions. - Document technical designs, processes, and setup for various integrations. - Experience with programming languages such as Python/Node.js Professional & Technical Skills: - Must To Have Skills: CCAI/Dialogflow CX hands on experience and generative AI understanding. - Good To Have Skills: Cloud Data Architecture, Cloud ML/PCA/PDE Certification, - Strong understanding of AI/ML algorithms, NLP and techniques. - Experience with chatbot , generative AI models, prompt Engineering. - Experience with cloud or on-prem application pipeline with production-ready quality. Additional Information: - The candidate should have a minimum of 10 years of experience in Google Cloud Machine Learning Services/Gen AI/Vertex AI/CCAI. - The ideal candidate will possess a strong educational background in computer science, mathematics, or a related field, along with a proven track record of delivering impactful data-driven solutions. - A 15-year full time education is required 15 years full time education Show more Show less

Posted 4 weeks ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Linkedin logo

Cresta is on a mission to turn every customer conversation into a competitive advantage by unlocking the true potential of the contact center. Our platform combines the best of AI and human intelligence to help contact centers discover customer insights and behavioral best practices, automate conversations and inefficient processes, and empower every team member to work smarter and faster. Born from the prestigious Stanford AI lab, Cresta's co-founder and chairman is Sebastian Thrun, the genius behind Google X, Waymo, Udacity, and more. Our leadership also includes CEO, Ping Wu, the co-founder of Google Contact Center AI and Vertex AI platform, and CTO & co-founder, Tim Shi, an early member of Open AI. We’ve assembled a world-class team of AI and ML experts, go-to-market leaders, and top-tier investors including Andreessen Horowitz, Greylock Partners, Sequoia, and former AT&T CEO John Donovan. Our valued customers include brands like Intuit, Cox Communications, Hilton, and Carmax and we’ve been recognized by Forbes and Bain Consulting as one of the top private AI companies in the world. Join us on this thrilling journey to revolutionize the workforce with AI. The future of work is here, and it's at Cresta. Cresta is excited to expand our operations to India and build a tech hub in the region. These early joiners will play a key role in establishing our engineering team in India, helping to shape the future of the site. We are a remote-first environment, offering opportunities to work with a global team across Europe, the US, and Canada. Over time, we aim to create a co-working space in India, moving towards a hybrid environment. We are seeking multiple highly skilled Analytics Engineers. You will be responsible for a combination of data engineering and product analytics tasks, contributing to the optimisation of our data ecosystem. The successful candidate will work closely with cross-functional teams to design, develop, and maintain data pipelines, perform in-depth analysis to derive actionable insights and build dashboards to showcase product health. Join us on this thrilling journey to revolutionize the workforce with AI. The future of work is here, and it's at Cresta. About Your Role As an Analytics Engineer, you will: Collaborate with stakeholders to understand data requirements and design and implement scalable data pipelines. Partner with product teams to define and implement tracking mechanisms for key product metrics. Analyze user behavior, product usage, and other relevant data to provide insights that drive product improvements. Create and maintain dashboards and reports to communicate analytical findings to stakeholders. Collaborate with cross-functional teams to integrate analytics into product development processes. Qualifications Previous experience in a startup or product-first company is a plus. Familiarity with ClickHouse or similar columnar databases for managing large-scale, real-time analytical queries. Experience with product analytics, including defining and tracking relevant metrics. Strong proficiency in data engineering, ETL processes, and database management. Proficiency in SQL, any scripting language (Python, R) and a data visualization tool like Hex, or Power BI. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Conclusion Compensation for this position includes a base salary, equity, and a variety of benefits. Actual base salaries will be based on candidate-specific factors, including experience, skillset, and location, as well as applicable local pay requirements. We are actively hiring for this role in India, Romania, Berlin, the US, and Canada. Your recruiter can provide further details. We have noticed a rise in recruiting impersonations across the industry, where scammers attempt to access candidates' personal and financial information through fake interviews and offers. All Cresta recruiting email communications will always come from the @cresta.com domain. Any outreach claiming to be from Cresta via other sources should be ignored. If you are uncertain whether you have been contacted by an official Cresta employee, reach out to recruiting@cresta.ai Show more Show less

Posted 4 weeks ago

Apply

10 - 15 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

MSCI Inc. seeks a dynamic SAP Finance Technology Director. This role will be part of the Business Technology department and will lead the Finance Tech team located in India (Mumbai or Pune), working closely with finance and business stakeholders to ensure the systems that support MSCI's order-to-cash, record-to-report, revenue recognition, procure-to-pay, and FP&A operate seamlessly and efficiently. As the SAP Finance Technology Director, you will be responsible for driving global projects for the implementation, enhancement, and maintenance of both finance systems and finance processes to automate and streamline operations. This includes working with SAP modules like ECC, BPC, and GRC while ensuring integration with other financial and reporting tools like RevStream, Vertex, Concur, or Board, as well as other core solutions like Salesforce or Workday. You will collaborate across regions and departments to ensure that technology solutions meet business needs and regulatory requirements, driving continuous improvements, while ensuring compliance with SOX controls and audit requirements. The SAP Finance Technology Director will manage a team of direct reports and contractors to support the overall operations of both Financial Systems and Financial Processes. As a Finance Technology Lead, you will be responsible for Finance Systems Management Operate, maintain, and improve the systems that support order-to-cash, record-to-report, revenue recognition, procure-to-pay, and FP&A processes. System Enhancements Lead the identification, development, and implementation of system enhancements that automate and improve the accuracy and efficiency of finance operations. Cross-functional Collaboration Work closely with Business (finance, sales, sales operations, product, client services, etc.) and IT teams to translate functional requirements into technical solutions. Ensure smooth integration of financial systems with other financial and operational platforms. Project Management Oversee projects related to system upgrades like a migration to SAP S/4HANA, new functionality deployments, and process automation initiatives. Ensure that projects are delivered on time and within budget. Compliance and Audit Support Ensure that the systems supporting revenue processes are compliant with internal controls and regulatory requirements, including SOX cycles, Standard Operating Procedures (SOPs), and audit protocols. Training and Support Provide technical guidance and training to finance and IT teams on the systems and processes supporting Revenue Recognition. Act as the go-to expert for system-related inquiries and troubleshooting. Stakeholder Communication Communicate system updates, enhancements, and issues to senior IT leadership and key stakeholders within the finance and business departments. Technical Expertise in ERP Systems Extensive experience with SAP systems (e.g., ECC, BPC, GRC). Experience in integrating with complementary business applications such as Salesforce, Concur, Vertex, or Workiva is helpful. Business Process Knowledge Familiarity with finance processes order-to-cash, record-to-report, procure-to-pay, revenue recognition, and FP&A. Project Management Proven experience in managing global large-scale IT projects, particularly those involving financial systems implementation or upgrades. Cross-functional Collaboration Ability to effectively collaborate with finance, business, and IT teams to deliver technology solutions that meet business requirements. Systems Integration Strong understanding of systems integration principles, particularly as they relate to financial reporting and revenue management systems. Compliance and Security Knowledge of SOX compliance, internal controls, and system security requirements as they pertain to financial systems. Problem-solving Skills Analytical and problem-solving skills with the ability to troubleshoot complex system issues and provide efficient solutions. Leadership and Team Development Experience managing a team of IT professionals, providing leadership coaching, and guidance in system support and project execution. Communication Skills Strong written and verbal communication skills in English, with the ability to clearly explain technical concepts to non-technical stakeholders. Preferred Qualifications Masters in Technology-related or Finance-related domain Preferred Experience 10-15 years of strong operational experience with SAP or similar ERP systems. What We Offer You Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose - to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com Show more Show less

Posted 4 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire SAP FICO Professionals in the following areas : Job Description SAP FICO Tax Experience required : 8 -10 years Key Responsibilities Technology Perform tax configuration and testing, any Tax Reporting Application, SAP Experience in baseline configuration in SAP / Vertex Experience in SAP – FI, MM, and SD Provide support to business users in resolving tax-related questions and issues, Troubleshoot and handle remedy ticket Proficient in handling Product Mapping file creation and mass upload in Vertex. Business Interact with tax departments and other business segments for overall design/implementation and maintenance Work with deployment teams to deploy the solution to end users Qualifications and Requirements Essential qualifications Bachelor’s Degree or professional qualification in tax, accounting, finance, or related field required Computer science, information technology, and project management certification is a plus Good communication, verbal, and written skills Key competencies in technologies 5+ years of progressive relevant experience in SAP and OneSource Indirect Tax Application Knowledge of Indirect Tax and Withholding tax (Preferred) SAP experience in FI, MM, and SD (Preferred) Hands-on experience with Vertex integration Advanced MS Excel capabilities and an ability to manage significant amounts of data Strong communication and interpersonal skills, with a proven ability to communicate tax issues and requirements clearly to non-tax team members and work effectively across functional teams Excellent project management, organizational, and documentation skills, including the ability to multitask and prioritize Should be conversant with support processes and methodologies Demonstrated ability to work well in a team environment, and collaborate with various people and organizations in an effort to develop win/win results. High level of proficiency and understanding of information technology and the linkage between business processes, people, and systems. Should have worked on a large project with multiple teams Other skills and abilities Ability to work in a global distributed setting without supervision Self-driven, Proactive, Systems Thinking At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Show more Show less

Posted 4 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Join Us At Vodafone, we’re not just shaping the future of connectivity for our customers – we’re shaping the future for everyone who joins our team. When you work with us, you’re part of a global mission to connect people, solve complex challenges, and create a sustainable and more inclusive world. If you want to grow your career whilst finding the perfect balance between work and life, Vodafone offers the opportunities to help you belong and make a real impact. What You’ll Do Develop analytical models on Vertex AI with in-depth knowledge of Python, Machine Learning, and Big Query 2. Provide technical guidance and accountability for developing analytical models on HR Data utilizing tools such as Looker, Vertex AI, and GCP 1. Work closely with SuccessFactors product owners and Data teams to document requirements for the Semantics Layer and lead requirement gathering sessions 3. Understand GCP data models and query the models using Big Query for data analysis and reconciliation 4. Work in Safe Agile models and maintain the JIRA tool as required 5. Define and develop GCP Semantics Models and Global HR Data Analytics architecture for various Global HR systems Who You Are Qualifications: 10-12 years of experience in GCP (Process and Semantics Layer) development using Big Query and other data processing languages 7. Experience with Vertex AI is a must. Experience with other HR analytics tools is a plus 8. Expertise in demonstrating data on visualization tools like Qlik Sense, Power BI, GCP, and Vertex AI 9. Strong SQL and Big Query skills with the ability to perform effective querying involving multiple tables and subqueries Experience in Global HR Data/Processes, preferably in SuccessFactors or other HR systems Key Performance Indicators: Strong skills in Global HR Data Analytics – GCP Process and Semantics Layer, Big Query, SQL, Python, Vertex AI, and machine learning 13. Understanding of HR Data and processes, preferably in SuccessFactors and GCP 14. Strong communication skills to work with product owners, data teams, and customers to understand the GCP requirements for HR Qlik Dashboards Not a perfect fit? Worried that you don’t meet all the desired criteria exactly? At Vodafone we are passionate about empowering people and creating a workplace where everyone can thrive, whatever their personal or professional background. If you’re excited about this role but your experience doesn’t align exactly with every part of the job description, we encourage you to still apply as you may be the right candidate for this role or another opportunity. What's In It For You Who we are We are a leading international Telco, serving millions of customers. At Vodafone, we believe that connectivity is a force for good. If we use it for the things that really matter, it can improve people's lives and the world around us. Through our technology we empower people, connecting everyone regardless of who they are or where they live and we protect the planet, whilst helping our customers do the same. Belonging at Vodafone isn't a concept; it's lived, breathed, and cultivated through everything we do. You'll be part of a global and diverse community, with many different minds, abilities, backgrounds and cultures. ;We're committed to increase diversity, ensure equal representation, and make Vodafone a place everyone feels safe, valued and included. If you require any reasonable adjustments or have an accessibility request as part of your recruitment journey, for example, extended time or breaks in between online assessments, please refer to https://careers.vodafone.com/application-adjustments/ for guidance. Together we can. Show more Show less

Posted 4 weeks ago

Apply

0 - 1 years

0 - 0 Lacs

Chhawni, Indore, Madhya Pradesh

Work from Office

Indeed logo

Job Title: Computer Operator Location: DB Vertex Technologies, Chhawni, Indore, Madhya Pradesh Job Type: Full-time, Fresher We are looking for a reliable and detail-oriented Computer Operator to join our team. If you're comfortable working with computer systems and have basic technical knowledge, this could be the perfect opportunity for you! Key Responsibilities: Operate and maintain computer systems and hardware Perform data entry, processing, and verification tasks Manage print jobs, reports, and system backups Monitor system performance and report issues promptly Ensure data accuracy and maintain system logs and documentation Requirements: Basic knowledge of MS Office (Word, Excel, etc.) Strong attention to detail and organizational skills Ability to work independently and meet deadlines What We Offer: Supportive and professional work environment On-the-job training and skill development Competitive salary and growth opportunities If you meet the above requirements and are ready to start your career in a dynamic role, we encourage you to apply! Job Type: Full-time, Fresher Pay: ₹7,000.00 - ₹15,000.00 per month Schedule: Day shift Ability to commute/relocate: Chhawni, Indore, Madhya Pradesh: Reliably commute or planning to relocate before starting work (Preferred) Education: Any Graduation Experience: total work: 0 to 1 year (Preferred) Job Types: Full-time, Fresher Pay: ₹7,000.00 - ₹15,000.00 per month Schedule: Day shift Work Location: In person Job Types: Full-time, Permanent, Fresher Pay: ₹7,000.00 - ₹15,000.00 per month Schedule: Day shift Language: English (Preferred) Work Location: In person

Posted 4 weeks ago

Apply

0 - 1 years

0 - 0 Lacs

Chhawni, Indore, Madhya Pradesh

Work from Office

Indeed logo

Job Title: Digital Marketing Executive Location: DB Vertex Technologies, Chhawni, Indore, Madhya Pradesh Job Type: Full-time, Fresher Job Responsibilities: Plan, execute, and optimize digital marketing campaigns across platforms (Google Ads, Meta, LinkedIn, etc.) Manage SEO/SEM strategies to boost organic reach and website traffic Create and manage engaging content for social media and email campaigns Monitor performance using analytics tools and prepare reports Stay up to date with the latest trends and best practices in digital marketing Requirements: 0–1 years of proven experience in digital marketing Hands-on experience with Ads Manager, SEO tools (e.g. SEMrush, Ahrefs) Strong understanding of marketing funnels, CRO, and content strategy Excellent written and verbal communication skills Creative thinker with attention to detail and data What We Offer: A collaborative and growth-focused work environment Opportunities for learning and career advancement Competitive salary and performance incentives If you meet the above requirements and are ready to start your career in a dynamic role, we encourage you to apply! Job Type: Full-time, Fresher Pay: ₹7,000.00 - ₹15,000.00 per month Schedule: Day shift Ability to commute/relocate: Chhawni, Indore, Madhya Pradesh: Reliably commute or planning to relocate before starting work (Preferred) Education: Any Graduation Experience: total work: 0 to 1 year (Preferred) Job Types: Full-time, Fresher Pay: ₹7,000.00 - ₹15,000.00 per month Schedule: Day shift Work Location: In person Job Types: Full-time, Permanent, Fresher Pay: ₹7,000.00 - ₹15,000.00 per month Schedule: Day shift Work Location: In person

Posted 4 weeks ago

Apply

0 - 1 years

0 - 0 Lacs

Chhawni, Indore, Madhya Pradesh

Work from Office

Indeed logo

Job description Job Title: SEO Executive / SEO Specialist Location: DB Vertex Technologies, Chhawni, Indore, Madhya Pradesh Job Type: Full-time, Fresher Job Responsibilities: Develop and implement effective SEO strategies Perform keyword research and optimize website content Improve website ranking through on-page and off-page SEO Monitor website performance using SEO tools (Google Analytics, Search Console, etc.) Build high-quality backlinks and manage link-building strategies Stay updated with the latest SEO trends and algorithm changes Collaborate with the content and marketing team for SEO-friendly content Requirements: Experience in SEO (Freshers with knowledge can apply) Strong understanding of on-page, off-page, and technical SEO Familiarity with SEO tools like Ahrefs, SEMrush, Moz, etc. Good analytical and problem-solving skills Basic knowledge of HTML and WordPress is a plus Benefits: Competitive salary Growth opportunities Friendly work environment If you meet the above requirements and are ready to start your career in a dynamic role, we encourage you to apply! Job Type: Full-time, Fresher Pay: ₹7,000.00 - ₹15,000.00 per month Schedule: Day shift Ability to commute/relocate: Chhawni, Indore, Madhya Pradesh: Reliably commute or planning to relocate before starting work (Preferred) Education: Any Graduation Experience: total work: 0 to 1 year (Preferred) Job Types: Full-time, Fresher Pay: ₹7,000.00 - ₹15,000.00 per month Schedule: Day shift Work Location: In person Job Types: Full-time, Permanent, Fresher Pay: ₹7,000.00 - ₹15,000.00 per month Schedule: Day shift Work Location: In person

Posted 4 weeks ago

Apply

0 years

0 Lacs

Surat, Gujarat, India

Remote

Linkedin logo

About Cloudairy: Join a world-class team of skilled engineers who build creative digital solutions to support our colleagues and clients. Cloudairy values its culture of nonstop innovation, an outcome-driven approach to success, and the notion that outstanding ideas can come from anyone on the team. Location: Surat or Pune, 2-days WFH Option Employment Type: Full-time Experience: 3–5 years Role Overview: We are looking for a specialized Node.js Integration Developer who can build and maintain deep, production-grade integrations between our SaaS product and external systems like CRMs, project management tools, communication platforms, and payment or identity systems. This is a pure integration role - your work will directly power connected experiences by integrating our core platform with the tools our customers use every day. Responsibilities: Develop and maintain third-party service integrations (e.g., cloud providers, Jira, Microsoft Teams, Slack, Stripe, HubSpot, Azure DevOps, etc.). Build and manage webhook-based integrations and event-driven workflows. Work with token-based APIs and manage scoped access securely. Create and maintain reusable integration layers, modules, or SDKs for multi-tenant use. Monitor API updates and breaking changes, and ensure backward compatibility across integrated systems. Implement retry strategies, error recovery mechanisms, and logging/monitoring for all integrations. Work cross-functionally with product, QA, and partner engineering teams to ensure smooth onboarding and usage of integrations. Collaborate with AI and product teams to embed AI features in the platform (e.g., smart automation, suggestions, content generation, and predictive logic). Help integrate with AI platforms and APIs such as OpenAI, Bedrock, or Vertex AI using Node.js SDKs. Document each integration clearly for internal teams and customer-facing usage. Required Skills: Strong experience in Node.js focused on backend service integration. Excellent understanding of REST APIs, including pagination, filtering, throttling, and error responses. Experience implementing and managing webhooks and related infrastructure. Ability to work with API tokens, secrets, scopes, and secure storage. Skilled in analyzing and reverse-engineering APIs with limited or outdated documentation. Familiarity with multi-tenant integration handling, including mapping and isolating data per tenant. Proficient in using tools like Postman, API logging, and debugging tools for third-party systems. Demonstrated ability to integrate cloud service provider APIs , including AWS SDK for Node.js (e.g., EC2, CloudWatch, S3) and Azure SDK for Node.js (e.g., Resource Graph, Cost Analysis) Experience integrating AI models or platforms via APIs or SDKs , including OpenAI (ChatGPT/GPT-4) Amazon Bedrock, Google Vertex AI, or Azure OpenAI Service Custom AI endpoints built in Python or hosted via serverless or container services Bonus Skills: Familiarity with iFrame embedding, SDK injection, or frontend integration points. Experience with async job processing systems (e.g., Redis queues). Knowledge of API signing methods, HMAC-based authentication, and webhook signature validation. Experience in building internal tools or dashboards to manage integrations. Prior experience working in an integration platform team or owning integration hubs. Why Join Us: Own and shape the direction of our third-party integrations. Focus entirely on technical integration depth, not general backend CRUD work. Be part of a fast-moving SaaS team with high-impact engineering problems. Opportunity to work with real-world enterprise tools and data in secure and scalable ways. If interested, please share your resume at 📧 Email: hr@cloudairy.com 📱 Contact: 9227009276 Show more Show less

Posted 4 weeks ago

Apply

10 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. What You’ll Do Design, develop and deploy agent-based AI systems using LLMs Build and scale Retrieval-Augmented Generation pipelines for real-time and offline inference. Develop and optimize training workflows for fine-tuning and adapting models to domain-specific tasks. Collaborate with cross-functional teams to integrate knowledge base into agent frameworks. Drive best practices in AI Engineering, model lifecycle management, and production deployment on Google Cloud (GCP) Implement version control strategies using Git, manage code repositories and ensure best practices in code management. Develop, manage CI/CD pipelines using Jenkins or other relevant tools to streamline deployment and updates. Monitor, evaluate, and improve model performance post- deployment on Google Cloud. Communicate technical findings and insights to non-technical stakeholders. Participate in technical discussions and contribute to strategic planning. What Experience You Need Master's / Bachelors in Computer Science, Artificial Intelligence, Machine Learning, or related field. 5+ years of experience in AI/ML engineering, with a strong focus on LLM-based applications. At least 10+ years of experience in IT overall. Proven experience in building agent-based applications using Gemini, OpenAI or similar models. Deep understanding of RAG systems, vector databases, and knowledge retrieval strategies. Hands-on experience with LangChain and LangGraph frameworks. Solid background in model training, fine-tuning, evaluation and deployment. Strong coding skills in Python and experience with modern MLOps practices. What Could Set You Apart Familiarity with frontend integration of AI agents (Eg. using Angular, Mesop or similar frameworks). Experience with Google Cloud services like BigQuery, Vertex AI, Agent Builder. Exposure to Angular framework We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less

Posted 1 month ago

Apply

0 - 3 years

0 Lacs

Andhra Pradesh

Work from Office

Indeed logo

Experience Level 8 Plus years with at least 2 to 3 years in AI/ML/GenAI Primary Skill: Google Gemini, GCP, Vertex AI Key Responsibilities Design and implement GenAI architectures leveraging Google Cloud and Gemini AI models Lead solution architecture and integration of generative AI models into enterprise applications Collaborate with data scientists engineers and business stakeholders to define AI use cases and technical strategy Develop and optimize prompt engineering, model fine tuning, and deployment pipelines Design scalable data storage and retrieval layers using PostgreSQL BigQuery and vector databases e.g.Vertex AI Search Pinecone or FAISS Evaluate third party GenAI APIs and tools for integration Ensure compliance with data security privacy and responsible AI guidelines Support performance tuning monitoring and optimization of AI solutions in production Stay updated with evolving trends in GenAI and GCP offerings especially related to Gemini and Vertex AI Required Skills and Qualifications Proven experience architecting AI and ML or GenAI systems on Google Cloud Platform Hands-on experience with Google Gemini Vertex AI and related GCP AI tools Strong understanding of LLMs, prompt engineering and text generation frameworks Proficiency in PostgreSQL, including advanced SQL and performance tuning Experience with MLOps, CI and CD pipelines, and AI model lifecycle management Solid knowledge of Python, APIs, RESTful services, and cloud native architecture Familiarity with vector databases and semantic search concepts Strong communication and stakeholder management skills Preferred Qualifications GCP certifications e.g., Professional Cloud Architect Machine Learning Engineer Experience in model fine-tuning and custom LLM training Knowledge of LangChain, RAG Retrieval Augmented Generation frameworks Exposure to data privacy regulations GDPR, HIPAA, etc. Background in natural language processing NLP and deep learning About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 1 month ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka

Work from Office

Indeed logo

SPEED & SPIRIT is what we look for in our candidates, defined by some simple values that inspire us to BE DRIVEN in our performance, BE VIBRANT in our sporting legacy, BE TOGETHER in our team spirit, and BE YOU to let our individual talent and experience shine. Applying for a job at PUMA is easy. Simply click APPLY ONLINE and follow the steps to upload your application. YOUR TALENT Eligibility Criteria and Functional Competencies Required: Adobe Photoshop Adobe Premier Microsoft Office Email communication Stakeholder coordination AI tools – like Vertex, Prome AI, Gemini YOUR MISSION Key Objectives: We are looking for a Image Editor specialized in editing for Ecommerce Catalog Image (Apparels on models, products, ghost mannequins, flat lays) to join the Puma Studio team Your key objectives will be. Image editing using Photoshop. Editing for Puma.com and other Marketplace portals . Color correction and first step of QC before submission to Final QC team. Ecommerce Catalog Image Editing which covers Apparels on models, products, ghost mannequins, flat lays, infographics, AI, creative editing, high end retouching for banners and other creatives. Photoshop . Photo manipulation. Catalog Image Editing. Color Correction with product. Enhancing Customer Experience by showcasing the product in the best possible way - Removal of blemishes, skin retouching, cut-outs, liquifying the product and manipulation. Elevated content – To enrich the product’s regular content with AI, Infographics and Creative images PUMA provides equal opportunities for all job applicants, regardless of race, color, religion, national origin, sex, gender identity or expression, sexual orientation, age, or disability. Equality for all is one of the core principles at PUMA and we do not tolerate any form of harassment or discrimination. PUMA supports over 21,000 employees across 51 countries. The PUMA Group owns the brand PUMA, Cobra Golf and stichd, and is headquartered in Herzogenaurach, Germany.

Posted 1 month ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana

On-site

Indeed logo

General Information Locations : Hyderabad, Telangana, India Role ID 209285 Worker Type Regular Employee Studio/Department CTO - EA Digital Platform Work Model Hybrid Description & Requirements Electronic Arts creates next-level entertainment experiences that inspire players and fans around the world. Here, everyone is part of the story. Part of a community that connects across the globe. A place where creativity thrives, new perspectives are invited, and ideas matter. A team where everyone makes play happen. Software Engineer II - EADP - Data The EA Digital Platform (EADP) group is the core powering the global EA ecosystem. We provide the foundation for all of EA’s incredible games and player experiences with high-level platforms like Cloud, Commerce, Data and AI, Gameplay Services, Identity and Social. By providing reusable capabilities that game teams can easily integrate into their work, we let them focus on making some of the best games in the world and creating meaningful relationships with our players. We’re behind the curtain, making it all work together. Come power the future of play with us. The Challenge Ahead: We are looking for developers who want to work on a large-scale distributed data system that empowers EA Games to personalize player experience and engagement. Responsibilities You will help with designing, implementing and optimizing the infrastructure for AI model training and deployment platform You will help with integrating AI capabilities into existing software systems and applications You will develop tools and systems to monitor the performance of the platform in real-time, analyzing key metrics, and proactively identify and address any issues or opportunities for improvement. You will participate in code reviews to maintain code quality and ensure best practices You will help with feature and operation enhancement for platform under senior guidance You will help with improving the stability and observability of the platform Qualifications Bachelor's degree or foreign degree equivalent in Computer Science, Electrical Engineering, or related field. 3+ years of experience with software development and model development Experience with a programming language such as Go, Java or Scala Experience with scripting languages such as bash, awk, python Experience with Scikit-Learn, Pandas, Matplotlib 3+ years of experience with Deep Learning frameworks like PyTorch, TensorFlow, CUDA Hands-on experience of any ML Platform (Sagemaker, Azure ML, GCP Vertex AI) Experience with cloud services and modern data technologies Experience with data streaming and processing systems About Electronic Arts We’re proud to have an extensive portfolio of games and experiences, locations around the world, and opportunities across EA. We value adaptability, resilience, creativity, and curiosity. From leadership that brings out your potential, to creating space for learning and experimenting, we empower you to do great work and pursue opportunities for growth. We adopt a holistic approach to our benefits programs, emphasizing physical, emotional, financial, career, and community wellness to support a balanced life. Our packages are tailored to meet local needs and may include healthcare coverage, mental well-being support, retirement savings, paid time off, family leaves, complimentary games, and more. We nurture environments where our teams can always bring their best to what they do. Electronic Arts is an equal opportunity employer. All employment decisions are made without regard to race, color, national origin, ancestry, sex, gender, gender identity or expression, sexual orientation, age, genetic information, religion, disability, medical condition, pregnancy, marital status, family status, veteran status, or any other characteristic protected by law. We will also consider employment qualified applicants with criminal records in accordance with applicable law. EA also makes workplace accommodations for qualified individuals with disabilities as required by applicable law.

Posted 1 month ago

Apply

0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Company Description Tatvic, is a marketing analytics company focusing on generating insights from data using long association with Google & its infrastructure. We breed,recognize and reward performance. As a company we are growing very fast & we are under transformation. To enable this transformation we need future leaders with eyesight which balances execution & strategic understanding. Website: www.tatvic.com Mission The Senior Data Scientist will provide value to clients by deriving actionable insights from data through feature definition, machine learning model creation, testing and validation, optimization, and presenting results in an actionable format. The goal is to enable data-driven business decisions for clients. Role Responsibilities Responsibilities w.r.t Customer: Communicating with Customers to discover and understand the problem statement. Design a Solution that clearly aligns the Problem Statement, Solution Details, Output, Success Criteria and how the impact of the Solution aligns to a specific Business Objective. Designing a solution. Bird’s eye view of the Platform to create an effective solution. Feature Engineering: Identify features that would matter and ensure that the logic for feature selection is transparent & explainable. Model Selection: Use Pre-trained Models, AutoML, APIs or individual algorithms & libraries for effective model selection for optimal implementation. Optimize the model to increase its effectiveness with proper data cleansing and feature engineering refinements. Deploy the model for Batch or real-time predictions using methodologies like MLOps. Display or export the output into a visualization platform. Create a POC for providing data insight for the customer at short notice. Maintaining and Managing Project execution trackers and documentation. Keep the promises made to the customer in terms of deliverables, deadlines and quality. Innovation And Asset Building Responsibilities Design and Build reusable solutions that can be reused for multiple customers. Create clear documentation on architecture and design concepts & technical decisions in the project. Conduct internal sessions to educate cross-team stakeholders to improve literacy of the domain & solutions. Maintain coding standards & build reusable code & libraries for future use and enhancing Engineering at Tatvic. Stay up-to-date with innovations in data science and its applications in Tatvic relevant domains. Frequently Perform POCs to get hands-on experience with new technologies, including Google Cloud tools designed for Data Science applications. Explore the usage of data science in various business and web analytics applications Technical Skills Data Handling: Manage data from diverse sources, including structured tables, unstructured text, images, videos, and streaming/real-time data. For scalable data processing and analysis, utilize cloud platforms (preferably Google Cloud) such as BigQuery, VertexAI, and Cloud Storage. Feature Engineering: Identify and select relevant features with transparent and explainable logic. Design new derived features to enhance model performance and enable deeper insights. Utilize advanced techniques like Pearson Coefficient and SHAP values for feature importance and correlation analysis. Model Development: Select and build models based on problem requirements, using pre-trained models, AutoML, or custom algorithms. Experience in Linear, Non-linear, Timeseries (RNNs, LSTMs), Tree-based models (XGBoost, LightGBM), and other foundational approaches. Apply advanced modeling techniques like CNNs for image processing, RCNNs, YOLO for object detection, and RAGs and LLM tuning for text and search-related tasks. Optimize models with hyperparameter tuning, Bayesian optimization, and appropriate evaluation strategies. Model Evaluation: Assess model performance using metrics suited for data type and problem type: For categorical data: Precision, Recall, F1 Score, ROC-AUC, and Precision-Recall curves. For numerical data: Metrics like Mean Absolute Error (MAE), Root Mean Square Error (RMSE), Mean Squared Error (MSE), and R-squared (R²). Deployment & MLOps: Deploy models for batch or real-time predictions using MLOps frameworks, leveraging tools such as VertexAI and Kubeflow for efficient and scalable deployment pipelines. Integrate outputs with visualization platforms to deliver actionable insights and drive decision-making. Innovation: Stay current with trends in AI and data science, including LLMs, grounding techniques, and innovations in temporal and sequential data modeling. Regularly conduct POCs to experiment with emerging tools and technologies. Code Practices & Engineering: Write clean, maintainable, and scalable code following industry best practices. Adhere to version control (e.g., Git) for collaborative development and maintain coding standards. Implement error handling, logging, and monitoring to ensure reliability in production systems. Collaborate with other teams to integrate data science models into broader system architectures. Performance Optimization: Optimize model and data processing pipelines for computational efficiency and scalability. Use parallel processing, distributed computing, and hardware accelerators (e.g., GPUs, TPUs) where applicable. Documentation & Reusability: Maintain comprehensive technical documentation for all solutions. Design and build reusable assets to streamline future implementations. Technical Tools And Platforms Google Cloud (BigQuery, VertexAI, Cloud Storage) Python (Libraries: TensorFlow, Scikit-learn, XGBoost, LightGBM, etc.) SQL/NoSQL databases MLOps frameworks (Kubeflow, Vertex AI Pipelines) Visualization tools (Power BI, Tableau, Google Data Studio) Show more Show less

Posted 1 month ago

Apply

10 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

What You’ll Do Design, develop and deploy agent-based AI systems using LLMs Build and scale Retrieval-Augmented Generation pipelines for real-time and offline inference. Develop and optimize training workflows for fine-tuning and adapting models to domain-specific tasks. Collaborate with cross-functional teams to integrate knowledge base into agent frameworks. Drive best practices in AI Engineering, model lifecycle management, and production deployment on Google Cloud (GCP) Implement version control strategies using Git, manage code repositories and ensure best practices in code management. Develop, manage CI/CD pipelines using Jenkins or other relevant tools to streamline deployment and updates. Monitor, evaluate, and improve model performance post- deployment on Google Cloud. Communicate technical findings and insights to non-technical stakeholders. Participate in technical discussions and contribute to strategic planning. What Experience You Need Master's / Bachelors in Computer Science, Artificial Intelligence, Machine Learning, or related field. 5+ years of experience in AI/ML engineering, with a strong focus on LLM-based applications. At least 10+ years of experience in IT overall. Proven experience in building agent-based applications using Gemini, OpenAI or similar models. Deep understanding of RAG systems, vector databases, and knowledge retrieval strategies. Hands-on experience with LangChain and LangGraph frameworks. Solid background in model training, fine-tuning, evaluation and deployment. Strong coding skills in Python and experience with modern MLOps practices. What Could Set You Apart Familiarity with frontend integration of AI agents (Eg. using Angular, Mesop or similar frameworks). Experience with Google Cloud services like BigQuery, Vertex AI, Agent Builder. Exposure to Angular framework Show more Show less

Posted 1 month ago

Apply

5 - 10 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

We are seeking a highly skilled and motivated Lead DS/ML engineer to join our team. The role is critical to the development of a cutting-edge reporting platform designed to measure and optimize online marketing campaigns. We are seeking a highly skilled Data Scientist / ML Engineer with a strong foundation in data engineering (ELT, data pipelines) and advanced machine learning to develop and deploy sophisticated models. The role focuses on building scalable data pipelines, developing ML models, and deploying solutions in production to support a cutting-edge reporting, insights, and recommendations platform for measuring and optimizing online marketing campaigns. The ideal candidate should be comfortable working across data engineering, ML model lifecycle, and cloud-native technologies. Job Description: Key Responsibilities: Data Engineering & Pipeline Development Design, build, and maintain scalable ELT pipelines for ingesting, transforming, and processing large-scale marketing campaign data. Ensure high data quality, integrity, and governance using orchestration tools like Apache Airflow, Google Cloud Composer, or Prefect. Optimize data storage, retrieval, and processing using BigQuery, Dataflow, and Spark for both batch and real-time workloads. Implement data modeling and feature engineering for ML use cases. Machine Learning Model Development & Validation Develop and validate predictive and prescriptive ML models to enhance marketing campaign measurement and optimization. Experiment with different algorithms (regression, classification, clustering, reinforcement learning) to drive insights and recommendations. Leverage NLP, time-series forecasting, and causal inference models to improve campaign attribution and performance analysis. Optimize models for scalability, efficiency, and interpretability. MLOps & Model Deployment Deploy and monitor ML models in production using tools such as Vertex AI, MLflow, Kubeflow, or TensorFlow Serving. Implement CI/CD pipelines for ML models, ensuring seamless updates and retraining. Develop real-time inference solutions and integrate ML models into BI dashboards and reporting platforms. Cloud & Infrastructure Optimization Design cloud-native data processing solutions on Google Cloud Platform (GCP), leveraging services such as BigQuery, Cloud Storage, Cloud Functions, Pub/Sub, and Dataflow. Work on containerized deployment (Docker, Kubernetes) for scalable model inference. Implement cost-efficient, serverless data solutions where applicable. Business Impact & Cross-functional Collaboration Work closely with data analysts, marketing teams, and software engineers to align ML and data solutions with business objectives. Translate complex model insights into actionable business recommendations. Present findings and performance metrics to both technical and non-technical stakeholders. Qualifications & Skills: Educational Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Science, Machine Learning, Artificial Intelligence, Statistics, or a related field. Certifications in Google Cloud (Professional Data Engineer, ML Engineer) is a plus. Must-Have Skills: Experience: 5-10 years with the mentioned skillset & relevant hands-on experience Data Engineering: Experience with ETL/ELT pipelines, data ingestion, transformation, and orchestration (Airflow, Dataflow, Composer). ML Model Development: Strong grasp of statistical modeling, supervised/unsupervised learning, time-series forecasting, and NLP. Programming: Proficiency in Python (Pandas, NumPy, Scikit-learn, TensorFlow/PyTorch) and SQL for large-scale data processing. Cloud & Infrastructure: Expertise in GCP (BigQuery, Vertex AI, Dataflow, Pub/Sub, Cloud Storage) or equivalent cloud platforms. MLOps & Deployment: Hands-on experience with CI/CD pipelines, model monitoring, and version control (MLflow, Kubeflow, Vertex AI, or similar tools). Data Warehousing & Real-time Processing: Strong knowledge of modern data platforms for batch and streaming data processing. Nice-to-Have Skills: Experience with Graph ML, reinforcement learning, or causal inference modeling. Working knowledge of BI tools (Looker, Tableau, Power BI) for integrating ML insights into dashboards. Familiarity with marketing analytics, attribution modeling, and A/B testing methodologies. Experience with distributed computing frameworks (Spark, Dask, Ray). Location: Bengaluru Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience. Your Role And Responsibilities What you’ll do: As a Data Scientist – Artificial Intelligence, your responsibilities include: AI & Machine Learning Model Development Developing ML models for predictive analytics, fraud detection, and automation. Working with deep learning (DL) and Natural Language Processing (NLP) models for text and speech processing. Implementing AI-driven anomaly detection for data quality and governance. Big Data & Model Deployment Building and deploying ML models on Cloudera Machine Learning (CML). Utilizing Apache Spark and PySpark for processing large-scale datasets. Working with Kafka and Iceberg to integrate AI solutions into real-time data pipelines. Data Quality & Governance Supporting AI-powered data quality monitoring with Talend DQ. Assisting in metadata management, data lineage tracking, and automated data validation. Utilizing Denodo for AI-driven data virtualization and federated learning. Security & Compliance Ensuring AI models comply with Bank’s data security and governance policies. Supporting AI-driven encryption and anomaly detection techniques using Thales CipherTrust. Collaboration & Documentation Working with data engineers and analysts to develop AI solutions aligned with business needs. Documenting model architectures, experiment results, and optimization techniques. Assisting in AI-driven reporting and visualization using Qlik Sense/Tableau. Preferred Education Master's Degree Required Technical And Professional Expertise 4-7 years of experience in AI, ML, and Data Science. Strong programming skills in Python, SQL, and ML frameworks (TensorFlow, PyTorch, Scikit-learn). Hands-on experience with big data platforms (Cloudera, Apache Spark, Kafka, Iceberg). Experience with NLP, deep learning, and AI for automation. Understanding of data governance, metadata management, and AI-driven data quality. Familiarity with GitLab, Sonatype Nexus, and CheckMarx for AI model deployment. Preferred Technical And Professional Experience Experience with AI/ML solutions for Banking and financial services. Knowledge of cloud AI platforms (AWS SageMaker, Azure ML, GCP Vertex AI). Exposure to AI ethics, explainable AI (XAI), and bias detection in ML models. Understanding of graph databases (DGraph Enterprise) for AI-powered insights. Certifications in IBM AI Engineering, Cloudera Data Science, or Google/AWS AI. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience. Your Role And Responsibilities What you’ll do: As a Data Scientist – Artificial Intelligence, your responsibilities include: AI & Machine Learning Model Development Developing ML models for predictive analytics, fraud detection, and automation. Working with deep learning (DL) and Natural Language Processing (NLP) models for text and speech processing. Implementing AI-driven anomaly detection for data quality and governance. Big Data & Model Deployment Building and deploying ML models on Cloudera Machine Learning (CML). Utilizing Apache Spark and PySpark for processing large-scale datasets. Working with Kafka and Iceberg to integrate AI solutions into real-time data pipelines. Data Quality & Governance Supporting AI-powered data quality monitoring with Talend DQ. Assisting in metadata management, data lineage tracking, and automated data validation. Utilizing Denodo for AI-driven data virtualization and federated learning. Security & Compliance Ensuring AI models comply with Bank’s data security and governance policies. Supporting AI-driven encryption and anomaly detection techniques using Thales CipherTrust. Collaboration & Documentation Working with data engineers and analysts to develop AI solutions aligned with business needs. Documenting model architectures, experiment results, and optimization techniques. Assisting in AI-driven reporting and visualization using Qlik Sense/Tableau. Preferred Education Master's Degree Required Technical And Professional Expertise 4-7 years of experience in AI, ML, and Data Science. Strong programming skills in Python, SQL, and ML frameworks (TensorFlow, PyTorch, Scikit-learn). Hands-on experience with big data platforms (Cloudera, Apache Spark, Kafka, Iceberg). Experience with NLP, deep learning, and AI for automation. Understanding of data governance, metadata management, and AI-driven data quality. Familiarity with GitLab, Sonatype Nexus, and CheckMarx for AI model deployment. Preferred Technical And Professional Experience Experience with AI/ML solutions for Banking and financial services. Knowledge of cloud AI platforms (AWS SageMaker, Azure ML, GCP Vertex AI). Exposure to AI ethics, explainable AI (XAI), and bias detection in ML models. Understanding of graph databases (DGraph Enterprise) for AI-powered insights. Certifications in IBM AI Engineering, Cloudera Data Science, or Google/AWS AI. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience. Your Role And Responsibilities Who you are: A senior Data Scientist specializing in Advanced Analytics, with expertise in machine learning (ML), predictive modeling, and statistical analysis. Sound experience in leveraging Big-data technologies, AI, and automation to solve complex business problems and enhance decision-making. Have experience working with Cloudera Data Platform, Apache Spark, Kafka, and Iceberg tables, and you understand how to design and deploy scalable AI/ML models. Your role will be instrumental in data modernization efforts, applying AI-driven insights to enhance efficiency, optimize operations, and mitigate risks.What you’ll do: As a Data Scientist – Advanced Analytics, your responsibilities include: AI & Machine Learning Model Development Developing AI/ML models for predictive analytics, fraud detection, and customer segmentation. Implementing time-series forecasting, anomaly detection, and optimization models. Working with deep learning (DL) and Natural Language Processing (NLP) for AI-driven automation. Big Data & Scalable AI Pipelines Processing and analyzing large datasets using Apache Spark, PySpark, and Iceberg tables. Deploying real-time models and streaming analytics with Kafka. Supporting AI model training and deployment on Cloudera Machine Learning (CML). Advanced Analytics & Business Impact Performing exploratory data analysis (EDA) and statistical modelling. Delivering AI-driven insights to improve business decision-making. Supporting data quality and governance initiatives using Talend DQ. Data Governance & Security Ensuring AI models comply with Bank’s data governance and security policies. Implementing AI-driven anomaly detection and metadata management. Utilizing Thales CipherTrust for data encryption and compliance. Collaboration & Thought Leadership Working closely with data engineers, analysts, and business teams to integrate AI-driven solutions. Presenting AI insights and recommendations to stakeholders and leadership teams. Contributing to the development of best practices for AI and analytics. Preferred Education Master's Degree Required Technical And Professional Expertise 3-7 years of experience in AI, ML, and Advanced Analytics. Proficiency in Python, R, SQL, and ML frameworks (Scikit-learn, TensorFlow, PyTorch). Hands-on experience with Big-data technologies (Cloudera, Apache Spark, Kafka, Iceberg table format). Strong knowledge of statistical modelling, optimization, and feature engineering. Understanding of MLOps practices and AI model deployment. Preferred Technical And Professional Experience Develop and implement advanced analytics models, including predictive, prescriptive, and diagnostic analytics to solve business challenges and optimize decision-making processes. Utilize tools and technologies to work with Large and complex datasets to derive analytical solutions. Build and deploy machine learning models (supervised and unsupervised), statistical models, and data-driven algorithms for forecasting, segmentation, classification, and anomaly detection. Should have strong hands-on experience in Python, Spark and cloud computing. Should be independently working and be able to deploy deep learning models using various architectures. Should be able to perform exploratory data analysis (EDA) to uncover trends, relationships, and outliers in large, complex datasets. Design and create features that improve model accuracy and business relevance. Should create insightful visualizations and dashboards that communicate findings to stakeholders. Effectively translate complex data insights into clear and actionable recommendations. Work closely with business leaders, engineers, and analysts to understand business requirements and translate them into analytical solutions that address strategic goals. Exposure to Graph AI using DGraph Enterprise. Knowledge of cloud-based AI platforms (AWS SageMaker, Azure ML, GCP Vertex AI). Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience. Your Role And Responsibilities Who you are: A senior Data Scientist specializing in Advanced Analytics, with expertise in machine learning (ML), predictive modeling, and statistical analysis. Sound experience in leveraging Big-data technologies, AI, and automation to solve complex business problems and enhance decision-making. Have experience working with Cloudera Data Platform, Apache Spark, Kafka, and Iceberg tables, and you understand how to design and deploy scalable AI/ML models. Your role will be instrumental in data modernization efforts, applying AI-driven insights to enhance efficiency, optimize operations, and mitigate risks.What you’ll do: As a Data Scientist – Advanced Analytics, your responsibilities include: AI & Machine Learning Model Development Developing AI/ML models for predictive analytics, fraud detection, and customer segmentation. Implementing time-series forecasting, anomaly detection, and optimization models. Working with deep learning (DL) and Natural Language Processing (NLP) for AI-driven automation. Big Data & Scalable AI Pipelines Processing and analyzing large datasets using Apache Spark, PySpark, and Iceberg tables. Deploying real-time models and streaming analytics with Kafka. Supporting AI model training and deployment on Cloudera Machine Learning (CML). Advanced Analytics & Business Impact Performing exploratory data analysis (EDA) and statistical modelling. Delivering AI-driven insights to improve business decision-making. Supporting data quality and governance initiatives using Talend DQ. Data Governance & Security Ensuring AI models comply with Bank’s data governance and security policies. Implementing AI-driven anomaly detection and metadata management. Utilizing Thales CipherTrust for data encryption and compliance. Collaboration & Thought Leadership Working closely with data engineers, analysts, and business teams to integrate AI-driven solutions. Presenting AI insights and recommendations to stakeholders and leadership teams. Contributing to the development of best practices for AI and analytics. Preferred Education Master's Degree Required Technical And Professional Expertise 3-7 years of experience in AI, ML, and Advanced Analytics. Proficiency in Python, R, SQL, and ML frameworks (Scikit-learn, TensorFlow, PyTorch). Hands-on experience with Big-data technologies (Cloudera, Apache Spark, Kafka, Iceberg table format). Strong knowledge of statistical modelling, optimization, and feature engineering. Understanding of MLOps practices and AI model deployment. Preferred Technical And Professional Experience Develop and implement advanced analytics models, including predictive, prescriptive, and diagnostic analytics to solve business challenges and optimize decision-making processes. Utilize tools and technologies to work with Large and complex datasets to derive analytical solutions. Build and deploy machine learning models (supervised and unsupervised), statistical models, and data-driven algorithms for forecasting, segmentation, classification, and anomaly detection. Should have strong hands-on experience in Python, Spark and cloud computing. Should be independently working and be able to deploy deep learning models using various architectures. Should be able to perform exploratory data analysis (EDA) to uncover trends, relationships, and outliers in large, complex datasets. Design and create features that improve model accuracy and business relevance. Should create insightful visualizations and dashboards that communicate findings to stakeholders. Effectively translate complex data insights into clear and actionable recommendations. Work closely with business leaders, engineers, and analysts to understand business requirements and translate them into analytical solutions that address strategic goals. Exposure to Graph AI using DGraph Enterprise. Knowledge of cloud-based AI platforms (AWS SageMaker, Azure ML, GCP Vertex AI). Show more Show less

Posted 1 month ago

Apply

2 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Relocation Assistance Offered Within Country Job Number #166101 - Mumbai, Maharashtra, India Who We Are Colgate-Palmolive Company is a global consumer products company operating in over 200 countries specializing in Oral Care, Personal Care, Home Care, Skin Care, and Pet Nutrition. Our products are trusted in more households than any other brand in the world, making us a household name! Join Colgate-Palmolive, a caring, innovative growth company reimagining a healthier future for people, their pets, and our planet. Guided by our core values—Caring, Inclusive, and Courageous—we foster a culture that inspires our people to achieve common goals. Together, let's build a brighter, healthier future for all. About Colgate-Palmolive Do you want to come to work with a smile and leave with one as well? In between those smiles, your day consists of working in a global organization, continually learning and collaborating, having stimulating discussions, and making impactful contributions! If this is how you see your career, Colgate is the place to be! Our dependable household brands, dedicated employees, and sustainability commitments make us a company passionate about building a future to smile about for our employees, consumers, and surrounding communities. The pride in our brand fuels a workplace that encourages creative thinking, champions experimentation, and promotes authenticity which has contributed to our enduring success. If you want to work for a company that lives by their values, then give your career a reason to smile...every single day. The Experience The Global Data Science & Advanced Analytics team in Colgate Palmolive is focused on working on commercial business cases which have a revenue impact for the business with recommended actions and scope for scalability across markets. The role requires working with cross functional teams and leaders across Information Technology (UI/UX Front End, Back end, ML Ops), data architecture, data engineering, division analytics leads and business teams, The Global Data Science & Advanced Analytics team is structured with Analytics Leads, Data Science Leads, Data Scientists and Analytics Engineers to solve business questions, provide actionable insights and drive growth for the organization. The role will lead a team of Data Scientists to develop scalable enterprise solutions from scratch in the areas of Marketing and Media Effectiveness, Marketing Mix Models, Revenue Growth Management, Category Forecasting. The role requires an understanding of Internal & external data sources (Syndicated Market Data, Point of Sales, Nielsen, SAP, Retailer ePOS etc.) and the ability to work with large datasets to derive insights and recommendations. The candidate should have proven ability to write scalable codes in SQL, Python, R to build and deploy statistical models, machine learning models in production using Airflow, conceptualize algorithms to guide the team to achieve the business objectives and goals. Knowledge of working on cloud environments (Google Cloud, Snowflake) is crucial. The candidate should be an analytical problem solver and be proactive and responsive to Business needs Communication is a prime requisite along with storytelling to present complex model insights in simple business language to the stakeholders. Who Are You You are a Data Science Expert and Leader working on solving Business Questions and leading a high performing Data Science and Advanced Analytics team An expert coder to write codes in SQL, Python to build scalable enterprise level data science solutions in Marketing Mix Modelling and Revenue Growth Management and to address business problems requiring descriptive, diagnostic, predictive, and / or prescriptive analytics Ability to write codes to build Regression Models, Machine Learning Models [ Ridge, Lasso, Logistic Regression, Bayesian Regression, Random Forests, XGBOOST, Optimization, Simulation ] at scale and then deploy models in production on cloud platforms using Airflow Leading, Mentoring, Coaching data scientists on key technical and domain topics through solution and code reviews to meet business objectives End to end accountability of quality of recommendations, insights and data science solutions developed and deployed in the organization globally You can prioritize multiple work assignments multi-task conflicting priorities and meet deadlines Excellent written and verbal communication. Good facilitation and project management skills You connect the dots - Understands business needs, business questions and identifies the right analytics capabilities to solve them Presentation and Storytelling of model results, insights, recommendation and actions to Marketing and Customer Development (Sales) team and to Division/Global Analytics Leads Communicate complex quantitative insights in a precise, and actionable manner to business stakeholders You are a collaborator - Work with cross functional teams in GIT, Data Architecture, Analytics Engineering to drive the delivery and project management of large scale enterprise wide data science solutions Work collaboratively with the Division Analytics/Global Analytics team and business partners across geographies located in different time zones. You are an innovator - Drive new value from insights by connecting external and internal data sources Bring in enhancements to the existing data science solutions / capabilities by exploring new solutions and techniques Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, exploring new data sources, re-designing infrastructure for greater scalability, etc. Qualifications What you’ll need Bachelor degree required [ Master's or MBA preferred ] in data science, computer science engineering, Information Technology, other engineering streams, a related quantitative and technical (Statistics, Business Analytics, Econometrics) 8+ years of data science and analytics experience in developing and deploying machine learning algorithms in a production environment Experience in setting up or managing a data science team is a must [ 2+ years ] Expert knowledge in all the following domains Marketing Mix Modeling, Revenue Growth Management, Optimization Techniques, Forecasting, Recommendation Engines, Marketing Analytics in CPG or Consulting Companies Extraordinary coding skills in SQL, Python, PySpark, R to implement statistical and machine learning algorithms like Linear Regression, Ridge, Lasso, PyMc, Bayesian Methods, GLMNET, Decision Trees, Random Forests, XGBOOST, SVM etc Strong knowledge of MLOPS using docker, airflow, kubernetes, databricks, dataiku on any cloud platform Proficient in Cloud Platforms - Google Cloud, Snowflake and experience dealing with high-volume, high-dimensionality data from varying sources (Cloud Platforms, Nielsen, SAP, Retailer ePOS; both structured and unstructured data) Experience in supporting and working with multi-functional teams in a dynamic environment Proficient in Reporting Insights, making and delivering presentation via Excel, Google Sheets, Google Slides, DOMO and any other Visualization Tools Proven Ability to provide structure and think strategically in complex business context and operate effectively in ambiguity Ability to independently plan and execute deliveries and be comfortable in a client / business facing role Exceptional communication & collaboration skills to understand business partner needs & deliver solutions. What You’ll Need (Preferred) Experience in building web apps using Pydash, Plotly, R Shiny, Streamlit Experience in Deep Learning, Reinforcement Learning, Image Recognition, AI algorithms, GenAI Experience with Databricks, Docker, Azure Data Factory Experience with machine learning API on cloud services: Azure, AWS, Vertex AI Our Commitment to Diversity, Equity & Inclusion Achieving our purpose starts with our people — ensuring our workforce represents the people and communities we serve —and creating an environment where our people feel they belong; where we can be our authentic selves, feel treated with respect and have the support of leadership to impact the business in a meaningful way. Equal Opportunity Employer Colgate is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity, sexual orientation, national origin, ethnicity, age, disability, marital status, veteran status (United States positions), or any other characteristic protected by law. Reasonable accommodation during the application process is available for persons with disabilities. Please complete this request form should you require accommodation. Show more Show less

Posted 1 month ago

Apply

3 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Experience: 3+ Years Skills needed : Core Python, LLM, Generative AI ,fast API Job location - Ahmedabad (Work from office) Minimum of (3+) years of experience in AI-based application development. Fine-tune pre-existing models to improve performance and accuracy. Experience with TensorFlow or PyTorch, Scikit-learn, or similar ML frameworks and familiarity with APIs like OpenAI or vertex AI Experience with NLP tools and libraries (e.g., NLTK, SpaCy, GPT, BERT). Implement frameworks like LangChain, Anthropic Constitutional AI, OpenAIs, Hugging Face, and Prompt Engineering techniques to build robust and scalable AI applications. Evaluate and analyze RAG solution and Utilise the best-in-class LLM to define customer experience solutions (Fine tune Large Language models (LLM)). Architect and develop advanced generative AI solutions leveraging state-of-the-art language models (LLMs) such as GPT, Llama, Palm, BLOOM, and others. Strong understanding and experience with open-source multimodal LLM models to customize and create solutions. Explore and implement cutting-edge techniques like Few-Shot Learning, Reinforcement Learning, Multi-Task Learning, and Transfer Learning for AI model training and fine-tuning. Proficiency in data pre-processing, feature engineering, and data visualization using tools like Pandas, NumPy, and Matplotlib. Optimize model performance through experimentation, hyperparameter tuning, and advanced optimization techniques. Proficiency in Python with the ability to get hands-on with coding at a deep level. Develop and maintain APIs using Python's FastAPI, Flask, or Django for integrating AI capabilities into various systems. Ability to write optimized and high-performing scripts on relational databases (e.g., MySQL, PostgreSQL) or non-relational database (e.g., MongoDB or Cassandra) Enthusiasm for continuous learning and professional development in AI and rleated technologies. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. Knowledge of cloud services like AWS, Google Cloud, or Azure. Proficiency with version control systems, especially Git. Familiarity with data pre-processing techniques and pipeline development for Al model training. Experience with deploying models using Docker, Kubernetes Experience with AWS Bedrock, and Sagemaker is a plus Strong problem-solving skills with the ability to translate complex business problems into Al solutions. Show more Show less

Posted 1 month ago

Apply

0.0 - 31.0 years

0 - 0 Lacs

Tonk Phatak, Jaipur

Remote

Apna logo

Snapmint is on a mission of democratizing no/low-cost installment purchases for the next 200 Mn Indians. Of the 300 million credit-eligible consumers in India, less than 30 million actively use credit cards. Snapmint is reinventing credit for the next 200M consumers by providing them the freedom to buy what they want and pay for them in installments without a credit card. In a short period of time, Snapmint has reached over a million consumers in 2200 cities and has powered over 200 crores worth of purchases. Minimum 1 year of experience in Email/Chat/In-call and Outbound calling process in any BPO, e-commerce, fintech, and any customer experience organization. For Email/chat – Typing speed must be in btw 30-35 wpm. Proven customer support experience or experience as a Client Service Representative. Qualified candidates will be comfortable in a multi-tasking, high-energy environment. They will be creative and analytical problem solvers with a passion for excellent customer service. Strong phone contact handling skills and active listening. Maintain regular and reliable attendance, including the daily schedule as assigned. Graduation and above (12th pass only in exceptional cases). Excellent verbal/written communication skills and basic computer knowledge. Age should be a maximum of 28. Work from the office and 6 days working (roster off). Near Tonk Road, Dev Nagar (Maximum travel distance should be 10-12 km). Tamil, Telugu and Malayam language is a plus. Candidate who have experience from Teleperformance, vertex cosmos, Girnar, Dealshare, Innovana Thinklabs, Dr ITM will be a plus.

Posted 1 month ago

Apply

Exploring Vertex Jobs in India

India has seen a rise in demand for professionals with expertise in Vertex, a cloud-based tax technology solution. Companies across various industries are actively seeking individuals with skills in Vertex to manage their tax compliance processes efficiently. If you are a job seeker looking to explore opportunities in this field, read on to learn more about the Vertex job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Hyderabad
  4. Pune
  5. Chennai

Average Salary Range

The salary range for Vertex professionals in India varies based on experience levels. Entry-level professionals can expect to earn around INR 4-6 lakhs per annum, while experienced professionals with several years in the industry can earn upwards of INR 12-15 lakhs per annum.

Career Path

In the Vertex domain, a typical career progression path may include roles such as Tax Analyst, Tax Consultant, Tax Manager, and Tax Director. Professionals may advance from Junior Tax Analyst to Senior Tax Analyst, and eventually take on leadership roles as Tax Managers or Directors.

Related Skills

Alongside expertise in Vertex, professionals in this field are often expected to have skills in tax compliance, tax regulations, accounting principles, and data analysis. Knowledge of ERP systems and experience in tax software implementation can also be beneficial.

Interview Questions

  • What is Vertex and how is it used in tax compliance? (basic)
  • Can you explain the difference between sales tax and value-added tax? (basic)
  • How do you stay updated on changes in tax laws and regulations? (basic)
  • Describe a challenging tax compliance project you worked on and how you overcame obstacles. (medium)
  • How do you ensure accuracy in tax calculations using Vertex? (medium)
  • What are some common challenges faced in implementing Vertex solutions for clients? (medium)
  • Can you walk us through a recent tax audit you were involved in? (medium)
  • How do you handle disputes with tax authorities regarding tax filings? (advanced)
  • In your opinion, what are the key factors to consider when choosing a tax technology solution like Vertex? (advanced)
  • How do you approach training and educating team members on using Vertex effectively? (advanced)
  • Describe a scenario where you had to customize Vertex to meet specific client requirements. (advanced)

Closing Remark

As you explore job opportunities in the Vertex domain in India, remember to showcase your expertise, skills, and experience confidently during interviews. Prepare thoroughly for technical questions and demonstrate your understanding of tax compliance processes. With dedication and continuous learning, you can build a successful career in Vertex roles. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies