Jobs
Interviews

1489 Vertex Jobs - Page 8

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

karnataka

On-site

Join a leading force in the financial technology and enterprise support sector, where innovation meets precision. This role is ideal for professionals passionate about ensuring the robustness of critical tax systems in a dynamic, on-site environment. We support essential financial operations and tax calculations for diverse client portfolios in India. As a Production Support Specialist for the Vertex Tax System, your primary responsibility will be to provide end-to-end support to ensure high system availability and accuracy. You will troubleshoot, diagnose, and swiftly resolve production issues to maintain seamless tax operations. Collaboration with cross-functional teams is essential, and you will escalate issues when necessary to guarantee minimal disruption. Monitoring system performance, implementing updates, and executing system enhancements will be part of your routine to optimize efficiency. Documentation of resolutions, maintenance of process standards, and contribution to continuous improvement initiatives are also key aspects of this role. Additionally, engaging in on-site support activities, participating in on-call rotations, and adhering to emergency response protocols will be required. To excel in this role, you must have proven experience in production support for the Vertex Tax System or similar tax software environments. Strong technical troubleshooting abilities, coupled with hands-on experience in Linux and SQL environments, are must-have skills. Familiarity with critical business processes around tax calculations and financial reporting is essential. Prior experience in on-site production support roles with a proactive problem-solving approach is also required. Exposure to financial software support, ERP systems, or ITIL production support frameworks is preferred. A degree in Engineering, Computer Science, or a related field would be advantageous. As part of our team, you will enjoy the benefits of working in an innovative environment that rewards initiative, expertise, and collaboration. You will engage in challenging projects that drive professional growth and skill enhancement. We offer a competitive compensation package with clear career advancement pathways in a high-impact, on-site setting.,

Posted 1 week ago

Apply

4.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Analyst, Inclusive Innovation & Analytics, Center for Inclusive Growth Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. The Center for Inclusive Growth is the social impact hub at Mastercard. The organization seeks to ensure that the benefits of an expanding economy accrue to all segments of society. Through actionable research, impact data science, programmatic grants, stakeholder engagement and global partnerships, the Center advances equitable and sustainable economic growth and financial inclusion around the world. The Center’s work is at the heart of Mastercard’s objective to be a force for good in the world. Reporting to Vice President, Inclusive Innovation & Analytics, the Analyst, will 1) create and/or scale data, data science, and AI solutions, methodologies, products, and tools to advance inclusive growth and the field of impact data science, 2) work on the execution and implementation of key priorities to advance external and internal data for social strategies, and 3) manage the operations to ensure operational excellence across the Inclusive Innovation & Analytics team. Key Responsibilities Data Analysis & Insight Generation Design, develop, and scale data science and AI solutions, tools, and methodologies to support inclusive growth and impact data science. Analyze structured and unstructured datasets to uncover trends, patterns, and actionable insights related to economic inclusion, public policy, and social equity. Translate analytical findings into insights through compelling visualizations and dashboards that inform policy, program design, and strategic decision-making. Create dashboards, reports, and visualizations that communicate findings to both technical and non-technical audiences. Provide data-driven support for convenings involving philanthropy, government, private sector, and civil society partners. Data Integration & Operationalization Assist in building and maintaining data pipelines for ingesting and processing diverse data sources (e.g., open data, text, survey data). Ensure data quality, consistency, and compliance with privacy and ethical standards. Collaborate with data engineers and AI developers to support backend infrastructure and model deployment. Team Operations Manage team operations, meeting agendas, project management, and strategic follow-ups to ensure alignment with organizational goals. Lead internal reporting processes, including the preparation of dashboards, performance metrics, and impact reports. Support team budgeting, financial tracking, and process optimization. Support grantees and grants management as needed Develop briefs, talking points, and presentation materials for leadership and external engagements. Translate strategic objectives into actionable data initiatives and track progress against milestones. Coordinate key activities and priorities in the portfolio, working across teams at the Center and the business as applicable to facilitate collaboration and information sharing Support the revamp of the Measurement, Evaluation, and Learning frameworks and workstreams at the Center Provide administrative support as needed Manage ad-hoc projects, events organization Qualifications Bachelor’s degree in Data Science, Statistics, Computer Science, Public Policy, or a related field. 2–4 years of experience in data analysis, preferably in a mission-driven or interdisciplinary setting. Strong proficiency in Python and SQL; experience with data visualization tools (e.g., Tableau, Power BI, Looker, Plotly, Seaborn, D3.js). Familiarity with unstructured data processing and robust machine learning concepts. Excellent communication skills and ability to work across technical and non-technical teams. Technical Skills & Tools Data Wrangling & Processing Data cleaning, transformation, and normalization techniques Pandas, NumPy, Dask, Polars Regular expressions, JSON/XML parsing, web scraping (e.g., BeautifulSoup, Scrapy) Machine Learning & Modeling Scikit-learn, XGBoost, LightGBM Proficiency in supervised/unsupervised learning, clustering, classification, regression Familiarity with LLM workflows and tools like Hugging Face Transformers, LangChain (a plus) Visualization & Reporting Power BI, Tableau, Looker Python libraries: Matplotlib, Seaborn, Plotly, Altair Dashboarding tools: Streamlit, Dash Storytelling with data and stakeholder-ready reporting Cloud & Collaboration Tools Google Cloud Platform (BigQuery, Vertex AI), Microsoft Azure Git/GitHub, Jupyter Notebooks, VS Code Experience with APIs and data integration tools (e.g., Airflow, dbt) Ideal Candidate You are a curious and collaborative analyst who believes in the power of data to drive social change. You’re excited to work with cutting-edge tools while staying grounded in the real-world needs of communities and stakeholders. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-253034

Posted 1 week ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Analyst, Inclusive Innovation & Analytics, Center for Inclusive Growth Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. The Center for Inclusive Growth is the social impact hub at Mastercard. The organization seeks to ensure that the benefits of an expanding economy accrue to all segments of society. Through actionable research, impact data science, programmatic grants, stakeholder engagement and global partnerships, the Center advances equitable and sustainable economic growth and financial inclusion around the world. The Center’s work is at the heart of Mastercard’s objective to be a force for good in the world. Reporting to Vice President, Inclusive Innovation & Analytics, the Analyst, will 1) create and/or scale data, data science, and AI solutions, methodologies, products, and tools to advance inclusive growth and the field of impact data science, 2) work on the execution and implementation of key priorities to advance external and internal data for social strategies, and 3) manage the operations to ensure operational excellence across the Inclusive Innovation & Analytics team. Key Responsibilities Data Analysis & Insight Generation Design, develop, and scale data science and AI solutions, tools, and methodologies to support inclusive growth and impact data science. Analyze structured and unstructured datasets to uncover trends, patterns, and actionable insights related to economic inclusion, public policy, and social equity. Translate analytical findings into insights through compelling visualizations and dashboards that inform policy, program design, and strategic decision-making. Create dashboards, reports, and visualizations that communicate findings to both technical and non-technical audiences. Provide data-driven support for convenings involving philanthropy, government, private sector, and civil society partners. Data Integration & Operationalization Assist in building and maintaining data pipelines for ingesting and processing diverse data sources (e.g., open data, text, survey data). Ensure data quality, consistency, and compliance with privacy and ethical standards. Collaborate with data engineers and AI developers to support backend infrastructure and model deployment. Team Operations Manage team operations, meeting agendas, project management, and strategic follow-ups to ensure alignment with organizational goals. Lead internal reporting processes, including the preparation of dashboards, performance metrics, and impact reports. Support team budgeting, financial tracking, and process optimization. Support grantees and grants management as needed Develop briefs, talking points, and presentation materials for leadership and external engagements. Translate strategic objectives into actionable data initiatives and track progress against milestones. Coordinate key activities and priorities in the portfolio, working across teams at the Center and the business as applicable to facilitate collaboration and information sharing Support the revamp of the Measurement, Evaluation, and Learning frameworks and workstreams at the Center Provide administrative support as needed Manage ad-hoc projects, events organization Qualifications Bachelor’s degree in Data Science, Statistics, Computer Science, Public Policy, or a related field. 2–4 years of experience in data analysis, preferably in a mission-driven or interdisciplinary setting. Strong proficiency in Python and SQL; experience with data visualization tools (e.g., Tableau, Power BI, Looker, Plotly, Seaborn, D3.js). Familiarity with unstructured data processing and robust machine learning concepts. Excellent communication skills and ability to work across technical and non-technical teams. Technical Skills & Tools Data Wrangling & Processing Data cleaning, transformation, and normalization techniques Pandas, NumPy, Dask, Polars Regular expressions, JSON/XML parsing, web scraping (e.g., BeautifulSoup, Scrapy) Machine Learning & Modeling Scikit-learn, XGBoost, LightGBM Proficiency in supervised/unsupervised learning, clustering, classification, regression Familiarity with LLM workflows and tools like Hugging Face Transformers, LangChain (a plus) Visualization & Reporting Power BI, Tableau, Looker Python libraries: Matplotlib, Seaborn, Plotly, Altair Dashboarding tools: Streamlit, Dash Storytelling with data and stakeholder-ready reporting Cloud & Collaboration Tools Google Cloud Platform (BigQuery, Vertex AI), Microsoft Azure Git/GitHub, Jupyter Notebooks, VS Code Experience with APIs and data integration tools (e.g., Airflow, dbt) Ideal Candidate You are a curious and collaborative analyst who believes in the power of data to drive social change. You’re excited to work with cutting-edge tools while staying grounded in the real-world needs of communities and stakeholders. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-253034

Posted 1 week ago

Apply

3.0 years

0 Lacs

Andhra Pradesh, India

On-site

3+ years of overall Gen AI/ AI ML development experience. Exposure on Python, Vertex AI, Tensor flow, DocAI, GKE, Cloud Functions, Serverless, Cloud Run, Terraform, Lang chain, Vector DB, One Few Short Learning, Gemini FM, notebooks, NLP, OCR, NER, RAF, LLM. Impeccable analytical and problem-solving skills 4+ years of strong hands-on experience as Python Developer. Extensive math and computer skills, with a deep understanding of probability, statistics, and algorithms In-depth knowledge of machine learning frameworks, like Keras or PyTorch Familiarity with data structures, data modeling, and software architecture Excellent time management and organizational skills Desire to learn. Excellent communication and collaboration skills Innovative mind with a passion for continuous learning General knowledge of building machine learning systems Bachelors degree (or equivalent) in computer science, mathematics, or related field

Posted 1 week ago

Apply

2.0 years

0 Lacs

Gurugram, Haryana, India

On-site

About Spyne At Spyne, we are transforming how cars are marketed and sold with cutting-edge Generative AI. What started as a bold idea—using AI-powered visuals to help auto dealers sell faster online—has now evolved into a full-fledged, AI-first automotive retail ecosystem. Backed by $16M in Series A funding from Accel, Vertex Ventures, and other top investors, we’re scaling at breakneck speed: Launched industry-first AI-powered Image, Video & 360° solutions for Automotive dealers Launching Gen AI powered Automotive Retail Suite to power Inventory, Marketing, CRM for dealers Onboarded 1500+ dealers across US, EU and other key markets in the past 2 years of launch Gearing up to onboard 10K+ dealers across global market of 200K+ dealers 150+ members team with near equal split on R&D and GTM Learn more about our products: Spyne AI Products - StudioAI , RetailAI Series A Announcement - CNBC-TV18 , Yourstory More about us - ASOTU , CNBC Awaaz Role Overview This is a unique opportunity to own and build the Customer Onboarding & Success function for Spyne’s next-gen CRM + Conversational AI suite — a strategic product poised to outpace our core offering. You'll begin as a hands-on IC (Individual Contributor) , managing high-impact programs and onboarding journeys, and will later build your own team as the function scales. What’s the mission? 👉 Ensure a smooth onboarding journey for our first 30 strategic customers 👉 Deliver $1M+ in quarterly revenue 👉 And most importantly, post achieving the PMF journey, lead the Customer Success charter for the product 📍 Location: Gurgaon (Onsite) ⏰ Working Hours: 6 PM – 3 AM IST (US Hours) | Work From Office Only 💼 Experience: 4–6 years 🧑‍💼 Reports to: Sanjay Varnwal, CEO & Co-founder Key Responsibilities 🚀 Program Ownership & Execution Own the end-to-end onboarding lifecycle for Spyne’s first 30 strategic customers in the U.S. market. Lead the charge in delivering $1M+ in quarterly revenue by ensuring timely onboarding and activation. Build and manage complex onboarding journeys for mid-to-large automotive dealers. 🧩 Strategy, Process & Automation Define scalable onboarding frameworks and build SOPs for repeatable execution. Collaborate with Product, Engineering, AI, and GTM teams to integrate onboarding workflows into the platform. Drive automation-first delivery models , enabling onboarding of 100+ dealers/month post-PMF. 📈 Insights, Feedback & Product Alignment Run continuous feedback loops between customers and the product team. Provide structured insights from onboarding experiences to influence roadmap and feature prioritization. 👥 Path to Team Building After building foundational processes, prepare for a transition into a People Manager role with clear hiring plans. Define roles, structure, and metrics for a future Customer Onboarding & Success team . What Will Make You Successful? 5-6 years in Strategy, Customer Success, or Program Management roles in B2B SaaS or AI-led startups Proven record of working with cross-functional teams in fast-paced environments Past experience in building scalable, process-led functions from scratch Ability to translate customer problems into product-driven solutions Familiarity with CRM systems, automation tools, and AI-led engagement tools Strong executive presence, communication, and stakeholder management skills Demonstrated maturity to start as an IC with the capability to build and lead a team Willingness to work US hours (6 PM to 3 AM IST) onsite in Gurgaon Deep alignment with Spyne’s core values : Customer Obsession Think 10X, Not 10% Extreme Ownership Relentless Innovation Why Spyne? Culture: High-ownership, zero-politics, execution-first Growth: $5M to $20M ARR trajectory Learning: Work with top GTM leaders and startup veterans Exposure: Global exposure across U.S., EU, and India markets Compensation: Competitive base + performance incentives + stock options

Posted 1 week ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism SAP Management Level Senior Manager Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. As a SAP consulting generalist at PwC, you will focus on providing consulting services across various SAP applications to clients, analysing their needs, implementing software solutions, and offering training and support for effective utilisation of SAP applications. Your versatile knowledge will allow you to assist clients in optimising operational efficiency and achieving their strategic objectives. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations Job Description & Summary: A career within…. Responsibilities: AI Architecture & Development · Design and implement generative AI models (e.g., Transformers, GANs, VAEs, Diffusion Models). · Architect Retrieval-Augmented Generation (RAG) systems and multi-agent frameworks. · Fine-tune pre-trained models for domain-specific tasks (e.g., NLP, vision, genomics). · Ensure model scalability, performance, and interpretability. System Integration & Deployment · Integrate AI models into full-stack applications using modern frameworks (React, Node.js, Django). · Deploy models using cloud platforms (AWS SageMaker, Azure ML, GCP Vertex AI). · Implement CI/CD pipelines and containerization (Docker, Kubernetes). Collaboration & Leadership · Work with data scientists, engineers, and domain experts to translate business/scientific needs into AI solutions. · Lead architectural decisions across model lifecycle: training, deployment, monitoring, and versioning. · Provide technical mentorship and guidance to junior team members. Compliance & Documentation · Ensure compliance with data privacy standards (HIPAA, GDPR). · Maintain comprehensive documentation for models, systems, and workflows. --- Required Qualifications: · Bachelor’s or Master’s in Computer Science, Engineering, Data Science, or related field. · 5+ years in AI/ML development; 3+ years in architecture or technical leadership roles. · Proficiency in Python, JavaScript, and frameworks like TensorFlow, PyTorch. · Experience with cloud platforms (AWS, Azure, GCP) and DevOps tools. · Strong understanding of NLP, computer vision, or life sciences applications. --- Preferred Qualifications: · Experience in domains like marketing, capital markets, or life sciences (e.g., drug discovery, genomics). · Familiarity with Salesforce Einstein and other enterprise AI tools. · Knowledge of regulatory standards (FDA, EMA) and ethical AI practices. · Experience with multimodal data (text, image, genomic, clinical). Mandatory skill sets: Gen AI Architect Preferred skill sets: Gen AI Years of experience required: 10+yrs Education qualification: Btech MBA MCA MTECH Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Technology Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills AI Architecture Optional Skills Generative AI Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Ciklum is looking for a Data Engineer to join our team full-time in India. We are a custom product engineering company that supports both multinational organizations and scaling startups to solve their most complex business challenges. With a global team of over 4,000 highly skilled developers, consultants, analysts and product owners, we engineer technology that redefines industries and shapes the way people live. About the role: As a Data Engineer, become a part of a cross-functional development team who is working with GenAI solutions for digital transformation across Enterprise Products. The prospective team you will be working with is responsible for the design, development, and deployment of innovative, enterprise technology, tools, and standard processes to support the delivery of tax services. The team focuses on the ability to deliver comprehensive, value-added, and efficient tax services to our clients. It is a dynamic team with professionals of varying backgrounds from tax technical, technology development, change management, and project management. The team consults and executes on a wide range of initiatives involving process and tool development and implementation including training development, engagement management, tool design, and implementation. Responsibilities: Responsible for the building, deployment, and maintenance of mission-critical analytics solutions that process terabytes of data quickly at big-data scales Contributes design, code, configurations, manage data ingestion, real-time streaming, batch processing, ETL across multiple data storages Responsible for performance tuning of complicated SQL queries and Data flows Requirements: Experience coding in SQL/Python, with solid CS fundamentals including data structure and algorithm design Hands-on implementation experience working with a combination of the following technologies: Hadoop, Map Reduce, Kafka, Hive, Spark, SQL and NoSQL data warehouses Experience in Azure cloud data platform Experience working with vector databases (Milvus, Postgres, etc.) Knowledge of embedding models and retrieval-augmented generation (RAG) architectures Understanding of LLM pipelines, including data preprocessing for GenAI models Experience deploying data pipelines for AI/ML workloads(*), ensuring scalability and efficiency Familiarity with model monitoring(*), feature stores (Feast, Vertex AI Feature Store), and data versioning Experience with CI/CD for ML pipelines(*) (Kubeflow, MLflow, Airflow, SageMaker Pipelines) Understanding of real-time streaming for ML model inference (Kafka, Spark Streaming) Knowledge of Data Warehousing, design, implementation and optimization Knowledge of Data Quality testing, automation and results visualization Knowledge of BI reports and dashboards design and implementation (PowerBI) Experience with supporting data scientists and complex statistical use cases highly desirable What`s in it for you? Strong community: Work alongside top professionals in a friendly, open-door environment Growth focus: Take on large-scale projects with a global impact and expand your expertise Tailored learning: Boost your skills with internal events (meetups, conferences, workshops), Udemy access, language courses, and company-paid certifications Endless opportunities: Explore diverse domains through internal mobility, finding the best fit to gain hands-on experience with cutting-edge technologies Care: We’ve got you covered with company-paid medical insurance, mental health support, and financial & legal consultations About us: At Ciklum, we are always exploring innovations, empowering each other to achieve more, and engineering solutions that matter. With us, you’ll work with cutting-edge technologies, contribute to impactful projects, and be part of a One Team culture that values collaboration and progress. India is a strategic innovation hub for Ciklum, with growing teams in Chennai and Pune leading advancements in EdgeTech, AR/VR, IoT, and beyond. Join us to collaborate on game-changing solutions and take your career to the next level. Want to learn more about us? Follow us on Instagram , Facebook , LinkedIn . Explore, empower, engineer with Ciklum! Interested already? We would love to get to know you! Submit your application. We can’t wait to see you at Ciklum.

Posted 1 week ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description We are looking for an enthusiastic AI/ML Developer with 3-5 years of experience to design, develop, and deploy AI/ML solutions. The ideal candidate is passionate about AI, skilled in machine learning, deep learning, and MLOps, and eager to work on cutting-edge projects. Key Skills & Experience: Programming: Python (TensorFlow, PyTorch, Scikit-learn, Pandas). Machine Learning: Supervised, Unsupervised, Deep Learning, NLP, Computer Vision. Model Deployment: Flask, FastAPI, AWS SageMaker, Google Vertex AI, Azure ML. MLOps & Cloud: Docker, Kubernetes, MLflow, Kubeflow, CI/CD pipelines. Big Data & Databases: Spark, Dask, SQL, NoSQL (PostgreSQL, MongoDB). Soft Skills: Strong analytical and problem-solving mindset. Passion for AI innovation and continuous learning. Excellent teamwork and communication abilities. Qualifications: Bachelor’s/Master’s in Computer Science, AI, Data Science, or related fields. AI/ML certifications are a plus. Career Level - IC4 Responsibilities We are looking for an enthusiastic AI/ML Developer with 3-5 years of experience to design, develop, and deploy AI/ML solutions. The ideal candidate is passionate about AI, skilled in machine learning, deep learning, and MLOps, and eager to work on cutting-edge projects. Key Skills & Experience: Programming: Python (TensorFlow, PyTorch, Scikit-learn, Pandas). Machine Learning: Supervised, Unsupervised, Deep Learning, NLP, Computer Vision. Model Deployment: Flask, FastAPI, AWS SageMaker, Google Vertex AI, Azure ML. MLOps & Cloud: Docker, Kubernetes, MLflow, Kubeflow, CI/CD pipelines. Big Data & Databases: Spark, Dask, SQL, NoSQL (PostgreSQL, MongoDB). Soft Skills: Strong analytical and problem-solving mindset. Passion for AI innovation and continuous learning. Excellent teamwork and communication abilities. Qualifications: Bachelor’s/Master’s in Computer Science, AI, Data Science, or related fields. AI/ML certifications are a plus. Diversity & Inclusion: An Oracle career can span industries, roles, Countries and cultures, giving you the opportunity to flourish in new roles and innovate, while blending work life in. Oracle has thrived through 40+ years of change by innovating and operating with integrity while delivering for the top companies in almost every industry. In order to nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that inspires thought leadership and innovation. . Oracle offers a highly competitive suite of Employee Benefits designed on the principles of parity, consistency, and affordability. The overall package includes certain core elements such as Medical, Life Insurance, access to Retirement Planning, and much more. We also encourage our employees to engage in the culture of giving back to the communities where we live and do business. At Oracle, we believe that innovation starts with diversity and inclusion and to create the future we need talent from various backgrounds, perspectives, and abilities. We ensure that individuals with disabilities are provided reasonable accommodation to successfully participate in the job application, interview process, and in potential roles. to perform crucial job functions. That’s why we’re committed to creating a workforce where all individuals can do their best work. It’s when everyone’s voice is heard and valued that we’re inspired to go beyond what’s been done before. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Ciklum is looking for a Data Engineer to join our team full-time in India. We are a custom product engineering company that supports both multinational organizations and scaling startups to solve their most complex business challenges. With a global team of over 4,000 highly skilled developers, consultants, analysts and product owners, we engineer technology that redefines industries and shapes the way people live. About the role: As a Data Engineer, become a part of a cross-functional development team who is working with GenAI solutions for digital transformation across Enterprise Products. The prospective team you will be working with is responsible for the design, development, and deployment of innovative, enterprise technology, tools, and standard processes to support the delivery of tax services. The team focuses on the ability to deliver comprehensive, value-added, and efficient tax services to our clients. It is a dynamic team with professionals of varying backgrounds from tax technical, technology development, change management, and project management. The team consults and executes on a wide range of initiatives involving process and tool development and implementation including training development, engagement management, tool design, and implementation. Responsibilities: Responsible for the building, deployment, and maintenance of mission-critical analytics solutions that process terabytes of data quickly at big-data scales Contributes design, code, configurations, manage data ingestion, real-time streaming, batch processing, ETL across multiple data storages Responsible for performance tuning of complicated SQL queries and Data flows Requirements: Experience coding in SQL/Python, with solid CS fundamentals including data structure and algorithm design Hands-on implementation experience working with a combination of the following technologies: Hadoop, Map Reduce, Kafka, Hive, Spark, SQL and NoSQL data warehouses Experience in Azure cloud data platform Experience working with vector databases (Milvus, Postgres, etc.) Knowledge of embedding models and retrieval-augmented generation (RAG) architectures Understanding of LLM pipelines, including data preprocessing for GenAI models Experience deploying data pipelines for AI/ML workloads(*), ensuring scalability and efficiency Familiarity with model monitoring(*), feature stores (Feast, Vertex AI Feature Store), and data versioning Experience with CI/CD for ML pipelines(*) (Kubeflow, MLflow, Airflow, SageMaker Pipelines) Understanding of real-time streaming for ML model inference (Kafka, Spark Streaming) Knowledge of Data Warehousing, design, implementation and optimization Knowledge of Data Quality testing, automation and results visualization Knowledge of BI reports and dashboards design and implementation (PowerBI) Experience with supporting data scientists and complex statistical use cases highly desirable What`s in it for you? Strong community: Work alongside top professionals in a friendly, open-door environment Growth focus: Take on large-scale projects with a global impact and expand your expertise Tailored learning: Boost your skills with internal events (meetups, conferences, workshops), Udemy access, language courses, and company-paid certifications Endless opportunities: Explore diverse domains through internal mobility, finding the best fit to gain hands-on experience with cutting-edge technologies Care: We’ve got you covered with company-paid medical insurance, mental health support, and financial & legal consultations About us: At Ciklum, we are always exploring innovations, empowering each other to achieve more, and engineering solutions that matter. With us, you’ll work with cutting-edge technologies, contribute to impactful projects, and be part of a One Team culture that values collaboration and progress. India is a strategic innovation hub for Ciklum, with growing teams in Chennai and Pune leading advancements in EdgeTech, AR/VR, IoT, and beyond. Join us to collaborate on game-changing solutions and take your career to the next level. Want to learn more about us? Follow us on Instagram , Facebook , LinkedIn . Explore, empower, engineer with Ciklum! Interested already? We would love to get to know you! Submit your application. We can’t wait to see you at Ciklum.

Posted 1 week ago

Apply

6.0 - 10.0 years

6 - 16 Lacs

Noida, Hyderabad, Bengaluru

Work from Office

Job Description Role : SAP Vertex Integration Consultant Years : 6-10 years Location : Hyderabad Understand client requirements, provide solutions, functional specifications and implement technical components accordingly. Ability to create Technical Design Documents (TDD) and Unit Test documents for the technical solutions being implemented. Excellent Communication, analytical and Interpersonal skills as a Consultant and play a role of team lead. Must have experience in end-to-end implementation of Vertex integration with SAP. Hands-on experience with SAP & Vertex configuration and troubleshooting with integrated applications. Should have good knowledge in understanding of global and country specific Tax procedures. Deep knowledge of Vertex tax engine, including tax rules, calculations, and data structures configurations. Good experience required with Vertex setups (Tax Product Category (TPC), Tax Product Driver (TPD), Tax Assist Rules) Experience in both Output and Input Taxes Should be able to communicate effectively with the team on the technical requirements Strong knowledge of SAP FICO Business Processes like P2P, OTC, AP, AR, GL, and Controlling, Strong knowledge of Month/year-end processes. Knowledge of working on interfaces like ALE, EDI IDOC, and RFC. Good Knowledge of SAP MM/SD/FI integration processes Experience in developing custom objects including requirement gathering preparing Functional Design documents, Estimation, conducting UT, SIT, and knowledge of SAP tables in areas of FI and CO. Responsible for supporting in translating business requirements into standard/custom solutions in the Finance domain focusing the skillset on the ERP solution with a focus on TAX. VAT determination and Withholding Tax expertise.

Posted 1 week ago

Apply

9.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Hiring: Python Developers (6–9 Yrs) | Chennai & Hyderabad | F2F Interview – 26th July Are you a Python developer with strong cloud experience? Join us at Virtusa – a global leader in digital engineering. We are conducting a Face-to-Face Interview Drive for multiple roles in Chennai and Hyderabad on Saturday, 26th July . Position Details: Role: Python Developer Experience: 6–9 Years Job Type: Full-Time No. of Positions: 20 Work Location: Chennai – Navallur Office / Hyderabad – Campus Office Interview Date: Saturday, 26th July Interview Mode: Face-to-Face ONLY Key Responsibilities: Design, develop, and maintain scalable applications using Python Build and manage cloud-native applications across AWS , Azure , or GCP Implement and manage API integrations including authentication mechanisms (e.g., OAuth , API keys) Set up and manage Python development environments using pip , Conda , and virtual environments Write clean, efficient, and testable code following best practices Collaborate with cross-functional teams (DevOps, Data Engineering, Product) Troubleshoot application issues and ensure optimal performance Must-Have Skills: Strong programming expertise in Python Experience in Python environment setup and dependency management (pip, conda) Hands-on knowledge of API integration techniques (OAuth, API Keys, RESTful APIs) Cloud Expertise (Must have any one combination): Python + AWS (e.g., S3, Lambda, SageMaker AI, EC2) Python + Azure (e.g., Azure ML, Functions, Blob Storage) Python + GCP (e.g., Vertex AI, GCS, Cloud Functions) Preferred Qualifications: Experience working in Agile environments Familiarity with CI/CD pipelines Exposure to cloud security and cost optimization practices Knowledge of containerization (Docker, Kubernetes) is a plus How to Apply: 📨 Send your updated resume to shalini.v@saranshinc.com Take advantage of this great opportunity to accelerate your career—join us at the upcoming interview drive! #PythonJobs #CloudCareers #AWS #Azure #GCP #HiringNow #InterviewDrive #ChennaiJobs #HyderabadJobs #TechHiring #PythonDevelopers

Posted 1 week ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title: Senior/ Lead Data Scientist Locations: Hyderabad NP : Immediate Joiner - 15 Days Skills : Python , Data Science , ML Model building, GCP, Azure, AWS Who we are Tiger Analytics is a global leader in AI and analytics, helping Fortune 1000 companies solve their toughest challenges. We offer full-stack AI and analytics services & solutions to empower businesses to achieve real outcomes and value at scale. We are on a mission to push the boundaries of what AI and analytics can do to help enterprises navigate uncertainty and move forward decisively. Our purpose is to provide certainty to shape a better tomorrow. Our team of 4000+ technologists and consultants are based in the US, Canada, the UK, India, Singapore and Australia, working closely with clients across CPG, Retail, Insurance, BFS, Manufacturing, Life Sciences, and Healthcare. Many of our team leaders rank in Top 10 and 40 Under 40 lists, exemplifying our dedication to innovation and excellence. We are a Great Place to Work-Certified™ (2022-24), recognized by analyst firms such as Forrester, Gartner, HFS, Everest, ISG and others. We have been ranked among the ‘Best’ and ‘Fastest Growing’ analytics firms lists by Inc., Financial Times, Economic Times and Analytics India Magazine Curious about the role? What your typical day would look like? Lead and contribute to developing sophisticated machine learning models, predictive analytics, and statistical analyses to solve complex business problems. Demonstrate proficiency in programming languages such as Python, with the ability to write clean, efficient, and maintainable code. Use your robust problem-solving skills to develop data-driven solutions, analyse complex datasets, and derive actionable insights that lead to impactful outcomes. Take ownership of end-to-end model development—from problem definition and data exploration to model training, validation, deployment, and monitoring—delivering scalable solutions in real- world settings. Work closely with clients to understand their business objectives, identify opportunities for analytics-driven solutions, and communicate findings clearly and promptly. Collaborate with cross-functional teams, including data engineers, software developers, and business stakeholders, to integrate machine learning solutions into business processes, with an emphasis on production-grade deployment. Implement and scale AI/ML models on Google Cloud Platform using tools like Vertex AI Workbench, Vertex AI Pipelines (Kubeflow), and BigQuery to accelerate development and streamline production deployment. Deploy, serve, and manage high-performance models using Vertex AI Endpoints, Cloud Run, and Cloud Composer, ensuring resilience and observability via Vertex Model Monitoring, Cloud Monitoring, and Logging. What do we expect? ● 6+ year’s experience in data science and ML model development ● A passion for writing high-quality, modular, and scalable code (Python), with hands-on involvement across end-to-end project execution. ● Solid understanding of regression, classification, and statistical methods ● Proven experience deploying machine learning models in production using Google Cloud Vertex AI (Training Jobs, Custom Training, Model Registry, Scoring Jobs, Experiment Tracking using TensorBoard) ● Experience in orchestrating ML pipelines using Vertex AI Pipelines (Kubeflow) and/or Cloud Composer (Apache Airflow) ● Hands-on exposure to GCP services like Vertex AI SDK, Cloud Storage, Cloud Run, and CI/CD pipelines (GitHub Actions, Cloud Build) ● Experience with monitoring and maintaining ML models in production, using Vertex AI Model Monitoring and related GCP tools Good to Have ● Experience with Generative AI tools like Vertex RAG Engine, Agent Builder, Dialogflow CX, and Google Agent Development Kit (ADK) ● Familiarity with API management (Apigee), VPC networking, load balancers, and GCP SDKs such as Discovery Engine SDK, Reasoning Engine SDK, and GKE You are important to us, let’s stay connected! Every individual comes with a different set of skills and qualities so even if you don’t tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire. Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry. Additional Benefits: Health insurance (self & family), virtual wellness platform, and knowledge communities.

Posted 1 week ago

Apply

7.0 years

6 - 9 Lacs

Gurgaon

On-site

Job Description: Generative AI Engineer (GCP, Python, RAG, Vertex AI) Location: Gurugram/Noida Experience Required: 7-9 years (2+ years specifically in Generative AI) Job Summary: We are seeking an experienced and highly skilled Generative AI Engineer with strong hands-on experience in building and deploying AI/ML models, particularly in the domain of Generative AI. The ideal candidate will have deep technical expertise in Python, Flask, Google Cloud Platform (GCP), Vertex AI, prompt engineering, embeddings, and Retrieval-Augmented Generation (RAG) applications. Key Responsibilities: · Design, build, and deploy scalable AI/ML models using Python and Flask, tailored for real-time and batch processing. · Leverage Google Cloud Platform (GCP) services, particularly Vertex AI, for training, hosting, and managing machine learning models. · Develop and optimize RAG (Retrieval-Augmented Generation) pipelines using vector databases and embeddings. · Write and fine-tune prompts for Generative AI models ensuring alignment with application-specific goals. · Create APIs/endpoints for seamless model integration using Flask and RESTful design. · Collaborate closely with product managers, data scientists, and engineers to define and deliver GenAI-powered solutions. · Present findings, models, and technical strategies to both technical and non-technical stakeholders. Required Qualifications: · 8+ years of overall experience in AI/ML domain. · Minimum 2 years of experience specifically in Generative AI applications. · Proven hands-on experience with GCP, especially Vertex AI, AI Platform, and related ML services. · Strong Python programming skills, including development of Flask-based APIs. · Experience with RAG, vector search engines, and embedding models. · Solid understanding of prompt engineering and application of LLMs in production. · Familiarity with deep learning techniques and transformer-based architectures. · Excellent communication and documentation skills. Preferred Skills (Bonus Points): · Exposure to Natural Language Processing (NLP) techniques and libraries. · Experience with tools like LangChain, OpenAI API, or LLamaIndex. · Proficiency in data visualization and data analysis for monitoring model performance. · Familiarity with CI/CD pipelines for ML models and containerization (Docker, Kubernetes). *Immediate joiners required *Excellent Communication skills required as client interaction will be there Job Types: Full-time, Contractual / Temporary Contract length: 6 months Pay: ₹600,000.00 - ₹900,000.00 per year Schedule: Day shift Application Question(s): How many years of experience do you hold into Gen AI? Do you have experience into GCP cloud services & Vertex AI? Are you open to work on contractual role for 6-8 months? We need to fill this position urgently. Are you an immediate joiner? Experience: AI/ML: 7 years (Required) Language: English (Required) Application Deadline: 26/07/2025 Expected Start Date: 01/08/2025

Posted 1 week ago

Apply

3.0 years

3 - 4 Lacs

Mohali

On-site

Job Title: Pre-Sales Technical Business Analyst (AI/ML & MERN Stack) Job Type: Full-time | Pre-Sales | Technical Consulting About the Role: We are seeking a dynamic Pre-Sales Technical Business Analyst with a strong foundation in AI/ML solutions , MERN stack technologies , and API integration . This role bridges the gap between clients’ business requirements and our technical solutions, playing a pivotal role in shaping proposals, leading product demos, and translating client needs into technical documentation and strategic solutions. Key Responsibilities: Client Engagement: Collaborate with the sales team to understand client requirements, pain points, and objectives. Participate in discovery calls, solution walkthroughs, and RFP/RFI responses. Solution Design & Technical Analysis: Analyze and document business needs, converting them into detailed technical requirements. Propose architectural solutions using AI/ML models and the MERN stack (MongoDB, Express.js, React.js, Node.js) . Provide input on data pipelines, model training, and AI workflows where needed. Technical Presentations & Demos: Prepare and deliver compelling demos and presentations for clients. Act as a technical expert during pre-sales discussions to communicate the value of proposed solutions. Documentation & Proposal Support: Draft technical sections of proposals, SoWs, and functional specs. Create user flows, diagrams, and system interaction documents. Collaboration: Work closely with engineering, product, and delivery teams to ensure alignment between business goals and technical feasibility. Conduct feasibility analysis and risk assessments on proposed features or integrations. Required Skills & Experience: 3+ years in a Business Analyst or Pre-Sales Technical Consultant role. Proven experience in AI/ML workflows (understanding of ML lifecycle, model deployment, data prep). Strong technical knowledge of the MERN stack – including RESTful APIs, database schema design, and frontend/backend integration. Solid understanding of API design , third-party integrations, and system interoperability. Ability to translate complex technical concepts into simple business language. Hands-on experience with documentation tools like Swagger/Postman for API analysis. Proficient in writing user stories, business cases, and technical specifications. Preferred Qualifications: Exposure to cloud platforms (AWS, Azure, GCP) and ML platforms (SageMaker, Vertex AI, etc.). Experience with Agile/Scrum methodologies. Familiarity with AI use cases like recommendation systems, NLP, predictive analytics. Experience with data visualization tools or BI platforms is a plus. Job Types: Full-time, Permanent Pay: ₹30,000.00 - ₹40,000.00 per month Work Location: In person

Posted 1 week ago

Apply

0 years

0 - 1 Lacs

Gāndhīnagar

On-site

FinDocGPT Internship Opportunity – AI/ML & GenAI Developer Intern Company : ArgyleEnigma Tech Labs Product : FinDocGPT – India’s first AI-powered assistant that decodes complex financial documents in simple, regional language Stipend : ₹8,000 – ₹12,000/month Duration : 6 months Start Date : Immediate About Us At FinDocGPT, we are building a revolutionary product that empowers millions of Indians to understand financial documents like health insurance, loans, and mutual fund T&Cs in their own language – using cutting-edge AI/ML and GenAI technologies . We’re backed by Google for Startups, supported by Reserve Bank Innovation Hub and on a mission to bridge the financial literacy gap in India. Intern Role – AI/ML & Generative AI Developer We are looking for highly motivated interns who want to gain hands-on experience in deploying GenAI models, NLP pipelines, and ML-based document processing. Responsibilities Work on document classification , NER , and summarization using LLMs like LLaMA, Mistral, Groq, and open-source models. Fine-tune or use APIs like LangChain, HuggingFace, OpenAI, and Google Vertex AI Preprocess, clean, and structure financial documents (PDFs, scans, emails, etc.) Implement prompt engineering & RAG-based workflows Collaborate with product & design to build smart, user-friendly interfaces Requirements Strong understanding of Python , NLP , and basic ML concepts Familiarity with transformers, BERT, T5, or GPT architectures Experience (even academic) with HuggingFace , LangChain , or LLM APIs Bonus: Exposure to Google Cloud, AWS , or Docker Hunger to learn, experiment fast, and make real impact What You’ll Gain Direct mentorship from industry experts (ex-Morgan Stanley, PIMCO, Google) Real-world exposure to India-first AI use cases Opportunity to convert into full-time role based on performance Work that will directly impact financial inclusion in India How to Apply? Send your CV, GitHub/portfolio, and 2–3 lines on why you want this role to info@arglyeenigma.com or https://tinyurl.com/hr-aetl with subject: “Internship Application – FinDocGPT” Job Type: Internship Contract length: 6 months Pay: ₹8,000.00 - ₹12,000.00 per month Ability to commute/relocate: Gandhinagar, Gujarat: Reliably commute or planning to relocate before starting work (Required) Work Location: In person Expected Start Date: 01/08/2025

Posted 1 week ago

Apply

6.0 - 9.0 years

6 - 10 Lacs

Noida

On-site

Key Responsibilities: Design, build, and deploy scalable AI/ML models using Python and Flask, tailored for real-time and batch processing. Leverage Google Cloud Platform (GCP) services, particularly Vertex AI, for training, hosting, and managing machine learning models. Develop and optimize RAG (Retrieval-Augmented Generation) pipelines using vector databases and embeddings. Write and fine-tune prompts for Generative AI models ensuring alignment with application-specific goals. Create APIs/endpoints for seamless model integration using Flask and RESTful design. Collaborate closely with product managers, data scientists, and engineers to define and deliver GenAI-powered solutions. Present findings, models, and technical strategies to both technical and non-technical stakeholders. Required Qualifications: 6-9 years of overall experience in AI/ML domain. Minimum 2 years of experience specifically in Generative AI applications. Proven hands-on experience with GCP, especially Vertex AI, AI Platform, and related ML services. Strong Python programming skills, including development of Flask-based APIs. Experience with RAG, vector search engines, and embedding models. Solid understanding of prompt engineering and application of LLMs in production. Familiarity with deep learning techniques and transformer-based architectures. Excellent communication and documentation skills. Preferred Skills Exposure to Natural Language Processing (NLP) techniques and libraries. Experience with tools like LangChain, OpenAI API, or LLamaIndex. Proficiency in data visualization and data analysis for monitoring model performance. Familiarity with CI/CD pipelines for ML models and containerization (Docker, Kubernetes). Job Type: Contractual / Temporary Contract length: 6 months Pay: ₹50,000.00 - ₹90,000.00 per month

Posted 1 week ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Key Responsibilities • Build Gen AI-enabled solutions using online and offline LLMs, SLMs and TLMs tailored to domain-specific problems. • Deploy agentic AI workflows and use cases using frameworks like LangGraph, Crew AI etc. • Apply NLP, predictive modelling and optimization techniques to develop scalable machine learning solutions. • Integrate enterprise knowledge bases using Vector Databases and Retrieval Augmented Generation (RAG). • Apply advanced analytics to address complex challenges in Healthcare, BFSI and Manufacturing domains. • Deliver embedded analytics within business systems to drive real-time operational insights. Required Skills & Experience • 3–5 years of experience in applied Data Science or AI roles. • Experience working in any one of the following domains: BFSI, Healthcare/Health Sciences, Manufacturing or Utilities. • Proficiency in Python, with hands-on experience in libraries such as scikit-learn, TensorFlow • Practical experience with Gen AI (LLMs, RAG, vector databases), NLP and building scalable ML solutions. • Experience with time series forecasting, A/B testing, Bayesian methods and hypothesis testing. • Strong skills in working with structured and unstructured data, including advanced feature engineering. • Familiarity with analytics maturity models and the development of Analytics Centre of Excellence (CoE’s). • Exposure to cloud-based ML platforms like Azure ML, AWS SageMaker or Google Vertex AI. • Data visualization using Matplotlib, Seaborn, Plotly; experience with Power BI is a plus. What We Look for (Values & Behaviours) • AI-First Thinking – Passion for leveraging AI to solve business problems. • Data-Driven Mindset – Ability to extract meaningful insights from complex data. • Collaboration & Agility – Comfortable working in cross-functional teams with a fast-paced mindset. • Problem-Solving – Think beyond the obvious to unlock AI-driven opportunities. • Business Impact – Focus on measurable outcomes and real-world adoption of AI. • Continuous Learning – Stay updated with the latest AI trends, research and best practices.

Posted 1 week ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Description Amazon is seeking a Tax Analyst II to join its Direct Tax Reporting team in Hyderabad, India supporting the Foreign Tax Reporting and Compliance team. The Amazon Tax Department is a fast-paced, team-focused, dynamic environment. The mission of the Foreign Tax Reporting and Compliance team is to comply with foreign local tax reporting requirements and manage the worldwide, tax-related filings of Amazon’s foreign entities. This position will be primarily focused on preparing and reviewing India/APAC countries corporate tax reporting (direct tax returns, tax accounting & tax assessments) as well as the international aspects of the US GAAP worldwide income tax reporting. A successful candidate will have excellent organizational and communication skills; a strong attention to detail; and the ability to employ technology tools to streamline large amounts of data for tax reporting. S/he can also prioritize multiple tasks with teammates around the globe in a deadline-driven, dynamic environment, and will be self-motivated to build cross functional process improvements. Key job responsibilities Prepare and/or review quarterly advance tax computation for various entities and ensuring timely payment of Taxes. Prepare and/or review of annual India income tax returns for Amazon’s overseas entities. Prepare and/or review monthly/ quarterly tax computations to support Amazon’s worldwide US GAAP provision for India/APAC based entities. Review of financial statements, reports and support closing of local country financial statements. Support & review local country transfer pricing compliances and ensure they are completed on time. Able to extract, analyse, and review data and make appropriate recommendations. Coordinate information requests with internal and external service providers to ensure accurate and timely closure of tax reporting deliverables. Identify process improvements which increase efficiency and scalability of data. About The Team The Foreign Reporting and Compliance team is responsible for direct income tax reporting obligations in 80+ countries outside of the U.S. This includes income tax accounting for statutory reporting under local GAAP (including related internal controls), management of the statutory audit from income tax perspective, ownership of local income tax filings, cash tax management, tax audit/controversy support, and tax planning/M&A support. FRC tax owners must not only understand the transactions related tax issues but also have the ability to coordinate and manage issues with local advisors, Tax Planning, US Provision, M&A, Transfer Pricing and business owners. Basic Qualifications Experience working in a large public accounting firm or multi-national corporate tax department CA or Masters from a recognized institute or equivalent preferred Preferred Qualifications 2+ years of maintaining and operating transaction tax calculation software (e.g. Vertex) experience Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI HYD 13 SEZ Job ID: A3040440

Posted 1 week ago

Apply

1.0 - 2.0 years

0 Lacs

Kochi, Kerala, India

On-site

Join Vertex Securities, a leading Stock Broking Company, as a Marketing Officer and play a vital role in shaping our brand and outreach strategies. Ideal candidates should be passionate in driving impactful marketing campaigns. Job Responsibilities: Creating and delivering messages to communicate with customers, prospects and other target audiences. To create content for e-mail marketing, social media, banners, posters and brochures. Write and update website content regularly. Overall understanding and interest in branding, media, advertising and PR. Support marketing team with tools, creatives and materials. Manage and schedule social media post and campaigns. Run digital marketing campaigns across platforms. Perform SEO optimisation for website and blog content. Qualifications: Graduation, Specialisation in Marketing/PR will be an added advantage. Experience: 1-2 years

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Company Description: McDonald’s new growth strategy, Accelerating the Arches, encompasses all aspects of our business as the leading global omni-channel restaurant brand. As the consumer landscape shifts, we are using our competitive advantages to further strengthen our brand. One of our core growth strategies is to Double Down on the 3Ds (Delivery, Digital and Drive Thru). McDonald’s will accelerate technology innovation so 65M+ customers a day will experience a fast, easy experience, whether at one of our 25,000 and growing Drive Thrus, through McDelivery, dine-in or takeaway. Leading this revolution is McDonald’s Global Technology organization made up of intrapreneurs who get to build really cool tech with scary smart people using the latest innovations like AI, IOT, and edge computing. We do this working along diverse, global teams who are always hungry for a challenge. It’s bonus points when you get to see your family and friends use the tech you build at their favorite McD restaurant. Job Description: This opportunity is part of the Global Technology Enterprise Products & Platforms (EPP) Team, focused on Treasury and Tax applications, Kyriba and Vertex, where our vision is to Always be a people led, product centric, forward-thinking & trusted technology partner. The Senior Technical Analyst supports the Technical Product Management leadership in technical / IT related delivery topics (e.g., trade-offs in implementation approaches, tech stack selection) and provides technical guidance for developers / squad members. Manages the output (quality and efficiency) of internal/external squads to ensure they are delivering to the standards required by McD. Participate in roadmap and backlog preparation. Builds and maintains technical process flows and solution architecture diagrams on Product level. Leads acceptance criteria creation and validation of development work. Supports hiring and development of engineers. Will also act as a technical developer to assist the development team. This position reports to the Technical Product Manager. Responsibilities & Accountabilities: Product roadmap and backlog preparation I n partnership with the TPM, participate in roadmap and backlog prioritization, providing technical perspectives towards translating Epic into user stories for the Develop team to work on, as well as backlog refinement processes. Analyze existing business processes, workflows, and systems to identify inefficiencies, bottlenecks, and opportunities for improvement. Build and update documents required as part of the product development lifecycle, in close coordination with product, technical and business teams (business & functional requirements for feature development, functional test scripts, and process documents). Create detailed requirements documents, user stories, use cases, and acceptance criteria to guide the development and implementation of technology solutions. Ensure that requirements are clear, comprehensive, and aligned with business goals. Agile ceremonies: Attend all product team ceremonies and act as leader of the Software Development Engineers Technical solutioning and feature development / releases: Works with boundary and integration systems to troubleshoot and mitigate any source/destination issues and requirements. Work and support with Business Users and Product Teams on Incident Management/Request Management/Problem Management /Change Management and Knowledge Management. Knowledge of Agile software development process including application of Agile techniques and delivery practices and promoting adoption of Agile methodologies to secure outcome-driven mindset in product teams. Fundamental understanding of Oracle Cloud ERP and areas such as: Oracle API’s, SQL, XML, PLSQL, OTBI/ BIP/FRS reports, FBDI, ADFDI, BPM workflows, BI Extract for FTP, Integration and Personalization, Oracle Fusion Data Intelligence (FDI). Understanding of ESS Jobs, OM Extensions, Flex Fields (DFF, EFF, KFF), Lookups, Value sets, and Fusion Apps functional setup manager configurations. Analyze Patches and plan patch deployment activities. Review and prioritize defects impacting customers. Serve as a point-person for product questions and be able to dig deep into problems to help find a solution. Working on market requirements, designing solutions, and assessing technical issues and working on resolutions with Team. Collaborate with other technology teams including internal teams, service provider and vendors. Ensure application service levels are maintained as per agreed standards. Accountable for the deployment of new features including QA, push to production and defect remediation. Ensure code development is in line with the architectural, quality and security standards and best practices. Maintain documentation standards for all software and lead acceptance criteria validation of development work. Ensure product delivery is done to a high standard with high performance across latency and scalability, extensibility, and security. Qualifications: Basic Qualifications: Bachelor’s degree in computer science or engineering. 5+ years of technical analyst experience with Banking and/or Tax systems. Experience working collaboratively with business partners to drive outcomes and solve complex technical challenges Excellent interpersonal, written, and verbal communication skills with the ability to understand business requirements and translate them into technical roadmaps. Ability to grasp technical concepts quickly, solid ability to translate technical verbiage into easily understandable language. Solid grasp of the software development lifecycle and Agile development practices Ability to contribute as a hands -on technologist when the need arises (e.g. writing and debugging scripts, setting up GitHub repos, troubleshooting issues with servers and prototypes). Experience applying data to make informed business decisions, and facilitate continuous improvement Positive thinking skills that include creating a safe environment to learn, challenge, accept risk taking and a willingness to be wrong Experienced testing background with a high attention to detail is desirable. Preferred Qualifications: Experience in operational aspects of banking systems as it relates to Oracle ERP. Experience with Vertex or other tax engines a plus Knowledge of FTP / H2H connections, encryption / decryptions, certificate renewals, Bank / Kyriba / ERP APIs, Swift Protocols Fin / FileAct channels, etc. Experience in connectivity projects and migrations including bank connections and third parties’ integrations to our clients (sftp, API, certificate renewal, encryption…). Understanding of Cash Management, Cash Positioning and Forecasting, Liquidity Management, Payments and Payment Interfaces. Understanding of technical protocols such as TCP / IP, LAN/WAN, SSO, SAML 2.0, API a plus. Strong technical debugging skills Proficiency in SQL, data integration tools, and scripting languages. Experience in Creating and maintaining detailed requirements, technical documentation, including architecture diagrams, configuration guides, and user manuals. Strong analytical skills and the ability to solve complex Business challenges related to lease and financial integration. Excellent communication and stakeholder management skills, with the ability to engage both business and IT teams. Experience working in product-centric organizations and / or product owner certification. Experience with JIRA and Confluence tools. Experience in implementing and managing IT General Controls (ITGC) to ensure SOX compliance. Experience in DevOps, Security and Systems Performance desirable. Understanding of Cloud architecture and Oracle cloud security. Foundational expertise in security: security standards, SSO, SAML, OAUTH, etc. Knowledge of Oracle ERP Cloud Finance modules. Experience with cybersecurity experts to ensure that systems are resilient against cyber threats and comply with relevant regulations and standards is desirable.

Posted 1 week ago

Apply

0.0 years

0 - 0 Lacs

Gandhinagar, Gujarat

On-site

FinDocGPT Internship Opportunity – AI/ML & GenAI Developer Intern Company : ArgyleEnigma Tech Labs Product : FinDocGPT – India’s first AI-powered assistant that decodes complex financial documents in simple, regional language Stipend : ₹8,000 – ₹12,000/month Duration : 6 months Start Date : Immediate About Us At FinDocGPT, we are building a revolutionary product that empowers millions of Indians to understand financial documents like health insurance, loans, and mutual fund T&Cs in their own language – using cutting-edge AI/ML and GenAI technologies . We’re backed by Google for Startups, supported by Reserve Bank Innovation Hub and on a mission to bridge the financial literacy gap in India. Intern Role – AI/ML & Generative AI Developer We are looking for highly motivated interns who want to gain hands-on experience in deploying GenAI models, NLP pipelines, and ML-based document processing. Responsibilities Work on document classification , NER , and summarization using LLMs like LLaMA, Mistral, Groq, and open-source models. Fine-tune or use APIs like LangChain, HuggingFace, OpenAI, and Google Vertex AI Preprocess, clean, and structure financial documents (PDFs, scans, emails, etc.) Implement prompt engineering & RAG-based workflows Collaborate with product & design to build smart, user-friendly interfaces Requirements Strong understanding of Python , NLP , and basic ML concepts Familiarity with transformers, BERT, T5, or GPT architectures Experience (even academic) with HuggingFace , LangChain , or LLM APIs Bonus: Exposure to Google Cloud, AWS , or Docker Hunger to learn, experiment fast, and make real impact What You’ll Gain Direct mentorship from industry experts (ex-Morgan Stanley, PIMCO, Google) Real-world exposure to India-first AI use cases Opportunity to convert into full-time role based on performance Work that will directly impact financial inclusion in India How to Apply? Send your CV, GitHub/portfolio, and 2–3 lines on why you want this role to info@arglyeenigma.com or https://tinyurl.com/hr-aetl with subject: “Internship Application – FinDocGPT” Job Type: Internship Contract length: 6 months Pay: ₹8,000.00 - ₹12,000.00 per month Ability to commute/relocate: Gandhinagar, Gujarat: Reliably commute or planning to relocate before starting work (Required) Work Location: In person Expected Start Date: 01/08/2025

Posted 1 week ago

Apply

1.0 - 3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Overview We are looking for a hybrid AI Platform Engineer to lead the design of high-performance prompts and the development of custom AI agents and integrations across our internal platforms. Collaborating closely with our AI Platform Manager and functional business units, this role bridges the creative precision of prompt engineering with the technical fluency required to build and deploy powerful LLM-based tools using APIs, system integrations, and emerging AI infrastructure. Key Responsibilities Prompt Engineering & Optimization Design, test, and refine prompts for general and use-case-specific applications. Evaluate and improve LLM output quality based on accuracy, latency, hallucination, and user satisfaction. Maintain a scalable prompt library with robust documentation and experimentation tracking. Custom Agent & GPT Development Build and maintain custom GPTs (via OpenAI) or Google Agents (Gemini, Vertex AI). Configure tools, memory, system prompts, and API-calling capabilities for agent workflows. Ensure agents follow internal data governance and compliance policies. API & Platform Integration Integrate LLM functionality into internal systems (e.g., dashboards, Slack, CRMs, internal portals). Work with APIs from OpenAI, Google Cloud, Azure, and others to create functional pipelines. Support rapid prototyping and deployment of AI features in close coordination with product and automation teams. Evaluation & Instrumentation Define key performance metrics (quality, reliability, usage patterns). Collaborate with data engineering to log, analyze, and visualize LLM performance. Implement feedback loops to continuously optimize prompt and agent behavior. Enablement & Documentation Provide tooling and guidance to business users and product teams for prompt-based workflows. Contribute to internal AI playbooks and best practices around prompt usage and API integration. Requirements 1 -3 years of experience in AI/ML, NLP applications, or full-stack product development Hands-on experience with OpenAI APIs, Google Cloud Vertex/Gemini, or similar platforms Strong experience with prompt engineering and LLM evaluation methods Proficiency in Python and working with APIs, JSON, and vector databases Familiarity with LangChain, LlamaIndex, or custom agent frameworks Excellent collaboration and communication skills

Posted 1 week ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Your Qualifications Bachelor's with > 6 years of relevant experience Experience in Analyzing the Complex Problems and translate it into data science algorithms Good grasp of NLP and GenAI with project experience Experience in machine learning, supervised and unsupervised: NLP, Classification, Data/Text Mining, Multi-modal supervised and unsupervised models, Neural Networks, Deep Learning Algorithms Experience in statistical learning: Predictive & Prescriptive Analytics, Web Analytics, Parametric and Non-parametric models, Regression, Time Series, Dynamic/Causal Model, Statistical Learning, Guided Decisions, Topic Modeling Experience with big data analytics - identifying trends, patterns, and outliers in large volumes of data Embedding generation from training materials, storage and retrieval from Vector Databases, set-up and provisioning of managed LLM gateways, development of Retrieval augmented generation based LLM agents, model selection, iterative prompt engineering and finetuning based on accuracy and user-feedback, monitoring and governance. Lead role mentoring multiple Jr. Analysts on approach and results. Strong Experience in Python, PySpark Google Cloud platform ,Vertex AI, Kubeflow, model deployment Strong Experience with big data platforms – Hadoop (Hive, Map Reduce, HQL, Scala) Responsibilities Play a key role to solve complex problems, business and drive actionable insights from petabytes of data Utilize product mindset to build, scale and deploy holistic data science products after successful prototyping Demonstrate incremental solution approach with agile and flexible ability to overcome practical problems Lead small teams and participate in data science project teams by serving as the technical lead for project Partner with senior team members to assess customer needs and define business questions Clearly articulate and present recommendations to business partners, and influence future plans based on insights Partner and engage with associates in other regions for delivering best services to the customers around the globe Work with customer centric mindset to deliver high quality business driven analytic solution Mentor peers and analysts across the division in analytical best practices Drive innovation in approach, method, practices, process, outcome, delivery, or any component of end-to-end problem solving

Posted 1 week ago

Apply

8.0 years

0 Lacs

Sholinganallur, Tamil Nadu, India

On-site

Role: MLE + Vertex AI Mode : Permanent - Full time Exp: 4- 8 years Job Description: The candidate should be a self-starter and be able to contribute independently in the absence of any guidance. strong vertex ai experience, moving multiple MLE workloads on to vertex ai is a pre-requisite. The client is not looking to act as guides/mentors. “They are seeking an MLE with hands-on experience in delivering machine learning solutions using Vertex AI and strong Python skills. The person must have 5+ years of experience, with 3+ in MLE. Advanced knowledge of machine learning, engineering industry frameworks, and professional standards. Demonstrated proficiency using cloud technologies and integrating with ML services including GCP Vertex AI, DataRobot or AWS SageMaker in large and complex organisations and experience with SQL and Python environments. Experience in Technology delivery, waterfall and agile. Python and SQL skills. Experience with distributed programming (e.g. Apache Spark, Pyspark) . Software engineering experience/skills. Experience working with big data cloud platforms (Azure, Google Cloud Platform, AWS). DevOps experience. CI/CD experience. Experience with Unit Testing, TDD. Experience with Infrastructure as code. Direct client interaction. Must Have Skills: Vertex AI, MLE, AWS, PYTHON, SQL Interested candidates can reach us @7338773388 or careers@w2ssolutions.com & hr@way2smile.com

Posted 1 week ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka

Remote

Job Information Job Opening ID JRF520 Date Opened 07/23/2025 Job Type Full time Industry IT Services City Bangalore South State/Province Karnataka Country India Zip/Postal Code 560102 Job Description As a AI Engineer, your role typically involves programming. You would be responsible for the commitments in terms of time, effort and quality of work. You would likely be a part of a larger offshore team and are expected to work collaboratively with your peers onsite and offshore to deliver milestone/sprint-based deliverables. Typical activities that would be expected are Program and deliver as per the scope provided by the delivery leads/onsite managers. Actively participate in the discussions/scrum meetings to comprehend and understand your scope of work and deliver as per your estimates/commitments Proactively reach out to others when you need assistance and to showcase your work Work independently on your assigned work Requirements Should have experience in the below: Python (Advanced proficiency) PyTorch / TensorFlow (Model Development & Deployment) LangChain / LlamaIndex (LLM Orchestration) HuggingFace Transformers OpenAI, AWS Bedrock, Vertex AI, or Azure OpenAI APIs Vector Databases for RAG (Weaviate / Milvus / FAISS/mongodb) MongoDB / PostgreSQL (Structured Data Retrieval & Joins) Redis / DynamoDB (Fast Caching & Lookup) Kafka / RabbitMQ (Message Queues for Real-Time Inference) FastAPI / Flask (Backend APIs for ML Serving) Docker (Containerization for Model Inference) CI/CD Pipelines (GitHub Actions / GitLab CI / Jenkins) MLflow / Weights & Biases (Experiment Tracking & Model Management) S3 / GCS (Storage for Model Artifacts & Datasets) ElasticSearch / OpenSearch (Search over structured/unstructured text) Linting & Testing: Pytest, Black, Ruff, Flake8 Type Hinting, Documentation Standards, YAML/JSON Config Management LLM training - finetuning (PEFT), pre-training Should be familiar with at least two Gen AI platform like Amazon Bedrock, Copilot, Vertex, etc.. Should be familiar with at least two model like Claude, ChatGPT, Gemini, etc.. Should be familiar with RAG Model and Agentic AI model Architectural Requirements: Strong understanding of AI system design: data ingestion preprocessing model retrieval response Experience leading teams on LLM/NLP/ML projects end-to-end Excellent architectural decision-making & scalability mindset Familiarity with prompt engineering, evaluation metrics, and benchmarking Strong communication, documentation, and client-handling skills Benefits Insurance benefits for the self and the spouse, including maternity benefits. Ability to work on many products as the organization is focused on working with several ISVs. Monthly sessions to understand the directions of each function and an opportunity to interact with the entire hierarchy of the organization. Celebrations are a common place - physical or virtual. Participate in several games with your coworkers. Voice your opinions on topics other than your work - Chimera Talks. Hybrid working models - Remote + office

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies