Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
You will be responsible for developing, training, and fine-tuning Machine Learning models for AI/ML applications. This includes designing and implementing data pipelines for data processing, model training, and inference. Additionally, you will be deploying models using MLOps and integrating them with cloud infrastructure. Collaboration with product managers and designers to conceptualize AI-driven features will also be a key part of your role. You will also be expected to research and implement various ML and AI techniques to improve performance. To excel in this role, you should have proficiency in Python and ML frameworks such as Scikit-learn, XGBoost, TensorFlow, PyTorch. Experience with SQL and ETL data pipelines, including data processing and feature engineering, will be beneficial. Familiarity with Docker and container-based deployments to create cloud-agnostic products is required. A strong understanding of AI and Machine Learning concepts such as Supervised Learning, Unsupervised Learning, Deep Learning, and Reinforcement Learning is essential. Knowledge of at least one cloud platform (AWS, Azure, GCP) and ML deployment strategies, preferably Azure, is preferred. Exposure to LLMs (e.g., OpenAI, Hugging Face, Mistral) and foundation models will be an advantage. Understanding of various Statistical models is also expected. If you have 5 to 7 years of experience in the relevant field and possess the mentioned skills and qualifications, we would like to hear from you.,
Posted 1 month ago
2.0 - 7.0 years
5 - 11 Lacs
Pune
Work from Office
Role: Generative AI Engineer (Contract) Experience: 2+ years in AI/ML development Notice Period - Immediate joiner / upto 30 days Location: Pune (Prefered Candidates From Pune location) Key Responsibilities: Design and deploy Generative AI models (LLMs, diffusion models) on Google Cloud Platform (GCP) using Vertex AI, BigQuery, Cloud Storage. Develop end-to-end AI solutions including data pipelines, APIs, and microservices. Work on prompt engineering to optimize AI model performance. Analyze datasets to uncover insights and optimize business outcomes. Collaborate with cross-functional teams for seamless AI integration. Must-Have Skills: Strong experience in Generative AI, LLMs, diffusion models. Hands-on with Google Cloud Platform (GCP), especially Vertex AI. Proficiency in Python, TensorFlow/PyTorch, Machine Learning/Deep Learning. Experience in Power BI for data visualization. Excellent problem-solving and collaboration skills. Good to Have: MLOps, Docker/Kubernetes, chatbot development experience.
Posted 1 month ago
0.0 - 2.0 years
0 - 1 Lacs
Gurugram
Work from Office
Hiring Al Trainees in Gurgaon! Work on cutting-edge Agentic Al in healthcare (RAG, LLMS, automation workflows). Ideal for freshers eager to build real-world Al apps. On-site role. Apply: mcuraimplementation@gmail.com, humanresource@mcura.com
Posted 1 month ago
6.0 - 10.0 years
8 - 12 Lacs
Hyderabad
Work from Office
Python programming expertise: data structures, OOP, recursions, generators, iterators, decorators, familiarity with regular expressions. Working knowledge and experience with deep learning framework Pytorch or Tensorflow. Embedding representations. Strong experience in Python and production system development. Familiarity with SQL database interactions. Familiarity with Elasticsearch document indexing, querying. Familiarity with Docker, Dockerfile. Familiarity with REST API, JSON structure. Python packages like FastAPI. Familiarity with git operations. Familiarity with shell scripting. Familiarity with PyCharm for development, debugging, profiling. Experience with Kubernetes. Experience with LLms & Gen AI. Desired Skills NLP toolkits like NLTK, spaCy, Gensim, scikit-learn. Familiarity with basic natural language concepts, handling. Tokenization, lemmatization, stemming, edit distances, named entity recognition, syntactic parsing, etc. Good knowledge and experience with deep learning framework Pytorch or Tensorflow. More complex operations with Elasticsearch. Creating indices, indexable fields, etc. Good experience with Kubernetes
Posted 1 month ago
3.0 - 6.0 years
8 - 18 Lacs
Bengaluru
Work from Office
RPA Developer Location : Bangalore [BTM layout] Work Model : Hybrid (3 days office, 2days WFH) Experience : 3+years We have opening in RPA (APA). Below skills are required along with A360 Proof of Concept (POC) or project experience with LLMs (e.g., ChatGPT) Exposure to AWS AI and Claude Sonnet (AI/LLM-related work) Experience with Automation Anywhere and its integration with AI solutions Familiarity with APA (Agentic Process Automation) Responsibilities Provide guidance with process design. Design, develop, and test automation workflows. Deploy RPA components including bots, robots, development tools, code repositories and logging tools. Support the launch and implementation of RPA solutions. Create process and end-user documentation. Assure the quality of the automation (QA processes). Work with Business Analysts, Scrum Masters, QA Analysts, Product Owners, and other cross-functional resources to define and deliver business impacting projects. Certifications: 1. Automation Anywhere RPA Developer Qualifications preferred : 1. Graduation: Bachelors in engineering or computer science 2. Post-Graduation (Preferred): Master’s in engineering or computer science
Posted 1 month ago
1.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
You are an experienced Reporting GenAI Consultant with a strong background in developing AI-driven reporting solutions. Your role involves building and integrating Generative AI capabilities into BI platforms to enable natural language insights, automated report generation, and interactive dialogue with data. You will leverage your hands-on experience working with LLMs, prompt engineering, and modern data visualization tools to deliver innovative reporting solutions. Your responsibilities include designing, developing, and deploying GenAI-based reporting solutions that generate insights summaries, dashboards, and narrative analytics from structured and unstructured data. You will build natural language interfaces and conversational agents for querying data, enabling users to interact with reports through plain English. Additionally, you will integrate GenAI features like ChatGPT, Azure OpenAI, or Vertex AI with enterprise BI platforms such as Power BI, Tableau, Qlik, ThoughtSpot, etc. Furthermore, you will implement automated insight generation using LLMs to summarize trends, detect anomalies, and generate key takeaways. Collaboration with data engineering and BI teams is crucial to optimize data models and ensure clean, prompt-ready datasets. You will design and fine-tune prompts and templates for contextual report summarization and storytelling, and conduct POCs and pilots to evaluate the feasibility and impact of GenAI-driven reporting use cases. It is essential to ensure that solutions are secure, scalable, and compliant with enterprise governance policies. To excel in this role, you should have 10+ years of experience in Business Intelligence/Analytics with 1-2 years in Generative AI implementations. Strong experience in Power BI with exposure to augmented analytics features is required. Your expertise should include working with LLMs for natural language understanding and summarization, prompt engineering, few-shot learning, and custom summarization models. A good understanding of data storytelling, narrative generation, and auto-generated insights is essential. Experience in integrating APIs for AI models into web or reporting tools is beneficial, along with familiarity with Python or JavaScript for model integration and backend logic. Excellent communication and stakeholder management skills are also necessary for this role. Preferred qualifications include experience with RAG (Retrieval-Augmented Generation), LangChain, or similar frameworks, exposure to voice-based analytics or speech-to-insight solutions, knowledge of data governance, privacy (GDPR/CPRA), and enterprise security standards, as well as familiarity with cloud platforms like Azure.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As a Software Engineer III at JPMorgan Chase within the AI/ML Data Platform team, you will play a crucial role in designing and delivering cutting-edge technology products in a secure, stable, and scalable manner. Your expertise will be instrumental in implementing critical technology solutions across various technical domains to support the firm's business objectives effectively. You will collaborate closely with business stakeholders, product teams, and technology experts to develop software solutions that align with strategic goals. Your responsibilities will include architecting, designing, and developing AI products using generative AI, natural language processing, and other AI-ML technologies. Working alongside software developers, data scientists, and product teams, you will establish timelines for product features and ensure effective communication with business stakeholders. In this role, you will conduct data modeling for AI software solutions, devise data persistence strategies, and create robust data pipelines. Setting coding standards for repositories, performing code reviews, and overseeing product deployments on public and private clouds will also be part of your responsibilities. You will be responsible for managing server costs through monitoring and tuning to ensure efficient operations. To qualify for this role, you should have formal training or certification in software engineering concepts along with a minimum of 3 years of practical experience. Your hands-on experience should cover system design, application development, testing, operational stability, and Agile SDLC. Proficiency in Python, Java, and JavaScript is essential, along with expertise in technologies like FastAPI, Spring, Agent Building tools, and LLMs. Additionally, you should possess advanced knowledge of automation and continuous delivery methods, with a strong grasp of agile methodologies such as CI/CD, Application Resiliency, and Security. Demonstrated proficiency in software applications and technical processes related to cloud, AI, ML, and mobile technologies is crucial. A deep understanding of the financial services industry, IT systems, microservice design patterns, data structures, algorithms, and cloud services like AWS and Terraform is highly desirable. Preferred qualifications include exposure to Python libraries like pandas, scipy, and numpy, as well as familiarity with python concurrency through multiprocessing. Knowledge of grid computing concepts and the financial services industry will be advantageous in this role.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Analytics focused Senior Software Engineer at PubMatic, you will be responsible for developing advanced AI agents to enhance data analytics capabilities. Your expertise in building and optimizing AI agents, along with strong skills in Hadoop, Spark, Scala, Kafka, Spark Streaming, and cloud-based solutions, will play a crucial role in improving data-driven insights and analytical workflows. Your key responsibilities will include building and implementing a highly scalable big data platform to process terabytes of data, developing backend services using Java, REST APIs, JDBC, and AWS, and building and maintaining Big Data pipelines using technologies like Spark, Hadoop, Kafka, and Snowflake. Additionally, you will design and implement real-time data processing workflows, develop GenAI-powered agents for analytics and data enrichment, and integrate LLMs into existing services for query understanding and decision support. You will work closely with cross-functional teams to enhance the availability and scalability of large data platforms and PubMatic software functionality. Participating in Agile/Scrum processes, discussing software features with product managers, and providing customer support over email or JIRA will also be part of your role. We are looking for candidates with three plus years of coding experience in Java and backend development, solid computer science fundamentals, expertise in developing software engineering best practices, hands-on experience with Big Data tools, and proven expertise in building GenAI applications. The ability to lead feature development, debug distributed systems, and learn new technologies quickly are essential. Strong interpersonal and communication skills, including technical communications, are highly valued. To qualify for this role, you should have a bachelor's degree in engineering (CS/IT) or an equivalent degree from well-known Institutes/Universities. PubMatic employees globally have returned to our offices via a hybrid work schedule to maximize collaboration, innovation, and productivity. Our benefits package includes paternity/maternity leave, healthcare insurance, broadband reimbursement, and office perks like healthy snacks, drinks, and catered lunches. About PubMatic: PubMatic is a leading digital advertising platform that provides transparent advertising solutions to publishers, media buyers, commerce companies, and data owners. Our vision is to enable content creators to run a profitable advertising business and invest back into the multi-screen and multi-format content that consumers demand.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
Develop mathematical models for trading, forecasting, or data processing, and continuously enhance existing models. Analyze large datasets to identify trends and insights, as well as conduct research on financial markets and economic conditions. Utilize strong coding skills in Python and other languages to implement mathematical models. Explore unconventional data sources to drive innovation and enhance model performance. Translate mathematical models into algorithms for practical application. Manage risk by developing mathematical models to assess and mitigate risks associated with products or strategies. Build predictive models based on historical data using Monte Carlo modeling techniques. Demonstrate proficiency in writing pseudo code and Python code to test mathematical models. Possess a good understanding of trading systems and trade execution processes to minimize latency and slippages. Have a comprehensive knowledge of financial and capital markets to make informed decisions. Oversee the development and implementation of investment strategies using mathematical and statistical tools. Utilize AI, machine learning, and linear latent models to enhance efficiency and productivity. Apply data science, mathematics, and statistics to develop or enhance investment strategies. Ensure compliance with SEBI regulations and other legal requirements. Identify and implement mathematical and statistical models for effective risk management of strategies and products. Establish and monitor internal controls to manage and mitigate risks efficiently. Requirements: - Bachelor's degree in Mathematics, Master's degree preferred or currently pursuing a PhD in Mathematics. - Relevant experience in mathematics, statistics, computer science, or finance is preferred. - Minimum of 3-5 years of experience in building mathematical models and data sciences. - Proven ability to develop and implement mathematical models in live market strategies. - Uphold high ethical standards and prioritize clients" best interests. This is a full-time position with a day shift schedule requiring in-person work.,
Posted 1 month ago
7.0 - 11.0 years
0 Lacs
haryana
On-site
The Specialized Analytics Manager provides full leadership and supervisory responsibility. You will provide operational/service leadership and direction to team(s). You will apply in-depth disciplinary knowledge through provision of value-added perspectives or advisory services. You may contribute to the development of new techniques, models, and plans within the area of expertise. Excellent communication and diplomacy skills are required for this role. You will generally have responsibility for volume, quality, timeliness of end results, and shared responsibility for planning and budgets. Your work will affect an entire area, which eventually affects the overall performance and effectiveness of the sub-function/job family. You will have full supervisory responsibility, ensuring motivation and development of the team through professional leadership. This will include duties such as performance evaluation, compensation, hiring, disciplinary and terminations, as well as direction of daily tasks and responsibilities. The Data Science Analyst is a developing professional role within the GWFO team. In this role, you will apply specialized knowledge in machine learning, statistical modeling, and data analysis to monitor, assess, analyze, and evaluate processes and data. You will identify opportunities to leverage advanced analytics to improve business outcomes, automate processes, and generate actionable insights. Additionally, you will contribute to the development and deployment of innovative AI solutions, including generative AI and agentic AI applications. You will work with cross-functional stakeholders to gather and process operational data from various sources to examine past business performance and identify areas for improvement. As a successful candidate, you should ideally have 7+ years of relevant experience in data science, machine learning, or a related field. You should possess advanced process management skills, be organized and detail-oriented, curious about learning and developing new skill sets, particularly in the area of artificial intelligence, have a positive outlook with a can-do mindset, and strong programming skills in Python and proficiency in relevant data science libraries such as scikit-learn, TensorFlow, PyTorch, and Transformers. Experience with statistical modeling techniques, building GenAI solutions, agentic AI frameworks, and data visualization tools such as Tableau or Power BI is required. Strong logical reasoning capabilities, willingness to learn new skills, and good communication and presentation skills are also essential. You should have a Bachelors/University degree or equivalent experience. This job description provides a high-level review of the types of work performed, and other job-related duties may be assigned as required. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity, review Accessibility at Citi. View Citis EEO Policy Statement and the Know Your Rights poster.,
Posted 1 month ago
6.0 - 13.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
We are seeking a highly experienced candidate with over 13 years of experience for the role of Technical Project Manager(Data) in Trivandrum/Kochi location. As a Technical Project Manager, your responsibilities will revolve around owning the end-to-end delivery of data platform, AI, BI, and analytics projects. It is essential to ensure alignment with business objectives and stakeholder expectations. Your role will involve developing and maintaining comprehensive project plans, roadmaps, and timelines for various aspects including data ingestion, transformation, governance, AI/ML models, and analytics deliverables. Leading cross-functional teams, comprising data engineers, data scientists, BI analysts, architects, and business stakeholders, to deliver high-quality and scalable solutions within the defined budget and timeframe will be a key aspect of this role. Furthermore, you will be responsible for defining, prioritizing, and managing product and project backlogs covering data pipelines, data quality, governance, AI services, and BI dashboards or reporting tools. Collaboration with business units to capture requirements and translate them into actionable user stories and acceptance criteria for data and analytics solutions is crucial. Overseeing BI and analytics areas, including dashboard development, embedded analytics, self-service BI enablement, and ad hoc reporting capabilities, will also be part of your responsibilities. It is imperative to ensure data quality, lineage, security, and compliance requirements are integrated throughout the project lifecycle in collaboration with governance and security teams. Coordinating UAT, performance testing, and user training to ensure successful adoption and rollout of data and analytics products is vital. Acting as the primary point of contact for all project stakeholders, providing regular status updates, managing risks and issues, and escalating when necessary are essential aspects of this role. Additionally, facilitating agile ceremonies such as sprint planning, backlog grooming, demos, and retrospectives to foster a culture of continuous improvement is expected. Driving post-deployment monitoring and optimization of data and BI solutions to meet evolving business needs and performance standards is also a key responsibility. Primary Skills required for this role include: - Over 13 years of experience in IT with at least 6 years in roles such as Technical Product Manager, Technical Program Manager, or Delivery Lead - Hands-on development experience in data engineering, including data pipelines, ETL processes, and data integration workflows - Proven track record in managing data engineering, analytics, or AI/ML projects end to end - Solid understanding of modern data architecture, data lakes, warehouses, pipelines, ETL/ELT, governance, and AI tooling - Hands-on familiarity with cloud platforms (e.g., Azure, AWS, GCP) and DataOps/MLOps practices - Strong knowledge of Agile methodologies, sprint planning, and backlog grooming - Excellent communication and stakeholder management skills, including working with senior execs and technical leads Secondary Skills that would be beneficial for this role include: - Background in computer science, engineering, data science, or analytics - Experience or solid understanding of data engineering tools and services in AWS, Azure & GCP - Exposure or solid understanding of BI, Analytics, LLMs, RAG, prompt engineering, or agent-based AI systems - Experience leading cross-functional teams in matrixed environments - Certifications such as PMP, CSM, SAFe, or equivalent are a plus If you meet the above requirements and are looking for a challenging opportunity in Technical Project Management within the data domain, we encourage you to apply before the closing date on 18-07-2025.,
Posted 1 month ago
1.0 - 5.0 years
0 Lacs
noida, uttar pradesh
On-site
Job Summary: As an Associate Product Manager (APM) at Rezo, you will play a key role in supporting the product development lifecycle from ideation to launch by gathering requirements, conducting market research, and collaborating with cross-functional teams. You will assist in defining product features, prioritizing enhancements, and ensuring successful product delivery and user satisfaction. Key Responsibilities: Product Requirements & Roadmapping: - Gather and document product requirements from stakeholders and users. - Assist in defining, prioritizing, and maintaining the product roadmap. Market & User Research: - Conduct market research to identify customer needs, market gaps, and competitive trends. - Analyze user feedback and product analytics to inform feature enhancements. Cross-Functional Collaboration: - Coordinate with engineering, operations, marketing, and sales teams to ensure seamless product development and launch. - Understand requirements raised by cross-functional teams and ensure a smooth development cycle. Project Execution: - Support the Product Manager in managing project timelines and deliverables. - Track key metrics and report on product performance post-launch. Quality Assurance & Documentation: - Participate in product testing and quality assurance processes. - Create and maintain product documentation, training materials, and user guides. UI and Wireframing: - Build wireframes for new feature requests ensuring the best-in-class User Experience and Design hygiene. - Monitor and analyze competitor products and industry trends. Requirements & Qualifications: Education: Bachelor's degree in Engineering, Business Administration, Marketing, Computer Science, or a related field. Experience: - 1-3 years of experience in product management, project development, or a related area (internships excluded). - Familiarity with product management tools (e.g., Figma, Confluence, JIRA) is a must. Skills: - Strong analytical and problem-solving abilities. - Excellent communication and collaboration skills. - Ability to manage multiple tasks and prioritize effectively. - Basic understanding of LLMs, Generative AI, web technologies, and software development processes. Attributes: - Proactive, detail-oriented, and eager to learn. - Comfortable working in a fast-paced, dynamic environment. Why Join Rezo - Opportunity to work on innovative products with a talented, supportive team. - Hands-on mentorship and career growth in product management. - Collaborative and inclusive work culture.,
Posted 1 month ago
3.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
As a part of the global leader in Cloud, AI, and Digital technologies, you will be responsible for leveraging GenAI models and frameworks to enhance the efficiency and productivity of businesses worldwide. Your role will involve serving as the primary technical expert for GenAI technologies, focusing on developing, implementing, and optimizing advanced AI models and solutions to expand our GenAI footprint across clients. You will provide technical leadership, drive innovation, and ensure the successful integration of AI technologies into our IT services for clients. Your responsibilities will include defining best practices, standards, and templates for AI solutions, conducting workshops, demos, and client presentations on GenAI use cases, and collaborating closely with cross-functional teams to identify, evaluate, and implement use cases across different customers. In addition, you will evaluate and integrate emerging GenAI technologies to enhance conversational AI capabilities, provide technical leadership, mentorship, and guidance to teams, troubleshoot and resolve complex platform integration and functionality issues. You will also be hands-on in independently building quick prototypes and demonstrating them to key decision-makers. To be successful in this role, you should have 10+ years of total IT experience with at least 3-4 years of hands-on experience with AI/GenAI technologies. Expertise in NLP, NLU, NLQ, and NLG technologies is essential, along with Core Python proficiency to build applications and engineer GenAI solutions at an enterprise scale. You must have delivered production-grade projects on agentic AI frameworks, as well as hands-on experience with agentic AI frameworks such as Autogen, Langraph, or Crew.AI. Experience with NLQ to SQL-based GenAI applications using core Python tools and frameworks, databases, data warehouses, and data lakes on cloud platforms is required. Deep knowledge of architecting product-scale GenAI applications on Microsoft Azure Cloud, AWS Cloud, or GCP cloud environments is a must. Strong programming skills in Python, Node.js, Java, or similar, API programming, familiarity with LLMs, prompt engineering, fine-tuning, and embeddings are desired. Excellent problem-solving skills, solution architecture capabilities, and client-facing communication skills are essential. The ability to translate business needs into technical solutions, manage multiple projects and priorities in a fast-paced environment, and work with cloud platforms is crucial. Moreover, expertise in building technical proposals with winning engineering architectures and technical articulation for CXO level audience, and experience in developing fast PoVs and blueprints on new and emerging GenAI models, tools, frameworks, and other stack will be advantageous for this role.,
Posted 1 month ago
2.0 - 6.0 years
0 Lacs
noida, uttar pradesh
On-site
Amity University Uttar Pradesh, Noida is inviting applications for the position of Consultant (Scientific Technical/Non-Medical) at the Amity Centre for Artificial Intelligence (ACAI). The position is for a duration of one year with a monthly compensation of Rs. 1 Lakh. The selected candidate will be working on the project titled "Personalized Recommender System for Virus Research and Diagnosis Laboratory Network: Advancing Diagnostic Decision-Making through Artificial Intelligence", sponsored by the Indian Council of Medical Research (ICMR). The ideal candidate for this position should have either a B.Tech./MTech/M.Sc. or Ph.D. with an engineering or science background. Experience in Deep Learning Models, LLMs, Transformers is preferred. Desirable qualifications include 2-3 years of research experience in artificial intelligence, particularly in Deep Learning and Transformers, as well as publications in peer-reviewed journals (SCI/Scopus indexed). Interested candidates who meet the above qualifications are requested to submit their detailed CV via email to ai@amity.edu with a cc to sushils@amity.edu within 15 days of this advertisement. Please mention "Application for Consultant (Scientific Technical/Non-Medical) - Amity Centre for Artificial Intelligence" in the subject line of the email. Please note that no TA/DA will be provided to candidates for attending the interview.,
Posted 1 month ago
6.0 - 11.0 years
20 - 35 Lacs
Hyderabad
Work from Office
Senior Data Scientist JD As a Senior Data Scientist working at TTEC Digital, you will work on a team with diverse clients and industries to create data-driven solutions, develop cutting edge algorithms and provide expert insights to increase our understanding of customer behavior and needs. You will help clients turn data into actionable insights that drive tangible outcomes, improve performance and help clients lead the market. This position requires in-depth understanding and use of statistical and data analysis tools. What youll be doing: Build new analytics solutions; as well as analyze pre-existing analytic solutions, provide suggestions on how to evolve, and improve their efficiency and effectiveness Build client facing presentations to review analytic results and ability to convey technical concepts around analytic solutions to non-technical audiences in compelling story ¢ Where required, work as independent contributor on projects, managing end-to-end deliverables ¢ Handle communications with internal & external stakeholders, scope client requirements, develop analytics frameworks/solutions and implement and deploy solutions What skillsets we are looking for: ¢ 4-6 years in analytics domain developing & deploying ML/AI solutions ¢ 1+ years working on NLP projects text mining, having used LLMs, worked on Hugging Face/Transformer models; worked on publicly available LLM APIs ¢ Proficient level of understanding in machine learning and deep learning methods ¢ Excellent programming skills in Python, Pandas, PySpark & SQL ¢ Excellent communication skills ability to describe findings to technical and non-technical audience ¢ Good knowledge of MS office suite of products (Excel and PowerPoint) ¢ Ability to create end-to-end PowerPoint presentations (outline, visualization, results, summary) that are client-ready ¢ Experience with at least one of the following: Azure, GCP, Databricks, AWS; GCP is preferred Who we are: TTEC Digital Consulting is a pioneer in customer experience, engagement and growth solutions. We utilize a holistic approach, applying solutions from both our Engage and Digital segments to help companies provide an amazing experience to their customers, inspire customer loyalty, and grow their business. We provide end-to-end advisory and execution services, giving leaders the confidence and tools to re-think how to compete with a combination of our logic, intellect and ability to make sense of the data with our creativity and experience. TTEC Digital Analytics India 12th Floor, SALARPURIA SATTVA KNOWLEDGE CITY, HITEC City, Hyderabad, Telangana 500081
Posted 1 month ago
1.0 - 5.0 years
0 Lacs
hyderabad, telangana
On-site
NTT DATA is looking for a GCP Python Gen AI LLM RAG Vertex AI to join their team in Hyderabad, Telangana, India. As a potential candidate, you should have at least 4 years of Software Engineering experience or equivalent demonstrated through various means such as work experience, training, military experience, or education. It is essential to have a minimum of 2 years of experience working with GCP (Google Cloud Platform) or alternate public/hybrid cloud, delivering products with cloud services and cloud architectures at scale. In addition, you should have 2+ years of experience with Python and 3+ years of experience with GenAI, LLMs, RAG, vector databases, and conversational bots. Furthermore, 1+ years of experience with Playbooks and Vertex AI is required for this role. Exposure to ADK (hands-on) and Voice AI is a must. While not mandatory, having experience with LangChain and/or LangGraph is considered a plus. Additionally, 4+ years of Contact Center industry experience would be advantageous, including design, development, testing, integration with vendors, CRMs, and business applications. Proven knowledge in contact center subdomains such as IVR/IVA, NLU/NLP, Real-Time Omni-channel Agent experience, customer journey, and CX/AX experience optimization using AI/ML is beneficial. Moreover, familiarity with Node JS, JAVA, Spring Boot, Kafka, Distributed Caches (GemFire, Redis), Elastic Search technologies, GraphQL, and NoSQL Databases (Cassandra or Mongo), Graph Databases, Public Cloud Marketplace services is a good-to-have skill set. Experience with Deep Domain Driven Design with cloud-native Microservices designed and developed for massive scale and seamless resiliency, deployed on PCF/VMWare Tanzu, K8s, or Serverless cloud technologies for at least 2 years is also an added advantage. NTT DATA is a trusted global innovator of business and technology services, with a commitment to helping clients innovate, optimize, and transform for long-term success. Being a part of NTT DATA means being part of a diverse team of experts in over 50 countries, with a robust partner ecosystem. Their services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation, and management of applications, infrastructure, and connectivity. NTT DATA is a leading provider of digital and AI infrastructure globally, and as part of the NTT Group, they invest significantly in R&D to help organizations and society move confidently and sustainably into the digital future. Visit their website for more information.,
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
maharashtra
On-site
You will be joining a renowned consulting firm known for being consistently ranked as one of the world's best places to work. The company has maintained a top position on Glassdoor's Best Places to Work list since 2009, emphasizing the importance of extraordinary teams in their business strategy. By intentionally bringing together diverse backgrounds, cultures, experiences, perspectives, and skills in a supportive and inclusive work environment, they ensure that every individual can thrive both professionally and personally. As part of the Application Engineering experts team within the AI, Insights & Solutions division, you will collaborate with a multidisciplinary group of professionals including analytics, engineering, product management, and design experts. Your role will involve leveraging deep technical expertise along with business acumen to assist clients in addressing their most transformative challenges. Working in integrated teams, you will develop data-driven strategies and innovative solutions to drive competitive advantage for clients by harnessing the power of data and artificial intelligence. Your responsibilities will include designing, developing, and maintaining cloud-based AI applications using a full-stack technology stack to deliver high-quality, scalable, and secure solutions. You will collaborate with cross-functional teams to define and implement analytics features, utilize Kubernetes and containerization technologies for deployment, develop APIs and microservices, ensure robust security measures, monitor application performance, contribute to coding standards, stay updated on emerging technologies, automate deployment processes, and collaborate closely with clients to assess opportunities and develop analytics solutions. To qualify for this position, you are required to have a Master's degree in Computer Science, Engineering, or a related technical field, along with at least 6 years of experience at a Senior or Staff level. Proficiency in client-side and server-side technologies, cloud platforms, Python, Git, DevOps, CI/CD, and various other technical skills is necessary. Additionally, strong interpersonal and communication skills, curiosity, proactivity, critical thinking, and a solid foundation in computer science fundamentals are essential for this role. This role also requires a willingness to travel up to 30% of the time. If you are looking for an opportunity to work in a collaborative and supportive environment, continuously learn and grow, and contribute to developing cutting-edge analytics solutions for clients across different sectors, this position may be the perfect fit for you.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As an AI Generalist at Scouto, you will play a crucial role in redefining recruitment through the utilization of an autonomous AI recruiter. With a focus on seamless sourcing, outreach, video-screening, ranking, and scheduling of candidates, all achieved without human intervention, Scouto is on a mission to revolutionize the hiring process with instant, effortless, and truly AI-powered solutions. If you are passionate about operating at the forefront of AI, automation, product development, and growth, we are excited to hear from you. In this role, you will have the opportunity to wear multiple hats as a founding member of the team. Your responsibilities will include building and automating internal processes, ensuring customer satisfaction, providing valuable insights for product enhancement, and contributing to revenue generation. The dynamic and fast-paced environment at Scouto is ideal for individuals who thrive in ambiguity, embrace rapid experimentation, and are eager to leverage AI tools hands-on. Your primary focus areas will be divided as follows: - Operations & Automation (40%): Identify and automate repetitive workflows within support, sales, and internal operations using AI APIs, Zapier/Make, LangChain, or light scripting. Take ownership of the tools required to maintain the smooth operation of Scouto. - Customer Success & Support (20%): Manage onboarding processes and support tickets, collaborate with engineering teams to resolve issues, and ensure customers derive maximum value from the product. - Product Feedback & Growth (20%): Translate user feedback into actionable product enhancements, develop rapid prototypes for solutions, and establish a feedback loop with the product development team. - Sales & Upsell (20%): Participate in product demonstrations, address technical inquiries, identify opportunities for upselling, and introduce automation to enhance the sales process. We are seeking candidates who possess the following qualities: - Proficiency in AI technologies, including LLMs, prompt engineering, and no-code AI stacks. - Sales and growth-oriented mindset with the ability to articulate value propositions and understand revenue drivers. - Technical expertise in APIs, Zapier/Make, Retool, and familiarity with light scripting languages. - Customer-centric approach with strong problem-solving skills. - Entrepreneurial spirit with a proactive, self-directed attitude, and a passion for adapting to rapid changes. - Bonus points for prior experience in SaaS, AI startups, customer success, sales engineering, or product operations. Joining Scouto offers you the opportunity to: - Contribute to shaping the future of AI-driven recruitment as an integral part of the founding team. - Collaborate closely with the founder and core team, gaining extensive visibility and a broad scope of responsibilities. - Accelerate your career progression towards leadership roles in success, growth, or operations. - Enjoy competitive compensation, flexible work arrangements, and the autonomy to define your role. If you are eager to contribute to building, automating, and scaling at a rapid pace, we look forward to discussing how you can be a part of Scouto's innovative journey.,
Posted 1 month ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
Scouto is revolutionizing the recruitment industry by introducing an autonomous AI recruiter that can handle sourcing, outreach, video-screening, candidate ranking, and scheduling seamlessly, without requiring human intervention. With a robust AI foundation and significant market traction, we are dedicated to simplifying the hiring process by making it instant, effortless, and fully AI-driven. If you are thrilled by the convergence of AI, automation, product development, and business growth, we are eager to have you on board. As a foundational AI Generalist at Scouto, you will play a versatile role, taking on various responsibilities such as establishing and automating internal procedures, ensuring customer satisfaction, providing valuable insights for product enhancement, and contributing to revenue generation. The fast-paced and dynamic environment provides an exciting opportunity for individuals who thrive in uncertainty, enjoy rapid experimentation, and are hands-on with AI technologies. Your primary areas of focus will include: Operations & Automation (40%): Identify repetitive tasks within support, sales, and internal operations, and automate them using AI APIs, Zapier/Make, LangChain, or lightweight scripting. Take ownership of the tools that are essential for the smooth functioning of Scouto. Customer Success & Support (20%): Manage onboarding processes and handle support tickets, collaborate with the engineering team to troubleshoot issues, and ensure customers derive maximum value from our services. Product Feedback & Growth (20%): Translate user pain points into actionable product suggestions, create rapid prototypes for solutions, and maintain a feedback loop with the product development team. Sales & Upsell (20%): Participate in product demonstrations, address technical inquiries, identify opportunities for upselling, and introduce automation into the sales pipeline. We are seeking individuals who possess the following qualifications: - Proficiency in AI technologies, familiarity with LLMs, prompt engineering, and experience with no-code AI stacks. - Sales and growth-oriented mindset with the ability to pitch ideas and comprehend revenue-driving strategies. - Technical expertise in APIs, Zapier/Make, Retool, and a bonus for proficiency in light scripting. - Customer-centric approach with strong problem-solving skills. - Entrepreneurial spirit, self-motivated, and adaptable to fast-paced environments. - Extra points for previous experience in SaaS, AI startups, customer success, sales engineering, or product operations. Joining us means: - Contributing to shaping the future of AI-powered recruitment as an integral part of the founding team. - Working closely with the founder and core team, offering significant scope for growth and visibility. - Accelerating your career progression towards leadership roles in success, growth, or operations. - Competitive compensation, flexible work arrangements, and the autonomy to define your role. If you are excited about the prospect of driving innovation, streamlining processes, and expanding rapidly, we would love to connect with you. Let's start a conversation about your future at Scouto.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a skilled Data Scientist specializing in Generative AI, you will be responsible for designing, developing, and deploying state-of-the-art AI models to tackle real-world business challenges. Your role will involve working with Large Language Models (LLMs), Generative Adversarial Networks (GANs), Retrieval-Augmented Generation (RAG) frameworks, and transformer architectures to create production-ready solutions. Your key responsibilities will include: - Designing, developing, and fine-tuning advanced Generative AI models such as LLMs, GANs, and Diffusion models. - Implementing and enhancing RAG and transformer-based architectures to enable contextual understanding and document intelligence. - Customizing and optimizing LLMs for specific domain applications. - Building, maintaining, and optimizing ML pipelines and infrastructure for model training, evaluation, and deployment. - Collaborating with engineering teams to integrate AI models into user-facing applications. - Staying updated with the latest trends and research in Generative AI, open-source frameworks, and tools. - Analyzing model outputs for quality and performance, ensuring adherence to ethical AI practices. To excel in this role, you should possess the following skills: - Strong proficiency in Python and deep learning frameworks like TensorFlow, PyTorch, and HuggingFace Transformers. - Deep understanding of GenAI architectures such as LLMs, RAG, GANs, and Autoencoders. - Experience in fine-tuning models using techniques like LoRA, PEFT, or equivalents. - Knowledge of vector databases like FAISS, Pinecone, and embedding generation methods. - Experience in handling datasets, preprocessing, and synthetic data generation. - Solid grasp of NLP concepts, prompt engineering, and safe AI practices. - Hands-on experience in API development, model deployment, and cloud platforms such as AWS, GCP, and Azure. By leveraging your expertise in Generative AI and staying abreast of industry advancements, you will play a crucial role in developing cutting-edge solutions to address complex business problems.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a founding AI Generalist at Scouto, you will play a pivotal role in redefining recruitment by leveraging an autonomous AI recruiter. Your responsibilities will encompass various aspects such as operations & automation, customer success & support, product feedback & growth, and sales & upsell. In the realm of Operations & Automation, you will be tasked with identifying repetitive workflows and automating them using AI APIs, Zapier/Make, LangChain, or light scripting. It will be your responsibility to ensure that the tools required to keep Scouto running smoothly are effectively managed. Handling Customer Success & Support will involve managing onboarding processes, addressing support tickets, troubleshooting with the engineering team, and guaranteeing that customers derive maximum value from the platform. Your role will also require you to provide valuable insights for product enhancement by converting user pain points into actionable ideas, prototyping solutions, and collaborating closely with the development team to refine the product. Additionally, you will be involved in Sales & Upsell activities which include participating in product demos, addressing technical queries, identifying upsell opportunities, and infusing automation into the sales pipeline to enhance efficiency. The ideal candidate for this position must possess the following qualities: - AINative Fluency: Demonstrates a deep understanding and proficiency in AI tools and LLMs, regularly prototypes automations using tools like Zapier/Make and light scripting. - Customer Empathy & Problem-Solving: Exhibits a knack for diagnosing customer issues and devising effective solutions across various aspects of the product and processes. - Growth & Sales Mindset: Comfortable with conducting demos, driving upgrades, and leveraging consultative sales strategies. - Product & Process Thinking: Capable of translating feedback into structured product insights, prototyping scalable workflows, and enhancing user experience through no/low-code solutions. - Extreme Ownership & Hustle: Takes on tasks with a founder mindset, proactively fills gaps, adapts quickly to changing circumstances, and thrives in ambiguous situations. - Communication Excellence: Possesses excellent written and verbal communication skills, able to simplify complex AI workflows for diverse audiences, and proficient in creating documentation, help articles, and product guides. Nice-to-have qualifications include prior experience in SaaS, AI startups, customer success, sales engineering, or product operations, as well as active engagement within AI communities. Joining Scouto presents the opportunity to shape the future of AI-driven hiring, work closely with the founder and core team, and accelerate your career growth into leadership roles. Additionally, competitive compensation, a flexible work environment, and the autonomy to define your role further add to the appeal of this position.,
Posted 1 month ago
5.0 - 9.0 years
0 - 0 Lacs
karnataka
On-site
You will be responsible for building and interpreting machine learning models on real business data from the SigView platform, such as Logistic Regression, Boosted trees (Gradient boosting), Random Forests, and Decision Trees. Your tasks will include identifying data sources, integrating multiple sources or types of data, and applying data analytics expertise within a data source to develop methods to compensate for limitations and extend the applicability of the data. Moreover, you will be expected to extract data from relevant data sources, including internal systems and third-party data sources, through manual and automated web scrapping. Your role will involve validating third-party metrics by cross-referencing various syndicated data sources and determining the numerical variables to be used in the same form as they are from the raw datasets, categorized into buckets, and used to create new calculated numerical variables. You will perform exploratory data analysis using PySpark to finalize the list of compulsory variables necessary to solve the business problem and transform formulated problems into implementation plans for experiments by applying appropriate data science methods, algorithms, and tools. Additionally, you will work with offshore teams post data preparation to identify the best statistical model/analytical solution that can be applied to the available data to solve the business problem and derive actionable insights. Your responsibilities will also include collating the results of the models, preparing detailed technical reports showcasing how the models can be used and modified for different scenarios in the future to develop predictive insights. You will develop multiple reports to facilitate the generation of various business scenarios and provide features for users to generate scenarios. Furthermore, you will be interpreting the results of tests and analyses to develop insights into formulated problems within the business/customer context and provide guidance on risks and limitations. Acquiring and using broad knowledge of innovative data analytics methods, algorithms, and tools, including Spark, Elasticsearch, Python, Databricks, Azure, Power BI, Azure Cloud services, LLMs-Gen AI, and Microsoft Suite will be crucial for success in this role. This position may involve telecommuting and requires 10% travel nationally to meet with clients. The minimum requirements for this role include a Bachelor's Degree in Electronics Engineering, Computer Engineering, Data Analytics, Computer Science, or a related field plus five (5) years of progressive experience in the job offered or related occupation. Special skill requirements for this role include applying statistical methods to validate results and support strategic decisions, building and interpreting advanced machine learning models, using various tools such as Python, Scikit-Learn, XGBoost, Databricks, Excel, and Azure Machine Learning for data preparation and model validation, integrating diverse data sources using data analytics techniques, and performing data analysis and predictive model development using AI/ML algorithms. Your mathematical knowledge in Statistics, Probability, Differentiation and Integration, Linear Algebra, and Geometry will be beneficial. Familiarity with Data Science libraries such as NumPy, SciPy, and Pandas, Azure Data Factory for data pipeline design, NLTK, Spacy, Hugging Face Transformers, Azure Text Analytics, OpenAI, Word2Vec, and BERT will also be advantageous. The base salary for this position ranges from $171,000 to $190,000 per annum for 40 hours per week, Monday to Friday. If you have any applications, comments, or questions regarding the job opportunity described, please contact Piyush Khemka, VP, Business Operations, at 111 Town Square Pl., Suite 1203, Jersey City, NJ 07310.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
The Data Science Analyst role at Citi within the GWFO team is a developing professional position that involves applying specialized knowledge in machine learning, statistical modeling, and data analysis to monitor, assess, analyze, and evaluate processes and data. The main responsibilities include identifying opportunities to leverage advanced analytics to improve business outcomes, automate processes, and generate actionable insights. Additionally, the individual in this role will contribute to the development and deployment of innovative AI solutions, including generative AI and agentic AI applications, while collaborating with cross-functional stakeholders. As a Data Science Analyst at Citi, you will be expected to: - Gather and process operational data from various cross-functional stakeholders to examine past business performance and identify areas for improvement. - Apply machine learning techniques to identify data patterns and trends and provide insights to enhance business decision-making capabilities in various areas. - Develop and implement machine learning models for predictive analytics, forecasting, and optimization. - Design, build, and deploy Generative AI solutions to enhance customer experience, automate tasks, and personalize interactions. - Experiment with agentic AI frameworks to create autonomous systems that can learn, adapt, and solve complex problems. - Develop tools and techniques to evaluate the performance and impact of AI solutions. - Translate data into consumer or customer behavioral insights to drive targeting and segmentation strategies and effectively communicate findings to business partners and senior leaders. - Continuously explore and evaluate new data sources, tools, and capabilities, focusing on cutting-edge AI technologies to improve processes and strategies. - Collaborate closely with internal and external business partners to build, implement, track, and enhance decision strategies. Skills and Experience: - 5+ years of relevant experience in data science, machine learning, or a related field. - Advanced process management skills, organized and detail-oriented. - Curiosity about learning and developing new skill sets, particularly in the area of artificial intelligence. - Positive outlook with a can-do mindset. - Strong programming skills in Python and proficiency in relevant data science libraries such as scikit-learn, TensorFlow, PyTorch, and Transformers. - Experience with statistical modeling techniques, including regression, classification, and clustering. - Experience building GenAI solutions using LLMs and vector databases. - Experience with agentic AI frameworks, such as Langchain, Langraph, MLOps. - Experience with data visualization tools such as Tableau or Power BI. - Other skills required include strong logical reasoning capabilities, willingness to learn new skills, and good communication and presentation skills. Education: - Bachelor's/University degree or equivalent experience in a quantitative field such as computer science, statistics, mathematics, or engineering. Master's degree preferred. Working at Citi offers more than just a job - it means joining a global family of dedicated individuals where you can grow your career, contribute to your community, and make a real impact.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a visionary technical co-founder, you will combine engineering excellence with philosophical depth to embark on the journey of building something rare, intelligent, and deeply rooted. You will be driven by first-principle problem solving and AI that reasons instead of merely responding. Your role will involve leading the technical architecture of a foundational AI product while embodying the qualities of humility to listen and the conviction to build. Your core responsibilities will revolve around spearheading the design, development, and iteration of VIKAS AI's reasoning engine. This will include building and fine-tuning LLMs, integrating RAG pipelines, and designing multi-agent systems. You will architect systems that strike a balance between cultural depth, emotional nuance, and technical performance. Collaborating closely with the team, you will shape core product features such as culturally intelligent LLMs, an aesthetically curated image engine, and a short-form AI news video layer. Additionally, you will co-create the technical roadmap, hire the early team, and oversee the infrastructure. To excel in this role, strong experience in machine learning, NLP, and LLMs is essential. You should be proficient with Transformers, LangChain, HuggingFace, or similar frameworks and possess solid knowledge of Python, vector databases, and inference infrastructure. Experience with RAG (Retrieval-Augmented Generation) and agent-based architectures, as well as familiarity with embedding models, fine-tuning, and prompt engineering, will set you up for success. An added advantage would be an interest in Indic language modeling or symbolic reasoning, experience in building low-latency, high-context systems, and an eye for clean code, ethics, and culture. Beyond your technical skills, your philosophical outlook and values will play a crucial role in defining who you are as a builder-philosopher. Your commitment to cultural intelligence, ethical AI, and context-aware systems will be evident as you strive to bridge the gap in current AI capabilities. You will prioritize truth over hype, depth over speed, and alignment over noise, reflecting a holistic approach to your work. In return for your contributions, you will receive co-founder status with meaningful equity, the opportunity to be at the forefront of building a product with global potential and local essence, complete creative and architectural freedom to innovate from the ground up, and the guidance of a visionary founder who is relentless, clear, and deeply committed. This role offers you the chance to not just follow existing trends but to redefine the narrative by putting India on the map through innovative thinking and impactful solutions. If you are intrigued by this opportunity and resonate with the vision outlined, feel free to reach out by DM or email at vikasai150807@gmail.com. Let's engage in conversations that transcend mere features and delve into shaping promising futures.,
Posted 1 month ago
10.0 - 15.0 years
0 Lacs
noida, uttar pradesh
On-site
You are an experienced OCI AI Architect who will be responsible for leading the design and deployment of Gen AI, Agentic AI, and traditional AI/ML solutions on Oracle Cloud. Your role will involve a deep understanding of Oracle Cloud Architecture, Gen AI, Agentic and AI/ML frameworks, data engineering, and OCI-native services. The ideal candidate will possess a combination of deep technical expertise in AI/ML and Gen AI over OCI along with domain knowledge in Finance and Accounting. Your key responsibilities will include designing, architecting, and deploying AI/ML and Gen AI solutions on OCI using native AI services, building agentic AI solutions using frameworks such as LangGraph, CrewAI, and AutoGen, leading the development of machine learning AI/ML pipelines, and providing technical guidance on MLOps, model versioning, deployment automation, and AI governance. You will collaborate with functional SMEs, application teams, and business stakeholders to identify AI opportunities, advocate for OCI-native capabilities, and support customer presentations and solution demos. To excel in this role, you should have 10-15 years of experience in Oracle Cloud and AI, with at least 5 years of proven experience in designing, architecting, and deploying AI/ML & Gen AI solutions over OCI AI stack. Strong Python development experience, knowledge of LLMs such as Cohere and GPT, proficiency in AI/ML/Gen AI frameworks like TensorFlow, PyTorch, Hugging Face, and hands-on experience with OCI services are required. Additionally, skills in AI governance, Agentic AI frameworks, AI architecture principles, and leadership abilities are crucial for success. Qualifications for this position include Oracle Cloud certifications such as OCI Architect Professional, OCI Generative AI Professional, OCI Data Science Professional, as well as a degree in Computer Science or MCA. Any degree or diploma in AI would be preferred. Experience with front-end programming languages, Finance domain solutions, Oracle Cloud deployment, and knowledge of Analytics and Data Science would be advantageous. If you are a highly skilled and experienced OCI AI Architect with a passion for designing cutting-edge AI solutions on Oracle Cloud, we invite you to apply and join our team for this exciting opportunity.,
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |