Home
Jobs
Companies
Resume

23 Gemini Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

20 - 30 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

We are seeking a highly skilled and experienced Senior Cloud Native Developer to join our team and drive the design, development, and delivery of cutting-edge cloud-based solutions on Google Cloud Platform (GCP). This role emphasizes technical expertise, best practices in cloud-native development, and a proactive approach to implementing scalable and secure cloud solutions. Responsibilities Design, develop, and deploy cloud-based solutions using GCP, adhering to architecture standards and best practices Code and implement Java applications using GCP Native Services like GKE, CloudRun, Functions, Firestore, CloudSQL, and Pub/Sub Select appropriate GCP services to address functional and non-functional requirements Demonstrate deep expertise in GCP PaaS, Serverless, and Database services Ensure compliance with security and regulatory standards across all cloud solutions Optimize cloud-based solutions to enhance performance, scalability, and cost-efficiency Stay updated on emerging cloud technologies and trends in the industry Collaborate with cross-functional teams to architect and deliver successful cloud implementations Leverage foundational knowledge of GCP AI services, including Vertex AI, Code Bison, and Gemini models when applicable Requirements 5+ years of extensive experience in designing, implementing, and maintaining applications on GCP Comprehensive expertise in using GCP services, including GKE, CloudRun, Functions, Firestore, Firebase, and Cloud SQL Knowledge of advanced GCP services, such as Apigee, Spanner, Memorystore, Service Mesh, Gemini Code Assist, Vertex AI, and Cloud Monitoring Solid understanding of cloud security best practices and expertise in implementing security controls in GCP Proficiency in cloud architecture principles and best practices, with a focus on scalable and reliable solutions Experience with automation and configuration management tools, particularly Terraform, along with a strong grasp of DevOps principles Familiarity with front-end technologies like Angular or React Nice to have Familiarity with GCP GenAI solutions and models, including Vertex AI, Codebison, and Gemini models Background in working with front-end frameworks and technologies to complement back-end cloud development Capability to design end-to-end solutions integrating modern AI and cloud technologies

Posted 3 days ago

Apply

3.0 - 8.0 years

0 - 3 Lacs

Hyderabad

Work from Office

Naukri logo

SUMMARY Project Lead Job Overview: The Project Manager will oversee task planning, allocation, and tracking for development projects, ensuring efficient task assignment and timely progress reporting. The role requires expertise in LAMP stack development (Laravel, CodeIgniter), API security, and knowledge of mobile development (Flutter/Firebase) and AI technologies like OpenAI and Gemini. Key Responsibilities: Task Planning & Allocation: Coordinate task assignment and project timelines. Task Monitoring & Reporting: Track progress and update management on task statuses. API Development & Integration: Oversee secure API design and integrations. Collaboration & Leadership: Lead team collaborations, sprint planning, and mentoring. Architecture & Infrastructure: Guide the development of scalable infrastructure, CI/CD pipelines, and cloud test labs. Mobile & AI Integration: Manage mobile development (Flutter/Firebase) and integrate AI technologies into projects. Required Qualifications: Education: Bachelor’s in Computer Science or related field; PMP/Scrum Master certification is a plus. Experience: 3+ years in LAMP stack development (Laravel, CodeIgniter). Experience with REST/GraphQL API design and security. Mobile development using Flutter/Firebase, with cloud-based emulators. AI integration experience (OpenAI, Gemini). Proven ability to manage and mentor development teams. Skills: Strong LAMP stack expertise (Linux, Apache, MySQL, PHP). API development and security best practices. CI/CD, cloud infrastructure, and scalable solutions. Excellent leadership, communication, and project management skills. Desired Attributes: Strong multitasking and project management capabilities. Proactive approach to task allocation and project delivery. Ability to adapt to new technologies, particularly in web, mobile, and AI.

Posted 3 days ago

Apply

3.0 - 5.0 years

10 - 15 Lacs

Gurugram

Remote

Naukri logo

Job Title: AI-Native Full Stack Engineer Job Description: We are hiring a Full stack Engineer to help build and scale AI-powered products for a client facing role in our IT consulting unit. You will work across Next.js, Python backends, and AWS infrastructure, while actively leveraging modern AI tools (Cursor, Devin, v0, Claude, Gemini, GPT-4o, etc.) to accelerate development. Beyond code, we value candidates who have strong opinions on LLMs and actively integrate AI agents into their workflows. You will contribute to everything from product features to internal tools, working closely with a nimble, high-performance team deeply embedded in AI product development. Job Requirements: 3+ years professional experience as full stack developer Strong Next.js (or React + SSR) experience Strong Python backend experience AWS deployment and scaling experience (EC2, S3, RDS, etc.) Demonstrated use of AI tools for coding: Cursor, Devin, v0, etc. Strong understanding of LLM landscape (Claude 4 Opus, GPT-4o, Gemini 2.5 Pro, open-source models)Role & responsibilities Direct 1st round Technical evaluation link : https://pehchaan.me/jobApply/RJR83dfTgj-KRI1q-EgkUw

Posted 6 days ago

Apply

10.0 - 17.0 years

25 - 40 Lacs

Chennai

Work from Office

Naukri logo

LLMs - OpenAI, Gemini, CoPilot etc. Good Knowledge of RAG Pipeline Architectures Fine/Prompt/Instruction Tuning of LLMs machine learning frameworks (like Keras or PyTorch) and libraries (like scikit-learn). cloud services (GCP, Azure, AWS).

Posted 6 days ago

Apply

2.0 - 5.0 years

12 - 20 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

Experienced Dialogflow CX & Generative AI candidate to design, develop, and optimize AI-driven conversational solutions. The ideal candidate will have expertise in Dialogflow CX, Conversational AI, and Generative AI technologies. Required Skills & Qualifications: Hands-on experience with Dialogflow CX and Conversational AI platforms. Strong knowledge of NLP, AI, and ML techniques. Experience working with Generative AI models (e.g., Vertex.ai, Gemini , GPT, BERT, etc.). Proficiency in Google Cloud, Python, APIs, and AI-driven automation. Ability to design complex conversational workflows with intents, contexts, and entities. Experience integrating AI chatbots with various platforms and third-party services. Key Responsibilities: Develop and optimize chatbot/voicebot solutions using Google Dialogflow CX. Design conversational flows with advanced NLP techniques for natural and effective communication. Integrate Generative AI models to enhance the chatbot's ability to generate human-like responses. Fine-tune AI models for accuracy, efficiency, and improved user experience. Collaborate with cross-functional teams to ensure smooth implementation and functionality. Monitor, analyze, and improve chatbot/voicebot performance using data insights and user feedback. Stay updated with the latest advancements in Conversational AI & GenAI technologies.

Posted 2 weeks ago

Apply

1.0 years

3 Lacs

Bangalore, Karnataka, IN

On-site

Internshala logo

About the job: This isn't your typical EA job. You'll work directly with Sachin Amarnath, a founder, strategist, and educator who operates across multiple ventures in education, consulting, real estate, and digital content. This role is perfect for someone looking for a structured, high-learning, low-politics environment where clarity, communication, and consistency matter more than buzzwords. You'll be Sachin's right-hand organizing chaos, ensuring nothing falls through the cracks, and helping him deliver excellence across 6+ active ventures. Key responsibilities: In the first 3-6 months (under close mentorship): 1. Manage Sachin's calendar and prioritize his meetings. 2. Track tasks, deadlines, and deliverables using shared tools (Notion, Task Boards). 3. Prepare, track, and follow up on invoices with clients and vendors. 4. Maintain clean, organized Minutes of Meetings (MOMs) and follow-through. 5. Conduct quick online research and structure information into insights. 6. Draft professional emails, proposals, LinkedIn posts, and pitch decks. 7. Assist in lead qualification and basic funnel follow-up using WhatsApp and CRM. 8. Build and format PowerPoints, Excel sheets, trackers, and planning docs You'll Be Working Across: 1. Ascend School of Construction Business (education & mentorship). 2. First pillar Consulting (Business Strategy & Digital Transformation). 3. Sayspace (Advisory-led Growth Initiatives). 4. Construction Management Training Institute (CMTI) (Corporate L&D Programs). 5. Aurum Superfoodz (E-commerce and Brand Support). 6. HNIs & LinkedIn Strategy (Personal Branding Projects). Travel & Work Flexibility: 1. Occasional travel (all expenses covered) for events, trainings, or reviews 2. Base location in Bangalore preferred. Work is hybrid with high flexibility Who can apply: Only those candidates can apply who: have minimum 1 years of experience are from Bangalore only Salary: ₹ 3,00,000 /year Experience: 1 year(s) Deadline: 2025-06-26 23:59:59 Other perks: Free snacks & beverages Skills required: MS-PowerPoint, MS-Excel, LinkedIn Marketing, Canva, Google Workspace, Effective Communication, Notion, ChatGPT and Gemini Other Requirements: Bonus (not mandatory): 1. Civil engineering or architecture background. 2. Familiarity with the construction or edtech industries. 3. Knack for design thinking or marketing campaigns. 4. Experience with tools like Facebook Ads or Revit. About Company: Ascend School of Construction Business is a new-age educational B-School that produces world-class techno-managers and executives for the construction industry.

Posted 3 weeks ago

Apply

3.0 - 6.0 years

15 - 20 Lacs

Chennai

Work from Office

Naukri logo

Prompt Engineer Experience - 3 to 6 years Location - Chennai Timing - General shift Notice period - Less than 15 days Contract period - 6 months JD: Role Overview: We are looking for a talented and creative Prompt Engineer to join our AI team. In this role, you will design, develop, and refine prompts for large language models (LLMs) to solve business problems, improve product functionality, and optimize user experiences. Your work will directly impact how our products interact with users and leverage cutting-edge AI models. Responsibilities: Design and iterate on effective prompts to drive high-quality responses from LLMs (e.g., GPT-4, Claude, Gemini, etc.). Collaborate with AI researchers, data scientists, and product managers to develop prompt strategies aligned with business goals. Evaluate LLM performance, analyze outputs, and optimize prompts based on performance metrics. Conduct A/B testing of prompts and maintain a library of prompt templates. Stay current on developments in generative AI, NLP, and prompt engineering best practices. Develop documentation, tools, and internal guides to help teams effectively use LLMs. Ensure ethical and responsible use of AI systems, identifying and mitigating model bias or inappropriate outputs. Qualifications: Required: Bachelor's degree in Computer Science, Linguistics, Cognitive Science, or a related field. Strong understanding of LLMs and natural language processing (NLP). Hands-on experience with prompt design for ChatGPT, Claude, Gemini, or similar models. Excellent written communication and analytical skills. Familiarity with programming/scripting (Python preferred) and basic data handling. Preferred: Masters degree or higher in a relevant field. Experience with fine-tuning or RLHF (Reinforcement Learning with Human Feedback). Exposure to prompt chaining, function calling, and multi-model orchestration. Experience using tools such as LangChain, LlamaIndex, or vector databases like Pinecone or Weaviate. Sincerely, Varsha L TS

Posted 3 weeks ago

Apply

0.0 - 1.0 years

1 - 2 Lacs

Bengaluru

Remote

Naukri logo

Are you an exceptionally skilled and highly motivated individual with a deep passion for quantitative finance, data science, end-to-end automation, cutting-edge AI tools, and creating impactful educational content ? Do you possess an unparalleled command of Python, advanced mathematics, statistical modeling, and the ability to effectively communicate complex concepts for online learning? If so, we have an extraordinary opportunity for you! We are seeking a Highly Paid Quant Analyst Intern for a demanding yet incredibly rewarding 6-month remote internship in India. This role is designed for ambitious individuals who are ready to dive deep into real-world quantitative challenges, drive efficiency through comprehensive automation, leverage the power of new AI tools (like Claude, ChatGPT, Perplexity, Gemini), and crucially, develop high-quality quantitative education content for our online platforms. Key Responsibilities: Quantitative Research & Model Development: Apply your expertise to research, develop, and refine quantitative models, trading strategies, and analytical tools. Quantitative Education Content Creation: Design, develop, and refine engaging and accurate quantitative education content (e.g., lessons, exercises, case studies, coding tutorials) for Gotraddy's online courses and training programs. Data Science & Analytics: Perform in-depth analysis of financial datasets, identify patterns, and extract actionable insights to inform and enhance our course materials and practical exercises. End-to-End Workflow Automation: Design, build, and maintain robust automated pipelines and tools for data processing, model testing, and the creation of interactive learning simulations and demonstrations, leveraging tools like n8n and Make . AI Tool Integration & Prompt Engineering: Proactively explore and effectively utilize new AI tools (e.g., Claude, ChatGPT, Perplexity, Gemini ) through advanced prompt engineering to assist in research, content generation, problem-solving, and efficiency improvements for educational material. Mathematical & Statistical Application: Leverage your understanding of advanced mathematical concepts (e.g., probability, stochastic calculus, optimization) to create clear, practical examples and solve complex problems for educational purposes. Content Validation & Improvement: Rigorously test and validate existing and new quantitative content, ensuring accuracy, relevance, pedagogical effectiveness, and a seamless online learning experience. Collaboration & Innovation: Work closely with our experienced educators and content creators to translate complex quantitative concepts into accessible and engaging learning experiences. What We're Looking For: Education: Currently pursuing or recently completed a Bachelor's, Master's, or Ph.D. in a highly quantitative field such as Quantitative Finance, Mathematics, Statistics, Computer Science, Data Science, or a related discipline. Exceptional Python Skills: Expert-level proficiency in Python, including extensive experience with libraries like NumPy, Pandas, SciPy, Scikit-learn , and ideally, specialized quantitative finance libraries. Strong Data Science Acumen: Proven ability in data manipulation, statistical analysis, machine learning (regression, classification, time series), and data visualization. Solid Mathematical Foundation: Deep understanding of linear algebra, multivariate calculus, probability theory, stochastic processes, and numerical methods. Automation Prowess: Demonstrated experience in automating complex workflows and processes using Python scripting and/or workflow automation tools like n8n or Make. Proficiency in Prompt Engineering: Demonstrated ability to effectively use and extract valuable insights from large language models and other AI tools (e.g., Claude, ChatGPT, Perplexity, Gemini). Content Creation Aptitude: Strong ability to articulate complex quantitative concepts clearly, concisely, and engagingly for an online learning audience. Experience with educational content development is a significant plus. Highly Motivated: A proactive, self-starter attitude with a strong desire to learn, contribute, and elevate educational content. Problem-Solving: Excellent analytical and problem-solving skills, with a keen eye for detail. Communication: Strong verbal and written communication skills. Location: Ability to work remotely from India. Desired Skills: While not strictly mandatory, candidates possessing the following skills will be highly regarded: Advanced Python Development: Experience with more complex Python frameworks or building robust applications. In-depth Data Science Techniques: Exposure to advanced topics like deep learning for financial applications, causal inference, or advanced time series analysis. Mathematical Sophistication: Knowledge of advanced optimization, numerical methods, or stochastic calculus applied to finance. AI Tool Power User: Proven track record of leveraging AI tools (Claude, ChatGPT, Perplexity, Gemini) for complex problem-solving, code generation, or sophisticated content ideation. Workflow Automation Mastery: Experience in designing and implementing complex, multi-step automated workflows using tools like n8n or Make for diverse applications. Educational Content Design: Prior experience in designing curricula, writing lessons, or creating interactive learning modules for quantitative subjects. Creative Writing & Pedagogy: Ability to distill complex technical information into clear, engaging, and creatively presented educational materials suitable for diverse learning styles. What We Offer: Exceptional Compensation: This is a highly competitive and very well-paid internship , acknowledging your top-tier skills and potential contribution to our firm. Impactful Contribution: Your work will directly enhance the learning experience for aspiring quantitative professionals globally, shaping the future of quantitative education. Deep Learning & Growth: An unparalleled opportunity to deepen your quantitative skills and master the integration of cutting-edge AI, workflow automation, and educational content creation. Mentorship: Direct mentorship from seasoned quantitative professionals and educators who are also exploring AI and automation frontiers. Dynamic Remote Environment: A collaborative, intellectually stimulating, and fast-paced remote work environment that fosters innovation in quantitative education, AI application, and content development. Flexibility: The convenience of a remote role, allowing you to work from anywhere in India. Career Pathway: Strong potential for a full-time conversion offer upon successful completion of the internship, contributing to our core education and research initiatives. Internship Duration: 6 Months Location: Remote (India)

Posted 3 weeks ago

Apply

5.0 - 10.0 years

15 - 22 Lacs

Chennai

Work from Office

Naukri logo

RAG Pipeline Architectures Fine/Prompt/Instruction Tuning of LLMs machine learning frameworks (like Keras or PyTorch) and libraries (like scikit-learn). data wrangling, data cleaning, data preprocessing, and data

Posted 3 weeks ago

Apply

10.0 - 16.0 years

40 - 50 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

About the Client Company is a global leader in digital transformation, cybersecurity, cloud, and high-performance computing. Headquartered in France, Company serves clients across diverse industries with innovative solutions that drive business agility and operational excellence. With a strong commitment to sustainability and cutting-edge technology, Company supports organizations in their digital journeys through consulting, systems integration, and managed services. The company's expertise spans artificial intelligence, data analytics, and IoT, helping clients achieve secure and efficient digital operations worldwide. Company is recognized for its focus on decarbonization and delivering trusted, secure digital services. Job Description Responsibilities: Lead the development and implementation of AI-driven solutions using GCP services and Gemini technologies. Collaborate with stakeholders to understand business requirements and translate them into scalable, secure technical solutions. Design and optimize AI architectures for performance, reliability, and cost-effectiveness. Establish best practices for AI model development, deployment, and monitoring. Ensure compliance with ethical AI principles, data governance, and security standards. Provide technical leadership and mentorship to cross-functional teams, including data scientists and engineers. Stay updated with emerging trends in AI and cloud technologies, particularly within the GCP ecosystem. Troubleshoot and resolve complex technical issues related to AI systems. Document architectural designs, technical decisions, and best practices for future reference. Eligibility Criteria: Bachelor's or master's degree in computer science, Data Science, or a related field. Extensive experience with GCP AI services and Gemini technologies. Strong understanding of AI architecture, machine learning, and optimization techniques. Proficiency in programming languages such as Python, Java, or similar. Experience with cloud-based AI solutions and MLOps practices is a plus. Familiarity with ethical AI principles and frameworks. Knowledge of data governance, compliance standards, and security protocols. Excellent problem-solving and analytical skills. Proven leadership and team management abilities. Ability to work in a fast-paced environment and manage multiple priorities.

Posted 3 weeks ago

Apply

0.0 years

3 - 6 Lacs

Delhi, Delhi, IN

On-site

Internshala logo

About the job: Key responsibilities: 1. Build AI-driven tools and products using APIs (OpenAI, Gemini, etc.) 2. Design and fine-tune prompts for various use cases. 3. Integrate vector databases (Pinecone, ChromaDB) for retrieval-augmented generation (RAG) 4. Use tools like LangChain or LlamaIndex for multi-step worklows 5. Collaborate with designers, content teams, and founders to turn ideas into polished tools Who can apply: Only those candidates can apply who: are Computer Science Engineering students Salary: ₹ 3,20,000 - 6,50,000 /year Experience: 0 year(s) Deadline: 2025-06-22 23:59:59 Skills required: Natural Language Processing (NLP), Deep Learning, Prompt Engineering, ChatGPT, Claude, Gemini, LLMOps and Model fine-tuning Other Requirements: 1. Degree Btech - AI/Ml, others ( who has done AI/ML projects) 2. Strong understanding of LLM APIs (OpenAI, Claude, Gemini, etc.) 3. REST API integration and deployment knowledge 4. GitHub portfolio with working AI tools or integrations About Company: Stirring Minds is a premier startup ecosystem in India, dedicated to helping businesses launch, scale, and succeed. As a leading incubator, we provide funding, co-working spaces, and mentorship to support the growth of innovative companies. In addition to our incubator services, we also host the largest startup event in the country known as Startup Summit Live, bringing together entrepreneurs and industry leaders to connect, learn, and collaborate. Our community-driven approach extends beyond our event and incubator offerings, as we work to create communities of like-minded individuals who can support and learn from one another. We have been recognized by top media outlets both in India and internationally, including the BBC, The Guardian, Entrepreneur, and Business Insider. Our goal is to provide a comprehensive ecosystem for startups and help turn their ideas into reality.

Posted 3 weeks ago

Apply

4 - 9 years

6 - 16 Lacs

Noida

Hybrid

Naukri logo

Hexaware is conducting Walkin Interview for Data Scientist (GENAI)/ Lead Data Scientist (GENAI)/ Data Scientist Architect (GENAI) _Noida Location_12th April 2025 (Saturday) We urgently looking for Immediate joiners/Early joiners. Interested Candidates can share CV at umaparvathyc@hexaware.com MUST HAVE 1. Strong experience in Data Scientist (GENAI) 2. Strong hands-on experience in GenAI LLM models (ChatGPT, LLAMA 2, etc.), Vector databases, LangChain, LangGraph, and LlamaIndex, Azure/AWS, Bedrock, GPT-4. 3. Strong in python, Machine learning, deep Learning architecture, NLP, and OCR. Primary Skills Good understanding of GenAI LLM models (ChatGPT, LLAMA 2, etc.), Vector databases, LangChain, and LlamaIndex. Hands-on experience with Deep Learning architecture, NLP, and OCR. Python Fast API experience, SDA based implementations for all the APIs Architect should be hand-on to review the code developed by the developers Should be able to take and work on spike stories assigned to him/her.

Posted 2 months ago

Apply

15 - 20 years

17 - 22 Lacs

Pune, Bengaluru, Hyderabad

Work from Office

Naukri logo

Skills : AI Architect, CoPilot, OpenAI, Gemini, LLMs, Python, Java, or Scala, cloud platforms, GenAI and artificial intelligence solutions, AI /ML, AIOps

Posted 2 months ago

Apply

8 - 13 years

30 - 37 Lacs

Pune, Bengaluru

Hybrid

Naukri logo

Xoriant is hiring GCP Developer ! Interested candidate please share resume to ashwini.naik@xoriant.com and below details, Current CTC Expected CTC Notice Period Current Location Job Summary: We are seeking a highly skilled GCP Developer and Subject Matter Expert (SME) with an extensive background in GCP architecture. This pivotal role requires an individual with extensive hands-on experience in designing and deploying solutions on the Google Cloud Platform (GCP). The successful candidate will play a critical role within our team and must possess the ability to communicate effectively with both clients and team members. Key Responsibilities: Requires 7+ years of experience working on GCP projects and platforms Design and implement scalable, secure, and high-performing cloud solutions on GCP, leveraging extensive hands-on experience with the platform. Develop and refine architecture blueprints and technical roadmaps for GCP-based solutions, with proficiency in GCP products and tools, such as Vertex AI, Gemini models, Agentspace, NotebookLM, G Suite, and Model Garden. Partner with enterprise architects, security, and SMEs to establish GCP DevOps standards and procedures. Provide expert-level recommendations and best practices for utilizing GCP products and services, applying demonstrated experience in delivering automated, secure cloud infrastructure solutions at enterprise scale. Collaborate with clients to understand their requirements and deliver tailored solutions, utilizing excellent oral and written communication skills to interact effectively with both technical and non-technical stakeholders. Document architecture designs, technical configurations, and deployment procedures thoroughly to ensure transparency and reproducibility. Ensure robust monitoring and incident response for deployed solutions through continuous improvement. Deploy and manage enterprise-scale GCP infrastructure using Infrastructure as Code (IaC) tools, such as Terraform. Automate deployment pipelines using CI/CD tooling, particularly GitHub Actions, with experience in leading these initiatives. Experience with scripting languages such as Python, Bash, and PowerShell

Posted 2 months ago

Apply

5 - 10 years

22 - 37 Lacs

Chennai

Hybrid

Naukri logo

Hexaware is conducting Walkin Interview for Data Scientist(GENAI)/Lead Data Scientist (GENAI)/Data Scientist Architect(GENAI) _Chennai Location 29th March 2025 (saturday). We urgently looking for immediate joiners/early joiners. Interested Candidates can share CV at umaparvathyc@hexaware.com MUST HAVE: 1.5+ years experience-Good understanding of GenAI LLM models (ChatGPT, LLAMA 2, etc.), Vector databases, LangChain, and LlamaIndex. 4.5+years -in machine learning, deep Learning architecture, NLP, and OCR. Primary Skills Good understanding of GenAI LLM models (ChatGPT, LLAMA 2, etc.), Vector databases, LangChain, and LlamaIndex. Hands-on experience with Deep Learning architecture, NLP, and OCR. Python Fast API experience, SDA based implementations for all the APIs Architect should be hand-on to review the code developed by the developers Should be able to take and work on spike stories assigned to him/her.

Posted 2 months ago

Apply

4 - 9 years

0 - 1 Lacs

Pune

Hybrid

Naukri logo

We are looking for a Data Scientist who will support building AI tools for customers The candidate should be skilled at using large data sets to find opportunities for product and process optimization The candidate must have a proven ability to drive business results with their databased insights The candidate must be comfortable working with a wide range of stakeholders and functional teams Primary Skills Python, AI, ML (Machine Learning) Gen AI Generative AI + RAG, NLP, LLM, Deep Learning, Computer Vision, AWS, Cloud Or Azure Open AI (Any Cloud) Experience- 3+ to 16 Years Immediate Joiner to 45days Notice period Job Location- Pune More than 3 years of experience in Data Science and AI / ML domain AT least 2 3 years experience in Machine Learning Deep Learning Experience working on NLP Conversant with Python programming Good knowledge of and experience working on Generative AI Adept in Prompt Engineering Should have good understanding and experience working on Azure Cognitive Services Should be able to work on NLP Generative AI based tasks independently with minimum supervision Good communication skills" Excellent understanding of machine learning techniques and algorithms, such as GPTs, CNN, RNN, k-NN, Naive Bayes, SVM, Decision Forests, etc. Experience using business intelligence tools (e.g., Tableau, PowerBI) and data frameworks (e.g., Hadoop) Experience in Cloud native skills. Knowledge of SQL and Python; familiarity with Scala. Analytical mind and business acumen and Strong math skills (e.g., statistics, algebra) Experience with common data science toolkits, such as TensorFlow, KERAs, PyTorch, PANDAs, Microsoft CNTK, NumPy etc. Deep expertise in at least one of these is highly desirable. Experience with NLP, NLG, and Large Language Models like – BERT, LLaMa, LaMDA, GPT, BLOOM, PaLM, DALL-E, etc. Great communication and presentation skills. Should have experience in working in a fast-paced team culture. Experience with AI/ML like – AWS Sage Maker, Azure Cognitive Services, Google Colab, Jupyter Notebook, Hadoop, PySpark, HIVE, AWS EMR etc. Experience with NoSQL databases, such as MongoDB, Cassandra, HBase, Vector databases. Good understanding of applied statistics skills, such as distributions, statistical testing, regression, etc. Should be a data-oriented person with analytical mind and business acumen.

Posted 2 months ago

Apply

6 - 11 years

4 - 9 Lacs

Chennai, Bengaluru, Mumbai (All Areas)

Work from Office

Naukri logo

Implement and optimize LLM models like GPT, Gemini, Llama, Falcon, RAG-based solutions. Fine-tune and optimize LLMs. Apply techniques such as GANs, VAEs, and Transformers. Implement solutions using Azure, GCP, and AWS services and frameworks. Knowledge and implementation experience of different LLMs and Generative AI use cases. Experience with LLM models and fine-tuning. Good understanding of GANs, VAEs, and Transformers. Experience with cloud platforms like Azure, GCP, and AWS. Bachelors or masters or PhD with 6-10 years of experience in AI. Strong programming skills in Python. Experience with source code management systems. Knowledge of search engines and information retrieval. Excellent problem-solving and analytical skills. Strong communication and teamwork skills.

Posted 2 months ago

Apply

5 - 10 years

1 - 3 Lacs

Hyderabad

Remote

Naukri logo

Greetings from Aptimized! We are currently hiring for the position of Gemini Code Assist Specialist Google Cloud Platform. We are looking for a professional with strong experience in network design, implementation, troubleshooting, and network security. Position Title: Gemini Code Assist Specialist – Google Cloud Platform Contract- 6 Months (Location: Hyderabad) Office Address: Mahaveer Techno Park, 4th Floor, Plot No.6, Survey No.64, Software Units Layout, HITEC City, Hyderabad, Telangana 500081 Walkin Time: 10.00 AM -1.00 PM IST Job Overview: Overview: Aptimized is seeking for a Google Gemini senior developer experienced technical professional to join our rapidly growing team. In this role, you will help to automate the CAD drawing analysis and come up with project proposal with pricing using Google Gemini assisted software development tools. You will work closely with customers to leverage Gemini , drive product adoption, and provide technical expertise to ensure their success. Key Responsibilities: Subject Matter Expertise: Lead the design of GenAI solutions, optimize ML infrastructure, and guide the development of data preparation and model optimization strategies. Utilize Gemini product to analyze uploaded blueprint images, accurately identifying HVAC-specific symbols, dimensions, and annotations. Extract critical information from blueprints, including ductwork layouts, equipment locations, zoning, specifications, and material codes using advanced OCR. Customer Engagement: Work with customers to understand their technical needs, develop AI-assisted software development use cases, and resolve blockers in their software development lifecycle. Product Development Collaboration: Partner with product management to prioritize improvements to Gemini model, ensuring alignment with customer needs. Business Opportunity Identification: Identify and qualify business opportunities, addressing technical objections, and creating strategies to resolve them. Content Creation: Develop technical materials (e.g., best practices, tutorials, code samples, presentations) to enable customers and internal teams to make the most of Google Cloud tools. Technical Support and Consultations: Conduct product and solution briefings, proof-of-concept work, and consulting sessions to support customers in leveraging Google Gemini platform. Qualifications: Required: Proficiency in one or more programming languages/frameworks such as Python, Java, GoLang, JavaScript, TypeScript, PyTorch, and Jupyter/Colab notebooks. Experience with frameworks like Spring Boot and Model-View-Controller (MVC). Knowledge of modern software development life-cycle tools and methodologies, including IDEs, CI/CD, and container orchestration. Preferred: 5-8 years of experience in software development, and with data structures/algorithms. 3-5 years of experience with state of the art GenAI techniques (e.g., LLMs, Multi-Modal, Large Vision Models) or with GenAI-related concepts (language modeling, computer vision). 3-5 years of experience leading ML design and optimizing ML infrastructure (e.g., model deployment, model evaluation, data processing, debugging, fine tuning). Ability to create prototypes/demos and present them effectively to customers, integrating them with existing systems and processes. Proven ability to engage with C-level executives and influence decisions. Excellent consulting and communication skills, with experience in leading discovery and planning sessions to define scope and success criteria.

Posted 2 months ago

Apply

5 - 10 years

25 - 40 Lacs

Pune

Remote

Naukri logo

Job Role: Google Cloud AI/ML Engineer We are seeking an experienced and innovative Google Cloud AI/ML Engineer to design, build, and deploy AI and ML solutions on Google Cloud Platform (GCP). The ideal candidate will be responsible for developing cutting-edge AI/ML solutions to solve complex business problems, leveraging the suite of tools and technologies available in GCP. Key Responsibilities : Design, build, train, and deploy machine learning models using Vertex AI, while integrating Gemini for advanced multi-modal AI capabilities (e.g., text and image data processing). Leverage Google Clouds pre-built AI APIs (Vision AI, Natural Language AI, Translation AI) to accelerate development workflows. Work on end-to-end AI/ML pipelines, from data ingestion to model deployment, using Vertex Pipelines, Cloud Build, and other workflow automation tools. Automate scalable pipelines for end-to-end AI/ML workflows, including data preparation, model training, and monitoring, utilizing tools like Cloud Dataflow and BigQuery ML. Process, clean, and transform data to make it usable for modeling using Dataprep, BigQuery, and Cloud Data Fusion. Build efficient, secure, and scalable data pipelines with tools like Cloud Storage, BigQuery, and Cloud Dataflow to support ML workflows effectively. Experiment with Gemini's advanced generative AI and multi-modal capabilities in Generative AI Studio and Model Garden to unlock innovative AI applications. Use tools such as Vertex AI Workbench, Codey AI, and Gemini to simplify prototyping, streamline debugging, and accelerate AI/ML solution development. Empower development teams by integrating predictive and generative AI insights powered by Gemini into enterprise workflows. Develop generative AI features by leveraging Gemini, PaLM APIs, and Generative AI Studio for business-critical applications. Fine-tune machine learning models to improve accuracy, speed, and scalability, using Vertex AI AutoML and custom training environments. Monitor and retrain AI/ML models to ensure ongoing performance aligned with business needs, leveraging MLOps tools such as Vertex Pipelines, Vertex AI Model Monitoring, and Kubernetes Engine (GKE). Ensure scalability, security, and robustness of AI/ML models through Vertex AI Model Monitoring, ML Metadata Tracking, and other advanced tools. Seamlessly integrate AI/ML models into enterprise systems using Cloud Run, Cloud Functions, and custom APIs. Collaborate with interdisciplinary teams to ensure the successful development and implementation of AI/ML workflows aligned with organizational goals. Stay informed about the latest advancements in generative AI tools, Gemini, PaLM APIs, and GCP AI technologies, applying these innovations to improve solutions and services. Explore emerging AI/ML technologies on Google Cloud to expand organizational capabilities and achieve business objectives. Utilize Geminis advanced features to address challenges that require multi-modal capabilities, generative AI, or large language models (LLMs). Ensure compliance with Responsible AI practices, including fairness, transparency, and ethical considerations, using GCP tools like Explainable AI on Vertex AI and data governance features. Maintain compliance with data security, governance regulations, and organizational standards such as Google Cloud IAM, audit logging, and encryption in transit. Required Skills/Qualifications : 3+ years of experience in designing, developing, and deploying AI/ML models on Google Cloud Platform (GCP). Expertise in developing AI/ML solutions using key GCP tools, including Vertex AI, BigQuery ML, Gemini, and Cloud AI APIs like Vision, NLP, and Translation. Proficiency in leveraging Gemini for multi-modal AI applications and integrating generative AI solutions into practical workflows. Strong programming skills in Python, R, or Java, with experience in AI frameworks such as TensorFlow and PyTorch. Knowledge of data engineering tools, such as Cloud Dataflow, BigQuery, Cloud Pub/Sub, and Cloud Data Fusion, for building scalable pipelines. Hands-on experience with Vertex AI AutoML, Vertex Pipelines, and Vertex AI Model Monitoring for model lifecycle management and MLOps implementation. Proficiency in building and deploying ML models in serving environments such as GKE, Cloud Run, and Cloud Functions. Strong understanding of generative AI technologies, including Gemini, PaLM APIs, Codey AI, and Generative AI Studio, for enhanced productivity and AI-powered development. Familiarity with Responsible AI principles, leveraging GCP tools such as Explainable AI and following data fairness and bias management practices. Strong collaboration, communication, and problem-solving skills for working with technical teams and business stakeholders. Bachelors or Masters degree in Computer Science, Data Science, AI/ML, or a related field. Relevant certifications like Google Professional Machine Learning Engineer, Google Cloud Professional Data Engineer. Job Role: Google Cloud AI Solution Architect Role Overview: We are looking for a Google Cloud AI Solution Architect with deep expertise in Google Cloud Platform (GCP) services to design and implement AI solutions on Google Cloud. This role requires proficiency in Google Cloud AI/ML tools, AI-powered development frameworks like Codey AI, and strategic AI integrations using Gemini, Generative AI Studio, Vertex AI, and PaLM APIs. Key Responsibilities : Design end-to-end AI/ML solutions using Google Cloud services, including Vertex AI, BigQuery ML, Cloud AI APIs (Vision AI, NLP AI, and Translation AI), and generative AI tools like Generative AI Studio, Gemini, and PaLM APIs, integrating capabilities like Codey AI for rapid prototyping and productivity. Architect scalable, secure, and high-performance AI systems that seamlessly integrate into cloud-native enterprise business applications. Assess business objectives and requirements to determine where AI/ML solutions and multi-modal AI capabilities like those provided by Gemini can deliver maximum value and ROI. Lead the development of AI/ML pipelines, including data preparation, model development, deployment, and monitoring, leveraging tools like Vertex Pipelines and Cloud Build. Integrate AI models into enterprise cloud-based architectures while ensuring security, robustness, and scalability. Implement scalable, efficient data pipelines using BigQuery, Cloud Dataflow, and Cloud Data Fusion for data transformation and preparation in AI/ML workflows. Leverage Geminis advanced multi-modal capabilities (e.g., processing text, images, and other data types) for innovative AI solutions, as well as Vertex AI Prediction for deploying and managing ML models. Provide strategic direction on enterprise AI adoption, aligning Google Cloud AI/ML solutions with organizational needs and using tools like Gemini for cutting-edge multi-modal implementations. Collaborate with data scientists, AI engineers, and other stakeholders to ensure architectural alignment and successful AI/ML solution delivery. Drive innovation by staying informed about Google's evolving AI/ML capabilities, including Generative AI advancements like Gemini and Responsible AI practices. Foster the adoption of LLM-based generative AI tools for enterprise AI development and processes, leveraging both PaLM APIs and Gemini. Advise on MLOps best practices using Vertex AI Pipelines, Cloud Build, and Google Kubernetes Engine (GKE) to automate and optimize model development, monitoring, and lifecycle management. Ensure alignment with compliance standards and data governance policies using GCP tools like Google Cloud IAM, Cloud Audit Logs, and Data Catalog for secure AI architecture. Required Skills/Qualifications : 5+ years of experience in designing and implementing AI/ML solutions, with a focus on Google Cloud technologies and enterprise-level deployments. Expertise in architecting AI/ML solutions using Google Cloud AI/ML services, including Vertex AI, BigQuery ML, Gemini, and Cloud AI APIs (Vision AI, NLP AI, Text-to-Speech). Proficiency in leveraging Generative AI tools, such as Gemini, PaLM APIs, Generative AI Studio, and Codey AI, for delivering cutting-edge generative and multi-modal AI solutions. Strong foundation in programming languages such as Python, R, and Java, with hands-on experience in TensorFlow and PyTorch for AI/ML development. Skilled in managing and optimizing AI workflows with tools like Vertex Feature Store, Vertex Experiments, and Vertex AI Model Monitoring. Hands-on experience with data tools such as BigQuery, Cloud Dataflow, Cloud Pub/Sub, and Cloud Storage for data preparation and analytics. Proficiency in deploying and managing scalable AI/ML models using frameworks like Google Kubernetes Engine (GKE) and Cloud Run. Experience with MLOps practices leveraging tools like Cloud Build, Vertex Pipelines, and Vertex Model Registry for CI/CD in AI/ML solutions. Strong knowledge of GCP DevOps practices, including containerization using Docker, orchestration using Kubernetes, and scalable architecture using GKE. Ability to provide strategic guidance on the adoption of generative AI tools like Gemini and PaLM APIs in enterprise-level projects to streamline AI-powered processes. Familiarity with Responsible AI principles, including explainability, bias mitigation, and fairness, leveraging tools like Explainable AI for Vertex AI. Strong problem-solving abilities and a deep understanding of integrating AI into real-world business environments. Excellent collaboration and communication skills to engage with technical and business stakeholders effectively. Bachelors or Masters degree in Computer Science, AI/ML, Data Science, or a related field. Relevant certifications like Google Professional Machine Learning Engineer, Google Cloud Professional Data Engineer, or TensorFlow Developer Certification. Google Cloud Data Architect Job Summary We are seeking skilled and motivated Google Cloud Data Architects to join our team. The candidate will have hands-on experience in designing, implementing, and optimizing enterprise-scale data architectures, leveraging the full suite of Google Cloud Platform (GCP) data services. This role is critical for supporting our data-driven projects by ensuring efficient, secure, and scalable data solutions across various platforms. Key Responsibilities • Design and develop scalable, secure, and cost-effective data architectures to support the organization's (including clients') data processing and analytics needs for traditional and AI-driven environments. • Work with Google Cloud data services such as BigQuery, Cloud Dataflow, Cloud Dataproc, Cloud Storage, Cloud Spanner, Firestore, and Cloud SQL. • Architect data ingestion, integration, and ETL/ELT strategies. • Design and implement data lakes, data warehouses, and real-time analytics solutions. • Define data governance, security, and compliance frameworks in accordance with organizational policies and industry regulations. • Ensure data accuracy, completeness, and consistency across systems. • Automate data workflows and optimize performance using Cloud Composer (Apache Airflow), Cloud Functions, or Python. • Monitor and troubleshoot data pipeline performance issues to identify opportunities for optimization. • Collaborate with data engineers, data scientists, analysts, and business stakeholders to define data requirements. • Manage and enhance data integration across hybrid and multi-cloud environments to ensure seamless data flow. • Implement best practices for data security, privacy, and access controls. • Stay updated with the latest advancements in Google Cloud data technologies and recommend innovative solutions. Required Skills/Qualifications • Proven experience working with Google Cloud data services, including (but not limited to) BigQuery, Cloud Dataflow, Cloud Dataproc, Cloud Storage, Cloud SQL, Cloud Spanner, Firestore, and Pub/Sub. • Strong expertise in data modeling, data governance, data security, and database design principles. • Experience in architecting scalable and high-performance data solutions on Google Cloud. • Expertise in big data technologies such as Dataproc (Hadoop/Spark) and real-time data processing (Pub/Sub, Dataflow, Kafka). • Proficiency in SQL, Python, or Java for data processing and automation. • Experience with Looker or Data Studio for advanced data visualization and reporting. • Strong problem-solving skills and the ability to design robust, cost-effective, and scalable data architectures. • Knowledge of Vertex AI, ML Ops best practices, and ML model data pipelines. • Experience with data lakehouse architectures and hybrid/multi-cloud strategies. • Strong communication and leadership skills, with the ability to engage both technical and non-technical stakeholders. • Familiarity with DevOps (CI/CD for data pipelines) and Infrastructure as Code (Terraform, Deployment Manager, or Cloud Build). • Bachelors/Masters degree in Computer Science, Information Technology, or a related field. • Google Cloud Certifications such as Professional Data Engineer or Professional Cloud Architect is preferred. Google Cloud Data Engineer Job Summary We are seeking skilled and motivated Google Cloud Data Engineers to join our team in associate roles. The candidate will have hands-on experience in developing, implementing, and optimizing data pipelines, leveraging the full suite of Google Cloud Platform (GCP) data services. This role is critical for supporting our data-driven projects by ensuring efficient, secure, and reliable data solutions and flows across various platforms. Key Responsibilities • Develop, maintain, and optimize scalable data pipelines to support the organizations (including clients') data processing and analytics needs for traditional and AI-driven environments. • Work with Google Cloud data services such as BigQuery, Cloud Dataflow, Cloud Dataproc, Cloud Storage, Cloud SQL, Cloud Spanner, and Firestore. • Design data ingestion, integration, and ETL/ELT strategies. • Design and implement ETL/ELT processes to extract, transform, and load data from diverse sources into centralized data repositories. • Design and implement data lakes, data warehouses, and real-time analytics solutions. • Ensure data accuracy, completeness, and security within pipelines and storage. • Maintain the quality of structured and unstructured data across platforms. • Automate data workflows using technologies like Cloud Functions, Cloud Composer (Apache Airflow), or Python. • Monitor and troubleshoot performance issues to identify opportunities for optimization. • Collaborate with data scientists, architects, analysts, and business stakeholders. • Manage and enhance data integration across platforms to ensure uninterrupted flow of information. • Perform data cleansing, transformation, and enrichment techniques to maintain high-quality datasets. • Implement and enforce data governance, security, and privacy practices in compliance with organizational and regulatory standards. • Stay updated with the latest advancements in Google Cloud data technologies and recommend innovative solutions. Required Skills/Qualifications • Proven experience working with Google Cloud data services, including (but not limited to) BigQuery, Cloud Dataflow, Cloud Dataproc, Cloud SQL, Cloud Storage, Firestore, and Pub/Sub. • Solid understanding of data modeling, data governance, data security & privacy, and database design principles. • Experience with designing data architectures supporting AI/ML workloads and big data technologies like Dataproc (Hadoop/Spark) is a plus. • Proficiency in SQL and Python for data processing and automation. • Experience with Looker or Data Studio for data visualization and reporting is a plus. • Strong problem-solving skills with the ability to translate complex data into clear insights. • Understanding of data warehousing, lakehouse architectures, and real-time analytics. • Knowledge of Vertex AI, ML Ops practices, and ML model data requirements. • Experience with preparing and transforming data for ML model training, including Vertex AI integration with BigQuery and Dataproc is a plus. • Strong communication skills, with the ability to engage both technical and non-technical stakeholders. • Familiarity with DevOps (CI/CD for data pipelines) and Infrastructure as Code (Terraform, Deployment Manager, or Cloud Build). • Bachelors/Masters degree in Computer Science, Information Technology, or a related field. • Google Cloud Certifications such as Professional Data Engineer is preferred. Compensation: Day rate of INR10,000 to INR15,000 per day Permanent salary between 20 lakh to 50 lakh, depending on the experience Working from Home Opportunity to travel onsite

Posted 2 months ago

Apply

3 - 6 years

15 - 30 Lacs

Hyderabad

Hybrid

Naukri logo

We are RadarRadar , experts in the commodity production, trade and processing industry. As a technology company we continuously aim to support our clients with strong data C analytics and business intelligence tools. It is our mission to enable companies to unlock the full potential of their data to improve risk and margin management and boost performance. As a Machine Learning (ML) Engineer you will work alongside an engineering team that builds RadarRadar machine learning tools for our clients. We are looking for an individual with deep experience in designing, implementing, assessing, and refining machine learning models. You love to work- and think progressively, because as first movers we need to stay ahead of market trends and deliver first class products. What will you do: Design and implement ML/AI solutions to optimize and enhance the RadarRadar product suite. Adhering to high quality development principles while delivering solutions on time. Participate in scrum related activities (stand-ups, planning, demo sessions, etc.) Work closely with cross-functional teams, including product, and engineering, to align ML/AI capabilities with business goals. Create technical documentation and specifications for feature integrations. Conduct research/implementation of latest industry standard best practices. Break down, estimate, and implement features containing multiple tasks. Review, assess and improve the full set of activities presented above. What will you bring: 3+ years of professional experience as a ML engineer or a similar role. Proficiency with state-of-the-art ML models and frameworks (e.g., Keras, NumPy, scikit-learn, PyTorch, TensorFlow). Knowledge/interest in current progressions in the AI world and interest in enterprise level development of AI models using LLM APIs (e.g. OpenAI, Gemini, Claude, Anthropic). Expertise in hyperparameter tuning to optimize model performance. Familiarity with Explainable AI (XAI) techniques. Advanced Python programming skills (min. 2 years of experience). T-SQL (SQL Server) proficient level of knowledge (min. 1-2 years), experience working with stored procedures, functions, triggers, indexes, dynamic SQL, query performance tuning. Comfortable with complexity and pursuit of excellent results, willing to learn on the job. A proactive thinker who can work independently and bring innovative ideas. Balances hard skills with interpersonal ability. Strong knowledge of clean code principles. Ability to work in a collaborative manner. What you will get: Rotterdam-based hybrid workplace model A competitive salary and working with an amazing team. Flexible working hours and your own laptop. Friday drinks and (virtual) team events. Company merchandise. An inspiring environment where you learn every day. Personal development plan to help you reach your personal goals.

Posted 2 months ago

Apply

3 - 6 years

12 - 22 Lacs

Pune, Bengaluru, Hyderabad

Hybrid

Naukri logo

Role & responsibilities: Well-versed with the SDLC, STLC, and Agile methodologies. Strong experience in testing with assistive technologies on mobile and web platforms such as NVDA, JAWS, VoiceOver, Talkback, etc. Experience with Document Accessibility Testing using assistive technologies and automated tools like PDF Checker, PAC, MS Accessibility tools, etc. Strong understanding of WCAG 2.2/2.1 guidelines and region-specific regulations such as ADA, AODA, DDA, etc. Demonstrable experience with WAI-ARIA. Ability to work with Developers and Product Owners to understand Accessibility requirements, implementation details, and to develop detailed Accessibility test strategies.

Posted 3 months ago

Apply

5 - 10 years

12 - 18 Lacs

Gurgaon/Gurugram, Delhi / NCR

Work from Office

Naukri logo

We are looking for a versatile candidate having good experience in end to end development of Java Application. Candidate will be responsible for below given tasks Responsibilities: 5 + years experience of core JAVA development. Java 1.8 experience is a must Thorough understanding of OOPS concepts, Design principles and implementation of Different type of Design patterns. Extensive experience in Java EE platforms, Spring Boot, Spring MVC, Spring Batch, Microservices Experience in Java, Spring Framework, Hibernate, JPA, Restful, web services. Sound understanding of concepts like Exceptional handling, Serialization/Deserialization and Immutability concepts, etc. Good knowledge in ENUMS, Collections, Annotations, Generics, Basic understanding of Java Memory Management (JMM) including garbage Collections concepts. Strong expertise in Algorithm and data structures Working closely with Product managers or individuals/teams Strong experience with unit testing and test-driven development Be a product owner Required Candidate profile Candidate should be a team player with experience in developing solutions in a specified time frame. He / She should have desire to learn new technologies and be of exploring nature to do something which is not done before.

Posted 1 month ago

Apply

5 - 7 years

8 - 12 Lacs

Hyderabad, Chennai

Hybrid

Naukri logo

Job Summary: We are seeking a forward-thinking AI Business Analyst with a strong foundation in IT Business Analysis and a passion for AI-driven innovation. The ideal candidate will possess exceptional analytical skills, technical expertise, and experience in delivering transformative digital solutions. This role will play a pivotal part in bridging the gap between business needs and cutting-edge AI technologies to drive strategic value across the organization. Key Responsibilities: Requirements Gathering: Engage with stakeholders to elicit, analyze, and document business requirements. Ensure business needs are aligned with strategic objectives and technological capabilities. Solution Design: Evaluate current business processes and identify opportunities for AI integration and digital transformation. Collaborate with cross-functional teams to design innovative, data-driven solutions. Project Support: Assist project managers in delivering AI and digital initiatives on time and within scope. Support end-to-end project lifecycles including planning, execution, and post-implementation analysis Data Analysis: Leverage data analytics tools to analyze large and complex datasets. Generate insights and actionable recommendations to inform decision-making and optimize processes. Stakeholder Communication: Maintain strong communication channels with internal and external stakeholders. Translate technical concepts into clear, business-friendly language. Documentation: Prepare comprehensive documentation including business requirements, functional specifications, process flows, and user guides. Training & Support: Facilitate user training and onboarding for new AI systems and digital tools. Provide ongoing support to ensure user adoption and satisfaction Qualifications Education: Bachelors degree in Computer Science, Information Technology, Business Administration, or a related field.. Experience: 5-7 years of experience as an IT Business Analyst . Technical Proficiency: Familiarity with data analytics platforms and visualization tools. Exposure to AI technologies (e.g., Chat GPT, Gemini, machine learning frameworks). Experience with project management tools (e.g., Jira, Trello, Asana) . Preferred Qualifications AI Expertise: Experience with AI models, natural language processing, predictive analytics, or intelligent automation. Digital Transformation: Hands-on experience in initiatives such as cloud migration, process automation, and digital strategy development . Certifications: Relevant certifications in Business Analysis (e.g., CBAP), AI/ML, or Agile methodologies are a plus. .

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies