Home
Jobs

1657 Fastapi Jobs - Page 36

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Responsibilities: • Design and develop GenAI-based solutions using LLMs (e.g., Bedrock, OpenAI, Claude) for text, image, table, diagram, and multi-modal applications. • Implement a multi-agent system that integrates structured and unstructured data sources, including knowledge graphs, embeddings, and vector databases. • Build and deploy agentic AI workflows capable of autonomous task completion, using frameworks like LangChain, LangGraph, or CrewAI. • Perform fine-tuning, retraining, or adaptation of open-source or proprietary LLMs for specific domain tasks. • Collaborate with data scientists and domain experts to curate and preprocess training datasets. • Integrate models with scalable backend APIs or pipelines (REST, FastAPI, gRPC) for real-world applications. • Stay updated with state-of-the-art research and actively contribute to enhancing model performance and interpretability. • Optimize inference, model serving, and memory management for deployment at scale. Qualifications: • Bachelor’s or Master’s degree in Computer Science, Engineering, AI/ML, Data Science, or related field. • 5+ years of hands-on experience in Deep Learning, NLP, and LLMs. • Proven experience with at least one end-to-end project involving multi-modal RAG and Agentic AI. • Proficient in Python and ML/DL libraries such as PyTorch, TensorFlow, Transformers (HuggingFace), LangChain, LangGraph, Bedrock, or similar • Experience in fine-tuning or adapting LLMs (using LoRA, QLoRA, PEFT, or full fine-tuning). • Experience in building a multi-agent system. • Strong understanding of knowledge graphs, embeddings, vector databases (e.g., FAISS, Chroma, Weaviate), and prompt engineering. • Strong understanding and experience of a cloud platform like AWS. • Familiarity with containerization (Docker, Kubernetes) Preferred Skills • Experience in the Biopharma industry. • Design and implement user-friendly interfaces for AI applications. • Utilize modern web frameworks (e.g., React, Vue.js) to create engaging user experiences. • Develop scalable and efficient backend systems to support the deployment of AI models. • Integrate with cloud platforms (AWS) for infrastructure management. • Hands-on experience in vision-language models (e.g., CLIP, BLIP, LLaVA). • Publications, Kaggle competitions, or GitHub projects in GenAI Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

--POSITION OPEN FOR IMMEDIATE JOINERS ONLY-- --THOSE WHO HAVE APPLIED EARLY, NO NEED TO APPLY AGAIN-- Following selection criteria will be strictly followed - Graduated in 2020 or before 4+ YOE in the industry, excluding any career breaks Python or Java proficiency is a MUST Should be able to join in 30 days once the offer letter is released Budget - There is no budget cap for the right talent Location - Noida Tech Stack - GCP, Pub/Sub, Postgres/PostGIS, Redis, Django/FastAPI/Spring Boot, GDAL, Geoserver etc What will you be building? SiteRecon platform consists of GIS-based web applications, mobile applications, and several custom integrations - both tool-based and API-based to its customers. As a full-stack engineer, you would be responsible for owning feature deliveries encompassing several interfaces. That means - what's built, how's it built, and how to make it a success. You will have a team of dedicated engineers, sales experts, and marketing maestros who will work together with you to make the product and yourself successful. What are your responsibilities? Owning entire code bases, improving them over time, and killing tech debts Production systems - you will have full access to the end delivery systems. You will be talking about security issues, performance issues, and QA issues. Isn't that what complete ownership stands for? You make decisions and drive what's built in the future at SiteRecon. Whatever you need - tools, processes, people. Whatever makes the product great, you will have it all to make great software. A little bit about us - We are helping property maintenance contractors in the US to automate property mapping and site visits through our mapping platform - https://order.siterecon.ai/ Hiring Manager - https://www.linkedin.com/in/innovationchef/ Drop a note to ankit@siterecon.ai if you want to discuss more about what you would be doing! Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

🔍 We're Hiring: AI Engineer / GenAI Engineer (3+ years) 📍 Location: Hyderabad (Full-time, Hybrid) 🏢 Join Our Mission to Build the Future with GenAI Are you passionate about Generative AI, LLMs , and pushing the boundaries of Agentic AI ? Do you thrive at the intersection of innovation and impact? We’re looking for a skilled AI Engineer to join our dynamic team developing multi-modal, intelligent, and autonomous AI systems that transform the way industries operate. 🚀 What You'll Do: Design and develop cutting-edge GenAI solutions using LLMs (OpenAI, Claude, Bedrock) across text, image, and tabular data. Build multi-agent AI workflows using LangChain, LangGraph, CrewAI —integrated with knowledge graphs and vector databases. Fine-tune or adapt open-source/proprietary LLMs (LoRA, QLoRA, PEFT) for domain-specific tasks. Integrate scalable APIs and pipelines using FastAPI, REST, or gRPC . Collaborate with data scientists to curate datasets and optimize real-world performance. Drive innovation by staying on top of the latest research and development in GenAI, VLMs, and RAG. ✅ What We're Looking For: 5+ years of hands-on experience in Deep Learning, NLP, and LLMs . Experience in building multi-modal RAG systems and Agentic AI solutions . Proficiency in Python , PyTorch , HuggingFace , LangChain , Bedrock , etc. Strong knowledge of embeddings, vector databases (FAISS, Chroma, Weaviate), and prompt engineering . Experience with cloud platforms (AWS) and containerization tools (Docker/Kubernetes) . ⭐ Bonus Skills: Experience in Biopharma domain Hands-on with Vision-Language Models (CLIP, BLIP, LLaVA) Strong frontend (React/Vue.js) and backend (FastAPI, Flask) experience GitHub projects, publications, or Kaggle contributions in GenAI Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

🚀 AI Engineering Intern (SDE) – Founding Tech Interns | Opportunity of a Lifetime Location: Gurgaon (In-Office) Duration: 3–6 months (Flexible based on academic schedule) Start Date: Immediate openings Open to: Tier 1 college students graduating in 2025 only Compensation: Stipend + Pre-Placement Offer potential + Founders’ recommendation for global fellowships (Google, Meta, etc.) 🧠 About Us – Darwix AI Darwix AI is on a mission to solve a problem no one's cracked yet — building real-time, multilingual conversational intelligence for omnichannel enterprise sales teams using the power of Generative AI. We're building India’s answer to Gong + Refract + Harvey AI — trained on 1M+ hours of sales conversations, and packed with industry-first features like live agent coaching , speech-to-text in 11 Indic languages , and autonomous sales enablement nudges . We’ve got global clients, insane velocity, and a team of ex-operators from IIMs, IITs, and top-tier AI labs. 🌌 Why This Internship is Unlike Anything Else 💡 Work on a once-in-a-decade problem — pushing the boundaries of GenAI + Speech + Edge compute. 🛠️ Ship real products used by enterprise teams across India & the Middle East. 🧪 Experiment freely — train models, optimize pipelines, fine-tune LLMs, or build scrapers that work in 5 languages. 🚀 Move fast, learn faster — direct mentorship from the founding engineering and AI team. 🏆 Proof-of-excellence opportunity — stand out in every future job, B-school, or YC application. 💻 What You'll Do Build and optimize core components of our real-time agent assist engine (Python + FastAPI + Kafka + Redis). Train, evaluate, and integrate whisper, wav2vec, or custom STT models on diverse datasets. Work on LLM/RAG pipelines , prompt engineering, or vector DB integrations. Develop internal tools to analyze, visualize, and scale insights from conversations across languages. Optimize for latency, reliability, and multilingual accuracy in dynamic customer environments. 🌟 Who You Are Pursuing a B.Tech/B.E. or dual degree from IITs, IIITs, BITS, NIT Trichy/Warangal/Surathkal, or other top-tier institutes. Comfortable with Python , REST APIs, and database operations. Bonus: familiarity with FastAPI, Langchain, or HuggingFace. Passionate about AI/ML, especially NLP, GenAI, ASR, or multimodal systems. Always curious, always shipping, always pushing yourself beyond the brief. Looking for an internship that actually matters — not one where you're just fixing CSS. 🌐 Tech You’ll Touch Python, FastAPI, Kafka, Redis, MongoDB, Postgres Whisper, Deepgram, Wav2Vec, HuggingFace Transformers OpenAI, Anthropic, Gemini APIs LangChain, FAISS, Pinecone, LlamaIndex Docker, GitHub Actions, Linux environments 🎯 What’s in it for you A pre-placement offer for the best performers. A chance to be a founding engineer post-graduation. Exposure to the VC ecosystem , client demos, and GTM strategies. Stipend + access to tools/courses/compute resources you need to thrive. 🚀 Ready to Build the Future? If you’re one of those rare folks who can combine deep tech with deep curiosity , this is your call to adventure. Join us in building something that’s never been done before. Apply now at careers@cur8.in Attach your CV + GitHub/Portfolio + a line on why this excites you. Bonus points if you share a project you’ve built or an AI problem you’re obsessed with. Darwix AI | GenAI for Revenue Teams | Built from India for the World Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Job Role Sr Full Stack Developer (Python+ Angular + GCP/AWS/Azure) Experience 5+ years Notice period Immediate Location Trivandrum/Kochi Introduction We are looking for a Sr Fullstack (Python & Angular) Developer who will take ownership of building and maintaining complex backend systems, APIs, and applications using Python and for frontend with Angular Js. Profiles with BFSI- Payment system integrations experience is desired. Responsibilities include: • Design, develop, and maintain backend applications, APIs, and services using Python. • Write clean, maintainable, and scalable code following industry standards and best practices. • Optimize application performance and ensure high availability and scalability. • Review code and mentor junior developers to ensure code quality and foster knowledge sharing. • Implement unit and integration tests to ensure application robustness. • Set up and manage CI/CD pipelines using tools like Jenkins, GitLab CI, or CircleCI. • Collaborate with DevOps to deploy applications on cloud platforms, preferably Google Cloud Platform (GCP). • Design and build cloud-native applications using APIs, containers, and Kubernetes. • Leverage GCP services to develop scalable and efficient solutions. • Ensure application security, manage access controls, and comply with data privacy regulations. • Work closely with frontend developers, DevOps engineers, and product managers for seamless project delivery. • Design, manage, and optimize relational and NoSQL databases (PostgreSQL, MySQL, MongoDB). • Monitor application performance using tools like Prometheus, Grafana, or Datadog. • Build dynamic, responsive UIs using Angular and JavaScript. • Develop and maintain reusable Angular components in collaboration with UX/UI teams. Primary Skills : • Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field. • 5-7 years of experience as a Python developer, with a focus on Product development (BE+FE development). Hands on experience in Angular Js. • Proven experience in designing and deploying scalable applications and microservices. App Integration experience is prefferd. • Python- FastAPI (Flask/Django) • API Development (RESTful Services) • Cloud Platforms – Google Cloud Platform (GCP)prefferd. • Familiarity with database management systems– PostgreSQL, MySQL, MongoDB and ORMs (e.g., SQLAlchemy, Django ORM). • Knowledge of CI/CD pipelines – Jenkins, GitLab CI, CircleCI • Frontend Development – JavaScript, Angular • Code Versioning – Git • Testing – Unit & Integration Testing • Strong understanding of security principles, authentication (OAuth2, JWT), and data protection. Secondary Skills: • Monitoring Tools – Prometheus, Grafana, Datadog • Security and Compliance Standards – GDPR, PCI, Soc2 • DevOps Collaboration • UX/UI Collaboration for Angular components • Experience with asynchronous programming (e.g., asyncio, Aiohttp). • Experience with big data technologies like Spark or Hadoop. • Experience with machine learning libraries (e.g., TensorFlow, PyTorch) is a plus. Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About the Company Resources is the backbone of Publicis Groupe, the world’s third-largest communications group. Formed in 1998 as a small team to service a few Publicis Groupe firms, Re:Sources has grown to 5,000+ people servicing a global network of prestigious advertising, public relations, media, healthcare, and digital marketing agencies. We provide technology solutions and business services including finance, accounting, legal, benefits, procurement, tax, real estate, treasury, and risk management to help Publicis Groupe agencies do their best: create and innovate for their clients. In addition to providing essential, everyday services to our agencies, Re:Sources develops and implements platforms, applications, and tools to enhance productivity, encourage collaboration, and enable professional and personal development. We continually transform to keep pace with our ever-changing communications industry and thrive on a spirit of innovation felt around the globe. With our support, Publicis Groupe agencies continue to create and deliver award-winning campaigns for their clients. About the Role The main purpose of this role is to advance the application of business intelligence, advanced data analytics, and machine learning for Marcel. The role involves working with other data scientists, engineers, and product owners to ensure the delivery of all commitments on time and in high quality. Responsibilities Develop and maintain robust Python-based backend services and RESTful APIs to support machine learning models in production. Deploy and manage containerized applications using Docker and orchestrate them using Azure Kubernetes Service (AKS). Implement and manage ML pipelines using MLflow for model tracking, reproducibility, and deployment. Design, schedule, and maintain automated workflows using Apache Airflow to orchestrate data and ML pipelines. Collaborate with Data Scientists to productize NLP models, with a focus on language models, embeddings, and text preprocessing techniques (e.g., tokenization, lemmatization, vectorization). Ensure high code quality and version control using Git; manage CI/CD pipelines for reliable deployment. Handle unstructured text data and build scalable backend infrastructure for inference and retraining workflows. Participate in system design and architecture reviews for scalable and maintainable machine learning services. Proactively monitor, debug, and optimize ML applications in production environments. Communicate technical solutions and project status clearly to team leads and product stakeholders. Qualifications Minimum Experience (relevant): 5 years Maximum Experience (relevant): 9 years Bachelor's degree in engineering, computer science, statistics, mathematics, information systems, or a related field from an accredited college or university; Master's degree from an accredited college or university is preferred. Or equivalent work experience. Required Skills Proficiency in Python and frameworks like FastAPI or Flask for building APIs. Solid hands-on experience with Docker, Kubernetes (AKS), and deploying production-grade applications. Familiarity with MLflow, including model packaging, logging, and deployment. Experience with Apache Airflow for orchestrating ETL and ML workflows. Understanding of NLP pipelines, language models (e.g., BERT, GPT variants), and associated libraries (e.g., spaCy, Hugging Face Transformers). Exposure to cloud environments, preferably Azure. Strong debugging, testing, and optimization skills for scalable systems. Experience working with large datasets and unstructured data, especially text. Preferred Skills Advanced knowledge of data science techniques, and experience building, maintaining, and documenting models. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience building and optimizing ADF and PySpark based data pipelines, architectures and data sets on Graph and Azure Datalake. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with unstructured datasets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. A successful history of manipulating, processing and extracting value from large disconnected datasets. Working knowledge of message queuing, stream processing, and highly scalable Azure based data stores. Strong project management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment. Understanding of Node.js is a plus, but not required. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Maharashtra, India

On-site

Linkedin logo

About Company : They balance innovation with an open, friendly culture and the backing of a long-established parent company, known for its ethical reputation. We guide customers from what’s now to what’s next by unlocking the value of their data and applications to solve their digital challenges, achieving outcomes that benefit both business and society. Job Title: Python Developer Location: Pune Experience: 6+ yrs Job Type: Contract to hire(Min 1+ yr) Notice Period: Immediate joiners Mandatory Skills : Python, Fastapi, No SQL, MongoDB Job Description: Backend developer with Python skill along with no SQL (MongoDB/Document DB) Writing and maintaining efficient reusable and reliable code Good experience in working with NO SQL Strong API Design and Implementation Hands on Performance Optimization Familiar with Python Frameworks and Libraries Good hands on in Server-side Development Collaborate with front-end developers, designers, and other stakeholders Strong communication Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About The Company iLink Digital is a Global Software Solution Provider and Systems Integrator, delivers next-generation technology solutions to help clients solve complex business challenges, improve organizational effectiveness, increase business productivity, realize sustainable enterprise value and transform your business inside-out. iLink integrates software systems and develops custom applications, components, and frameworks on the latest platforms for IT departments, commercial accounts, application services providers (ASP) and independent software vendors (ISV). iLink solutions are used in a broad range of industries and functions, including healthcare, telecom, government, oil and gas, education, and life sciences. iLink’s expertise includes Cloud Computing & Application Modernization, Data Management & Analytics, Enterprise Mobility, Portal, collaboration & Social Employee Engagement, Embedded Systems and User Experience design etc. What makes iLink's offerings unique is the fact that we use pre-created frameworks, designed to accelerate software development and implementation of business processes for our clients. iLink has over 60 frameworks (solution accelerators), both industry-specific and horizontal, that can be easily customized and enhanced to meet your current business challenges. Requirements Job Summary: We are looking for a skilled and motivated Python and AWS Developer to join our dynamic engineering team. The ideal candidate will have strong expertise in Python (3.x) and modern web frameworks, AWS cloud services and solid knowledge of relational and graph databases. You’ll be responsible for designing, developing, and deploying scalable cloud-native applications and serverless architectures. Key Responsibilities - Develop, test, and maintain Python-based backend systems with a focus on object-oriented design, coding standards, and scalability. - Build and maintain RESTful APIs using frameworks such as Django and FastAPI. - Design and implement AWS cloud solutions leveraging services like Lambda, Step Functions, ECS, API Gateway, CloudWatch, and S3. - Write clean, maintainable, and efficient code following industry best practices. - Participate in architecture discussions, code reviews, and agile development cycles. - Collaborate with cross-functional teams to define, design, and ship new features. - Monitor and troubleshoot performance issues in production environments. - Maintain CI/CD pipelines. - Design and optimize relational databases (RDS) such as: PostgreSQL, MySQL, and data warehousing solutions like Snowflake. Preferred/Bonus Skills - Experience with OpenAI APIs, Large Language Models (LLMs), or Prompt Engineering. - Familiarity with CI/CD pipelines using tools like GitHub Actions, or GitLab CI. - Exposure to asynchronous programming and message queues. Benefits Competitive salaries Medical Insurance Employee Referral Bonuses Performance Based Bonuses Flexible Work Options & Fun Culture Robust Learning & Development Programs In-House Technology Training Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

This position requires good application development and coding skills as well, as this person will also troubleshoot, support, and fix the existing Deductions Link Web Scraping and ETL code base, and may develop additional features/functionality to enhance or onboard the new customers. Technical Skills Needed/desired Web Scrapping: Experience in Web Scraping using Selenium or related technologies SQL Proficiency: Strong familiarity with writing and optimizing SQL queries. Python Development: Some experience with Python programming, including supporting Python-based applications. API Development: Proficient in developing new API’s using web frameworks FastAPI, Django and experience with ORM (Object-Relational Mapping) tools. Problem Solving: Excellent problem-solving skills with the ability to debug and troubleshoot complex issues. Log Analysis: Capable of searching and analyzing logs to diagnose and resolve issues. GCP Basics: Foundational knowledge of GCP services and cloud computing concepts. Team Collaboration: Strong communication skills and a proven ability to work effectively within a team. Continuous Learning: Demonstrates curiosity and a willingness to learn and adapt to new technologies and methodologies. Required Qualifications Bachelor’s degree in Computer Science, Computer Engineering, or Information Systems and/or related work experience (open source web services development) 8+ years of Python API/Restful services development experience using FastAPI or Django 5+ years experience in databases like Postgresql, MySql Experience in Web Scraping using Selenium , Beautifulsoup Experience writing unit and functional tests using PyTest Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

This position requires good application development and coding skills as well, as this person will also troubleshoot, support, and fix the existing Deductions Link Web Scraping and ETL code base, and may develop additional features/functionality to enhance or onboard the new customers. Technical Skills Needed/desired Web Scrapping: Experience in Web Scraping using Selenium or related technologies SQL Proficiency: Strong familiarity with writing and optimizing SQL queries. Python Development: Some experience with Python programming, including supporting Python-based applications. API Development: Proficient in developing new API’s using web frameworks FastAPI, Django and experience with ORM (Object-Relational Mapping) tools. Problem Solving: Excellent problem-solving skills with the ability to debug and troubleshoot complex issues. Log Analysis: Capable of searching and analyzing logs to diagnose and resolve issues. GCP Basics: Foundational knowledge of GCP services and cloud computing concepts. Team Collaboration: Strong communication skills and a proven ability to work effectively within a team. Continuous Learning: Demonstrates curiosity and a willingness to learn and adapt to new technologies and methodologies. Required Qualifications Bachelor’s degree in Computer Science, Computer Engineering, or Information Systems and/or related work experience (open source web services development) 8+ years of Python API/Restful services development experience using FastAPI or Django 5+ years experience in databases like Postgresql, MySql Experience in Web Scraping using Selenium , Beautifulsoup Experience writing unit and functional tests using PyTest Show more Show less

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Develop and deploy ML pipelines using MLOps tools, build FastAPI-based APIs, support LLMOps and real-time inferencing, collaborate with DS/DevOps teams, ensure performance and CI/CD compliance in AI infrastructure projects. Required Candidate profile Experienced Python developer with 4–8 years in MLOps, FastAPI, and AI/ML system deployment. Exposure to LLMOps, GenAI models, containerized environments, and strong collaboration across ML lifecycle

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Design and build scalable REST APIs using FastAPI, integrate optimized SQL queries, collaborate with cross-functional teams, ensure secure coding, participate in agile sprints, and deploy services using CI/CD and Docker Required Candidate profile Experienced Python developer with 5+ years in backend/API development using FastAPI, strong SQL proficiency, knowledge of Docker, CI/CD, cloud, and secure coding practices.

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

This position requires good application development and coding skills as well, as this person will also troubleshoot, support, and fix the existing Deductions Link Web Scraping and ETL code base, and may develop additional features/functionality to enhance or onboard the new customers. Technical Skills Needed/desired Web Scrapping: Experience in Web Scraping using Selenium or related technologies SQL Proficiency: Strong familiarity with writing and optimizing SQL queries. Python Development: Some experience with Python programming, including supporting Python-based applications. API Development: Proficient in developing new API’s using web frameworks FastAPI, Django and experience with ORM (Object-Relational Mapping) tools. Problem Solving: Excellent problem-solving skills with the ability to debug and troubleshoot complex issues. Log Analysis: Capable of searching and analyzing logs to diagnose and resolve issues. GCP Basics: Foundational knowledge of GCP services and cloud computing concepts. Team Collaboration: Strong communication skills and a proven ability to work effectively within a team. Continuous Learning: Demonstrates curiosity and a willingness to learn and adapt to new technologies and methodologies. Required Qualifications Bachelor’s degree in Computer Science, Computer Engineering, or Information Systems and/or related work experience (open source web services development) 8+ years of Python API/Restful services development experience using FastAPI or Django 5+ years experience in databases like Postgresql, MySql Experience in Web Scraping using Selenium , Beautifulsoup Experience writing unit and functional tests using PyTest Show more Show less

Posted 2 weeks ago

Apply

5.0 - 31.0 years

0 - 0 Lacs

Mohali

Remote

Apna logo

We are a fast-growing AI and IT consultancy based in Mohali, India, specializing in practical AI solutions for real-world business challenges. Our team combines deep technical expertise with industry knowledge to deliver custom AI and machine learning applications, leveraging the latest technologies including Large Language Models (LLMs) to transform how businesses operate. Experience: 5+ years | Type: Full-time, On-site, Monday to Saturday | Location: Mohali, Punjab Key Responsibilities Backend Development: Design and develop scalable FastAPI backends with RESTful APIs AI PoC Development: Build Proof of Concepts using computer vision and LLMs Performance Optimization: Debug and optimize backend systems and AI workflows Security & Compliance: Implement robust security measures and data protection Cross-functional Collaboration: Work with front-end developers and product managers Required Skills Python Expertise: 5+ years of backend development experience Frameworks: Strong FastAPI experience; familiarity with Flask/Django AI/ML Stack: NumPy, Pandas, Scikit-learn, PyTorch or TensorFlow Database Management: MySQL, PostgreSQL, MongoDB with ORM tools Cloud Platforms: AWS, Azure, or Google Cloud experience API Development: Building and consuming RESTful APIs AI Experience: Computer vision algorithms and LLM implementation Version Control: Git proficiency and Agile workflows Preferred Qualifications Docker and Kubernetes experience Open-source project contributions CI/CD pipeline experience Active involvement in AI/ML communities What We Offer Work on cutting-edge AI projects and transformative PoCs Access to state-of-the-art development tools (GitHub Copilot, Cursor) Competitive compensation and benefits Collaborative, innovation-focused team environment Continuous learning and career growth opportunities

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Surat, Gujarat, India

On-site

Linkedin logo

Back Openings: 01 Experience: 2 years Location: Surat - Varachha Benefits 5-Days Working Paid Leaves Complimentary Health Insurance Overtime Pay Fun Activities Personal Loan Employee Training Positive Work Environment Professional Developments We are looking for an experienced AI/ML cum Python Developer with 2 years of hands-on work in machine learning, Python development, and API integration. The ideal candidate should also have experience building AI agentssmart systems that can plan tasks, make decisions, and work independently using tools like LangChain, AutoGPT, or similar frameworks. Youll be part of a collaborative team, working on real-world AI projects and helping us build intelligent, scalable solutions. Job Responsibility Key Responsibilities Develop, train, and deploy machine learning models using frameworks such as TensorFlow, PyTorch, or Scikit-learn. Develop AI agents capable of decision-making and multi-step task execution. Write efficient and maintainable Python code for data processing, automation, and backend services. Design and implement REST APIs or backend services for model integration. Handle preprocessing, cleaning, and transformation of large datasets. Evaluate model accuracy and performance, and make necessary optimizations. Collaborate with cross-functional teams including UI/UX, QA, and product managers. Stay updated with the latest trends and advancements in AI/ML. Key Performance Areas (KPAs): Development of AI/ML algorithms and backend services. AI agent development and performance. Model evaluation, testing, and optimization. Seamless deployment and integration of models in production. Technical documentation and project support. Research and implementation of emerging AI technologies. Key Performance Indicators (KPIs): Accuracy and efficiency of AI models delivered. Clean, reusable, and well-documented Python code. Timely delivery of assigned tasks and milestones. Issue resolution and minimal bugs in production. Contribution to innovation and internal R&D efforts. Required Skills & Qualification: Bachelors or Masters degree in Computer Science, IT, or related field. Minimum 2 years of experience in Python and machine learning. Hands-on with AI agent tools like LangChain, AutoGPT, OpenAI APIs, Pinecone, etc. Strong foundation in algorithms, data structures, and mathematics. Experience with Flask, FastAPI, or Django for API development. Good understanding of model evaluation and optimization techniques. Familiarity with version control tools like Git. Strong communication and team collaboration skills. Interview Process: HR Round Technical Round Practical Round Salary Negotiation Offer Release Show more Show less

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Marcus Evans founded in 1983 and is a global Business Intelligence and Event Marketing company, with 49 offices in 20+ countries. Headquartered in the UK, we are rated in the top 20 UK companies for work/life balance. Working across all industry sectors our Summits, Conferences and Online Events brands deliver innovative high-level content, networking and one-to-one meeting platforms that bring together key decision-makers on a truly global scale. Our client base is comprised of C-level executives from 98% of existing Fortune 1000 companies, and we are world leaders in a variety of industries, including healthcare, legal, pharmaceutical, investments, energy, and packaging. We are in search of a highly skilled Python Developer with a solid background in application development, strong expertise in Python programming, and a keen focus on creating high-quality, reliable software. The ideal candidate should have 4+ years of experience in a similar role within a software development/IT environment. In this role, you will be responsible for designing, developing, and maintaining high-quality software solutions that meet business needs. This role is located in our Kerala office. Key Responsibilities Design, develop, test, and deploy scalable and efficient Python applications, with a focus on utilizing FastAPI for building APIs. Write clean and maintainable code, adhering to best practices and coding standards. Collaborate with cross-functional teams to define, design, and implement new features, ensuring high-quality, performant solutions. Participate in code reviews to maintain code quality and ensure the adoption of best practices. Work closely with stakeholders to understand project requirements and deliver solutions that meet business needs. Stay updated on industry trends, emerging technologies, and best practices to enhance development processes continually. Develop and maintain technical documentation. Provide technical support and troubleshooting for software applications. Must - Have Skills Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience as a Python Developer with 4+ years of relevant work experience. Strong understanding of software development principles, design patterns, and best practices. Experience with FastAPI for building APIs. Strong knowledge of Python web frameworks such as Flask or Django. Proficient understanding of code versioning tools like Git, SVN. Understanding of software development principles, including agile methodologies. Experience with relational databases such as MySQL and SQL Server. Excellent problem-solving skills and attention to detail. Strong communication skills and ability to work in a team environment. Good to Have Skillls Knowledge of containerization and orchestration tools (e.g., Docker, Kubernetes). Knowledge of non-relational databases such as MongoDB. Familiarity with cloud platforms (e.g., AWS, Azure, GCP). Familiarity with front-end technologies (HTML, CSS, JavaScript) is advantageous. Compensation & Benefits Competitive base salary. Part of a highly skilled and motivated development team. Bonus available on performance. May require relocating to Mumbai, India with occasional international travel. We are an equal opportunity employer and value diversity. All employment is decided on the basis of qualifications, merit and business need. Show more Show less

Posted 2 weeks ago

Apply

3.0 - 6.0 years

9 - 19 Lacs

Jaipur

Work from Office

Naukri logo

(Notice Period should not be more than 30 Days) Job Summary: We are seeking a Python Backend Engineer with experience in AI/ML workflows , particularly LLMs and LangChain . The role focuses on developing Generative AI-driven applications, integrating LLMs, building data-pipelines, and backend services. Prior experience with generative AI is strongly preferred. Key Responsibilities: Implement LangChain/LangGraph workflows for AI-driven automation. Fine-tune LLMs for business applications. Build and optimize FastAPI-based backend services. Manage data pipelines and storage using PostgreSQL . Deploy and scale AI models using Docker (Kubernetes is a plus). Work with AWS SageMaker, GCP Vertex AI, or Azure ML for model deployment. Improve inference efficiency and reduce latency for real-time AI applications. Required Skills: Strong Python backend development (FastAPI preferred). Experience with LLMs, and prompt engineering . Hands-on experience with LangChain/LangGraph . Knowledge of PostgreSQL and scalable data storage. Familiarity with Docker, Kubernetes for containerization. Exposure to at least one cloud platform (AWS/GCP/Azure). Nice to Have: Experience with vector databases (e.g., Pinecone, FAISS, Qdrant, etc.). Experience with fine-tuning LLMs. Familiarity with micro-services Familiarity with MLOps best practices. Familiarity with any of AWS SageMaker, GCP Vertex AI, or Azure ML for AI model deployment.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

About Us Zycus is a pioneer in Cognitive Procurement software and has been a trusted partner of choice for large global enterprises for two decades. Zycus has been consistently recognized by Gartner, Forrester, and other analysts for its Source to Pay integrated suite. Zycus powers its S2P software with the revolutionary Merlin AI Suite. Merlin AI takes over the tactical tasks and empowers procurement and AP officers to focus on strategic projects; offers data-driven actionable insights for quicker and smarter decisions, and its conversational AI offers a B2C type user-experience to the end-users. Zycus helps enterprises drive real savings, reduce risks, and boost compliance, and its seamless, intuitive, and easy-to-use user interface ensures high adoption and value across the organization. Start your #CognitiveProcurement journey with us, as you are #MeantforMore We Are An Equal Opportunity Employer Zycus is committed to providing equal opportunities in employment and creating an inclusive work environment. We do not discriminate against applicants on the basis of race, color, religion, gender, sexual orientation, national origin, age, disability, or any other legally protected characteristic. All hiring decisions will be based solely on qualifications, skills, and experience relevant to the job requirements. Job Description Zycus is looking for a passionate and curious AI Intern to join our innovative team. If you’re eager to work with cutting-edge technologies like LLMs (Large Language Models), open-source AI frameworks, and advanced NLP systems, this internship is your gateway to a high-impact career in Artificial Intelligence. What You Will Learn And Work On Assist in building AI solutions using open-source models (e.g., Llama 2, Mistral, Hugging Face) and third-party LLM APIs like OpenAI and Anthropic. Contribute to research and experimentation on advanced techniques such as Retrieval-Augmented Generation (RAG), GraphRAG, and Agent Systems using tools like LangChain or LlamaIndex. Support the team in deploying AI models using scalable tools like vLLM, FastAPI, or Flask. Help in integrating AI functionalities into real-world enterprise applications. Participate in developing data pipelines for AI projects – from data preprocessing to model evaluation. Stay updated on the latest AI trends and assist the team in identifying areas for innovation and improvement. Job Requirement Experience & Qualifications Bachelor’s/master’s in computer science, AI, or related field. Strong interest in AI/ML, with some exposure to Python and frameworks such as PyTorch, TensorFlow, or Hugging Face. Familiarity with at least one web framework (FastAPI or Flask preferred). Understanding of how LLMs work, including concepts like embeddings, fine-tuning, and prompt engineering (project work or self-learning is welcome). Good problem-solving skills and willingness to learn complex AI workflows in a production setting. Why Intern with Zycus? Real-world Projects: Work on live AI initiatives and contribute to Zycus’ AI-driven products. Mentorship: Learn from top AI professionals and collaborate with experienced engineers. Innovation Culture: Get hands-on experience with the latest in GenAI and open-source advancements. Global Exposure: Collaborate with teams working on international deployments and enterprise-grade solutions. Growth Potential: Many of our interns are offered full-time roles based on performance and fit. About Zycus Zycus is a pioneer in Cognitive Procurement software and a global leader in Source-to-Pay solutions. Powered by our Merlin AI Suite , Zycus automates tactical work, surfaces insights, and transforms enterprise procurement experiences. Join us and be part of the next generation of AI innovation. Start your #CognitiveProcurement journey with us – you are #MeantforMore Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Location: Chennai, TN/ Hyderabad, TG/ Bangalore, KA/ Pune, MH Must-Have Skills: Backend Development: Python (FastAPI / Flask) AI Integration: OpenAI API (GPT-4o) Cloud Services: Azure Functions, Azure Storage Account, Cosmos DB Database & Caching: NoSQL (Cosmos DB) APIs & Authentication: RESTful APIs, OAuth, JWT DevOps & Deployment: CI/CD pipelines, Azure DevOps Code Quality: Unit Testing, Debugging, Performance Optimization Good-to-have Skills: Vector Databases for AI: Azure AI Search Logging & Monitoring: Application Insights, Azure Monitor Security: Azure Active Directory (AAD), Role-Based Access Control (RBAC) Retrieval-Augmented Generation with vector databases LLM Orchestration: LangChain or Semantic Kernel Experience in OCR, Automation, Natural language data analytics and AI agents is preferred. Experience in CRM, Personalization, Virtual Agents, and Self-service AI apps is preferred Work Hours: 04:00AM - 12:00PM PST Show more Show less

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Ghaziabad, Uttar Pradesh, India

On-site

Linkedin logo

About Epikdoc AI At Epikdoc AI , we're on a mission to redefine healthcare documentation and diagnostics through intelligent automation. From AI-powered dental diagnostics to telehealth practice tools , we are building seamless and scalable platforms that empower doctors, patients, and insurers alike. If you’re excited about solving real-world problems through AI and automation, we’d love to hear from you! Role Overview We are looking for a Python Developer with strong backend expertise, particularly in Django , to join our fast-growing tech team. This role is perfect for someone who thrives in a startup environment and wants to build products that impact thousands of clinicians and patients globally . Key Responsibilities Design, develop, and maintain backend systems using Python, Django (primary) , and optionally Flask or FastAPI. Build secure, efficient RESTful APIs and support frontend integration. Collaborate closely with frontend developers, AI engineers, and product managers to deliver seamless experiences. Integrate systems such as payment gateways, user authentication , and third-party APIs. Manage and query databases including MySQL, PostgreSQL, and MongoDB . Set up Docker containers and contribute to deployment on AWS or similar cloud platforms . Write clean, maintainable, scalable code and conduct peer reviews. Contribute to CI/CD pipelines and version control with Git. Stay updated on backend, AI, and DevOps trends, bringing best practices into the team. Requirements Proficient in Python with 2+ years of experience in Django . Experience building scalable web applications and APIs. Familiarity with HTML, CSS, JS for backend-frontend integration. Solid grasp of SQL and NoSQL databases (MySQL, PostgreSQL, MongoDB). Experience with Docker, AWS, Git , and version control workflows. Integration experience with Stripe, Razorpay , Firebase, etc. Excellent debugging, problem-solving, and performance optimization skills. Nice to Have Experience with FastAPI, WebSockets, GraphQL , or asynchronous frameworks . Familiarity with AI/ML , especially NLP and LLMs (OpenAI, Hugging Face). Experience with Celery, Redis, RabbitMQ or other task/message queues. Exposure to Sentry, Prometheus, Grafana for monitoring/logging. Contributions to open-source projects or past freelance experience. Why Join Us? Work on meaningful AI-driven healthcare tools used by doctors across the world Be part of a small, passionate, and growing team—your voice matters Flexible working hours Be a founding tech team member in a backed, fast-scaling startup Ready to Build the Future of Healthcare Tech? 👉 Apply now with your resume, GitHub (or portfolio), and a short note on why you’re excited about Epikdoc AI. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Dreaming big is in our DNA. It’s who we are as a company. It’s our culture. It’s our heritage. And more than ever, it’s our future. A future where we’re always looking forward. Always serving up new ways to meet life’s moments. A future where we keep dreaming bigger. We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential. The power we create together – when we combine your strengths with ours – is unstoppable. Are you ready to join a team that dreams as big as you do? AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev. The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology. The teams are transforming Operations through Tech and Analytics. Do You Dream Big? We Need You. Job Description Job Title: AI Engineer Location: Bangalore Reporting to: Senior Manager Purpose of the role The Global Brand Analytics Team at Anheuser-Busch InBev (AB InBev) is tasked with helping strengthen AB InBev’s brands across the globe. The solutions we provide aim to extract insights and give guidance on actionable from all our brand related data. The derived data-driven insights play a pivotal role in empowering our Marketing teams to make well-informed decisions backed by cutting edge modelling techniques and thorough analysis. Key Tasks & Accountabilities Large Language Models (LLM): Experience with LangChain, LangGraph Proficiency in building agentic patterns like ReAct, ReWoo, LLMCompiler Multi-modal Retrieval-Augmented Generation (RAG): Expertise in multi-modal AI systems (text, images, audio, video) Designing and optimizing chunking strategies and clustering for large data processing Streaming & Real-time Processing: Experience in audio/video streaming and real-time data pipelines Low-latency inference and deployment architectures NL2SQL: Natural language-driven SQL generation for databases Experience with natural language interfaces to databases and query optimization API Development: Building scalable APIs with FastAPI for AI model serving Containerization & Orchestration: Proficient with Docker for containerized AI services Experience with orchestration tools for deploying and managing services Data Processing & Pipelines: Experience with chunking strategies for efficient document processing Building data pipelines to handle large-scale data for AI model training and inference AI Frameworks & Tools: Experience with AI/ML frameworks like TensorFlow, PyTorch Proficiency in LangChain, LangGraph, and other LLM-related technologies Prompt Engineering: Expertise in advanced prompting techniques like Chain of Thought (CoT) prompting, LLM Judge, and self-reflection prompting Experience with prompt compression and optimization using tools like LLMLingua, AdaFlow, TextGrad, and DSPy Strong understanding of context window management and optimizing prompts for performance and efficiency Qualifications, Experience, Skills Level of educational attainment required Degree in business analytics / data science / statistics / economics and / or degree in Engineering, Mathematics or Computer Science And above all of this, an undying love for beer! We dream big to create future with more cheers. Show more Show less

Posted 2 weeks ago

Apply

5.0 - 10.0 years

8 - 13 Lacs

Hyderabad

Work from Office

Naukri logo

We are looking for a Senior Python Software Engineer with at least 5 years of hands-on experience in developing and maintaining Python-based applications. The ideal candidate is a problem solver who thrives in a fast-paced environment and is passionate about building scalable, high-performance, and secure software solutions. In this role, you will work closely with cross-functional teams, contributing to the design, development, and optimization of our internal systems. Your expertise will help shape our technical landscape while ensuring best practices in code quality, testing, and maintainability . Key Responsibilities: Develop and maintain a variety of internal software applications using Python and related technologies. Collaborate with cross-functional teams to design, develop, and implement new features and enhancements. Ensure software solutions meet standards for scalability, performance, and security. Write clean, efficient, and well-documented code that satisfies project requirements. Participate in code reviews, providing and receiving feedback to improve overall code quality. Debug, troubleshoot, and resolve software defects and technical issues in a timely manner. Follow agile development methodologies, participating in sprint planning, daily stand-ups, and retrospectives. Continuously improve technical skills and stay current with industry best practices and emerging technologies. Conduct comprehensive unit and integration testing to ensure code quality and reliability. Assist in the deployment of applications and provide ongoing maintenance of production environments to ensure smooth operations. Required Skills & Experience : At least 5 years of continuous, professional experience as a Python3 Software Engineer, developing, deploying, and maintaining production-grade applications. Minimum 3 years of hands-on experience with at least 4 of the following Python frameworks, libraries, and tools: FastAPI, Pydantic, SQLAlchemy, Pandas, and messaging queues (e.g., Celery, Kafka, RabbitMQ). Minimum 3 years of experience working in a Linux/Unix environment with expertise in system navigation, scripting, and troubleshooting. Deep understanding of best practices for building scalable, high-performance, and secure software solutions. Strong analytical, problem-solving, and debugging skills with a proven ability to diagnose and resolve complex issues efficiently. Demonstrated commitment to continuous learning, innovation, and enhancing both individual and team performance Extensive experience with unit and integration testing. Proven expertise in designing, implementing, and maintaining robust unit and integration tests to ensure software reliability and quality. Ability to troubleshoot and resolve dependency conflicts, versioning issues, and environment inconsistencies. Self-starter with the ability to independently set up a complete Python development environment from scratch. Proven ability to collaborate effectively with cross-functional teams to drive projects forward and deliver high-quality solutions. Bachelors degree in Computer Science, Engineering, or a related field, or equivalent practical experience. Preferred Qualifications: 1+ years of experience with Python packaging using setuptools, Poetry, or related tools. Familiarity with publishing Python packages to PyPI or private repositories. Experience automating package builds and releases (e.g., GitHub Actions, Bitbucket Pipelines, CI/CD). Strong ability to diagnose and resolve dependency conflicts, versioning issues, and environment inconsistencies to ensure seamless development and deployment workflows Experience designing and configuring Python-based stacks, including FastAPI , Pydantic, SQLAlchemy, Pandas, Celery and other relevant libraries. Comfortable compiling and installing Python from source when necessary. Bachelors degree in Computer Science, Engineering, or a related field, or equivalent experience Commitment to Continuous Learning: Proven commitment to continuous learning, staying ahead of industry trends, and driving innovation by adopting emerging technologies, optimizing best practices, and applying creative problem-solving to real-world challenges. This includes actively exploring advancements in software development, contributing to open-source projects, obtaining relevant certifications, or implementing innovative solutions to improve efficiency and scalability." Examples: Staying current with Python ecosystem updates (e.g., migrating to FastAPI for better performance and async capabilities). Contributing to open-source projects or engaging with developer communities. Earning industry certifications (e.g., AWS Certified Solutions Architect, Google Cloud Professional Developer). Implementing automation to streamline CI/CD pipelines and enhance deployment efficiency. Researching and adopting best practices for security and performance optimizations in production environments. This role provides an excellent opportunity for growth and offers exposure to a broad range of software development challenges. If you are passionate about coding and working in a collaborative, agile environment, we'd love to hear from you!

Posted 2 weeks ago

Apply

4.0 - 7.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Generative AI Engineer Location: Remote Experience: 4 to 7 Years Job type - Contractual About the Role: We are seeking talented and experienced Generative AI Engineers to join our growing remote team. In this role, you will be responsible for designing, developing, and deploying cutting-edge generative AI solutions using modern frameworks and LLM ecosystems. You’ll collaborate with cross-functional teams to build intelligent, scalable, and production-ready systems. Key Responsibilities: Design and develop microservice-based applications for AI use cases using Python and FastAPI Build and deploy LLM-based chatbots using GPT-4 , LangChain , and Azure OpenAI services Develop and maintain retrieval-augmented generation (RAG) pipelines using ChromaDB for document-based Q&A systems Integrate solutions with Azure AI tools and services for scalable deployment Implement effective prompt engineering techniques to optimize LLM performance Collaborate with DevOps and product teams to ensure high-quality delivery and reliability of AI solutions Optional: Work with LangGraph for advanced workflow orchestration in LangChain Required Skills & Qualifications: 4–7 years of overall experience in AI/ML, with at least 2 years in Generative AI Strong proficiency in Python and FastAPI Hands-on experience with GPT-4 , LangChain , and Azure OpenAI Experience working with ChromaDB or other vector databases Proven experience in microservices architecture and deployment Solid understanding of prompt engineering and chatbot development Strong debugging and problem-solving skills Nice to Have: Experience with LangGraph for managing complex LangChain workflows Familiarity with MLOps and CI/CD pipelines for AI model deployment What We Offer: Competitive compensation 100% remote working opportunity Work on state-of-the-art AI projects with a skilled and passionate team Opportunity to grow and learn in the rapidly evolving GenAI landscape Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

We are seeking a Full Stack Python Developer with strong experience in both frontend and backend development, and deep familiarity with Azure Cloud and serverless architecture. In this full-time role, you’ll build modern, scalable web applications and services using Python, JavaScript frameworks, and Azure-native tools. You’ll work cross-functionally to develop secure, performant, and user-friendly applications that run entirely in the cloud. Job Description: Responsibilities : Develop end-to-end web applications using Python for the backend and React / JavaScript for the frontend. Design, build, and deploy serverless applications on Microsoft Azure using services like: Azure Functions Azure API Management Azure Blob Storage Azure Cosmos DB / Mongo DB Strong experience with using Python Runtime inside Azure Functions, and building serverless functions using the Python v2 programming model and Azure Blueprints. Use Blueprints to define and register new Azure Functions Use Python Modules and an Object-Oriented Programming model to modularize function definition and implementation Build and maintain RESTful APIs, microservices, and integrations with third-party services. Work closely with designers, PMs, and QA to deliver high-quality, user-centric applications. Optimize applications for performance, scalability, and cost-efficiency on Azure. Implement DevOps practices using CI/CD pipelines. Write clean, modular, and well-documented code, following best practices and secure coding guidelines. Participate in sprint planning, code reviews, and agile ceremonies. Required Skills (Must Have): 3–5 years of professional experience in full stack development. Strong proficiency in Object-Orriented Python, with frameworks like FastAPI, Flask, or Django. Solid experience with frontend frameworks such as React.js, or similar. Proven experience with Azure Serverless Architecture, including: Azure Functions Azure API Management Azure Storage & Cosmos DB Understanding of event-driven architecture, and asynchronous APIs in Azure. Experience working with Azure Serverless functions including Durable Functions within Azure Experience with API integrations, secure data handling, and cloud-native development. Proficient in working with Git, Agile methodologies, and software development best practices. Ability to design and develop scalable and efficient applications. Excellent problem-solving and analytical skills. Strong communication and teamwork abilities. Preferred Skills (Good to Have): Experience with Azure App Service, Azure Key Vault, Application Insights, and Azure Monitor for observability and secure deployments. Familiarity with authentication and authorization mechanisms, such as Azure Active Directory (Azure AD), OAuth2, and JWT. Exposure to containerization technologies including Docker, Azure Container Registry (ACR), and Azure Kubernetes Service (AKS). Understanding of cost optimization, resilience, and security best practices in cloud-native and serverless applications. Knowledge of integration with Azure OpenAI service and working with LLM models inside Azure apps Knowledge of LLM frameworks such as LangChain, LlamaIndex, and experience in building intelligent solutions using AI agents and orchestration frameworks. Awareness of modern AI application architecture, including Retrieval-Augmented Generation (RAG) and semantic search. Qualifications : Bachelor’s degree in computer science, Computer Engineering, or a related field. 3+ years of experience in software development. Strong understanding of building cloud-native applications in a serverless ecosystem. Strong understanding of software development methodologies (e.g., Agile). Location: DGS India - Pune - Kharadi EON Free Zone Brand: Dentsu Creative Time Type: Full time Contract Type: Consultant Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Design and development responsibilities include developing, testing, and deploying software components or modules using a microservices architecture, utilizing programming languages such as Python for backend API development and ReactJS for frontend development. The role involves designing and coding different services, creating class diagrams, and re-architecting the existing Python codebase to ensure it is scalable, reusable, and production-ready. Cloud deployment involves implementing and managing the deployment of microservices in AWS, leveraging services such as EC2, Lambda, ECS, S3, RDS, and others. The role also includes automating repetitive tasks and optimizing development processes using CI/CD pipelines and other tools. Security is a key focus, requiring the implementation of best practices throughout the software development and deployment lifecycle to ensure that all services are secure and compliant with industry standards. Additionally, maintaining comprehensive documentation for software components, processes, and deployments is expected. From a technical experience perspective, the candidate must demonstrate proficiency in programming languages such as Python and ReactJS. They should have strong experience with AWS services including EC2, S3, RDS, and Lambda. Experience with containerization technologies like Docker and orchestration tools such as ECS and EKS is essential. Familiarity with CI/CD tools such as Jenkins and GitHub Actions is required, along with a solid understanding of REST APIs and messaging systems. Experience with monitoring and logging tools, particularly AWS CloudWatch and the ELK stack, is also important. The must-have skills for this role include software service development using microservice architecture with ReactJS and Python (FastAPI framework), experience developing solutions within Systems Development Life Cycle (SDLC) processes, and a strong understanding of SOLID principles and best practices. The candidate should also have hands-on experience deploying services in AWS infrastructure, knowledge of Docker containerization, deep expertise in core Python, strong familiarity with REST API concepts, and unit testing experience. Preferred qualifications include knowledge of Kubernetes, experience with tools such as JIRA and Confluence, and familiarity with integrating GenAI LLM models via API development, along with SSO implementation. Candidates should have a minimum of 8 years of experience in software development, specifically using a microservice architecture, AWS cloud stack, and expertise in software engineering with ReactJS and Python FastAPI. Show more Show less

Posted 2 weeks ago

Apply

Exploring FastAPI Jobs in India

FastAPI is a modern web framework for building APIs with Python that is gaining popularity in the tech industry. If you are a job seeker looking to explore opportunities in the fastapi domain in India, you're in the right place. This article will provide you with insights into the fastapi job market in India, including top hiring locations, salary ranges, career progression, related skills, and interview questions.

Top Hiring Locations in India

  1. Bengaluru
  2. Hyderabad
  3. Pune
  4. Chennai
  5. Mumbai

Average Salary Range

The salary range for fastapi professionals in India varies based on experience levels. Entry-level positions can expect a salary range of INR 4-6 lakhs per annum, while experienced professionals can earn anywhere from INR 10-20 lakhs per annum.

Career Path

In the fastapi domain, a career typically progresses as follows: - Junior Developer - Mid-level Developer - Senior Developer - Tech Lead

Related Skills

Besides proficiency in FastAPI, other skills that are often expected or helpful alongside FastAPI include: - Python programming - RESTful APIs - Database management (SQL or NoSQL) - Frontend technologies like HTML, CSS, and JavaScript

Interview Questions

  • What is FastAPI and how is it different from other Python web frameworks? (basic)
  • Explain the main features of FastAPI. (basic)
  • How do you handle authentication and authorization in FastAPI? (medium)
  • Can you explain dependency injection in FastAPI? (medium)
  • What is Pydantic and how is it used in FastAPI? (medium)
  • How do you handle request validation in FastAPI? (medium)
  • What are the advantages of using asynchronous programming with FastAPI? (medium)
  • How do you perform testing in FastAPI applications? (medium)
  • Explain the role of middleware in FastAPI. (medium)
  • Can you discuss the performance benefits of FastAPI compared to other frameworks? (advanced)
  • How do you handle background tasks in FastAPI? (advanced)
  • Explain the process of deploying a FastAPI application to production. (advanced)
  • How does FastAPI handle exceptions and errors? (advanced)
  • What are OpenAPI schemas and how are they used in FastAPI? (advanced)
  • Can you discuss the scalability aspects of FastAPI applications? (advanced)
  • How do you optimize database queries in FastAPI applications? (advanced)
  • Explain the process of integrating FastAPI with Docker. (advanced)
  • What are API routers in FastAPI and how do you use them? (advanced)
  • How do you handle file uploads in FastAPI applications? (advanced)
  • Can you discuss the security best practices in FastAPI development? (advanced)
  • Explain the process of versioning APIs in FastAPI. (advanced)
  • How do you handle CORS in FastAPI applications? (advanced)
  • What is the role of dependency management in FastAPI projects? (advanced)
  • How do you monitor and log FastAPI applications in production? (advanced)

Closing Remark

As you explore opportunities in the fastapi job market in India, remember to prepare thoroughly and apply confidently. With the right skills and knowledge, you can excel in your career as a FastAPI professional. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies