Home
Jobs

1760 Fastapi Jobs - Page 42

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Backend Engineer Job Type: Full-time, Contractor Location: Remote About Us: Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest-growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market. Job Summary: Join our customer's team as a Backend Engineer, where you will design, develop, and maintain innovative backend systems powering next-generation applications. Leverage your expertise in Python and cloud-native technologies to build robust, scalable services in a dynamic and collaborative environment. If you are passionate about backend excellence and eager to learn, this is your perfect next step. Key Responsibilities: Design, develop, and maintain backend systems using Python, Flask, and SQL. Implement and manage RESTful APIs to support front-end and external integrations. Deploy, monitor, and manage applications on AWS and Kubernetes platforms. Collaborate closely with front-end developers and cross-functional team members to ensure seamless and efficient system integration. Optimize backend performance, scalability, and reliability to support growing user and data demands. Troubleshoot, debug, and resolve complex software and infrastructure issues. Actively contribute to continuous improvement and share knowledge within the team. Required Skills and Qualifications: Proven experience with Python development and frameworks such as Flask and FastAPI. Strong understanding of SQL, with production experience using MySQL databases. Hands-on expertise deploying and managing applications on AWS and Kubernetes. Excellent written and verbal communication skills, with a focus on clarity and collaboration. Experience building, documenting, and maintaining REST APIs. Demonstrated ability to work remotely and independently while thriving in a collaborative team setting. Proactive attitude with a hunger to learn, adapt, and rapidly master new technologies. Preferred Qualifications: Familiarity with continuous integration and deployment pipelines (CI/CD). Previous experience working in agile and distributed teams. Knowledge of system monitoring, logging, and performance tuning tools. Show more Show less

Posted 2 weeks ago

Apply

0.0 - 1.0 years

0 Lacs

Chennai District, Tamil Nadu

On-site

Indeed logo

We are looking for young minds with minimum of 0-1 years of experience in python development and engineering to develop our SaaS products. This role will provide opportunity to learn product development and software engineering experiences to create successful career. If you are an energetic and goal oriented person, read further. Location: Chennai Graduation: BE/ME/MCA/MS Information Technology. Job Responsibilities: Responsible in developing software modules and functionalities as per the requirements using Python and related libraries. Develop the python based jobs, functions and data processing modules as per the design guidelines provided by the senior developer and lead. Test the developed modules as per the requirements and confirm functional readiness. Integrate with PostgreSQL database, third party APIs and lib to complete the end-to-end functionalities. Responsible for SQL query development. Follow best practices while implementing application modules. Required Candidate Profile Some basic experience in Python Flask, Django and FastAPI. Exposure in data processing using Python Lib will be added value. Exposure in Celery Jobs, RabbitMQ and Other Python lib are added value. Good understanding of any SQL (PostgreSQL and MySQL) Should be strong in Back end Technologies. Please note that due to high volume of applications, we may not reply to all submissions. Only qualified candidates will receive email response and call. Job Type: Full-time Schedule: Morning shift Education: Bachelor's (Required) Experience: Python: 1 year (Required) Location: Chennai District, Tamil Nadu (Required) Work Location: In person

Posted 2 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: Python Developer Experience Level: 5-8 Years Location : Hyderabad Job Description We are seeking an experienced Lead Python Developer with a proven track record of building scalable and secure applications, specifically in the travel and tourism industry. The ideal candidate should possess in-depth knowledge of Python, modern development frameworks, and expertise in integrating third-party travel APIs. This role demands a leader who can foster innovation while adhering to industry standards for security, scalability, and performance. Roles and Responsibilities Application Development: Architect and develop robust, high-performance applications using Python frameworks such as Django, Flask, and FastAPI. API Integration: Design and implement seamless integration with third-party APIs, including GDS, CRS, OTA, and airline-specific APIs, to enable real-time data retrieval for booking, pricing, and availability. Data Management: Develop and optimize complex data pipelines to manage structured and unstructured data, utilizing ETL processes, data lakes, and distributed storage solutions. Microservices Architecture: Build modular applications using microservices principles to ensure scalability, independent deployment, and high availability. Performance Optimization: Enhance application performance through efficient resource management, load balancing, and faster query handling to deliver an exceptional user experience. Security and Compliance: Implement secure coding practices, manage data encryption, and ensure compliance with industry standards such as PCI DSS and GDPR. Automation and Deployment: Leverage CI/CD pipelines, containerization, and orchestration tools to automate testing, deployment, and monitoring processes. Collaboration: Work closely with front-end developers, product managers, and stakeholders to deliver high- quality, user-centric solutions aligned with business goals.Requirements  Education: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.  Technical Expertise: o At least 4 years of hands-on experience with Python frameworks like Django, Flask, and FastAPI. o Proficiency in RESTful APIs, GraphQL, and asynchronous programming. o Strong knowledge of SQL/No SQL databases (PostgreSQL, MongoDB) and big data tools (e.g., Spark, Kafka). o Experience with cloud platforms (AWS, Azure, Google Cloud), containerization (Docker, Kubernetes), and CI/CD tools (e.g., Jenkins, GitLab CI). o Familiarity with testing tools such as PyTest, Selenium, and SonarQube. o Expertise in travel APIs, booking flows, and payment gateway integrations.  Soft Skills: o Excellent problem-solving and analytical abilities. o Strong communication, presentation, and teamwork skills. o A proactive attitude with a willingness to take ownership and perform under pressure. Show more Show less

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

🚀 Job Title: Backend Engineer – Python (Fresher Friendly) Location: Hyderabad, India Type: Full-time | In-office / Hybrid Experience: 0–2 years 💡 About Nova Nova is India’s first vibe-first , emotionally intelligent dating app for Gen Z and Millennials. We don’t do mindless swipes. We build respectful, meaningful, and calm digital spaces for people to connect with emotional clarity, consent, and kindness — powered by smart AI. 🛠 What You’ll Do Build, maintain, and optimize APIs and backend services that power our core dating experience (matching, messaging, nudges). Implement features like NovaScore™ , Red Flag Detection , Karma Reputation System , and Engagement Engine using clean Python code. Collaborate with frontend, design, and AI teams to create emotionally safe and responsive user flows. Work with modern cloud backends (Firebase, Supabase, or PostgreSQL-based stacks). Integrate AI services into the backend for emotional tone analysis, red flag detection, and match recommendations. Write scalable code for real-time chat, behavioral logging, and user safety controls. 💻 Tech Stack Primary Language: Python (FastAPI, Django) Database: Firebase Firestore / PostgreSQL Cloud Services: Supabase / Google Cloud Platform / Firebase AI/ML Integration: NLP APIs, transformer models (preferred but not required) Tools: Git, Docker, REST APIs, Firebase Functions 🧠 Who We’re Looking For Fresh graduates or early-career engineers with solid Python fundamentals Passionate about emotionally aware and ethical tech Curious about AI and how it can support mental well-being and human connection Eager to learn and contribute to a small, fast-moving, impact-focused team Comfortable with debugging, taking ownership, and shipping features quickly 🌱 Bonus (but not mandatory) Built or contributed to a personal project using Python backend Exposure to AI/NLP models or Firebase services Understanding of RESTful API architecture and security best practices 💖 What You’ll Love Be part of a startup rewriting the rules of dating in India Build features that matter — real impact on user emotions and well-being Learn fast in a collaborative, user-obsessed team Work on futuristic AI tools to promote emotional safety and consent Office snacks + chai breaks + real banter (we mean it!) 📩 How to Apply Send us a short note with: What excites you about Nova One Python project you’re proud of (can be a GitHub link or description) Your resume Email: novadatingapp9@gmail.com Subject: Application – Backend Engineer – Nova Show more Show less

Posted 2 weeks ago

Apply

5.0 - 9.0 years

37 - 55 Lacs

Gurugram, Greater Noida, Delhi / NCR

Hybrid

Naukri logo

Overview: We are seeking an exceptional and technically accomplished Senior Data Scientist to join our rapidly growing team. This role is ideal for a hands-on AI expert with deep experience in Large Language Models (LLMs) , Natural Language Processing (NLP) , Cognitive AI , Conversational AI , and Agentic AI systems . You will work on designing, developing, and scaling state-of-the-art AI solutions that drive business innovation, operational intelligence, and customer experience transformation. This is a high-impact individual contributor role, focused on technical excellence and product-grade AI development. Key Responsibilities LLM & Cognitive AI Solution Development Architect and fine-tune LLM-based solutions (e.g., GPT, LLaMA, Claude, Mistral) using transfer learning, PEFT, and retrieval-augmented generation (RAG). Develop Agentic AI architectures with autonomous behavior and multi-agent collaboration capabilities. Apply prompt engineering , in-context learning , and tool former-style techniques to enable complex reasoning and task execution. Build Conversational AI systems using Transformer-based backbones and orchestration tools (LangChain, Haystack, Rasa, Semantic Kernel). Natural Language Processing (NLP) Design and implement advanced NLP pipelines for semantic search, summarization, NER, text classification, and knowledge extraction. Utilize and integrate tools such as SpaCy , BERT , RoBERTa , GloVe , Word2Vec , NLTK , and TextBlob . Conduct advanced text preprocessing, dependency parsing, sentiment analysis, tokenization, and multilingual NLP modeling. AI/ML Engineering Build scalable ML pipelines with FastAPI, gRPC, and RESTful interfaces. Containerize services using Docker and deploy to production environments via Kubernetes and cloud-native workflows. Integrate models into real-time and batch data systems, optimizing for latency, throughput, and reliability. Cloud & Infrastructure Develop and deploy solutions on AWS , Azure , or Google Cloud Platform (GCP) . Leverage services like SageMaker, Vertex AI, Azure ML Studio, and serverless orchestration tools. MLOps & CI/CD Design and maintain end-to-end MLOps pipelines using tools like MLflow, Kubeflow, or TFX for reproducibility and traceability. Automate model testing, monitoring, versioning, and rollback strategies in production. Big Data & Analytics Apply Spark, Hadoop, and Hive to process large-scale datasets efficiently. Perform complex data transformations and statistical modeling to extract actionable insights. Database & Data Engineering Interface with SQL and NoSQL systems (PostgreSQL, MongoDB, DynamoDB, etc.) for efficient data retrieval and processing. Design data schemas and optimize query performance for AI-centric applications. Cross-functional Collaboration Work closely with product managers, engineers, and domain experts to identify high-impact problems and deliver AI-driven solutions. Communicate findings and solution architectures clearly to technical and non-technical stakeholders. Technical Skills Expert in Python , PyTorch, HuggingFace Transformers, LangChain, and OpenAI/Anthropic/LLama APIs. Deep understanding of Transformer architectures , Attention Mechanisms , and LLM fine-tuning techniques . Familiarity with Agentic AI paradigms , including memory management, autonomous goal completion, and tool use. Strong experience with cloud platforms, container orchestration (Docker + Kubernetes), and deployment automation. Proficient in statistical modeling, probabilistic reasoning, and experimental design. Bonus Experience in Online Reputation Management (ORM) or product-centric platforms . Publications or open-source contributions in AI/ML/NLP.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

40 - 65 Lacs

New Delhi, Gurugram, Greater Noida

Hybrid

Naukri logo

Role Overview: We' re looking for a Lead Data Scientist to join our fast-growing, mission-driven team. This is a high-impact leadership role at the intersection of technology, strategy, and innovation where your expertise will power intelligent solutions that redefine how we think about data, products, and customer experience. If you thrive in high-performance environments and want to lead cutting-edge projects with Generative AI, LLMs, and large-scale NLP systems, this is your opportunity to make a lasting impact. Key Responsibilities: Build the Future with AI/ML Architect scalable AI systems to solve real-world, high-value business challenges cross domains. Design and deploy cutting-edge solutions with Generative AI, Transformer architectures, and custom Large Language Models (LLMs). Apply state-of-the-art ML techniques prompt engineering, transfer learning, multi-modal learning to push the boundaries of what's possible. NLP at Scale Lead the development of advanced NLP systems: semantic search, question answering, summarization, classification, and conversational AI. Work with best-in-class models and libraries: BERT, GPT, RoBERTa, Word2Vec, SpaCy, NLTK, Hugging Face Transformers, and more. Leverage language understanding to transform unstructured data into strategic insights. Full-Lifecycle Model Deployment Build robust MLOps pipelines for training, testing, deployment, and monitoring using MLflow, Kubeflow, Airflow, and CI/CD tools. Containerize and orchestrate AI models using Docker and Kubernetes across cloud environments (AWS, GCP, Azure). Design APIs and inference services using FastAPI or gRPC, optimized for performance, scalability, and uptime. Lead, Inspire, and Elevate Build and mentor a world-class data science team, nurturing talent and fostering a culture of curiosity, experimentation, and technical excellence. Drive the data science strategy, influence key decisions, and align AI initiatives with business goals. Establish best-in-class practices for model development, code quality, documentation, and performance evaluation. Data at the Core Work hands-on with structured, semi-structured, and unstructured data. Optimize high-volume data pipelines, integrate with SQL and NoSQL databases, and leverage big data ecosystems like Spark, Hive, and Hadoop. Use statistical modeling and experimental design to derive actionable intelligence from data Collaborate Across the Enterprise Work closely with engineering, product, and executive teams to define problems, explore solutions, and deliver AI products that move the needle. Translate technical concepts into strategic recommendations and insights that influence roadmaps and revenue. Champion AI literacy and advocate for ethical, explainable, and scalable data science across the organization. Education Bachelors or Master’s in Computer Science, AI, Data Science, Machine Learning, Engineering, or a related technical discipline. PhD is a plus. Experience 6+ years of experience in data science or machine learning, with 2+ years in a leadership or managerial role. Proven record of building and deploying AI solutions at scale in real-world production environments. Experience in GenAI, NLP, cloud-native architectures, and full-cycle ML development. Technical Expertise Languages & Libraries: Python, Scikit-learn, PyTorch, TensorFlow, Hugging Face, NumPy, Pandas NLP: BERT, GPT, Word2Vec, SpaCy, NLTK, TextBlob, CoreNLP, Transformer-based architectures Cloud & Deployment: AWS, Azure, GCP, Docker, Kubernetes, FastAPI, gRPC Big Data: Spark, Hive, Hadoop, Kafka Data Systems: PostgreSQL, MongoDB, Cassandra, Snowflake, BigQuery MLOps & Monitoring: MLflow, Airflow, Kubeflow, Prometheus, Grafana Nice to Have Experience in online reputation management, customer analytics, or product-led organizations. Familiarity with responsible AI, fairness, and explainability techniques (SHAP, LIME).

Posted 2 weeks ago

Apply

2.0 - 4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description Navtech is looking for a AI/ML Engineer to join our growing data science and machine learning team. In this role, you will be responsible for building, deploying, and maintaining machine learning models and pipelines that power intelligent products and data-driven decisions. Working as an AI/ML Engineer at Navtech, you will : Design, develop, and deploy machine learning models for classification, regression, clustering, recommendations, or NLP tasks. Clean, preprocess, and analyze large datasets to extract meaningful insights and features. Work closely with data engineers to develop scalable and reliable data pipelines. Experiment with different algorithms and techniques to improve model performance. Monitor and maintain production ML models, including retraining and model drift detection. Collaborate with software engineers to integrate ML models into applications and services. Document processes, experiments, and decisions for reproducibility and transparency. Stay current with the latest research and trends in machine learning and AI. Who Are We Looking for Exactly ? 2- 4 years of hands-on experience in building and deploying ML models in real-world applications. Strong knowledge of Python and ML libraries such as Scikit-learn, TensorFlow, PyTorch, XGBoost, or similar. Experience with data preprocessing, feature engineering, and model evaluation techniques. Solid understanding of ML concepts such as supervised and unsupervised learning, overfitting, regularization, etc. Experience working with Jupyter, pandas, NumPy, and visualization libraries like Matplotlib or Seaborn. Familiarity with version control (Git) and basic software engineering practices. You consistently demonstrate strong verbal and written communication skills as well as strong analytical and problem-solving abilities You should have a masters degree /Bachelors (BS) in computer science, Software Engineering, IT, Technology Management or related degrees and throughout education in English medium. Well REALLY Love You If You Have knowledge of cloud platforms (AWS, Azure, GCP) and ML services (SageMaker, Vertex AI, etc.) Have knowledge of GenAI prompting and hosting of LLMs. Have experience with NLP libraries (spaCy, Hugging Face Transformers, NLTK). Have familiarity with MLOps tools and practices (MLflow, DVC, Kubeflow, etc.). Have exposure to deep learning and neural network architectures. Have knowledge of REST APIs and how to serve ML models (e.g., Flask, FastAPI, Docker). Why Navtech? Performance review and Appraisal Twice a year. Competitive pay package with additional bonus & benefits. Work with US, UK & Europe based industry renowned clients for exponential technical growth. Medical Insurance cover for self & immediate family. Work with a culturally diverse team from different us : Navtech is a premier IT software and Services provider. Navtechs mission is to increase public cloud adoption and build cloud-first solutions that become trendsetting platforms of the future. We have been recognized as the Best Cloud Service Provider at GoodFirms for ensuring good results with quality services. Here, we strive to innovate and push technology and service boundaries to provide best-in-class technology solutions to our clients at scale. We deliver to our clients globally from our state-of-the-art design and development centers in the US & Hyderabad. Were a fast-growing company with clients in the United States, UK, and Europe. We are also a certified AWS partner. You will join a team of talented developers, quality engineers, product managers whose mission is to impact above 100 million people across the world with technological services by the year 2030. (ref:hirist.tech) Show more Show less

Posted 2 weeks ago

Apply

1.0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

Linkedin logo

Key Responsibilities Fine-tune and deploy LLMs (e.g., LLaMA 3.2) using LoRA, QLoRA, and related optimization techniques Build intelligent systems combining NLP, image analysis, and pattern recognition Develop and integrate retrieval-augmented generation (RAG) pipelines using LangChain, LlamaIndex, and vector databases like FAISS or Weaviate Apply statistical analysis, feature engineering, and advanced data analysis techniques using NumPy and Pandas Train and evaluate models with CNNs, transformers, and neural networks in PyTorch, TensorFlow, or Keras Package and deploy scalable inference APIs with FastAPI on AWS or Azure Collaborate directly with product and engineering teams to ship AI features into production Requirements Possess 1+ years of hands-on experience in data science, machine learning, or deep learning roles Demonstrate strong experience with Python, NumPy, Pandas, and modern ML frameworks (PyTorch or TensorFlow) Show practical understanding of transformers, CNNs, and neural network training pipelines Have exposure to LLMs, vector databases, and/or retrieval-augmented generation (RAG) systems Be familiar with FastAPI and deploying ML models to the cloud (AWS or Azure) Hold a solid grounding in statistics, data wrangling, and model evaluation techniques About Company: Softsensor.ai is a USA and India-based corporation focused on delivering outcomes to clients using data. Our expertise lies in a collection of people, methods, and accelerators to rapidly deploy solutions for our clients. Our principals have significant experience with leading global consulting firms & corporations and delivering large-scale solutions. We are focused on data science and analytics for improving the process and organizational performance. We are working on cutting-edge data science technologies like NLP, CNN, and RNN and applying them in the business context. Show more Show less

Posted 2 weeks ago

Apply

1.0 - 31.0 years

0 - 0 Lacs

Vaishali Nagar, Jaipur

Remote

Apna logo

Job Title: Python Developer (Django, Flask, Fast API) Location: On-site – Jaipur, Rajasthan Experience Required: Minimum 2 Years Company Type: Startup – Fast-paced and growth-focused Mode: Full-time | Immediate Joiners Preferred About the Role: Briskcovey Technologies is actively looking for a Python Developer with strong hands-on experience in Django, Flask, and FastAPI. The ideal candidate should be passionate about backend development, have a startup mindset, and be ready to contribute to a high-growth tech environment. Key Responsibilities: • Design and implement robust backend systems using Django, Flask, and FastAPI. • Integrate third-party APIs and services. • Build scalable and secure RESTful APIs. • Optimize application performance, write clean and maintainable code. • Collaborate with frontend developers, QA, and design teams for smooth deployment. • Participate in code reviews, testing, and troubleshooting. Key Skills Required: • Minimum 2 years of experience with Python and related web frameworks (Django, Flask, FastAPI). • Strong knowledge of REST APIs, ORMs, SQL/NoSQL databases. • Familiarity with version control (Git) and CI/CD pipelines. • Knowledge of Docker, PostgreSQL, or Cloud services is a plus. • Strong problem-solving skills and startup adaptability. Why Join Us? • Opportunity to grow in a startup environment with real-time exposure to projects. • Collaborative team and learning-driven atmosphere. • Fixed Saturday & Sunday Off.• Competitive compensation and skill-based growth path. Apply Now: Send your resume to hr@briskcovey.com Walk-in Interviews (Jaipur): Monday to Friday, 12 PM – 5 PM

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

India

On-site

Linkedin logo

Description About the advertising web applications team The advertising web applications team is responsible for the customer systems allowing users to explore our advertising offerings and location/audience insights, book omni-channel advertising campaigns, and view detailed campaign reports and location analytics. We build UI components and pages with React and Angular and API endpoints with Python/Flask and Java. We also work closely with AWS and manage a number of related microservices: auth, search, video transcoding, feature flags, etc. The senior level represents Expert professionals with significant experience who plan, design, organize, and execute large units of work in collaboration with stakeholders. These employees are subject matter experts in technologies and business practices who coach, mentor, and supervise less experienced staff members. You Will Lead engineering efforts across multiple software components. Write excellent production code and tests and help others improve in code-reviews. Analyze high-level requirements to design, document, estimate, and build systems. Collaborate with their supporting Devops engineer to create reusable architectural components. Coach and mentor engineers within the team to develop their skills and abilities Continuously improve the team's practices in code-quality, reliability, performance, testing, automation, logging, monitoring, alerting, and build processes You Have This is our ideal wish list, but most people don’t check every box on every job description. If you meet most of the criteria below and you’re excited about the opportunity and willing to learn, we’d love to hear from you. Education and professional experience: B.Tech/B.E or Masters degree in C.S or other engineering branches with 6+ years experience in technology. or 8+ years experience in technology The following skills/certifications: Javascript, Python, SQL/MySQL, AWS, Git Additional nice-to-have skills/certifications: Flask, FastAPI, React, Angular, Linux, Elasticsearch You Are A team player who is organized, flexible and willing to adapt. Not afraid of new technologies and driven to learn. A detail-oriented person, who catches problems early and adjusts. A strong communicator who can collaborate with multiple business and engineering stakeholders and work through conflicting needs. A problem solver with a maker mindset. Working with multiple teams to propose and build software solutions that make positive impact to business. Benefits At GroundTruth, we want our employees to be comfortable with their benefits so they can focus on doing the work they love. Parental leave- Maternity and Paternity Flexible Time Offs (Earned Leaves, Sick Leaves, Birthday leave, Bereavement leave & Company Holidays) In Office Daily Catered Breakfast, Lunch, Snacks and Beverages Health cover for any hospitalization. Covers both nuclear family and parents Tele-med for free doctor consultation, discounts on health checkups and medicines Wellness/Gym Reimbursement Pet Expense Reimbursement Childcare Expenses and reimbursements Employee referral program Education reimbursement program Skill development program Cell phone reimbursement (Mobile Subsidy program). Internet reimbursement/Postpaid cell phone bill/or both. Birthday treat reimbursement Employee Provident Fund Scheme offering different tax saving options such as Voluntary Provident Fund and employee and employer contribution up to 12% Basic Creche reimbursement Co-working space reimbursement National Pension System employer match Meal card for tax benefit Special benefits on salary account Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Location(s): Quay Building 8th Floor, Bagmane Tech Park, Bengaluru, IN Line Of Business: Data Estate(DE) Job Category Engineering & Technology Experience Level: Experienced Hire At Moody's, we unite the brightest minds to turn today’s risks into tomorrow’s opportunities. We do this by striving to create an inclusive environment where everyone feels welcome to be who they are-with the freedom to exchange ideas, think innovatively, and listen to each other and customers in meaningful ways. If you are excited about this opportunity but do not meet every single requirement, please apply! You still may be a great fit for this role or other open roles. We are seeking candidates who model our values: invest in every relationship, lead with curiosity, champion diverse perspectives, turn inputs into actions, and uphold trust through integrity. Role Overview We are seeking a highly skilled and experienced Senior Full Stack Engineer to join our dynamic team. You will play a crucial role in designing, developing, deploying, and maintaining highly resilient, low-latency web applications that form the core of our user experience. We're looking for a hands-on expert with deep proficiency in the modern JavaScript ecosystem, particularly Node.js, TypeScript, and React. While your core expertise lies in JavaScript technologies, experience developing backend systems with Python and/or Java is valuable. As a senior member of the team, you will significantly influence our technical direction, mentor other engineers, and champion software development best practices. Key Responsibilities Take ownership of the design, development, testing, deployment, and maintenance of robust, scalable,highly resilient, low latency web applications Lead the implementation of complex features, focusing on performant front-end solutions (React, TypeScript) and efficient back-end services (primarily Node.js) Architect and implement solutions optimized for speed, scalability, and reliability across the entire stack Design, build, document, and maintain clean, efficient, and scalable APIs Collaborate effectively with product managers, designers, and fellow engineers to translate requirements into well-architected technical solutions Write high-quality, maintainable, secure, and thoroughly tested code Actively participate in code reviews, providing constructive feedback Diagnose, troubleshoot, and resolve complex technical issues across all environments Mentor junior and mid-level engineers, fostering their technical growth Stay abreast of emerging web technologies, evaluating and proposing their adoption where beneficial Contribute significantly to architectural discussions, helping to shape our technical landscape Required Qualifications & Skills 5+ years of professional experience in full-stack software development, with a proven track record of shipping complex web applications Demonstrable experience building and operating web applications with high availability and low-latency. Strong proficiency in JavaScript and TypeScript Extensive experience with Node.js for building scalable back-end services Strong proficiency in React and its ecosystem (state management, hooks) Solid command of modern web technologies (HTML5, CSS3) Experience designing and building robust APIs following RESTful principles Understanding of fundamental software engineering principles and architectural design patterns Experience working with relational databases and at least one NoSQL database Proficiency with Git and modern CI/CD practices Experience with testing frameworks (unit, integration, end-to-end) Strong analytical, problem-solving, and debugging capabilities Excellent communication and interpersonal skills Preferred Qualifications & Skills Experience with Python (Django, Flask, FastAPI) and/or Java (Spring Boot) Familiarity with graph databases, particularly Neo4j Cloud platform experience (AWS, Azure, or GCP) Experience with Docker and Kubernetes Knowledge of microservices architecture patterns Experience with caching strategies (Redis, Memcached) Understanding of message queues and event-driven architecture Experience with observability tools for monitoring, logging, and tracing Moody’s is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability, protected veteran status, sexual orientation, gender expression, gender identity or any other characteristic protected by law. Candidates for Moody's Corporation may be asked to disclose securities holdings pursuant to Moody’s Policy for Securities Trading and the requirements of the position. Employment is contingent upon compliance with the Policy, including remediation of positions in those holdings as necessary. For more information on the Securities Trading Program, please refer to the STP Quick Reference guide on ComplianceNet Please note: STP categories are assigned by the hiring teams and are subject to change over the course of an employee’s tenure with Moody’s. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

About the Company: Transnational AI Private Limited is a next-generation AI-first company committed to building scalable, intelligent systems for digital marketplaces, insurance, employment, and healthcare sectors. We drive innovation through AI engineering, data science, and seamless platform integration powered by event-driven architectures. Role Summary: We are looking for a highly motivated AI Engineer with strong experience in Python, FastAPI, and event-driven microservice architecture. You will be instrumental in building intelligent, real-time systems that power scalable AI workflows across our platforms. This role combines deep technical engineering skills with a product-oriented mindset. Key Responsibilities: Architect and develop AI microservices using Python and FastAPI within an event-driven ecosystem. Implement and maintain asynchronous communication between services using message brokers like Kafka, RabbitMQ, or NATS. Convert AI/ML models into production-grade, containerized services integrated with streaming and event-processing pipelines. Design and document async REST APIs and event-based endpoints with comprehensive OpenAPI/Swagger documentation. Collaborate with AI researchers, product managers, and DevOps engineers to deploy scalable and secure services. Develop reusable libraries, automation scripts, and shared components for AI/ML pipelines. Maintain high standards for code quality, testability, and observability using unit tests, logging, and monitoring tools. Work within Agile teams to ship features iteratively with a focus on scalability, resilience, and fault tolerance. Required Skills and Experience: Proficiency in Python 3.x with a solid understanding of asynchronous programming (async/await). Hands-on experience with FastAPI; knowledge of Flask or Django is a plus. Experience building and integrating event-driven systems using Kafka, RabbitMQ, Redis Streams, or similar technologies. Strong knowledge of event-driven microservices, pub/sub models, and real-time data streaming architectures. Exposure to deploying AI/ML models using PyTorch, TensorFlow, or scikit-learn. Familiarity with containerization (Docker), orchestration (Kubernetes), and cloud platforms (AWS, GCP, Azure). Experience with unit testing frameworks such as PyTest, and observability tools like Prometheus, Grafana, or OpenTelemetry. Understanding of security principles including JWT, OAuth2, and API security best practices. Nice to Have: Experience with MLOps pipelines and tools like MLflow, DVC, or Kubeflow. Familiarity with Protobuf, gRPC, and async I/O with WebSockets. Prior work in real-time analytics, recommendation systems, or workflow orchestration (e.g., Prefect, Airflow). Contributions to open-source projects or active GitHub/portfolio. Educational Background: Bachelor’s or Master’s degree in Computer Science, Software Engineering, Artificial Intelligence, or a related technical discipline. Why Join Transnational AI: Build production-grade AI infrastructure powering real-world applications. Collaborate with domain experts and top engineers across marketplaces, insurance, and Workforce platforms. Flexible, remote-friendly environment with a focus on innovation and ownership. Competitive compensation, bonuses, and continuous learning support. Work on high-impact projects that influence how people discover jobs, get insured, and access personalized digital services. Show more Show less

Posted 2 weeks ago

Apply

5.0 - 10.0 years

5 - 15 Lacs

Hyderabad

Hybrid

Naukri logo

• Strongly proficient in Core Python, Data handling and manipulation and exception handling. • Strong have worked of at least one Python web framework {such as Django, Flask, etc..). • Good in FAST API

Posted 2 weeks ago

Apply

0 years

0 Lacs

New Delhi, Delhi, India

On-site

Linkedin logo

About Us At WaysAhead , we blend the latest AI techniques with deep industry and analytics expertise to unlock business value from data. Our mission? To transform complex data into actionable insights and help clients make smarter, faster decisions. If you're passionate about innovation and love solving real-world problems through data—let’s talk. Role Overview We’re looking for a Data Scientist to join our growing team in New Delhi. You'll work on cutting-edge problems at the intersection of retail intelligence and AI—building LLMs, vision-based systems, and scalable ML models. Expect to collaborate closely with product teams to turn ideas into impactful AI-powered features. What You’ll Do Build ML models for NLP, vision, recommendation, and classification tasks Fine-tune LLMs and implement RAG-based systems Develop REST APIs and deploy models via Docker, FastAPI, or Flask Write modular, clean Python code for ML pipelines Collaborate with cross-functional teams to deliver AI features Stay updated with the latest in AI, LLMs, and deployment best practices Must-Have Skills Python (Advanced), Sklearn, LLMs / Transformers REST API development, Model deployment (Docker, FastAPI, Flask) RAG pipelines, HuggingFace/OpenAI APIs MsSQL, Algorithm development Good-to-Have Skills Git, CI/CD, LangChain, Prompt Engineering, Streamlit Nice-to-Have Skills AWS/Azure pipelines, OpenCV, YOLO, ONNX, PyTorch Qualifications Master’s in Data Science, Statistics, Computer Science, or related field Strong analytical mindset, teamwork, and communication skills Experience with data visualization tools (Tableau, Power BI) is a plus Show more Show less

Posted 2 weeks ago

Apply

4.0 - 5.0 years

0 Lacs

Vadodara, Gujarat, India

On-site

Linkedin logo

Job Description We are seeking an experienced AI Engineer with 4-5 years of hands-on experience in designing and implementing AI solutions. The ideal candidate should have a strong foundation in developing AI/ML-based solutions, including expertise in Computer Vision (OpenCV). Additionally, proficiency in developing, fine-tuning, and deploying Large Language Models (LLMs) is essential. As an AI Engineer, candidate will work on cutting-edge AI applications, using LLMs like GPT, LLaMA, or custom fine-tuned models to build intelligent, scalable, and impactful solutions. candidate will collaborate closely with Product, Data Science, and Engineering teams to define, develop, and optimize AI/ML models for real-world business applications. Key Responsibilities Research, design, and develop AI/ML solutions for real-world business applications, RAG is must. Collaborate with Product & Data Science teams to define core AI/ML platform features. Analyze business requirements and identify pre-trained models that align with use cases. Work with multi-agent AI frameworks like LangChain, LangGraph, and LlamaIndex. Train and fine-tune LLMs (GPT, LLaMA, Gemini, etc.) for domain-specific tasks. Implement Retrieval-Augmented Generation (RAG) workflows and optimize LLM inference. Develop NLP-based GenAI applications, including chatbots, document automation, and AI agents. Preprocess, clean, and analyze large datasets to train and improve AI models. Optimize LLM inference speed, memory efficiency, and resource utilization. Deploy AI models in cloud environments (AWS, Azure, GCP) or on-premises infrastructure. Develop APIs, pipelines, and frameworks for integrating AI solutions into products. Conduct performance evaluations and fine-tune models for accuracy, latency, and scalability. Stay updated with advancements in AI, ML, and GenAI technologies. Required Skills & Experience AI & Machine Learning: Strong experience in developing & deploying AI/ML models. Generative AI & LLMs: Expertise in LLM pretraining, fine-tuning, and optimization. NLP & Computer Vision: Hands-on experience in NLP, Transformers, OpenCV, YOLO, R-CNN. AI Agents & Multi-Agent Frameworks: Experience with LangChain, LangGraph, LlamaIndex. Deep Learning & Frameworks: Proficiency in TensorFlow, PyTorch, Keras. Cloud & Infrastructure: Strong knowledge of AWS, Azure, or GCP for AI deployment. Model Optimization: Experience in LLM inference optimization for speed & memory efficiency. Programming & Development: Proficiency in Python and experience in API development. Statistical & ML Techniques: Knowledge of Regression, Classification, Clustering, SVMs, Decision Trees, Neural Networks. Debugging & Performance Tuning: Strong skills in unit testing, debugging, and model evaluation. Hands-on experience with Vector Databases (FAISS, ChromaDB, Weaviate, Pinecone). Good To Have Experience with multi-modal AI (text, image, video, speech processing). Familiarity with containerization (Docker, Kubernetes) and model serving (FastAPI, Flask, Triton). Requirements We are seeking an experienced AI Engineer with 4-5 years of hands-on experience in designing and implementing AI solutions. The ideal candidate should have a strong foundation in developing AI/ML-based solutions, including expertise in Computer Vision (OpenCV). Additionally, proficiency in developing, fine-tuning, and deploying Large Language Models (LLMs) is essential. As an AI Engineer, candidate will work on cutting-edge AI applications, using LLMs like GPT, LLaMA, or custom fine-tuned models to build intelligent, scalable, and impactful solutions. candidate will collaborate closely with Product, Data Science, and Engineering teams to define, develop, and optimize AI/ML models for real-world business applications. Key Responsibilities: - Research, design, and develop AI/ML solutions for real-world business applications, RAG is must. - Collaborate with Product & Data Science teams to define core AI/ML platform features. - Analyze business requirements and identify pre-trained models that align with use cases. - Work with multi-agent AI frameworks like LangChain, LangGraph, and LlamaIndex. - Train and fine-tune LLMs (GPT, LLaMA, Gemini, etc.) for domain-specific tasks. - Implement Retrieval-Augmented Generation (RAG) workflows and optimize LLM inference. - Develop NLP-based GenAI applications, including chatbots, document automation, and AI agents. - Preprocess, clean, and analyze large datasets to train and improve AI models. - Optimize LLM inference speed, memory efficiency, and resource utilization. - Deploy AI models in cloud environments (AWS, Azure, GCP) or on-premises infrastructure. - Develop APIs, pipelines, and frameworks for integrating AI solutions into products. - Conduct performance evaluations and fine-tune models for accuracy, latency, and scalability. - Stay updated with advancements in AI, ML, and GenAI technologies. Required Skills & Experience: - AI & Machine Learning: Strong experience in developing & deploying AI/ML models. - Generative AI & LLMs: Expertise in LLM pretraining, fine-tuning, and optimization. - NLP & Computer Vision: Hands-on experience in NLP, Transformers, OpenCV, YOLO, R-CNN. - AI Agents & Multi-Agent Frameworks: Experience with LangChain, LangGraph, LlamaIndex. - Deep Learning & Frameworks: Proficiency in TensorFlow, PyTorch, Keras. - Cloud & Infrastructure: Strong knowledge of AWS, Azure, or GCP for AI deployment. - Model Optimization: Experience in LLM inference optimization for speed & memory efficiency. - Programming & Development: Proficiency in Python and experience in API development. - Statistical & ML Techniques: Knowledge of Regression, Classification, Clustering, SVMs, Decision Trees, Neural Networks. - Debugging & Performance Tuning: Strong skills in unit testing, debugging, and model evaluation. - Hands-on experience with Vector Databases (FAISS, ChromaDB, Weaviate, Pinecone). Good to Have: - Experience with multi-modal AI (text, image, video, speech processing). - Familiarity with containerization (Docker, Kubernetes) and model serving (FastAPI, Flask, Triton). Show more Show less

Posted 2 weeks ago

Apply

2.0 - 4.0 years

3 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Were looking for people with a strong interest in building successful products or systems, comfortable dealing with lots of moving pieces, exquisite attention to detail, and are comfortable learning new technologies and methods. Job Description: Role Backend Engineers to join our team and build seamless integrations with third-party systems. This role is ideal for engineers with experience in FastAPI, Python, and designing scalable backend services. You will own the entire lifecycle of integration projects, from gathering requirements to deployment, ensuring efficient and maintainable solutions. Responsibilities: Design & Development: Build and maintain backend services, APIs, and data pipelines in a Python-based stack. Integration Development: Develop and optimize integrations with external systems, handling data synchronization and transformations efficiently. Technical Ownership: Own the end-to-end development process, from gathering functional requirements to testing and deployment. Scalability & Performance: Identify performance bottlenecks, improve system reliability, and ensure scalable architecture. Collaboration: Work closely with internal teams (product, support, and other engineering teams) to understand requirements and provide robust technical solutions. Continuous Improvement: Proactively identify areas for improvement in existing integrations and backend systems. Requirements: Education: Bachelors degree in Computer Science, Engineering, or a related field. Experience: Minimum 3 years of software development experience, primarily in Python. Mandatory Skills: Hands-on experience with FastAPI. Experience designing and maintaining backend services and APIs. Understanding of SQL and NoSQL databases. Familiarity with asynchronous programming and event-driven architecture. Preferred Skills: Experience with Django or other Python frameworks. Exposure to B2B SaaS products or enterprise integrations. Understanding of authentication and security best practices (OAuth, JWT, etc.). Experience working with ETL pipelines and data synchronization. Soft Skills: Strong problem-solving and analytical skills. Ability to make technical trade-offs and balance short-term and long-term goals. Excellent communication skills for cross-functional collaboration.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

The Applications Development Senior Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to Generative AI Shared Services Platform applications/systems and programming activities. Responsibilities: Conduct tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establish and implement new or revised applications systems and programs to meet specific business needs or user areas Monitor and control all phases of development process and analysis, design, construction, testing, and implementation as well as provide user and operational support on applications to business users Utilize in-depth specialty knowledge of applications development to analyze complex problems/issues, provide evaluation of business process, system process, and industry standards, and make evaluative judgement Recommend and develop security measures in post implementation analysis of business usage to ensure successful system design and functionality Consult with users/clients and other technology groups on issues, recommend advanced programming solutions, and install and assist customer exposure systems Ensure essential procedures are followed and help define operating standards and processes Serve as advisor or coach to new or lower level analysts Has the ability to operate with a limited level of direct supervision. Can exercise independence of judgement and autonomy. Acts as SME to senior stakeholders and /or other team members. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Mandatory Qualifications: 5-8 years of relevant experience Working proficiency in Python Writing clean, scalable, and modular Python code following best practices Experience with RESTful API development (preferably using FastAPI, or Flask) Experience with object-oriented programming (OOP), exception handling, and logging. Proficiency in asynchronous programming (async/await, threading, multiprocessing) for handling concurrent requests efficiently Building and optimizing microservices based architectures for shared services Database integration (PostgreSQL/MongoDB, Oracle) Writing unit, integration, and end-to-end test cases using PyTest or similar frameworks. Basic understanding of CI/CD pipelines and their role in automated deployments. Setting up GitHub CI/CD pipelines (or similar)for automated testing and deployments. Writing and maintaining Dockerfiles for containerized applications. Working knowledge of Kubernetes/OpenShift ECS Cluster Strong understanding of API authentication and authorization (JWT, OAuth, API Gateway). Proficient in optimizing database queries, indexing strategies, and schema design Experience in systems analysis and programming of software applications Experience in managing and implementing successful projects Working knowledge of consulting/project management techniques/methods Ability to work under pressure and manage deadlines or unexpected changes in expectations or requirements Desired Qualifications: Implementing SonarQube for code quality, coverage, and static analysis. Familiarity with data science libraries (scikit-learn, TensorFlow) Awareness of Large Language Model (LLM) integration and RAG architectures for shared AI services Familiarity with vector databases (e.g., PGVector, FAISS) for retrieval-augmented generation (RAG) Conceptual knowledge of embeddings, tokenization, and retrieval mechanisms in AI-driven applications Implementing logging and observability for services using ELK Stack Grafana or similar software Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

We are seeking a skilled and innovative Deep Learning Engineer to join our AI/ML team. As a Deep Learning Engineer, you will develop, train, and deploy computer vision models that solve complex visual problems. You will work on cutting-edge technology involving image processing, object detection, and video analysis, collaborating with cross-functional teams to create impactful real-world applications. Role: Data Scientist / Deep Learning Engineer Location: Pune General Summary Of The Role Develop and optimize computer vision models for object detection (YOLO, Faster R-CNN, SSD) and image classification (ResNet, MobileNet, EfficientNet, ViTs). Work with OCR technologies (Tesseract, EasyOCR, CRNN, TrOCR) for text extraction from images. Work with PyTorch, TensorFlow, OpenCV for deep learning and image processing. Implement sequence-based models (RNNs, LSTMs, GRUs) for vision tasks. Optimize software for real-time performance on multiple platforms. Implement and deploy AI models via Flask/FastAPI and integrate with SQL/NoSQL databases. Use Git/GitHub for version control and team collaboration. Apply ML algorithms (regression, decision trees, clustering) as needed. Review code, mentor team members, and enhance model efficiency. Stay updated with advancements in deep learning and multimodal AI. Required Skills & Qualifications Python proficiency for AI development. Experience with PyTorch, TensorFlow, and OpenCV. Knowledge of object detection (YOLO, Faster R-CNN, SSD) and image classification (ResNet, MobileNet, EfficientNet, ViTs). Experience with OCR technologies (Tesseract, EasyOCR, CRNN, TrOCR). Experience with RNNs, LSTMs, GRUs for sequence-based tasks. Experience with Generative Adversarial Networks (GANs) and Diffusion Models for image generation. Familiarity with REST APIs (Flask/FastAPI) and SQL/NoSQL databases. Strong problem-solving and real-time AI optimization skills. Experience with Git/GitHub for version control. Knowledge of Docker, Kubernetes, and model deployment at scale on serverless and onprem platforms. Understanding of vector databases (FAISS, Milvus). Preferred Qualifications Experience with cloud platforms (AWS, GCP, Azure). Experience with Vision Transformers (ViTs) and Generative AI (GANs, Stable Diffusion, LMMs). Familiarity with Frontend Technologies. Skills: computer vision,object detection,models,platforms,opencv,diffusion models,databases,classification,lstms,tensorflow,deep learning,python,rest apis,learning,grus,ocr technologies,pytorch,git,generative adversarial networks,image classification,sql/nosql databases,kubernetes,docker,vector databases,rnns,diffusion,data Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Attention all Python Full Stack Developers! Join a dynamic and growing company where you can utilize your leadership skills and technical expertise to drive innovation and make a real impact. We are seeking a talented and motivated FullStack Developer to join our dynamic team. Our ideal candidate should have a passion for technology, a strong understanding of software development principles, and a desire to continuously learn and grow their skills. As a FullStack Developer, you will have the opportunity to work on a variety of projects developing server-side logic for web-based applications that is market leading, highly accessed commercial digital content platform based on a high-performance Consumer Electronics device. You will be responsible for overseeing the development process from start to finish, working closely with our design and product teams to ensure that our applications meet the needs of our users and exceed their expectations. You will be working with the latest technologies and frameworks and will have the opportunity to shape the direction of our development strategy as we grow and evolve. We offer a supportive and inclusive work environment, and opportunities for professional growth and advancement. If you are ready to take your career to the next level and build the future, apply today! Primary Skills Strong understanding of database management systems (DBMS) such as MySQL, PostgreSQL, or Oracle. Strong proficiency in front-end technologies (HTML, CSS, JavaScript, React, Angular, or Vue.js). Experience with database design and data modeling. Strong experience as a Full Stack Developer with expertise in Python and web frameworks and also FastAPI or other popular API frameworks in Python Experience with Microservices based System design & deployment. Experience working with Third-Party Collaboration (e.g. TCMS) Experience building high performance and scalable web services Solid understanding of MVC and stateless APIs & building RESTful APIs Strong sense of ownership for the end product and passionate about writing high quality and well-architected code. Experience with one or more NoSQL databases such as DynamoDB or Redis Experience with MySQL or other relational database systems Experience with version control concepts and Git Experience with AWS and AWS Infrastructure, Platform and Services is an advantage Experience and comfort building cloud-native applications Familiarity on Frontend Development Familiarity with code versioning tools such as Git Strong problem-solving and analytical skills Excellent communication, Presentation, and collaboration skills Passion for staying up-to-date with the latest industry trends and technologies. Bilingual (Japanese) will be a plus Show more Show less

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad, Bengaluru

Work from Office

Naukri logo

Your future duties and responsibilities: Skill: pgvector,Vertex AI, FastAPI, Flask, Kubernetes Develops and optimizes AI applications for production, ensuring seamless integration with enterprise systems and front-end applications. Builds scalable API layers and microservices using FastAPI, Flask, Docker, and Kubernetes to serve AI models in real-world environments Implements and maintains AI pipelines with MLOps best practices, leveraging tools like Azure ML, Databricks, AWS SageMaker, and Vertex AI Ensures high availability, reliability, and performance of AI systems through rigorous testing, monitoring, and optimization Works with agentic frameworks such as LangChain, LangGraph, and AutoGen to build adaptive AI agents and workflows Experience with GCP, AWS, or Azure - utilizing services such as Vertex AI, Bedrock, or Azure Open AI model endpoints Hands on experience with vector databases such as pgvector, Milvus, Azure Search, AWS OpenSearch, and embedding models such as Ada, Titan, etc. Collaborates with architects and scientists to transition AI models from research to fully functional, high-performance production systems. Skills: Azure Search Flask Kubernetes

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

India

On-site

Linkedin logo

Description The Position We are seeking a seasoned engineer with a passion for changing the way millions of people save energy. You’ll work within the Engineering team to build and improve our platforms to deliver flexible and creative solutions to our utility partners and end users and help us achieve our ambitious goals for our business and the planet. We are seeking a skilled and passionate Data Engineer - Business Intelligence with expertise in Data Engineering and BI Reporting to join our development team. As a Data Engineer, you will play a crucial role developing different components, harnessing the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, data processing and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. You will also work on creating BI reports as well as development of a Business Intelligence platform that will enable users to create reports and dashboards based on their requirements. You will coordinate with the rest of the team working on different layers of the infrastructure. Therefore, a commitment to collaborative problem solving, sophisticated design, and quality product is important. You will own the development and its quality independently and be responsible for high quality deliverables. And you will work with a great team with excellent benefits. Responsibilities & Skills You should: Be excited to work with talented, committed people in a fast-paced environment. Have a proven experience as a Data Engineer with a focus on BI reporting.. Be designing, building, and maintaining high performance solutions with reusable, and reliable code. Use a rigorous approach for product improvement and customer satisfaction. Love developing great software as a seasoned product engineer. Be ready, able, and willing to jump onto a call with stakeholders to help solve problems. Be able to deliver against several initiatives simultaneously. Have a strong eye for detail and quality of code. Have an agile mindset. Have strong problem-solving skills and attention to detail. Required Skills (Data Engineer) You ideally have 2+ or more years of professional experience. Design, build, and maintain scalable data pipelines and ETL processes to support business analytics and reporting needs. Strong Experience with SQL for querying and transforming large datasets, and optimizing query performance in relational databases. Proficiency in Python for building and automating data pipelines, ETL processes, and data integration workflows. Familiarity with big data frameworks such as Apache Spark or PySpark for distributed data processing. Strong Understanding of data modeling principles for building scalable and efficient data architectures (e.g., star schema, snowflake schema). Good to have experience with Databricks for managing and processing large datasets, implementing Delta Lake, and leveraging its collaborative environment. Knowledge of Google Cloud Platform (GCP) services like BigQuery, Dataflow, Pub/Sub, and Cloud Storage for end-to-end data engineering solutions. Familiarity with version control systems such as Git and CI/CD pipelines for managing code and deploying workflows. Awareness of data governance and security best practices, including access control, data masking, and compliance with industry standards. Exposure to monitoring and logging tools like Datadog, Cloud Logging, or ELK stack for maintaining pipeline reliability. Ability to understand business requirements and translate them into technical requirements. Inclination to design solutions for complex data problems. Ability to deliver against several initiatives simultaneously as a multiplier. Demonstrable experience with writing unit and functional tests. Required Skills (BI Reporting) Strong experience in developing Business Intelligence reports and dashboards via tools such as Tableau, PowerBI, Sigma etc. Ability to analyse and deeply understand the data, relate it to the business application and derive meaningful insights from the data. Required The following experiences are not required, but you'll stand out from other applicants if you have any of the following, in our order of importance: You are an experienced developer - a minimum of 2+ years of professional experience. Work experience & strong proficiency in Python, SQL and BI Reporting and its associated frameworks (like Flask, FastAPI etc.). Experience with cloud infrastructure like AWS/GCP or other cloud service provider experience CI/CD experience You are a Git guru and revel in collaborative workflows You work on the command line confidently and are familiar with all the goodies that the linux toolkit can provide Familiarity with Apache Spark and PySpark. Qualifications Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Uplight provides equal employment opportunities to all employees and applicants and prohibits discrimination and harassment of any type without regard to race (including hair texture and hairstyles), color, religion (including head coverings), age, sex, national origin, caste, disability status, genetics, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. Show more Show less

Posted 2 weeks ago

Apply

1.0 years

0 Lacs

India

On-site

We are hiring an experienced AI Developer who specializes in computer vision, object detection, and OCR technologies. The ideal candidate will work closely with the development team to design, train, and deploy machine learning models that extract meaningful insights from visual data. You will be responsible for model training, data annotation, API integration, and creating end-to-end AI pipelines. Job Responsibilities Develop and fine-tune object detection models using YOLOv5/YOLOv8, Detectron2, or SSD Preprocess image data using OpenCV (contours, morphology, edge detection, etc.) Train custom models using labeled datasets, including bounding box annotations Use annotation tools such as CVAT, LabelImg, or Roboflow to label and manage datasets Prepare data in YOLO format, apply augmentation techniques to increase accuracy Integrate OCR engines such as Tesseract or EasyOCR to extract text from images Associate spatial text with detected objects using bounding box linking and proximity-based matching Build and deploy REST APIs using Flask or FastAPI for serving trained models Collaborate with frontend/backend developers to integrate AI models into applications Job Qualifications: Must-Have Skills Experience with YOLO (v5/v8), Detectron2, or SSD object detection models Proficient in image preprocessing techniques using OpenCV Hands-on experience with annotation tools like CVAT, LabelImg, or Roboflow Ability to prepare and format datasets for training (e.g., YOLO format, augmentation) Strong knowledge of OCR tools such as Tesseract OCR or EasyOCR Experience in linking spatial text with detected objects and proximity-based matching Proficiency in building RESTful APIs using Flask or FastAPI Solid understanding of Python, machine learning, and deep learning frameworks (e.g., PyTorch, TensorFlow) Nice to Have Knowledge of Docker, cloud deployment (AWS/GCP), and CI/CD practices Experience with model monitoring and performance optimization Familiarity with database technologies (e.g., MongoDB, PostgreSQL) Job Type: Full-time Benefits: Health insurance Provident Fund Schedule: Day shift Education: Bachelor's (Preferred) Experience: Python (PyTorch, TensorFlow): 1 year (Required) object detection Models: 1 year (Required) OpenCV: 1 year (Required) Location: Kulathur, Thiruvananthapuram, Kerala (Preferred) Work Location: In person Application Deadline: 30/06/2025

Posted 2 weeks ago

Apply

10.0 years

7 - 8 Lacs

Hyderābād

On-site

Job Information Date Opened 05/14/2025 Job Type Permanent RSD NO 11028 Industry IT Services Min Experience 10 Max Experience 10+ City Hyderabad State/Province Telangana Country India Zip/Postal Code 500001 Job Description Job Title: Backend Developer – Python Experience: 10+ Years Location: HYD/CHN/BLR Key Skills : Python with Fast API , Panda, Datadog, AWS S3. About the Role: We are looking for a seasoned Backend Developer with over 10 years of experience in Python development. The ideal candidate should have hands-on experience in designing scalable APIs, working with monitoring tools like Datadog, and developing backend services using modern frameworks such as Flask, Django, or FastAPI. A strong grasp of asynchronous programming and cloud integrations is highly preferred. Primary Responsibilities: Design, develop, and maintain robust backend applications using Python Build and manage API endpoints using Flask, Django, or FastAPI Implement asynchronous programming using asyncio Leverage logging libraries and monitoring tools like Datadog Work with PostgreSQL or MongoDB for database operations Use API management tools such as Postman or Mulesoft for testing and documentation Optional exposure to S3/AWS and gRPC is a plus Collaborate with frontend developers, DevOps, and product managers Write clean, testable code with a focus on performance and scalability Must-Have Skills: Strong programming skills in Python Experience creating and managing API endpoints (Flask/Django/FastAPI) Familiarity with asyncio for asynchronous operations Proficient with Datadog and logging tools Basic knowledge of PostgreSQL and/or MongoDB Hands-on experience with API testing tools (e.g., Postman, Mulesoft) Excellent communication and analytical skills Strong problem-solving aptitude and business acumen Nice to Have: Knowledge of NumPy and Pandas Familiarity with AWS services (e.g., S3) Exposure to gRPC At Indium diversity, equity, and inclusion (DEI) are the cornerstones of our values. We champion DEI through a dedicated council, expert sessions, and tailored training programs, ensuring an inclusive workplace for all. Our initiatives, including the WE@IN women empowerment program and our DEI calendar, foster a culture of respect and belonging. Recognized with the Human Capital Award, we are committed to creating an environment where every individual thrives. Join us in building a workplace that values diversity and drives innovation.

Posted 2 weeks ago

Apply

3.0 years

6 - 10 Lacs

Gurgaon

On-site

Company Overview: Schneider Electric is a global leader in energy management and automation, committed to providing innovative solutions that ensure Life Is On everywhere, for everyone, and at every moment. We are expanding our team in Gurugram and looking for a Senior Cloud Architect to enhance our cloud capabilities and drive the integration of digital technologies in our operations. Job Description: As Senior Design Engineer at Schneider Electric, you will play a crucial role in developing and implementing IoT solutions across our global infrastructure, with a primary focus on Edge software. This position requires practical hands-on ability to implement and manage, and optimize edge-based software solutions, ensuring efficient data processing for a large-scale edge gateways and devices (100s of thousands) deployed in the field. Key Responsibilities: Develop scalable, high-performance Edge computing solutions for IoT applications. Independently design and implementation of asynchronous task processing using Python (asyncio, Twistd, Tornado, etc.) for efficient data handling and device communication. Develop and optimize IoT data pipelines, integrating sensors, edge devices, and cloud-based platforms. Work on device-to-cloud communication using MQTT, WebSockets, or other messaging protocols. Ensure software is secure, reliable, and optimized for resource-constrained edge environments. Design and optimize Linux-based networking for edge devices, including network configuration, VPNs, firewalls, and traffic shaping. Implement and manage Linux process management, including systemd services, resource allocation, and performance tuning for IoT applications. Stay updated with emerging IoT, edge computing, and Linux networking technologies. Requirements: Technical 3-5 years of overall experience in software engineering with a strong focus on Python development. Expertise in Python, with experience in asynchronous programming, task processing frameworks, Web frameworks (e.g., asyncio, Twistd, FastAPI, Flask). Strong knowledge of Linux networking, including TCP/IP, DNS, firewalls (iptables/nftables), VPNs, and network security. Experience in Linux process management, including systemd, resource limits (cgroups), and performance tuning. Good Understanding of IoT architectures, protocols (MQTT, HTTP/REST), and edge computing frameworks. Hands-on experience with Docker. Proficiency and Experience with Git or any other VCS. Excellent problem-solving skills and the ability to lead complex technical projects. Good to have: Knowledge of Rust, C++, or Golang for performance-critical edge applications. Prior experience of working in IoT. Understanding of BACnet/Modbus protocols. Familiarity with cloud IoT platforms (AWS IoT, Azure IoT, Google Cloud IoT) and their integration with edge devices. Soft Skills: Excellent problem-solving abilities and strong communication skills. Advanced verbal and written communication skills including the ability to explain and present technical concepts to a diverse set of audiences. Good judgment, time management, and decision-making skills Strong teamwork and interpersonal skills; ability to communicate and thrive in a cross-functional environment Willingness to work outside documented job description. Has a “whatever is needed” attitude. Qualifications Preferred Qualifications: Bachelor's or Master's degree in computer science, Information Technology, or related field. Working experience on designing robust, scalable & maintainable asynchronous python applications. Prior experience in building cloud connected Edge IoT solutions Prior experience in the energy sector or industrial automation is advantageous. Primary Location : IN-Haryana-Gurgaon Schedule : Full-time Unposting Date : Ongoing

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Position Overview We are seeking a skilled Backend Developer with expertise in Python to join our dynamic development team. The ideal candidate will have a strong foundation in server-side development, database management, and API design, with a passion for building scalable and efficient backend systems. Key Responsibilities - Design, develop, and maintain robust backend applications using Python - Build and optimize RESTful APIs and web services - Implement asynchronous programming patterns and manage concurrent processes - Work with both SQL and NoSQL databases to ensure optimal data storage and retrieval - Collaborate with frontend developers and other team members to deliver integrated solutions - Participate in code reviews and maintain high coding standards - Contribute to CI/CD pipeline development and deployment processes - Troubleshoot and debug production issues - Write comprehensive unit tests and documentation Please send your resumes to support@reelwise.in Required Qualifications Experience & Education - 3+ years of professional software development experience - Bachelor's degree in Computer Science, Engineering, or related field (or equivalent experience) Technical Skills - Strong proficiency in Python programming - Deep understanding of asynchronous programming, concurrency, and async task management - Knowledge of Python coding standards and best practices* (PEP 8, code organization, design patterns) - Experience with SQL databases (PostgreSQL preferred) - Experience with NoSQL databases (MongoDB preferred) - Hands-on experience with API development using FastAPI or Django - Understanding of CI/CD pipelines and deployment automation - Familiarity with version control systems (Git) - Knowledge of containerization technologies (Docker preferred) Preferred Qualifications - Experience with message queues and pub/sub systems (Redis, RabbitMQ, Apache Kafka, or similar) - Knowledge of cloud platforms (AWS, GCP, or Azure) - Experience with microservices architecture - Familiarity with testing frameworks (pytest, unittest) - Understanding of security best practices in backend development - Experience with monitoring and logging tools - Knowledge of performance optimization techniques Show more Show less

Posted 2 weeks ago

Apply

Exploring FastAPI Jobs in India

FastAPI is a modern web framework for building APIs with Python that is gaining popularity in the tech industry. If you are a job seeker looking to explore opportunities in the fastapi domain in India, you're in the right place. This article will provide you with insights into the fastapi job market in India, including top hiring locations, salary ranges, career progression, related skills, and interview questions.

Top Hiring Locations in India

  1. Bengaluru
  2. Hyderabad
  3. Pune
  4. Chennai
  5. Mumbai

Average Salary Range

The salary range for fastapi professionals in India varies based on experience levels. Entry-level positions can expect a salary range of INR 4-6 lakhs per annum, while experienced professionals can earn anywhere from INR 10-20 lakhs per annum.

Career Path

In the fastapi domain, a career typically progresses as follows: - Junior Developer - Mid-level Developer - Senior Developer - Tech Lead

Related Skills

Besides proficiency in FastAPI, other skills that are often expected or helpful alongside FastAPI include: - Python programming - RESTful APIs - Database management (SQL or NoSQL) - Frontend technologies like HTML, CSS, and JavaScript

Interview Questions

  • What is FastAPI and how is it different from other Python web frameworks? (basic)
  • Explain the main features of FastAPI. (basic)
  • How do you handle authentication and authorization in FastAPI? (medium)
  • Can you explain dependency injection in FastAPI? (medium)
  • What is Pydantic and how is it used in FastAPI? (medium)
  • How do you handle request validation in FastAPI? (medium)
  • What are the advantages of using asynchronous programming with FastAPI? (medium)
  • How do you perform testing in FastAPI applications? (medium)
  • Explain the role of middleware in FastAPI. (medium)
  • Can you discuss the performance benefits of FastAPI compared to other frameworks? (advanced)
  • How do you handle background tasks in FastAPI? (advanced)
  • Explain the process of deploying a FastAPI application to production. (advanced)
  • How does FastAPI handle exceptions and errors? (advanced)
  • What are OpenAPI schemas and how are they used in FastAPI? (advanced)
  • Can you discuss the scalability aspects of FastAPI applications? (advanced)
  • How do you optimize database queries in FastAPI applications? (advanced)
  • Explain the process of integrating FastAPI with Docker. (advanced)
  • What are API routers in FastAPI and how do you use them? (advanced)
  • How do you handle file uploads in FastAPI applications? (advanced)
  • Can you discuss the security best practices in FastAPI development? (advanced)
  • Explain the process of versioning APIs in FastAPI. (advanced)
  • How do you handle CORS in FastAPI applications? (advanced)
  • What is the role of dependency management in FastAPI projects? (advanced)
  • How do you monitor and log FastAPI applications in production? (advanced)

Closing Remark

As you explore opportunities in the fastapi job market in India, remember to prepare thoroughly and apply confidently. With the right skills and knowledge, you can excel in your career as a FastAPI professional. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies