Home
Jobs

1760 Fastapi Jobs - Page 44

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About Python architect At Cybage: Providing Architectural Services for turn-key projects, new clients with large scale requirements. R&D on new technologies / frameworks / tools in Python ecosystem. Online travels on need basis for new customer engagement and during discovery phase to transfer knowledge. Python Architect - Web/API/Application Designing, building and maintaining, scalable and secure services and REST APIs, in at least one Python framework such as Django, Flask, FastAPI etc. or in Python with gRPC. Expertise in at least one one RDBMS such as Postgres, MySQL, Oracle etc. and one NoSQL database such as MongoDB, Redis, etc. Familiarity with different caching strategies and use of at least one caching solution such as Redis, Memcached, etc. to do the same. Designing for distributed / asynchronous jobs and familiarity with tools such as Celery, Redis Queue, Kafka, etc. to implement the same. Building these services in at least one cloud platform such as AWS, Azure, GCP etc. with an eye on scalability, performance and high availability. Experienced in building these use containerization ( Docker, etc. ) and orchestration ( Kubernetes, etc. ) techniques. Ensuring code quality and use of at least one tool / linter such as pylint, black, flake8 etc. Using automated tests to validate the services and APIs built in order to allow for continuous delivery. Effectively monitor and observe these services by tracking application logs through at least one logging tool such as Splunk, Datadog Logs, AWS Cloudwatch, etc. and application metrics such as latency, concurrency, etc. over at least one monitoring tool such as Datadog APM/RUM, AWS Cloudwatch, etc. Expertise in identifying the right thresholds and alerts on these services and plugging these against alerting tools such as PagerDuty, OpsGenie, AWS Cloudwatch alarms etc. in order to respond to incidents quickly and effectively. Python Architect - Data Designing, building and maintaining, effective and scalable data solutions using Python. Creating and maintaining data integration processes, ETL ( Extract, Transform, Load ) workflows, and data pipelines ( Airflow, etc. ) to seamlessly transport data between systems. Expertise in parallel processing massive datasets and use of Spark, Hadoop, etc. to do the same. Experienced in working with datasets hosted in at least one data warehouse such as Snowflake, Amazon Redshift, etc. Familiarity with reporting on datatsets using at least one BI tool such as Looker, Tableau, Power BI, Quicksight etc. Expertise in at least one one RDBMS such as Postgres, MySQL, Oracle etc. and one NoSQL database such as MongoDB, Redis, etc. Building these in at least one cloud platform such as AWS, Azure, GCP etc. with an eye on scalability, performance and high availability. Experienced in building these use containerization ( Docker, etc. ) and orchestration ( Kubernetes, etc. ) techniques. Ensuring code quality and use of at least one tool / linter such as pylint, black, flake8 etc. Using automated tests to validate the services built in order to allow for continuous delivery. Effectively monitor and observe these services by tracking service logs through at least one logging tool such as Splunk, Datadog Logs, AWS Cloudwatch, etc. and service metrics such as latency, concurrency, etc. over at least one monitoring tool such as Datadog APM, AWS Cloudwatch, etc. Expertise in identifying the right thresholds and alerts on these services and plugging these against alerting tools such as PagerDuty, OpsGenie, AWS Cloudwatch alarms etc. in order to respond to incidents quickly and effectively. Passion for maintaining software configurations in code and familiarity in the use of at least one of Ansible, Terraform, Helm, etc. to do the same. Show more Show less

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Position: SDE 2 - Full-Stack Python Developer Competitive Salary + Equity + Benefits 📍 Location: Hyderabad, India (Onsite) 🛠 Experience: 2 - 3 years About Us At GroAR , we're at the forefront of building an AI-driven product designed to redefine revenue operations for B2B and B2C firms. As an early-stage team member, you’ll have the unique opportunity to work on an unprecedented generative AI project , gaining hands-on experience in AI engineering, automation, and large-scale data processing . Role Overview We are looking for a Full-Stack Python Developer who is passionate about building scalable AI-driven applications . This role involves developing AI-powered SaaS solutions , ensuring seamless integration between the frontend, backend, and AI models . You’ll play a key role in architecting, developing, and optimizing AI prompting techniques . Responsibilities Develop and maintain a scalable full-stack architecture for our AI-driven SaaS product. Design and implement AI-friendly databases optimized for high-speed data processing and retrieval. Work on prompt engineering to optimize AI outputs and ensure high-quality responses. Work on integrating LLM models and automation workflows into our product. Implement DevOps best practices , including CI/CD pipelines, cloud deployments, and infrastructure automation. Write clean, efficient, and maintainable Python-based backend code. Develop user-friendly front-end interfaces to interact with AI-generated insights. Optimize application performance, security, and scalability . Qualifications ✔ 2 - 3 years of experience in full-stack development , preferably in a B2B SaaS environment. ✔ Strong proficiency in Python (Flask/Django/FastAPI). ✔ Experience with AI-powered applications , including prompt engineering and AI model integration. ✔ Knowledge of database architecture , especially for AI-driven systems (SQL, NoSQL, Vector DBs like Pinecone/Weaviate). ✔ Hands-on experience with DevOps practices (AWS/GCP/Azure, Docker, Kubernetes, CI/CD pipelines). ✔ Familiarity with front-end technologies ( React, Vue, or Angular ). ✔ Strong problem-solving skills and the ability to work in a fast-paced startup environment . Why Join Us? 🚀 Competitive Salary + Equity + Benefits 🧠 Work on a Game-Changing AI Product ⚡ Opportunity to Take Ownership & Drive Innovation 📈 Be Part of a High-Growth Startup from the Ground Up Show more Show less

Posted 2 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

India

Remote

Linkedin logo

Ubique Systems is hiring. Location- Work From Home (Remote) Experience: 3-5 years Role: Data Engineer Type: Contract Job Description: Data Pipelines: Proven experience in building scalable and reliable data pipelines BigQuery: Expertise in writing complex SQL transformations; hands-on with indexing and performance optimization Ingestion: Skilled in data scraping and ingestion through RESTful APIs and file-based sources Orchestration: Familiarity with orchestration tools like Prefect, Apache Airflow (nice to have) Tech Stack: Proficient in Python, FastAPI, and PostgreSQL End-to-End Workflows: Capable of owning ingestion, transformation, and delivery processes Interested? Kindly share your CV with siddhi.divekar@ubique-systems.com Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Bangalore Urban, Karnataka, India

On-site

Linkedin logo

Responsibilities Responsible for ensuring that data pipelines are deployed successfully into production Deploy backend micro-services to AWS cloud leveraging on the full spectrum of the AWS cloud services Collaborate with the product team (developers, product owner, designer, etc.) to handle product development, release management, infrastructure provisioning and deployment during the implementation Skills needed: Minimum requirements 5 years of work experience deploying data pipelines in a production environment Experience with Python and related data libraries (eg. Pandas,NumPy) Experience with database languages (e.g. SQL) Experience with Docker Experience with Flask deployment of micro-services, preferably FastAPI Experience working in a multi-disciplinary team of data scientists, software engineers, product managers and subject domain experts Experience in Agile working environment Preferred requirements Experience with AWS cloud services like RDS, EKS Experience with CI/CD tools like Jenkins • Experience with Dagster/Airflow is preferred Experience with SQLAlchemy and Alembic libraries is a plus Show more Show less

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Vesu, Surat, Gujarat

On-site

Indeed logo

Company Description Welcome to ScriptJet! We are a dynamic and innovative company dedicated to providing exceptional products and services to our valued clients. Our team is committed to delivering outstanding customer experiences by focusing on quality, reliability, and innovation. ScriptJet specializes in Web Development and Mobile App Development, offering a wide range of products and services tailored to meet diverse client needs across various industries. Role Description This is a full-time on-site role for a Python + Next.js Developer located in Surat. The Developer will be responsible for front-end and back-end web development. The day-to-day tasks include coding, testing, debugging, and implementing web applications to meet client requirements. Qualifications: Strong understanding of Front-End and Back-End Web Development Proficiency in Python (Django, FastAPI, or similar frameworks) Hands-on experience with JavaScript and modern React.js development Familiarity with Next.js framework is a strong plus Solid understanding of Redux or other state management tools Good knowledge of RESTful APIs and web services Experience with version control systems (Git/GitHub) Basic understanding of Node.js is a bonus Strong problem-solving and analytical skills Ability to work collaboratively in a fast-paced, team-oriented environment Bachelor’s degree in Computer Science , Software Engineering , or related field Job Type: Full-time Pay: ₹10,000.00 - ₹35,000.00 per month Location Type: In-person Ability to commute/relocate: Vesu, Surat, Gujarat: Reliably commute or planning to relocate before starting work (Required) Application Question(s): Current CTC: Work Location: In person

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About Cybage: Cybage Software is a technology consulting organization specializing in outsourced product engineering services. Our unique offerings span the technological spectrum–from cutting-edge software development to transformative digital strategies. In 1995, Cybage was founded with a mission to revolutionize the product engineering landscape, empowering businesses to soar above their limitations. What commenced as a modest venture has transformed into a pioneering force, shaping the digital future with innovative solutions tailored to our clients' unique needs. Technical Qualifications: Minimum 4 years of work experience using advanced Python Programming language & microservices Well versed with developing REST APIs and Services using at least one Python web framework (Django, Flask, FastAPI) Knowledge of Google Cloud (GCP) and corresponding cloud services and deployment process understanding (minimum 3+ Years). Python + Container or Orchestration use ( Docker / Kubernetes / etc. ) Exposure to Cloud - GCP , AppEngine , Firestore , SQL and Big Query will be an added advantage Strong knowledge of Data Structures & Algorithms, OOP, Threads, Parallel-Processing Hands-on experience working with relational SQL (PostgreSQL, MYSQL,Bigquery) and NOSQL (Datastore, MongoDB) databases. Good knowledge of version controls like Git, Github & SVN Good with developing Applications on Linux with sufficient Bash Scripting Knowledge. Expertise in the areas of Big Data and/or Data Analytics is a plus. Good to have knowledge of Devops tools (like Docker & GKE) and third-party tools like Airflow, Elastic Search, etc Key Responsibilities: Write effective & scalable code using python and improve functionality of existing systems. Debug and optimize performance to ensure scalability and reliability with python test framework tools like pytest, pyunit. Conduct code reviews and ensure the delivery of high-quality code Coordinate with internal & external teams to understand user requirements and provide technical solutions. Verify compliance with accessibility standards. Manage the developers’ team - provide technical guidance and address technical challenges and risks Keeping abreast with latest technologies and tools. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

India

On-site

Linkedin logo

Job Title: Senior Software Engineer Job Type: Full-time, Contractor About Us: Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market. Job Summary: We are seeking a highly skilled Senior Software Engineer to join one of our top customers., committed to designing and implementing high-performance microservices. The ideal candidate will have extensive experience with Python, FastAPI, task queues, web sockets and Kubernetes to build scalable solutions for our platforms. This is an exciting opportunity for those who thrive in challenging environments and have a passion for technology and innovation. Key Responsibilities: Design and develop backend services using Python, with an emphasis on FastAPI for high-performance applications. Architect and orchestrate microservices to handle high concurrency I/O requests efficiently. Deploy and manage applications on AWS, ensuring robust and scalable solutions are delivered. Implement and maintain messaging queues using Celery, RabbitMQ, or AWS SQS. Utilize WebSockets and asynchronous programming to enhance system responsiveness and performance. Collaborate with cross-functional teams to ensure seamless integration of solutions. Continuously improve system reliability, scalability, and performance through innovative design and testing. Required Skills and Qualifications: Proven experience in production deployments with user bases exceeding 10k. Expertise in Python and FastAPI, with strong knowledge of microservices architecture. Proficiency in working with queues and asynchronous programming. Hands-on experience with databases such as Postgres, MongoDB, or Databricks. Comprehensive knowledge of Kubernetes for running scalable microservices. Exceptional written and verbal communication skills. Consistent work history without overlapping roles or career gaps. Preferred Qualifications: Experience with GoLang for microservice development. Familiarity with data lake technologies such as Iceberg. Understanding of deploying APIs in Kubernetes environments. Show more Show less

Posted 2 weeks ago

Apply

0.0 - 5.0 years

0 Lacs

Gurugram, Haryana

On-site

Indeed logo

Company Overview: Schneider Electric is a global leader in energy management and automation, committed to providing innovative solutions that ensure Life Is On everywhere, for everyone, and at every moment. We are expanding our team in Gurugram and looking for a Senior Cloud Architect to enhance our cloud capabilities and drive the integration of digital technologies in our operations. Job Description: As Senior Design Engineer at Schneider Electric, you will play a crucial role in developing and implementing IoT solutions across our global infrastructure, with a primary focus on Edge software. This position requires practical hands-on ability to implement and manage, and optimize edge-based software solutions, ensuring efficient data processing for a large-scale edge gateways and devices (100s of thousands) deployed in the field. Key Responsibilities: Develop scalable, high-performance Edge computing solutions for IoT applications. Independently design and implementation of asynchronous task processing using Python (asyncio, Twistd, Tornado, etc.) for efficient data handling and device communication. Develop and optimize IoT data pipelines, integrating sensors, edge devices, and cloud-based platforms. Work on device-to-cloud communication using MQTT, WebSockets, or other messaging protocols. Ensure software is secure, reliable, and optimized for resource-constrained edge environments. Design and optimize Linux-based networking for edge devices, including network configuration, VPNs, firewalls, and traffic shaping. Implement and manage Linux process management, including systemd services, resource allocation, and performance tuning for IoT applications. Stay updated with emerging IoT, edge computing, and Linux networking technologies. Requirements: Technical 3-5 years of overall experience in software engineering with a strong focus on Python development. Expertise in Python, with experience in asynchronous programming, task processing frameworks, Web frameworks (e.g., asyncio, Twistd, FastAPI, Flask). Strong knowledge of Linux networking, including TCP/IP, DNS, firewalls (iptables/nftables), VPNs, and network security. Experience in Linux process management, including systemd, resource limits (cgroups), and performance tuning. Good Understanding of IoT architectures, protocols (MQTT, HTTP/REST), and edge computing frameworks. Hands-on experience with Docker. Proficiency and Experience with Git or any other VCS. Excellent problem-solving skills and the ability to lead complex technical projects. Good to have: Knowledge of Rust, C++, or Golang for performance-critical edge applications. Prior experience of working in IoT. Understanding of BACnet/Modbus protocols. Familiarity with cloud IoT platforms (AWS IoT, Azure IoT, Google Cloud IoT) and their integration with edge devices. Soft Skills: Excellent problem-solving abilities and strong communication skills. Advanced verbal and written communication skills including the ability to explain and present technical concepts to a diverse set of audiences. Good judgment, time management, and decision-making skills Strong teamwork and interpersonal skills; ability to communicate and thrive in a cross-functional environment Willingness to work outside documented job description. Has a “whatever is needed” attitude. Qualifications Preferred Qualifications: Bachelor's or Master's degree in computer science, Information Technology, or related field. Working experience on designing robust, scalable & maintainable asynchronous python applications. Prior experience in building cloud connected Edge IoT solutions Prior experience in the energy sector or industrial automation is advantageous. Primary Location : IN-Haryana-Gurgaon Schedule : Full-time Unposting Date : Ongoing

Posted 2 weeks ago

Apply

0 years

0 Lacs

India

On-site

Linkedin logo

As a Frontend Engineer, you will have the unique opportunity to shape the DNA of our user experience. Given the no/low-code nature of our platform, we're seeking someone who is not only technically adept but also deeply passionate about creating an engaging, intuitive, and performant user interface. Responsibilities Work directly with the founders to conceptualize and implement an exceptional user interface. Lead frontend architecture decisions, ensuring the platform is scalable and maintainable. Drive forward web performance optimization. Advocate for and implement UI/UX best practices, ensuring users are at the heart of every design decision. Incorporate user feedback to continuously iterate and refine the platform. Stay updated with the latest frontend technologies and integrate them where relevant. Establish foundational frontend practices that will guide future team members. Requirements Degree in a technical discipline, preferably computer science. Experience with React/NextJS or similar frontend frameworks. Experience with the entire web development process (design, development, and deployment). Ability to integrate seamlessly with backend APIs. Familiarity with source control tools, preferably Git. In-depth knowledge of responsive design. A keen eye for user experience and UI design. Strong communication skills and a collaborative mindset. Problem-solving aptitude and enthusiasm for overcoming challenges. Nice To Have Experience in early-stage startup environments. Experience with no/low-code platforms. Understanding of web performance optimization and best practices. Familiarity with Python FastAPI or similar backend frameworks. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Delhi, India

Remote

Linkedin logo

Opportunity : Contract Engineer AI Chatbot & Salesforce Integration Location : Remote Engagement Type : Contract Start Date : the Role : Were seeking an experienced Contract Engineer to implement an AI-powered chatbot integrated with Salesforce CRM. The ideal candidate will work on a real-time assistant to handle lead capture, scheduling, document handling, and multi-department Responsibilities : Develop and deploy intelligent chatbots using OpenAI/GPT models and LangChain (or similar frameworks) Integrate chatbot workflows with Salesforce objects (e.g., Leads, Cases, Custom Objects) Design and expose backend APIs for smooth interaction with external systems (FTP, portals, databases) Implement authentication, session management, and user-specific routing in the bot Support Experience Cloud integrations (Instructor/Trainee/Scheduling portals) Ensure scalability, prompt handling, logging, and graceful error fallback Collaborate with internal stakeholders to align features with training and engineering Skills & Experience : Strong experience with Python (LangChain / FastAPI / Flask) Proficient in OpenAI API or similar LLM providers (Anthropic, Cohere, etc.) Hands-on with Salesforce Integration (REST APIs, Apex, Flow, LWC knowledge is a plus) Familiar with Heroku, AWS Lambda, or cloud hosting platforms Working knowledge of JSON, Webhooks, and OAuth 2.0 Ability to translate business workflows into automated conversational to Have : Experience building bots for enterprise CRM environments Understanding of UI frameworks (e.g., React.js) for embedding bots on web apps (ref:hirist.tech) Show more Show less

Posted 2 weeks ago

Apply

1.0 years

5 - 6 Lacs

Jaipur, Rajasthan, IN

On-site

Internshala logo

About the job: Key responsibilities: 1. Fine-tune and deploy LLMs (e.g., LLaMA 3.2) using LoRA, QLoRA, and related optimization techniques 2. Build intelligent systems combining NLP, image analysis, and pattern recognition 3. Develop and integrate retrieval-augmented generation (RAG) pipelines using LangChain, LlamaIndex, and vector databases like FAISS or Weaviate 4. Apply statistical analysis, feature engineering, and advanced data analysis techniques using NumPy and Pandas 5. Train and evaluate models with CNNs, transformers, and neural networks in PyTorch, TensorFlow, or Keras 6. Package and deploy scalable inference APIs with FastAPI on AWS or Azure 7. Collaborate directly with product and engineering teams to ship AI features into production Requirements: 1. Possess 1+ years of hands-on experience in data science, machine learning, or deep learning roles 2. Demonstrate strong experience with Python, NumPy, Pandas, and modern ML frameworks (PyTorch or TensorFlow) 3. Show practical understanding of transformers, CNNs, and neural network training pipelines 4. Have exposure to LLMs, vector databases, and/or retrieval-augmented generation (RAG) systems 5. Be familiar with FastAPI and deploying ML models to the cloud (AWS or Azure) 6. Hold a solid grounding in statistics, data wrangling, and model evaluation techniques Who can apply: Only those candidates can apply who: have minimum 1 years of experience are Computer Science Engineering students Salary: ₹ 5,00,000 - 6,00,000 /year Experience: 1 year(s) Deadline: 2025-06-28 23:59:59 Other perks: 5 days a week Skills required: Python, SQL, Machine Learning, Natural Language Processing (NLP), TensorFlow, FastAPI, Keras, NumPy and Pandas About Company: Softsensor.ai is a USA and India-based corporation focused on delivering outcomes to clients using data. Our expertise lies in a collection of people, methods, and accelerators to rapidly deploy solutions for our clients. Our principals have significant experience with leading global consulting firms & corporations and delivering large-scale solutions. We are focused on data science and analytics for improving the process and organizational performance. We are working on cutting-edge data science technologies like NLP, CNN, and RNN and applying them in the business context.

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Python - Technical Architect Exp - 10 - 14 yrs Domain - INSURANCE Role - Development & architecture Location - Gurugram Qualification and Experience Recognizedwith a Bachelor’s degree in Computer Science, Information Technology, or equivalent. Work experience - Overall experience 10-14 years Recognizabledomain knowledge and awareness of basic insuranceand regulatory frameworks. Previous experience workingin the insurance industry (AINS Certification is a plus). Key Capabilities and Competencies Knowledge Proven experience as a Software Architect or Technical Project Manager with architectural responsibilities. Strong proficiency in Python and relevant frameworks (Django, Flask, FastAPI). Strong understanding of software development lifecycle (SDLC), agile methodologies (Scrum, Kanban) and DevOps practices. Expertise in Azure cloud ecosystem and architecture design patterns. Familiarity with Azure DevOps, CI/CD pipelines, monitoring and logging. Experience with RESTful APIs, microservices architecture and asynchronous processing. Deep understanding of insurance domain processes such as claims management, policy administration etc. Experience in database design and data modelling with SQL(MySQL) and NoSQL(Azure Cosmos DB). Knowledge of security best practices including data encryption, API security and compliance standards. Knowledge of SAST and DAST security tools is a plus. Major Responsibilities: Develop and customize solutions, including workflows, Workviews, and application integrations. Integrate with other enterprise applications and systems. Perform system upgrades and migrations to ensure optimal performance. Troubleshoot and resolve issues related to applications and workflows using Diagnostic console. Ensure data integrity and security within the system. Maintain documentation for system configurations, workflows, and processes. Stay updated on best practices, new features and industry trends. Hands-on in Waterfall & Agile Scrum methodology. Working on softwareissues and specifications and performing Design/Code Review(s). Engaging in the assignment of work to the development team resources, ensuringeffective transition of knowledge, designassumptions and development expectations. Ability to mentor developers and lead cross-functional technical teams. Collaborate with stakeholders to gather requirements and translate them into technical specifications for effective workflow/Workview design. Assist in the training of end-users and provide support as needed Contributingto the organizational values by actively workingwith agile development teams, methodologies, and toolsets. Driving concise, structured, and effective communication with peers and clients. Detailed JD will be shared as we screen your CV. Show more Show less

Posted 2 weeks ago

Apply

3.0 - 5.0 years

8 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

About the Role : We are seeking a highly motivated and skilled Full Stack Developer with expertise in Python, React.js, and AI to join our growing team. You will be responsible for building and maintaining robust, scalable, and user-friendly web applications that integrate advanced AI functionalities. This role requires a strong understanding of both frontend and backend development, as well as hands-on experience with AI scripting and modern DevOps practices. Responsibilities : - Develop and maintain backend applications using Python frameworks such as Django, Flask, or FastAPI. - Design and implement engaging and responsive user interfaces using React.js. - Integrate AI functionalities into web applications using tools like OpenAI, LangChain, TensorFlow, PyTorch, or similar. - Design and develop efficient and secure RESTful APIs for seamless communication between frontend and backend. - Implement and manage database solutions, including both SQL and NoSQL databases. - Deploy and manage applications using Docker and Kubernetes, ensuring scalability and reliability. - Implement and maintain CI/CD pipelines to automate builds, tests, and deployments. - Utilize Git and version control systems (GitHub, GitLab, or Bitbucket) for collaborative development and code management. - Troubleshoot and resolve technical issues across the full stack, ensuring optimal application performance. - Collaborate effectively with cross-functional teams, including product managers, designers, and other developers. - Stay up-to-date with the latest trends and technologies1 in web development and AI, proactively seeking opportunities for improvement. Skills : - Python Development : 3+ years of experience in Python development, with proficiency in Django, Flask, or FastAPI. - React.js Development : 3+ years of experience in React.js and frontend development, with a strong understanding of modern JavaScript and TypeScript. - AI Scripting : Proven experience with AI scripting tools like OpenAI, LangChain, TensorFlow, PyTorch, or similar. - API Development : Proficiency in designing and developing efficient and secure RESTful APIs. - Frontend Technologies : Strong knowledge of JavaScript, TypeScript, and modern frontend frameworks and libraries. - Database Management : Hands-on experience with database management, including both SQL and NoSQL databases. - Containerization & Orchestration : Experience with Docker and Kubernetes for containerization and orchestration. - CI/CD : Experience with CI/CD pipelines and tools for automated deployments. - Version Control : Familiarity with Git, GitHub, GitLab, or Bitbucket for version control and collaborative development. - Problem-Solving : Strong problem-solving skills and the ability to work independently, taking initiative to resolve complex issues

Posted 2 weeks ago

Apply

0.0 - 3.0 years

2 - 5 Lacs

Pune

Work from Office

Naukri logo

Job Description: Python Training, Internship and Job Assistance Position: Python Intern Location: Pune, Maharashtra, India Duration: 6 months Mode: Offline Stipend: Unpaid Training Program: Free Training with Job Assistance upon Successful Completion Pre-Placement Offer (PPO) Opportunity: 2.5 LPA CTC (Based on performance) Note: CANDIDATES SHOULD BE READY TO LEARN NEW TECHNOLOGIES About the Role: We are seeking a motivated and detail-oriented Python Intern to join our team. The ideal candidate will have a foundational understanding of API development using frameworks like Flask and FastAPI, a strong grasp of Python libraries such as Pandas and NumPy, and a knack for problem-solving. Knowledge of data analytics tools and Large Language Models (LLMs) will be considered a plus. Key Responsibilities: Develop and maintain RESTful APIs using Flask and FastAPI. Work with data manipulation and analysis libraries like Pandas and NumPy. Assist in building scalable and efficient back-end solutions. Solve real-world problems with logical and analytical thinking. Support data analytics efforts using tools like Power BI (preferred). Collaborate with cross-functional teams to understand requirements and deliver solutions. Explore and implement applications of LLMs and other advanced technologies (as needed). Requirements: Proficiency in Python programming. Hands-on experience or knowledge of Flask and FastAPI frameworks. Familiarity with data analysis libraries such as Pandas and NumPy. Strong problem-solving and logical thinking skills. Good understanding of RESTful API design principles. Knowledge of data analytics tools like Power BI is a plus. Awareness of Large Language Models (LLMs) and their potential applications is a bonus. Eagerness to learn and work in a collaborative environment. What We Offer: Hands-on experience in API development and data analytics projects. Mentorship from experienced professionals. Opportunity to explore cutting-edge technologies like LLMs. A learning-focused environment to develop both technical and analytical skills. Potential for full-time opportunities based on performance.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

7 - 11 Lacs

Mumbai

Work from Office

Naukri logo

Summary Bitkraft Technologies LLP is looking for skilled Senior Python Developer to join our software engineering team You will be working across the technology stack on cutting edge innovative web development projects for our custom services business. The ideal candidate will have a solid background in building scalable, maintainable, and high-performance applications using Python and FastAPI You will be responsible for leading a development team, engaging in technical design, and ensuring best practices are followed for delivering quality solutions. If you love solving problems, are a team player and want to work in a fast-paced environment with core technical and business challenges, we would like to meet you. Must-Know Skills Strong proficiency in Python/FastAPI (highest priority). Solid understanding of design patterns and object-oriented programming (OOP). Experience with Test-Driven Development (TDD). Hands-on experience with ORMs for database interaction. Kafka / Kafka consumers recent practical exposure is required. Preferred Skills Familiarity with Java Spring Boot. Knowledge of basic AWS services. Understanding of Docker and containerization concepts. Experience with multi-threading and concurrent programming. Nice-to-Have Skills (Non-Mandatory) Frontend/UI exposure : While React experience is not a priority, familiarity with UI development practices from a deployment and debugging perspective is desirable However, candidates will not be expected to contribute to frontend development. TimeSeries DB and Druid: Can be omitted from the evaluation criteria. Apache Iceberg familiarity is a plus but not required. Key Responsibilities Lead and mentor a team of developers to ensure best practices, code quality, and high-performance application development. Participate in the technical design and code review process to ensure architecture and solutions align with business requirements and scalability goals. Set development standards and ensure adherence to them across the team to deliver high-quality solutions. Build and optimize reusable Python libraries that are thread-safe, performance-optimized, and include robust exception handling. Collaborate closely with product owners, senior technical architects, and global teams to define application requirements and deliver solutions. Ensure continuous improvement in development processes by leveraging TDD and agile methodologies. Maintain a focus on building scalable and maintainable software that supports business growth and is capable of handling increasing data loads. Experience : 6+ years

Posted 2 weeks ago

Apply

3.0 - 8.0 years

15 - 30 Lacs

Gurugram

Work from Office

Naukri logo

Role Overview: As a Data Scientist, you will lead the development of advanced analytics and Generative AI (GenAI) solutions that deliver measurable impact for clients. You will combine deep technical knowledge in machine learning and GenAI with solid software engineering practices to build scalable, robust tools and applications. Responsibilities: Design, build, and deploy advanced analytics and GenAI solutions, including custom agents and LLM-integrated applications. Lead the development of modular, maintainable, and production-ready codebases using best practices in software engineering (e.g., version control, testing, CI/CD). Build interactive apps and dashboards using frameworks like Streamlit, enabling seamless business interaction with AI tools. Develop APIs and microservices to integrate GenAI tools with client systems or platforms. Identify and mitigate risks associated with GenAI (e.g., hallucination, bias, privacy, regulatory concerns), and implement robust logging, monitoring, and evaluation mechanisms. Translate complex analytical and AI concepts into clear recommendations for business and technical stakeholders. Coach and mentor junior data scientists, ensuring high-quality code and technical growth across the team. Collaborate with data engineers, DevOps, and product managers to ensure end-to-end solution delivery. Qualifications: 5+ years of experience in data science, with strong experience delivering ML or GenAI models and applications in production. Proficiency in Python with strong command of relevant libraries (e.g., pandas, scikit-learn, PyTorch, LangChain, FastAPI). Solid software engineering skills: experience with Git, testing frameworks (e.g., pytest), CI/CD pipelines (e.g., GitHub Actions, GitLab CI), and packaging/distribution of code. Experience building modular architectures for GenAI applications and agents (e.g., using LangChain, OpenAI function calling, vector stores).

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

India

On-site

Linkedin logo

Description The Position We are seeking a seasoned engineer with a passion for changing the way millions of people save energy. You’ll work within the Engineering team to build and improve our platforms to deliver flexible and creative solutions to our utility partners and end users and help us achieve our ambitious goals for our business and the planet. We are seeking a skilled and passionate Data Engineer with expertise in Python to join our development team. As a Data Engineer, you will play a crucial role developing different components, harnessing the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, data processing and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. You will coordinate with the rest of the team working on different layers of the infrastructure. Therefore, a commitment to collaborative problem solving, sophisticated design, and quality product is important. You will own the development and its quality independently and be responsible for high quality deliverables. And you will work with a great team with excellent benefits. Responsibilities & Skills You should: Be excited to work with talented, committed people in a fast-paced environment. Use a data-driven approach and actively work on product & technology roadmap at strategy level and day-to-day tactical level. Have a proven experience as a Data Engineer with a focus on Python. Be designing, building, and maintaining high performance solutions with reusable, and reliable code. Use a rigorous approach for product improvement and customer satisfaction. Love developing great software as a seasoned product engineer. Be ready, able, and willing to jump onto a call with a partner or customer to help solve problems. Be able to deliver against several initiatives simultaneously. Have a strong eye for detail and quality of code. Have an agile mindset. Have strong problem-solving skills and attention to detail. Required Skills (Data Engineer) You are an experienced developer – you ideally have 4 or more years of professional experience Design, build, and maintain scalable data pipelines and ETL processes to support business analytics and reporting needs. Strong proficiency in Python for building and automating data pipelines, ETL processes, and data integration workflows. Strong Experience with SQL for querying and transforming large datasets, and optimizing query performance in relational databases. Familiarity with big data frameworks such as Apache Spark or PySpark for distributed data processing. Hands-on experience with data pipeline orchestration tools like Apache Airflow or Prefect for workflow automation. Strong Understanding of data modeling principles for building scalable and efficient data architectures (e.g., star schema, snowflake schema). Good to have experience with Databricks for managing and processing large datasets, implementing Delta Lake, and leveraging its collaborative environment. Knowledge of Google Cloud Platform (GCP) services like BigQuery, Dataflow, Pub/Sub, and Cloud Storage for end-to-end data engineering solutions. Familiarity with version control systems such as Git and CI/CD pipelines for managing code and deploying workflows. Awareness of data governance and security best practices, including access control, data masking, and compliance with industry standards. Exposure to monitoring and logging tools like Datadog, Cloud Logging, or ELK stack for maintaining pipeline reliability. Ability to understand business requirements and translate them into technical requirements. Expertise in solutions design. Demonstrable experience with writing unit and functional tests. Ability to deliver against several initiatives simultaneously as a multiplier. Required Skills (Python) You are an experienced developer - a minimum of 4+ years of professional experience Python experience, preferably both 2.7 and 3.x Strong Python knowledge - familiar with OOPs, data structures and algorithms Work experience & strong proficiency in Python and its associated frameworks (like Flask, FastAPI etc.) Experience in designing and implementing scalable microservice architecture Familiarity with RESTful APIs and integration of third-party APIs 2+ years building and managing APIs to industry-accepted RESTful standards Demonstrable experience with writing unit and functional tests Application of industry security best practices to application and system development Experience with database systems such as PostgreSQL, MySQL, or MongoDB Required The following experiences are not required, but you'll stand out from other applicants if you have any of the following, in our order of importance: Experience with cloud infrastructure like AWS/GCP or other cloud service provider experience Serverless architecture, preferably AWS Lambda Solid CI/CD experience You are a Git guru and revel in collaborative workflows You work on the command line confidently and are familiar with all the goodies that the linux toolkit can provide Knowledge of modern authorization mechanisms, such as JSON Web Token Qualifications Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Uplight provides equal employment opportunities to all employees and applicants and prohibits discrimination and harassment of any type without regard to race (including hair texture and hairstyles), color, religion (including head coverings), age, sex, national origin, caste, disability status, genetics, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

We are building full stack AI-agentic teams (hyper-specialised autonomous agents that use LLM back- ends, MCP servers, tailor-made tools and retrieval pipelines) to completely replace desk-job based businesses which currently require high-value ($100k+) human employees. We invite IIT/NIT students (ideal for 2025/26 batches) to join us as Software-Engineering Interns and help design, build, and benchmark these agents at production scale. 1 | Internship Snapshot • Role : Software Engineer Intern – LLM / Autonomous-Agent Development • Duration : 10–12 weeks (May–July 2025, flexible), can extend basis mutual fit • Location : Remote (IST overlap) • Stipend : Competitive + PPO-track (get to be a part of the founding team with ESOPs) • Stack: Python, FastAPI, LangChain/LangGraph, pgvector/FAISS, Docker, MCP servers • Work directly with founders (IIT/IIM alumni) 2 | What You Will Build • LLM-backed micro-agents that own an entire sub-task/workflow end-to-end • Retrieval pipelines using hybrid vector + keyword search (FAISS / pgvector) for tool- augmented reasoning. • Agent evaluation harnesses (pytest + LangChain), echoing best practice from industrial agent teams 3 | Core Responsibilities • Spec & API design – Draft OpenAPI contracts and JSONSchema for new agents. • Build agent functionalities: Integrate sessions, memory, knowledge graphs, state etc. • Build tools – Write typed, test-covered Python; integrate async I/O and streaming where it improves token efficiency • Infrastructure – Containerise services, and deploy to our EKS staging cluster with auto- scaling triggers. • Documentation & handover – Produce clear READMEs 4 | Ideal Candidate Must-Have • Solid CS fundamentals (DSA, OS, networks) • Fluency in Python; Git workflows • Experience with agent development on Agno or OpenAI SDKs • SQL + basic schema design Nice-to-have • REST, MCP and Knowledge graphs • AWS basics (S3, EKS) 5 | How to apply • Please fill this form: https://forms.gle/gjNN9bBSVMTmqs6aA Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

India

On-site

Linkedin logo

Please find below the job description for a fulltime role with us, MRCC Group . Please share your resume and salary confirmation if interested. Key Responsibilities: • Design, build, and deploy AI models using ML, DL, and GenAI techniques. • Work on Agentic AI frameworks and build intelligent, goal-directed agents using frameworks like Langchain and Langgraph. • Create and expose RESTful APIs using FastAPI for model serving. • Build and maintain scalable pipelines for data preprocessing, training, and inference. • Collaborate with cross-functional teams to develop AI-powered solutions from scratch to production. • Stay updated with the latest research and trends in AI/ML/NLP/Agentic systems. Project Experience in building AI solutions like Chatbots, sentiment analysis, Image Classification, etc. • Knowledge of cloud platforms (Azure) for deploying AI models Key Requirements: • Bachelor’s/Master’s degree in Computer science or Data science. • Strong proficiency in Python and relevant AI/ML libraries. • Hands-on experience with: Machine Learning and Deep Learning o Generative AI (LLMs, transformers, embeddings) o Natural Language Processing (NLP) Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Greater Chennai Area

On-site

Linkedin logo

Redefine the future of customer experiences. One conversation at a time. We’re changing the game with a first-of-its-kind, conversation-centric platform that unifies team collaboration and customer experience in one place. Powered by AI, built by amazing humans. Our culture is forward-thinking, customer-obsessed and built on an unwavering belief that connection fuels business and life; connections to our customers with our signature Amazing Service®, our products and services, and most importantly, each other. Since 2008, 100,000+ companies and 1M+ users rely on Nextiva for customer and team communication. If you’re ready to collaborate and create with amazing people, let your personality shine and be on the frontlines of helping businesses deliver amazing experiences, you’re in the right place. Build Amazing - Deliver Amazing - Live Amazing - Be Amazing We’re looking for a highly skilled and hands-on RAG (Retrieval-Augmented Generation) & Prompt Engineer to join our applied AI team. You’ll work with cutting-edge open-source and proprietary LLMs (like LLaMA, Mistral, Claude, GPT-4o, etc.) to build, prompt, and orchestrate intelligent agents that are capable, reliable, and production-ready. This role is perfect for someone who has experience developing prompt chains, implementing tool-calling workflows, and debugging AI agents at scale. Key Responsibilities Design, develop, and iterate on prompt strategies tailored to downloadable models and major APIs (LLaMA, Mistral, Claude, GPT-4o, etc.). Architect and implement RAG pipelines with a deep understanding of embedding models, retrievers, and context optimization techniques. Create prompt chains and tool-calling workflows for dynamic agent behavior using Responses API and similar frameworks. Design, test, and deploy foolproof agent architectures using OpenAI tool calling and agent protocol layers. Write robust Guardrails and control flows for agents to prevent unintended behaviors and ensure task compliance. Debug and maintain agent codebases, ensuring reliability and scalability of deployed services. Apply basic knowledge of OpenAI Operator and related orchestration tools to manage agent lifecycle. Collaborate with researchers and infra teams to optimize prompt efficiency and latency. Must-Have Qualifications 3 - 5 years of experience in AI engineering, prompt engineering, or applied ML roles. Proven experience working with both downloadable open-source models and hosted APIs. Strong knowledge of LLM prompt design patterns, prompt chaining, and failure handling. Ability to build agent systems that are secure, auditable, and self-healing. Good coding and debugging skills in Python (or relevant stack) with focus on AI orchestration. Familiarity with agent deployment pipelines, containerized environments, and CI/CD flows. Tech Stack We Use Python, FastAPI, LangChain / LlamaIndex. OpenAI, Anthropic, HuggingFace. Vector DBs (Weaviate, Pinecone, Qdrant). Responses API, OpenAI Operator, A2A SDK. Docker, GitHub Actions, GCP/AWS. Bonus (Nice-to-Have Skills) Experience building agents from scratch, especially with agent transfer logic and persistent memory. Understanding of Model Context Protocols and how to integrate them into multi-agent LLM stacks. Familiarity with A2A SDK for agent-to-agent communication and delegation. Hands-on experience with LoRA / QLoRA techniques for fine-tuning GPT-style models on downstream or domain-specific tasks. Experience with vector DBs, context compression, or multi-turn reasoning at scale. Total Rewards Our Total Rewards offerings are designed to allow our employees to take care of themselves and their families so they can be their best, in and out of the office. Our compensation packages are tailored to each role and candidate's qualifications. We consider a wide range of factors, including skills, experience, training, and certifications, when determining compensation. We aim to offer competitive salaries or wages that reflect the value you bring to our team. Depending on the position, compensation may include base salary and/or hourly wages, incentives, or bonuses. Medical 🩺- Medical insurance coverage is available for employees, their spouse, and up to two dependent children with a limit of 500,000 INR, as well as their parents or in-laws for up to 300,000 INR. This comprehensive coverage ensures that essential healthcare needs are met for the entire family unit, providing peace of mind and security in times of medical necessity. Group Term & Group Personal Accident Insurance 💼 - Provides insurance coverage against the risk of death / injury during the policy period sustained due to an accident caused by violent, visible & external means. Coverage Type - Employee Only Sum Insured - 3 times of annual CTC with minimum cap of INR 10,00,000 Free Cover Limit - 1.5 Crore Work-Life Balance ⚖️ - 15 days of Privilege leaves per calendar year, 6 days of Paid Sick leave per calendar year, 6 days of Casual leave per calendar year. Paid 26 weeks of Maternity leaves, 1 week of Paternity leave, a day off on your Birthday, and paid holidays Financial Security💰 - Provident Fund & Gratuity Wellness 🤸‍ - Employee Assistance Program and comprehensive wellness initiatives Growth 🌱 - Access to ongoing learning and development opportunities and career advancement At Nextiva, we're committed to supporting our employees' health, well-being, and professional growth. Join us and build a rewarding career! Established in 2008 and headquartered in Scottsdale, Arizona, Nextiva secured $200M from Goldman Sachs in late 2021, valuing the company at $2.7B.To check out what’s going on at Nextiva, check us out on Instagram, Instagram (MX), YouTube, LinkedIn, and the Nextiva blog. Show more Show less

Posted 2 weeks ago

Apply

7.0 - 12.0 years

0 - 2 Lacs

Hyderabad, Chennai

Work from Office

Naukri logo

We are seeking a highly skilled and experienced Senior Python Product Development Engineer to join our engineering team. In this role, you will be responsible for designing, developing, and optimizing Python-based products and services that are core to our business. You will work closely with product managers, designers, and fellow engineers to translate user needs into functional and scalable software solutions. Key Responsibilities Lead the design and implementation of core Python-based services and applications. Collaborate with product management to define feature requirements and technical specifications. Develop and maintain clean, efficient, and well-documented code. Optimize performance and scalability of systems to support high growth and usage. Conduct code reviews and mentor junior and mid-level engineers. Integrate third-party APIs and tools where appropriate. Drive continuous improvement in development processes and tooling. Participate in architectural discussions and contribute to long-term technical strategy. Ensure software meets all requirements of quality, security, scalability, and maintainability. Collaborate with QA and DevOps to ensure high-quality releases and deployments. Additional Job DescriptionAdditional Job DescriptionRequired Qualifications Bachelors or Master’s degree in Computer Science, Engineering, or a related field. 5+ years of hands-on experience in Python software development. Proven track record in product-focused engineering roles, preferably in SaaS or platform companies. Strong knowledge of modern Python frameworks (e.g., Django, FastAPI, Flask). Experience with RESTful API design and development. Familiarity with cloud platforms (AWS, GCP, or Azure) and containerized environments (Docker, Kubernetes). Proficient in database design and development (PostgreSQL, MySQL, or NoSQL databases). Solid understanding of software engineering principles, data structures, and algorithms. Strong communication and collaboration skills Preferred Qualifications Experience with asynchronous programming (e.g., asyncio, Celery). Familiarity with CI/CD pipelines and version control systems like Git. Background in microservices architecture. Exposure to frontend technologies (e.g., React, Vue) is a plus. Experience in agile/scrum development environments.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Greater Chennai Area

On-site

Linkedin logo

Are you a coding genius , a debugging maniac , and a relentless problem-solving ninja ? Do you dream in Java, Python, and Go, and wake up eager to architect the future? If yes, then Mindsprint wants YOU to join our elite squad of software wizards! Why Join Mindsprint? At Mindsprint, we don’t just build software — we craft experiences, innovate relentlessly, and push the boundaries of what code can do. If you thrive in a fast-paced, intellectually charged environment where your ideas spark revolutions, you’ll fit right in. Your Mission, Should You Choose to Accept It: 1. Architect & Innovate Lead the design and implementation of scalable, high-performance features across Java, Python, and Go ecosystems. Write clean, modular, and elegant code that even your future self will thank you for. Set the gold standard for coding practices and mentor junior developers to become coding maestros themselves. 2. Debugging Wizardry Hunt down bugs like a seasoned detective and obliterate them with surgical precision. Collaborate with QA and DevOps teams to ensure rock-solid, stable releases. 3. Testing & Automation Guru Champion automated testing frameworks and CI/CD pipelines to keep our codebase bulletproof. Craft comprehensive unit, integration, and system tests that catch issues before they catch us. 4. Lifelong Learning & Tech Evangelism Stay ahead of the curve by mastering new technologies, frameworks, and methodologies. Share your knowledge through code reviews, tech talks, and team workshops. 5. Collaboration & Leadership Drive sprint planning, retrospectives, and daily stand-ups with infectious enthusiasm. Liaise with product owners, architects, and stakeholders to translate vision into reality. 6. Codebase Custodian Maintain and refactor legacy code with care and precision, ensuring longevity and scalability. Monitor live applications and swiftly resolve production issues like a true guardian of uptime. What You Bring to the Table: Technical Sorcery: Mastery in Java, Python, and Go — you speak their languages fluently and can juggle all three without breaking a sweat. Deep experience with frameworks like Spring Boot, Django/Flask/FastAPI, and Gin/Echo. Expert-level Git ninja skills for seamless collaboration. Solid grasp of relational (MySQL/PostgreSQL) and NoSQL (MongoDB/Redis) databases. Proven track record building and consuming RESTful APIs. Hands-on experience with Docker, Kubernetes, and CI/CD pipelines. Testing wizardry with JUnit, pytest, or equivalent frameworks. Soft Skills: Intellectual curiosity that borders on obsession. Communication skills sharp enough to cut through the noise. Leadership that inspires and uplifts. Problem-solving prowess that turns puzzles into poetry. Join Us & Code the Future! If you’re eager to innovate, obsessed with clean code, and ready to sprint with the best, apply now and show us why you’re the coding maestro we’ve been dreaming of. Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

Remote

Linkedin logo

Python Backend Developer (FastAPI + LLM/ML Integrations) Location: Remote / Hybrid Type: Contract / Full-time Experience: 3+ Years Role Overview: We are seeking a skilled Python Backend Developer to design, build, and optimize RESTful APIs for seamless interaction with Large Language Models (LLMs), machine learning services, and cloud-based AI tools. You will work closely with frontend teams to deliver robust API services, manage database interactions, and integrate AI/ML components into scalable backend systems. The ideal candidate will have strong expertise in FastAPI, Python-based API development, text extraction (PDFs, documents), and cloud-based ML services (AWS Bedrock, Vertex AI, etc.). Key Responsibilities: Design and develop high-performance REST APIs using FastAPI for frontend consumption. Integrate with LLMs (OpenAI, Anthropic, Llama2, etc.) and ML models (RAG pipelines, embeddings, fine-tuning). Extract and process text/data from PDFs, documents, and unstructured sources using Python libraries (PyPDF2, pdfplumber, unstructured.io, etc.). Work with databases (PostgreSQL, MongoDB, Redis) for efficient data storage/retrieval. Implement authentication (JWT/OAuth), rate limiting, and API security best practices. Collaborate with frontend teams to optimize API responses and ensure smooth integration. Deploy and manage APIs on AWS/GCP (Lambda, API Gateway, EC2, Docker). Work with AI/ML tools like LangChain, LlamaIndex, Weaviate, or Pinecone for retrieval-augmented workflows. Write clean, scalable, and well-documented code with unit/integration testing (Pytest). Technology Stack: Backend: Python, FastAPI, Flask (optional), async programming AI/ML Integrations: OpenAI API, Hugging Face, AWS Bedrock, LangChain, LlamaIndex Database: PostgreSQL, MongoDB, Redis, ORMs (SQLAlchemy, Pydantic) Document Processing: PyPDF2, pdfplumber, unstructured.io, OCR tools (Tesseract) Cloud & DevOps: AWS (Lambda, API Gateway, S3), Docker, CI/CD (GitHub Actions) API Tools: Postman, Swagger/OpenAPI, RESTful standards Version Control: Git, GitHub/GitLab Required Skills & Experience: 3+ years of Python backend development (FastAPI/Flask/Django). Strong experience in building REST APIs for frontend/UI consumption. Familiarity with LLM APIs, RAG pipelines, and AI/ML cloud services. Knowledge of text extraction from PDFs/documents and data preprocessing. Experience with relational & NoSQL databases (PostgreSQL, MongoDB). Understanding of JWT, OAuth, API security, and rate limiting. Basic knowledge of AWS/GCP cloud services (Lambda, S3, EC2). Ability to work in an Agile/remote team environment. Nice to Have: Experience with asynchronous programming (async/await in FastAPI). Knowledge of WebSockets for real-time updates. Familiarity with ML model deployment (SageMaker, Vertex AI). Exposure to frontend frameworks (React, Next.js) for better API collaboration. Exposure to HIPAA Complaint project is a must. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Primary Skill Python,Flask Secondary skill Cloud,Database Strong hands-on programming experience with Python (Flask, Django, or FastAPI preferred). Expertise in Python with FastAPI.Proficiency in designing and consuming RESTful APIs and GraphQL. Solid understanding of RESTful API development and integration. Practical experience with cloud platforms (AWS Lambda, Azure Functions, GCP App Engine, or similar services). Basic exposure or working knowledge of AI/ML concepts and frameworks (like sci-kit-learn, TensorFlow, PyTorch). Good understanding of microservices architecture and containerization (Docker preferred). Proficient with relational and NoSQL databases (PostgreSQL, MySQL, MongoDB, etc.). Experience in product development with SaaS applications. Experience with version control systems (Git) and CI/CD pipelines. Knowledge of asynchronous programming and message queues (RabbitMQ, Kafka) is a plus. Strong problem-solving, analytical, and communication skills. Familiarity with machine learning and data science workflows.(e.g., TensorFlow, Py Torch) Experience in working with front-end technologies like JavaScript, React, or Angular is desirable. Show more Show less

Posted 2 weeks ago

Apply

9.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. We have some Python demands and we foresee this growing in upcoming days. Need your help with hiring Python Developers with JD below: Python Developer (FastAPI/Flask/Django) Experience Range: 5 – 9 Years (A healthy mix within this experience Range) Responsibilitie Build scalable backend applications using FastAPI, Flask or Django Design RESTful APIs and implement data models with Pydantic Utilize AWS services (DynamoDB, S3, EKS) for deployment Write unit tests and integrate with CI/CD pipelines Collaborate with cross-functional teams Requirements Python development experience Experience with any one of the following frameworks: Django, FastAPI, or Flask AWS services knowledge (DynamoDB, S3, EKS) Pydantic and unit testing experience like PyTest Git for version control Nice to Have Containerization (Docker) and orchestration (Kubernetes) CI/CD pipeline experience NoSQL database knowledge EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies