Jobs
Interviews

4097 Fastapi Jobs - Page 41

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 - 4.0 years

0 Lacs

karnataka

On-site

As a Python Intern at 8byte, you will have the opportunity to work on developing robust, scalable, and secure AI applications for enterprise environments. You will collaborate with experienced engineers and researchers to contribute to critical components of our AI infrastructure. Your responsibilities will include data parsing and processing, search indexing, LLM routing systems, API development, testing and debugging, documentation, as well as research and exploration of new tools and techniques to enhance existing processes and features. To be successful in this role, you should be currently pursuing a Bachelor's or Master's degree in Computer Science, Artificial Intelligence, Data Science, or a related technical field. You should have a strong foundational knowledge of Python programming, familiarity with data structures and algorithms, and a basic understanding of databases (SQL/NoSQL) and data manipulation concepts. An eagerness to learn new technologies, excellent problem-solving skills, attention to detail, and good communication and teamwork abilities are essential. Desired skills that would be considered a bonus include experience with any Python web framework (e.g., Flask, Django, FastAPI), familiarity with version control systems (e.g., Git), and a basic understanding of machine learning concepts, especially Natural Language Processing (NLP). At 8byte, you will gain hands-on experience with cutting-edge AI technologies and real-world enterprise projects. You will receive mentorship from experienced AI engineers and researchers in a collaborative and supportive work environment. This internship offers you the opportunity to make a tangible impact on our products and clients, along with a stipend that is commensurate with industry standards. Please note that this is a paid internship.,

Posted 2 weeks ago

Apply

7.0 - 9.0 years

0 Lacs

Bengaluru

Hybrid

Role & responsibilities Technical & Behavioral Competencies Programming/Technical Languages High skills on Python (DEV) bash YAML Fast API Essential: Kubernetes and helm deployment Jinja templating CI/CD with tekton Management of APIs (Curl and request library to implement calls in python) Vault Hashicorp Mongodb (NO-SQL DATABASE management) Airflow Kibana Desirable: Git/Bitbucket ITIL methodology Vscode Ldap/Active Directory Confluence Postman Kibana Sentry Cyberark Keycloack JupyterHub Illumio Coder AVI (Load balancer service) skills AlertManager Exporter Celery Postgres + Timescale Prometheus Promscale Promtail Loki Redis Superset OPA Grafana HA-Proxy Incident post-mortem skills Preferred candidate profile

Posted 2 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Position Title Senior Executive - Programmer Analyst Location Chennai, India Band A2 Designation Lead Programmer Analyst Overview - We are seeking a highly skilled and experienced Lead Programmer Analyst specializing in Microsoft Technologies to join our dynamic team. As a Lead Programmer Analyst, you will play a critical role in shaping the success of our technology projects. Your responsibilities will include architecture design, implementation, and overseeing the development and deployment of Microsoft-based solutions that meet our internal needs. You’ll collaborate closely with architects, business analysts, project managers, developers, testers and other stakeholders to ensure the successful delivery of projects within scope, budget, and schedule. Technical Skills – Microsoft .NET Stack- Proficiency in .NET 8.0, C#, ASP.NET Core, and MVC. Experience with building Web APIs and Minimal APIs. Familiarity with front-end technologies such as React, TypeScript, and NodeJS. Data Persistence and Messaging- Hands-on experience with ORMs (Object-Relational Mappers). Knowledge of messaging and streaming technologies. NoSQL Databases- Understanding of NoSQL databases and their use cases. Microsoft Azure- Designing and implementing cloud-based solutions using Azure services: Azure App Services Azure Functions Azure Web Jobs Azure SQL Database Azure Storage Additional Skills and Value Additions - Experience working in Agile/Scrum environments. Familiarity with Agile methodologies and Scrum practices. Python: General Python skills. Data handling using Python. API development using FastAPI or Flask. Knowledge of PySpark. Big Data: Exposure to technologies such as Databricks and Snowflake. Familiarity with Spark. Good to Have – Relevant Microsoft certifications are a plus. Experience with healthcare data analytics, machine learning, or AI technologies. Certification in healthcare IT (e.g., Certified Professional in Healthcare Information and Management Systems, CHPS). Soft Skills – Strong communication skills - oral and verbal. Ability to work with various stakeholders across various geography. with the ability to build & sustain teams. Mentor people and create a high performing organization, fostering talent, resolving conflicts to build & sustain teams. Education – Master’s or Bachelor’s degree from top tier colleges with good grades from an Engineering Background Business Domain – US Healthcare Insurance & Payer Analytics Insurance Fraud, Waste & Abuse Recovery Audit & Utilization Review Compliance Adherence & Coding Accuracy Payer Management & Code Classification Management Requirements & Responsibilities - Architectural Design and Implementation Design scalable, reliable, and high-performance solutions based on Microsoft technologies, including but not limited to .NET, Azure, SQL Server, and SharePoint Online. Provide expertise in creating robust architectures that align with business objectives. Requirements Gathering and Analysis Collaborate with stakeholders to understand business objectives and technical requirements. Translate requirements into architectural blueprints and design specifications. Mentoring and Knowledge Transfer Mentor new engineers, helping them adapt to the software development environment. Share best practices and guide their learning journey. Alignment with Business Goals: Work closely with project managers, business analysts, and quality assurance teams and ensure that technical solutions align with business requirements. Code Quality and Security: Conduct thorough code reviews & Enforce coding standards, best practices, and security guidelines. Continuous Learning and Adaptation: Stay informed about emerging technologies, trends, and best practices and evaluate their applicability to ongoing projects and solutions. Troubleshooting and Issue Resolution: Assist in resolving complex technical issues during development or deployment. Cloud Migration: Lead the migration of on-premises applications to the cloud, specifically leveraging the Microsoft Azure platform.

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. FSO Full Stack Developer JD - .Net + Python. Requirements Full stack developer with 8+ years of experience. Level – Senior 3 Must have: Dotnet, Python, FastAPI, Angular, SQL, Azure Good to have: React Resource should be willing to work on new tech stacks/frameworks by upskilling/cross skilling on the job (as per engagements requirement) Key Responsibilities Integrating emerging technologies to enhance audit service delivery quality, solving business/client challenges through technology. Building user-centred applications, combining business knowledge with technology experience to build automation and analytic solutions with a focus on the financial services industry. Performing full life-cycle software development using agile framework including delivery and integration with other systems and applications in a cloud-based environment. As a full-stack engineer, you will get to work on a wide range of new products changing the way engineers do research, share knowledge, design, and build new software. Driving various stages of the development lifecycle, ensuring application/software architecture is aligned to stakeholder requirements. Creating and/or modifying application delivery process strategy based on industry trends, experience, leading practices, and direction of major players. Establishing strategy and business case for the integration of multiple applications. Lead the design of complex integration strategies and plans and the evaluation of complex system integration initiatives and advises on any necessary course corrections. Developing and maintain long-term business relationships and networks. EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.

Posted 2 weeks ago

Apply

2.0 - 7.0 years

6 - 12 Lacs

Ahmedabad

Work from Office

Perks & Benefits: 5 Days Working Group Medical Insurance Professional Development & Culture No Bars for the Right Candidate No Bond Policy Position Summary: We are looking for a Full-Stack Python Developer with strong expertise in Python and the FastAPI framework to develop and maintain scalable web applications and APIs. The ideal candidate should have a solid understanding of backend development with experience in handling authentication, API integrations, and task scheduling. Knowledge of React, JavaScript, and modern frontend frameworks is a plus. Required Skills: Strong proficiency in Python and extensive knowledge of the FastAPI framework. Solid understanding of Python's core concepts (data structures, OOP, exception handling, and performance optimization). Experience with API development, including authentication (OAuth2, JWT), API Gateway, and external API integration. Proficiency in cron jobs and task scheduling to automate processes. Understanding of asynchronous programming and event-driven architectures. Strong experience with Git and version control systems. Good knowledge of HTML, CSS, and JavaScript for effective collaboration with front-end teams. Preferred Skills: Experience with ReactJS, NextJS, and other front-end libraries. Knowledge of Node.js for building microservices or working across stacks. Experience with Docker, CI/CD pipelines, and cloud platforms (AWS, Azure, GCP). Familiarity with SQL and NoSQL databases. Exposure to caching strategies (Redis, Memcached) and message queues (RabbitMQ, Kafka). Qualifications: Bachelors degree in Computer Science, Engineering, or a related field. 2+ years of hands-on experience in Python and FastAPI development.

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Delhi, India

On-site

About Us Bain & Company is a global consultancy that helps the world’s most ambitious change makers define the future. Across 65 offices in 40 countries, we work alongside our clients as one team with a shared ambition to achieve extraordinary results, outperform the competition and redefine industries. Since our founding in 1973, we have measured our success by the success of our clients, and we proudly maintain the highest level of client advocacy in the industry. In 2004, the firm established its presence in the Indian market by opening the Bain Capability Center (BCC) in New Delhi. The BCC is now known as BCN (Bain Capability Network) with its nodes across various geographies. BCN is an integral and largest unit of (ECD) Expert Client Delivery. ECD plays a critical role as it adds value to Bain's case teams globally by supporting them with analytics and research solutioning across all industries, specific domains for corporate cases, client development, private equity diligence or Bain intellectual property. The BCN comprises of Consulting Services, Knowledge Services and Shared Services. Who You Will Work With The Consumer Products Center of Expertise collaborates with Bain’s global Consumer Products Practice leadership, client-facing Bain leadership and teams, and with end clients on development and delivery of Bain’s proprietary CP products and solutions. These solutions aim to answer strategic questions of Bain’s CP clients relating to brand strategy (consumer needs, assortment, pricing, distribution), revenue growth management (pricing strategy, promotions, profit pools, trade terms), negotiation strategy with key retailers, optimization of COGS etc. You will work as part of the team in CP CoE comprising of a mix of Director, Managers, Projects Leads, Associates and Analysts working to implement cloud-based end-to-end advanced analytics solutions. Delivery models on projects vary from working as part of a CP Center of Expertise, broader global Bain case team within the CP ringfence, or within other industry CoEs such as FS / Retail / TMT / Energy / CME / etc with BCN on need basis The AS is expected to have a knack for seeking out challenging problems and coming up with their own ideas, which they will be encouraged to brainstorm with their peers and managers. They should be willing to learn new techniques and be open to solving problems with an interdisciplinary approach. They must have excellent coding skills and should demonstrate a willingness to write modular, reusable, and functional code. What You’ll Do Collaborate with data scientists working with Python, LLMs, NLP, and Generative AI to design, fine-tune, and deploy intelligent agents and chains-based applications. Develop and maintain front-end interfaces for AI and data science applications using React.js / Angular / Nextjs and/or Streamlit/ DASH, enhancing user interaction with complex machine learning and NLP-driven systems. Build and integrate Python-based machine learning models with backend systems via RESTful APIs using frameworks like FastAPI / Flask or Django. Translate complex business problems into scalable technical solutions, integrating AI capabilities with robust backend and frontend systems. Assist in the design and implementation of scalable data pipelines and ETL workflows using DBT, PySpark, and SQL, supporting both analytics and generative AI solutions. Leverage containerization tools like Docker and utilize Git for version control, ensuring code modularity, maintainability, and collaborative development. Deploy ML-powered and data-driven applications on cloud platforms such as AWS or Azure, optimizing for performance, scalability, and cost-efficiency. Contribute to internal AI/ML Ops platforms and tools, streamlining model deployment, monitoring, and lifecycle management. Create dashboards, visualizations, and presentations using tools like Tableau/ PowerBI, Plotly, and Seaborn to drive business insights. Proficient with Excel, and PowerPoint by showing proficiency in business communication through stakeholder interactions. About You A Master’s degree or higher in Computer Science, Data Science, Engineering, or related fields OR Bachelor's candidates with relevant industry experience will also be considered. Proven experience (2 years for Master’s; 3+ years for Bachelor’s) in AI/ML, software development, and data engineering. Solid understanding of LLMs, NLP, Generative AI, chains, agents, and model fine-tuning methodologies. Proficiency in Python, with experience using libraries such as Pandas, Numpy, Plotly, and Seaborn for data manipulation and visualization. Experience working with modern Python frameworks such as FastAPI for backend API development. Frontend development skills using HTML, CSS, JavaScript/TypeScript, and modern frameworks like React.js; Streamlit knowledge is a plus. Strong grasp of data engineering concepts – including ETL pipelines, batch processing using DBT and PySpark, and working with relational databases like PostgreSQL, Snowflake etc. Good working knowledge of cloud infrastructure (AWS and/or Azure) and deployment best practices. Familiarity with MLOps/AI Ops tools and workflows including CI/CD pipelines, monitoring, and container orchestration (with Docker and Kubernetes). Good-to-have: Experience in BI tools such as Tableau or PowerBI, Good-to-have: Prior exposure to consulting projects or CP (Consumer Products) business domain. What Makes Us a Great Place To Work We are proud to be consistently recognized as one of the world's best places to work, a champion of diversity and a model of social responsibility. We are currently ranked the #1 consulting firm on Glassdoor’s Best Places to Work list, and we have maintained a spot in the top four on Glassdoor's list for the last 12 years. We believe that diversity, inclusion and collaboration is key to building extraordinary teams. We hire people with exceptional talents, abilities and potential, then create an environment where you can become the best version of yourself and thrive both professionally and personally. We are publicly recognized by external parties such as Fortune, Vault, Mogul, Working Mother, Glassdoor and the Human Rights Campaign for being a great place to work for diversity and inclusion, women, LGBTQ and parents.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Ahmedabad, Gujarat, India

Remote

🚀 We're Hiring: Founding Backend Python Developer @ Vakta.AI 🚀 Ready to build the backbone of India's next-generation conversational AI platform? Join us as a Founding Backend Python Developer at Vakta.AI and architect systems that will power millions of intelligent conversations. At Vakta.AI , we're developing India's most advanced conversational AI infrastructure — from high-performance API gateways to real-time chat engines and scalable AI model serving platforms. We're building the robust foundation that makes seamless human-AI interaction possible at massive scale. Why this is an exceptional opportunity: You'll be a founding team member , working directly with our core team to design the entire backend architecture and technical infrastructure from the ground up. Competitive salary + significant ESOPs — we want you to grow with us and share in our success story. You'll solve complex distributed systems challenges that directly enable breakthrough AI experiences for millions of users. Tremendous growth potential, complete ownership, and the chance to build and mentor your own backend engineering team as we expand. What we're looking for: Strong expertise in Python backend development (Django/FastAPI/Flask, async programming, microservices) Experience with databases (PostgreSQL, MongoDB, Redis), message queues, and distributed systems Someone who loves building robust, scalable systems and can architect production-ready solutions from scratch A startup mindset: proactive, solution-oriented, and excited about tackling complex technical challenges Bonus points for: Experience with cloud platforms (AWS/GCP/Azure), containerization (Docker/Kubernetes) Knowledge of API design, WebSocket implementations, and real-time systems Understanding of ML model deployment and serving infrastructure Additional details: Great salary + equity (ESOPs) — competitive package, negotiable for the right candidate We need someone who can join immediately (or very soon!) Location: Hybrid/Remote with flexibility

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Description The Position We are seeking a seasoned engineer with a passion for changing the way millions of people save energy. You’ll work within the Engineering team to build and improve our platforms to deliver flexible and creative solutions to our utility partners and end users and help us achieve our ambitious goals for our business and the planet. We are seeking a skilled and passionate Senior Software Engineer with expertise in Python and React to join our development team. As a Senior Fullstack Developer, you will be a crucial member of our development team, responsible for leading and driving the development of complex, scalable, and high-performance Python-based applications. One of your main focus will be on developing and supporting efficient, reusable and highly scalable APIs & components to deliver a compelling experience to users across platforms. You will collaborate with cross-functional teams, mentor junior developers, and coordinate with the rest of the team working on different layers of the infrastructure. Therefore, a commitment to collaborative problem solving, sophisticated design, and quality product is important. You will take part in the planning and strategy to come up with the solutions with full ownership. You will own the development and its quality independently and be responsible for high quality deliverables. And you will work with a great team with excellent benefits. Responsibilities & Skills You should: Be excited to work with talented, committed people in a fast-paced environment. Use a data-driven approach and actively work on product & technology roadmap at strategy level and day-to-day tactical level. Be designing, building, and maintaining high performance responsive Web Applications and dashboards with reusable, and reliable code. Use a rigorous approach for product improvement and customer satisfaction. Love developing great software as a seasoned product engineer. Be ready, able, and willing to jump onto a call with a partner or customer to help solve problems. Be able to deliver against several initiatives simultaneously. Have a strong eye for detail and quality of code. Have an agile mindset. Have strong problem-solving skills and attention to detail. Ability to understand business requirements and translate them into technical requirements Ability to deliver against several initiatives simultaneously as a multiplier. Required Skills (Python) You are an experienced developer – a minimum of 7+ years of professional experience. Python experience, preferably both 2.7 and 3.x Strong Python knowledge - familiar with OOPs, data structures and algorithms Work experience & strong proficiency in Python and its associated frameworks (like Flask, FastAPI etc.). Experience in designing and implementing scalable microservice architecture Familiarity with RESTful APIs and integration of third-party APIs. 3+ years building and managing APIs to industry-accepted RESTful standards Demonstrable experience with writing unit and functional tests Application of industry security best practices to application and system development Experience with database systems such as PostgreSQL, MySQL, or MongoDB. Required Skills (React) React experience, preferably React 15 or higher, 2+ years. Thorough understanding of React.js and its core principles Familiarity with newer specifications of ECMAScript Experience with popular React.js workflows (such as Flux or Redux) Experience with modern front-end build pipelines and tools such as Babel, Webpack, NPM, etc. A knack for benchmarking and optimization Demonstrable experience with writing unit and functional tests Required The following experiences are not required, but you'll stand out from other applicants if you have any of the following, in our order of importance: Experience with cloud infrastructure like AWS/GCP or other cloud service provider experience Serverless architecture, preferably AWS Lambda Experience with PySpark, Pandas, Scipy, Numpy libraries is a plus Experience in microservices architecture Solid CI/CD experience You are a Git guru and revel in collaborative workflows You work on the command line confidently and are familiar with all the goodies that the linux toolkit can provide Knowledge of modern authorization mechanisms, such as JSON Web Token Qualifications Bachelor's or Master's degree in Computer Science, Engineering, or a related field.

Posted 2 weeks ago

Apply

4.0 - 10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title: Application Developer – Open Source (Python Developer) Experience: 4 to 10 Years Location: Chennai, India Job Type: Full-Time Job Summary We are seeking a skilled Python Developer with strong knowledge and hands-on experience in designing, building, and maintaining scalable, secure, and high-performance applications. The ideal candidate should be comfortable working in a fast-paced environment and collaborating with cross-functional teams. Key Responsibilities Design, develop, test, and deploy Python-based applications. Build and maintain scalable and secure backend services and APIs. Optimize performance and ensure high availability and responsiveness. Collaborate with product managers, designers, and other developers to deliver quality software. Write clean, maintainable, and efficient code following best practices. Participate in code reviews and contribute to team knowledge sharing. Must-Have Skills Strong programming skills in Python Experience with one or more Python frameworks (e.g., Django, Flask, FastAPI) Knowledge of RESTful APIs and integration practices Familiarity with version control systems like Git Understanding of security, performance tuning, and scalability principles Good-to-Have Skills Experience with containerization tools (e.g., Docker) Exposure to CI/CD pipelines Knowledge of relational and/or NoSQL databases Familiarity with cloud platforms (e.g., AWS, Azure, GCP) Preferred Qualifications Bachelor’s degree in Computer Science, Information Technology, or a related field Strong problem-solving skills and the ability to work independently and collaboratively

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Position - Python Developer Experience - 4 -6 Years Job Location - Noida Sector 126 Roles and Responsibilities- Define scalable event-driven pipelines capable of handling lakhs to millions of messages/minutes using Kafka Develop and optimize Python services using Flask, FastAPI (or Django). Build REST APIs, CLI tools, and components interfacing with legacy systems (e.g., AMOS, CORBA). High-Volume Event Processing - Design, implement, and optimize Kafka producers/consumers, manage clusters, topics, partitions, and schema registries Ensure pipeline reliability, performance, and durable stream handling. Database Design & Optimization Architect PostgreSQL schemas optimized for concurrency, partitioning, and real-time analytics. Troubleshoot performance bottlenecks and optimize query execution. Mentorship & Leadershi p Mentor junior developers Define engineering best practices and set standards across the team. Required Qualifications 4+ years of Python experience building production services. Proficiency with Flask, FastAPI, or Django. Mandatory deep experience with Apache Kafka . Strong PostgreSQL skills: schema design, indexing, partitioning, high concurrency. Strong debugging, profiling, and optimization skills. Proven track record of mentoring junior team members. Good communication, collaboration, and documentation habits. Preferred Skils Experience with RabbitMQ or Celery. Familiarity with NoSQL databases (e.g., MongoDB, Cassandra) Knowledge of stream-processing frameworks (e.g., Kafka Streams, Spark, Flink). Candidate from the Telecom industry/project is preferred

Posted 2 weeks ago

Apply

1.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About Us Yubi stands for ubiquitous. But Yubi will also stand for transparency, collaboration, and the power of possibility. From being a disruptor in India’s debt market to marching towards global corporate markets from one product to one holistic product suite with seven products Yubi is the place to unleash potential. Freedom, not fear. Avenues, not roadblocks. Opportunity, not obstacles. About Yubi Yubi, formerly known as CredAvenue, is re-defining global debt markets by freeing the flow of finance between borrowers, lenders, and investors. We are the world's possibility platform for the discovery, investment, fulfillment, and collection of any debt solution. At Yubi, opportunities are plenty and we equip you with tools to seize it. In March 2022, we became India's fastest fintech and most impactful startup to join the unicorn club with a Series B fundraising round of $137 million. In 2020, we began our journey with a vision of transforming and deepening the global institutional debt market through technology. Our two-sided debt marketplace helps institutional and HNI investors find the widest network of corporate borrowers and debt products on one side and helps corporates to discover investors and access debt capital efficiently on the other side. Switching between platforms is easy, which means investors can lend, invest and trade bonds - all in one place. All of our platforms shake up the traditional debt ecosystem and offer new ways of digital finance. Yubi Credit Marketplace - With the largest selection of lenders on one platform, our credit marketplace helps enterprises partner with lenders of their choice for any and all capital requirements. Yubi Invest - Fixed income securities platform for wealth managers & financial advisors to channel client investments in fixed income Financial Services Platform - Designed for financial institutions to manage co-lending partnerships & asset based securitization Spocto - Debt recovery & risk mitigation platform Corpository - Dedicated SaaS solutions platform powered by Decision-grade data, Analytics, Pattern Identifications, Early Warning Signals and Predictions to Lenders, Investors and Business Enterprises So far, we have on-boarded over 17000+ enterprises, 6200+ investors & lenders and have facilitated debt volumes of over INR 1,40,000 crore. Backed by marquee investors like Insight Partners, B Capital Group, Dragoneer, Sequoia Capital, LightSpeed and Lightrock, we are the only-of-its-kind debt platform globally, revolutionizing the segment. At Yubi, People are at the core of the business and our most valuable assets. Yubi is constantly growing, with 1000+ like-minded individuals today, who are changing the way people perceive debt. We are a fun bunch who are highly motivated and driven to create a purposeful impact. Come, join the club to be a part of our epic growth story. Job Summary: We are looking for a highly skilled Data Scientist (LLM) to join our AI and Machine Learning team. The ideal candidate will have a strong foundation in Machine Learning (ML), Deep Learning (DL), and Large Language Models (LLMs) , along with hands-on experience in building and deploying conversational AI/chatbots . The role requires expertise in LLM agent development frameworks such as LangChain, LlamaIndex, AutoGen, and LangGraph . You will work closely with cross-functional teams to drive the development and enhancement of AI-powered applications. Key Responsibilities: Develop, fine-tune, and deploy Large Language Models (LLMs) for various applications, including chatbots, virtual assistants, and enterprise AI solutions. Build and optimize conversational AI solutions with at least 1 year of experience in chatbot development. Implement and experiment with LLM agent development frameworks such as LangChain, LlamaIndex, AutoGen, and LangGraph . Design and develop ML/DL-based models to enhance natural language understanding capabilities. Work on retrieval-augmented generation (RAG) and vector databases (e.g., FAISS, Pinecone, Weaviate, ChromaDB) to enhance LLM-based applications. Optimize and fine-tune transformer-based models such as GPT, LLaMA, Falcon, Mistral, Claude, etc. for domain-specific tasks. Develop and implement prompt engineering techniques and fine-tuning strategies to improve LLM performance. Work on AI agents, multi-agent systems, and tool-use optimization for real-world business applications. Develop APIs and pipelines to integrate LLMs into enterprise applications. Research and stay up to date with the latest advancements in LLM architectures, frameworks, and AI trends . Requirements Required Skills & Qualifications: 3-5 years of experience in Machine Learning (ML), Deep Learning (DL), and NLP-based model development. Hands-on experience in developing and deploying conversational AI/chatbots is Plus Strong proficiency in Python and experience with ML/DL frameworks such as TensorFlow, PyTorch, Hugging Face Transformers . Experience with LLM agent development frameworks like LangChain, LlamaIndex, AutoGen, LangGraph . Knowledge of vector databases (e.g., FAISS, Pinecone, Weaviate, ChromaDB) and embedding models . Understanding of Prompt Engineering and Fine-tuning LLMs . Familiarity with cloud services (AWS, GCP, Azure) for deploying LLMs at scale. Experience in working with APIs, Docker, FastAPI for model deployment. Strong analytical and problem-solving skills. Ability to work independently and collaboratively in a fast-paced environment. Good to Have: Experience with Multi-modal AI models (text-to-image, text-to-video, speech synthesis, etc.) . Knowledge of Knowledge Graphs and Symbolic AI . Understanding of MLOps and LLMOps for deploying scalable AI solutions. Experience in automated evaluation of LLMs and bias mitigation techniques . Research experience or published work in LLMs, NLP, or Generative AI is a plus. Why Join Us? Opportunity to work on cutting-edge LLM and Generative AI projects . Collaborative and innovative work environment. Competitive salary and benefits. Career growth opportunities in AI and ML research and development.

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

🚀 Hiring: #Dot_NET_Full_Stack_Developer 📍 Location: Hyderabad, India 📅 Experience: 4 – 6 Years 📧 Interested candidates can send their resumes to: ayesha@coretek.io We are looking for a passionate and skilled .NET Full Stack Developer to join our team! Key Skills Required: ✅#DotNET_Full_Stack_Development ✅#Python_server-side_scripting and #API_development ✅#Python_Frameworks: # Flask , # Django , or # FastAPI ✅ # Azure_Cloud development ✅ # JavaScript , # HTML , and # CSS #DotNet #FullStackDeveloper #HyderabadJobs #PythonDeveloper #AzureDeveloper #Flask #Django #FastAPI #JavaScriptDeveloper #FrontendDeveloper #BackendDeveloper #NowHiring #ITJobs #TechJobsIndia #HiringAlert #CoretekJobs

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Title: Software Engineer (Next.js+FastAPI) Job Type: Full-Time, Contractor Location: Pune / Gurugram About Us: Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest-growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market. Job Summary: We're looking for an experienced Software Engineer with strong hands-on expertise in Next.js and FastAPI to join our growing engineering team. In this role, you will take end-to-end ownership of features, work across the frontend and backend stack, and collaborate closely with product and design teams. If you're passionate about crafting performant web applications, managing complex tasks effectively, and communicating your ideas clearly — we’d love to hear from you. Key Responsibilities: Design, develop, and deploy modern web applications using Next.js (React) and FastAPI. Build scalable APIs and backend services with performance and maintainability in mind. Translate product requirements into high-quality, testable, and maintainable code. Manage project tasks, timelines, and priorities with minimal supervision. Collaborate with designers, product managers, and fellow engineers to deliver impactful user experiences. Conduct code reviews, identify and fix bugs, and help maintain a high standard of code quality. Stay current with emerging trends in full-stack development and propose improvements proactively. Required Skills and Qualifications: 7+ years of relevant full-stack development experience. Strong proficiency in Next.js, React, and modern JavaScript/TypeScript. Hands-on experience with FastAPI, Python, and asynchronous backend patterns. Solid knowledge of RESTful APIs, microservices, and modern software architecture. Ability to manage tasks independently and communicate clearly with stakeholders. Excellent problem-solving skills and a bias for action. Strong verbal and written communication abilities. Preferred Qualifications: Experience working with cloud infrastructure (AWS, GCP, or Azure). Familiarity with Docker, CI/CD pipelines, and scalable deployment workflows. Previous experience in a leadership, mentoring, or tech lead role.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

🚀 AI Engineering Intern (SDE) – Founding Tech Interns | Opportunity of a Lifetime Location: Gurgaon (In-Office) Duration: 3–6 months (Flexible based on academic schedule) Start Date: Immediate openings Open to: Tier 1 college students graduating in 2025 and 2026 Compensation: Stipend + Pre-Placement Offer potential 🧠 About Us – Darwix AI Darwix AI is on a mission to solve a problem no one's cracked yet — building real-time, multilingual conversational intelligence for omnichannel enterprise sales teams using the power of Generative AI. We're building India’s answer to Gong + Refract + Harvey AI — trained on 1M+ hours of sales conversations, and packed with industry-first features like live agent coaching, speech-to-text in 11 Indic languages, and autonomous sales enablement nudges. We’ve got global clients, insane velocity, and a team of ex-operators from IIMs, IITs, and top-tier AI labs. 🌌 Why This Internship is Unlike Anything Else Work on a once-in-a-decade problem — pushing the boundaries of GenAI + Speech + Edge compute. Ship real products used by enterprise teams across India & the Middle East. Experiment freely — train models, optimize pipelines, fine-tune LLMs, or build scrapers that work in 5 languages. Move fast, learn faster — direct mentorship from the founding engineering and AI team. Proof-of-excellence opportunity — stand out in every future job, B-school, or YC application. 💻 What You'll Do Build and optimize core components of our real-time agent assist engine (Python + FastAPI + Kafka + Redis). Train, evaluate, and integrate whisper, wav2vec, or custom STT models on diverse datasets. Work on LLM/RAG pipelines, prompt engineering, or vector DB integrations. Develop internal tools to analyze, visualize, and scale insights from conversations across languages. Optimize for latency, reliability, and multilingual accuracy in dynamic customer environments. 🌟 Who You Are Pursuing a B.Tech/B.E. or dual degree from IITs, IIITs, BITS, NIT Trichy/Warangal/Surathkal, or other Tier-1 institutes , preferably in Computer Science or allied fields . Comfortable with Python, REST APIs, and database operations. Bonus: familiarity with FastAPI, Langchain, or HuggingFace. Passionate about AI/ML, especially NLP, GenAI, ASR, or multimodal systems. Always curious, always shipping, always pushing yourself beyond the brief. Looking for an internship that actually matters — not one where you're just fixing CSS. 🌐 Tech You’ll Touch Python, FastAPI, Kafka, Redis, MongoDB, Postgres Whisper, Deepgram, Wav2Vec, HuggingFace Transformers OpenAI, Anthropic, Gemini APIs LangChain, FAISS, Pinecone, LlamaIndex Docker, GitHub Actions, Linux environments 🎯 What’s in it for you A pre-placement offer for the best performers. A chance to be a founding engineer post-graduation. Exposure to the VC ecosystem, client demos, and GTM strategies. Stipend + access to tools/courses/compute resources you need to thrive. 🚀 Ready to Build the Future? If you’re one of those rare folks who can combine deep tech with deep curiosity, this is your call to adventure. Join us in building something that’s never been done before. Apply now at careers@cur8.in Attach your CV + GitHub/Portfolio + a line on why this excites you. Bonus points if you share a project you’ve built or an AI problem you’re obsessed with. Darwix AI | GenAI for Revenue Teams | Built from India for the World

Posted 2 weeks ago

Apply

5.0 - 10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Exp - 5 to 10 Years Responsibilities: • Design, develop, and implement well-tested, reusable, and maintainable Python code. • ⁠Utilize various Python libraries and frameworks (e.g., FastAPI, Django, Flask, Pandas, NumPy) to implement functionalities. • ⁠Integrate various data sources (APIs, databases) to manipulate and analyze data. • Optimize code for performance, scalability, and security. • ⁠Write unit and integration tests for code coverage and stability. • ⁠Collaborate with designers and other developers to translate requirements into efficient solutions. • Participate in code reviews, providing constructive feedback to improve code quality. • ⁠Stay up-to-date with the latest Python trends, libraries, and best practices. • Debug and troubleshoot complex issues to ensure optimal application performance. • Proactively suggest improvements and optimizations to existing codebase.

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Position: AI + Backend Engineer Location: Gurugram Experience Required: 7–8+ years We are looking for a highly capable and innovative AI + Backend Engineer to join our dynamic team in Gurugram. The ideal candidate should bring in-depth experience working with Large Language Models (LLMs), Natural Language Processing (NLP), and intelligent agent frameworks, combined with solid backend development skills. This role requires a strong grasp of machine learning operations, API-centric architectures, and distributed systems. Role & Responsibilities: Design and develop advanced AI solutions leveraging LLMs for tasks like content generation, entity extraction, summarization, and intelligent agent development. Fine-tune and implement models for dense retrieval, question answering, and semantic search use cases. Build and maintain high-performance backend systems and RESTful APIs that support AI model integration. Use vector databases and embedding-based retrieval methods to enhance model outputs (e.g., Pinecone, FAISS, Weaviate). Apply Retrieval-Augmented Generation (RAG) frameworks to support dynamic and context-aware responses. Work closely with ML engineers and researchers to streamline model training, deployment, and optimization processes. Monitor and enhance the performance of deployed models and inference pipelines in production environments. Set up and manage end-to-end data pipelines for preprocessing, training, and deployment workflows. Stay abreast of the latest research and industry developments in the field of LLMs, NLP, and AI agents to continuously improve solutions. Key Requirements: 7–8+ years of experience in AI/ML systems development and backend engineering, with a strong focus on NLP and LLM-based systems. Expertise in Python and deep learning libraries such as PyTorch, TensorFlow, and Hugging Face Transformers. Strong experience in model fine-tuning, prompt engineering, and building intelligent agent-based applications. Practical knowledge of working with vector databases and retrieval mechanisms. Proficiency in building and scaling backend platforms using frameworks like FastAPI, Flask, or Django. Solid experience with cloud platforms like AWS, Google Cloud Platform, or Azure for AI deployment. Skilled in using Docker, Kubernetes, and related tools for model containerization and orchestration. Familiarity with MLOps practices, CI/CD pipelines, and model version control. Excellent analytical, debugging, and communication skills, with the ability to work effectively in agile, cross-functional teams.

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Bangalore North Rural, Karnataka, India

On-site

About Us Yubi stands for ubiquitous. But Yubi will also stand for transparency, collaboration, and the power of possibility. From being a disruptor in India’s debt market to marching towards global corporate markets from one product to one holistic product suite with seven products Yubi is the place to unleash potential. Freedom, not fear. Avenues, not roadblocks. Opportunity, not obstacles. About Yubi Yubi, formerly known as CredAvenue, is redefining global debt markets by freeing the flow of finance between borrowers, lenders, and investors. We are the world's possibility platform for the discovery, investment, fulfillment, and collection of any debt solution. At Yubi, opportunities are plentiful, and we equip you with the tools to seize them. In March 2022, we became India's fastest fintech and most impactful startup to join the unicorn club with a Series B fundraising round of $137 million. In 2020, we began our journey with a vision of transforming and deepening the global institutional debt market through technology. Our two-sided debt marketplace helps institutional and HNI investors find the widest network of corporate borrowers and debt products on one side and helps corporates to discover investors and access debt capital efficiently on the other side. Switching between platforms is easy, which means investors can lend, invest, and trade bonds - all in one place. All of our platforms shake up the traditional debt ecosystem and offer new ways of digital finance. Yubi Credit Marketplace - With the largest selection of lenders on one platform, our credit marketplace helps enterprises partner with lenders of their choice for any capital requirements. Yubi Invest - Fixed income securities platform for wealth managers & financial advisors to channel client investments in fixed income Financial Services Platform - Designed for financial institutions to manage co-lending partnerships & asset-based securitization Spocto - Debt recovery & risk mitigation platform Accumn - Dedicated SaaS solutions platform powered by Decision-grade data, Analytics, Pattern Identifications, Early Warning Signals, and Predictions to Lenders, Investors, and Business Enterprises So far, we have onboarded over 17,000+ enterprises and 6200+ investors & lenders and have facilitated debt volumes of over INR 1,40,000 crore. Backed by marquee investors like Insight Partners, B Capital Group, Dragoneer, Sequoia Capital, LightSpeed, and Lightrock, we are the only-of-its-kind debt platform globally, revolutionizing the segment. At Yubi, People are at the core of the business and our most valuable assets. Yubi is constantly growing, with 1000+ like-minded individuals today who are changing the way people perceive debt. We are a fun bunch who are highly motivated and driven to create a purposeful impact. Come join the club to be a part of our epic growth story. About The Job Job Title: Data Scientist 2 (LLM/GenAI) Location: Bangalore Experience: 2 - 4 years Employment Type: Full-time Job Summary: We seek a highly skilled Data Scientist (LLM) to join our AI and Machine Learning team. The ideal candidate will have a strong foundation in Machine Learning (ML), Deep Learning (DL), and Large Language Models (LLMs) , along with hands-on experience in building and deploying conversational AI/chatbots . The role requires expertise in LLM agent development frameworks such as LangChain, LlamaIndex, AutoGen, and LangGraph . You will work closely with cross-functional teams to drive the development and enhancement of AI-powered applications. Key Responsibilities: Develop, fine-tune, and deploy Large Language Models (LLMs) for various applications, including chatbots, virtual assistants, and enterprise AI solutions. Build and optimize conversational AI solutions with at least 1 year of experience in chatbot development. Implement and experiment with LLM agent development frameworks such as LangChain, LlamaIndex, AutoGen, and LangGraph . Design and develop ML/DL-based models to enhance natural language understanding capabilities. Work on retrieval-augmented generation (RAG) and vector databases (e.g., FAISS, Pinecone, Weaviate, ChromaDB) to enhance LLM-based applications. Optimize and fine-tune transformer-based models such as GPT, LLaMA, Falcon, Mistral, Claude, etc., for domain-specific tasks. Develop and implement prompt engineering techniques and fine-tuning strategies to improve LLM performance. Work on AI agents, multi-agent systems, and tool-use optimization for real-world business applications. Develop APIs and pipelines to integrate LLMs into enterprise applications. Research and stay up-to-date with the latest advancements in LLM architectures, frameworks, and AI trends . Requirements Required Skills & Qualifications: 2-4 years of experience in Machine Learning (ML), Deep Learning (DL), and NLP-based model development. Hands-on experience in developing and deploying conversational AI/chatbots is Plus Strong proficiency in Python and experience with ML/DL frameworks such as TensorFlow, PyTorch, and Hugging Face Transformers . Experience with LLM agent development frameworks like LangChain, LlamaIndex, AutoGen, LangGraph . Knowledge of vector databases (e.g., FAISS, Pinecone, Weaviate, ChromaDB) and embedding models . Understanding of Prompt Engineering and Fine-tuning LLMs . Familiarity with cloud services (AWS, GCP, Azure) for deploying LLMs at scale. Experience in working with APIs, Docker, FastAPI for model deployment. Strong analytical and problem-solving skills. Ability to work independently and collaboratively in a fast-paced environment. Good to Have: Experience with Multi-modal AI models (text-to-image, text-to-video, speech synthesis, etc.) . Knowledge of Knowledge Graphs and Symbolic AI . Understanding of MLOps and LLMOps for deploying scalable AI solutions. Experience in automated evaluation of LLMs and bias mitigation techniques . Research experience or published work in LLMs, NLP, or Generative AI is a plus. Benefits Why Join Us? This is an opportunity to work on cutting-edge LLM and Generative AI projects . Collaborative and innovative work environment. Competitive salary and benefits. Career growth opportunities in AI and ML research and development.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Chandigarh

On-site

Role Overview We are looking for a Python AI/ML Developer with a passion for Large Language Models (LLMs), rapid prototyping, and scalable backend development. If you’re excited about building AI-native applications using frameworks like FastAPI, Django, Gradio, and Streamlit—and you’ve dabbled in digital or affiliate marketing—we want to hear from you. Key Responsibilities Build and deploy AI-powered applications using Python and LLM APIs (e.g., OpenAI, LLaMA, Mistral, etc.) Develop RESTful and asynchronous APIs using FastAPI and Django Integrate and manage databases like PostgreSQL and MongoDB Create intuitive frontend UIs with Gradio and Streamlit Design and fine-tune prompts for various LLM use cases (text generation, classification, semantic search, etc.) Collaborate with product, design, and marketing teams to translate ideas into production-ready tools Bonus: Apply your understanding of digital marketing funnels or affiliate campaigns in product design Requirements Strong foundation in Python and basic understanding of AI/ML workflows Exposure to one or more LLM APIs (OpenAI, Cohere, Hugging Face models, etc.) Working knowledge of FastAPI , Django , PostgreSQL , and MongoDB Experience or project work using Gradio and/or Streamlit Demonstrated ability in writing effective prompts and building LLM chains or workflows Strong problem-solving mindset and eagerness to learn Git and version control proficiency Bonus Points Knowledge in digital marketing , affiliate marketing , or performance marketing Knowledge of LangChain , LLMOps , or RAG-based pipelines Job Type: Full-time Work Location: In person

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

PwC AC is hiring for Data scientist Apply and get a chance to work with one of the Big4 companies #PwC AC. Job Tit le : Data scientist Years of Experienc e: 3-7 years Shift Timin gs: 11AM-8PM Qualificati on: Graduate and above(Full time) About PwC CTIO – AI Engineering PwC’s Commercial Technology and Innovation Office (CTIO) is at the forefront of emerging technology, focused on building transformative AI-powered products and driving enterprise innovation. The AI Engineering team within CTIO is dedicated to researching, developing, and operationalizing cutting-edge technologies such as Generative AI, Large Language Models (LLMs), AI Agents, and more. Our mission is to continuously explore what's next—enabling business transformation through scalable AI/ML solutions while remaining grounded in research, experimentation, and engineering excellence. Role Overview We are seeking a Senior Associate – Data Science/ML/DL/GenAI to join our high-impact, entrepreneurial team. This individual will play a key role in designing and delivering scalable AI applications, conducting applied research in GenAI and deep learning, and contributing to the team’s innovation agenda. This is a hands-on, technical role ideal for professionals passionate about AI-driven transformation. Key Responsibilities Design, develop, and deploy machine learning, deep learning, and Generative AI solutions tailored to business use cases. Build scalable pipelines using Python (and frameworks such as Flask/FastAPI) to operationalize data science models in production environments. Prototype and implement solutions using state-of-the-art LLM frameworks such as LangChain, LlamaIndex, LangGraph, or similar. Also developing applications in streamlit/chainlit for demo purposes. Design advanced prompts and develop agentic LLM applications that autonomously interact with tools and APIs. Fine-tune and pre-train LLMs (HuggingFace and similar libraries) to align with business objectives. Collaborate in a cross-functional setup with ML engineers, architects, and product teams to co-develop AI solutions. Conduct R&D in NLP, CV, and multi-modal tasks, and evaluate model performance with production-grade metrics. Stay current with AI research and industry trends; continuously upskill to integrate the latest tools and methods into the team’s work. Required Skills & Experience 3 to 7 years of experience in Data Science/ML/AI roles. Bachelor’s degree in Computer Science, Engineering, or equivalent technical discipline (BE/BTech/MCA). Proficiency in Python and related data science libraries: Pandas, NumPy, SciPy, Scikit-learn, TensorFlow, PyTorch, Keras, etc. Hands-on experience with Generative AI , including prompt engineering, LLM fine-tuning, and deployment. Experience with Agentic LLMs and task orchestration using tools like LangGraph or AutoGPT-like flows. Strong knowledge of NLP techniques, transformer architectures, and text analysis. Proven experience working with cloud platforms (preferably Azure; AWS/GCP also considered). Understanding of production-level AI systems including CI/CD, model monitoring, and cloud-native architecture. (Need not develop from scratch) Familiarity with ML algorithms: XGBoost, GBM, k-NN, SVM, Decision Forests, Naive Bayes, Neural Networks, etc. Exposure to deploying AI models via APIs and integration into larger data ecosystems. Strong understanding of model operationalization and lifecycle management. Good to Have Experience with Docker, Kubernetes, and containerized deployments for ML workloads. Use of MLOps tooling and pipelines (e.g., MLflow, Azure ML, SageMaker, etc.). Experience in full-stack AI applications, including visualization (e.g., PowerBI, D3.js). Demonstrated track record of delivering AI-driven solutions as part of large-scale systems. Soft Skills & Team Expectations Strong written and verbal communication; able to explain complex models to business stakeholders. Ability to independently document work, manage requirements, and self-drive technical discovery. Desire to innovate, improve, and automate existing processes and solutions. Active contributor to team knowledge sharing, technical forums, and innovation drives. Strong interpersonal skills to build relationships across cross-functional teams. A mindset of continuous learning and technical curiosity. Preferred Certifications (at least two are preferred) Certifications in Machine Learning, Deep Learning, or Natural Language Processing. Python programming certifications (e.g., PCEP/PCAP). Cloud certifications (Azure/AWS/GCP) such as Azure AI Engineer, AWS ML Specialty, etc. Why Join PwC CTIO? Be part of a mission-driven AI innovation team tackling industry-wide transformation challenges. Gain exposure to bleeding-edge GenAI research, rapid prototyping, and product development. Contribute to a diverse portfolio of AI solutions spanning pharma, finance, and core business domains. Operate in a startup-like environment within the safety and structure of a global enterprise. Accelerate your career as a deep tech leader in an AI-first future.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

22 - 25 Lacs

India

On-site

TECHNICAL ARCHITECT Key Responsibilities 1. Designing technology systems: Plan and design the structure of technology solutions, and work with design and development teams to assist with the process. 2. Communicating: Communicate system requirements to software development teams, and explain plans to developers and designers. They also communicate the value of a solution to stakeholders and clients. 3. Managing Stakeholders: Work with clients and stakeholders to understand their vision for the systems. Should also manage stakeholder expectations. 4. Architectural Oversight: Develop and implement robust architectures for AI/ML and data science solutions, ensuring scalability, security, and performance. Oversee architecture for data-driven web applications and data science projects, providing guidance on best practices in data processing, model deployment, and end-to-end workflows. 5. Problem Solving: Identify and troubleshoot technical problems in existing or new systems. Assist with solving technical problems when they arise. 6. Ensuring Quality: Ensure if systems meet security and quality standards. Monitor systems to ensure they meet both user needs and business goals. 7. Project management: Break down project requirements into manageable pieces of work, and organise the workloads of technical teams. 8. Tool & Framework Expertise: Utilise relevant tools and technologies, including but not limited to LLMs, TensorFlow, PyTorch, Apache Spark, cloud platforms (AWS, Azure, GCP), Web App development frameworks and DevOps practices. 9. Continuous Improvement: Stay current on emerging technologies and methods in AI, ML, data science, and web applications, bringing insights back to the team to foster continuous improvement. Technical Skills 1. Proficiency in AI/ML frameworks such as TensorFlow, PyTorch, Keras, and scikit-learn for developing machine learning and deep learning models. 2. Knowledge or experience working with self-hosted or managed LLMs. 3. Knowledge or experience with NLP tools and libraries (e.g., SpaCy, NLTK, Hugging Face Transformers) and familiarity with Computer Vision frameworks like OpenCV and related libraries for image processing and object recognition. 4. Experience or knowledge in back-end frameworks (e.g., Django, Spring Boot, Node.js, Express etc.) and building RESTful and GraphQL APIs. 5. Familiarity with microservices, serverless, and event-driven architectures. Strong understanding of design patterns (e.g., Factory, Singleton, Observer) to ensure code scalability and reusability. 6. Proficiency in modern front-end frameworks such as React, Angular, or Vue.js, with an understanding of responsive design, UX/UI principles, and state management (e.g., Redux) 7. In-depth knowledge of SQL and NoSQL databases (e.g., PostgreSQL, MongoDB, Cassandra), as well as caching solutions (e.g., Redis, Memcached). 8. Expertise in tools such as Apache Spark, Hadoop, Pandas, and Dask for large-scale data processing. 9. Understanding of data warehouses and ETL tools (e.g., Snowflake, BigQuery, Redshift, Airflow) to manage large datasets. 10. Familiarity with visualisation tools (e.g., Tableau, Power BI, Plotly) for building dashboards and conveying insights. 11. Knowledge of deploying models with TensorFlow Serving, Flask, FastAPI, or cloud-native services (e.g., AWS SageMaker, Google AI Platform). 12. Familiarity with MLOps tools and practices for versioning, monitoring, and scaling models (e.g., MLflow, Kubeflow, TFX). 13. Knowledge or experience in CI/CD, IaC and Cloud Native toolchains. 14. Understanding of security principles, including firewalls, VPC, IAM, and TLS/SSL for secure communication. 15. Knowledge of API Gateway, service mesh (e.g., Istio), and NGINX for API security, rate limiting, and traffic management. Experience Required Technical Architect with 7 - 12 years of experience Salary 22-25 LPA Job Types: Full-time, Permanent Pay: ₹2,200,000.00 - ₹2,500,000.00 per year Work Location: In person

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Delhi

On-site

About us Bain & Company is a global consultancy that helps the world’s most ambitious change makers define the future. Across 65 offices in 40 countries, we work alongside our clients as one team with a shared ambition to achieve extraordinary results, outperform the competition and redefine industries. Since our founding in 1973, we have measured our success by the success of our clients, and we proudly maintain the highest level of client advocacy in the industry. In 2004, the firm established its presence in the Indian market by opening the Bain Capability Center (BCC) in New Delhi. The BCC is now known as BCN (Bain Capability Network) with its nodes across various geographies. BCN is an integral and largest unit of (ECD) Expert Client Delivery. ECD plays a critical role as it adds value to Bain's case teams globally by supporting them with analytics and research solutioning across all industries, specific domains for corporate cases, client development, private equity diligence or Bain intellectual property. The BCN comprises of Consulting Services, Knowledge Services and Shared Services. Who you will work with The Consumer Products Center of Expertise collaborates with Bain’s global Consumer Products Practice leadership, client-facing Bain leadership and teams, and with end clients on development and delivery of Bain’s proprietary CP products and solutions. These solutions aim to answer strategic questions of Bain’s CP clients relating to brand strategy (consumer needs, assortment, pricing, distribution), revenue growth management (pricing strategy, promotions, profit pools, trade terms), negotiation strategy with key retailers, optimization of COGS etc. You will work as part of the team in CP CoE comprising of a mix of Director, Managers, Projects Leads, Associates and Analysts working to implement cloud-based end-to-end advanced analytics solutions. Delivery models on projects vary from working as part of a CP Center of Expertise, broader global Bain case team within the CP ringfence, or within other industry CoEs such as FS / Retail / TMT / Energy / CME / etc with BCN on need basis The AS is expected to have a knack for seeking out challenging problems and coming up with their own ideas, which they will be encouraged to brainstorm with their peers and managers. They should be willing to learn new techniques and be open to solving problems with an interdisciplinary approach. They must have excellent coding skills and should demonstrate a willingness to write modular, reusable, and functional code. What you’ll do Collaborate with data scientists working with Python, LLMs, NLP, and Generative AI to design, fine-tune, and deploy intelligent agents and chains-based applications. Develop and maintain front-end interfaces for AI and data science applications using React.js / Angular / Nextjs and/or Streamlit/ DASH, enhancing user interaction with complex machine learning and NLP-driven systems. Build and integrate Python-based machine learning models with backend systems via RESTful APIs using frameworks like FastAPI / Flask or Django. Translate complex business problems into scalable technical solutions, integrating AI capabilities with robust backend and frontend systems. Assist in the design and implementation of scalable data pipelines and ETL workflows using DBT, PySpark, and SQL, supporting both analytics and generative AI solutions. Leverage containerization tools like Docker and utilize Git for version control, ensuring code modularity, maintainability, and collaborative development. Deploy ML-powered and data-driven applications on cloud platforms such as AWS or Azure, optimizing for performance, scalability, and cost-efficiency. Contribute to internal AI/ML Ops platforms and tools, streamlining model deployment, monitoring, and lifecycle management. Create dashboards, visualizations, and presentations using tools like Tableau/ PowerBI, Plotly, and Seaborn to drive business insights. Proficient with Excel, and PowerPoint by showing proficiency in business communication through stakeholder interactions. About you A Master’s degree or higher in Computer Science, Data Science, Engineering, or related fields OR Bachelor's candidates with relevant industry experience will also be considered. Proven experience (2 years for Master’s; 3+ years for Bachelor’s) in AI/ML, software development, and data engineering. Solid understanding of LLMs, NLP, Generative AI, chains, agents, and model fine-tuning methodologies. Proficiency in Python, with experience using libraries such as Pandas, Numpy, Plotly, and Seaborn for data manipulation and visualization. Experience working with modern Python frameworks such as FastAPI for backend API development. Frontend development skills using HTML, CSS, JavaScript/TypeScript, and modern frameworks like React.js; Streamlit knowledge is a plus. Strong grasp of data engineering concepts – including ETL pipelines, batch processing using DBT and PySpark, and working with relational databases like PostgreSQL, Snowflake etc. Good working knowledge of cloud infrastructure (AWS and/or Azure) and deployment best practices. Familiarity with MLOps/AI Ops tools and workflows including CI/CD pipelines, monitoring, and container orchestration (with Docker and Kubernetes). Good-to-have: Experience in BI tools such as Tableau or PowerBI, Good-to-have: Prior exposure to consulting projects or CP (Consumer Products) business domain. What makes us a great place to work We are proud to be consistently recognized as one of the world's best places to work, a champion of diversity and a model of social responsibility. We are currently ranked the #1 consulting firm on Glassdoor’s Best Places to Work list, and we have maintained a spot in the top four on Glassdoor's list for the last 12 years. We believe that diversity, inclusion and collaboration is key to building extraordinary teams. We hire people with exceptional talents, abilities and potential, then create an environment where you can become the best version of yourself and thrive both professionally and personally. We are publicly recognized by external parties such as Fortune, Vault, Mogul, Working Mother, Glassdoor and the Human Rights Campaign for being a great place to work for diversity and inclusion, women, LGBTQ and parents.

Posted 2 weeks ago

Apply

5.0 - 6.0 years

8 - 15 Lacs

India

On-site

We are seeking a highly skilled Python Developer with expertise in Machine Learning and Data Analytics to join our team. The ideal candidate should have 5-6 years of experience in developing end-to-end ML-driven applications and handling data-driven projects independently. You will be responsible for designing, developing, and deploying Python-based applications that leverage data analytics, statistical modeling, and machine learning techniques. Key Responsibilities: Design, develop, and deploy Python applications for data analytics and machine learning. Work independently on machine learning model development, evaluation, and optimization. Develop ETL pipelines and process large-scale datasets for analysis. Implement scalable and efficient algorithms for predictive analytics and automation. Optimize code for performance, scalability, and maintainability. Collaborate with stakeholders to understand business requirements and translate them into technical solutions. Integrate APIs and third-party tools to enhance functionality. Document processes, code, and best practices for maintainability. Required Skills & Qualifications: 5-6 years of professional experience in Python application development. Strong expertise in Machine Learning, Data Analytics, and AI frameworks (TensorFlow, PyTorch, Scikit-learn, etc.). Proficiency in Python libraries such as Pandas, NumPy, SciPy, and Matplotlib. Experience with SQL and NoSQL databases (PostgreSQL, MongoDB, etc.). Hands-on experience with big data technologies (Apache Spark, Delta Lake, Hadoop, etc.). Strong experience in developing APIs and microservices using FastAPI, Flask, or Django. Good understanding of data structures, algorithms, and software development best practices. Strong problem-solving and debugging skills. Ability to work independently and handle multiple projects simultaneously. Good to have - Working knowledge of cloud platforms (Azure/AWS/GCP) for deploying ML models and data applications. Job Type: Full-time Pay: ₹800,000.00 - ₹1,500,000.00 per year Schedule: Day shift Ability to commute/relocate: Chandrasekharpur, Bhubaneswar, Orissa: Reliably commute or planning to relocate before starting work (Preferred) Experience: Python: 5 years (Required) Work Location: In person Expected Start Date: 01/08/2025

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Role Description Role Proficiency: Leverage expertise in a technology area (e.g. Infromatica Transformation Terradata data warehouse Hadoop Analytics) Responsible for Architecture for a small/mid-size projects. Outcomes Implement either data extract and transformation a data warehouse (ETL Data Extracts Data Load Logic Mapping Work Flows stored procedures data warehouse) data analysis solution data reporting solutions or cloud data tools in any one of the cloud providers(AWS/AZURE/GCP) Understand business workflows and related data flows. Develop design for data acquisitions and data transformation or data modelling; applying business intelligence on data or design data fetching and dashboards Design information structure work-and dataflow navigation. Define backup recovery and security specifications Enforce and maintain naming standards and data dictionary for data models Provide or guide team to perform estimates Help team to develop proof of concepts (POC) and solution relevant to customer problems. Able to trouble shoot problems while developing POCs Architect/Big Data Speciality Certification in (AWS/AZURE/GCP/General for example Coursera or similar learning platform/Any ML) Measures Of Outcomes Percentage of billable time spent in a year for developing and implementing data transformation or data storage Number of best practices documented in any new tool and technology emerging in the market Number of associates trained on the data service practice Outputs Expected Strategy & Planning: Create or contribute short-term tactical solutions to achieve long-term objectives and an overall data management roadmap Implement methods and procedures for tracking data quality completeness redundancy and improvement Ensure that data strategies and architectures meet regulatory compliance requirements Begin engaging external stakeholders including standards organizations regulatory bodies operators and scientific research communities or attend conferences with respect to data in cloud Operational Management Help Architects to establish governance stewardship and frameworks for managing data across the organization Provide support in implementing the appropriate tools software applications and systems to support data technology goals Collaborate with project managers and business teams for all projects involving enterprise data Analyse data-related issues with systems integration compatibility and multi-platform integration Project Control And Review Provide advice to teams facing complex technical issues in the course of project delivery Define and measure project and program specific architectural and technology quality metrics Knowledge Management & Capability Development Publish and maintain a repository of solutions best practices and standards and other knowledge articles for data management Conduct and facilitate knowledge sharing and learning sessions across the team Gain industry standard certifications on technology or area of expertise Support technical skill building (including hiring and training) for the team based on inputs from project manager /RTE’s Mentor new members in the team in technical areas Gain and cultivate domain expertise to provide best and optimized solution to customer (delivery) Requirement Gathering And Analysis Work with customer business owners and other teams to collect analyze and understand the requirements including NFRs/define NFRs Analyze gaps/ trade-offs based on current system context and industry practices; clarify the requirements by working with the customer Define the systems and sub-systems that define the programs People Management Set goals and manage performance of team engineers Provide career guidance to technical specialists and mentor them Alliance Management Identify alliance partners based on the understanding of service offerings and client requirements In collaboration with Architect create a compelling business case around the offerings Conduct beta testing of the offerings and relevance to program Technology Consulting In collaboration with Architects II and III analyze the application and technology landscapers process and tolls to arrive at the architecture options best fit for the client program Analyze Cost Vs Benefits of solution options Support Architects II and III to create a technology/ architecture roadmap for the client Define Architecture strategy for the program Innovation And Thought Leadership Participate in internal and external forums (seminars paper presentation etc) Understand clients existing business at the program level and explore new avenues to save cost and bring process efficiency Identify business opportunities to create reusable components/accelerators and reuse existing components and best practices Project Management Support Assist the PM/Scrum Master/Program Manager to identify technical risks and come-up with mitigation strategies Stakeholder Management Monitor the concerns of internal stakeholders like Product Managers & RTE’s and external stakeholders like client architects on Architecture aspects. Follow through on commitments to achieve timely resolution of issues Conduct initiatives to meet client expectations Work to expand professional network in the client organization at team and program levels New Service Design Identify potential opportunities for new service offerings based on customer voice/ partner inputs Conduct beta testing / POC as applicable Develop collaterals guides for GTM Skill Examples Use data services knowledge creating POC to meet a business requirements; contextualize the solution to the industry under guidance of Architects Use technology knowledge to create Proof of Concept (POC) / (reusable) assets under the guidance of the specialist. Apply best practices in own area of work helping with performance troubleshooting and other complex troubleshooting. Define decide and defend the technology choices made review solution under guidance Use knowledge of technology t rends to provide inputs on potential areas of opportunity for UST Use independent knowledge of Design Patterns Tools and Principles to create high level design for the given requirements. Evaluate multiple design options and choose the appropriate options for best possible trade-offs. Conduct knowledge sessions to enhance team's design capabilities. Review the low and high level design created by Specialists for efficiency (consumption of hardware memory and memory leaks etc.) Use knowledge of Software Development Process Tools & Techniques to identify and assess incremental improvements for software development process methodology and tools. Take technical responsibility for all stages in the software development process. Conduct optimal coding with clear understanding of memory leakage and related impact. Implement global standards and guidelines relevant to programming and development come up with 'points of view' and new technological ideas Use knowledge of Project Management & Agile Tools and Techniques to support plan and manage medium size projects/programs as defined within UST; identifying risks and mitigation strategies Use knowledge of Project Metrics to understand relevance in project. Collect and collate project metrics and share with the relevant stakeholders Use knowledge of Estimation and Resource Planning to create estimate and plan resources for specific modules or small projects with detailed requirements or user stories in place Strong proficiencies in understanding data workflows and dataflow Attention to details High analytical capabilities Knowledge Examples Data visualization Data migration RDMSs (relational database management systems SQL Hadoop technologies like MapReduce Hive and Pig. Programming languages especially Python and Java Operating systems like UNIX and MS Windows. Backup/archival software. Additional Comments AI Architect Role Summary: Hands-on AI Architect with strong expertise in Deep Learning, Generative AI, and real-world AI/ML systems. The role involves leading the architecture, development, and deployment of AI agent-based solutions, supporting initiatives such as intelligent automation, anomaly detection, and GenAI-powered assistants across enterprise operations and engineering. This is a hands-on role ideal for someone who thrives in fast-paced environments, is passionate about AI innovations, and can adapt across multiple opportunities based on business priorities. Key Responsibilities: Design and architect AI-based solutions including multi-agent GenAI systems using LLMs and RAG pipelines. Build POCs, prototypes, and production-grade AI components for operations, support automation, and intelligent assistants. Lead end-to-end development of AI agents for use cases such as triage, RCA automation, and predictive analytics. Leverage GenAI (LLMs) and Time Series models to drive intelligent observability and performance management. Work closely with product, engineering, and operations teams to align solutions with domain and customer needs. Own model lifecycle from experimentation to deployment using modern MLOps and LLMOps practices. Ensure scalable, secure, and cost-efficient implementation across AWS and Azure cloud environments. Key Skills & Technology Areas: AI/ML Expertise: 8+ years in AI/ML, with hands-on experience in deep learning, model deployment, and GenAI. LLMs & Frameworks: GPT-3+, Claude, LLAMA3, LangChain, LangGraph, Transformers (BERT, T5), RAG pipelines, LLMOps. Programming: Python (advanced), Keras, PyTorch, Pandas, FastAPI, Celery (for agent orchestration), Redis. Modeling & Analytics: Time Series Forecasting, Predictive Modeling, Synthetic Data Generation. Data & Storage: ChromaDB, Pinecone, FAISS, DynamoDB, PostgreSQL, Azure Synapse, Azure Data Factory. Cloud & Tools: o AWS (Bedrock, SageMaker, Lambda), o Azure (Azure ML, Azure Databricks, Synapse), o GCP (Vertex AI – optional) Observability Integration: Splunk, ELK Stack, Prometheus. DevOps/MLOps: Docker, GitHub Actions, Kubernetes, CI/CD pipelines, model monitoring & versioning. Architectural Patterns: Microservices, Event-Driven Architecture, Multi-Agent Systems, API-first Design. Other Requirements: Proven ability to work independently and collaboratively in agile, innovation-driven teams. Strong problem-solving mindset and product-oriented thinking. Excellent communication and technical storytelling skills. Flexibility to work across multiple opportunities based on business priorities. Experience in Telecom, E- Commerce, or Enterprise IT Operations is a plus. ________________________________________ ________________________________________ ________________________________________ Skills python,pandas,AIML,GENAI

Posted 2 weeks ago

Apply

1.0 - 3.0 years

60 - 96 Lacs

Pune

Work from Office

Responsibilities: Exp Machine Learning ,NLP, Generative AI, LLm ,API ,django . Develop CRM solutions with Hubspot & FastAPI/Flask frameworks. Design, develop & maintain RAG applications on Cloud platforms. Deploy Model Context Protocol server

Posted 2 weeks ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Please find the JD for Senior Full Stack Engineer ( Python + MERN ) :- Location: Noida Company Profile : MeeTri is a global provider of Information Technology services and business solutions. We leverage deep industry and functional expertise, leading technology practices to help clients transform their highest-value business processes and improve their business performance. MeeTri is led by a team of seasoned executives with extensive experience, industry knowledge, and technology expertise. Our management team is committed to excellence in customer satisfaction and technical innovation and partnering with best-of-breed technology and distribution partners. Our vision is to achieve global IT services leadership in providing value-added high-quality IT solutions to our clients in selected horizontal and vertical segments, by combining technology skills, domain expertise, process focus, and a commitment to long-term client relationships. Job Description: Strong experience with Python (Flask, FastAPI) for backend development, building efficient APIs (REST/GraphQL). Optimize backend performance using AsyncIO, multithreading, and multiprocessing techniques. Lead technical aspects of projects, including architecture design and technology stack decisions. Develop modern web applications using MERN stack (MongoDB, Express.js, React.js, Node.js). Mentor and guide junior developers, conduct code reviews, and enforce best practices. Work with PostgreSQL and MongoDB for database management, optimization, and design. Deploy and manage applications using AWS (EC2, Lambda, S3, RDS). Implement CI/CD pipelines and automation for smooth deployment and continuous integration. Collaborate with UX/UI designers to create intuitive and responsive user interfaces. Participate in Agile development processes, including sprint planning and retrospectives. Ensure high application performance, scalability, and security across both frontend and backend. Implement cloud-based solutions for high availability and disaster recovery. Skills: Proficient in Python, Node.js, React.js, Express.js, Flask, and Django. Experience with PostgreSQL, MongoDB, and database optimization. Expertise in AsyncIO, multithreading, multiprocessing for concurrency. Familiarity with AWS services (EC2, Lambda, S3, RDS). Experience in Git, CI/CD tools, and version control systems. Ability to lead teams, mentor junior developers, and make technical decisions. Strong problem-solving and debugging skill

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies