Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
0 years
0 Lacs
Bangalore North Rural, Karnataka, India
On-site
About the Role: Okulo Aerospace is looking for a versatile full-stack engineer who can build fast, take ownership, and grow with the company. You’ll work closely to design, develop, and scale our web applications using React (Frontend), FastAPI (Backend), Docker, and REST APIs . This is a high-impact role where you’ll help shape the architecture, culture, and future of our product . You'll be the second software engineer. Requirements ✔ Frontend: Strong in NEXT.js / React.js , JavaScript/TypeScript, Tailwind CSS. ✔ Backend: Experience with FastAPI (or Flask/Django) & Python. ✔ APIs: Comfortable designing & consuming RESTful APIs . ✔ DevOps: Hands-on with Docker (Kubernetes is a bonus). ✔ Database: NoSQL (MongoDB). ✔ Git/GitHub: Clean, organized version control. ✔ Problem-Solving: Ability to debug, optimize, and learn quickly. Preferred: WebSockets, AWS basics, and experience with testing (Jest, PyTest). Responsibilities Build end-to-end features – from UI (React) to backend (FastAPI) to deployment (Docker). Help define best practices in code, architecture, and DevOps as we scale. Work directly in product decisions. Optimize performance, fix bugs, and ensure scalable, maintainable code. Own what you build – this isn’t a "follow orders" role; you’ll have real responsibility. Benefits Intellectual fulfillment and rapid personal/professional development. Fast-paced, innovative team and work environment. Financial compensation on a case-by-case basis. Flexible work hours, professional autonomy, open-door policy, informal work setting, and strong meritocracy. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Greater Nashik Area
On-site
Dreaming big is in our DNA. It’s who we are as a company. It’s our culture. It’s our heritage. And more than ever, it’s our future. A future where we’re always looking forward. Always serving up new ways to meet life’s moments. A future where we keep dreaming bigger. We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential. The power we create together – when we combine your strengths with ours – is unstoppable. Are you ready to join a team that dreams as big as you do? AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev. The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology. The teams are transforming Operations through Tech and Analytics. Do You Dream Big? We Need You. Job Description Job Title: Data Scientist Location: Bangalore Reporting to: Senior Manager Analytics Purpose of the role Anheuser-Busch InBev (AB InBev)’s Supply Analytics is responsible for building competitive differentiated solutions that enhance brewery efficiency through data-driven insights. We optimize processes, reduce waste, and improve productivity by leveraging advanced analytics and AI-driven solutions. As a Data Scientist you will work at the intersection of Conceptualize the analytical solution for the business problem by implementing statistical models and programming techniques. Application of machine learning solutions. Best in class cloud technology & micro-services architecture. Use DevOps best practices that include model serving, data & code versioning. Key tasks & accountabilities Develop and fine-tune Gen AI models to solve business problems, leveraging LLMs, and other advanced AI techniques. Design, implement, and optimize AI-driven solutions that enhance automation, efficiency, and decision-making. Work with cloud-based architectures to deploy and scale AI models efficiently using best-in-class microservices. Apply DevOps and MLOps best practices for model serving, data and code versioning, and continuous integration/deployment. Collaborate with cross-functional teams (engineering, business, and product teams) to translate business needs into AI-driven solutions. Ensure model interpretability, reliability, and performance, continuously improving accuracy and reducing biases. Develop internal tools and utilities to enhance the productivity of the team and streamline workflows. Maintain best coding practices, including proper documentation, testing, logging, and performance monitoring. Stay up to date with the latest advancements in Gen AI, LLMs, and deep learning to incorporate innovative approaches into projects. Qualifications, Experience, Skills Level Of Educational Attainment Required Academic degree in, but not limited to, Bachelors or master's in computer application, Computer science, or any engineering discipline. Previous Work Experience Minimum 3 years of relevant experience. Technical Skills Required Programming Languages: Proficiency in Python. Mathematics and Statistics: Strong understanding of linear algebra, calculus, probability, and statistics. Machine Learning Algorithms: Knowledge of supervised, unsupervised, and reinforcement learning techniques. Natural Language Processing (NLP): Understanding of techniques such as tokenization, POS tagging, named entity recognition, and machine translation. LLMs: Experience with Langchain, inferring from LLMs and fine tuning LLMs for specific tasks, Prompt Engineering. Data Preprocessing: Skills in data cleaning, normalization, augmentation, and handling imbalanced datasets. Database Management: Experience with SQL and NoSQL databases like MongoDB and Redis. Cloud Platforms: Familiarity with Azure and Google Cloud Platform. DevOps: Knowledge of CI/CD pipelines, Docker, Kubernetes. Other Skills Required APIs: Experience with FastAPI or Flask. Software Development: Understanding of software development lifecycle (SDLC) and Agile methodologies. And above all of this, an undying love for beer! We dream big to create future with more cheers. Show more Show less
Posted 1 week ago
2.0 - 4.0 years
6 - 9 Lacs
Mumbai
Work from Office
Seeking Full-Stack Developer to build intelligent, task-driven AI agents using React & FastAPI. Must blend AI/ML expertise with software skills to create scalable, modular systems for API/UI interaction. Required Candidate profile 1. 2+ yrs in software (Full-Stack) development 2. Strong in Next.js 3. Built scalable web apps 4. Well-versed with prompt engineering
Posted 1 week ago
5.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
• Provide senior-level production support for Splunk apps and integrations across on-prem and cloud environments. • Diagnose and resolve issues related to app scripts, APIs, dashboards, search queries, alerting logic, and data ingestion. • Troubleshoot and debug application-level issues using logs, scripts, and system diagnostics. • Drive automation for repeated tasks and scripts; maintain and improve operational tooling. • Work with development and engineering teams to analyze bugs and deploy fixes. • Support deployment upgrades, migration, and configuration changes across environments. • Act as a technical point of contact for assigned Splunk applications and use cases. • Ensure proper documentation of root cause analysis and contribute to the knowledge base. Skillset • 5+ years’ experience with Splunk application support hybrid environment which includes customer interaction & case management • Strong scripting expertise in Python, Bash, JavaScript, GitHub, FastAPI etc. • Experience with relative experience in integration methods like API, HEC, scripted inputs etc. • Hands on experience in network data troubleshooting using Wireshark, tcp dump etc • Good knowledge of Splunk integrations systems in On-Prem, SaaS apps, network components and databases etc. • Excellent knowledge of SQL, PL/SQL, SQLAlchemy, etc. • Self-driven and adaptable with strong documentation and communication skills You can also share your CV at vanshita.pawar@hummingbirdconsulting.work Show more Show less
Posted 1 week ago
9.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. We have some Python demands and we foresee this growing in upcoming days. Need your help with hiring Python Developers with JD below: Python Developer (FastAPI/Flask/Django) Experience Range: 5 – 9 Years (A healthy mix within this experience Range) Responsibilitie Build scalable backend applications using FastAPI, Flask or Django Design RESTful APIs and implement data models with Pydantic Utilize AWS services (DynamoDB, S3, EKS) for deployment Write unit tests and integrate with CI/CD pipelines Collaborate with cross-functional teams Requirements Python development experience Experience with any one of the following frameworks: Django, FastAPI, or Flask AWS services knowledge (DynamoDB, S3, EKS) Pydantic and unit testing experience like PyTest Git for version control Nice to Have Containerization (Docker) and orchestration (Kubernetes) CI/CD pipeline experience NoSQL database knowledge EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 week ago
6.0 - 9.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Summary Position Summary Strategy & Analytics AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Python Developer - Sr. Consultant T he position is suited for individuals who have the ability to work in a constantly challenging environment and deliver effectively and efficiently. As a Data Engineer, you will be an integral member of our Data & Analytics team responsible for design and development of pipelines using cutting edge technologies . Work you’ll do Implementation of security and data protection Implementation of ETL pipelines for data from a wide variety of data sources using Python and SQL Delivering data and insights in Realtime Participate in architectural, design, and product sessions. Unit testing and debugging skills Collaborate with other developers, testers, and system engineers to ensure quality of deliverables and any product enhancements. Qualifications Required: 6-9 Years of technology Consulting experience Education: Bachelors/ Master’s degree in Computer Science / MCA / M.Sc / MBA A minimum of 2 Years of experience into Unit testing and debugging skills Excellent knowledge of Python programming language along with knowledge of at least one Python web framework (Django , Flask , FastAPI , Pyramid ) Extensive experience in Pandas/ Numpy dataframes , slicing, data wrangling, aggregations. Lambda Functions, Decorators. Vector operations on Pandas dataframes /series. Application of applymap , apply, map functions. Understanding on using a framework based on specific needs and requirements. Understanding of the threading limitations of Python, and multi-process architecture Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3 Primary Skills Python and data analysis libraries (Pandas, NumPy, SciPy). Django DS/Algo SQL (Read & Write) CRUD Awareness of Microservices Preferred: Good Understanding of fundamental design principles behind a scalable application Good Understanding of accessibility and security compliance Familiarity with event-driven programming in Python Proficient understanding of code versioning tools (Git , Mercurial or SVN) Knowledge of PowerShell and SQL Server You are familiar with big data technologies like Spark or Flink and comfortable working with web-scale datasets You have an eye for detail, good data intuition, and a passion for data quality G ood Knowledge of user authentication and authorization between multiple systems, servers, and environments You appreciate the importance of great documentation and data debugging skill Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300058 Show more Show less
Posted 1 week ago
3.0 - 6.0 years
0 Lacs
Greater Kolkata Area
On-site
Summary Position Summary Strategy & Analytics AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Python Developer - Consultant T he position is suited for individuals who have the ability to work in a constantly challenging environment and deliver effectively and efficiently. As a Data Engineer, you will be an integral member of our Data & Analytics team responsible for design and development of pipelines using cutting edge technologies . Work you’ll do Implementation of security and data protection Implementation of ETL pipelines for data from a wide variety of data sources using Python and SQL Delivering data and insights in Realtime Participate in architectural, design, and product sessions. Unit testing and debugging skills Collaborate with other developers, testers, and system engineers to ensure quality of deliverables and any product enhancements. Qualifications Required: 3-6 Years of technology Consulting experience Education: Bachelors/ Master’s degree in Computer Science / MCA / M.Sc / MBA A minimum of 2 Years of experience into Unit testing and debugging skills Excellent knowledge of Python programming language along with knowledge of at least one Python web framework (Django , Flask , FastAPI , Pyramid ) Extensive experience in Pandas/ Numpy dataframes , slicing, data wrangling, aggregations. Lambda Functions, Decorators. Vector operations on Pandas dataframes /series. Application of applymap , apply, map functions. Understanding on using a framework based on specific needs and requirements. Understanding of the threading limitations of Python, and multi-process architecture Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3 Primary Skills Python and data analysis libraries (Pandas, NumPy, SciPy). Django DS/Algo SQL (Read & Write) CRUD Awareness of Microservices Preferred: Good Understanding of fundamental design principles behind a scalable application Good Understanding of accessibility and security compliance Familiarity with event-driven programming in Python Proficient understanding of code versioning tools (Git , Mercurial or SVN) Knowledge of PowerShell and SQL Server You are familiar with big data technologies like Spark or Flink and comfortable working with web-scale datasets You have an eye for detail, good data intuition, and a passion for data quality G ood Knowledge of user authentication and authorization between multiple systems, servers, and environments You appreciate the importance of great documentation and data debugging skill Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300054 Show more Show less
Posted 1 week ago
6.0 - 9.0 years
0 Lacs
Greater Kolkata Area
On-site
Summary Position Summary Strategy & Analytics AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Python Developer - Sr. Consultant T he position is suited for individuals who have the ability to work in a constantly challenging environment and deliver effectively and efficiently. As a Data Engineer, you will be an integral member of our Data & Analytics team responsible for design and development of pipelines using cutting edge technologies . Work you’ll do Implementation of security and data protection Implementation of ETL pipelines for data from a wide variety of data sources using Python and SQL Delivering data and insights in Realtime Participate in architectural, design, and product sessions. Unit testing and debugging skills Collaborate with other developers, testers, and system engineers to ensure quality of deliverables and any product enhancements. Qualifications Required: 6-9 Years of technology Consulting experience Education: Bachelors/ Master’s degree in Computer Science / MCA / M.Sc / MBA A minimum of 2 Years of experience into Unit testing and debugging skills Excellent knowledge of Python programming language along with knowledge of at least one Python web framework (Django , Flask , FastAPI , Pyramid ) Extensive experience in Pandas/ Numpy dataframes , slicing, data wrangling, aggregations. Lambda Functions, Decorators. Vector operations on Pandas dataframes /series. Application of applymap , apply, map functions. Understanding on using a framework based on specific needs and requirements. Understanding of the threading limitations of Python, and multi-process architecture Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3 Primary Skills Python and data analysis libraries (Pandas, NumPy, SciPy). Django DS/Algo SQL (Read & Write) CRUD Awareness of Microservices Preferred: Good Understanding of fundamental design principles behind a scalable application Good Understanding of accessibility and security compliance Familiarity with event-driven programming in Python Proficient understanding of code versioning tools (Git , Mercurial or SVN) Knowledge of PowerShell and SQL Server You are familiar with big data technologies like Spark or Flink and comfortable working with web-scale datasets You have an eye for detail, good data intuition, and a passion for data quality G ood Knowledge of user authentication and authorization between multiple systems, servers, and environments You appreciate the importance of great documentation and data debugging skill Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300058 Show more Show less
Posted 1 week ago
6.0 - 9.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Summary Position Summary Strategy & Analytics AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. Together with the Strategy practice, our Strategy & Analytics portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Python Developer - Sr. Consultant T he position is suited for individuals who have the ability to work in a constantly challenging environment and deliver effectively and efficiently. As a Data Engineer, you will be an integral member of our Data & Analytics team responsible for design and development of pipelines using cutting edge technologies . Work you’ll do Implementation of security and data protection Implementation of ETL pipelines for data from a wide variety of data sources using Python and SQL Delivering data and insights in Realtime Participate in architectural, design, and product sessions. Unit testing and debugging skills Collaborate with other developers, testers, and system engineers to ensure quality of deliverables and any product enhancements. Qualifications Required: 6-9 Years of technology Consulting experience Education: Bachelors/ Master’s degree in Computer Science / MCA / M.Sc / MBA A minimum of 2 Years of experience into Unit testing and debugging skills Excellent knowledge of Python programming language along with knowledge of at least one Python web framework (Django , Flask , FastAPI , Pyramid ) Extensive experience in Pandas/ Numpy dataframes , slicing, data wrangling, aggregations. Lambda Functions, Decorators. Vector operations on Pandas dataframes /series. Application of applymap , apply, map functions. Understanding on using a framework based on specific needs and requirements. Understanding of the threading limitations of Python, and multi-process architecture Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3 Primary Skills Python and data analysis libraries (Pandas, NumPy, SciPy). Django DS/Algo SQL (Read & Write) CRUD Awareness of Microservices Preferred: Good Understanding of fundamental design principles behind a scalable application Good Understanding of accessibility and security compliance Familiarity with event-driven programming in Python Proficient understanding of code versioning tools (Git , Mercurial or SVN) Knowledge of PowerShell and SQL Server You are familiar with big data technologies like Spark or Flink and comfortable working with web-scale datasets You have an eye for detail, good data intuition, and a passion for data quality G ood Knowledge of user authentication and authorization between multiple systems, servers, and environments You appreciate the importance of great documentation and data debugging skill Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 300058 Show more Show less
Posted 1 week ago
0 years
0 Lacs
India
On-site
About Creditors Academy: At Creditors Academy, we’re revolutionizing financial and legal education using cutting-edge technology. We empower learners with powerful tools to understand credit, law, and sovereignty in the modern age. Now, we’re building a next-generation, AI-powered Learning Management System (LMS) that integrates content generation, adaptive learning, and smart engagement into a single platform. We are looking for a high-potential AI Engineer / Machine Learning Engineer who will be the driving force behind the development of AI features , AI-generated content, and ML infrastructure for our educational ecosystem. Role Overview: As an AI Engineer at Creditors Academy , you will work closely with product, design, and instructional teams to build a smart LMS that thinks, talks, draws, teaches, and evaluates . You will implement AI/ML models, automate content creation, and integrate third-party AI tools into the system. This is a hybrid role that combines hands-on development , prompt engineering , and AI system design — with end-to-end ownership of features that touch thousands of learners. Key Responsibilities:1. AI-Powered Content Creation & Automation Build systems that use LLMs (like GPT-4, Claude, etc.) to: Auto-generate course modules, lesson plans, descriptions, quizzes Write AI scripts for voiceovers, explainers, or chatbot tutors Summarize complex credit/law topics into digestible content Develop AI workflows that automate: Comics and visual storytelling Educational video generation (via tools like RunwayML, Pika, Synthesia) AI voiceovers (via ElevenLabs, Descript, etc.) Image generation (via Midjourney, DALL·E, Stable Diffusion) 2. AI/ML System Design & Training Fine-tune or train models to: Deliver personalized learning journeys Analyze user progress and suggest improvements Power AI tutors/chatbots that can answer legal and credit-related queries Build AI capabilities for: AI picture reader (image-to-text for PDFs, legal forms, etc.) Auto-marking and adaptive assessments Generating dynamic learning feedback 3. LMS Development with Embedded AI Co-develop our custom LMS platform with inbuilt AI tools: AI-assisted content editing Admin dashboards with analytics Student activity tracking with real-time insights Integrate AI with: Shopify for storefront automation (course sales, upsells) Stripe or Razorpay for payments Multimedia delivery systems 4. Data Engineering & Model Performance Create structured data pipelines for: Tracking learner progress Model training (behavioral prediction, quiz scoring) Monitor, evaluate, and optimize AI model performance Ensure secure data storage, ethical AI usage, and compliance with privacy laws Required Skills & Qualifications: Educational Background: B.Tech / B.E. / M.Tech in Computer Science, Artificial Intelligence, Data Science, or a related field Core Technical Skills: Python (NumPy, Pandas, FastAPI/Flask) Machine Learning & NLP: Scikit-learn, TensorFlow, PyTorch, HuggingFace Transformers Familiarity with OpenAI, Claude, Gemini, or similar APIs Generative AI (LLMs, text-to-image, text-to-video tools) Git/GitHub, REST APIs, containerized deployments (Docker) AI Tools Experience (Preferred): ChatGPT, Claude, Copilot (for content) Midjourney, DALL·E, RunwayML, Pika (for media) ElevenLabs, Descript, PlayHT (for voiceovers) LangChain, Pinecone, Weaviate (for retrieval-augmented generation) Bonus Skills (Nice to Have): Prompt engineering for educational use cases Shopify app or store integrations Web backend frameworks: Node.js, Django, or similar Experience with LMS tools (Moodle, TalentLMS, or custom LMS) Google Cloud / AWS / Azure AI services What You Will Gain: Lead the AI vision for a cutting-edge EdTech platform Hands-on access to the latest AI tools and APIs Creative freedom to experiment with AI-generated education Collaborate directly with domain experts, designers, and founders Competitive salary with growth-based incentives Application Process: To apply, send your: Resume / CV GitHub or portfolio link (with relevant AI/ML projects) Optional: A short note describing an AI tool you’ve built or integrated Job Type: Full-time Pay: ₹17,000.00 - ₹25,000.00 per year Schedule: Day shift Monday to Friday Work Location: In person
Posted 1 week ago
6.0 years
0 Lacs
Hyderābād
Remote
Your opportunity At New Relic, we provide businesses with a state-of-the-art observability platform, leveraging advanced technologies to deliver real-time insights into the performance of software applications and infrastructure. We enable organizations to monitor, analyze, and optimize their systems to achieve enhanced reliability, performance, and user experience. New Relic is a leader in the industry and has been on the forefront of developing cutting edge AI/ML solutions to revolutionize observability. We are seeking an experienced and dynamic Lead Backend Engineer (Python) to join our AI/ML team. You will develop scalable web services and APIs using Python and its extended ecosystem for our Agentic Platform which will be the nucleus of AI driven workflows at New Relic. Your responsibilities will include ideating, implementing and owning the low level design of the services and leading the rest of the team. What you'll do Drive the design, development, and enhancement of core features and functionalities of our AI platform with micro-services architecture and deliver scalable, secure and reliable solutions Be proactive in identifying and addressing performance bottlenecks, applying optimizations, and maintaining the stability and availability of our platform Build thoughtful, high-quality code that is easy to read and maintain Collaborate with your team, external contributors, and others to help solve problems. Write and share proposals to improve team processes and approaches. This role requires Bachelor’s degree in Computer Science discipline or related field 6+ years of experience as a Software Engineer working with Python, developing production grade applications Demonstrated experience in designing, developing, and maintaining large-scale cloud platforms with a strong understanding of scalable distributed systems and microservices architecture Proficiency in back-end frameworks such as Flask/FastAPI; Pydantic for robust models; asyncio, aiohttp libraries for asynchronous request handling; Decorators for abstraction; Pytest for testing Competency in using Python threading and multiprocessing modules for parallel task execution. Knowledge of Coroutines. Understand the GIL and its implications on concurrency Experience in building secure infrastructure having simulated race condition attacks, injection attacks; leading teams through real incident management situations with strong debugging skills Demonstrated experience in working with both Relational and NoSQL DBs; message queueing systems (SQS/Kafka/RabbitMQ) Up to date with cloud technologies - AWS/Azure/GCP, Serverless, Docker, Kubernetes, CI/CD pipelines among others Bonus points if you have Masters in Computer Science discipline Exposure to Machine Learning and GenAI technologies Experience with Authentication/Authorization services etc. Communication protocol - gRPC GraphQL API working knowledge Please note that visa sponsorship is not available for this position. Fostering a diverse, welcoming and inclusive environment is important to us. We work hard to make everyone feel comfortable bringing their best, most authentic selves to work every day. We celebrate our talented Relics’ different backgrounds and abilities, and recognize the different paths they took to reach us – including nontraditional ones. Their experiences and perspectives inspire us to make our products and company the best they can be. We’re looking for people who feel connected to our mission and values, not just candidates who check off all the boxes. If you require a reasonable accommodation to complete any part of the application or recruiting process, please reach out to resume@newrelic.com. We believe in empowering all Relics to achieve professional and business success through a flexible workforce model. This model allows us to work in a variety of workplaces that best support our success, including fully office-based, fully remote, or hybrid. Our hiring process In compliance with applicable law, all persons hired will be required to verify identity and eligibility to work and to complete employment eligibility verification. Note: Our stewardship of the data of thousands of customers’ means that a criminal background check is required to join New Relic. We will consider qualified applicants with arrest and conviction records based on individual circumstances and in accordance with applicable law including, but not limited to, the San Francisco Fair Chance Ordinance. Headhunters and recruitment agencies may not submit resumes/CVs through this website or directly to managers. New Relic does not accept unsolicited headhunter and agency resumes, and will not pay fees to any third-party agency or company that does not have a signed agreement with New Relic. Candidates are evaluated based on qualifications, regardless of race, religion, ethnicity, national origin, sex, sexual orientation, gender expression or identity, age, disability, neurodiversity, veteran or marital status, political viewpoint, or other legally protected characteristics. Review our Applicant Privacy Notice at https://newrelic.com/termsandconditions/applicant-privacy-policy
Posted 1 week ago
5.0 years
6 - 9 Lacs
Hyderābād
On-site
About the job We are seeking an experienced Data Engineering Specialist interested in challenging the status quo to ensure the seamless creation and operation of the data pipelines that are needed by Sanofi’s advanced analytic, AI and ML initiatives for the betterment of our global patients and customers. Sanofi has recently embarked into a vast and ambitious digital transformation program. A cornerstone of this roadmap is the acceleration of its data transformation and of the adoption of artificial intelligence (AI) and machine learning (ML) solutions, to accelerate R&D, manufacturing and commercial performance and bring better drugs and vaccines to patients faster, to improve health and save lives Main Responsibilities: Establish technical designs to meet Sanofi requirements aligned with the architectural and Data standards Ownership of the entire back end of the application, including the design, implementation, test, and troubleshooting of the core application logic, databases, data ingestion and transformation, data processing and orchestration of pipelines, APIs, CI/CD integration and other processes Fine-tune and optimize queries using Snowflake platform and database techniques Optimize ETL/data pipelines to balance performance, functionality, and other operational requirements. Assess and resolve data pipeline issues to ensure performance and timeliness of execution Assist with technical solution discovery to ensure technical feasibility. Assist in setting up and managing CI/CD pipelines and development of automated tests Developing and managing microservices using python Conduct peer reviews for quality, consistency, and rigor for production level solution Design application architecture for efficient concurrent user handling, ensuring optimal performance during high usage periods Own all areas of the product lifecycle: design, development, test, deployment, operation, and support About you Qualifications: 5+ years of relevant experience developing backend, integration, data pipelining, and infrastructure Expertise in database optimization and performance improvement Expertise in Python, PySpark, and Snowpark Experience data warehousing and object-relational database (Snowflake and PostgreSQL) and writing efficient SQL queries Experience in cloud-based data platforms (Snowflake, AWS) Proficiency in developing robust, reliable APIs using Python and FastAPI Framework Expert in ELT and ETL & experience working with large data sets and performance and query optimization. IICS is a plus Understanding of data structures and algorithms Understanding of DBT is a plus Experience in modern testing framework (SonarQube, K6 is a plus) Strong collaboration skills, willingness to work with others to ensure seamless integration of the server-side and client-side Knowledge of DevOps best practices and associated tools is a plus, especially in the setup, configuration, maintenance, and troubleshooting of associated tools: Containers and containerization technologies (Kubernetes, Argo, Red Hat OpenShift) Infrastructure as code (Terraform) Monitoring and Logging (CloudWatch, Grafana) CI/CD Pipelines (JFrog Artifactory) Scripting and automation (Python, GitHub, Github actions) Experience with JIRA & Confluence Workflow orchestration (Airflow) Message brokers (RabbitMQ) Education: Bachelor’s degree in computer science, engineering, or similar quantitative field of study Why choose us? Bring the miracles of science to life alongside a supportive, future-focused team. Discover endless opportunities to grow your talent and drive your career, whether it’s through a promotion or lateral move, at home or internationally. Enjoy a thoughtful, well-crafted rewards package that recognizes your contribution and amplifies your impact. Take good care of yourself and your family, with a wide range of health and wellbeing benefits including high-quality healthcare, prevention and wellness programs and at least 14 weeks’ gender-neutral parental leave. Opportunity to work in an international environment, collaborating with diverse business teams and vendors, working in a dynamic team, and fully empowered to propose and implement innovative ideas. Pursue Progress . Discover Extraordinary . Progress doesn’t happen without people – people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. You can be one of those people. Chasing change, embracing new ideas and exploring all the opportunities we have to offer. Let’s pursue Progress. And let’s discover Extraordinary together. At Sanofi, we provide equal opportunities to all regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, or gender identity. Watch our ALL IN video and check out our Diversity Equity and Inclusion actions at sanofi.com! Languages: English is a must
Posted 1 week ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
We are seeking a highly responsible and accountable Application Support Engineer to provide after-market support for our AMR (Autonomous Mobile Robot) product line. In this role, you will be the primary point of contact for clients facing issues with deployed AMR solutions, ensuring timely and effective resolution. You will also play a key role in deploying CI/CD updates and maintaining robust automation workflows. The ideal candidate will possess strong troubleshooting skills, a deep understanding of robotic systems, and the ability to manage both technical and customer-facing tasks. After-Market Support: Serve as the first point of contact for clients experiencing issues with AMR systems, including UI, localization, sensor calibration, and integration problems. Investigate and resolve complex system-level issues, collaborating with cross-functional teams as needed. Document issues, root causes, and solutions for internal knowledge sharing and continuous improvement. CI/CD Deployment: Regularly deploy CI/CD updates to client sites, ensuring minimal downtime and seamless integration. Manage and troubleshoot deployment pipelines, including telemetry, logging, and update rollouts. Maintain and improve deployment scripts and automation tools. Assist team members in configuring their own testing environments and understanding workflows. Site Deployment and Support: Participate in site setup, deployment, and ongoing support for AMR solutions. Monitor system health, telemetry, and logs to proactively identify and address potential issues. Implement small updates and patches as requested by the team or clients. Continuous Improvement: Analyze recurring issues and contribute to root cause analysis and solution strategies. Provide feedback to development teams to improve product reliability and performance. Stay up-to-date with the latest advancements in AMR technology, CI/CD practices, and automation tools. Requirements Technical Skills: 3-5 years of relevant experience in technical support, automation, or deployment roles for robotics, automation, or IoT systems. Strong troubleshooting and debugging skills in complex robotic systems (ROS/ROS2 experience preferred). C++/Python, Git, Linux, Docker, FastAPI. Proficiency in setting up and maintaining testing environments (E2E, integration, automation). Knowledge of sensor integration (LIDAR, RealSense, etc.) and localization systems. Familiarity with Linux environments and command-line tools. Scripting skills (Python, Bash) for automation and deployment tasks. Product Knowledge: Understanding of AMR architecture, workflows, and integration with warehouse management systems. Experience with UI/UX troubleshooting and error diagnostics. Knowledge of telemetry, logging, and system health monitoring. Soft Skills: Excellent communication and customer service skills for client interactions. High level of responsibility and accountability for issue resolution and deployment tasks. Ability to work collaboratively in a team and support colleagues with technical guidance. Proactive problem-solving and a continuous improvement mindset. Benefits Exciting and challenging problems are addressed using wide-ranging technologies and tools. Competitive salary Great team culture, peers and workplace Show more Show less
Posted 1 week ago
1.0 - 2.0 years
0 Lacs
Delhi
On-site
NASSCOM Campus, Sector 126, Noida, NCR About the Role We are seeking a dynamic and technically proficient AI/ML Engineer to support our AI/ML R&D initiatives in cybersecurity and take ownership of TechSagar.in — a knowledge repository for India's emerging technology capabilities. The ideal candidate will possess hands-on experience in generative AI, emerging technologies, and product management. This is a hybrid role combining deep technical development with stakeholder engagement and platform evangelism. Key Responsibilities AI/ML & Cybersecurity Innovation Support R&D efforts to prototype generative AI models for real-time threat detection and cybersecurity. Design, develop, and deploy machine learning models tailored to cyber threat intelligence and anomaly detection. Research and implement novel AI approaches, including multi-agent and reasoning-based systems. Develop distributed security monitoring frameworks using tools like AutoGen , CrewAI , etc. Build LLM-powered threat analysis tools using LangChain , LlamaIndex , and integrate with enterprise infrastructure. Apply MLOps best practices for model deployment, performance monitoring, and continuous integration. Optimize vector stores (Qdrant, FAISS, Pinecone, etc.) for RAG-based systems. Create synthetic datasets for AI training and model evaluation. Use Pydantic for data validation within AI pipelines. TechSagar Product Responsibilities Manage and evolve the TechSagar.in platform—enhancing features, ensuring data integrity, and driving usage. Liaise with tech partners, government bodies, startups, and academia to enrich platform content. Strategize and execute industry engagement plans to market TechSagar and establish its relevance. Represent TechSagar in external forums, conferences, and industry meetings. Collect user feedback, define product roadmap, and ensure alignment with AI/ML advancements. Required Qualifications Bachelor’s or Master’s degree in Computer Science, Artificial Intelligence, or related field. 1–2 years of hands-on experience in AI/ML model development and deployment. Strong programming expertise in Python . Familiarity with LangChain , LlamaIndex , and large language models (LLMs). Experience in applying AI to cybersecurity or vulnerability analysis. Good understanding of machine learning algorithms, data pipelines, and model evaluation. Excellent communication skills for technical and stakeholder engagement. Preferred Skills Exposure to generative AI , LLMs, and chain-of-thought reasoning techniques. Working knowledge of MLOps tools such as MLflow , Docker , etc. Familiarity with FastAPI or Flask for API development. Ability to preprocess, clean, and analyze large datasets efficiently. Experience in integrating AI tools with legacy or existing security systems. Technologies & Frameworks LLM Frameworks: LangChain, LlamaIndex Multi-agent Systems: AutoGen, CrewAI Vector Databases: FAISS, Pinecone, Qdrant, Elasticsearch, AstraDB MLOps Tools: MLflow, Docker Programming & APIs: Python, FastAPI/Flask Data Validation: Pydantic Why Join Us? Be at the forefront of AI innovation in cybersecurity and national technology initiatives. Lead and shape a strategic tech product (TechSagar) with national impact. Collaborate with thought leaders in the AI, cybersecurity, and emerging tech ecosystem.
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About The Role Grade Level (for internal use): 03 Who We Are Kensho is a 120-person AI and machine learning company within S&P Global. With expertise in Machine Learning and data discovery, we develop and deploy novel solutions for S&P Global and its customers worldwide. Our solutions help businesses harness the power of data and Artificial Intelligence to innovate and drive progress. Kensho's solutions and research focus on speech recognition, entity linking, document extraction, automated database linking, text classification, natural language processing, and more. Are you looking to solve hard problems and enjoy working with teammates with diverse perspectives? If so, we would love to help you excel here at Kensho. About The Team Kensho’s Applications group develops the web apps and APIs that deliver Kensho’s AI capabilities to our customers. Our teams are small, product-focused, and intent on shipping high-quality code that best leverages our efforts. We’re collegial, humble, and inquisitive, and we delight in learning from teammates with backgrounds, skills, and interests different from our own. Kensho Link team, within the Applications Department, is a machine learning service that allows users to map entities in their datasets with unique entities drawn from S&P Global’s world-class company database with precision and speed. Link started as an internal Kensho project to help S&P Global Market Intelligence Team to integrate datasets more quickly into their platform. It uses ML based algorithms trained to return high quality links, even when the data inputs are incomplete or contain errors. In simple words, Kensho’s Link product helps in connecting the disconnected information about a company at one place – and it does so with scale. Link leverages a variety of NLP and ML techniques to process and link millions of company entities in hours. About The Role As a Senior Backend Engineer you will develop reliable, secure, and performant APIs that apply Kensho’s AI capabilities to specific customer workflows. You will collaborate with colleagues from Product, Machine Learning, Infrastructure, and Design, as well as with other engineers within Applications. You have a demonstrated capacity for depth, and are comfortable working with a broad range of technologies. Your verbal and written communication is proactive, efficient, and inclusive of your geographically-distributed colleagues. You are a thoughtful, deliberate technologist and share your knowledge generously. Equivalent to Grade 11 Role (Internal) You Will Design, develop, test, document, deploy, maintain, and improve software Manage individual project priorities, deadlines, and deliverables Work with key stakeholders to develop system architectures, API specifications, implementation requirements, and complexity estimates Test assumptions through instrumentation and prototyping Promote ongoing technical development through code reviews, knowledge sharing, and mentorship Optimize Application Scaling: Efficiently scale ML applications to maximize compute resource utilization and meet high customer demand. Address Technical Debt: Proactively identify and propose solutions to reduce technical debt within the tech stack. Enhance User Experiences: Collaborate with Product and Design teams to develop ML-based solutions that enhance user experiences and align with business goals. Ensure API security and data privacy by implementing best practices and compliance measures. Monitor and analyze API performance and reliability, making data-driven decisions to improve system health. Contribute to architectural discussions and decisions, ensuring scalability, maintainability, and performance of the backend systems. Qualifications At least 5+ years of direct experience developing customer-facing APIs within a team Thoughtful and efficient communication skills (both verbal and written) Experience developing RESTful APIs using a variety of tools Experience turning abstract business requirements into concrete technical plans Experience working across many stages of the software development lifecycle Sound reasoning about the behavior and performance of loosely-coupled systems Proficiency with algorithms (including time and space complexity analysis), data structures, and software architecture At least one domain of demonstrable technical depth Familiarity with CI/CD practices and tools to streamline deployment processes. Experience with containerization technologies (e.g., Docker, Kubernetes) for application deployment and orchestration. Technologies We Love Python, Django, FastAPI mypy, OpenAPI RabbitMQ, Celery, Kafka OpenSearch, PostgreSQL, Redis Git, Jsonnet, Jenkins, Docker, Kubernetes Airflow, AWS, Terraform Grafana, Prometheus ML Libraries: PyTorch, Scikit-learn, Pandas What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Inclusive Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering an inclusive workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and equal opportunity, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), BSMGMT203 - Entry Professional (EEO Job Group) Job ID: 312713 Posted On: 2025-04-15 Location: Hyderabad, Telangana, India Show more Show less
Posted 1 week ago
0 years
0 Lacs
Noida
On-site
Hello! You've landed on this page, which means you're interested in working with us. Let's take a sneak peek at what it's like to work at Innovaccer. Engineering at Innovaccer With every line of code, we accelerate our customers' success, turning complex challenges into innovative solutions. Collaboratively, we transform each data point we gather into valuable insights for our customers. Join us and be part of a team that's turning dreams of better healthcare into reality, one line of code at a time. Together, we’re shaping the future and making a meaningful impact on the world. About the Role The technology that once promised to simplify patient care has brought more issues than anyone ever anticipated. At Innovaccer, we defeat this beast by making full use of all the data Healthcare has worked so hard to collect, and replacing long-standing problems with ideal solutions. Data is our bread and butter for innovation. We are looking for Interns in our AI Team who are passionate about solving some real world healthcare problems. In this role, they will leverage the healthcare data to build algorithms to personalize treatments based on the clinical and behavioural history of patients. We are looking for superstars who will define and build the next generation of predictive analytics tools in healthcare. A Day in the life Development of various artificial intelligence initiatives to help improve health and wellness of patients Write well-documented maintainable python code Coordinate with the other team members in the development processes Work with the leaders and other stakeholders to understand their pain-points and build large-scale solutions Work with our data platform and to help them successfully integrate the AI capability or algorithms in the platform Define and execute on the roadmap What You Need Strong hands-on experience in Python. Understanding of Python Frameworks like FastAPI, Django, etc. Understanding of LLMs, Agentic Framework. Experience in manipulating/transforming data, model selection, and model training Knowledge of NLP Libraries like spaCy, NLTK Understanding of Deep Learning Frameworks like Keras, TensorFlow Strong written and spoken communication skills Coordinate management activities with others on the team The core skills we are seeking includes: Python Django / Flask / Fast API Additionally, the following skills are good to have: Power BI, Generative AI, LLMs, LangChain Deep Learning, Machine Learning, Natural Language Processing (NLP) Here’s What We Offer Generous Leave Benefits: Enjoy generous leave benefits of up to 40 days. Parental Leave: Experience one of the industry's best parental leave policies to spend time with your new addition. Sabbatical Leave Policy: Want to focus on skill development, pursue an academic career, or just take a break? We've got you covered. Health Insurance: We offer health benefits and insurance to you and your family for medically related expenses related to illness, disease, or injury. Pet-Friendly Office*: Spend more time with your treasured friends, even when you're away from home. Bring your furry friends with you to the office and let your colleagues become their friends, too. *Noida office only Creche Facility for children*: Say goodbye to worries and hello to a convenient and reliable creche facility that puts your child's well-being first. *India offices Where and how we work Our Noida office is situated in a posh techspace, equipped with various amenities to support our work environment. Here, we follow a five-day work schedule, allowing us to efficiently carry out our tasks and collaborate effectively within our team. Innovaccer is an equal-opportunity employer. We celebrate diversity, and we are committed to fostering an inclusive and diverse workplace where all employees, regardless of race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, marital status, or veteran status, feel valued and empowered. About Innovaccer Innovaccer Inc. is the data platform that accelerates innovation. The Innovaccer platform unifies patient data across systems and care settings and empowers healthcare organizations with scalable, modern applications that improve clinical, financial, operational, and experiential outcomes. Innovaccer’s EHR-agnostic solutions have been deployed across more than 1,600 hospitals and clinics in the US, enabling care delivery transformation for more than 96,000 clinicians, and helping providers work collaboratively with payers and life sciences companies. Innovaccer has helped its customers unify health records for more than 54 million people and generate over $1.5 billion in cumulative cost savings. The Innovaccer platform is the #1 rated Best-in-KLAS data and analytics platform by KLAS, and the #1 rated population health technology platform by Black Book. For more information, please visit innovaccer.com. Check us out on YouTube, Glassdoor, LinkedIn, and innovaccer.com. Disclaimer: Innovaccer does not charge fees or require payment from individuals or agencies for securing employment with us. We do not guarantee job spots or engage in any financial transactions related to employment. If you encounter any posts or requests asking for payment or personal information, we strongly advise you to report them immediately to our HR department at px@innovaccer.com. Additionally, please exercise caution and verify the authenticity of any requests before disclosing personal and confidential information, including bank account details.
Posted 1 week ago
1.0 years
0 - 0 Lacs
Indore
On-site
Python Developer – Flask & FastAPI Expert Location: Indore (Onsite Only) Experience: 1 to 2+ years Freshers/Trainees with live projects & strong technical skills can also apply Salary: Not a constraint for the deserving candidate – hike on current CTC! Key Skills Required Expertise in Flask & FastAPI (Django not required) Experience in live project development & deployment Strong knowledge of server management & maintenance Understanding of REST APIs & clean code architecture Database handling (MySQL, MongoDB, etc.) Excellent client communication & project handling skills Apply Now: hr.technorizen@gmail.com | hr@technorizen.com Contact Us: +91 88500 4047 Let’s build your next big opportunity – together! #PythonDeveloper #Flask #FastAPI #IndoreJobs #HiringNow #TechnorizenHiring #OnsiteJob #FreshersWelcome #CTCHike #BackendDeveloper Job Types: Full-time, Permanent Pay: ₹8,374.56 - ₹35,504.03 per month Benefits: Health insurance Paid sick time Schedule: Day shift Weekend availability Work Location: In person
Posted 1 week ago
4.0 - 8.0 years
4 - 8 Lacs
Hyderabad, Pune
Work from Office
Hi, We are hiring Python Developer with 4+ years of experience. This is a full-time, Work from Office job in Hyderabad/Pune location. Please send your resume to recruiter1@lightuptechnologies.com Mandatory requirements for resume submission: All employments start (Date of Joining) and end (Last Working Date) dates with the month and year. All educational details, including start and end dates with month and year. A mobile number, email ID, and LinkedIn profile URL should be provided for all profiles. Candidate PAN card copy, Aadhar card copy, Passport copy, and resume. Candidates missing any of the above documents will not be considered. Current CTC Expected CTC Current Location Notice Period: 10 Days (Immediate joiners only) CTC: 4 LPA -8 LPA (Based on your Interview) Mandatory Skills: Microservices, Flask and FastAPI Roles and Responsibilities: Having experience in python development, creating APIs, problem solving, debugging and working on API automation scripts using python. Must have experience in Microservices Must have experience in Flask and FastAPI. Able to build an API using Flask and Fast API. Should have hands-on PostgresQL and SQLAlchemy. Should be able to create and consume Restful web Services Having a good experience in Pandas, Numpy, data structures, MongoDB and openCV. Exposure to writing python code with multi-threading and multiprocessing. Worked on agile methodology. Worked on data visualization libraries like matplotlib, openCV and reportLAb. Writing efficient, reusable, testable, and scalable code. Good hands on scripting experience Multi-process and multi-thread architecture. Performance tuning and automation of applications. Testing and debugging software applications with Python test framework tools Coming up with digital tools for online traffic monitoring. Ability to integrate databases and various data sources into a unified system. Qualifications: Bachelor’s degree in Computer Science, Engineering, or a related field. 4+ years of professional experience in Python Developer.
Posted 1 week ago
3.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Purpose: The Job holder will be responsible Coding, designing, deploying, and debugging development projects. Take part in analysis, requirement gathering and design. Owning and delivering the automation of data engineering pipelines. Roles and Responsibilities: Solid understanding of backend performance optimization and debugging. Formal training or certification on software engineering concepts and proficient applied experience Strong hands-on experience with Python Experience in developing microservices using Python with FastAPI. Commercial experience in both backend and frontend engineering Hands-on experience with AWS Cloud-based applications development, including EC2, ECS, EKS, Lambda, SQS, SNS, RDS Aurora MySQL & Postgres, DynamoDB, EMR, and Kinesis. Strong engineering background in machine learning, deep learning, and neural networks. Experience with containerized stack using Kubernetes or ECS for development, deployment, and configuration. Experience with Single Sign-On/OIDC integration and a deep understanding of OAuth, JWT/JWE/JWS. Knowledge of AWS SageMaker and data analytics tools. Proficiency in frameworks TensorFlow, PyTorch, or similar. Familiarity with LangChain, Langgraph, or any Agentic Frameworks is a strong plus. Python engineering experience Education Qualification: Graduation: Bachelor of Science (B.Sc) / Bachelor of Technology (B.Tech) / Bachelor of Computer Applications (BCA) Post-Graduation: Master of Science (M.Sc) /Master of Technology (M.Tech) / Master of Computer Applications (MCA) Experience: 3-8 Years. Show more Show less
Posted 1 week ago
5.0 - 8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Us Groundbreaker. Game changer. Pioneer. TRC has long set the bar for clients who require more than just engineering, combining science with the latest technology to devise innovative solutions that stand the test of time. From pipelines to power plants, roadways to reservoirs, schoolyards to security solutions, clients look to TRC for breakthrough thinking backed by the innovative follow-through of an industry leader. TRC's professionals work with a broad range of commercial, industrial and government clients and the communities they serve. We deliver breakthrough solutions that address local needs -- so our clients can better succeed in an ever-changing world. Working at TRC means tackling interesting, meaningful projects. We pride ourselves on our collaborative spirit, entrepreneurial zeal and agile corporate structure. We recognize that the expertise of our staff is our strongest asset, so we generously reward employees for successful performance and invest in their careers through training and the development of new skills and certifications. Overview We are seeking a highly skilled Full Stack Developer with 5-8 years of experience specializing in FastAPI, Python, and ReactJS to join our dynamic team. The ideal candidate will have a strong background in REST and GraphQL API development, front-end frameworks (React Hooks, Redux Store, Material UI, Axios), and databases (MongoDB/DocumentDB) and a member of technical team. Responsible for developing, and maintaining scalable, high-performance web applications, ensuring seamless integration between front-end and back-end components. Strong communication and technical skills to effectively communicate with other members on the team are a must. Responsibilities Be an independent developer to develop of product features. Closely work with Tech leads to implement features with good quality to a degree write first right.. Closely work with the development team to deliver as per the plan. Closely work with QA and BAs to understand requirements. Mentor team members to maintain the quality of deliverables. Estimate the effort and schedule for features, enhancements and bugs. Debug, analysis and root cause finding for any bugs in the product features. Qualifications Bachelor’s degree in Computer Science, Software Engineering, or a related field. 5-8 years of experience in software development with a focus on full stack development especially in products development. Strong knowledge of Python with Object Orientation Programing.. Strong Experience in FastAPI development with REST and GraphQL. Strong experience in ReactJS, React Hooks, Redux Store, Material UI and Axios. Strong experience in MongoDB or DocumentDB. Exceptional verbal and written communication skills in English, adept at communicating with offshore and onshore teams. Exposure towards Agile and Scrum methodologies. Excellent organizational skills. Preferred Qualifications Experience in the efficiency segment of the energy domain. Master’s degree in computer science, Software Engineering, or related field. Experience with CRM systems. Exposure to AWS Web Services is desirable. Relevant certifications on technologies mentioned in above category. Good to have hands on experience in HTML5, CSS 3 frameworks. EEO Statement TRC is an Equal Opportunity Employer. We do not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, veteran status, or any other characteristic protected by applicable law. All employment decisions are made based on qualifications, merit, and business needs. We celebrate diversity and are committed to creating an inclusive environment for all employees. The complete job description and application are available on TRC’s career site. TRC accepts applications for this position on an ongoing, rolling basis and reserves the right to cancel this posting at any time. Show more Show less
Posted 1 week ago
4.0 - 9.0 years
6 - 12 Lacs
Hyderabad
Work from Office
ABOUT THE ROLE Role Description: We are seeking an accomplished and visionary Data Scientist/ GenAI developer to join Amgens Enterprise Data Management team. As part of MDM team, you will be responsible for design ing , develop ing , and deploy ing Generative AI and ML models to power data-driven decisions across business domains. This role is ideal for an AI practitioner who thrives in a collaborative environment and brings a strategic mindset to applying advanced AI techniques to solve real-world problems. To succeed in this role, the candidate must have strong AI/ML, Data Science , GenAI experience along with MDM knowledge, hence the candidates having only MDM experience are not eligible for this role. Candidate must have AI/ML, data science and GenAI experience on technologies like ( PySpark / PyTorch , TensorFlow, LLM , Autogen , Hugging FaceVectorDB , Embeddings, RAGs etc ), along with knowledge of MDM (Master Data Management) Roles & Responsibilities: Develop enterprise-level GenAI applications using LLM frameworks such as Langchain, Autogen, and Hugging Face. Design and develop intelligent pipelines using PySpark, TensorFlow, and PyTorch within Databricks and AWS environments. Implement embedding models andmanage VectorStores for retrieval-augmented generation (RAG) solutions. Integrate and leverage MDM platforms like Informatica and Reltio to supply high-quality structured data to ML systems. Utilize SQL and Python for data engineering, data wrangling, and pipeline automation. Build scalable APIs and services to serve GenAI models in production. Lead cross-functional collaboration with data scientists, engineers, and product teams to scope, design, and deploy AI-powered systems. Ensure model governance, version control, and auditability aligned with regulatory and compliance expectations. Basic Qualifications and Experience: Masters degree with 4 - 6 years of experience in Business, Engineering, IT or related field OR Bachelors degree with 6 - 9 years of experience in Business, Engineering, IT or related field OR Diploma with 10 - 12 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: 6+ years of experience working in AI/ML or Data Science roles, including designing and implementing GenAI solutions. Extensive hands-on experience with LLM frameworks and tools such as Langchain, Autogen, Hugging Face, OpenAI APIs, and embedding models. Strong programming background with Python, PySpark, and experience in building scalable solutions using TensorFlow, PyTorch, and SK-Learn. Proven track record of building and deploying AI/ML applications in cloud environments such as AWS. Expertise in developing APIs, automation pipelines, and serving GenAI models using frameworks like Django, FastAPI, and DataBricks. Solid experience integrating and managing MDM tools (Informatica/Reltio) and applying data governance best practices. Guide the team on development activities and lead the solution discussions Must have core technical capabilities in GenAI, Data Science space Good-to-Have Skills: Prior experience in Data Modeling, ETL development, and data profiling to support AI/ML workflows. Working knowledge of Life Sciences or Pharma industry standards and regulatory considerations. Proficiency in tools like JIRA and Confluence for Agile delivery and project collaboration. Familiarity with MongoDB, VectorStores, and modern architecture principles for scalable GenAI applications. Professional Certifications : Any ETL certification (e.g. Informatica) Any Data Analysis certification (SQL) Any cloud certification (AWS or AZURE) Data Science and ML Certification Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams.
Posted 1 week ago
3.0 years
0 - 0 Lacs
Chandigarh, India
Remote
Experience : 3.00 + years Salary : GBP 1785-2500 / month (based on experience) Expected Notice Period : 15 Days Shift : (GMT+01:00) Europe/London (BST) Opportunity Type : Remote Placement Type : Full Time Contract for 6 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - UK's Leading AgriTech Company) What do you need for this opportunity? Must have skills required: AgriTech Industry, Large Language Models, Nvidia Jetson, Raspberry PI, Blender, Computer Vision, opencv, Python, Pytorch/tensorflow, Segmentation, Extraction, Regression UK's Leading AgriTech Company is Looking for: Location: Remote Type: 6 months contract Experience Level : 3–5 Years Industry: Agritech | Sustainability | AI for Renewables About Us We're an AI-first company transforming the renewable and sustainable agriculture space. Our mission is to harness advanced computer vision and machine learning to enable smart, data-driven decisions in the livestock and agricultural ecosystem. We focus on practical applications such as automated weight estimation of cattle , livestock monitoring, and resource optimization to drive a more sustainable food system. Role Overview We are hiring a Computer Vision Engineer to develop intelligent image-based systems for livestock management, focusing on cattle weight estimation from images and video feeds. You will be responsible for building scalable vision pipelines, working with deep learning models, and bringing AI to production in real-world farm settings . Key Responsibilities Design and develop vision-based models to predict cattle weight from 2D/3D images, video, or depth data. Build image acquisition and preprocessing pipelines using multi-angle camera data. Implement classical and deep learning-based feature extraction techniques (e.g., body measurements, volume estimation). Conduct camera calibration, multi-view geometry analysis, and photogrammetry for size inference. Apply deep learning architectures (e.g., CNNs, ResNet, UNet, Mask R-CNN) for object detection, segmentation, and keypoint localization. Build 3D reconstruction pipelines using stereo imaging, depth sensors, or photogrammetry. Optimize and deploy models for edge devices (e.g., NVIDIA Jetson) or cloud environments. Collaborate with data scientists and product teams to analyze livestock datasets, refine prediction models, and validate outputs. Develop tools for automated annotation, model training pipelines, and continuous performance tracking. Required Qualifications & Skills Computer Vision: Object detection, keypoint estimation, semantic/instance segmentation, stereo imaging, and structure-from-motion. Weight Estimation Techniques: Experience in livestock monitoring, body condition scoring, and volumetric analysis from images/videos. Image Processing: Noise reduction, image normalization, contour extraction, 3D reconstruction, and camera calibration. Data Analysis & Modeling: Statistical modeling, regression techniques, and feature engineering for biological data. Technical Stack Programming Languages: Python (mandatory) Libraries & Frameworks: OpenCV, PyTorch, TensorFlow, Keras, scikit-learn 3D Processing: Open3D, PCL (Point Cloud Library), Blender (optional) Data Handling: NumPy, Pandas, DVC Annotation Tools: LabelImg, CVAT, Roboflow Cloud & DevOps: AWS/GCP, Docker, Git, CI/CD pipelines Deployment Tools: ONNX, TensorRT, FastAPI, Flask (for model serving) Preferred Qualifications Prior experience working in agritech, animal husbandry, or precision livestock farming. Familiarity with Large Language Models (LLMs) and integrating vision + language models for domain-specific insights. Knowledge of edge computing for on-farm device deployment (e.g., NVIDIA Jetson, Raspberry Pi). Contributions to open-source computer vision projects or relevant publications in CVPR, ECCV, or similar conferences. Soft Skills Strong problem-solving and critical thinking skills Clear communication and documentation practices Ability to work independently and collaborate in a remote, cross-functional team Why Join Us? Work at the intersection of AI and sustainability Be part of a dynamic and mission-driven team Opportunity to lead innovation in an emerging field of agritech Flexible remote work environment How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
3.0 years
0 - 0 Lacs
Thiruvananthapuram, Kerala, India
Remote
Experience : 3.00 + years Salary : GBP 1785-2500 / month (based on experience) Expected Notice Period : 15 Days Shift : (GMT+01:00) Europe/London (BST) Opportunity Type : Remote Placement Type : Full Time Contract for 6 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - UK's Leading AgriTech Company) What do you need for this opportunity? Must have skills required: AgriTech Industry, Large Language Models, Nvidia Jetson, Raspberry PI, Blender, Computer Vision, opencv, Python, Pytorch/tensorflow, Segmentation, Extraction, Regression UK's Leading AgriTech Company is Looking for: Location: Remote Type: 6 months contract Experience Level : 3–5 Years Industry: Agritech | Sustainability | AI for Renewables About Us We're an AI-first company transforming the renewable and sustainable agriculture space. Our mission is to harness advanced computer vision and machine learning to enable smart, data-driven decisions in the livestock and agricultural ecosystem. We focus on practical applications such as automated weight estimation of cattle , livestock monitoring, and resource optimization to drive a more sustainable food system. Role Overview We are hiring a Computer Vision Engineer to develop intelligent image-based systems for livestock management, focusing on cattle weight estimation from images and video feeds. You will be responsible for building scalable vision pipelines, working with deep learning models, and bringing AI to production in real-world farm settings . Key Responsibilities Design and develop vision-based models to predict cattle weight from 2D/3D images, video, or depth data. Build image acquisition and preprocessing pipelines using multi-angle camera data. Implement classical and deep learning-based feature extraction techniques (e.g., body measurements, volume estimation). Conduct camera calibration, multi-view geometry analysis, and photogrammetry for size inference. Apply deep learning architectures (e.g., CNNs, ResNet, UNet, Mask R-CNN) for object detection, segmentation, and keypoint localization. Build 3D reconstruction pipelines using stereo imaging, depth sensors, or photogrammetry. Optimize and deploy models for edge devices (e.g., NVIDIA Jetson) or cloud environments. Collaborate with data scientists and product teams to analyze livestock datasets, refine prediction models, and validate outputs. Develop tools for automated annotation, model training pipelines, and continuous performance tracking. Required Qualifications & Skills Computer Vision: Object detection, keypoint estimation, semantic/instance segmentation, stereo imaging, and structure-from-motion. Weight Estimation Techniques: Experience in livestock monitoring, body condition scoring, and volumetric analysis from images/videos. Image Processing: Noise reduction, image normalization, contour extraction, 3D reconstruction, and camera calibration. Data Analysis & Modeling: Statistical modeling, regression techniques, and feature engineering for biological data. Technical Stack Programming Languages: Python (mandatory) Libraries & Frameworks: OpenCV, PyTorch, TensorFlow, Keras, scikit-learn 3D Processing: Open3D, PCL (Point Cloud Library), Blender (optional) Data Handling: NumPy, Pandas, DVC Annotation Tools: LabelImg, CVAT, Roboflow Cloud & DevOps: AWS/GCP, Docker, Git, CI/CD pipelines Deployment Tools: ONNX, TensorRT, FastAPI, Flask (for model serving) Preferred Qualifications Prior experience working in agritech, animal husbandry, or precision livestock farming. Familiarity with Large Language Models (LLMs) and integrating vision + language models for domain-specific insights. Knowledge of edge computing for on-farm device deployment (e.g., NVIDIA Jetson, Raspberry Pi). Contributions to open-source computer vision projects or relevant publications in CVPR, ECCV, or similar conferences. Soft Skills Strong problem-solving and critical thinking skills Clear communication and documentation practices Ability to work independently and collaborate in a remote, cross-functional team Why Join Us? Work at the intersection of AI and sustainability Be part of a dynamic and mission-driven team Opportunity to lead innovation in an emerging field of agritech Flexible remote work environment How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
3.0 years
0 - 0 Lacs
Dehradun, Uttarakhand, India
Remote
Experience : 3.00 + years Salary : GBP 1785-2500 / month (based on experience) Expected Notice Period : 15 Days Shift : (GMT+01:00) Europe/London (BST) Opportunity Type : Remote Placement Type : Full Time Contract for 6 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - UK's Leading AgriTech Company) What do you need for this opportunity? Must have skills required: AgriTech Industry, Large Language Models, Nvidia Jetson, Raspberry PI, Blender, Computer Vision, opencv, Python, Pytorch/tensorflow, Segmentation, Extraction, Regression UK's Leading AgriTech Company is Looking for: Location: Remote Type: 6 months contract Experience Level : 3–5 Years Industry: Agritech | Sustainability | AI for Renewables About Us We're an AI-first company transforming the renewable and sustainable agriculture space. Our mission is to harness advanced computer vision and machine learning to enable smart, data-driven decisions in the livestock and agricultural ecosystem. We focus on practical applications such as automated weight estimation of cattle , livestock monitoring, and resource optimization to drive a more sustainable food system. Role Overview We are hiring a Computer Vision Engineer to develop intelligent image-based systems for livestock management, focusing on cattle weight estimation from images and video feeds. You will be responsible for building scalable vision pipelines, working with deep learning models, and bringing AI to production in real-world farm settings . Key Responsibilities Design and develop vision-based models to predict cattle weight from 2D/3D images, video, or depth data. Build image acquisition and preprocessing pipelines using multi-angle camera data. Implement classical and deep learning-based feature extraction techniques (e.g., body measurements, volume estimation). Conduct camera calibration, multi-view geometry analysis, and photogrammetry for size inference. Apply deep learning architectures (e.g., CNNs, ResNet, UNet, Mask R-CNN) for object detection, segmentation, and keypoint localization. Build 3D reconstruction pipelines using stereo imaging, depth sensors, or photogrammetry. Optimize and deploy models for edge devices (e.g., NVIDIA Jetson) or cloud environments. Collaborate with data scientists and product teams to analyze livestock datasets, refine prediction models, and validate outputs. Develop tools for automated annotation, model training pipelines, and continuous performance tracking. Required Qualifications & Skills Computer Vision: Object detection, keypoint estimation, semantic/instance segmentation, stereo imaging, and structure-from-motion. Weight Estimation Techniques: Experience in livestock monitoring, body condition scoring, and volumetric analysis from images/videos. Image Processing: Noise reduction, image normalization, contour extraction, 3D reconstruction, and camera calibration. Data Analysis & Modeling: Statistical modeling, regression techniques, and feature engineering for biological data. Technical Stack Programming Languages: Python (mandatory) Libraries & Frameworks: OpenCV, PyTorch, TensorFlow, Keras, scikit-learn 3D Processing: Open3D, PCL (Point Cloud Library), Blender (optional) Data Handling: NumPy, Pandas, DVC Annotation Tools: LabelImg, CVAT, Roboflow Cloud & DevOps: AWS/GCP, Docker, Git, CI/CD pipelines Deployment Tools: ONNX, TensorRT, FastAPI, Flask (for model serving) Preferred Qualifications Prior experience working in agritech, animal husbandry, or precision livestock farming. Familiarity with Large Language Models (LLMs) and integrating vision + language models for domain-specific insights. Knowledge of edge computing for on-farm device deployment (e.g., NVIDIA Jetson, Raspberry Pi). Contributions to open-source computer vision projects or relevant publications in CVPR, ECCV, or similar conferences. Soft Skills Strong problem-solving and critical thinking skills Clear communication and documentation practices Ability to work independently and collaborate in a remote, cross-functional team Why Join Us? Work at the intersection of AI and sustainability Be part of a dynamic and mission-driven team Opportunity to lead innovation in an emerging field of agritech Flexible remote work environment How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
3.0 years
0 - 0 Lacs
Patna, Bihar, India
Remote
Experience : 3.00 + years Salary : GBP 1785-2500 / month (based on experience) Expected Notice Period : 15 Days Shift : (GMT+01:00) Europe/London (BST) Opportunity Type : Remote Placement Type : Full Time Contract for 6 Months(40 hrs a week/160 hrs a month) (*Note: This is a requirement for one of Uplers' client - UK's Leading AgriTech Company) What do you need for this opportunity? Must have skills required: AgriTech Industry, Large Language Models, Nvidia Jetson, Raspberry PI, Blender, Computer Vision, opencv, Python, Pytorch/tensorflow, Segmentation, Extraction, Regression UK's Leading AgriTech Company is Looking for: Location: Remote Type: 6 months contract Experience Level : 3–5 Years Industry: Agritech | Sustainability | AI for Renewables About Us We're an AI-first company transforming the renewable and sustainable agriculture space. Our mission is to harness advanced computer vision and machine learning to enable smart, data-driven decisions in the livestock and agricultural ecosystem. We focus on practical applications such as automated weight estimation of cattle , livestock monitoring, and resource optimization to drive a more sustainable food system. Role Overview We are hiring a Computer Vision Engineer to develop intelligent image-based systems for livestock management, focusing on cattle weight estimation from images and video feeds. You will be responsible for building scalable vision pipelines, working with deep learning models, and bringing AI to production in real-world farm settings . Key Responsibilities Design and develop vision-based models to predict cattle weight from 2D/3D images, video, or depth data. Build image acquisition and preprocessing pipelines using multi-angle camera data. Implement classical and deep learning-based feature extraction techniques (e.g., body measurements, volume estimation). Conduct camera calibration, multi-view geometry analysis, and photogrammetry for size inference. Apply deep learning architectures (e.g., CNNs, ResNet, UNet, Mask R-CNN) for object detection, segmentation, and keypoint localization. Build 3D reconstruction pipelines using stereo imaging, depth sensors, or photogrammetry. Optimize and deploy models for edge devices (e.g., NVIDIA Jetson) or cloud environments. Collaborate with data scientists and product teams to analyze livestock datasets, refine prediction models, and validate outputs. Develop tools for automated annotation, model training pipelines, and continuous performance tracking. Required Qualifications & Skills Computer Vision: Object detection, keypoint estimation, semantic/instance segmentation, stereo imaging, and structure-from-motion. Weight Estimation Techniques: Experience in livestock monitoring, body condition scoring, and volumetric analysis from images/videos. Image Processing: Noise reduction, image normalization, contour extraction, 3D reconstruction, and camera calibration. Data Analysis & Modeling: Statistical modeling, regression techniques, and feature engineering for biological data. Technical Stack Programming Languages: Python (mandatory) Libraries & Frameworks: OpenCV, PyTorch, TensorFlow, Keras, scikit-learn 3D Processing: Open3D, PCL (Point Cloud Library), Blender (optional) Data Handling: NumPy, Pandas, DVC Annotation Tools: LabelImg, CVAT, Roboflow Cloud & DevOps: AWS/GCP, Docker, Git, CI/CD pipelines Deployment Tools: ONNX, TensorRT, FastAPI, Flask (for model serving) Preferred Qualifications Prior experience working in agritech, animal husbandry, or precision livestock farming. Familiarity with Large Language Models (LLMs) and integrating vision + language models for domain-specific insights. Knowledge of edge computing for on-farm device deployment (e.g., NVIDIA Jetson, Raspberry Pi). Contributions to open-source computer vision projects or relevant publications in CVPR, ECCV, or similar conferences. Soft Skills Strong problem-solving and critical thinking skills Clear communication and documentation practices Ability to work independently and collaborate in a remote, cross-functional team Why Join Us? Work at the intersection of AI and sustainability Be part of a dynamic and mission-driven team Opportunity to lead innovation in an emerging field of agritech Flexible remote work environment How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
FastAPI is a modern web framework for building APIs with Python that is gaining popularity in the tech industry. If you are a job seeker looking to explore opportunities in the fastapi domain in India, you're in the right place. This article will provide you with insights into the fastapi job market in India, including top hiring locations, salary ranges, career progression, related skills, and interview questions.
The salary range for fastapi professionals in India varies based on experience levels. Entry-level positions can expect a salary range of INR 4-6 lakhs per annum, while experienced professionals can earn anywhere from INR 10-20 lakhs per annum.
In the fastapi domain, a career typically progresses as follows: - Junior Developer - Mid-level Developer - Senior Developer - Tech Lead
Besides proficiency in FastAPI, other skills that are often expected or helpful alongside FastAPI include: - Python programming - RESTful APIs - Database management (SQL or NoSQL) - Frontend technologies like HTML, CSS, and JavaScript
As you explore opportunities in the fastapi job market in India, remember to prepare thoroughly and apply confidently. With the right skills and knowledge, you can excel in your career as a FastAPI professional. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.