Home
Jobs

4114 Retrieval Jobs - Page 35

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

0 Lacs

Greater Bhopal Area

Remote

Linkedin logo

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

4.0 years

0 Lacs

Visakhapatnam, Andhra Pradesh, India

Remote

Linkedin logo

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

4.0 years

0 Lacs

Indore, Madhya Pradesh, India

Remote

Linkedin logo

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

4.0 years

0 Lacs

Surat, Gujarat, India

Remote

Linkedin logo

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

4.0 years

0 Lacs

Chandigarh, India

Remote

Linkedin logo

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

4.0 years

0 Lacs

Dehradun, Uttarakhand, India

Remote

Linkedin logo

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

4.0 years

0 Lacs

Mysore, Karnataka, India

Remote

Linkedin logo

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

4.0 years

0 Lacs

Vijayawada, Andhra Pradesh, India

Remote

Linkedin logo

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

4.0 years

0 Lacs

Thiruvananthapuram, Kerala, India

Remote

Linkedin logo

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

4.0 years

0 Lacs

Patna, Bihar, India

Remote

Linkedin logo

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Linkedin logo

When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What You Will Be Doing... The Commercial Data & Analytics - Impact Analytics team is part of the Verizon Global Services (VGS) organization.The Impact Analytics team addresses high-impact, analytically driven projects focused within three core pillars: Customer Experience, Pricing & Monetization, Network & Sustainability. In this role, you will analyze large data sets to draw insights and solutions to help drive actionable business decisions. You will also apply advanced analytical techniques and algorithms to help us solve some of Verizon’s most pressing challenges. Use your analysis of large structured and unstructured datasets to draw meaningful and actionable insights Envision and test for corner cases. Build analytical solutions and models by manipulating large data sets and integrating diverse data sources Present the results and recommendations of statistical modeling and data analysis to management and other stakeholders Leading the development and implementation of advanced reports and dashboard solutions to support business objectives. Identify data sources and apply your knowledge of data structures, organization, transformation, and aggregation techniques to prepare data for in-depth analysis Deeply understand business requirements and translate them into well-defined analytical problems, identifying the most appropriate statistical techniques to deliver impactful solutions. Assist in building data views from disparate data sources which powers insights and business cases Apply statistical modeling techniques / ML to data and perform root cause analysis and forecasting Develop and implement rigorous frameworks for effective base management. Collaborate with cross-functional teams to discover the most appropriate data sources, fields which caters to the business needs Design modular, reusable Python scripts to automate data processing Clearly and effectively communicate complex statistical concepts and model results to both technical and non-technical audiences, translating your findings into actionable insights for stakeholders. What we’re looking for… You have strong analytical skills, and are eager to work in a collaborative environment with global teams to drive ML applications in business problems, develop end to end analytical solutions and communicate insights and findings to leadership. You work independently and are always willing to learn new technologies. You thrive in a dynamic environment and are able to interact with various partners and cross functional teams to implement data science driven business solutions. You Will Need To Have Bachelor’s degree or six or more years of work experience Six or more years of relevant work experience Experience in managing a team of data scientists that supports a business function. Proficiency in SQL, including writing queries for reporting, analysis and extraction of data from big data systems (Google Cloud Platform, Teradata, Spark, Splunk etc) Curiosity to dive deep into data inconsistencies and perform root cause analysis Programming experience in Python (Pandas, NumPy, Scipy and Scikit-Learn) Experience with Visualization tools matplotlib, seaborn, tableau, grafana etc. A deep understanding of various machine learning algorithms and techniques, including supervised and unsupervised learning Understanding of time series modeling and forecasting techniques Even better if you have one or more of the following: Experience with cloud computing platforms (e.g., AWS, Azure, GCP) and deploying machine learning models at scale using platforms like Domino Data Lab or Vertex AI Experience in applying statistical ideas and methods to data sets to answer business problems. Ability to collaborate effectively across teams for data discovery and validation Experience in deep learning, recommendation systems, conversational systems, information retrieval, computer vision Expertise in advanced statistical modeling techniques, such as Bayesian inference or causal inference. Excellent interpersonal, verbal and written communication skills. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics.

Posted 1 week ago

Apply

15.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! About The Team The Adobe Express team is building a path-breaking, all-in-one creative application for all platforms - web, desktop, and mobile. This platform combines the power of Adobe’s creative technologies with intuitive, AI-enhanced workflows that empower anyone to create standout content quickly and effortlessly. The Opportunity We’re looking for a Senior Machine Learning Engineer (Architect) to play a key role in shaping the next generation of AI-powered creative experiences in Adobe Express. This role will build high-impact AI Workflows for Adobe Express in Image editing space. Concentrating on brand new generative AI workflows, including personalized assistants and intelligent creative tools. Speed up AI culture by sharing knowledge, encouraging experimentation, and improving developer efficiency. Experience Requirements: 15+ years of proven experience in hands-on Machine Learning work. As a Machine Learning Engineer, you will: Build and scale advanced ML models to make Image editing easy and seamless for users Develop AI Agents for Adobe Express in Imaging space Partner closely with product, design and engineering teams across Adobe to integrate Adobe’s latest generative AI capabilities user-facing features. Help drive a culture of AI innovation and learning through internal knowledge-sharing, best-practice documentation, and experimentation frameworks that boost team productivity. Detailed Responsibilities: Research, design, and implement advanced ML models and scalable pipelines across training, inference, and deployment stages, using techniques in computer vision, NLP, deep learning, and generative AI. Integrate Large Language Models (LLMs) and agent-based frameworks to support multimodal creative workflows—enabling rich, context-aware, dynamic user experiences. Design, implement, and optimize subagent architectures for supporting modular and intelligent assistance across various creative tasks in Adobe Express. Collaborate with multi-functional teams to translate product requirements into ML solutions—especially those related to Harmony GenAI, smart recommendations, and generative tooling. Contribute to the development of internal platforms for model experimentation, A/B testing, performance monitoring, and continuous improvement. Stay up-to-date with evolving ML/GenAI research, tools, and frameworks—including federated learning, retrieval-augmented generation, and optimization for real-time inference. Champion an AI-first culture by mentoring peers, promoting learning opportunities, and encouraging innovation at both technical and organizational levels. Special Skills Requirements: Proficiency in Python for model development and C++ for systems integration. Strong hands-on experience with TensorFlow, PyTorch, and emerging GenAI toolkits. Experience working with LLMs, agent architectures, and user interfaces powered by AI technology. Deep understanding of computer vision and NLP techniques, especially for multimodal AI applications. Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more about our vision here. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015.

Posted 1 week ago

Apply

12.0 - 18.0 years

0 Lacs

Tamil Nadu, India

Remote

Linkedin logo

Join us as we work to create a thriving ecosystem that delivers accessible, high-quality, and sustainable healthcare for all. This position requires expertise in designing, developing, debugging, and maintaining AI-powered applications and data engineering workflows for both local and cloud environments. The role involves working on large-scale projects, optimizing AI/ML pipelines, and ensuring scalable data infrastructure. As a PMTS, you will be responsible for integrating Generative AI (GenAI) capabilities, building data pipelines for AI model training, and deploying scalable AI-powered microservices. You will collaborate with AI/ML, Data Engineering, DevOps, and Product teams to deliver impactful solutions that enhance our products and services. Additionally, it would be desirable if the candidate has experience in retrieval-augmented generation (RAG), fine-tuning pre-trained LLMs, AI model evaluation, data pipeline automation, and optimizing cloud-based AI deployments. Responsibilities AI-Powered Software Development & API Integration Develop AI-driven applications, microservices, and automation workflows using FastAPI, Flask, or Django, ensuring cloud-native deployment and performance optimization. Integrate OpenAI APIs (GPT models, Embeddings, Function Calling) and Retrieval-Augmented Generation (RAG) techniques to enhance AI-powered document retrieval, classification, and decision-making. Data Engineering & AI Model Performance Optimization Design, build, and optimize scalable data pipelines for AI/ML workflows using Pandas, PySpark, and Dask, integrating data sources such as Kafka, AWS S3, Azure Data Lake, and Snowflake. Enhance AI model inference efficiency by implementing vector retrieval using FAISS, Pinecone, or ChromaDB, and optimize API latency with tuning techniques (temperature, top-k sampling, max tokens settings). Microservices, APIs & Security Develop scalable RESTful APIs for AI models and data services, ensuring integration with internal and external systems while securing API endpoints using OAuth, JWT, and API Key Authentication. Implement AI-powered logging, observability, and monitoring to track data pipelines, model drift, and inference accuracy, ensuring compliance with AI governance and security best practices. AI & Data Engineering Collaboration Work with AI/ML, Data Engineering, and DevOps teams to optimize AI model deployments, data pipelines, and real-time/batch processing for AI-driven solutions. Engage in Agile ceremonies, backlog refinement, and collaborative problem-solving to scale AI-powered workflows in areas like fraud detection, claims processing, and intelligent automation. Cross-Functional Coordination and Communication Collaborate with Product, UX, and Compliance teams to align AI-powered features with user needs, security policies, and regulatory frameworks (HIPAA, GDPR, SOC2). Ensure seamless integration of structured and unstructured data sources (SQL, NoSQL, vector databases) to improve AI model accuracy and retrieval efficiency. Mentorship & Knowledge Sharing Mentor junior engineers on AI model integration, API development, and scalable data engineering best practices, and conduct knowledge-sharing sessions. Education & Experience Required 12-18 years of experience in software engineering or AI/ML development, preferably in AI-driven solutions. Hands-on experience with Agile development, SDLC, CI/CD pipelines, and AI model deployment lifecycles. Bachelor’s Degree or equivalent in Computer Science, Engineering, Data Science, or a related field. Proficiency in full-stack development with expertise in Python (preferred for AI), Java Experience with structured & unstructured data: SQL (PostgreSQL, MySQL, SQL Server) NoSQL (OpenSearch, Redis, Elasticsearch) Vector Databases (FAISS, Pinecone, ChromaDB) Cloud & AI Infrastructure AWS: Lambda, SageMaker, ECS, S3 Azure: Azure OpenAI, ML Studio GenAI Frameworks & Tools: OpenAI API, Hugging Face Transformers, LangChain, LlamaIndex, AutoGPT, CrewAI. Experience in LLM deployment, retrieval-augmented generation (RAG), and AI search optimization. Proficiency in AI model evaluation (BLEU, ROUGE, BERT Score, cosine similarity) and responsible AI deployment. Strong problem-solving skills, AI ethics awareness, and the ability to collaborate across AI, DevOps, and data engineering teams. Curiosity and eagerness to explore new AI models, tools, and best practices for scalable GenAI adoption. About Athenahealth Here’s our vision: To create a thriving ecosystem that delivers accessible, high-quality, and sustainable healthcare for all. What’s unique about our locations? From an historic, 19th century arsenal to a converted, landmark power plant, all of athenahealth’s offices were carefully chosen to represent our innovative spirit and promote the most positive and productive work environment for our teams. Our 10 offices across the United States and India — plus numerous remote employees — all work to modernize the healthcare experience, together. Our Company Culture Might Be Our Best Feature. We don't take ourselves too seriously. But our work? That’s another story. athenahealth develops and implements products and services that support US healthcare: It’s our chance to create healthier futures for ourselves, for our family and friends, for everyone. Our vibrant and talented employees — or athenistas, as we call ourselves — spark the innovation and passion needed to accomplish our goal. We continue to expand our workforce with amazing people who bring diverse backgrounds, experiences, and perspectives at every level, and foster an environment where every athenista feels comfortable bringing their best selves to work. Our size makes a difference, too: We are small enough that your individual contributions will stand out — but large enough to grow your career with our resources and established business stability. Giving back is integral to our culture. Our athenaGives platform strives to support food security, expand access to high-quality healthcare for all, and support STEM education to develop providers and technologists who will provide access to high-quality healthcare for all in the future. As part of the evolution of athenahealth’s Corporate Social Responsibility (CSR) program, we’ve selected nonprofit partners that align with our purpose and let us foster long-term partnerships for charitable giving, employee volunteerism, insight sharing, collaboration, and cross-team engagement. What can we do for you? Along with health and financial benefits, athenistas enjoy perks specific to each location, including commuter support, employee assistance programs, tuition assistance, employee resource groups, and collaborative workspaces — some offices even welcome dogs. In addition to our traditional benefits and perks, we sponsor events throughout the year, including book clubs, external speakers, and hackathons. And we provide athenistas with a company culture based on learning, the support of an engaged team, and an inclusive environment where all employees are valued. We also encourage a better work-life balance for athenistas with our flexibility. While we know in-office collaboration is critical to our vision, we recognize that not all work needs to be done within an office environment, full-time. With consistent communication and digital collaboration tools, athenahealth enables employees to find a balance that feels fulfilling and productive for each individual situation.

Posted 1 week ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Accounting Accounting of Sales, Purchase, Expenses, JV etc Bank Reconciliation Monthly Entry for Prepaid and Provisions Entry for Petty Cash and Investments Entries for Payroll and Claims Preparation of Purchase Order, Invoices, Credit Note and Debit Note Petty Cash Maintaining and disbursement of Petty Cash to Employees / Office Boy for petty Expenses Providing advance to Office boy and taking expense submission from him Preparing proper details / voucher for the expenses and attaching supporting to the same Checking Approval exist for all expenses done and liasing with concerned employees to get missing details Arranging withdrawal of cash from the Bank Account for Petty Cash Checking of Claims along with supporting and as per Limits mentioned in Policy Liaising with Employees to obtain missing documentation / approvals Maintaining and updating Master Claim Sheet Statutory Compliance Updating TDS Tracker for the year Filing Challans Financial Year wise for future reference / Retrieval Ensuring all Compliance / Filings related to Direct, Indirect Tax, SEZ / STPI, ROC, TP, Labour Laws as applicable are done on timely basis Preparing Transfer Pricing Statements and coordinating with consultant for TP Agreements Collaborating with Factory and other teams to create Transfer Pricing Master Data for Intercompany movement of Goods ROC Ensuring all ROC Compliances are done on timely basis Liaising with ROC Consultant as and when required. Vendor Empanelment and Reconciliation Filling of Vendor Empanelment Forms Preparation of Vendor Account Reconciliation Liaising with Vendor to procure proper invoices Sharing Bank Payment Advice with Vendors Replying to Vendor Queries Banking Work Liaising with Bank Representative for calling of Forms and Certificates Filling of Forms required for any service / request Employee / Payroll Related Acting as intermediate between the Payroll Service Provider and the Employee in case of any disconnect related to Tax Queries / Documents. Helping HR in preparing the Monthly Payroll Reconciliation to be shared with Gareth for Monthly Payroll Approval. Accounts Finalization and Audit Providing basic data / support to Auditor for Accounts Finalization and Audit Closing books on Monthly basis Preparing Form 26AS Reconciliation Budgeting and MIS Preparing Annual Budget and Re-forecast on Quarterly Basis Preparing Variance Analysis and Trend Analysis. Preparing and reviewing of Monthly P & L & Other MIS (Financial Metrics) as required by the Management Preparing Departmental Budget and tracking of actuals for comparison Debtors and Receivable Management Preparing various Sales Analytic Report, Amortization Sheet and Debtors Statement Maintaining Creditor statement and processing weekly payments including outward remittance and documentation for the same. Treasury Management Monitor and prepare cash flow to ensure liquidity for operations. Manage working capital, including accounts receivable and accounts payable Optimize deployment of funds to the right investment avenues. Other Activity Correspondence with Banks, Auditors, Consultants, Insurance Agents and others for any updates/changes in the banking operations, registrations, advisory, Quotes for Corporate Policies etc. Co-ordination with Vendor to prepare DSC for Directors Filling up forms related to Import of Goods and assisting in factory operations Filing and Scanning of Document Any other Accounts Related Activity as allocated from Time to Time

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Summary ¿ Proficiency with major search engines and platforms such as Coveo, Elasticsearch, Solr, MongoDB Atlas, or similar technologies. ¿ Experience with Natural Language Processing (NLP) and machine learning techniques for search relevance and personalization. ¿ Ability to design and implement ranking algorithms and relevance tuning. ¿ Experience with A/B testing and other methods for optimizing search results. ¿ Experience with analyzing search logs and metrics to understand user behavior and improve search performance. ¿ Deep understanding of indexing, data storage, and retrieval mechanisms (RAG). ¿ Experience with data integration, ETL processes, and data normalization. ¿ Knowledge of scaling search solutions to handle large volumes of data and high query loads. ¿ Strong knowledge of programming languages like C#.NET, Python, or JavaScript for developing and customizing search functionalities. ¿ Experience in integrating search solutions with various APIs and third party systems. ¿ Understanding of how search interfaces impact user experience and ways to improve search usability and efficiency. ¿ Experience with enterprise level systems and an understanding of how search integrates with broader IT infrastructure and business processes.

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What You Will Be Doing... The Commercial Data & Analytics - Impact Analytics team is part of the Verizon Global Services (VGS) organization.The Impact Analytics team addresses high-impact, analytically driven projects focused within three core pillars: Customer Experience, Pricing & Monetization, Network & Sustainability. In this role, you will analyze large data sets to draw insights and solutions to help drive actionable business decisions. You will also apply advanced analytical techniques and algorithms to help us solve some of Verizon’s most pressing challenges. Use your analysis of large structured and unstructured datasets to draw meaningful and actionable insights Envision and test for corner cases. Build analytical solutions and models by manipulating large data sets and integrating diverse data sources Present the results and recommendations of statistical modeling and data analysis to management and other stakeholders Leading the development and implementation of advanced reports and dashboard solutions to support business objectives. Identify data sources and apply your knowledge of data structures, organization, transformation, and aggregation techniques to prepare data for in-depth analysis Deeply understand business requirements and translate them into well-defined analytical problems, identifying the most appropriate statistical techniques to deliver impactful solutions. Assist in building data views from disparate data sources which powers insights and business cases Apply statistical modeling techniques / ML to data and perform root cause analysis and forecasting Develop and implement rigorous frameworks for effective base management. Collaborate with cross-functional teams to discover the most appropriate data sources, fields which caters to the business needs Design modular, reusable Python scripts to automate data processing Clearly and effectively communicate complex statistical concepts and model results to both technical and non-technical audiences, translating your findings into actionable insights for stakeholders. What we’re looking for… You have strong analytical skills, and are eager to work in a collaborative environment with global teams to drive ML applications in business problems, develop end to end analytical solutions and communicate insights and findings to leadership. You work independently and are always willing to learn new technologies. You thrive in a dynamic environment and are able to interact with various partners and cross functional teams to implement data science driven business solutions. You Will Need To Have Bachelor’s degree or six or more years of work experience Six or more years of relevant work experience Experience in managing a team of data scientists that supports a business function. Proficiency in SQL, including writing queries for reporting, analysis and extraction of data from big data systems (Google Cloud Platform, Teradata, Spark, Splunk etc) Curiosity to dive deep into data inconsistencies and perform root cause analysis Programming experience in Python (Pandas, NumPy, Scipy and Scikit-Learn) Experience with Visualization tools matplotlib, seaborn, tableau, grafana etc. A deep understanding of various machine learning algorithms and techniques, including supervised and unsupervised learning Understanding of time series modeling and forecasting techniques Even better if you have one or more of the following: Experience with cloud computing platforms (e.g., AWS, Azure, GCP) and deploying machine learning models at scale using platforms like Domino Data Lab or Vertex AI Experience in applying statistical ideas and methods to data sets to answer business problems. Ability to collaborate effectively across teams for data discovery and validation Experience in deep learning, recommendation systems, conversational systems, information retrieval, computer vision Expertise in advanced statistical modeling techniques, such as Bayesian inference or causal inference. Excellent interpersonal, verbal and written communication skills. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics.

Posted 1 week ago

Apply

5.0 years

0 Lacs

India

On-site

Linkedin logo

About Billigence Billigence is a boutique data consultancy with global outreach and clientele, transforming the way organizations work with data. We leverage proven, cutting-edge technologies to design, tailor, and implement advanced Business Intelligence solutions with high added value across a wide range of applications—from process digitization through to Cloud Data Warehousing, Visualisation, Data Science, Engineering, and Data Governance. Headquartered in Sydney, Australia, with offices worldwide—including Bangalore—we help clients navigate complex business challenges, unlock efficiencies, and enable scalable adoption of data and AI culture. About the Role We’re looking for a highly experienced Senior AI Enginee r based in India to join our growing global team. This role is focused on designing, developing, and deploying scalable Generative AI (GenAI) solutions for our enterprise clients. You’ll work closely with cross-functional teams across the UK, EU, and APAC regions to drive innovation in AI/ML and contribute to high-impact engagements across industries. This is a hands-on leadership role that combines deep technical work with consultancy, solution design, and AI strategy. What You’ll Do Architect and develop scalable GenAI pipelines, APIs, and microservices for both real-time and batch AI applications using frameworks such as FastAPI, Ray, and LangServe.Design robust prompt strategies for instruction-following, reasoning, and multi-turn conversations, with a strong focus on Retrieval-Augmented Generation (RAG) architectures. Lead embedding model selection and fine-tuning to optimise semantic search and domain-specific GenAI solutions. Oversee LLM Ops workflows including orchestration, deployment, rollback, evaluation, and monitoring. Drive fine-tuning efforts on large language models (LLMs) to tailor outputs for regulated and industry-specific contexts. Establish AI testing and evaluation frameworks (including hallucination detection, safety filters, regression tests, and output quality). Implement observability, lineage tracking, and CI/CD automation using MLOps tools like MLflow, Databricks, Azure ML, or Vertex AI. Lead continuous improvement based on telemetry, user feedback, and cost-performance trade-offs. Actively support pre-sales engagements and client solutioning with a consultative, business-outcome-first mindset. What You’ll Need 5+ years of experience in AI/ML engineering, including 2+ years focused on Generative AI and LLMs. Hands-on expertise in Python, cloud platforms (Azure, AWS, or GCP), and GenAI frameworks. Deep understanding of vector search, RAG pipelines, embedding models, and model fine-tuning. Familiarity with MLOps practices and tools (MLflow, Airflow, Databricks, Azure ML,etc.). Experience in enterprise-grade solutioning and deploying LLMs in production environments. Strong leadership skills with experience guiding cross-functional technical teams. Excellent communication and stakeholder engagement skills in a client-facing environment. Bachelor’s or master’s degree in computer science, Data Science, Engineering, or a related field. Inclusion and Equal Opportunities We are always on the lookout for talented individuals to join our team. Billigence is an equal opportunity employer and is committed to creating an inclusive environment for all employees and applicants. We welcome applications from all backgrounds, regardless of race, religion, gender, sexual orientation, disability, or age.

Posted 1 week ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About Holiday Tribe Holiday Tribe is a Great Place To Work® Certified™, seed-stage VC-funded travel-tech brand based in Gurugram. We specialize in crafting unforgettable leisure travel experiences by integrating advanced technology, leveraging human expertise, and prioritizing customer success. With holidays curated across 30+ destinations worldwide, partnerships with renowned tourism boards, and recognition as the Emerging Holiday Tech Company at the India Travel Awards 2023, Holiday Tribe is transforming the travel industry. Our mission is to redefine how Indians experience holidays making travel planning faster, smarter, and more personalized, ensuring every trip is truly seamless and unforgettable. The Role We are seeking a talented AI Engineer to join our growing engineering team and play a pivotal role in building our AI-first travel platform. You will be responsible for designing, developing, and deploying AI systems that power personalized travel recommendations, intelligent itinerary generation, and our innovative sales assistant tools. This is a unique opportunity to shape the future of travel technology while working with state-of-the-art AI capabilities. Key Responsibilities: AI System Development Design and implement Retrieval Augmented Generation (RAG) systems for travel recommendation and itinerary planning Build and optimize large language model integrations using frameworks like LangChain for travel-specific use cases Develop semantic search capabilities using vector databases and embedding models for travel content discovery Create tool-calling architectures that enable AI agents to interact with booking systems, inventory APIs, and external travel services Implement intelligent conversation flows for customer interactions and sales assistance Travel Intelligence Platform Build personalized recommendation engines that understand traveler preferences, seasonal factors, and destination characteristics Develop natural language processing capabilities for interpreting customer travel requests and preferences Implement real-time itinerary generation systems that consider multiple constraints (budget, time, preferences, availability) Create AI-powered tools to assist travel experts in creating customized packages faster Build semantic search engines for finding relevant travel content based on user intent and contextual understanding AI Agent & Tool Integration Design and implement function calling systems that allow LLMs to execute actions like booking confirmations, inventory checks, and pricing queries Build multi-agent systems where specialized AI agents handle different aspects of travel planning (accommodation, transportation, activities) Create tool orchestration frameworks that enable AI systems to chain multiple API calls for complex travel operations Implement safety and validation layers for AI-initiated actions in critical systems Data & Model Operations Work with travel knowledge graphs to enhance AI understanding of destinations, accommodations, and activities Implement hybrid search systems combining semantic similarity with traditional keyword-based search Build vector indexing strategies for efficient similarity search across large travel content databases Implement model evaluation frameworks to ensure high-quality AI outputs Optimize AI model performance for cost-efficiency and response times Collaborate with data engineers to build robust data pipelines for AI training and inference Cross-functional Collaboration Partner with product teams to translate travel domain requirements into AI capabilities Work closely with backend engineers to integrate AI services into the broader platform architecture Collaborate with UX teams to design intuitive AI-human interaction patterns Support sales and customer success teams by improving AI assistant capabilities Required Qualifications: Technical Skills 3+ years of experience in AI/ML engineering with focus on natural language processing and large language models Strong expertise in RAG (Retrieval Augmented Generation) systems including vector databases, embedding models, and retrieval strategies Hands-on experience with LangChain or similar LLM orchestration frameworks, including tool calling and agent patterns Proficiency with semantic search technologies including vector databases, embedding models, and similarity search algorithms Experience with tool calling and function calling in LLM applications, including API integration and action validation Proficiency with major LLM APIs (OpenAI, Anthropic, Google, etc.) and understanding of prompt engineering best practices Experience with vector databases such as Milvus, Weaviate, Chroma, or similar solutions Strong Python programming skills with experience in AI/ML libraries (transformers, sentence-transformers, scikit-learn) AI/ML Foundation Solid understanding of transformer architectures, attention mechanisms, and modern NLP techniques Deep knowledge of embedding models and semantic similarity techniques (sentence transformers, dense retrieval methods) Experience with hybrid search architectures combining dense and sparse retrieval methods Knowledge of fine-tuning approaches and model adaptation strategies Understanding of agent-based AI systems and multi-step reasoning capabilities Understanding of AI evaluation metrics and testing methodologies Familiarity with MLOps practices and model deployment strategies Software Engineering Experience building production-grade AI applications with proper error handling and monitoring Experience with API integration and orchestration for complex multi-step workflows Understanding of API design and microservices architecture Familiarity with cloud platforms (AWS, GCP, Azure) and their AI/ML services Experience with version control, CI/CD, and collaborative development practices Preferred Qualifications: Advanced AI Experience Experience with multi-modal AI systems (text, images, structured data) Advanced knowledge of agent frameworks (LangGraph, CrewAI, AutoGen) and agentic workflows Experience with advanced semantic search techniques including re-ranking, query expansion, and result fusion Experience with model fine-tuning, especially for domain-specific applications Knowledge of tool use optimization and function calling best practices Understanding of AI safety, bias mitigation, and responsible AI practices Technical Depth Experience with advanced RAG techniques (hybrid search, re-ranking, query expansion, contextual retrieval) Knowledge of vector search optimization including indexing strategies, similarity metrics, and performance tuning Experience building tool-calling systems that integrate with external APIs and services Knowledge of graph databases and knowledge graph construction Familiarity with conversational AI and dialogue management systems Experience with A/B testing frameworks for AI systems

Posted 1 week ago

Apply

10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About Company Our client is a Gurgaon-based, venture-backed consumer technology startup that aims to develop a next-generation platform and revolutionize user behavior through the integration of AI. They are looking to hire their Machine Learning leader to shape and deploy core recommendation and personalization systems. Role and Responsibilities Develop recommendation, ranking, and personalization systems. Create adaptive, real-time matchmaking engines learning from user interactions. Design algorithms to deliver personalized content feeds. Implement cold-start solutions and collaborate with Data, Product, and Backend teams. Deploy models efficiently, leveraging observability tools and model registries. Lead the growth of the ML engineering team and foster a strong ML culture. Skills and Qualifications 5–10 years of experience in personalization, recommendations, search, or ranking at scale. Background in B2C platforms such as social, ecommerce, gaming, or video with top-tier educational pedigree Proficient in recommendation techniques like collaborative filtering, deep retrieval models, learning-to-rank, embeddings, and LLMs. Capable of both training and deploying models within end-to-end pipelines. Skilled in offline/online evaluation, A/B testing, and metric optimization. Experience with vector search, graph algorithms, and LLM-based methods is preferred but not mandatory What's On Offer Opportunity to own the full lifecycle—from design to deployment—building a scalable, real-time ranking infrastructure Opportunity to be a founding member of a well-funded consumer-tech start-up looking to disrupt a prime market Competitive compensation with a strong equity upside Show more Show less

Posted 1 week ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Summary Position Summary Analyst – Collection Curation - Knowledge Services Do you like working with people associated with an organization that enables flow of knowledge and expertise to our client service professionals around the world? Then this might be the perfect opportunity for you. Collection Curation team is looking for highly motivated individuals with expertise in communications, attention to detail, positive attitude, presence, and ability to interact across many levels to support delivery of projects with Deloitte member firms around the world. Work you’ll do As a part of this team, you will be responsible for: Deliver on aligned collection curation topics Conduct basic internal and external research on topic for understanding of key trends and potential scope. Conduct basic research, and analysis to identify topic trends which may impact Collection scope using internal portals and secondary/third-party research databases such as Factiva, Hoovers, OneSource, Thomson, and industry journals etc. Demonstrate good understanding of aligned industry/business Curate topic aligned content from internal portals Prepare intermittent deliverables for covering current content counts; usage; freshness; gaps; experts Conduct outreach for content based on Collection priorities Manage intranet portals for Collections by supporting activities including content management, optimization, and sourcing Ensure content pertaining to Collections on the repositories and the web portal is tagged with proper metadata, taxonomy values for easy search and retrieval Contribute various documents to aligned repositories Act as super delegate to assist with, and collaborate with practice/business teams to drive people profile completion Perform content usage analysis using available metrics reports, identify trends, and determine content archival, acquisition and promotion strategies for Collections to share with curator to inform curation priorities Manage knowledge sharing through various communities of practice, micro-blogging tools and other collaboration spaces Project/ Process management Support the curator and the senior member to plan and engage in activities for building and launching Collection content Demonstrate good communication and presentation skills, and ability to create process documents and training materials Focus on quality and strict adherence to governance guidelines and standards The team The Collection Curation Team works closely with global Deloitte practitioners to acquire and publish content related to key topics across Businesses and Industries to the global management portal and manage the topic related intranet sites. This content is intended to help Deloitte practitioners find relevant content just in time to win new projects and deliver quicker and quality deliverables to their clients. Qualifications Mandatory skills: Graduate / Bachelors degree with more than 2 years of relevant experience Educational qualification in Business Administration, English Language, Mass Communication, Humanities, Library Sciences, Commerce, Information Systems or similar Experience in Secondary research : Summary of skills: Core Capability Skills Interpersonal Skills Secondary research Customer engagement/relationships Content curation skills Process Excellence Basic to intermediate proficiency in MS Office Suite (Excel, PowerPoint and Word) Team player SEO Conflict Management Taxonomy Decision Making SharePoint Time management Photoshop Networking Tableau Dashboards Data/Metrics Analysis Intranet Social Media technologies Experience with one or more business/industry (added advantage) Key skills: Good understanding of aligned Industry/Business/Topics and ability to sift through differing industry terms. Intermediate secondary research skills - ability to synthesize large amounts of quantitative and qualitative data and integrate into meaningful reports and recommendations Proficiency in understanding and utilizing GenAI tools Demonstrate understanding knowledge of Content Management platforms Demonstrate understanding of intranet social media tools Showcase basic understanding of content management life cycle, client confidentiality, taxonomy, search Detail-oriented to identify relevant content Understanding of metrics and analytics interpretation Excellent verbal and written communication skills with ability to influence appropriate outcome Attention to detail and delivers high quality deliverables Be able to work independently and as part of a team with professionals at all levels Be able to prioritize tasks, work on multiple assignments, and raise concerns/questions where appropriate Ability to work across cultures and in a virtual environment Self-motivated and strong team player Ability to build networks within the organization Effectively leverage internal social media & collaboration tools to connect people-to-people and people-to-content Location: Hyderabad Work timings: 11AM to 8PM How You’ll Grow At Deloitte, we’ve invested a great deal to create a rich environment in which our professionals can grow. We want all our people to develop in their own way, playing to their own strengths as they hone their leadership skills. And, as a part of our efforts, we provide our professionals with a variety of learning and networking opportunities—including exposure to leaders, sponsors, coaches, and challenging assignments—to help accelerate their careers along the way. No two people learn in exactly the same way. So, we provide a range of resources including live classrooms, team-based learning, and eLearning. DU: The Leadership Center in India, our state-of-the-art, world-class learning Center in the Hyderabad offices is an extension of the Deloitte University (DU) in Westlake, Texas, and represents a tangible symbol of our commitment to our people’s growth and development. Explore DU: The Leadership Center in India Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Deloitte’s culture Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture that is inclusive, invites authenticity, leverages our diversity, and where our people excel and lead healthy, happy lives. Learn more about Life at Deloitte. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people and our communities.We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. Learn more about Deloitte’s impact on the world. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 305194

Posted 1 week ago

Apply

0.0 - 2.0 years

0 Lacs

Indore, Madhya Pradesh

On-site

Indeed logo

Hiring For AI Enginner - Python Developer :- Job Description:- We are seeking a talented Python Developer with hands-on experience in AI chatbot development and familiarity with Model Context Protocol (MCP) to join our AI team. You will be responsible for developing intelligent, context-aware conversational systems that integrate seamlessly with our internal knowledge base and enterprise services. The ideal candidate is technically proficient ,proactive, and capable of translating complex AI interactions into scalable backend solutions. Key Responsibilities 1. Design and develop robust AI chatbots using Python and integrate them with LLM APIs(e.g., OpenAI, Google AI, etc.). 2. Implement and manage Model Context Protocol (MCP) for optimize context injection, session management, and model-aware interactions. 3.Build and maintain secure pipelines for knowledge base access that allow the chatbot to accurately respond to internal queries. 4.Work with internal teams to define and evolve the contextual metadata strategy (roles, user state, query history, etc.). 5.Contribute to internal tooling and framework development for contextual AI applications. Required Skills & Experience 1. 3+ years of professional Python development experience. 2. Proven track record in AI chatbot development, particularly using LLMs. 3. Understanding of Model Context Protocol (MCP) and its role in enhancing AI interactionfidelity and relevance. 4. Strong experience integrating with AI APIs (e.g., OpenAI, Azure OpenAI). 5. Familiarity with Retrieval-Augmented Generation (RAG) pipelines and vector-basedsearch (e.g., Pinecone, Weaviate, FAISS). 6. Experience designing systems that ingest and structure unstructured knowledge (e.g., PDF,Confluence, Google Drive docs). 7. Comfortable working with RESTful APIs, event-driven architectures, and context-awareservices.8.Good understanding of data handling, privacy, and security standards related to enterpriseAI use. Job Location: Indore Joining: Immediate Share resume at talent@jstechalliance.com or can Contact Here :- 0731-3122400 WhatsApp : 8224006397 Job Type: Full-time Schedule: Day shift Application Question(s): Immediate Joiner Have you completed your Bachelor's\Master's Degree? Experience: Python: 3 years (Required) Model Context Protocol (MCP): 3 years (Required) LLM APIs: 3 years (Required) Artificial Intelligence: 2 years (Required) Location: Indore, Madhya Pradesh (Required) Work Location: In person

Posted 1 week ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Description The JP Economics and Decision Science team is a central science team working across a variety of topics in the Retail business and beyond. We work closely with business leaders to drive change at Amazon. We focus on solving long-term, ambiguous and challenging problems, while providing advisory support to help solve short-term business pain points. Key topics include pricing, product selection, delivery speed, profitability, and customer experience. We tackle these issues by building novel economic/econometric models, machine learning systems, and high-impact experiments which we integrate into business, financial, and system-level decision making. We are looking for a Business Intelligence Engineer to work alongside our Scientists to build statistical and ML models, and working directly with senior leaders to tackle their most critical problems in Retail business. Key job responsibilities Drive business changes through data insights to drive change across a wide range of retail business inputs Explore/analyze data and work with Product Managers to understand customer behaviors, develop complex data pipelines Work closely with our scientists to design scalable model architectures. Identify appropriate metrics and feature sets for the problem, and build/maintain automated pipelines for their extraction Fostering culture of continuous engineering improvement through mentoring and feedback Basic Qualifications 5+ years of SQL experience Experience programming to extract, transform and clean large (multi-TB) data sets Experience with theory and practice of design of experiments and statistical analysis of results Experience with AWS technologies Experience in scripting for automation (e.g. Python) and advanced SQL skills. Experience with theory and practice of information retrieval, data science, machine learning and data mining 8+years of analytical experience Preferred Qualifications Experience working directly with business stakeholders to translate between data and business needs Experience managing, analyzing and communicating results to senior leadership Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka - A66 Job ID: A2993715 Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

This role is for one of our clients Industry: Technology, Information and Media Seniority level: Mid-Senior level About The Role We’re seeking an experienced and forward-thinking Lead AI Engineer to head the development of cutting-edge AI applications leveraging large language models (LLMs) and generative AI techniques. In this high-impact role, you will guide the architecture, deployment, and optimization of AI systems that power real-world, intelligent applications. You’ll be at the intersection of research and engineering, leading a team of top-tier AI engineers while shaping the future of AI capabilities within the organization. What You’ll Do Architect AI Solutions Design and lead the development of end-to-end AI systems that leverage LLMs, retrieval-augmented generation (RAG), and autonomous agent frameworks. Model Integration and Optimization Work hands-on with both open-source (Llama, Mistral, Falcon, etc.) and proprietary models (GPT-4, Claude, Gemini), fine-tuning them for specialized applications using proprietary data. Agentic Systems & RAG Build intelligent agents capable of reasoning, planning, and performing multi-step tasks using agentic architectures and RAG pipelines with vector databases like FAISS or Weaviate. Production-Grade Engineering Oversee scalable deployment, versioning, and monitoring of AI models in production using modern MLOps practices and cloud-native platforms (AWS, GCP, Azure). Cross-functional Collaboration Partner with data scientists, product managers, and software engineers to transform complex problems into elegant AI-driven products. AI Team Leadership Mentor a team of engineers, drive technical direction, and manage project priorities, sprint planning, and roadmap execution. Innovation & Research Stay ahead of the curve by tracking advancements in the AI/LLM space. Promote a culture of experimentation, rapid prototyping, and responsible AI. Ethics & Evaluation Implement best practices for model interpretability, bias mitigation, and responsible deployment of AI systems. Experience Must-Have Qualifications 5+ years of hands-on experience in AI/ML engineering with a primary focus on NLP and LLMs. Proven experience in designing and deploying production-grade AI systems. Technical Skills Strong command of Python and frameworks like PyTorch , TensorFlow , LangChain , and LlamaIndex . Solid knowledge of transformer architectures , embeddings, fine-tuning techniques, and tokenization. Experience integrating with vector databases (FAISS, Pinecone, Weaviate). Proficiency in designing APIs and working with distributed systems . Familiarity with MLOps , containerization (Docker), and orchestration tools (Kubernetes). Leadership Experience managing engineering teams and projects, with a track record of shipping successful AI solutions. Cloud & Deployment Deep experience with deploying AI systems on AWS , GCP , or Azure , and optimizing workloads for cost and performance. Nice-to-Have Experience working with multi-modal AI models (text, image, video, audio). Knowledge of RLHF (Reinforcement Learning from Human Feedback) methodologies. Contributions to open-source AI projects or peer-reviewed research. Prior experience at a startup or AI research lab . Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

About Us: We at Ascendeum provide Data and AdTech strategy consultation to businesses worldwide. We started our journey with private funding and have been operating profitably since 2015. We are a lean team of 50+ highly productive and smart people spread across various parts of India who have been consistently delivering intelligent solutions that enable enterprise-level websites and apps to maximize their digital advertising returns. Official Website: https://ascendeum.com/ LinkedIn: https://www.linkedin.com/company/ascendeum/mycompany/ Job Details: • Position: Senior Backend Web Developer • Location: Remote (India) • Employment Type: Full-time Key Responsibilities: • Design, develop, and optimize database structures using MySQL / MariaDB to ensure efficient data storage and retrieval. • Develop robust web applications using PHP8 and either Laravel or Symfony frameworks. • Create intuitive and user-friendly interfaces by leveraging HTML, CSS, and JavaScript, focusing on building a minimal viable interface. • Utilize Git for version control, ensuring code integrity and collaboration within the development team. • Enhance system performance and automation through Linux-based shell scripting (bash). • Collaborate with cross-functional teams to understand requirements and deliver high-quality solutions. • Communicate effectively with team members and stakeholders to provide updates, gather feedback, and ensure project alignment. Preferred Skills and Qualifications: • Proficiency in database design and optimization with MySQL / MariaDB. • Strong command of PHP8 and experience with Laravel and/or Symfony frameworks. • Good understanding of HTML, CSS, and JavaScript. • Experience with version control systems, particularly Git. • Familiarity with Linux environments and proficiency in shell scripting (bash). • Exceptional communication skills, both verbal and written. • Demonstrated ability to work independently and collaboratively within a team environment. Show more Show less

Posted 1 week ago

Apply

5.0 years

1 - 5 Lacs

Hyderābād

On-site

GlassDoor logo

Minimum qualifications: Bachelor’s degree, or equivalent practical experience. 5 years of experience with software development in one or more programming languages (e.g., Python, C, C++, Java, JavaScript). 1.5 years of experience in a leadership role (technical leadership or people management, supervision, or team leadership). Preferred qualifications: Master's degree or PhD in Computer Science or related technical field. 1 year of experience working in a complex, matrixed organization. About the job Like Google's own ambitions, the work of a Software Engineer goes beyond just Search. Software Engineering Managers have not only the technical expertise to take on and provide technical leadership to major projects, but also manage a team of Engineers. You not only optimize your own code but make sure Engineers are able to optimize theirs. As a Software Engineering Manager you manage your project goals, contribute to product strategy and help develop your team. Teams work all across the company, in areas such as information retrieval, artificial intelligence, natural language processing, distributed computing, large-scale system design, networking, security, data compression, user interface design; the list goes on and is growing every day. Operating with scale and speed, our exceptional software engineers are just getting started - and as a manager, you guide the way. With technical and leadership expertise, you manage engineers across multiple teams and locations, a large product budget and oversee the deployment of large-scale projects across multiple sites internationally. At Corp Eng, we build world-leading business solutions that scale a more helpful Google for everyone. As Google’s IT organization, we provide end-to-end solutions for organizations across Google. We deliver the right tools, platforms, and experiences for all Googlers as they create more helpful products and services for everyone. In the simplest terms, we are Google for Googlers. Responsibilities Recruit engineers to join the team, mentor them as they onboard, and help them manage their careers. Work with product management and other Technical Leaders to develop technical road maps. Prototype, develop, and deploy new spatial measurement technologies to figure out how Google's spaces are being used. Own scalable distributed systems (APIs, microservices) for measuring, storing, serving and rendering spatial analytics. Leverage the standard technologies (e.g. Angular TS, Java) combined with Google-created technologies (e.g. gRPC, Spanner). Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies