Home
Jobs

4114 Retrieval Jobs - Page 34

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

50.0 years

0 Lacs

Sahibzada Ajit Singh Nagar, Punjab, India

On-site

Linkedin logo

Hrs As a Company HRS, a pioneer in business travel, aims to elevate every stay through innovative technology. With over 50 years of experience, their digital platform, driven by ProcureTech, TravelTech, and FinTech, transforms how companies and travelers Stay, Work, and Pay. ProcureTech digitally revolutionizes lodging procurement, connecting corporations and suppliers in a cutting-edge ecosystem. This enables seamless efficiency and automation, surpassing travelers' expectations. TravelTech redefines the online lodging experience, offering personalized content from selection to check-in, ensuring an unparalleled journey for corporate travelers. In FinTech, HRS introduces advancements like mobile banking and digital payments, turning corporate back offices into touchless lodging enablers, eliminating legacy cost barriers. The innovative 2-click book-to-pay feature streamlines interactions for travelers and hoteliers. Combining these technology propositions, HRS unlocks exponential catalyst effects. Their data-driven focus delivers value-added services and high-return network effects, creating substantial customer value. HRS's exponential growth since 1972 serves over 35% of the global Fortune 500 and leading hotel chains. Join HRS to shape the future of business travel, empowered by a culture of growth and setting new industry standards worldwide. BUSINESS UNIT HRS' Product House is a critical function in driving the success of the company's Lodging-as-a-Service (LaaS) platform. The department collaborates with cross-functional teams to define the product vision, roadmap, and strategy, and prioritizes features using analytics and data to meet business goals and deliver an exceptional experience for stakeholders. Product Managers at HRS own the program backlog, define product increments and releases, and are responsible for the product vision, roadmap, pricing, licensing, and ROI. They possess strong business and technical knowledge, as well as excellent communication and prioritization skills. The department operates based on HRS' leadership principles, putting the customer view first and striving for customer success over commercial success. They think and act big, challenging the status quo, and constantly leaving their comfort zones to achieve growth. As coaches, they hire the most likely to win and help develop team members to become the best through radical candid feedback. Product Managers at HRS are learning pioneers, continually seeking to improve processes, products, commercial models, technologies, and ways of working. They take ownership of the entire customer experience, seeking truth and committing to decisions once they are made.To succeed in the role, candidates must possess strong business and technical know-how, prioritize tasks accurately, and have excellent communication skills. They must also have an up-to-date knowledge of the latest trends and technologies and be comfortable presenting their ideas to internal stakeholders. POSITION As a Product Manager on the AI Operations team, you will play a central role in transforming our customer support platform through intelligent automation and agentic AI. The AI Ops team is responsible for digitizing and managing a comprehensive customer support ecosystem leveraging cutting-edge AI technologies. You will work closely with design, engineering, data science, customer support leaders, and external partners to define and build sophisticated solutions that enable frictionless customer experiences across multiple channels while driving significant operational efficiency. CHALLENGE Build an enterprise-grade AI-powered support platform that ensures seamless experiences for customers across all touchpoints, reducing support interactions by 80% through intelligent automation and self-service capabilities. Work with our ecosystem partners to integrate our AI-driven solutions into their existing workflows, demonstrate measurable business impact through key performance indicators, and continuously enhance value over time. Identify new product opportunities within our multichannel support and CX case management tools, driving a comprehensive roadmap informed by support analytics, qualitative research, customer feedback, and emerging AI/ML technologies. Collaborate with external vendors (Genesys, Cognigy, AWS Bedrock) to develop custom, scalable integrations that align with our unique requirements while supporting our vision for proactive, personalized customer support. Demonstrate strong expertise in developing and managing agentic AI systems, with the ability to optimize performance, implement robust safeguards, and continuously improve agent capabilities. Navigate a dynamic, fast-paced environment where you'll need to balance immediate operational needs with strategic innovation, exercising autonomy to drive product improvements and make data-driven decisions. Apply structured decision-making frameworks to evaluate opportunities and trade-offs, communicating recommendations through compelling narratives and product requirement documents. Develop a long-term vision and product strategy for the evolution of our AI support platform, effectively communicating with senior leaders to secure buy-in and align cross-functional teams toward executing the strategy. FOR THIS EXCITING MISSION YOU ARE EQUIPPED WITH... Outcomes-driven with an established track record of delivering measurable business impact through customer-facing AI products and automation tools. At least 7+ years of product management experience in enterprise software products, preferably with significant exposure to AI/ML technologies, customer support platforms, or CX management tools. Demonstrated experience with agentic AI systems, including prompt engineering, retrieval-augmented generation (RAG), and the integration of large language models into production environments. Strong analytical mindset with experience using metrics to identify, size, and solve complex customer support challenges, particularly around scaling efficiency while maintaining or improving service quality. Exceptional written and verbal communication skills with the ability to translate complex technical concepts for diverse stakeholders and drive alignment across engineering, support operations, and leadership. Proven ability to develop and execute product experimentation frameworks, testing hypotheses quickly and making data-driven decisions to improve AI agent performance and customer outcomes. Equally comfortable discussing LLM fine-tuning with an ML engineer, reviewing conversation flows with a UX designer, or analyzing support metrics with operations leaders. A hands-on, problem-solving attitude that enables you to tackle challenges directly, whether debugging AI agent behavior, optimizing conversation flows, or investigating customer escalations. Experience with relevant technologies such as conversational AI platforms, knowledge management systems, and data analysis tools; familiarity with SQL and visualization tools like Power BI or MicroStrategy is highly desirable. PERSPECTIVE Access to a global network of a globally united and mutually responsible “Tribe of Intrapreneurs” that is passionately dedicated to renew the travel industry and while doing so reinvent the ways how businesses stay, work and pay. Our entrepreneurial driven environment of full ownership and execution focus offers you the playground to contribute to a greater mission, while growing personally and professionally throughout this unique journey. You will continuously learn from a radical culture of retrospectives and continuous improvement and actively contribute to making business life better, smarter and more sustainable. LOCATION, MOBILITY, INCENTIVE

Posted 1 week ago

Apply

4.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Microsoft is a company where passionate innovators come to collaborate, envision what can be and take their careers further. This is a world of more possibilities, more innovation, more openness, and the sky is the limit thinking in a cloud-enabled world. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. Within Azure Data, the Microsoft Fabric platform team builds and maintains the operating system and provides customers a unified data stack to run an entire data estate. The platform provides a unified experience, unified governance, enables a unified business model and a unified architecture. The Fabric Data Analytics, Insights, and Curation team is leading the way at understanding the Microsoft Fabric composite services and empowering our strategic business leaders. We work with very large and fast arriving data and transform it into trustworthy insights. We build and manage pipelines, transformation, platforms, models, and so much more that empowers the Fabric product. As an Engineer on our team your core function will be Data Engineering with opportunities in Analytics, Science, Software Engineering, DEVOps, and Cloud Systems. You will be working alongside other Engineers, Scientists, Product, Architecture, and Visionaries bringing forth the next generation of data democratization products. We do not just value differences or different perspectives. We seek them out and invite them in so we can tap into the collective power of everyone in the company. As a result, our customers are better served. Responsibilities You will develop and maintain data pipelines, including solutions for data collection, management, transformation, and usage, ensuring accurate data ingestion and readiness for downstream analysis, visualization, and AI model training You will review, design, and implement end-to-end software life cycles, encompassing design, development, CI/CD, service reliability, recoverability, and participation in agile development practices, including on-call rotation You will review and write code to implement performance monitoring protocols across data pipelines, building visualizations and aggregations to monitor pipeline health. You’ll also implement solutions and self-healing processes that minimize points of failure across multiple product features You will anticipate data governance needs, designing data modeling and handling procedures to ensure compliance with all applicable laws and policies You will plan, implement, and enforce security and access control measures to protect sensitive resources and data You will perform database administration tasks, including maintenance, and performance monitoring. You will collaborate with Product Managers, Data and Applied Scientists, Software and Quality Engineers, and other stakeholders to understand data requirements and deliver phased solutions that meet test and quality programs data needs, and support AI model training and inference You will become an SME of our teams’ products and provide inputs for strategic vision You will champion process, engineering, architecture, and product best practices in the team You will work with other team Seniors and Principles to establish best practices in our organization Embody our culture and values Qualifications Required/Minimum Qualifications Bachelor's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 4+ years' experience in business analytics, data science, software development, data modeling or data engineering work OR Master's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 2+ years' experience in business analytics, data science, software development, or data engineering work OR equivalent experience 2+ years of experience in software or data engineering, with proven proficiency in C#, Java, or equivalent 2+ years in one scripting language for data retrieval and manipulation (e.g., SQL or KQL) 2+ years of experience with ETL and data cloud computing technologies, including Azure Data Lake, Azure Data Factory, Azure Synapse, Azure Logic Apps, Azure Functions, Azure Data Explorer, and Power BI or equivalent platforms Preferred/Additional Qualifications 1+ years of demonstrated experience implementing data governance practices, including data access, security and privacy controls and monitoring to comply with regulatory standards. Other Requirements Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter. Equal Opportunity Employer (EOP) #azdat #azuredata #fabricdata #dataintegration #azure #synapse #databases #analytics #science Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Location: Hyderabad Experience Level: Experienced (3 to 5 years) Education: Bachelor’s degree in Technology, Business, or related field About Us NationsBenefits is recognized as one of the fastest-growing companies in America and a leading Healthcare Fintech provider. We specialize in supplemental benefits, flex cards, and member engagement solutions. Partnering with managed care organizations, we offer innovative healthcare solutions that drive growth, improve outcomes, reduce costs, and deliver value to members. Through our comprehensive suite of supplemental benefits, fintech payment platforms, and member engagement solutions, we enable health plans to provide high-quality benefits that address social determinants of health, ultimately improving member health outcomes and satisfaction. Our compliance-focused infrastructure, proprietary technology systems, and premier service delivery model empower our health plan partners to offer value-based care to millions of members. We offer a dynamic and fulfilling work environment that attracts top talent, where associates are encouraged to contribute to delivering premier service to both internal and external customers. Our goal is to transform the healthcare industry for the better! We provide opportunities for career advancement across multiple locations in the US, South America, and India. Job Description We are seeking a highly skilled and motivated Machine Learning Engineer with expertise in Retriever-Augmented Generation (RAG) , BERT (Bidirectional Encoder Representations from Transformers) , and Pinecone to join our innovative AI/ML team. Responsibilities: Work on cutting-edge ML models and AI-driven applications: Focus on improving retrieval-augmented generation (RAG) systems and developing scalable solutions. Design and fine-tune BERT-based models: Implement models for a range of tasks including sentiment analysis, named entity recognition (NER), and question answering. Utilize Pinecone for scalable vector search solutions: Enhance the performance of search and retrieval systems by implementing fast, scalable, and high-performing vector search techniques. Manage large datasets: Apply best practices in machine learning to ensure robust model training, validation, and deployment pipelines. Collaborate with cross-functional teams: Work closely with Data Scientists, Product Managers, and Backend Engineers to deploy AI solutions into production. Conduct research and experimentation: Improve existing algorithms and model performance in retrieval-augmented generation. Optimize model performance: Ensure the efficiency, accuracy, and scalability of solutions in production environments. If you're passionate about leveraging AI/ML technologies to transform the healthcare industry, we would love to hear from you! This version enhances readability by using clear headings, bullet points for responsibilities, and consistent formatting.

Posted 1 week ago

Apply

6.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

About Godrej Industries Limited Group (GIG) Godrej Industries Group (GIG), is a vibrant group of listed Godrej companies. It has a clear focus on Chemicals, FMCG, Real Estate, Agriculture and Financial Services, a set of diverse industries, most of which are defining new India’s growth story. At GIG we seek to achieve this growth through fostering an inspiring place to work, while inculcating shared value through a philosophy of ‘Good & Green’. As a part of GIG, Godrej Industries Group (GIG) is in the business of oleo-chemicals, surfactants, finance & investments, and estate management. In the past few years, the group has also focused on increasing its global footprint in developing economies like Latin America, Indonesia and Africa through its FMCG arm – Godrej Consumer Products Limited (GCPL) . GCPL is a leader among the Indian-born FMCG companies with leading Household and Personal Care Products. The real estate arm, Godrej Properties Limited (GPL) brings the group’s philosophy of innovation and excellence to the real estate industry. It aims to deliver superior value to all stakeholders through extraordinary and imaginative spaces created out of deep customer focus and insight. The agri-business arm Godrej Agrovet Ltd (GAVL) of GILAC is dedicated to improving the productivity of Indian farmers by innovating products and services that sustainably increase crop and livestock yields. The company operates in animal feed, oil palm, agri inputs, hybrid seeds, and poultry in which it is a leader. Godrej Capital (GC) is the vertical that aims to finance your dreams. We understand what’s important to you, and taking the Group’s legacy of trust, we bring to you financial solutions to secure your future, creating moments of joy. www.godrejgroup.com Designation : AI Engineer – AI Lab Location: Mumbai, India Job Purpose As an AI Engineer in the Godrej AI Lab, you will play a key role in developing and experimenting with AI and GenAI solutions to solve business problems. This role is deeply technical but also exploratory in nature. You will be responsible for testing feasibility, building prototypes, running experiments, and co-creating solutions with both internal teams and external partners. Your work will directly contribute to evaluating new use cases, scaling successful pilots, and embedding AI into our business workflows. All solutioning, prototyping, testing, and development will be anchored by you— making this a highly engaging and interesting role at the heart of real-world AI delivery Roles & Responsibilities Build, test, and iterate on AI and GenAI solutions using industry-standard tools and platforms. Anchor technical components of AI use cases—from experimentation to solution deployment. Collaborate with managers, internal tech teams, and external partners to co-develop POCs and scalable solutions. Conduct feasibility assessments and support decision-making through rapid experimentation. Translate abstract business problems into model-ready formats, including data preparation and feature engineering. Work on GenAI components such as prompt design, LLM integration, and retrieval-augmented generation (RAG). Maintain clean, well-documented code and version control practices. Integrate solutions with APIs and backend systems as needed, in partnership with engineering teams. Support model evaluation, optimization, and iterative improvement cycles. Contribute to the AI Lab's internal knowledge-sharing and tooling base Educational Qualification Bachelor’s or Master’s in Computer Science, Engineering, or a related technical field. Experience 4–6 years of hands-on experience in AI/ML, GenAI, or applied data science roles. Experience building or experimenting with AI solutions in practical business settings. Exposure to collaborative delivery with tech teams or solution partners. Skills Strong proficiency in Python and libraries like Pandas, NumPy, scikit-learn, PyTorch, TensorFlow. Experience with GenAI APIs (OpenAI, PaLM, Hugging Face) and vector databases (e.g., Pinecone, FAISS). Working knowledge of prompt engineering, LLMs, RAG, and LangChain-style pipelines. Comfortable with code versioning (Git), API-based integration, and cloud environments (GCP, AWS, Azure). Ability to think critically, test hypotheses, and learn rapidly through experimentation. Good collaboration skills and attention to clean, modular code An inclusive Godrej Before you go, there is something important we want to highlight. There is no place for discrimination at Godrej. Diversity is the philosophy of who we are as a company. And has been for over a century. It’s not just in our DNA and nice to do. Being more diverse - especially having our team members reflect the diversity of our businesses and communities - helps us innovate better and grow faster. We hope this resonates with you. We take pride in being an equal opportunities employer. We recognise merit and encourage diversity. We do not tolerate any form of discrimination on the basis of nationality, race, colour, religion, caste, gender identity or expression, sexual orientation, disability, age, or marital status and ensure equal opportunities for all our team members. If this sounds like a role for you, apply now! We look forward to meeting you.

Posted 1 week ago

Apply

2.0 years

0 Lacs

India

On-site

Linkedin logo

Hi , Please find the below Job Description. Job Title: GCP Data Modeler Duration: Full Time Location: Hybrid Locations: Hyderabad, Chennai, Bengaluru, Pune, Nagpur. Job Description: Experience with with ( GCP, Bigquery, Dataflow, LookML, Looker, SQL, Python) Job Description: Senior Data Modeler with Expertise in GCP and Looker Overview: We are seeking a highly skilled and experienced Data Modeler to join our data and analytics team. The ideal candidate will have deep expertise in data modeling, particularly with Google Cloud Platform (GCP), and a strong background in managing complex data projects. This role involves designing scalable data models, optimizing workflows, and ensuring seamless data integration to support strategic business decisions. Key Responsibilities: Data Modeling: Design, develop, and maintain conceptual, logical, and physical data models to support data warehousing and analytics needs. Ensure data models are scalable, efficient, and aligned with business requirements. Database Design: Create and optimize database schemas, tables, views, indexes, and other database objects in Google BigQuery. Implement best practices for database design to ensure data integrity and performance. ETL Processes: Design and implement ETL (Extract, Transform, Load) processes to integrate data from various source systems into BigQuery. Use tools like Google Cloud Dataflow, Apache Beam, or other ETL tools to automate data pipelines. Data Integration: Work closely with data engineers to ensure seamless integration and consistency of data across different platforms. Integrate data from on-premises systems, third-party applications, and other cloud services into GCP. Data Governance: Implement data governance practices to ensure data quality, consistency, and security. Define and enforce data standards, naming conventions, and documentation. Performance Optimization: Optimize data storage, processing, and retrieval to ensure high performance and scalability. Use partitioning, clustering, and other optimization techniques in BigQuery. Collaboration: Collaborate with business stakeholders, data scientists, and analysts to understand data requirements and translate them into effective data models. Provide technical guidance and mentorship to junior team members. Data Visualization: Work with data visualization tools like Looker, Looker Studio, or Tableau to create interactive dashboards and reports. Develop LookML models in Looker to enable efficient data querying and visualization. Documentation: Document data models, ETL processes, and data integration workflows. Maintain up-to-date documentation to facilitate knowledge sharing and onboarding of new team members. Required Expertise: Looker: 2-5+ Years of Strong proficiency in Looker, including LookML, dashboard creation, and report development. BigQuery: 5+ Extensive experience with Google BigQuery, including data warehousing, SQL querying, and performance optimization. SQL& Python: 10+ years of SQL and Advanced SQL and Python skills for data manipulation, querying, and modelling. ETL: 10+ years of hands-on experience with ETL processes and tools for data integration from various source systems. Cloud Services: Familiarity with Google Cloud Platform (GCP) services, particularly BigQuery, Cloud Storage, and Dataflow. Data Modelling Techniques: Proficiency in various data modelling techniques such as star schema, snowflake schema, normalized and denormalized models, and dimensional modelling. Knowledge of data modelling frameworks, including Data Mesh, Data Vault, Medallion architecture, and methodologies by Kimball and Inmon, is highly advantageous. Problem-Solving: Excellent problem-solving skills and the ability to work on complex, ambiguous projects. Communication: Strong communication and collaboration skills, with the ability to work effectively in a team environment. Project Delivery: Proven track record of delivering successful data projects and driving business value through data insights. Preferred Qualifications: Education: Bachelor's or Master's degree in Data Science, Computer Science, Information Systems, or a related field. Certifications: Google Cloud certification in relevance to Data Modeler or engineering capabilities. Visualization Tools: Experience with other data visualization tools such as Looker, Looker Studio and Tableau. Programming: Familiarity with programming languages such as Python for data manipulation and analysis. Data Warehousing: Knowledge of data warehousing concepts and best practices.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Designation: Computer Vision Engineer Location: Gurgaon Salary: Upto 35 LPA Working: 5 Days Experience: 5+ Years Key Responsibilities: -Design and fine-tune deep learning models for Text2SQL, TTS, and image generation tasks -Build and optimize Generative AI models using GPT, BERT, T5, Claude, and similar LLMs -Work with GANs (e.g., Wav2Lip, LipSync) for audio-visual synthesis -Leverage NVIDIA frameworks like Triton and Omniverse -Integrate AI models into scalable applications with optimal performance -Fine-tune models for domain-specific use cases with a focus on accuracy -Research advancements in NLP and diffusion models for fashion-based image generation -Deploy and maintain AI solutions in production environments -Build Retrieval-Augmented Generation (RAG) systems for better knowledge retrieval -Use and manage vector/graph databases to support AI implementations -Apply basic computer vision and image processing in real-world projects -Benchmark AI model performance using standard metrics and datasets -Develop custom object detection pipelines (YOLOv8, Faster R-CNN, SSD) -Work closely with engineers, scientists, and stakeholders to meet business needs Required Skills & Qualification: -5+ years in AI/ML with a focus on Generative AI and LLMs -Expert in Python, C++, and JavaScript -Strong with PyTorch, TensorFlow, and Keras -Experience in NLP, transformer models, and deep learning -Knowledge of ML Ops tools is a plus -Familiar with AWS, GCP, or Azure for cloud deployments -Experience with RAG systems, vector (e.g., Faiss, Pinecone) & graph databases (e.g., Neo4j) -Strong foundation in computer vision, CNNs, and object detection -Familiar with OpenCV, PIL, and Agentic AI tools (LangGraph, CrewAI, AutoGen) -Proficient in Flask, FastAPI, and WebSocket for serving AI models -Strong grasp of benchmarking standards like GLUE, ImageNet, BLEU, ROUGE, COCO

Posted 1 week ago

Apply

4.0 years

0 Lacs

Agra, Uttar Pradesh, India

Remote

Linkedin logo

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

4.0 years

0 Lacs

Noida, Uttar Pradesh, India

Remote

Linkedin logo

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

4.0 years

0 Lacs

Ahmedabad, Gujarat, India

Remote

Linkedin logo

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

4.0 years

0 Lacs

Kolkata, West Bengal, India

Remote

Linkedin logo

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

4.0 years

0 Lacs

Cuttack, Odisha, India

Remote

Linkedin logo

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

4.0 years

0 Lacs

Bhubaneswar, Odisha, India

Remote

Linkedin logo

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

4.0 years

0 Lacs

Guwahati, Assam, India

Remote

Linkedin logo

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

4.0 years

0 Lacs

Raipur, Chhattisgarh, India

Remote

Linkedin logo

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

4.0 years

0 Lacs

Ranchi, Jharkhand, India

Remote

Linkedin logo

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

4.0 years

0 Lacs

Jamshedpur, Jharkhand, India

Remote

Linkedin logo

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

4.0 years

0 Lacs

Amritsar, Punjab, India

Remote

Linkedin logo

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

4.0 years

0 Lacs

Jaipur, Rajasthan, India

Remote

Linkedin logo

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

4.0 years

0 Lacs

Greater Lucknow Area

Remote

Linkedin logo

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

4.0 years

0 Lacs

Nashik, Maharashtra, India

Remote

Linkedin logo

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

4.0 years

0 Lacs

Thane, Maharashtra, India

Remote

Linkedin logo

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

4.0 years

0 Lacs

Kanpur, Uttar Pradesh, India

Remote

Linkedin logo

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Req ID: 327884 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a AI Data Scientist - Digital Engineering Sr. Engineer to join our team in Hyderabad, Telangana (IN-TG), India (IN). ARTIFICIAL INTELLIGENCE AI Data Scientist | Focused on Generative AI & LLMs Design and develop AI/ML models with a focus on LLMs (e.g., GPT, LLaMA, Mistral, Falcon, Claude). Apply prompt engineering, fine-tuning, and transfer learning techniques to customize models for enterprise use cases. Work with vector databases and retrieval-augmented generation (RAG) pipelines for contextual response generation. Collaborate with data engineers, AI Engineers and MLOps teams to deploy models in production environments About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here .

Posted 1 week ago

Apply

4.0 years

0 Lacs

Nagpur, Maharashtra, India

Remote

Linkedin logo

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply

4.0 years

0 Lacs

Kochi, Kerala, India

Remote

Linkedin logo

Experience : 4.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: TensorFlow, PyTorch, rag, LangChain Forbes Advisor is Looking for: Location - Remote (For candidate's from Chennai or Mumbai it's hybrid) Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. At Marketplace, our mission is to help readers turn their aspirations into reality. We arm people with trusted advice and guidance, so they can make informed decisions they feel confident in and get back to doing the things they care about most. We are an experienced team of industry experts dedicated to helping readers make smart decisions and choose the right products with ease. Marketplace boasts decades of experience across dozens of geographies and teams, including Content, SEO, Business Intelligence, Finance, HR, Marketing, Production, Technology and Sales. The team brings rich industry knowledge to Marketplace’s global coverage of consumer credit, debt, health, home improvement, banking, investing, credit cards, small business, education, insurance, loans, real estate and travel. The Data Extraction Team is a brand-new team who plays a crucial role in our organization by designing, implementing, and overseeing advanced web scraping frameworks. Their core function involves creating and refining tools and methodologies to efficiently gather precise and meaningful data from a diverse range of digital platforms. Additionally, this team is tasked with constructing robust data pipelines and implementing Extract, Transform, Load (ETL) processes. These processes are essential for seamlessly transferring the harvested data into our data storage systems, ensuring its ready availability for analysis and utilization. A typical day in the life of a Data Research Engineer will involve coming up with ideas regarding how the company/team can best harness the power of AI/LLM, and use it not only simplify operations within the team, but also to streamline the work of the research team in gathering/retrieving large sets of data. The role is that of a leader who sets a vision for the future of AI/LLM’s use within the team and the company. They think outside the box and are proactive in engaging with new technologies and developing new ideas for the team to move forward in the AI/LLM field. The candidate should also at least be willing to acquire some basic skills in scraping and data pipelining. Responsibilities: Develop methods to leverage the potential of LLM and AI within the team. Proactive at finding new solutions to engage the team with AI/LLM, and streamline processes in the team. Be a visionary with AI/LLM tools and predict how the use of future technologies could be harnessed early on so that when these technologies come out, the team is ahead of the game regarding how it could be used. Assist in acquiring and integrating data from various sources, including web crawling and API integration. Stay updated with emerging technologies and industry trends. Explore third-party technologies as alternatives to legacy approaches for efficient data pipelines. Contribute to cross-functional teams in understanding data requirements. Assume accountability for achieving development milestones. Prioritize tasks to ensure timely delivery, in a fast-paced environment with rapidly changing priorities. Collaborate with and assist fellow members of the Data Research Engineering Team as required. Leverage online resources effectively like StackOverflow, ChatGPT, Bard, etc., while considering their capabilities and limitations. Skills And Experience Bachelor's degree in Computer Science, Data Science, or a related field. Higher qualifications is a plus. Think proactively and creatively regarding the next AI/LLM technologies and how to use them to the team’s and company’s benefits. “Think outside the box” mentality. Experience prompting LLMs in a streamlined way, taking into account how the LLM can potentially “hallucinate” and return wrong information. Experience building agentic AI platforms with modular capabilities and autonomous task execution. (crewai, lagchain, etc.) Proficient in implementing Retrieval-Augmented Generation (RAG) pipelines for dynamic knowledge integration. (chromadb, pinecone, etc) Experience managing a team of AI/LLM experts is a plus: this includes setting up goals and objectives for the team and fine-tuning complex models. Strong proficiency in Python programming Proficiency in SQL and data querying is a plus. Familiarity with web crawling techniques and API integration is a plus but not a must. Experience in AI/ML engineering and data extraction Experience with LLMs, NLP frameworks (spaCy, NLTK, Hugging Face, etc.) Strong understanding of machine learning frameworks (TensorFlow, PyTorch) Design and build AI models using LLMs Integrate LLM solutions with existing systems via APIs Collaborate with the team to implement and optimize AI solutions Monitor and improve model performance and accuracy Familiarity with Agile development methodologies is a plus. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Ability to work collaboratively in a team environment. Good and effective communication skills. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Comfortable with autonomy and ability to work independently. Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies