Home
Jobs
Companies
Resume

370 Parsing Jobs - Page 9

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 10.0 years

2 - 3 Lacs

Gurgaon

On-site

Experience: 7 - 10 Years Location: GURGAON/ HYBRID MODE CTC TO BE OFFERED : Mention Your Current & Expected CTC Notice Period: IMMEDIATE TO 30 DAYS KeySkills: SPLUNK, SIEM DOMAIN, BACKEND OPERATIONS , UF, HF, SH, INDEXER CLUSTER, LOG MANAGEMENT, LOG COLLECTION, PARSING, NORMALIZATION, RETENTION PRACTICES, LOGS/LICENSE OPTIMIZATION, DESIGNING, DEPLOYMENT & IMPLEMENTATION, DATA PARSIMONY, GERMAN DATA SECURITY STANDARDS, SPLUNK LOGGING INFRASTRUCTURE, OBSERVABILITY TOOLS, ELK, DATADOG, NETWORK ARCHITECTURE, LINUX ADMINISTRATION, SYSLOG, PYTHON, POWERSHELL, OR BASH, OEM SIEM, HLD, LLD, IMPLEMENTATION GUIDE, OPERATION MANUALS Job Description: As Lead Splunk, your role and responsibilities would include: Hands on experience in the SIEM domain Expert knowledge on Splunk Backend operations (UF, HF, SH and Indexer Cluster) and architecture Expert knowledge of Log Management and Splunk SIEM. Understanding of log collection, parsing, normalization, and retention practices. Expert in Logs/License optimization techniques and strategy. Good Understanding of Designing, Deployment & Implementation of a scalable SIEM Architecture. Understanding of data parsimony as a concept, especially in terms of German data security standards. Working knowledge of integration of Splunk logging infrastructure with 3rd party Observability Tools (e.g. ELK, DataDog etc.) Experience in identifying the security and non-security logs and apply adequate filters/re-route the logs accordingly. Expert in understanding the Network Architecture and identifying the components of impact. Expert in Linux Administration. Proficient in working with Syslog. Proficiency in scripting languages like Python, PowerShell, or Bash to automate tasks Expertise with OEM SIEM tools preferably Splunk Experience with open source SIEM/Log storage solutions like ELK OR Datadog etc.. Very good with documentation of HLD, LLD, Implementation guide and Operation Manuals Note: (i) Our client is looking for immediate & early joiners. (ii) Having LinkedIn Profile is a must. (iii) Being an immediate & high priority requirement interested candidates can share their Resumes with Photograph in word doc. format

Posted 2 weeks ago

Apply

0 years

3 - 5 Lacs

Chennai

On-site

Primary Responsibilities: Design and develop AI-driven web applications using Streamlit and LangChain. Implement multi-agent workflows with LangGraph. Integrate Claude 3 (via AWS Bedrock) into intelligent systems for document and image processing. Work with FAISS for vector search and similarity matching. Develop document integration solutions for PDF, DOCX, XLSX, PPTX, and image-based formats. Implement OCR and summarization features using EasyOCR, PyMuPDF, and AI models. Create features such as spell-check, chatbot accuracy tracking, and automatic re-training pipelines. Build secure apps with SSO authentication, transcript downloads, and reference link generation. Integrate external platforms like Confluence, SharePoint, ServiceNow, Veeva Vault, Outlook, G.Net/G.Share, and JIRA. Collaborate on architecture, performance optimization, and deployment. Required Skills: Strong expertise in Streamlit, LangChain, LangGraph, and Claude 3 (AWS Bedrock). Hands-on experience with boto3, FAISS, EasyOCR, and PyMuPDF. Advanced skills in document parsing and image/video-to-text summarization. Proficient in modular architecture design and real-time AI response systems. Experience in enterprise integration with tools like ServiceNow, Confluence, Outlook, and JIRA. Familiar with chatbot monitoring and retraining strategies. Secondary Skills: Working knowledge of PostgreSQL, JSON, and file I/O with Python libraries like os, io, time, datetime, and typing. Experience with dataclasses and numpy for efficient data handling and numerical process About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Bengaluru

On-site

Company Description Bosch Global Software Technologies Private Limited is a 100% owned subsidiary of Robert Bosch GmbH, one of the world's leading global supplier of technology and services, offering end-to-end Engineering, IT and Business Solutions. With over 28,200+ associates, it’s the largest software development center of Bosch, outside Germany, indicating that it is the Technology Powerhouse of Bosch in India with a global footprint and presence in the US, Europe and the Asia Pacific region. Job Description Roles & Responsibilities : The engineer’s role would be to support testing efforts for XR sw stack and adjacent technologies like game/3d app dev kits (unity 3d, unreal engine), connectivity (usb, wifi 6, Wifi 6E), multimedia(video, camera, graphics, audio, display). Primary responsibilities will be to write and execute testcases and device automation. Evaluate technologies, systems, and devices through testing, logging and analysis. Candidate must be able to create test environments using software and hardware tools. Develop test procedures, execute tests, and isolate problems. Develop automation systems for executing tests and parsing results. Develop app/test content for exercising new features. Actively study user experience to improve customer experience on Qualcomm MM solution across ARM based chipset product Qualifications Educational qualification: Bachelor’s or Master’s degree in Computer Science or Electronics or Electrical Engineering or related field. Experience : 4-8 Years Skills: Experience working with Windows, Linux and Android. Expertise on Windows and Linux OS concepts and tools – bat script, linux commands. Expert in C/C++/Python programming Experience in game/3d app dev kits (unity 3d, unreal engine), connectivity (usb, wifi 6, Wifi 6E), multimedia(video, camera, graphics, audio, display) Outstanding problem-solving skills Excellent communication and team working skills.

Posted 2 weeks ago

Apply

0 years

0 - 0 Lacs

Bengaluru

Remote

Job Description: We are seeking a creative and independent Web Crawler Developer to join our team Seattle based Construction Team. The ideal candidate will have a keen eye for detail, a passion for problem-solving, and the ability to think outside the box to develop sophisticated web scraping solutions. Responsibilities: - Design, implement, and maintain web crawlers that can effectively extract data from various websites . - Analyze web page structures and adapt crawlers to extract relevant information efficiently. - Monitor crawler performance and make necessary adjustments to ensure optimal data collection. - Work independently to identify new opportunities for data extraction and offer insightful recommendations. - Ensure compliance with legal and ethical standards for data scraping. - Collaborate with data analysts and other team members to understand data needs and improve data accuracy. - Keep up-to-date with the latest web scraping technologies and best practices Qualifications: - Strong experience with web scraping tools and frameworks (e.g., Scrapy, BeautifulSoup, Selenium, etc.). - Proficiency in programming languages such as Python, Java, or others relevant to web crawling. - Experience with handling and parsing different data formats like HTML, JSON, XML, etc. - Excellent problem-solving skills and the ability to think outside the box. - Ability to work independently and manage multiple tasks efficiently. - Solid understanding of web protocols (HTTP, HTTPS) and web technologies. - Familiarity with version control systems, preferably Git. - Knowledge of data privacy laws and ethical web scraping practices. Preferred: - Experience with cloud services like AWS or Azure for deploying and managing web crawlers. - Understanding of databases and data storage solutions. - Previous experience in a similar role or related projects. Job Type: Contractual / Temporary Contract length: 2 months Pay: ₹76,000.00 - ₹80,000.00 per month Benefits: Work from home Supplemental Pay: Performance bonus Expected Start Date: 03/06/2025

Posted 2 weeks ago

Apply

3.0 years

4 - 10 Lacs

Pune

Remote

We help the world run better At SAP, we enable you to bring out your best. Our company culture is focused on collaboration and a shared passion to help the world run better. How? We focus every day on building the foundation for tomorrow and creating a workplace that embraces differences, values flexibility, and is aligned to our purpose-driven and future-focused work. We offer a highly collaborative, caring team environment with a strong focus on learning and development, recognition for your individual contributions, and a variety of benefit options for you to choose from. The SAP HANA Database and Analytics Core engine team is looking for an intermediate, or senior developer to contribute to our Knowledge Graph Database System engine development. In this role, you will be designing, developing features, and maintaining our Knowledge Graph engine, which runs inside SAP HANA in-memory database. At SAP, all members of the engineering team, including management, are hands-on and close to the code. If you think you can thrive in such an environment, and you have the necessary skills and experience please do not hesitate to apply. WHAT YOU’LL DO- As a developer, you will have the opportunity to: Contribute to hands-on coding, design, and architecture that is best suited for our team size and performance targets. Collaborate in a team environment that extends to colleagues in remote locations and from various lines of businesses within the company. Ability to communicate and guide other teams to construct best possible queries for their needs. Assess new technology, tool, and infrastructure to keep up with the rapid pace of change. Embrace lean and agile software development principles. Debug, troubleshoot and communicate with customers about their issues with their data models, and queries. Continually enhance existing skills and seek new areas for personal development. WHAT YOU BRING- Bachelor’s degree or equivalent university education in computer science or engineering with 3-5 years of experience in developing enterprise class software. Experience in Development with modern C++. Knowledge of development of Database Internals like - Query Optimizer/Planner, Query Executor, System Management, Transaction Management, and/or Persistence. Knowledge of SQL, and Graph technologies like RDF/SPARQL. Knowledge of full SDLC and development of tests using Python or other tools. Experience designing and developing well-encapsulated, and object-oriented code. Solution-oriented and open minded. Manage collaboration with sister teams and partner resources in remote locations. High service and customer orientation Skilled in process optimization and drives for permanent change. Strong in analytical thinking/problem solving. Interpersonal skills: team player, proactive networking, results and execution oriented, motivated to work in an international and intercultural environment. Excellent oral and written communication skills and presentation skills MEET YOUR TEAM- The team is responsible for developing HANA Knowledge Graph, a high-performance graph analytics database system, made available to SAP customers, partners, and various internal groups as part of HANA Multi Model Database System. It is specifically designed for processing large-scale graph data and executing complex graph queries with high efficiency. HANA Knowledge Graph enables organizations to gain insights from their graph datasets, discover patterns, perform advanced graph analytics, and unlock the value of interconnected data. HANA Knowledge Graph utilizes massive parallel processing (MPP) architecture to leverage the power of distributed computing. It is built with W3C web standards specifications of graph data and query language – RDF and SPARQL. The various components of HANA Knowledge Graph System include – Storage, Data Load, Query Parsing, Query Planning and Optimization, Query Execution, Transaction Management, Memory Management, Network Communications, System Management, Data Persistence, Backup & Restore, Performance Tuning, etc. At SAP, HANA Knowledge Graph is set to play a critical role in the development of several AI products. Bring out your best SAP innovations help more than four hundred thousand customers worldwide work together more efficiently and use business insight more effectively. Originally known for leadership in enterprise resource planning (ERP) software, SAP has evolved to become a market leader in end-to-end business application software and related services for database, analytics, intelligent technologies, and experience management. As a cloud company with two hundred million users and more than one hundred thousand employees worldwide, we are purpose-driven and future-focused, with a highly collaborative team ethic and commitment to personal development. Whether connecting global industries, people, or platforms, we help ensure every challenge gets the solution it deserves. At SAP, you can bring out your best. We win with inclusion SAP’s culture of inclusion, focus on health and well-being, and flexible working models help ensure that everyone – regardless of background – feels included and can run at their best. At SAP, we believe we are made stronger by the unique capabilities and qualities that each person brings to our company, and we invest in our employees to inspire confidence and help everyone realize their full potential. We ultimately believe in unleashing all talent and creating a better and more equitable world. SAP is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to the values of Equal Employment Opportunity and provide accessibility accommodations to applicants with physical and/or mental disabilities. If you are interested in applying for employment with SAP and are in need of accommodation or special assistance to navigate our website or to complete your application, please send an e-mail with your request to Recruiting Operations Team: Careers@sap.com For SAP employees: Only permanent roles are eligible for the SAP Employee Referral Program, according to the eligibility rules set in the SAP Referral Policy. Specific conditions may apply for roles in Vocational Training. EOE AA M/F/Vet/Disability: Qualified applicants will receive consideration for employment without regard to their age, race, religion, national origin, ethnicity, age, gender (including pregnancy, childbirth, et al), sexual orientation, gender identity or expression, protected veteran status, or disability. Successful candidates might be required to undergo a background verification with an external vendor. Requisition ID: 396628 | Work Area: Software-Design and Development | Expected Travel: 0 - 10% | Career Status: Professional | Employment Type: Regular Full Time | Additional Locations: #LI-Hybrid.

Posted 2 weeks ago

Apply

9.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Introduction As a Hardware Developer at IBM, you’ll get to work on the systems that are driving the quantum revolution and the AI era. Join an elite team of engineering professionals who enable IBM customers to make better decisions quicker on the most trusted hardware platform in today’s market. Your Role And Responsibilities We are seeking highly motivated Test engineer to be part of Hardware team. Join a great team of engineering professionals who are involved in development, validation, delivery of DFT patterns and testing the patterns for IBM’s microprocessor chip design team. Preferred Education Master's Degree Required Technical And Professional Expertise 4–9 years of experience in ATE test development, silicon debug, and production support for complex SoC or ASIC devices. Strong expertise in test program development, test vector translation, timing setup, and ATE bring-up workflows. Proven ability in debugging test failures, analyzing yield and parametric issues, and resolving silicon bring-up and characterization challenges. Experience with RMA debug – reproducing, analyzing, and isolating failures in customer-returned or field-returned silicon. Hands-on experience with PVT (Process, Voltage, Temperature) characterization, using ATE. Experience in pattern generation, pattern retargeting, and vector-level debug using standard ATE tools (e.g., Teradyne, Advantest). Strong knowledge of pin margin analysis, voltage/timing margining, and correlation between simulation and ATE results. Proficient in automation and scripting using VB (Visual Basic), Perl, Python, and TCL for test flow automation, log parsing, and pattern manipulation. Effective collaboration with cross-functional teams including design, validation, product engineering, and silicon debug to ensure test robustness and quality. Excellent debug and bring-up skills – considered key requirements for this role. Detail-oriented with solid analytical and problem-solving abilities. Strong communication skills and ability to work across global teams. Preferred Technical And Professional Experience Experience with Teradyne UltraFlex (UFlex) tester is a plus. Familiarity with microcontroller architecture, embedded firmware, and functional verification concepts. Experience in post-silicon validation, system-level debug, and yield optimization workflows. Knowledge of processor-based test flows, scan diagnostics, and test time optimization Show more Show less

Posted 2 weeks ago

Apply

14.0 - 16.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

We have urgent job opportunity with us. C# Developer with XML Expertise Location : Pune Overview: We are seeking a talented and motivated C# Developer with a strong background in XML technologies to join our dynamic team in the life insurance sector. The ideal candidate will have 14-16 years of experience in software development, with a robust understanding of C# programming, XML handling, and the nuances of the US life insurance domain. You will play a key role in developing and maintaining software solutions that support our business operations and enhance our customer experience. Key Responsibilities: - Design, develop, and implement software applications using C# that meet business requirements in the life insurance sector. - Work with XML data formats for data interchange and processing within our applications. - Collaborate with cross-functional teams including business analysts, quality assurance, and project management to gather and refine requirements. - Perform code reviews, unit testing, and debugging to ensure high-quality software delivery. - Maintain and enhance existing applications to improve performance and functionality. - Document software designs, processes, and technical specifications to facilitate knowledge sharing and compliance. - Stay current with industry trends and technologies related to life insurance and software development. Qualifications: - 14-16 years of professional experience in software development using C#, .NET framework, and related technologies. - Strong understanding of XML and experience in parsing, transforming, and validating XML data. - Knowledge of life insurance industry practices, products, and regulatory requirements. - Experience with SQL Server or other database technologies for data management. - Proficiency in working with RESTful and SOAP web services. - Familiarity with Agile development methodologies and tools. - Excellent problem-solving skills and attention to detail. - Strong communication skills and ability to work collaboratively in a team-oriented environment. Preferred Qualifications: - Experience with related technologies such as ASP.NET, MVC frameworks, and cloud services (e.g., Azure). - Knowledge of life insurance policy administration systems or underwriting processes. - Certifications in C#, XML technologies, or Agile methodologies are a plus. Show more Show less

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Data Integration Developer About the Role We are looking for a skilled Data Integration Developer to join our award-winning team that recently earned the "Outstanding Data Engineering Team" award at DES 2025. In this role, you will be instrumental in building and maintaining cloud-based data pipelines that power AI and Intelligent Document Automation services . Your work will directly support scalable, production-grade workflows that transform structured and unstructured documents into actionable data using cutting-edge machine learning solutions. You’ll collaborate cross-functionally with data scientists, AI/ML engineers, cloud engineers, and product owners to ensure robust pipeline design, integration, observability, and performance at scale. Key Responsibilities Design, develop, and maintain end-to-end data ingestion and integration pipelines on Google Cloud Platform (GCP). Implement robust workflows from document ingestion and file triggers to downstream ML integration and storage (e.g., BigQuery). Integrate and manage RESTful APIs and asynchronous job queues to support ML/OCR services. Collaborate with AI/ML teams to deploy and scale intelligent automation solutions. Containerize services using Docker and manage deployments via Cloud Run and Cloud Build . Ensure production readiness through monitoring, logging, and observability best practices . Utilize Infrastructure-as-Code tools (e.g., Terraform) for provisioning and environment consistency. Work in Agile/Scrum teams, participating in sprint planning, reviews, and backlog refinement. Required Skills & Experience 7+ years of experience in data integration, cloud data engineering, or backend development . Strong proficiency in Python , REST APIs, and handling asynchronous workloads . Solid hands-on experience with Google Cloud Platform (GCP) services including: Pub/Sub, Cloud Storage (GCS), BigQuery, Cloud Run, Cloud Build Experience in Docker and managing containerized microservices . Familiarity with Infrastructure-as-Code tools like Terraform . Exposure to AI/ML integration , OCR pipelines, or document understanding frameworks is a strong advantage. Experience with CI/CD pipelines and automated deployment workflows . Preferred Qualifications Knowledge of data formats like JSON, XML, or PDF parsing. Prior experience working on document processing , intelligent automation , or ML Ops projects. GCP certification (e.g., Professional Data Engineer or Associate Cloud Engineer ) is a plus. Why Join Us? Join a recognized leader in data engineering , recently awarded the "Outstanding Data Engineering Team" at DES 2025 . Work on mission-critical AI and automation products that directly impact real-world use cases. Thrive in a collaborative, learning-driven culture with opportunities to grow in AI/ML and Cloud technologies. Experience : 7 to 12 Yrs Location : Chennai/Pune/Coimbatore/Bangalore Notice : Immediate to 1 Week Regards, TA Team KANINI Software Solutions Show more Show less

Posted 2 weeks ago

Apply

7.0 - 10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Job Title: Lead Splunk Engineer Location: Gurgaon (Hybrid) Experience: 7-10 Years Employment Type: Full-time Notice Period: Immediate Joiners Preferred Job Summary: We are seeking an experienced Lead Splunk Engineer to design, deploy, and optimize SIEM solutions with expertise in Splunk architecture, log management, and security event monitoring . The ideal candidate will have hands-on experience in Linux administration, scripting, and integrating Splunk with tools like ELK & DataDog . Key Responsibilities: ✔ Design & deploy scalable Splunk SIEM solutions (UF, HF, SH, Indexer Clusters). ✔ Optimize log collection, parsing, normalization, and retention . ✔ Ensure license & log optimization for cost efficiency. ✔ Integrate Splunk with 3rd-party tools (ELK, DataDog, etc.) . ✔ Develop automation scripts (Python/Bash/PowerShell) . ✔ Create technical documentation (HLD, LLD, Runbooks) . Skills Required: 🔹 Expert in Splunk (Architecture, Deployment, Troubleshooting) 🔹 Strong SIEM & Log Management Knowledge 🔹 Linux/Unix Administration 🔹 Scripting (Python, Bash, PowerShell) 🔹 Experience with ELK/DataDog 🔹 Understanding of German Data Security Standards (GDPR/Data Parsimony) Why Join Us? Opportunity to work with cutting-edge security tools . Hybrid work model (Gurgaon-based). Collaborative & growth-oriented environment . Show more Show less

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Job Title- Sr.Splunk Architect Exp-7+ Years Location- Gurgaon (Hybrid) Notice Period- Immediate Joiner /Serving Responsibilities As Lead Splunk, your role and responsibilities would include: Hands on experience in the SIEM domain o Expert knowledge on splunk> Backend operations (UF, HF, SH and Indexer Cluster) and architecture o Expert knowledge of Log Management and Splunk SIEM. Understanding of log collection, parsing, normalization, and retention practices. o Expert in Logs/License optimization techniques and strategy. o Good Understanding of Designing, Deployment & Implementation of a scalable SIEM Architecture. o Understanding of data parsimony as a concept, especially in terms of German data security standards. o Working knowledge of integration of Splunk logging infrastructure with 3rd party Observability Tools (e.g. ELK, DataDog etc.) o Experience in identifying the security and non-security logs and apply adequate filters/re- route the logs accordingly. o Expert in understanding the Network Architecture and identifying the components of impact. o Expert in Linux Administration. o Proficient in working with Syslog. o Proficiency in scripting languages like Python, PowerShell, or Bash to automate tasks Expertise with OEM SIEM tools preferably Splunk Experience with open source SIEM/Log storage solutions like ELK OR Datadog etc.. o Very good with documentation of HLD, LLD, Implementation guide and Operation Manuals Show more Show less

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Title: QA Networking Manual + Python Location: Bangalore Experience: 10Years+ Notice Period: Immediate to 30 Days About the Company Calsoft is a leading technology-first partner, providing digital and product engineering services. For over 25 years, Calsoft has been helping customers solve business challenges through technologies in Storage, Virtualization, Networking, Security, Cloud, AI/ML, IoT, and Telecommunications. As a Technology First company, our mission is to find new ways to solve customer problems by leveraging agility, technological innovation, and deep engineering expertise. With a global presence, thousands of innovators, and a clientele including Fortune 500 companies, Calsoft is a trusted digital transformation partner. Job Title: SONIC Platform Validation Engineer (Layer 1/Layer2) Job Description: We are seeking a highly skilled and motivated SONIC Platform Validation Engineer to join our team. The ideal candidate will be responsible for validating and ensuring the robustness of the SONIC platform at Layer 1 and also the Serdes level testing experience. This role involves working closely with hardware and software teams to ensure seamless integration and optimal performance. Key Responsibilities: Develop and execute test plans for validating Layer 1 functionalities of the SONIC platform or any other NOS. Hands-on experience with industry level Network operating systems such as Cisco XR/ JUNOS in L1 and L2 levels Handson with designing and developing scripted test automation infrastructure for system electrical validation, test automation. Experience in using Python for data parsing/analysis for L1-L3 performance metrics (e.g., SerDes performance FEC SER, packet statistics, power, temperature), Experience with iterative test infra handling with traffic generator, PDU through API RPCS, TCP/IP, and Python Good understanding of PCS/PMA and MAC level test cases. Good understanding of Ethernet standards including Speed and FEC Ensure all hardware and protocols meet industry standards and regulatory requirements, including Ethernet (IEEE 802.3), VLANs (IEEE 802.1Q) and VRF Experience with testing Layer1 and hardware validation for data center switching products from 1.8T to 25.6T and 40G, 100G, and 400G/800G ports Hands-on experience with SerDes tuning and channel optimization Tx FIR tuning for host-side links between ASIC/ retimes / module loopbacks; for optical eye compliance to meet IEEE 802.3 requirements at 10G, 25G, and 50Gbps data rates Hands-on with different media types such as DAC/AEC/Optical modules ranging from 10G till 800G with different form factors such as QSFPDD/OSFP/QSFP56/QSFP112. Must have worked with traffic generators such as IXIA/XENA/JDSU/Spirent Collaborate with hardware teams to ensure proper integration and functionality of hardware components (e.g., ASICs). Perform detailed analysis and debugging of issues related to Layer 1. Document test results and provide comprehensive reports to stakeholders. Work with cross-functional teams to resolve issues and improve platform stability. Stay updated with the latest advancements in SONIC and related technologies. Collaborate with firmware and driver teams to validate Layer 1 interactions with higher layers. Qualifications: Bachelor's or Master's degree in Electrical Engineering, Computer Science, or a related field. 10+ years of experience in Layer1 and Layer 2 validation. Strong understanding of Layer 1 and Layer2 protocols and technologies. Experience with SONIC/Any NOS platform Proficiency in scripting languages such as Python or Bash. Familiarity with hardware validation and debugging tools. Excellent problem-solving skills and attention to detail. Application Instructions- Please share the following mandatory details along with your application: Total IT Experience: Experience in Validation Engineer Role: Experience in Networking Protocol: Layer1 and Layer 2 validation Experience in Docker/containerization : Current CTC: Expected CTC: Current Location: Notice Period: Are you comfortable working in Indore (in-person interview required)? Show more Show less

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Indeed logo

- 3+ years of building models for business application experience - PhD, or Master's degree and 4+ years of CS, CE, ML or related field experience - Experience in patents or publications at top-tier peer-reviewed conferences or journals - Experience programming in Java, C++, Python or related language - Experience in any of the following areas: algorithms and data structures, parsing, numerical optimization, data mining, parallel and distributed computing, high-performance computing Do you want to join an innovative team of scientists who use machine learning and statistical techniques to create state-of-the-art solutions for providing better value to Amazon’s customers? Do you want to build and deploy advanced algorithmic systems that help optimize millions of transactions every day? Are you excited by the prospect of analyzing and modeling terabytes of data to solve real world problems? Do you like to own end-to-end business problems/metrics and directly impact the profitability of the company? Do you like to innovate and simplify? If yes, then you may be a great fit to join the Machine Learning and team for India Consumer Businesses. If you have an entrepreneurial spirit, know how to deliver, love to work with data, are deeply technical, highly innovative and long for the opportunity to build solutions to challenging problems that directly impact the company's bottom-line, we want to talk to you. Major responsibilities - Use machine learning and analytical techniques to create scalable solutions for business problems - Analyze and extract relevant information from large amounts of Amazon’s historical business data to help automate and optimize key processes - Design, development, evaluate and deploy innovative and highly scalable models for predictive learning - Research and implement novel machine learning and statistical approaches - Work closely with software engineering teams to drive real-time model implementations and new feature creations - Work closely with business owners and operations staff to optimize various business operations - Establish scalable, efficient, automated processes for large scale data analyses, model development, model validation and model implementation - Mentor other scientists and engineers in the use of ML techniques About the team The India Machine Learning team works closely with the business and engineering teams in building ML solutions that create an impact for Amazon's IN businesses. This is a great opportunity to leverage your machine learning and data mining skills to create a direct impact on consumers and end users. Experience using Unix/Linux Experience in professional software development Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Automation Lead at WellnessLiving Location Bengaluru, India Salary Based on experience Job Type Contract Date Posted December 3rd, 2024 Apply Now View All Jobs Download File Title: Automation Lead Salary: Monthly Service Fee Location: Bangalore, India Length: Contract Reporting Manager: QA Manager About Us At WellnessLiving, we empower thousands of health and wellness business owners to turn their entrepreneurial dreams into reality. Our mission-critical software fuels their vision, supporting millions of clients around the world in their wellness journeys. With a deep commitment to putting our customers first, we foster a culture that values high performance, adaptability, and accountability. If you are a skilled professional who thrives in a fast-paced, customer-focused environment and are passionate about making a meaningful impact on the health and wellness industry, we would love to connect with you. About You: We are seeking an experienced and detail-oriented Automation QA Lead to join our QA team. The ideal candidate will bring a solid technical background in test automation, a passion for quality assurance, and demonstrated expertise in leading automation initiatives. As the Automation QA Lead, you will be responsible for designing, developing, and executing automated test strategies to ensure the highest product quality. Additionally, you will mentor junior team members and collaborate closely with cross-functional teams to deliver world-class products. At WellnessLiving, our team is driven by four core values that shape everything we do. If you share these values and meet the qualifications outlined for this role, we encourage you to apply - we’d love to learn more about you! Customer First – We approach every challenge with a customer-focused lens, driven by an obsession with our customers’ happiness and success. Excellence – We approach every task, whether big or small, with a steadfast commitment to exceptional execution and the pursuit of greatness. Accountability – We take full ownership of our decisions, actions, and outcomes – both successes and failures. Adaptability – We recognize that sustained success demands that we be malleable and purposefully evolve, acknowledging that the world is dynamic and constantly changing. Responsibilities Develop and maintain automation testing strategies. Select and optimize tools/frameworks for test automation (frontend, backend, and API testing). Regularly review strategies for improved test coverage and reliability. Mentor team members in automation best practices. Work with stakeholders, including QA managers, developers, and product managers. Collaborate with DevOps for seamless CI/CD integration. Create test plans, strategies, and test cases. Establish standards for execution and documentation. Manage test environments and datasets effectively. Track test coverage, results, and defect status. Perform root cause analysis for failures. Deliver progress reports to stakeholders. Optimize automation processes for efficiency and reliability. Enhance test coverage while minimizing maintenance costs. Stay updated with industry trends and tools. Skills & Qualifications Bachelor’s degree in Computer Science or a related field. 8+ years in QA, including 5+ years in test automation and 3+ years in a lead role. Tools like Postman and REST Assured, with JSON parsing and API contract testing. Tools like JMeter to assess scalability and throughput. Using OWASP ZAP/Burp Suite for vulnerabilities. Proficiency in Cypress, Selenium, and Karate, with scripting in Java or Python. Testing in Docker/Kubernetes with tools like Testcontainers. CI/CD tools (Jenkins, GitHub CI/Action). Test management tools (TestRail, Xray, Zephyr). Leadership and team-building. Problem-solving and analytical capabilities. Strong communication for technical and non-technical collaboration. Please note that those who meet the qualifications for the position will be contacted directly. We appreciate you taking the time and look forward to reviewing your application. WellnessLiving is an equal-opportunity employer. At WellnessLiving, we are proud to embrace and celebrate differences. Employment at WellnessLiving is based purely on a candidate’s qualifications and experiences as they directly relate to professional competencies. WellnessLiving does not discriminate against any employee or potential employee because of race, creed, color, religion, gender, sexual orientation, gender identity/expression, national origin, disability, age, genetic information, veteran status, marital status, family or parental status, or any other status protected by the laws and regulations in the locations where we operate. Furthermore, we will not tolerate bias or discrimination of any kind from our employees or customers. At WellnessLiving, we bring everyone together to create something incredible! We are a unique and diverse blend of leaders and action-takers, and that mindset encompasses our passion and commitment to our product and our employees. We utilize AI to generate summaries of interview notes as part of our candidate evaluation process. This helps ensure a fair and consistent review while maintaining a human-centered hiring approach. Apply Now Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

We help the world run better At SAP, we enable you to bring out your best. Our company culture is focused on collaboration and a shared passion to help the world run better. How? We focus every day on building the foundation for tomorrow and creating a workplace that embraces differences, values flexibility, and is aligned to our purpose-driven and future-focused work. We offer a highly collaborative, caring team environment with a strong focus on learning and development, recognition for your individual contributions, and a variety of benefit options for you to choose from. The SAP HANA Database and Analytics Core engine team is looking for an intermediate, or senior developer to contribute to our Knowledge Graph Database System engine development. In this role, you will be designing, developing features, and maintaining our Knowledge Graph engine, which runs inside SAP HANA in-memory database. At SAP, all members of the engineering team, including management, are hands-on and close to the code. If you think you can thrive in such an environment, and you have the necessary skills and experience please do not hesitate to apply. What You’ll Do- As a developer, you will have the opportunity to: Contribute to hands-on coding, design, and architecture that is best suited for our team size and performance targets. Collaborate in a team environment that extends to colleagues in remote locations and from various lines of businesses within the company. Ability to communicate and guide other teams to construct best possible queries for their needs. Assess new technology, tool, and infrastructure to keep up with the rapid pace of change. Embrace lean and agile software development principles. Debug, troubleshoot and communicate with customers about their issues with their data models, and queries. Continually enhance existing skills and seek new areas for personal development. What You Bring- Bachelor’s degree or equivalent university education in computer science or engineering with 3-5 years of experience in developing enterprise class software. Experience in Development with modern C++. Knowledge of development of Database Internals like - Query Optimizer/Planner, Query Executor, System Management, Transaction Management, and/or Persistence. Knowledge of SQL, and Graph technologies like RDF/SPARQL. Knowledge of full SDLC and development of tests using Python or other tools. Experience designing and developing well-encapsulated, and object-oriented code. Solution-oriented and open minded. Manage collaboration with sister teams and partner resources in remote locations. High service and customer orientation Skilled in process optimization and drives for permanent change. Strong in analytical thinking/problem solving. Interpersonal skills: team player, proactive networking, results and execution oriented, motivated to work in an international and intercultural environment. Excellent oral and written communication skills and presentation skills MEET YOUR TEAM- The team is responsible for developing HANA Knowledge Graph, a high-performance graph analytics database system, made available to SAP customers, partners, and various internal groups as part of HANA Multi Model Database System. It is specifically designed for processing large-scale graph data and executing complex graph queries with high efficiency. HANA Knowledge Graph enables organizations to gain insights from their graph datasets, discover patterns, perform advanced graph analytics, and unlock the value of interconnected data. HANA Knowledge Graph utilizes massive parallel processing (MPP) architecture to leverage the power of distributed computing. It is built with W3C web standards specifications of graph data and query language – RDF and SPARQL. The various components of HANA Knowledge Graph System include – Storage, Data Load, Query Parsing, Query Planning and Optimization, Query Execution, Transaction Management, Memory Management, Network Communications, System Management, Data Persistence, Backup & Restore, Performance Tuning, etc. At SAP, HANA Knowledge Graph is set to play a critical role in the development of several AI products. Bring out your best SAP innovations help more than four hundred thousand customers worldwide work together more efficiently and use business insight more effectively. Originally known for leadership in enterprise resource planning (ERP) software, SAP has evolved to become a market leader in end-to-end business application software and related services for database, analytics, intelligent technologies, and experience management. As a cloud company with two hundred million users and more than one hundred thousand employees worldwide, we are purpose-driven and future-focused, with a highly collaborative team ethic and commitment to personal development. Whether connecting global industries, people, or platforms, we help ensure every challenge gets the solution it deserves. At SAP, you can bring out your best. We win with inclusion SAP’s culture of inclusion, focus on health and well-being, and flexible working models help ensure that everyone – regardless of background – feels included and can run at their best. At SAP, we believe we are made stronger by the unique capabilities and qualities that each person brings to our company, and we invest in our employees to inspire confidence and help everyone realize their full potential. We ultimately believe in unleashing all talent and creating a better and more equitable world. SAP is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to the values of Equal Employment Opportunity and provide accessibility accommodations to applicants with physical and/or mental disabilities. If you are interested in applying for employment with SAP and are in need of accommodation or special assistance to navigate our website or to complete your application, please send an e-mail with your request to Recruiting Operations Team: Careers@sap.com For SAP employees: Only permanent roles are eligible for the SAP Employee Referral Program, according to the eligibility rules set in the SAP Referral Policy. Specific conditions may apply for roles in Vocational Training. EOE AA M/F/Vet/Disability Qualified applicants will receive consideration for employment without regard to their age, race, religion, national origin, ethnicity, age, gender (including pregnancy, childbirth, et al), sexual orientation, gender identity or expression, protected veteran status, or disability. Successful candidates might be required to undergo a background verification with an external vendor. Requisition ID: 396628 | Work Area: Software-Design and Development | Expected Travel: 0 - 10% | Career Status: Professional | Employment Type: Regular Full Time | Additional Locations: . Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Responsibilities Design, develop, test, and deploy robust and scalable Android TV applications using Kotlin/Java. Implement user interfaces optimized for the 10-foot experience, ensuring intuitive navigation and engaging visuals. Integrate with various backend services and APIs to fetch and display content, handle user authentication, and manage subscriptions. Optimize application performance, memory usage, and responsiveness on a variety of Android TV devices, including set-top boxes, smart TVs, and streaming sticks. Work with video playback technologies, including ExoPlayer, DRM solutions (Widevine, PlayReady), and adaptive streaming protocols (HLS, DASH). Ensure applications adhere to Android TV design guidelines and best practices, including D-pad navigation, voice search integration, and leanback UI components. Debug and resolve complex technical issues, identify root causes, and implement effective solutions. Participate in code reviews, contribute to architectural discussions, and mentor junior engineers. Stay up-to-date with the latest Android TV platform features, tools, and industry trends. Collaborate with QA engineers to define test plans and ensure high-quality releases. Document technical designs, implementation details, and API Qualifications : Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent practical experience. 3+ years of professional experience in Android application development. Proven experience developing and deploying applications specifically for Android TV. Strong proficiency in Kotlin and Java programming languages. Deep understanding of the Android SDK, Android Jetpack components, and material design principles. Extensive experience with Android TV Leanback library and its components. Solid understanding of video streaming technologies, including ExoPlayer, HLS, and DASH. Experience with RESTful APIs, JSON parsing, and asynchronous programming. Familiarity with version control systems (Git). Ability to write clean, maintainable, and well-documented code. Strong problem-solving skills and attention to detail. Excellent communication and teamwork Qualifications: Experience with DRM technologies (e.g., Widevine, PlayReady). Familiarity with CI/CD pipelines for Android TV applications. Experience with dependency injection frameworks (e.g., Dagger Hilt, Koin). Knowledge of unit testing and integration testing frameworks (e.g., JUnit, Espresso). Experience with analytics and crash reporting tools. Understanding of accessibility best practices for TV applications. Contributions to open-source projects related to Android TV (ref:hirist.tech) Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Chennai

On-site

Role –Senior Gen AI Engineer Job Location - Hyderabad Mode of Interview - Virtual Job Description: Collect and prepare data for training and evaluating multimodal foundation models. This may involve cleaning and processing text data or creating synthetic data. Develop and optimize large-scale language models like GANs (Generative Adversarial Networks) and VAEs (Variational Autoencoders) Work on tasks involving language modeling, text generation, understanding, and contextual comprehension. Regularly review and fine-tune Large Language models to ensure maximum accuracy and relevance for custom datasets. Build and deploy AI applications on cloud platforms – any hyperscaler Azure, GCP or AWS. Integrate AI models with our company's data to enhance and augment existing applications. Role & Responsibility Handle data preprocessing, augmentation, and generation of synthetic data. Design and develop backend services using Python or .NET to support OpenAI-powered solutions (or any other LLM solution) Develop and Maintaining AI Pipelines Work with custom datasets, utilizing techniques like chunking and embeddings, to train and fine-tune models. Integrate Azure cognitive services (or equivalent platform services) to extend functionality and improve AI solutions Collaborate with cross-functional teams to ensure smooth deployment and integration of AI solutions. Ensure the robustness, efficiency, and scalability of AI systems. Stay updated with the latest advancements in AI and machine learning technologies. Skills & Experience Strong foundation in machine learning, deep learning, and computer science. Expertise in generative AI models and techniques (e.g., GANs, VAEs, Transformers). Experience with natural language processing (NLP) and computer vision is a plus. Ability to work independently and as part of a team. Knowledge of advanced programming like Python, and especially AI-centric libraries like TensorFlow, PyTorch, and Keras. This includes the ability to implement and manipulate complex algorithms fundamental to developing generative AI models. Knowledge of Natural language processing (NLP) for text generation projects like text parsing, sentiment analysis, and the use of transformers like GPT (generative pre-trained transformer) models. Experience in Data management, including data pre-processing, augmentation, and generation of synthetic data. This involves cleaning, labeling, and augmenting data to train and improve AI models. Experience in developing and deploying AI models in production environments. Knowledge of cloud services (AWS, Azure, GCP) and understanding of containerization technologies like Docker and orchestration tools like Kubernetes for deploying , managing and scaling AI solutions Should be able to bring new ideas and innovative solutions to our clients

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

We're Hiring: AI DevOps Engineer – ML, LLM & Cloud for Battery & Livestock Intelligence 📍 Hyderabad / Remote | 🧠 3–8 Years Experience 💼 Full-time | Deep Tech | AI-Driven IoT At Vanix Technologies , we’re solving real-world problems using AI built on top of IoT data — from predicting the health of electric vehicle batteries to monitoring livestock behavior with BLE sensors. We’re looking for a hands-on AI DevOps Engineer who understands not just how to build ML/DL models, but also how to turn them into intelligent cloud-native services . If you've worked on battery analytics or sensor-driven ML , and you're excited by the potential of LLMs + real-time IoT — this is for you. What You’ll Work On 🔋 EV Battery Intelligence Build models for SOH, true SOH, SOC, RUL prediction , thermal event detection, and high-risk condition classification. Ingest and process time-series data from BMS, CAN bus, GPS , and environmental sensors. Deliver analytics that plug into our BatteryTelematicsPro SaaS for OEMs and fleet customers. 🐄 Livestock Monitoring AI Analyze BLE sensor data from our cattle wearables (motion, temp, rumination proxies). Develop models for health anomaly detection, estrus prediction, movement patterns , and outlier behaviors. Power actionable insights for farmers via mobile dashboards and alerts. 🤖 Agentic AI & LLM Integration Chain ML outputs with LLMs (e.g., GPT-4, Claude) using LangChain or similar frameworks . Build AI assistants that summarize events, auto-generate alerts, and respond to user queries using both structured and ML-derived data. Support AI-powered explainability and insight generation layers on top of raw telemetry. ☁️ Cloud ML Engineering & DevOps Deploy models on AWS (Lambda, SageMaker, EC2, ECS, CloudWatch). Design and maintain CI/CD pipelines for data, models, and APIs. Optimize performance, cost, and scalability of cloud workloads. ✅ You Must Have Solid foundation in ML/DL for time-series / telemetry data Hands-on with PyTorch / TensorFlow / Scikit-learn / XGBoost Experience with battery analytics or sensor-based animal behavior prediction Understanding of LangChain / OpenAI APIs / LLM orchestration AWS fluency: Lambda, EC2, S3, SageMaker, CloudWatch Python APIs Nice to Have MLOps stack (MLFlow, DVC, W&B) BLE signal processing or CAN bus protocol parsing Prompt engineering or fine-tuning experience Exposure to edge-to-cloud model deployment Why Vanix Technologies? Because we're not another AI lab — we're a deep-tech company building production-ready AI platforms that interact with real devices in the field , used by farmers, OEMs, and EV fleets . You’ll work at the intersection of: IoT + AI + LLMs Hardware + Cloud Mission-critical data + Everyday impact Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Your Role and Impact R Systems International is seeking a skilled Mobile Engineer with experience in native app development for both iOS (Swift) and Android (Kotlin). You should have a strong understanding of design principles, including MVVM for Android or MVC for iOS, and expertise in RESTful services, APIs, and JSON parsing. Knowledge of design best practices, Agile methodologies, Git, JIRA, and CI/CD is essential. You will be responsible for creating high-quality software, collaborating with UI/UX teams, writing unit tests, and adhering to coding standards. A proactive approach to solving technical challenges and improving code quality is key to success in this role. Your Contribution Analyze requirements and collaborate with architects and senior engineers to create software designs of moderate complexity. Work with UI/UX teams to ensure requirements traceability from definition to implementation. Participate in peer reviews and pull requests to maintain high-quality software standards. Adhere to coding standards and best practices for creating reusable code. Write unit tests to ensure software reliability. Interface with Product Owners and Scrum Masters for ticket/issue management. Develop complex views that interact with network and data layers, and contribute to improvements. Lead large refactors/design improvements and work with multiple scrum teams to implement new features. Participate in technical assessments, scoping, and managing code changes based on business requirements and product enhancements. Estimate work, support project planning, and report progress to management. Present software concepts, designs, or code in design review forums. Lead and contribute to technical discussions in community practice, design reviews, and other technical meetings. Maintain in-depth knowledge of platform-specific features, frameworks, and components. Show more Show less

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Come work at a place where innovation and teamwork come together to support the most exciting missions in the world! Brief Description: Principal Software development engineer for Cloud Agent Endpoint Security Product Full Job Description: We invite you to be part of motivated and agile Qualys engineering team responsible for developing high-end Cloud based Security Solutions. This opening is your chance to work in the rapidly expanding field of computer security, in a company with excellent customer ratings and outstanding growth rates. In this position you will be working on network security solutions to deliver cutting-edge products including advanced Endpoint Security Technology and product. This position is for our fastest growing R&D center in Pune, India, which is part of multi-continent engineering team. Responsibilities: Develop understanding of the product functionality spanning in-field appliance to cloud services i.e. the end-to-end architecture, how customers use the product, how the product fits in the overall Qualys security platform and its value-add, various customer use case scenarios etc. This perspective is required for in-depth understanding and handling customer queries. Ability to lead the initiatives and engineering lifecycle including write, read, comprehend codebase and participate in design / code / testcase reviews. Develop in-depth knowledge in Endpoint Security and networking domains Contribute to the appliance stack development. Understand the existing appliance architecture well to be able to own new feature development - design, develop, deliver. Study and decipher documentation needed to accomplish tasks at hand viz Endpoint security technologies including EDR/AntiMalware (EPP), XDR, MDR, etc., other Cloud Agent standards, RFCs and protocol specifications, network topologies, networking fundamentals ( Tcp/Ip stack, switches, routers, networking protocols, firewalls), Linux platform fundamentals, virtualization, deep packet inspection etc. Debug issues in the product reported by internal QA teams or in production by customers and suggest solutions. Interact with QA teams to describe product feature and methods to test it – functionality, performance, negative scenarios. Document the design and test plans as part of development activities. Communicate with other team members, including with the QA team and collaborate as required. Qualifications: Must Have: Degree in Computer Science/Electronics/Instrumentation. 10 to 15 years software development or testing experience in Windows Development background and Windows Internals. Professional experience developing products in any tech domains is a good added advantage, for example experience in areas such as deep packet inspection, packet parsing and fast packet processing techniques, firewalls, networking protocols socket programming, virtualization and hypervisors etc. Ability to write as well as comprehend written code in C/C++ and/or Python programs. Passion to build a career in Endpoint systems and system closer to OS levels. Good academic record. Good reading and comprehension skills to be able to read technical literature of Network Security products and make inferences. Ability to operate Windows Operating System commands and related applications Good written and verbal communication skills. Additional skills that are good to have: Good debugging skills, ability to inspects packet captures Understanding of Linux boot loaders, grub, kernel compilation and networking stack internals, TCP/IP knowledge. Knowledge of one or more protocols used in Network Security systems - eg. LDAP, DHCP, ARP, DNS etc. Knowledge of Layer 2 and 3 switching, High availability, VPN, VLAN technologies etc. Working knowledge of deployment of Virtual Machines such as Vmware. Good understanding of Database concepts and good working knowledge with Oracle/PLSQL/Postgres. Excellent analytical and program solving skills, excellent written and oral communication, self-starter and highly motivated. Work in a dynamic environment and ability to adapt quickly to changes. Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

AI Agent Development - Python (CrewAI + LangChain) Location: Noida / Gwalior (On-site) Experience Required: Minimum 3+ years Employment Type: Full-time 🚀 About the Role We're seeking a AI Agent Developer (Python) with hands-on experience in CrewAI and LangChain to join our cutting-edge AI product engineering team. If you thrive at the intersection of LLMs, agentic workflows, and autonomous tooling — this is your opportunity to build real-world AI agents that solve complex problems at scale. You’ll be responsible for designing, building, and deploying intelligent agents that leverage prompt engineering, memory systems, vector databases, and multi-step tool execution strategies. 🧠 Core Responsibilities Design and develop modular, asynchronous Python applications using clean code principles. Build and orchestrate intelligent agents using CrewAI: defining agents, tasks, memory, and crew dynamics. Develop custom chains and tools using LangChain (LLMChain, AgentExecutor, memory, structured tools). Implement prompt engineering techniques like ReAct, Few-Shot, and Chain-of-Thought reasoning. Integrate with APIs from OpenAI, Anthropic, HuggingFace, or Mistral for advanced LLM capabilities. Use semantic search and vector stores (FAISS, Chroma, Pinecone, etc.) to build RAG pipelines. Extend tool capabilities: web scraping, PDF/document parsing, API integrations, and file handling. Implement memory systems for persistent, contextual agent behavior. Leverage DSA and algorithmic skills to structure efficient reasoning and execution logic. Deploy containerized applications using Docker, Git, and modern Python packaging tools. 🛠️ Must-Have Skills Python 3.x (Async, OOP, Type Hinting, Modular Design) CrewAI (Agent, Task, Crew, Memory, Orchestration) – Must Have LangChain (LLMChain, Tools, AgentExecutor, Memory) Prompt Engineering (Few-Shot, ReAct, Dynamic Templates) LLMs & APIs (OpenAI, HuggingFace, Anthropic) Vector Stores (FAISS, Chroma, Pinecone, Weaviate) Retrieval-Augmented Generation (RAG) Pipelines Memory Systems: BufferMemory, ConversationBuffer, VectorStoreMemory Asynchronous Programming (asyncio, LangChain hooks) DSA / Algorithms (Graphs, Queues, Recursion, Time/Space Optimization) 💡 Bonus Skills Experience with Machine Learning libraries (Scikit-learn, XGBoost, TensorFlow basics) Familiarity with NLP concepts (Embeddings, Tokenization, Similarity scoring) DevOps familiarity (Docker, GitHub Actions, Pipenv/Poetry) 🧭 Why Join Us? Work on cutting-edge LLM agent architecture with real-world impact. Be part of a fast-paced, experiment-driven AI team. Collaborate with passionate developers and AI researchers. Opportunity to build from scratch and influence core product design. Show more Show less

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Lead Python Engineer – Backend & AI Integrations Location: Gurgaon Working Days: Monday to Friday, with 2nd and 4th Saturdays off Working Hours : 10:30 AM – 8:00 PM Experience : 3–8 years Function: Backend Engineering | AI Platform Integration | Scalable Systems About Darwix AI Darwix AI is one of India’s fastest-growing GenAI SaaS companies powering real-time decision intelligence for enterprise revenue teams. Our platform transforms frontline performance through: Transform+: Live agent nudges & call intelligence Sherpa.ai: GenAI-powered multilingual sales coach Store Intel: Computer vision for in-store sales analysis We serve market leaders across BFSI, real estate, and retail—including IndiaMart, Wakefit, GIVA, Sobha, and Bank Dofar. Our stack processes thousands of voice conversations daily, powers real-time dashboards, and delivers high-stakes nudges that directly impact multi-crore revenue pipelines. We are building at the intersection of voice AI, backend scale, and real-time analytics. You will play a key role in shaping the tech foundation that drives this mission. Role Overview We’re looking for a Lead Python Engineer to architect, own, and scale the core backend systems that power Darwix AI’s GenAI applications. You’ll work at the confluence of backend engineering, data pipelines, speech processing, and AI model integrations—supporting everything from real-time call ingestion to multi-tenant analytics dashboards. You will lead a high-performing engineering pod, collaborate with product, AI, and infra teams, and mentor junior engineers. This is a high-impact, ownership-first role with direct influence over product velocity, system performance, and enterprise reliability. Key Responsibilities 1. Backend Architecture & Infrastructure Design and maintain scalable APIs and backend systems using Python (FastAPI) Optimize data flow for speech-to-text transcription, diarization outputs, and call scoring workflows Build and maintain modular service components (STT, scoring engine, notification triggers) Manage asynchronous job queues (Celery, Redis) for large batch processing Ensure high availability, security, and scalability of backend systems across geographies 2. AI/ML Integration & Processing Pipelines Integrate with LLMs (OpenAI, Cohere, Hugging Face) and inference APIs for custom use cases Handle ingestion and parsing of STT outputs (WhisperX, Deepgram, etc.) Work closely with the AI team to productionize model outputs into usable product layers Manage embedding pipelines, RAG workflows, and retrieval caching across client tenants 3. Database & Data Engineering Design and maintain schemas across PostgreSQL, MongoDB, and TimescaleDB Optimize read/write operations for large call data, agent metrics, and dashboard queries Collaborate on real-time analytics systems used by enterprise sales teams Implement access controls and tenant isolation logic for sensitive sales data 4. Platform Reliability, Monitoring & Scaling Collaborate with DevOps team on infrastructure orchestration (Docker, Kubernetes, GitHub Actions) Set up alerting, logging, and auto-recovery protocols for uptime guarantees Drive version control and CI/CD automation for releases with minimal regression Support benchmarking, load testing, and latency reduction initiatives 5. Technical Leadership & Team Collaboration Mentor junior engineers, review pull requests, and enforce code quality standards Collaborate with product managers on scoping and technical feasibility Break down large tech initiatives into sprints and delegate effectively Take ownership of technical decisions and present trade-offs with clarity Required Skills & Experience 3–8 years of hands-on backend engineering experience, primarily in Python Strong grasp of FastAPI, REST APIs, job queues (Celery), and async workflows Solid experience with relational and NoSQL databases: PostgreSQL, MongoDB, Redis Familiarity with working on production systems involving large-scale API calls or streaming data Prior experience integrating 3rd-party APIs (e.g., OpenAI, CRM, VoIP, or transcription vendors) Working knowledge of Docker, CI/CD pipelines (GitHub Actions preferred), and basic infra scaling Experience working in high-growth SaaS or data-product companies Bonus Skills (Preferred, Not Mandatory) Experience with LLM applications, vector stores (FAISS, Pinecone), and RAG pipelines Familiarity with speech-to-text engines (WhisperX, Deepgram) and audio processing Prior exposure to multi-tenant SaaS systems with role-based access and usage metering Knowledge of OAuth2, webhooks, event-driven architectures Experience with frontend collaboration (Angular/React) and mobile APIs Contributions to open-source projects, technical blogs, or developer communities Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Join our dynamic team as a Web Scraping Engineer and play a crucial role in driving our data-driven strategies. As a key player, you will develop and maintain innovative solutions to automate data extraction, parsing, and structuring from various online sources. Your expertise will empower our business intelligence, market research, and decision-making processes. If you are passionate about automation, dedicated to ethical practices, and have a knack for solving complex problems, we want you! Key Responsibilities Design, implement, and maintain web scraping solutions to collect structured data from publicly available online sources and APIs Parse, clean, and transform extracted data to ensure accuracy and usability for business needs Store and organize collected data in databases or spreadsheets for easy access and analysis Monitor and optimize scraping processes for efficiency, reliability, and compliance with relevant laws and website policies Troubleshoot issues related to dynamic content, anti-bot measures, and changes in website structure Collaborate with data analysts, scientists, and other stakeholders to understand data requirements and deliver actionable insights Document processes, tools, and workflows for ongoing improvements and knowledge sharing Requirements Proven experience in web scraping, data extraction, or web automation projects Proficiency in Python or similar programming languages, and familiarity with libraries such as BeautifulSoup, Scrapy, or Selenium Strong understanding of HTML, CSS, JavaScript, and web protocols Experience with data cleaning, transformation, and storage (e.g., CSV, JSON, SQL/NoSQL databases) Knowledge of legal and ethical considerations in web scraping, with a commitment to compliance with website terms of service and data privacy regulations Excellent problem-solving and troubleshooting skills Ability to work independently and manage multiple projects simultaneously Preferred Qualifications Experience with cloud platforms (AWS, GCP, Azure) for scalable data solutions Familiarity with workflow automation and integration with communication tools (e.g., email, Slack, APIs) Background in market research, business intelligence, or related fields Skills: data extraction,data cleaning,beautifulsoup,business intelligence,web automation,javascript,web scraping,data privacy regulations,web protocols,selenium,scrapy,sql,data transformation,nosql,css,market research,automation,python,html Show more Show less

Posted 2 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Position Title: R&D Data Engineer About The Job At Sanofi, we’re committed to providing the next-gen healthcare that patients and customers need. It’s about harnessing data insights and leveraging AI responsibly to search deeper and solve sooner than ever before. Join our R&D Data & AI Products and Platforms Team as an R&D Data Engineer and you can help make it happen. What You Will Be Doing Sanofi has recently embarked into a vast and ambitious digital transformation program. A cornerstone of this roadmap is the acceleration of its data transformation and of the adoption of artificial intelligence (AI) and machine learning (ML) solutions, to accelerate R&D, manufacturing and commercial performance and bring better drugs and vaccines to patients faster, to improve health and save lives. The R&D Data & AI Products and Platforms Team is a key team within R&D Digital, focused on developing and delivering Data and AI products for R&D use cases. This team plays a critical role in pursuing broader democratization of data across R&D and providing the foundation to scale AI/ML, advanced analytics, and operational analytics capabilities. As an R&D Data Engineer , you will join this dynamic team committed to driving strategic and operational digital priorities and initiatives in R&D. You will work as a part of a Data & AI Product Delivery Pod, lead by a Product Owner, in an agile environment to deliver Data & AI Products. As a part of this team, you will be responsible for the design and development of data pipelines and workflows to ingest, curate, process, and store large volumes of complex structured and unstructured data. You will have the ability to work on multiple data products serving multiple areas of the business. Our vision for digital, data analytics and AI Join us on our journey in enabling Sanofi’s Digital Transformation through becoming an AI first organization. This means: AI Factory - Versatile Teams Operating in Cross Functional Pods: Utilizing digital and data resources to develop AI products, bringing data management, AI and product development skills to products, programs and projects to create an agile, fulfilling and meaningful work environment. Leading Edge Tech Stack: Experience build products that will be deployed globally on a leading-edge tech stack. World Class Mentorship and Training: Working with renown leaders and academics in machine learning to further develop your skillsets. We are an innovative global healthcare company with one purpose: to chase the miracles of science to improve people’s lives. We’re also a company where you can flourish and grow your career, with countless opportunities to explore, make connections with people, and stretch the limits of what you thought was possible. Ready to get started? Main Responsibilities Data Product Engineering: Provide input into the engineering feasibility of developing specific R&D Data/AI Products Provide input to Data/AI Product Owner and Scrum Master to support with planning, capacity, and resource estimates Design, build, and maintain scalable and reusable ETL / ELT pipelines to ingest, transform, clean, and load data from sources into central platforms / repositories Structure and provision data to support modeling and data discovery, including filtering, tagging, joining, parsing and normalizing data Collaborate with Data/AI Product Owner and Scrum Master to share progress on engineering activities and inform of any delays, issues, bugs, or risks with proposed remediation plans Design, develop, and deploy APIs, data feeds, or specific features required by product design and user stories Optimize data workflows to drive high performance and reliability of implemented data products Oversee and support junior engineer with Data/AI Product testing requirements and execution Innovation & Team Collaboration Stay current on industry trends, emerging technologies, and best practices in data product engineering Contribute to a team culture of of innovation, collaboration, and continuous learning within the product team About You Key Functional Requirements & Qualifications: Bachelor’s degree in software engineering or related field, or equivalent work experience 3-5 years of experience in data product engineering, software engineering, or other related field Understanding of R&D business and data environment preferred Excellent communication and collaboration skills Working knowledge and comfort working with Agile methodologies Key Technical Requirements & Qualifications Proficiency with data analytics and statistical software (incl. SQL, Python, Java, Excel, AWS, Snowflake, Informatica) Deep understanding and proven track record of developing data pipelines and workflows Why Choose Us? Bring the miracles of science to life alongside a supportive, future-focused team Discover endless opportunities to grow your talent and drive your career, whether it’s through a promotion or lateral move, at home or internationally Enjoy a thoughtful, well-crafted rewards package that recognizes your contribution and amplifies your impact Take good care of yourself and your family, with a wide range of health and wellbeing benefits including high-quality healthcare, prevention and wellness programs Pursue Progress . Discover Extraordinary . Progress doesn’t happen without people – people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. You can be one of those people. Chasing change, embracing new ideas and exploring all the opportunities we have to offer. Let’s pursue progress. And let’s discover extraordinary together. At Sanofi, we provide equal opportunities to all regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, or gender identity. Watch our ALL IN video and check out our Diversity Equity and Inclusion actions at sanofi.com! null Show more Show less

Posted 2 weeks ago

Apply

4.0 - 6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Responsible for developing, optimize, and maintaining business intelligence and data warehouse systems, ensuring secure, efficient data storage and retrieval, enabling self-service data exploration, and supporting stakeholders with insightful reporting and analysis. Grade - T5 Please note that the Job will close at 12am on Posting Close date, so please submit your application prior to the Close Date Accountabilities What your main responsibilities are: Data Pipeline - Develop and maintain scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity Data Integration - Connect offline and online data to continuously improve overall understanding of customer behavior and journeys for personalization. Data pre-processing including collecting, parsing, managing, analyzing and visualizing large sets of data Data Quality Management - Cleanse the data and improve data quality and readiness for analysis. Drive standards, define and implement/improve data governance strategies and enforce best practices to scale data analysis across platforms Data Transformation - Processes data by cleansing data and transforming them to proper storage structure for the purpose of querying and analysis using ETL and ELT process Data Enablement - Ensure data is accessible and useable to wider enterprise to enable a deeper and more timely understanding of operation. Qualifications & Specifications Masters /Bachelor’s degree in Engineering /Computer Science/ Math/ Statistics or equivalent. Strong programming skills in Python/Pyspark/SAS. Proven experience with large data sets and related technologies – Hadoop, Hive, Distributed computing systems, Spark optimization. Experience on cloud platforms (preferably Azure) and it's services Azure Data Factory (ADF), ADLS Storage, Azure DevOps. Hands-on experience on Databricks, Delta Lake, Workflows. Should have knowledge of DevOps process and tools like Docker, CI/CD, Kubernetes, Terraform, Octopus. Hands-on experience with SQL and data modeling to support the organization's data storage and analysis needs. Experience on any BI tool like Power BI (Good to have). Cloud migration experience (Good to have) Cloud and Data Engineering certification (Good to have) Working in an Agile environment 4-6 Years Of Relevant Work Experience Is Required. Experience with stakeholder management is an added advantage. What We Are Looking For Education: Bachelor's degree or equivalent in Computer Science, MIS, Mathematics, Statistics, or similar discipline. Master's degree or PhD preferred. Knowledge, Skills And Abilities Fluency in English Analytical Skills Accuracy & Attention to Detail Numerical Skills Planning & Organizing Skills Presentation Skills Data Modeling and Database Design ETL (Extract, Transform, Load) Skills Programming Skills FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace. Show more Show less

Posted 2 weeks ago

Apply

4.0 - 6.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Responsible for developing, optimize, and maintaining business intelligence and data warehouse systems, ensuring secure, efficient data storage and retrieval, enabling self-service data exploration, and supporting stakeholders with insightful reporting and analysis. Grade - T5 Please note that the Job will close at 12am on Posting Close date, so please submit your application prior to the Close Date Accountabilities What your main responsibilities are: Data Pipeline - Develop and maintain scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity Data Integration - Connect offline and online data to continuously improve overall understanding of customer behavior and journeys for personalization. Data pre-processing including collecting, parsing, managing, analyzing and visualizing large sets of data Data Quality Management - Cleanse the data and improve data quality and readiness for analysis. Drive standards, define and implement/improve data governance strategies and enforce best practices to scale data analysis across platforms Data Transformation - Processes data by cleansing data and transforming them to proper storage structure for the purpose of querying and analysis using ETL and ELT process Data Enablement - Ensure data is accessible and useable to wider enterprise to enable a deeper and more timely understanding of operation. Qualifications & Specifications Masters /Bachelor’s degree in Engineering /Computer Science/ Math/ Statistics or equivalent. Strong programming skills in Python/Pyspark/SAS. Proven experience with large data sets and related technologies – Hadoop, Hive, Distributed computing systems, Spark optimization. Experience on cloud platforms (preferably Azure) and it's services Azure Data Factory (ADF), ADLS Storage, Azure DevOps. Hands-on experience on Databricks, Delta Lake, Workflows. Should have knowledge of DevOps process and tools like Docker, CI/CD, Kubernetes, Terraform, Octopus. Hands-on experience with SQL and data modeling to support the organization's data storage and analysis needs. Experience on any BI tool like Power BI (Good to have). Cloud migration experience (Good to have) Cloud and Data Engineering certification (Good to have) Working in an Agile environment 4-6 Years Of Relevant Work Experience Is Required. Experience with stakeholder management is an added advantage. What We Are Looking For Education: Bachelor's degree or equivalent in Computer Science, MIS, Mathematics, Statistics, or similar discipline. Master's degree or PhD preferred. Knowledge, Skills And Abilities Fluency in English Analytical Skills Accuracy & Attention to Detail Numerical Skills Planning & Organizing Skills Presentation Skills Data Modeling and Database Design ETL (Extract, Transform, Load) Skills Programming Skills FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace. Show more Show less

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies