Jobs
Interviews

1124 Ocr Jobs - Page 4

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 15.0 years

0 Lacs

noida, uttar pradesh

On-site

As the leader of the AI Center of Excellence (CoE) at RMSI, you will play a crucial role in fostering innovation and excellence in AI-based solutions. Your responsibilities will include defining and executing the AI strategy in alignment with RMSI's business goals, leading the development and deployment of AI/ML models and solutions, collaborating with cross-functional teams to integrate AI into existing solutions, and ensuring the scalability, performance, and business impact of AI solutions. You will be expected to drive research and experimentation in machine learning, deep learning, computer vision, and predictive analytics, establish best practices and governance frameworks for AI development and deployment, collaborate with business leaders to understand their needs and translate them into AI solutions, and present progress, insights, and recommendations to senior leadership and external stakeholders. Furthermore, you will be responsible for building and mentoring a high-performing team of data scientists, ML engineers, and domain experts, fostering a culture of innovation, collaboration, and continuous learning within the organization. To be successful in this role, you should hold a B.Tech/M.Tech degree in Computer Science, Engineering, or a related field, with at least 15 years of overall experience, including a minimum of 6 years in AI/ML or data science roles. You should have proven experience in leading AI initiatives or Centers of Excellence in a corporate environment, strong programming and ML toolkit knowledge (such as Python and TensorFlow), and expertise in ML algorithms and OCR/NLP tools. Experience with cloud AI platforms (such as AWS, Azure, GCP) and modern ML frameworks (like TensorFlow and PyTorch) will be beneficial. Additionally, excellent leadership, communication, and stakeholder management skills, as well as experience in client engagement and cross-functional collaboration, are desired qualities for this role at RMSI.,

Posted 5 days ago

Apply

0.0 - 3.0 years

0 Lacs

kochi, kerala

On-site

You will be responsible for functions, activities, and skills required for the analysis, design, coding, integration, testing, and maintenance of Intelligent Document Processing modules and systems. Your role will involve building NLP based solutions for query and document analysis, processing, information extraction, and document classification, as well as context-based information retrieval. Additionally, you will conduct research to advance the state-of-the-art Deep learning and NLP technologies. As an AI Intern at Xerox Holdings Corporation, you will work in Kochi, India, in a hybrid work mode with timings from 10 AM to 7 PM (IST). The ideal candidate for this role should have 6 months to 1 year of experience and hold a qualification of B. Tech /MCA /BCA. In this role, you will need proficiency and experience working in the technical area of Intelligent Document Processing, including Digitization, OCR/ICR/OMR, LLM, Classification Methodologies, Data Extraction Methodologies, ML, AI, NLP, and more. Experience in designing and developing highly scalable templates and training documents on IDP for efficient data extraction from semi/un-structured pdf or images is essential. Additionally, experience in Docker and flask APIs, Redis, and Celery is required. You will closely work with Solution Architects/Team leads, prepare technical design documents, and implement automated deployment. Understanding and practicing AGILE Methodologies, working as part of the Software Development Lifecycle (SDLC) using Code Management & Release Tools, and proficiency in working with Relational Databases and SQL Scripting are key aspects of this role. Clear understanding of Architecture and infra requirements and setup is necessary. Experience or proficiency in .NET (C#, VB, C++, Java) & Python development languages would be useful. This position offers an opportunity to contribute to the advancement of digital transformation, augmented reality, robotic process automation, additive manufacturing, Industrial Internet of Things, and cleantech at Xerox Holdings Corporation. If you are passionate about innovation and have the required technical skills and experience, we encourage you to apply for this exciting role. Learn more about Xerox at www.xerox.com and explore our commitment to diversity and inclusion.,

Posted 5 days ago

Apply

10.0 - 20.0 years

35 - 40 Lacs

Mumbai

Work from Office

Job Title: Data Science Expert (Mentor & Trainer) Location: Onsite Mumbai, India Employment Type: Full-Time About the Role: We are seeking an experienced and highly skilled Data Science Expert to join our growing team at our Mumbai office. This is a full-time, onsite role focused not only on solving complex data problems but also on mentoring and training Junior Data Science Engineers . The ideal candidate will bring deep technical expertise in data science and machine learning, along with a passion for teaching and developing talent. Key Responsibilities: Lead the development of end-to-end data science solutions using advanced ML, NLP, and Computer Vision techniques. Train, mentor, and support junior data science engineers in coding, model development, and best practices. Architect and implement AI-driven solutions such as chatbots, OCR systems, and facial recognition applications. Translate complex business problems into actionable data science projects and deliver measurable results. Design and lead internal workshops, code reviews, and learning sessions to upskill the team. Collaborate with engineering and product teams to deploy models and insights into production environments. Stay abreast of the latest AI/ML trends and integrate cutting-edge techniques into projects where applicable. Desired Skills & Qualifications: Experience: 6+ years in Data Science/Machine Learning with at least 12 years of team mentoring or leadership experience. Education: Bachelors or Masters degree in Computer Science, Data Science, Statistics, or a related field. Technical Expertise Required: Strong proficiency in Python and SQL ; R is a plus. Solid hands-on experience with Deep Learning and Neural Networks , particularly in Natural Language Processing (NLP) , Generative AI , and Computer Vision . Familiarity with frameworks and libraries such as TensorFlow , Keras , PyTorch , OpenCV , SpaCy , NLTK , BERT , ELMo , etc. Experience developing Chatbots , OCR , and Face Recognition systems is preferred. Hands-on knowledge of cloud platforms (AWS, Azure, or Google Cloud Platform). Experience applying statistical and data mining techniques such as GLM , regression , clustering , random forests , boosting , decision trees , etc. Strong understanding of model validation , performance tuning , and deployment strategies . Soft Skills: Excellent communication and presentation skills, especially in explaining complex models to non-technical audiences. Demonstrated ability to mentor, train, and lead junior team members effectively. Strong analytical and problem-solving mindset with a detail-oriented approach. What We Offer: Competitive salary and benefits A collaborative and intellectually stimulating environment Career growth and leadership development opportunities within a fast-paced team

Posted 5 days ago

Apply

1.0 years

0 Lacs

Thiruvananthapuram, Kerala, India

On-site

Requisition ID: 34097 At Terumo Blood and Cell Technologies, our 7,000+ global associates are proud to come to work each day, knowing that what we do impacts the lives of patients around the world. We make medical devices and related products that are used to collect, separate, manufacture and process various components of blood and cells. With our innovative technologies and service offerings, we touch a patient’s life every second of every day and are committed to continuing to increase the number of patients we serve. With some of the best and brightest minds in the industry, an unmatched global footprint, comprehensive benefits and a distinct culture, Terumo Blood and Cell Technologies is a great place to work, grow and be part of a team that is focused on making a difference. Consider joining our team and unlock your potential. JOB TITLE: Finance Staff Accountant Working Title: Executive – Accounts Payable Job Summary The ideal candidate will process invoices, reconcile vendor statements, resolve discrepancies, and ensure timely payments while maintaining strong internal controls and compliance with company policies. Essential Duties Receive, review, and process vendor invoices and employee reimbursements. Match invoices with purchase orders and delivery receipts (3-way match). Ensure accuracy of billing, proper approvals, and timely posting in the ERP system within 3 WD’s of receipt. Reconcile vendor accounts monthly/quarterly and address discrepancies. Coordinate with procurement, operations, and vendors to resolve invoice/payment issues. Handle monthly closing activities for AP including accruals and reports. Maintain documentation in accordance with audit and compliance standards. Support internal and external audits related to accounts payable. Minimum Qualification Requirements Education Bachelor’s degree in Accounting, Commerce, or related field (B.Com, M.Com). Experience 1-2 years of relevant experience in Accounts Payable function. Skills Familiarity with ERP systems (MS-Navision, SAP, Oracle, etc.). Working knowledge of GST, TDS, and other statutory requirements. Good Excel skills (vlookups, pivots, basic formulas). Strong attention to detail and ability to manage high transaction volumes. Good communication and coordination skills. Experience working in a shared services or centralized AP team. Exposure to invoice automation tools or OCR-based systems. Terumo Penpol is part of Terumo Group and is headquartered in Thiruvananthapuram, Kerala. As India’s largest blood bag manufacturer, we provide a comprehensive range of blood collection, storage and processing solutions for blood centers, hospitals and therapeutic apheresis centers in India and abroad. Our high-quality products touch the lives of patients in over 80 countries across the world. True to our mission of contributing to society through healthcare, our award-winning Corporate Social Responsibility initiatives reinforce our organizational values and culture. Firmly rooted in core values collectively termed as the ‘Associate Spirit’, our work culture fosters an environment conducive to growth and continuous learning. People are our greatest asset, and we place as much importance in their professional development as in the research and development of our products.

Posted 5 days ago

Apply

2.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Company Description Skillturk is an insure-tech company that uses artificial intelligence to automate insurance review processes, directly affecting the premiums charged by insurance companies. We leverage OCR, automation, and human-in-the-middle to transform various raw documents into highly accurate digital data. Our AI-powered automation workflow validates property attributes in real time with 99% accuracy, helping insurance companies increase their sales significantly. Role Description Job Title: Social Media Executive Location: Indore Job Type: Full-time (Work from Office) Working Days: Monday to Saturday Shift Timings: 10:00 AM – 7:00 PM Job Description: We are looking for a creative, results-driven Social Media & Digital Marketing Executive to join our team. The ideal candidate should have hands-on experience in managing social media platforms, executing digital campaigns, and optimizing online visibility through SEO and SEM. Prior experience in the e-commerce industry is highly desirable. Key Responsibilities: Develop and manage social media content calendars for Instagram, Facebook, LinkedIn, etc. Create, schedule, and publish engaging content (posts, reels, videos, stories) aligned with the brand voice Monitor platform trends, engagement, and audience behavior Respond to comments, DMs, and community interactions professionally and promptly Collaborate with the design and content team to execute campaigns Analyze social media performance metrics and provide regular reports Stay updated with platform algorithm changes and new digital tools Support in managing SEO (on-page/off-page), SEM (Google Ads, Facebook Ads), and keyword strategy Assist in executing paid ad campaigns and influencer collaborations Ensure brand consistency and optimization across all digital platforms Managing and updating website/blog content via WordPress (basic knowledge preferred) Requirements: Bachelor’s degree in Marketing, Communications, or a related field 1–2 years of experience in social media management and digital marketing Solid understanding of SEO , SEM , and digital growth strategies Experience working in or with e-commerce brands Proficiency in social media tools and analytics platforms Excellent written and verbal communication skills Creative mindset with strong attention to detail Basic design or video editing skills using Canva , Adobe Suite , or similar tools Experience with WordPress is a plus

Posted 5 days ago

Apply

162.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Area(s) of responsibility About Birlasoft Birlasoft, a powerhouse where domain expertise, enterprise solutions, and digital technologies converge to redefine business processes. We take pride in our consultative and design thinking approach, driving societal progress by enabling our customers to run businesses with unmatched efficiency and innovation. As part of the CKA Birla Group, a multibillion-dollar enterprise, we boast a 12,500+ professional team committed to upholding the Group's 162-year legacy. Our core values prioritize Diversity, Equity, and Inclusion (DEI) initiatives, along with Corporate Sustainable Responsibility (CSR) activities, demonstrating our dedication to building inclusive and sustainable communities. Join us in shaping a future where technology seamlessly aligns with purpose. About the Job - The Technical Lead will focus on the development, implementation, and engineering of GenAI applications using the latest LLMs and frameworks. This role requires hands-on expertise in Python programming, cloud platforms, and advanced AI techniques, along with additional skills in front-end technologies, data modernization, and API integration. The Technical Lead will be responsible for building applications from the ground up, ensuring robust, scalable, and efficient solutions Job Title - Developer- Gen AI Application Development Location: Pune/Hyderabad Educational Background: Bachelor’s degree in computer science, Information Technology, or related field. Mode of Work- Hybrid Experience Required - 6+ years relevant Job Description Key Responsibilities: Coding and Development: Write clean, using Lang chain/Lang graph and Auto Gen frameworks efficiently, and maintainable code for GenAI applications using Python and open-source frameworks. Fine-Tuning Models: Fine-tune LLMs and SLMs using techniques like PEFT, LoRA, and QLoRA for specific use cases. Open-Source Frameworks: Work with frameworks like Hugging Face, LangChain, and others to build GenAI solutions. Azure AI Expertise: Design and deploy scalable AI solutions leveraging a comprehensive suite of Azure AI services. Integration and Deployment: Integrate generative AI models into existing enterprise systems and applications. Implement robust MLOps practices, CI/CD pipelines (e.g., Azure DevOps, GitHub, Jenkins), and containerization (Docker, Kubernetes) for seamless deployment. Knowledge of AWS would be an added advantage. Data Preprocessing: Build and maintain data preprocessing pipelines for training and fine-tuning models. API Integration: Integrate REST, SOAP, and other APIs for data ingestion, processing, and output delivery. Model Evaluation: Evaluate model performance using metrics and benchmarks and iterate to improve results. Prototyping: Quickly prototype and demonstrate GenAI applications to showcase capabilities and gather feedback. Front-End Development: Collaborate with front-end developers to integrate GenAI capabilities into user-friendly interfaces using tools like Streamlit or React. Version Control: Use Git and other version control systems to manage code and collaborate with team members. Technical Documentation: Create clear and concise documentation for code, models, and processes. Collaboration: Work closely with data scientists, engineers, and product managers to deliver high-impact solutions. Learning and Growth: Continuously learn and stay updated with the latest advancements in GenAI, open-source tools, and cloud technologies. Debugging and Optimization: Identify and fix bugs, optimize code, and improve application performance. Long Description Required Skills Python Programming: Strong proficiency in CORE Python OOP for developing GenAI applications. Fine-Tuning Techniques: Hands-on experience with fine-tuning methods like PEFT, LoRA, and QLoRA. Open-Source Frameworks: Expertise in Hugging Face, LangChain, LlamaIndex, and other open-source libraries. Cloud Platforms: Familiarity with Azure, GCP, and AWS for deploying and managing GenAI models. Data Preprocessing: Skills in building and maintaining data preprocessing pipelines. API Integration: Experience with REST, SOAP, and other protocols for API integration. Model Evaluation: Knowledge of metrics and benchmarks for evaluating model performance. Front-End Tools: Basic knowledge of front-end tools like Streamlit, React, or JavaScript for UI integration. Version Control: Proficiency in Git and Version Control best practices. Technical Documentation: Ability to create clear and concise technical documentation. Problem-Solving: Strong analytical and problem-solving skills to debug and optimize code. Collaboration: Excellent teamwork and communication skills to work effectively in cross-functional teams. Rapid Prototyping: Ability to quickly prototype and demonstrate GenAI applications. Continuous Learning: A growth mindset with a passion for learning and staying updated with the latest GenAI trends and technologies. Good To Have Document Intelligence: Proficiency in OCR and document intelligence using open-source and cloud tools.

Posted 5 days ago

Apply

0 years

0 Lacs

India

Remote

Job Title: AI/ML Engineer Job Type: Freelance Job Role: Remote About the Project: We are hiring on behalf of one of our clients who is building an AI-powered estimating tool for the construction industry. This tool aims to streamline project estimation using OCR, AI models, and intuitive interfaces. Key Responsibilities: Integrate ChatGPT (OpenAI) API for contextual estimation logic Use Tesseract OCR and OpenCV for document/image processing Develop and manage APIs with FastAPI Manage data storage and queries with PostgreSQL Handle document uploads and storage using AWS S3 Collaborate with frontend/mobile developers for full-stack integration Must-Have Skills: Experience with ChatGPT / OpenAI API Strong knowledge of Tesseract OCR and OpenCV Proficiency in FastAPI , PostgreSQL , and AWS S3 Nice-to-Have Skills: Knowledge of Scikit-learn or LightGBM Familiarity with deployment tools like Vercel Exposure to React Native and Next.js

Posted 5 days ago

Apply

4.0 - 7.0 years

10 - 20 Lacs

Coimbatore

Remote

Required Skills Proficiency in Python, especially for data processing, OCR, and ML Experience using AWS services: S3, Lambda, SageMaker, Textract, IAM, KMS Knowledge of OCR tools (e.g., Tesseract, AWS Textract, OpenCV for image cleaning) Familiarity with ML libraries: Scikit-learn, TensorFlow/PyTorch, HuggingFace Transformers Basic API development using Flask or FastAPI Understanding of containerization with Docker Exposure to Git and CI/CD pipelines Strong debugging, documentation, and code modularization skills Interested can share your resume to priyadharshini.r@g2tsolutions.com

Posted 5 days ago

Apply

2.0 - 3.0 years

2 Lacs

Cochin

On-site

We are a fast-growing technology startup based in Cochin, Kerala, focused on building innovative AI-powered software solutions for the healthcare, retail, and hospitality industries. We’re looking for a passionate AI Developer / Engineer to join our team and help us take our products to the next level. Key Responsibilities: Design, develop, and deploy AI/ML models for real-world applications. Build and optimize NLP, Computer Vision, or Predictive Analytics modules. Preprocess data and build datasets for training and inference. Integrate AI models into production-ready software using Python and REST APIs. Collaborate with software developers, product managers, and domain experts. Required Skills: 2–3 years of experience in AI/ML development. Proficient in Python and frameworks like TensorFlow, PyTorch, Scikit-learn . Experience with NLP, OCR, or Computer Vision projects. Solid understanding of data preprocessing , model training , and evaluation metrics . Ability to work with APIs , Databases (SQL/NoSQL) , and cloud tools. Experience with version control systems (e.g., Git). Preferred Skills (Bonus): Experience with AI in healthcare or OCR for documents/prescriptions . Knowledge of LLMs (e.g., GPT, LLaMA) or Generative AI . Deployment experience using Docker , Kubernetes , or AWS/GCP/Azure . Familiarity with Flutter , Node.js , or full-stack environments. Job Type: Full-time Pay: From ₹24,000.00 per month Work Location: In person Expected Start Date: 19/08/2025

Posted 5 days ago

Apply

8.0 years

4 - 8 Lacs

Hyderābād

On-site

Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Specialist Automation Engineer – Digital Experience & Automation What you will do Let’s do this. Let’s change the world. In this vital role you will design, build, and scale intelligent automations—leveraging RPA, machine learning, AI services, and cloud-native development—to streamline Digital Technology & Innovation (DTI) operations and elevate workforce productivity across the enterprise. You will lead rapid proofs-of-concept, own complex automation projects from ideation through production, and collaborate with process owners, architects, and engineers to deliver measurable business outcomes. Roles & Responsibilities: Engineer end-to-end automations : design, code, test, deploy, and maintain robust solutions that reduce manual effort and cycle time. Rapid prototyping : deliver proof-of-concept automations to validate feasibility and value, iterating quickly with stakeholders. Process re-engineering : work with SMEs to map, optimize, and automate workflows using RPA, AI/ML, and cloud services. Project ownership : drive high-visibility automation initiatives, ensuring scope, quality, schedule, and cost targets are met. Hands-on development : build reusable components, APIs, and orchestration logic in Python, JavaScript/TypeScript, or similar languages. Intelligent automation : integrate cognitive services (NLP, OCR, predictive models) to create self-learning, adaptive solutions. Collaboration & guidance : partner with ML engineers, cloud architects, and DevOps teams to operationalize and scale automations. Standards & best practices : contribute to coding guidelines, CI/CD pipelines, and governance frameworks that enable sustainable growth. What we expect of you We are all different, yet we all use our unique contributions to serve patients. The [vital attribute] professional we seek is a [type of person] with these qualifications. Basic Qualifications: Master's degree / Bachelor's degree and 8 to 13 years Information Systems experience Preferred Qualifications: Must-Have Skills 2+ years building automations with one or more leading platforms (UiPath, Automation Anywhere, Blue Prism, Power Automate, etc.). Proven full-stack development skills in at least one modern language (Python, JavaScript/TypeScript, C#, Java, Go, etc.). Experience delivering cloud-native solutions (AWS, Azure, or GCP) using services such as Lambda/Functions, API Gateway, S3/Blob, and serverless data stores. Familiarity with ML/AI concepts and practical integration of models or cognitive services into automation workflows. Demonstrated success leading or contributing to Agile/Scrum or DevOps delivery teams. Strong analytical and problem-solving capabilities; ability to translate operational pain points into technical designs. Nice to Have Skills 3–5 years designing, deploying, and operating automations that span SaaS, cloud, and on-prem environments. Deep proficiency in Python and experience developing, training, or tuning machine-learning models. Hands-on experience with process-mining / intelligence tools (Celonis, UiPath Process Mining, etc.). Solid understanding of CI/CD pipelines, infrastructure-as-code, and containerization (Docker, Kubernetes). Familiarity with AWS services such as EC2, S3, Lambda, Glue, Athena, and Redshift. Exposure to citizen-development programs and governance of low-code/no-code solutions. Soft Skills Strong collaboration and influencing skills across technical and non-technical teams. Proven ability to prioritize, manage multiple initiatives, and deliver under tight deadlines. Clear, concise communicator—able to tailor messages to engineers, product owners, and leadership. High degree of initiative, ownership, and accountability; thrives in fast-changing environments. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 5 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Location: Hyderabad Work Mode: Hybrid (2–3 days on-site/week) Experience Required: Minimum 3+ Years Salary Range: ₹6 – ₹10 LPA (based on experience and skill set) Disclaimer – Please Read Before You Apply We are not accepting applications from freshers or freelance-only profiles . This role requires prior corporate experience (minimum 2 years) with a clear understanding of how to automate contact extraction workflows using Python and NLP. This role goes beyond writing Python scripts—it’s about applying NLP (Natural Language Processing) in real-world scenarios where data isn’t clean or structured. We are looking for someone who can extract contact information like names, emails, phone numbers, and company details from documents such as PDFs, resumes, scanned images, and emails —even when the formats vary or the layout is messy. You should be able to: Understand the structure and flow of unstructured text Apply NLP concepts to locate and extract relevant contact details Build logic to automate this extraction across different document types Think critically and creatively to handle inconsistent data inputs You don’t need to rely on pre-built solutions—we value your ability to reason through the problem and implement your own approach . If you're passionate about NLP and love solving messy data problems, we encourage you to apply. To ensure you've read this section thoroughly, we’ve included a small check in the application process. The keyword is the number 7 — you’ll be asked for it at the end of the form. About the Role A leading AI-focused organization is looking for a Junior Python Developer with strong experience in machine learning and natural language processing. This is a great opportunity to work closely with senior engineers on cutting-edge AI initiatives that involve building, training, and deploying intelligent models at scale. Key Responsibilities Develop and optimize Python scripts for data preprocessing, training, and evaluating NLP/ML models. Contribute to ML pipeline development using Scikit-learn, TensorFlow, or PyTorch. Deploy ML models via REST APIs using Flask or FastAPI. Handle data in various formats such as JSON, CSV, and PDF using spaCy, PyMuPDF, or regex-based logic. Participate in testing, debugging, and validation of machine learning workflows. Maintain documentation on code, model performance, and technical decisions. Stay updated with tools like Hugging Face, LangChain, and MLflow. Mandatory Skills Minimum 3 years of hands-on Python development experience. Proficient in Python libraries such as NumPy, Pandas, and Matplotlib. Experience with NLP tasks using tools like spaCy or regex for rule-based NER. Understanding of core ML algorithms: classification, regression, clustering. Familiar with ML frameworks: Scikit-learn, TensorFlow, or PyTorch. Experience developing and consuming REST APIs. Proficient with Git and collaborative version control. Nice to Have Experience with Hugging Face Transformers or LangChain. Familiarity with MLflow or similar model lifecycle tools. Exposure to OCR or intelligent document processing projects. How to Apply Please send your updated resume to komal.bansal@zetamicron.com Shortlisted candidates will be contacted for further discussions.

Posted 5 days ago

Apply

4.0 - 9.0 years

7 - 17 Lacs

Pune

Work from Office

We are looking for a technically sound and process-driven VIM / P2P Functional Analyst with experience in OCR technologies and Azure services to support our Invoice Management Automation (VMI) platform. The role focuses on managing the OCR data extraction pipeline, validating incoming invoice metadata, and ensuring smooth integration with SAP through Azure logic apps and JSON payloads. This is not a direct SAP consultant role, but requires a good understanding of the SAP invoice posting lifecycle to align the automation logic accordingly. Key Responsibilities: * Manage OCR data capture and validation processes for incoming PDF invoices. * Configure and monitor Azure Logic Apps, Functions, and Service Bus for invoice data flow. * Work with business users to validate invoice fields (vendor, amount, PO number, etc.) and route exceptions. * Collaborate with SAP Functional Consultants to understand IDoc/API mappings and post-processing logic. * Maintain invoice queues, monitor failures, and resolve mapping or data exceptions. * Work with the development team to refine parsing logic or enhance OCR templates. * Document exception cases, handling procedures, and escalation points. * Conduct UAT, prepare test cases, and train end users on exception dashboards. Required Skills and Experience: * 46 years of total experience, including: Minimum 2 years in OCR-based invoice processing (ABBYY, Azure Form Recognizer, etc.) 2+ years working on Azure Logic Apps / Azure Functions / API integrations. * Good understanding of invoice processing lifecycle in SAP (FI-AP), though not mandatory to configure SAP. * Ability to interpret SAP posting logs, IDoc messages, and error statuses. * Experience working with JSON, XML, and integration payload formats. * Familiarity with SharePoint, Power Automate, or workflow visualization tools.

Posted 5 days ago

Apply

3.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

We are seeking a skilled RPA Developer with 3+ years of experience in UiPath and Power Automate to design, develop, and implement automation solutions. The ideal candidate will work closely with business teams to streamline workflows, improve efficiency, and support digital transformation initiatives. Key Responsibilities: Design, develop, and implement RPA solutions using UiPath and Power Automate . Collaborate with business analysts and stakeholders to identify automation opportunities . Create and maintain RPA workflows, scripts, and automation bots . Troubleshoot, debug, and optimize existing automation processes for efficiency and performance. Integrate RPA solutions with various enterprise applications like ERP, CRM, databases, and APIs . Ensure compliance with security, governance, and best practices in RPA development. Develop and maintain technical documentation for automation processes. Provide support, monitoring, and maintenance of deployed automation solutions. Stay up-to-date with RPA trends, tools, and best practices . Required Skills and Qualifications: Education: Bachelor’s degree in Computer Science, Information Technology, or a related field. Technical Skills: 3+ years of hands-on experience in UiPath and Power Automate . Strong expertise in RPA development, deployment, and maintenance . Experience in Orchestrator, attended/unattended bots, and REFramework. Proficiency in Power Automate Desktop and Cloud flows . Knowledge of API integration, SQL, and scripting languages (e.g., Python, VBScript, JavaScript). Understanding of AI/ML models, OCR tools, and process mining is a plus. Experience with Microsoft Power Platform, Power Apps, and Power BI is a bonus. Soft Skills: Strong problem-solving and analytical skills . Good communication skills to collaborate with business and technical teams . Ability to work in agile methodologies and handle multiple projects. Preferred Certifications: UiPath Certified RPA Developer (UiARD) or equivalent. Microsoft Power Automate Certification (PL-500) or similar.

Posted 5 days ago

Apply

5.0 years

0 Lacs

India

On-site

Job Description: Flutter Developer (Dart) Key Responsibilities Flutter Development Build and maintain cross-platform mobile applications for Android and iOS using Flutter (Dart). Implement advanced features such as: Biometric authentication (Face ID, Fingerprint) OTP-based fallback login Push notifications via Firebase Document uploads/downloads Secure in-app transactions and chat functionality API & Middleware Integration Consume and integrate RESTful APIs over secure protocols. Collaborate with Node.js backend teams to: Authenticate via JWT Process and manage data through middleware Debug and optimize API flows for performance and reliability UI/UX & Figma Design Integration Convert Figma design prototypes into responsive and scalable Flutter UIs. Ensure pixel-perfect UI across various screen sizes and devices. Apply Material Design or custom design systems to ensure consistent branding and UX behavior. Collaborate with designers to ensure alignment between design and functionality. Implement responsive and accessible layouts as per modern usability standards. Security, Performance & Compliance Adhere to data privacy and security practices (e.g., GDPR, HIPAA). Ensure encrypted data storage and secure session management. Optimize performance for fast load times, responsiveness, and low crash rates. Participate in QA/UAT cycles and address production issues as needed. Requirements Must-Have 5+ years of experience with Flutter and Dart Proven expertise in integrating RESTful APIs securely Hands-on with Firebase services (Messaging, Crashlytics, Analytics) Proficiency in translating Figma designs into functional mobile UIs Experience implementing biometric login and secure local storage Basic understanding of Node.js for middleware integration Nice-to-Have Experience with Azure Blob Storage or Firebase for document/file handling Familiarity with CI/CD pipelines for app builds and deployment Knowledge of accessibility standards and multi-language support Soft Skills Strong communication and collaboration, especially with designers, backend teams, and QA Keen eye for detail and commitment to building polished, user-friendly interfaces Ownership mindset and problem-solving attitude Agile mindset with experience in Scrum or similar methodologies Preferred Qualifications Bachelor’s degree in Computer Science, Software Engineering, or related field Experience in regulated domains (e.g., healthcare, finance, insurance) Exposure to chatbots, OCR, or data dashboards is a plus Skills: firebase,figma,flutter,multi-language support,data privacy,azure blob storage,biometric authentication,ios,restful apis,android,hipaa,dart,ui/ux design,ci/cd,node.js,material design,mobile applications,biometric login,gdpr

Posted 5 days ago

Apply

5.0 years

0 Lacs

India

On-site

Job Description: Flutter Developer (Dart) Key Responsibilities Flutter Development Build and maintain cross-platform mobile applications for Android and iOS using Flutter (Dart). Implement advanced features such as: Biometric authentication (Face ID, Fingerprint) OTP-based fallback login Push notifications via Firebase Document uploads/downloads Secure in-app transactions and chat functionality API & Middleware Integration Consume and integrate RESTful APIs over secure protocols. Collaborate with Node.js backend teams to: Authenticate via JWT Process and manage data through middleware Debug and optimize API flows for performance and reliability UI/UX & Figma Design Integration Convert Figma design prototypes into responsive and scalable Flutter UIs. Ensure pixel-perfect UI across various screen sizes and devices. Apply Material Design or custom design systems to ensure consistent branding and UX behavior. Collaborate with designers to ensure alignment between design and functionality. Implement responsive and accessible layouts as per modern usability standards. Security, Performance & Compliance Adhere to data privacy and security practices (e.g., GDPR, HIPAA). Ensure encrypted data storage and secure session management. Optimize performance for fast load times, responsiveness, and low crash rates. Participate in QA/UAT cycles and address production issues as needed. Requirements Must-Have 5+ years of experience with Flutter and Dart Proven expertise in integrating RESTful APIs securely Hands-on with Firebase services (Messaging, Crashlytics, Analytics) Proficiency in translating Figma designs into functional mobile UIs Experience implementing biometric login and secure local storage Basic understanding of Node.js for middleware integration Nice-to-Have Experience with Azure Blob Storage or Firebase for document/file handling Familiarity with CI/CD pipelines for app builds and deployment Knowledge of accessibility standards and multi-language support Soft Skills Strong communication and collaboration, especially with designers, backend teams, and QA Keen eye for detail and commitment to building polished, user-friendly interfaces Ownership mindset and problem-solving attitude Agile mindset with experience in Scrum or similar methodologies Preferred Qualifications Bachelor’s degree in Computer Science, Software Engineering, or related field Experience in regulated domains (e.g., healthcare, finance, insurance) Exposure to chatbots, OCR, or data dashboards is a plus Skills: flutter,figma,hipaa,restful apis,dart,azure blob storage,multi-language support,material design,biometric login,firebase services,firebase,ci/cd,gdpr,node.js,biometric authentication

Posted 5 days ago

Apply

0.0 - 3.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Job Description – AI Developer (Agentic AI Frameworks, Computer Vision & LLMs) Location (Hybrid - Bangalore) About the Role We’re seeking an AI Developer who specializes in agentic AI frameworks —LangChain, LangGraph, CrewAI, or equivalents—and who can take both vision and language models from prototype to production. You will lead the design of multi‑agent systems that coordinate perception (image classification & extraction), reasoning, and action, while owning the end‑to‑end deep‑learning life‑cycle (training, scaling, deployment, and monitoring). Key Responsibilities Scope What You’ll Do Agentic AI Frameworks (Primary Focus) Architect and implement multi‑agent workflows using LangChain, LangGraph, CrewAI, or similar. Design role hierarchies, state graphs, and tool integrations that enable autonomous data processing, decision‑making, and orchestration. Benchmark and optimize agent performance (cost, latency, reliability). Image Classification & Extraction Build and fine‑tune CNN/ViT models for classification, detection, OCR, and structured data extraction. Create scalable data‑ingestion, labeling, and augmentation pipelines. LLM Fine‑Tuning & Retrieval‑Augmented Generation (RAG) Fine‑tune open‑weight LLMs with LoRA/QLoRA, PEFT; perform SFT, DPO, or RLHF as needed. Implement RAG pipelines using vector databases (FAISS, Weaviate, pgvector) and domain‑specific adapters. Deep Learning at Scale Develop reproducible training workflows in PyTorch/TensorFlow with experiment tracking (MLflow, W&B). Serve models via TorchServe/Triton/KServe on Kubernetes, SageMaker, or GCP Vertex AI. MLOps & Production Excellence Build robust APIs/micro‑services (FastAPI, gRPC). Establish CI/CD, monitoring (Prometheus, Grafana), and automated retraining triggers. Optimize inference on CPU/GPU/Edge with ONNX/TensorRT, quantization, and pruning. Collaboration & Mentorship Translate product requirements into scalable AI services. Mentor junior engineers, conduct code and experiment reviews, and evangelize best practices. Minimum Qualifications B.S./M.S. in Computer Science, Electrical Engineering, Applied Math, or related discipline. 5+ years building production ML/DL systems with strong Python & Git . Demonstrable expertise in at least one agentic AI framework (LangChain, LangGraph, CrewAI, or comparable). Proven delivery of computer‑vision models for image classification/extraction. Hands‑on experience fine‑tuning LLMs and deploying RAG solutions. Solid understanding of containerization (Docker) and cloud AI stacks (AWS/Azure). Knowledge of distributed training, GPU acceleration, and performance optimization. ---------------------------------------------------------------------------------------------------------------------------------------------------------- Job Type: Full-time Pay: Up to ₹1,200,000.00 per year Experience: AI, LLM, RAG: 4 years (Preferred) Vector database, Image classification: 4 years (Preferred) containerization (Docker): 3 years (Preferred) ML/DL systems with strong Python & Git: 3 years (Preferred) LangChain, LangGraph, CrewAI: 3 years (Preferred) Location: Bangalore, Karnataka (Preferred) Work Location: In person

Posted 5 days ago

Apply

0.0 - 3.0 years

0 Lacs

Kochi, Kerala

On-site

We are a fast-growing technology startup based in Cochin, Kerala, focused on building innovative AI-powered software solutions for the healthcare, retail, and hospitality industries. We’re looking for a passionate AI Developer / Engineer to join our team and help us take our products to the next level. Key Responsibilities: Design, develop, and deploy AI/ML models for real-world applications. Build and optimize NLP, Computer Vision, or Predictive Analytics modules. Preprocess data and build datasets for training and inference. Integrate AI models into production-ready software using Python and REST APIs. Collaborate with software developers, product managers, and domain experts. Required Skills: 2–3 years of experience in AI/ML development. Proficient in Python and frameworks like TensorFlow, PyTorch, Scikit-learn . Experience with NLP, OCR, or Computer Vision projects. Solid understanding of data preprocessing , model training , and evaluation metrics . Ability to work with APIs , Databases (SQL/NoSQL) , and cloud tools. Experience with version control systems (e.g., Git). Preferred Skills (Bonus): Experience with AI in healthcare or OCR for documents/prescriptions . Knowledge of LLMs (e.g., GPT, LLaMA) or Generative AI . Deployment experience using Docker , Kubernetes , or AWS/GCP/Azure . Familiarity with Flutter , Node.js , or full-stack environments. Job Type: Full-time Pay: From ₹24,000.00 per month Work Location: In person Expected Start Date: 19/08/2025

Posted 5 days ago

Apply

6.0 years

0 Lacs

Sahibzada Ajit Singh Nagar, Punjab, India

On-site

Software Engineer 🏭 Company: NYX India (Wholly owned subsidiary of NYC Inc.) 📍 Location: Mohali ( Sahibzada Ajit Singh Nagar), Punjab 🕒 Full-Time 💼 Experience – 6-8 years Driven by Innovation. Defined by Excellence. NYX Inc. is a market leading minority business provider of automotive interior and under-hood solutions, known for commitment to innovation, quality, and continuous improvement. We are seeking a Software Engineer to join our fast-growing software team in Mohali , with strong expertise across modern Microsoft technologies and a passion for writing scalable, performant, and clean code. 🔑 Must-Have Skills: ✅ 5–6 years of hands-on development experience ✅ .NET Core, ASP.NET, MVC Core ,Web API Core ✅ C# and VB.NET ✅ REST API design & integration ✅ Frontend experience with Angular (version ≤ 19) + TypeScript ✅ HTML5, CSS, and Responsive Web Design ✅ SQL Server expertise – querying, optimization, stored procedures ✅ Understanding of scalable architecture and secure coding practices Good to Have: 🌟 Python (especially with OCR capabilities) 🌟 React with Material UI 🌟 Writing test cases using JEST (React/Angular) 🌟 Familiarity with Agile development and DevOps practices 🌟 Active Directory Concepts

Posted 5 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Title - ML Developer Language - English Location - Pune/ Hyderabad Duration - Fulltime permanent role Workplace type - Work from office Experience -5+ Years Machine Learning (ML) Developer JD: Onboarding and Know Your Customer (OBKYC) Value Stream The Onboarding and Know Your Customer (OBKYC) Enabler Value Stream has been established to deliver common, group-wide onboarding and KYC capabilities and services. It brings together business, operations and technology colleagues to co-design and build solutions to deliver better products and services for our customers. This value stream is intended to deliver towards our Future State Architecture (FSA) and the Digital Acceleration Programme - enabling a consistent approach in how we deliver change across the bank to facilitate improving the experience of our customers, the resilience of our infrastructure, and allow us to embrace innovative technologies. Our global businesses, operations and technology teams work closely together to help design and build digital services that allow our millions of customers around the world, to bank quickly, simply, and securely. OBKYC scope incorporates onboarding products, platforms, and a delivery capability particularly suited to client-aligned agile delivery at pace. The products provide our CIB and Wealth & Private Banking client base with onboarding capabilities to enable a best-in-class staff and client experience. Solutions are tailored to suit the client’s needs, from Treasury depts of large multi-nationals to sole traders. We are investing heavily across these domains with a strategic focus on increasing adoption of AI capabilities through our flagship AI journeys, day-to-day engineering and overall ways of working. To accelerate achieving our vision , we are seeking an experienced AI Engineer to join the Client Services and OBKYC Technology group. The role will be Poland based but will work with our teams globally. Machine Leaning Developer Principal responsibilities: · Building production-ready models to drive content extraction and classification from images and text-based sources. · Working closely with business teams to understand requirements and iteratively design and develop solutions. · Collaborative with product managers, technical teams. · Create, test and iterate new and existing products and features. · Designing and building Python/ML/OCR-based components. · Not only supporting the development of the product, but also the full lifecycle including the deployment, testing and production support of the application. Required experience: · MUST HAVE Strong experience in Document AI/Intelligent document processing using traditional models and Generative AI - particularly in using open source models for achieving business outcomes. · Experience delivering to production in python, with a focus on machine learning, deep learning, natural language processing, generative AI, image processing and OCR all additional positives. · Experience with some of the following frameworks - TensorFlow, Pytorch, Hugging face, Spacy, OpenCV, Regex or equivalents. · Experience delivering safe code to production, focusing on cybersecurity and resilience of the application and APIs. Nice to Have: · Experience using PostgreSQL for data storage and management. · Proficiency with Azure's core services like Azure Virtual Machines and experience with one or all of Azure CLI, Azure Kubernetes Service (AKS) and Azure DevOps · Experience delivering in teams releasing at a high cadence to production. If interested, kindly share your updated CV with arulkiruthiga@sloka.eu (or) arul.k@kamkon.in

Posted 6 days ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Technical Lead-App Development Location: Bengaluru Total exp: Min 5 to 8 years Skills: Esker, SAP ECC / S4 / Finance (Accounts Payable) and/or Procure to Pay Process Key Responsibilities: Required to have: Esker implementation and support experience – 2+ years Experience implementing and supporting SAP or JDE, companies, languages and currencies Experience / Good knowledge in Esker auto-learning, OCR mapping, touchless processing, teaching and workflow Familiar with Esker dashboards, workspaces, punchouts catalogs Expertise in Esker Accounts Payable and Purchasing functionality and processes Experience / working knowledge of SAP Finance (Accounts Payable) and/or SAP Procure to Pay process Preferred to have: Experience implementing and supporting multiple ERPS, companies, languages and currencies Experience with SAP ECC and S/4 Expert in SAP ECC / S4 / Finance (Accounts Payable) and/or Procure to Pay Process Familiar with the change management process

Posted 6 days ago

Apply

5.0 - 9.0 years

0 Lacs

kochi, kerala

On-site

You will be responsible for functions, activities, and skills required for the analysis, design, coding, integration, test & maintenance of Intelligent Document Processing modules and systems. Your main focus will be on building NLP based solutions for query and document analysis, processing, information extraction, document classification, and context-based information retrieval. Additionally, you will conduct research to advance the state-of-the-art Deep learning and NLP technologies. Your role will involve building knowledge of the organization, processes, and customers. You should have knowledge and experience in your own discipline while continuously acquiring higher-level knowledge and skills. You will receive a moderate level of guidance and direction, with moderate decision-making authority guided by policies, procedures, and business operations protocol. In terms of technical skills, you should have proficiency and experience working in the technical area of Intelligent Document Processing, including Digitization, OCR/ICR/OMR, LLM, Classification Methodologies, Data Extraction Methodologies, ML, AI, DL, NLP, etc. Experience in designing & developing highly scalable templates and training documents on IDP for efficient data extraction from semi/un-structured PDFs or images is essential. Familiarity with Docker, flask APIs, Redis, and Celery is a must. You will closely work with Solution Architects/Team leads, prepare technical design documents, and implement automated deployment. Understanding and practicing AGILE Methodologies, Software Development Lifecycle (SDLC) using Code Management & Release Tools (MS DevOps, Github, Team Foundation Server), working with Relational Databases, SQL Scripting (MS SQL Server), and clear understanding of Architecture and infra requirements are crucial for this role. Experience or proficiency in Python would be useful for this position. This is a fully remote position.,

Posted 1 week ago

Apply

1.0 years

0 Lacs

India

Remote

About Us: Upscrape is a fast-growing data automation and web scraping company building advanced scraping systems, custom data pipelines, and API-driven solutions for enterprise clients. We work on complex real-world challenges that require precision, scale, and expertise. As we continue to grow, we are looking to bring on an experienced developer to join our core technical team. Position Overview: We’re hiring a full‑time Software Engineer (Python / Full‑Stack) with strong experience in web applications, RESTful API development, and web scraping/browser automation. The ideal candidate has built production‑level systems, from front‑end interfaces to backend services, understands anti‑bot protections, and can independently own end‑to‑end data extraction and delivery workflows. This is a highly technical role perfect for someone who thrives on solving complex problems and shipping meaningful features that make an immediate impact. Key Responsibilities: Design, build, and maintain web applications (front‑end + back‑end) using Flask, FastAPI or Django paired with modern JavaScript frameworks. Develop and document RESTful APIs to serve and manage data. Implement and maintain web scraping/browser‑automation pipelines for dynamic, protected sites (Playwright, Selenium, Puppeteer), as part of broader data workflows. Architect and operate proxy management , IP rotation, and anti‑blocking solutions. Ensure high reliability with robust error handling , retry logic, monitoring, and scalability. Collaborate closely with the founder and cross‑functional team to define requirements, estimate tasks, and deliver client projects on time. Required Experience & Skills: 1+ years building production‑level web applications or scraping systems. Python proficiency with frameworks/libraries: Flask or FastAPI (preferred), Django, Requests, Async/Aiohttp, Scrapy. Front‑end fundamentals : HTML/CSS, JavaScript, and experience integrating with React or Next.js. API development : design, documentation (OpenAPI/Swagger), versioning. Database experience: PostgreSQL, MongoDB, or similar. DevOps skills: Docker, Git, CI/CD, Linux environments. Strong debugging, optimization, and problem‑solving abilities. Clear, consistent communication and a collaborative mindset. Bonus (Nice to Have): Experience with AI‑powered data tooling (LLMs, OCR, GPT‑4, Cursor, Claude Code). Familiarity with large‑scale architectures handling millions of records . Prior work in SaaS , productized data services, or cloud platforms (AWS, GCP, Azure). The Right Fit: We’re looking for someone who is: Self‑driven – takes full ownership from design through deployment. Execution‑oriented – really fast at shipping clean, maintainable code. Experienced – knows what they’re doing and can mentor others. Outcome‑focused – prioritizes working systems and client impact over theory. Detail‑oriented – writes clean code, follows best practices, and documents thoroughly. What We Offer: 100% remote, full‑time position. Stable, long‑term role with clear growth paths. Direct, efficient communication. Opportunity to work on high‑impact, cutting‑edge projects. Competitive compensation: ₹6 LPA – ₹12 LPA , based on skills and experience. How to Apply (Important Filter): In your application, please include : Links or code samples of web apps, APIs, or scraping projects you’ve built Your preferred tools and libraries, and why A brief overview of your approach to extracting data from highly dynamic websites A note on your experience with AI tools (e.g., Cursor, Claude Code) and your typical development turnaround time We look forward to seeing how you can help us build the next generation of data products!

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Why Choose Ideas2IT Ideas2IT has all the good attributes of a product startup and a services company. Since we launch our products, you will have ample opportunities to learn and contribute. However, single-product companies stagnate in the technologies they use. In our multiple product initiatives and customer-facing projects, you will have the opportunity to work on various technologies. AGI is going to change the world. Big companies like Microsoft are betting heavily on this (see here and here). We are following suit. As a Data Engineer, exclusively focus on engineering data pipelines for complex products What’s in it for you? A robust distributed platform to manage a self-healing swarm of bots onunreliable network / compute Large-scale Cloud-Native applications Document Comprehension Engine leveraging RNN and other latest OCR techniques Completely data-driven low-code platform You will leverage cutting-edge technologies like Blockchain, IoT, and Data Science as you work on projects for leading Silicon Valley startups. Your role does not start or end with just Java development; you will enjoy the freedom to share your suggestions on the choice of tech stacks across the length of the project If there is a certain technology you would like to explore, you can do your Technical PoCs Work in a culture that values capability over experience and continuous learning as a core tenet Here’s what you’ll bring Proficiency in SQL and experience with database technologies (e.g., MySQL, PostgreSQL, SQL Server).Experience in any one of the cloud environments – AWS, Azure Experience with data modeling, data warehousing, and building ETL pipelines. Experience building large-scale data pipelines and data-centric applications using any distributed storage platform Experience in data processing tools like Pandas, pyspark. Experience in cloud services like S3, Lambda, SQS, Redshift, Azure Data Factory, ADLS, Function Apps, etc. Expertise in one or more high-level languages (Python/Scala) Ability to handle large-scale structured and unstructured data from internal and third-party sources Ability to collaborate with analytics and business teams to improve data models that feed business intelligence tools, increase data accessibility, and foster data-driven decision-making across the organization Experience with data visualization tools like PowerBI, Tableau Experience in containerization technologies like Docker , Kubernetes About Us Ideas2IT stands at the intersection of Technology, Business, and Product Engineering, offering high-caliber Product Development services. Initially conceived as a CTO consulting firm, we've evolved into thought leaders in cutting-edge technologies such as Generative AI, assisting our clients in embracing innovation. Our forte lies in applying technology to address business needs, demonstrated by our track record of developing AI-driven solutions for industry giants like Facebook, Bloomberg, Siemens, Roche, and others. Harnessing our product-centric approach, we've incubated several AI-based startups—including Pipecandy, Element5, IdeaRx, and Carefi. in—that have flourished into successful ventures backed by venture capital. With fourteen years of remarkable growth behind us, we're steadfast in pursuing ambitious objectives. P.S. We're all about diversity, and our doors are wide open to everyone. Join us in celebrating the awesomeness of differences!

Posted 1 week ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Role As a Data Engineer, you'll build and maintain data pipelines and architectures.Responsibilities include optimizing databases and ETL processes, using Python or SQL,and collaborating with data teams for informed decision-making. Why Choose Ideas2IT Ideas2IT has all the good attributes of a product startup and a services company. Since we launch our products, you will have ample opportunities to learn and contribute. However, single-product companies stagnate in the technologies they use. In our multiple product initiatives and customer-facing projects, you will have the opportunity to work on various technologies. AGI is going to change the world. Big companies like Microsoft are betting heavily on this (see here and here). We are following suit. As a Data Engineer, exclusively focus on engineering data pipelines for complex products What’s in it for you? A robust distributed platform to manage a self-healing swarm of bots onunreliable network / compute Large-scale Cloud-Native applications Document Comprehension Engine leveraging RNN and other latest OCR techniques Completely data-driven low-code platform You will leverage cutting-edge technologies like Blockchain, IoT, and Data Science as you work on projects for leading Silicon Valley startups. Your role does not start or end with just Java development; you will enjoy the freedom to share your suggestions on the choice of tech stacks across the length of the project If there is a certain technology you would like to explore, you can do your Technical PoCs Work in a culture that values capability over experience and continuous learning as a core tenet Here’s what you’ll bring Proficiency in SQL and experience with database technologies (e.g., MySQL, PostgreSQL, SQL Server).Experience in any one of the cloud environments – AWS, Azure Experience with data modeling, data warehousing, and building ETL pipelines. Experience building large-scale data pipelines and data-centric applications using any distributed storage platform Experience in data processing tools like Pandas, pyspark. Experience in cloud services like S3, Lambda, SQS, Redshift, Azure Data Factory, ADLS, Function Apps, etc. Expertise in one or more high-level languages (Python/Scala) Ability to handle large-scale structured and unstructured data from internal and third-party sources Ability to collaborate with analytics and business teams to improve data models that feed business intelligence tools, increase data accessibility, and foster data-driven decision-making across the organization Experience with data visualization tools like PowerBI, Tableau Experience in containerization technologies like Docker , Kubernetes

Posted 1 week ago

Apply

12.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Position: AI/ML Lead (Associate Director Level) Experience: 12+ years Mode: Hybrid Salary: Upto 75 LPA We are looking for hiring a AI/ML Lead for one of our well established MNC Client Proficiency in understanding pulling structured and unstructured data from legacy systems to create a data architecture that can drive AI/ML efforts. Ability to engage actively with internal stakeholders, helping them understand the intricacies of AI/ML solutions and factors involved in successful execution. Ability to leverage latest AI offerings (ChatGPT, GPT 4, CoPilots, Tesseract OCR, OCR with CNN) from Microsoft, AWS, IBM and other providers to deliver solutions specific to the business interests Deep hands-on understanding of OCR and BI technologies Proven track record in thought leadership, particularly in IT modernization and innovation, to expedite immigration benefits and solve business problems. Solution Architect mindset, adept in planning data usage for delivering outcomes through multiple applications and continuously improving AI & ML models. Essential Qualifications: 12+ years in IT with a focus on enterprise data architecture, ML, and BI 5+ years leading data teams around enterprise scale AI/ML or RPA efforts 3+ years hands-on experience leading teams on use of deep learning frameworks (e.g. Tensorflow or PyTorch) Experience with Prompt Engineering for conversational AI Proficiency in managing big data and distributed computing environments (e.g., Databricks, Spark, Dask). Proven experience in leveraging LLMs for analytics and decision making Highly proficient in AWS or Azure data processing (e.g., EC2, S3, Redshift). Hands-on leadership of Bachelor's degree in Data Science, Statistics, Computer Science, IT Management, Engineering, or similar. Bonus Qualifications: Expertise in Advanced AI Domains: Proficiency in areas such as Natural Language Processing (NLP), OCR with CNN, and AGI showcasing a broad and deep understanding of AI technologies. Client Engagement and Collaboration: Excellent skills in engaging with clients, coupled with a strong ability to collaborate on new ideas and concepts, indicating both interpersonal and creative capabilities. Innovation and Team Leadership: Experience in creating and implementing innovative roadmaps and aligning teams for efficient delivery, reflecting leadership and strategic planning abilities. Data Process Management: Demonstrated expertise in managing and moving data through various processes, ensuring accountability for outcomes, emphasizing operational and execution skills. Work Ethic and Team Spirit: An independent worker who embodies a strong work ethic and team spirit, ensuring a balance between autonomy and collaboration.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies