Home
Jobs

3280 Extraction Jobs - Page 41

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

India

On-site

Linkedin logo

Role: AI Engineer Join AiDP: Revolutionizing Document Automation through AI At AiDP, we're transforming complex document workflows into seamless experiences with powerful AI-driven automation. We're on a mission to redefine efficiency, accuracy, and collaboration in finance, insurance, and compliance. To continue pushing boundaries, we’re looking for exceptional talent Your Mission: Develop, deploy, and optimize cutting-edge machine learning models for accurate extraction and structuring of data from complex documents. Design and implement scalable NLP pipelines to handle vast quantities of unstructured and structured data. Continuously refine models through experimentation and data-driven analysis to maximize accuracy and efficiency. Collaborate closely with product and engineering teams to deliver impactful, real-world solutions. We’re looking for: Proven expertise in NLP, machine learning, and deep learning with solid knowledge of frameworks such as PyTorch, TensorFlow, Hugging Face, or scikit-learn. Strong proficiency in Python and experience with data processing tools (Pandas, NumPy, Dask). Experience deploying models to production using containerization technologies (Docker, Kubernetes) and cloud platforms (AWS, Azure, GCP). Familiarity with version control systems (Git) and continuous integration/continuous deployment (CI/CD) pipelines. Background in computer science, including understanding of algorithms, data structures, and software engineering best practices. Strong analytical thinking, problem-solving skills, and passion for tackling challenging issues in document automation and compliance workflows Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

WHO WE ARE: EOS IT Solutions is a Global Technology and Logistics company, providing Collaboration and Business IT Support services to some of the world’s largest industry leaders, delivering forward-thinking solutions based on multi-domain architecture. Customer satisfaction and commitment to superior quality of service are our top business priorities, along with investing in and supporting our partners and employees. We are a true International IT provider and are proud to deliver our services through global simplicity with trusted transparency. POSITION OVERVIEW: We are seeking a highly motivated and detail-oriented Supply Chain Analyst with proficiency in SQL to join our dynamic team. The ideal candidate will play a crucial role in optimizing our supply chain processes by leveraging data analysis and insights. The Supply Chain Analyst will be responsible for collecting, analyzing, and interpreting data to support strategic decision-making within the supply chain department. WHAT YOU'LL DO: Responsibilities: Data Analysis: Utilize SQL to extract, transform, and analyze large datasets related to supply chain operations. Generate reports and dashboards to provide insights into key performance indicators (KPIs) such as inventory levels, order fulfillment, and supplier performance. Identify trends, anomalies, and opportunities for process improvement through data-driven analysis. Forecasting and Demand Planning: Collaborate with cross-functional teams to develop accurate demand forecasts based on historical data, market trends, and other relevant factors. Assist in the development and maintenance of inventory models to optimize stock levels and minimize shortages or excess inventory. Supplier Relationship Management: Evaluate supplier performance through data analysis and develop strategies to enhance relationships and improve overall supply chain efficiency. Monitor supplier lead times, delivery performance, and quality metrics to ensure alignment with organizational goals. Process Optimization: Identify areas for process improvement within the supply chain, utilizing data insights to streamline operations and reduce costs. Work closely with stakeholders to implement changes and monitor the impact on key performance metrics. Collaboration: Collaborate with cross-functional teams including procurement, logistics, and operations to ensure seamless coordination and communication across the supply chain network. Provide analytical support for strategic decision-making and participate in regular meetings to discuss supply chain performance and improvement initiatives. WHAT YOU'LL NEED TO SUCCEED: Bachelor's degree in Supply Chain Management, Business Analytics, or a related field. Proven experience as a Supply Chain Analyst with a strong emphasis on data analysis. Proficiency in SQL for data extraction, transformation, and analysis. Familiarity with data visualization tools (e.g., Tableau, Power BI) is a plus. Strong analytical and problem-solving skills with attention to detail. Excellent communication and interpersonal skills. Ability to work collaboratively in a team environment. Knowledge of supply chain concepts, processes, and best practices. EOS is committed to creating a diverse and inclusive work environment and is proud to be an equal opportunity employer. We invite you to consider opportunities at EOS regardless of your gender; gender identity; gender reassignment; age; religious or similar philosophical belief; race; national origin; political opinion; sexual orientation; disability; marital or civil partnership status or other non-merit factor. Show more Show less

Posted 1 week ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

We are looking for a developer to join our product dev team. We are a web-based SaaS product with universal (iOS / Android) mobile app. We are also using AI/ML for entity extraction and rules engine. Your core focus shall be software development, Unit testing, defect fixing. PYTHON DEV RESPONSIBILITIES 1. Participate in requirement gathering and design discussions. 2. Develop software - web based APIs, Database queries (ORM / MySQL) etc. 3. Perform unit testing and address defects and change requests (CRs) 4. Resolve defects and other issues with QA team and SME. PYTHON DEV REQUIREMENTS 1. Minimum 2 years’ experience in Python framework (Flask preferrable). 2. Restful API development. 3. Third party API integration. 4. Hands on experience in ORM (SQLAlchemy preferred). 5. Raw queries in MySQL. 6. Good analytical skills, communication skills and a team player. 7. Knowledge of Python 3 preferred. 8. Knowledge of accounting or finance domain SaaS products preferred. 9. Knowledge of React Native preferred. 10. Knowledge of version control system (viz. Git) preferred. 11. Bachelor's degree in computer science or engineering preferred. Show more Show less

Posted 1 week ago

Apply

5.0 - 10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

BIM Engineer - 2D CAD /3D Tekla/, Marine Chennai (Hybrid) Permanent Do you want to join one of the leading design consultancies? Can you add to our success by producing high-quality 2D CAD works &3D BIM Models in Tekla for Marine /Heavy Infrastructures projects, while also developing your career and reputation? Do you hold a degree in Civil Engineering? Then join us in designing the future. Join our Marine team in Chennai The Marine Department of COWI India is undertaking projects from both international and domestic markets. It offers a comprehensive range of consulting services related to the development of Ports, Harbours, Coastal and Marine structures, and Heavy civil infrastructure etc. In this position, you will join our global team of experts and work closely with the Head of the Section. Your primary task in our team will be the production of high-quality 2D models in Auto CAD and BIM Models in Tekla, for marine structures including piled jetties, retaining structures like diaphragm walls, sheet pile walls, embankments, steel structures, breakwaters, and revetments etc. On a day-to-day basis, you will: Be involved in clash detection and coordination of 2D/ 3D models prepared in various software (Tekla/Revit) with Navisworks and other coordination software. Understand the BIM execution plan and implement it on the project. Coordinate with internal and external design engineers for the information required to prepare BIM and generate drawings and quantities. Perform and document quality checks at each stage of submission drawings and quantities/BOQ. Be responsible for the estimation, planning, and tracking of the project. Develop and implement standard procedures suitable for Marine BIM workflow. Have interface knowledge with Concrete 2D AutoCAD/3D Tekla modelers across geography. Perform quality checks on 3D/2D models and drawings to ensure accuracy and adherence to COWI standards. Identify and resolve design and modelling issues as they arise during the design process Your Skills. Our Team. Together, we design the future The first step to success in this role is you be eager to collaborate with the people around you, whether they are colleagues, partners or customers. Developing ties with others is something you do by acting respectfully and delivering on your promises. And you never get set in your ways but keep exploring new insights and ways to improve. Furthermore, you'll: Hold B. Tech in Civil Engineering and come with 5-10 years of total work experience with at least 5 years. in Civil 2D Auto CAD &Tekla. Model preparation & extraction of 2D drawings using CIVIL 3D and AutoCAD. Have strong skills in using Navisworks, and Infra works and creating parametric subassemblies. Have experience in the BIM process - creating and updating projects in BIM and updating model contents to support ongoing designs. A place to work and so much more At COWI, we work together with our customers to shape a sustainable and liveable world. We do it by applying our knowledge and curiosity – and sometimes even our courage – to create the solutions the world needs today to enable a better tomorrow. That is why we say no to fossil-based projects and aspire to have 100 percent of our revenue come from activities that move our customers toward sustainability. We value differences and development and cultivate an environment of belonging and having fun. Because that is what brings out the best in you, at work and at home. With offices primarily located in Scandinavia, the UK, North America, and India, we are currently 7,500 people who bring their expertise in engineering, architecture, energy, and environment into play. Get to know us even better at our website, www.cowi.com, where you can learn more about our projects, our strategy, what we want to achieve, and what life is like at COWI. Equal opportunity employer COWI provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to ethnicity, colour, religion, gender, national origin, age, or disability. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training. Show more Show less

Posted 1 week ago

Apply

3.0 - 6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Responsibility: GEN AI is must Products.looking for a data scientist who will help us discover the information hidden in vast amounts of data, and help us make smarter decisions to deliver AI/ML based Enterprise Software Products. • Develop solutions related to machine learning, natural language processing and deep learning & Generative AI to address business needs. • Your primary focus will be in applying Language/Vision techniques, developing llm based applications and building high quality prediction systems. • Analyze Data: Collaborate with cross-functional teams to understand data requirements and identify relevant data sources. Analyze and preprocess data to extract valuable insights and ensure data quality. • Evaluation and Optimization: Evaluate model performance using appropriate metrics and iterate on solutions to enhance performance and accuracy. Continuously optimize algorithms and models to adapt to evolving business requirements. • Documentation and Reporting: Document methodologies, findings, and outcomes in clear and concise reports. Communicate results effectively to technical and non-technical stakeholders. Work experience background required: • Experience building software from the ground up in a corporate or startup environment. Essential skillsets required: • 3-6 years experience in software development • Educational Background: Strong computer science and Math/Statistics • Experience with Open Source LLM and Langchain Framework and and designing efficient prompt for LLMs. • Proven ability with NLP and text-based extraction techniques. • Experience in Generative AI technologies, such as diffusion and/or language models. • Excellent understanding of machine learning techniques and algorithms, such as k-NN, Naive Bayes, SVM, Decision Forests, etc. • Familiarity with cloud computing platforms such as GCP or AWS. Experience to deploy and monitor model in cloud environment. • Experience with common data science toolkits, such as NumPy, Pandas etc • Proficiency in using query languages such as SQL • Good applied statistics skills, such as distributions, statistical testing, regression, etc. • Experience working with large data sets along with data modeling, language development, and database technologies • Knowledge in Machine Learning and Deep Learning frameworks (e.g., TensorFlow, Keras, Scikit-Learn, CNTK, or PyTorch), NLP, Recommender systems, personalization, Segmentation, microservices architecture and API development. • Ability to adapt to a fast-paced, dynamic work environment and learn new technologies quickly. • Excellent verbal and written communication skills Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Looking for a highly versatile AI engineer who can manage data ingestion, tagging, and build an LLM-powered knowledge system end-to-end. Job Role- AI automation engineer Job Type- Part-time Work mode- Remote Responsibilities: Build a secure, centralized “data room” from files (emails, docs, drive folders, etc.) Clean, tag, and index documents using embeddings + vector database (e.g., Chroma) Implement a simple ChatGPT-style Q&A interface over the data Handle basic transcription + summarization for select video/audio Ensure compatibility across desktop, tablet, and mobile Prioritize privacy and human-in-the-loop validation Ideal Skills: LangChain or LlamaIndex, OpenAI/Gemini embeddings FastAPI or Node.js backend experience Dropbox / Google Drive API integrations Email + document parsing, metadata extraction Bonus: Streamlit or lightweight UI prototyping . Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: SQL Developer Trainee Location: Remote Job Type: Internship (Full-Time) Duration: 1–3 Months Stipend: ₹25,000/month Department: Data & Engineering Job Summary: We are seeking a detail-oriented and motivated SQL Developer Trainee to join our team remotely. This internship is designed for recent graduates or students who want to gain practical experience in database development, writing SQL queries, and working with data in real-world applications. Key Responsibilities: Write, test, and optimize SQL queries for data extraction and reporting Assist in designing and maintaining database structures (tables, views, indexes, etc.) Help ensure data integrity, accuracy, and security across systems Support the team in troubleshooting and debugging database-related issues Collaborate with developers and analysts to fulfill data requirements for projects Document query logic and database-related processes Qualifications: Bachelor’s degree (or final year student) in Computer Science, Information Technology, or related field Strong understanding of SQL and relational databases (e.g., MySQL, PostgreSQL, SQL Server) Familiarity with database design and normalization Analytical mindset with good problem-solving skills Ability to work independently in a remote setting Eagerness to learn and grow in a data-driven environment Preferred Skills (Nice to Have): Experience with procedures, triggers, or functions in SQL Exposure to BI/reporting tools (Power BI, Tableau, etc.) Understanding of data warehousing concepts Familiarity with cloud-based databases or platforms What We Offer: Monthly stipend of ₹25,000 Remote work opportunity Hands-on experience with real-world datasets and projects Mentorship and structured learning sessions Certificate of Completion Potential for full-time employment based on performance Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: SQL Developer Trainee Location: Remote Job Type: Internship (Full-Time) Duration: 1–3 Months Stipend: ₹25,000/month Department: Data & Engineering Job Summary: We are seeking a detail-oriented and motivated SQL Developer Trainee to join our team remotely. This internship is designed for recent graduates or students who want to gain practical experience in database development, writing SQL queries, and working with data in real-world applications. Key Responsibilities: Write, test, and optimize SQL queries for data extraction and reporting Assist in designing and maintaining database structures (tables, views, indexes, etc.) Help ensure data integrity, accuracy, and security across systems Support the team in troubleshooting and debugging database-related issues Collaborate with developers and analysts to fulfill data requirements for projects Document query logic and database-related processes Qualifications: Bachelor’s degree (or final year student) in Computer Science, Information Technology, or related field Strong understanding of SQL and relational databases (e.g., MySQL, PostgreSQL, SQL Server) Familiarity with database design and normalization Analytical mindset with good problem-solving skills Ability to work independently in a remote setting Eagerness to learn and grow in a data-driven environment Preferred Skills (Nice to Have): Experience with procedures, triggers, or functions in SQL Exposure to BI/reporting tools (Power BI, Tableau, etc.) Understanding of data warehousing concepts Familiarity with cloud-based databases or platforms What We Offer: Monthly stipend of ₹25,000 Remote work opportunity Hands-on experience with real-world datasets and projects Mentorship and structured learning sessions Certificate of Completion Potential for full-time employment based on performance Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Linkedin logo

We're Hiring: Computer Vision Engineer Location: India (Remote/Hybrid) Type: Full-Time | Applied AI R&D Experience : Minimum 4 years Note: This opportunity is for e xperienced professionals only . If you're a fresher or currently pursuing your degree , we appreciate your interest, but this role is not a fit at the moment. Company Overview Weai Labs is an India-based AI/ML research and development company focused on accelerating the transformation of cutting-edge ideas into real-world solutions. We partner with enterprises, research institutions, startups, and universities to build impactful AI products. Our mission is to bridge the gap between innovation and deployment in the AI/ML space. We are looking for a talented and experienced Computer Vision Engineer to join our growing team. This role is ideal for someone passionate about building applied AI systems that make a tangible difference. About the Role: Computer Vision Engineer In this role, you will lead the design and implementation of advanced computer vision models for image and video analysis. You’ll contribute directly to commercial products involving object detection, keypoint-based scoring, biometric estimation, and real-time tracking. Key Responsibilities: Design and develop computer vision models for: Visual attribute estimation (classification/regression tasks) Scoring using keypoint detection and geometric features Object tracking and identification across sequences Build and train deep learning models using PyTorch or TensorFlow Apply advanced techniques including: Object detection (YOLO, Detectron2, etc.) Keypoint estimation (MediaPipe, DeepLabCut, etc.) Similarity learning (Siamese networks or related architectures) Collaborate with engineering teams to define system requirements and integrate AI models into cloud-based platforms Contribute to optimization and deployment pipelines using OpenCV, NumPy, and cloud compute resources Minimum Qualifications: Bachelor’s or Master’s degree in Engineering, Computer Science, or a related field with a focus on Computer Vision or Machine Learning Minimum of 4 years of hands-on experience in deep learning and computer vision Proficiency in Python and experience with frameworks like PyTorch and TensorFlow Solid understanding of object detection, classification, and visual feature extraction Experience with image processing tools such as OpenCV Familiarity with biometric matching or similarity-based recognition systems Preferred Qualifications: Experience building production-ready AI systems in a cloud or SaaS environment Familiarity with keypoint tracking, statistical scoring systems, or visual measurement techniques Exposure to edge or embedded vision systems Domain experience in areas such as medical imaging, agriculture, sports analytics, or wildlife monitoring Why Join Us At Weai Labs, you’ll be part of a mission-driven team dedicated to solving real-world problems with cutting-edge AI. This is an opportunity to work on high-impact projects that integrate science, engineering, and scalable technology. To Apply: Submit your resume or connect with us here on LinkedIn to know more. jobs@weailabs.com +91 8072457947 Just WhatsApp Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Join us as a Data Analyst Take on a new challenge in a cutting edge data team, in which you’ll manage the analysis of complex bank, franchise or function data to identify business issues and opportunities We’ll look to you to provide high quality analytical input to help develop and implement innovative processes and resolve problems across the bank This is a hands on rolein which you'll honeyour data analysis expertise and gain valuable experience in a dynamic area of our business We're offering this role at vice president level What you'll do As a Data Analyst, you'll play a key role in supporting the delivery of high quality business solutions. You’ll be performing data extraction, storage, manipulation, processing and analysis, alongside developing and performing standard queries to ensure data quality and identify data inconsistencies and missing data. Day-to-day, You’ll Also Be Collecting, profiling and mapping appropriate data to use in new or existing solutions as well as for ongoing data activities Identifying and documenting data migration paths and processes, standardising data naming, data definitions and modelling Interpreting customer needs and turning them into functional or data requirements and process models Building and maintaining collaborative partnerships with key business stakeholders The skills you'll need We’re looking for someone with at least 10 years of experience using data analysis tools and delivering data analysis in a technology or IT function. We’ll Also Look For An in-depth understanding of the interrelationships of data and multiple data domains Experience in data analysis of complex organisational, franchise or function data to identify business issues and opportunities A background in delivering research based on qualitative and quantitative data across a range of subjects Excellent communication and interpersonal skills Show more Show less

Posted 1 week ago

Apply

0.0 - 5.0 years

0 Lacs

Pune, Maharashtra

On-site

Indeed logo

JOB DESCRIPTION: Role: Data Scientist / Deep Learning Engineer Location: Senapati Bapat Road, Pune Experience : 5-10 Years General Summary of the Role: Develop and optimize computer vision models for o bject detection (YOLO, Faster R-CNN, SSD) and image classification (ResNet, MobileNet, EfficientNet, ViTs ). Work with OCR technologies (Tesseract, EasyOCR, CRNN, TrOCR) for text extraction from images. Work with PyTorch, TensorFlow, OpenCV for deep learning and image processing. Implement sequence-based models (RNNs, LSTMs, GRUs) for vision tasks. Optimize software for real-time performance on multiple platforms. Implement and deploy AI models via Flask/FastAPI and integrate with SQL/NoSQL databases. Use Git/GitHub for version control and team collaboration. Apply ML algorithms (regression, decision trees, clustering) as needed. Review code, mentor team members, and enhance model efficiency. Stay updated with advancements in deep learning and multimodal AI. Required Skills & Qualifications: Python proficiency for AI development. Experience with PyTorch, TensorFlow, and OpenCV. Knowledge of object detection (YOLO, Faster R-CNN, SSD) and image classification (ResNet, MobileNet, EfficientNet, ViTs). Experience with OCR technologies (Tesseract, EasyOCR, CRNN, TrOCR). Experience with RNNs, LSTMs, GRUs for sequence-based tasks. Experience with Generative Adversarial Networks (GANs) and Diffusion Models for image generation. Familiarity with REST APIs (Flask/FastAPI) and SQL/NoSQL databases. Strong problem-solving and real-time AI optimization skills. Experience with Git/GitHub for version control. Knowledge of Docker, Kubernetes, and model deployment at scale on serverless and on-prem platforms. Understanding of vector databases (FAISS, Milvus). Preferred Qualifications: Experience with cloud platforms (AWS, GCP, Azure). Experience with Vision Transformers (ViTs) and Generative AI (GANs, Stable Diffusion, LMMs). Familiarity with Frontend Technologies. Job Types: Full-time, Permanent Pay: ₹1,800,000.00 - ₹2,500,000.00 per year Benefits: Health insurance Paid sick time Paid time off Provident Fund Schedule: Day shift Monday to Friday Application Question(s): what is your notice period? What's your current CTC? Experience: Deep learning/Data scientist: 5 years (Required) PyTorch, Python: 5 years (Required) TensorFlow: 5 years (Required) cloud platforms AWS, GCP, Azure: 5 years (Required) Vision Transformers (ViTs) and Generative AI: 5 years (Required) video processing & computer vision: 5 years (Required) Location: Pune, Maharashtra (Required) Work Location: In person

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

About Lead India: Lead India is a forward-thinking IT company offering end-to-end digital solutions, software development, and data-driven services. We believe in nurturing fresh talent and giving them the opportunity to work on meaningful, real-world projects in a supportive remote environment. Internship Overview: We are seeking a motivated and detail-oriented SQL Developer Intern to join our team. In this role, you’ll be working with databases, writing and optimizing SQL queries, and contributing to the development and maintenance of our data systems. Key Responsibilities: Write, test, and optimize SQL queries and stored procedures Assist in the design and maintenance of database schemas Perform data extraction, transformation, and loading (ETL) tasks Collaborate with the data and development teams to troubleshoot and improve database performance Document database processes and ensure data accuracy and integrity Support reporting needs with custom queries and data exports Required Skills: Good understanding of SQL and relational databases (MySQL, PostgreSQL, SQL Server, etc.) Basic knowledge of database design and normalization Familiarity with tools like MySQL Workbench, SSMS, or pgAdmin Strong problem-solving and analytical skills Attention to detail and ability to work independently in a remote setup Nice to Have: Exposure to ETL tools or processes Basic knowledge of performance tuning and query optimization Understanding of data warehousing concepts Experience with scripting languages (Python, Shell) for automation What You’ll Gain: Practical experience working on real SQL/database projects Guidance from experienced developers and mentors Remote work flexibility Internship certificate and Letter of Recommendation Opportunity for a full-time role or pre-placement offer (based on performance) You will get salary upto 25,000/- per month. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Position: Data Analyst Intern (Full-Time) Company: Lead India Location: Remote Stipend: ₹25,000/month Duration: 1–3 months (Full-Time Internship) About Lead India: Lead India is a forward-thinking technology company that helps businesses make smarter decisions through data. We provide meaningful internship opportunities for emerging professionals to gain real-world experience in data analysis, reporting, and decision-making. Role Overview: We are seeking a Data Analyst Intern to support our data and product teams in gathering, analyzing, and visualizing business data. This internship is ideal for individuals who enjoy working with numbers, identifying trends, and turning data into actionable insights. Key Responsibilities: Analyze large datasets to uncover patterns, trends, and insights Create dashboards and reports using tools like Excel, Power BI, or Tableau Write and optimize SQL queries for data extraction and analysis Assist in data cleaning, preprocessing, and validation Collaborate with cross-functional teams to support data-driven decisions Document findings and present insights to stakeholders Skills We're Looking For: Strong analytical and problem-solving skills Basic knowledge of SQL and data visualization tools (Power BI, Tableau, or Excel) Familiarity with Python for data analysis (pandas, matplotlib) is a plus Good communication and presentation skills Detail-oriented with a willingness to learn and grow What You’ll Gain: ₹25,000/month stipend Real-world experience in data analysis and reporting Mentorship from experienced analysts and developers Remote-first, collaborative work environment Potential for a Pre-Placement Offer (PPO) based on performance Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Position: Data Analyst Intern (Full-Time) Company: Lead India Location: Remote Stipend: ₹25,000/month Duration: 1–3 months (Full-Time Internship) About Lead India: Lead India is a forward-thinking technology company that helps businesses make smarter decisions through data. We provide meaningful internship opportunities for emerging professionals to gain real-world experience in data analysis, reporting, and decision-making. Role Overview: We are seeking a Data Analyst Intern to support our data and product teams in gathering, analyzing, and visualizing business data. This internship is ideal for individuals who enjoy working with numbers, identifying trends, and turning data into actionable insights. Key Responsibilities: Analyze large datasets to uncover patterns, trends, and insights Create dashboards and reports using tools like Excel, Power BI, or Tableau Write and optimize SQL queries for data extraction and analysis Assist in data cleaning, preprocessing, and validation Collaborate with cross-functional teams to support data-driven decisions Document findings and present insights to stakeholders Skills We're Looking For: Strong analytical and problem-solving skills Basic knowledge of SQL and data visualization tools (Power BI, Tableau, or Excel) Familiarity with Python for data analysis (pandas, matplotlib) is a plus Good communication and presentation skills Detail-oriented with a willingness to learn and grow What You’ll Gain: ₹25,000/month stipend Real-world experience in data analysis and reporting Mentorship from experienced analysts and developers Remote-first, collaborative work environment Potential for a Pre-Placement Offer (PPO) based on performance Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Position: Data Analyst Intern (Full-Time) Company: Lead India Location: Remote Stipend: ₹25,000/month Duration: 1–3 months (Full-Time Internship) About Lead India: Lead India is a forward-thinking technology company that helps businesses make smarter decisions through data. We provide meaningful internship opportunities for emerging professionals to gain real-world experience in data analysis, reporting, and decision-making. Role Overview: We are seeking a Data Analyst Intern to support our data and product teams in gathering, analyzing, and visualizing business data. This internship is ideal for individuals who enjoy working with numbers, identifying trends, and turning data into actionable insights. Key Responsibilities: Analyze large datasets to uncover patterns, trends, and insights Create dashboards and reports using tools like Excel, Power BI, or Tableau Write and optimize SQL queries for data extraction and analysis Assist in data cleaning, preprocessing, and validation Collaborate with cross-functional teams to support data-driven decisions Document findings and present insights to stakeholders Skills We're Looking For: Strong analytical and problem-solving skills Basic knowledge of SQL and data visualization tools (Power BI, Tableau, or Excel) Familiarity with Python for data analysis (pandas, matplotlib) is a plus Good communication and presentation skills Detail-oriented with a willingness to learn and grow What You’ll Gain: ₹25,000/month stipend Real-world experience in data analysis and reporting Mentorship from experienced analysts and developers Remote-first, collaborative work environment Potential for a Pre-Placement Offer (PPO) based on performance Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

About Lead India: Lead India is a forward-thinking IT company offering end-to-end digital solutions, software development, and data-driven services. We believe in nurturing fresh talent and giving them the opportunity to work on meaningful, real-world projects in a supportive remote environment. Internship Overview: We are seeking a motivated and detail-oriented SQL Developer Intern to join our team. In this role, you’ll be working with databases, writing and optimizing SQL queries, and contributing to the development and maintenance of our data systems. Key Responsibilities: Write, test, and optimize SQL queries and stored procedures Assist in the design and maintenance of database schemas Perform data extraction, transformation, and loading (ETL) tasks Collaborate with the data and development teams to troubleshoot and improve database performance Document database processes and ensure data accuracy and integrity Support reporting needs with custom queries and data exports Required Skills: Good understanding of SQL and relational databases (MySQL, PostgreSQL, SQL Server, etc.) Basic knowledge of database design and normalization Familiarity with tools like MySQL Workbench, SSMS, or pgAdmin Strong problem-solving and analytical skills Attention to detail and ability to work independently in a remote setup Nice to Have: Exposure to ETL tools or processes Basic knowledge of performance tuning and query optimization Understanding of data warehousing concepts Experience with scripting languages (Python, Shell) for automation What You’ll Gain: Practical experience working on real SQL/database projects Guidance from experienced developers and mentors Remote work flexibility Internship certificate and Letter of Recommendation Opportunity for a full-time role or pre-placement offer (based on performance) You will get salary upto 25,000/- per month. Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

India

On-site

Linkedin logo

Mandatory Skills : Azure Cloud Technologies, Azure Data Factory, Azure Databricks (Advance Knowledge), PySpark, CI/CD Pipeline (Jenkins, GitLab CI/CD or Azure DevOps), Data Ingestion, SOL Seeking a skilled Data Engineer with expertise in Azure cloud technologies, data pipelines, and big data processing. The ideal candidate will be responsible for designing, developing, and optimizing scalable data solutions. Responsibilities Azure Databricks and Azure Data Factory Expertise:  Demonstrate proficiency in designing, implementing, and optimizing data workflows using Azure Databricks and Azure Data Factory.  Provide expertise in configuring and managing data pipelines within the Azure cloud environment. PySpark Proficiency:  Possess a strong command of PySpark for data processing and analysis.  Develop and optimize PySpark code to ensure efficient and scalable data transformations. Big Data & CI/CD Experience:  Ability to troubleshoot and optimize data processing tasks on large datasets. Design and implement automated CI/CD pipelines for data workflows.  This involves using tools like Jenkins, GitLab CI/CD, or Azure DevOps to automate the building, testing, and deployment of data pipelines. Data Pipeline Development & Deployment:  Design, implement, and maintain end-to-end data pipelines for various data sources and destinations.  This includes unit tests for individual components, integration tests to ensure that different components work together correctly, and end-to-end tests to verify the entire pipeline's functionality.  Familiarity with Github/Repo for deployment of code  Ensure data quality, integrity, and reliability throughout the entire data pipeline. Extraction, Ingestion, and Consumption Frameworks:  Develop frameworks for efficient data extraction, ingestion, and consumption.  Implement best practices for data integration and ensure seamless data flow across the organization. Collaboration and Communication:  Collaborate with cross-functional teams to understand data requirements and deliver scalable solutions.  Communicate effectively with stakeholders to gather and clarify data-related requirements. Requirements Bachelor’s or master’s degree in Computer Science, Data Engineering, or a related field. 4+ years of relevant hands-on experience in data engineering with Azure cloud services and advanced Databricks. Strong analytical and problem-solving skills in handling large-scale data pipelines. Experience in big data processing and working with structured & unstructured datasets. Expertise in designing and implementing data pipelines for ETL workflows. Strong proficiency in writing optimized queries and working with relational databases. Experience in developing data transformation scripts and managing big data processing using PySpark.. Skills: sol,azure,azure databricks,sql,pyspark,data ingestion,azure cloud technologies,azure datafactory,azure data factory,ci/cd pipeline (jenkins, gitlab ci/cd or azure devops),azure databricks (advance knowledge),ci/cd pipelines Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Chakan, Pune, Maharashtra

On-site

Indeed logo

Job Title: Python Engineer – Vision-Based Projects Experience: 3+ Years Location: Pune Employment Type: Full-time Job Summary: We are looking for a skilled and passionate Python Engineer with hands-on experience in vision-based projects, such as object detection, image classification, and video analytics. The ideal candidate should have a solid understanding of computer vision libraries, machine learning techniques, and real-time data processing. Key Responsibilities: Design, develop, and optimize computer vision algorithms using Python. Work on vision-based applications including object detection, image segmentation, and video analysis. Implement and train deep learning models using frameworks like TensorFlow, PyTorch, or OpenCV. Integrate computer vision solutions into real-time systems or cloud environments. Collaborate with cross-functional teams including data scientists, backend engineers, and product managers. Maintain and improve code quality, testing, and documentation. Stay updated with the latest trends and advancements in vision-based AI/ML solutions. Required Skills & Qualifications: Bachelor's or Master’s degree in Computer Science, Engineering, or related field. Minimum 3 years of experience in Python programming. Strong experience in vision-based libraries: OpenCV, Pillow, scikit-image, computer vision etc. Experience with deep learning frameworks: TensorFlow, Keras, or PyTorch. Proficiency in image and video processing, feature extraction, and real-time inference. Experience with REST APIs and cloud deployment (AWS/GCP) is a plus. Familiarity with Git, Docker, and CI/CD pipelines. Good to Have: Experience with edge devices like NVIDIA Jetson, Raspberry Pi, etc. Knowledge of classical machine learning (SVM, Random Forests, etc.). Understanding of calibration, tracking, and 3D vision techniques. Job Type: Full-time Pay: ₹300,000.00 - ₹900,000.00 per year Benefits: Flexible schedule Location Type: In-person Schedule: Day shift Monday to Friday Ability to commute/relocate: Chakan, Pune, Maharashtra: Reliably commute or planning to relocate before starting work (Preferred) Education: Bachelor's (Preferred) Experience: Software deployment: 3 years (Required) Work Location: In person

Posted 1 week ago

Apply

0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Total Experience - 0-6 Months What we want you to do Work with diagnostic lab partners for smooth functioning of infectious disease test Work with partners to ensure timely upgrades to the test are done Liaison with partners for technical support as needed Document activities What are we looking in you MSc in Microbiology or Biotechnology Experience in DNA Extraction & RT PCR Ability to quickly prioritize and execute tasks Good oral and written communication skills. Strong analytical skills and attention to detail Strong documentation skills Compulsory rotation shift (i.e day shift, mid shift, night shift) 6 days working with rotational week off What you will gain Exposure of working with one of the leading companies in genomics Experience in working with advanced sequencing technology in diagnostic industry i.e NGS, WGS, Nanopore, and Illumina Gain skills in troubleshooting during sequencing. Skills: data management,biotechnology,word,dna extraction,communication skills,data entry,ngs,organizational skills,rna isolation,microsoft office,analytical skills,powerpoint,rtpcr,excel,troubleshooting,record-keeping,documentation,rt pcr Show more Show less

Posted 1 week ago

Apply

4.0 years

0 Lacs

Chandigarh, India

On-site

Linkedin logo

Experience Required: 4+ Years Key Responsibility: Design, build, and maintain scalable and reliable data pipelines on Databricks, Snowflake, or equivalent cloud platforms. Ingest and process structured, semi-structured, and unstructured data from a variety of sources including APIs, RDBMS, and file systems. Perform data wrangling, cleansing, transformation, and enrichment using PySpark, Pandas, NumPy, or similar libraries. Optimize and manage large-scale data workflows for performance, scalability, and cost-efficiency. Write and optimize complex SQL queries for transformation, extraction, and reporting. Design and implement efficient data models and database schemas with appropriate partitioning and indexing strategies for Data Warehouse or Data Mart. Leverage cloud services (e.g., AWS S3, Glue, Kinesis, Lambda) for storage, processing, and orchestration. Build containerized solutions using Docker and manage deployment pipelines via CI/CD tools such as Azure DevOps, GitHub Actions, or Jenkins. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Find the below JD: 5+ years of hands-on experience in data engineering/ETL using Databricks on AWS / Azure cloud infrastructure and functions. 3+ years of experience in PBI and Data Warehousing experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Experience with AWS (e.g. S3, Athena, Glue, Lambda, etc.) preferred. Deep understanding of data warehousing concepts (Dimensional (star-schema), SCD2, Data Vault, Denormalized, OBT) implementing highly performant data ingestion pipelines from multiple sources Strong proficiency in Python and SQL. Deep understanding of Databricks platform features (Delta Lake, Databricks SQL, MLflow) Experience with CI/CD on Databricks using tools such as BitBucket, GitHub Actions, and Databricks CLI Integrating the end-to-end Databricks pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained. Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Experience with Delta Lake, Unity Catalog, Delta Sharing, Delta Live Tables (DLT), MLflow. Basic working knowledge of API or Stream based data extraction processes like Salesforce API, Bulk API. Understanding of Data Management principles (quality, governance, security, privacy, life cycle management, cataloguing) Nice to have: Databricks certifications and AWS Solution Architect certification. Nice to have: experience with building data pipeline from various business applications like Salesforce, Marketo, NetSuite, Workday etc. Show more Show less

Posted 1 week ago

Apply

0.0 - 1.0 years

0 Lacs

Coimbatore, Tamil Nadu

On-site

Indeed logo

We are looking forward to an urgent position "Junior Operator - Assembly " Experience - Fresher to 1 year only Roles and responsibilities: Assembling the machine as per the requirement Mechanical maintenance and fabrication Maintaining the stock as per the requirement Coordinating with the rest of the department for regular activities Visiting the site in case required for installation purposes - Rarely Knowledge of extraction system or the dust collection machine Should work in Mechanical and Electrical Job Location - Coimbatore – Mathampalayam (641 019) Candidates nearby Bilichi, Mathampalayam, Vellamadai, Onnipalayam, Karamadai, Kuppepalayam, Periyanaickenpalayam, Gudalur, Veerapandi, Shanthi Medu,Press colony, ITI, Kasthuripalayam, Jothipuram, Andaalpuram, Kalattiyur, Onnipalayam are most welcome. Skills Required: Diploma or ITI in Mechanical/Industrial/Production Fresher to 1 year - Experience as per job requirements. Local Coimbatore candidate preferred Any technical or additional course in mechanical or relevant field is an added advantage Benefits from the organization: ESI EPF Refreshment Yearly Bonus Accomodation T-shirt and Safety shoe Job Types: Full-time, Fresher, Regular / Permanent Job Types: Full-time, Permanent, Fresher Pay: ₹13,000.00 - ₹15,000.00 per month Benefits: Health insurance Leave encashment Provident Fund Ability to commute/relocate: Coimbatore North, Coimbatore - 641019, Tamil Nadu: Reliably commute or planning to relocate before starting work (Required) Education: Diploma (Preferred) Experience: Mechanical assembly: 1 year (Preferred) Expected Start Date: 10/06/2025

Posted 1 week ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description: 5+ years of hands-on experience in data engineering/ETL using Databricks on AWS / Azure cloud infrastructure and functions. 3+ years of experience in PBI and Data Warehousing experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Experience with AWS (e.g. S3, Athena, Glue, Lambda, etc.) preferred. Deep understanding of data warehousing concepts (Dimensional (star-schema), SCD2, Data Vault, Denormalized, OBT) implementing highly performant data ingestion pipelines from multiple sources Strong proficiency in Python and SQL. Deep understanding of Databricks platform features (Delta Lake, Databricks SQL, MLflow) Experience with CI/CD on Databricks using tools such as BitBucket, GitHub Actions, and Databricks CLI Integrating the end-to-end Databricks pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained. Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Experience with Delta Lake, Unity Catalog, Delta Sharing, Delta Live Tables (DLT), MLflow. Basic working knowledge of API or Stream based data extraction processes like Salesforce API, Bulk API. Understanding of Data Management principles (quality, governance, security, privacy, life cycle management, cataloguing) Nice to have: Databricks certifications and AWS Solution Architect certification. Nice to have: experience with building data pipeline from various business applications like Salesforce, Marketo, NetSuite, Workday etc. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Linkedin logo

Work Level : Individual Core : Responsible Leadership : Team Alignment Industry Type : Information Technology Function : Database Administrator Key Skills : mSQL,SQL Writing,PLSQL Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. Responsibilities Write, optimize, and maintain SQL queries, stored procedures, and functions. This is a Remote Position. Assist in designing and managing relational databases. Perform data extraction, transformation, and loading (ETL) tasks. Ensure database integrity, security, and performance. Work with developers to integrate databases into applications. Support data analysis and reporting by writing complex queries. Document database structures, processes, and best practices Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description – Senior Data Scientist As a Senior Data Scientist, your responsibilities will encompass developing advanced data analytics models, implementing machine learning algorithms, and optimizing data pipelines. You will play a pivotal role in delivering data-driven solutions that enhance our and our client’s products and operations, ultimately driving business growth. Collaboration with cross-functional teams, management, and senior leadership will be key as you translate complex data into actionable insights and strategies. We are looking for individuals who are passionate about harnessing the power of data science and machine learning to solve complex business challenges and drive innovation. Roles & Responsibilities: Advanced Analytics: Develop and implement data analytics models, machine learning algorithms, and data pipelines. Problem Solving: Tackle challenging data science problems, from predictive modeling to natural language processing. Data-Driven Insights: Extract actionable insights from data, enabling data-driven decision-making across the enterprise. ML Model Development : Develop and deploy machine learning models and algorithms Stakeholder Communication: Effectively communicate to address complex business problems A/B Testing: Design and implement A/B testing methodologies to drive data-driven decision-making and product optimization Sharing technical concepts and insights to non-technical stakeholders, guiding them in making informed decisions Mentoring and leading a team of data scientists Participate and attend daily scrum calls and other project meetings Continuous learning, acquiring new skills and sharing knowledge pertaining to the field of data science Qualifications & Experience : A Master's or advanced degree in Data Science, Computer Science, Statistics, Engineering or a related field. Min 10+years with aound 7+ years of experience in Machine Learning , data science and delivering data-driven solutions in a corporate environment Strong proficiency in machine learning, statistical analysis, and data visualization. Expertise in programming languages such as Python. SQL expertise for data extraction, transformation, and analysis. Strong problem-solving and strategic thinking abilities. Excellent interpersonal/communication skills (both oral/written) with the ability to communicate at various levels with clarity & precision Ability to collaborate effectively across cross-functional teams. Strong problem-solving and strategic thinking abilities. Show more Show less

Posted 1 week ago

Apply

Exploring Extraction Jobs in India

The extraction job market in India is a thriving industry with numerous opportunities for job seekers. Extraction jobs involve extracting valuable resources such as oil, gas, minerals, and other natural resources from the earth. These roles are essential for the growth and development of various sectors in the country.

Top Hiring Locations in India

  1. Mumbai
  2. Delhi
  3. Bangalore
  4. Kolkata
  5. Hyderabad

These cities are known for their active hiring in extraction roles, with a high demand for skilled professionals in the industry.

Average Salary Range

The average salary range for extraction professionals in India varies based on experience and skills. Entry-level positions can expect to earn around INR 3-5 lakhs per annum, while experienced professionals can earn upwards of INR 10-15 lakhs per annum.

Career Path

In the extraction industry, a typical career path may involve starting as a Junior Engineer or Technician, moving on to roles such as Senior Engineer, Project Manager, and eventually reaching positions like Operations Manager or Director.

Related Skills

In addition to extraction skills, professionals in this field are often expected to have knowledge of geology, environmental regulations, safety procedures, and project management.

Interview Questions

  • What is the importance of exploration in the extraction industry? (basic)
  • How do you ensure compliance with environmental regulations during extraction processes? (medium)
  • Can you explain the difference between surface mining and underground mining? (medium)
  • What are some of the challenges faced in the extraction industry, and how would you address them? (medium)
  • Describe a successful extraction project you were involved in and the role you played. (advanced)
  • How do you stay updated on new technologies and advancements in the extraction industry? (basic)
  • What steps would you take to improve efficiency in extraction processes? (medium)
  • How do you prioritize safety in extraction operations? (medium)
  • Can you discuss a time when you had to handle a difficult situation during an extraction project? (advanced)
  • What software tools or technologies are you proficient in for extraction work? (basic)
  • Explain the importance of risk assessment in extraction operations. (medium)
  • How do you ensure quality control in extraction processes? (medium)
  • What are the key factors to consider when selecting a site for extraction activities? (medium)
  • How do you manage stakeholder relationships in the extraction industry? (medium)
  • Describe a time when you had to work under strict deadlines in an extraction project. How did you handle it? (advanced)
  • What strategies would you implement to reduce the environmental impact of extraction activities? (medium)
  • Can you discuss a time when you had to troubleshoot a technical issue during an extraction operation? (advanced)
  • How do you handle conflicts within a team working on an extraction project? (medium)
  • What are the different types of extraction methods used in the industry, and when would you use each? (advanced)
  • How do you ensure cost-effectiveness in extraction operations? (medium)
  • Explain the role of technology in modern extraction processes. (basic)
  • What are the key components of a successful extraction plan? (medium)
  • How do you assess the feasibility of an extraction project? (medium)
  • Can you discuss a time when you had to adapt to unexpected changes in an extraction project? (advanced)
  • How do you ensure the health and safety of workers in extraction operations? (medium)

Closing Remark

As you prepare for interviews and explore opportunities in the extraction industry in India, remember to showcase your skills, experience, and passion for the field. With the right preparation and confidence, you can excel in extraction roles and contribute to the growth of this dynamic industry. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies