Data Engineer, I

2 - 5 years

9 - 13 Lacs

Posted:1 week ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Overview

A Data Engineer will be responsible for Designs, develops, programs and implements Machine Learning solutions , Implements Artificial/Augmented Intelligence systems/Agentic Workflows/Data Engineer Workflows, Performs Statistical Modelling and Measurements by applying data engineering, feature engineering, statistical methods, ML modelling and AI techniques on structured, unstructured, diverse “big data” sources of machine acquire data to generate actionable insights and foresights for real life business problem solutions and product features development and enhancements.A strong understanding of databases, SQL, cloud technologies, and modern data integration and orchestration tools like Azure Data Factory (ADF) are required to succeed in this role.

Responsibilities
  • Integrates state-of-the-art machine learning algorithms as well as the development of new methods
  • Develops tools to support analysis and visualization of large datasets
  • Develops, codes software programs, implements industry standard auto ML models (Speech, Computer vision, Text Data, LLM), Statistical models, relevant ML models (devices/machine acquired data), AI models and algorithms
  • Identifies meaningful foresights based on predictive ML models from large data and metadata sources; interprets and communicates foresights, insights and findings from experiments to product managers, service managers, business partners and business managers
  • Makes use of Rapid Development Tools (Business Intelligence Tools, Graphics Libraries, Data modelling tools) to effectively communicate research findings using visual graphics, Data Models, machine learning model features, feature engineering / transformations to relevant stakeholders
  • Analyze, review and track trends and tools in Data Science, Machine Learning, Artificial Intelligence and IoT space
  • Interacts with Cross-Functional teams to identify questions and issues for data engineering, machine learning models feature engineering
  • Evaluates and makes recommendations to evolve data collection mechanism for Data capture to improve efficacy of machine learning models prediction
  • Meets with customers, partners, product managers and business leaders to present findings, predictions, foresights; Gather customer specific requirements of business problems/processes; Identify data collection constraints and alternatives for implementation of models
  • Working knowledge of MLOps, LLMs and Agentic AI/Workflows
  • Programming Skills: Proficiency in Python and experience with ML frameworks like TensorFlow, PyTorch
  • LLM Expertise: Hands-on experience in training, fine-tuning, and deploying LLMs
  • Foundational Model Knowledge: Strong understanding of open-weight LLM architectures, including training methodologies, fine-tuning techniques, hyperparameter optimization, and model distillation.
  • Data Pipeline Development: Strong understanding of data engineering concepts, feature engineering, and workflow automation using Airflow or Kubeflow.
  • Cloud & MLOps: Experience deploying ML models in cloud environments like AWS, GCP (Google Vertex AI), or Azure using Docker and Kubernetes.Designs and implementation predictive and optimisation models incorporating diverse data types
  • strong in SQL, Azure Data Factory (ADF)
Qualifications

• Minimum Education:
o Bachelors, Master's or Ph.D. Degree in Computer Science or Engineering.• Minimum Work Experience (years):o 1+ years of experience programming with at least one of the following languages: Python,Scala, Go.o 1+ years of experience in SQL and data transformationo 1+ years of experience in developing distributed systems using open source technologiessuch as Spark and Dask.o 1+ years of experience with relational databases or NoSQL databases running in Linuxenvironments (MySQL, MariaDB, PostgreSQL, MongoDB, Redis).• Key Skills and Competencies:o Experience working with AWS / Azure / GCP environment is highly desired.o Experience in data models in the Retail and Consumer products industry is desired.o Experience working on agile projects and understanding of agile concepts is desired.o Demonstrated ability to learn new technologies quickly and independently.o Excellent verbal and written communication skills, especially in technical communications.o Ability to work and achieve stretch goals in a very innovative and fast-paced environment.o Ability to work collaboratively in a diverse team environment.

o Ability to telework
o Expected travel: Not expected.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Zebra Technologies logo
Zebra Technologies

Technology - Automatic Identification and Data Capture

Vernon Hills

RecommendedJobs for You

Bengaluru, Karnataka, India