Data/Analytics Engineer

3 - 6 years

5 - 15 Lacs

Posted:2 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Dear Candidate,

Hiring for Data/Analytics Engineer position for Gurgaon location for Big Company

If Interested, Kindly share your updated CV @gayatri.d@honeybeetechsolutions.com

JD -

Role - Data/Analytics Engineer

Experience - 3-6 Years

Location - Gurgaon

JD in Detail -

Position Name Data/Analytics Engineer

Position type: Permanent

Total Exp: 3-6 Years

Notice Period: Immediate

Asset: Laptop Mandatory for interview

Work Location: Gurgaon

Work Type: WFO

Interview Rounds: 2

Interview Mode: Round 1 - Technical Mode Virtual / Round 2 - Technical Mode Virtual (or) F2F / Round 3 - HR

Job Description

Must Have We are looking for a skilled and hands-on Data/Analytics Engineer to build and maintain the backbone of our data infrastructure. In this role, you will be responsible for setting up robust data flows for AI and Data Analytics, ensuring that high-quality data is available for modelling and decision-making. You will bridge the gap between data engineering and data science by implementing efficient ETL pipelines and establishing strong ML Ops practices.

Must Have If you are proficient in Python, Airflow, and the Google Cloud ecosystem, and thrive on optimizing data architecture for scale. Key Responsibilities Data Pipeline & ETL Architecture

Must Have Pipeline Orchestration: Design, develop, and maintain scalable ETL/ELT pipelines using Apache Airflow and Python to automate data workflows.

Must Have Data Integration: seamless integration of data from various RDBMS sources and external APIs into Google BigQuery to create a unified data source for analytics and AI.

Must Have Data Flow Optimization: Ensure efficient data flow for AI and analytics use cases, optimizing for latency, cost, and reliability. ML Ops & Model Deployment

Must Have Model Operationalization: collaborate with Data Scientists to deploy machine learning models into production using Google Cloud Vertex AI.

Must Have ML Ops Implementation: Establish and maintain ML Ops practices, including model monitoring, versioning, retraining pipelines, and CI/CD for machine learning.

Must Have Infrastructure Management: Manage the underlying infrastructure required for model training and serving, ensuring high availability and performance Data Warehousing & Governance

Must Have BigQuery Management: Manage and optimize data warehousing in Google BigQuery, ensuring appropriate schema design, partitioning, and clustering for performance.

Must Have Data Quality: Implement checks and balances to ensure data accuracy and consistency across the pipeline.

Must Have Documentation: Maintain comprehensive documentation of data lineage, pipeline architecture, and operational runbooks. Collaboration

Must Have Work as key member of the Data Science team to understand model requirements and provide the necessary data infrastructure.

Must Have Collaborate with the Tech and Business teams to integrate data pipelines with broader organizational infrastructure.

Skills & Experience Experience:

Must Have 3-6 years of professional experience in Data Engineering, Analytics Engineering, or ML Engineering.

Must Have Proven experience in building production-grade data pipelines and deploying ML models. Technical Proficiency:

Must Have Core Languages: Strong proficiency in Python for data manipulation and scripting.

Must Have Orchestration: Expert-level knowledge of Apache Airflow for scheduling and monitoring workflows.

Must Have Cloud Data Stack: Deep expertise in Google Cloud Platform (GCP), specifically BigQuery for warehousing and Vertex AI for AI/ML workloads.

Must Have Database & ETL: Strong command over RDBMS (SQL) and extensive experience with ETL operations to transform raw data into analytical assets.

Must Have ML Ops: Familiarity with ML Ops tools and methodologies (e.g., Kubeflow, MLflow, or native Vertex AI pipelines). Soft Skills: Must Have Strong problem-solving skills and attention to detail.

Must Have Ability to work in a fast-paced, agile environment.

Must Have Good communication skills to articulate technical challenges to non-technical stakeholders.

Good-to-Have / Plus Good-to-Have Domain Expertise: Prior experience in the BFSI domain (Wealth Management, Insurance, Mutual Funds). Good-to-Have Certifications: Google Professional Data Engineer or Google Professional Machine Learning Engineer certifications. Exposure to AWS S3 and RDS.

Education

Must Have B.Tech / B.E. / MCA / M.Sc or equivalent in Computer Science, Information Technology, or Data Science.

Thanks,

Gayatri D

IT Recruiter

gayatri.d@honeybeetechsolutions.com

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Honeybee Tech Solutions logo
Honeybee Tech Solutions

Information Technology

Innovation City

RecommendedJobs for You