Posted:2 days ago|
Platform:
Work from Office
Full Time
Dear Candidate,
Hiring for Data/Analytics Engineer position for Gurgaon location for Big Company
If Interested, Kindly share your updated CV @gayatri.d@honeybeetechsolutions.com
JD -
Role - Data/Analytics Engineer
Experience - 3-6 Years
Location - Gurgaon
JD in Detail -
Position Name Data/Analytics Engineer
Position type: Permanent
Total Exp: 3-6 Years
Notice Period: Immediate
Asset: Laptop Mandatory for interview
Work Location: Gurgaon
Work Type: WFO
Interview Rounds: 2
Interview Mode: Round 1 - Technical Mode Virtual / Round 2 - Technical Mode Virtual (or) F2F / Round 3 - HR
Job Description
Must Have We are looking for a skilled and hands-on Data/Analytics Engineer to build and maintain the backbone of our data infrastructure. In this role, you will be responsible for setting up robust data flows for AI and Data Analytics, ensuring that high-quality data is available for modelling and decision-making. You will bridge the gap between data engineering and data science by implementing efficient ETL pipelines and establishing strong ML Ops practices.
Must Have If you are proficient in Python, Airflow, and the Google Cloud ecosystem, and thrive on optimizing data architecture for scale. Key Responsibilities Data Pipeline & ETL Architecture
Must Have Pipeline Orchestration: Design, develop, and maintain scalable ETL/ELT pipelines using Apache Airflow and Python to automate data workflows.
Must Have Data Integration: seamless integration of data from various RDBMS sources and external APIs into Google BigQuery to create a unified data source for analytics and AI.
Must Have Data Flow Optimization: Ensure efficient data flow for AI and analytics use cases, optimizing for latency, cost, and reliability. ML Ops & Model Deployment
Must Have Model Operationalization: collaborate with Data Scientists to deploy machine learning models into production using Google Cloud Vertex AI.
Must Have ML Ops Implementation: Establish and maintain ML Ops practices, including model monitoring, versioning, retraining pipelines, and CI/CD for machine learning.
Must Have Infrastructure Management: Manage the underlying infrastructure required for model training and serving, ensuring high availability and performance Data Warehousing & Governance
Must Have BigQuery Management: Manage and optimize data warehousing in Google BigQuery, ensuring appropriate schema design, partitioning, and clustering for performance.
Must Have Data Quality: Implement checks and balances to ensure data accuracy and consistency across the pipeline.
Must Have Documentation: Maintain comprehensive documentation of data lineage, pipeline architecture, and operational runbooks. Collaboration
Must Have Work as key member of the Data Science team to understand model requirements and provide the necessary data infrastructure.
Must Have Collaborate with the Tech and Business teams to integrate data pipelines with broader organizational infrastructure.
Skills & Experience Experience:
Must Have 3-6 years of professional experience in Data Engineering, Analytics Engineering, or ML Engineering.
Must Have Proven experience in building production-grade data pipelines and deploying ML models. Technical Proficiency:
Must Have Core Languages: Strong proficiency in Python for data manipulation and scripting.
Must Have Orchestration: Expert-level knowledge of Apache Airflow for scheduling and monitoring workflows.
Must Have Cloud Data Stack: Deep expertise in Google Cloud Platform (GCP), specifically BigQuery for warehousing and Vertex AI for AI/ML workloads.
Must Have Database & ETL: Strong command over RDBMS (SQL) and extensive experience with ETL operations to transform raw data into analytical assets.
Must Have ML Ops: Familiarity with ML Ops tools and methodologies (e.g., Kubeflow, MLflow, or native Vertex AI pipelines). Soft Skills: Must Have Strong problem-solving skills and attention to detail.
Must Have Ability to work in a fast-paced, agile environment.
Must Have Good communication skills to articulate technical challenges to non-technical stakeholders.
Good-to-Have / Plus Good-to-Have Domain Expertise: Prior experience in the BFSI domain (Wealth Management, Insurance, Mutual Funds). Good-to-Have Certifications: Google Professional Data Engineer or Google Professional Machine Learning Engineer certifications. Exposure to AWS S3 and RDS.
Education
Must Have B.Tech / B.E. / MCA / M.Sc or equivalent in Computer Science, Information Technology, or Data Science.
Thanks,
Gayatri D
IT Recruiter
gayatri.d@honeybeetechsolutions.com
Honeybee Tech Solutions
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Now
5.0 - 15.0 Lacs P.A.
bengaluru
22.5 - 35.0 Lacs P.A.
bengaluru
12.0 - 20.0 Lacs P.A.
6.0 - 16.0 Lacs P.A.
10.0 - 20.0 Lacs P.A.
bengaluru
15.0 - 25.0 Lacs P.A.
11.0 - 15.0 Lacs P.A.
Experience: Not specified
Salary: Not disclosed
hyderabad
4.0 - 7.0 Lacs P.A.
mumbai
5.0 - 9.0 Lacs P.A.