(Services) Data Engineer

5 years

3 - 8 Lacs

Posted:2 weeks ago| Platform: GlassDoor logo

Apply

Work Mode

On-site

Job Type

Part Time

Job Description

As a Data Engineer
You design, build, and optimize large-scale data pipelines and platforms across cloud environments. You manage data integration from multiple business systems, ensuring high data quality, performance, and governance. You collaborate with cross-functional teams to deliver trusted, scalable, and secure data solutions that enable analytics, reporting, and decision-making.

Meet the job

  • Data Engineering: Design, build, and optimize scalable ETL/ELT pipelines using Azure Data Factory, Databricks, PySpark, and SQL;
  • Cloud Data Platforms: Manage and integrate data across Azure (Synapse, Data Lake, Event Hub, Key Vault) and GCP (BigQuery, Cloud Storage);
  • API Integration: Develop workflows for data ingestion and processing via REST APIs and web services, including integrations with BambooHR, Salesforce, and Oracle NetSuite;
  • Data Modeling & Warehousing: Build and maintain data models, warehouses, and lakehouse structures to support analytics and reporting needs;
  • Performance Optimization: Optimize Spark jobs, SQL queries, and pipeline execution for scalability, performance, and cost-efficiency;
  • Governance & Security: Ensure data privacy, security, and compliance while maintaining data lineage and cataloging practices;
  • Collaboration: Partner with business stakeholders, analysts, and PMO teams to deliver reliable data for reporting and operations;
  • Documentation: Create and maintain technical documentation for data processes, integrations, and pipeline workflows;

How about you
  • Education: Bachelor's/Master's degree in Computer Science, Engineering, Analytics, Mathematics, Statistics, IT, or equivalent;
  • Experience: 5+ years of experience in Data Engineering and large-scale data migration projects;
  • Technical Skills: Proficient in SQL, Python, and PySpark for data processing and transformation;
  • Big Data & Cloud: Hands-on expertise with Apache Spark, Databricks, and Azure Data Services (ADF, Synapse, Data Lake, Event Hub, Key Vault);
  • GCP Knowledge: Exposure to Google Cloud Platform (BigQuery, Cloud Storage) and multi-cloud data workflows;
  • Integration Tools: Exposure to tools such as Workato for API-based data ingestion and automation;
  • Best Practices: Strong understanding of ETL/ELT development best practices and performance optimization;
  • Added Advantage: Certifications in Azure or GCP cloud platforms;
  • Domain Knowledge: Preferable to have knowledge of Oracle NetSuite, BambooHR, Salesforce data ingestion, and PMO data operations;
  • Soft Skills: Strong problem-solving skills, effective communication, and ability to work both independently and in cross-functional teams while mentoring junior engineers.

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You