Home
Jobs

Data Engineer

11 - 19 years

20 - 30 Lacs

Posted:2 days ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Job Title: Data Engineer Location: Hyderabad, Kochi, Trivandrum Experience Required: 10-19 Yrs Skills: Primary - Scala, Pyspark, Python / Secondary - ETL, SQL, Azure Role Proficiency The role demands expertise in building robust, scalable data pipelines that support ingestion, wrangling, transformation, and integration of data from multiple sources. The ideal candidate should have hands-on experience with ETL tools (e.g., Informatica, AWS Glue, Databricks, GCP DataProc), and strong programming skills in Python, PySpark, SQL, and optionally Scala. Proficiency across various data domains and familiarity with modern data warehouse and lakehouse architectures (Snowflake, BigQuery, Delta Lake, Lakehouse) is essential. A solid understanding of DevOps and infrastructure cost optimization is required. Key Responsibilities & Outcomes Technical Development Develop high-performance data pipelines and applications. Optimize development using design patterns and reusable solutions. Create and tune code using best practices for performance and scalability. Develop schemas, data models, and data storage solutions (SQL/NoSQL/Delta Lake). Perform debugging, testing, and validation to ensure solution quality. Documentation & Design Produce high-level and low-level design (HLD, LLD, SAD) and architecture documentation. Prepare infra costing, source-target mappings, and business requirement documentation. Contribute to and govern documentation standards/templates/checklists. Project & Team Management Support Project Manager in planning, delivery, and sprint execution. Estimate effort and provide input on resource planning. Lead and mentor junior team members, define goals, and monitor progress. Monitor and manage defect lifecycle including RCA and proactive quality improvements. Customer Interaction Gather and clarify requirements with customers and architects. Present design alternatives and conduct product demos. Ensure alignment with customer expectations and solution architecture. Testing & Release Design and review unit/integration test cases and execution strategies. Provide support during system/integration testing and UAT. Oversee and execute release cycles and configurations. Knowledge Management & Compliance Maintain compliance with configuration management plans. Contribute to internal knowledge repositories and reusable assets. Stay updated and certified on relevant technologies/domains. Measures of Success (KPIs) Adherence to engineering processes and delivery schedules. Number of post-delivery defects and non-compliance issues. Reduction in recurring defects and faster resolution of production bugs. Timeliness in detecting, responding to, and resolving pipeline/data issues. Improvements in pipeline efficiency (e.g., runtime, resource utilization). Team engagement and upskilling; completion of relevant certifications. Zero or minimal data security/compliance breaches. Expected Deliverables Code High-quality data transformation scripts and pipelines. Peer-reviewed, optimized, and reusable code. Documentation Design documents, technical specifications, test plans, and infra cost estimations. Configuration & Testing Configuration management plans and test execution results. Knowledge Sharing Contributions to SharePoint, internal wikis, client university platforms. Skill Requirements Mandatory Technical Skills Languages : Python, PySpark, Scala ETL Tools : Apache Airflow, Talend, Informatica, AWS Glue, Databricks, DataProc Cloud Platforms : AWS, GCP, Azure (esp. BigQuery, DataFlow, ADF, ADLS) Data Warehousing : Snowflake, BigQuery, Delta Lake, Lakehouse architecture Performance Tuning : For large-scale distributed systems and pipelines Additional Skills Experience in data model design and optimization. Good understanding of data schemas, window functions, and data partitioning strategies. Awareness of data governance, security standards, and compliance. Familiarity with DevOps, CI/CD, infrastructure cost estimation. Certifications (Preferred) Cloud certifications (e.g., AWS Data Analytics, GCP Data Engineer) Informatica or Databricks certification Domain-specific certifications based on project/client need Soft Skills Strong analytical and problem-solving capabilities Excellent communication and documentation skills Ability to work independently and collaboratively in cross-functional teams Stakeholder management and customer interaction

Mock Interview

Practice Video Interview with JobPe AI

Start Pyspark Interview Now
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

UST
UST

IT Services and IT Consulting

Aliso Viejo CA

10001 Employees

1185 Jobs

    Key People

  • Kris Canekeratne

    Co-Founder & CEO
  • Sandeep Reddy

    President

RecommendedJobs for You