Lead II - Data Engineering

7 - 9 years

8 - 9 Lacs

Posted:13 hours ago| Platform: GlassDoor logo

Apply

Work Mode

On-site

Job Type

Part Time

Job Description

    7 - 9 Years
    1 Opening
    Trivandrum


Role description

Role Proficiency:

This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required.

Outcomes:

  • Act creatively to develop pipelines/applications by selecting appropriate technical options optimizing application development maintenance and performance through design patterns and reusing proven solutions. Support the Project Manager in day-to-day project execution and account for the developmental activities of others.
  • Interpret requirements create optimal architecture and design solutions in accordance with specifications.
  • Document and communicate milestones/stages for end-to-end delivery.
  • Code using best standards debug and test solutions to ensure best-in-class quality.
  • Tune performance of code and align it with the appropriate infrastructure understanding cost implications of licenses and infrastructure.
  • Create data schemas and models effectively.
  • Develop and manage data storage solutions including relational databases NoSQL databases Delta Lakes and data lakes.
  • Validate results with user representatives integrating the overall solution.
  • Influence and enhance customer satisfaction and employee engagement within project teams.

Measures of Outcomes:

  • TeamOne's Adherence to engineering processes and standards
  • TeamOne's Adherence to schedule / timelines
  • TeamOne's Adhere to SLAs where applicable
  • TeamOne's # of defects post delivery
  • TeamOne's # of non-compliance issues
  • TeamOne's Reduction of reoccurrence of known defects
  • TeamOne's Quickly turnaround production bugs
  • Completion of applicable technical/domain certifications
  • Completion of all mandatory training requirementst
  • Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times).
  • TeamOne's Average time to detect respond to and resolve pipeline failures or data issues.
  • TeamOne's Number of data security incidents or compliance breaches.

Outputs Expected:

Code:

  • Develop data processing code with guidance
    ensuring performance and scalability requirements are met.
  • Define coding standards
    templates and checklists.
  • Review code for team and peers.


Documentation:

  • Create/review templates
    checklists guidelines and standards for design/process/development.
  • Create/review deliverable documents
    including design documents architecture documents infra costing business requirements source-target mappings test cases and results.


Configure:

  • Define and govern the configuration management plan.
  • Ensure compliance from the team.


Test:

  • Review/create unit test cases
    scenarios and execution.
  • Review test plans and strategies created by the testing team.
  • Provide clarifications to the testing team.


Domain Relevance:

  • Advise data engineers on the design and development of features and components
    leveraging a deeper understanding of business needs.
  • Learn more about the customer domain and identify opportunities to add value.
  • Complete relevant domain certifications.


Manage Project:

  • Support the Project Manager with project inputs.
  • Provide inputs on project plans or sprints as needed.
  • Manage the delivery of modules.


Manage Defects:

  • Perform defect root cause analysis (RCA) and mitigation.
  • Identify defect trends and implement proactive measures to improve quality.


Estimate:

  • Create and provide input for effort and size estimation
    and plan resources for projects.


Manage Knowledge:

  • Consume and contribute to project-related documents
    SharePoint libraries and client universities.
  • Review reusable documents created by the team.


Release:

  • Execute and monitor the release process.


Design:

  • Contribute to the creation of design (HLD
    LLD SAD)/architecture for applications business components and data models.


Interface with Customer:

  • Clarify requirements and provide guidance to the Development Team.
  • Present design options to customers.
  • Conduct product demos.
  • Collaborate closely with customer architects to finalize designs.


Manage Team:

  • Set FAST goals and provide feedback.
  • Understand team members' aspirations and provide guidance and opportunities.
  • Ensure team members are upskilled.
  • Engage the team in projects.
  • Proactively identify attrition risks and collaborate with BSE on retention measures.


Certifications:

  • Obtain relevant domain and technology certifications.

Skill Examples:

  • Proficiency in SQL Python or other programming languages used for data manipulation.
  • Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF.
  • Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery).
  • Conduct tests on data pipelines and evaluate results against data quality and performance specifications.
  • Experience in performance tuning.
  • Experience in data warehouse design and cost improvements.
  • Apply and optimize data models for efficient storage retrieval and processing of large datasets.
  • Communicate and explain design/development aspects to customers.
  • Estimate time and resource requirements for developing/debugging features/components.
  • Participate in RFP responses and solutioning.
  • Mentor team members and guide them in relevant upskilling and certification.

Knowledge Examples:

Knowledge Examples

  • Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF.
  • Proficient in SQL for analytics and windowing functions.
  • Understanding of data schemas and models.
  • Familiarity with domain-related data.
  • Knowledge of data warehouse optimization techniques.
  • Understanding of data security concepts.
  • Awareness of patterns frameworks and automation practices.

Additional Comments:

Role Purpose: The Cloud Engineer ensures the stability, performance, and scalability of the Data Transformation Platform hosted in Microsoft Azure. They will work as part of a team responsible for both the operational support to maintain platform uptime, as well as the delivery of new features enhancements driven by stakeholder requirements. This includes designing, building, and optimising infrastructure, CI/CD workflows using tools such as Databricks, Terraform, and Azure DevOps. The Engineer will also be working closely with developers to build pipelines for ingestions and transformations. Key Accountabilities / Responsibilities: · Working as part of a team that is responsible for the uptime of data transformation platforms, as well any feature requests that are being fed in by our stakeholders. · Building scalable and reliable Azure DevOps pipelines for infrastructure deployments. · Working in collaboration with data engineers, developers, and security teams · Maintaining up-to-date documentation for the platform, as well as thoroughly documenting and showcasing any new features built. · Participating in OOH support call-out rota. Required Skills & Experience: · Strong hands-on experience with Azure Cloud, Azure Databricks, and data integration workflows. · Confident in Terraform for IaC in Azure. · Understanding of Azure networking (VNets, NSGs VPNs, Private Endpoints, DNS). · Experience building and managing Azure DevOps pipelines for code, infrastructure, and data workflows. · Familiarity with monitoring, logging, and ing using Azure Monitor, Log Analytics, or similar tools. · Comfortable with scripting (PowerShell, Bash, or Python). · Understanding of cloud security best practices (IAM, RBAC, Key Vault, policies). · Excellent communication, documentation, and collaboration skills. · Previous Experience in Data Platform teams preferred.

Skills

Azure Databricks,Terraform,Azure DevOps,Azure Cloud

About UST

UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
UST Global logo
UST Global

Information Technology Services

Oxnard

RecommendedJobs for You

thiruvananthapuram, kerala

Trivandrum, Kerala, India

Pune, Maharashtra, India

Trivandrum, Kerala, India

thiruvananthapuram, kerala