Data Engineer Team Lead

9 - 14 years

0 Lacs

Posted:1 week ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Role Description:

Candidate must have architect experience on technologies like (Azure SQL, Python, PySpark, Data bricks, Azure, Kafka etc.) and candidate should be well versed with Agile methodologies and CI/CD deployment models.

Roles & Responsibilities:

  • Own the technical data governance practices and ensures data sovereignty, privacy, security and regulatory compliance.
  • Continuously challenging the status quo of how things have been done in the past.
  • Build data access strategy to securely democratize data and enable research, modelling, machine learning and artificial intelligence work.
  • Help define the tools and pipeline patterns our engineers and data engineers use to transform data and support our analytics practice
  • Work in a cross-functional team to translate business needs into data architecture solutions.
  • Ensure data solutions are built for performance, scalability, and reliability.
  • Mentor junior data architects and team members.
  • Keep current on technology: distributed computing, big data concepts and architecture.
  • Analyze and manage customer master data using Azure services.
  • Optimizing advanced SQL queries and data analysis to validate and ensure master data integrity.
  • Leverage Python, PySpark, and Azure SQL, Data bricks for scalable data processing and automation.
  • Implement data stewardship processes and workflows, including CI/CD process
  • Utilize Azure cloud services for data storage and compute processes, assess the cost optimization techniques.
  • Contribute to metadata and data modeling activities.
  • Track and manage data issues using tools such as JIRA and document processes in Confluence.

Must-Have Skills:

Data Architecture Design:

  • Design and implement scalable, secure, and high-performance data architectures to support diverse business needs.
  • Develop and maintain data models, data dictionaries, and metadata repositories.
  • Define and enforce data standards, policies, and procedures.

Data Strategy & Governance:

  • Collaborate with stakeholders to maintain data strategies and roadmaps.
  • Establish and maintain data governance frameworks to ensure data quality and compliance. Implement re-usable data quality frameworks.
  • Evaluate and recommend new data technologies and tools.
  • Change how we think, act, and utilize our data by performing exploratory and quantitative analytics, data mining, and discovery.
  • Ensure data security and privacy through appropriate access controls and encryption.

Data Integration & Management:

  • Design ETL/ELT processes for data integration from various sources.
  • Build software across our data platform, including event driven data processing, storage, and serving through scalable and highly available APIs, with awesome cutting-edge technologies.
  • Optimize data storage and retrieval for efficient data access.
  • Work closely with data analysts and business stake holders to make data easily accessible and understandable to them.

Cloud Data Solutions:

  • Design and implement cloud-based data solutions using platforms like AWS, Azure, Snowflake and Data bricks.
  • Optimize cloud data storage and processing for cost-effectiveness and performance.
  • Stay up to date with the latest cloud data technologies and trends.
  • Develop and enforce data engineering, security, data quality standards through automation.

Performance & Optimization:

  • Monitor and optimize data system performance.
  • Troubleshoot and resolve data-related issues.
  • Conduct performance tuning and capacity planning.

Collaboration & Communication:

  • Help us stay ahead of the curve by working closely with data engineers, stream processing specialists, API developers, our DevOps team, and analysts to design systems which can scale elastically.
  • Work closely with business analysts & business users to understand data requirements.
  • Communicate complex technical concepts to non-technical stakeholders.
  • Provide technical leadership and mentorship to junior team members.
  • Think of new ways to help make our data platform more scalable, resilient and reliable and then work across our team to put your ideas into action.
  • Help build and maintain foundational data products such as but not limited to Finance, Titles, Content Sales, Theatrical, Consumer Products etc.
  • Work closely with various other data engineering teams to roll out new capabilities.
  • Build process and tools to maintain Machine Learning pipelines in production.

Professional Certifications:

  • Any Architect certification
  • Any cloud certification (AWS or AZURE or GCP)

Qualifications & Experiences:

  • Bachelor's degree in computer science, information systems, or information technology.
  • 8+ years of experience building and scaling data platforms, data architecture, data modeling, and data warehousing.
  • Strong experience with cloud data technologies including Azure, Python Scripting, Apache Airflow and Data bricks.
  • Passion for working with data and deriving insights to answer business questions that drive actions and decision-making
  • Strong Experience in SQL and Data Modelling tools & concepts.
  • Experience with data virtualization and data mesh architectures.
  • Experienced in one or more automation and scheduling tools (for example Redwood, Airflow, etc.)
  • Experience leveraging creative analytics techniques and output to tell a story to drive business decisions
  • Solid business acumen, and critical problem-solving ability, with a capacity for strategic thinking
  • Comfort level with ambiguity and ability to manage multiple projects at the same time
  • Excellent communication, presentation, and customer relationship skills.

Soft Skills:

  • Strong analytical abilities to assess and improve master data processes and solutions.
  • Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders.
  • Effective problem-solving skills to address data-related issues and implement scalable solutions.
  • Ability to work individually and effectively with global, virtual teams.
  • Strong vendor management and stakeholder management skills.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
HTC Global Services logo
HTC Global Services

IT Services and IT Consulting

Troy Michigan

RecommendedJobs for You