Data Platform Engineer

5 - 10 years

22 - 30 Lacs

Posted:7 hours ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Data Platform Engineer:

Key Position Responsibilities

  • Work with the Data Platform team to design, implement, and maintain scalable, secure, and resilient data platform solutions.
  • Deliver solutions that meet requirements within agreed timeframes, ensuring quality and desired outcomes.
  • Develop and manage data platform infrastructure using CI/CD pipelines and automation tools.
  • Support and manage key data platform services, such as data lakes, data warehouses, stream processing, data transformation, data integration, data quality, and analytics tools. Enable consuming teams within Enterprise Data to effectively utilise these services.
  • Collaborate with the Cloud team to ensure alignment with best practices while managing AWS cloud services consumed by Enterprise Data, including AWS accounts, object storage, containerised applications, relational databases, and database migration services.
  • Collaborate with the data governance team to enhance data security, quality, and regulatory compliance practices.
  • Partner with vendors and the Cloud team to define operational resilience targets, manage risks, and develop controls, procedures, and playbooks for key data platform services, ensuring alignment with observability and alerting standards.
  • Adhere to established incident management and reporting procedures and escalate issues to senior managers as necessary.
  • Stay informed about industry trends, emerging technologies, and market changes in data platforms and cloud services.

Skills, Qualifications, Experience:

  • Minimum 2 years of experience in a data platform, cloud, or devops role
  • Able to work independently, self-organise, scope work and adapt to changing requirements
  • Excellent communication and interpersonal skills both written and verbal including the ability to present to both technical and non-technical stakeholders
  • At least 2 years experience in a programming language (Python, Golang or Scala are a plus)
  • Strong understanding and/or experience in core AWS services as well as relational databases (RDS) and Kubernetes (EKS)
  • Experience with at least one data lake or warehousing solution (e.g. Databricks, Snowflake, Redshift), or stream processing/analytics tools (e.g. Kafka, Flink)
  • Proficiency in IAC and CI/CD tools like Terraform, Buildkite and Github Actions
  • Experience with orchestration frameworks such as Airflow and/or DBT
  • Experience integrating core data management services with BI tools (e.g., Tableau), data science frameworks (e.g., SageMaker, MLFlow, TensorFlow), and integration platforms (e.g., Segment, Salesforce Data Cloud)
  • Experience in incident and issue management
  • Committed to ongoing learning and development

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Valuelabs logo
Valuelabs

IT Services and IT Consulting

Hyderabad Telangana

RecommendedJobs for You