Senior Data Platform Architect

9 - 13 years

25 - 30 Lacs

Posted:3 weeks ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Project description
We are seeking an expert with deep proficiency as a Platform Engineer, possessing experience in data engineering. This individual should have a comprehensive understanding of both data platforms and software engineering, enabling them to integrate the platform effectively within an IT ecosystem.
  • Responsibilities

  • Manage and optimize data platforms (Databricks, Palantir).

  • Ensure high availability, security, and performance of data systems.

  • Provide valuable insights about data platform usage.

  • Optimize computing and storage for large-scale data processing.

  • Design and maintain system libraries (Python) used in ETL pipelines and platform governance.

  • Optimize ETL Processes Enhance and tune existing ETL processes for better performance, scalability, and reliability.
  • Skills
    Must have

  • Minimum 10 Years of experience in IT/Data.

  • Minimum 5 years of experience as a Data Platform Engineer/Data Engineer.

  • Bachelor's in IT or related field.

  • Infrastructure & CloudAzure, AWS (expertise in storage, networking, compute).

  • Data Platform ToolsAny of Palantir, Databricks, Snowflake.

  • ProgrammingProficiency in PySpark for distributed computing and Python for ETL development.

  • SQLExpertise in writing and optimizing SQL queries, preferably with experience in databases such as PostgreSQL, MySQL, Oracle, or Snowflake.

  • Data WarehousingExperience working with data warehousing concepts and platforms, ideally Databricks.

  • ETL ToolsFamiliarity with ETL tools & processes

  • Data ModellingExperience with dimensional modelling, normalization/denormalization, and schema design.

  • Version ControlProficiency with version control tools like Git to manage codebases and collaborate on development.

  • Data Pipeline MonitoringFamiliarity with monitoring tools (e.g., Prometheus, Grafana, or custom monitoring scripts) to track pipeline performance.

  • Data Quality ToolsExperience implementing data validation, cleaning, and quality frameworks, ideally Monte Carlo. Nice to have

  • Containerization & OrchestrationDocker, Kubernetes.

  • Infrastructure as Code (IaC)Terraform.

  • Understanding of Investment Data domain (desired).

  • Location - pune,mumbai,chennai,Bengaluru
  • Mock Interview

    Practice Video Interview with JobPe AI

    Start Python Interview
    cta

    Start Your Job Search Today

    Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

    Job Application AI Bot

    Job Application AI Bot

    Apply to 20+ Portals in one click

    Download Now

    Download the Mobile App

    Instantly access job listings, apply easily, and track applications.

    coding practice

    Enhance Your Python Skills

    Practice Python coding challenges to boost your skills

    Start Practicing Python Now
    Luxoft logo
    Luxoft

    IT Services and IT Consulting

    Zug New York

    RecommendedJobs for You