Platform Data Engineer

5 - 10 years

7 - 12 Lacs

Posted:2 months ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Total Experience 5 - 10 Years Relevant Yrs. of experience* 5+ Mandatory skills* Platform Data engineer who are 5+ years hands on experience in Deploy and configure using Airflow on top of Kubernetes engine. Responsibilities: Designing and implementing the overall data platform architecture. Selecting and integrating various data technologies and tools (e.g., cloud platforms like AWS, Azure, GCP, data lakes, data warehouses, streaming platforms like Kafka, containerization technologies). Ensuring the scalability, reliability, performance, security, and governance of the data platform. Setting up and managing the infrastructure for data pipelines, data storage, and data processing. Developing internal frameworks and tools to improve the efficiency and usability of the data platform for other data teams (including Data Engineers and Data Scientists). Implementing monitoring and observability for the data platform. Collaborating with software engineering teams to integrate the data platform with other systems. Often involved in capacity planning and cost optimization of the data infrastructure Tech stack experience Required Strong experience in implementing and managing table formats utilizing Apache Iceberg (version 0.13.2). Proficiency in building and maintaining batch processing capabilities using Apache Spark (version 3.4 and above). Expertise in developing and managing streaming data processing solutions with Apache Spark Streaming (version 3.4 and above). Demonstrated ability to implement and administer Role-Based Access Control (RBAC) using Apache Ranger (version 2.6.0). Solid understanding and experience in managing messaging platforms, specifically Apache Kafka (version 3.9 and above). Experience with near real-time data storage solutions, particularly RDBMS such as Oracle (version 19c). Familiarity with and ability to implement Data Quality (DQ) frameworks, preferably using Great Expectations (version 1.3.4). Experience in establishing and utilizing data lineage and data catalog solutions, ideally with Open Lineage and DataHub (version 0.15.0). Competence in leveraging query engines for batch processing, such as Trino (version 4.7.0). Hands-on experience in managing container platforms, specifically SKE (version 1.29 on AKS). Proficiency in utilizing workflow and scheduling tools, such as Airflow (version 2.10.4). Experience with ETL/ELT frameworks, including DBT (Data Build Tool). Knowledge of and experience with data tokenization technologies, such as Protegrity (version 9.2). Domain* Banking Approx. vendor billing rate* (INR/Day) 15000 INR/Day Work Location* Offshore - 5 days WFO - Bangalore Mode of Interview: Telephonic/Face to Face/Skype Interview* Teams

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
Clifyx Technology logo
Clifyx Technology

Technology

Innovation City

RecommendedJobs for You

Bengaluru / Bangalore, Karnataka, India

Pune, Maharashtra, India

Andhra Pradesh, India