Home
Jobs

Posted:2 weeks ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Qualifications: Bachelors or Master’s degree in Computer Science, Engineering, or related field. 7–9 years of data engineering experience with strong hands-on delivery using ADF, SQL, Python, Databricks, and Spark. Experience designing data pipelines, warehouse models, and processing frameworks using Snowflake or Azure Synapse. Proficient with CI/CD tools (Azure DevOps, GitHub) and observability practices. Solid grasp of data governance, metadata tagging, and role-based access control. Proven ability to mentor and grow engineers in a matrixed or global environment. Strong verbal and written communication skills, with the ability to operate cross-functionally. Certifications in Azure, Databricks, or Snowflake are a plus. Preferred Skills: Strong knowledge of data engineering concepts (data pipeline creation, data warehousing, data marts/cubes, data reconciliation and audit, data management). Working knowledge of DevOps processes (CI/CD), Git/Jenkins version control tools, Master Data Management (MDM), and data quality tools. Strong experience in ETL/ELT development, QA, and operations/support processes (RCA of production issues, code/data fix strategy, monitoring and maintenance). Hands-on experience with databases like Azure SQL DB, Snowflake, MySQL, Cosmos DB, Blob Storage, Python/Unix Shell scripting. ADF, Databricks, and Azure certifications are a plus. Technologies We Use: Databricks, Azure SQL DW/Synapse, Snowflake, Azure Tabular, Azure Data Factory, Azure Functions, Azure Containers, Docker, DevOps, Python, PySpark, scripting (PowerShell, Bash), Git, Terraform, Power BI Responsibilities: Design, develop, and maintain scalable pipelines across ADF, Databricks, Snowflake, and related platforms. Lead the technical execution of non-domain-specific initiatives (e.g., reusable dimensions, TLOG standardization, enablement pipelines). Architect data models and reusable layers consumed by multiple downstream pods. Guide platform-wide patterns like parameterization, CI/CD pipelines, pipeline recovery, and auditability frameworks. Mentor and coach team members. Partner with product and platform leaders to ensure engineering consistency and delivery excellence. Act as an L3 escalation point for operational data issues impacting foundational pipelines. Own engineering best practices, sprint planning, and quality across the Enablement pod. Contribute to platform discussions and architectural decisions across regions.

Mock Interview

Practice Video Interview with JobPe AI

Start Etl/Elt Development Interview Now

My Connections Riktam Technology Consulting

Download Chrome Extension (See your connection in the Riktam Technology Consulting )

chrome image
Download Now
Riktam Technology Consulting
Riktam Technology Consulting

Information Technology & Services

Not Provided

51-200 Employees

9 Jobs

    Key People

  • Sandeep Ghosh

    CEO
  • Srinivasan J

    CTO

RecommendedJobs for You