7 - 9 years
22 - 35 Lacs
Posted:2 weeks ago|
Platform:
Work from Office
Full Time
Qualifications: Bachelors or Master’s degree in Computer Science, Engineering, or related field. 7–9 years of data engineering experience with strong hands-on delivery using ADF, SQL, Python, Databricks, and Spark. Experience designing data pipelines, warehouse models, and processing frameworks using Snowflake or Azure Synapse. Proficient with CI/CD tools (Azure DevOps, GitHub) and observability practices. Solid grasp of data governance, metadata tagging, and role-based access control. Proven ability to mentor and grow engineers in a matrixed or global environment. Strong verbal and written communication skills, with the ability to operate cross-functionally. Certifications in Azure, Databricks, or Snowflake are a plus. Preferred Skills: Strong knowledge of data engineering concepts (data pipeline creation, data warehousing, data marts/cubes, data reconciliation and audit, data management). Working knowledge of DevOps processes (CI/CD), Git/Jenkins version control tools, Master Data Management (MDM), and data quality tools. Strong experience in ETL/ELT development, QA, and operations/support processes (RCA of production issues, code/data fix strategy, monitoring and maintenance). Hands-on experience with databases like Azure SQL DB, Snowflake, MySQL, Cosmos DB, Blob Storage, Python/Unix Shell scripting. ADF, Databricks, and Azure certifications are a plus. Technologies We Use: Databricks, Azure SQL DW/Synapse, Snowflake, Azure Tabular, Azure Data Factory, Azure Functions, Azure Containers, Docker, DevOps, Python, PySpark, scripting (PowerShell, Bash), Git, Terraform, Power BI Responsibilities: Design, develop, and maintain scalable pipelines across ADF, Databricks, Snowflake, and related platforms. Lead the technical execution of non-domain-specific initiatives (e.g., reusable dimensions, TLOG standardization, enablement pipelines). Architect data models and reusable layers consumed by multiple downstream pods. Guide platform-wide patterns like parameterization, CI/CD pipelines, pipeline recovery, and auditability frameworks. Mentor and coach team members. Partner with product and platform leaders to ensure engineering consistency and delivery excellence. Act as an L3 escalation point for operational data issues impacting foundational pipelines. Own engineering best practices, sprint planning, and quality across the Enablement pod. Contribute to platform discussions and architectural decisions across regions.
Riktam Technology Consulting
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
My Connections Riktam Technology Consulting
Information Technology & Services
51-200 Employees
9 Jobs
Key People
Bengaluru
5.0 - 9.0 Lacs P.A.
Hyderabad
10.0 - 15.0 Lacs P.A.
Hyderabad
5.0 - 9.0 Lacs P.A.
Bengaluru
6.0 - 10.0 Lacs P.A.
Hyderabad
6.0 - 10.0 Lacs P.A.
Bengaluru
25.0 - 40.0 Lacs P.A.
Bengaluru
10.0 - 14.0 Lacs P.A.
19.0 - 30.0 Lacs P.A.
Hyderābād
Salary: Not disclosed
Gurgaon
Salary: Not disclosed