Job
Description
About The Role
Project Role :Custom Software Engineer
Project Role Description :Develop custom software solutions to design, code, and enhance components across systems or applications. Use modern frameworks and agile practices to deliver scalable, high-performing solutions tailored to specific business needs.
Must have skills :Snowflake Data Warehouse
Good to have skills :Data Building Tool
Minimum 3 year(s) of experience is required
Educational Qualification :15 years full time education
Job Title:Data Platform Support Engineer (Snowflake / dbt / Fivetran / Maestro)Role Overview:We are looking for a Data Platform Support Engineer to monitor and maintain our Unified Data Platform (UDP) built on Snowflake. The platform ingests data from multiple source systems using Fivetran and HVR, and processes it through multiple transformation layers using dbt, with Maestro as the orchestration and scheduling tool.The ideal candidate will be responsible for ensuring smooth operation of data pipelines, promptly identifying and resolving issues, performing root cause analysis, executing standard operating procedures (SOPs), and communicating effectively with stakeholders and escalation teams.Key Responsibilities:
Monitor end-to-end data pipelines and job executions in Maestro across ingestion (Fivetran/HVR) and transformation (dbt) layers.Detect and triage failures or performance issues in data loads and transformations.Perform initial troubleshooting and apply defined remedial actions based on SOPs.Analyze root cause of recurring or complex issues and recommend long-term fixes or improvements.Communicate effectively with relevant teams "” data engineering, platform, and business users "” regarding job status, delays, and incidents.Escalate issues to Level 2/Level 3 support as per escalation matrix when beyond scope of L1/L2 handling.Document incidents and maintain detailed logs of investigations, resolutions, and preventive actions.Collaborate with development teams to improve monitoring, alerting, and error-handling processes.Contribute to continuous improvement of SOPs and runbooks.Required Skills & Experience:3–5 years of experience in data operations, data platform support, or ETL monitoring roles.Hands-on exposure to Snowflake and basic understanding of data warehousing concepts.Good SQL knowledge for querying tables, understanding dbt code or validating data issues.Familiarity with dbt (data build tool) "” ability to read, understand code and interpret dbt logs.Experience with job scheduling/orchestration tools (e.g., Maestro, Airflow, Control-M, etc.).Strong analytical and problem-solving skills with attention to detail.Excellent communication and incident reporting skills.Willingness to work in rotational shifts to ensure full-day support coverage (8:00 AM – 5:30 PM or 12:00 PM – 9:30 PM).Bachelor of engineering or equivalent degree in education requiredGood to Have:Experience with Fivetran, HVR, or similar data integration tools.Knowledge of Cloud platforms (AWS/Azure/GCP).Familiarity with Git-based workflows and CI/CD pipelines for dbt.Exposure to ITIL processes (Incident, Problem, Change Management).
Qualification 15 years full time education