Snowflake Admin

5 years

0 Lacs

Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

About Straive:

Straive is a market leading Content and Data Technology company providing data services, subject matter expertise, & technology solutions to multiple domains.

Data Analytics & Al Solutions, Data Al Powered Operations and Education & Learning form the core pillars of the company’s long-term vision. The company is a specialized solutions provider to business information providers in finance, insurance, legal, real estate, life sciences and logistics. Straive continues to be the leading content services provider to research and education publishers.

Data Analytics & Al Services: Our Data Solutions business has become critical to our client's success. We use technology and Al with human experts-in loop to create data assets that our clients use to power their data products and their end customers' workflows. As our clients expect us to become their future-fit Analytics and Al partner, they look to us for help in building data analytics and Al enterprise capabilities for them.

.

Website: https://www.straive.com/

Role Overview

We are seeking a Data Platform Operations Engineer to join us in building, automating, and operating our Enterprise Data Platform. This role is ideal for someone with a unique combination of DataOps/DevOps, Data Engineering, and Database Administration expertise. As a key member of our Data & Analytics team, you will ensure our data infrastructure is reliable, scalable, secure, and high-performing—enabling data-driven decision-making across the business.

Key Responsibilities

  • Snowflake Administration:

    Own the administration, monitoring, configuration, and optimization of our Snowflake data warehouse. Implement and automate user/role management, resource monitoring, scaling strategies, and security policies.
  • Fivetran Management:

    Configure, monitor, and troubleshoot Fivetran pipelines for seamless ingestion from SaaS applications, ERPs, and operational databases. Resolve connector failures and optimize sync performance and cost.
  • DataOps/Automation:

    Build/improve CI/CD workflows using Git and other automation tools for data pipeline deployment, testing, and monitoring.
  • Infrastructure as Code (IaC):

    Implement and maintain infrastructure-using tools like Terraform and Titan to ensure consistent, repeatable, and auditable environments.
  • Platform Monitoring & Reliability:

    Implement automated checks and alerting across Snowflake, Fivetran, and dbt processes to ensure platform uptime, data freshness, and SLA compliance. Proactively identify and resolve platform issues and performance bottlenecks.
  • Database Performance and Cost Optimization:

    Monitor and optimize database usage (queries, compute, storage) for speed and cost-effectiveness. Partner with data engineers and analysts to optimize SQL and refine warehouse utilization.
  • Security & Compliance:

    Enforce security best practices across the data platform (access controls, encryption, data masking). Support audits and compliance requirements (e.g., SOC2).
  • Data Quality Operations:

    Build and automate data health and quality checks (using dbt tests and/or custom monitors). Rapidly triage and resolve data pipeline incidents with root cause analyses.
  • Documentation & Process:

    Ensure all operational procedures (run books, escalation paths, knowledge base) and infrastructure documentation are accurate, up-to-date, and easily accessible.
  • Collaboration:

    Partner with Data Architects, Data Engineers, and DevOps Engineers to understand data flow requirements, troubleshoot issues, and continuously enhance platform capabilities.

Required Experience & Skills

  • 5+ years in a DataOps, DevOps, Data Engineering, or Database Administration role in cloud data environments.
  • Hands-on experience administering Snowflake, including security, performance tuning, cost management, and automation.
  • Strong expertise with Fivetran setup, management, and incident troubleshooting.
  • Proficiency in dbt for ELT development, testing, and orchestration.
  • Advanced SQL skills for troubleshooting, diagnostics, and optimization.
  • Proficient with version control (Git) and experience designing/deploying data pipelines in a collaborative environment.
  • Scripting skills (Python, Bash, etc.) for workflow automation, data operations tasks, and deployment pipelines.
  • Experience with cloud platforms (AWS/Azure); knowledge of core services such as IAM, data storage, and data transfer.
  • Strong understanding of platform reliability, monitoring, and observability (alerting, dash boarding, log analysis).
  • Comfortable with Infrastructure as Code concepts and tools (Terraform).
  • Experience working with business and analytics teams to translate ops support needs into scalable technical solutions.


Technical Stack

● Required: Snowflake, Terraform, Github Actions, AWS, dbt, Fivetran

● Preferred Titan, Datacoves

Mock Interview

Practice Video Interview with JobPe AI

Start DevOps Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Straive logo
Straive

IT Services and IT Consulting

Columbia

RecommendedJobs for You

Gurugram, Haryana, India

hyderabad, telangana

Hyderabad, Telangana, India

Vishakhapatnam, Andhra Pradesh, India