Posted:1 week ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

The Role

As a Data Engineer, you will be responsible for designing, building, and maintaining robust data pipelines and workflows. You will work closely with data analysts, data scientists, and other stakeholders to ensure data availability, quality, and reliability, and contribute to our data-driven success.

Key Responsibilities


  • Data Modeling and Transformation: Design, develop, and maintain data models and transformations using DBT to create analysis-ready datasets.
  • ETL Automation: Build and optimize ETL pipelines using Snowflake and Python, ensuring efficient data extraction, transformation, and loading processes.
  • Workflow Automation: Implement and manage data workflows using Apache Airflow, scheduling and orchestrating data-related tasks and jobs.
  • Data Quality: Establish data quality checks and validation procedures to ensure data accuracy and integrity.
  • Collaboration: Collaborate closely with data engineering, data science, and business teams to gather requirements and deliver data solutions.
  • Documentation: Maintain comprehensive documentation for data pipelines, models, workflows, and automation processes.
  • Security: Implement and adhere to data security best practices within Snowflake and other data-related systems.
  • Performance Optimization: Monitor and optimize data pipelines and workflows for performance and scalability.
  • Troubleshooting: Diagnose and resolve data pipeline issues, errors, and performance bottlenecks.


Qualifications


  • Bachelor’s degree in computer science, Information Technology, or a related field (master’s preferred).
  • 3+ years of relevant experience required.
  • Proficiency in DBT, Snowflake, Python, and Apache Airflow.
  • Strong SQL skills for data manipulation and querying.
  • Previous experience in data engineering and ETL development.
  • Knowledge of data warehousing concepts and best practices.
  • Excellent problem-solving and communication skills.
  • Ability to work collaboratively in a team-oriented environment.
  • Attention to detail and a commitment to delivering high-quality data solutions.

Preferred Qualifications

  • Snowflake certification(s).
  • Experience with cloud platforms such as AWS, Azure, or Google Cloud.
  • Familiarity with version control systems like Git.
  • Strong scripting skills for automation tasks.
  • Previous experience with big data technologies (e.g., Spark, Hadoop) is a plus.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Saras Analytics logo
Saras Analytics

Data Analytics

San Francisco

RecommendedJobs for You

bhubaneswar, odisha, india

gurugram, haryana, india

noida, uttar pradesh

chennai, tamil nadu, india

pune, maharashtra, india