Home
Jobs

7 - 14 years

0 Lacs

Posted:1 week ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Key Responsibilities

  • Develop and optimize complex SQL queries, including joins (inner/outer), filters, and aggregations.
  • Work with diverse datasets from multiple database sources, ensuring data quality and integrity.
  • Leverage Python for data manipulation, including functions, iterations, API requests, and JSON flattening.
  • Use Python to interpret, manipulate, and process data to facilitate downstream analysis.
  • Design, implement, and optimize ETL processes and workflows.
  • Manage data ingestion from various formats (e.g., JSON, Parquet, TXT, XLSX) using tools like Informatica, Teradata, DataStage, Talend, and Snowflake.
  • Demonstrate expertise in Azure services, specifically ADF, Databricks, and Azure Data Lake.
  • Create, manage, and optimize cloud-based data pipelines.
  • Integrate data sources via Fivetran or custom connectors (e.g., PySpark, ADF).
  • Lead the implementation of Snowflake as an ETL and storage layer.
  • Ensure seamless data connectivity, including handling semi-structured/unstructured data.
  • Promote code and manage changes across various environments.
  • Proficient in writing complex SQL scripts, including stored procedures, views, and functions.
  • Hands-on experience with Snowflake in multiple projects.
  • Familiarity with DBT for transformation logic and Fivetran for data ingestion.
  • Strong understanding of data modeling and data warehousing fundamentals.
  • Experience with GitHub for version control and code Skills & Experience :
  • 7 to 14 years of experience in Data Engineering, with a focus on SQL, Python, ETL, and cloud technologies.
  • Hands-on experience with Snowflake implementation and data pipeline management.
  • In-depth understanding of Azure cloud tools and services, such as ADF, Databricks, and Azure Data Lake.
  • Expertise in designing and managing ETL workflows, data mapping, and ingestion from multiple data sources/formats.
  • Proficient in Python for data interpretation, manipulation, and automation tasks.
  • Strong knowledge of SQL, including advanced techniques such as stored procedures and functions.
  • Experience with GitHub for version control and collaborative to Have :
  • Experience with other cloud platforms (e.g., AWS, GCP).
  • Familiarity with DataOps and continuous integration/continuous delivery (CI/CD) practices.
  • Prior experience leading or mentoring teams of data engineers.
(ref:hirist.tech)

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You