2 - 3 years

5.0 - 15.0 Lacs P.A.

Bengaluru

Posted:3 weeks ago| Platform: Naukri logo

Apply Now

Skills Required

ETL ToolPythonSQLAirflowData PipelineSnowflakeCloud PlatformData WarehousingElt

Work Mode

Work from Office

Job Type

Full Time

Job Description

Role & responsibilities Design, develop, and maintain scalable ETL/ELT data pipelines Work with structured and unstructured data from various sources (APIs, databases, cloud storage, etc.) Optimize data workflows and ensure data quality, consistency, and reliability Collaborate with cross-functional teams to understand data requirements and deliver solutions Maintain and improve our data infrastructure and architecture Monitor pipeline performance and troubleshoot issues in real-time Preferred candidate profile 2-3 years of experience in data engineering or a similar role Proficiency in SQL and Python (or Scala/Java for data processing) Experience with ETL tools (e.g., Airflow, dbt, Luigi) Familiarity with cloud platforms like AWS, GCP, or Azure Hands-on experience with data warehouses (e.g., Redshift, BigQuery, Snowflake) Knowledge of distributed data processing frameworks like Spark or Hadoop Experience with version control systems (e.g., Git) Exposure to data modeling and schema design Experience working with CI/CD pipelines for data workflows Understanding of data privacy and security practices

Information Technology
Hackerville

RecommendedJobs for You

Hyderabad, Pune, Bengaluru