Jobs
Interviews

Fugetron Corporation

1 Job openings at Fugetron Corporation
Snowflake Data Engineer Mumbai,Mumbai (All Areas) 2 - 7 years INR 5.0 - 15.0 Lacs P.A. Work from Office Full Time

We are seeking a highly skilled Data Engineer with a strong background in Snowflake and Azure Data Factory (ADF) , and solid experience in Python and SQL . The ideal candidate will play a critical role in designing and building robust, scalable data pipelines, enabling modern cloud-based data platforms including data warehouses and data lakes . Key Responsibilities Design, develop, and maintain scalable ETL/ELT pipelines using Snowflake , ADF , and Python to support data warehouse and datalake architectures. Build and automate data ingestion pipelines from various structured and semi-structured sources (APIs, flat files, cloud storage, databases) into Snowflake-based data lakes and data warehouses . Perform full-cycle data migration from on-premise and cloud databases (e.g., Oracle, SQL Server, Redshift, MySQL) to Snowflake . Optimize Snowflake workloads: schema design, clustering, partitioning, materialized views, and query performance tuning . Develop and orchestrate data workflows using Azure Data Factory pipelines, triggers, and dataflows. Implement data quality checks , validation processes, and monitoring mechanisms for production pipelines. Collaborate with cross-functional teams including Data Scientists, Analysts, and DevOps to support diverse data needs. Ensure data integrity, security, and governance throughout the data lifecycle. Maintain comprehensive documentation on pipeline design, schema changes, and architectural decisions. Required Skills & Qualifications Bachelors degree in Computer Science, Information Technology, or a related field. 2+ years of hands-on experience with Snowflake , including Snowflake SQL,SnowSQL, Snowpipe, Streams, Tasks, and performance optimization. 1+ year of experience with Azure Data Factory (ADF) – pipeline design,linked services, datasets, triggers, and integration runtime. Strong Python skills for scripting, automation, and data manipulation. Advanced SQL skills – ability to write efficient, complex queries, procedures, and analytical expressions. Experience designing and implementing data lakes and data warehouses on cloud platforms. Familiarity with Azure cloud services , including Azure Data Lake Storage (ADLS), Blob Storage, Azure SQL, and Azure DevOps. Experience with orchestration tools such as Airflow, DBT, or Prefect is a plus. Understanding of data modeling , data warehousing principles , and ETL/ELT best practices . Experience in building scalable data architectures for analytics and business intelligence use cases. Preferred Qualifications (Nice to Have) Experience with CI/CD pipelines for data engineering (e.g., Azure DevOps, GitHub Actions). Familiarity with Delta Lake, Parquet, or other big data formats. Knowledge of data security and governance tools like Purview or Informatica.