We are seeking a highly skilled and motivated Data Ingestion Engineer to join our data platform team. This role involves building and maintaining scalable, reliable, and secure data pipelines that power analytics, reporting, and decision-making across the organization.You will work with tools like Talend, Azure Data Factory (ADF), dbt, PostgreSQL, Snowflake, and Python to enable smooth data ingestion, transformation, and delivery across cloud and on-premise systems.________________________________________
Key Responsibilities
- Design, develop, and maintain data ingestion pipelines using Talend, ADF, and Python
- Ingest, transform, and load data from various internal and external sources into Snowflake and PostgreSQL
- Develop and manage modular SQL transformations using dbt
- Build and automate scripts in Python for data parsing, validation, and integration when needed
- Collaborate with data analysts, data scientists, and business teams to understand data requirements and deliver clean, usable datasets
- Monitor and optimize performance of pipelines and troubleshoot ingestion issues
- Implement data quality checks, logging, alerting, and governance processes
- Maintain technical documentation of data pipelines and workflows
________________________________________
Required Skills & Experience
- 3+ years of experience in data engineering, ETL development, or data integration
- Strong hands-on experience with Talend (Open Studio or Enterprise Edition)
- Experience with Azure Data Factory (ADF) for orchestration and pipeline automation
- Proficiency with Python for scripting, automation, and data manipulation
- Solid working knowledge of Snowflake and PostgreSQL
- Experience using dbt for SQL-based transformation, version control, and documentation
- Strong SQL skills and familiarity with data modeling concepts (e.g., star/snowflake schemas)
- Familiarity with Git and version-controlled development environments
________________________________________
Preferred Qualifications
- Experience with API integration and handling large data files (CSV, JSON, XML)
- Familiarity with Azure Blob Storage, Azure Data Lake, or similar
- Exposure to cloud data platforms (Azure, AWS, or GCP)
- Understanding of data governance, lineage, and cataloging tools
Skills: postgresql,snowflake,talend