6 years
0 Lacs
Posted:14 hours ago|
Platform:
On-site
Contractual
threeS Data, a cutting-edge technology startup based in Coimbatore, India, specializes in Data Architecture, Management, Governance, Analytics, Intelligence, Business Intelligence, Automation, and Machine Learning. Founded in 2024, we focus on delivering simple, smart, and significant solutions that meet our clients' desired outcomes. Our engagements are partnerships, dedicated to understanding the complexities of day-to-day operations and offering practical, honest approaches to deliver exceptional results.
orchestration and monitoring capabilities.
· Translate business requirements into robust data transformation pipelines across cloud-data platforms.
· Develop reusable ETL components to support a configuration-driven architecture.
· Integrate data from multiple sources: Redshift, flat files, APIs, Excel, and relational databases.
· Implement transformation logic such as cleansing, standardization, enrichment, and de-duplication.
· Manage incremental and full loads, along with SCD handling strategies.
· Write performant SQL queries for data staging and transformation within Redshift and Snowflake.
· Utilize joins, window functions, and aggregations effectively.
· Ensure indexing and query tuning for high-performance workloads.
· Optimize data pipelines and orchestrations for large-scale data volumes.
· Tune SQL queries and monitor execution plans.
· Implement best practices in distributed data processing and cloud-native optimizations.
· Implement robust error handling and logging in Airflow DAGs.
· Enable retry logic, alerting mechanisms, and failure notifications.
· Conduct unit and integration testing of ETL jobs.
· Validate data outputs against business rules and source systems.
· Support QA during UAT cycles and help resolve data defects.
· Deploy pipelines using Git-based CI/CD practices.
· Schedule and monitor DAGs using Apache Airflow and integrated tools.
· Troubleshoot failures and ensure data pipeline reliability.
· Document data flows, DAG configurations, transformation logic, and operational procedures.
· Maintain change logs and update job dependency charts.
· Work closely with data architects, analysts, and BI teams to define and fulfill data needs.
· Participate in stand-ups, sprint planning, and post-deployment reviews.
· Ensure ETL processes adhere to data security, governance, and privacy regulations (HIPAA, GDPR, etc.).
· Follow naming conventions, version control standards, and deployment protocols.
o 6+ years of hands-on experience in ETL development.
o Strong understanding of data warehousing concepts and cloud-based data ecosystems.
o Familiarity with handling flat files, APIs, and external sources
o Experience with job orchestration, error handling, and scalable transformation patterns.
o Ability to work independently and meet deadlines.
§ Knowledge of Git, Azure DevOps, or other version control and CI/CD platforms.
threeS Data Solutions
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
coimbatore, tamil nadu, india
Salary: Not disclosed
coimbatore, tamil nadu, india
Salary: Not disclosed