Key Responsibilities
Design, develop, and maintain ETL/ELT pipelines
for data extraction, transformation, and loading from diverse sources.Perform complex SQL development
including queries, stored procedures, and performance tuning on MS SQL and other relational databases.Implement data modeling and schema design
for data warehousing and reporting solutions.Work on API integrations
including token-based authentication.Optimize and troubleshoot ETL processes for performance, scalability, and reliability.Develop reusable data integration interfaces
and automate processes using scripting (Unix shell..etc).Ensure data quality, integrity, and security
across systems.Collaborate with cross-functional teams to define data requirements and deliver solutions.Required Skills & Experience
Strong knowledge of SQL
with proven ability to write and optimize complex queries.Solid understanding of ETL/ELT concepts, data warehousing, and schema design
.Experience in Data ingestion, DB schema build, AWS glue, Redshift, Data importer, reports
Hands-on experience with MS SQL, DB2, PostgreSQL, Redshift
.Experience In Cloud Infrastructure (AWS, Azure, GCP Preferred).
Proficiency in Unix shell scripting, Perl scripting
etcStrong knowledge of API integrations and token-based authentication
.Proven ability to analyse, debug, and resolve data quality and performance issues.Experience in database design and data modelling
.