SUMMARY
About tsworks:
tsworks is a leading technology innovator, providing transformative products and services designed for the digital-first world. Our mission is to provide domain expertise, innovative solutions and thought leadership to drive exceptional user and customer experiences. Demonstrating this commitment , we have a proven track record of championing digital transformation for industries such as Banking, Travel and Hospitality, and Retail (including e-commerce and omnichannel), as well as Distribution and Supply Chain, delivering impactful solutions that drive efficiency and growth. We take pride in fostering a workplace where your skills, ideas, and attitude shape meaningful customer engagements.
About This Role:
tsworks Technologies India Private Limited is seeking driven and motivated Senior Data Engineers to join its Digital Services Team. You will get hands-on experience with projects employing industry-leading technologies. This would initially be focused on the operational readiness and maintenance of existing applications and would transition into a build and maintenance role in the long run.
Requirements Position: Sr. Data Engineer
Experience: 3 to 10+ Years
Location: Bangalore, India
Mandatory Required Qualification
- Strong proficiency in Azure services such as Azure Data Factory, Azure Databricks, Azure Synapse Analytics, Azure Storage, etc.
- Expertise in DevOps and CI/CD implementation
- Good knowledge in SQL
- Excellent Communication Skills
In This Role, You Will
- Design, implement, and manage scalable and efficient data architecture on the Azure cloud platform.
- Develop and maintain data pipelines for efficient data extraction, transformation, and loading (ETL) processes.
- Perform complex data transformations and processing using Azure Data Factory, Azure Databricks, Snowflake's data processing capabilities, or other relevant tools.
- Develop and maintain data models within Snowflake and related tools to support reporting, analytics, and business intelligence needs.
- Collaborate with cross-functional teams to understand data requirements and design appropriate data integration solutions.
- Integrate data from various sources, both internal and external, ensuring data quality and consistency.
- Ensure data models are designed for scalability, reusability, and flexibility.
- Implement data quality checks, validations, and monitoring processes to ensure data accuracy and integrity across Azure and Snowflake environments.
- Adhere to data governance standards and best practices to maintain data security and compliance.
- Handling performance optimization in ADF and Snowflake platforms
- Collaborate with data scientists, analysts, and business stakeholders to understand data needs and deliver actionable insights
- Provide guidance and mentorship to junior team members to enhance their technical skills.
- Maintain comprehensive documentation for data pipelines, processes, and architecture within both Azure and Snowflake environments including best practices, standards, and procedures.
Skills & Knowledge
- Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
- 3 + Years of experience in Information Technology, designing, developing and executing solutions.
- 3+ Years of hands-on experience in designing and executing data solutions on Azure cloud platforms as a Data Engineer.
- Strong proficiency in Azure services such as Azure Data Factory, Azure Databricks, Azure Synapse Analytics, Azure Storage, etc.
- Familiarity with Snowflake data platform would be an added advantage.
- Hands-on experience in data modelling, batch and real-time pipelines, using Python, Java or JavaScript and experience working with Restful APIs are required.
- Expertise in DevOps and CI/CD implementation.
- Hands-on experience with SQL and NoSQL databases.
- Hands-on experience in data modelling, implementation, and management of OLTP and OLAP systems.
- Experience with data modelling concepts and practices.
- Familiarity with data quality, governance, and security best practices.
- Knowledge of big data technologies such as Hadoop, Spark, or Kafka.
- Familiarity with machine learning concepts and integration of ML pipelines into data workflows
- Hands-on experience working in an Agile setting.
- Can articulate, create, and maintain technical and non-technical documentation.
- Public cloud certifications are desired.