Posted:16 hours ago|
Platform:
On-site
Part Time
Senior Database Engineer, you will be responsible for expanding and optimizing our data architecture and data
pipelines. The ideal candidate is an experienced data engineer and data wrangler who enjoys optimizing data
systems and building them from the ground up. The Database Engineer must be self-directed and comfortable
supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the
prospect of data process automation and optimizing or even re-designing our company’s data architecture to
support our next generation of products and data initiatives.
Key Responsibilities:
— Build the data pipeline for optimal extraction, transformation, and loading of data from a wide variety of data
sources using SQL and cloud database technologies.
— Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related
technical issues and support their data needs.
— Assemble large, complex data sets that meet functional / non-functional business requirements.
of new SQL features, close security gaps, and increase robustness and maintainability of the code.
delivery for greater scalability, etc.
— Unit Test databases and perform bug fixes.
— Develop best practices for database design and development activities.
— Take on technical leadership responsibilities of database projects across various scrum teams.
— Manage exploratory data analysis to support dashboard development (desirable).
Key Requirements:
Experience: 8-12 years would be preferable.
Required Skills:
— Strong experience in SQL with expertise in relational database (PostgreSQL preferrable cloud hosted in
AWS/Azure/GCP).
— Strong experience in cloud based Data Warehouse like Snowflake (preferrable) or any similar ones like Azure
Synapse.
— Competence in data preparation and/or ETL/ELT tools like IBM StreamSets, SnapLogic, DBT, etc. (preferably
strong working experience in one or more) to build and maintain complex data pipelines and flows to handle
large volume of data.
— Understanding of data modelling techniques and working knowledge with OLAP systems
— Ability to fine tune report generating queries.
— Understanding of index design and performance-tuning techniques
Encryption (TDE), signed stored procedures, and assignment of user permissions
— Experience in understanding the source data from various platforms and mapping them into Entity
Relationship Models (ER) for data integration and reporting (desirable).
— Exposure to Source control like GIT, Azure DevOps
— Understanding of Agile methodologies (Scrum, Kanban)
— experience with NoSQL database to migrate data into other type of databases with real time replication
(desirable).
— Experience with CI/CD automation tools (desirable)
— Programming language experience in Golang, Python, any programming language, Visualization tools (Power
BI/Tableau) (desirable).
Personal Attributes:
— Very good communication skills.
acceptance criteria.
across different time zones.
Personal Attributes:
— Very good communication skills.
At Tietoevry, we believe in the power of diversity, equity, and inclusion. We encourage applicants of all backgrounds, genders (m/f/d), and walks of life to join our team, as we believe that this fosters an inspiring workplace and fuels innovation. Our commitment to openness, trust, and diversity is at the heart of our mission to create digital futures that benefit businesses, societies, and humanity.
Diversity, equity and inclusion (tietoevry.com)
TieToevry
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Now4.5 - 8.0 Lacs P.A.
4.5 - 8.0 Lacs P.A.