Posted:5 hours ago|
Platform:
Hybrid
Full Time
Summary:
Proficiency with Snowflakes cloud data platform and its features, such as Snow pipe, Time Travel, and Zero-Copy Cloning. Experience with Snow SQL, Snowflake's command-line interface, and the ability to write complex SQL queries for large-scale data analytics. Understanding of data warehousing concepts and hands-on experience with Snowflake's unique architecture and caching mechanisms. Familiarity with data migration strategies to Snowflake from other databases or data warehouses and knowledge of ETL processes and tools like Matillion, Talend, or Informatica that integrate with Snowflake. Knowledge of Snowflake's security features, such as encryption, data masking, and role-based access control, ensuring data compliance and security. implemented complex ETL pipelines in Snowflake that allowed for the ingestion and transformation of diverse data sources, enhancing the analytical data foundation. Designing, implementing, managing scalable data solutions using Snowflake environment for optimized data storage and processing. Migrate existing data domains/flows from relational data store to cloud data store (Snowflake). Identify and optimize new/existing data workflows. Identify and implement data integrity practices. Integrate data governance and data science tools with Snowflake ecosystem as per practice. Support the development of data models and ETL processes to ensure high quality data ingestion in cloud data store. Collaborate with team members to design and implement effective data workflows and transformations. Assist in the maintenance and optimization of Snowflake environments to improve performance and reduce costs. Contribute to proof of concept, documentation and best practices for data management and governance within the Snowflake ecosystem. Participate in code reviews and provide constructive feedback to improve team deliverables quality Design and develop data ingestion pipeline using Talend/Informatica using industry best practices. Writing efficient SQL and Python scripts for large dataset analysis and building end to end
automation process on a set schedule. Design, implement data distribution layer using Snowflake REST API.
What you’ll bring to the role:
At least 7 years’ relevant experience would generally be expected to find the skills required for this role Experience in data analysis, data objects development and modelling in Snowflake data store. Snowflake REST API experience Talend ETL experience. Efficient SQL queries development and Python scripts development experience. Proven ability to work in distributed systems Proficiency with relational databases (such as DB2, SQL) querying and focus on data transformations. Excellent problem-solving skills and team-oriented mindset Strong data modelling concepts and schema design on relational data store and on cloud data store Strong communication skills both verbal and written. Capable of collaborating effectively across a variety of IT and Business groups, across regions and different roles. Familiarity with data visualization tools such as Tableau and PowerBI is a plus. Collaborating with data scientists/experts to integrate machine learning models into Snowflake
Dexian India Technologies
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Now
bengaluru
20.0 - 30.0 Lacs P.A.
hyderabad
10.0 - 20.0 Lacs P.A.
bengaluru
10.0 - 20.0 Lacs P.A.
10.0 - 20.0 Lacs P.A.
chennai
4.5 - 5.8 Lacs P.A.
kochi, hyderabad, coimbatore
37.5 - 40.0 Lacs P.A.
chennai, tamil nadu, india
Salary: Not disclosed
bengaluru
3.0 - 7.0 Lacs P.A.
bengaluru
3.0 - 7.0 Lacs P.A.
bengaluru
3.0 - 7.0 Lacs P.A.