Posted:1 day ago|
Platform:
On-site
Full Time
Role Description Job Overview: UST is seeking a skilled Snowflake Engineer with 5 to 7 years of experience to join our team. The ideal candidate will play a key role in the development, implementation, and optimization of data solutions on the Snowflake cloud platform. The candidate should have strong expertise in Snowflake , PySpark , and a solid understanding of ETL processes, along with proficiency in data engineering and data processing technologies. This role is essential for designing and maintaining high-performance data pipelines and data warehouses , focusing on scalability and efficient data storage, with a specific emphasis on transforming data using PySpark . Key Responsibilities Snowflake Data Warehouse Development: Design, implement, and optimize data warehouses on the Snowflake cloud platform. Ensure the effective utilization of Snowflake’s features for scalable, efficient, and high-performance data storage and processing. Data Pipeline Development: Develop, implement, and optimize end-to-end data pipelines on the Snowflake platform. Design and maintain ETL workflows to enable seamless data processing across systems. Data Transformation with PySpark: Leverage PySpark for data transformations within the Snowflake environment. Implement complex data cleansing, enrichment, and validation processes using PySpark to ensure the highest data quality. Collaboration: Work closely with cross-functional teams to design data solutions aligned with business requirements. Engage with stakeholders to understand business needs and translate them into technical solutions. Optimization: Continuously monitor and optimize data storage, processing, and retrieval performance in Snowflake. Leverage Snowflake’s capabilities for scalable data storage and data processing to ensure efficient performance. Required Qualifications Experience: 5 to 7 years of experience as a Data Engineer, with a strong emphasis on Snowflake. Proven experience in designing, implementing, and optimizing data warehouses on the Snowflake platform. Expertise in PySpark for data processing and analytics. Technical Skills: Snowflake: Strong knowledge of Snowflake architecture, features, and best practices for data storage and performance optimization. PySpark: Proficiency in PySpark for data transformation, cleansing, and processing within the Snowflake environment. ETL: Experience with ETL processes to extract, transform, and load data into Snowflake. Programming Languages: Proficiency in Python, SQL, or Scala for data processing and transformations. Data Modeling: Experience with data modeling techniques and designing efficient data schemas for optimal performance in Snowflake. Skills Snowflake,Pyspark,Sql,Etl Show more Show less
UST
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Thiruvananthapuram
4.4375 - 7.65 Lacs P.A.
Chennai, Tamil Nadu, India
Salary: Not disclosed
Trivandrum, Kerala, India
Experience: Not specified
Salary: Not disclosed
Thiruvananthapuram, Kerala, India
Salary: Not disclosed
Thiruvananthapuram
4.4375 - 7.65 Lacs P.A.
Thiruvananthapuram
4.4375 - 7.65 Lacs P.A.
Salary: Not disclosed
Trivandrum, Kerala, India
Salary: Not disclosed
Trivandrum, Kerala, India
Salary: Not disclosed
Trivandrum, Kerala, India
Salary: Not disclosed