Posted:8 hours ago|
Platform:
On-site
Full Time
Job Summary: Data Engineer with strong hands-on experience in Looker, Snowflake, AWS Cloud, and Python (PySpark). This role focuses on creating impactful data visualizations and dashboards, building and optimizing data pipelines, and supporting cloud-based workflows. Key Responsibilities: · Design and develop Looker dashboards and data visualizations to meet business requirements. · Write and optimize complex SQL queries in Snowflake for data transformation and reporting. · Collaborate with stakeholders to understand reporting needs and translate them into Looker models and explores. · Support and maintain data pipelines in AWS Cloud, particularly using Amazon MWAA (Airflow). · Work with PySpark to process large-scale datasets when needed. · Occasionally contribute to infrastructure-as-code tasks using Terraform for cloud resource provisioning. · Ensure data quality, reliability, and governance across all data workflows. · Collaborate with cross-functional teams including data analysts, engineers, and business stakeholders. Technical Skills: · 3+ years of experience with Looker. · Strong hands-on experience with Snowflake and SQL. · Proficiency in AWS Cloud Services (especially Airflow). · Experience with Python and PySpark for data processing. · Familiarity with Terraform and Infrastructure as Code (IaC) principles. · Good understanding of data modeling and BI best practices. Work Distribution: · 60% focus on Looker (dashboarding, visualization, SQL in Snowflake) · 40% focus on AWS Cloud and data engineering (Airflow, PySpark, Terraform) Preferred Qualifications: · Experience in agile development environments. · Excellent problem-solving and communication skills. · Ability to work independently and manage multiple priorities. Data Engineer with strong hands-on experience in Looker, Snowflake, AWS Cloud, and Python (PySpark). This role focuses on creating impactful data visualizations and dashboards, building and optimizing data pipelines, and supporting cloud-based workflows. Key Responsibilities: · Design and develop Looker dashboards and data visualizations to meet business requirements. · Write and optimize complex SQL queries in Snowflake for data transformation and reporting. · Collaborate with stakeholders to understand reporting needs and translate them into Looker models and explores. · Support and maintain data pipelines in AWS Cloud, particularly using Amazon MWAA (Airflow). · Work with PySpark to process large-scale datasets when needed. · Occasionally contribute to infrastructure-as-code tasks using Terraform for cloud resource provisioning. · Ensure data quality, reliability, and governance across all data workflows. · Collaborate with cross-functional teams including data analysts, engineers, and business stakeholders. Technical Skills: · 3+ years of experience with Looker. · Strong hands-on experience with Snowflake and SQL. · Proficiency in AWS Cloud Services (especially Airflow). · Experience with Python and PySpark for data processing. · Familiarity with Terraform and Infrastructure as Code (IaC) principles. · Good understanding of data modeling and BI best practices. Work Distribution: · 60% focus on Looker (dashboarding, visualization, SQL in Snowflake) · 40% focus on AWS Cloud and data engineering (Airflow, PySpark, Terraform) Preferred Qualifications: · Experience in agile development environments. · Excellent problem-solving and communication skills. · Ability to work independently and manage multiple priorities. Interested candidates can share resume at tisha.vaghela@proclink.com
Proclink
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Hyderabad, Telangana, India
Salary: Not disclosed
Hyderabad, Telangana, India
Salary: Not disclosed