Home
Jobs

Data Architect

0 years

0 Lacs

Posted:1 week ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Technical Requirements SQL (Advanced level): Strong command of complex SQL logic, including window functions, CTEs, pivot/unpivot, and be proficient in stored procedure/SQL script development. Experience writing maintainable SQL for transformations. Python for ETL : Ability to write modular and reusable ETL logic using Python. Familiarity with JSON manipulation and API consumption. ETL Pipeline Development : Experienced in developing ETL/ELT pipelines, data profiling, validation, quality/health check, error handling, logging and notifications, etc. Nice-to-Have Skills Knowledge of CI/CD practices for data workflows. Key Responsibilities Collaborate with analysts and data architects to develop and test ETL pipelines using SQL and Python in Data Brick and Yellowbrick. Perform related data quality checks and implement validation frameworks. Optimize queries for performance and cost-efficiency Technical Requirements SQL (Advanced level): Strong command of complex SQL logic, including window functions, CTEs, pivot/unpivot, and be proficient in stored procedure/SQL script development. Experience writing maintainable SQL for transformations. Python for ETL : Ability to write modular and reusable ETL logic using Python. Familiarity with JSON manipulation and API consumption. ETL Pipeline Development : Experienced in developing ETL/ELT pipelines, data profiling, validation, quality/health check, error handling, logging and notifications, etc. Nice-to-Have Skills: Experiences with AWS Redshift, Databrick and Yellow brick, Knowledge of CI/CD practices for data workflows. Mandatory Skills Pyspark & Yellowbrick Roles & Responsibilities Leverage expertise in AWS Redshift, PostgreSQL, Databricks, and Yellowbrick to design and implement scalable data solutions. Partner with data analysts and architects to build and test robust ETL pipelines using SQL and Python. Develop and maintain data validation frameworks to ensure high data quality and reliability. Optimize database queries to enhance performance and ensure cost-effective data processing. Show more Show less

Mock Interview

Practice Video Interview with JobPe AI

Start Data Interview Now

My Connections R Systems

Download Chrome Extension (See your connection in the R Systems )

chrome image
Download Now
R Systems
R Systems

28 Jobs

RecommendedJobs for You

Bengaluru / Bangalore, Karnataka, India

Bengaluru / Bangalore, Karnataka, India

Hyderabad, Telangana, India

Chennai, Tamil Nadu, India