Home
Jobs

Data Engineer - Azure/Synapse Analytics

0 years

0 Lacs

Posted:6 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Key Responsibilities Azure Synapse Development: Design, develop, and optimize data solutions within Azure Synapse Analytics, leveraging its capabilities for data warehousing, data lakes, and big data processing. Data Pipeline Development (ADF): Build, manage, and monitor scalable and efficient data pipelines using Azure Data Factory (ADF) for data ingestion, transformation, and orchestration. Data Warehousing & Modelling: Apply expertise in data warehousing principles and various data modelling techniques to design and implement robust data structures. Snowflake & Stored Procedures: Work extensively with Snowflake, including data loading, transformation, and optimizing queries. Develop and maintain complex Stored Procedures in various database environments. ETL/ELT Processes: Implement and enhance ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes to move data efficiently between disparate systems. Data Quality & Monitoring: Implement and ensure adherence to data quality frameworks. Utilize data monitoring tools to ensure data integrity and reliability. Job Scheduling: Configure and manage job scheduling for automated data workflows and pipeline execution. Data Format Handling: Work proficiently with various data formats including JSON, XML, CSV, and Parquet. Agile Collaboration: Participate actively in an Agile development environment, using tools like JIRA for task management and collaboration. Communication: Clearly communicate technical concepts and solutions to team members and stakeholders, maintaining formal and professional Skills : Azure Synapse: Good experience in Azure Synapse Analytics. Azure Data Factory (ADF): Good experience in Azure Data Factory. Snowflake: Good experience with Snowflake. Stored Procedures: Strong experience with Stored Procedures. Data Engineering Fundamentals: Experience with ETL/ELT processes, data warehousing, and data modelling. Data Quality & Operations: Experience with data quality frameworks, monitoring tools, and job scheduling. Data Formats: Knowledge of data formats like JSON, XML, CSV, and Parquet. Agile: Experience with Agile methodology and tools like JIRA. Language: Fluent in English (Strong written, verbal, and presentation skills). Communication: Good communication and formal skills. (ref:hirist.tech) Show more Show less

Mock Interview

Practice Video Interview with JobPe AI

Start Data Interview Now

My Connections IOWeb3 Technologies

Download Chrome Extension (See your connection in the IOWeb3 Technologies )

chrome image
Download Now

RecommendedJobs for You