Home
Jobs

182 Azure Synapse Jobs - Page 8

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3 - 8 years

3 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Name of Organization: Jarus Technologies (India) Pvt. Ltd. Organization Website: www.jarustech.com Position: Senior Software Engineer - Data warehouse Domain Knowledge: Insurance (Mandatory) Job Type: Permanent Location: Hyderabad - IDA Cherlapally, ECIL and Divyasree Trinity, Hi-Tech City. Experience: 3+ years Education: B. E. / B. Tech. / M. C. A. Resource Availability: Immediately or a maximum period of 30 days. Technical Skills: • Strong knowledge of data warehousing concepts and technologies. • Proficiency in SQL and other database languages. • Experience with ETL tools (e.g., Informatica, Talend, SSIS). • Familiarity with data modelling techniques. • Experience in building dimensional data modelling objects, dimensions, and facts. • Experience with cloud-based data warehouse platforms (e.g., AWS Redshift, Azure Synapse, Google Big Query). • Familiar with optimizing SQL queries and improving ETL processes for better performance. • Knowledge of data transformation, cleansing, and validation techniques. Experience with incremental loads, change data capture (CDC) and data scheduling. • • Comfortable with version control systems like GIT. • Familiar with BI tools like Power BI for visualization and reporting. Responsibilities: Design, develop and maintain data warehouse systems and ETL (Extract, Transform, Load) processes. • • Develop and optimize data models and schemas to support business needs. • Design and implement data warehouse architectures, including physical and logical designs. • Design and develop dimensions, facts and bridges. • Ensure data quality and integrity throughout the ETL process. • Design and implement relational and multidimensional database structures. • Understand data structures and fundamental design principles of data warehouses. • Analyze and modify data structures to adapt them to business needs. • Identify and resolve data quality issues and data warehouse problems. • Debug ETL processes and data warehouse queries. Communication skills: • Good communication skills to interact with customer • Ability to understand requirements for implementing an insurance warehouse system

Posted 1 month ago

Apply

3 - 5 years

6 - 8 Lacs

Pune

Work from Office

Naukri logo

Job Title: Senior Data Engineer Experience Required: 3 to 5 Years Location: Baner, Pune Job Type: Full-Time (WFO) Job Summary We are seeking a highly skilled and motivated Senior Data Engineer to join our dynamic team. The ideal candidate will have extensive experience in building and managing scalable data pipelines, working with cloud platforms like Microsoft Azure, AWS and utilizing advanced tools such as Datalakes, PySpark, and Azure Data Factory. The role involves collaborating with cross-functional teams to design and implement robust data solutions that support business intelligence, analytics, and decision-making processes. Key Responsibilities Design, develop, and maintain scalable ETL pipelines to ingest, transform, and process large datasets from various sources. Build and optimize data pipelines and architectures for efficient and secure data processing. Work extensively with Azure Data Lake , Azure Data Factory , and Azure Synapse Analytics for cloud data integration and management. Utilize Databricks and PySpark for advanced big data processing and analytics. Implement data modelling and design data warehouses to support business intelligence tools like Power BI . Ensure data quality, governance, and security using Azure DevOps and Azure Functions . Develop and maintain SQL Server databases and write optimized SQL queries for analytics and reporting. Collaborate with stakeholders to gather requirements and translate them into effective data engineering solutions. Implement Data architecture best practices to support big data initiatives and analytics use cases. Monitor, troubleshoot, and improve data workflows and processes to ensure seamless data flow. Required Skills and Qualifications Educational Background : Bachelor's or master's degree in computer science, Information Systems, or a related field. Technical Skills : Strong expertise in ETL development , Data Engineering , and Data Pipeline -Development . Proficiency in Azure Data Lake , Azure Data Factory , and Azure Synapse Analytics . Advanced knowledge of Databricks , PySpark , and Python for data processing. Hands-on experience with SQL Azure , SQL Server , and data warehousing solutions. Knowledge of Power BI for reporting and dashboard creation. Familiarity with Azure Functions , Azure DevOps , and cloud computing in Microsoft Azure . Understanding of data architecture and data modelling principles. Experience with Big Data tools and frameworks. Experience : Proven experience in designing and implementing large-scale data processing systems. Hands-on experience with DWH and handling big data workloads. Ability to work with both structured and unstructured datasets. Soft Skills : Strong problem-solving and analytical skills. Excellent communication and collaboration abilities to work effectively in a team environment. A proactive mindset with a passion for learning and adopting new technologies. Preferred Skills Experience with Azure Data Warehouse technologies. Knowledge of Azure Machine Learning or similar AI/ML frameworks. Familiarity with Data Governance and Data Compliance practices.

Posted 1 month ago

Apply

6 - 11 years

12 - 17 Lacs

Gurgaon

Work from Office

Naukri logo

Job Responsibilities Skillet Needed from the resource Data Architecture and Management : Understanding of Azure SQL technology, including SQL databases, operational data stores, and data transformation processes. Azure Data Factory : Expertise in using Azure Data Factory for ETL processes, including creating and managing pipelines. Python Programming : Proficiency in writing Python scripts, particularly using the pandas library, for data cleaning and transformation tasks. Azure Functions : Experience with Azure Functions for handling and processing Excel files, making them suitable for database import. API Integration : Skills in integrating various data sources, including APIs, into the data warehouse. Preferred candidate profile

Posted 1 month ago

Apply

8 - 13 years

15 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Design, develop, and maintain scalable ETL pipelines, data lakes, and hosting solutions using Azure tools. Ensure data quality, performance optimization, and compliance across hybrid and cloud environments. Required Candidate profile Data engineer with experience in Azure data services, ETL workflows, scripting, and data modeling. Strong collaboration with analytics teams and hands-on pipeline deployment using best practices

Posted 1 month ago

Apply

8 - 12 years

20 - 30 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Role & responsibilities As a Cloud Technical Lead- Data, you will get to: Build and maintain data pipelines to enable faster, better, data-informed decision-making through customer enterprise business analytics Collaborate with stakeholders to understand their strategic objectives and identify opportunities to leverage data and data quality Design, develop and maintain large-scale data solutions on Azure cloud platform Implement ETL pipelines using Azure Data Factory, Azure Databricks, and other related services Develop and deploy data models and data warehousing solutions using Azure Synapse Analytics, Azure SQL Database. Optimize performing, robust, and resilient data storage solutions using Azure Blob Storage, Azure Data Lake, Snowflake and other related services Develop and implement data security policies to ensure compliance with industry standards Provide support for data-related issues, and mentor junior data engineers in the team Define and manage data governance policies to ensure data quality and compliance with industry standards Collaborate with data architects, data scientists, developers, and business stakeholders to design data solutions that meet business requirements Coordinates with users to understand data needs and delivery of data with a focus on data quality, data reuse, consistency, security, and regulatory compliance. Conceptualize and visualize data frameworks. Preferred candidate profile Bachelors degree in computer science, Information Technology, or related field 8+ years of experience in data engineering with 3+ years hands on Databricks (DB) experience. Strong expertise in Microsoft Azure cloud platform and services, particularly Azure Data Factory, Azure Databricks, Azure Synapse Analytics, and Azure SQL Database Extensive experience working with large data sets with hands-on technology skills to design and build robust data architecture and data modeling and database design. Strong programming skills in SQL, Python and Pyspark Experience in Unity catalog & DBT and data governance knowledge. Good to have experience in Snowflake utilities such as SnowSQL, SnowPipe, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Agile development environment experience applying DEVOPS along with data quality and governance principles. Good leadership skills to guide and mentor the work of less experienced personnel Ability to contribute to continual improvement by suggesting improvements to Architecture or new technologies and mentoring junior employees and being ready to shoulder ad-hoc. Experience with cross-team collaboration, interpersonal skills/relationship building Ability to effectively communicate through presentation, interpersonal, verbal, and written skills.

Posted 1 month ago

Apply

5 - 8 years

15 - 22 Lacs

Gurugram, Chennai

Work from Office

Naukri logo

Role & responsibilities: Experience: 6+ years of experience in data analysis, with at least 2+ years of experience in DataVault modeling. Prior experience in the financial services domain is highly preferred. Technical Skills: Strong proficiency in SQL and hands-on experience with DataVault 2.0 methodology. Familiarity with data analysis tools like Python, R, or SAS. Experience with ETL/ELT tools and cloud data platforms (e.g., Azure Synapse, AWS Redshift, or GCP BigQuery). Knowledge of Wherescape 3D and RED for modelling DataVault Data Visualization: Proficiency in creating dashboards and reports using tools such as Power BI, Tableau, or Qlik. Soft Skills: Excellent analytical thinking and problem-solving abilities. Strong communication skills to effectively collaborate with technical and non-technical stakeholders. Knowledge of Financial Services: Understanding of key financial metrics and regulatory requirements, such as Basel III or SOX compliance.

Posted 1 month ago

Apply

8 - 13 years

12 - 20 Lacs

Hyderabad

Remote

Naukri logo

Mid Sr. Data Engineers - (5-8 years of relevant exp) a. Own and execute the entire Data Transformation from Snowflake to Azure Synapse instance b. Implement missing metrics and adapt existing ones. c. Set up version control and establish the refresh schedule. d. Create a data quality framework for validation and monitoring Note : Interested professional can share their resume to raghav.b@aciinfotech.com

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies