Jobs
Interviews

26 Azure Fabric Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

We are seeking a skilled and seasoned Senior Data Engineer to become a valued member of our innovative team. The ideal candidate should possess a solid foundation in data engineering and demonstrate proficiency in Azure, particularly Azure Data Factory (ADF), Azure Fabric, Databricks, and Snowflake. In this role, you will be responsible for the design, construction, and upkeep of data pipelines, ensuring data quality and accessibility, as well as collaborating with various teams to support our data-centric initiatives. Your responsibilities will include crafting, enhancing, and sustaining robust data pipelines utilizing tools such as Azure Data Factory, Azure Fabric, Databricks, and Snowflake. Moreover, you will work closely with data scientists, analysts, and stakeholders to comprehend data requirements, guarantee data availability, and maintain data quality. Implementing and refining ETL processes to efficiently ingest, transform, and load data from diverse sources into data warehouses, data lakes, and Snowflake will also be part of your role. Furthermore, you will play a crucial role in ensuring data integrity and security by adhering to best practices and data governance policies. Monitoring and rectifying data pipelines for timely and accurate data delivery, as well as optimizing data storage and retrieval processes to enhance performance and scalability, will be among your key responsibilities. Staying abreast of industry trends and best practices in data engineering and cloud technologies is essential, along with mentoring and providing guidance to junior data engineers. To qualify for this position, you should hold a Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Additionally, you must have over 5 years of experience in data engineering, with a strong emphasis on Azure, ADF, Azure Fabric, Databricks, and Snowflake. Proficiency in SQL, experience in data modeling and database design, and solid programming skills in Python, Scala, or Java are prerequisites. Familiarity with big data technologies like Apache Spark, Hadoop, and Kafka, as well as a sound grasp of data warehousing concepts and solutions, including Azure Synapse Analytics and Snowflake, are highly desirable. Knowledge of data governance, data quality, and data security best practices, exceptional problem-solving skills, and effective communication and collaboration abilities within a team setting are essential. Preferred qualifications include experience with other Azure services such as Azure Blob Storage, Azure SQL Database, and Azure Cosmos DB, familiarity with DevOps practices and tools for CI/CD in data engineering, and certifications in Azure Data Engineering, Snowflake, or related areas.,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

We are seeking a skilled and seasoned Senior Data Engineer to become a part of our innovative team. The perfect candidate will possess a solid foundation in data engineering and proficiency in Azure, Azure Data Factory (ADF), Azure Fabric, Databricks, and Snowflake. This position necessitates the creation, development, and upkeep of data pipelines, ensuring data quality and accessibility, and collaborating with various teams to support our data-centric initiatives. Your responsibilities will include designing, developing, and maintaining robust data pipelines utilizing Azure Data Factory, Azure Fabric, Databricks, and Snowflake. You will work closely with data scientists, analysts, and stakeholders to comprehend data requirements and guarantee the availability and quality of data. Implementing and refining ETL processes to handle the ingestion, transformation, and loading of data from diverse sources into data warehouses, data lakes, and Snowflake will also be a key aspect of your role. Additionally, you will be responsible for upholding data integrity and security through the implementation of best practices and compliance with data governance policies. Monitoring and resolving data pipeline issues to ensure the timely and accurate delivery of data, as well as enhancing data storage and retrieval processes to boost performance and scalability, will be essential tasks. It is crucial to stay abreast of industry trends and best practices in data engineering and cloud technologies. Furthermore, you will have the opportunity to mentor and provide guidance to junior data engineers, offering technical expertise and assistance as required. To qualify for this role, you should hold a Bachelor's or Master's degree in Computer Science, Information Technology, or a related field, along with over 5 years of experience in data engineering, with a strong emphasis on Azure, ADF, Azure Fabric, Databricks, and Snowflake. Proficiency in SQL, experience in data modeling and database design, and strong programming skills in Python, Scala, or Java are also essential. Familiarity with big data technologies like Apache Spark, Hadoop, and Kafka, as well as a solid grasp of data warehousing concepts and experience with data warehousing solutions (e.g., Azure Synapse Analytics, Snowflake) is required. Knowledge of data governance, data quality, and data security best practices, excellent problem-solving abilities, and effective communication and collaboration skills within a team setting are all highly valued. Preferred qualifications include experience with other Azure services such as Azure Blob Storage, Azure SQL Database, and Azure Cosmos DB, familiarity with DevOps practices and tools for CI/CD in data engineering, as well as certifications in Azure Data Engineering, Snowflake, or related fields.,

Posted 4 days ago

Apply

7.0 - 11.0 years

15 - 25 Lacs

Hyderabad

Hybrid

Role Purpose: The Senior Data Engineer will support and enable the Data Architecture and the Data Strategy. Supporting solution architecture and engineering for data ingestion and modelling challenges. The role will support the deduplication of enterprise data tools, working with the Lonza Data Governance Board, Digital Council and IT to drive towards a single Data and Information Architecture. This will be a hands-on engineering role with a focus on business and digital transformation. The role will be responsible for managing and maintain the Data Architecture and solutions that deliver the platform at with operational support and troubleshooting. The Senior Data Engineer will also manage (no reporting line changes but from day-to-day delivery) and coordinate the Data Engineering team members (Internal and External) working on the various project implementations. Experience : 7-10 years experience with digital transformation and data projects. Experience in designing, delivering and managing data infrastructures. Proficiency in using Cloud Services (Azure) for data engineering, storage and analytics. Strong SQL and NoSQL experience Data Modelling Hands on developing pipelines, setting-up architectures in Azure Fabric. Team management experience (internal and external resources). Good understanding of data warehousing, data virtualization and analytics. Experience in working with data analysts, data scientists and BI teams to deliver on data requirements. Data Catalogue experience is a plus. ETL Pipeline Design is a plus Python Development skills is a plus Realtime data ingestion (E.g. Kafka) Licenses or Certifications Beneficial; ITIL, PM, CSM, Six Sigma, Lean Knowledge Good understanding about integration, ETL, API and Data sharing concepts. Understanding / Awareness of Visualization tools is a plus Knowledge and understanding of relevant legal and regulatory requirements, such as CFR 21 part 11, EU General Data Protection Regulation, Health Insurance Portability and Accountability Act (HIPAA) and GxP validation process would be a plus. Skills The position requires a pragmatic leader with sound knowledge of data, integration and analytics. Excellent written and verbal communication skills, interpersonal and collaborative skills, and the ability to communicate technical concepts to nontechnical audiences. Exhibit excellent analytical skills, the ability to manage and contribute to multiple projects under strict timelines, as well as the ability to work well in a demanding, dynamic environment and meet overall objectives. Project management skills: scheduling and resource management are a plus. Ability to motivate cross-functional, interdisciplinary teams to achieve tactical and strategic goals. Data Catalogue Project and Team management skills are plus. Strong SAP skills are a plus.

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You will be joining YASH Technologies, a leading technology integrator focused on helping clients enhance competitiveness, optimize costs, and drive business transformation in an increasingly virtual world. As a Microsoft Fabric Professional, you will be responsible for working with cutting-edge technologies in Azure Fabric, Azure Data factory, Azure Databricks, Azure Synapse, Azure SQL, and ETL processes. Your key responsibilities will include creating pipelines, datasets, dataflows, Integration runtimes, and monitoring pipelines in Azure. You will be extracting, transforming, and loading data from source systems using Azure Databricks and creating SQL scripts for complex queries. Additionally, you will develop Synapse pipelines to migrate data from Gen2 to Azure SQL and work on data migration pipelines to Azure cloud (Azure SQL). Experience in using Azure Data Catalog and Big Data Batch Processing Solutions, Interactive Processing Solutions, and Real-Time Processing Solutions will be beneficial for this role. While certifications are considered good to have, YASH Technologies provides an inclusive team environment where you are empowered to create a career path aligned with your aspirations. The workplace culture is grounded on principles like flexible work arrangements, emotional positivity, trust, transparency, open collaboration, and all necessary support for realizing business goals. Join us at YASH Technologies for stable employment and a great atmosphere with an ethical corporate culture.,

Posted 6 days ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be joining Lexitas, a high-growth company that values strong personal relationships with clients and delivers reliable, accurate, and professional services across various offerings such as local and national court reporting, medical record retrieval, process service, registered agent services, and legal talent outsourcing. As part of a multinational corporation, Lexitas has established a subsidiary in Chennai, India known as Lexitas India Pvt. Ltd., aimed at becoming the Lexitas Global Capability Center. This center will focus on building a world-class IT development team and evolving into a Shared Services hub for several corporate functions. To learn more about Lexitas, visit https://www.lexitaslegal.com. This is a full-time position based in Chennai, India. In this role, you will lead the design and development of advanced Power BI reports and dashboards, offer guidance on data modeling and DAX calculations, collaborate with stakeholders to define data requirements, ensure data security and compliance, and troubleshoot and optimize Power BI solutions. The ideal candidate should have 6 to 8+ years of experience working with Reporting tools, 3 to 5+ years of hands-on development experience with Power BI, proficiency in SQL and data warehouse concepts, expertise in developing and optimizing complex Power BI solutions, experience in developing, debugging, and writing complex MS SQL queries, familiarity with data pipeline orchestration and automation, skills in performance tuning and optimization of Power BI reports and SQL queries, ability to architect end-to-end BI solutions, strong communication skills to lead cross-functional teams, project management capabilities to deliver results, certifications in Power BI are highly desirable, and an understanding of Cloud and Azure Fabric. Qualifications for this position include a bachelor's degree in computer science or a Master's degree preferred, along with 8+ years of proven experience.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You have an exciting opportunity to join YASH Technologies as a Microsoft Fabric Professional. As part of our team, you will be working with cutting-edge technologies to drive business transformation and create real positive changes in an increasingly virtual world. Your main responsibilities will include working with Azure Fabric, Azure Data Factory, Azure Databricks, Azure Synapse, Azure SQL, and ETL processes. You will be creating pipelines, datasets, dataflows, integration runtimes, and monitoring pipelines to trigger runs. Additionally, you will be involved in extracting, transforming, and loading data from source systems using Azure Databricks, as well as creating SQL scripts for complex queries. Moreover, you will work on creating Synapse pipelines to migrate data from Gen2 to Azure SQL, data migration pipelines to Azure cloud (Azure SQL), and database migration from on-prem SQL server to Azure Dev Environment using Azure DMS and Data Migration Assistant. Experience in using Azure Data Catalog and Big Data Batch Processing Solutions, Interactive Processing Solutions, and Real-Time Processing Solutions is a plus. As a Microsoft Fabric Professional, you are encouraged to pursue relevant certifications to enhance your skills. At YASH Technologies, we provide a supportive and inclusive team environment where you can create a career that aligns with your goals. Our Hyperlearning workplace is built on flexibility, emotional positivity, trust, transparency, and open collaboration to help you achieve your business goals while maintaining stable employment in a great atmosphere with an ethical corporate culture.,

Posted 1 week ago

Apply

3.0 - 8.0 years

6 - 15 Lacs

Ahmedabad

Work from Office

Job Description: As an ETL Developer, you will be responsible for designing, building, and maintaining ETL pipelines using MSBI stack, Azure Data Factory (ADF) and Fabric. You will work closely with data engineers, analysts, and other stakeholders to ensure data is accessible, reliable, and processed efficiently. Key Responsibilities: Design, develop, and deploy ETL pipelines using ADF and Fabric. Collaborate with data engineers and analysts to understand data requirements and translate them into efficient ETL processes. Optimize data pipelines for performance, scalability, and robustness. Integrate data from various sources, including S3, relational databases, and APIs. Implement data validation and error handling mechanisms to ensure data quality. Monitor and troubleshoot ETL jobs to ensure data accuracy and pipeline reliability. Maintain and update existing data pipelines as data sources and requirements evolve. Document ETL processes, data models, and pipeline configurations. Qualifications: Experience: 3+ years of experience in ETL development, with a focus on ADF, MSBI stack, SQL, Power BI, Fabric. Technical Skills: Strong expertise in ADF, MSBI stack, SQL, Power BI. Proficiency in programming languages such as Python or Scala. Hands-on experience with ADF, Fabric, Power BI, MSBI. Solid understanding of data warehousing concepts, data modeling, and ETL best practices. Familiarity with orchestration tools like Apache Airflow is a plus. Data Integration: Experience with integrating data from diverse sources, including relational databases, APIs, and flat files. Problem-Solving: Strong analytical and problem-solving skills with the ability to troubleshoot complex ETL issues. Communication: Excellent communication skills, with the ability to work collaboratively with cross-functional teams. Education: Bachelor's degree in computer science, Engineering, or a related field, or equivalent work experience. Nice to Have: Experience with data lakes and big data processing. Knowledge of data governance and security practices in a cloud environment.

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Microsoft Fabric Professional at YASH Technologies, you will be responsible for working with cutting-edge technologies to bring about real positive changes in an increasingly virtual world. You will have the opportunity to contribute to business transformation by leveraging your experience in Azure Fabric, Azure Data factory, Azure Databricks, Azure Synapse, Azure Storage Services, Azure SQL, ETL, Azure Cosmos DB, Event HUB, Azure Data Catalog, Azure Functions, and Azure Purview. With 5-8 years of experience in Microsoft Cloud solutions, you will be involved in creating pipelines, datasets, dataflows, Integration runtimes, and monitoring Pipelines. Your role will also entail extracting, transforming, and loading data from source systems using Azure Databricks, as well as preparing DB Design Documents based on client requirements. Collaborating with the development team, you will create database structures, queries, and triggers while working on SQL scripts and Synapse pipelines for data migration to Azure SQL. Your responsibilities will include data migration pipeline to Azure cloud, database migration from on-prem SQL server to Azure Dev Environment, and implementing data governance in Azure. Additionally, you will work on data migration pipelines for on-prem SQL server data to Azure cloud, along with utilizing Azure data catalog and experience in Big Data Batch Processing Solutions, Interactive Processing Solutions, and Real-Time Processing Solutions. To excel in this role, mandatory certifications are required. At YASH Technologies, you will have the opportunity to create a career path tailored to your aspirations within an inclusive team environment. Our Hyperlearning workplace is built on principles of flexible work arrangements, free spirit, emotional positivity, agile self-determination, trust, transparency, open collaboration, support for business goals realization, stable employment, and ethical corporate culture. Join us to embark on a journey of continuous learning, unlearning, and relearning in a dynamic and evolving technology landscape.,

Posted 2 weeks ago

Apply

5.0 - 10.0 years

0 Lacs

bhubaneswar

On-site

As a Senior Azure Data Engineer, you will be an integral part of our dynamic data team, contributing your expertise and skills to design, build, optimize, and maintain scalable data solutions on the Azure platform. Your primary responsibilities will include developing robust data pipelines using Azure Data Factory, Azure Data Lake, and Azure SQL, as well as working with Azure Fabric, Cosmos DB, and SQL Server to create end-to-end data solutions. You will also be involved in Database Design, Data Modeling, Performance Tuning, and writing complex SQL queries to support data processing and reporting requirements. Proactive optimization strategies and leading data migration efforts will be key aspects of your role, along with collaborating with cross-functional teams to translate business requirements into technical solutions. Maintaining documentation and adhering to industry best practices for security, compliance, and scalability will also be essential. The ideal candidate for this position should possess proven experience in Azure Fabric, SQL Server, Azure Data Factory, Azure Data Lake, and Cosmos DB. Strong hands-on expertise in complex SQL queries, query efficiency and optimization, database design, data modeling, data migration techniques, and performance tuning is also required. A solid understanding of cloud infrastructure and data integration patterns in Azure is essential for success in this role. Nice-to-have qualifications include Microsoft Azure certifications related to Data Engineering or Azure Solutions Architecture, as well as experience working in agile environments with CI/CD practices. The required qualifications for this position include a minimum of 5+ years of experience in the software industry, a B.Tech/M.Tech in CS/IT, or a related field, and excellent verbal and written communication skills. If you are a motivated and skilled Senior Azure Data Engineer looking to join a growing team and make a significant impact in the field of data engineering, we encourage you to apply for this exciting opportunity.,

Posted 2 weeks ago

Apply

5.0 - 10.0 years

0 Lacs

bhubaneswar

On-site

You are a highly skilled and motivated Senior Azure Data Engineer with 5-10 years of experience, sought to join our expanding data team. Your expertise lies in cloud-based data engineering, particularly with hands-on experience in various Azure services. In this role, you will be pivotal in designing, constructing, optimizing, and managing scalable data solutions that align with our business objectives. Your responsibilities will encompass: - Designing and implementing robust and scalable data pipelines utilizing Azure Data Factory, Azure Data Lake, and Azure SQL. - Extensive work with Azure Fabric, Cosmos DB, and SQL Server to create and enhance end-to-end data solutions. - Conducting Database Design, Data Modeling, and Performance Tuning to uphold system reliability and data integrity. - Crafting and refining complex SQL queries to support data ingestion, transformation, and reporting requirements. - Proactively implementing SQL optimization and preventive maintenance strategies for efficient database performance. - Leading data migration initiatives from on-premise to cloud or across various Azure services. - Collaborating with cross-functional teams to gather requirements and translate them into technical solutions. - Maintaining clear documentation and adhering to industry best practices for security, compliance, and scalability. Your required skills include proven experience with Azure Fabric, SQL Server, Azure Data Factory, Azure Data Lake, and Cosmos DB. You must possess strong hands-on expertise in complex SQL queries, SQL query efficiency and optimization, database design, data modeling, data migration techniques, and performance tuning. A solid understanding of cloud infrastructure and data integration patterns in Azure is essential. Nice to have qualifications: - Microsoft Azure certifications related to Data Engineering or Azure Solutions Architecture. - Experience working in agile environments with CI/CD practices. To qualify for this role, you must have a minimum of 5+ years of experience in the software industry, hold a B.Tech/M.Tech in CS/IT, or related field, and exhibit excellent verbal and written communication skills. Join us in this exciting opportunity to contribute to our dynamic data team and shape the future of our data solutions.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

10 - 20 Lacs

Chennai, Bengaluru

Hybrid

We are hiring for Big Data Lead Yrs of Experience : 6yrs - 10yrs Primary Skills set : MS Fabric/ Azure Fabric , Python/ Pyspark, SQL Work location : Bangalore/Chennai Work Mode: Hybrid Notice Period : Imm - 30 days Kindly, share the following details : Updated CV Relevant Skills Total Experience Current Company Current CTC Expected CTC Notice Period Current Location Preferred Location

Posted 3 weeks ago

Apply

10.0 - 17.0 years

25 - 40 Lacs

Chennai

Work from Office

Extensive experience in big data architecture, with a focus on Cloud native and/or Cloud based services / solutions. data processing technologies such as Hadoop, Spark, and Kafka, in the Cloud ecosystem. AWS, Azure and GCP.

Posted 1 month ago

Apply

6.0 - 11.0 years

3 - 12 Lacs

Hyderabad, Telangana, India

On-site

Collaborate with several stakeholders to deeply understand the needs of data practitioners to deliver at scale Lead Data Engineers to define, build and maintain Data Platform Work on building Data Lake in Azure Fabric processing data from multiple sources Migrating existing data store from Azure Synapse to Azure Fabric Implement data governance and access control Drive development effort End-to-End for on-time delivery of high-quality solutions that conform to requirements, conform to the architectural vision, and comply with all applicable standards. Present technical solutions, capabilities, considerations, and features in business terms. Effectively communicate status, issues, and risks in a precise and timely manner. Further develop critical initiatives, such as Data Discovery, Data Lineage and Data Quality Leading team and Mentor junior resources Help your team members grow in their role and achieve their career aspirations Build data systems, pipelines, analytical tools and programs Conduct complex data analysis and report on results 5+ Years of Experience as a data engineer or similar role in Azure Synapses, ADF or relevant exp in Azure Fabric Degree in Computer Science, Data Science, Mathematics, IT, or similar field Must have experience executing projects end to end. At least one data engineering project should have worked in Azure Synapse, ADF or Azure Fabric Should be experienced in handling multiple data sources Technical expertise with data models, data mining, and segmentation techniques Deep understanding, both conceptually and in practice of at least one object orientated library (Python, pySpark) Strong SQL skills and a good understanding of existing SQL warehouses and relational databases. Strong Spark, PySpark, Spark SQL skills and good understanding of distributed processing frameworks. Build large-scale batches and real-time data pipelines. Ability to work independently and mentor junior resources. Desire to lead and develop a team of Data Engineers across multiple levels Experience or knowledge in Data Governance Azure Cloud experience with Data modeling, CICD, Agile Methodologies, DockerKubernetes

Posted 1 month ago

Apply

7.0 - 12.0 years

2 - 11 Lacs

Hyderabad, Telangana, India

On-site

Collaborate with several stakeholders to deeply understand the needs of data practitioners to deliver at scale Lead Data Engineers to define, build and maintain Data Platform Work on building Data Lake in Azure Fabric processing data from multiple sources Migrating existing data store from Azure Synapse to Azure Fabric Implement data governance and access control Drive development effort End-to-End for on-time delivery of high-quality solutions that conform to requirements, conform to the architectural vision, and comply with all applicable standards. Present technical solutions, capabilities, considerations, and features in business terms. Effectively communicate status, issues, and risks in a precise and timely manner. Further develop critical initiatives, such as Data Discovery, Data Lineage and Data Quality Leading team and Mentor junior resources Help your team members grow in their role and achieve their career aspirations Build data systems, pipelines, analytical tools and programs Conduct complex data analysis and report on results Qualifications7+ Years of Experience as a data engineer or similar role in Azure Synapses, ADF or relevant exp in Azure Fabric Degree in Computer Science, Data Science, Mathematics, IT, or similar fie

Posted 1 month ago

Apply

5.0 - 10.0 years

2 - 12 Lacs

Hyderabad, Telangana, India

On-site

Responsibilities Collaborate with several stakeholders to deeply understand the needs of data practitioners to deliver at scale Lead Data Engineers to define, build and maintain Data Platform Work on building Data Lake in Azure Fabric processing data from multiple sources Migrating existing data store from Azure Synapse to Azure Fabric Implement data governance and access control Drive development effort End-to-End for on-time delivery of high-quality solutions that conform to requirements, conform to the architectural vision, and comply with all applicable standards. Present technical solutions, capabilities, considerations, and features in business terms. Effectively communicate status, issues, and risks in a precise and timely manner. Further develop critical initiatives, such as Data Discovery, Data Lineage and Data Quality Leading team and Mentor junior resources Help your team members grow in their role and achieve their career aspirations Build data systems, pipelines, analytical tools and programs Conduct complex data analysis and report on results Qualifications5+ Years of Experience as a data engineer or similar role in Azure Synapses, ADF or relevant exp in Azure Fabric Degree in Computer Science, Data Science, Mathematics, IT, or similar fie

Posted 1 month ago

Apply

5.0 - 10.0 years

12 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Design and implement robust and scalable data pipelines using Azure Data Factory, Azure Data Lake, and Azure SQL. Work extensively with Azure Fabric, CosmosDB, and SQL Server to develop and optimize end-to-end data solutions. Perform Database Design, Data Modeling, and Performance Tuning to ensure system reliability and data integrity. Write and optimize complex SQL queries to support data ingestion, transformation, and reporting needs. Proactively implement SQL optimization and preventive maintenance strategies to ensure efficient database performance. Lead data migration efforts from on-premise to cloud or across Azure services. Collaborate with cross-functional teams to gather requirements and translate them into technical solutions. Maintain clear documentation and follow industry best practices for security, compliance, and scalability. Required Skills : Proven experience working with: Azure Fabric SQL Server Azure Data Factory Azure Data Lake Cosmos DB Strong hands-on expertise in: Complex SQL queries SQL query efficiency and optimization Database design and data modeling Data migration techniques and performance tuning Solid understanding of cloud infrastructure and data integration patterns in Azure. Experience working in agile environments with CI/CD practices. Nice to have :Microsoft Azure certifications related to Data Engineering or Azure Solutions Location - Bengaluru , Hyderabad , Chennai , Pune , Noida , Mumbai.

Posted 1 month ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Pune

Work from Office

Azure Cloud Data Lead Job Title: Azure Cloud Data Lead Location: Pune, India Experience: 7 - 12 Years Work Mode: Full-time, Office-based Company Overview : Smartavya Analytica is a niche Data and AI company based in Mumbai, established in 2017. We specialize in data-driven innovation, transforming enterprise data into strategic insights. With expertise spanning over 25+ Data Modernization projects and handling large datasets up to 24 PB in a single implementation, we have successfully delivered data and AI projects across multiple industries, including retail, finance, telecom, manufacturing, insurance, and capital markets. We are specialists in Cloud, Hadoop, Big Data, AI, and Analytics, with a strong focus on Data Modernization for On-premises, Private, and Public Cloud Platforms. Visit us at: https://smart-analytica.com Job Summary: We are looking for a highly experienced Azure Cloud Data Lead to oversee the architecture, design, and delivery of enterprise-scale cloud data solutions. This role demands deep expertise in Azure Data Services , strong hands-on experience with data engineering and governance , and a strategic mindset to guide cloud modernization initiatives across complex environments. Key Responsibilities: Architect and design data lakehouses , data warehouses , and analytics platforms using Azure Data Services . Lead implementations using Azure Data Factory (ADF) , Azure Synapse Analytics , and Azure Fabric (OneLake ecosystem). Define and implement data governance frameworks including cataloguing, lineage, security, and quality controls. Collaborate with business stakeholders, data engineers, and developers to translate business requirements into scalable Azure architectures. Ensure platform design meets performance, scalability, security, and regulatory compliance needs. Guide migration of on-premises data platforms to Azure Cloud environments. Create architectural artifacts: solution blueprints, reference architectures, governance models, and best practice guidelines. Collaborate with Sales / presales to customer meetings to understand the business requirement, the scope of work and propose relevant solutions. Drive the MVP/PoC and capability demos to prospective customers / opportunities Must-Have Skills: 7 - 12 years of experience in data architecture, data engineering, or analytics solutions. Hands-on expertise in Azure Cloud services: ADF , Synapse , Azure Fabric (OneLake) , and Databricks (good to have). Strong understanding of data governance , metadata management, and compliance frameworks (e.g., GDPR, HIPAA). Deep knowledge of relational and non-relational databases (SQL, NoSQL) on Azure. Experience with security practices (IAM, RBAC, encryption, data masking) in cloud environments. Strong client-facing skills with the ability to present complex solutions clearly. Preferred Certifications: Microsoft Certified: Azure Solutions Architect Expert Microsoft Certified: Azure Data Engineer Associate

Posted 1 month ago

Apply

4.0 - 6.0 years

0 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Job Description : YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we're a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hireMicrosoft Fabric Professionals in the following areas : Experience 4-6 Years Job Description Experience in Azure Fabric, Azure Data factory, Azure Databricks, Azure Synapse, Azure SQL, ETL Create Pipelines, datasets, dataflows, Integration runtimes and monitoring Pipelines and trigger runs Extract Transformation and Load Data from source system and processing the data in Azure Databricks Create SQL scripts to perform complex queries Create Synapse pipelines to migrate data from Gen2 to Azure SQL Data Migration pipeline to Azure cloud (Azure SQL). Database migration from on-prem SQL server to Azure Dev Environment by using Azure DMS and Data Migration Assistant Experience in using azure data catalog Experience in Big Data Batch Processing Solutions Interactive Processing Solutions Real Time Processing Solutions Certifications Good To Have At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment.We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 1 month ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Pune, Gurugram, Bengaluru

Work from Office

Contractual Hiring manager :- My profile :- linkedin.com/in/yashsharma1608 Payroll of :- https://www.nyxtech.in/ 1. AZURE DATA ENGINEER WITH FABRIC The Role : Lead Data Engineer PAYROLL Client - Brillio About Role: Experience 6 to 8yrs Location- Bangalore , Hyderabad , Pune , Chennai , Gurgaon (Hyderabad is preferred) Notice- 15 days / 30 days. Budget -15 LPA AZURE FABRIC EXP MANDATE Skills : Azure Onelake, datapipeline , Apache Spark , ETL , Datafactory , Azure Fabric , SQL , Python/Scala. Key Responsibilities: Data Pipeline Development: Lead the design, development, and deployment of data pipelines using Azure OneLake, Azure Data Factory, and Apache Spark, ensuring efficient, scalable, and secure data movement across systems. ETL Architecture: Architect and implement ETL (Extract, Transform, Load) workflows, optimizing the process for data ingestion, transformation, and storage in the cloud. Data Integration: Build and manage data integration solutions that connect multiple data sources (structured and unstructured) into a cohesive data ecosystem. Use SQL, Python, Scala, and R to manipulate and process large datasets. Azure OneLake Expertise: Leverage Azure OneLake and Azure Synapse Analytics to design and implement scalable data storage and analytics solutions that support big data processing and analysis. Collaboration with Teams: Work closely with Data Scientists, Data Analysts, and BI Engineers to ensure that the data infrastructure supports analytical needs and is optimized for performance and accuracy. Performance Optimization: Monitor, troubleshoot, and optimize data pipeline performance to ensure high availability, fast processing, and minimal downtime. Data Governance & Security: Implement best practices for data governance, data security, and compliance within the Azure ecosystem, ensuring data privacy and protection. Leadership & Mentorship: Lead and mentor a team of data engineers, promoting a collaborative and high-performance team culture. Oversee code reviews, design decisions, and the implementation of new technologies. Automation & Monitoring: Automate data engineering workflows, job scheduling, and monitoring to ensure smooth operations. Use tools like Azure DevOps, Airflow, and other relevant platforms for automation and orchestration. Documentation & Best Practices: Document data pipeline architecture, data models, and ETL processes, and contribute to the establishment of engineering best practices, standards, and guidelines. C Innovation: Stay current with industry trends and emerging technologies in data engineering, cloud computing, and big data analytics, driving innovation within the team.C

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

GitHub Actions + DevOPS Terraform / Bicep AKS Cluster and deployment Cloud Technology majorly Azure SQL server, Fabric, Data bricks etc.. Good to have Powershell Flawless Communication Job Description We are seeking an experienced Sr. Azure GitHub DevOps Engineer to join our team, supporting a global enterprise client. In this role, you will be responsible for designing and optimizing DevOps pipelines, leveraging GitHub Actions and Azure DevOps tools to streamline software delivery and infrastructure automation. This role requires expertise in GitHub Actions, Azure-native services, and modern DevOps methodologies to enable seamless collaboration and ensure scalable, secure, and efficient cloud-based solutions. Key Responsibilities GitHub Actions Development: Design, implement, and optimize CI/CD workflows using GitHub Actions to support multi-environment deployments. Leverage GitHub Actions for automated builds, tests, and deployments, ensuring integration with Azure services. Create reusable GitHub Actions templates and libraries for consistent DevOps practices. GitHub Repository Administration: Manage GitHub repositories, branching strategies, and access permissions. Implement GitHub features like Dependabot, code scanning, and security alerts to enhance code quality and security. Azure DevOps Integration: Utilize Azure Pipelines in conjunction with GitHub Actions to orchestrate complex CI/CD workflows. Configure and manage Azure services such as: Azure Kubernetes Service (AKS) for container orchestration. Azure Application Gateway and Azure Front Door for load balancing and traffic management. Azure Monitoring , Azure App Insights , and Azure KeyVault for observability, diagnostics, and secure secrets management. HELM charts and Microsoft Bicep for Infrastructure as Code. Automation & Scripting: Develop robust automation scripts using PowerShell , Bash , or Python to streamline operational tasks. Automate monitoring, deployments, and environment management workflows. Infrastructure Management: Oversee and maintain cloud environments with a focus on scalability, security, and reliability. Implement containerization strategies using Docker and orchestration via AKS . Collaboration: Partner with cross-functional teams to align DevOps practices with business objectives while maintaining compliance and security standards. Monitoring & Optimization: Deploy and maintain monitoring and logging tools to ensure system performance and uptime. Optimize pipeline execution times and infrastructure costs. Documentation & Best Practices: Document GitHub Actions workflows, CI/CD pipelines, and Azure infrastructure configurations. Advocate for best practices in version control, security, and DevOps methodologies. Qualifications Education: Bachelor's degree in Computer Science, Information Technology, or related field (preferred). Experience: 3+ years of experience in DevOps engineering with a focus on GitHub Actions and Azure DevOps tools. Proven track record of designing CI/CD workflows using GitHub Actions in production environments. Extensive experience with Azure services, including AKS, Azure Front Door, Azure Application Gateway, Azure KeyVault, Azure App Insights, and Azure Monitoring. Hands-on experience with Infrastructure as Code tools, including Microsoft Bicep and HELM charts . Technical Skills: GitHub Actions Expertise: Deep understanding of GitHub Actions, workflows, and integrations with Azure services. Scripting & Automation: Proficiency in PowerShell , Bash , and Python for creating automation scripts and custom GitHub Actions. Containerization & Orchestration: Experience with Docker and Kubernetes , including Azure Kubernetes Service (AKS). Security Best Practices: Familiarity with securing CI/CD pipelines, secrets management, and cloud environments. Monitoring & Optimization: Hands-on experience with Azure Monitoring, App Insights, and logging solutions to ensure system reliability. Soft Skills: Strong problem-solving and analytical abilities. Excellent communication and collaboration skills, with the ability to work in cross-functional and global teams. Detail-oriented with a commitment to delivering high-quality results. Preferred Qualifications Experience in DevOps practices within the financial or tax services industries. Familiarity with advanced GitHub features such as Dependabot, Security Alerts, and CodeQL. Knowledge of additional CI/CD platforms like Jenkins or CircleCI.

Posted 1 month ago

Apply

10.0 - 16.0 years

25 - 27 Lacs

Chennai

Work from Office

We at Dexian India, are looking to hire a Cloud Data PM with over 10 years of hands-on experience in AWS/Azure, DWH, and ETL. The role is based in Chennai with a shift from 2.00pm to 11.00pm IST. Key qualifications we seek in candidates include: - Solid understanding of SQL and data modeling - Proficiency in DWH architecture, including EDW/DM concepts and Star/Snowflake schema - Experience in designing and building data pipelines on Azure Cloud stack - Familiarity with Azure Data Explorer, Data Factory, Data Bricks, Synapse Analytics, Azure Fabric, Azure Analysis Services, and Azure SQL Datawarehouse - Knowledge of Azure DevOps and CI/CD Pipelines - Previous experience managing scrum teams and working as a Scrum Master or Project Manager on at least 2 projects - Exposure to on-premise transactional database environments like Oracle, SQL Server, Snowflake, MySQL, and/or Postgres - Ability to lead enterprise data strategies, including data lake delivery - Proficiency in data visualization tools such as Power BI or Tableau, and statistical analysis using R or Python - Strong problem-solving skills with a track record of deriving business insights from large datasets - Excellent communication skills and the ability to provide strategic direction to technical and business teams - Prior experience in presales, RFP and RFI responses, and proposal writing is mandatory - Capability to explain complex data solutions clearly to senior management - Experience in implementing, managing, and supporting data warehouse projects or applications - Track record of leading full-cycle implementation projects related to Business Intelligence - Strong team and stakeholder management skills - Attention to detail, accuracy, and ability to meet tight deadlines - Knowledge of application development, APIs, Microservices, and Integration components Tools & Technology Experience Required: - Strong hands-on experience in SQL or PLSQL - Proficiency in Python - SSIS or Informatica (Mandatory one of the tools) - BI: Power BI, or Tableau (Mandatory one of the tools)

Posted 2 months ago

Apply

10.0 - 16.0 years

25 - 28 Lacs

Chennai, Tamil Nadu, India

On-site

We at Dexian India, are looking to hire a Cloud Data PM with over 10 years of hands-on experience in AWS/Azure, DWH, and ETL. The role is based in Chennai with a shift from 2.00pm to 11.00pm IST. Key qualifications we seek in candidates include: - Solid understanding of SQL and data modeling - Proficiency in DWH architecture, including EDW/DM concepts and Star/Snowflake schema - Experience in designing and building data pipelines on Azure Cloud stack - Familiarity with Azure Data Explorer, Data Factory, Data Bricks, Synapse Analytics, Azure Fabric, Azure Analysis Services, and Azure SQL Datawarehouse - Knowledge of Azure DevOps and CI/CD Pipelines - Previous experience managing scrum teams and working as a Scrum Master or Project Manager on at least 2 projects - Exposure to on-premise transactional database environments like Oracle, SQL Server, Snowflake, MySQL, and/or Postgres - Ability to lead enterprise data strategies, including data lake delivery - Proficiency in data visualization tools such as Power BI or Tableau, and statistical analysis using R or Python - Strong problem-solving skills with a track record of deriving business insights from large datasets - Excellent communication skills and the ability to provide strategic direction to technical and business teams - Prior experience in presales, RFP and RFI responses, and proposal writing is mandatory - Capability to explain complex data solutions clearly to senior management - Experience in implementing, managing, and supporting data warehouse projects or applications - Track record of leading full-cycle implementation projects related to Business Intelligence - Strong team and stakeholder management skills - Attention to detail, accuracy, and ability to meet tight deadlines - Knowledge of application development, APIs, Microservices, and Integration components Tools & Technology Experience Required: - Strong hands-on experience in SQL or PLSQL - Proficiency in Python - SSIS or Informatica (Mandatory one of the tools) - BI: Power BI, or Tableau (Mandatory one of the tools)

Posted 2 months ago

Apply

9.0 - 14.0 years

30 - 45 Lacs

Bengaluru

Work from Office

Design, deploy, and optimize Azure-based data pipelines and architectures. Ensure scalability, data integrity, and CI/CD automation. Collaborate with analytics teams and lead data engineering initiatives across hybrid data platforms Required Candidate profile Bachelor’s in CS/IT with 7–12 years of experience in Azure data engineering. Strong in ADF, Synapse, Databricks, and CI/CD. Able to mentor junior engineers, optimize large-scale data systems

Posted 2 months ago

Apply

10.0 - 20.0 years

20 - 35 Lacs

Chennai

Work from Office

Hadoop, Spark, and Kafka, in the Cloud ecosystem. AWS, Azure GCP. data modeling, data integration, and data management best practices. GDPR, HIPAA, and other industry regulations. AWS, Azure and GCP

Posted 2 months ago

Apply

15.0 - 20.0 years

20 - 30 Lacs

Pune

Work from Office

We are seeking a seasoned Technical Project Manager to oversee and guide large service engagements, involving teams of 35-50 individuals. This role requires a balance of technical know-how, exceptional leadership abilities, and proven project management skills.

Posted 2 months ago

Apply
Page 1 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies