Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
1.0 - 6.0 years
0 Lacs
andhra pradesh
On-site
As a Data Engineer at Microsoft Fabric, you will be responsible for designing, developing, and optimizing data pipelines, reporting solutions, and analytics frameworks using Microsoft Fabric. Your role will involve collaborating with stakeholders and technical teams to deliver scalable, secure, and high-performing analytics solutions. You will work closely with data architects, analysts, and business stakeholders to gather analytics requirements and build data solutions using Microsoft Fabric components such as Data Factory, OneLake, Synapse, and Power BI. Your responsibilities will include developing and optimizing pipelines for ingestion, transformation, and integration, as well as creating and maintaining semantic models and datasets for reporting purposes. Ensuring compliance with best practices for performance, governance, and security of Fabric solutions will also be a key aspect of your role. Additionally, you will support migration projects, conduct proof-of-concepts, and create and maintain documentation related to ETL processes, data flows, and data mappings. You will also play a crucial role in guiding and training client teams on Fabric adoption. To excel in this role, you should have 4-6 years of experience in data analytics, BI, or cloud platforms, with at least 1 year of hands-on experience in Microsoft Fabric, specifically in Data Factory, OneLake, Synapse, and Power BI semantic models and reporting. Strong SQL and data modeling skills, experience with ETL/ELT and performance tuning, familiarity with Azure and cloud data platforms, as well as strong communication and client-facing skills are essential requirements. Knowledge of the Azure Data Stack (ADF, Synapse, Databricks), governance, security, compliance, and consulting/IT services experience will be beneficial. This is a full-time position located in Visakhapatnam, with health insurance and Provident Fund benefits provided. The work location is in person.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
At Capgemini Invent, we believe that difference drives change. As inventive transformation consultants, we blend our strategic, creative, and scientific capabilities to collaborate closely with clients in delivering cutting-edge solutions tailored to address the challenges of today and tomorrow. Our approach is informed and validated by science and data, superpowered by creativity and design, and all underpinned by purpose-driven technology. Your role will involve proficiency in various technologies such as MS Fabric, Azure Data Factory, Azure Synapse Analytics, Azure Databricks Lakehouses, OneLake, Data Pipelines, Real-Time Analytics, Power BI Integration, and Semantic Model. You will be responsible for integrating Fabric capabilities to ensure seamless data flow, governance, and collaboration across teams. A strong understanding of Delta Lake, Parquet, and distributed data systems is essential. Additionally, strong programming skills in Python, PySpark, Scala or Spark SQL/TSQL for data transformations are required. In terms of your profile, we are looking for individuals with strong experience in the implementation and management of Lake House using Databricks and Azure Tech stack (ADLS Gen2, ADF, Azure SQL). Proficiency in data integration techniques, ETL processes, and data pipeline architectures is crucial. An understanding of Machine Learning Algorithms & AI/ML frameworks (such as TensorFlow, PyTorch) and Power BI will be an added advantage. MS Fabric and PySpark proficiency are must-have skills for this role. Working with us, you will appreciate the significance of flexible work arrangements that support remote work or flexible work hours, allowing you to maintain a healthy work-life balance. Our commitment to your career growth is at the heart of our mission, offering an array of career growth programs and diverse professions to help you explore a world of opportunities. You will have the opportunity to equip yourself with valuable certifications in the latest technologies like Generative AI. Capgemini is a global business and technology transformation partner that accelerates organizations" dual transition to a digital and sustainable world, creating tangible impact for enterprises and society. With a diverse team of over 340,000 members in more than 50 countries, Capgemini's strong over 55-year heritage is built on trust from clients to unlock technology's value in addressing their entire business needs. The company delivers end-to-end services and solutions spanning strategy, design, and engineering, driven by market-leading capabilities in AI, generative AI, cloud, and data, complemented by deep industry expertise and a strong partner ecosystem.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
You are looking for a skilled and experienced Microsoft Fabric Engineer to join the data engineering team. Your main responsibilities will include designing, developing, and maintaining data solutions using Microsoft Fabric. This will involve working across key workloads such as Data Engineering, Data Factory, Data Science, Real-Time Analytics, and Power BI. In this role, you will need to have a deep understanding of Synapse Data Warehouse, OneLake, Notebooks, Lakehouse architecture, and Power BI integration within the Microsoft ecosystem. Some of your key responsibilities will include designing and implementing scalable and secure data solutions, building and maintaining Data Pipelines using Dataflows Gen2 and Data Factory, working with Lakehouse architecture, and managing datasets in OneLake. You will also be responsible for developing and optimizing notebooks (PySpark or T-SQL) for data transformation and processing, collaborating with data analysts and business users to create interactive dashboards and reports using Power BI (within Fabric), leveraging Synapse Data Warehouse and KQL databases for structured and real-time analytics, monitoring and optimizing performance of data pipelines and queries, and ensuring data quality, security, and governance practices are adhered to. To excel in this role, you should have at least 3 years of hands-on experience with Microsoft Fabric or similar tools in the Microsoft data stack. You must be proficient in tools such as Data Factory (Fabric), Synapse Data Warehouse/SQL Analytics Endpoints, Power BI integration, and DAX, as well as have a solid understanding of data modeling, ETL/ELT processes, and real-time data streaming. Experience with KQL (Kusto Query Language) is a plus, and familiarity with Microsoft Purview, Azure Data Lake, or Azure Synapse Analytics is advantageous. Overall, as a Microsoft Fabric Engineer, you will play a crucial role in designing, developing, and maintaining data solutions using Microsoft Fabric, collaborating with various teams to ensure data quality and security, and staying current with Microsoft Fabric updates and best practices to recommend enhancements. Please note that the qualifications required for this role include proficiency in Microsoft Fabric, OneLake, Data Factory, Data Lake, and DataMesh.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Engineer, you will be responsible for designing and implementing data models, warehouses, and databases using Microsoft Fabric, Azure Synapse Analytics, and Azure Data Lake. Your role will involve developing ETL pipelines utilizing tools such as SQL Server Integration Services (SSIS), Azure Synapse Pipelines, and Azure Data Factory. You will leverage Fabric Lakehouse for Power BI reporting, real-time analytics, and automation, while ensuring optimal data integration, governance, security, and performance. Collaboration with cross-functional teams to develop scalable data solutions will be a key aspect of your job. You will implement Medallion Architecture for efficient data processing and work in an Agile environment, applying DevOps principles for automation and CI/CD processes. Your skills should include proficiency in Microsoft Fabric & Azure Lakehouse, OneLake, Data Pipelines, Power BI, Synapse, Data Factory, and Data Lake. Experience in Data Warehousing & ETL both on-premises and in the cloud using SQL, Python, SSIS, and Synapse Pipelines is essential. Strong knowledge of data modeling & architecture, including Medallion Architecture, integration, governance, security, and performance tuning is required. In addition, you should have expertise in analytics & reporting tools such as Power BI, Excel (formulas, macros, pivots), and ERP systems like SAP and Oracle. Problem-solving skills, collaboration abilities in Agile and DevOps environments, and a degree in Computer Science, Engineering, or a related field are necessary. Familiarity with Azure DevOps, Agile, and Scrum methodologies, as well as Microsoft Certifications, particularly Agile certification, would be advantageous for this role.,
Posted 2 weeks ago
4.0 - 9.0 years
17 - 32 Lacs
pune, bangalore rural, bengaluru
Work from Office
Big 4 hiring in large numbers in Bangalore/ Pune for below role Please call on 7208835287 / 7208835290 /7738402343 send cv on it@contactxindia.com Role & responsibilities Mandatory Skills • Bachelors or higher degree in Computer Science or a related discipline; or equivalent (minimum 4+ years work experience). • At least 3+ years of consulting or client service delivery experience on Azure Microsoft data engineering. • At least 1+ years of experience in developing data ingestion, data processing and analytical pipelines for bigdata, relational databases such as SQL server and data warehouse solutions such as Synapse/Azure Databricks, Microsoft Fabric • Hands-on experience implementing data ingestion, ETL and data processing using Azure services: Fabric, onelake, ADLS, Azure Data Factory, Azure Functions, services in Microsoft Fabric etc. • Minimum of 1+ years of hands-on experience in Azure and Big Data technologies such as Fabric, databricks, Python, SQL, ADLS/Blob, pyspark/Spark SQL. • Minimum of 1+ years of RDBMS experience • Experience in using Big Data File Formats and compression techniques. • Experience working with Developer tools such as Azure DevOps, Visual Studio Team Server, Git, etc. Primary Roles and Responsibilities An Azure Data Engineer is responsible for designing, building, and maintaining the data infrastructure for an organization using Azure cloud services. This includes creating data pipelines, integrating data from various sources, and implementing data security and privacy measures. The Azure Data Engineer will also be responsible for monitoring and troubleshooting data flows and optimizing data storage and processing for performance and cost efficiency. Preferred Skills • Experience developing and deploying ETL solutions on Azure cloud using ADF, Notebooks, Synapse analytics, Azure functions and other services. • Experience developing and deploying ETL solutions on Azure cloud using services in Microsoft Fabric . • Microsoft certifications role based. (DP600, DP203,DP900, AI102, AI900..) • Knowledge of Microsoft Powerbi, reports/dashboards and generate insights for business users. • Knowledge of Azure RBAC and IAM. Understanding access controls and security on Azure Cloud. • Inclined with Microsoft vision and road map around latest tools and technologies in market. Preferred candidate profile
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
haryana
On-site
About CoPoint AI: CoPoint AI is a specialized consulting firm that focuses on transforming businesses through process improvement, data insights, and technology-driven innovation. Leveraging AI technologies, Microsoft cloud platforms, and modern web development frameworks, we deliver intelligent, scalable solutions that drive measurable impact for our clients. Our team partners across industries to design and deploy solutions that streamline operations, enhance customer experiences, and enable data-driven growth. Our Vision: We transform businesses through process improvement and data insights leveraging AI on the Microsoft stack. Our Values: - Be Purposeful: Think Straight, Communicate, Always Do The Right Thing. - In Partnership: With our Team, For our Clients, In our communities. - Create Impact: Deliver value-based solutions, help individuals achieve their dreams, demand profitable growth. Role Overview: As a Senior Consultant at CoPoint AI, you will drive end-to-end delivery of both AI-enabled data solutions and modern web applications. You will blend technical expertise in AI, Microsoft platforms, and full-stack web development with business insight to architect and implement impactful solutions across client environments. Key Responsibilities: - Design and develop enterprise applications using Microsoft Fabric components: Power BI, Data Factory, Synapse, Lakehouse, Dataflows, and OneLake. - Architect and deploy data integration pipelines for real-time and batch processing. - Develop and maintain custom web applications integrated with Fabric data sources. - Automate workflows and implement Data Activator triggers for operational intelligence. - Build and manage semantic models, Power BI reports, and paginated reports. - Design event-driven, API-first architectures using .NET Core, Python, and Azure Functions. - Integrate Azure AI services (e.g., Azure OpenAI, Cognitive Search) into enterprise applications. - Lead sprint-based feature delivery and contribute to solution architecture. - Translate complex business requirements into scalable software and data designs. - Mentor junior developers and support knowledge-sharing across the team. - Support proposal development and solution demos in pre-sales engagement. Qualifications: - 2-4 years of experience in development, data engineering, or BI solution delivery. - Hands-on expertise with Microsoft Fabric components (Power BI, Synapse, Data Factory, etc.). - Proficient in C#, .NET, or Python for backend development. - Familiarity with Power BI Desktop, DAX, and Power Query M. - Experience with REST APIs, Azure Logic Apps, or Power Platform connectors. - Strong grasp of cloud-native design patterns and DevOps practices. - Exposure to Azure Data Lake, Delta Lake, or Parquet-based storage systems. - Understanding of data governance, quality, and security models. - Excellent communication, documentation, and presentation skills. What Should You Expect: - A culture of continuous learning with certification support. - Clear career advancement pathways. - Competitive compensation and benefits. - Flexible work arrangements. - A collaborative environment that values innovation and creativity. Ready to shape the future of enterprise technology Join our team of Microsoft technology experts and make an impact.,
Posted 3 weeks ago
4.0 - 7.0 years
5 - 10 Lacs
mumbai
Work from Office
About the Role Were looking for a hands-on Data Engineer who can take full ownership of the data engineering lifecycle-from data ingestion to deployment-in a hybrid cloud environment built on Microsoft Fabric and Google Cloud Platform (GCP). You'll be stepping into an existing ecosystem with mature pipelines, strong DevOps practices, and a close-knit cross-functional team. This role is ideal for someone who is comfortable working independently, can manage multiple priorities, and is equally adept at building new pipelines and improving existing ones Key Responsibilities Develop, enhance, and maintain end-to-end data pipelines across Microsoft Fabric and GCP. Work with tools like Fabric Dataflows, Synapse (Lakehouse), BigQuery, Composer, and others to support both batch and near real-time processing. Build and maintain ELT/ETL frameworks using Python, PySpark, SQL, and relevant platform-native services. Deploy pipelines and manage infra via Git-based CI/CD workflows (e.g., Azure DevOps / GitHub Actions). Collaborate with data analysts, product managers, and other engineers to translate business needs into scalable data models and flows. Monitor data quality, job performance, and proactively resolve data issues. Support and improve our data observability, governance, and documentation practices. Work independently and act as a point of contact for ongoing pipeline or platform-related discussions. Required Skills Strong experience with both Microsoft Fabric (Synapse, OneLake, Dataflows) and GCP (BigQuery, Cloud Storage, Composer). Solid hands-on with SQL, Python, and Spark/PySpark. Proven experience in building and deploying modular, reusable, and production-grade data pipelines. Experience in managing version-controlled codebases and CI/CD for data workflows. Good understanding of data warehousing principles, data modeling, and performance tuning. Familiar with monitoring tools, job orchestration, and alerting strategies. Nice to Have Prior work with Power BI, Looker, or any BI integration layer. Cloud certifications in GCP or Microsoft are a plus.
Posted 3 weeks ago
4.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Power BI + Microsoft Fabric Lead with over 10 years of experience, you will play a key role in leading the strategy and architecture for BI initiatives. Your responsibilities will include designing and delivering end-to-end Power BI and Microsoft Fabric solutions, collaborating with stakeholders to define data and reporting goals, and driving the adoption of best practices and performance optimization. Your expertise in Power BI, including DAX, Power Query, and Advanced Visualizations, will be essential for the success of high-impact BI initiatives. As a Power BI + Microsoft Fabric Developer with 4+ years of experience, you will be responsible for developing dashboards and interactive reports using Power BI, building robust data models, and implementing Microsoft Fabric components like Lakehouse, OneLake, and Pipelines. Working closely with cross-functional teams, you will gather and refine requirements to ensure high performance and data accuracy across reporting solutions. Your hands-on experience with Microsoft Fabric tools such as Data Factory, OneLake, Lakehouse, and Pipelines will be crucial for delivering effective data solutions. Key Skills Required: - Strong expertise in Power BI (DAX, Power Query, Advanced Visualizations) - Hands-on experience with Microsoft Fabric (Data Factory, OneLake, Lakehouse, Pipelines) - Solid understanding of data modeling, ETL, and performance tuning - Ability to collaborate effectively with business and technical teams Joining our team will provide you with the opportunity to work with cutting-edge Microsoft technologies, lead high-impact BI initiatives, and thrive in a collaborative and innovation-driven environment. We offer a competitive salary and benefits package to reward your expertise and contributions. If you are passionate about leveraging Power BI and Microsoft Fabric tools to drive data-driven insights and solutions, we invite you to apply for this full-time position. Application Question(s): - What is your current and expected CTC - What is your notice period If you are serving your notice period, then what is your Last Working Day (LWD) Experience Required: - Power BI: 4 years (Required) - Microsoft Fabrics: 4 years (Required) Work Location: In person,
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Engineer MS Fabric at our Chennai-Excelencia Office location, you will leverage your 4+ years of experience to design, build, and optimize data pipelines using Microsoft Fabric, Azure Data Factory, and Synapse Analytics. Your primary responsibilities will include developing and maintaining Lakehouses, Notebooks, and data flows within the Microsoft Fabric ecosystem, ensuring efficient data integration, quality, and governance across OneLake and other Fabric components, and implementing real-time analytics pipelines for high-throughput data processing. To excel in this role, you must have proficiency in Microsoft Fabric, Azure Data Factory (ADF), Azure Synapse Analytics, Delta Lake, OneLake, Lakehouses, Python, PySpark, Spark SQL, T-SQL, and ETL/ELT Development. Your work will involve collaborating with cross-functional teams to define and deliver end-to-end data engineering solutions, participating in Agile ceremonies, and utilizing tools like JIRA for project tracking and delivery. Additionally, you will be tasked with performing complex data transformations using various data formats and handling large-scale data warehousing and analytics workloads. Preferred skills for this position include a strong understanding of distributed computing and cloud-native data architecture, experience with DataOps practices and data quality frameworks, familiarity with CI/CD for data pipelines, and proficiency in monitoring tools and job scheduling frameworks to ensure the reliability and performance of data systems. Strong problem-solving and analytical thinking, excellent communication and collaboration skills, as well as a self-motivated and proactive approach with a continuous learning mindset are essential soft skills required for success in this role.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
kochi, kerala
On-site
You will be working as a Senior Microsoft Fabric Developer in an immediate project that aims to develop and implement an automated reporting solution using Microsoft Fabric. Your primary responsibilities will include utilizing Microsoft Fabric as the primary platform, Azure Logic Apps for API integrations, Power BI for report creation within Fabric, Power Automate for report distribution, and OneLake for data storage. Your role will require deep expertise in Microsoft Fabric, focusing on data integration, processing, and report development. You should have a strong background in Power BI, specifically within the Fabric environment, and proficiency in Azure Logic Apps for API integrations. Additionally, familiarity with Power Automate for workflow automation, understanding of data modeling and ETL processes, as well as experience with SQL and data analysis are essential skills for this position. Desired skills for this role include knowledge of MSP operations and common tools, experience with Microsoft 365 security features and reporting, familiarity with PDF generation from Power BI reports, understanding of data privacy and security best practices, and previous experience in creating reporting solutions for service providers. Apart from technical skills, you are expected to have excellent communication skills in English, both written and verbal. You should be able to work independently, take initiative, and approach problem-solving proactively. Business acumen, cost-awareness, and the commitment to seeing the project through to successful completion are also key criteria for this role. Additionally, you must be available to overlap with Irish time zones for at least 4 hours per day. If you meet these requirements and are ready to contribute to the successful completion of the project, we look forward to receiving your application.,
Posted 1 month ago
2.0 - 5.0 years
8 - 15 Lacs
Gurugram
Remote
Job Description: We are looking for a talented and driven MS Fabric Developer / Data Analytics Engineer with expertise in Microsoft Fabric ecosystem, data transformation, and analytics. The ideal candidate will be responsible for designing, developing, and optimizing data pipelines, working with real-time analytics, and implementing best practices in data modeling and reporting. Key Responsibilities: Work with MS Fabric components , including: Data Lake OneLake Lakehouse Warehouse Real-Time Analytics Develop and maintain data transformation scripts using: Power Query T-SQL Python Build scalable and efficient data models and pipelines for analytics and reporting Collaborate with BI teams and business stakeholders to deliver data-driven insights Implement best practices for data governance, performance tuning, and storage optimization Support real-time and near real-time data streaming and transformation tasks Required Skills: Hands-on experience with MS Fabric and associated data services Strong command over Power Query , T-SQL , and Python for data transformations Experience working in modern data lakehouse and real-time analytics environments Good to Have: DevOps knowledge for automating deployments and managing environments Familiarity with Azure services and cloud data architecture Understanding of CI/CD pipelines for data projects
Posted 1 month ago
5.0 - 10.0 years
7 - 11 Lacs
Hyderabad
Hybrid
Database Administrator (DBA) - T-SQL / Microsoft Fabric / Azure Data Services Required Qualifications Bachelors degree in Computer Science, Information Technology, or a related discipline 5+ years of hands-on experience as a DBA, with strong exposure to: T-SQL SSMS SQL Server 2019 or later Solid knowledge of Microsoft Fabric components and their interoperability with the Power Platform ecosystem Experience with: Azure SQL Database Azure Managed Instance Data Lake (Gen2 / OneLake) Strong understanding of: RDBMS design Data normalization Performance tuning techniques Hands-on with HA/DR mechanisms such as: Always On Availability Groups Log Shipping Azure Failover Groups Proficient in monitoring and diagnostic tools: SQL Profiler Extended Events Azure Log Analytics Query Performance Insight Experience in implementing: Data privacy Encryption (e.g., TDE, Always Encrypted) Firewall rules Security auditing Preferred Skills & Tools Proficiency in: Azure Data Factory (ADF) Azure Synapse Power BI Dataflows Familiarity with Microsoft Purview for data lineage and governance Hands-on with CI/CD pipelines for SQL using Azure DevOps YAML Understanding of: Fabric workspace administration Capacity planning Security roles Knowledge of NoSQL / Azure Cosmos DB is a plus Experience with monitoring tools like Grafana or Prometheus (especially in hybrid setups) Scripting experience in Python and/or PowerShell for automation Experience with ERP integrations and third-party data replication tools like: Fivetran BryteFlow Qlik Replicate Qualification : - Bachelors degree in Computer Science, Information Technology, Business Administration, or a related field Skills : - T-SQL, SSMS, SQL Server 2019, Microsoft Fabric, Power Platform, Azure SQL Database, Azure Managed Instance, Data Lake Gen2, OneLake, Always On Availability Groups, Log Shipping, Azure Failover Groups, SQL Profiler, Extended Events, Azure Log Analytics, Query Performance Insight, TDE, Always Encrypted, Azure Data Factory (ADF), Azure Synapse, Power BI Dataflows, Microsoft Purview, Azure DevOps, YAML, Grafana, Prometheus, Fivetran, BryteFlow, Qlik Replicate
Posted 1 month ago
4.0 - 8.0 years
15 - 27 Lacs
Indore, Hyderabad
Hybrid
Data Engineer - D365 OneLake Integration Specialist Position Overview: We are seeking an experienced Data Engineer with expertise in Microsoft D365 ERP and OneLake integration to support a critical acquisition integration project. The successful candidate will assess existing data integrations, collaborate with our data team to migrate pipelines to Snowflake using Matillion, and ensure seamless data flow for go-live critical reports by November 2025. Role & responsibilities: Assessment & Documentation: Analyze and document existing D365 to OneLake/Fabric integrations and data flows Data Pipeline Migration: Collaborate with the current data team to redesign and migrate data integrations from D365 to Snowflake using Matillion Integration Architecture : Understand and map current Power BI reporting dependencies and data sources Go-Live Support: Identify critical reports for go-live and recommend optimal data integration strategies Technical Collaboration: Work closely with existing data engineering team to leverage current Snowflake and Matillion expertise Knowledge Transfer: Document findings and provide recommendations on existing vs. new integration approaches ERP Implementation Support: Support the acquired company's ERP go-live timeline and requirements Required Qualifications: Technical Skills 3+ years experience with Microsoft Dynamics 365 ERP data integrations 2+ years hands-on experience with Microsoft OneLake and Fabric ecosystem Strong experience with Snowflake data warehouse platform Proficiency in Matillion ETL tool for data pipeline development Experience with Power BI data modeling and reporting architecture Strong SQL skills and data modeling expertise Knowledge of Azure Data Factory or similar cloud ETL tools Experience with REST APIs and data connector frameworks Business & Soft Skills Experience supporting ERP implementation projects and go-live activities Strong analytical and problem-solving skills for complex data integration challenges Excellent documentation and communication skills Ability to work in fast-paced, deadline-driven environments Experience in M&A integration projects (preferred) Project management skills and ability to prioritize go-live critical deliverables Preferred candidate profile Microsoft Azure certifications (DP-203, DP-900) Experience with Snowflake SnowPro certification Previous experience with acquisition integration projects Knowledge of financial and operational reporting requirements Familiarity with data governance and compliance frameworks
Posted 1 month ago
3.0 - 7.0 years
12 - 15 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Work from Office
We are looking for an experienced Data Engineer/BI Developer with strong hands-on expertise in Microsoft Fabric technologies, including OneLake, Lakehouse, Data Lake, Warehouse, and Real-Time Analytics, along with proven skills in Power BI, Azure Synapse Analytics, and Azure Data Factory (ADF). The ideal candidate should also possess working knowledge of DevOps practices for data engineering and deployment automation. Key Responsibilities: Design and implement scalable data solutions using Microsoft Fabric components: OneLake, Data Lake, Lakehouse, Warehouse, and Real-Time Analytics Build and manage end-to-end data pipelines integrating structured and unstructured data from multiple sources. Integrate Microsoft Fabric with Power BI, Synapse Analytics, and Azure Data Factory to enable modern data analytics solutions. Develop and maintain Power BI datasets, dashboards, and reports using data from Fabric Lakehouses or Warehouses. Implement data governance, security, and compliance policies within the Microsoft Fabric ecosystem. Collaborate with stakeholders for requirements gathering, data modeling, and performance tuning. Leverage Azure DevOps / Git for version control, CI/CD pipelines, and deployment automation of data artifacts. Monitor, troubleshoot, and optimize data flows and transformations for performance and reliability. Required Skills: 38 years of experience in data engineering, BI development, or similar roles. Strong hands-on experience with Microsoft Fabric ecosystem:OneLake, Data Lake, Lakehouse, Warehouse, Real-Time Analytics Proficient in Power BI for interactive reporting and visualization. Experience with Azure Synapse Analytics, ADF (Azure Data Factory), and related Azure services. Good understanding of data modeling, SQL, T-SQL, and Spark/Delta Lake concepts. Working knowledge of DevOps tools and CI/CD processes for data deployment (Azure DevOps preferred). Familiarity with DataOps and version control practices for data solutions. Preferred Qualifications: Microsoft certifications (e.g., DP-203, PL-300, or Microsoft Fabric certifications) are a plus. Experience with Python, Notebooks, or KQL for Real-Time Analytics is advantageous. Knowledge of data governance tools (e.g., Microsoft Purview) is a plus. Location: Remote- Bengaluru,Hyderabad,Delhi / NCR,Chennai,Pune,Kolkata,Ahmedabad,Mumbai
Posted 2 months ago
8.0 - 13.0 years
8 - 17 Lacs
Chennai
Remote
MS Fabric (Data Lake, OneLake, Lakehouse, Warehouse, Real-Time Analytics) and integration with Power BI, Synapse, and Azure Data Factory. DevOps Knowledge Team Leading experience
Posted 2 months ago
8.0 - 13.0 years
8 - 17 Lacs
Chennai
Remote
MS Fabric (Data Lake, OneLake, Lakehouse, Warehouse, Real-Time Analytics) and integration with Power BI, Synapse, and Azure Data Factory. DevOps Knowledge Team Leading experience
Posted 2 months ago
3.0 - 7.0 years
5 - 10 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Hybrid
Role & Responsibilities Job Description: We are seeking a skilled and experienced Microsoft Fabric Engineer to join data engineering team. The ideal candidate will have a strong background in designing, developing, and maintaining data solutions using Microsoft Fabric, i ncluding experience across key workloads such as Data Engineering, Data Factory, Data Science, Real-Time Analytics, and Power BI. Require deep understanding of Synapse Data Warehouse, OneLake, Notebooks, Lakehouse architecture, and Power BI integration within Microsoft ecosystem. Key Responsibilities: Design, implement scalable and secure data solutions using Microsoft Fabric. Build and maintain Data Pipelines using Dataflows Gen2 and Data Factory. Work with Lakehouse architecture and manage datasets in OneLake. Develop notebooks (PySpark or T-SQL) for data transformation and processing. Collaborate with data analysts to create interactive dashboards, reports using Power BI (within Fabric). Leverage Synapse Data Warehouse and KQL databases for structured real-time analytics. Monitor and optimize performance of data pipelines and queries. Ensure to adhere data quality, security, and governance practices. Stay current with Microsoft Fabric updates and roadmap, recommending enhancements. Required Skills: 3+ years of hands-on experience with Microsoft Fabric or similar tools in the Microsoft data stack. Strong proficiency with: Data Factory (Fabric) Synapse Data Warehouse / SQL Analytics Endpoints Power BI integration and DAX Notebooks (PySpark, T-SQL) Lakehouse and OneLake Understanding of data modeling, ETL/ELT processes, and real-time data streaming. Experience with KQL (Kusto Query Language) is a plus. Familiarity with Microsoft Purview, Azure Data Lake, or Azure Synapse Analytics is advantageous. Qualifications: Microsoft Fabric, Onelake, Data Factory, Data Lake, DataMesh
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |