Jobs
Interviews

153 Microsoft Fabric Jobs - Page 4

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 9.0 years

9 - 13 Lacs

Mumbai

Work from Office

About the job : Role : Microsoft Fabric Data Engineer Experience : 6+ years as Azure Data Engineer including at least 1 E2E Implementation in Microsoft Fabric. Responsibilities : - Lead the design and implementation of Microsoft Fabric-centric data platforms and data warehouses. - Develop and optimize ETL/ELT processes within the Microsoft Azure ecosystem, effectively utilizing relevant Fabric solutions. - Ensure data integrity, quality, and governance throughout Microsoft Fabric environment. - Collaborate with stakeholders to translate business needs into actionable data solutions. - Troubleshoot and optimize existing Fabric implementations for enhanced performance. Skills : - Solid foundational knowledge in data warehousing, ETL/ELT processes, and data modeling (dimensional, normalized). - Design and implement scalable and efficient data pipelines using Data Factory (Data Pipeline, Data Flow Gen 2 etc) in Fabric, Pyspark notebooks, Spark SQL, and Python. This includes data ingestion, data transformation, and data loading processes. - Experience ingesting data from SAP systems like SAP ECC/S4HANA/SAP BW etc will be a plus. - Nice to have ability to develop dashboards or reports using tools like Power BI. Coding Fluency : - Proficiency in SQL, Python, or other languages for data scripting, transformation, and automation.

Posted 3 weeks ago

Apply

6.0 - 10.0 years

9 - 13 Lacs

Kolkata

Work from Office

About the job : Role : Microsoft Fabric Data Engineer Experience : 6+ years as Azure Data Engineer including at least 1 E2E Implementation in Microsoft Fabric. Responsibilities : - Lead the design and implementation of Microsoft Fabric-centric data platforms and data warehouses. - Develop and optimize ETL/ELT processes within the Microsoft Azure ecosystem, effectively utilizing relevant Fabric solutions. - Ensure data integrity, quality, and governance throughout Microsoft Fabric environment. - Collaborate with stakeholders to translate business needs into actionable data solutions. - Troubleshoot and optimize existing Fabric implementations for enhanced performance. Skills : - Solid foundational knowledge in data warehousing, ETL/ELT processes, and data modeling (dimensional, normalized). - Design and implement scalable and efficient data pipelines using Data Factory (Data Pipeline, Data Flow Gen 2 etc) in Fabric, Pyspark notebooks, Spark SQL, and Python. This includes data ingestion, data transformation, and data loading processes. - Experience ingesting data from SAP systems like SAP ECC/S4HANA/SAP BW etc will be a plus. - Nice to have ability to develop dashboards or reports using tools like Power BI. Coding Fluency : - Proficiency in SQL, Python, or other languages for data scripting, transformation, and automation.

Posted 3 weeks ago

Apply

10.0 - 15.0 years

1 - 2 Lacs

Hyderabad

Work from Office

Experience needed: 10-15 years Type: Full-Time Mode: WFO Shift: General Shift IST Location: Hyderabad NP: Immediate Joinee - 30 days Job Summary: We are seeking a highly experienced and results-driven Power BI Architect to lead the design, development, and implementation of enterprise-level BI solutions. The ideal candidate will have deep expertise in Power BI architecture , data modeling , visualization , DAX , and Power BI/Fabric administration , along with a solid foundation in Microsoft Azure and Microsoft Entra . You will work closely with data engineers, analysts, and stakeholders to build a scalable and secure data visualization ecosystem. Key Responsibilities: Design end-to-end Power BI Architecture including data ingestion, modeling, visualization, and governance. Lead implementation of dimensional data models to support enterprise reporting and analytics needs. Develop and optimize Power BI reports and dashboards using DAX, M Language (Power Query), and advanced visualizations. Architect and manage the Power BI Service environment including workspaces, datasets, dataflows, gateways, and security. Define and implement Power BI SDLC processes including versioning, deployment pipelines, and documentation. Manage Power BI/Fabric administration tasks, including tenant settings, capacity management, and usage monitoring. Ensure best practices in report performance tuning , data refresh optimization , and data security . Collaborate with Azure teams to integrate Power BI solutions with Microsoft Azure services (Data Lake, Synapse, Data Factory, etc.). Implement Microsoft Entra (Azure AD) role-based access controls and security for BI content. Provide thought leadership and mentorship to BI developers and analysts. Stay current on Microsofts data and analytics roadmap and assess applicability to ongoing projects. Required Skills & Qualifications: Strong experience with Power BI Desktop , Power BI Service , and Power BI Premium/Fabric . Expertise in DAX and Power Query (M Language) . Proven experience with dimensional modeling and data warehousing concepts. Proficient in ETL processes and integrating data from multiple sources. Demonstrated success in leading enterprise BI implementations . Solid understanding and experience with Power BI governance , security , and lifecycle management . Experience with Microsoft Azure platform , especially Azure Data Services. Experience and Knowledge of Microsoft Entra (Azure AD) for authentication and access management. Strong communication and stakeholder management skills. Preferred Qualifications: Microsoft Certified: Power BI Data Analyst Associate or Azure Data Engineer Associate . Familiarity with DevOps and CI/CD pipelines for Power BI deployments. Experience working in Agile/Scrum environments.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

16 - 25 Lacs

Kochi

Remote

Role is with one our customers in Dubai, 100% Remote/Offshore Data Engineer with expertise in Microsoft Fabric, Databricks and related technologies We are seeking a skilled and proactive Data Engineer to design, operate, and enhance our data infrastructure. This role focuses on managing data warehouses, pipelines, and storage systems, with a strong emphasis on Microsoft Fabric, Databricks and modern CI/CD practices. Key Responsibilities Data Warehousing & Storage Design, implement, and maintain scalable data warehouses using Microsoft Fabric, Azure Synapse and Databricks Manage structured and unstructured data across One Lake, Delta Lake, and other storage solutions. Data Pipelines & Ingestion Build and optimize data ingestion pipelines using Azure Data Factory, Databricks, Spark, and Dataflows Gen2. Ensure reliable ETL/ELT processes for real-time and batch data movement. Development & CI/CD Maintain and version control code repositories (e.g., GitHub, Azure Repos). Implement CI/CD pipelines for data workflows using Azure DevOps or GitHub Actions. Platform Administration Administer Microsoft Fabric and Databricks environments, including Lakehouse and Data Engineering workloads. Monitor, troubleshoot, and fine-tune data workflows for performance and reliability. Governance Ensure data quality, security, and compliance with governance standards. Required Skills & Experience Experience 5+ years in data engineering or similar roles Microsoft Fabric Hands-on with OneLake, Lakehouse, Spark notebooks and Databricks Languages Proficient in SQL, Python (or PySpark) Tools Azure Data Factory, Databricks, Azure Synapse, Databricks, Git, Power BI CI/CD Experience with DevOps pipelines and infrastructure-as-code Data Modelling Strong grasp of dimensional modelling, metadata management, and data governance Preferred Qualifications Microsoft Certified: Azure Data Engineer Associate or Microsoft Fabric, Databricks certification Familiarity with hybrid cloud environments and data mesh/fabric concepts Experience with structured streaming and large-scale data processing Thanks & Regards, Fatima Team Lead - Recruitment Mobile - 9819067889 Email - fatima@varishthainfotech.com http://www.varishthainfotech.com Office M13, Business Venue Building, Oud Metha, Dubai, UAE- PO Box 33161

Posted 3 weeks ago

Apply

10.0 - 12.0 years

25 - 30 Lacs

Noida, Hyderabad

Work from Office

We’re hiring an Azure Data Architect with 10+ years of experience in designing end-to-end data solutions using ADF, Synapse, Databricks, Data Lake, and Python/SQL.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

20 - 30 Lacs

Noida

Work from Office

We're Hiring | Microsoft Fabric Developer | 3-7 Yrs | Noida | Hybrid Location: Noida (Hybrid 3 days/week in office) Experience: 3 to 7 years Joiners: Immediate to Max 2 Weeks Notice Period ONLY Shift: 2:00 PM to 10:30 PM IST Cab Provided Key Responsibilities: Setup and manage Microsoft Fabric platform Build secure, scalable Lakehouses and implement Azure Data Factory pipelines Design and manage data warehouses for analytics Develop and manage - reports and semantic models using Fabric Write complex SQL queries, Spark SQL, and build data solutions using PySpark Schedule and optimize Spark jobs for performance and reliability Leverage Data Activator for real-time analytics and insights Must-Have Skills: 3+ years of experience with Microsoft Fabric , OneLake , and Lakehouses Proven expertise with Azure Data Factory and ETL Strong knowledge of Power BI , data warehousing , and data governance Proficiency in Python , PySpark , and Spark SQL Practical exposure to real-time analytics in Fabric Good-to-Have: Knowledge of AWS services and Snowflake To Apply: Send your resume to vijay.s@xebia.com with these details: Full Name Total Experience Current CTC Expected CTC Current Location Preferred Location Notice Period / LWD (if serving) Primary Skills LinkedIn Profile Note: Apply only if you're an immediate joiner or on a notice period 2 weeks and are not in process with any other open roles with us. #MicrosoftFabric #FabricDeveloper #PySparkJobs #PowerBI #AzureDataFactory #ImmediateJoiners #NoidaHiring #HybridJobs #XebiaHiring #BigData #ETL #Lakehous

Posted 4 weeks ago

Apply

6.0 - 11.0 years

17 - 25 Lacs

Pune, Chennai, Bengaluru

Hybrid

Job Title: Data Engineer Experience : 6 + Years Location : Chennai, Coimbatore, Bangalore, Pune About the Role We are seeking a skilled Data Engineer with hands-on experience in Microsoft Fabric and Azure Synapse Analytics to build scalable data pipelines, optimize data models, and modernize analytics platforms in a cloud-first environment. Key Skills Required Microsoft Fabric components (Lakehouse, OneLake, Data Pipelines, Real-Time Analytics, Power BI Integration) Azure Synapse Analytics Azure Data Engineering Python / PySpark Good Understanding of ETL/ELT processes, and data warehouse best practices. Good to Have Azure and Microsoft Fabric certifications Experience with real-time and event-driven data processing Familiarity with Data Governance tools Why Join Us? Join KANINIs award-winning Data Engineering Team, recognized as the "Outstanding Data Engineering Team" at DES 2025 Work on real-world problems at the intersection of data, AI, and product development Be part of a collaborative, innovative, and growth-driven environment Access to cutting-edge tools, leadership support, and continuous learning opportunities Enjoy flexible work options and competitive compensation

Posted 4 weeks ago

Apply

10.0 - 16.0 years

30 - 40 Lacs

Chennai

Remote

Greetings from Sutherland! We are hiring for a Senior Manager - Data Scientist. This is a 5 day work from home/remote opportunity with UK shifts(3 pm - 12 pm) shifts. Please see below the job description and qualifications required for the same. The Sr. Manager / Manager Data Science at Sutherland would play a very vital role in analysing the data from various sources and providing insights that can help in driving performance in the right direction. The position would employ mathematical and statistical methods to analyse various forms of data (like data from Analytics tools, monitoring data, surveys, KPI data, etc.) and will recommend solutions or changes to processes and procedures to help improve performance. Job Description Collect data from various sources like Analytics tools, CRM, Survey, Monitoring, etc. Thoroughly clean and prune data to discard irrelevant information Explore and examine data from a variety of angles to determine hidden weaknesses, trends and/or opportunities. Conduct undirected research and frame open-ended industry questions. Employ sophisticated analytics tools and statistical methods to build efficient descriptive, diagnostic, and predictive models like trends, RCA of opportunities, prediction of KPIs, and employee behaviour like attrition. Prescribing solutions, process changes, employee profiling, training / coaching needs, etc. Devise data-driven solutions for the business challenges Communicate predictions and findings to management through effective data visualizations, reports and presentations Measure the effectiveness of actions like training, coaching, etc. Recommend cost-effective changes to existing procedures and strategies Work in a collaborative manner with stakeholders Managers and Supervisors in various departments to collect data and implement changes that are required. Keep the Management updated about the Transformational initiatives and their effectiveness Qualifications Graduate / Postgraduate in Science / Mathematics / Statistics / Engineering 10+ years of overall experience in IT/ ITES industry supporting clients from different verticals Strong analytical ability with good knowledge of data mining, data visualization and statistical tools and methods Ability to build effective models for data analysis and analytics Excellent Project management skills with ability to create and execute action plans while working with multiple stakeholders Effective communicator with experience in interacting with stakeholders internally and externally Experience with Digital tools and Transformation technology

Posted 1 month ago

Apply

5.0 - 10.0 years

18 - 25 Lacs

Mumbai, Thane

Work from Office

Role & responsibilities Assess current Synapse Analytics workspace including pipelines, notebooks, datasets, and SQL scripts. Rebuild or refactor Synapse pipelines, notebooks, and data models using Fabric-native services. Collaborate with data engineers, architects, and business stakeholders to ensure functional parity post-migration. Validate data integrity and performance in the new environment. Document the migration process, architectural decisions, and any required support materials. Provide knowledge transfer and guidance to internal teams on Microsoft Fabric capabilities. Preferred candidate profile Proven experience with Azure Synapse Analytics (workspaces, pipelines, dedicated/SQL serverless pools, Spark notebooks). 5 years of synapse azure cloud experience. Probably only see 1 to 2 years experience in Fabric. Hands-on experience with Microsoft Fabric (Data Factory, OneLake, Power BI integration). Strong proficiency in SQL, Python, and Spark. Solid understanding of data modeling, ETL/ELT pipelines, and data integration patterns. Familiarity with Azure Data Lake, Azure Data Factory, and Power BI. Experience with Lakehouse architecture and Delta Lake in Microsoft Fabric. Experience with CI/CD practices for data pipelines. Excellent communication skills and ability to work cross-functionally. Nice-to-Have Skills: Familiarity with DataOps or DevOps practices in Azure environments. Prior involvement in medium to large-scale cloud platform migrations. Knowledge of security and governance features in Microsoft Fabric. Knowledge of Dynamics Dataverse link to Fabric.

Posted 1 month ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Develop and maintain data pipelines, ETL/ELT processes, and workflows to ensure the seamless integration and transformation of data. Architect, implement, and optimize scalable data solutions. Required Candidate profile Work closely with data scientists, analysts, and business stakeholders to understand requirements and deliver actionable insights. Partner with cloud architects and DevOps teams

Posted 1 month ago

Apply

8.0 - 13.0 years

20 - 25 Lacs

Pune

Remote

Design Databases & Data Warehouses, Power BI Solutions, Support Enterprise Business Intelligence, Strong Team Player & Contributor, Continuous Improvement Experience in SQL, Oracle, SSIS/SSRS, Azure, ADF, CI/CD, Power BI, DAX, Microsoft Fabric Required Candidate profile Source system data structures, data extraction, data transformation, warehousing, DB administration, query development, and Power BI. Develop WORK FROM HOME

Posted 1 month ago

Apply

4.0 - 8.0 years

6 - 12 Lacs

Bengaluru

Work from Office

We are seeking an experienced Data Engineer to join our dynamic product development team. In this role, you will be responsible for designing, building, and optimizing data pipelines that ensure efficient data processing and insightful analytics. You will work collaboratively with cross-functional teams, including data scientists, software developers, and product managers, to transform raw data into actionable insights while adhering to best practices in data architecture, security, and scalability. Role & responsibilities * Design, build, and maintain scalable ETL processes to ingest, process, and store large datasets. * Collaborate with cross-functional teams to integrate data from various sources, ensuring data consistency and quality. * Leverage Microsoft Azure services for data storage, processing, and analytics, integrating with our CI/CD pipeline on Azure Repos. * Continuously optimize data workflows for performance and scalability, identifying bottlenecks and implementing improvements. * Deploy and monitor ML/GenAI models in production environments. * Develop and enforce data quality standards, data validation checks, and ensure compliance with security and privacy policies. * Work closely with backend developers (PHP/Node/Python) and DevOps teams to support seamless data operations and deployment. * Stay current with industry trends and emerging technologies to continually enhance data strategies and methodologies. Required Skills & Qualifications * Minimum of 4+ years in data engineering or a related field. * In depth understanding of streaming technologies like Kafka, Spark Streaming. * Strong proficiency in SQL, Python, Spark SQL - data manipulation, data processing, and automation. * Solid understanding of ETL/ELT frameworks, data pipeline design, data modelling, data warehousing and data governance principles. * Must have in-depth knowledge of performance tuning/optimizing data processing jobs, debugging time consuming jobs. * Proficient in Azure technologies like ADB, ADF, SQL (capability of writing complex SQL queries), PySpark, Python, Synapse, Fabric, Delta Tables, Unity CatLog. * Deep understanding of cloud platforms (e.g., AWS, Azure, Google Cloud) and data warehousing solutions (e.g., Snowflake, Redshift, Big Query). * Good knowledge of Agile, SDLC/CICD practices and tools with a good understanding of distributed systems. * Proven ability to work effectively in agile/scrum teams, collaborating across disciplines. * Excellent analytical, troubleshooting, problem-solving skills and attention to detail. Preferred candidate profile * Experience with NoSQL databases and big data processing frameworks e.g., Apache Spark. * Knowledge of data visualization and reporting tools. * Strong understanding of data security, governance, and compliance best practices. * Effective communication skills with an ability to translate technical concepts to non-technical stakeholders. * Knowledge of AI-OPS and LLM Data pipelines. Why Join GenXAI? * Innovative Environment: Work on transformative projects in a forward-thinking, collaborative setting. * Career Growth: Opportunities for professional development and advancement within a rapidly growing company. * Cutting-Edge Tools: Gain hands-on experience with industry-leading technologies and cloud platforms. * Collaborative Culture: Join a diverse team where your expertise is valued, and your ideas make an impact.

Posted 1 month ago

Apply

8.0 - 12.0 years

14 - 24 Lacs

Pune

Work from Office

Role & responsibilities Experience: 8-10 years in the Data and Analytics domain with expertise in the Microsoft Data Tech stack. Leadership: Experience in managing teams of 8-10 members. Technical Skills: Expertise in tools like Microsoft Fabric, Azure Synapse Analytics, Azure Data Factory, Power BI, SQL Server, Azure Databricks, etc. Strong understanding of data architecture, pipelines, and governance. Understanding of one of the other data platforms like Snowflake or Google Big query or Amazon Red shift will be a plus and good to have skill. Tech stack - DBT and Databricks or Snowflake Microsoft BI - PBI, Synapse and Fabric Project Management: Proficiency in project management methodologies (Agile, Scrum, or Waterfall). Key Responsibilities Project Delivery & Management: Involved in the delivery of project. Help and define project plan, and ensure timelines are met in project delivery. Maintain quality control and ensure client satisfaction at all stages. Team Leadership & Mentorship: Lead, mentor, and manage a team of 5 to 8 professionals. Conduct performance evaluations and provide opportunities for skill enhancement. Foster a collaborative and high-performance work environment. Client Engagement: Act as the primary point of contact on technical front. Understand client needs and ensure expectations are met or exceeded. Conduct and do bi-weekly and monthly reviews on projects with customer. Technical Expertise & Innovation: Stay updated with the latest trends in Microsoft Data Technologies (Microsoft Fabric, Azure Synapse, Power BI, SQL Server, Azure Data Factory, etc.). Provide technical guidance and support to the team. Regards, Ruchita Shete Busisol Sourcing Pvt. Ltd. Tel No: 7738389588 Email id: ruchita@busisol.net

Posted 1 month ago

Apply

6.0 - 9.0 years

8 - 11 Lacs

Chennai

Work from Office

About the job : Role : Microsoft Fabric Data Engineer Experience : 6+ years as Azure Data Engineer including at least 1 E2E Implementation in Microsoft Fabric. Responsibilities : - Lead the design and implementation of Microsoft Fabric-centric data platforms and data warehouses. - Develop and optimize ETL/ELT processes within the Microsoft Azure ecosystem, effectively utilizing relevant Fabric solutions. - Ensure data integrity, quality, and governance throughout Microsoft Fabric environment. - Collaborate with stakeholders to translate business needs into actionable data solutions. - Troubleshoot and optimize existing Fabric implementations for enhanced performance. Skills : - Solid foundational knowledge in data warehousing, ETL/ELT processes, and data modeling (dimensional, normalized). - Design and implement scalable and efficient data pipelines using Data Factory (Data Pipeline, Data Flow Gen 2 etc) in Fabric, Pyspark notebooks, Spark SQL, and Python. This includes data ingestion, data transformation, and data loading processes. - Experience ingesting data from SAP systems like SAP ECC/S4HANA/SAP BW etc will be a plus. - Nice to have ability to develop dashboards or reports using tools like Power BI. Coding Fluency : - Proficiency in SQL, Python, or other languages for data scripting, transformation, and automation.

Posted 1 month ago

Apply

6.0 - 9.0 years

9 - 13 Lacs

Kolkata

Work from Office

Experience : 6+ years as Azure Data Engineer including at least 1 E2E Implementation in Microsoft Fabric. Responsibilities : - Lead the design and implementation of Microsoft Fabric-centric data platforms and data warehouses. - Develop and optimize ETL/ELT processes within the Microsoft Azure ecosystem, effectively utilizing relevant Fabric solutions. - Ensure data integrity, quality, and governance throughout Microsoft Fabric environment. - Collaborate with stakeholders to translate business needs into actionable data solutions. - Troubleshoot and optimize existing Fabric implementations for enhanced performance. Skills : - Solid foundational knowledge in data warehousing, ETL/ELT processes, and data modeling (dimensional, normalized). - Design and implement scalable and efficient data pipelines using Data Factory (Data Pipeline, Data Flow Gen 2 etc) in Fabric, Pyspark notebooks, Spark SQL, and Python. This includes data ingestion, data transformation, and data loading processes. - Experience ingesting data from SAP systems like SAP ECC/S4HANA/SAP BW etc will be a plus. - Nice to have ability to develop dashboards or reports using tools like Power BI. Coding Fluency : - Proficiency in SQL, Python, or other languages for data scripting, transformation, and automation.

Posted 1 month ago

Apply

8.0 - 10.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Role Responsibilities : - Design and implement data pipelines using MS Fabric. - Develop data models to support business intelligence and analytics. - Manage and optimize ETL processes for data extraction, transformation, and loading. - Collaborate with cross-functional teams to gather and define data requirements. - Ensure data quality and integrity in all data processes. - Implement best practices for data management, storage, and processing. - Conduct performance tuning for data storage and retrieval for enhanced efficiency. - Generate and maintain documentation for data architecture and data flow. - Participate in troubleshooting data-related issues and implement solutions. - Monitor and optimize cloud-based solutions for scalability and resource efficiency. - Evaluate emerging technologies and tools for potential incorporation in projects. - Assist in designing data governance frameworks and policies. - Provide technical guidance and support to junior data engineers. - Participate in code reviews and ensure adherence to coding standards. - Stay updated with industry trends and best practices in data engineering. Qualifications : - 8+ years of experience in data engineering roles. - Strong expertise in MS Fabric and related technologies. - Proficiency in SQL and relational database management systems. - Experience with data warehousing solutions and data modeling. - Hands-on experience in ETL tools and processes. - Knowledge of cloud computing platforms (Azure, AWS, GCP). - Familiarity with Python or similar programming languages. - Ability to communicate complex concepts clearly to non-technical stakeholders. - Experience in implementing data quality measures and data governance. - Strong problem-solving skills and attention to detail. - Ability to work independently in a remote environment. - Experience with data visualization tools is a plus. - Excellent analytical and organizational skills. - Bachelor's degree in Computer Science, Engineering, or related field. - Experience in Agile methodologies and project management.

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Pune

Work from Office

Dear Candidate, We are excited to share an opportunity at Avigna.AI for the position of Data Engineer . We're looking for professionals with strong data engineering experience who can contribute to building scalable, intelligent data solutions and have a passion for solving complex problems. Position Details: Role: Data Engineer Location: Pune, Baner (Work from Office) Experience: 7+ years Working Days: Monday to Friday (9:00 AM 6:00 PM) Education: Bachelors or Master’s in Computer Science, Engineering, Mathematics, or related field Company Website: www.avigna.ai LinkedIn: Avigna.AI Key Responsibilities: Design and develop robust data pipelines for large-scale data ingestion, transformation, and analytics. Implement scalable Lakehouse architectures using tools like Microsoft Fabric for structured and semi-structured data. Work with Python , PySpark , and Azure services to support data modelling, automation, and predictive insights. Develop custom KQL queries and manage data using Power BI , Azure Cosmos DB , or similar tools. Collaborate with cross-functional teams to integrate data-driven components with application backends and frontends. Ensure secure, efficient, and reliable CI/CD pipelines for automated deployments and data updates. Skills & Experience Required: Strong proficiency in Python , PySpark , and cloud-native data tools Experience with Microsoft Azure services (e.g., App Services, Functions, Cosmos DB, Active Directory) Hands-on experience with Microsoft Fabric (preferred or good to have) Working knowledge of Power BI and building interactive dashboards for business insights Familiarity with CI/CD practices for automated deployments Exposure to machine learning integration into data workflows (nice to have) Strong analytical and problem-solving skills with attention to detail Good to Have: Experience with KQL (Kusto Query Language) Background in simulation models or mathematical modeling Knowledge of Power Platform integration (Power Pages, Power Apps) Benefits : Competitive salary. Health insurance coverage. Professional development opportunities. Dynamic and collaborative work environment. Important Note: Kindly share your resumes to talent@avigna.ai When sharing your profile, please copy paste the below content in the subject line: Subject: Applying for Data Engineer role JOBID:ZR_14_JOB

Posted 1 month ago

Apply

6.0 - 9.0 years

9 - 13 Lacs

Mumbai

Work from Office

Experience : 6+ years as Azure Data Engineer including at least 1 E2E Implementation in Microsoft Fabric. Responsibilities : - Lead the design and implementation of Microsoft Fabric-centric data platforms and data warehouses. - Develop and optimize ETL/ELT processes within the Microsoft Azure ecosystem, effectively utilizing relevant Fabric solutions. - Ensure data integrity, quality, and governance throughout Microsoft Fabric environment. - Collaborate with stakeholders to translate business needs into actionable data solutions. - Troubleshoot and optimize existing Fabric implementations for enhanced performance. Skills : - Solid foundational knowledge in data warehousing, ETL/ELT processes, and data modeling (dimensional, normalized). - Design and implement scalable and efficient data pipelines using Data Factory (Data Pipeline, Data Flow Gen 2 etc) in Fabric, Pyspark notebooks, Spark SQL, and Python. This includes data ingestion, data transformation, and data loading processes. - Experience ingesting data from SAP systems like SAP ECC/S4HANA/SAP BW etc will be a plus. - Nice to have ability to develop dashboards or reports using tools like Power BI. Coding Fluency : - Proficiency in SQL, Python, or other languages for data scripting, transformation, and automation.

Posted 1 month ago

Apply

4.0 - 9.0 years

5 - 10 Lacs

Pune

Hybrid

Design, develop, and deploy Power BI dashboards, data models, and reports. Collaborate with stakeholders to meet business needs through data visualization and BI solutions. Required Candidate profile 4–10 years’ experience in Power BI development. Skilled in data modelling, data visualization, and data analysis. Strong collaboration and communication skills are essential.

Posted 1 month ago

Apply

7.0 - 10.0 years

15 - 25 Lacs

Gurgaon, Haryana, India

On-site

The Cloud Data Architect will lead client engagements, guiding stakeholders toward optimized, cloud-native data architectures. This role will be pivotal in defining modernization strategies, designing future-state data platforms, and integrating Microsoft Fabric solutions. Key Responsibilities: Lead client interviews and workshops to understand current and future data needs Conduct technical reviews of Azure infrastructure including Databricks, Synapse Analytics, and Power BI Design scalable and optimized architecture solutions with a focus on Microsoft Fabric integration Define and refine data governance frameworks including cataloguing, lineage, and quality standards Deliver strategic and actionable project outputs in line with client expectations Evaluate and ensure the quality and accuracy of deliverables Collaborate with business and domain stakeholders to capture and implement business logic Manage end-to-end project delivery, including coordination with client and internal teams Communicate effectively with global stakeholders across various channels Troubleshoot and resolve complex issues across dev, test, UAT, and production environments Ensure quality checks and adherence to Service Level Agreements and Turnaround Times Required Skills and Experience: Bachelor's or Master's degree in Computer Science, Finance, Information Systems, or related field Minimum 7 years of experience in Data and Cloud architecture roles Proven experience engaging with client stakeholders and leading solution architecture Deep expertise in Azure Data Platform: Synapse, Databricks, Azure Data Factory, Azure SQL, Power BI Strong knowledge of data governance best practices including data quality, cataloguing, and lineage Familiarity with Microsoft Fabric and its integration into enterprise environments Experience creating modernization roadmaps and designing target architectures Excellent verbal and written communication skills Strong analytical, organizational, and problem-solving abilities Self-starter capable of working independently and in team environments Experience delivering projects in agile development environments Project management and team leadership capabilities

Posted 1 month ago

Apply

3.0 - 5.0 years

10 - 12 Lacs

Bengaluru

Hybrid

Notice Period: Immediate Key Responsibilities: Design and implement data models and schemas to support business intelligence and analytics. Perform data engineering tasks as needed to support analytical activities. Develop clear, concise, and insightful data visualizations and dashboards. Interpret complex datasets and communicate findings through compelling visualizations and storytelling. Work closely with stakeholders to understand data requirements and deliver actionable insights. Maintain documentation and ensure data quality and integrity. Reporting Structure: Direct reporting to Senior Data Engineer Dotted-line reporting to CTO Required Skills & Qualifications: Proficiency in Power BI, Microsoft Fabric, and Power Query. Experience designing and implementing data models and schemas. Familiarity with basic data engineering tasks. Advanced SQL skills for querying and analyzing data. Exceptional ability to translate complex data into clear, actionable insights. Strong ability to communicate complex data insights effectively to technical and non-technical audiences. Preferred Skills: Experience with Python for data manipulation and analysis. Experience in the finance, tax, or professional services industries. Familiarity with Salesforce data models and integrations.

Posted 1 month ago

Apply

5.0 - 10.0 years

0 - 0 Lacs

Pune, Chennai

Hybrid

Ciklum is looking for a Senior Microsoft Fabric Data Engineer to join our team full-time in India. We are a custom product engineering company that supports both multinational organizations and scaling startups to solve their most complex business challenges. With a global team of over 4,000 highly skilled developers, consultants, analysts and product owners, we engineer technology that redefines industries and shapes the way people live. About the role: We are seeking a highly skilled and experienced Senior Microsoft Fabric Data Engineer to design, develop, and optimize advanced data solutions leveraging the Microsoft Fabric platform. You will be responsible for building robust, scalable data pipelines, integrating diverse and large-scale data sources, and enabling sophisticated analytics and business intelligence capabilities. This role requires extensive hands-on expertise with Microsoft Fabric, a deep understanding of Azure data services, and mastery of modern data engineering practices. Responsibilities: Lead the design and implementation of highly scalable and efficient data pipelines and data warehouses using Microsoft Fabric and a comprehensive suite of Azure services (Data Factory, Synapse Analytics, Azure SQL, Data Lake) Develop, optimize, and oversee complex ETL/ELT processes for data ingestion, transformation, and loading from a multitude of disparate sources, ensuring high performance with large-scale datasets Ensure the highest level of data integrity, quality, and governance throughout the entire Fabric environment, establishing best practices for data management Collaborate extensively with stakeholders, translating intricate business requirements into actionable, resilient, and optimized data solutions Proactively troubleshoot, monitor, and fine-tune data pipelines and workflows for peak performance and efficiency, particularly in handling massive datasets Architect and manage workspace architecture, implement robust user access controls, and enforce data security in strict compliance with privacy regulations Automate platform tasks and infrastructure management using advanced scripting languages (Python, PowerShell) and Infrastructure as Code (Terraform, Ansible) principles Document comprehensive technical solutions, enforce code modularity, and champion best practices in version control and documentation across the team Stay at the forefront of Microsoft Fabric updates, new features, and contribute significantly to continuous improvement initiatives and the adoption of cutting-edge technologies Requirements: Minimum of 5+ years of progressive experience in data engineering, with at least 3 years of hands-on, in-depth work on Microsoft Fabric and a wide array of Azure data services Exceptional proficiency in SQL, Python, and advanced data transformation tools (e.g., Spark, PySpark notebooks) Mastery of data warehousing concepts, dimensional modeling, and advanced ETL best practices Extensive experience with complex hybrid cloud and on-premises data integration scenarios Profound understanding of data governance, security protocols, and compliance standards Excellent problem-solving, analytical, and communication skills, with the ability to articulate complex technical concepts clearly to both technical and non-technical audiences Desirable: Experience with Power BI, Azure Active Directory, and managing very large-scale data infrastructure Strong familiarity with Infrastructure as Code and advanced automation tools Bachelors degree in Computer Science, Engineering, or a related field (or equivalent extensive experience) What's in it for you? Care: your mental and physical health is our priority. We ensure comprehensive company-paid medical insurance, as well as financial and legal consultation Tailored education path: boost your skills and knowledge with our regular internal events (meetups, conferences, workshops), Udemy licence, language courses and company-paid certifications Growth environment: share your experience and level up your expertise with a community of skilled professionals, locally and globally Flexibility: hybrid work mode at Chennai or Pune Opportunities: we value our specialists and always find the best options for them. Our Resourcing Team helps change a project if needed to help you grow, excel professionally and fulfil your potential Global impact: work on large-scale projects that redefine industries with international and fast-growing clients Welcoming environment: feel empowered with a friendly team, open-door policy, informal atmosphere within the company and regular team-building events About us: At Ciklum, we are always exploring innovations, empowering each other to achieve more, and engineering solutions that matter. With us, youll work with cutting-edge technologies, contribute to impactful projects, and be part of a One Team culture that values collaboration and progress. India is a strategic innovation hub for Ciklum, with growing teams in Chennai and Pune leading advancements in EdgeTech, AR/VR, IoT, and beyond. Join us to collaborate on game-changing solutions and take your career to the next level. Want to learn more about us? Follow us on Instagram , Facebook , LinkedIn . Explore, empower, engineer with Ciklum! Experiences of tomorrow. Engineered together Interested already? We would love to get to know you! Submit your application. Cant wait to see you at Ciklum.

Posted 1 month ago

Apply

5.0 - 7.0 years

15 - 25 Lacs

Pune, Ahmedabad

Hybrid

Key Responsibilities: Design, develop, and optimize data pipelines and ETL/ELT workflows using Microsoft Fabric, Azure Data Factory, and Azure Synapse Analytics. Implement Lakehouse and Warehouse architectures within Microsoft Fabric, supporting medallion (bronze-silver-gold) data layers. Collaborate with business and analytics teams to build scalable and reliable data models (star/snowflake) using Azure SQL, Power BI, and DAX. Utilize Azure Analysis Services, Power BI Semantic Models, and Microsoft Fabric Dataflows for analytics delivery. Very good hands-on experience with Python for data transformation and processing. Apply CI/CD best practices and manage code through Git version control. Ensure data security, lineage, and quality using data governance best practices and Microsoft Purview (if applicable). Troubleshoot and improve performance of existing data pipelines and models. Participate in code reviews, testing, and deployment activities. Communicate effectively with stakeholders across geographies and time zones. Required Skills: Hands-on experience with Microsoft Fabric (Lakehouse, Warehouse, Dataflows, Pipelines). Strong knowledge of Azure Synapse Analytics, Azure Data Factory, Azure SQL, and Azure Analysis Services. Proficiency in Power BI and DAX for data visualization and analytics modeling. Strong Python skills for scripting and data manipulation. Experience in dimensional modeling, star/snowflake schemas, and Kimball methodologies. Familiarity with CI/CD pipelines, DevOps, and Git-based versioning. Understanding of data governance, data cataloging, and quality management practices. Excellent verbal and written communication skills.

Posted 1 month ago

Apply

3.0 - 5.0 years

4 - 7 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Key Responsibilities: Design and implement data models and schemas to support business intelligence and analytics. Perform data engineering tasks as needed to support analytical activities. Develop clear, concise, and insightful data visualizations and dashboards. Interpret complex datasets and communicate findings through compelling visualizations and storytelling. Work closely with stakeholders to understand data requirements and deliver actionable insights. Maintain documentation and ensure data quality and integrity. Reporting Structure: Direct reporting to Senior Data Engineer Dotted-line reporting to CTO Required Skills & Qualifications: Proficiency in Power BI, Microsoft Fabric, and Power Query. Experience designing and implementing data models and schemas. Familiarity with basic data engineering tasks. Advanced SQL skills for querying and analyzing data. Exceptional ability to translate complex data into clear, actionable insights. Strong ability to communicate complex data insights effectively to technical and non-technical audiences.

Posted 1 month ago

Apply

5.0 - 7.0 years

9 - 12 Lacs

Bengaluru

Work from Office

Join our growing Data & Analytics practice as a Data Analytics & Visualization Candidate must be ready for a F2F interview at Bangalore Consultant and play a key role in designing, building, and governing enterprise- grade dashboards and low-code solutions that enable datadriven decision-making across the firm and for our clients. We are looking for a hands-on, results-driven individual with proven expertise in Power BI, Power Apps, and SQL, along with exposure to modern cloud data ecosystems. Familiarity with Snowflake, Microsoft Fabric best practices, and Finance domain knowledge will be considered valuable assets. This role spans the full delivery lifecycleincluding requirements gathering, data modelling, solution design, development, testing, deployment, and support. Collaborate with business stakeholders to gather and translate business requirements into technical solutions. Design and develop end-to-end Power BI dashboards including data models, DAX calculations, row level security, and performance optimization. Build and deploy Power Apps solutions to automate workflows and integrate with Microsoft 365 and data platforms. Write and optimize complex SQL queries to transform, clean, and extract data from Snowflake or Azure-based data platforms. Connect Power BI to Snowflake using best practices (ODBC, DirectQuery, Import modes). Author views and stored procedures on Azure SQL/Synapse to enable scalable and governed reporting. Understand and apply Microsoft Fabric concepts and infrastructure best practices for scalable BI and data integration. Develop workflows using Alteryx or similar data-prep tools as needed. Build data ingestion and transformation pipelines using Azure Data Factory or Synapse pipelines. Collaborate with data engineers to ensure data quality, integrity, and availability. Monitor and troubleshoot solutions, ensuring performance and reliability. Mentor junior team members and support internal knowledge-sharing initiatives Qualifications Bachelors degree in computer science, Information Systems, Data Analytics, or a related field. Masters degree preferred. 57 years of experience in Business Intelligence or Analytics roles. Expertise in: Power BI (data modelling, DAX, visuals, optimization) Power Apps (canvas apps, connectors, integration) SQL (query performance, views, procedures) Hands-on experience with: Azure Data Factory / Synapse Pipelines Data prep tools like Alteryx or equivalent Strong communication skills, with the ability to present technical concepts to business stakeholders. Practical, solution-oriented mindset with strong problem-solving skills. Experience with Snowflake (architecture, best practices, optimization) Exposure to Finance domain (e.g., FP&A, P&L dashboards, financial metrics) Experience with other BI tools like Tableau or QlikView is a plus Familiarity with Microsoft Fabric and its infrastructure components

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies