Home
Jobs

14 Ms Fabric Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

5 - 10 Lacs

Gurgaon / Gurugram, Haryana, India

On-site

Foundit logo

Design, develop, and maintain cloud infrastructure using Azure and MS Fabric: Architect and implement cloud solutions leveraging Microsoft Azure services and MS Fabric. Ensure the infrastructure supports scalability, reliability, performance, and cost-efficiency. Integrate containerization and orchestration technologies: Utilize Kubernetes and Docker for containerization and orchestration. Manage and optimize Azure Kubernetes Service (AKS) deployments. Implement DevOps practices and automation: Develop CI/CD pipelines to automate code deployment and infrastructure provisioning. Use automation tools and Terraform to streamline operations and reduce manual intervention. Collaborate with development teams to build and deploy cloud-native applications: Provide guidance and support for designing and implementing cloud-native applications. Ensure applications are optimized for cloud environments. Monitor, troubleshoot, and optimize cloud infrastructure: Implement monitoring and alerting systems to ensure infrastructure health. Optimize resource usage and performance to reduce costs and improve efficiency. Develop cost optimization strategies for efficient use of Azure resources. Troubleshoot and resolve issues quickly to minimize impact on users. Ensure high availability and uptime of applications. Enhance system security and compliance: Implement security best practices and ensure compliance with industry standards. Perform regular security assessments and audits EDUCATION University background: Bachelors/Master's degree in computer science & information systems or related engineering. BEHAVIORAL COMPETENCIES: Outstanding Technical leader with proven hands on in configuration and deployment of DevOps towards successful delivery. Be Innovative and be aligned to new product development technologies and methods. Demonstrate excellent communication skills and able to guide, influence and convince others in a matrix organization. Demonstrated teamwork and collaboration in a professional setting Proven capabilities with worldwide teams Team Player with prior experience in working with European customer is not mandatory but preferable. 5 to 10 years in IT and/or digital companies or startups Knowledge of ansible. Extensive knowledge of cloud technologies, particularly Microsoft Azure and MS Fabric. Proven experience with containerization and orchestration tools such as Kubernetes and Docker. Experience with Azure Kubernetes Service (AKS), Terraform, and DevOps practices. Strong automation skills, including scripting and using automation tools. Proven track record in designing and implementing cloud infrastructure. Experience in optimizing cloud resource usage and performance. Proven experience in Azure cost optimization strategies. Proven experience ensuring uptime of applications and rapid troubleshooting in case of failures. Strong understanding of security best practices and compliance standards. Proven experience providing technical guidance to teams. Proven experience in managing customer expectations. Proven track record of driving decisions collaboratively, resolving conflicts, and ensuring follow-through. Extensive knowledge of software development and system operations. Proven experience in designing stable solutions, testing, and debugging. Demonstrated technical guidance with worldwide teams. Demonstrated teamwork and collaboration in a professional setting. Proven capabilities with worldwide teams. Proficient in English; proficiency in French is a plus Performance Measurements: On-Time Delivery (OTD) Infrastructure Reliability and Availability Cost Optimization and Efficiency Application Uptime and Failure Resolution

Posted 1 week ago

Apply

4.0 - 6.0 years

4 - 6 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Generate CPG business insights and reports using Excel, PowerPoint, and PowerBI. Develop and maintain Power BI reports, dashboards, and visualizations that provide meaningful insights to stakeholders. Create comprehensive content and presentations to support business decisions and strategies. Extract and analyze data from NIQ, Circana, Spins , and other relevant sources. Work with cross-functional teams to develop and implement data-driven solutions, including data visualizations, reports, and dashboards. Manage analytics projects and work streams, and build dashboards and reports. Provide expert-level support to stakeholders on analytics and data visualization. Present findings and recommendations to stakeholders in a clear and concise manner. Required Education Bachelor's Degree Preferred Education Master's Degree Required Technical and Professional Expertise B. Tech, Bachelor's or Master's degree in Computer Science, Science, or relevant education. 4-6 years of experience in data analysis or a related field. Proficiency in MS Fabric, PowerBI, SQL, Excel , and experience with NIQ, Circana, and Spins . Preferred Technical and Professional Experience Strong analytical skills and attention to detail. Excellent communication and presentation abilities. Ability to manage multiple tasks and meet deadlines. Experience in the CPG industry is preferred.

Posted 1 week ago

Apply

4.0 - 7.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Design, develop, and implement data solutions using Microsoft Fabric , including data pipelines, data warehouses, and data marts. Develop data pipelines, data transformations, and data workflows using Microsoft Fabric.

Posted 1 week ago

Apply

3.0 - 8.0 years

9 - 19 Lacs

Bengaluru, Delhi / NCR

Work from Office

Naukri logo

Key Responsibilities: Lead the implementation and optimization of Microsoft Purview across the clients data estate in MS Fabric/Azure Cloud Platform (ADF or Data Bricks etc). Define and enforce data governance policies, data classification, sensitivity labeling, and data lineage to ensure readiness for GenAI use cases. Collaborate with data engineers, architects, and AI/ML teams to ensure data discoverability, compliance, and ethical AI readiness. Design and implement data cataloging strategies to support GenAI model training and inference. Provide guidance on data access controls, privacy, and regulatory compliance (e.g., GDPR, HIPAA). Conduct workshops and training sessions for client stakeholders on Purview capabilities and best practices. Monitor and report on data governance KPIs and GenAI readiness metrics. Required Skills & Qualifications: Proven experience as a Microsoft Purview SME in enterprise environments. Strong knowledge of Microsoft Fabric, OneLake, and Synapse Data Engineering. Experience with data governance frameworks and metadata management. Hands-on experience with data classification, sensitivity labels, and data lineage tracking. Understanding of compliance standards and data privacy regulations. Excellent communication and stakeholder management skills. Preferred Qualifications: Microsoft certifications in Azure Data, Purview, or Security & Compliance. Experience working with Azure OpenAI, Copilot integrations, or other GenAI platforms. Background in data science, AI ethics, or ML operations is a plus.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

3 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title: Power BI Developer Experience: 23 Years Location: Bangalore - Indiranagar (Work from Office Only) Employment Type: Full-Time Job Description: We are looking for a Power BI Developer with 23 years of hands-on experience in designing and developing BI reports and dashboards using Power BI. Candidates with experience in Microsoft Fabric will be given preference. Strong communication skills are essential, as the role involves close collaboration with cross-functional teams. Key Responsibilities: Develop, design, and maintain interactive dashboards and reports in Power BI Work closely with stakeholders to gather requirements and translate them into effective data visualizations Optimize data models for performance and usability Implement row-level security and data governance best practices Stay updated with Power BI and MS Fabric capabilities and best practices Requirements: 23 years of hands-on Power BI development experience Familiarity with Power Query, DAX, and data modeling techniques Experience in Microsoft Fabric is a plus Strong analytical and problem-solving skills Excellent verbal and written communication skills Interested candidates kindly share your CV and below details to usha.sundar@adecco.com 1) Present CTC (Fixed + VP) - 2) Expected CTC - 3) No. of years experience - 4) Notice Period - 5) Offer-in hand - 6) Reason of Change - 7) Present Location -

Posted 2 weeks ago

Apply

3.0 - 4.0 years

5 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

Overview This role serves as an Associate Analyst for the GTM Data analytics COE project development team. This role is one of the go-to resource for building/ maintaining key reports, data pipelines and advanced analytics necessary to bring insights to light for senior leaders and Sector and field end users. Responsibilities The COEs core competencies are a mastery of data visualization, data engineering, data transformation, predictive and prescriptive analytics Enhance data discovery, processes, testing, and data acquisition from multiple platforms. Apply detailed knowledge of PepsiCos applications for root-cause problem-solving. Ensure compliance with PepsiCo IT governance rules and design best practices. Participate in project planning with stakeholders to analyze business opportunities and define end-to-end processes. Translate operational requirements into actionable data presentations. Support data recovery and integrity issue resolution between business and PepsiCo IT. Provide performance reporting for the GTM function, including ad-hoc requests using internal, shipment data systems Develop on-demand reports and scorecards for improved agility and visualization. Collate and analyze large data sets to extract meaningful insights on performance trends and opportunities. Present insights and recommendations to the GTM Leadership team regularly. Manage expectations through effective communication with headquarters partners. Ensure timely and accurate data delivery per service level agreements (SLA). Collaborate across functions to gather insights for action-oriented analysis. Identify and act on opportunities to improve work delivery. Implement process improvements, reporting standardization, and optimal technology use. Foster an inclusive and collaborative environment. Provide baseline support for monitoring SPA mailboxes, work intake, and other ad-hoc requests.Additionally, the role will provide baseline support for monitoring work intake & other adhoc requests, queries Qualifications Undergrad degree in Business or related technology 3-4 Yrs working experience in Power BI 1-2 Yrs working experience in SQL and Python Preferred qualifications : Information technology or analytics experience is a plus Familiarity with Power BI/ Tableau, Python, SQL, Teradata, Azure, MS Fabric Requires a level of analytical, critical thinking, and problem-solving skills as well as great attention to detail Strong time management skills, ability to multitask, set priorities, and plan

Posted 3 weeks ago

Apply

4.0 - 9.0 years

0 - 25 Lacs

Hyderabad, Pune, Greater Noida

Work from Office

Naukri logo

Roles and Responsibilities : Design, develop, test, deploy, and maintain large-scale data pipelines using Azure Data Factory (ADF) to integrate various data sources into a centralized platform. Collaborate with cross-functional teams to gather requirements for data integrations and ensure seamless delivery of high-quality solutions. Develop complex SQL queries to extract insights from large datasets stored in relational databases such as PostgreSQL or MySQL. Troubleshoot issues related to data pipeline failures, identify root causes, and implement fixes to prevent future occurrences. Job Requirements : 4-9 years of experience in designing and developing data integration solutions using ADF or similar tools like Informatica PowerCenter or Talend Open Studio. Strong understanding of Microsoft Azure services including storage options (e.g., Blob Storage), compute resources (e.g., Virtual Machines), networking concepts (e.g., VPN). Proficiency in writing complex SQL queries for querying large datasets stored in relational databases such as PostgreSQL or MySQL.

Posted 3 weeks ago

Apply

8 - 12 years

19 - 30 Lacs

Pune, Bengaluru

Work from Office

Naukri logo

About Position: We at Persistent are looking for a Data Engineering lead with experience in MS Fabric, SQL, Python along with knowledge in Data Extraction and ETL Process. Role: Data Engineering Lead Location: Pune, Bangalore Experience: 8+ years Job Type: Full Time Employment What You'll Do: Work with business to understand business requirements and translate into low level design Design and implement robust, fault tolerant, scalable, and secure data pipelines using pyspark, notebooks in MS Fabric Review code of peers and mentor junior team members Participate in sprint planning and other agile ceremonies Drive automation and efficiency in Data ingestion, data movement and data access workflow Contribute ideas to help ensure that required standards and processes are in place and actively look for opportunities to enhance standards and improve process efficiency. Expertise You'll Bring: Around 8 to 12 years of experience, at least 1 year in MS fabric and Azure cloud Leadership: Ability to lead and mentor junior data engineers, help with planning and estimations Data migration: Experience on migrating and re-modeling large enterprise data from legacy warehouse to Lakehouse (Delta lake) on MS Fabric or Databricks. Strong Data Engineering Skills: Proficiency in data extraction, transformation, and loading (ETL) processes, data modeling, and database management. Also experience around setting up pipelines using Notebooks and ADF, setting up monitoring and alert notifications. Experience with Data Lake Technologies: MS Fabric, Azure, Databricks, Python, Orchestration tool like Apache Airflow or Azure Data Factory, Azure Synapse along with stored procedures, Azure data lake storage. Data Integration Knowledge: Familiarity with data integration techniques, including batch processing, streaming, and real-time data ingestion, auto-loader, change data capture, creation of fact and dimension tables. Programming Skills: Proficiency in SQL, Python, Pyspark for data manipulation and transformation. DP-700 certification will be preferred Benefits: Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment: Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment. We are committed to creating an inclusive environment where all employees can thrive. Our company fosters a values-driven and people-centric work environment that enables our employees to: Accelerate growth, both professionally and personally Impact the world in powerful, positive ways, using the latest technologies Enjoy collaborative innovation, with diversity and work-life wellbeing at the core Unlock global opportunities to work and learn with the industry's best Let's unleash your full potential at Persistent "Persistent is an Equal Opportunity Employer and prohibits discrimination and harassment of any kind."

Posted 1 month ago

Apply

2 - 7 years

11 - 18 Lacs

Pune, Navi Mumbai, Mumbai (All Areas)

Work from Office

Naukri logo

Data Integration Developer Job Description Job Title: Data Integration Developer Overview: Develop, update, maintain, test and document ETL code and processes to accommodate source system changes, as well as warehouse improvements, bug xes, and new ETL steps/pipelines. Responsibilities: • Designing, developing, and maintaining data integration/ETL processes across a variety of technologies. • • Test and document integration steps and processes Coordinate with the development team to streamline new functionality into the infrastructure • • • Maintain documentation of analytic methodologies, ensure the quality of all deliverables, and manage project timelines and scopes Be a good team player and have ability to handle multiple projects with minimum supervision Provide regular progress and update reports to the leadership team Qualications: • • • • • Comprehensive understanding of data modeling within a data warehouse 2+ years of enterprise ETL development experience Strong T-SQL knowledge Strong experience with Azure Data Factory, Azure SQL, or Azure Synapse Experience with MS Fabric, Databricks, or Data Visualization Tools • • • • • Experience with Python and PySpark Advanced analytical thinking Experience with data visualization products (e.g., Power-BI, Tableau or similar) Excellent communication and organizational skills Experience with AI/ML

Posted 2 months ago

Apply

4 - 8 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

As a Big Data Engineer, you will be responsible for the design, implementation, Security and on- going support of Scality data storage. You will partner with Data Scientists as well as product stakeholders to create an advanced analytics eco-system. Obtain an overall understanding of data and processes Partner with the Product Owner on requirements and brainstorm implementation options Participate in planning session to determine technical designs and estimates Develop and deliver solutions using Big Data/Spark tools and technologies Collaborate effectively with other teams (internal and external) in a very dynamic agile environment Learn new tools and implement continuous improvements in support of the Products strategic visionKey Responsibilities Partners closely with team members on Big Data solutions for our data science community and analytic users Partners with 360F teams on Big Data efforts Executes moderately complex functional work tracks for the team Contributes to the development of moderately complex prototypes and department applications that integrate big data and advanced analytics to make business decisions Develops innovative solutions to Big Data issues and challenges within the team Contributes to the development of moderately complex technical solutions using Big Data techniques in data & analytics processes Leverages and uses Big Data best practices / lessons learned to develop technical solutions Understands the Big Data related problems and requirements to identify the correct technical approach Uses new areas of Big Data technologies, (ingestion, processing, distribution) and research delivery methods that can solve business problems Key Skills Experience in Spark, Scala, Python, SQL, Unix, Kubernetes, Airflow. Preferable if experience in PowerBI and MS Fabric. Additionally, need to support oncall jobs during India day time on rotation basis. Education 4 year Bachelors Degree (Preferred) Experience 2 or more years of experience (Preferred) Supervisory Responsibilities This job does not have supervisory duties. Education & Experience (in lieu) In lieu of the above education requirements, an equivalent combination of education and experience may be considered. Primary Skills Big Data Engineering, Big Data Systems, Big Data Technologies, Data Science, Influencing Others Shift Time Shift B (India)

Posted 2 months ago

Apply

8 - 12 years

22 - 30 Lacs

Hyderabad

Remote

Naukri logo

We are looking for a skilled Data Engineer with strong expertise in SQL Development, SSIS, and Microsoft Fabric. The ideal candidate should have hands-on experience in T-SQL Development and working across the Bronze and Gold layers of Microsoft Fabric . Additionally, experience in Power BI will be a plus. Key Responsibilities: SQL & SSIS Development: Design, develop, and optimize SQL queries, stored procedures, and functions. Develop and maintain SSIS packages for ETL processes. Ensure data quality, performance tuning, and error handling in ETL pipelines. MS Fabric & Data Engineering: Work on Bronze & Gold layers of Microsoft Fabric for data transformation and enrichment. Develop and manage T-SQL scripts for data processing and transformation. Implement best practices for data modeling and data pipeline optimization. Power BI Integration (Preferred): Collaborate with BI teams to support Power BI dashboards and reports. Optimize datasets and queries for efficient reporting. Collaboration & Process Improvement: Work closely with data analysts, business teams, and other engineers to deliver data solutions. Ensure best practices in database design, performance tuning, and security. Automate workflows and optimize data pipelines for scalability. Required Skills & Qualifications: Technical Expertise: Strong hands-on experience in SQL Development & T-SQL scripting . Experience in SSIS (SQL Server Integration Services) for ETL processes. Knowledge of Microsoft Fabric , specifically Bronze & Gold layers . Familiarity with data engineering concepts , data modeling, and data warehousing. Exposure to Power BI for data visualization (preferred). Other Skills: Ability to troubleshoot and optimize SQL queries for performance. Experience in handling large datasets and optimizing ETL processes. Strong problem-solving and analytical skills. Excellent communication and teamwork abilities. Preferred Qualifications: Experience with Azure Data Services (Azure Synapse, Data Factory, etc.) is a plus. Certifications in Microsoft SQL Server, Azure Data Engineering, or Power BI are beneficial.

Posted 2 months ago

Apply

5 - 8 years

7 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

• Understanding current architecture and design principles of Databricks echo system • Understanding of Managing and securing data assets in Databricks using Unity Catalog. • Understanding Monitoring and observability using Datadog. • Understanding Code quality and security analysis with SonarQube. • Code walkthrough to explain current Python / Pyspark scripts for data processing and analysis. • Understanding Version control with GitHub and automating workflows using GitHub Actions. • Understanding financial operations and cost management in cloud environments. • Understanding 3rd Party Vendor coordination and escalation matrix (if any) • Understanding of existing DevOps for CI/CD pipeline • Understanding of current performance of application, monitoring and logging practices

Posted 3 months ago

Apply

2 - 5 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

• 8+ years of IT experience and must have worked in DW project for at least 3 years • Worked in atleast 3 Data Warehousing projects by contributing in developing ETL solution. • Proficient with Python+ Pyspark for data manipulation+ analysis+ and extraction • Advance SQL : Should be very strong understanding of Database & Data Modelling. Must posses hands-on experience in writing complex SQLs • Azure Data Factory (ADF): A key service for data integration and orchestration. Should know how to create data pipelines+ schedule activities+ and manage data movement and transformation using ADF • Azure Databricks: A cloud-based big data analytics platform based on Apache Spark. Should be adept at using Databricks for data engineering tasks like data ingestion+ transformation+ and analysis • Azure SQL Database: Microsoft's fully managed relational database service. Should be proficient in using it for data storage+ retrieval+ and basic manipulation. • Azure DevOps: Knowledge of Azure DevOps is valuable for implementing Continuous Integration and Continuous Deployment (CI/CD) pipelines for data engineering solutions • Monitoring and Optimization: Understanding how to monitor the performance of data engineering solutions and optimize them for better efficiency is crucial • Data Quality and Data Cleaning: Knowing how to ensure data quality and perform data cleaning operations to maintain reliable data is important for data engineers. • Data Modeling and ETL/ELT: You should be skilled in data modeling techniques and Extract+ Transform+ Load (ETL) or Extract+ Load+ Transform (ELT) processes for data integration. • Good to have Apache Spark: Understanding of Spark and its various components. Spark is a fast and general-purpose data processing engine that can run in-memory+ making it well-suited for iterative algorithms and interactive data analysis.

Posted 3 months ago

Apply

2 - 7 years

4 - 8 Lacs

Maharashtra

Work from Office

Naukri logo

Configures and manages the MS Fabric environment. Implements the security model and framework. Troubleshoots and resolves issues related to the MS Fabric environment. Collaborates with the DevOps engineers to ensure smooth deployment and integration of MS Fabric resources.

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies