Jobs
Interviews

319 Microsoft Fabric Jobs - Page 13

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10 - 20 years

20 - 35 Lacs

Indore, Hyderabad, Ahmedabad

Work from Office

Job Summary: As a Solution Architect , you will collaborate with our sales, presales and COE teams to provide technical expertise and support throughout the new business acquisition process. You will play a crucial role in understanding customer requirements, presenting our solutions, and demonstrating the value of our products. You thrive in high-pressure environments, maintaining a positive outlook and understanding that career growth is a journey that requires making strategic choices. You possess good communication skills, both written and verbal, enabling you to convey complex technical concepts clearly and effectively. You are a team player, customer-focused, self-motivated, responsible individual who can work under pressure with a positive attitude. You must have experience in managing and handling RFPs/ RFIs, client demos and presentations, and converting opportunities into winning bids. You possess a strong work ethic, positive attitude, and enthusiasm to embrace new challenges. You can multi-task and prioritize (good time management skills), willing to display and learn. You should be able to work independently with less or no supervision. You should be process-oriented, have a methodical approach and demonstrate a quality-first approach. Ability to convert clients business challenges/ priorities into winning proposal/ bid through excellence in technical solution will be the key performance indicator for this role. What youll do Architecture & Design: Develop high-level architecture designs for scalable, secure, and robust solutions. Technology Evaluation: Select appropriate technologies, frameworks, and platforms for business needs. Cloud & Infrastructure: Design cloud-native, hybrid, or on-premises solutions using AWS, Azure, or GCP. Integration: Ensure seamless integration between various enterprise applications, APIs, and third-party services. Design and develop scalable, secure, and performant data architectures on Microsoft Azure and/or new generation analytics platform like MS Fabric. Translate business needs into technical solutions by designing secure, scalable, and performant data architectures on cloud platforms. Select and recommend appropriate Data services (e.g. Fabric, Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics, Power BI etc) to meet specific data storage, processing, and analytics needs. Develop and recommend data models that optimize data access and querying. Design and implement data pipelines for efficient data extraction, transformation, and loading (ETL/ELT) processes. Ability to understand Conceptual/Logical/Physical Data Modelling. Choose and implement appropriate data storage, processing, and analytics services based on specific data needs (e.g., data lakes, data warehouses, data pipelines). Understand and recommend data governance practices, including data lineage tracking, access control, and data quality monitoring. What you will Bring 10+ years of working in data analytics and AI technologies from consulting, implementation and design perspectives Certifications in data engineering, analytics, cloud, AI will be a certain advantage Bachelor’s in engineering/ technology or an MCA from a reputed college is a must Prior experience of working as a solution architect during presales cycle will be an advantage Soft Skills Communication Skills Presentation Skills Flexible and Hard-working Technical Skills Knowledge of Presales Processes Basic understanding of business analytics and AI High IQ and EQ Why join us? Work with a passionate and innovative team in a fast-paced, growth-oriented environment. Gain hands-on experience in content marketing with exposure to real-world projects. Opportunity to learn from experienced professionals and enhance your marketing skills. Contribute to exciting initiatives and make an impact from day one. Competitive stipend and potential for growth within the company. Recognized for excellence in data and AI solutions with industry awards and accolades. Employee Benefits Culture: Open Door Policy: Encourages open communication and accessibility to management. Open Office Floor Plan: Fosters a collaborative and interactive work environment. Flexible Working Hours: Allows employees to have flexibility in their work schedules. Employee Referral Bonus: Rewards employees for referring qualified candidates. Appraisal Process Twice a Year: Provides regular performance evaluations and feedback. Inclusivity and Diversity: Hiring practices that promote diversity: Ensures a diverse and inclusive workforce. Mandatory POSH training: Promotes a safe and respectful work environment. Health Insurance and Wellness Benefits: GMC and Term Insurance: Offers medical coverage and financial protection. Health Insurance: Provides coverage for medical expenses. Disability Insurance: Offers financial support in case of disability. Child Care & Parental Leave Benefits: Company-sponsored family events: Creates opportunities for employees and their families to bond. Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child. Family Medical Leave: Offers leave for employees to take care of family members' medical needs. Perks and Time-Off Benefits: Company-sponsored outings: Organizes recreational activities for employees. Gratuity: Provides a monetary benefit as a token of appreciation. Provident Fund: Helps employees save for retirement. Generous PTO: Offers more than the industry standard for paid time off. Paid sick days: Allows employees to take paid time off when they are unwell. Paid holidays: Gives employees paid time off for designated holidays. Bereavement Leave: Provides time off for employees to grieve the loss of a loved one. Professional Development Benefits: L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development. Mentorship Program: Offers guidance and support from experienced professionals. Job Training: Provides training to enhance job-related skills. Professional Certification Reimbursements: Assists employees in obtaining professional certifications. Promote from Within: Encourages internal growth and advancement opportunities.

Posted 4 months ago

Apply

6.0 - 11.0 years

10 - 20 Lacs

hyderabad, gurugram

Work from Office

Technical Skills (Required) Azure DevOps: Repos, Boards, Pipelines (build/release), CI/CD configuration and management Infrastructure as Code: ARM templates, Terraform or Bicep for provisioning and versioning Azure Data Factory: design, implement and monitor ETL/ELT pipelines Microsoft Fabric: access, workspace configuration, pipeline orchestration and data integration Azure Analysis Services & Cubes: model deployment, scaling and performance tuning Entra ID (Azure AD) Based Access: manage groupbased permissions for Power BI workspaces and reports Scripting & Automation: PowerShell and Python to build custom pipelines, approval flows and deployment processes Version Control & Branching: Git workflows, pull requests and merge strategies Azure Databricks Administration: workspace and cluster provisioning, performance tuning, security permissions, and integration into CI/CD pipelines Cost Management & Allocation: implement Azure Cost Management (budgets, alerts), tagging strategies and reporting to split and track expenses across different business areas and markets Technical Skills (GoodtoHave) Containerization & Orchestration: Docker, Kubernetes (AKS) Messaging & Integration: Azure Service Bus, Event Grid or Event Hubs Serverless & App Services: Functions, Logic Apps for automation tasks Monitoring & Logging: Azure Monitor, Log Analytics, Application Insights Security & Compliance: Azure Policy, RBAC, network security basics CrossCloud Familiarity: basic experience with AWS/GCP DevOps tools Confluence: for documentation Soft Skills Effective Communication: presenting technical insights clearly to diverse audiences Advanced English Proficiency: reading/writing documentation, collaborating with global teams

Posted Date not available

Apply

4.0 - 8.0 years

8 - 18 Lacs

noida

Work from Office

We are seeking a Data Analyst with strong Power BI expertise and experience working across both modern and legacy analytics platforms. You will create analytical solutions, enable business self-service capabilities, and support the transition from legacy systems (Excel, SSRS, OLAP Cubes) to the Microsoft Fabric ecosystem. This role combines reporting, ad-hoc analysis, data investigation, and user enablementwhile collaborating closely with Data Engineers and Platform Engineers to deliver reliable, high-quality insights. Key Responsibilities 1. Reporting & Analysis Develop and enhance Power BI dashboards, including new filters, performance optimisations, and UX improvements. Perform ad-hoc analysis to answer business questions across multiple data sources. Maintain and optimise existing reports in Excel, SSRS, and Power BI. Conduct exploratory data analysis to identify trends, patterns, and opportunities. Support end-of-month finance processes and EDW/IDW operations. 2. Data Investigation & Quality Investigate and resolve data quality issues using systematic “is/is not” root cause analysis. Collaborate with engineering teams to identify, document, and resolve discrepancies. Validate data accuracy across both Fabric and legacy systems. Provide business users with clarity on data definitions, limitations, and correct usage. 3. Self-Service Enablement & Training Empower business teams to access data independently via self-service tools. Provide training and documentation for Power BI and related Fabric capabilities. Guide tool selection (Excel, Power BI, or other Fabric tools) based on requirements. Assist in gathering business requirements for vendor-delivered solutions and validate outputs. 4. Business Insights & Strategic Support Analyse performance metrics to identify improvement opportunities. Support strategic initiatives with predictive modelling and forecasting (where applicable). Present actionable insights and recommendations to stakeholders. Provide data-driven input for business cases and investment decisions. Essential Skills & Experience 2+ years in business intelligence or data analysis roles. Advanced Power BI skills (dashboard creation, DAX, data modelling, troubleshooting). Strong SQL skills for querying and analysis (Fabric & SQL Server). Experience with legacy systems : Excel/VBA, SSRS, OLAP Cubes. Systematic problem-solving approach (“is/is not” methodology). Strong communication skills to explain technical issues and guide tool adoption. Proven experience supporting migrations from legacy to modern analytics platforms. Preferred Skills Microsoft certifications (PL-300, DP-600). Experience with Python, R, or similar analytical languages. Background in finance, operations, or commercial analysis. Familiarity with Microsoft Fabric notebooks or Jupyter notebooks. Statistical analysis or data science knowledge. Experience training business users on analytics tools. Ways of Working Agile/Scrum environment with flexibility for BAU priorities. Close collaboration within a three-role squad : Data Platform Engineer – platform governance & vendor oversight Data Engineer – data pipelines & gold tables Data Analyst – business insights & self-service enablement Hybrid technology environment (modern cloud + on-premises systems). Strong vendor coordination responsibilities for data product delivery. Success Metrics Improved platform performance and cost optimisation. High data quality and reliability across reports. Increased adoption of self-service analytics. On-time, standards-compliant vendor project delivery. Measurable progress in migrating users from legacy to modern tools.

Posted Date not available

Apply

2.0 - 5.0 years

4 - 9 Lacs

noida

Work from Office

We are seeking a skilled Data Engineer to design, build, and maintain high-performance data pipelines within the Microsoft Fabric ecosystem. The role involves transforming raw data into analytics-ready assets, optimising data performance across both modern and legacy platforms, and collaborating closely with Data Analysts to deliver reliable, business-ready gold tables. You will also coordinate with external vendors during build projects to ensure adherence to standards. Key Responsibilities Pipeline Development & Integration Design and develop end-to-end data pipelines using Microsoft Fabric (Data Factory, Synapse, Notebooks). Build robust ETL/ELT processes to ingest data from both modern and legacy sources. Create and optimise gold tables and semantic models in collaboration with Data Analysts. Implement real-time and batch processing with performance optimisation. Build automated data validation and quality checks across Fabric and legacy environments. Manage integrations with SQL Server (SSIS packages, cube processing). Data Transformation & Performance Optimisation Transform raw datasets into analytics-ready gold tables following dimensional modelling principles. Implement complex business logic and calculations within Fabric pipelines. Create reusable data assets and standardised metrics with Data Analysts. Optimise query performance across Fabric compute engines and SQL Server. Implement incremental loading strategies for large datasets. Maintain and improve performance across both Fabric and legacy environments. Business Collaboration & Vendor Support Partner with Data Analysts and stakeholders to understand requirements and deliver gold tables. Provide technical guidance to vendors during data product development. Ensure vendor-built pipelines meet performance and integration standards. Collaborate on data model design for both ongoing reporting and new analytics use cases. Support legacy reporting systems including Excel, SSRS, and Power BI. Resolve data quality issues across internal and vendor-built solutions. Quality Assurance & Monitoring Write unit and integration tests for data pipelines. Implement monitoring and alerting for data quality. Troubleshoot pipeline failures and data inconsistencies. Maintain documentation and operational runbooks. Support deployment and change management processes. Required Skills & Experience Essential 2+ years of data engineering experience with Microsoft Fabric and SQL Server environments. Strong SQL expertise for complex transformations in Fabric and SQL Server. Proficiency in Python or PySpark for data processing. Integration experience with SSIS, SSRS, and cube processing. Proven performance optimisation skills across Fabric and SQL Server. Experience coordinating with vendors on technical build projects. Strong collaboration skills with Data Analysts for gold table creation. Preferred Microsoft Fabric or Azure certifications (DP-600, DP-203). Experience with Git and CI/CD for data pipelines. Familiarity with streaming technologies and real-time processing. Background in BI or analytics engineering. Experience with data quality tools and monitoring frameworks.

Posted Date not available

Apply

4.0 - 8.0 years

10 - 15 Lacs

noida

Work from Office

The Data Platform Engineer will be responsible for designing, implementing, and optimising the technical foundation of our Microsoft Fabric environment. This role combines deep platform architecture expertise with governance, security, cost management, and vendor oversight to ensure high performance, scalability, and compliance across enterprise data systems. You will provide architectural leadership for both internal teams and external vendors, driving best practices for building robust, future-ready data products on the Microsoft Fabric platform. Key Responsibilities Platform Architecture & Infrastructure Design and implement enterprise-grade Microsoft Fabric architecture with a focus on performance, scalability, and cost efficiency. Build and maintain core data infrastructure, including data lakes, semantic layers, and integration frameworks. Establish and manage security models, access controls, and compliance frameworks across Fabric and legacy environments. Oversee performance monitoring, capacity planning, and resource optimisation. Implement cost governance practices, including usage monitoring, charge-back models, and optimisation strategies. Design and maintain integrations with legacy systems (SQL Server, SSIS, SSRS, Excel/VBA, OLAP cubes). Governance & Standards Define and enforce enterprise data governance policies, quality standards, and architectural principles. Establish development standards, deployment pipelines, and Infrastructure-as-Code (IaC) practices. Maintain comprehensive platform documentation, architectural guidelines, and best practice standards. Implement data lineage, cataloguing, and metadata management solutions. Design and execute disaster recovery and business continuity strategies. Technical Leadership & Vendor Management Mentor and guide Data Engineers and Data Analysts on platform usage and best practices. Lead architectural governance for external vendors developing data products on the Fabric platform. Review and approve vendor technical designs and implementation strategies. Ensure adherence to architectural standards and integration patterns. Troubleshoot complex technical issues in both internal and vendor-delivered solutions. Coordinate rollout of new platform capabilities and features. Strategic Platform Development Evaluate and integrate new Microsoft Fabric features to enhance platform capabilities. Develop integration patterns for connecting external systems and data sources. Plan and execute platform upgrades, migrations, and scalability initiatives. Collaborate with enterprise architecture teams to align with broader IT strategy. Required Skills & Experience Essential 4+ years in data platform engineering with modern cloud platforms. 2+ years hands-on experience in Microsoft Fabric administration and architecture. Proven expertise in cost management and performance optimisation for large-scale data platforms. Strong security background, including Azure AD, RBAC, and compliance frameworks (GDPR, HIPAA, etc.). Integration experience with legacy systems (SQL Server, SSIS, SSRS, Excel/VBA). Advanced performance tuning skills for Fabric compute engines and SQL Server. Demonstrated scalability planning for expanding data volumes and user bases. Highly Valued Microsoft Azure certifications (AZ-305, DP-203, DP-700). Experience with enterprise data governance tools and frameworks. Background in platform engineering or DevOps practices. Familiarity with data mesh or federated analytics architectures. Experience with monitoring and observability tools.

Posted Date not available

Apply

8.0 - 13.0 years

15 - 30 Lacs

pune, chennai, bengaluru

Hybrid

Role & responsibilities Mandatory skill set : MS FABRIC, ADF ADB Responsibilities: Data Integration: Design, develop, and maintain data integration solutions using Microsoft Fabric, ensuring seamless data flow across various systems. Data Modeling: Create and manage data models within Microsoft Fabric to support analytics and reporting needs. ETL Processes: Develop and optimize ETL (Extract, Transform, Load) processes to ensure data quality and consistency. Collaboration: Work closely with data analysts, data scientists, and business stakeholders to understand data requirements and deliver solutions. Performance Tuning: Monitor and optimize the performance of data pipelines and workflows within Microsoft Fabric. Documentation: Document data integration processes, architectures, and workflows for future reference and compliance. Best Practices: Implement best practices for data engineering and ensure adherence to data governance policies. Troubleshooting: Identify and resolve issues related to data integration and performance. Qualifications: Education: Bachelors degree in Computer Science, Information Technology, or a related field. Experience: Proven experience in data engineering, specifically with Microsoft Fabric and related technologies. Programming Skills: Proficiency in programming languages such as SQL, Python, or C#. Cloud Platforms: Experience with Microsoft Azure and its data services. Data Warehousing: Familiarity with data warehousing concepts and tools, particularly within the Microsoft ecosystem. Version Control: Experience with version control systems like Git. Communication Skills: Strong verbal and written communication skills to collaborate with cross-functional teams. Preferred Skills: Experience with Power BI for data visualization and reporting. Knowledge of machine learning frameworks and libraries. Familiarity with CI/CD practices for data pipelines

Posted Date not available

Apply

4.0 - 9.0 years

12 - 22 Lacs

noida

Hybrid

Strong proficiency in Power BI (data modeling, measures, relationships, and visualization best practices). Hands-on experience with DAX functions and optimization techniques. Good understanding of Microsoft Fabric and its components.

Posted Date not available

Apply

5.0 - 10.0 years

10 - 20 Lacs

bengaluru

Remote

Strong expertise in Microsoft Fabric, Python/PySpark, SQL, and data warehousing/lakehouse concepts. Excellent analytical, problem-solving, and communication skills with experience in agile environments; certifications

Posted Date not available

Apply

5.0 - 10.0 years

15 - 30 Lacs

hyderabad, chennai, bengaluru

Hybrid

Role: Microsoft Fabric Developer Experience: 5+ years Location: Hyderabad/Bangalore/Chennai/Vizag/Kolhapur. Primary Skills: microsoft fabric, Azure Databricks, Pyspark, SQL. Please share your updated CV: ramumu.in@mouritech.com Role and Responsibilities: Hands-on with Microsoft Fabric, including Lakehouse, Synapse, Pipelines, and Dataflows. Proficiency in DAX, Power Query (M), and SQL. Strong understanding of Delta Lake, Parquet, and OneLake storage. Knowledge of Spark Notebooks, KQL, and Data Engineering within Fabric. Experience with source control integration (Git) and CI/CD pipelines. Familiarity with Azure Data Factory, Azure Synapse Analytics, or Databricks is a plus. Excellent analytical and problem-solving skills. Strong communication and stakeholder management skills. Lead the architecture, design, and implementation of end-to-end data solutions using Microsoft Fabric. Define and enforce data governance, security, workspace management, and CI/CD strategies across Fabric components. Design Lakehouse, Warehouse, and Real-time Analytics models using OneLake, Delta Lake, and Synapse within Microsoft Fabric. Architect and oversee data pipelines, dataflows, notebooks, and event-driven workloads. Collaborate with enterprise architects, data engineers, BI developers, and business users to align solutions with business needs. Define data modeling standards, naming conventions, and best practices for data engineering and reporting. Provide technical leadership and mentorship to delivery teams using Power BI and Fabric. Evaluate emerging Microsoft technologies and recommend adoption strategies. Work closely with stakeholders on performance tuning, cost optimization, and governance enforcement. Ensure compliance, security policies, and auditing across the Microsoft Fabric platform Regards Ramu M 7993140441

Posted Date not available

Apply

7.0 - 10.0 years

10 - 16 Lacs

kolkata

Hybrid

Role & responsibilities Microsoft Fabric Preferred candidate profile Data Engineer job description outlines the responsibilities for designing, building, and maintaining data pipelines and systems. This includes tasks like collecting, transforming, and storing data, ensuring data quality and accessibility, and collaborating with other teams to meet business needs

Posted Date not available

Apply

8.0 - 12.0 years

5 - 15 Lacs

hyderabad, chennai, bengaluru

Hybrid

Key Responsibilities 1. Plan, design, and execute test strategies for cloud applications, ensuring comprehensive testing coverage. 2. Implement and maintain automated testing frameworks for efficient and reliable testing processes. 3. Lead and mentor the testing team to deliver high-quality results within the stipulated timelines. 4. Collaborate with cross functional teams to identify testing requirements and priorities. 5. Perform etl testing and ensure data accuracy and integrity in data warehouse automation projects. 6. Develop test cases, scenarios, and scripts to validate data transformations and load processes. 7. Monitor test results, identify defects, and work with development teams to resolve issues promptly. 8. Stay updated on industry best practices, tools, and technologies related to cloud testing and data automation processes. Must have technical Skill: ETL Testing Power BI Microsoft Fabric Testing

Posted Date not available

Apply

10.0 - 15.0 years

25 - 40 Lacs

hyderabad, pune, bengaluru

Hybrid

Key Responsibilities: • Data Modeling: Design, develop, and implement robust data models within Microsoft Fabric to support business analytics and reporting requirements. • Data Integration: Work on integrating diverse data sources (structured and unstructured) into the Fabric ecosystem for seamless data flow and transformation. • Performance Optimization: Tune and optimize the performance of data models, ensuring fast and efficient querying, data processing, and storage. • Collaboration: Partner with data engineers, analysts, and business stakeholders to ensure that data models meet business needs and technical specifications. • Best Practices: Implement industry best practices for data modeling, data governance, and data quality assurance within the Microsoft Fabric environment. • Documentation & Reporting: Maintain clear and concise documentation of data models, processes, and methodologies. Provide regular reports and updates on model performance and enhancements. • Automation & CI/CD: Leverage automation tools and continuous integration/continuous deployment (CI/CD) pipelines to streamline model deployment and maintenance. • Troubleshooting & Support: Troubleshoot data modeling issues and provide timely support for ongoing projects or system performance concerns. • Training & Knowledge Sharing: Share your knowledge and expertise in Microsoft Fabric with the wider data team, supporting ongoing training and development. Required Qualifications: • Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field. • Proven experience working with Microsoft Fabric, Power BI, Fabric Data Pipeline, Azure Data Factory, or similar technologies. • Strong background in data modeling (Dimensional, Relational, Data Warehousing, OLAP cubes). • Hands-on experience with SQL, DAX, and other relevant data query languages. • Familiarity with ETL/ELT processes, data integration, and data pipeline design. • Experience with cloud technologies, particularly Microsoft Azure and related services (e.g., Azure Data Lake, Azure SQL Database). • In-depth understanding of data governance, security, and best practices in cloud data management. • Solid experience working in an agile development environment and collaborating with cross-functional teams. • Strong problem-solving skills, with the ability to think critically and optimize complex data solutions. • Ability to work through complex modeling and security issues Preferred Qualifications: • Microsoft Certified: Azure Data Engineer Associate or related certifications. • Experience with Power BI report/dashboard development and integration. • Knowledge of Data Science concepts and working with large-scale datasets. • Familiarity with Data Lake and Data Warehouse design and optimization. • Understanding of DevOps principles and integration in data engineering practices. Skills & Competencies: • Technical Skills: Proficiency in SQL, DAX, Azure, Microsoft Fabric, Power BI, and Data Modeling Tools. • Analytical Thinking: Strong ability to understand complex business requirements and translate them into efficient data models. • Communication: Excellent verbal and written communication skills, with the ability to present complex technical concepts to non-technical stakeholders. • Collaboration: Ability to work well in a team-oriented environment, collaborating with engineers, analysts, and business stakeholders. • Attention to Detail: High attention to detail in designing scalable, efficient, and well-documented data models. • Project Management: Ability to manage multiple priorities and projects with tight deadlines.

Posted Date not available

Apply

6.0 - 10.0 years

17 - 22 Lacs

pune

Work from Office

We are seeking a highly skilled and experienced Data Engineer with 510 years of experience, including strong expertise in Microsoft Fabric . The ideal candidate will be responsible for designing, developing, and maintaining scalable data solutions using Microsofts unified data platform. You will collaborate with cross-functional teams to drive data strategy, architecture, and implementation across the organization. Key Responsibilities: Design and implement end-to-end data solutions using Microsoft Fabric (including OneLake, Data Factory, Synapse, Power BI, and Data Activator). Develop and maintain data pipelines, dataflows, and datasets for analytics and reporting. Collaborate with business stakeholders to understand data requirements and translate them into technical solutions. Optimize data models and queries for performance and scalability. Ensure data quality, governance, and security best practices are followed. Monitor and troubleshoot data workflows and resolve issues proactively. Stay updated with the latest features and best practices in Microsoft Fabric and related technologies. Data migration from On prem to Azure Fabric and then use analytics and Power BI dashboards to build reports. Required Skills & Qualifications: 510 years of experience in data engineering, BI, or analytics roles. Strong hands-on experience with Microsoft Fabric components. Proficiency in Power BI , Data Factory , Synapse , and OneLake . Solid understanding of data modeling , ETL/ELT processes , and SQL . Experience with Azure Data Services is a plus. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Preferred Qualifications: Microsoft certifications in Azure or Power Platform. Experience working in Agile/Scrum environments. Familiarity with data governance and compliance framework Experience in Investment Domain

Posted Date not available

Apply

6.0 - 11.0 years

40 - 45 Lacs

bengaluru

Hybrid

Key Skills: Azure ADF, Microsoft fabric, Azure Databricksm Power BI Roles and Responsibilities: Design and implement scalable BI and data architecture leveraging Microsoft Fabric, Lakehouse paradigms, and medallion architecture. Architect and optimize ETL/ELT pipelines using Azure Data Factory, SQL Server, Dataflows, and Fabric Pipelines. Integrate on-premises and cloud data sources using On-premises Data Gateway (ODG). Create and manage semantic data models to bridge technical structures and business terminology. Build, manage, and optimize Lakehouses, Data Warehouses, and data pipelines in Microsoft Fabric. Design robust Power BI dashboards using advanced DAX with real-time data connectivity. Implement data governance, security, and compliance frameworks using Microsoft Purview or Fabric-native tools. Lead initiatives for AI-enhanced BI development and delivery across the analytics ecosystem. Define and enforce data quality rules, transformation logic, and cleansing operations in collaboration with Data Engineers. Collaborate with stakeholders to translate business requirements into scalable technical architectures. Provide mentorship to BI developers, data engineers, and analysts, fostering a data-first culture. Drive enterprise-wide adoption of Microsoft Fabric and promote modern BI practices. Ensure end-to-end data lineage, quality, and availability across reporting and analytics layers. Document all architectural decisions, data models, workflows, and semantic layer specifications. Skills Required: Must-Have: Hands-on experience with Azure Data Factory (ADF) and Azure Databricks Expertise in Microsoft Fabric for building modern data platforms Deep understanding of data modeling, ELT/ETL design, and Lakehouse architecture Strong skills in Power BI, including DAX, real-time data, and custom visuals Knowledge of data governance, data security, and compliance tools like Microsoft Purview Nice-to-Have: Experience with AI-driven development within BI tools Knowledge of data virtualization principles Familiarity with DevOps practices for BI deployments Exposure to data mesh or data fabric architectural patterns Education: Bachelor's Degree in related field

Posted Date not available

Apply

15.0 - 20.0 years

20 - 30 Lacs

noida

Work from Office

Job Title: DBA Architect Azure ADF, Databricks – 15+ Years – Noida – Immediate Joiners Payroll: Venpa Staffing Services (Preferred) | PERMANENT also acceptable Work Location: Noida (Onsite) Shift: EST Shift Notice Period: Immediate Joiners Only CTC: As per market standards. Job Summary: BrickRed Systems, through Venpa Staffing Services, is urgently hiring a DBA Architect with deep expertise in Azure Data Factory (ADF) , Azure Databricks , and Data Fabric technologies . This is a high-impact onsite role for experienced architects with 15+ years of end-to-end data platform expertise and leadership experience. Key Responsibilities: Architect and implement scalable, secure, and enterprise-grade Azure-based data solutions Lead design-to-delivery of data architectures including ADF, Databricks, Synapse, Data Lake, Data Fabric Drive data governance, ingestion, transformation, and security initiatives Collaborate with engineers, analysts, and DevOps teams for solution alignment Define architecture best practices , patterns, and governance frameworks Migrate legacy/on-prem data platforms to Azure cloud environments Optimize performance, cost, and reliability of data pipelines Mentor and lead junior engineers and architects Required Skills & Experience: 15+ years in database technology and architecture 3–5+ years in a dedicated Architect role Expert in Azure ADF, Azure Databricks, Data Fabric solutions Strong hands-on in data modeling, warehousing, ETL/ELT Proficient in SQL, NoSQL, Big Data tech Knowledge of CI/CD , cost optimization, and data monitoring on Azure Strong leadership, governance, and compliance experience Preferred Qualifications: Azure certifications (e.g., Azure Data Engineer Associate, Solutions Architect) Experience with Microsoft Fabric Exposure to AI/ML integration in data pipelines Excellent communication & stakeholder management Note: Candidates must be available immediately Apply Now If you’re a seasoned data architect ready to make an impact, apply today or email your resume to: karthika@venpastaffing.com +91 9036237987 Subject Line: DBA Architect – Noida – Immediate Joiner

Posted Date not available

Apply

5.0 - 10.0 years

14 - 24 Lacs

bengaluru

Work from Office

Dear Candidate, Greetings, we have urgent requirement for- Microsoft Fabric Engineer - CMM Level 5 Client - Permanent position. Location-Bangalore Work Mode -Hybrid Experience-5-10yrs Notice Period- Immediate to 30days(Only serving Candidates) JD- Need Microsoft Fabric certified candidate +Azure+Pyspark only apply relevant candidates, if anyone interested Kindly share your updated cv to Manasa@skyonn.com and Please refer your friends and colleagues Thank you..

Posted Date not available

Apply

18.0 - 22.0 years

55 - 60 Lacs

hyderabad

Work from Office

We are seeking a highly skilled and experienced Data Architect to join our team. The ideal candidate will have at least 18 years of experience in Data engineering and Analytics and a proven track record of designing and implementing complex data solutions. As a senior principal data architect, you will be expected to design, create, deploy, and manage Blackbauds data architecture. This role has considerable technical influence within the Data Platform, Data Engineering teams, and the Data Intelligence Center of Excellence at Blackbaud. This individual acts as an evangelist for proper data strategy with other teams at Blackbaud and assists with the technical direction, specifically with data, of other projects. Responsibilities: Develop and direct the strategy for all aspects of Blackbauds Data and Analytics platforms, products and services Set, communicate and facilitate technical direction more broadly for the AI Center of Excellence and collaboratively beyond the Center of Excellence Design and develop breakthrough products, services or technological advancements in the Data Intelligence space that expand our business Work alongside product management to craft technical solutions to solve customer business problems. Own the technical data governance practices and ensures data sovereignty, privacy, security and regulatory compliance. Continuously challenging the status quo of how things have been done in the past. Build data access strategy to securely democratize data and enable research, modelling, machine learning and artificial intelligence work. Help define the tools and pipeline patterns our engineers and data engineers use to transform data and support our analytics practice Work in a cross-functional team to translate business needs into data architecture solutions. Ensure data solutions are built for performance, scalability, and reliability. Mentor junior data architects and team members. Keep current on technology: distributed computing, big data concepts and architecture. Promote internally how data within Blackbaud can help change the world. Required Qualifications: 18+ years of experience in data and advanced analytics At least 8 years of experience working on data technologies in Azure/AWS Expertise in SQL and Python Expertise in SQL Server, Azure Data Services, and other Microsoft data technologies. Expertise in Databricks, Microsoft Fabric Strong understanding of data modeling, data warehousing, data lakes, data mesh and data products. Experience with machine learning Excellent communication and leadership skills. Preferred Qualifications Experience working with .Net/Java and Microservice Architecture.

Posted Date not available

Apply

6.0 - 10.0 years

17 - 22 Lacs

pune

Work from Office

We are seeking a highly skilled and experienced Data Engineer with 510 years of experience, including strong expertise in Microsoft Fabric . The ideal candidate will be responsible for designing, developing, and maintaining scalable data solutions using Microsofts unified data platform. You will collaborate with cross-functional teams to drive data strategy, architecture, and implementation across the organization. Key Responsibilities: Design and implement end-to-end data solutions using Microsoft Fabric (including OneLake, Data Factory, Synapse, Power BI, and Data Activator). Develop and maintain data pipelines, dataflows, and datasets for analytics and reporting. Collaborate with business stakeholders to understand data requirements and translate them into technical solutions. Optimize data models and queries for performance and scalability. Ensure data quality, governance, and security best practices are followed. Monitor and troubleshoot data workflows and resolve issues proactively. Stay updated with the latest features and best practices in Microsoft Fabric and related technologies. Data migration from On prem to Azure Fabric and then use analytics and Power BI dashboards to build reports. Required Skills & Qualifications: 510 years of experience in data engineering, BI, or analytics roles. Strong hands-on experience with Microsoft Fabric components. Proficiency in Power BI , Data Factory , Synapse , and OneLake . Solid understanding of data modeling , ETL/ELT processes , and SQL . Experience with Azure Data Services is a plus. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Preferred Qualifications: Microsoft certifications in Azure or Power Platform. Experience working in Agile/Scrum environments. Familiarity with data governance and compliance framework Experience in Investment Domain

Posted Date not available

Apply

7.0 - 12.0 years

5 - 15 Lacs

hyderabad, chennai, bengaluru

Hybrid

Key Responsibilities: 1. Plan, design, and execute test strategies for cloud applications, ensuring comprehensive testing coverage. 2. Implement and maintain automated testing frameworks for efficient and reliable testing processes. 3. Lead and mentor the testing team to deliver high-quality results within the stipulated timelines. 4. Collaborate with cross functional teams to identify testing requirements and priorities. 5. Perform etl testing and ensure data accuracy and integrity in data warehouse automation projects. 6. Develop test cases, scenarios, and scripts to validate data transformations and load processes. 7. Monitor test results, identify defects, and work with development teams to resolve issues promptly. 8. Stay updated on industry best practices, tools, and technologies related to cloud testing and data automation processes. Requirement: Exp- 7-12 Years Location- Bangalore, Hyderabad, Chennai & Noida Notice Period0 immediate Joiner to 45 Days Technical Skill Must have: Power BI ETL Testing DWH Testing Microsoft fabric Testing

Posted Date not available

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies