Jobs
Interviews

153 Microsoft Fabric Jobs - Page 5

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 10.0 years

0 Lacs

Gurgaon / Gurugram, Haryana, India

On-site

Job Description : YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we're a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hireMicrosoft Fabric Professionals in the following areas : Experience 8+ Years Job Description Position: Data Analytics Lead. Experience: 8+ Years. Responsibilities: . Build, manage, and foster a high-functioning team of data engineers and Data analysts. . Collaborate with business and technical teams to capture and prioritize platform ingestion requirements. . Experience of working with manufacturing industry in building a centralized data platform for self service reporting. . Lead the data analytics team members, providing guidance, mentorship, and support to ensure their professional growth and success. . Responsible for managing customer, partner, and internal data on the cloud and on-premises. . Evaluate and understand current data technologies and trends and promote a culture of learning. . Build and end to end data strategy from collecting the requirements from business to modelling the data and building reports and dashboards Required Skills: . Experience in data engineering and architecture, with a focus on developing scalable cloud solutions in Azure Synapse / Microsoft Fabric / Azure Databricks . Accountable for the data group's activities including architecting, developing, and maintaining a centralized data platform including our operational data, data warehouse, data lake, Data factory pipelines, and data-related services. . Experience in designing and building operationally efficient pipelines, utilising core Azure components, such as Azure Data Factory, Azure Databricks and Pyspark etc . Strong understanding of data architecture, data modelling, and ETL processes. . Proficiency in SQL and Pyspark . Strong knowledge of building PowerBI reports and dashboards. . Excellent communication skills . Strong problem-solving and analytical skills. Required Technical/ Functional Competencies Domain/ Industry Knowledge: Basic knowledge of customer's business processes- relevant technology platform or product. Able to prepare process maps, workflows, business cases and simple business models in line with customer requirements with assistance from SME and apply industry standards/ practices in implementation with guidance from experienced team members. Requirement Gathering and Analysis: Working knowledge of requirement management processes and requirement analysis processes, tools & methodologies. Able to analyse the impact of change requested/ enhancement/ defect fix and identify dependencies or interrelationships among requirements & transition requirements for engagement. Product/ Technology Knowledge: Working knowledge of technology product/platform standards and specifications. Able to implement code or configure/customize products and provide inputs in design and architecture adhering to industry standards/ practices in implementation. Analyze various frameworks/tools, review the code and provide feedback on improvement opportunities. Architecture tools and frameworks: Working knowledge of architecture Industry tools & frameworks. Able to identify pros/ cons of available tools & frameworks in market and use those as per Customer requirement and explore new tools/ framework for implementation. Architecture concepts and principles : Working knowledge of architectural elements, SDLC, methodologies. Able to provides architectural design/ documentation at an application or function capability level and implement architectural patterns in solution & engagements and communicates architecture direction to the business. Analytics Solution Design: Knowledge of statistical & machine learning techniques like classification, linear regression modelling, clustering & decision trees. Able to identify the cause of errors and their potential solutions. Tools & Platform Knowledge: Familiar with wide range of mainstream commercial & open-source data science/analytics software tools, their constraints, advantages, disadvantages, and areas of application. Required Behavioral Competencies Accountability: Takes responsibility for and ensures accuracy of own work, as well as the work and deadlines of the team. Collaboration: Shares information within team, participates in team activities, asks questions to understand other points of view. Agility: Demonstrates readiness for change, asking questions and determining how changes could impact own work. Customer Focus: Identifies trends and patterns emerging from customer preferences and works towards customizing/ refining existing services to exceed customer needs and expectations. Communication: Targets communications for the appropriate audience, clearly articulating and presenting his/her position or decision. Drives Results: Sets realistic stretch goals for self & others to achieve and exceed defined goals/targets. Resolves Conflict: Displays sensitivity in interactions and strives to understand others views and concerns. Certifications Mandatory At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment.We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 1 month ago

Apply

3.0 - 6.0 years

5 - 9 Lacs

Jaipur

Work from Office

We are looking for a skilled Power BI Developer to design, develop, and manage business intelligence solutions that provide actionable insights. You will work closely with stakeholders to understand data needs and transform complex datasets into intuitive, interactive dashboards and reports using Power BI. Key Responsibilities: Understand business requirements and translate them into technical BI solutions Design and develop Power BI dashboards, reports, and datasets Optimize data models for performance and scalability Use Python scripts within Power BI for advanced data manipulation, forecasting, and automation Integrate data from multiple sources (SQL, Excel, APIs, etc.) Work with DAX and Power Query for data transformation and calculations Collaborate with data engineers and business analysts to ensure data accuracy Maintain and support existing BI solutions and troubleshoot issues Stay current on Power BI updates and best practices Requirement: Bachelor's degree; MBA or relevant technical degree preferred, with 3-7 years of experience in Data Analytics. Excellent visualization skills, storytelling abilities, and familiarity with best practices for conveying insights in an intuitive and visually compelling manner. Experience in creating, collecting, analyzing, and communicating business insights from data. Strong background in retention, churn, engagement, and marketing analytics metrics. Knowledge of analytics self-service BI approaches and predictive analytics solutions is a strong advantage. Proficiency in SQL and experience working with Microsoft Azure environment. Experience in statistical techniques (e.g., R, hypothesis testing). 2-3 years of hands-on experience with BI Tools such as Tableau, Power BI, or Fabric. Knowledge of GCS or AWS, AI/ML will be a plus. Preferred Qualifications: Microsoft Power BI certification Experience with large data sets and performance tuning

Posted 1 month ago

Apply

5.0 - 6.0 years

13 - 14 Lacs

Bhopal, New Delhi, Pune

Hybrid

Design, build & maintain PowerBI/Fabric semantic models, reports, dashboards & pipelines using advanced DAX, PowerQuery, KQL; optimize performance, refreshes, handle complex models, collaborate .Contact to 9063478484 /v.aparna@tekgenieservices.com

Posted 1 month ago

Apply

7.0 - 12.0 years

7 - 12 Lacs

Pune, Maharashtra, India

On-site

As part of a critical healthcare IT transformation, Xpress we'llness is migrating its data infrastructure from Google Cloud Platform (GCP) to Microsoft Azure, and building an end-to-end ETL and reporting system to deliver key KPIs via Power BI. We are seeking a hands-on Technical Lead - Azure Data Engineering to lead the data engineering workstream of the project. The ideal candidate will have deep expertise in Azure cloud data services, strong experience in data migration from GCP to Azure, and a solid understanding of data governance, compliance, and Azure storage architectures. Key Responsibilities: Lead the technical design and implementation of data pipelines and storage on Azure. Drive the GCP-to-Azure data migration strategy and execution. Oversee the development of scalable ETL/ELT processes using Azure Data Factory, Synapse, or Fabric. Ensure alignment with data governance and healthcare compliance standards. Collaborate with architects, data engineers, and Power BI developers to enable accurate KPI delivery. Provide technical mentorship to junior engineers and ensure best practices are followe'd. Act as the primary technical point of contact for data engineering-related discussions. Key Skills & Qualifications: 7-12 years of experience in data engineering, with 3-6 years in Azure Cloud. Strong experience with Azure Data Factory, Azure Data Lake, Microsoft Fabric, ETL, Synapse Analytics, and Azure Storage services. Hands-on experience in data migration projects from GCP to Azure. Knowledge of data governance, Microsoft Purvue, privacy, and compliance (HIPAA preferred). Excellent communication and stakeholder management skills. Relevant Microsoft certifications are a plus. Our Commitment to Diversity & Inclusion: Our Perks and Benefits: Our benefits and rewards program has been thoughtfully designed to recognize your skills and contributions, elevate your learning/upskilling experience and provide care and support for you and your loved ones. As an Apexon Associate, you get continuous skill-based development, opportunities for career advancement, and access to comprehensive health and we'll-being benefits and assistance. We also offer: Group Health Insurance covering family of 4 Term Insurance and Accident Insurance Paid Holidays & Earned Leaves Paid Parental LeaveoLearning & Career Development Employee we'llness

Posted 1 month ago

Apply

2.0 - 5.0 years

8 - 13 Lacs

Pune, Maharashtra, India

On-site

We are looking for a proactive and detail-oriented Junior Data Engineer with 2 to 5 years of experience to join our cloud data transformation team. The candidate will work closely with the Data Engineering Lead and Solution Architect to support data migration, pipeline development, testing, and integration efforts on the Microsoft Azure platform. Key Responsibilities: Data Migration Support Assist in migrating structured and semi-structured data from GCP storage systems to Azure Blob Storage, Azure Data Lake, or Synapse. Help validate and reconcile data post-migration to ensure completeness and accuracy. ETL/ELT Development Build and maintain ETL pipelines using Azure Data Factory, Synapse Pipelines, or Microsoft Fabric. Support the development of data transformation logic (SQL/ADF/Dataflows). Ensure data pipelines are efficient, scalable, and meet defined SLAs. Data Modeling & Integration Support the design of data models to enable effective reporting in Power BI. Prepare clean, structured datasets ready for downstream KPI reporting and analytics use cases. Testing & Documentation Conduct unit and integration testing of data pipelines. Maintain documentation of data workflows, metadata, and pipeline configurations. Collaboration & Learning Collaborate with the Data Engineering Lead, BI Developers, and other team members. Stay current with Azure technologies and best practices under the guidance of senior team members. Qualifications: Education: bachelors degree in Computer Science, Information Technology, Engineering, or a related field. Experience: 2 to 5 years of hands-on experience in data engineering or analytics engineering roles. Exposure to at least one cloud platform (preferably Microsoft Azure). Technical Skills Required: Experience with SQL and data transformation logic. Familiarity with Azure data services like Azure Data Factory, Synapse Analytics, Blob Storage, Data Lake, or Microsoft Fabric. Basic knowledge of ETL/ELT concepts and data warehousing principles and familiarity with Unix shell scripting. Familiarity with Power BI datasets or Power Query is a plus. Good understanding of data quality and testing practices. Exposure to version control systems like Git. Soft Skills: Eagerness to learn and grow under the mentorship of experienced team members. Strong analytical and problem-solving skills. Ability to work in a collaborative, fast-paced team environment. Good written and verbal communication skills. Our Perks and Benefits: Our benefits and rewards program has been thoughtfully designed to recognize your skills and contributions, elevate your learning/upskilling experience and provide care and support for you and your loved ones. As an Apexon Associate, you get continuous skill-based development, opportunities for career advancement, and access to comprehensive health and we'll-being benefits and assistance. We also offer: Group Health Insurance covering family of 4 Term Insurance and Accident Insurance Paid Holidays & Earned Leaves Paid Parental Leaveo Learning & Career Development Employee we'llness Job Location : Pune, India

Posted 1 month ago

Apply

2.0 - 5.0 years

5 - 15 Lacs

Hyderabad

Work from Office

Company Overview Accordion works at the intersection of sponsors and management teams throughout every stage of the investment lifecycle, providing hands-on, execution-focused support to elevate data and analytics capabilities. So, what does it mean to work at Accordion? It means joining 1,000+ analytics, data science, finance & technology experts in a high-growth, agile, and entrepreneurial environment while transforming how portfolio companies drive value. It also means making your mark on Accordions futureby embracing a culture rooted in collaboration and a firm-wide commitment to building something great, together. Headquartered in New York City with 10 offices worldwide, Accordion invites you to join our journey. Data & Analytics (Accordion | Data & Analytics) Accordion's Data & Analytics (D&A) team delivers cutting-edge, intelligent solutions to a global clientele, leveraging a blend of domain knowledge, sophisticated technology tools, and deep analytics capabilities to tackle complex business challenges. We partner with Private Equity clients and their Portfolio Companies across diverse sectors, including Retail, CPG, Healthcare, Media & Entertainment, Technology, and Logistics. D&A team delivers data and analytical solutions designed to streamline reporting capabilities and enhance business insights across vast and complex data sets ranging from Sales, Operations, Marketing, Pricing, Customer Strategies, and more. Location: Hyderabad, Telangana Role Overview: Accordion is looking for Senior Data Engineer with Database/Data Warehouse/Business Intelligence experience. He/she will be responsible for the design, development, configuration/deployment, and maintenance of the above technology stack. He/she must have in depth understanding of various tools & technologies in the above domain to design and implement robust and scalable solutions which address client current and future requirements at optimal costs. The Senior Data Engineer should be able to understand various architecture and recommend right fit depending on the use case of the project. A successful Senior Data Engineer should possess strong working business knowledge, familiarity with multiple tools and techniques along with industry standards and best practices in Business Intelligence and Data Warehousing environment. He/she should have strong organizational, critical thinking, and communication skills. What You will do: Understand the business requirements thoroughly to design and develop the BI architecture. Determine business intelligence and data warehousing solutions that meet business needs. Perform data warehouse design and modelling according to established standards. Work closely with the business teams to arrive at methodologies to develop KPIs and Metrics. Work with Project Manager in developing and executing project plans within assigned schedule and timeline. Develop standard reports and functional dashboards based on business requirements. Ensure to develop and deliver high quality reports in timely and accurate manner. Conduct training programs and knowledge transfer sessions to junior developers when needed. Recommend improvements to provide optimum reporting solutions. Ideally, you have: Undergraduate degree (B.E/B.Tech.) from tier-1/tier-2 colleges are preferred. 2 - 5 years of experience in related field. Proven expertise in SSIS, SSAS and SSRS (MSBI Suite). In-depth knowledge of databases (SQL Server, MySQL, Oracle etc.) and data warehouse (Azure Synapse, AWS Redshift, Google BigQuery, Snowflake etc.). In-depth knowledge of business intelligence tools (any one of Power BI, Tableau, Qlik, DOMO, Looker etc.). Good understanding of Azure (Data Factory & Pipelines, SQL Database & Managed Instances, DevOps, Logic Apps, Analysis Services), AWS (Glue, Aurora Database, Dynamo Database, Redshift, QuickSight). Proven abilities to take on initiative and be innovative. Analytical mind with problem solving attitude. Why Explore a Career at Accordion: High growth environment: Semi-annual performance management and promotion cycles coupled with a strong meritocratic culture, enables fast track to leadership responsibility. Cross Domain Exposure: Interesting and challenging work streams across industries and domains that always keep you excited, motivated, and on your toes. Entrepreneurial Environment : Intellectual freedom to make decisions and own them. We expect you to spread your wings and assume larger responsibilities. Fun culture and peer group: Non-bureaucratic and fun working environment; Strong peer environment that will challenge you and accelerate your learning curve. Other benefits for full time employees: Health and wellness programs that include employee health insurance covering immediate family members and parents, term life insurance for employees, free health camps for employees, discounted health services (including vision, dental) for employee and family members, free doctors consultations, counsellors, etc. Corporate Meal card options for ease of use and tax benefits. Team lunches, company sponsored team outings and celebrations. Cab reimbursement for women employees beyond a certain time of the day. Robust leave policy to support work-life balance. Specially designed leave structure to support woman employees for maternity and related requests. Reward and recognition platform to celebrate professional and personal milestones. A positive & transparent work environment including various employee engagement and employee benefit initiatives to support personal and professional learning and development.

Posted 1 month ago

Apply

12.0 - 18.0 years

25 - 40 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Role & responsibilities Azure Cloud Services (PaaS & IaaS): Proficient in deploying and managing cloud-based solutions using Azure's Platform-as-a-Service and Infrastructure-as-a-Service offerings. Data Engineering & Analytics: Azure Synapse Analytics: Integrated big data and data warehousing capabilities for comprehensive analytics solutions. Azure Data Factory: Developed and orchestrated ETL/ELT pipelines for seamless data movement and transformation. Azure Databricks & PySpark: Engineered scalable data processing workflows and machine learning models. Azure Stream Analytics: Implemented real-time data stream processing for immediate insights. Microsoft Fabric: Utilized AI-powered analytics for unified data access and management.deepaksood619.github.io Business Intelligence & Reporting: Power BI & SSRS: Designed and developed interactive dashboards and reports for data visualization and decision-making. SQL Server Analysis Services (SSAS): Built OLAP cubes and tabular models for multidimensional data analysis. Data Governance & Security: Microsoft Purview: Established comprehensive data governance frameworks to ensure compliance and data integrity. DevOps & Automation: Azure DevOps: Implemented CI/CD pipelines and automated deployment processes for efficient software delivery. Preferred candidate profile Technical Skills: Cloud Computing: Azure-Cloud Services (PaaS & IaaS), Active Directory, Application Insights, Azure Stream Analytics, Azure Search, Data Factory, Key Vault and SQL Azure, Azure Data Factory, Azure Analysis services, Azure Synapse Analytics (DW), Azure Data Lake, PySpark, Microsoft Fabric Database & BI Tools: SQL, T-SQL, SSIS, SSRS, SQL Server Management Studio (SSMS) 2016/2014, SQL Server Job Agent, Import and Export Data, Linked Servers. Reporting Tools: SSRS, Power BI reports, Tableau, Excel

Posted 1 month ago

Apply

8.0 - 13.0 years

15 - 30 Lacs

Pune

Work from Office

Role & responsiby: Position Details: Role: Data Engineer Location: Pune, Baner (Work from Office) Experience: 6+ years Working Days: Monday to Friday (9:30 AM 6:30 PM) Education: Bachelors or Masters in Computer Science, Engineering, Mathematics, or related field Company Website: www.avigna.ai LinkedIn: Avigna.AI Key Responsibilities: Design and develop robust data pipelines for large-scale data ingestion, transformation, and analytics. Implement scalable Lakehouse architectures using tools like Microsoft Fabric for structured and semi-structured data. Work with Python , PySpark , and Azure services to support data modeling, automation, and predictive insights. Develop custom KQL queries and manage data using Power BI , Azure Cosmos DB , or similar tools. Collaborate with cross-functional teams to integrate data-driven components with application backends and frontends. Ensure secure, efficient, and reliable CI/CD pipelines for automated deployments and data updates. Skills & Experience Required: Strong proficiency in Python , PySpark , and cloud-native data tools Experience with Microsoft Azure services (e.g., App Services, Functions, Cosmos DB, Active Directory) Hands-on experience with Microsoft Fabric (preferred or good to have) Working knowledge of Power BI and building interactive dashboards for business insights Familiarity with CI/CD practices for automated deployments Exposure to machine learning integration into data workflows (nice to have) Strong analytical and problem-solving skills with attention to detail Good to Have: Experience with KQL (Kusto Query Language) Background in simulation models or mathematical modeling Knowledge of Power Platform integration (Power Pages, Power Apps) ilities Benefits : Competitive salary. Health insurance coverage. Professional development opportunities. Dynamic and collaborative work environment.

Posted 1 month ago

Apply

8.0 - 13.0 years

14 - 24 Lacs

Bengaluru

Remote

Key Responsibilities: Design and implement data solutions using MS Fabric, including data pipelines, data warehouses, and data lakes Lead and mentor a team of data engineers, providing technical guidance and oversight Collaborate with stakeholders to understand data requirements and deliver data-driven solutions Develop and maintain large-scale data systems, ensuring data quality, integrity, and security Troubleshoot data pipeline issues and optimize data workflows for performance and scalability Stay up-to-date with MS Fabric features and best practices, applying knowledge to improve data solutions Requirements: 8+ years of experience in data engineering, with expertise in MS Fabric, Azure Data Factory, or similar technologies Strong programming skills in languages like Python, SQL, or C# Experience with data modeling, data warehousing, and data governance Excellent problem-solving skills, with ability to troubleshoot complex data pipeline issues Strong communication and leadership skills, with experience leading teams

Posted 1 month ago

Apply

5.0 - 9.0 years

15 - 25 Lacs

Bengaluru

Hybrid

Position: Data Engineer Skills Required: Experience in Python/Pyspark, Strong SQL Server. Good to have: Azure DataBricks (ADF) (or) Azure Synapse or Snowflake.

Posted 1 month ago

Apply

1.0 - 4.0 years

5 - 9 Lacs

Noida, Mohali

Work from Office

- Support the development of internal web applications and tools. - Help build and maintain backend services. - Contribute to frontend development using React.js or Vue.js. - Assist in setting up and managing cloud-based infrastructure.

Posted 1 month ago

Apply

7.0 - 12.0 years

8 - 18 Lacs

Kolkata

Remote

Position : Sr Azure Data Engineer Location: Remote Time : CET Time Role & responsibilities We are seeking a highly skilled Senior Data Engineer to join our dynamic team. The ideal candidate will have extensive experience in Microsoft Azure, Fabric Azure SQL, Azure Synapse, Python, and Power BI. Knowledge of Oracle DB and data replication tools will be preferred . This role involves designing, developing, and maintaining robust data pipelines and ensuring efficient data processing and integration across various platforms. Candidate understands stated needs & requirements of the stakeholders and produce high quality deliverables Monitors own work to ensure delivery within the desired performance standards. Understands the importance of delivery within expected time, budget and quality standards and displays concern in case of deviation. Good communication skills and a team player Design and Development: Architect, develop, and maintain scalable data pipelines using Microsoft Fabric and Azure services, including Azure SQL and Azure Synapse. Data Integration : Integrate data from multiple sources, ensuring data consistency, quality, and availability using data replication tools. Data Management: Manage and optimize databases, ensuring high performance and reliability. ETL Processes: Develop and maintain ETL processes to transform data into actionable insights. Data Analysis: Use Python and other tools to analyze data, create reports, and provide insights to support business decisions. Visualization : Develop and maintain dashboards and reports in Power BI to visualize complex data sets. Performance Tuning : Optimize database performance and troubleshoot any issues related to data processing and integration Preferred candidate profile Minimum 7 years of experience in data engineering or a related field. Proven experience with Microsoft Azure services, Fabrics including Azure SQL and Azure Synapse. Strong proficiency in Python for data analysis and scripting. Extensive experience with Power BI for data visualization. Knowledge of Oracle DB and experience with data replication tools. Proficient in SQL and database management. Experience with ETL tools and processes. Strong understanding of data warehousing concepts and architectures. Familiarity with cloud-based data platforms and services. Analytical Skills: Ability to analyze complex data sets and provide actionable insights. Problem-Solving: Strong problem-solving skills and the ability to troubleshoot data-related issues.

Posted 1 month ago

Apply

12.0 - 14.0 years

20 - 30 Lacs

Indore, Hyderabad

Work from Office

Microsoft Fabric Data engineer CTC Range 12 14 Years Location – Hyderabad/Indore Notice Period - Immediate * Primary Skill Microsoft Fabric Secondary Skill 1 Azure Data Factory (ADF) 12+ years of experience in Microsoft Azure Data Engineering for analytical projects. Proven expertise in designing, developing, and deploying high-volume, end-to-end ETL pipelines for complex models, including batch, and real-time data integration frameworks using Azure, Microsoft Fabric and Databricks. Extensive hands-on experience with Azure Data Factory, Databricks (with Unity Catalog), Azure Functions, Synapse Analytics, Data Lake, Delta Lake, and Azure SQL Database for managing and processing large-scale data integrations. Experience in Databricks cluster optimization and workflow management to ensure cost-effective and high-performance processing. Sound knowledge of data modelling, data governance, data quality management, and data modernization processes. Develop architecture blueprints and technical design documentation for Azure-based data solutions. Provide technical leadership and guidance on cloud architecture best practices, ensuring scalable and secure solutions. Keep abreast of emerging Azure technologies and recommend enhancements to existing systems. Lead proof of concepts (PoCs) and adopt agile delivery methodologies for solution development and delivery. www.yash.com 'Information transmitted by this e-mail is proprietary to YASH Technologies and/ or its Customers and is intended for use only by the individual or entity to which it is addressed, and may contain information that is privileged, confidential or exempt from disclosure under applicable law. If you are not the intended recipient or it appears that this mail has been forwarded to you without proper authority, you are notified that any use or dissemination of this information in any manner is strictly prohibited. In such cases, please notify us immediately at info@yash.com and delete this mail from your records.

Posted 1 month ago

Apply

2.0 - 5.0 years

8 - 18 Lacs

Pune

Work from Office

Scope of Work: Collaborate with the lead Business / Data Analyst to gather and analyse business requirements for data processing and reporting solutions. Maintain and run existing Python code, ensuring smooth execution and troubleshooting any issues that arise. Develop new features and enhancements for data processing, ingestion, transformation, and report building. Implement best coding practices to improve code quality, maintainability, and efficiency. Work within Microsoft Fabric to manage data integration, warehousing, and analytics, ensuring optimal performance and reliability. Support and maintain CI/CD workflows using Git-based deployments or other automated deployment tools, preferably in Fabric. Develop complex business rules and logic in Python to meet functional specifications and reporting needs. Participate in an agile development environment, providing feedback, iterating on improvements, and supporting continuous integration and delivery processes. Requirements: This person will be an individual contributor responsible for programming, maintenance support, and troubleshooting tasks related to data movement, processing, ingestion, transformation, and report building. Advanced-level Python developer. Moderate-level experience in working in Microsoft Fabric environment (at least one and preferably two or more client projects in Fabric). Well-versed with understanding of modelling, databases, data warehousing, data integration, and technical elements of business intelligence technologies. Ability to understand business requirements and translate them into functional specifications for reporting applications. Experience in GIT-based deployments or other CI/CD workflow options, preferably in Fabric. Strong verbal and written communication skills. Ability to perform in an agile environment where continual development is prioritized. Working experience in the financial industry domain and familiarity with financial accounting terms and statements like general ledger, balance sheet, and profit & loss statements would be a plus. Ability to create Power BI dashboards, KPI scorecards, and visual reports would be a plus. Degree in Computer Science or Information Systems, along with a good understanding of financial terms or working experience in banking/financial institutions, is preferred.

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Bengaluru

Remote

Role & responsibilities The Test Lead oversees the testing strategy and execution for the Microsoft Fabric migration and Power BI reporting solutions. This offshore role ensures quality, reliability, and client satisfaction through rigorous validation. The successful candidate will have a strong testing background and coordination skills. Responsibilities Develop and execute the testing strategy for Microsoft Fabric and Power BI deliverables. Validate data migration, pipeline functionality, and report accuracy against requirements. Coordinate with the Offshore Project Manager to align testing with development milestones. Collaborate with onsite technical leads to validate results and resolve defects. • Oversee offshore testers, ensuring comprehensive coverage and quality standards. Proactively identify risks and articulate solutions to minimize delivery issues. Skills Bachelors degree in IT, computer science, or a related field. 5+ years of experience in test leadership for data platforms and BI solutions. Knowledge of Microsoft Fabric, Power BI, and data migration testing. Proficiency with testing tools (e.g., Azure DevOps, Selenium) and SQL. Strong communication and stakeholder management skills. Detail-oriented with a focus on quality and continuous improvement 1. JD for Data Modeler The Data Modeler designs and implements data models for Microsoft Fabric and Power BI, supporting the migration from Oracle/Informatica. This offshore role ensures optimized data structures for performance and reporting needs. The successful candidate will bring expertise in data modeling and a collaborative approach. Responsibilities Develop conceptual, logical, and physical data models for Microsoft Fabric and Power BI solutions. Implement data models for relational, dimensional, and data lake environments on target platforms. Collaborate with the Offshore Data Engineer and Onsite Data Modernization Architect to ensure model alignment. Define and govern data modeling standards, tools, and best practices. Optimize data structures for query performance and scalability. Provide updates on modeling progress and dependencies to the Offshore Project Manager. Skills Bachelor’s or master’s degree in computer science, data science, or a related field. 5+ years of data modeling experience with relational and NoSQL platforms. Proficiency with modeling tools (e.g., Erwin, ER/Studio) and SQL. Experience with Microsoft Fabric, data lakes, and BI data structures. Strong analytical and communication skills for team collaboration. Attention to detail with a focus on performance and consistency. management, communication, and presentation

Posted 1 month ago

Apply

3.0 - 6.0 years

0 - 1 Lacs

Chennai, Bengaluru

Hybrid

Job Description: An experienced and skilled BI engineer with designing, developing, and deploying business intelligence solutions using Microsoft Power BI Mandatory Skills 3+ years of experience in Power BI Strong knowledge in data transformation using Power Query. Ability to write complex DAX formula for data aggregation, filtering, ranking etc. String knowledge in schema modelling in Power BI. Thorough knowledge in RLS implementation in Power BI. Ability to create report mockups/wireframe based on requirements Knowledge on Power BI service and Gateway. Working experience in writing complex queries and data analysis skills Good working experience in UI/UX design of reports and story telling Good communication skills, abililty to learn new things , good attitude towards work and team skills Good to have skills: Microsoft Fabric experience Data Engineering skills Azure Devops

Posted 1 month ago

Apply

12.0 - 18.0 years

35 - 40 Lacs

Bengaluru

Work from Office

Microsoft Fabric, Azure Analysis Services, PowerBI, Azure SQL, Dimensional Data Modeling

Posted 1 month ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Bengaluru

Work from Office

JOB DESCRIPTION We are looking for a highly skilled API & Pixel Tracking Integration Engineer to lead the development and deployment of server- side tracking and attribution solutions across multiple platforms. The ideal candidate brings deep expertise in CAPI integrations (Meta, Google, and other platforms), secure data handling using cryptographic techniques, and experience working within privacy- first environments like Azure Clean Rooms . This role requires strong hands-on experience in C# development, Azure cloud services, OCI (Oracle Cloud Infrastructure) , and marketing technology stacks including Adobe Tag Management and Pixel Management . You will work closely with engineering, analytics, and marketing teams to deliver scalable, compliant, and secure data tracking solutions that drive business insights and performance. Key Responsibilities: Design, implement, and maintain CAPI integrations across Meta, Google, and all major platforms , ensuring real-time and accurate server-side event tracking. Utilize Fabric and OCI environments as needed for data integration and marketing intelligence workflows. Develop and manage custom tracking solutions leveraging Azure Clean Rooms , ensuring user NFAs are respected and privacy-compliant logic is implemented. Implement cryptographic hashing (e.g., SHA-256) Use Azure Data Lake Gen1 & Gen2 (ADLS) , Cosmos DB , and Azure Functions to build and host scalable backend systems. Integrate with Azure Key Vaults to securely manage secrets and sensitive credentials. Design and execute data pipelines in Azure Data Factory (ADF) for processing and transforming tracking data. Lead pixel and tag management initiatives using Adobe Tag Manager , including pixel governance and QA across properties. Collaborate with security teams to ensure all data-sharing and processing complies with Azures data security standards and enterprise privacy frameworks. Monitor, troubleshoot, and optimize existing integrations using logs, diagnostics, and analytics tools. EXPERTISE AND QUALIFICATIONS Required Skills: Strong hands-on experience with Fabric and building scalable APIs. Experience in implementing Meta CAPI , Google Enhanced Conversions , and other platform-specific server-side tracking APIs. Knowledge of Azure Clean Rooms , with experience developing custom logic and code for clean data collaborations . Proficiency with Azure Cloud technologies , especially Cosmos DB, Azure Functions, ADF, Key Vault, ADLS , and Azure security best practices . Familiarity with OCI for hybrid-cloud integration scenarios. Understanding of cryptography and secure data handling (e.g., hashing email addresses with SHA-256). Experience with Adobe Tag Management , specifically in pixel governance and lifecycle. Proven ability to collaborate across functions, especially with marketing and analytics teams. Soft Skills: Strong communication skills to explain technical concepts to non-technical stakeholders. Proven ability to collaborate across teams, especially with marketing, product, and data analytics. Adaptable and proactive in learning and applying evolving technologies and regulatory changes .

Posted 1 month ago

Apply

4.0 - 8.0 years

10 - 12 Lacs

Hyderabad

Remote

Role & responsibilities : Develop and maintain data platforms using Microsoft Fabric and Databricks Design and operate robust, scalable data pipelines for SAP , MS Dynamics , and other cloud/on-premise sources Create and manage a cloud-based Enterprise Data Lake for BI solutions Translate business and customer needs into data collection and processing workflows Support and optimize data science workflows and algorithms Administer and monitor the performance of data platforms Preferred candidate profile : Required Skills & Experience: Experience working with both structured and unstructured data Strong programming skills in Python and SQL (Scala is a plus) Hands-on experience with Microsoft Fabric (Lakehouse, Data Factory, etc.) and/or Databricks (Spark) Proficient in Power BI and working with APIs Solid understanding of data security best practices Azure knowledge is a plus (Storage, Networking, Billing, Security) Education & Background: Bachelors or Master’s in Business Informatics, Computer Science, or related field 5+ years of professional experience as a Data Engineer, preferably in industrial companies Familiarity with Apache Spark and DevOps practices Agile project experience and human-centered design are desirable Fluent in English with strong analytical and problem-solving skills

Posted 1 month ago

Apply

4.0 - 9.0 years

9 - 19 Lacs

Pune

Work from Office

We are seeking a Data Engineer with strong expertise in Microsoft Fabric and Databricks to support our enterprise data platform initiatives. Role: Data Engineer Microsoft Fabric & Databricks Location: Pune/ Remote Key Responsibilities: • Develop and maintain scalable data platforms using Microsoft Fabric for BI and Databricks for real-time analytics. • Build robust data pipelines for SAP, MS Dynamics, and other cloud/on-prem sources. • Design enterprise-scale Data Lakes and integrate structured/unstructured data. • Optimize algorithms developed by data scientists and ensure platform reliability. • Collaborate with data scientists, architects, and business teams in a global environment. • Perform general administration, security, and monitoring of data platforms. Mandatory Skills: • Experience with Microsoft Fabric (Warehouse, Lakehouse, Data Factory, DataFlow Gen2, Semantic Models) and/or Databricks (Apache Spark). • Strong background in Python, SQL (Scala is a plus), and API integration. • Hands-on experience with Power BI and various database technologies (RDBMS, OLAP, Time Series). • Experience working with large datasets, preferably in an industrial or enterprise environment. • Proven skills in performance tuning, data modeling, data mining, and cloud security (Azure preferred). Nice to Have: • Knowledge of Azure data services (Storage, Networking, Billing, Security). • Experience with DevOps, agile software development, and working in international/multicultural teams. Candidate Requirements: • 4+ years of experience as a data engineer. • Bachelors or Masters degree in Computer Science, Information Systems, or related fields. • Strong problem-solving skills and a high attention to detail. • Proficiency in English (written and verbal) Please share your resume at Neesha1@damcogroup.com

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Support a multi-product SaaS platform using SQL Server (on-prem & Azure). Key duties include monitoring overnight jobs, optimizing T-SQL & SSIS ETL, and resolving performance issues. Must handle backups, tuning, indexing, and high availability setups Required Candidate profile Experience with SQL Server 2016+, Azure SQL, and SSIS is required. Preferred skills: Microsoft Fabric, CDC/Event Hubs, Power BI, DevOps. Strong problem-solving, independent work & team collaboration

Posted 1 month ago

Apply

6.0 - 9.0 years

0 - 3 Lacs

Chennai, Bengaluru, Mumbai (All Areas)

Hybrid

Preferred candidate profile Experience in typical SDLC phases like requirement gathering, design, build, unit testing, post production warranty support. Design and Implement scalable ETL data pipelines & reporting dashboards using MS Fabric Integrate Azure data workloads in MS Fabric like Onelake, Azure Purview, Data Factory, Synapse Analytics, Fabric Power BI and Key Vault for efficient data flow and security. Develop and orchestrate data pipelines using MS Fabric notebooks for ETL/ELT processes & Fabric Power BI reporting Perform unit testing for data pipelines using MS Fabric notebooks for ETL/ELT processes & Fabric Power BI reporting Employ Spark, SQL, and Python to wrangle, transform, and analyze data for various needs. Setup batch jobs to move data from on-premises source systems to Azure data services such as MS Fabric, ADL, Databricks, and Synapse analytics Implement version control and continuous integration practices for secure and reliable data processing. Collaborate effectively with data scientists, analysts, and stakeholders to understand data needs and translate them into technical solutions. Document data pipelines and processes for knowledge sharing and maintainability. Good communication skills Team colloboration to manage the team of 5-8 members

Posted 1 month ago

Apply

6.0 - 11.0 years

16 - 30 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Job Description: Data Architect - Azure with MS Fabric Location: Pune/Bangalore/Hyderabad Experience: 6+ Years Job Summary: We're seeking an experienced Data Engineer Lead to architect, design, and implement data solutions using Microsoft Fabric. The successful candidate will lead a team of data engineers, collaborating with stakeholders to deliver scalable, efficient, and reliable data pipelines. Strong technical expertise in MS Fabric, data modeling, and data warehousing is required. Key Responsibilities: Design and implement data solutions using MS Fabric, including data pipelines, data warehouses, and data lakes Lead and mentor a team of data engineers, providing technical guidance and oversight Collaborate with stakeholders to understand data requirements and deliver data-driven solutions Develop and maintain large-scale data systems, ensuring data quality, integrity, and security Troubleshoot data pipeline issues and optimize data workflows for performance and scalability Stay up-to-date with MS Fabric features and best practices, applying knowledge to improve data solutions Requirements: 5+ years of experience in data engineering, with expertise in MS Fabric, Azure Data Factory, or similar technologies Strong programming skills in languages like Python, SQL, or C# Experience with data modeling, data warehousing, and data governance Excellent problem-solving skills, with ability to troubleshoot complex data pipeline issues Strong communication and leadership skills, with experience leading teams

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 30 Lacs

Hyderabad

Remote

Hiring for TOP MNC for Data Modeler positon (Long term contract - 2+ Years) The Data Modeler designs and implements data models for Microsoft Fabric and Power BI, supporting the migration from Oracle/Informatica. This offshore role ensures optimized data structures for performance and reporting needs. The successful candidate will bring expertise in data modeling and a collaborative approach. Responsibilities Develop conceptual, logical, and physical data models for Microsoft Fabric and Power BI solutions. Implement data models for relational, dimensional, and data lake environments on target platforms. Collaborate with the Offshore Data Engineer and Onsite Data Modernization Architect to ensure model alignment. Define and govern data modeling standards, tools, and best practices. Optimize data structures for query performance and scalability. Provide updates on modeling progress and dependencies to the Offshore Project Manager. Skills Bachelors or masters degree in computer science, data science, or a related field. 5+ years of data modeling experience with relational and NoSQL platforms. Proficiency with modeling tools (e.g., Erwin, ER/Studio) and SQL. Experience with Microsoft Fabric, data lakes, and BI data structures. Strong analytical and communication skills for team collaboration. Attention to detail with a focus on performance and consistency. management, communication, and presentation

Posted 2 months ago

Apply

5.0 - 10.0 years

6 - 18 Lacs

Bengaluru

Work from Office

We are looking a skilled and proactive Data Engineer with hands-on experience in Azure Data Services & Microsoft Fabric. In this role, youll be responsible for building robust, scalable data pipelines and enabling enterprise grade analytic solutions.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies