Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
12.0 - 18.0 years
25 - 40 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Role & responsibilities Azure Cloud Services (PaaS & IaaS): Proficient in deploying and managing cloud-based solutions using Azure's Platform-as-a-Service and Infrastructure-as-a-Service offerings. Data Engineering & Analytics: Azure Synapse Analytics: Integrated big data and data warehousing capabilities for comprehensive analytics solutions. Azure Data Factory: Developed and orchestrated ETL/ELT pipelines for seamless data movement and transformation. Azure Databricks & PySpark: Engineered scalable data processing workflows and machine learning models. Azure Stream Analytics: Implemented real-time data stream processing for immediate insights. Microsoft Fabric: Utilized AI-powered analytics for unified data access and management.deepaksood619.github.io Business Intelligence & Reporting: Power BI & SSRS: Designed and developed interactive dashboards and reports for data visualization and decision-making. SQL Server Analysis Services (SSAS): Built OLAP cubes and tabular models for multidimensional data analysis. Data Governance & Security: Microsoft Purview: Established comprehensive data governance frameworks to ensure compliance and data integrity. DevOps & Automation: Azure DevOps: Implemented CI/CD pipelines and automated deployment processes for efficient software delivery. Preferred candidate profile Technical Skills: Cloud Computing: Azure-Cloud Services (PaaS & IaaS), Active Directory, Application Insights, Azure Stream Analytics, Azure Search, Data Factory, Key Vault and SQL Azure, Azure Data Factory, Azure Analysis services, Azure Synapse Analytics (DW), Azure Data Lake, PySpark, Microsoft Fabric Database & BI Tools: SQL, T-SQL, SSIS, SSRS, SQL Server Management Studio (SSMS) 2016/2014, SQL Server Job Agent, Import and Export Data, Linked Servers. Reporting Tools: SSRS, Power BI reports, Tableau, Excel
Posted 2 weeks ago
8.0 - 13.0 years
15 - 30 Lacs
Pune
Work from Office
Role & responsiby: Position Details: Role: Data Engineer Location: Pune, Baner (Work from Office) Experience: 6+ years Working Days: Monday to Friday (9:30 AM 6:30 PM) Education: Bachelors or Masters in Computer Science, Engineering, Mathematics, or related field Company Website: www.avigna.ai LinkedIn: Avigna.AI Key Responsibilities: Design and develop robust data pipelines for large-scale data ingestion, transformation, and analytics. Implement scalable Lakehouse architectures using tools like Microsoft Fabric for structured and semi-structured data. Work with Python , PySpark , and Azure services to support data modeling, automation, and predictive insights. Develop custom KQL queries and manage data using Power BI , Azure Cosmos DB , or similar tools. Collaborate with cross-functional teams to integrate data-driven components with application backends and frontends. Ensure secure, efficient, and reliable CI/CD pipelines for automated deployments and data updates. Skills & Experience Required: Strong proficiency in Python , PySpark , and cloud-native data tools Experience with Microsoft Azure services (e.g., App Services, Functions, Cosmos DB, Active Directory) Hands-on experience with Microsoft Fabric (preferred or good to have) Working knowledge of Power BI and building interactive dashboards for business insights Familiarity with CI/CD practices for automated deployments Exposure to machine learning integration into data workflows (nice to have) Strong analytical and problem-solving skills with attention to detail Good to Have: Experience with KQL (Kusto Query Language) Background in simulation models or mathematical modeling Knowledge of Power Platform integration (Power Pages, Power Apps) ilities Benefits : Competitive salary. Health insurance coverage. Professional development opportunities. Dynamic and collaborative work environment.
Posted 2 weeks ago
8.0 - 13.0 years
14 - 24 Lacs
Bengaluru
Remote
Key Responsibilities: Design and implement data solutions using MS Fabric, including data pipelines, data warehouses, and data lakes Lead and mentor a team of data engineers, providing technical guidance and oversight Collaborate with stakeholders to understand data requirements and deliver data-driven solutions Develop and maintain large-scale data systems, ensuring data quality, integrity, and security Troubleshoot data pipeline issues and optimize data workflows for performance and scalability Stay up-to-date with MS Fabric features and best practices, applying knowledge to improve data solutions Requirements: 8+ years of experience in data engineering, with expertise in MS Fabric, Azure Data Factory, or similar technologies Strong programming skills in languages like Python, SQL, or C# Experience with data modeling, data warehousing, and data governance Excellent problem-solving skills, with ability to troubleshoot complex data pipeline issues Strong communication and leadership skills, with experience leading teams
Posted 2 weeks ago
5.0 - 9.0 years
15 - 25 Lacs
Bengaluru
Hybrid
Position: Data Engineer Skills Required: Experience in Python/Pyspark, Strong SQL Server. Good to have: Azure DataBricks (ADF) (or) Azure Synapse or Snowflake.
Posted 2 weeks ago
1.0 - 4.0 years
5 - 9 Lacs
Noida, Mohali
Work from Office
- Support the development of internal web applications and tools. - Help build and maintain backend services. - Contribute to frontend development using React.js or Vue.js. - Assist in setting up and managing cloud-based infrastructure.
Posted 2 weeks ago
7.0 - 12.0 years
8 - 18 Lacs
Kolkata
Remote
Position : Sr Azure Data Engineer Location: Remote Time : CET Time Role & responsibilities We are seeking a highly skilled Senior Data Engineer to join our dynamic team. The ideal candidate will have extensive experience in Microsoft Azure, Fabric Azure SQL, Azure Synapse, Python, and Power BI. Knowledge of Oracle DB and data replication tools will be preferred . This role involves designing, developing, and maintaining robust data pipelines and ensuring efficient data processing and integration across various platforms. Candidate understands stated needs & requirements of the stakeholders and produce high quality deliverables Monitors own work to ensure delivery within the desired performance standards. Understands the importance of delivery within expected time, budget and quality standards and displays concern in case of deviation. Good communication skills and a team player Design and Development: Architect, develop, and maintain scalable data pipelines using Microsoft Fabric and Azure services, including Azure SQL and Azure Synapse. Data Integration : Integrate data from multiple sources, ensuring data consistency, quality, and availability using data replication tools. Data Management: Manage and optimize databases, ensuring high performance and reliability. ETL Processes: Develop and maintain ETL processes to transform data into actionable insights. Data Analysis: Use Python and other tools to analyze data, create reports, and provide insights to support business decisions. Visualization : Develop and maintain dashboards and reports in Power BI to visualize complex data sets. Performance Tuning : Optimize database performance and troubleshoot any issues related to data processing and integration Preferred candidate profile Minimum 7 years of experience in data engineering or a related field. Proven experience with Microsoft Azure services, Fabrics including Azure SQL and Azure Synapse. Strong proficiency in Python for data analysis and scripting. Extensive experience with Power BI for data visualization. Knowledge of Oracle DB and experience with data replication tools. Proficient in SQL and database management. Experience with ETL tools and processes. Strong understanding of data warehousing concepts and architectures. Familiarity with cloud-based data platforms and services. Analytical Skills: Ability to analyze complex data sets and provide actionable insights. Problem-Solving: Strong problem-solving skills and the ability to troubleshoot data-related issues.
Posted 2 weeks ago
12.0 - 14.0 years
20 - 30 Lacs
Indore, Hyderabad
Work from Office
Microsoft Fabric Data engineer CTC Range 12 14 Years Location – Hyderabad/Indore Notice Period - Immediate * Primary Skill Microsoft Fabric Secondary Skill 1 Azure Data Factory (ADF) 12+ years of experience in Microsoft Azure Data Engineering for analytical projects. Proven expertise in designing, developing, and deploying high-volume, end-to-end ETL pipelines for complex models, including batch, and real-time data integration frameworks using Azure, Microsoft Fabric and Databricks. Extensive hands-on experience with Azure Data Factory, Databricks (with Unity Catalog), Azure Functions, Synapse Analytics, Data Lake, Delta Lake, and Azure SQL Database for managing and processing large-scale data integrations. Experience in Databricks cluster optimization and workflow management to ensure cost-effective and high-performance processing. Sound knowledge of data modelling, data governance, data quality management, and data modernization processes. Develop architecture blueprints and technical design documentation for Azure-based data solutions. Provide technical leadership and guidance on cloud architecture best practices, ensuring scalable and secure solutions. Keep abreast of emerging Azure technologies and recommend enhancements to existing systems. Lead proof of concepts (PoCs) and adopt agile delivery methodologies for solution development and delivery. www.yash.com 'Information transmitted by this e-mail is proprietary to YASH Technologies and/ or its Customers and is intended for use only by the individual or entity to which it is addressed, and may contain information that is privileged, confidential or exempt from disclosure under applicable law. If you are not the intended recipient or it appears that this mail has been forwarded to you without proper authority, you are notified that any use or dissemination of this information in any manner is strictly prohibited. In such cases, please notify us immediately at info@yash.com and delete this mail from your records.
Posted 2 weeks ago
2.0 - 5.0 years
8 - 18 Lacs
Pune
Work from Office
Scope of Work: Collaborate with the lead Business / Data Analyst to gather and analyse business requirements for data processing and reporting solutions. Maintain and run existing Python code, ensuring smooth execution and troubleshooting any issues that arise. Develop new features and enhancements for data processing, ingestion, transformation, and report building. Implement best coding practices to improve code quality, maintainability, and efficiency. Work within Microsoft Fabric to manage data integration, warehousing, and analytics, ensuring optimal performance and reliability. Support and maintain CI/CD workflows using Git-based deployments or other automated deployment tools, preferably in Fabric. Develop complex business rules and logic in Python to meet functional specifications and reporting needs. Participate in an agile development environment, providing feedback, iterating on improvements, and supporting continuous integration and delivery processes. Requirements: This person will be an individual contributor responsible for programming, maintenance support, and troubleshooting tasks related to data movement, processing, ingestion, transformation, and report building. Advanced-level Python developer. Moderate-level experience in working in Microsoft Fabric environment (at least one and preferably two or more client projects in Fabric). Well-versed with understanding of modelling, databases, data warehousing, data integration, and technical elements of business intelligence technologies. Ability to understand business requirements and translate them into functional specifications for reporting applications. Experience in GIT-based deployments or other CI/CD workflow options, preferably in Fabric. Strong verbal and written communication skills. Ability to perform in an agile environment where continual development is prioritized. Working experience in the financial industry domain and familiarity with financial accounting terms and statements like general ledger, balance sheet, and profit & loss statements would be a plus. Ability to create Power BI dashboards, KPI scorecards, and visual reports would be a plus. Degree in Computer Science or Information Systems, along with a good understanding of financial terms or working experience in banking/financial institutions, is preferred.
Posted 3 weeks ago
5.0 - 10.0 years
10 - 20 Lacs
Bengaluru
Remote
Role & responsibilities The Test Lead oversees the testing strategy and execution for the Microsoft Fabric migration and Power BI reporting solutions. This offshore role ensures quality, reliability, and client satisfaction through rigorous validation. The successful candidate will have a strong testing background and coordination skills. Responsibilities Develop and execute the testing strategy for Microsoft Fabric and Power BI deliverables. Validate data migration, pipeline functionality, and report accuracy against requirements. Coordinate with the Offshore Project Manager to align testing with development milestones. Collaborate with onsite technical leads to validate results and resolve defects. • Oversee offshore testers, ensuring comprehensive coverage and quality standards. Proactively identify risks and articulate solutions to minimize delivery issues. Skills Bachelors degree in IT, computer science, or a related field. 5+ years of experience in test leadership for data platforms and BI solutions. Knowledge of Microsoft Fabric, Power BI, and data migration testing. Proficiency with testing tools (e.g., Azure DevOps, Selenium) and SQL. Strong communication and stakeholder management skills. Detail-oriented with a focus on quality and continuous improvement 1. JD for Data Modeler The Data Modeler designs and implements data models for Microsoft Fabric and Power BI, supporting the migration from Oracle/Informatica. This offshore role ensures optimized data structures for performance and reporting needs. The successful candidate will bring expertise in data modeling and a collaborative approach. Responsibilities Develop conceptual, logical, and physical data models for Microsoft Fabric and Power BI solutions. Implement data models for relational, dimensional, and data lake environments on target platforms. Collaborate with the Offshore Data Engineer and Onsite Data Modernization Architect to ensure model alignment. Define and govern data modeling standards, tools, and best practices. Optimize data structures for query performance and scalability. Provide updates on modeling progress and dependencies to the Offshore Project Manager. Skills Bachelor’s or master’s degree in computer science, data science, or a related field. 5+ years of data modeling experience with relational and NoSQL platforms. Proficiency with modeling tools (e.g., Erwin, ER/Studio) and SQL. Experience with Microsoft Fabric, data lakes, and BI data structures. Strong analytical and communication skills for team collaboration. Attention to detail with a focus on performance and consistency. management, communication, and presentation
Posted 3 weeks ago
3.0 - 6.0 years
0 - 1 Lacs
Chennai, Bengaluru
Hybrid
Job Description: An experienced and skilled BI engineer with designing, developing, and deploying business intelligence solutions using Microsoft Power BI Mandatory Skills 3+ years of experience in Power BI Strong knowledge in data transformation using Power Query. Ability to write complex DAX formula for data aggregation, filtering, ranking etc. String knowledge in schema modelling in Power BI. Thorough knowledge in RLS implementation in Power BI. Ability to create report mockups/wireframe based on requirements Knowledge on Power BI service and Gateway. Working experience in writing complex queries and data analysis skills Good working experience in UI/UX design of reports and story telling Good communication skills, abililty to learn new things , good attitude towards work and team skills Good to have skills: Microsoft Fabric experience Data Engineering skills Azure Devops
Posted 3 weeks ago
12.0 - 18.0 years
35 - 40 Lacs
Bengaluru
Work from Office
Microsoft Fabric, Azure Analysis Services, PowerBI, Azure SQL, Dimensional Data Modeling
Posted 3 weeks ago
5.0 - 10.0 years
25 - 30 Lacs
Bengaluru
Work from Office
JOB DESCRIPTION We are looking for a highly skilled API & Pixel Tracking Integration Engineer to lead the development and deployment of server- side tracking and attribution solutions across multiple platforms. The ideal candidate brings deep expertise in CAPI integrations (Meta, Google, and other platforms), secure data handling using cryptographic techniques, and experience working within privacy- first environments like Azure Clean Rooms . This role requires strong hands-on experience in C# development, Azure cloud services, OCI (Oracle Cloud Infrastructure) , and marketing technology stacks including Adobe Tag Management and Pixel Management . You will work closely with engineering, analytics, and marketing teams to deliver scalable, compliant, and secure data tracking solutions that drive business insights and performance. Key Responsibilities: Design, implement, and maintain CAPI integrations across Meta, Google, and all major platforms , ensuring real-time and accurate server-side event tracking. Utilize Fabric and OCI environments as needed for data integration and marketing intelligence workflows. Develop and manage custom tracking solutions leveraging Azure Clean Rooms , ensuring user NFAs are respected and privacy-compliant logic is implemented. Implement cryptographic hashing (e.g., SHA-256) Use Azure Data Lake Gen1 & Gen2 (ADLS) , Cosmos DB , and Azure Functions to build and host scalable backend systems. Integrate with Azure Key Vaults to securely manage secrets and sensitive credentials. Design and execute data pipelines in Azure Data Factory (ADF) for processing and transforming tracking data. Lead pixel and tag management initiatives using Adobe Tag Manager , including pixel governance and QA across properties. Collaborate with security teams to ensure all data-sharing and processing complies with Azures data security standards and enterprise privacy frameworks. Monitor, troubleshoot, and optimize existing integrations using logs, diagnostics, and analytics tools. EXPERTISE AND QUALIFICATIONS Required Skills: Strong hands-on experience with Fabric and building scalable APIs. Experience in implementing Meta CAPI , Google Enhanced Conversions , and other platform-specific server-side tracking APIs. Knowledge of Azure Clean Rooms , with experience developing custom logic and code for clean data collaborations . Proficiency with Azure Cloud technologies , especially Cosmos DB, Azure Functions, ADF, Key Vault, ADLS , and Azure security best practices . Familiarity with OCI for hybrid-cloud integration scenarios. Understanding of cryptography and secure data handling (e.g., hashing email addresses with SHA-256). Experience with Adobe Tag Management , specifically in pixel governance and lifecycle. Proven ability to collaborate across functions, especially with marketing and analytics teams. Soft Skills: Strong communication skills to explain technical concepts to non-technical stakeholders. Proven ability to collaborate across teams, especially with marketing, product, and data analytics. Adaptable and proactive in learning and applying evolving technologies and regulatory changes .
Posted 3 weeks ago
4.0 - 8.0 years
10 - 12 Lacs
Hyderabad
Remote
Role & responsibilities : Develop and maintain data platforms using Microsoft Fabric and Databricks Design and operate robust, scalable data pipelines for SAP , MS Dynamics , and other cloud/on-premise sources Create and manage a cloud-based Enterprise Data Lake for BI solutions Translate business and customer needs into data collection and processing workflows Support and optimize data science workflows and algorithms Administer and monitor the performance of data platforms Preferred candidate profile : Required Skills & Experience: Experience working with both structured and unstructured data Strong programming skills in Python and SQL (Scala is a plus) Hands-on experience with Microsoft Fabric (Lakehouse, Data Factory, etc.) and/or Databricks (Spark) Proficient in Power BI and working with APIs Solid understanding of data security best practices Azure knowledge is a plus (Storage, Networking, Billing, Security) Education & Background: Bachelors or Master’s in Business Informatics, Computer Science, or related field 5+ years of professional experience as a Data Engineer, preferably in industrial companies Familiarity with Apache Spark and DevOps practices Agile project experience and human-centered design are desirable Fluent in English with strong analytical and problem-solving skills
Posted 3 weeks ago
4.0 - 9.0 years
9 - 19 Lacs
Pune
Work from Office
We are seeking a Data Engineer with strong expertise in Microsoft Fabric and Databricks to support our enterprise data platform initiatives. Role: Data Engineer Microsoft Fabric & Databricks Location: Pune/ Remote Key Responsibilities: • Develop and maintain scalable data platforms using Microsoft Fabric for BI and Databricks for real-time analytics. • Build robust data pipelines for SAP, MS Dynamics, and other cloud/on-prem sources. • Design enterprise-scale Data Lakes and integrate structured/unstructured data. • Optimize algorithms developed by data scientists and ensure platform reliability. • Collaborate with data scientists, architects, and business teams in a global environment. • Perform general administration, security, and monitoring of data platforms. Mandatory Skills: • Experience with Microsoft Fabric (Warehouse, Lakehouse, Data Factory, DataFlow Gen2, Semantic Models) and/or Databricks (Apache Spark). • Strong background in Python, SQL (Scala is a plus), and API integration. • Hands-on experience with Power BI and various database technologies (RDBMS, OLAP, Time Series). • Experience working with large datasets, preferably in an industrial or enterprise environment. • Proven skills in performance tuning, data modeling, data mining, and cloud security (Azure preferred). Nice to Have: • Knowledge of Azure data services (Storage, Networking, Billing, Security). • Experience with DevOps, agile software development, and working in international/multicultural teams. Candidate Requirements: • 4+ years of experience as a data engineer. • Bachelors or Masters degree in Computer Science, Information Systems, or related fields. • Strong problem-solving skills and a high attention to detail. • Proficiency in English (written and verbal) Please share your resume at Neesha1@damcogroup.com
Posted 4 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad
Work from Office
Support a multi-product SaaS platform using SQL Server (on-prem & Azure). Key duties include monitoring overnight jobs, optimizing T-SQL & SSIS ETL, and resolving performance issues. Must handle backups, tuning, indexing, and high availability setups Required Candidate profile Experience with SQL Server 2016+, Azure SQL, and SSIS is required. Preferred skills: Microsoft Fabric, CDC/Event Hubs, Power BI, DevOps. Strong problem-solving, independent work & team collaboration
Posted 4 weeks ago
6.0 - 9.0 years
0 - 3 Lacs
Chennai, Bengaluru, Mumbai (All Areas)
Hybrid
Preferred candidate profile Experience in typical SDLC phases like requirement gathering, design, build, unit testing, post production warranty support. Design and Implement scalable ETL data pipelines & reporting dashboards using MS Fabric Integrate Azure data workloads in MS Fabric like Onelake, Azure Purview, Data Factory, Synapse Analytics, Fabric Power BI and Key Vault for efficient data flow and security. Develop and orchestrate data pipelines using MS Fabric notebooks for ETL/ELT processes & Fabric Power BI reporting Perform unit testing for data pipelines using MS Fabric notebooks for ETL/ELT processes & Fabric Power BI reporting Employ Spark, SQL, and Python to wrangle, transform, and analyze data for various needs. Setup batch jobs to move data from on-premises source systems to Azure data services such as MS Fabric, ADL, Databricks, and Synapse analytics Implement version control and continuous integration practices for secure and reliable data processing. Collaborate effectively with data scientists, analysts, and stakeholders to understand data needs and translate them into technical solutions. Document data pipelines and processes for knowledge sharing and maintainability. Good communication skills Team colloboration to manage the team of 5-8 members
Posted 4 weeks ago
6.0 - 11.0 years
16 - 30 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Job Description: Data Architect - Azure with MS Fabric Location: Pune/Bangalore/Hyderabad Experience: 6+ Years Job Summary: We're seeking an experienced Data Engineer Lead to architect, design, and implement data solutions using Microsoft Fabric. The successful candidate will lead a team of data engineers, collaborating with stakeholders to deliver scalable, efficient, and reliable data pipelines. Strong technical expertise in MS Fabric, data modeling, and data warehousing is required. Key Responsibilities: Design and implement data solutions using MS Fabric, including data pipelines, data warehouses, and data lakes Lead and mentor a team of data engineers, providing technical guidance and oversight Collaborate with stakeholders to understand data requirements and deliver data-driven solutions Develop and maintain large-scale data systems, ensuring data quality, integrity, and security Troubleshoot data pipeline issues and optimize data workflows for performance and scalability Stay up-to-date with MS Fabric features and best practices, applying knowledge to improve data solutions Requirements: 5+ years of experience in data engineering, with expertise in MS Fabric, Azure Data Factory, or similar technologies Strong programming skills in languages like Python, SQL, or C# Experience with data modeling, data warehousing, and data governance Excellent problem-solving skills, with ability to troubleshoot complex data pipeline issues Strong communication and leadership skills, with experience leading teams
Posted 1 month ago
5.0 - 10.0 years
20 - 30 Lacs
Hyderabad
Remote
Hiring for TOP MNC for Data Modeler positon (Long term contract - 2+ Years) The Data Modeler designs and implements data models for Microsoft Fabric and Power BI, supporting the migration from Oracle/Informatica. This offshore role ensures optimized data structures for performance and reporting needs. The successful candidate will bring expertise in data modeling and a collaborative approach. Responsibilities Develop conceptual, logical, and physical data models for Microsoft Fabric and Power BI solutions. Implement data models for relational, dimensional, and data lake environments on target platforms. Collaborate with the Offshore Data Engineer and Onsite Data Modernization Architect to ensure model alignment. Define and govern data modeling standards, tools, and best practices. Optimize data structures for query performance and scalability. Provide updates on modeling progress and dependencies to the Offshore Project Manager. Skills Bachelors or masters degree in computer science, data science, or a related field. 5+ years of data modeling experience with relational and NoSQL platforms. Proficiency with modeling tools (e.g., Erwin, ER/Studio) and SQL. Experience with Microsoft Fabric, data lakes, and BI data structures. Strong analytical and communication skills for team collaboration. Attention to detail with a focus on performance and consistency. management, communication, and presentation
Posted 1 month ago
5.0 - 10.0 years
6 - 18 Lacs
Bengaluru
Work from Office
We are looking a skilled and proactive Data Engineer with hands-on experience in Azure Data Services & Microsoft Fabric. In this role, youll be responsible for building robust, scalable data pipelines and enabling enterprise grade analytic solutions.
Posted 1 month ago
10.0 - 16.0 years
27 - 37 Lacs
Hyderabad
Work from Office
Data Architect Microsoft Fabric, Snowflake & Modern Data Platforms Location: Hyderabad Employment Type: Full-Time Position Overview: We are seeking a seasoned Data Architect with strong consulting experience to lead the design and delivery of modern data solutions across global clients. This role emphasizes hands-on architecture and engineering using Microsoft Fabric and Snowflake, while also contributing to internal capability development and practice growth. The ideal candidate will bring deep expertise in data modeling, modern data architecture, and data engineering, with a passion for innovation and client impact. Key Responsibilities: Client Delivery & Architecture (75%) Serve as the lead architect for client engagements, designing scalable, secure, and high-performance data solutions using Microsoft Fabric and Snowflake. Apply modern data architecture principles including data lakehouse, ELT/ETL pipelines, and real-time streaming. Collaborate with cross-functional teams (data engineers, analysts, architects) to deliver end-to-end solutions. Translate business requirements into technical strategies with measurable outcomes. Ensure best practices in data governance, quality, and security are embedded in all solutions. Deliver scalable data modeling solutions for various use cases leveraging a modern data platform. Practice & Capability Development (25%) Contribute to the development of reusable assets, accelerators, and reference architectures. Support internal knowledge sharing and mentoring across the India-based consulting team. Stay current with emerging trends in data platforms, AI/ML integration, and cloud-native architectures. Collaborate with global teams to align on delivery standards and innovation initiatives. Qualifications: 10+ years of experience in data architecture and engineering, preferably in a consulting environment. Proven experience with Microsoft Fabric and Snowflake platforms. Strong skills in data modeling, data pipeline development, and performance optimization. Familiarity with Azure Synapse, Azure Data Factory, Power BI, and related Azure services. Excellent communication and stakeholder management skills. Experience working with global delivery teams and agile methodologies. Preferred Certifications: SnowPro Core Certification (preferred but not required) Microsoft Certified: Fabric Analytics Engineer Associate Microsoft Certified: Azure Solutions Architect Expert
Posted 1 month ago
12.0 - 22.0 years
40 - 60 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Job Description: Data Architect - Azure with MS Fabric Location: Pune/Bangalore/Hyderabad Experience: 12+ Years Role Overview: As a Data Architect specializing in Azure with MS Fabric , you will play a pivotal role in designing and implementing robust data solutions that leverage Microsoft Fabric for cloud-based data management and analytics. Your expertise will guide clients through the complexities of data architecture, ensuring seamless integration with existing systems and optimizing data workflows. You will be responsible for leading projects from inception to completion, providing strategic insights and technical leadership throughout the process. Required Skills and Qualifications: Experience: 14+ years in Data and Analytics, with a minimum of 7-8 years focused on Azure and at least 2 implementations using Microsoft Fabric. Data Architecture Expertise: Proven experience as a Data Architect, particularly in consulting and solution design, with a strong background in cloud data stacks. Technical Proficiency: Extensive knowledge of data modeling, database design, ETL processes, and data governance principles. MS Fabric: Hands-on experience with Microsoft Fabric, including data integration, data pipelines, and analytics capabilities. SQL Skills: Advanced SQL knowledge with experience in writing complex queries, performance tuning, and troubleshooting. Programming Skills: Proficiency in programming languages such as Java, Python, or Scala for building data pipelines. Methodologies: Familiarity with Agile, Scrum, and other project delivery methodologies. Stakeholder Management: Strong experience in managing both internal and external stakeholders effectively. Certifications: Relevant certifications in Azure and Microsoft Fabric will be an advantage. Key Responsibilities: Leadership & Strategy Lead the design and implementation of end-to-end solutions using Microsoft Fabric. Collaborate with business and technical stakeholders to define data strategies. Act as the primary point of contact for all Fabric-related projects and initiatives. Provide mentorship and guidance to junior data engineers, BI developers, and analysts. Architecture & Development Design and manage Lakehouses, Data Warehouses, and Pipelines within Microsoft Fabric. Build scalable data models and visualizations using Power BI (with Fabric Integration). Develop and maintain Dataflows, Notebooks, Spark Jobs, and Synapse Pipelines. Implement best practices in data governance, security, and compliance using Fabrics tools. Project Execution Lead cross-functional teams for successful project delivery. Ensure alignment of architecture with business KPIs and OKRs. Drive adoption of Fabric across business units. Perform code reviews and architectural assessments. Monitoring & Optimization Monitor data pipeline performance, troubleshoot issues, and tune performance. Ensure data quality, availability, and lineage using Microsoft Purview (or native Fabric tooling). Maintain documentation of data models, architecture, and workflows.
Posted 1 month ago
5.0 - 8.0 years
12 - 13 Lacs
Pune
Work from Office
Job Title: Data Engineer Microsoft Fabric & Databricks Location: Pune / Remote Experience: 5+ Years Job Type: Full-Time We are looking for a skilled Data Engineer with expertise in Microsoft Fabric and Databricks to support
Posted 1 month ago
2.0 - 5.0 years
3 - 8 Lacs
Bengaluru
Work from Office
Job Title: Power BI Developer Experience: 23 Years Location: Bangalore - Indiranagar (Work from Office Only) Employment Type: Full-Time Job Description: We are looking for a Power BI Developer with 23 years of hands-on experience in designing and developing BI reports and dashboards using Power BI. Candidates with experience in Microsoft Fabric will be given preference. Strong communication skills are essential, as the role involves close collaboration with cross-functional teams. Key Responsibilities: Develop, design, and maintain interactive dashboards and reports in Power BI Work closely with stakeholders to gather requirements and translate them into effective data visualizations Optimize data models for performance and usability Implement row-level security and data governance best practices Stay updated with Power BI and MS Fabric capabilities and best practices Requirements: 23 years of hands-on Power BI development experience Familiarity with Power Query, DAX, and data modeling techniques Experience in Microsoft Fabric is a plus Strong analytical and problem-solving skills Excellent verbal and written communication skills Interested candidates kindly share your CV and below details to usha.sundar@adecco.com 1) Present CTC (Fixed + VP) - 2) Expected CTC - 3) No. of years experience - 4) Notice Period - 5) Offer-in hand - 6) Reason of Change - 7) Present Location -
Posted 1 month ago
5.0 - 7.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Introduction In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Participate in delivery reviews / product reviews, quality assurance and work as design authority Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems Experience in data engineering and architecting data platforms Experience in architecting and implementing Data Platforms Azure Cloud Platform Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred technical and professional experience Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc
Posted 1 month ago
10.0 - 14.0 years
19 - 34 Lacs
Pune
Work from Office
Role & responsibilities Responsibilities: Collaborate with cross-functional teams to understand data requirements. Utilize Fabric Lakehouse for data storage and processing. Design and implement data models and databases using MS Fabric and Azure services. Develop ETL processes using SSIS, Azure Synapse Pipelines, etc. Ensure data quality and governance. Optimize data pipelines for performance and scalability. Requirements: Proficiency in BI tools like Power BI and Tableau. Experience with data integration tools like Azure Data Factory. Expertise in Microsoft Fabric and Azure Cloud Platform. Strong problem-solving and communication skills. Ability to work independently and in a team.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
23751 Jobs | Dublin
Wipro
12469 Jobs | Bengaluru
EY
8625 Jobs | London
Accenture in India
7339 Jobs | Dublin 2
Uplers
7127 Jobs | Ahmedabad
Amazon
6778 Jobs | Seattle,WA
IBM
6514 Jobs | Armonk
Oracle
6388 Jobs | Redwood City
Muthoot FinCorp (MFL)
5532 Jobs | New Delhi
Capgemini
4741 Jobs | Paris,France