Home
Jobs

70 Microsoft Fabric Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 years

0 Lacs

, India

On-site

Microsoft is a company where passionate innovators come to collaborate, envision what can be and take their careers further . This is a world of more possibilities , more innovation, more openness, and the sky is the limit thinking in a cloud-enabled world. Microsoft's Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. Within Azure Data, the Microsoft Fabric platform team builds and maintains the operating system and provides customers a unified data stack to run an entire data estate. The platform provides a unified experience, unified governance, enables a unified business model and a unified architecture. The Microsoft Fabric Platform - Capacity Analytics Team is hiring a Software Engineer II to develop UX experiences for Capacity Analytics. We do not just value differences or different perspectives. We seek them out and invite them in so we can tap into the collective power of everyone in the company. As a result, our customers are better served .

Posted 1 month ago

Apply

5.0 - 7.0 years

0 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Req ID: 326727 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Microsoft Fabric Specialist to join our team in Hyderabad, Telangana (IN-TG), India (IN). Job Description: We are seeking a Mid-Level Microsoft Fabric Support Specialist to join our IT team. The ideal candidate will be responsible for providing technical support, troubleshooting, and ensuring the smooth operation of Microsoft Fabric services. This role requires a deep understanding of Microsoft Fabric, data integration, and analytics solutions, along with strong problem-solving skills. Key Responsibilities: . Provide technical support and troubleshooting for Microsoft Fabric services. . Assist in the implementation, configuration, and maintenance of Microsoft Fabric environments. . Monitor system performance and resolve issues proactively. . Collaborate with cross-functional teams to optimize data workflows and analytics solutions. . Document support procedures, best practices, and troubleshooting steps. . Assist in user training and onboarding for Microsoft Fabric-related tools and applications. . Stay up to date with the latest Microsoft Fabric updates and best practices. Required Qualifications: . 5+ years of experience in IT support, with a focus on Microsoft Fabric or related technologies. . Strong knowledge of Microsoft Fabric, Power BI, Azure Synapse, and data integration tools. . Experience with troubleshooting and resolving issues in a cloud-based environment. . Familiarity with SQL, data pipelines, and ETL processes. . Excellent problem-solving and communication skills. . Ability to work independently and collaboratively in a team environment. Preferred Qualifications: . Microsoft certifications related to Fabric, Azure, or Power BI. . Experience with automation and scripting (PowerShell, Python, etc.). . Understanding of security and compliance considerations in cloud-based data platforms. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.

Posted 1 month ago

Apply

10.0 - 12.0 years

25 - 30 Lacs

Hyderabad, Bengaluru

Hybrid

Excellent communication and interpersonal skills, with the ability to explain technical concepts to non-technical stakeholders. Azure certifications such as Azure Solutions Architect Expert (AZ-30x) or equivalent certifications are preferred. Knowledge of hybrid cloud architecture and migration techniques. Mandatory Skills: Microsoft Fabric Azure Data Factory Azure Synapse Analytics Azure SQL DB Azure Functions Azure Cosmos DB Nice to Have: .NET experience Azure AI skills

Posted 1 month ago

Apply

8.0 - 10.0 years

11 - 18 Lacs

Pune

Work from Office

Role Responsibilities : - Design and implement data pipelines using MS Fabric. - Develop data models to support business intelligence and analytics. - Manage and optimize ETL processes for data extraction, transformation, and loading. - Collaborate with cross-functional teams to gather and define data requirements. - Ensure data quality and integrity in all data processes. - Implement best practices for data management, storage, and processing. - Conduct performance tuning for data storage and retrieval for enhanced efficiency. - Generate and maintain documentation for data architecture and data flow. - Participate in troubleshooting data-related issues and implement solutions. - Monitor and optimize cloud-based solutions for scalability and resource efficiency. - Evaluate emerging technologies and tools for potential incorporation in projects. - Assist in designing data governance frameworks and policies. - Provide technical guidance and support to junior data engineers. - Participate in code reviews and ensure adherence to coding standards. - Stay updated with industry trends and best practices in data engineering. Qualifications : - 8+ years of experience in data engineering roles. - Strong expertise in MS Fabric and related technologies. - Proficiency in SQL and relational database management systems. - Experience with data warehousing solutions and data modeling. - Hands-on experience in ETL tools and processes. - Knowledge of cloud computing platforms (Azure, AWS, GCP). - Familiarity with Python or similar programming languages. - Ability to communicate complex concepts clearly to non-technical stakeholders. - Experience in implementing data quality measures and data governance. - Strong problem-solving skills and attention to detail. - Ability to work independently in a remote environment. - Experience with data visualization tools is a plus. - Excellent analytical and organizational skills. - Bachelor's degree in Computer Science, Engineering, or related field. - Experience in Agile methodologies and project management.

Posted 1 month ago

Apply

8.0 - 10.0 years

11 - 18 Lacs

Mumbai

Work from Office

Role Responsibilities : - Design and implement data pipelines using MS Fabric. - Develop data models to support business intelligence and analytics. - Manage and optimize ETL processes for data extraction, transformation, and loading. - Collaborate with cross-functional teams to gather and define data requirements. - Ensure data quality and integrity in all data processes. - Implement best practices for data management, storage, and processing. - Conduct performance tuning for data storage and retrieval for enhanced efficiency. - Generate and maintain documentation for data architecture and data flow. - Participate in troubleshooting data-related issues and implement solutions. - Monitor and optimize cloud-based solutions for scalability and resource efficiency. - Evaluate emerging technologies and tools for potential incorporation in projects. - Assist in designing data governance frameworks and policies. - Provide technical guidance and support to junior data engineers. - Participate in code reviews and ensure adherence to coding standards. - Stay updated with industry trends and best practices in data engineering. Qualifications : - 8+ years of experience in data engineering roles. - Strong expertise in MS Fabric and related technologies. - Proficiency in SQL and relational database management systems. - Experience with data warehousing solutions and data modeling. - Hands-on experience in ETL tools and processes. - Knowledge of cloud computing platforms (Azure, AWS, GCP). - Familiarity with Python or similar programming languages. - Ability to communicate complex concepts clearly to non-technical stakeholders. - Experience in implementing data quality measures and data governance. - Strong problem-solving skills and attention to detail. - Ability to work independently in a remote environment. - Experience with data visualization tools is a plus. - Excellent analytical and organizational skills. - Bachelor's degree in Computer Science, Engineering, or related field. - Experience in Agile methodologies and project management.

Posted 1 month ago

Apply

8.0 - 10.0 years

11 - 18 Lacs

Jaipur

Work from Office

Role Responsibilities : - Design and implement data pipelines using MS Fabric. - Develop data models to support business intelligence and analytics. - Manage and optimize ETL processes for data extraction, transformation, and loading. - Collaborate with cross-functional teams to gather and define data requirements. - Ensure data quality and integrity in all data processes. - Implement best practices for data management, storage, and processing. - Conduct performance tuning for data storage and retrieval for enhanced efficiency. - Generate and maintain documentation for data architecture and data flow. - Participate in troubleshooting data-related issues and implement solutions. - Monitor and optimize cloud-based solutions for scalability and resource efficiency. - Evaluate emerging technologies and tools for potential incorporation in projects. - Assist in designing data governance frameworks and policies. - Provide technical guidance and support to junior data engineers. - Participate in code reviews and ensure adherence to coding standards. - Stay updated with industry trends and best practices in data engineering. Qualifications : - 8+ years of experience in data engineering roles. - Strong expertise in MS Fabric and related technologies. - Proficiency in SQL and relational database management systems. - Experience with data warehousing solutions and data modeling. - Hands-on experience in ETL tools and processes. - Knowledge of cloud computing platforms (Azure, AWS, GCP). - Familiarity with Python or similar programming languages. - Ability to communicate complex concepts clearly to non-technical stakeholders. - Experience in implementing data quality measures and data governance. - Strong problem-solving skills and attention to detail. - Ability to work independently in a remote environment. - Experience with data visualization tools is a plus. - Excellent analytical and organizational skills. - Bachelor's degree in Computer Science, Engineering, or related field. - Experience in Agile methodologies and project management.

Posted 1 month ago

Apply

8.0 - 10.0 years

11 - 18 Lacs

Kolkata

Work from Office

Role Responsibilities : - Design and implement data pipelines using MS Fabric. - Develop data models to support business intelligence and analytics. - Manage and optimize ETL processes for data extraction, transformation, and loading. - Collaborate with cross-functional teams to gather and define data requirements. - Ensure data quality and integrity in all data processes. - Implement best practices for data management, storage, and processing. - Conduct performance tuning for data storage and retrieval for enhanced efficiency. - Generate and maintain documentation for data architecture and data flow. - Participate in troubleshooting data-related issues and implement solutions. - Monitor and optimize cloud-based solutions for scalability and resource efficiency. - Evaluate emerging technologies and tools for potential incorporation in projects. - Assist in designing data governance frameworks and policies. - Provide technical guidance and support to junior data engineers. - Participate in code reviews and ensure adherence to coding standards. - Stay updated with industry trends and best practices in data engineering. Qualifications : - 8+ years of experience in data engineering roles. - Strong expertise in MS Fabric and related technologies. - Proficiency in SQL and relational database management systems. - Experience with data warehousing solutions and data modeling. - Hands-on experience in ETL tools and processes. - Knowledge of cloud computing platforms (Azure, AWS, GCP). - Familiarity with Python or similar programming languages. - Ability to communicate complex concepts clearly to non-technical stakeholders. - Experience in implementing data quality measures and data governance. - Strong problem-solving skills and attention to detail. - Ability to work independently in a remote environment. - Experience with data visualization tools is a plus. - Excellent analytical and organizational skills. - Bachelor's degree in Computer Science, Engineering, or related field. - Experience in Agile methodologies and project management.

Posted 1 month ago

Apply

8.0 - 10.0 years

11 - 18 Lacs

Ahmedabad

Work from Office

Company Overview : Zorba Consulting India is a leading consultancy firm focused on delivering innovative solutions and strategies to enhance business performance. With a commitment to excellence, we prioritize collaboration, integrity, and customer-centric values in our operations. Our mission is to empower organizations by transforming data into actionable insights and enabling data-driven decision-making. We are dedicated to fostering a culture of continuous improvement and supporting our team members' professional development. Role Responsibilities : - Design and implement data pipelines using MS Fabric. - Develop data models to support business intelligence and analytics. - Manage and optimize ETL processes for data extraction, transformation, and loading. - Collaborate with cross-functional teams to gather and define data requirements. - Ensure data quality and integrity in all data processes. - Implement best practices for data management, storage, and processing. - Conduct performance tuning for data storage and retrieval for enhanced efficiency. - Generate and maintain documentation for data architecture and data flow. - Participate in troubleshooting data-related issues and implement solutions. - Monitor and optimize cloud-based solutions for scalability and resource efficiency. - Evaluate emerging technologies and tools for potential incorporation in projects. - Assist in designing data governance frameworks and policies. - Provide technical guidance and support to junior data engineers. - Participate in code reviews and ensure adherence to coding standards. - Stay updated with industry trends and best practices in data engineering. Qualifications : - 8+ years of experience in data engineering roles. - Strong expertise in MS Fabric and related technologies. - Proficiency in SQL and relational database management systems. - Experience with data warehousing solutions and data modeling. - Hands-on experience in ETL tools and processes. - Knowledge of cloud computing platforms (Azure, AWS, GCP). - Familiarity with Python or similar programming languages. - Ability to communicate complex concepts clearly to non-technical stakeholders. - Experience in implementing data quality measures and data governance. - Strong problem-solving skills and attention to detail. - Ability to work independently in a remote environment. - Experience with data visualization tools is a plus. - Excellent analytical and organizational skills. - Bachelor's degree in Computer Science, Engineering, or related field. - Experience in Agile methodologies and project management.

Posted 1 month ago

Apply

8.0 - 10.0 years

11 - 18 Lacs

Chennai

Work from Office

Role Responsibilities : - Design and implement data pipelines using MS Fabric. - Develop data models to support business intelligence and analytics. - Manage and optimize ETL processes for data extraction, transformation, and loading. - Collaborate with cross-functional teams to gather and define data requirements. - Ensure data quality and integrity in all data processes. - Implement best practices for data management, storage, and processing. - Conduct performance tuning for data storage and retrieval for enhanced efficiency. - Generate and maintain documentation for data architecture and data flow. - Participate in troubleshooting data-related issues and implement solutions. - Monitor and optimize cloud-based solutions for scalability and resource efficiency. - Evaluate emerging technologies and tools for potential incorporation in projects. - Assist in designing data governance frameworks and policies. - Provide technical guidance and support to junior data engineers. - Participate in code reviews and ensure adherence to coding standards. - Stay updated with industry trends and best practices in data engineering. Qualifications : - 8+ years of experience in data engineering roles. - Strong expertise in MS Fabric and related technologies. - Proficiency in SQL and relational database management systems. - Experience with data warehousing solutions and data modeling. - Hands-on experience in ETL tools and processes. - Knowledge of cloud computing platforms (Azure, AWS, GCP). - Familiarity with Python or similar programming languages. - Ability to communicate complex concepts clearly to non-technical stakeholders. - Experience in implementing data quality measures and data governance. - Strong problem-solving skills and attention to detail. - Ability to work independently in a remote environment. - Experience with data visualization tools is a plus. - Excellent analytical and organizational skills. - Bachelor's degree in Computer Science, Engineering, or related field. - Experience in Agile methodologies and project management.

Posted 1 month ago

Apply

8.0 - 10.0 years

11 - 18 Lacs

Hyderabad

Work from Office

Role Responsibilities : - Design and implement data pipelines using MS Fabric. - Develop data models to support business intelligence and analytics. - Manage and optimize ETL processes for data extraction, transformation, and loading. - Collaborate with cross-functional teams to gather and define data requirements. - Ensure data quality and integrity in all data processes. - Implement best practices for data management, storage, and processing. - Conduct performance tuning for data storage and retrieval for enhanced efficiency. - Generate and maintain documentation for data architecture and data flow. - Participate in troubleshooting data-related issues and implement solutions. - Monitor and optimize cloud-based solutions for scalability and resource efficiency. - Evaluate emerging technologies and tools for potential incorporation in projects. - Assist in designing data governance frameworks and policies. - Provide technical guidance and support to junior data engineers. - Participate in code reviews and ensure adherence to coding standards. - Stay updated with industry trends and best practices in data engineering. Qualifications : - 8+ years of experience in data engineering roles. - Strong expertise in MS Fabric and related technologies. - Proficiency in SQL and relational database management systems. - Experience with data warehousing solutions and data modeling. - Hands-on experience in ETL tools and processes. - Knowledge of cloud computing platforms (Azure, AWS, GCP). - Familiarity with Python or similar programming languages. - Ability to communicate complex concepts clearly to non-technical stakeholders. - Experience in implementing data quality measures and data governance. - Strong problem-solving skills and attention to detail. - Ability to work independently in a remote environment. - Experience with data visualization tools is a plus. - Excellent analytical and organizational skills. - Bachelor's degree in Computer Science, Engineering, or related field. - Experience in Agile methodologies and project management.

Posted 1 month ago

Apply

8.0 - 10.0 years

11 - 18 Lacs

Surat

Work from Office

Role Responsibilities : - Design and implement data pipelines using MS Fabric. - Develop data models to support business intelligence and analytics. - Manage and optimize ETL processes for data extraction, transformation, and loading. - Collaborate with cross-functional teams to gather and define data requirements. - Ensure data quality and integrity in all data processes. - Implement best practices for data management, storage, and processing. - Conduct performance tuning for data storage and retrieval for enhanced efficiency. - Generate and maintain documentation for data architecture and data flow. - Participate in troubleshooting data-related issues and implement solutions. - Monitor and optimize cloud-based solutions for scalability and resource efficiency. - Evaluate emerging technologies and tools for potential incorporation in projects. - Assist in designing data governance frameworks and policies. - Provide technical guidance and support to junior data engineers. - Participate in code reviews and ensure adherence to coding standards. - Stay updated with industry trends and best practices in data engineering. Qualifications : - 8+ years of experience in data engineering roles. - Strong expertise in MS Fabric and related technologies. - Proficiency in SQL and relational database management systems. - Experience with data warehousing solutions and data modeling. - Hands-on experience in ETL tools and processes. - Knowledge of cloud computing platforms (Azure, AWS, GCP). - Familiarity with Python or similar programming languages. - Ability to communicate complex concepts clearly to non-technical stakeholders. - Experience in implementing data quality measures and data governance. - Strong problem-solving skills and attention to detail. - Ability to work independently in a remote environment. - Experience with data visualization tools is a plus. - Excellent analytical and organizational skills. - Bachelor's degree in Computer Science, Engineering, or related field. - Experience in Agile methodologies and project management.

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 15 Lacs

Noida

Work from Office

We are seeking a skilled and detail-oriented Database and Report Developer with 45 years of experience in designing, developing, and maintaining data solutions and business intelligence reports. The ideal candidate should have hands-on experience with Microsoft Fabric, SAP BusinessObjects (BOBJ), Power BI, and SQL. You will work closely with business stakeholders to understand reporting needs, design ETL/data models, and deliver actionable insights. Key Responsibilities: Develop and maintain reports and dashboards using Power BI, SAP BOBJ, and Microsoft Fabric. Design, optimize, and manage SQL queries, stored procedures, and views for reporting and analytics. Collaborate with data engineers and analysts to build and optimize data models and pipelines. Translate business requirements into technical specifications and reporting solutions. Perform data validation, quality checks, and performance tuning of reports. Work with large data sets across multiple sources to generate insights and trends. Provide support and troubleshooting for reporting tools and dashboards. Maintain data security and access controls across reporting platforms. Document development processes, data flows, and report definitions. Required Skills and Qualifications: 45 years of experience in data reporting and database development. Strong experience with Power BI: DAX, Power Query, report design, data modeling. Proficiency in SAP BusinessObjects (BOBJ): Web Intelligence (WebI), Universe Designer. Hands-on experience with Microsoft Fabric (OneLake, Dataflows, Pipelines) is a plus. Strong command of SQL for data extraction, transformation, and analysis. Experience working with relational databases such as SQL Server, Azure SQL, or Oracle. Familiarity with ETL processes and data warehouse concepts. Ability to communicate effectively with both technical and non-technical stakeholders. Preferred Qualifications: Experience with cloud-based data platforms (e.g., Azure Synapse, Azure Data Factory). Knowledge of performance tuning and best practices in Power BI/BOBJ. Understanding of data governance and data security principles. Education: Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field.

Posted 1 month ago

Apply

3.0 - 7.0 years

5 - 10 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Hybrid

Role & Responsibilities Job Description: We are seeking a skilled and experienced Microsoft Fabric Engineer to join data engineering team. The ideal candidate will have a strong background in designing, developing, and maintaining data solutions using Microsoft Fabric, i ncluding experience across key workloads such as Data Engineering, Data Factory, Data Science, Real-Time Analytics, and Power BI. Require deep understanding of Synapse Data Warehouse, OneLake, Notebooks, Lakehouse architecture, and Power BI integration within Microsoft ecosystem. Key Responsibilities: Design, implement scalable and secure data solutions using Microsoft Fabric. Build and maintain Data Pipelines using Dataflows Gen2 and Data Factory. Work with Lakehouse architecture and manage datasets in OneLake. Develop notebooks (PySpark or T-SQL) for data transformation and processing. Collaborate with data analysts to create interactive dashboards, reports using Power BI (within Fabric). Leverage Synapse Data Warehouse and KQL databases for structured real-time analytics. Monitor and optimize performance of data pipelines and queries. Ensure to adhere data quality, security, and governance practices. Stay current with Microsoft Fabric updates and roadmap, recommending enhancements. Required Skills: 3+ years of hands-on experience with Microsoft Fabric or similar tools in the Microsoft data stack. Strong proficiency with: Data Factory (Fabric) Synapse Data Warehouse / SQL Analytics Endpoints Power BI integration and DAX Notebooks (PySpark, T-SQL) Lakehouse and OneLake Understanding of data modeling, ETL/ELT processes, and real-time data streaming. Experience with KQL (Kusto Query Language) is a plus. Familiarity with Microsoft Purview, Azure Data Lake, or Azure Synapse Analytics is advantageous. Qualifications: Microsoft Fabric, Onelake, Data Factory, Data Lake, DataMesh

Posted 1 month ago

Apply

12.0 - 22.0 years

25 - 40 Lacs

Bangalore Rural, Bengaluru

Work from Office

Role & responsibilities Requirements: Data Modeling (Conceptual, Logical, Physical)- Minimum 5 years Database Technologies (SQL Server, Oracle, PostgreSQL, NoSQL)- Minimum 5 years Cloud Platforms (AWS, Azure, GCP) - Minimum 3 Years ETL Tools (Informatica, Talend, Apache Nifi) - Minimum 3 Years Big Data Technologies (Hadoop, Spark, Kafka) - Minimum 5 Years Data Governance & Compliance (GDPR, HIPAA) - Minimum 3 years Master Data Management (MDM) - Minimum 3 years Data Warehousing (Snowflake, Redshift, BigQuery)- Minimum 3 years API Integration & Data Pipelines - Good to have. Performance Tuning & Optimization - Minimum 3 years business Intelligence (Power BI, Tableau)- Minimum 3 years Job Description: We are seeking experienced Data Architects to design and implement enterprise data solutions, ensuring data governance, quality, and advanced analytics capabilities. The ideal candidate will have expertise in defining data policies, managing metadata, and leading data migrations from legacy systems to Microsoft Fabric/DataBricks/ . Experience and deep knowledge about at least one of these 3 platforms is critical. Additionally, they will play a key role in identifying use cases for advanced analytics and developing machine learning models to drive business insights. Key Responsibilities: 1. Data Governance & Management Establish and maintain a Data Usage Hierarchy to ensure structured data access. Define data policies, standards, and governance frameworks to ensure consistency and compliance. Implement Data Quality Management practices to improve accuracy, completeness, and reliability. Oversee Metadata and Master Data Management (MDM) to enable seamless data integration across platforms. 2. Data Architecture & Migration Lead the migration of data systems from legacy infrastructure to Microsoft Fabric. Design scalable, high-performance data architectures that support business intelligence and analytics. Collaborate with IT and engineering teams to ensure efficient data pipeline development. 3. Advanced Analytics & Machine Learning Identify and define use cases for advanced analytics that align with business objectives. Design and develop machine learning models to drive data-driven decision-making. Work with data scientists to operationalize ML models and ensure real-world applicability. Required Qualifications: Proven experience as a Data Architect or similar role in data management and analytics. Strong knowledge of data governance frameworks, data quality management, and metadata management. Hands-on experience with Microsoft Fabric and data migration from legacy systems. Expertise in advanced analytics, machine learning models, and AI-driven insights. Familiarity with data modelling, ETL processes, and cloud-based data solutions (Azure, AWS, or GCP). Strong communication skills with the ability to translate complex data concepts into business insights. Preferred candidate profile Immediate Joiner

Posted 1 month ago

Apply

7 - 12 years

27 - 40 Lacs

Pune

Work from Office

Hi Greetings for the Season !!! We are Hiring for a Data Engineer III associated with a global IT service & IT consulting. KEY DUTIES Work alongside senior team members to define and implement migration strategies. Creation of Data Lakes, migrating SQL Databases to Data Lake and ELT/ETL transformations. Design, construct, install, test and maintain data management systems. Ensure all systems meet business and performance requirements. Create and maintain optimal data pipeline architecture. Assemble large, complex datasets that meet functional and non-functional business requirements. Develop and implement data flows to connect operational systems, data for analytics and BI (Business Intelligence) systems. Design, implement and optimize data models on Azure data platforms. Implement effective metrics and monitoring processes. Drive the collection, cleaning, processing, and analysis of new and existing data sources. Work with data and analytics experts to strive for greater functionality in data systems. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, redesigning infrastructure for greater scalability. Ensure seamless integration with Power BI reports, exposure to Tableau. Other duties as assigned. BASIC QUALIFICATIONS 7+ years of experience in data engineering with a focus on cloud data platforms. Experience with data migrations and CRMs such as Salesforce and MS Dynamics 365 and their corresponding data models. Extensive experience with T-SQL, including query optimization, stored procedures, and performance tuning for efficient data processing. Hands on experience with Azure Data Lake, Azure Data Factory, Azure SQL Database. Updated January 2025 Hands on experience building and optimizing data ingestion, transformation, and processing workflows. Strong understanding of data governance, security, and access controls in an Azure environment. Hands on experience with Power BI and ability to modify DAX Measures and data models to align with data lake integration. Experience with Tableau is an asset. Strong analytical and problem-solving skills to troubleshoot data ingestion and transformation issues. Experience in the creation and maintenance of Disaster Recovery plans. Experience with Index Maintenance. Experience with Microsoft Fabric an asset. The ability to work independently without supervision and with a team effectively

Posted 1 month ago

Apply

7 - 12 years

9 - 14 Lacs

Indore, Hyderabad, Ahmedabad

Work from Office

Required Qualification: B.Tech / MCA / Equivalent in Computer Science, IT, or Engineering Primary Skills: Azure Data Factory (ADF) Data pipeline creation, data flow management Power BI Report & dashboard development, DAX, data modeling SQL Strong experience with Azure SQL, SQL Server, query optimization ETL Processes Design and implementation of scalable ETL pipelines Cloud Platforms Microsoft Azure (Azure Data Lake, Azure Blob Storage) Programming Python and SQL for data transformation Data Warehousing Concepts Hands-on understanding of data architecture Performance Tuning Optimize data pipelines for reliability and efficiency Preferred Skills: Azure Databricks Data engineering in distributed environments Microsoft Fabric Exposure to end-to-end analytics platform Data Modeling Best practices in Power BI and enterprise data environments Cloud Ecosystem Knowledge Broader understanding of Azure services What We Need: Educational Background: B.Tech / MCA / Equivalent in Computer Science, Information Technology, or related field 7+ years of hands-on experience in Data Engineering / BI

Posted 1 month ago

Apply

5 - 10 years

9 - 18 Lacs

Bengaluru

Work from Office

We are seeking a highly skilled Power BI Expert with over 5 years of experience in business intelligence and data analytics. The ideal candidate will have expertise in Azure, Data Factory, Microsoft Fabric, and Data Warehousing. Required Candidate profile Experience with Power BI, Azure, Data Warehousing, and related technologies Proficiency in DAX, Power Query, SQL, and data visualization best practices Degree in Computer Science, Data Analytic.

Posted 1 month ago

Apply

5 - 10 years

10 - 18 Lacs

Bengaluru

Work from Office

5+ years of experience in business intelligence and data analytics. Highly skilled Powerbi Expert in Data Management . Required Candidate profile Strong background in Azure, Data Factory, Microsoft Fabric,and Data Warehousing, enabling create robust data pipelines and effective data visualizations, n DAX, Power Query (M langue) decision maker.

Posted 1 month ago

Apply

- 2 years

3 - 8 Lacs

Lucknow

Hybrid

Develop and maintain scalable data pipelines. Collaborate with data scientists and analysts to support business needs. Work with cloud platforms like AWS, Azure, or Google Cloud. Effectively working with cross-functional teams. Data Modelling.

Posted 1 month ago

Apply

10 - 20 years

20 - 35 Lacs

Indore, Hyderabad, Ahmedabad

Work from Office

Job Summary: As a Solution Architect , you will collaborate with our sales, presales and COE teams to provide technical expertise and support throughout the new business acquisition process. You will play a crucial role in understanding customer requirements, presenting our solutions, and demonstrating the value of our products. You thrive in high-pressure environments, maintaining a positive outlook and understanding that career growth is a journey that requires making strategic choices. You possess good communication skills, both written and verbal, enabling you to convey complex technical concepts clearly and effectively. You are a team player, customer-focused, self-motivated, responsible individual who can work under pressure with a positive attitude. You must have experience in managing and handling RFPs/ RFIs, client demos and presentations, and converting opportunities into winning bids. You possess a strong work ethic, positive attitude, and enthusiasm to embrace new challenges. You can multi-task and prioritize (good time management skills), willing to display and learn. You should be able to work independently with less or no supervision. You should be process-oriented, have a methodical approach and demonstrate a quality-first approach. Ability to convert clients business challenges/ priorities into winning proposal/ bid through excellence in technical solution will be the key performance indicator for this role. What youll do Architecture & Design: Develop high-level architecture designs for scalable, secure, and robust solutions. Technology Evaluation: Select appropriate technologies, frameworks, and platforms for business needs. Cloud & Infrastructure: Design cloud-native, hybrid, or on-premises solutions using AWS, Azure, or GCP. Integration: Ensure seamless integration between various enterprise applications, APIs, and third-party services. Design and develop scalable, secure, and performant data architectures on Microsoft Azure and/or new generation analytics platform like MS Fabric. Translate business needs into technical solutions by designing secure, scalable, and performant data architectures on cloud platforms. Select and recommend appropriate Data services (e.g. Fabric, Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics, Power BI etc) to meet specific data storage, processing, and analytics needs. Develop and recommend data models that optimize data access and querying. Design and implement data pipelines for efficient data extraction, transformation, and loading (ETL/ELT) processes. Ability to understand Conceptual/Logical/Physical Data Modelling. Choose and implement appropriate data storage, processing, and analytics services based on specific data needs (e.g., data lakes, data warehouses, data pipelines). Understand and recommend data governance practices, including data lineage tracking, access control, and data quality monitoring. What you will Bring 10+ years of working in data analytics and AI technologies from consulting, implementation and design perspectives Certifications in data engineering, analytics, cloud, AI will be a certain advantage Bachelor’s in engineering/ technology or an MCA from a reputed college is a must Prior experience of working as a solution architect during presales cycle will be an advantage Soft Skills Communication Skills Presentation Skills Flexible and Hard-working Technical Skills Knowledge of Presales Processes Basic understanding of business analytics and AI High IQ and EQ Why join us? Work with a passionate and innovative team in a fast-paced, growth-oriented environment. Gain hands-on experience in content marketing with exposure to real-world projects. Opportunity to learn from experienced professionals and enhance your marketing skills. Contribute to exciting initiatives and make an impact from day one. Competitive stipend and potential for growth within the company. Recognized for excellence in data and AI solutions with industry awards and accolades. Employee Benefits Culture: Open Door Policy: Encourages open communication and accessibility to management. Open Office Floor Plan: Fosters a collaborative and interactive work environment. Flexible Working Hours: Allows employees to have flexibility in their work schedules. Employee Referral Bonus: Rewards employees for referring qualified candidates. Appraisal Process Twice a Year: Provides regular performance evaluations and feedback. Inclusivity and Diversity: Hiring practices that promote diversity: Ensures a diverse and inclusive workforce. Mandatory POSH training: Promotes a safe and respectful work environment. Health Insurance and Wellness Benefits: GMC and Term Insurance: Offers medical coverage and financial protection. Health Insurance: Provides coverage for medical expenses. Disability Insurance: Offers financial support in case of disability. Child Care & Parental Leave Benefits: Company-sponsored family events: Creates opportunities for employees and their families to bond. Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child. Family Medical Leave: Offers leave for employees to take care of family members' medical needs. Perks and Time-Off Benefits: Company-sponsored outings: Organizes recreational activities for employees. Gratuity: Provides a monetary benefit as a token of appreciation. Provident Fund: Helps employees save for retirement. Generous PTO: Offers more than the industry standard for paid time off. Paid sick days: Allows employees to take paid time off when they are unwell. Paid holidays: Gives employees paid time off for designated holidays. Bereavement Leave: Provides time off for employees to grieve the loss of a loved one. Professional Development Benefits: L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development. Mentorship Program: Offers guidance and support from experienced professionals. Job Training: Provides training to enhance job-related skills. Professional Certification Reimbursements: Assists employees in obtaining professional certifications. Promote from Within: Encourages internal growth and advancement opportunities.

Posted 1 month ago

Apply
page 3 of 3 results
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies