Jobs
Interviews

25 Ms Fabric Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

0 - 0 Lacs

pune, maharashtra

On-site

You will be responsible for architecting data warehousing and business intelligence solutions to address cross-functional business challenges. This will involve interacting with business stakeholders to gather requirements and deliver comprehensive Data Engineering, Data Warehousing, and analytics solutions. Additionally, you will collaborate with other technology teams to extract, transform, and load data from diverse sources. You should have a minimum of 5-8 years of end-to-end Data Engineering Development experience, preferably across industries such as Retail, FMCG, Manufacturing, Finance, Oil & Gas. Experience in functional domains like Sales, Procurement, Cost Control, Business Development, and Finance is desirable. You are expected to have 3 to 10 years of experience in data engineering projects using Azure or AWS services, with hands-on expertise in data transformation, processing, and migration using various tools such as Azure Data Lake Storage, Azure Data Factory, Databricks, AWS Glue, Redshift, and Athena. Familiarity with MS Fabric and its components will be advantageous, along with experience in working with different source/target systems like Oracle Database, SQL Server Database, Azure Data Lake Storage, ERP, CRM, and SCM systems. Proficiency in reading data from sources via APIs/Web Services and utilizing APIs to write data to target systems is essential. You should also have experience in Data Cleanup, Data Cleansing, and optimization tasks, including working with non-structured data sets in Azure. Knowledge of analytics tools like Power BI and Azure Analysis Service, as well as exposure to private and public cloud architectures, will be beneficial. Excellent written and verbal communication skills are crucial for this role. Ideally, you hold a degree in M.Tech / B.E. / B.Tech (Computer Science, Information systems, IT) / MCA / MCS. Key requirements include expertise in MS Azure Data Factory, Python, PySpark Coding, Synapse Analytics, Azure Function Apps, Azure Databricks, AWS Glue, Athena, Redshift, and Databricks Pysark. Exposure to integration with various applications/systems like ERP, CRM, SCM, WebApp using APIs, Cloud, On-premise systems, DBs, and file systems is expected. The role necessitates a minimum of 3 Full Cycle Data Engineering Implementations (5-10 years of experience) with a focus on building data warehouses and implementing data models. Exposure to the consulting industry is mandatory, along with strong verbal and written communication skills. Your primary skills should encompass Data Engineering Development, Cloud Engineering with Azure or AWS, Data Warehousing & BI Solutions Architecture, Programming (Python PySpark), Data Integration across various systems, Consulting experience, ETL and Data Transformation, and knowledge in Cloud Architecture. Additionally, familiarity with MS Fabric, handling non-structured data, Data Cleanup and Optimization, API/Web Services, Data Visualization, and industry and functional knowledge will be advantageous. The compensation package ranges from INR 12-28 lpa, subject to the candidate's performance and experience level.,

Posted 3 days ago

Apply

6.0 - 9.0 years

18 - 25 Lacs

Bengaluru

Hybrid

About the Role We are seeking a BI Architect to advise the BI Lead of a global CPG organization and architect an intelligent, scalable Business Intelligence ecosystem. This includes an enterprise-wide KPI dashboard suite augmented by a GenAI-driven natural language interface for insight discovery. The ideal candidate will be responsible for end-to-end architecture: from scalable data models and dashboards to a conversational interface powered by Retrieval-Augmented Generation (RAG) and/or Knowledge Graphs. The solution must synthesize internal BI data with external (web-scraped and competitor) data to deliver intelligent, context-rich insights. Key Responsibilities • Architect BI Stack : Design and oversee a scalable and performant BI platform that serves as a single source of truth for key business metrics across functions (Sales, Marketing, Supply Chain, Finance, etc.). • Advise BI Lead : Act as a technical thought partner to the BI Lead, aligning architecture decisions with long-term strategy and business priorities. • Design GenAI Layer : Architect a GenAI-powered natural language interface on top of BI dashboards to allow business users to query KPIs, trends, and anomalies conversationally. • RAG/Graph Approach : Select and implement appropriate architectures (e.g., RAG using vector stores, Knowledge Graphs) to support intelligent, context-aware insights. • External Data Integration : Build mechanisms to ingest and structure data from public sources (e.g., competitor websites, industry reports, social sentiment) to augment internal insights. • Security & Governance : Ensure all layers (BI + GenAI) adhere to enterprise data governance, security, and compliance standards. • Cross-functional Collaboration : Work closely with Data Engineering, Analytics, and Product teams to ensure seamless integration and operationalization. Qualifications • 69 years of experience in BI architecture and analytics platforms, with at least 2 years working on GenAI, RAG, or LLM-based solutions. • Strong expertise in BI tools (e.g., Power BI, Tableau, Looker) and data modeling. • Experience with GenAI frameworks (e.g., LangChain, LlamaIndex, Semantic Kernel) and vector databases (e.g., Pinecone, FAISS, Weaviate). • Knowledge of graph-based data models and tools (e.g., Neo4j, RDF, SPARQL) is a plus. • Proficiency in Python or relevant scripting language for pipeline orchestration and AI integration. • Familiarity with web scraping and structuring external/third-party datasets. • Prior experience in CPG domain or large-scale KPI dashboarding preferred.

Posted 4 days ago

Apply

2.0 - 6.0 years

0 - 0 Lacs

karnataka

On-site

As a Power BI Developer, you will be an integral part of our dynamic team, contributing your expertise to design, develop, and implement advanced Power BI solutions that facilitate data-driven decision-making. Your role will involve close collaboration with business stakeholders to grasp their requirements and translate them into visually appealing and high-performance Power BI reports. Your responsibilities will span various key areas, including data modeling and analysis. You will be tasked with creating robust data models, utilizing advanced DAX for complex calculations, and effectively transforming and cleaning data using Power Query. Additionally, you will develop interactive Power BI reports with diverse visualizations, optimize report performance, and enforce data access control through Row-Level Security (RLS). Furthermore, you will oversee Power BI Service administration, managing capacities, licenses, and deployment strategies while integrating Power BI with other Microsoft tools for enhanced automation and data processing. Your expertise in cloud platforms like MS Fabric, Data Factory, and Data Lake will be crucial in optimizing data pipelines and scalability. In addition to your technical responsibilities, you will engage in collaboration with stakeholders to deliver actionable insights, mentor junior team members on best practices, and provide technical leadership by ensuring adherence to standards and deploying reports to production environments. To qualify for this role, you should possess 2 to 6 years of hands-on experience with Power BI and related technologies, demonstrating proficiency in data modeling, DAX, Power Query, visualization techniques, and SQL skills. Experience in ETL processes, cloud platforms, and strong problem-solving abilities are essential. Excellent communication skills and the ability to work both independently and collaboratively are also required. Preferred qualifications include experience with R or Python for custom visual development and certification in Power BI or related technologies. Please note that this position mandates working at our (South) Bangalore office for at least 4 out of 5 days, with no remote work option available. Local candidates are preferred, and relocation assistance will not be provided. This is a full-time position based in Bangalore (South), offering a competitive salary range of 500,000-1,200,000 INR per year. If you meet the qualifications and are eager to contribute to our team, we encourage you to apply before the deadline of April 15, 2025.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

haryana

On-site

As a Data Engineering Specialist, you will be responsible for assessing, capturing, and translating complex business issues into structured technical tasks for the data engineering team. This includes designing, building, launching, optimizing, and extending full-stack data and business intelligence solutions. Your role will involve supporting the build of big data environments, focusing on improving data pipelines and data quality, and working with stakeholders to meet business needs. You will create data access tools for the analytics and data scientist team, conduct code reviews, assist other developers, and train team members as required. Additionally, you will ensure that developed systems comply with industry standards and best practices while meeting project requirements. To excel in this role, you should possess a Bachelor's degree in computer science engineering or equivalent, or relevant experience. Certification in cloud technologies, especially Azure, would be beneficial. You should have 2-3+ years of development experience in building and maintaining ETL/ELT pipelines on various sources and operational programming tasks. Experience with Apache data projects or cloud platform equivalents and proficiency in programming languages like Python, Scala, R, Java, Golang, Kotlin, C, or C++ is required. Your work will involve collaborating closely with data scientists, machine learning engineers, and stakeholders to understand requirements and develop data-driven solutions. Troubleshooting, debugging, and resolving issues within generative AI system development, as well as documenting processes, specifications, and training procedures will be part of your responsibilities. In summary, this role requires a strong background in data engineering, proficiency in cloud technologies, experience with data projects and programming languages, and the ability to collaborate effectively with various stakeholders to deliver high-quality data solutions.,

Posted 1 week ago

Apply

7.0 - 12.0 years

12 - 20 Lacs

Bengaluru

Hybrid

Microsoft Fabric & Azure Expertise Design and implement comprehensive data solutions using Microsoft Fabric, including Data Pipelines , Lakehouses , and Real-Time Analytics . Build and manage scalable OLAP architectures , including data layer design, dimensional modeling, and semantic models. Develop and optimize data pipelines within Fabric Workspaces for efficient ETL/ELT processing. Lead implementation of data governance practices , including IAM policies , RBAC , RLS , data masking , and encryption . Manage and optimize Fabric resources , including Fabric SKUs, OneLake, and Workspace configurations. Set up and manage Power BI apps integrated with Fabric for reporting and dashboard distribution. Develop and maintain Fabric Notebooks using PySpark , Python , and SQL for advanced data engineering and analytics use cases. Oversee Azure services integration such as App Registrations , IAM , and other platform services to ensure secure and efficient operation. Python Development Build robust, scalable Python scripts and applications with a strong focus on performance, reliability, and maintainability. Integrate external systems using RESTful APIs and automate workflows through Python-based orchestration. Utilize advanced Python libraries such as pandas , numpy , and others for data manipulation and transformation. Develop database integration scripts to connect with relational databases (e.g., SQL Server, MySQL, PostgreSQL, Oracle) and execute complex SQL and PL/SQL queries. Handle and transform complex JSON data structures for ingestion and processing. Collaboration & Leadership Work closely with cross-functional stakeholdersincluding data scientists, analysts, product managers, and business leaders—to understand requirements and deliver scalable solutions. Translate business needs into technical specifications and implement solutions that align with architectural best practices. Provide technical mentorship and guidance to junior team members and contribute to architectural decision-making. Qualifications Experience in data engineering, analytics, or platform engineering roles. Proven expertise in Microsoft Fabric and Azure Data Services. Strong command of Python programming and experience working with large-scale data systems. Experience designing and implementing OLAP systems, including semantic models and multidimensional cubes. Solid understanding of data governance, security, and access control in enterprise environments. Exceptional problem-solving, communication, and collaboration skills. Regular Tasks Required: Build and maintain MS Fabric data import, standardization, and governance procedures. Write effective Python code to tackle complex issues, but also use your business sense and analytical abilities to glean valuable insights from public databases Communicate clearly with team and help the organization in realizing its objectives Clearly express the reasoning and logic when writing code Fix bugs in the code and create thorough documentation Utilize your data analysis skills to develop and respond to important business queries using available datasets Effectively communicate with the team to comprehend the needs and provide the results Useful but non-mandatory skills: Hands-on experience with Generative AI tools, including LangChain, RAG, and vector databases. Experience with web scraping, browser automation, and workflow automation. Proficiency in traditional ML libraries (scikit-learn, TensorFlow, PyTorch, XGBoost). Experience and Educational requirements: Bachelor’s in Engineering or Masters in Computer Science (or equivalent experience) At least 2+ years of relevant experience as a data scientist 2+ years of Data analysis experience and a desire to have a significant impact on the field of artificial intelligence 2+ years of experience working extensively with Python programming Experience with MS Fabric, OneLake, and the Microsoft data analytics stack Experience with PySpark and Spark SQL Extensive experience working with Data Science/Analysis Familiarity with SQL, Rest API, Tableau and related technologies is desirable Excellent communication abilities to work with stakeholders and researchers successfully Strong data analytic abilities and business sense are required to draw the appropriate conclusions from the dataset, respond to those conclusions, and clearly convey the key findings Excellent problem-solving and analytical skills Fluent in conversational and written English communication skills Physical requirements: Normal, corrective vision range; ability to see color and to distinguish letters, numbers and symbols Frequently required to sit, stand, walk, talk, hear, bend and reach Ability to reach with hands and arms Occasionally lift and/or move up to 30lbs

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

You should have a minimum of 7 years of experience in Database warehouse / lake house programming and should have successfully implemented at least 2 end-to-end data warehouse / data lake projects. Additionally, you should have experience in implementing at least 1 Azure Data warehouse / lake house project end-to-end, converting business requirements into concept / technical specifications, and collaborating with source system experts to finalize ETL and analytics design. You will also be responsible for supporting data modeler developers in the design and development of ETLs and creating activity plans based on agreed concepts with timelines. Your technical expertise should include a strong background with Microsoft Azure components such as Azure Data Factory, Azure Synapse, Azure SQL Database, Azure Key Vault, MS Fabric, Azure DevOps (ADO), and Virtual Networks (VNets). You should also have expertise in Medallion Architecture for Lakehouses and data modeling in the Gold layer, along with a solid understanding of Data Warehouse design principles like star schema, snowflake schema, and data partitioning. Proficiency in MS SQL Database Packages, Stored procedures, Functions, procedures, Triggers, and data transformation activities using SQL is required, as well as knowledge in SQL loader, Data pump, and Import/Export utilities. Experience with data visualization or BI tools like Tableau, Power BI, capacity planning, environment management, performance tuning, and familiarity with cloud cloning/copying processes within Azure will be essential for this role. Knowledge of green computing principles and optimizing cloud resources for cost and environmental efficiency is also desired. You should possess excellent interpersonal and communication skills to collaborate effectively with technical and non-technical teams, communicate complex concepts, and influence key stakeholders. Additionally, analyzing demands, contributing to cost/benefit analysis, and estimation are part of the responsibilities. Preferred qualifications include certifications like Azure Solutions Architect Expert or Azure Data Engineer Associate. Skills required for this role include database management, Tableau, Power BI, ETL processes, Azure SQL Database, Medallion Architecture, Azure services, data visualization, data warehouse design, and Microsoft Azure technologies.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

ahmedabad, gujarat

On-site

As a Solution Architect at Kanerika, you will collaborate with our sales, presales, and COE teams to provide technical expertise and support throughout the new business acquisition process. Your role will involve understanding customer requirements, presenting our solutions, and demonstrating the value of our products. In this high-pressure environment, maintaining a positive outlook and making strategic choices for career growth are essential. Your excellent communication skills, both written and verbal, will enable you to convey complex technical concepts clearly and effectively. Being a team player, customer-focused, self-motivated, and responsible individual who can work under pressure with a positive attitude is crucial for success in this role. Experience in managing and handling RFPs/ RFIs, client demos and presentations, and converting opportunities into winning bids is required. Having a strong work ethic, positive attitude, and enthusiasm to embrace new challenges are key qualities. You should be able to multitask, prioritize, and demonstrate good time management skills, as well as work independently with minimal supervision. A process-oriented and methodical approach with a quality-first mindset will be beneficial. The ability to convert a client's business challenges and priorities into winning proposals through excellence in technical solutions will be the key performance indicator for this role. Your responsibilities will include developing high-level architecture designs for scalable, secure, and robust solutions, selecting appropriate technologies, frameworks, and platforms for business needs, and designing cloud-native, hybrid, or on-premises solutions using AWS, Azure, or GCP. You will also ensure seamless integration between various enterprise applications, APIs, and third-party services, as well as design and develop scalable, secure, and performant data architectures on Microsoft Azure and/or new generation analytics platforms. To excel in this role, you should have at least 10 years of experience working in data analytics and AI technologies from consulting, implementation, and design perspectives. Certifications in data engineering, analytics, cloud, and AI will be advantageous. A Bachelor's in engineering/technology or an MCA from a reputed college is a must, along with prior experience working as a solution architect during the presales cycle. Soft skills such as communication, presentation, flexibility, and being hard-working are essential. Additionally, having knowledge of presales processes and a basic understanding of business analytics and AI will benefit you in this role at Kanerika. Join us at Kanerika and become part of a vibrant and diverse community where your talents are recognized, your growth is nurtured, and your contributions make a real impact. See the benefits section below for the perks you'll get while working for Kanerika.,

Posted 1 week ago

Apply

6.0 - 8.0 years

1 - 6 Lacs

Noida

Work from Office

Urgent Hiring... Microsoft Fabric Cloud Architect 6-8yrs Noida Immediate to 30 days Skills- Azure Cloud, MS Fabric, Py spark, DAX, Python, Azure Synapse, ADF, Data Bricks, MS-Fabric, ETL Pipelines.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

nagpur, maharashtra

On-site

You have a great opportunity to join Harrier Information Systems as a Sr. PowerBI Consultant in Nagpur. With 4 to 7 years of experience, you will be responsible for designing and developing Business Intelligence Dashboards/Reports using PowerBI. Your role will involve working with Power BI/MS Fabric, DAX, data models, Power Query, and Power BI Service. As a Sr. PowerBI Consultant, you should excel in data transformations, data modeling, and data visualization layers. Performance tuning of existing PowerBI dashboards, analyzing and providing solutions for data model design improvement are key aspects of this role. Strong SQL skills for querying and data manipulation are required, along with experience in ETL processes and tools. Knowledge of data warehousing concepts, data modeling, and experience with various data sources like Teradata, MSSQL are desirable. Your compensation will be commensurate with your technology skills, communication skills, and problem-solving attitude. If you are interested in this opportunity, please send your updated resume with current and expected CTC and notice period to mohan.bangde@harriersys.com or career@harriersys.com. We look forward to hearing from you soon. Thank you. Mohan Bangde Harrier Information Systems Pvt. Ltd. Mobile: +91-997-541-2556 Website: https://www.harriersys.com Locations: India | UK | USA,

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

navi mumbai, maharashtra

On-site

Arcadis is the world's leading company delivering sustainable design, engineering, and consultancy solutions for natural and built assets. We are more than 36,000 people, in over 70 countries, dedicated to improving quality of life. Everyone has an important role to play. With the power of many curious minds, together we can solve the worlds most complex challenges and deliver more impact together. Individual Accountabilities Collaboration Collaborates with domain architects in the DSS, OEA, EUS, and HaN towers and if appropriate, the respective business stakeholders in architecting data solutions for their data service needs. Collaborates with the Data Engineering and Data Software Engineering teams to effectively communicate the data architecture to be implemented. Contributes to prototype or proof of concept efforts. Collaborates with InfoSec organization to understand corporate security policies and how they apply to data solutions. Collaborates with the Legal and Data Privacy organization to understand the latest policies so they may be incorporated into every data architecture solution. Suggest architecture design with Ontologies, MDM team. Technical Skills & Design Significant experience working with structured and unstructured data at scale and comfort with a variety of different stores (key-value, document, columnar, etc.) as well as traditional RDBMS and data warehouses. Deep understanding of modern data services in leading cloud environments, and able to select and assemble data services with maximum cost efficiency while meeting business requirements of speed, continuity, and data integrity. Creates data architecture artifacts such as architecture diagrams, data models, design documents, etc. Guides domain architect on the value of a modern data and analytics platform. Research, design, test, and evaluate new technologies, platforms and third-party products. Working experience with Azure Cloud, Data Mesh, MS Fabric, Ontologies, MDM, IoT, BI solution and AI would be greater assets. Expert troubleshoot skills and experience. Leadership Mentors aspiring data architects typically operating in data engineering and software engineering roles. Key Shared Accountabilities Leads medium to large data services projects. Provides technical partnership to product owners Shared stewardship, with domains architects, of the Arcadis data ecosystem. Actively participates in Arcadis Tech Architect community. Key Profile Requirements Minimum of 7 years of experience in designing and implementing modern solutions as part of variety of data ingestion and transformation pipelines Minimum of 5 years of experience with best practice design principles and approaches for a range of application styles and technologies to help guide and steer decisions. Experience working in large scale development and cloud environment. Why Arcadis We can only achieve our goals when everyone is empowered to be their best. We believe everyone's contribution matters. Its why we are pioneering a skills-based approach, where you can harness your unique experience and expertise to carve your career path and maximize the impact we can make together. Youll do meaningful work, and no matter what role, youll be helping to deliver sustainable solutions for a more prosperous planet. Make your mark, on your career, your colleagues, your clients, your life and the world around you. Together, we can create a lasting legacy. Join Arcadis. Create a Legacy. Our Commitment to Equality, Diversity, Inclusion & Belonging We want you to be able to bring your best self to work every day, which is why we take equality and inclusion seriously and hold ourselves to account for our actions. Our ambition is to be an employer of choice and provide a great place to work for all our people.,

Posted 3 weeks ago

Apply

2.0 - 5.0 years

0 - 3 Lacs

Jaipur

Work from Office

Job Role Data Engineer Job Location Jaipur Job Type Permanent Experience Required- (2-5) Years As a Data Engineer, you will play a critical role in designing, developing, and maintaining our data pipelines and infrastructure. You will work closely with our data scientists, analysts, and other stakeholders to ensure data is accurate, timely, and accessible. Your contributions will directly impact our data-driven decision-making and support our growth. Key Responsibilities: Data Pipeline Development: Design, develop, and implement data pipelines using Azure Data Factory and Databricks to support the ingestion, transformation, and movement of data. ETL Processes: Develop and optimize ETL (Extract, Transform, Load) processes to ensure efficient data flow and transformation. Data Lake Management: Develop and maintain Azure Data Lake solutions, ensuring efficient storage and retrieval of large datasets. Data Warehousing: Work with Azure Synapse Analytics to build and manage scalable data warehousing solutions that enable advanced analytics and reporting. Data Integration: Integrate various data sources into MS-Fabric, ensuring data consistency, quality, and accessibility across different platforms. Performance Optimization: Optimize data processing workflows and storage solutions to improve performance and reduce costs. Database Management: Manage and optimize databases (SQL and NoSQL) to support high-performance queries and data storage requirements. Data Quality: Implement data quality checks and monitoring to ensure accuracy and consistency of data. Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver actionable insights. Documentation: Create and maintain comprehensive documentation for data processes, pipelines, and infrastructure, architecture and best practices. Troubleshooting and Support: Identify and resolve issues in data pipelines, data lakes, and warehousing solutions, providing timely support and maintenance. Qualifications: Experience: 2-4 years of experience in data engineering or a related field. Technical Skills: Proficiency with Azure Data Factory, Azure Synapse Analytics, Databricks, and Azure Data Lake Experience with Microsoft Fabric is a plus Strong SQL skills and experience with data warehousing concepts (DWH) Knowledge of data modeling, ETL processes, and data integration Experience with relational databases (e.g., MS-SQL, PostgreSQL, MySQL) Hands-on experience with ETL tools and frameworks (e.g., Apache Airflow, Talend) Knowledge of big data technologies (e.g., Hadoop, Spark) is a plus Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud) and associated data services (e.g., S3, Redshift, BigQuery) Familiarity with data visualization tools (e.g., Power BI) and experience with programming languages such as Python, Java, or Scala. Experience with schema design and dimensional data modeling Analytical Skills: Strong problem-solving abilities and attention to detail. Communication: Excellent verbal and written communication skills, with the ability to explain technical concepts to non-technical stakeholders. Education: Bachelors degree in computer science, Engineering, Mathematics, or a related field. Advanced degrees or certifications are a plus Thanks & Regards Sulabh Tailang HR-Talent Acquisition Manager |Celebal Technologies |91-9448844746 Sulabh.tailang@celebaltech.com|LinkedIn-sulabhtailang |Twitter-Ersulabh Website-www.celebaltech.com

Posted 3 weeks ago

Apply

8.0 - 13.0 years

8 - 17 Lacs

Chennai

Remote

MS Fabric (Data Lake, OneLake, Lakehouse, Warehouse, Real-Time Analytics) and integration with Power BI, Synapse, and Azure Data Factory. DevOps Knowledge Team Leading experience

Posted 3 weeks ago

Apply

8.0 - 13.0 years

8 - 17 Lacs

Chennai

Remote

MS Fabric (Data Lake, OneLake, Lakehouse, Warehouse, Real-Time Analytics) and integration with Power BI, Synapse, and Azure Data Factory. DevOps Knowledge Team Leading experience

Posted 3 weeks ago

Apply

5.0 - 7.0 years

20 - 25 Lacs

Mohali, Pune, Bengaluru

Hybrid

Required Skills and Experiences: Collaborate with cross-functional teams, including data analysts, data scientists, and business stakeholders, to understand their data requirements and deliver effective solutions. Leverage Fabric Lakehouse for data storage, governance, and processing to support Power BI and automation initiatives. Expertise in data modeling, with a strong focus on data warehouse and lakehouse design. Design and implement data models, warehouses, and databases using MS Fabric, Azure Synapse Analytics, Azure Data Lake Storage, and other Azure services. • Develop ETL (Extract, Transform, Load) processes using SQL Server Integration Services (SSIS), Azure Synapse Pipelines, or similar tools to prepare data for analysis and reporting. Implement data quality checks and governance practices to ensure accuracy, consistency, and security of data assets. Monitor and optimize data pipelines and workflows for performance, scalability, and cost efficiency, utilizing Microsoft Fabric for real-time analytics and AI-powered workloads. Strong proficiency in Business Intelligence (BI) tools such as Power BI, Tableau, and other analytics platforms. Experience with data integration and ETL tools like Azure Data Factory. • Proven expertise in Microsoft Fabric or similar data platforms. • In-depth knowledge of the Azure Cloud Platform, particularly in data warehousing and storage solutions.

Posted 1 month ago

Apply

5.0 - 7.0 years

6 - 8 Lacs

Hyderabad

Work from Office

Role Overview We are seeking an experienced .NET Backend Developer with strong Azure Data Engineering skills to join our growing team in Hyderabad. You will work closely with cross-functional teams to build scalable backend systems, modern APIs, and data pipelines using cutting-edge tools like Azure Databricks and MS Fabric. Technical Skills (Must-Have) Strong hands-on experience in C#, SQL Server, and OOP Concepts Proficiency with .NET Core, ASP.NET Core, Web API, Entity Framework (v6 or above) Strong understanding of Microservices Architecture Experience with Azure Cloud technologies including Data Engineering, Azure Databricks, MS Fabric, Azure SQL, Blob Storage, etc. Experience with Snowflake or similar cloud data platforms Experience working with NoSQL databases Skilled in Database Performance Tuning and Design Patterns Working knowledge of Agile methodologies Ability to write reusable libraries and modular, maintainable code Excellent verbal and written communication skills (especially with US counterparts) Strong troubleshooting and debugging skills Nice to Have Skills Experience with Angular, MongoDB, NPM Familiarity with Azure DevOps CI/CD pipelines for build and release configuration Self-starter attitude with strong analytical and problem-solving abilities Willingness to work extra hours when needed to meet tight deadlines Why Join UsWork with a passionate, high-performing team Opportunity to grow your technical and leadership skills in a dynamic environment Be part of global digital transformation initiatives with top-tier clients Exposure to real-world enterprise data systems Opportunity to work on cutting-edge Azure and cloud technologies Performance-based growth & internal mobility opportunities

Posted 1 month ago

Apply

7.0 - 12.0 years

8 - 14 Lacs

Bengaluru

Work from Office

Individual Accountabilities Collaboration Collaborates with domain architects in the DSS, OEA, EUS, and HaN towers and if appropriate, the respective business stakeholders in architecting data solutions for their data service needs. Collaborates with the Data Engineering and Data Software Engineering teams to effectively communicate the data architecture to be implemented. Contributes to prototype or proof of concept efforts. Collaborates with InfoSec organization to understand corporate security policies and how they apply to data solutions. Collaborates with the Legal and Data Privacy organization to understand the latest policies so they may be incorporated into every data architecture solution. Suggest architecture design with Ontologies, MDM team. Technical skills & design Significant experience working with structured and unstructured data at scale and comfort with a variety of different stores (key-value, document, columnar, etc.) as well as traditional RDBMS and data warehouses. Deep understanding of modern data services in leading cloud environments, and able to select and assemble data services with maximum cost efficiency while meeting business requirements of speed, continuity, and data integrity. Creates data architecture artifacts such as architecture diagrams, data models, design documents, etc. Guides domain architect on the value of a modern data and analytics platform. Research, design, test, and evaluate new technologies, platforms and third-party products. Working experience with Azure Cloud, Data Mesh, MS Fabric, Ontologies, MDM, IoT, BI solution and AI would be greater assets. Expert troubleshoot skills and experience. Leadership Mentors aspiring data architects typically operating in data engineering and software engineering roles. Key shared accountabilities Leads medium to large data services projects. Provides technical partnership to product owners Shared stewardship, with domains architects, of the Arcadis data ecosystem. Actively participates in Arcadis Tech Architect community. Key profile requirements Minimum of 7 years of experience in designing and implementing modern solutions as part of variety of data ingestion and transformation pipelines Minimum of 5 years of experience with best practice design principles and approaches for a range of application styles and technologies to help guide and steer decisions. Experience working in large scale development and cloud environment.

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad

Work from Office

We Are Hiring: Senior .NET Backend Developer with Azure Data Engineering Experience Job Location: Hyderabad, India Work Mode: Onsite Only Experience: Minimum 6+ Years Qualification: B.Tech, B.E, MCA, M.Tech Role Overview We are seeking an experienced .NET Backend Developer with strong Azure Data Engineering skills to join our growing team in Hyderabad. You will work closely with cross-functional teams to build scalable backend systems, modern APIs, and data pipelines using cutting-edge tools like Azure Databricks and MS Fabric. Technical Skills (Must-Have) Strong hands-on experience in C, SQL Server, and OOP Concepts Proficiency with .NET Core, ASP.NET Core, Web API, Entity Framework (v6 or above) Strong understanding of Microservices Architecture Experience with Azure Cloud technologies including Data Engineering, Azure Databricks, MS Fabric, Azure SQL, Blob Storage, etc. Experience with Snowflake or similar cloud data platforms Experience working with NoSQL databases Skilled in Database Performance Tuning and Design Patterns Working knowledge of Agile methodologies Ability to write reusable libraries and modular, maintainable code Excellent verbal and written communication skills (especially with US counterparts) Strong troubleshooting and debugging skills Nice to Have Skills Experience with Angular, MongoDB, NPM Familiarity with Azure DevOps CI/CD pipelines for build and release configuration Self-starter attitude with strong analytical and problem-solving abilities Willingness to work extra hours when needed to meet tight deadlines Why Join Us Work with a passionate, high-performing team Opportunity to grow your technical and leadership skills in a dynamic environment Be part of global digital transformation initiatives with top-tier clients Exposure to real-world enterprise data systems Opportunity to work on cutting-edge Azure and cloud technologies Performance-based growth & internal mobility opportunities Tags DotNetDeveloper BackendDeveloper AzureDataEngineering Databricks MSFabric Snowflake Microservices CSharpJobs HyderabadJobs FullTimeJob HiringNow EntityFramework ASPNetCore CloudEngineering SQLJobs DevOps DotNetCore BackendJobs SuzvaCareers DataPlatformDeveloper SoftwareJobsIndia

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

Gurgaon / Gurugram, Haryana, India

On-site

Design, develop, and maintain cloud infrastructure using Azure and MS Fabric: Architect and implement cloud solutions leveraging Microsoft Azure services and MS Fabric. Ensure the infrastructure supports scalability, reliability, performance, and cost-efficiency. Integrate containerization and orchestration technologies: Utilize Kubernetes and Docker for containerization and orchestration. Manage and optimize Azure Kubernetes Service (AKS) deployments. Implement DevOps practices and automation: Develop CI/CD pipelines to automate code deployment and infrastructure provisioning. Use automation tools and Terraform to streamline operations and reduce manual intervention. Collaborate with development teams to build and deploy cloud-native applications: Provide guidance and support for designing and implementing cloud-native applications. Ensure applications are optimized for cloud environments. Monitor, troubleshoot, and optimize cloud infrastructure: Implement monitoring and alerting systems to ensure infrastructure health. Optimize resource usage and performance to reduce costs and improve efficiency. Develop cost optimization strategies for efficient use of Azure resources. Troubleshoot and resolve issues quickly to minimize impact on users. Ensure high availability and uptime of applications. Enhance system security and compliance: Implement security best practices and ensure compliance with industry standards. Perform regular security assessments and audits EDUCATION University background: Bachelors/Master's degree in computer science & information systems or related engineering. BEHAVIORAL COMPETENCIES: Outstanding Technical leader with proven hands on in configuration and deployment of DevOps towards successful delivery. Be Innovative and be aligned to new product development technologies and methods. Demonstrate excellent communication skills and able to guide, influence and convince others in a matrix organization. Demonstrated teamwork and collaboration in a professional setting Proven capabilities with worldwide teams Team Player with prior experience in working with European customer is not mandatory but preferable. 5 to 10 years in IT and/or digital companies or startups Knowledge of ansible. Extensive knowledge of cloud technologies, particularly Microsoft Azure and MS Fabric. Proven experience with containerization and orchestration tools such as Kubernetes and Docker. Experience with Azure Kubernetes Service (AKS), Terraform, and DevOps practices. Strong automation skills, including scripting and using automation tools. Proven track record in designing and implementing cloud infrastructure. Experience in optimizing cloud resource usage and performance. Proven experience in Azure cost optimization strategies. Proven experience ensuring uptime of applications and rapid troubleshooting in case of failures. Strong understanding of security best practices and compliance standards. Proven experience providing technical guidance to teams. Proven experience in managing customer expectations. Proven track record of driving decisions collaboratively, resolving conflicts, and ensuring follow-through. Extensive knowledge of software development and system operations. Proven experience in designing stable solutions, testing, and debugging. Demonstrated technical guidance with worldwide teams. Demonstrated teamwork and collaboration in a professional setting. Proven capabilities with worldwide teams. Proficient in English; proficiency in French is a plus Performance Measurements: On-Time Delivery (OTD) Infrastructure Reliability and Availability Cost Optimization and Efficiency Application Uptime and Failure Resolution

Posted 1 month ago

Apply

4.0 - 6.0 years

4 - 6 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Generate CPG business insights and reports using Excel, PowerPoint, and PowerBI. Develop and maintain Power BI reports, dashboards, and visualizations that provide meaningful insights to stakeholders. Create comprehensive content and presentations to support business decisions and strategies. Extract and analyze data from NIQ, Circana, Spins , and other relevant sources. Work with cross-functional teams to develop and implement data-driven solutions, including data visualizations, reports, and dashboards. Manage analytics projects and work streams, and build dashboards and reports. Provide expert-level support to stakeholders on analytics and data visualization. Present findings and recommendations to stakeholders in a clear and concise manner. Required Education Bachelor's Degree Preferred Education Master's Degree Required Technical and Professional Expertise B. Tech, Bachelor's or Master's degree in Computer Science, Science, or relevant education. 4-6 years of experience in data analysis or a related field. Proficiency in MS Fabric, PowerBI, SQL, Excel , and experience with NIQ, Circana, and Spins . Preferred Technical and Professional Experience Strong analytical skills and attention to detail. Excellent communication and presentation abilities. Ability to manage multiple tasks and meet deadlines. Experience in the CPG industry is preferred.

Posted 1 month ago

Apply

4.0 - 7.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Design, develop, and implement data solutions using Microsoft Fabric , including data pipelines, data warehouses, and data marts. Develop data pipelines, data transformations, and data workflows using Microsoft Fabric.

Posted 1 month ago

Apply

3.0 - 8.0 years

9 - 19 Lacs

Bengaluru, Delhi / NCR

Work from Office

Key Responsibilities: Lead the implementation and optimization of Microsoft Purview across the clients data estate in MS Fabric/Azure Cloud Platform (ADF or Data Bricks etc). Define and enforce data governance policies, data classification, sensitivity labeling, and data lineage to ensure readiness for GenAI use cases. Collaborate with data engineers, architects, and AI/ML teams to ensure data discoverability, compliance, and ethical AI readiness. Design and implement data cataloging strategies to support GenAI model training and inference. Provide guidance on data access controls, privacy, and regulatory compliance (e.g., GDPR, HIPAA). Conduct workshops and training sessions for client stakeholders on Purview capabilities and best practices. Monitor and report on data governance KPIs and GenAI readiness metrics. Required Skills & Qualifications: Proven experience as a Microsoft Purview SME in enterprise environments. Strong knowledge of Microsoft Fabric, OneLake, and Synapse Data Engineering. Experience with data governance frameworks and metadata management. Hands-on experience with data classification, sensitivity labels, and data lineage tracking. Understanding of compliance standards and data privacy regulations. Excellent communication and stakeholder management skills. Preferred Qualifications: Microsoft certifications in Azure Data, Purview, or Security & Compliance. Experience working with Azure OpenAI, Copilot integrations, or other GenAI platforms. Background in data science, AI ethics, or ML operations is a plus.

Posted 2 months ago

Apply

2.0 - 5.0 years

3 - 8 Lacs

Bengaluru

Work from Office

Job Title: Power BI Developer Experience: 23 Years Location: Bangalore - Indiranagar (Work from Office Only) Employment Type: Full-Time Job Description: We are looking for a Power BI Developer with 23 years of hands-on experience in designing and developing BI reports and dashboards using Power BI. Candidates with experience in Microsoft Fabric will be given preference. Strong communication skills are essential, as the role involves close collaboration with cross-functional teams. Key Responsibilities: Develop, design, and maintain interactive dashboards and reports in Power BI Work closely with stakeholders to gather requirements and translate them into effective data visualizations Optimize data models for performance and usability Implement row-level security and data governance best practices Stay updated with Power BI and MS Fabric capabilities and best practices Requirements: 23 years of hands-on Power BI development experience Familiarity with Power Query, DAX, and data modeling techniques Experience in Microsoft Fabric is a plus Strong analytical and problem-solving skills Excellent verbal and written communication skills Interested candidates kindly share your CV and below details to usha.sundar@adecco.com 1) Present CTC (Fixed + VP) - 2) Expected CTC - 3) No. of years experience - 4) Notice Period - 5) Offer-in hand - 6) Reason of Change - 7) Present Location -

Posted 2 months ago

Apply

3.0 - 4.0 years

5 - 6 Lacs

Hyderabad

Work from Office

Overview This role serves as an Associate Analyst for the GTM Data analytics COE project development team. This role is one of the go-to resource for building/ maintaining key reports, data pipelines and advanced analytics necessary to bring insights to light for senior leaders and Sector and field end users. Responsibilities The COEs core competencies are a mastery of data visualization, data engineering, data transformation, predictive and prescriptive analytics Enhance data discovery, processes, testing, and data acquisition from multiple platforms. Apply detailed knowledge of PepsiCos applications for root-cause problem-solving. Ensure compliance with PepsiCo IT governance rules and design best practices. Participate in project planning with stakeholders to analyze business opportunities and define end-to-end processes. Translate operational requirements into actionable data presentations. Support data recovery and integrity issue resolution between business and PepsiCo IT. Provide performance reporting for the GTM function, including ad-hoc requests using internal, shipment data systems Develop on-demand reports and scorecards for improved agility and visualization. Collate and analyze large data sets to extract meaningful insights on performance trends and opportunities. Present insights and recommendations to the GTM Leadership team regularly. Manage expectations through effective communication with headquarters partners. Ensure timely and accurate data delivery per service level agreements (SLA). Collaborate across functions to gather insights for action-oriented analysis. Identify and act on opportunities to improve work delivery. Implement process improvements, reporting standardization, and optimal technology use. Foster an inclusive and collaborative environment. Provide baseline support for monitoring SPA mailboxes, work intake, and other ad-hoc requests.Additionally, the role will provide baseline support for monitoring work intake & other adhoc requests, queries Qualifications Undergrad degree in Business or related technology 3-4 Yrs working experience in Power BI 1-2 Yrs working experience in SQL and Python Preferred qualifications : Information technology or analytics experience is a plus Familiarity with Power BI/ Tableau, Python, SQL, Teradata, Azure, MS Fabric Requires a level of analytical, critical thinking, and problem-solving skills as well as great attention to detail Strong time management skills, ability to multitask, set priorities, and plan

Posted 2 months ago

Apply

4.0 - 9.0 years

0 - 25 Lacs

Hyderabad, Pune, Greater Noida

Work from Office

Roles and Responsibilities : Design, develop, test, deploy, and maintain large-scale data pipelines using Azure Data Factory (ADF) to integrate various data sources into a centralized platform. Collaborate with cross-functional teams to gather requirements for data integrations and ensure seamless delivery of high-quality solutions. Develop complex SQL queries to extract insights from large datasets stored in relational databases such as PostgreSQL or MySQL. Troubleshoot issues related to data pipeline failures, identify root causes, and implement fixes to prevent future occurrences. Job Requirements : 4-9 years of experience in designing and developing data integration solutions using ADF or similar tools like Informatica PowerCenter or Talend Open Studio. Strong understanding of Microsoft Azure services including storage options (e.g., Blob Storage), compute resources (e.g., Virtual Machines), networking concepts (e.g., VPN). Proficiency in writing complex SQL queries for querying large datasets stored in relational databases such as PostgreSQL or MySQL.

Posted 2 months ago

Apply

8 - 12 years

19 - 30 Lacs

Pune, Bengaluru

Work from Office

About Position: We at Persistent are looking for a Data Engineering lead with experience in MS Fabric, SQL, Python along with knowledge in Data Extraction and ETL Process. Role: Data Engineering Lead Location: Pune, Bangalore Experience: 8+ years Job Type: Full Time Employment What You'll Do: Work with business to understand business requirements and translate into low level design Design and implement robust, fault tolerant, scalable, and secure data pipelines using pyspark, notebooks in MS Fabric Review code of peers and mentor junior team members Participate in sprint planning and other agile ceremonies Drive automation and efficiency in Data ingestion, data movement and data access workflow Contribute ideas to help ensure that required standards and processes are in place and actively look for opportunities to enhance standards and improve process efficiency. Expertise You'll Bring: Around 8 to 12 years of experience, at least 1 year in MS fabric and Azure cloud Leadership: Ability to lead and mentor junior data engineers, help with planning and estimations Data migration: Experience on migrating and re-modeling large enterprise data from legacy warehouse to Lakehouse (Delta lake) on MS Fabric or Databricks. Strong Data Engineering Skills: Proficiency in data extraction, transformation, and loading (ETL) processes, data modeling, and database management. Also experience around setting up pipelines using Notebooks and ADF, setting up monitoring and alert notifications. Experience with Data Lake Technologies: MS Fabric, Azure, Databricks, Python, Orchestration tool like Apache Airflow or Azure Data Factory, Azure Synapse along with stored procedures, Azure data lake storage. Data Integration Knowledge: Familiarity with data integration techniques, including batch processing, streaming, and real-time data ingestion, auto-loader, change data capture, creation of fact and dimension tables. Programming Skills: Proficiency in SQL, Python, Pyspark for data manipulation and transformation. DP-700 certification will be preferred Benefits: Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment: Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment. We are committed to creating an inclusive environment where all employees can thrive. Our company fosters a values-driven and people-centric work environment that enables our employees to: Accelerate growth, both professionally and personally Impact the world in powerful, positive ways, using the latest technologies Enjoy collaborative innovation, with diversity and work-life wellbeing at the core Unlock global opportunities to work and learn with the industry's best Let's unleash your full potential at Persistent "Persistent is an Equal Opportunity Employer and prohibits discrimination and harassment of any kind."

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies