Jobs
Interviews

104 Synapse Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

9.0 - 13.0 years

0 Lacs

chennai, tamil nadu

On-site

You are an experienced Data Engineering Manager responsible for leading a team of 10+ engineers in Chennai, Tamil Nadu, India. Your primary role is to build scalable data marts and Power BI dashboards to measure marketing campaign performance. Your deep expertise in Azure, Microsoft Fabric, and Power BI, combined with strong leadership skills, enables you to drive data initiatives that facilitate data-driven decision-making for the marketing team. Your key responsibilities include managing and mentoring the data engineering and BI developer team, overseeing the design and implementation of scalable data marts and pipelines, and leading the development of insightful Power BI dashboards. You collaborate closely with marketing and business stakeholders to gather requirements, align on metrics, and deliver actionable insights. Additionally, you lead project planning, prioritize analytics projects, and ensure timely and high-impact outcomes using Agile methodologies. You are accountable for ensuring data accuracy, lineage, and compliance through robust validation, monitoring, and governance practices. You promote the adoption of modern Azure/Microsoft Fabric capabilities and industry best practices in data engineering and BI. Cost and resource management are also part of your responsibilities, where you optimize infrastructure and licensing costs, as well as manage external vendors or contractors if needed. Your expertise in Microsoft Fabric, Power BI, Azure (Data Lake, Synapse, Data Factory, Azure Functions), data modeling, data pipeline development, SQL, and marketing analytics is crucial for success in this role. Proficiency in Agile project management, data governance, data quality monitoring, Git, stakeholder management, and performance optimization is also required. Your role involves leading a team that focuses on developing scalable data infrastructure and analytics solutions to empower the marketing team with campaign performance measurement and optimization. This permanent position requires 9 to 12 years of experience in the Data Engineering domain. If you are passionate about driving data initiatives, leading a team of engineers, and collaborating with stakeholders to deliver impactful analytics solutions, this role offers an exciting opportunity to make a significant impact in the marketing analytics space at the Chennai location.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

haryana

On-site

You will be responsible for developing applications using various Microsoft and web development technologies such as ASP.Net, C#, MVC, Web Forms, Angular, SQL Server, T-SQL, and Microservices. Your expertise in big data technologies like Hadoop, Spark, Hive, Python, Databricks, etc. will be crucial for this role. With a Bachelors Degree in Computer Science or equivalent experience through higher education, you should have at least 8 years of experience in Data Engineering and/or Software Engineering. Your strong coding skills along with knowledge of infrastructure as code and automating production data and ML pipelines will be highly valued. You should be proficient in working on on-prem to cloud migration, particularly in Azure, and have hands-on experience with Azure PaaS offerings such as Synapse, ADLS, DataBricks, Event Hubs, CosmosDB, Azure ML, etc. Experience in building, governing, and scaling data warehouses/lakes/lake houses is essential for this role. Your expertise in developing and tuning stored procedures and T-SQL scripting in SQL Server, along with familiarity with various .Net development tools and products, will contribute significantly to the success of the projects. You should be adept with agile software development lifecycle and DevOps principles to ensure efficient project delivery.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

As a Developer (Japanese-speaking), you will be responsible for supporting projects in the Japan region by utilizing your technical expertise and Japanese language proficiency. This hands-on role will primarily focus on backend development, data engineering, and cloud technologies. Candidates with prior experience working in Japan or those looking to relocate from Japan are highly desired. Your key responsibilities will include designing and developing ETL/ELT pipelines using Azure or equivalent cloud platforms, collaborating with Japanese-speaking stakeholders and internal teams, working with Azure Data Factory, Synapse, Data Lake, and Power BI for data integration and reporting, as well as participating in technical discussions, requirement gathering, and solution design. You will also be expected to ensure timely delivery of project milestones while upholding code quality and documentation standards. To excel in this role, you should possess 3-5 years of experience in data engineering or backend development, proficiency in SQL, Python, and ETL/ELT processes, hands-on experience with Azure Data Factory, Synapse, Data Lake, and Power BI, a strong understanding of cloud architecture (preferably Azure, but AWS/GCP are acceptable), and at least JLPT N3 certification with a preference for N2 or N1 to effectively communicate in Japanese. A Bachelor's degree in Computer Science, Engineering, or a related field is also required. Preferred candidates for this position include individuals who have worked in Japan for at least 2 years and are now relocating to India, or those currently based in Japan and planning to relocate within a month. While Bangalore is the preferred location, Kochi is also acceptable for relocation. Candidates from other regions, such as Noida or Gurgaon, will not be considered unless relocation to the specified locations is confirmed. The interview process will consist of a technical evaluation conducted by Suresh Varghese in the first round, followed by a Japanese language proficiency assessment. At least one round of the interview must be conducted face-to-face for shortlisted candidates to assess their suitability for the role.,

Posted 1 week ago

Apply

4.0 - 7.0 years

7 - 17 Lacs

Bengaluru, Delhi / NCR, Mumbai (All Areas)

Work from Office

Key Responsibilities: Requirement gathering and analysis Experience with different databases like Synapse, SQL DB, Snowflake etc. Design and implement data pipelines using Azure Data Factory, Databricks, Synapse Create and manage Azure SQL Data Warehouses and Azure Cosmos DB databases Extract, transform, and load (ETL) data from various sources into Azure Data Lake Storage Implement data security and governance measures Monitor and optimize data pipelines for performance and efficiency Troubleshoot and resolve data engineering issues Provide optimized solution for any problem related to data engineering Ability to work with a variety of sources like Relational DB, API, File System, Realtime streams, CDC etc. Strong knowledge on Databricks, Delta tables Required Skills: 48 years of experience in Data Engineering or related roles. Hands-on experience in Azure Databricks , ADF , or Synapse Analytics Proficiency in Python for data processing and scripting. Strong command over SQL writing complex queries, performance tuning, etc. Experience working with Azure Data Lake Storage and Data Warehouse concepts (e.g., dimensional modeling, star/snowflake schemas). Understanding CI/CD practices in a data engineering context. Excellent problem-solving and communication skills. Good to Have: Hands on experience in Microsoft Fabric, Logic Apps, Azure OpenAI basics Experienced in Delta Lake , Power BI , or Azure DevOps . Knowledge of Spark , Scala , or other distributed processing frameworks. Exposure to BI tools like Power BI , Tableau , or Looker . Familiarity with data security and compliance in the cloud. Experience in leading a development team.

Posted 1 week ago

Apply

7.0 - 12.0 years

17 - 27 Lacs

Bengaluru, Delhi / NCR, Mumbai (All Areas)

Work from Office

Key Responsibilities: Requirement gathering and analysis Design of data architecture and data model to ingest data Experience with different databases like Synapse, SQL DB, Snowflake etc. Design and implement data pipelines using Azure Data Factory, Databricks, Synapse Create and manage Azure SQL Data Warehouses and Azure Cosmos DB databases Extract, transform, and load (ETL) data from various sources into Azure Data Lake Storage Implement data security and governance measures Monitor and optimize data pipelines for performance and efficiency Troubleshoot and resolve data engineering issues Hands on experience on Azure functions and other components like realtime streaming etc Oversee Azure billing processes, conducting analyses to ensure cost-effectiveness and efficiency in data operations. Provide optimized solution for any problem related to data engineering Ability to work with verity of sources like Relational DB, API, File System, Realtime streams, CDC etc. Strong knowledge on Databricks, Delta tables

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

Diageo's ambition is to be one of the best performing, most trusted, and respected consumer products companies in the world. The strategy is to support premiumisation in developed and emerging countries by offering a broad portfolio across different consumer occasions and price points. This approach also plays a crucial role in shaping responsible drinking trends in markets where international premium spirits are an emerging category. As a member of Diageo's Analytics & Insights team, you will be instrumental in designing, developing, and implementing analytics products to drive the company's competitive advantage and facilitate data-driven decisions. Your role will involve advancing the sophistication of analytics throughout Diageo, serving as a data evangelist to empower stakeholders, identifying meaningful insights from vast data sources, and communicating findings to drive growth, enhance consumer experiences, and optimize business processes. While the role does not entail budget ownership, understanding architecture resource costs is necessary. You will be supporting global initiatives and functions across various markets, working closely with key stakeholders to create possibilities, foster conditions for success, promote personal and professional growth, and maintain authenticity in all interactions. The purpose of the role includes owning and developing a domain-specific data visualization product portfolio, ensuring compliance with technological and business priorities, and contributing to the end-to-end build of analytics products meeting enterprise standards. You will lead agile teams in developing robust BI solutions, provide technical guidance, oversee data flow, and collaborate with internal and external partners to deliver innovative solutions. Your top accountabilities will involve technical leadership in analytics product builds, optimization of data visualization architecture, BAU support, and feedback to enhance data model standards. Business acumen is essential, particularly in working with marketing data and building relationships with stakeholders to drive data-led innovation. Required qualifications include multiple years of experience in BI solution development, a bachelor's degree in a relevant field, hands-on experience as a lead developer, proficiency in DAX & M language, knowledge of Azure architecture, and expertise in data acquisition and processing. Additionally, experience with Azure platform, technical documentation, DevOps solutions, Agile methodologies, and a willingness to deepen solution architecture skills are vital. Experience with structured and unstructured datasets, design collaboration, user experience best practices, and visualization trends are advantageous. A dynamic personality, proficiency in English, and excellent communication skills are key for success in this role.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You will join our data engineering and business intelligence team as an SSIS (SQL Server Integration Services) and SSAS (SQL Server Analysis Services) Developer. Your primary responsibilities will include designing, developing, deploying, and maintaining SSIS packages for ETL processes and managing SSAS cubes for advanced analytics and reporting. Collaboration with business analysts, data architects, and stakeholders to grasp data requirements will be essential. You will need to optimize existing ETL processes for improved performance, scalability, and reliability. Additionally, creating and maintaining technical documentation, monitoring ETL workflows, troubleshooting issues, implementing data quality checks, and performing data validation and unit testing are crucial tasks. Integration of SSIS/SSAS with reporting tools like Power BI, Excel, and participation in code reviews, sprint planning, and agile development are part of your responsibilities. A Bachelor's degree in Computer Science, Information Systems, or a related field, along with at least 3 years of hands-on experience with SSIS and SSAS is required. Strong proficiency in SQL Server, T-SQL, and building both Multidimensional and Tabular SSAS models is necessary. A deep understanding of data warehousing concepts, star/snowflake schema, ETL best practices, and performance tuning in SSIS and SSAS is expected. Proficiency in data visualization tools such as Power BI or Excel (PivotTables) is preferred. Experience with Azure Data Factory, Synapse, or other cloud-based data services, exposure to DevOps CI/CD pipelines for SSIS/SSAS deployments, familiarity with MDX and DAX query languages, and certification in Microsoft SQL Server BI Stack will be advantageous. Strong analytical and problem-solving skills, effective communication, collaboration abilities, and the capacity to work independently while managing multiple tasks are qualities we are looking for in the ideal candidate.,

Posted 1 week ago

Apply

4.0 - 7.0 years

15 - 27 Lacs

Bengaluru

Hybrid

Key Responsibilities: Design, develop, and maintain interactive dashboards and reports in Power BI . Utilize Microsoft Fabric (including OneLake, Lakehouse, Dataflows Gen2, and Pipelines) to build scalable data solutions. Integrate data from multiple sources using Fabric Data Factory Pipelines , Synapse Real-Time Analytics, and Power Query. Implement and optimize data models , measures (DAX) , and ETL processes . Collaborate with data engineers, analysts, and stakeholders to understand data needs and deliver actionable insights. Ensure data governance, security, and compliance using Microsoft Purview and Fabrics built-in governance tools. Perform performance tuning, dataset optimization, and report deployment across workspaces. Document technical solutions and provide user training/support when necessary. Good to Have: Microsoft Certified: Fabric Analytics Engineer or Power BI Data Analyst Associate. Knowledge of Azure Data Services (Data Factory, Synapse, Azure SQL). Experience with Row-Level Security (RLS) and large dataset optimization in Power BI. Familiarity with GitHub or Azure DevOps for version control. Exposure to real-time streaming data and KQL queries (Kusto). Job Requirement Strong experience with Power BI, including DAX,Power Query and Fabric Proficiency in SQL and data modeling techniques. Experience with Azure services (e.g., Synapse, Data Factory). Ability to optimize Power BI reports for performance. Excellent communication and problem-solving skills.

Posted 2 weeks ago

Apply

10.0 - 15.0 years

30 - 45 Lacs

Hyderabad

Hybrid

Job Title: IT-Lead Architect Architect AI Years of Experience: 10-15 Years Mandatory Skills: Data Architect, Team Leadership, AI/ML Expert, Azure, SAP Good to have: Visualization, Python Key Responsibilities: Lead a team of architects and engineers focused on Strategic Azure architecture and AI projects. Develop and maintain the companys data architecture strategy and lead design/architecture validation reviews. Drive the adoption of new AI/ML technologies and assess their impact on data strategy. Architect scalable data flows, storage, and analytics platforms, ensuring secure and cost-effective solutions. Establish data governance frameworks and promote best practices for data quality. Act as a technical advisor on complex data projects and collaborate with stakeholders. Work with technologies including SQL, SYNAPSE, Databricks, PowerBI, Fabric, Python, SQL Server, and NoSQL. Required Qualifications & Experience: Bachelor’s or Master’s degree in Computer Science or related field. At least 5 years in a leadership role in data architecture. Expert in Azure, Databricks, and Synapse. Proven experience leading technical teams and strategic projects, specifically designing and implementing AI solutions within data architectures. Deep knowledge of cloud data platforms (Azure, Fabric, Databricks, AWS), data modeling, ETL/ELT, big data, relational/NoSQL databases, and data security. 5 years of experience in AI model design & deployment. Strong experience in Solution Architecture. Excellent communication, stakeholder management, and problem-solving skills.

Posted 2 weeks ago

Apply

8.0 - 10.0 years

15 - 30 Lacs

Hyderabad

Hybrid

Job Title: IT- Lead Engineer/Architect Azure Lake Years of Experience: 8-10 Years Mandatory Skills: Azure, DataLake, Databricks, SAP BW Key Responsibilities: Lead the development and maintenance of data architecture strategy, including design and architecture validation reviews with all stakeholders. Architect scalable data flows, storage, and analytics platforms in cloud/hybrid environments, ensuring secure, high-performing, and cost-effective solutions. Establish comprehensive data governance frameworks and promote best practices for data quality and enterprise compliance. Act as a technical leader on complex data projects and drive the adoption of new technologies, including AI/ML. Collaborate extensively with business stakeholders to translate needs into architectural solutions and define project scope. Support a wide range of Datalakes and Lakehouses technologies (SQL, SYNAPSE, Databricks, PowerBI, Fabric). Required Qualifications & Experience: Bachelors or Master’s degree in Computer Science or related field. At least 3 years in a leadership role in data architecture. Proven ability leading Architecture/AI/ML projects from conception to deployment. Deep knowledge of cloud data platforms (Microsoft Azure, Fabric, Databricks), data modeling, ETL/ELT, big data, relational/NoSQL databases, and data security. Experience in designing and implementing AI solutions within cloud architecture. 3 years as a project lead in large-scale projects. 5 years in development with Azure, Synapse, and Databricks. Excellent communication and stakeholder management skills.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

7 - 17 Lacs

Hyderabad

Hybrid

Job Title: IT- Senior Engineer Azure Lake Years of Experience: 4-6 Years Mandatory Skills: Azure, DataLake, SAP BW, PowerBI, Tableau Key Responsibilities: Develop and maintain data architecture strategy, including design and architecture validation reviews. Architect scalable data flows, storage, and analytics platforms in cloud/hybrid environments, ensuring secure and cost-effective solutions. Establish and enforce data governance frameworks, promoting data quality and compliance. Act as a technical advisor on complex data projects and collaborate with stakeholders on project scope and planning. Drive adoption of new technologies, conduct technological watch, and define standards for data management. Develop using SQL, SYNAPSE, Databricks, PowerBI, Fabric. Required Qualifications & Experience: Bachelors or Master’s degree in Computer Science or related field. Experience in data architecture with at least 3 years in a leadership role. Deep knowledge of Azure/AWS, Databricks, Synapse, and other cloud data platforms. Understanding of SAP technologies (SAP BW, SAP DataSphere, HANA, S/4, ECC) and visualization tools (Power BI, Tableau). Understanding of data modeling, ETL/ELT, big data, relational/NoSQL databases, and data security. Experience with AI/ML and familiarity with data mesh/fabric. 5 years in back-end/full stack development in large scale projects with Azure Synapse / Databricks.

Posted 2 weeks ago

Apply

12.0 - 20.0 years

25 - 40 Lacs

Hyderabad

Work from Office

Minimum 10 yrs in IT project/program management with hands-on in tools like JIRA, Excel, MS Project, Planisware. Strong in data platform implementation (Snowflake/Redshift), ETL/ELT, scalable architecture & business-aligned solutions.

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

haryana

On-site

You will be responsible for leading the design and implementation of an Azure-based digital and AI platform that facilitates scalable and secure product delivery across IT and OT domains. In collaboration with the Enterprise Architect, you will shape the platform architecture to ensure alignment with the overall digital ecosystem. Your role will involve integrating OT sensor data from PLCs, SCADA, and IoT devices into a centralized and governed Lakehouse environment, bridging plant-floor operations with cloud innovation. Key Responsibilities: - Architect and implement the Azure digital platform utilizing IoT Hub, IoT Edge, Synapse, Databricks, and Purview. - Work closely with the Enterprise Architect to ensure that platform capabilities align with the broader enterprise architecture and digital roadmap. - Design data ingestion flows and edge-to-cloud integration from OT systems such as SCADA, PLC, MQTT, and OPC-UA. - Establish platform standards for data ingestion, transformation (Bronze, Silver, Gold), and downstream AI/BI consumption. - Ensure security, governance, and compliance in accordance with standards like ISA-95 and the Purdue Model. - Lead the technical validation of platform components and provide guidance on platform scaling across global sites. - Implement microservices architecture patterns using containers (Docker) and orchestration (Kubernetes) to enhance platform modularity and scalability. Requirements: - Possess a minimum of 8 years of experience in architecture or platform engineering roles. - Demonstrated hands-on expertise with Azure services including Data Lake, Synapse, Databricks, IoT Edge, and IoT Hub. - Deep understanding of industrial data protocols such as OPC-UA, MQTT, and Modbus. - Proven track record of designing IT/OT integration solutions in manufacturing environments. - Familiarity with Medallion architecture, time-series data, and Azure security best practices. - TOGAF or Azure Solutions Architect certification is mandatory for this role.,

Posted 2 weeks ago

Apply

5.0 - 10.0 years

16 - 30 Lacs

Bengaluru

Hybrid

CBS -National IT - Senior Associate -.Net Full Stack Bangalore Job Duties Be part of technical team in developing and maintaining Web and desktop applications and support issues and ensure an overlap of time zones for supporting Analytics and Web applications. Upgrade Application development software frameworks, support business administration activities, and implement BDO USA security policy, processes, and technologies. Demonstrate proficiency in Agile software development and delivery with a focus on automation. Show expertise in Web Application Development and Service-Oriented Application Design. Possess proven experience as a Full Stack Developer or similar role, with experience developing desktop, web, and mobile applications. Work on highly distributed and scalable system architecture. Design, code, debug, test, and develop features with good quality, maintainability and performance and security aspects considered. Work with a focus on customers requirements, considering current and future needs when designing and implementing features. Manage the site design and development life cycle, including budgeting and milestone management. Carries out routine systems testing to detect and resolve bugs, coding errors, and technical issues. Have knowledge of multiple front-end languages and libraries (e.g., HTML/CSS, JavaScript, XML, jQuery) and back-end languages (e.g., .NET Core, Entity framework, ASP.NET C#, Python, R) and JavaScript frameworks (e.g., Angular, React, Node.js). Be familiar with databases (e.g., MSSQL, MySQL, MongoDB), Azure Services, and UI/UX design. Maintain familiarity with Microsoft Development Best Practices, Azure ML, Databricks, Synapse, and Fabric. Exhibit excellent communication and teamwork skills, great attention to detail, and proven organizational skills. Qualifications, Knowledge, Skills and Abilities Education: A bachelors or masters degree in computer science, computer/electrical engineering or equivalent. Experience: Minimum 5-10 years of hands-on experience in software development. Software: Microsoft .Net technology is primary. Experience on multiple front-end languages and libraries (e.g., HTML/CSS, JavaScript, XML, jQuery) and back-end languages (e.g., .NET Core, Entity framework, ASP.NET C#, Python, R) and JavaScript frameworks (e.g., Angular, React, Node.js). Azure/AWS, SaaS/ PaaS/IaaS. SQL and NOSQL databases (MSSQL, MongoDB, PostgreSQL etc.) Distributed caching NCacheRedis, Memcached etc. Distributed message queue RabbitMQ/Kafka C#/Java /Ruby / Node.js / Python Other Knowledge, Skills & Abilities: Familiarity with Microsoft Development Best Practices, Azure ML, Databricks, Synapse, MS Blazor and Fabric.

Posted 2 weeks ago

Apply

10.0 - 15.0 years

20 - 32 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Skills: Python, SQL, PySpark, Azure Databricks, Data Pipelines SQL: Great skills on T-SQL, stored procedures troubleshooting and development, schema management, data issues analysis, query performance analysis. Python: Intermediate development knowledge – skillful in data frames, Panda's library, parquets management, deployment on cloud. Databricks: PySpark and data frames, azure databricks notebooks management and troubleshooting, Azure databricks architecture. Azure Data Factory/ADF/Synapse/ Data Explorer: Data pipelines design and troubleshooting, Azure Linked services management. Data ETL activities knowledge. Azure ML knowledge and troubleshooting. Azure DevOps/Github PRs management. Kusto server and K-QL nice to have.

Posted 2 weeks ago

Apply

4.0 - 5.0 years

8 - 14 Lacs

Delhi, India

On-site

We are seeking a highly skilled Senior Power BI Administrator with 45 years of experience to manage, monitor, and optimize enterprise-level Power BI environments. The ideal candidate will have in-depth knowledge of Power BI architecture, governance, security, and performance optimization, and will work closely with BI developers, data engineers, and business users to ensure a reliable and secure reporting ecosystem. Key Responsibilities: Administer and manage Power BI service , including workspace management, dataset refreshes, dataflows, and gateways. Define and enforce Power BI governance policies around workspace usage, data security, content lifecycle, and sharing permissions. Monitor and optimize Power BI performance , including dataset size, DAX query performance, and report load times. Set up and manage on-premises data gateways , ensuring high availability and secure connectivity to on-prem data sources. Handle Power BI licensing , capacity planning, and usage analytics using Power BI Admin APIs and audit logs. Support Power BI Premium/Embedded configuration, monitoring, and troubleshooting. Collaborate with BI developers to promote best practices in report development, publishing , and dataset design . Maintain documentation on administration procedures, governance frameworks, and change management processes. Work with security and compliance teams to ensure data privacy and adherence to organizational standards. Provide Tier 2/3 support for Power BI issues and escalate platform-level issues to Microsoft when needed. Required Skills & Qualifications: 45 years of experience working with Power BI , including administration and platform management . Strong knowledge of Power BI service architecture , including datasets, workspaces, apps, gateways, and deployment pipelines. Experience with Power BI Admin APIs , PowerShell scripts , or Microsoft Fabric features. Understanding of role-level security (RLS) and data access management. Experience with on-premises data gateway setup, configuration, and troubleshooting. Proficiency in DAX , Power Query (M) , and performance tuning. Familiarity with Azure Active Directory , Microsoft 365 Admin Center , and Power Platform Admin Center . Strong communication, documentation, and stakeholder management skills. Preferred Skills (Good to Have): Experience with Power BI Premium , Capacity Metrics App , or Fabric Capacity Management . Knowledge of Azure Data Services (e.g., Synapse, Azure SQL DB, Data Factory). Understanding of data governance tools like Purview or Informatica. Microsoft certifications (e.g., DA-100 / PL-300 or Power BI Data Analyst Associate ).

Posted 2 weeks ago

Apply

4.0 - 5.0 years

3 - 10 Lacs

Hyderabad, Telangana, India

On-site

We are seeking a highly skilled Senior Power BI Administrator with 45 years of experience to manage, monitor, and optimize enterprise-level Power BI environments. The ideal candidate will have in-depth knowledge of Power BI architecture, governance, security, and performance optimization, and will work closely with BI developers, data engineers, and business users to ensure a reliable and secure reporting ecosystem. Key Responsibilities: Administer and manage Power BI service , including workspace management, dataset refreshes, dataflows, and gateways. Define and enforce Power BI governance policies around workspace usage, data security, content lifecycle, and sharing permissions. Monitor and optimize Power BI performance , including dataset size, DAX query performance, and report load times. Set up and manage on-premises data gateways , ensuring high availability and secure connectivity to on-prem data sources. Handle Power BI licensing , capacity planning, and usage analytics using Power BI Admin APIs and audit logs. Support Power BI Premium/Embedded configuration, monitoring, and troubleshooting. Collaborate with BI developers to promote best practices in report development, publishing , and dataset design . Maintain documentation on administration procedures, governance frameworks, and change management processes. Work with security and compliance teams to ensure data privacy and adherence to organizational standards. Provide Tier 2/3 support for Power BI issues and escalate platform-level issues to Microsoft when needed. Required Skills & Qualifications: 45 years of experience working with Power BI , including administration and platform management . Strong knowledge of Power BI service architecture , including datasets, workspaces, apps, gateways, and deployment pipelines. Experience with Power BI Admin APIs , PowerShell scripts , or Microsoft Fabric features. Understanding of role-level security (RLS) and data access management. Experience with on-premises data gateway setup, configuration, and troubleshooting. Proficiency in DAX , Power Query (M) , and performance tuning. Familiarity with Azure Active Directory , Microsoft 365 Admin Center , and Power Platform Admin Center . Strong communication, documentation, and stakeholder management skills. Preferred Skills (Good to Have): Experience with Power BI Premium , Capacity Metrics App , or Fabric Capacity Management . Knowledge of Azure Data Services (e.g., Synapse, Azure SQL DB, Data Factory). Understanding of data governance tools like Purview or Informatica. Microsoft certifications (e.g., DA-100 / PL-300 or Power BI Data Analyst Associate ).

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

The project duration for this role is 6 months with a monthly rate of 1.60 Lac. The ideal candidate should possess 4-7 years of experience and the work location is in Bangalore with a Hybrid setup. Key Responsibilities: - Demonstrated strong proficiency in Python, LLMs, Lang Chain, Prompt Engineering, and related Gen AI technologies. - Proficiency in working with Azure Databricks. - Ability to showcase strong analytical skills, problem-solving capabilities, and effective stakeholder communication. - A solid understanding of data governance frameworks, compliance requirements, and internal controls. - Hands-on experience in data quality rule development, profiling, and implementation. - Familiarity with Azure Data Services such as Data Lake, Synapse, and Blob Storage. Preferred Qualifications: - Previous experience in supporting AI/ML pipelines, particularly with GenAI or LLM based models. - Proficiency in Python, PySpark, SQL, and knowledge of Delta Lake architecture. - Hands-on experience with Azure Data Lake, Azure Data Factory, and Azure Synapse Analytics. - Prior experience in data engineering, with a strong expertise in Databricks.,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

The project is expected to last for 6 months with a monthly rate of 1.60 Lac. The ideal candidate should have 4-7 years of experience and the work location will be in Bangalore with hybrid working options available. As a candidate, you are required to have a strong proficiency in Python, LLMs, Lang Chain, Prompt Engineering, and related Gen AI technologies. Additionally, you should have proficiency with Azure Databricks and possess strong analytical, problem-solving, and stakeholder communication skills. A solid understanding of data governance frameworks, compliance, and internal controls is essential. Your experience should include data quality rule development, profiling, and implementation, as well as familiarity with Azure Data Services such as Data Lake, Synapse, and Blob Storage. Preferred qualifications for this role include experience in supporting AI/ML pipelines, particularly with GenAI or LLM based models. Proficiency in Python, PySpark, SQL, and Delta Lake architecture is desired, along with hands-on experience in Azure Data Lake, Azure Data Factory, and Azure Synapse Analytics. A background in data engineering with a strong expertise in Databricks would be beneficial for this position.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Senior Power BI Developer at Magna, you will play a crucial role in interpreting business needs and translating them into impactful Power BI reports and data insights products. Your responsibilities will include designing, developing, integrating, and maintaining business systems through cubes, ad-hoc reports, and dashboards using cutting-edge technologies like Microsoft Fabric and Databricks. You will collaborate closely with a diverse international team spanning across Europe, North America, and Asia. Your major responsibilities will involve working closely with business analysts and stakeholders to understand data visualization requirements and develop effective BI solutions. You will utilize your expertise in DAX to create calculated measures, columns, and tables that enhance data analysis capabilities within Power BI models. Additionally, you will optimize ETL processes using tools like Power Query, SQL, Databricks, and MS Fabric to ensure accurate and consistent data integration from various sources. In this role, you will implement best practices for data modeling, performance optimization, and data governance within Power BI projects. You will also collaborate with database administrators and data engineers to maintain seamless data flow and integrity. Furthermore, you will identify and address performance bottlenecks, optimize queries and data models, and implement security measures to safeguard data confidentiality. To excel in this position, you must stay updated with Power BI advancements and industry trends, continuously seeking optimized solutions and technologies to enhance Magna's Power BI processes. Additionally, you will provide training sessions and technical support to end users, enabling self-service analytics and maximizing Power BI utilization. You will also support junior team members and collaborate with cross-functional teams to identify data-driven insights for strategic decision-making processes. To qualify for this role, you should have a University Degree and more than 3 years of work-related experience in developing Business Intelligence solutions based on Microsoft Tabular models, including Power BI visualization and complex DAX expressions. Strong SQL coding skills, experience in data modeling, ETL processes, Data Warehouse concepts, and proficiency in Microsoft BI stack are essential. Knowledge of programming languages like Python or C# is a plus, along with excellent English language skills, analytical abilities, and effective communication skills. This position may require working in the second or third shift, starting at 4:30 PM or later India time, with 10-25% regular travel. Magna offers a dynamic work environment within a global team, along with professional development opportunities and fair treatment for employees. Competitive salary and attractive benefits are provided based on skills and experience, reflecting market conditions. Join us at Magna to contribute to innovative mobility solutions and advance your career in the automotive technology industry.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

The Data Quality Monitoring Lead plays a crucial role in ensuring the accuracy, reliability, and integrity of data across various systems and platforms. You will lead an offshore team, establish robust data quality monitoring frameworks, and collaborate with cross-functional stakeholders to address data-related challenges effectively. Your responsibilities will include overseeing real-time monitoring of data pipelines, dashboards, and logs using tools like Log Analytics, KQL queries, and Azure Monitoring to detect anomalies promptly. You will configure alerting mechanisms for timely notifications of potential data discrepancies and collaborate with support teams to investigate and resolve system-related issues impacting data quality. Additionally, you will lead the team in identifying and categorizing data quality issues, perform root cause analysis to determine underlying causes, and collaborate with system support teams and data stewards to implement corrective measures. Developing strategies for rectifying data quality issues, designing monitoring tools, and conducting cross-system data analysis will also be part of your role. Moreover, you will evaluate existing data monitoring processes, refine monitoring tools, and promote best practices in data quality monitoring to ensure standardization across all data-related activities. You will also lead and mentor an offshore team, develop a centralized knowledge base, and serve as the primary liaison between the offshore team and the Lockton Data Quality Lead. In terms of technical skills, proficiency in data monitoring tools like Log Analytics, KQL, Azure Monitoring, and Power BI, strong command of SQL, experience in automation scripting using Python, familiarity with Azure services, and understanding of data flows involving Mulesoft and Salesforce platforms are required. Additionally, experience with Azure DevOps for issue tracking and version control is preferred. This role requires a proactive, detail-oriented individual with strong leadership and communication skills, along with a solid technical background in data monitoring, analytics, database querying, automation scripting, and Azure services.,

Posted 2 weeks ago

Apply

15.0 - 21.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Data Architect with over 15 years of experience, your primary responsibility will be to lead the design and implementation of scalable, secure, and high-performing data architectures. You will collaborate with business, engineering, and product teams to develop robust data solutions that support business intelligence, analytics, and AI initiatives. Your key responsibilities will include designing and implementing enterprise-grade data architectures using cloud platforms such as AWS, Azure, or GCP. You will lead the definition of data architecture standards, guidelines, and best practices while architecting scalable data solutions like data lakes, data warehouses, and real-time streaming platforms. Collaborating with data engineers, analysts, and data scientists, you will ensure optimal solutions are delivered based on data requirements. In addition, you will oversee data modeling activities encompassing conceptual, logical, and physical data models. It will be your duty to ensure data security, privacy, and compliance with relevant regulations like GDPR and HIPAA. Defining and implementing data governance strategies alongside stakeholders and evaluating data-related tools and technologies are also integral parts of your role. To excel in this position, you should possess at least 15 years of experience in data architecture, data engineering, or database development. Strong experience in architecting data solutions on major cloud platforms like AWS, Azure, or GCP is essential. Proficiency in data management principles, data modeling, ETL/ELT pipelines, and modern data platforms/tools such as Snowflake, Databricks, and Apache Spark is required. Familiarity with programming languages like Python, SQL, or Java, as well as real-time data processing frameworks like Kafka, Kinesis, or Azure Event Hub, will be beneficial. Moreover, experience in implementing data governance, data cataloging, and data quality frameworks is important. Knowledge of DevOps practices, CI/CD pipelines for data, and Infrastructure as Code (IaC) is a plus. Excellent problem-solving, communication, and stakeholder management skills are necessary for this role. A Bachelor's or Master's degree in Computer Science, Information Technology, or a related field is preferred, along with certifications like Cloud Architect or Data Architect (AWS/Azure/GCP). Join us at Infogain, a human-centered digital platform and software engineering company, where you will have the opportunity to work on cutting-edge data and AI projects in a collaborative and inclusive work environment. Experience competitive compensation and benefits while contributing to experience-led transformation for our clients in various industries.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You will be responsible for designing, developing, and implementing data-centric software solutions using various technologies. This includes conducting code reviews, recommending best coding practices, and providing effort estimates for the proposed solutions. Additionally, you will design audit business-centric software solutions and maintain comprehensive documentation for all proposed solutions. As a key member of the team, you will lead architect and design efforts for product development and application development for relevant use cases. You will provide guidance and support to team members and clients, implementing best practices of data engineering and architectural solution design, development, testing, and documentation. Your role will require you to participate in team meetings, brainstorming sessions, and project planning activities. It is essential to stay up-to-date with the latest advancements in the data engineering area to drive innovation and maintain a competitive edge. You will stay hands-on with the design, development, and validation of systems and models deployed. Collaboration with audit professionals to understand business, regulatory, and risk requirements, as well as key alignment considerations for audit, is a crucial aspect of the role. Driving efforts in the data engineering and architecture practice area will be a key responsibility. In terms of mandatory technical and functional skills, you should have a deep understanding of RDBMS (MS SQL Server, ORACLE, etc.), strong programming skills in T-SQL, and proven experience in ETL and reporting (MSBI stack/COGNOS/INFORMATICA, etc.). Additionally, experience with cloud-centric databases (AZURE SQL/AWS RDS), ADF (AZURE Data Factory), data warehousing skills using SYNAPSE/Redshift, understanding and implementation experience of datalakes, and experience in large data processing/ingestion using Databricks APIs, Lakehouse, etc., are required. Knowledge in MPP databases like SnowFlake/Postgres-XL is also essential. Preferred technical and functional skills include understanding financial accounting, experience with NoSQL using MONGODB/COSMOS, Python coding experience, and an aptitude towards emerging data platforms technologies like MS AZURE Fabric. Key behavioral attributes required for this role include strong analytical, problem-solving, and critical-thinking skills, excellent collaboration skills, the ability to work effectively in a team-oriented environment, excellent written and verbal communication skills, and the willingness to learn new technologies and work on them.,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

You are an experienced Senior QA Specialist being sought to join a dynamic team for a critical AWS to GCP migration project. Your primary responsibility will involve the rigorous testing of data pipelines and data integrity in GCP cloud to ensure seamless reporting and analytics capabilities. Your key responsibilities will include designing and executing test plans to validate data pipelines re-engineered from AWS to GCP, ensuring data integrity and accuracy. You will work closely with data engineering teams to understand AVRO, ORC, and Parquet file structures in AWS S3, and analyze the data in external tables created in Athena used for reporting. It will be essential to ensure that schema and data in Bigquery match against Athena to support reporting in PowerBI. Additionally, you will be required to test and validate Spark pipelines and other big data workflows in GCP. Documenting all test results and collaborating with development teams to resolve discrepancies will also be part of your responsibilities. Furthermore, providing support to UAT business users during UAT testing is expected. To excel in this role, you should possess proven experience in QA testing within a big data DWBI ecosystem. Strong familiarity with cloud platforms such as AWS, GCP, or Azure, with hands-on experience in at least one is necessary. Deep knowledge of data warehousing solutions like BigQuery, Redshift, Synapse, or Snowflake is essential. Expertise in testing data pipelines and understanding different file formats like Avro and Parquet is required. Experience with reporting tools such as PowerBI or similar is preferred. Your excellent problem-solving skills and ability to work independently will be valuable, along with strong communication skills and the ability to collaborate effectively across teams.,

Posted 2 weeks ago

Apply

2.0 - 3.0 years

3 - 6 Lacs

Hyderabad

Work from Office

Detailed job description - Skill Set: Technically strong hands-on Self-driven Good client communication skills Able to work independently and good team player Flexible to work in PST hour(overlap for some hours) Past development experience for Cisco client is preferred.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies