Jobs
Interviews

198 Synapse Jobs - Page 6

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 10.0 years

15 - 30 Lacs

Hyderabad

Hybrid

Job Title: IT- Lead Engineer/Architect Azure Lake Years of Experience: 8-10 Years Mandatory Skills: Azure, DataLake, Databricks, SAP BW Key Responsibilities: Lead the development and maintenance of data architecture strategy, including design and architecture validation reviews with all stakeholders. Architect scalable data flows, storage, and analytics platforms in cloud/hybrid environments, ensuring secure, high-performing, and cost-effective solutions. Establish comprehensive data governance frameworks and promote best practices for data quality and enterprise compliance. Act as a technical leader on complex data projects and drive the adoption of new technologies, including AI/ML. Collaborate extensively with business stakeholders to translate needs into architectural solutions and define project scope. Support a wide range of Datalakes and Lakehouses technologies (SQL, SYNAPSE, Databricks, PowerBI, Fabric). Required Qualifications & Experience: Bachelors or Master’s degree in Computer Science or related field. At least 3 years in a leadership role in data architecture. Proven ability leading Architecture/AI/ML projects from conception to deployment. Deep knowledge of cloud data platforms (Microsoft Azure, Fabric, Databricks), data modeling, ETL/ELT, big data, relational/NoSQL databases, and data security. Experience in designing and implementing AI solutions within cloud architecture. 3 years as a project lead in large-scale projects. 5 years in development with Azure, Synapse, and Databricks. Excellent communication and stakeholder management skills.

Posted 1 month ago

Apply

4.0 - 8.0 years

7 - 17 Lacs

Hyderabad

Hybrid

Job Title: IT- Senior Engineer Azure Lake Years of Experience: 4-6 Years Mandatory Skills: Azure, DataLake, SAP BW, PowerBI, Tableau Key Responsibilities: Develop and maintain data architecture strategy, including design and architecture validation reviews. Architect scalable data flows, storage, and analytics platforms in cloud/hybrid environments, ensuring secure and cost-effective solutions. Establish and enforce data governance frameworks, promoting data quality and compliance. Act as a technical advisor on complex data projects and collaborate with stakeholders on project scope and planning. Drive adoption of new technologies, conduct technological watch, and define standards for data management. Develop using SQL, SYNAPSE, Databricks, PowerBI, Fabric. Required Qualifications & Experience: Bachelors or Master’s degree in Computer Science or related field. Experience in data architecture with at least 3 years in a leadership role. Deep knowledge of Azure/AWS, Databricks, Synapse, and other cloud data platforms. Understanding of SAP technologies (SAP BW, SAP DataSphere, HANA, S/4, ECC) and visualization tools (Power BI, Tableau). Understanding of data modeling, ETL/ELT, big data, relational/NoSQL databases, and data security. Experience with AI/ML and familiarity with data mesh/fabric. 5 years in back-end/full stack development in large scale projects with Azure Synapse / Databricks.

Posted 1 month ago

Apply

12.0 - 20.0 years

25 - 40 Lacs

Hyderabad

Work from Office

Minimum 10 yrs in IT project/program management with hands-on in tools like JIRA, Excel, MS Project, Planisware. Strong in data platform implementation (Snowflake/Redshift), ETL/ELT, scalable architecture & business-aligned solutions.

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

haryana

On-site

You will be responsible for leading the design and implementation of an Azure-based digital and AI platform that facilitates scalable and secure product delivery across IT and OT domains. In collaboration with the Enterprise Architect, you will shape the platform architecture to ensure alignment with the overall digital ecosystem. Your role will involve integrating OT sensor data from PLCs, SCADA, and IoT devices into a centralized and governed Lakehouse environment, bridging plant-floor operations with cloud innovation. Key Responsibilities: - Architect and implement the Azure digital platform utilizing IoT Hub, IoT Edge, Synapse, Databricks, and Purview. - Work closely with the Enterprise Architect to ensure that platform capabilities align with the broader enterprise architecture and digital roadmap. - Design data ingestion flows and edge-to-cloud integration from OT systems such as SCADA, PLC, MQTT, and OPC-UA. - Establish platform standards for data ingestion, transformation (Bronze, Silver, Gold), and downstream AI/BI consumption. - Ensure security, governance, and compliance in accordance with standards like ISA-95 and the Purdue Model. - Lead the technical validation of platform components and provide guidance on platform scaling across global sites. - Implement microservices architecture patterns using containers (Docker) and orchestration (Kubernetes) to enhance platform modularity and scalability. Requirements: - Possess a minimum of 8 years of experience in architecture or platform engineering roles. - Demonstrated hands-on expertise with Azure services including Data Lake, Synapse, Databricks, IoT Edge, and IoT Hub. - Deep understanding of industrial data protocols such as OPC-UA, MQTT, and Modbus. - Proven track record of designing IT/OT integration solutions in manufacturing environments. - Familiarity with Medallion architecture, time-series data, and Azure security best practices. - TOGAF or Azure Solutions Architect certification is mandatory for this role.,

Posted 2 months ago

Apply

5.0 - 10.0 years

16 - 30 Lacs

Bengaluru

Hybrid

CBS -National IT - Senior Associate -.Net Full Stack Bangalore Job Duties Be part of technical team in developing and maintaining Web and desktop applications and support issues and ensure an overlap of time zones for supporting Analytics and Web applications. Upgrade Application development software frameworks, support business administration activities, and implement BDO USA security policy, processes, and technologies. Demonstrate proficiency in Agile software development and delivery with a focus on automation. Show expertise in Web Application Development and Service-Oriented Application Design. Possess proven experience as a Full Stack Developer or similar role, with experience developing desktop, web, and mobile applications. Work on highly distributed and scalable system architecture. Design, code, debug, test, and develop features with good quality, maintainability and performance and security aspects considered. Work with a focus on customers requirements, considering current and future needs when designing and implementing features. Manage the site design and development life cycle, including budgeting and milestone management. Carries out routine systems testing to detect and resolve bugs, coding errors, and technical issues. Have knowledge of multiple front-end languages and libraries (e.g., HTML/CSS, JavaScript, XML, jQuery) and back-end languages (e.g., .NET Core, Entity framework, ASP.NET C#, Python, R) and JavaScript frameworks (e.g., Angular, React, Node.js). Be familiar with databases (e.g., MSSQL, MySQL, MongoDB), Azure Services, and UI/UX design. Maintain familiarity with Microsoft Development Best Practices, Azure ML, Databricks, Synapse, and Fabric. Exhibit excellent communication and teamwork skills, great attention to detail, and proven organizational skills. Qualifications, Knowledge, Skills and Abilities Education: A bachelors or masters degree in computer science, computer/electrical engineering or equivalent. Experience: Minimum 5-10 years of hands-on experience in software development. Software: Microsoft .Net technology is primary. Experience on multiple front-end languages and libraries (e.g., HTML/CSS, JavaScript, XML, jQuery) and back-end languages (e.g., .NET Core, Entity framework, ASP.NET C#, Python, R) and JavaScript frameworks (e.g., Angular, React, Node.js). Azure/AWS, SaaS/ PaaS/IaaS. SQL and NOSQL databases (MSSQL, MongoDB, PostgreSQL etc.) Distributed caching NCacheRedis, Memcached etc. Distributed message queue RabbitMQ/Kafka C#/Java /Ruby / Node.js / Python Other Knowledge, Skills & Abilities: Familiarity with Microsoft Development Best Practices, Azure ML, Databricks, Synapse, MS Blazor and Fabric.

Posted 2 months ago

Apply

10.0 - 15.0 years

20 - 32 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Skills: Python, SQL, PySpark, Azure Databricks, Data Pipelines SQL: Great skills on T-SQL, stored procedures troubleshooting and development, schema management, data issues analysis, query performance analysis. Python: Intermediate development knowledge – skillful in data frames, Panda's library, parquets management, deployment on cloud. Databricks: PySpark and data frames, azure databricks notebooks management and troubleshooting, Azure databricks architecture. Azure Data Factory/ADF/Synapse/ Data Explorer: Data pipelines design and troubleshooting, Azure Linked services management. Data ETL activities knowledge. Azure ML knowledge and troubleshooting. Azure DevOps/Github PRs management. Kusto server and K-QL nice to have.

Posted 2 months ago

Apply

4.0 - 5.0 years

8 - 14 Lacs

Delhi, India

On-site

We are seeking a highly skilled Senior Power BI Administrator with 45 years of experience to manage, monitor, and optimize enterprise-level Power BI environments. The ideal candidate will have in-depth knowledge of Power BI architecture, governance, security, and performance optimization, and will work closely with BI developers, data engineers, and business users to ensure a reliable and secure reporting ecosystem. Key Responsibilities: Administer and manage Power BI service , including workspace management, dataset refreshes, dataflows, and gateways. Define and enforce Power BI governance policies around workspace usage, data security, content lifecycle, and sharing permissions. Monitor and optimize Power BI performance , including dataset size, DAX query performance, and report load times. Set up and manage on-premises data gateways , ensuring high availability and secure connectivity to on-prem data sources. Handle Power BI licensing , capacity planning, and usage analytics using Power BI Admin APIs and audit logs. Support Power BI Premium/Embedded configuration, monitoring, and troubleshooting. Collaborate with BI developers to promote best practices in report development, publishing , and dataset design . Maintain documentation on administration procedures, governance frameworks, and change management processes. Work with security and compliance teams to ensure data privacy and adherence to organizational standards. Provide Tier 2/3 support for Power BI issues and escalate platform-level issues to Microsoft when needed. Required Skills & Qualifications: 45 years of experience working with Power BI , including administration and platform management . Strong knowledge of Power BI service architecture , including datasets, workspaces, apps, gateways, and deployment pipelines. Experience with Power BI Admin APIs , PowerShell scripts , or Microsoft Fabric features. Understanding of role-level security (RLS) and data access management. Experience with on-premises data gateway setup, configuration, and troubleshooting. Proficiency in DAX , Power Query (M) , and performance tuning. Familiarity with Azure Active Directory , Microsoft 365 Admin Center , and Power Platform Admin Center . Strong communication, documentation, and stakeholder management skills. Preferred Skills (Good to Have): Experience with Power BI Premium , Capacity Metrics App , or Fabric Capacity Management . Knowledge of Azure Data Services (e.g., Synapse, Azure SQL DB, Data Factory). Understanding of data governance tools like Purview or Informatica. Microsoft certifications (e.g., DA-100 / PL-300 or Power BI Data Analyst Associate ).

Posted 2 months ago

Apply

4.0 - 5.0 years

3 - 10 Lacs

Hyderabad, Telangana, India

On-site

We are seeking a highly skilled Senior Power BI Administrator with 45 years of experience to manage, monitor, and optimize enterprise-level Power BI environments. The ideal candidate will have in-depth knowledge of Power BI architecture, governance, security, and performance optimization, and will work closely with BI developers, data engineers, and business users to ensure a reliable and secure reporting ecosystem. Key Responsibilities: Administer and manage Power BI service , including workspace management, dataset refreshes, dataflows, and gateways. Define and enforce Power BI governance policies around workspace usage, data security, content lifecycle, and sharing permissions. Monitor and optimize Power BI performance , including dataset size, DAX query performance, and report load times. Set up and manage on-premises data gateways , ensuring high availability and secure connectivity to on-prem data sources. Handle Power BI licensing , capacity planning, and usage analytics using Power BI Admin APIs and audit logs. Support Power BI Premium/Embedded configuration, monitoring, and troubleshooting. Collaborate with BI developers to promote best practices in report development, publishing , and dataset design . Maintain documentation on administration procedures, governance frameworks, and change management processes. Work with security and compliance teams to ensure data privacy and adherence to organizational standards. Provide Tier 2/3 support for Power BI issues and escalate platform-level issues to Microsoft when needed. Required Skills & Qualifications: 45 years of experience working with Power BI , including administration and platform management . Strong knowledge of Power BI service architecture , including datasets, workspaces, apps, gateways, and deployment pipelines. Experience with Power BI Admin APIs , PowerShell scripts , or Microsoft Fabric features. Understanding of role-level security (RLS) and data access management. Experience with on-premises data gateway setup, configuration, and troubleshooting. Proficiency in DAX , Power Query (M) , and performance tuning. Familiarity with Azure Active Directory , Microsoft 365 Admin Center , and Power Platform Admin Center . Strong communication, documentation, and stakeholder management skills. Preferred Skills (Good to Have): Experience with Power BI Premium , Capacity Metrics App , or Fabric Capacity Management . Knowledge of Azure Data Services (e.g., Synapse, Azure SQL DB, Data Factory). Understanding of data governance tools like Purview or Informatica. Microsoft certifications (e.g., DA-100 / PL-300 or Power BI Data Analyst Associate ).

Posted 2 months ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

The project duration for this role is 6 months with a monthly rate of 1.60 Lac. The ideal candidate should possess 4-7 years of experience and the work location is in Bangalore with a Hybrid setup. Key Responsibilities: - Demonstrated strong proficiency in Python, LLMs, Lang Chain, Prompt Engineering, and related Gen AI technologies. - Proficiency in working with Azure Databricks. - Ability to showcase strong analytical skills, problem-solving capabilities, and effective stakeholder communication. - A solid understanding of data governance frameworks, compliance requirements, and internal controls. - Hands-on experience in data quality rule development, profiling, and implementation. - Familiarity with Azure Data Services such as Data Lake, Synapse, and Blob Storage. Preferred Qualifications: - Previous experience in supporting AI/ML pipelines, particularly with GenAI or LLM based models. - Proficiency in Python, PySpark, SQL, and knowledge of Delta Lake architecture. - Hands-on experience with Azure Data Lake, Azure Data Factory, and Azure Synapse Analytics. - Prior experience in data engineering, with a strong expertise in Databricks.,

Posted 2 months ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

The project is expected to last for 6 months with a monthly rate of 1.60 Lac. The ideal candidate should have 4-7 years of experience and the work location will be in Bangalore with hybrid working options available. As a candidate, you are required to have a strong proficiency in Python, LLMs, Lang Chain, Prompt Engineering, and related Gen AI technologies. Additionally, you should have proficiency with Azure Databricks and possess strong analytical, problem-solving, and stakeholder communication skills. A solid understanding of data governance frameworks, compliance, and internal controls is essential. Your experience should include data quality rule development, profiling, and implementation, as well as familiarity with Azure Data Services such as Data Lake, Synapse, and Blob Storage. Preferred qualifications for this role include experience in supporting AI/ML pipelines, particularly with GenAI or LLM based models. Proficiency in Python, PySpark, SQL, and Delta Lake architecture is desired, along with hands-on experience in Azure Data Lake, Azure Data Factory, and Azure Synapse Analytics. A background in data engineering with a strong expertise in Databricks would be beneficial for this position.,

Posted 2 months ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Senior Power BI Developer at Magna, you will play a crucial role in interpreting business needs and translating them into impactful Power BI reports and data insights products. Your responsibilities will include designing, developing, integrating, and maintaining business systems through cubes, ad-hoc reports, and dashboards using cutting-edge technologies like Microsoft Fabric and Databricks. You will collaborate closely with a diverse international team spanning across Europe, North America, and Asia. Your major responsibilities will involve working closely with business analysts and stakeholders to understand data visualization requirements and develop effective BI solutions. You will utilize your expertise in DAX to create calculated measures, columns, and tables that enhance data analysis capabilities within Power BI models. Additionally, you will optimize ETL processes using tools like Power Query, SQL, Databricks, and MS Fabric to ensure accurate and consistent data integration from various sources. In this role, you will implement best practices for data modeling, performance optimization, and data governance within Power BI projects. You will also collaborate with database administrators and data engineers to maintain seamless data flow and integrity. Furthermore, you will identify and address performance bottlenecks, optimize queries and data models, and implement security measures to safeguard data confidentiality. To excel in this position, you must stay updated with Power BI advancements and industry trends, continuously seeking optimized solutions and technologies to enhance Magna's Power BI processes. Additionally, you will provide training sessions and technical support to end users, enabling self-service analytics and maximizing Power BI utilization. You will also support junior team members and collaborate with cross-functional teams to identify data-driven insights for strategic decision-making processes. To qualify for this role, you should have a University Degree and more than 3 years of work-related experience in developing Business Intelligence solutions based on Microsoft Tabular models, including Power BI visualization and complex DAX expressions. Strong SQL coding skills, experience in data modeling, ETL processes, Data Warehouse concepts, and proficiency in Microsoft BI stack are essential. Knowledge of programming languages like Python or C# is a plus, along with excellent English language skills, analytical abilities, and effective communication skills. This position may require working in the second or third shift, starting at 4:30 PM or later India time, with 10-25% regular travel. Magna offers a dynamic work environment within a global team, along with professional development opportunities and fair treatment for employees. Competitive salary and attractive benefits are provided based on skills and experience, reflecting market conditions. Join us at Magna to contribute to innovative mobility solutions and advance your career in the automotive technology industry.,

Posted 2 months ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

The Data Quality Monitoring Lead plays a crucial role in ensuring the accuracy, reliability, and integrity of data across various systems and platforms. You will lead an offshore team, establish robust data quality monitoring frameworks, and collaborate with cross-functional stakeholders to address data-related challenges effectively. Your responsibilities will include overseeing real-time monitoring of data pipelines, dashboards, and logs using tools like Log Analytics, KQL queries, and Azure Monitoring to detect anomalies promptly. You will configure alerting mechanisms for timely notifications of potential data discrepancies and collaborate with support teams to investigate and resolve system-related issues impacting data quality. Additionally, you will lead the team in identifying and categorizing data quality issues, perform root cause analysis to determine underlying causes, and collaborate with system support teams and data stewards to implement corrective measures. Developing strategies for rectifying data quality issues, designing monitoring tools, and conducting cross-system data analysis will also be part of your role. Moreover, you will evaluate existing data monitoring processes, refine monitoring tools, and promote best practices in data quality monitoring to ensure standardization across all data-related activities. You will also lead and mentor an offshore team, develop a centralized knowledge base, and serve as the primary liaison between the offshore team and the Lockton Data Quality Lead. In terms of technical skills, proficiency in data monitoring tools like Log Analytics, KQL, Azure Monitoring, and Power BI, strong command of SQL, experience in automation scripting using Python, familiarity with Azure services, and understanding of data flows involving Mulesoft and Salesforce platforms are required. Additionally, experience with Azure DevOps for issue tracking and version control is preferred. This role requires a proactive, detail-oriented individual with strong leadership and communication skills, along with a solid technical background in data monitoring, analytics, database querying, automation scripting, and Azure services.,

Posted 2 months ago

Apply

15.0 - 21.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Data Architect with over 15 years of experience, your primary responsibility will be to lead the design and implementation of scalable, secure, and high-performing data architectures. You will collaborate with business, engineering, and product teams to develop robust data solutions that support business intelligence, analytics, and AI initiatives. Your key responsibilities will include designing and implementing enterprise-grade data architectures using cloud platforms such as AWS, Azure, or GCP. You will lead the definition of data architecture standards, guidelines, and best practices while architecting scalable data solutions like data lakes, data warehouses, and real-time streaming platforms. Collaborating with data engineers, analysts, and data scientists, you will ensure optimal solutions are delivered based on data requirements. In addition, you will oversee data modeling activities encompassing conceptual, logical, and physical data models. It will be your duty to ensure data security, privacy, and compliance with relevant regulations like GDPR and HIPAA. Defining and implementing data governance strategies alongside stakeholders and evaluating data-related tools and technologies are also integral parts of your role. To excel in this position, you should possess at least 15 years of experience in data architecture, data engineering, or database development. Strong experience in architecting data solutions on major cloud platforms like AWS, Azure, or GCP is essential. Proficiency in data management principles, data modeling, ETL/ELT pipelines, and modern data platforms/tools such as Snowflake, Databricks, and Apache Spark is required. Familiarity with programming languages like Python, SQL, or Java, as well as real-time data processing frameworks like Kafka, Kinesis, or Azure Event Hub, will be beneficial. Moreover, experience in implementing data governance, data cataloging, and data quality frameworks is important. Knowledge of DevOps practices, CI/CD pipelines for data, and Infrastructure as Code (IaC) is a plus. Excellent problem-solving, communication, and stakeholder management skills are necessary for this role. A Bachelor's or Master's degree in Computer Science, Information Technology, or a related field is preferred, along with certifications like Cloud Architect or Data Architect (AWS/Azure/GCP). Join us at Infogain, a human-centered digital platform and software engineering company, where you will have the opportunity to work on cutting-edge data and AI projects in a collaborative and inclusive work environment. Experience competitive compensation and benefits while contributing to experience-led transformation for our clients in various industries.,

Posted 2 months ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You will be responsible for designing, developing, and implementing data-centric software solutions using various technologies. This includes conducting code reviews, recommending best coding practices, and providing effort estimates for the proposed solutions. Additionally, you will design audit business-centric software solutions and maintain comprehensive documentation for all proposed solutions. As a key member of the team, you will lead architect and design efforts for product development and application development for relevant use cases. You will provide guidance and support to team members and clients, implementing best practices of data engineering and architectural solution design, development, testing, and documentation. Your role will require you to participate in team meetings, brainstorming sessions, and project planning activities. It is essential to stay up-to-date with the latest advancements in the data engineering area to drive innovation and maintain a competitive edge. You will stay hands-on with the design, development, and validation of systems and models deployed. Collaboration with audit professionals to understand business, regulatory, and risk requirements, as well as key alignment considerations for audit, is a crucial aspect of the role. Driving efforts in the data engineering and architecture practice area will be a key responsibility. In terms of mandatory technical and functional skills, you should have a deep understanding of RDBMS (MS SQL Server, ORACLE, etc.), strong programming skills in T-SQL, and proven experience in ETL and reporting (MSBI stack/COGNOS/INFORMATICA, etc.). Additionally, experience with cloud-centric databases (AZURE SQL/AWS RDS), ADF (AZURE Data Factory), data warehousing skills using SYNAPSE/Redshift, understanding and implementation experience of datalakes, and experience in large data processing/ingestion using Databricks APIs, Lakehouse, etc., are required. Knowledge in MPP databases like SnowFlake/Postgres-XL is also essential. Preferred technical and functional skills include understanding financial accounting, experience with NoSQL using MONGODB/COSMOS, Python coding experience, and an aptitude towards emerging data platforms technologies like MS AZURE Fabric. Key behavioral attributes required for this role include strong analytical, problem-solving, and critical-thinking skills, excellent collaboration skills, the ability to work effectively in a team-oriented environment, excellent written and verbal communication skills, and the willingness to learn new technologies and work on them.,

Posted 2 months ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

You are an experienced Senior QA Specialist being sought to join a dynamic team for a critical AWS to GCP migration project. Your primary responsibility will involve the rigorous testing of data pipelines and data integrity in GCP cloud to ensure seamless reporting and analytics capabilities. Your key responsibilities will include designing and executing test plans to validate data pipelines re-engineered from AWS to GCP, ensuring data integrity and accuracy. You will work closely with data engineering teams to understand AVRO, ORC, and Parquet file structures in AWS S3, and analyze the data in external tables created in Athena used for reporting. It will be essential to ensure that schema and data in Bigquery match against Athena to support reporting in PowerBI. Additionally, you will be required to test and validate Spark pipelines and other big data workflows in GCP. Documenting all test results and collaborating with development teams to resolve discrepancies will also be part of your responsibilities. Furthermore, providing support to UAT business users during UAT testing is expected. To excel in this role, you should possess proven experience in QA testing within a big data DWBI ecosystem. Strong familiarity with cloud platforms such as AWS, GCP, or Azure, with hands-on experience in at least one is necessary. Deep knowledge of data warehousing solutions like BigQuery, Redshift, Synapse, or Snowflake is essential. Expertise in testing data pipelines and understanding different file formats like Avro and Parquet is required. Experience with reporting tools such as PowerBI or similar is preferred. Your excellent problem-solving skills and ability to work independently will be valuable, along with strong communication skills and the ability to collaborate effectively across teams.,

Posted 2 months ago

Apply

2.0 - 3.0 years

3 - 6 Lacs

Hyderabad

Work from Office

Detailed job description - Skill Set: Technically strong hands-on Self-driven Good client communication skills Able to work independently and good team player Flexible to work in PST hour(overlap for some hours) Past development experience for Cisco client is preferred.

Posted 2 months ago

Apply

8.0 - 10.0 years

10 - 12 Lacs

Chennai

Hybrid

Job Title: Product Owner / Subject Matter Expert (AI & Data) Experience Required: 10+ years Location: The selected candidate is required to work onsite for the initial 1 to 3-month project training and execution period at either our Kovilpatti or Chennai location, which will be confirmed during the onboarding process. After the initial period, remote work opportunities will be offered. Job Description: The Product Owner / Subject Matter Expert (AI & Data) will lead the definition, prioritization, and successful delivery of intelligent, data-driven products by aligning business needs with AI/ML and data platform capabilities. Acting as a bridge between stakeholders, data engineering teams, and AI developers, this role ensures that business goals are translated into actionable technical requirements. The candidate will manage product backlogs, define epics and features, and guide cross-functional teams throughout the product development lifecycle. They will play a crucial role in driving innovation, ensuring data governance, and realizing value through AI-enhanced digital solutions. Key Responsibilities: Define and manage the product roadmap across AI and data domains based on business strategy and stakeholder input. Translate business needs into technical requirements, user stories, and use cases for AI and data-driven applications. Collaborate with data scientists, AI engineers, and data engineers to prioritize features, define MVPs, and validate solution feasibility. Lead backlog refinement, sprint planning, and iteration reviews across multidisciplinary teams. Drive the adoption of AI models (e.g., LLMs, classification, prediction, recommendation) and data pipelines that support operational goals. Ensure inclusion of data governance, lineage, and compliance requirements in product development. Engage with business units to define KPIs and success metrics for AI and analytics products. Document product artifacts such as PRDs, feature definitions, data mappings, model selection criteria, and risk registers. Facilitate workshops, stakeholder demos, and solution walkthroughs to ensure ongoing alignment. Support responsible AI practices and secure data sharing standards. Technical Skills: Product Management Tools: Azure DevOps, Jira, Confluence AI/ML Concepts: LLMs, NLP, predictive analytics, computer vision, generative AI AI Tools: OpenAI, Azure OpenAI, MLflow, LangChain, prompt engineering Data Platforms: Azure Data Factory, Databricks, Synapse Analytics, Purview, SQL, NoSQL Data Governance: Metadata management, data lineage, PII handling, classification standards Documentation: PRDs, data dictionaries, process flows, KPI dashboards Methodologies: Agile/Scrum, backlog management, MVP delivery Qualification: Bachelors or Master’s in Computer Science, Data Science, Information Systems, or a related field. Preferred Certifications: Microsoft Certified (Azure AI Engineer Associate / Azure Data Fundamentals / Azure Data Engineer Associate). 10+ years of experience in product ownership, business analysis, or solution delivery in AI and data-centric environments. Proven success in delivering AI-enabled products and scalable data platforms. Strong communication, stakeholder facilitation, and technical documentation skills.

Posted 2 months ago

Apply

4.0 - 8.0 years

5 - 15 Lacs

Chennai, Delhi / NCR, Mumbai (All Areas)

Hybrid

Job Description (JD): Azure Databricks / ADF / Synapse , with strong emphasis on Python, SQL, Data Lake, and Data Warehouse : Job Title: Data Engineer Azure (Databricks / ADF / Synapse) Experience: 4 to 7 Years Location: Pan India Employment Type: Full-Time Notice Period: Immediate to 30 Days Job Summary: We are looking for a skilled and experienced Data Engineer with 4 to 8 years of experience in building scalable data solutions on the Microsoft Azure ecosystem . The ideal candidate must have strong hands-on experience with Azure Databricks , Azure Data Factory (ADF) , or Azure Synapse Analytics , along with Python and SQL expertise. Familiarity with Data Lake , Data Warehouse concepts, and end-to-end data pipelines is essential. Key Responsibilities: Requirement gathering and analysis Experience with different databases like Synapse, SQL DB, Snowflake etc. Design and implement data pipelines using Azure Data Factory, Databricks, Synapse Create and manage Azure SQL Data Warehouses and Azure Cosmos DB databases Extract, transform, and load (ETL) data from various sources into Azure Data Lake Storage Implement data security and governance measures Monitor and optimize data pipelines for performance and efficiency Troubleshoot and resolve data engineering issues Provide optimized solution for any problem related to data engineering Ability to work with a variety of sources like Relational DB, API, File System, Realtime streams, CDC etc. Strong knowledge on Databricks, Delta tables Required Skills: 48 years of experience in Data Engineering or related roles. Hands-on experience in Azure Databricks , ADF , or Synapse Analytics Proficiency in Python for data processing and scripting. Strong command over SQL writing complex queries, performance tuning, etc. Experience working with Azure Data Lake Storage and Data Warehouse concepts (e.g., dimensional modeling, star/snowflake schemas). Understanding CI/CD practices in a data engineering context. Excellent problem-solving and communication skills. Good to Have: Experienced in Delta Lake , Power BI , or Azure DevOps . Knowledge of Spark , Scala , or other distributed processing frameworks. Exposure to BI tools like Power BI , Tableau , or Looker . Familiarity with data security and compliance in the cloud. Experience in leading a development team.

Posted 2 months ago

Apply

8.0 - 10.0 years

5 - 15 Lacs

Chennai, Bengaluru, Mumbai (All Areas)

Hybrid

Job Title: Azure Data Architect Experience: 8 to 10 years Location: Pan India Employment Type: Full-Time Notice period : Immediate to 30 days Technology: SQL, ADF, ADLS, Synapse, Pyspark, Databricks, data modelling Key Responsibilities: Requirement gathering and analysis Design of data architecture and data model to ingest data Experience with different databases like Synapse, SQL DB, Snowflake etc. Design and implement data pipelines using Azure Data Factory, Databricks, Synapse Create and manage Azure SQL Data Warehouses and Azure Cosmos DB databases Extract, transform, and load (ETL) data from various sources into Azure Data Lake Storage Implement data security and governance measures Monitor and optimize data pipelines for performance and efficiency Troubleshoot and resolve data engineering issues Hands on experience on Azure functions and other components like realtime streaming etc Oversee Azure billing processes, conducting analyses to ensure cost-effectiveness and efficiency in data operations. Provide optimized solution for any problem related to data engineering Ability to work with verity of sources like Relational DB, API, File System, Realtime streams, CDC etc. Strong knowledge on Databricks, Delta tables

Posted 2 months ago

Apply

6.0 - 11.0 years

12 - 22 Lacs

Pune, Gurugram, Bengaluru

Work from Office

Warm welcome from SP Staffing Services! Reaching out to you regarding permanent opportunity !! Job Description: Exp: 6-12 yrs Location: PAN India Skill: Azure Data Factory/SSIS Interested can share your resume to sangeetha.spstaffing@gmail.com with below inline details. Full Name as per PAN: Mobile No: Alt No/ Whatsapp No: Total Exp: Relevant Exp in Data Factory: Rel Exp in Synapse: Rel Exp in SSIS: Rel Exp in Python/Pyspark: Current CTC: Expected CTC: Notice Period (Official): Notice Period (Negotiable)/Reason: Date of Birth: PAN number: Reason for Job Change: Offer in Pipeline (Current Status): Availability for virtual interview on weekdays between 10 AM- 4 PM(plz mention time): Current Res Location: Preferred Job Location: Whether educational % in 10th std, 12th std, UG is all above 50%? Do you have any gaps in between your education or Career? If having gap, please mention the duration in months/year:

Posted 2 months ago

Apply

5.0 - 10.0 years

7 - 17 Lacs

Pune

Hybrid

Azure Data Engineer Remote/Pune-Hybrid Full time Permanent Company: Academian Job Description: We are seeking a skilled Data Engineer with strong experience in Microsoft Azure Cloud services to design, build, and maintain robust data pipelines and architectures. In this role, you will design, implement, and maintain our data infrastructure, ensuring efficient data processing and availability throughout the organization. Key Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines in Azure. Work with Azure services such as Azure Data Factory, Azure Data Lake, Synapse Analytics, Azure SQL, and Databricks . Implement and optimize data storage and retrieval solutions in the cloud. Ensure data quality, consistency, and governance through robust validation and monitoring. Develop and manage CI/CD pipelines for data workflows using tools like Azure DevOps . Collaborate with cross-functional teams to understand data requirements and translate them into technical solutions. Support and troubleshoot data issues and ensure high availability of data infrastructure. Follow best practices in data security, privacy, and compliance Develop and maintain data architectures (data lakes, data warehouses). Integrate data from a wide variety of sources (APIs, logs, third-party platforms). Monitor data workflows and troubleshoot data-related issues. Required Skills & Experience Bachelors degree in computer science, Information Technology, or related field 5+ years of experience in data engineering or similar role Strong hands-on experience with Azure Data Factory, Azure Data Lake, Azure Synapse Analytics, and Databricks. Proficiency in SQL, Python, and PySpark. Experience with data modeling, schema design, and data warehousing. Familiarity with CI/CD processes, version control (e.g., Git), and deployment in Azure DevOps. Knowledge of data governance tools and practices (e.g., Azure Purview, RBAC). Strong SQL skills and experience with relational databases Proficiency with Apache Kafka and streaming data architectures Knowledge of ETL tools and processes Familiarity with DW-BI Tools PowerBI Strong knowledge of database systems (PostgreSQL, MySQL, NoSQL). Understanding of distributed systems like Kafka or MSK Preferred Skills: Experience of data visualization tools Experience with NoSQL databases Understanding of machine learning pipelines and workflows Regards Manisha Koul mkoul@academian.com www.linkedin.com/in/koul-manisha

Posted 2 months ago

Apply

10.0 - 12.0 years

25 - 30 Lacs

Noida, Hyderabad

Work from Office

We’re hiring an Azure Data Architect with 10+ years of experience in designing end-to-end data solutions using ADF, Synapse, Databricks, Data Lake, and Python/SQL.

Posted 2 months ago

Apply

9.0 - 14.0 years

25 - 35 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Role & responsibilities Job Overview: We are looking for a Senior Data Engineer with strong expertise in SQL, Python, Azure Synapse, Azure Data Factory, Snowflake, and Databricks . The ideal candidate should have a solid understanding of SQL (DDL, DML, query optimization) and ETL pipelines while demonstrating a learning mindset to adapt to evolving technologies. Key Responsibilities: Collaborate with business and IT stakeholders to define business and functional requirements for data solutions. Design and implement scalable ETL/ELT pipelines using Azure Data Factory, Databricks, and Snowflake . Develop detailed technical designs, data flow diagrams, and future-state data architecture . Evangelize modern data modelling practices , including entity-relationship models, star schema, and Kimball methodology . Ensure data governance, quality, and validation by working closely with quality engineering teams . Write, optimize, and troubleshoot complex SQL queries , including DDL, DML, and performance tuning . Work with Azure Synapse, Azure Data Lake, and Snowflake for large-scale data processing . Implement DevOps and CI/CD best practices for automated data pipeline deployments. Support real-time streaming data processing with Spark, Kafka, or similar technologies . Provide technical mentorship and guide team members on best practices in SQL, ETL, and cloud data solutions . Stay up to date with emerging cloud and data engineering technologies and demonstrate a continuous learning mindset .

Posted 2 months ago

Apply

3.0 - 6.0 years

4 - 7 Lacs

Indore, Hyderabad, Ahmedabad

Work from Office

We're Hiring: Data Governance Developer Microsoft Purview, Locations: Hyderabad / Indore / Ahmedabad (Work from Office) Role Overview As a Data Governance Developer at Kanerika, you will be responsible for developing and managing robust metadata, lineage, and compliance frameworks using Microsoft Purview and other leading tools. Youll work closely with engineering and business teams to ensure data integrity, regulatory compliance, and operational transparency. Key Responsibilities Set up and manage Microsoft Purview: accounts, collections, RBAC, and policies. Integrate Purview with Azure Data Lake, Synapse, SQL DB, Power BI, Snowflake. Schedule & monitor metadata scanning, classification, and lineage tracking jobs. Build ingestion workflows for technical, business, and operational metadata. Tag, enrich, and organize assets with glossary terms and metadata. Automate lineage, glossary, and scanning processes via REST APIs, PowerShell, ADF, and Logic Apps. Design and enforce classification rules for PII, PCI, PHI. Collaborate with domain owners for glossary and metadata quality governance. Generate compliance dashboards and lineage maps in Power BI. Tools & Technologies Governance Platforms: Microsoft Purview, Collibra, Atlan, Informatica Axon, IBM IG Catalog Integration Tools: Azure Data Factory, dbt, Talend Automation & Scripting: PowerShell, Azure Functions, Logic Apps, REST APIs Compliance Areas in Purview: Sensitivity Labels, Policy Management, Auto-labeling Data Loss Prevention (DLP), Insider Risk Mgmt, Records Management Compliance Manager, Lifecycle Mgmt, eDiscovery, Audit DSPM, Information Barriers, Unified Catalog Required Qualifications 4-6 years of experience in Data Governance / Data Management. Hands-on with Microsoft Purview, especially lineage and classification workflows. Strong understanding of metadata management, glossary governance, and data classification. Familiarity with Azure Data Factory, dbt, Talend. Working knowledge of data compliance regulations: GDPR, CCPA, SOX, HIPAA. Strong communication skills to collaborate across technical and non-technical teams. Apply by sharing your resume with: Current CTC Expected CTC Notice Period Preferred Location Email your profile to: navaneetha@suzva.com Contact: +91 90329 56160

Posted 2 months ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Hyderabad

Work from Office

Lead Data Engineer Data Management Job description Company Overview Accordion works at the intersection of sponsors and management teams throughout every stage of the investment lifecycle, providing hands-on, execution-focused support to elevate data and analytics capabilities. So, what does it mean to work at Accordion? It means joining 1,000+ analytics, data science, finance & technology experts in a high-growth, agile, and entrepreneurial environment while transforming how portfolio companies drive value. It also means making your mark on Accordions futureby embracing a culture rooted in collaboration and a firm-wide commitment to building something great, together. Headquartered in New York City with 10 offices worldwide, Accordion invites you to join our journey. Data & Analytics (Accordion | Data & Analytics) Accordion's Data & Analytics (D&A) team delivers cutting-edge, intelligent solutions to a global clientele, leveraging a blend of domain knowledge, sophisticated technology tools, and deep analytics capabilities to tackle complex business challenges. We partner with Private Equity clients and their Portfolio Companies across diverse sectors, including Retail, CPG, Healthcare, Media & Entertainment, Technology, and Logistics. D&A team delivers data and analytical solutions designed to streamline reporting capabilities and enhance business insights across vast and complex data sets ranging from Sales, Operations, Marketing, Pricing, Customer Strategies, and more. Location: Hyderabad Role Overview: Accordion is looking for Lead Data Engineer. He/she will be responsible for the design, development, configuration/deployment, and maintenance of the above technology stack. He/she must have in-depth understanding of various tools & technologies in the above domain to design and implement robust and scalable solutions which address client current and future requirements at optimal costs. The Lead Data Engineer should be able to evaluate existing architectures and recommend way to upgrade and improve the performance of architectures both on-premises and cloud-based solutions. A successful Lead Data Engineer should possess strong working business knowledge, familiarity with multiple tools and techniques along with industry standards and best practices in Business Intelligence and Data Warehousing environment. He/she should have strong organizational, critical thinking, and communication skills. What You will do: Partners with clients to understand their business and create comprehensive business requirements. Develops end-to-end Business Intelligence framework based on requirements including recommending appropriate architecture (on-premises or cloud), analytics and reporting. Works closely with the business and technology teams to guide in solution development and implementation. Work closely with the business teams to arrive at methodologies to develop KPIs and Metrics. Work with Project Manager in developing and executing project plans within assigned schedule and timeline. Develop standard reports and functional dashboards based on business requirements. Conduct training programs and knowledge transfer sessions to junior developers when needed. Recommend improvements to provide optimum reporting solutions. Curiosity to learn new tools and technologies to provide futuristic solutions for clients. Ideally, you have: Undergraduate degree (B.E/B.Tech.) from tier-1/tier-2 colleges are preferred. More than 5 years of experience in related field. Proven expertise in SSIS, SSAS and SSRS (MSBI Suite.) In-depth knowledge of databases (SQL Server, MySQL, Oracle etc.) and data warehouse (any one of Azure Synapse, AWS Redshift, Google BigQuery, Snowflake etc.) In-depth knowledge of business intelligence tools (any one of Power BI, Tableau, Qlik, DOMO, Looker etc.) Good understanding of Azure (OR) AWS: Azure (Data Factory & Pipelines, SQL Database & Managed Instances, DevOps, Logic Apps, Analysis Services) or AWS (Glue, Aurora Database, Dynamo Database, Redshift, QuickSight). Proven abilities to take on initiative and be innovative. Analytical mind with problem solving attitude. Why Explore a Career at Accordion: High growth environment: Semi-annual performance management and promotion cycles coupled with a strong meritocratic culture, enables fast track to leadership responsibility. Cross Domain Exposure: Interesting and challenging work streams across industries and domains that always keep you excited, motivated, and on your toes. Entrepreneurial Environment : Intellectual freedom to make decisions and own them. We expect you to spread your wings and assume larger responsibilities. Fun culture and peer group: Non-bureaucratic and fun working environment; Strong peer environment that will challenge you and accelerate your learning curve. Other benefits for full time employees: Health and wellness programs that include employee health insurance covering immediate family members and parents, term life insurance for employees, free health camps for employees, discounted health services (including vision, dental) for employee and family members, free doctors consultations, counsellors, etc. Corporate Meal card options for ease of use and tax benefits. Team lunches, company sponsored team outings and celebrations. Cab reimbursement for women employees beyond a certain time of the day. Robust leave policy to support work-life balance. Specially designed leave structure to support woman employees for maternity and related requests. Reward and recognition platform to celebrate professional and personal milestones. A positive & transparent work environment including various employee engagement and employee benefit initiatives to support personal and professional learning and development.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies