Jobs
Interviews

20 Purview Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

You should have over 10 years of experience in data architecture, data engineering, or related roles. Your expertise should include designing and implementing enterprise-level data solutions with a hands-on technical approach. You should have a proven track record of managing client relationships and leading technical teams. In terms of technical skills, you must be well-versed in data modeling, data warehousing, and database design, including both relational and NoSQL databases. You should have a strong proficiency in data engineering, which includes experience with ETL tools, data integration frameworks, and big data technologies. Hands-on experience with Google Cloud data platform and modern data processing frameworks is crucial. Moreover, familiarity with scripting and programming languages like Python and SQL for hands-on development and troubleshooting is essential. Experience with data governance frameworks & solutions such as Informatica, Collibra, Purview, etc., will be a plus. Soft skills required for this role include exceptional client management and communication skills to confidently interact with both technical and non-technical stakeholders. You should possess proven team management and leadership abilities, including mentoring, coaching, and project management. Strong analytical and problem-solving skills with a proactive, detail-oriented approach are necessary. The ability to work collaboratively in a fast-paced, dynamic environment while successfully driving multiple projects to completion is important. Preferred certifications for this position include Professional Cloud Architect (GCP), Data Architect, Certified Data Management Professional (CDMP), or similar credentials.,

Posted 1 day ago

Apply

2.0 - 9.0 years

0 Lacs

karnataka

On-site

We are seeking a Data Architect / Sr. Data and Pr. Data Architects to join our team. In this role, you will be involved in a combination of hands-on contribution, customer engagement, and technical team management. As a Data Architect, your responsibilities will include designing, architecting, deploying, and maintaining solutions on the MS Azure platform using various Cloud & Big Data Technologies. You will be managing the full life-cycle of Data Lake / Big Data solutions, starting from requirement gathering and analysis to platform selection, architecture design, and deployment. It will be your responsibility to implement scalable solutions on the Cloud and collaborate with a team of business domain experts, data scientists, and application developers to develop Big Data solutions. Moreover, you will be expected to explore and learn new technologies for creative problem solving and mentor a team of Data Engineers. The ideal candidate should possess strong hands-on experience in implementing Data Lake with technologies such as Data Factory (ADF), ADLS, Databricks, Azure Synapse Analytics, Event Hub & Streaming Analytics, Cosmos DB, and Purview. Additionally, experience with big data technologies like Hadoop (CDH or HDP), Spark, Airflow, NiFi, Kafka, Hive, HBase, MongoDB, Neo4J, Elastic Search, Impala, Sqoop, etc., is required. Proficiency in programming and debugging skills in Python and Scala/Java is essential, with experience in building REST services considered beneficial. Candidates should also have experience in supporting BI and Data Science teams in consuming data in a secure and governed manner, along with a good understanding of using CI/CD with Git, Jenkins / Azure DevOps. Experience in setting up cloud-computing infrastructure solutions, hands-on experience/exposure to NoSQL Databases, and Data Modelling in Hive are all highly valued. Applicants should have a minimum of 9 years of technical experience, with at least 5 years on MS Azure and 2 years on Hadoop (CDH/HDP).,

Posted 3 days ago

Apply

2.0 - 6.0 years

0 Lacs

haryana

On-site

Job Description: Coders Brain Technology Pvt. Ltd. is seeking a skilled API Developer to join our team. As an API Developer, you will be responsible for developing and maintaining APIs using React and .Net, creating and managing SQL databases, and implementing front-end designs and user interfaces. You will play a crucial role in delivering innovative solutions to our clients in the None industry. We are looking for a candidate with a minimum of 2 years of experience as an API Developer. The ideal candidate should have proficiency in React, SQL, and API (.Net), along with prior experience in front-end development. The candidate should possess strong skills in React, SQL, front-end development, and API (.Net). If you are passionate about technology and have a strong background in API development, we encourage you to apply for this exciting opportunity at Coders Brain Technology Pvt. Ltd. Join our dedicated team of professionals and contribute to delivering cutting-edge solutions to our clients.,

Posted 4 days ago

Apply

1.0 - 2.0 years

5 - 10 Lacs

Hyderabad

Work from Office

Job Role : DLP Analyst--Microsoft Purview Tool Experience : 1 to 3 Yrs Key Skills: DLP Implementation, Writing Policies, Onboarding, Configuration, Data Classificatio n Notice Period : 0 to 15 days(Must) Should be willing to work in rotational shift Office Address : Cyber Towers, Quadrant 3, 3rd floor, Madhapur, Hyderabad -- 500081. Job Overview: Develop and implement data loss prevention strategies, policies, and procedures to protect sensitive data from unauthorized access, disclosure, or loss. Collaborate with cross-functional teams to identify potential vulnerabilities, risks, and gaps in existing data protection measures, and provide recommendations for improvement. Design, configure, DLP solutions and tools to monitor, detect, and prevent data breaches or leaks across various platforms and endpoints. Conduct regular assessments and audits to evaluate the effectiveness of data loss prevention controls and ensure compliance with applicable regulations and industry standards. Collaborate with internal stakeholders to raise awareness and educate employees on data protection best practices, policies, and procedures. Stay updated on emerging threats, trends, and technologies in the field of data security and loss prevention, and provide recommendations for proactive measures. Participate in the evaluation, selection, and implementation of new data protection technologies and tools. Prepare comprehensive reports and presentations for management, highlighting key findings, recommendations, and metrics related to data loss prevention initiatives. Prepare and maintain Standard Operating Procedures (SOPs) related to DLP, ensuring they are up to date and accessible to all relevant stakeholders. Develop and maintain the Responsibility Assignment Matrix (RACI) to clearly define roles and responsibilities for DLP initiatives, including incident response, policy enforcement, and employee training. Skills Strong understanding of data security concepts, regulatory requirements (e.g., GDPR, HIPAA), and industry best practices. Experience in designing and implementing data loss prevention strategies, policies, and procedures in a corporate environment. Proficient in configuring and managing DLP technologies such as data classification, data discovery, data loss monitoring, and incident response. Familiarity with network protocols, security technologies (e.g., firewalls, intrusion detection systems), and encryption methods.

Posted 6 days ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

We are looking to hire a Technical Presales Engineer with a specialization in Microsoft Security products such as MDE, MDO, Sentinel, Purview, Intune, Entra ID, etc. The ideal candidate should have a strong understanding of cybersecurity principles and be proficient in using Microsoft security tools. As a Presales Engineer, you will be responsible for providing technical expertise to support sales, designing customized security solutions, conducting product demonstrations, leading Proof of Concepts (PoCs), and serving as a trusted advisor to our customers. To excel in this role, you should possess excellent communication and presentation skills, along with a Bachelor's degree in Computer Science, Information Technology, or Cybersecurity. Having relevant Microsoft certifications will be considered a plus. If you are passionate about cybersecurity and have experience in the field, we encourage you to reach out to us at careers@skysecure.ai to explore this exciting opportunity further.,

Posted 1 week ago

Apply

6.0 - 11.0 years

20 - 35 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Hybrid

Shift timings- 2 PM -11 PM Primary skills Azure Security Defender, Sentinel,(identity, Endpoint, etc.) Secondary skills Azure Infrastructure, Office 365 collab workloads Required Skills & Experience: Technical Expertise: Strong understanding of Azure security offerings, including but not limited to: Microsoft Defender for Cloud / Endpoint / Identity Microsoft Sentinel (SIEM/SOAR) Microsoft Entra (Identity Governance, Conditional Access) Hands-on experience with cloud security assessments, PoC deployments, and client workshops. Familiarity with Zero Trust architecture and related best practices. Professional Experience: 5+ years in IT security roles, with 2+ years focused on Azure or cloud security. Proven track record of leading technical engagements independently. Soft Skills: Excellent communication and presentation skills. Ability to articulate technical concepts to both technical and business audiences. Self-starter who thrives in a fast-paced, client-facing environment. Preferred Qualifications: Microsoft certifications (e.g., SC-100, AZ-500, SC-200) Experience working with Microsoft partners or within funded engagement programs. Exposure to regulatory compliance frameworks (e.g., ISO, NIST, GDPR) Key Responsibilities: Client Engagements: Conduct security assessments and discovery workshops to understand client environments, security gaps, and cloud readiness. Deliver technical Proof of Concepts (PoCs) and hands-on demonstrations of Microsoft Azure security solutions. Host and facilitate technical workshops on Zero Trust, Microsoft Defender, Sentinel, Entra, and related technologies. Provide technology walkthroughs, highlight use cases, and share practical experience to illustrate business value. Solution Design & Implementation: Design and recommend secure architectures and configurations using Azure-native tools and services. Collaborate on solution development, documentation, and client readiness for security modernization. Internal & Cross-Functional Collaboration: Work closely with Sales, PreSales, and regional delivery teams to align on customer needs, technical strategy, and success metrics. Contribute to proposal development and client presentations from a technical security standpoint. Thought Leadership & Enablement: Stay updated on Azure security advancements and share knowledge internally and with clients. Support internal enablement sessions and mentor junior team members, where applicable.

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

We are looking for an experienced Data Governance Architect with deep expertise in Alation and Azure cloud platforms. This role involves partnering with senior stakeholders to define and champion an enterprise data catalog and dictionary strategy, oversee the entire lifecycle of the data catalog from establishing metadata standards and initial MVPs to executing full-scale enterprise rollouts. You should have at least 10 years of experience in data governance and proven expertise in the Alation tool on the Azure platform. Understanding of the Snowflake platform is also required. Additionally, you should have proven expertise in at least two areas such as Data Governance Operating Models, Metadata Management, Data Cataloging, Data Lineage, or Data Quality. A deep understanding of governance frameworks such as DAMA or DCAM, with practical implementation experience, is essential. In this role, you will be responsible for assessing current cataloging and dictionary capabilities, identifying gaps, and developing actionable roadmaps to enrich metadata quality, accelerate catalog population, and drive adoption. You will also need to identify different data personas using the data catalog and design persona-specific playbooks to promote adoption. Your responsibilities will include designing, deploying, and managing scalable data catalog and dictionary solutions using platforms like Alation. Understanding of leading Data Governance tools like Collibra and Purview will be beneficial. You will oversee the entire lifecycle of the data catalog, from establishing metadata standards and initial MVPs to executing full-scale enterprise rollouts. Furthermore, you will define architecture and best practices for metadata management to ensure consistency, scalability, and sustainability of the catalog and dictionary. You will identify and catalog critical data elements by capturing clear business terms, glossaries, KPIs, lineage, and persona-specific guides to build a trusted, comprehensive data dictionary. Developing and enforcing policies to maintain metadata quality, manage access, and protect sensitive information within the catalog will be part of your responsibilities. You will need to implement robust processes for catalog population, including automated metadata ingestion, leveraging APIs, glossary management, lineage tracking, and data classification. Moreover, you will develop a workflow management approach to notify changes to certified catalog content to stewards. Creating reusable frameworks and templates for data definitions and best practices to streamline catalog adoption across teams will also be expected from you.,

Posted 1 week ago

Apply

2.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

Tiger Analytics is a global AI and analytics consulting firm with a team of over 2800 professionals focused on using data and technology to solve complex problems that impact millions of lives worldwide. Our culture is centered around expertise, respect, and a team-first mindset. Headquartered in Silicon Valley, we have delivery centers globally and offices in various cities across India, the US, UK, Canada, and Singapore, along with a significant remote workforce. At Tiger Analytics, we are certified as a Great Place to Work. Joining our team means being at the forefront of the AI revolution, working with innovative teams that push boundaries and create inspiring solutions. We are currently looking for an Azure Big Data Engineer to join our team in Chennai, Hyderabad, or Bangalore. As a Big Data Engineer (Azure), you will be responsible for building and implementing various analytics solutions and platforms on Microsoft Azure using a range of Open Source, Big Data, and Cloud technologies. Your typical day might involve designing and building scalable data ingestion pipelines, processing structured and unstructured data, orchestrating pipelines, collaborating with teams and stakeholders, and making critical tech-related decisions. To be successful in this role, we expect you to have 4 to 9 years of total IT experience with at least 2 years in big data engineering and Microsoft Azure. You should be proficient in technologies such as Azure Data Factory (ADF), PySpark, Databricks, ADLS, Azure SQL Database, Azure Synapse Analytics, Event Hub & Streaming Analytics, Cosmos DB, and Purview. Strong coding skills in SQL, Python, or Scala/Java are essential, as well as experience with big data technologies like Hadoop, Spark, Airflow, NiFi, Kafka, Hive, Neo4J, and Elastic Search. Knowledge of file formats such as Delta Lake, Avro, Parquet, JSON, and CSV is also required. Ideally, you should have experience in building REST APIs, working on Data Lake or Lakehouse projects, supporting BI and Data Science teams, and following Agile and DevOps processes. Certifications like Data Engineering on Microsoft Azure (DP-203) or Databricks Certified Developer (DE) would be a valuable addition to your profile. At Tiger Analytics, we value diversity and inclusivity, and we encourage individuals with different skills and qualities to apply, even if they do not meet all the criteria for the role. We are committed to providing equal opportunities and fostering a culture of listening, trust, respect, and growth. Please note that the job designation and compensation will be based on your expertise and experience, and our compensation packages are competitive within the industry. If you are passionate about leveraging data and technology to drive impactful solutions, we would love to stay connected with you.,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

maharashtra

On-site

As a Cloud & AI Solution Engineer at Microsoft, you will be part of a dynamic team that is at the forefront of innovation in the realm of databases and analytics. Your role will involve working on cutting-edge projects that leverage the latest technologies to drive meaningful impact for commercial customers. If you are insatiably curious and deeply passionate about tackling complex challenges in the era of AI, this is the perfect opportunity for you. In this role, you will play a pivotal role in helping enterprises unlock the full potential of Microsoft's cloud database and analytics stack. You will collaborate closely with engineering leaders and platform teams to accelerate the Fabric Data Platform, including Azure Databases and Analytics. Your responsibilities will include hands-on engagements such as Proof of Concepts, hackathons, and architecture workshops to guide customers through secure, scalable solution design and accelerate database and analytics migration into their deployment workflows. To excel in this position, you should have at least 10+ years of technical pre-sales or technical consulting experience, or a Bachelor's/Master's Degree in Computer Science or related field with 4+ years of technical pre-sales experience. You should be an expert on Azure Databases (SQL DB, Cosmos DB, PostgreSQL) and Azure Analytics (Fabric, Azure Databricks, Purview), as well as competitors in the data warehouse, data lake, big data, and analytics space. Additionally, you should have experience with cloud and hybrid infrastructure, architecture designs, migrations, and technology management. As a trusted technical advisor, you will guide customers through solution design, influence technical decisions, and help them modernize their data platform to realize the full value of Microsoft's platform. You will drive technical sales, lead hands-on engagements, build trusted relationships with platform leads, and maintain deep expertise in Microsoft's Analytics Portfolio and Azure Databases. By joining our team, you will have the opportunity to accelerate your career growth, develop deep business acumen, and hone your technical skills. You will be part of a collaborative and creative team that thrives on continuous learning and flexible work opportunities. If you are ready to take on this exciting challenge and be part of a team that is shaping the future of cloud Database & Analytics, we invite you to apply and join us on this journey.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

We are seeking an experienced Data Governance Architect with specialized knowledge in Alation and Azure cloud platforms. In this role, you will collaborate with senior stakeholders to establish and advocate for an enterprise data catalog and dictionary strategy. Your responsibilities will encompass overseeing the complete data catalog lifecycle, from defining metadata standards and initial MVPs to executing large-scale enterprise rollouts. To qualify for this position, you should have over 10 years of experience in data governance and demonstrate proficiency in Alation tool on Azure platform. Additionally, familiarity with the Snowflake platform is required. Expertise in at least two of the following areas is essential: Data Governance Operating Models, Metadata Management, Data Cataloging, Data Lineage, or Data Quality. A deep understanding of governance frameworks like DAMA or DCAM, along with practical implementation experience, is crucial. As a Data Governance Architect, you must possess strong capabilities in conducting maturity assessments, gap analyses, and delivering strategic roadmaps. Excellent communication skills are necessary for articulating complex topics clearly and producing precise documentation. Key Responsibilities: - Evaluate existing cataloging and dictionary capabilities, identify gaps, and create roadmaps to enhance metadata quality, speed up catalog population, and foster adoption. - Recognize various data personas utilizing the data catalog and develop persona-specific playbooks to encourage adoption. - Plan, implement, and supervise scalable data catalog and dictionary solutions using platforms such as Alation. - Comprehend leading Data Governance tools like Collibra and Purview. - Supervise the entire data catalog lifecycle, including setting metadata standards, developing initial MVPs, and executing large-scale enterprise rollouts. - Define architecture and best practices for metadata management to ensure catalog and dictionary consistency, scalability, and sustainability. - Identify and categorize critical data elements by documenting clear business terms, glossaries, KPIs, lineage, and persona-specific guides to construct a reliable data dictionary. - Establish and enforce policies to uphold metadata quality, regulate access, and safeguard sensitive information within the catalog. - Implement robust processes for catalog population through automated metadata ingestion, API utilization, glossary management, lineage tracking, and data classification. - Create a workflow management approach to notify stewards of changes to certified catalog content. - Develop reusable frameworks and templates for data definitions and best practices to streamline catalog adoption across teams.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

haryana

On-site

You will be responsible for leading the design and implementation of an Azure-based digital and AI platform that facilitates scalable and secure product delivery across IT and OT domains. In collaboration with the Enterprise Architect, you will shape the platform architecture to ensure alignment with the overall digital ecosystem. Your role will involve integrating OT sensor data from PLCs, SCADA, and IoT devices into a centralized and governed Lakehouse environment, bridging plant-floor operations with cloud innovation. Key Responsibilities: - Architect and implement the Azure digital platform utilizing IoT Hub, IoT Edge, Synapse, Databricks, and Purview. - Work closely with the Enterprise Architect to ensure that platform capabilities align with the broader enterprise architecture and digital roadmap. - Design data ingestion flows and edge-to-cloud integration from OT systems such as SCADA, PLC, MQTT, and OPC-UA. - Establish platform standards for data ingestion, transformation (Bronze, Silver, Gold), and downstream AI/BI consumption. - Ensure security, governance, and compliance in accordance with standards like ISA-95 and the Purdue Model. - Lead the technical validation of platform components and provide guidance on platform scaling across global sites. - Implement microservices architecture patterns using containers (Docker) and orchestration (Kubernetes) to enhance platform modularity and scalability. Requirements: - Possess a minimum of 8 years of experience in architecture or platform engineering roles. - Demonstrated hands-on expertise with Azure services including Data Lake, Synapse, Databricks, IoT Edge, and IoT Hub. - Deep understanding of industrial data protocols such as OPC-UA, MQTT, and Modbus. - Proven track record of designing IT/OT integration solutions in manufacturing environments. - Familiarity with Medallion architecture, time-series data, and Azure security best practices. - TOGAF or Azure Solutions Architect certification is mandatory for this role.,

Posted 2 weeks ago

Apply

5.0 - 10.0 years

6 - 12 Lacs

Gurgaon, Haryana, India

On-site

Insight Direct India is looking for a proactive and client-focused Consultant II - Purview to join our team. In this role, you'll be instrumental in delivering on client visions through meticulous process management, development, and lifecycle guidance. You'll work directly with stakeholders and team members, defining, designing, and delivering actionable insights within an agile environment, specifically focusing on data security, governance, and Microsoft technologies. As a Consultant II, you will get to: Engage with potential clients to understand their needs and propose relevant workshops tailored to their specific requirements, industry demands, and their current level of adoption and understanding of Security, Copilot, or Azure. Maintain up-to-date knowledge of Microsoft funding programs , including understanding workshop entry requirements and staying informed about ongoing changes to deliverables. Maintain up-to-date knowledge of Microsoft technologies (M365, Security, Copilot) , industry trends, and have a clear understanding of the roadmap and upcoming feature updates. Follow the relevant paperwork process once an eligible workshop is identified, including progressing opportunities in Partner Centre, Salesforce, and Qorus. Demonstrate strong objection handling skills , particularly for Copilot, to address clients concerns and misconceptions, and persuasively advocate for Insight's proposed approach. Develop a strategic mindset by understanding the client's overall roadmap and identify how different workshops can support their long-term goals, being curious about their motivations and how to unlock further opportunities. Communicate effectively across different levels, with both technical personnel and executives. What We're Looking For Experience: 1-3 years of experience in pre-sales activities or training sessions in Data Security, Governance, M365, Purview , or other relevant Microsoft technologies. Business Acumen: Strong understanding of how pre-sales activities drive business for Insight. Communication Skills: Excellent communication, presentation, and instructional skills. Teamwork & Independence: Ability to work independently and as part of a team. Travel: Willingness to travel as needed.

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

Greetings from Teknikoz! With over 3 years of experience, you will be responsible for maintaining and improving data security controls, managing security policies using security tools, and monitoring auditing and logging on all cloud databases. Additionally, you will oversee the day-to-day governance of Big ID, Purview, and Palo Alto, including configuration tuning, policy management, and defining execution criteria. You will collaborate with security teams to optimize control systems based on business needs, work on integrations using APIs and connectors, and provide daily support for security controls. Your role will also involve supporting configuration, rules, and policies across the enterprise and assisting in security incident response alongside enterprise CSIRT & SOC teams. The ideal candidate will bring to the table: - Working knowledge of BigID, Palo Alto, and Purview for over 3 years - Proficiency in SaaS technology & services - Strong grasp of data classification concepts and best practices - Expertise in at least one major cloud provider (AWS, Azure, GCP) - Ability to document security governance processes and procedures in team run book - Strong communication and collaboration skills, with the capability to work effectively across multiple teams - Customer/client-focused approach, ensuring superior customer experience and long-term relationships - Strategic thinking, sound judgment, and ability to balance short and long-term risk decisions - Proactive, self-motivated, and capable of working independently while being comfortable with challenge and escalation when necessary. If you are ready to take on this challenging yet rewarding role, we look forward to receiving your application!,

Posted 2 weeks ago

Apply

8.0 - 10.0 years

10 - 20 Lacs

Pune

Remote

Job Summary: We are seeking an experienced Azure Data Governance Specialist to design, implement, and manage data governance frameworks and infrastructure across Azure-based platforms. The ideal candidate will ensure enterprise data is high-quality, secure, compliant, and aligned with business and regulatory requirements. This role combines deep technical expertise in Azure with a strong understanding of data governance principles, MDM, and data quality management. Key Responsibilities: Data Governance & Compliance: Design and enforce data governance policies, standards, and frameworks aligned with enterprise objectives and compliance requirements (e.g., GDPR, HIPAA). Master Data Management (MDM): Implement and manage MDM strategies and solutions within the Azure ecosystem to ensure consistency, accuracy, and accountability of key business data. Azure Data Architecture: Develop and maintain scalable data architecture on Azure (e.g., Azure Data Lake, Synapse, Purview, Alation, Anomalo) to support governance needs. Tooling & Automation: Deploy and manage Azure-native data governance tools such as Azure Purview, Microsoft Fabric, and Data Factory to classify, catalog, and monitor data assets including third party tools like Alation. Data Quality (DQ): Lead and contribute to Data Quality forums, establish DQ metrics, and integrate DQ checks and dashboards within Azure platforms. Security & Access Management: Collaborate with security teams to implement data security measures, role-based access controls, and data encryption in accordance with Azure best practices. Technical Leadership: Guide teams in best practices for designing data pipelines, metadata management, and lineage tracking with Azure tooling. Continuous Improvement: Drive improvements in data management processes and tooling to enhance governance efficiency and compliance posture. Mentorship & Collaboration: Provide technical mentorship to data engineers and analysts, promoting data stewardship and governance awareness across the organization. Qualifications: Education: Bachelors degree in Computer Science, Information Systems, or a related field. Experience: 8+ years of experience in data infrastructure and governance, with 3+ years focused on Azure data services and tools. Technical Skills: Proficiency with data governance tools: Alation, Purview, Synapse, Data Factory, Azure SQL, etc. Strong understanding of data modeling (conceptual, logical, and physical models). Experience with programming languages such as Python, C#, or Java. In-depth knowledge of SQL and metadata management. Leadership: Proven experience leading or influencing cross-functional teams in data governance and architecture initiatives. Certifications (preferred): Azure Data Engineer Associate, Azure Solutions Architect Expert, or Azure Purview-related certifications.

Posted 2 weeks ago

Apply

2.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

Tiger Analytics is a global AI and analytics consulting firm that is at the forefront of solving complex problems using data and technology. With a team of over 2800 experts spread across the globe, we are dedicated to making a positive impact on the lives of millions worldwide. Our culture is built on expertise, respect, and collaboration, with a focus on teamwork. While our headquarters are in Silicon Valley, we have delivery centers and offices in various cities in India, the US, UK, Canada, and Singapore, as well as a significant remote workforce. As an Azure Big Data Engineer at Tiger Analytics, you will be part of a dynamic team that is driving an AI revolution. Your typical day will involve working on a variety of analytics solutions and platforms, including data lakes, modern data platforms, and data fabric solutions using Open Source, Big Data, and Cloud technologies on Microsoft Azure. Your responsibilities may include designing and building scalable data ingestion pipelines, executing high-performance data processing, orchestrating pipelines, designing exception handling mechanisms, and collaborating with cross-functional teams to bring analytical solutions to life. To excel in this role, we expect you to have 4 to 9 years of total IT experience with at least 2 years in big data engineering and Microsoft Azure. You should be well-versed in technologies such as Azure Data Factory, PySpark, Databricks, Azure SQL Database, Azure Synapse Analytics, Event Hub & Streaming Analytics, Cosmos DB, and Purview. Your passion for writing high-quality, scalable code and your ability to collaborate effectively with stakeholders are essential for success in this role. Experience with big data technologies like Hadoop, Spark, Airflow, NiFi, Kafka, Hive, and Neo4J, as well as knowledge of different file formats and REST API design, will be advantageous. At Tiger Analytics, we value diversity and inclusivity, and we encourage individuals with varying skills and backgrounds to apply. We are committed to providing equal opportunities for all our employees and fostering a culture of trust, respect, and growth. Your compensation package will be competitive and aligned with your expertise and experience. If you are looking to be part of a forward-thinking team that is pushing the boundaries of what is possible in AI and analytics, we invite you to join us at Tiger Analytics and be a part of our exciting journey towards building innovative solutions that inspire and energize.,

Posted 2 weeks ago

Apply

5.0 - 10.0 years

14 - 24 Lacs

Hyderabad, Chennai

Hybrid

JD Background in Data Analysis Experience in training Artificial Intelligence models, with special focus on prompt engineering Two years related work experience in Data Loss Prevention (DLP), and/or Cybersecurity. For DLP, experience providing recommendations for tuning content detection rules to improve accuracy, a plus Able to learn and apply new concepts quickly Proven analytical and problem solving abilities Able to learn and apply new concepts quickly Proven analytical and problem solving abilities Strong communications skills Responsibilities: train Machine Learning (AI) models in answering Data Loss Prevention topics Be an advocate of users of Artificial Intelligence (AI): understand and give prompts from user perspective. generate metrics for AI responses Evaluate AI responses and generate comprehensive feedback about the prompt and the responses Document and articulate the AI feedback into guide books Communicate the AI feedback across stakeholders and drive continuous improvements The JD for the DLP role is still accurate. We want candidates who has domain knowledge in using DLP module of compliance tools such as: MS Purview ProofPoint ForcePoint Symantec DLP others

Posted 1 month ago

Apply

2.0 - 4.0 years

0 - 2 Lacs

Hyderabad, Bangalore Rural, Bengaluru

Work from Office

Hi, We are hiring Data discovery and classification engineer for Bangalore location, please find the below details of the Opportunity: Job Description Summary: We are seeking a skilled Data Discovery & Classification Engineer to join our data management team. The ideal candidate will be responsible for identifying, classifying, and protecting sensitive data across the organization, ensuring compliance with data protection regulations and internal policies. Key Responsibilities: 1. Onboarding Data Sources: Work with IT and data teams to integrate new data sources into the existing data management platform. (e.g., Teams, OneDrive, file storage, APIs). Monitor data source health and connectivity; resolve any ingestion issues. Automate inventory updates where possible using scripts or tools. Create and maintain detailed documentation for each data source records, including metadata, data lineage, along with onboarding process. 2. Maintaining Inventory of Data Sources: Maintain an up-to-date inventory of all data sources, ensuring each source is accurately cataloged and classified. Regularly monitor data sources for changes or updates, ensuring the inventory reflects the current state of data assets. Generate reports on the status and progress of data sources, highlighting any issues or areas for improvement. 3. Data Classification: Create and maintain classification categories. Apply appropriate tags and labels to data to facilitate easy retrieval and management. Adjust classification rules based on changing business needs and policies. 4. Metrics and Reporting: Develop dashboards and reports to track onboarding status, classification accuracy, and data inventory health. Track key performance indicators (KPIs) such as: Number of data sources onboarded Percentage of classified vs. unclassified data Automation and Process Improvement: Identify opportunities to automate data onboarding and classification processes. Create and maintain scripts to automate data extraction and classification. Propose and implement improvements to data ingestion pipelines and workflows. Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. Proven experience as a Data Discovery & Classification Engineer or in a similar role. Strong knowledge of data discovery and classification tools and technologies (e.g., Congruity 360, Microsoft Purview, Big ID etc.). Excellent analytical and problem-solving skills. Ability to work independently and as part of a team.

Posted 1 month ago

Apply

0.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos AI Gigafactory, our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Senior Principal Consultant- Senior Data Engineer - Databricks, Azure & Mosaic AI Role Summary: We are seeking a Senior Data Engineer with extensive expertise in Data & Analytics platform modernization using Databricks, Azure, and Mosaic AI. This role will focus on designing and optimizing cloud-based data architectures, leveraging AI-driven automation to enhance data pipelines, governance, and processing at scale. Key Responsibilities: . Architect & modernize Data & Analytics platforms using Databricks on Azure. . Design and optimize Lakehouse architectures integrating Azure Data Lake, Databricks Delta Lake, and Synapse Analytics. . Implement Mosaic AI for AI-driven automation, predictive analytics, and intelligent data engineering solutions. . Lead the migration of legacy data platforms to a modern cloud-native Data & AI ecosystem. . Develop high-performance ETL pipelines, integrating Databricks with Azure services such as Data Factory, Synapse, and Purview. . Utilize MLflow & Mosaic AI for AI-enhanced data processing and decision-making. . Establish data governance, security, lineage tracking, and metadata management across modern data platforms. . Work collaboratively with business leaders, data scientists, and engineers to drive innovation. . Stay at the forefront of emerging trends in AI-powered data engineering and modernization strategies. Qualifications we seek in you! Minimum Qualifications . experience in Data Engineering, Cloud Platforms, and AI-driven automation. . Expertise in Databricks (Apache Spark, Delta Lake, MLflow) and Azure (Data Lake, Synapse, ADF, Purview). . Strong experience with Mosaic AI for AI-powered data engineering and automation. . Advanced proficiency in SQL, Python, and Scala for big data processing. . Experience in modernizing Data & Analytics platforms, migrating from on-prem to cloud. . Knowledge of Data Lineage, Observability, and AI-driven Data Governance frameworks. . Familiarity with Vector Databases & Retrieval-Augmented Generation (RAG) architectures for AI-powered data analytics. . Strong leadership, problem-solving, and stakeholder management skills. Preferred Skills: . Experience with Knowledge Graphs (Neo4J, TigerGraph) for data structuring. . Exposure to Kubernetes, Terraform, and CI/CD for scalable cloud deployments. . Background in streaming technologies (Kafka, Spark Streaming, Kinesis). Why join Genpact . Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation . Make an impact - Drive change for global enterprises and solve business challenges that matter . Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities . Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day . Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 1 month ago

Apply

5.0 - 10.0 years

18 - 22 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Hybrid

Shift timings- 2 PM -11 PM Primary skills Azure Security Defender, Sentinel,(identity, Endpoint, etc.) Secondary skills Azure Infrastructure, Office 365 collab workloads Required Skills & Experience: Technical Expertise: Strong understanding of Azure security offerings, including but not limited to: Microsoft Defender for Cloud / Endpoint / Identity Microsoft Sentinel (SIEM/SOAR) Microsoft Entra (Identity Governance, Conditional Access) Hands-on experience with cloud security assessments, PoC deployments, and client workshops. Familiarity with Zero Trust architecture and related best practices. Professional Experience: 5+ years in IT security roles, with 2+ years focused on Azure or cloud security. Proven track record of leading technical engagements independently. Soft Skills: Excellent communication and presentation skills. Ability to articulate technical concepts to both technical and business audiences. Self-starter who thrives in a fast-paced, client-facing environment. Preferred Qualifications: Microsoft certifications (e.g., SC-100, AZ-500, SC-200) Experience working with Microsoft partners or within funded engagement programs. Exposure to regulatory compliance frameworks (e.g., ISO, NIST, GDPR) Key Responsibilities: Client Engagements: Conduct security assessments and discovery workshops to understand client environments, security gaps, and cloud readiness. Deliver technical Proof of Concepts (PoCs) and hands-on demonstrations of Microsoft Azure security solutions. Host and facilitate technical workshops on Zero Trust, Microsoft Defender, Sentinel, Entra, and related technologies. Provide technology walkthroughs, highlight use cases, and share practical experience to illustrate business value. Solution Design & Implementation: Design and recommend secure architectures and configurations using Azure-native tools and services. Collaborate on solution development, documentation, and client readiness for security modernization. Internal & Cross-Functional Collaboration: Work closely with Sales, PreSales, and regional delivery teams to align on customer needs, technical strategy, and success metrics. Contribute to proposal development and client presentations from a technical security standpoint. Thought Leadership & Enablement: Stay updated on Azure security advancements and share knowledge internally and with clients. Support internal enablement sessions and mentor junior team members, where applicable.

Posted 2 months ago

Apply

8.0 - 10.0 years

13 - 15 Lacs

Pune

Work from Office

We are seeking a hands-on Lead Data Engineer to drive the design and delivery of scalable, secure data platforms on Google Cloud Platform (GCP). In this role you will own architectural decisions, guide service selection, and embed best practices across data engineering, security, and performance disciplines. You will partner with data modelers, analysts, security teams, and product owners to ensure our pipelines and datasets serve analytical, operational, and AI/ML workloads with reliability and cost efficiency. Familiarity with Microsoft Azure data services (Data Factory, Databricks, Synapse, Fabric) is valuable, as many existing workloads will transition from Azure to GCP. Key Responsibilities Lead end-to-end development of high-throughput, low-latency data pipelines and lake-house solutions on GCP (BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud Composer, Dataplex, etc.). Define reference architectures, technology standards for data ingestion, transformation, and storage. Drive service-selection trade-offscost, performance, scalability, and securityacross streaming and batch workloads. Conduct design reviews and performance tuning sessions; ensure adherence to partitioning, clustering, and query-optimization standards in BigQuery. Contribute to long-term cloud data strategy, evaluating emerging GCP features and multi-cloud patterns (Azure Synapse, Data Factory, Purview, etc.) for future adoption. Lead the code reviews and oversee the development activities delegated to Data engineers. Implement best practices recommended by Google Cloud Provide effort estimates for the data engineering activities Participate in discussions to migrate existing Azure workloads to GCP, provide solutions to migrate the work loads for selected data pipelines Must-Have Skills 810 years in data engineering, with 3+ years leading teams or projects on GCP. Expert in GCP data services (BigQuery, Dataflow/Apache Beam, Dataproc/Spark, Pub/Sub, Cloud Storage) and orchestration with Cloud Composer or Airflow. Proven track record designing and optimizing large-scale ETL/ELT pipelines (streaming + batch). Strong fluency in SQL and one major programming language (Python, Java, or Scala). Deep understanding of data lake / lakehouse, dimensional & data-vault modeling, and data governance frameworks. Excellent communication and stakeholder-management skills; able to translate complex technical topics to non-technical audiences. Nice-to-Have Skills Hands-on experience with Microsoft Azure data services (Azure Synapse Analytics, Data Factory, Event Hub, Purview). Experience integrating ML pipelines (Vertex AI, Dataproc ML) or real-time analytics (BigQuery BI Engine, Looker). Familiarity with open-source observability stacks (Prometheus, Grafana) and FinOps tooling for cloud cost optimization. Preferred Certifications Google Professional Data Engineer (strongly preferred) or Google Professional Cloud Architect Microsoft Certified: Azure Data Engineer Associate (nice to have) Education Bachelors or Masters degree in Computer Science, Information Systems, Engineering, or a related technical field. Equivalent professional experience will be considered.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies