Jobs
Interviews

13 Lineage Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 12.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About the Company Brillio Technologies is a forward-thinking company dedicated to leveraging data to drive innovation and deliver impactful solutions. Our mission is to empower businesses through data-centric strategies, fostering a culture of collaboration and excellence. About the Role We&aposre looking for a Senior Product & Delivery Manager with a strong background in data-driven products, agile development, and end-to-end delivery management. This is a high-impact role driving product innovation and execution across complex technology and business landscapes. If you&aposre passionate about building data-centric solutions and love working across cross-functional teamsthis is your opportunity. Responsibilities Lead full project & delivery lifecycle: requirements, scope, planning, budgeting, resourcing, and stakeholder engagement Drive product management activities including backlog grooming, use-case capture, and roadmap definition Apply Design Thinking, user journey mapping, and value stream analysis to shape new features Collaborate closely with tech & business teams to align development with strategy Champion Agile/Scrum methodologies for effective MVP and product rollout Define and track KPIs and customer value metrics for data-driven prioritization Guide execution of data initiatives: lineage, metadata management, data catalogs, data governance, and more Bridge business needs and technical solutions with clarity and confidence Qualifications 10+ years in product and delivery management in tech-centric environments 6+ years in hands-on product management and data/business analysis within Agile settings Deep experience with data architectures and governance tools (lineage, metadata, catalogs, taxonomies) Strong technical acumen + proven program management skills Confident communicatorequally comfortable with dev teams and senior stakeholders Ability to drive UX and data visualization solutions that empower users Creative, analytical, and proactive mindset Domain knowledge in Banking or Capital Markets Familiarity with regulations like BCBS 239, IFRS, CCAR Experience in knowledge graph development or information management Required Skills Strong background in data-driven products Agile development experience End-to-end delivery management Preferred Skills Experience with data architectures and governance tools Domain knowledge in Banking or Capital Markets Familiarity with regulations like BCBS 239, IFRS, CCAR Full-time | Senior Level Equal Opportunity Statement Brillio Technologies is committed to diversity and inclusivity in the workplace. We encourage applications from individuals of all backgrounds and experiences. Show more Show less

Posted 1 day ago

Apply

10.0 - 15.0 years

0 Lacs

haryana

On-site

The Legal Analytics lead (Vice President) will be a part of AIM, based out of Gurugram and reporting into the Legal Analytics head (Senior Vice President) leading the team. You will lead a team of information management experts and data engineers responsible for building Data Strategy by identifying all relevant product processors, creating Data Lake, Data Pipeline, Governance & Reporting. Your role will involve driving quality, reliability, and usability of all work products, as well as evaluating and refining methods and procedures for obtaining data to ensure validity, applicability, efficiency, and accuracy. It is essential to ensure proper documentation and traceability of all project work and respond timely to internal and external reviews. As the Data/Information Management Sr. Manager, you will achieve results through the management of professional team(s) and department(s), integrating subject matter and industry expertise within a defined area. You will contribute to standards around which others will operate, requiring an in-depth understanding of how areas collectively integrate within the sub-function and coordinate and contribute to the objectives of the entire function. Basic commercial awareness is necessary, along with developed communication and diplomacy skills to guide, influence, and convince others, including colleagues in other areas and occasional external customers. Your responsibilities will include ensuring volume, quality, timeliness, and delivery of end results of an area, and you may also have responsibility for planning, budgeting, and policy formulation within your area of expertise. Involvement in short-term planning resource planning and full management responsibility of a team, which may include management of people, budget and planning, such as performance evaluation, compensation, hiring, disciplinary actions, terminations, and budget approval. Your primary responsibilities will involve supporting Business Execution activities of the Chief Operating Office by implementing data engineering solutions to manage banking operations. You will establish monitoring routines, scorecards, and escalation workflows, overseeing Data Strategy, Smart Automation, Insight Generation, Data Quality, and Reporting activities using proven analytical techniques. It will be your responsibility to document data requirements, data collection, processing, cleaning, process automation, optimization, and data visualization techniques. You will enable proactive issue detection, escalation workflows, and alignment with firmwide Data Related policies, implementing a governance framework with clear stewardship roles and data quality controls. You will also interface between business and technology partners for digitizing data collection, including performance generation, validation rules for banking operations. In this role, you will work with large and complex data sets (both internal and external data) to evaluate, recommend, and support the implementation of business strategies, such as a Centralized data repository with standardized definitions and scalable data pipes. You will identify and compile data sets using a variety of tools (e.g., SQL, Access) to help predict, improve, and measure the success of key business outcomes, implementing rule-based Data Quality checks across critical data points, automating alerts for breaks, and publishing periodic quality reports. You will develop and execute the analytics strategy for Data Ingestion, Reporting / Insights Centralization, ensuring consistency, lineage tracking, and audit readiness across legal reporting. As the ideal candidate, you will have 10-15 years of experience in Business Transformation Solution Design roles with proficiency in tools/technologies like SQL, SAS, Python, PySpark, Tableau, Xceptor, Appian, JIRA, Sharepoint, etc. Strong understanding of Data Transformation, Data Modeling, Data Strategy, Data Architecture, Data Tracing & Lineage, Scalable Data Flow Design, Standardization, Platform Integration, and Smart Automation is essential. You should also have expertise in database performance tuning and optimization for data enrichment and integration, reporting, and dashboarding. A Bachelors or Masters degree in STEM is required, with a Masters degree being preferred. Additionally, you should have a strong capability to influence business outcomes and decisions in collaboration with AIM leadership and business stakeholders, demonstrating thought leadership in partner meetings while leading from the front to drive innovative solutions with excellent stakeholder management. Your excellent verbal and written communication skills will enable you to communicate seamlessly across team members, stakeholders, and cross-functional teams.,

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

coimbatore, tamil nadu

On-site

As an Integration Developer at our organization, you will be an integral part of the data engineering and integration team. Your primary responsibility will be to design, develop, and maintain ETL/ELT pipelines using tools such as SSIS and Azure Data Factory (ADF). You will play a crucial role in building reliable data integration pipelines that support various business functions including business intelligence, analytics, and operational workloads. Your key responsibilities will include implementing robust data workflows to ingest, transform, and store structured and unstructured data in Azure Data Lakehouse, integrating cloud and on-premise systems for high performance and data quality, and actively participating in architectural discussions to contribute to the design of data integration frameworks on the Azure platform. Collaboration with data analysts, data scientists, and business users will be essential to understand data requirements and deliver scalable solutions. Monitoring and optimizing data pipelines for performance, reliability, and cost-efficiency will also be part of your responsibilities. To excel in this role, you are required to have a Bachelor's degree in Computer Science, Engineering, or a related field, along with at least 5 years of experience in data integration or ETL development roles. Strong expertise in SSIS, Azure Data Factory, Azure Data Lake, Delta Lake, and Azure Storage (Blob/ADLS Gen2) is necessary. Proficiency in SQL, familiarity with Azure Fabric concepts, and problem-solving skills are also essential. Preferred qualifications include experience with DataOps or CI/CD pipelines, exposure to Databricks, Spark, or Python for data transformation, and knowledge of data governance tools. Possessing Microsoft Azure certifications such as DP-203 or AZ-204 would be considered a plus. This is a full-time position with benefits such as food provided and Provident Fund. The work location is in-person. Join us as an Integration Developer and be part of a dynamic team that is dedicated to creating scalable and resilient data integration solutions on the Azure platform.,

Posted 5 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

NTT DATA is looking for a Databricks Engineer to join their team in Bangalore, Karnataka, India. As a Databricks Engineer, you will be responsible for various tasks related to data extraction, ETL pipeline modernization, job design, development, automation, metadata management, documentation, testing, collaboration, performance tuning, security, governance, and compliance. Your primary job duties will include extracting and analyzing data from SQL Server and Teradata sources, translating legacy SQL/DataStage transformations into Databricks-native code, building and orchestrating jobs within Databricks using tools like Databricks Workflows, Delta Lake, and Auto Loader, generating and maintaining data flow diagrams and job documentation, designing and executing unit tests and integration tests for data pipelines, optimizing data ingestion and transformation for performance and cost efficiency, ensuring compliance with data governance policies, and implementing access control via Unity Catalog. To be successful in this role, you must have a strong understanding of ETL/ELT principles and data pipelines, proficiency with Databricks platform and PySpark or Spark SQL, advanced SQL skills, familiarity with Teradata and SQL Server environments, ability to read and understand data models, schemas, and ERDs, basic proficiency with Git for code versioning, ability to write and validate unit/integration tests, strong communication skills, and an awareness of security and governance principles. NTT DATA is a global innovator of business and technology services, serving 75% of the Fortune Global 100. They are committed to helping clients innovate, optimize, and transform for long-term success. As a Global Top Employer, NTT DATA has diverse experts in more than 50 countries and a robust partner ecosystem. Their services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation, and management of applications, infrastructure, and connectivity. NTT DATA is a leading provider of digital and AI infrastructure and is part of the NTT Group, investing in R&D to help organizations and society move confidently into the digital future. Visit us at us.nttdata.com.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You will be the Platform Services Lead for Data Platform and Standards at our company, taking on the responsibility for managing end-to-end service delivery and platform operations for core data governance technologies. This includes overseeing Data Quality, Catalogue, Privacy, Lineage, and Retention services. Your role will involve defining and implementing a service resilience strategy, covering aspects such as monitoring, alerting, capacity management, disaster recovery, and failover design. As the Platform Services Lead, you will be tasked with establishing and enforcing SLAs, KPIs, and operational performance metrics across the platform estate. Collaboration with Engineering, IT Service Owners (ITSO), and Cybersecurity teams will be essential to embed observability, DevSecOps, and compliance practices within the platform. Driving the adoption of self-healing mechanisms, automated remediation processes, and infrastructure-as-code practices will be part of your responsibilities to enhance uptime and reduce operational overhead. Additionally, you will lead incident and problem management processes, which includes conducting root cause analysis, managing stakeholder communications, and implementing corrective actions as needed. Ensuring platform change management and maintaining environment stability in alignment with regulatory and audit requirements will also fall under your purview. This role requires a seasoned professional with a strong background in platform services and data governance technologies, as well as a proactive approach to driving operational excellence.,

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a DataOps Engineer, you will be responsible for designing and maintaining scalable ML model deployment infrastructure using Kubernetes and Docker. Your role will involve implementing CI/CD pipelines for ML workflows, ensuring security best practices are followed, and setting up monitoring tools to track system health, model performance, and data pipeline issues. You will collaborate with cross-functional teams to streamline the end-to-end lifecycle of data products and identify performance bottlenecks and data reliability issues in the ML infrastructure. To excel in this role, you should have strong experience with Kubernetes and Docker for containerization and orchestration, hands-on experience in ML model deployment in production environments, and proficiency with orchestration tools like Airflow or Luigi. Familiarity with monitoring tools such as Prometheus, Grafana, or ELK Stack, knowledge of security protocols, CI/CD pipelines, and DevOps practices in a data/ML environment are essential. Exposure to cloud platforms like AWS, GCP, or Azure is preferred. Additionally, experience with MLflow, Seldon, or Kubeflow, knowledge of data governance, lineage, and compliance standards, and understanding of data pipelines and streaming frameworks would be advantageous in this role. Your expertise in data pipelines, Docker, Grafana, Airflow, CI/CD pipelines, orchestration tools, cloud platforms, compliance standards, data governance, ELK Stack, Kubernetes, lineage, ML, streaming frameworks, ML model deployment, and DevOps practices will be key to your success in this position.,

Posted 3 weeks ago

Apply

3.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

The ideal candidate for this role should have 3 to 8 years of experience in Data Governance/Cloud Data. As a Lead PoC for OCI Data Catalog, you will be responsible for designing, configuring, and demonstrating catalog capabilities. You will also need to integrate with Oracle DB, Object Storage, and on-prem sources while showcasing metadata, lineage, classification, and stewardship. Additionally, the role requires exporting metadata to the Marketplace App, documenting outcomes, and training client teams. The candidate should possess hands-on experience with OCI Data Catalog and be proficient in metadata, lineage, and classification. Strong communication and documentation skills are essential for effectively fulfilling the responsibilities of this position.,

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

ahmedabad, gujarat

On-site

You are a Senior Data Engineer with expertise in constructing scalable data pipelines utilizing Microsoft Fabric. Your primary responsibilities will involve developing and managing data pipelines through Microsoft Fabric Data Factory and OneLake. You will be tasked with designing and creating ingestion and transformation pipelines for both structured and unstructured data. It will be your responsibility to establish frameworks for metadata tagging, version control, and batch tracking to ensure the security, quality, and compliance of data pipelines. Additionally, you will play a crucial role in contributing to CI/CD integration, observability, and documentation. Collaboration with data architects and analysts will be essential to align with business requirements effectively. To qualify for this role, you should possess at least 6 years of experience in data engineering, with a minimum of 2 years of hands-on experience working on Microsoft Fabric or Azure Data services. Proficiency in tools like Azure Data Factory, Fabric, Databricks, or Synapse is required. Strong SQL and data processing skills (such as PySpark and Python) are essential. Previous experience with data cataloging, lineage, and governance frameworks will be beneficial for this position.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

At PwC, our team in managed services focuses on providing a variety of outsourced solutions and supporting clients across multiple functions. We help organizations streamline their operations, reduce costs, and enhance efficiency by managing key processes and functions on their behalf. Our team is skilled in project management, technology, and process optimization, ensuring the delivery of high-quality services to our clients. Those in managed service management and strategy at PwC are responsible for transitioning and running services, managing delivery teams, programs, commercials, performance, and delivery risk. Your role will involve continuous improvement and optimization of managed services processes, tools, and services. As a member of our team, you will build meaningful client relationships and learn how to manage and inspire others. You will navigate complex situations, develop your personal brand, deepen your technical expertise, and leverage your strengths. Anticipating the needs of your teams and clients, you will deliver quality results. Embracing ambiguity, you will be comfortable when the path forward is unclear, asking questions and using such moments as opportunities for growth. Required skills, knowledge, and experiences for this role include but are not limited to: - Responding effectively to diverse perspectives, needs, and feelings of others - Using a broad range of tools, methodologies, and techniques to generate new ideas and solve problems - Applying critical thinking to break down complex concepts - Understanding the broader objectives of your project or role and how your work aligns with the overall strategy - Developing a deeper understanding of the business context and its changing dynamics - Using reflection to enhance self-awareness, strengths, and development areas - Interpreting data to derive insights and recommendations - Upholding and reinforcing professional and technical standards, along with the Firm's code of conduct and independence requirements As a Senior Associate, you will work collaboratively with a team of problem solvers, addressing complex business issues from strategy to execution through Data, Analytics & Insights Skills. Your responsibilities at this level include: - Using feedback and reflection to enhance self-awareness, personal strengths, and address development areas - Demonstrating critical thinking and the ability to structure unstructured problems - Reviewing deliverables for quality, accuracy, and relevance - Adhering to SLAs, incident management, change management, and problem management - Leveraging tools effectively in different situations and explaining the rationale behind the choices - Seeking opportunities for exposure to diverse situations, environments, and perspectives - Communicating straightforwardly and structurally to influence and connect with others - Demonstrating leadership by engaging directly with clients and leading engagements - Collaborating in a team environment with client interactions, workstream management, and cross-team cooperation - Contributing to cross-competency work and Center of Excellence activities - Managing escalations and risks effectively Position Requirements: - Primary Skill: Tableau, Visualization, Excel - Secondary Skill: Power BI, Cognos, Qlik, SQL, Python, Advanced Excel, Excel Macro BI Engineer Role: - Minimum 5 years hands-on experience in building advanced Data Analytics - Minimum 5 years hands-on experience in delivering Managed Data and Analytics programs - Extensive experience in developing scalable, repeatable, and secure data structures and pipelines - Proficiency in industry tools like Python, SQL, Spark for Data analytics - Experience in building Data Governance solutions using leading tools - Knowledge of Data consumption patterns and BI tools like Tableau, Qlik Sense, Power BI - Strong communication, problem-solving, quantitative, and analytical abilities Certifications in Tableau and other BI tools are advantageous, along with certifications in any cloud platform. In our Managed Services - Data, Analytics & Insights team at PwC, we focus on collaborating with clients to leverage technology and human expertise, delivering simple yet powerful solutions. Our goal is to enable clients to focus on their core business while trusting us as their IT partner. We are driven by the passion to enhance our clients" capabilities every day. Within our Managed Services platform, we offer integrated services grounded in industry experience and powered by top talent. Our team of global professionals, combined with cutting-edge technology, ensures effective outcomes that add value to our clients" enterprises. Through a consultative approach, we enable transformational journeys that drive sustained client outcomes, allowing clients to focus on accelerating their priorities and optimizing their operations. As a member of our Data, Analytics & Insights Managed Service team, you will contribute to critical Application Evolution Service offerings, help desk support, enhancement and optimization projects, and strategic roadmap development. Your role will involve technical expertise and relationship management to support customer engagements effectively.,

Posted 1 month ago

Apply

3.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

As a Lead PoC for OCI Data Catalog, you will be responsible for designing, configuring, and demonstrating catalog capabilities. Your role will involve integrating with Oracle DB, Object Storage, and on-prem sources to showcase metadata, lineage, classification, and stewardship. Additionally, you will be expected to export metadata to Marketplace App, document outcomes, and provide training to client teams. The ideal candidate should have a minimum of 3 years of experience in Data Governance and Cloud Data, with hands-on experience in OCI Data Catalog. You should possess a strong understanding of metadata, lineage, and classification concepts. Excellent communication and documentation skills are essential for this role to effectively interact with stakeholders and produce relevant documentation.,

Posted 1 month ago

Apply

7.0 - 11.0 years

0 Lacs

hyderabad, telangana

On-site

As a Principal Data Engineer at Skillsoft, you will play a crucial role in driving the advancement of Enterprise data infrastructure by designing and implementing the logic and structure for how data is set up, cleansed, and stored for organizational usage. You will be responsible for developing a Knowledge Management strategy to support Skillsoft's analytical objectives across various business areas. Your role will involve building robust systems and reusable code modules to solve problems, working with the latest open-source tools and platforms to build data products, and collaborating with Product Owners and cross-functional teams in an agile environment. Additionally, you will champion the standardization of processes for data elements used in analysis, establish forward-looking data and technology objectives, manage a small team through project deliveries, and design rich data visualizations and interactive tools to communicate complex ideas to stakeholders. Furthermore, you will evangelize the Enterprise Data Strategy & Execution Team mission, identify opportunities to influence decision-making with supporting data and analysis, and seek additional data resources that align with strategic objectives. To qualify for this role, you should possess a degree in Data Engineering, Information Technology, CIS, CS, or related field, along with 7+ years of experience in Data Engineering/Data Management. You should have expertise in building cloud data applications, cloud computing, data engineering/analysis programming languages, and SQL Server. Proficiency in data architecture, data modeling, and experience with technology stacks for Metadata Management, Data Governance, and Data Quality are essential. Additionally, experience in working cross-functionally across an enterprise organization and an Agile methodology environment is preferred. Your strong business acumen, analytical skills, technical abilities, and problem-solving skills will be critical in this role. Experience with app and web analytics data, CRM, and ERP systems data is a plus. Join us at Skillsoft and be part of our mission to democratize learning and help individuals unleash their edge. If you find this opportunity intriguing, we encourage you to apply and be a part of our team dedicated to leadership, learning, and success at Skillsoft. Thank you for considering this role.,

Posted 1 month ago

Apply

7.0 - 12.0 years

12 - 22 Lacs

Chennai

Hybrid

About Company: Papigen is a fast-growing global technology services company, delivering innovative digital solutions through deep industry experience and cutting-edge expertise. We specialize in technology transformation, enterprise modernization, and dynamic areas like Cloud, Big Data, Java, React, DevOps, and more. Our client-centric approach combines consulting, engineering, and data science to help businesses evolve and scale efficiently. About the Role: We are seeking a Senior Data Engineer to join our growing cloud data team. In this role, you will design and implement scalable data pipelines and ETL processes using Azure Databricks , Azure Data Factory , PySpark , and Spark SQL . Youll work with cross-functional teams to develop high-quality, secure, and efficient data solutions in a data lakehouse architecture on Azure. Key Responsibilities: Design, develop, and optimize scalable data pipelines using Databricks , ADF , PySpark , Spark SQL , and Python Build robust ETL workflows to transform and load data into a lakehouse architecture on Azure Ensure data quality, security, and compliance with data governance and privacy standards Collaborate with stakeholders to gather business requirements and deliver technical data solutions Create and maintain technical documentation for workflows, architecture, and data models Work within an Agile environment and track tasks using tools like Azure DevOps Required Skills & Experience: 8+ years of experience in data engineering and enterprise data platform development Proven expertise in Azure Databricks , Azure Data Factory , PySpark , and Spark SQL Strong understanding of Data Warehouses , Data Marts , and Operational Data Stores Proficient in writing complex SQL / PL-SQL queries and understanding data models and data lineage Knowledge of data management best practices: data quality , lineage , metadata , reference/master data Experience working in Agile teams with tools like Azure DevOps Strong problem-solving skills, attention to detail, and the ability to multi-task effectively Excellent communication skills for interacting with both technical and business teams Benefits and Perks: Opportunity to work with leading global clients Exposure to modern technology stacks and tools Supportive and collaborative team environment Continuous learning and career development opportunities

Posted 1 month ago

Apply

7.0 - 12.0 years

25 - 27 Lacs

Noida

Hybrid

Job Title: AVP Data Analyst Location: Noida Package up to 28 LPA Key Responsibilities: Investigate and resolve data quality, lineage, and control issues. Design and build scalable data pipelines to automate data flow and processing. Conduct data cleansing, transformation, and validation activities. Apply machine learning and AI to solve complex operational problems. Oversee data governance policies and ensure adherence to regulatory frameworks like BCBS 239 . Collaborate with stakeholders across Risk, Compliance, Technology, and Business to deliver actionable insights. Document and report on data issues and improvements, supporting audit and control frameworks. Required Skills & Qualifications: Proven experience in Data Management, Data Governance , and Data Quality within a financial services environment. Strong understanding of risk and control frameworks , including enterprise risk and operational risk. Proficiency in tools and technologies such as: SQL , Python , Tableau , Tableau Prep , Power Apps , Alteryx Familiarity with structured and unstructured data analytics Excellent stakeholder management, communication, and presentation skills. Strong analytical mindset with the ability to work independently and lead data initiatives. Working knowledge of financial crime and fraud data domains . For more details Call Kanika on 9953939776 or email resume to Kanika@manningconsulting.in

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies