Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 10.0 years
13 - 17 Lacs
bengaluru
Work from Office
Develop, implement, and maintain data governance policies, standards, and processes across the organization Ensure compliance with data privacy and security policies, including adherence to relevant legal and regulatory frameworks (eg, GDPR, CCPA) Identify opportunities for automation within data governance and data pipeline processes, drivingefficiency and scalability Define, maintain, and manage a data dictionary, business glossary, and metadata repository Collaborate with data owners and stewards to ensure accountability and enforce data standardization practices Lead data quality management initiatives; define and monitor KPIs related todata accuracy, completeness, and consistency Monitor enterprise-wide data usage to ensure alignment with governance frameworks and data access policies Design, develop, and optimize ETL/ELT pipelines and data workflows using modern data engineering tools Support audit and compliance reporting needs by providing well-governed and traceable data pipelines Required Skills & Qualifications: Education: Bachelors or Master s degree in Computer Science, Information Systems, Data Management, or a related field Experience: Minimum 6 years in data governance, data engineering, or data management roles Strong expertise in data governance tools and familiarity with data cataloging Hands-on experience with data integration/analytics platforms (e g, Alteryx, Redpoint, Talend, or similar tools) Proven experience working with Salesforce CRM and understanding of its data structure and governance needs Solid knowledge of SQL, data warehousing concepts, and data lifecycle management Familiarity with cloud data platforms (eg, AWS, Azure, GCP) is a plus Excellent problem-solving, documentation, and cross-functional stakeholder management skills Strong understanding of data privacy regulations, master data management (MDM), and data lineage tracking Preferred (Nice to Have): Experience with data governance frameworks like DAMA-DMBOK Knowledge of data security frameworks and access control models Background in business process improvement or Lean/Agile methodologies
Posted -1 days ago
7.0 - 11.0 years
2 - 6 Lacs
mumbai
Work from Office
About The Role Skill required: Delivery - Search Engine Optimization (SEO) Designation: I&F Decision Science Practitioner Specialist Qualifications: Any Graduation Years of Experience: 7 to 11 Years What would you do? Accenture is a global professional services company with leading capabilities in digital, cloud and security. Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities. Visit us at www.accenture.com Were Accenture Marketing Operations. Were the global managed services arm of Accenture Interactive. We sit in the Operations Business to take advantage of the industrialized run capabilities leveraging investments from Accenture Operations. Our quest is to activate the best experiences on the planet by driving value across every customer interaction to maximize marketing performance. We combine deep functional and technical expertise to future-proof our clients business while accelerating time-to-market and operating efficiently at scale.We are digital professionals committed to providing innovative, end-to-end customer experience solutions focusing on operating marketing models that help businesses transform and excel in the new world, with an ecosystem that empowers our clients to implement the changes necessary to support the transformation of their businesses. Develop an organic search engine optimization strategy based on client goals and objectives, defining keywords and priority content, and ensuring web/mobile content meets marketing needs. What are we looking for? Develop and implement SEO best practices, including technical, on-page, and off-page optimization to enhance organic search rankings and website traffic Analyze SEO metrics (rankings, traffic, CTR) and CRM data (customer interactions, conversions, engagement) using tools like Google Analytics (GA4), Search Console, and CRM dashboards Conduct keyword research, metadata optimization, and content strategy development to align with search intent and improve visibility Work closely with marketing, sales, and analytics teams to integrate SEO and CRM strategies for a unified customer acquisition and retention approach Stay updated with Google algorithm updates, evolving SEO trends, GDPR/CCPA regulations, and industry best practices to ensure continued business growth and compliance Experience in data warehousing techniques like snowflake, GCP, Databricks, SQL Industry Experience Beauty, CPG, Retail Python Roles and Responsibilities: Understanding of the strategic direction set by senior management as it relates to team goals Able to do keyword research, content optimization, technical SEO improvements, and performance reporting to enhance organic search visibility Prepare actionable insights from SEO tools In this role you are required to do analysis and solving of increasingly complex problems Your day-to-day interactions are with peers within Accenture You are likely to have some interaction with clients and/or Accenture management You will be given minimal instruction on daily work/tasks and a moderate level of instruction on new assignments Decisions that are made by you impact your own work and may impact the work of others In this role you would be an individual contributor and/or oversee a small work effort and/or team Qualification Any Graduation
Posted -1 days ago
7.0 - 12.0 years
20 - 25 Lacs
mumbai
Work from Office
Metadata Governance Capability Lead is a critical role to support Data Governance evolution specially as it relates to Enterprise Data Catalog capabilities. Partnering with Business Data Owners, Stewards and Custodians, DMS, D&A and other MDS teams, (S)He will operationalize, monitor and evolve Data Governance practices, ensuring that Metadata Cataloging capabilities are adopted consistently across the organization and that enterprise data becomes Findable, Accessible, Interoperable, and Reusable (FAIR). How you will contribute You will manage the employee services community in the service center and onshore in countries and manage the relationships between delivery teams and internal customers To excel in this role, you will report on service-center performance and lead reviews with the functional leadership team and key stakeholders In addition, you will contribute to global service management agenda and integrate our global scale using simplified and standardized processes and technologies As part of this job, you will oversee the employee services center s annual budget; handle local change requests and follow the global employee services governance framework to keep local documentation updated and compliant; manage suppliers delivering services for the processes; and identify service improvements to continuously improve the quality of the service provided As a people leader, you will training employees and lead, coach, motivate, train and inspire direct reports to deliver the employee services agenda What you will bring In partnership with Data Governance Engagement, Data Domains, Data Products and other D&A teams define and lead execution of the enterprise Data Governance Strategy leveraging and enhancing the Governance Framework. Advance evolution of Data Governance by leveraging the Data Governance Framework, focusing on delivery and completion of enterprise data catalog and data solutions, ensuring that Data Governance team adheres to Governance standards and policies. Oversee and co-own enterprise data catalog operating model in alignment with strategic business priorities. Oversee and own the development of Metadata Governance, Management and Data Certification standards, rules, principles, definitions, taxonomies and metrics for Data Catalog; ensure their promotion, communication, socialization, and execution. Define, measure and report catalog adoption and Metadata Governance KPI. Lead continuous improvement of Metadata cataloging process to ensure it is successful Advance Data Governance solutions for data catalog, data lineage, data quality and others to enable a sustainable and measurable Data Governance practice in partnership with other teams in D&A and MDS. Develop & co-own a roadmap in the designated area and manage future demands of the Metadata Governance capabilities organization in this space. Manage and advance enterprise information management principles, policies and programs for ownership, stewardship, and custodianship of data and analytics. Ensure that Data Cataloging practices enhance data product usability, interoperability, and value realization promoting FAIR data principles. Drive data catalog adoption with key users from business and MDS function, promote data catalog usage for data citizens Ensure harmonized and aligned approach for Metadata Governance and Management in collaboration with other teams (D&A, Architects, Product and Platform teams). Partnering with Data Governance Engagement, MDS and Functional business teams, oversee sessions to identify and document critical data sets, lineage, business rules, scorecards required for Domain and Product Data Governance in Data Catalog, coordinate onboarding of new Data Governance bodies (training, provisioning) Foster Metadata Governance community and collaboration by integrating feedback from MDS and business regional and functional teams and preventing functional silos. Serve as a business consultant on business problems, identifies data needs and formulates data solutions, creates practical development roadmaps, and presents recommendations to stakeholders at all levels of the business & MDS Knowledge, Experience and Education 7+ years Data Management / Data Domains experience in IT and business functions, ideally in the consumer goods industry Strategic view of how data management enables a high performing business and a broad understanding of business processes, strong understanding of business objectives of data domains Understanding and experience of data tools usage (Data Catalog Lineage and Cloud Solutions, Data Quality such as Collibra, Ataccama, Informatica, Alation, DEMS among others) and in Master Data Domains and MDM solutions Expert knowledge of Data quality and Governance standards coupled with experience at systemizing them in IT Governance solutions. Familiar operating in large and complex multi-national organizations to create alignment for her/his strategic priorities across a diverse range of internal and external stakeholders. Demonstrated knowledge of Data Governance strategies and processes, ability to execute sustainable data management strategies Demonstrated ability to influence and drive change across organizations, to deal with high levels of ambiguity Understanding of data base, data warehouse and data modeling concepts DAMA certifications and SAP S/4 knowledge a plus Personal Skills and Characteristics Analytical, conceptual, strategic, and forward-thinking leadership and management skills Strong communication and interpersonal skills Passion for product development Ability to translate business needs into potential system solution Analytical mindset with ability to grasp complex data sets and to identify patterns or critical data elements Strong work ethic; ability to work at an abstract level and gain consensus Demonstrated ability to collaborate globally, to work in a team environment yet independently productive Persistent and resilient in the pursuit of objectives; driven and results oriented and holds self and others accountable Experience and interest in leading, influencing, coaching and mentoring others, ability to deal with conflict resolution and remediation of issues escalated
Posted -1 days ago
2.0 - 5.0 years
9 - 14 Lacs
hyderabad
Work from Office
Work within agile multi-skilled teams to create world class products to serve our customer s needs. Perform the elicitation and analysis of business change, functional and non-functional requirements across a range of stakeholders, work with cross asset IT teams to interface those requirements and finally deliver a working reconciliation solution. Own and produce relevant analysis and modeling artefacts that enable development teams to develop working products. Understand the "user" journey end to end which goes beyond the "system . Create/enhance physical data models by adhering to the agreed standards, to fulfil both business as we'll as technical requirements and expectations. Undertake the metadata analysis which includes but not limited to naming of the physical table and columns, definitions and appropriate data type and length etc Create and maintain the target data models. Requirements Strong Data Analysis and/or Data Modeling experience. Strong financial domain and data analysis skills with experience covering activities like requirement gathering, elicitation skills, gap analysis, data analysis, effort estimation, reviews, ability to translate the high-level functional data or business requirements into technical solution, database designing and data mapping. Comprehensive understanding of Physical Data Modeling, create and deliver high-quality data models, by following agreed data governance and standards. Maintain quality metadata and data related artefacts which should be accurate, complete, consistent, unambiguous, reliable, accessible, traceable, and valid. Should be an individual contributor with good understanding of the SDLC or Agile methodologies. A team player with self-starter approach, having a sense of ownership, should be a problem solver with solution-seeking approach and an ability to work in fast-paced and continuously changing environment. Excellent communication & stakeholder management skills and should be capable of building rapport and relationships. Act as a liaison between business and technical teams to bridge any gaps and assist the business teams & technical teams to successfully deliver the projects. Other Skills & Tools: SQL, MS-Office tools, GCP Big Query, Erwin (preferable), Visual Paradigm (preferable).
Posted -1 days ago
12.0 - 13.0 years
30 - 35 Lacs
hyderabad
Work from Office
Work within agile multi-skilled teams to create world class products to serve our customer s needs. Perform the elicitation and analysis of business change, functional and non-functional requirements across a range of stakeholders, work with cross asset IT teams to interface those requirements and finally deliver a working reconciliation solution. Own and produce relevant analysis and modeling artefacts that enable development teams to develop working products. Understand the "user" journey end to end which goes beyond the "system . Create/enhance physical data models by adhering to the agreed standards, to fulfil both business as we'll as technical requirements and expectations. Undertake the metadata analysis which includes but not limited to naming of the physical table and columns, definitions and appropriate data type and length etc Create and maintain the target data models. Requirements Strong Data Analysis and/or Data Modeling experience. Strong financial domain and data analysis skills with experience covering activities like requirement gathering, elicitation skills, gap analysis, data analysis, effort estimation, reviews, ability to translate the high-level functional data or business requirements into technical solution, database designing and data mapping. Comprehensive understanding of Physical Data Modeling, create and deliver high-quality data models, by following agreed data governance and standards. Maintain quality metadata and data related artefacts which should be accurate, complete, consistent, unambiguous, reliable, accessible, traceable, and valid. Should be an individual contributor with good understanding of the SDLC or Agile methodologies. A team player with self-starter approach, having a sense of ownership, should be a problem solver with solution-seeking approach and an ability to work in fast-paced and continuously changing environment. Excellent communication & stakeholder management skills and should be capable of building rapport and relationships. Act as a liaison between business and technical teams to bridge any gaps and assist the business teams & technical teams to successfully deliver the projects. Other Skills & Tools: SQL, MS-Office tools, GCP Big Query, Erwin (preferable), Visual Paradigm (preferable).
Posted -1 days ago
12.0 - 13.0 years
30 - 35 Lacs
hyderabad
Work from Office
Work within agile multi-skilled teams to create world class products to serve our customer s needs. Perform the elicitation and analysis of business change, functional and non-functional requirements across a range of stakeholders, work with cross asset IT teams to interface those requirements and finally deliver a working reconciliation solution. Own and produce relevant analysis and modeling artefacts that enable development teams to develop working products. Understand the "user" journey end to end which goes beyond the "system . Create/enhance physical data models by adhering to the agreed standards, to fulfil both business as we'll as technical requirements and expectations. Undertake the metadata analysis which includes but not limited to naming of the physical table and columns, definitions and appropriate data type and length etc Create and maintain the target data models. Requirements Strong Data Analysis and/or Data Modeling experience. Strong financial domain and data analysis skills with experience covering activities like requirement gathering, elicitation skills, gap analysis, data analysis, effort estimation, reviews, ability to translate the high-level functional data or business requirements into technical solution, database designing and data mapping. Comprehensive understanding of Physical Data Modeling, create and deliver high-quality data models, by following agreed data governance and standards. Maintain quality metadata and data related artefacts which should be accurate, complete, consistent, unambiguous, reliable, accessible, traceable, and valid. Should be an individual contributor with good understanding of the SDLC or Agile methodologies. A team player with self-starter approach, having a sense of ownership, should be a problem solver with solution-seeking approach and an ability to work in fast-paced and continuously changing environment. Excellent communication & stakeholder management skills and should be capable of building rapport and relationships. Act as a liaison between business and technical teams to bridge any gaps and assist the business teams & technical teams to successfully deliver the projects. Other Skills & Tools: SQL, MS-Office tools, GCP Big Query, Erwin (preferable), Visual Paradigm (preferable).
Posted -1 days ago
1.0 - 5.0 years
17 - 19 Lacs
bengaluru
Work from Office
We are seeking a Data Engineer I who will contribute to a GenAI-powered insights assistant initiative by developing and maintaining performant ETL pipelines and semantic layers for WW FBA metrics. Your work ensures the assistant s text-to-SQL capabilities deliver accurate, up-to-date structured data insights supporting natural-language query responses. Develop and maintain ETL pipelines in Spark to transform and load FBA metrics into the Data Lakehouse. Optimize SQL queries for fast, cost-efficient access by AI systems. Support dbt semantic models and automate metadata enrichment with Glue. Write automated tests to ensure data quality and freshness. Build pipelines that add business context metadata to help AI generate accurate SQL queries. 1+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Proficiency in dbt, Airflow/MWAA, AWS Glue, Kinesis. Experience building semantic layers or BI models. Familiarity with prompt-driven SQL generation and AI-assisted query validation.
Posted -1 days ago
1.0 - 5.0 years
12 - 16 Lacs
bengaluru
Work from Office
Develop metadata pipelines to tag documents with freshness, ownership, and other context for better filtering. Implement caching and multi-region replication to reduce query latency. Monitor data retrieval accuracy and log source citations to improve AI trustworthiness. Automate ingestion and embedding generation for unstructured data into vector databases like Zilliz, Pinecone, or OpenSearch. 1+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Strong expertise in AWS Glue, Redshift, Kinesis/MSK, Lambda. Hands-on with data contracts, lineage tracking, and automated QA. Familiarity with multi-modal data ingestion (structured + unstructured). Experience operationalizing cross-region replication and caching strategies.
Posted -1 days ago
1.0 - 5.0 years
17 - 19 Lacs
bengaluru
Work from Office
We are seeking a Data Engineer I who will support a GenAI-powered insights assistant initiative by building and scaling ingestion and embedding pipelines for unstructured WW FBA knowledge bases. Your role ensures the retrieval-augmented generation system accesses fresh, relevant document embeddings to enhance AI-driven insights and user query satisfaction. Build batch and streaming data pipelines using Spark and AWS streaming services. Implement automated checks to ensure data consistency across different data types. Define and maintain data contracts with source teams to keep schemas consistent. Develop cross-domain metadata services linking structured and unstructured data catalogs. Create APIs and event-driven workflows integrating AI insights with business tools. Monitor pipeline health, costs, and SLA adherence. 1+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Familiarity with RAG (Retrieval-Augmented Generation) principles. AWS experience: Lambda, S3, SageMaker, Bedrock Knowledge Bases.
Posted -1 days ago
10.0 - 14.0 years
8 - 11 Lacs
bengaluru
Work from Office
As a Business Data Steward, you will be responsible for ensuring the quality and integrity of clinical operations data to enable better planning and monitoring of GSK s clinical trials. Your primary focus will be on specific data domains, with responsibilities including: Data Definition and Metadata Cataloging : Ensure clear definitions and metadata documentation for assigned data domains. Data Standards and Governance : Ensure data meets required standards as per defined data governance policies. Define data classification, security, use, and quality standards to adhere to compliance policies. Data Quality Assurance : Establish data quality checks and resolve issues by collaborating with Product Owners and data source SMEs. Triage and resolve user queries regarding data domains. Data Flow and Lineage : Understand and document data flow and lineage to ensure transparency and traceability across systems and processes. Issue Resolution and Collaboration : Resolve data quality issues in a matrix organization through cross-functional collaboration. Support third-party activity monitoring related to data management
Posted -1 days ago
5.0 - 7.0 years
6 - 10 Lacs
gurugram
Work from Office
We are part of Digital-IT team established 17 years ago in Gurgaon, India to provide technology support and rollout digital initiatives to 60 plus global offices. Digital IT has six key pillars Collaboration Technology; Functional Technology; Digital Technology; Security & Architecture; Infrastructure & Services, Digital Success to support business and to take lead on digital transformation initiatives with the total strength of 150+ team members across the globe. Requirements Job Summary: We are seeking a skilled and motivated Data Engineer with strong proficiency in Python and a solid foundation in
Posted -1 days ago
2.0 - 5.0 years
6 - 11 Lacs
bengaluru
Work from Office
We are currently seeking a Salesforce Data Cloud Developer to join our team in Bangalore, Karntaka (IN-KA), India (IN). We are looking for a talented Salesforce Data Cloud Developer to design, build and support world-class Salesforce applications for our CRM based product development. As a Salesforce Data Cloud Developer, you will be part of an agile (SAFe) team, participate in all SAFe ceremonies, and contribute daily to a high-quality implementation. You will work on end-to-end Data Cloud design and delivery - including data ingestion, modeling, identity resolution, Calculated Insights, segmentation, activations, and governance. You will actively help define, estimate, design, and implement required user stories/features; participate in daily stand-ups and full-day events; and continuously improve the product to maximize value. You will be part of a multidisciplinary team in a high demand yet highly collaborative environment. Your role will also include troubleshooting and resolving defects, ensuring data quality and performance, documenting solutions, and supporting release and operational readiness (monitoring, alerting, and CI/CD). Job Title: Salesforce Data Cloud Developer Education: Bachelors Degree (Computer Science) or equivalent No of Years Experience: 4 to 6 Years; minimum 2 years with Salesforce Data Cloud Job Responsibilities : Configure core Data Cloud Components: set up and manage Data Streams, Data Lake Objects (DLOs), Data Model Objects (DMOs), Calculated Insights, and Identity Resolution rules. Develop and maintain data ingestion pipelines and integrations, with robust monitoring and error handling. Perform Data Profiling & Metadata Management; assess data quality and completeness. Maintain the data dictionary and related metadata documentation Support DevOps practices, including CI/CD and release readiness Contribute to testing and quality assurance Technical skills: Bachelors degree in computer science or software engineering. At least 5 years of Salesforce experience At least 2 years of relevant working experience with Salesforce Data Cloud development. Good understanding of when to use declarative tools, declarative in combination with non-declarative tools and non-declarative only. And to advocate for the preferred tooling. Soft Skills: Excellent verbal and written communication skills in English Experience in Customer facing projects Excellent team player with good lead capabilities Good decision-making capabilities Good presentation and reporting skills
Posted -1 days ago
4.0 - 7.0 years
7 - 8 Lacs
noida
Work from Office
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Responsibilities: Responsible for End-to-End Development Activities in Informatica MDM. Experience in developing multidomain MDM solutions. Expertise in working with Banking industry Master Data. Proficient in Informatica MDM configurations Data Modeling, User Group & Privilege Setup, Hierarchy Management, Workflow Setup, etc. Reference Data Management in Informatica MDM. Expertise in Data Quality (DQ) rules, Match and Merge strategies, Golden Records Survivorship Rules, and Match Tuning. Ability to configure the Informatica MDM User Interface using Provisioning Tool. Good knowledge of Integration Endpoints (Outbound and Inbound integration endpoints using SOAP, REST, and other upstream/downstream systems). Extend support to Data Stewards to improve business processes. Assist in MDM support activities. Implement and support data loading to the Informatica MDM application. Customize MDM solutions, creating master data management solutions using agile methodologies. Strong understanding of MDM concepts such as Integration Patterns, Trust, Survivorship, Publishing, and Change Data Capture (CDC). Capable of working on new developments by understanding the data model, making changes to the repository as needed, onboarding source data, DQ configurations, and enrichment in MDM. Mandatory skill sets: Data Modeling, User Group & Privilege Setup, Hierarchy Management, Workflow Setup, Data Quality Preferred skill sets: Exposure to other MDM tools, such as IBM MDM, Reltio, or STIBO. Experience working with business, technology, and process workstreams for master data identification, definition, MDM requirements, architecture patterns, and setting up MDM platforms across the organization. Understanding of business domains, data definitions, technical and operational metadata, with the ability to document and maintain a data dictionary and data quality definitions at both concept and data element levels. Certification in Informatica MDM or other MDM tools is a plus. Good knowledge of industries such as BFSI, Life Sciences, or Manufacturing, and related products is preferred Years of experience required: 4 - 7 yrs Education qualification: Btech/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: MBA (Master of Business Administration), Bachelor of Technology Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Data Modeling Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Coaching and Feedback, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion {+ 21 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship No Government Clearance Required No Job Posting End Date
Posted -1 days ago
2.0 - 5.0 years
13 - 15 Lacs
bengaluru
Work from Office
Should have good analytical skills and experienced in writing SQL queries Should have hands-on experience in developing Spark applications using Data frame API, PySpark-SQL in Data bricks for data extraction, transformation, and aggregation from multiple file formats for Analyzing & transforming the data using Python to uncover insights into the customer usage patterns. Should have hands-on experience Extract Transform and Load data from sources Systems (On-prem) to Azure Data Storage services using a combination of Azure Data factory, T-SQL, Spark SQL. Metadata driven ingestion pipeline framework by using Azure Data Factory Should have real time experience of implementing spark optimization techniques and adaptive query execution configuration in data bricks. Involved in conducting the code review and Results with the Developers Good understanding and real time architecture experience on Azure cloud (ADLS, ADFS, ADF) Knowledge of using GitHub for code versioning and devops deployment process. Consume data and generate reports using Power BI Good to have exposure to USA Healthcare Insurance domain
Posted -1 days ago
8.0 - 12.0 years
10 - 14 Lacs
bengaluru
Work from Office
About Bhavin Bhavin Turakhia is a serial entrepreneur and has founded multiple successful companies. His companies are unique in several ways - (1) all bootstrapped and self-funded (2) with an intense Focus on profitability (3) delivering High ROCE (4) serving Global Markets (6) in Enduring Categories (7) with Majority stake still owned by Bhavin. About Bhavin s Businesses Zeta is a Next-Gen Banking Tech company that empowers banks and fintechs to launch banking products for the future. Its flagship processing platform - Zeta Tachyon - is the industry s first modern, cloud-native, and fully API-enabled stack that brings together issuance, processing, lending, core banking, fraud & risk, and many more capabilities as a single-vendor stack. Zeta is actively working with the largest Banks and Fintechs in multiple global markets transforming customer experience for multi-million card portfolios. Zeta has over 1700+ employees - with over 70% roles in R&D - across locations in the US, EMEA, and Asia . Zeta has raised $400 million at a $2 billion valuation from Softbank, UHG, Mastercard, and other investors. Titan is the first customer-centric email suite created specifically for professionals and small business owners, with features designed to enable deeper, more meaningful relationships with customers. Available through leading web hosts, site builders, and domain registrars, Titan provides dynamic tools needed to effectively build customer relationships over email. In 2021, Titan received a $30M investment from WordPress , valuing Titan at $300M. Radix is one of the worlds largest domain registries; and the owner of the most premium Top Level Domain extensions including .store, .inline, .tech, .online, .website, .site, .space etc. Radix is profitable, lean and was valued at over $900 million. (Note: The above is not a complete list) About the Company Stealth mode Bangalore-based fintech startup building from India for the world. We are committed to delivering cutting-edge personal finance management solutions and a robust marketplace for banking and wealth products. With a technology-first approach, we aim to empower consumers to manage their finances effectively and improve financial well-being across geographies, including India, the UK, and the USA. The selected candidate will work closely with the marketing functions across the broader group, ensuring alignment, synergy, and consistency in brand voice, digital strategy, and execution across entities. About the role You will own mobile user acquisition and early-lifecycle growth across US, UK & India markets. The mandate: plan, launch, and scale app-install and activation campaigns across Google, Apple, Meta, programmatic (DV360/DSPs), and affiliate/CPI networks; build a privacy-ready measurement foundation; and run a disciplined test-and-learn program that improves activation, retention, and LTV. You ll be both strategist and operator comfortable in platforms, in the data, and in the details that drive compounding results. Job Location Bangalore, India (100% on-site) Responsibilities Acquisition & Channel Operations Run multi-channel user acquisition end-to-end: Execute across Google UAC, Apple Search Ads, Meta AAC, DV360/DSPs, and Affiliate/CPI covering goals, event mapping, keywords/match types, creative/asset feeds, placements, audience expansion, and learning-phase control. Own outcomes & pacing: Hands-on management of platforms to hit install volume/quality and efficiency (CPI/CAC, CPQA, pLTV) targets, rebalance budgets to marginal ROI, and publish timely, accurate and consistent reports Measurement, Attribution & Data Integrity MMP ownership (AppsFlyer): Own event taxonomy, postbacks, SRN/partner configs, Protect360 rules, and rigorous data QA. iOS & Android specifics: Design SKAN 4 conversion mapping, implement Play Install Referrer, and prepare for Android Privacy Sandbox attribution. Source-of-truth dashboards: Maintain SQL/BI cohort, funnel, retention, and ROAS/pLTV dashboards with Data as the single source of truth. Lifecycle, Retention & Re-engagement MoEngage collaboration: Build high-intent segments and journeys to raise D0/D7/D30 retention; align CRM nudges with paid retargeting across drop-offs. Re-engagement: Sync audiences and exclusions for retargeting and quantify incremental lift where feasible. Creative, ASO & Messaging Creative engine: Drive a weekly creative cadence with naming taxonomies and asset libraries across hooks, formats, and durations. ASO partnership: Feed paid-search learnings into ASO metadata/keywords and align store creatives to winning ad narratives. Skills & Attributes Deep, hands-on expertise with Google App Campaigns, Apple Search Ads, Meta app campaigns, DSPs, and affiliate/CPI; can launch, optimize, and scale across iOS and Android. Comfortable with AppsFlyer, SKAN 4, and Play Install Referrer; can design clean event schemas and resolve attribution issues. Understanding of cohorts and funnels, tracks CAC vs LTV, and can work confidently in spreadsheets. Can navigate SQL/BI dashboards to get answers and insights that matter. Experimentation mindset with the ability to write clear hypotheses, size tests properly, choose A/B vs holdout when needed, and turn outcomes into repeatable playbooks. Strong understanding of technical and non-technical ASO, uses paid-search learnings to refine keywords and metadata; ensures messaging consistency in the stores. Experience with CRM platforms along with the proven ability to improve activation and early retention; can build practical retargeting and exclusion audiences. Bias to action, high accountability for targets, and ability to scale what works while sunsetting what doesn t. Excellent communication skills with the ability to manage cross-functional stakeholders and workflows efficiently. Experience & Qualifications: 6 8+ years in mobile growth/performance marketing for B2C apps, with direct hands-on work across Google UAC, Apple Search Ads, Meta, DSPs, and affiliate/CPI networks. Experience driving acquisition and growth for fintech or consumer tech products. Practical knowledge of AppsFlyer (or Adjust/Branch/Kochava), Firebase (Analytics, A/B testing, Remote Config), and MoEngage (segments, journeys). Proven track record of improving funnels from installs to registration/KYC, first action, and D1/D7/D30 retention by combining paid and lifecycle marketing. Strong data skills: advanced Excel/Sheets, ability to read SQL dashboards, and familiarity with BI tools like Looker, Tableau, or Power BI. Working knowledge of SKAN 4, Play Install Referrer, deep linking, web-to-app, and upcoming Privacy Sandbox changes. Exposure to fintech/regulatory requirements, execution in U.S. markets, ASO collaboration, and techniques like incrementality testing or MMM. PLEASE NOTE: While the job appears to be with Zeta, this role would be for Bhavin Turakhias new fintech startup in stealth mode.
Posted -1 days ago
9.0 - 14.0 years
22 - 27 Lacs
hyderabad, bengaluru
Work from Office
3+ years of experience in Azure Databricks with PySpark. 2+ years of experience in Databricks workflow & Unity Catalog. 2+ years of experience in ADF (Azure Data Factory), 2+ years of experience in ADLS Gen 2 2+ years of experience in Azure SQL. 3+ years of experience in Azure Cloud platform. 2+ years of experience in Python programming & package builds. Hands-on experience in designing and building scalable data pipelines using Databricks with PySpark , supporting batch and near-real-time ingestion, transformation, and processing. Ability to optimize Spark jobs and manage large-scale data processing using RDD/Data Frame APIs. Demonstrated expertise in partitioning strategies, file format optimization (Parquet/Delta), and Spark SQL tuning. Skilled in governing and managing data access for Azure Data Lakehouse with Unity Catalog. Experience in configuring data permissions, object lineage, and access policies with Unity Catalog. Understanding of integrating Unity Catalog with Azure AD, external megastores, and audit trails. Experience in building efficient orchestration solutions using Azure data factory, Databricks Workflows . Ability to design modular, reusable workflows using tasks, triggers, and dependencies. Skilled in using dynamic expressions, parameterized pipelines, custom activities, and triggers. Familiarity with integration runtime configurations, pipeline performance tuning, and error handling strategies. Good experience in implementing secure, hierarchical namespace-based data lake storage for structured/semi-structured data, aligned to bronze-silver-gold layers with ADLS Gen2 . Hands-on experience with lifecycle policies, access control (RBAC/ACLs), and folder-level security. Understanding of best practices in file partitioning, retention management, and storage performance optimization. Capable of developing T-SQL queries, stored procedures, and managing metadata layers on Azure SQL Comprehensive experience working across the Azure Cloud ecosystem, including networking, security, monitoring, and cost management relevant to data engineering workloads. Experience in writing modular, testable Python code used in data transformations, utility functions, and packaging reusable components. Familiarity with Python environments, dependency management (pip/Poetry/Conda), and packaging libraries. Ability to write unit tests using PyTest/unittest and integrate with CI/CD pipelines. Able to prepare Design documents for development adopting code analyzers and unit testing frameworks.
Posted -1 days ago
5.0 - 7.0 years
45 - 50 Lacs
hyderabad
Work from Office
If you are looking for a unique opportunity to have meaningful impact on a large scale at a leading financial services firm then this is your chance. As a Data Scientist Lead within the Consumer and Community Banking (CCB) Data & Analytics Data Strategy and Product organization, you will be an integral part of the External Data Team, reporting directly to the External Data Product Owner. You will work to establish and maintain strong partnerships with data suppliers, ensuring that JPMCs data needs are fulfilled with high-quality and dependable sources. Your role will involve collaborating with suppliers to modernize metadata and facilitate cloud-based data delivery and publishing. Additionally, you will support the External Data Solutions Lab team, which is tasked with sourcing, evaluating, and onboarding external datasets, models, and capabilities to address existing data gaps and align with strategic priorities for CCB lines of business. This position offers the opportunity to engage with both internal business stakeholders and external data suppliers, contributing to the broader external data strategy. Job Responsibilities Assisting in the management of external data cloud publishing requests, ensuring processes that prioritize viability, quality, accuracy, and usability. Modernizing the delivery and availability of external data for internal analysts and modelers in collaboration with an engineering team that supports cloud to cloud data publishing. Serving as a technical liaison by facilitating effective communication and understanding between the internal product team, engineering team, and data users. Assisting in the design and implementation of processes for extracting, transforming, loading, and refining metadata from a legacy semantic repository into a contemporary data discovery solution. Clearly communicating proposed scope and status of data management and publishing requests to drive superior customer experience for all stakeholders. Collaborating with internal teams to refine a large language model (LLM) solution that enhances metadata discovery for external data products. Required qualifications, skills and capabilities Bachelor s degree with 5+ years of experience in data & analytics or equivalent Experience in financial services, data and analytics, technology, or relevant start up environment Experience or familiarity with fundamentals of cloud computing and providers (AWS, Snowflake, Azure, GCP) Outstanding analytical and problem solving skills (problem structuring, driving to an answer, having a point of view) as well as a determination to become a subject matter expert within a respective field Ability to understand and communicate business strategy with both internal stakeholders and external suppliers, adjusting communication style based on the audience Strong project management skills with the ability to manage multiple projects and prioritize work within a multi-disciplinary team setting Demonstrates detail-oriented work, ensuring accuracy and consistency over time Ability to work effectively in cross-functional teams, ensuring project progress and alignment with organizational goals Able to manage and own a work plan autonomously Preferred qualifications, skills and capabilities Experience in building data pipelines and/or data management (preferred) Experience working with external suppliers and sourcing external data (preferred) Technical experience in Apache Spark and programming languages such as Python and Java (preferred) Familiarity or experience with GenAI solutions and how they can be effectively used as emerging solutions within financial services (preferred)
Posted -1 days ago
5.0 - 10.0 years
7 - 12 Lacs
noida
Work from Office
We are currently seeking a Systems Engineering Specialist Advisor to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Role Overview The NTT DATA Services Security organization is looking for talented security-oriented Systems Engineering Specialist Advisor with strong Active Directory/Azure AD/Identity skills. This role will be part of a larger dedicated security team dedicated to supporting, troubleshooting, upgrading Active Directory, Azure AD and related Identity technologies. Role Responsibilities: Active Directory designing, Architecture Solutions, Integration with platforms & Applications Develop an architecture of directory solutions for Windows, Unix, and related platforms Experience in consolidations of multiple forest and domains and demonstrated understanding on User accounts, machine accounts, GPOs Understand the requirement and create a migration plan for any services i.e. DNS, DHCP, and Certificate Services (PKI) etc. Analyzing the requirement and design a solution to fulfil the requirement with zero impact to other platforms Develop a power shell scripting with AD modules or VB .Net based on the requirements Manage Azure active directory design, Architect Solutions, Integration with platforms & Applications and AD connector to Azure Auditing the security logs and integrating with SIEM Conducting POC with multiple vendors for AD solutions and prepare detailed test cases. Create a clear recommendation document with pros and cons for senior management Vulnerability Assessment and Management related to Active Directory, DNS & Windows platforms Active Directory consolidations including application integration working with application teams Recommend security best practices to achieve stated business objectives, advises on risk assumptions for any variances granted, and provides alternatives to achieve desired end results Required Qualifications: Minimum 5 years relevant experience in Architecture and designing, solutions & Migrating Active Directory, Azure AD, Windows & End points Strong Demonstrated experience with Active Directory migration tool or equivalent and consolidation of Global Forest and Domains. Hands on experience in successful consolidation of AD Forests and Domains Good Knowledge on (MIM) Microsoft Identity Management 2016 or above, earlier versions would be a plus. Must have hands-on experience working on Azure AD (Azure Active Directory) Extensive Experience working as Azure Admin for enterprise Active Directory setup and maintenance Strong experience in AD Trusts, two-way Trusts and one-way Trusts and deep knowledge of Active Directory Schemas and meta data Strong Knowledge on Azure AD Identity Management & Integration with on premise Strong knowledge of Azure Active Directory technologies, including authentication models, federation, Multifactor Authentication (MFA), conditional access policies and other relevant capabilities. Knowledge of best practices in AD/Azure Privileged access management and modern AD/Azure Secured Administration practices Strong PowerShell scripting Strong Knowledge on IAM disciplines like PIM and Privilege Administrative Accounts PAM solutions such as CyberArk Good knowledge on ADFS and Azure AD sync connectors Strong familiarity with DNS Active Directory integrated, partitions and Infoblox & DHCP systems and Migration of services from Active Directory any platform Demonstrated knowledge and experience in AD assessment in terms of OU delegation, GPOs, permission etc., Expertise in Active Directory versions 2003, 2008R2, 2012R2 & 2016, 2019 and Azure Active Directory Good knowledge and hands on experience in setting up lab based on the solution requirements Demonstrated working knowledge and hands on experience in AD disaster recovery, Replication issues and resolution using tools such as repadmin Demonstrated experience in writing and applying GPOs, especially related to domain consolidations Good Knowledge on Active Directory & windows audit logs and levels and SIEM integration Good knowledge on Networking, firewalls, including host firewalls, DNS, DHCP, DFS & Network load balancers and Secure Global Directory or Secure LDAP Good knowledge on Cryptography, certificates, PKI, symmetric, asymmetric keys, Encryption & hash algorithms Good knowledge on AD authentication protocols Kerberos, NTLM, LDAP, LDAPS & LDAP-Start TLS Good knowledge on Network log capturing & analyzing the network packet captures through the tools Wireshark, Tshark, Microsoft NM etc., Good knowledge on application integration with LDAP & Kerberos i.e. Keytab, krb5 etc., Good knowledge on AD migration tools like ADMT, Quest etc., knowledge on AD trusts, forest, domain tree structures, sites, DNS, GPOs, OU, FRS, DFSR. Good knowledge on any Identity & Access Management tools like FIM, MIM, OIM, Quest etc., Exposure to SAML, OAuth, OpenID and other security/IAM related standards Strong hands-on familiarity with host-based security solutions, Forensic & Investigation agents, and Compliance scanning and reporting, Hardening Active Directory Knowledge of single sign-on, federation, active directory/LDAP, Kerberos/NTLM authentication & integrated Windows authentication Good knowledge on Identity management and Role based access control, attribute-based access control & entitlement management Good knowledge on power shell scripting with AD modules or VB .Net and ability to write scripts based on the requirement Excellent communication skills, especially verbal and written Good documentation skills to write a design & configuration documents version controls Excellent Interpersonal skill and ability to work as part of a team Home office for remote work Ability to work some weekends and late nights performing approved changes ITIL V3 or later experience, experience in writing change request and attending Change Advisory Boards (CAB) meeting Experience with Security Controls and compliance.
Posted 1 hour ago
5.0 - 10.0 years
7 - 11 Lacs
noida
Work from Office
We are currently seeking a Systems Engineering Advisor to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Role Overview The NTT DATA Services Security organization is looking for talented security-oriented Systems Engineering Advisor with strong Active Directory/Azure AD/Identity skills. This role will be part of a larger dedicated security team dedicated to supporting, troubleshooting, upgrading Active Directory, Azure AD and related Identity technologies. Role Responsibilities: Active Directory designing, Architecture Solutions, Integration with platforms & Applications Develop an architecture of directory solutions for Windows, Unix, and related platforms Experience in consolidations of multiple forest and domains and demonstrated understanding on User accounts, machine accounts, GPOs Understand the requirement and create a migration plan for any services i.e. DNS, DHCP, and Certificate Services (PKI) etc. Analyzing the requirement and design a solution to fulfil the requirement with zero impact to other platforms Develop a power shell scripting with AD modules or VB .Net based on the requirements Manage Azure active directory design, Architect Solutions, Integration with platforms & Applications and AD connector to Azure Auditing the security logs and integrating with SIEM Conducting POC with multiple vendors for AD solutions and prepare detailed test cases. Create a clear recommendation document with pros and cons for senior management Vulnerability Assessment and Management related to Active Directory, DNS & Windows platforms Active Directory consolidations including application integration working with application teams Recommend security best practices to achieve stated business objectives, advises on risk assumptions for any variances granted, and provides alternatives to achieve desired end results Required Qualifications: Minimum 5 years relevant experience in Architecture and designing, solutions & Migrating Active Directory, Azure AD, Windows & End points Strong Demonstrated experience with Active Directory migration tool or equivalent and consolidation of Global Forest and Domains. Hands on experience in successful consolidation of AD Forests and Domains Good Knowledge on (MIM) Microsoft Identity Management 2016 or above, earlier versions would be a plus. Must have hands-on experience working on Azure AD (Azure Active Directory) Extensive Experience working as Azure Admin for enterprise Active Directory setup and maintenance Strong experience in AD Trusts, two-way Trusts and one-way Trusts and deep knowledge of Active Directory Schemas and meta data Strong Knowledge on Azure AD Identity Management & Integration with on premise Strong knowledge of Azure Active Directory technologies, including authentication models, federation, Multifactor Authentication (MFA), conditional access policies and other relevant capabilities. Knowledge of best practices in AD/Azure Privileged access management and modern AD/Azure Secured Administration practices Strong PowerShell scripting Strong Knowledge on IAM disciplines like PIM and Privilege Administrative Accounts PAM solutions such as CyberArk Good knowledge on ADFS and Azure AD sync connectors Strong familiarity with DNS Active Directory integrated, partitions and Infoblox & DHCP systems and Migration of services from Active Directory any platform Demonstrated knowledge and experience in AD assessment in terms of OU delegation, GPOs, permission etc., Expertise in Active Directory versions 2003, 2008R2, 2012R2 & 2016, 2019 and Azure Active Directory Good knowledge and hands on experience in setting up lab based on the solution requirements Demonstrated working knowledge and hands on experience in AD disaster recovery, Replication issues and resolution using tools such as repadmin Demonstrated experience in writing and applying GPOs, especially related to domain consolidations Good Knowledge on Active Directory & windows audit logs and levels and SIEM integration Good knowledge on Networking, firewalls, including host firewalls, DNS, DHCP, DFS & Network load balancers and Secure Global Directory or Secure LDAP Good knowledge on Cryptography, certificates, PKI, symmetric, asymmetric keys, Encryption & hash algorithms Good knowledge on AD authentication protocols Kerberos, NTLM, LDAP, LDAPS & LDAP-Start TLS Good knowledge on Network log capturing & analyzing the network packet captures through the tools Wireshark, Tshark, Microsoft NM etc., Good knowledge on application integration with LDAP & Kerberos i.e. Keytab, krb5 etc., Good knowledge on AD migration tools like ADMT, Quest etc., knowledge on AD trusts, forest, domain tree structures, sites, DNS, GPOs, OU, FRS, DFSR. Good knowledge on any Identity & Access Management tools like FIM, MIM, OIM, Quest etc., Exposure to SAML, OAuth, OpenID and other security/IAM related standards Strong hands-on familiarity with host-based security solutions, Forensic & Investigation agents, and Compliance scanning and reporting, Hardening Active Directory Knowledge of single sign-on, federation, active directory/LDAP, Kerberos/NTLM authentication & integrated Windows authentication Good knowledge on Identity management and Role based access control, attribute-based access control & entitlement management Good knowledge on power shell scripting with AD modules or VB .Net and ability to write scripts based on the requirement Excellent communication skills, especially verbal and written Good documentation skills to write a design & configuration documents version controls Excellent Interpersonal skill and ability to work as part of a team Home office for remote work Ability to work some weekends and late nights performing approved changes ITIL V3 or later experience, experience in writing change request and attending Change Advisory Boards (CAB) meeting Experience with Security Controls and compliance.
Posted 1 hour ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As a Product & Manufacturer Data Validation Specialist, your role will involve validating manufacturer information and ensuring accurate product taxonomy classification for industrial parts. You will work with AI-generated data, Excel workbooks, and online resources to enhance the integrity of product and supplier databases. Here are the key responsibilities: - Manufacturer Data Validation: - Review manufacturer names extracted via AI-assisted web search and assess accuracy. - Conduct manual research to confirm manufacturer identities and correct discrepancies. - Update Excel workbooks with verified and standardized manufacturer data. - Flag ambiguous or unverifiable entries for further investigation. - Provide feedback to improve AI data extraction and enrichment workflows. - Product Taxonomy Validation: - Validate and refine product taxonomy for industrial parts, ensuring alignment with internal classification standards. - Review AI-generated or vendor-supplied categorizations and correct misclassifications. - Research product attributes to determine appropriate category placement. - Document taxonomy rules, exceptions, and updates for reference and training. - Support taxonomy mapping for new product onboarding and catalog expansion. Qualifications Required: - Strong proficiency in Microsoft Excel, including formulas, filters, and data management tools. - Excellent research and analytical skills with a high attention to detail. - Experience working with AI-generated data or web scraping tools is a plus. - Familiarity with manufacturer databases, industrial parts, or technical product data. - Understanding of product taxonomy, classification systems, and metadata. - Ability to work independently and manage multiple tasks effectively. - Strong communication skills for cross-functional collaboration.,
Posted 1 day ago
3.0 - 6.0 years
6 - 10 Lacs
hyderabad, gurugram, chennai
Work from Office
Develop & Optimize Data Pipelines Build, test, and maintain ETL/ELT data pipelines using Azure Databricks & Apache Spark (PySpark) . Optimize performance and cost-efficiency of Spark jobs. Ensure data quality through validation, monitoring, and alerting mechanisms. Understand cluster types, configuration, and use-case for serverless Implement Unity Catalog for Data Governance Design and enforce access control policies using Unity Catalog. Manage data lineage, auditing, and metadata governance . Enable secure data sharing across teams and external stakeholders. Integrate with Cloud Data Platforms Work with Azure Data Lake Storage / Azure Blob Storage/ Azure Event Hub to integrate Databricks with cloud-based data lakes, data warehouses, and event streams . Implement Delta Lake for scalable, ACID-compliant storage. Automate & Orchestrate Workflows Develop CI/CD pipelines for data workflows using Azure Databricks Workflows or Azure Data Factory . Monitor and troubleshoot failures in job execution and cluster performance . Collaborate with Stakeholders Work with Data Analysts, Scientists, and Business Teams to understand requirements. Translate business needs into scalable data engineering solutions . API expertise Ability to pull data from a wide variety of APIs using different strategies and methods Required Skills & Experience: Azure Databricks & Apache Spark (PySpark) Strong experience in building distributed data pipelines . Python Proficiency in writing optimized and maintainable Python code for data engineering. Unity Catalog Hands-on experience implementing data governance, access controls, and lineage tracking . SQL Strong knowledge of SQL for data transformations and optimizations. Delta Lake Understanding of time travel, schema evolution, and performance tuning . Workflow Orchestration Experience with Azure Databricks Jobs or Azure Data Factory . CI/CD & Infrastructure as Code (IaC) Familiarity with Databricks CLI, Databricks DABs, and DevOps principles . Security & Compliance Knowledge of IAM, role-based access control (RBAC), and encryption . Preferred Qualifications: Experience with MLflow for model tracking & deployment in Databricks. Familiarity with streaming technologies (Kafka, Delta Live Tables, Azure Event Hub, Azure Event Grid). Hands-on experience with dbt (Data Build Tool) for modular ETL development. Certification in Databricks, Azure is a plus. Experience with Azure Databricks Lakehouse connectors for SalesForce and SQL Server Experience with Azure Synapse Link for Dynamics, dataverse Familiarity with other data pipeline strategies, like Azure Functions, Fabric, ADF, etc Soft Skills: Strong problem-solving and debugging skills. Ability to work independently and in teams . Excellent communication and documentation skills.
Posted 2 days ago
2.0 - 7.0 years
4 - 9 Lacs
bengaluru
Work from Office
Role Overview: We are looking for a detail-oriented and collaborative Sitecore Specialist to support the transition to Sitecore XM Cloud , manage ongoing content updates, and ensure SEO best practices are implemented across our digital platforms. This role will work closely with marketing, IT, and external partners to ensure a smooth migration and maintain a consistent, optimized web presence. Key Responsibilities: Sitecore XM Cloud Transition Support Coordinate with internal teams and vendors to support the migration to Sitecore XM Cloud. Assist in content mapping, quality checks, and publishing during the transition. Track progress and flag issues related to content structure, formatting, or user experience. Sitecore Maintenance Manage routine content updates and page publishing within Sitecore. Ensure brand consistency and accuracy across all web pages. Collaborate with business units to gather and implement content changes. SEO Web Optimization Apply basic SEO best practices (metadata, headings, alt text, internal linking). Monitor page performance and suggest improvements based on analytics. Coordinate with SEO specialists or agencies for deeper optimization efforts. Qualifications: Bachelor s degree in Marketing, Communications, or a related field. 2+ years of experience in digital content management or website coordination. Familiarity with CMS platforms (Sitecore experience preferred). Basic understanding of SEO principles and web analytics.Strong understanding of HTML and CSS. Strong organizational and communication skills. Attention to detail and ability to manage multiple tasks. Preferred Skills: Experience working with cross-functional teams. Exposure to cloud-based CMS platforms or digital transformation projects. Familiarity with tools like Google Analytics, AHREFS.
Posted 2 days ago
3.0 - 8.0 years
5 - 10 Lacs
bengaluru
Work from Office
About Netskope Since 2012, we have built the market-leading cloud security company and an award-winning culture powered by hundreds of employees spread across offices in Santa Clara, St. Louis, Bangalore, London, Paris, Melbourne, Taipei, and Tokyo. Our core values are openness, honesty, and transparency, and we purposely developed our open desk layouts and large meeting spaces to support and promote partnerships, collaboration, and teamwork. From catered lunches and office celebrations to employee recognition events and social professional groups such as the Awesome Women of Netskope (AWON), we strive to keep work fun, supportive and interactive. Visit us at Netskope Careers. Please follow us on LinkedIn and Twitter @Netskope . About the position Netskope is looking for an experienced Salesforce Developer to join our Go-To-Market (GTM) Systems Team. This role will be responsible for supporting Netskope s Commercial Salesforce org, from solution design to development of programmatic solutions to support key business initiatives. The ideal candidate will have a strong understanding of Salesforce programmatic development, as well as deep expertise in leading and executing on technical solution design within a complex Salesforce environment. Responsibilities Provide technical expertise and develop scalable solutions to support Netskope s Commercial Salesforce org. Work cross functionally with key GTM stakeholders to scope, design, and execute on technical solutions within Salesforce to support key initiatives. Develop flows, Apex, and lightning web components to support solution design. Work closely with other Salesforce Administrators and Developers to support Sales Cloud, Service Cloud, and CPQ within our Commercial org. Work closely with the development team to define and enforce strict change management policies. Optimize our Salesforce environments by identifying and eliminating technical debt. Create and maintain code documentation. Proactively monitor for and troubleshoot technical issues. Requirements 3+ years of experience administering Salesforce at an enterprise SaaS company. Salesforce Certified Platform Developer. Experience working in a complex Salesforce environment with multiple clouds, including CPQ and Service Cloud. Experience with iPaaS solutions such as Workato, Zapier or Mulesoft. Hands on experience with technical solution design and development using flows, Apex, and lightning web components. Familiarity with integrating and managing third party applications with Salesforce. Experience with the end to end development process, including sprint cycles, request evaluation, estimating level of effort, deployment using Metadata API/ Changesets, code coverage, and post mortems. Familiarity with Salesforce architecture and hands on knowledge working on Salesforce APIs. Education: Bachelor s degree in related field. #LI-TD1
Posted 2 days ago
3.0 - 7.0 years
5 - 9 Lacs
gurugram
Work from Office
EY-Consulting Oracle Analytics Cloud Staff Consultant We re looking for Staff Consultant with expertise in BI and ETL tools especially on tools like OAC/OBIEE/Power BI/ODI/Informatica to join the EA group of our consulting Team . This is a fantastic opportunity to be part of a leading firm whilst being instrumental in the growth of a new service offering. We are seeking experienced and motivated Engineer with a strong track record of OAC/OBIEE/Power BI , business analytics and data warehousing to join our team providing deep technical expertise in Analytics, Business Intelligence, Data Warehouse, ETL and about power utility Sector. Our Engineers work closely with external clients, presales, other architects, and internal teams to design, build and enable solutions on different Analytics Solutions This role demands a highly technical, extremely hands-on cloud engineer who will work closely with our EY Partners and external clients to develop new business as well as drive other initiatives on Oracle Analytics, ETL Data Warehouse. The ideal candidate must have a good understanding of the value of data and analytics and proven experience in delivering solutions to different lines of business and technical leadership. From top to bottom approach, Candidate will engage with a customer to discover business problems and goals, and develop solutions using different cloud services. Your key responsibilities Expertise in Oracles analytics offerings Oracle Analytics Cloud, Data Visualization, OBIEE, Fusion Analytics for Warehouse Strong design skills in BI and ETL based on various scenarios. Experience in other BI tools like Power BI or Tableau is preferred Have solution design skills to provide expertise and guide customers for specific needs. Be extremely hands on with Analytics and Data Warehousing report/solution development. Deliver PoCs tailored to customers needs. Run and Deliver Customer Hands on Workshops. Interact with all roles at customer, including Executives, Architects, technical staff and business representatives. Building Effective Relationships, Customer Focus, Effective Communication and Coaching Knowledge on FDI is an advantage Have some experience in ETL process and ETL tools like ODI or Informatica Skills and attributes for success Primary focus on developing customer solutions using Oracles analytics offerings Oracle Analytics Cloud, Data Visualization, Fusion Analytics for Warehouse, OBIEE, OBIA etc Experience in other BI tools like Power BI or Tableau is preferred. Must have extensive hands on/end to end implementation experience using OAC/OBIEE and BI Publisher. Knowledge in development of Oracle BI Repository (RPD). Experience in configuring OBIEE /OAC security (Authentication and Authorization Object level and Data level security) as well as tuning reports. Experience in working with session and repository variable and initialization blocks to streamline administrative tasks and modify metadata content dynamically. Experience on working with report performance optimization. Experience in developing Dimensional Hierarchies and adding multiple sources to business model objects. Solid knowledge of data extraction using Sql. Good knowledge of Oracle Applications Oracle E-business Suite or Oracle ERP, Oracle HCM (the Oracle cloud SaaS offering) is preferable. Deep knowledge of Database, Cloud Concepts, Autonomous Data warehouse (ADW), Data Integration tools such as ODI, Informatica etc is an added advantage To qualify for the role, you must have 3-7 years of Data warehousing and Business Intelligence projects experience. Having 2-5 years of projects experience on OBIEE/Power BI or any other BI tools. Having at least 2 years of OAC implementations experience. Worked on Oracle ERP data. Ideally, you ll also have Engaging with business partners and IT to understand requirements from various parts of an organization to drive the design, programming execution, and UAT for future state capabilities within the platform. Working in a fast-paced and dynamic environment while managing multiple projects and strict deadlines Good understanding of outsourcing and off shoring, building win/win strategies and contracts with suppliers Experience on other Data visualization tools like Power BI or Tableau would be a plus Experience or good knowledge of Oracle Applications Oracle CCB , Oracle MDM etc Integration development to/from other systems Consulting experience, including assessments and implementations Documenting requirements and processes (e.g., process flows) Working collaboratively in a team environment Excellent oral and written communication skills Strong analytical and problem-solving skills B.E. / B.tech. /Master s degree required. At EY, we re dedicated to helping our clients, from start ups to Fortune 500 companies and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange.
Posted 2 days ago
5.0 - 10.0 years
7 - 12 Lacs
bengaluru
Work from Office
Enterprise Systems Technical Specialist (Primavera) R0145787 Hybrid Bengaluru, Karnataka, India Full time Add to favorites Favorited View favorites Role Summary: The Enterprise Systems Technical Specialist will provide expert-level technical support, system configuration, and integration management for Oracle Primavera P6. This role will support infrastructure readiness, data governance, and automation across project planning and engineering document management systems. The ideal candidate will bridge project controls with engineering systems to drive consistency, reliability, and cross-system optimization. Key Responsibilities: Administer and configure Primavera P6 EPPM (environments, global data, layouts, roles, and security). Support Primavera integrations with enterprise systems via APIs and middleware. Create and maintain activity codes, filters, templates, and dashboards. Troubleshoot data sync, baseline, and performance issues. Ensure compliance with corporate IT policy, patching, and change control. Author and maintain user documentation and process guides. Knowledge, Skills and Abilities: Strong documentation and communication skills with an ability to train diverse user communities. Analytical thinker and strategic collaborator. Self-motivated and adaptable in evolving environments. Skilled at building cross-functional relationships between business, engineering, and IT stakeholders. Effective communicator, capable of training users and managing system vendors. Minimum Qualifications: 5 years of hands-on experience administering Primavera P6 EPPM in a technical capacity. Advanced understanding of project controls and document control principles (WBS, CPM, metadata configuration, workflows). Experience managing databases (Oracle, SQL Server), integrations (APIs, PowerShell), and middleware. Demonstrated experience with application infrastructure, environments, and workflows. Familiarity with ITIL frameworks and Service Desk ticketing systems. Preferred Qualifications: Experience with Primavera integration or workflow alignment. Background supporting DIBCAC High or cybersecurity-sensitive environments. Knowledge of Bentley Cloud-hosted platforms. Familiarity with SAML, LDAP, or Active Directory for authentication. Certifications in Oracle Primavera are a plus. Amentum is proud to be an Equal Opportunity Employer. Our hiring practices provide equal opportunity for employment without regard to race, sex, sexual orientation, pregnancy (including pregnancy, childbirth, breastfeeding, or medical conditions related to pregnancy, childbirth, or breastfeeding), age, ancestry, United States military or veteran status, color, religion, creed, marital or domestic partner status, medical condition, genetic information, national origin, citizenship status, low-income status, or mental or physical disability so long as the essential functions of the job can be performed with or without reasonable accommodation, or any other protected category under federal, state, or local law. Learn more about your rights under Federal laws and supplemental language at Labor Laws Posters .
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Metadata roles are in high demand in India, with many companies looking for professionals who can manage and analyze data effectively. In this article, we will explore the metadata job market in India, including top hiring locations, salary ranges, career progression, related skills, and common interview questions.
These cities are known for their thriving tech sectors and offer numerous opportunities for metadata professionals.
The average salary range for metadata professionals in India varies based on experience level: - Entry-level: ₹3-6 lakhs per annum - Mid-level: ₹6-12 lakhs per annum - Experienced: ₹12-20 lakhs per annum
Salaries may vary based on the company, location, and specific job responsibilities.
In the metadata field, a career typically progresses as follows: - Metadata Analyst - Metadata Specialist - Metadata Manager - Metadata Architect
As professionals gain experience and expertise, they can move into more senior roles with increased responsibilities.
In addition to metadata management, professionals in this field are often expected to have skills in: - Data analysis - Database management - Data modeling - Information governance
Having a combination of these skills can make job seekers more attractive to potential employers.
As you explore metadata jobs in India, remember to showcase your skills and experience confidently during interviews. By preparing thoroughly and demonstrating your expertise in metadata management, you can increase your chances of securing a rewarding career in this field. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |