Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Modeler at our company, you will play a crucial role in designing conceptual, logical, and physical models for Azure Databricks and Azure Data Lake to support structured, semi-structured, and unstructured data. Your responsibilities will include: - Utilizing your 6+ years of experience in data modeling, with a preference for insurance industry datasets such as policies, claims, customer, or actuarial data. - Demonstrating advanced skills in data modeling tools like Erwin, ER/Studio, PowerDesigner, or Microsoft Visio, and version control using GitHub. - Applying deep understanding of relational, dimensional, and data lake modeling techniques optimized for Databricks/Spark-based processing. - Modeling and documenting metadata, reference data, and master data with Informatica to support robust data governance and quality. - Utilizing strong SQL and Spark skills for data profiling, validation, and prototyping in Databricks environments. - Ensuring compliance with regulatory and compliance requirements for insurance data, such as IFRS 17 and Solvency II. Regarding the company, Virtusa values teamwork, quality of life, and professional development. With a global team of 27,000 professionals, we are dedicated to providing exciting projects, opportunities, and working with cutting-edge technologies to support your career growth. At Virtusa, we foster a collaborative team environment that encourages new ideas and excellence. Join us to unleash your potential and contribute to our dynamic work culture.,
Posted 1 day ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
You will play a critical role as a Customer Data Platform (CDP) Operations Specialist at Amgen, enabling personalized, data-driven customer engagement across the omnichannel ecosystem. Your responsibilities will include: - Building and managing audience segmentation and activation using Salesforce Customer Data Platform (CDP), Data Cloud, to enable personalized, data-driven campaigns. - Configuring and validating data source trust, identity resolution, and profile stitching rules within Salesforce Data Cloud to ensure accurate and unified customer profiles. - Collaborating with data engineering and analytics teams to maintain scalable, reliable data pipelines feeding into CDP. - Implementing audience governance frameworks to ensure data privacy, compliance (e.g., CAN-SPAM, CCPA, GDPR), and consistency in audience usage. - Supporting campaign orchestration by defining entry criteria, decision splits, and activation logic tied to unified CDP segments. - Conducting QA and UAT of CDP audiences and integrations to validate accuracy, coverage, and expected campaign performance. - Monitoring segment performance, activation trends, and data flows, providing insights and recommendations for optimization. - Staying current on Salesforce CDP enhancements, features, and best practices, contributing to continuous improvement and knowledge sharing across the organization. - Managing Data Lake Object (DLO) -to-Data Model Object (DMO) mapping to ensure seamless alignment between ingested data and the business-ready canonical model. - Supporting new data ingestions in collaboration with the DTI team, ensuring data integrity and scalability. - Overseeing data transformation monitoring and maintenance within CDP, proactively addressing errors or inefficiencies. - Driving identity resolution processes to unify customer records across systems. - Ensuring the DMO layer is business-ready by validating that fields, objects, and naming conventions are meaningful to business users. - Partnering with the Campaign Execution team to resolve technical challenges in segmentation and activation, ensuring smooth downstream execution. - Facilitating cross-functional issue resolution, coordinating between business, data engineering, and tech teams to address blockers and ensure timely delivery. Qualifications: - Masters degree with 5+ years of relevant experience in Marketing Technology, CRM, Data Analytics, Computer Science, or related field; OR - Bachelors degree with 5+ years of relevant experience. - Strong understanding of digital compliance, web accessibility, and regulated industry standards. - Excellent communication, stakeholder management, and project delivery skills. Preferred Experience: - Pharmaceutical or healthcare sector experience. - Familiarity with Agile methodologies and sprint-based delivery. - Working knowledge of Salesforce Marketing Cloud or other CRM platforms. - Technical understanding of web architectures, APIs, and performance optimization. Functional Skills: Must-Have Skills: - Hands-on experience with Salesforce CDP (Data Cloud) for segmentation, audience management, and activation. - Familiarity with Salesforce Marketing Cloud (Journey Builder, Automation Studio, Email Studio, Mobile Studio, Advertising Studio). - Proficiency in SQL/SOQL and understanding of data structures in CRM and CDP environments. - Knowledge of data privacy, governance, and compliance standards in the U.S. (e.g., CAN-SPAM, CCPA, GDPR). Good-to-Have Skills: - Exposure to identity resolution concepts and methodologies. - Experience integrating CDP segments into multichannel orchestration tools. - Familiarity with marketing analytics tools and data visualization platforms. - Experience working in agile methodologies and sprint planning processes. PROFESSIONAL CERTIFICATIONS: - Preferred Salesforce Data Cloud Consultant certification. - Preferred Salesforce Marketing Cloud certifications (e.g., Marketing Cloud Email Specialist, Marketing Cloud Consultant, Marketing Cloud Developer). SOFT SKILLS: - Strong analytical and problem-solving skills with attention to detail. - Excellent communication and collaboration skills; partner with technical/non-technical stakeholders. - Effective organizational skills with the ability to manage multiple initiatives simultaneously. - Demonstrated ability to thrive in a fast-paced, agile environment. - Strong presentation skills to communicate CDP insights and operational results to stakeholders.,
Posted 1 day ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
You are a strategic thinker passionate about driving solutions in business architecture. You have found the right team. As a Banking Book Product Owner Analyst in our Firmwide Finance Business Architecture (FFBA) team, you will spend each day defining, refining, and delivering set goals for our firm. You will partner with stakeholders across various lines of business and subject matter experts to understand products, data, source system flows, and business requirements related to Finance and Risk applications and infrastructure. **Key Responsibilities:** - Utilize Agile Framework to write business requirements in the form of user stories to enhance data, test execution, reporting automation, and digital analytics toolsets. - Engage with development teams to translate business needs into technical specifications, ensuring acceptance criteria are met. - Drive adherence to product and Release Management standards and operating models. - Manage the release plan, including scope, milestones, sourcing requirements, test strategy, execution, and stakeholder activities. - Collaborate with lines of business to understand products, data capture methods, and strategic data sourcing into a cloud-based big data architecture. - Identify and implement solutions for business process improvements, creating supporting documentation and enhancing end-user experience. - Collaborate with Implementation leads, Release managers, Project managers, and data SMEs to align data and system flows with Finance and Risk applications. - Oversee the entire Software Development Life Cycle (SDLC) from requirements gathering to testing and deployment, ensuring seamless integration and execution. **Qualifications Required:** - Bachelor's degree with 2+ years of experience in Project Management or Product Ownership, with a focus on process re-engineering. - Proven experience as a Product Owner with a strong understanding of agile principles and delivering complex programs. - Strong analytical and problem-solving abilities, with the capacity to quickly assimilate business and technical knowledge. - Experience in Finance, Risk, or Operations as a Product Lead. - Familiarity with Traditional Credit Products and Liquidity and Credit reporting data. - Highly responsible, detail-oriented, and able to work with tight deadlines. - Excellent written and verbal communication skills, with the ability to articulate complex concepts to diverse audiences. - Strong organizational abilities to manage multiple work streams concurrently, maintaining sound judgment and a risk mindset. - Solid understanding of financial and regulatory reporting processes. Energetic, adaptable, self-motivated, and effective under pressure. - Basic knowledge of cloud technologies (e.g., AWS). The preferred qualifications, capabilities, and skills include knowledge of JIRA, SQL, Microsoft suite of applications, Databricks, and data visualization/analytical tools (Tableau, Alteryx, Python). Additionally, knowledge and experience of Traditional Credit Products (Loans, Deposits, Cash, etc.) and Trading Products (Derivatives and Securities) would be a plus.,
Posted 1 day ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
Role Overview: You will be responsible for installing, configuring, and managing Kafka cluster, Elastic search cluster, and Mongo DB database. Your primary duties will involve monitoring and troubleshooting issues in these clusters, optimizing them for performance and scalability, implementing data backup and recovery strategies for MongoDB, and patching servers to scale out Kafka and ES cluster. Your strong problem-solving and troubleshooting skills will be crucial for diagnosing and resolving issues efficiently. Additionally, you will need to have a working knowledge of Unix/Linux operating systems and possess a combination of technical expertise, analytical abilities, and communication skills for effective management. Key Responsibilities: - Install, configure, and manage Kafka cluster, Elastic search cluster, and Mongo DB database - Monitor and troubleshoot issues in Kafka cluster, Elastic search cluster, and Mongo DB - Optimize Kafka, Elastic search, and Mongo DB for performance and scalability - Implement data backup and recovery strategies for MongoDB - Patch servers to scale out Kafka and ES cluster and run predefined scripts Qualifications Required: - Bachelor's or Master's degree with 4-7 years of professional experience - Understanding of data governance, compliance, and security best practices - Proficiency in data querying and manipulation - Strong problem-solving and troubleshooting skills - Excellent communication and collaboration skills - Ability to effectively communicate internally and with customers - Experience in monitoring and optimizing database performance, analyzing execution plans, and adjusting database configuration parameters - Implementation and maintenance of database security measures including user access controls, role-based privileges, and encryption - Diagnose and resolve database-related issues such as performance bottlenecks, connectivity problems, and data corruption,
Posted 1 day ago
5.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Master Data Management (MDM) Architect at Fortrea, your role will involve shaping the future of the global data ecosystem. You will be a strategic data leader with a passion for data governance, canonical modeling, and MDM platform architecture. This high-impact role is ideal for someone with deep data strategy experience and a passion for building future-proof platforms. **Key Responsibilities:** - Support the Product Owner, Data Quality, Governance, and MDM in defining the long-term technology vision for MDM, and interact with appropriate vendors and third parties - Act as the ARB representative for the MDM Platform, serving as the initial point of contact for inquiries related to the impact of specific projects on the state of the MDM platform - Keep up to date with all key decisions related to canonical data modeling for master data - Support the Head of Data Platform Architecture in properly modeling Enterprise Data assets and defining, documenting, and maintaining all architectural standards, policies, and best practices for the MDM platform - Lead the design of all matching rules for MDM - Act as the MDM consultant for cross-program and cross-project technology strategies - Define the enterprise MDM Platform technology vision tailored to Fortrea business needs, documenting such vision in multiple ways for different stakeholders and audiences - Stay informed about the latest trends in the field of MDM through formal and informal learning, industry group participation, and conference attendance - Collaborate with other architects in the data space to ensure a coherent approach to data architecture for Fortrea - Advocate for the agreed-upon data technology choices of the enterprise - Build relationships and work collaboratively with individuals within the enterprise, external vendors, and customers - Informally lead, manage, and motivate team members to support objectives and mission, and achieve set goals - Maintain an up-to-date repository of all technical debt in your area - Tackle problems creatively using colleague networks and publicly available solutions, work independently in a matrix environment, and manage multiple tasks simultaneously - Work in a fast-paced, team-oriented, and self-directed entrepreneurial environment - Think strategically, communicate a vision, and drive results - Be detail-oriented, highly organized, with a strong business/technical acumen - Possess excellent verbal and written communication skills, an innovative approach to problem-solving, and a strong work ethic - Coordinate with other enterprise governance teams to ensure compliance with standards and regulations - Perform any other duties as needed or assigned **Qualifications:** - Bachelor's degree in Computer Science, Information Technology, or a related field **Experience:** - At least 5 years of experience as a Data Platform leader - At least 10 years of IT Leadership in the data strategy space - Proficiency with Profisee, Informatica MDM, or Reltio - Experience with cloud platforms like Azure (preferred), AWS, or Google Cloud **Additional Company Details:** Fortrea is a remote work environment. If you believe you have the required qualifications and experience to excel in this role, we encourage you to apply and be a part of shaping the future of our global data ecosystem at Fortrea.,
Posted 1 day ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Backend Developer with 3 to 5 years of experience in full backend development, your role will involve the following key responsibilities: - Design and develop scalable, reusable APIs that meet evolving business needs, emphasizing security, performance, and maintainability. This includes a deep understanding of RESTful API standards and microservices architecture. - Proficiency in backend programming languages such as Python, Javascript is required, along with experience in API testing tools like Postman and Apigee, and API management platforms like APIM Snaplogic/MuleSoft. - Strong knowledge of cloud services such as AWS, Azure, Google Cloud, and associated deployment strategies is essential. - Familiarity with API security measures, including authentication (OAuth, JWT) and authorization protocols is necessary. - Managing code repositories and deployment pipelines using Git or Gitlab with tools like IntelliJ. - Designing and implementing backend solutions by leveraging Snaplogic pipelines, Neo4j for knowledge graph integration, and Kafka for real-time data streaming. - Experience with API lifecycle management and familiarity with MCP layer architecture for AI enablement is highly desirable. - Overseeing the deployment process and ensuring smooth integration with existing systems, working closely with system owners to align API implementations with core system capabilities and constraints. - Collaborating with business analysts to translate requirements into actionable backend solutions. - Assisting in the creation and maintenance of comprehensive documentation, including API specifications, user guides, and system integration details. - Advocating for and implementing best practices in code reusability and system integration. - Facilitating communication between the API development team and system owners to resolve technical issues and align development objectives. Required Qualifications: - Proficiency in programming languages such as Python, Java, or .NET. - Experience with API security measures, authentication, and authorization protocols. - Strong knowledge of RESTful API standards and best practices. - Familiarity with cloud services (AWS, Azure, Google Cloud) and API management and testing tools (i.e. Snaplogic, MuleSoft, Azure, Postman, Apigee). - Experience with modular architecture or composable systems. - Understanding of data governance and API cataloging. Preferred Qualifications: - Experience working in AI/ML-enabled environments. - Exposure to enterprise integration strategies or customer-centric platforms. - Familiarity with agile development methodologies. - Knowledge about Change and Release management. - Ability to understand multi-tier architecture. - Collaborating with cross-functional team members to refine and optimize application functionality and design. - Participating actively in team planning, implementation, and review sessions to align project goals and development priorities. - Assisting in the creation and maintenance of comprehensive documentation, including API specifications, code references, and user guides to support the organization's modular architecture and integration strategies. - Strong problem-solving skills, attention to detail, and ability to manage multiple project tasks effectively.,
Posted 1 day ago
15.0 - 19.0 years
0 Lacs
karnataka
On-site
As a Digital Transformation Consultant at HARMAN Automotive, your role is crucial in defining and executing the digital transformation strategy across PLM, Quality, CAD, and Manufacturing solutions. Your responsibilities include: - Developing and implementing a comprehensive digital transformation strategy aligned with the company's objectives. - Identifying and evaluating emerging technologies like IoT, AI, machine learning, cloud computing, and digital twins for operational impact assessment. - Providing thought leadership and guidance on digital transformation best practices. - Creating roadmaps and business cases for senior management review. - Analyzing and optimizing existing workflows for improvements through digital technology application. - Leading the selection, implementation, and integration of new digital tools and platforms. - Ensuring seamless data flow and integration across systems and departments. - Establishing and maintaining data governance and security standards. - Driving change management initiatives for smooth adoption of new technologies. - Collaborating with cross-functional teams to achieve digital transformation goals. - Providing training and support to employees on new digital tools. - Establishing KPIs to measure the success of digital transformation initiatives. - Monitoring progress, addressing challenges, and providing insights for continuous improvement. - Managing budget and resources effectively. To be successful in this role, you should have: - Minimum of 15+ years of experience in PLM, Quality, CAD, and/or Manufacturing with a track record in leading digital transformation. - Deep understanding of PLM, Quality, CAD, and manufacturing processes and technologies. - Strong knowledge of emerging digital technologies in manufacturing. - Strong analytical, problem-solving, communication, and interpersonal skills. - Ability to lead and motivate cross-functional teams. - Experience with Industry 4.0 concepts. - Ability to work independently and collaboratively. Bonus points if you have: - Bachelor's or master's degree in engineering, Computer Science, or a related field. - Experience with statistical analysis tools, programming languages, AI/ML implementations, digital twin development, and relevant certifications. In addition to the above, you should be willing to travel for client engagements and work in the Bangalore office. HARMAN Automotive offers a range of benefits including a flexible work environment, employee discounts, extensive training opportunities, wellness benefits, tuition reimbursement, recognition programs, and an inclusive work environment for professional and personal development.,
Posted 1 day ago
6.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
Role Overview: You will be part of the enterprise data office and product solution team, focusing on ensuring accurate, timely, and fit for purpose data for business, risk management, and regulatory reporting requirements. Your average day will involve collaborating with various teams to understand Markets products processing in Regulatory Reporting data Flow and documenting product data flows within enterprise systems. Key Responsibilities: - Understand Derivatives data flows within CITI for Equities, FX, IRS, Fixed Income, Commodities etc. - Conduct data analysis for derivatives products across systems to ensure target state adoption and resolve data gaps/issues. - Lead assessment of end-to-end data flows for all data elements used in Regulatory Reports. - Document current and target states data mapping and produce gap assessment. - Collaborate with the business to identify critical data elements, define standards and quality expectations, and prioritize remediation of data issues. - Identify strategic sources for critical data elements. - Design and implement data governance controls including data quality rules and data reconciliation. - Develop systematic solutions for the elimination of manual processes/adjustments and remediation of tactical solutions. - Prepare detailed requirement specifications containing calculations, data transformations, and aggregation logic. - Perform functional testing and data validations. Qualifications Required: - 6+ years of combined experience in banking and financial services industry, information technology, and/or data controls and governance. - Preferably an Engineering Graduate with Post Graduation in Finance. - Extensive experience in the capital markets business and processes. - Deep understanding of Derivative products such as Equities, FX, IRS, Commodities, etc. - Strong Data analysis skills using Excel, SQL, Python, Pyspark, etc. - Experience with data management processes, tools, and applications, including process mapping and lineage toolsets. - Ability to identify and solve problems throughout the product development process. - Strong knowledge of structured/unstructured databases, data modeling, data management, rapid/iterative development methodologies, and data governance tools. - Strong understanding of data governance issues, policies, regulatory requirements, and industry information affecting the business environment. - Demonstrated stakeholder management skills. - Excellent communication skills for gathering requirements, expressing needs, and developing clear documentation. - Excellent presentation skills, business and technical writing, and verbal communication skills to support decision-making and actions. - Excellent problem-solving and critical thinking skills. - Self-motivated and able to determine priorities dynamically. - Data visualization skills to create visual representations of data models and provide input to UX/UI teams. (Note: Omitted additional details of the company as it did not contain relevant information for the job description),
Posted 1 day ago
3.0 - 7.0 years
0 Lacs
vellore, tamil nadu
On-site
Role Overview As a Data Management Specialist in this full-time remote role, you will be responsible for the day-to-day management of data assets. This includes ensuring data governance, maintaining data quality, and overseeing master data management. Your role will also involve analyzing data to support business decisions and developing/implementing data management practices and policies. Key Responsibilities - Ensure data governance and maintain data quality - Oversee master data management - Analyze data to support business decisions - Develop and implement data management practices and policies Qualifications Required - Experience in Data Governance and Data Quality - Proficiency in Data Management and Master Data Management - Strong Analytical Skills - Excellent problem-solving skills and attention to detail - Ability to work independently and remotely - Experience in the HR industry is a plus - Bachelor's degree in Information Management, Data Science, or a related field,
Posted 1 day ago
8.0 - 12.0 years
0 Lacs
maharashtra
On-site
Role Overview: As the Director - Head of ISPL CIB IT Architecture at BNP Paribas, your primary responsibility is to proactively define, maintain, and evolve the organization's technology architecture strategy. You will lead a team of architects to design scalable, secure, stable, and cost-effective enterprise solutions across applications, data, and infrastructure layers. Your role will require technical expertise, strategic vision, and strong leadership to guide technology decisions that enable innovation at scale and long-term business value. Key Responsibilities: - Define the policy and objectives of your scope based on the entity's strategic priorities. - Evaluate emerging technologies and market trends to deliver competitive advantage. - Lead and organize your team to ensure work is completed efficiently. - Present, explain, and adapt objectives set for your scope. - Coordinate and monitor all activities under your responsibility. - Promote teamwork, co-construction, and cross-functionality. - Guarantee that methods are applied within your scope. - Develop your employees" skills to ensure employability. - Participate in the management of disciplinary issues. - Manage available human resources to achieve expected results in terms of quality, costs, and deadlines. - Assess and manage IT & Cyber security risks within your scope of responsibility. - Apply and promote the Group's IT governance rules, including those related to IT & Cyber security risks. - Communicate any organizational, scope, or operational changes impacting risk profiles to your immediate supervisor. Qualifications Required: - Proven track record designing and delivering large-scale, complex enterprise systems. - Expertise in architecture domain (Applications, data, infra, and security). - Strong knowledge of cloud architectures. - Experience with API, Microservices, etc. - Understanding of Data architecture (Data lakes, data governance). Please note that the above qualifications and responsibilities are indicative and not exhaustive. The successful candidate may be required to perform additional duties as and when necessary as part of their role at BNP Paribas.,
Posted 1 day ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As a Data Engineer at our company, you will be responsible for designing, developing, and maintaining scalable data pipelines using Snowflake. You will optimize SQL queries and data models for performance and efficiency, ensuring smooth data operations. Implementing data security and governance best practices within Snowflake will be a key aspect of your role to maintain data integrity. Your key responsibilities will include: - Designing, developing, and maintaining scalable data pipelines using Snowflake - Optimizing SQL queries and data models for performance and efficiency - Implementing data security and governance best practices within Snowflake - Collaborating with data scientists and analysts to support their data needs - Troubleshooting and resolving data-related issues in a timely manner In addition to the above, if there are any additional details about the company in the job description, kindly provide that information.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
Role Overview: You are accountable for developing, expanding, and optimizing Data Management Architecture, Design & Implementation under Singtel Data Platform & Management. Your responsibilities include designing, developing, and implementing data governance and management solutions, data quality, privacy, protection, and associated control technology solutions according to industry best practices. You will review, evaluate, and implement Data Management standards, primarily Data Classification and Data Retention across systems. Additionally, you will design, develop, and implement Automated Data Discovery rules to identify the presence of PII attributes. Your role also involves driving development, optimization, testing, and tooling to enhance overall data control management, including security, data privacy, protection, and data quality. You will review, analyze, benchmark, and approve solution designs from product companies, internal teams, and vendors to ensure alignment with the data landscape, big data architecture guidelines, and roadmap. Key Responsibilities: - Design, develop, and implement data governance and management solutions - Review, evaluate, and implement Data Management standards - Develop Automated Data Discovery rules to identify PII attributes - Drive development, optimization, testing, and tooling for data control management - Review, analyze, benchmark, and approve solution designs Qualifications Required: - Diploma in Data Analytics, Data Engineering, IT, Computer Science, Software Engineering, or equivalent - Exposure to Data Management and Big Data Concepts - Knowledge and experience in Data Management, Data Integration, Data Quality products - Technical skills in Informatica CDGC, Collibra, Alation, Informatica Data Quality, Data Privacy Management, Azure Data Bricks,
Posted 2 days ago
12.0 - 16.0 years
0 Lacs
karnataka
On-site
Role Overview: As a Data Engineer, Senior Consultant for the Asia Pacific Region based out of Bangalore, you will bring deep expertise in data architecture, big data/data warehousing, and the ability to build large-scale data processing systems using the latest database and data processing technologies. Your role will be crucial in enabling VISA's clients in the region with foundational data capabilities needed to scale their data ecosystem for next-generation portfolio optimization and hyper-personalized marketing, especially within the BFSI space. You will work closely with VISA's market teams in AP, acting as a bridge between end-users and technology colleagues in Bangalore and the US to influence the development of global capabilities while providing local tools and technologies as required. Your consulting, communication, and presentation skills will be essential for collaborating with clients and internal cross-functional team members at various levels. You will also work on strategic client projects using VISA data, requiring proficiency in hands-on detailed design and coding using big data technologies. Key Responsibilities: - Act as a trusted advisor to VISA's clients, offering strategic guidance on designing and implementing scalable data architectures for advanced analytics and marketing use cases. - Collaborate with senior management, business units, and IT teams to gather requirements, align data strategies, and ensure successful adoption of solutions. - Integrate diverse data sources in batch and real-time to create a consolidated view such as a single customer view. - Design, develop, and deploy robust data platforms and pipelines leveraging technologies like Hadoop, Spark, modern ETL frameworks, and APIs. - Ensure data solutions adhere to client-specific governance and regulatory requirements related to data privacy, security, and quality. - Design target platform components, data flow architecture, and capacity requirements for scalable data architecture implementation. - Develop and deliver training materials, documentation, and workshops to upskill client teams and promote data best practices. - Review scripts for best practices, educate user base, and build training assets for beginner and intermediate users. Qualifications: - Bachelor's degree or higher in Computer Science, Engineering, or a related field. - 12+ years of progressive experience in data advisory, data architecture & governance, and data engineering roles. - Good understanding of Banking and Financial Services domains with familiarity in enterprise analytics data assets. - Experience in client consulting on data architecture and engineering solutions translating business needs into technical requirements. - Expertise in distributed data architecture, modern BI tools, and frameworks/packages used for Generative AI and machine learning model development. - Strong resource planning, project management, and delivery skills with a track record of successfully leading or contributing to large-scale data initiatives. (Note: Additional information section omitted as no details provided in the job description),
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
coimbatore, tamil nadu
On-site
As a NetSuite Analytics Developer & Data Warehousing expert at Mindspark Technologies, your role will involve designing, building, and optimizing NetSuite analytics solutions and enterprise data warehouses. You will be leveraging NetSuite's SuiteAnalytics tools along with external data warehousing platforms such as Oracle Analytics Warehouse, Google Cloud Platform (GCP), and Snowflake to deliver scalable, data-driven insights through advanced visualizations and reporting across the organization. Key Responsibilities: - Design, develop, and maintain SuiteAnalytics reports, saved searches, and dashboards within NetSuite to meet evolving business needs. - Build and optimize data pipelines and ETL processes to integrate NetSuite data into enterprise data warehouses (e.g., Oracle Analytics Warehouse, Snowflake, BigQuery). - Develop data models, schemas, and maintain data marts to support business intelligence and analytical requirements. - Implement advanced visualizations and reports using tools such as Tableau, Power BI, or Looker, ensuring high performance and usability. - Collaborate with business stakeholders to gather requirements and translate them into effective technical solutions. - Monitor, troubleshoot, and optimize data flow and reporting performance. - Ensure data governance, security, and quality standards are upheld across analytics and reporting systems. - Provide documentation, training, and support to end-users on analytics solutions. Qualifications Required: - Bachelor's degree in Computer Science, Information Systems, or related field. - 5+ years of experience working with NetSuite, including SuiteAnalytics (saved searches, datasets, workbooks). - Strong expertise in data warehousing concepts, ETL processes, and data modeling. - Hands-on experience with external data warehouse platforms such as Oracle Analytics Warehouse, GCP (BigQuery), or Snowflake. - Proficient in SQL and performance optimization of complex queries. - Experience with BI and visualization tools like Tableau, Power BI, or Looker. - Understanding of data governance, compliance, and best practices in data security. Please note that the company details were not provided in the job description.,
Posted 2 days ago
10.0 - 14.0 years
0 Lacs
hyderabad, telangana
On-site
As the Vice President of Engineering at Teradata, you will lead the India-based software development organization for the AI Platform Group. Your primary responsibility will be to execute the product roadmap for key technologies such as Vector Store, Agent platform, Apps, user experience, and AI/ML-driven use-cases at scale. Success in this role will involve building a world-class engineering culture, attracting and retaining top technical talent, accelerating product delivery, and driving innovation to bring measurable value to customers. Key Responsibilities: - Lead a team of 150+ engineers to help customers achieve outcomes with Data and AI - Partner closely with Product Management, Product Operations, Security, Customer Success, and Executive Leadership - Implement and scale Agile and DevSecOps methodologies - Drive the development of agentic AI and AI at scale in a hybrid cloud environment - Modernize legacy architectures into service-based systems using CI/CD and automation Qualifications Required: - 10+ years of senior leadership experience in product development, engineering, or technology within enterprise software product companies - 3+ years in a VP Product or equivalent role managing large-scale technical teams in a growth market - Experience with cloud platforms, Kubernetes, containerization, and microservices-based architectures - Knowledge of data harmonization, data analytics for AI, and modern data stack technologies - Strong background in enterprise security, data governance, and API-first design - Masters degree in engineering, Computer Science, or MBA preferred Teradata is a company that believes in empowering people with better information through their cloud analytics and data platform for AI. They aim to uplift and empower customers to make better decisions by providing harmonized data, trusted AI, and faster innovation. Trusted by the world's top companies, Teradata helps improve business performance, enrich customer experiences, and integrate data across the enterprise.,
Posted 2 days ago
6.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
As an IT Business Intermediate Analyst in Wholesale Banking at Citigroup, your role involves acting as a liaison between business users and technologists to facilitate the exchange of information in a clear and concise manner. Your primary objective is to contribute to the continuous exploration of business performance and drive business planning. **Responsibilities:** - Formulate and define systems scope and project objectives through research activities, providing guidance to new or lower-level analysts - Analyze business client needs, document requirements, new technology, and derive test cases - Define and analyze enhancements, assist in redesigning business processes, and process automation - Prepare reports, metrics, and presentations, facilitating the exchange of ideas/information between business units and IT - Identify risks and consider business implications of technology applications, ensuring compliance with laws, rules, and regulations - Test systems to ensure project requirements are met, identifying and resolving system problems - Translate business requirements into technical solutions, applying acquired technical experience and precedent - Assess risks in business decisions, safeguarding Citigroup, its clients, and assets **Qualifications:** - 6+ years of relevant experience - Experience in Wholesale Banking, Data Governance, Data Standards Adoption, Data Adjustments reductions, and Regulatory Reporting requirements - Proficiency in data analysis using intermediate/advanced Microsoft Office Suite skills and SQL - Knowledge of applicable business systems, industry standards, and data querying tools - Demonstrated analytical, organizational, presentation skills - Ability to manage diverse project portfolios independently or in a team - Passion for research, ideation, and exploration with clear and concise communication skills - Bachelor's degree/University degree or equivalent experience This job description offers an overview of the responsibilities and qualifications required. Please note that other job-related duties may be assigned as necessary.,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Engineer specializing in Pyspark and SQL at Barclays, your main role will involve spearheading the evolution of the digital landscape, driving innovation and excellence within the company. You will be responsible for harnessing cutting-edge technology to revolutionize digital offerings, ensuring unparalleled customer experiences. Working as part of a team of developers, your primary focus will be delivering a technology stack using your strong analytical and problem-solving skills to understand business requirements and deliver quality solutions. Key Responsibilities: - Hands-on experience in Pyspark with a strong knowledge of Dataframes, RDD, and SparkSQL. - Proficiency in Pyspark performance optimization techniques. - Development, testing, and maintenance of applications on AWS Cloud. - Strong grasp of AWS Data Analytics Technology Stack including Glue, S3, Lambda, Lake formation, and Athena. - Design and implementation of scalable and efficient data transformation/storage solutions using open table formats such as DELTA, Iceberg, and Hudi. - Experience in using DBT (Data Build Tool) with snowflake/Athena/Glue for ELT pipeline development. - Proficiency in writing advanced SQL and PL SQL programs. - Building reusable components using Snowflake and AWS Tools/Technology. - Project implementation experience in at least two major projects. - Exposure to data governance or lineage tools like Immuta and Alation. - Knowledge of orchestration tools such as Apache Airflow or Snowflake Tasks. - Familiarity with Ab-initio ETL tool is a plus. Qualifications Required: - Ability to engage with Stakeholders, elicit requirements/user stories, and translate requirements into ETL components. - Understanding of infrastructure setup and the ability to provide solutions individually or with teams. - Good knowledge of Data Marts and Data Warehousing concepts. - Possess good analytical and interpersonal skills. - Implementation of Cloud-based Enterprise data warehouse with multiple data platforms along with Snowflake and NoSQL environment to build data movement strategy. In this role based out of Pune, your main purpose will be to build and maintain systems that collect, store, process, and analyze data such as data pipelines, data warehouses, and data lakes to ensure accuracy, accessibility, and security of all data. As a Data Engineer at Barclays, you will be accountable for: - Building and maintaining data architectures pipelines for durable, complete, and consistent data transfer and processing. - Designing and implementing data warehouses and data lakes that manage appropriate data volumes and velocity while adhering to required security measures. - Developing processing and analysis algorithms suitable for the intended data complexity and volumes. - Collaborating with data scientists to build and deploy machine learning models. As part of your analyst expectations, you will be required to perform activities in a timely manner and to a high standard consistently, driving continuous improvement. You will need in-depth technical knowledge and experience in your area of expertise, leading and supervising a team, guiding and supporting professional development, allocating work requirements, and coordinating team resources. Additionally, you will be expected to embody the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, and demonstrate the Barclays Mindset of Empower, Challenge, and Drive.,
Posted 2 days ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Data Analyst in the Solution Design team at Barclays, your role involves supporting the organization in defining and designing technology and business solutions to meet organizational goals. This includes requirements gathering, data analysis, data architecture, system integration, and delivering scalable, high-quality designs aligned with both business and technical needs. Key Responsibilities: - Deliver large-scale change in complex environments, acting as a thought leader in requirements documentation and workshop facilitation to gather, clarify, and communicate business needs effectively. - Utilize strong data analysis and data modeling skills to perform data validations, anomaly detection, and make sense of large volumes of data to support decision-making. - Demonstrate advanced SQL proficiency for querying, joining, and transforming data to extract actionable insights, along with experience in data visualization tools such as Tableau, Qlik, and Business Objects. - Act as an effective communicator, translating complex technical concepts into clear, accessible language for diverse audiences, and liaising between business stakeholders and technical teams to achieve a mutual understanding of data interpretations, requirements definition, and solution designs. - Apply experience in Banking and Financial services, particularly in wholesale credit risk, and implement data governance standards including metadata management, lineage, and stewardship. Qualifications Required: - Experience in Python data analysis and associated visualization tools. - Familiarity with external data vendors for sourcing and integrating company financials and third-party datasets. - Experience with wholesale credit risk internal ratings-based (IRB) models and regulatory frameworks. In this role based in Chennai/Pune, you will be responsible for implementing data quality processes and procedures to ensure reliable and trustworthy data. Your tasks will include investigating and analyzing data issues related to quality, lineage, controls, and authoritative source identification, executing data cleansing and transformation tasks, designing and building data pipelines, and applying advanced analytical techniques like machine learning and AI to solve complex business problems. Additionally, you will document data quality findings and recommendations for improvement. As a Vice President, you are expected to contribute to strategy, drive requirements, and make recommendations for change. You will manage resources, budgets, and policies, deliver continuous improvements, and escalate breaches of policies and procedures. If you have leadership responsibilities, you will demonstrate leadership behaviours focused on creating an environment for colleagues to thrive and deliver to an excellent standard. All colleagues at Barclays are expected to uphold the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as demonstrate the Barclays Mindset of Empower, Challenge, and Drive in their behavior.,
Posted 2 days ago
11.0 - 18.0 years
40 - 70 Lacs
gurugram, delhi / ncr
Hybrid
Full time with Global MNC in to ecommerce logistics As we scale and grow, Data will be our backbone to drive efficiency initiatives in our ecosystem. We are looking for a Head of Product for Data who will be responsible for managing 3 pillars of Data Analytics & BI, Data Governance and Data AI. A DAY IN A LIFE As a start up, you can expect your days to be pretty varied. Multitasking is normal, and sometimes, your skills or natural talents will be leveraged to support other business priorities. That said, the bulk of your working hours should involve you having to: Work with group stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions and identify business strategies Assess the effectiveness and accuracy of new data sources and data gathering techniques Develop predictive models using Machine Learning, Natural Language Processing (OCR, information extraction), and LLMs to build Data Products that directly impact key organization metrics Research on algorithm improvements for higher performance, accuracy, and optimality. Develop processes and tools to monitor and analyze model performance and data accuracy. Work on building the right data governance practice as we scale in our journey from a startup to more enterprise architecture To ensure seamless onboarding with a dynamic team - you as a leader are expected to: Roll up their sleeves as needed, and never delegate work that one would not be willing to do themselves Do what is needed to get things done, as they believe speed is more important than anything else to effect change Over communicate, particularly as they are all quite autonomous Take care of our staff, and treat them as they would want to be treated Are detail oriented, able to operate in an environment of chaos, identifying the right opportunities Requirements Experienced working in Data Products and building and scaling data teams to support organization growth Experience using ML frameworks such as TensorFlow, PyTorch, or scikit-learn. Experience with Google Cloud Platform products and services such as Vision API, Recommendations API, Cloud Natural Language. An in-depth knowledge of AI techniques, their real-world advantages/drawbacks, and ability to prescribe and implement feasible and appropriate conventional/AI related techniques that serve as solutions to problems. Good knowledge of codebase shares like Gitlab, bitbucket, and CI/CD process Strong problem solving skills with an emphasis on business solution product development. Excellent written and verbal communication skills for coordinating across teams. Great team player but at the same time able to work independently with minimal supervision
Posted 2 days ago
5.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
Role Overview: As a Solution Architect at Deutsche Bank in Pune, India, in the role of Assistant Vice President, you will be responsible for designing secure, scalable, and cost-effective solutions. Your key responsibilities will include reviewing existing architectures, defining cloud migration strategies, and enabling data design principles across enterprise platforms and systems. You will work on proof of concepts and minimum viable products to evaluate new technologies and solution approaches. Key Responsibilities: - Review, assess, and enhance existing architectures to identify security and performance gaps. - Design and document new solution architectures, ensuring they are secure, scalable, and cost-effective. - Provide end-to-end solution options with clear trade-off analysis for business and IT stakeholders. - Lead cloud migration assessments, define migration roadmaps, and design hybrid/multi-cloud solutions. - Embed security-by-design principles into all solution and data architectures. - Conduct threat modeling and define countermeasures for identified risks. - Define secure patterns for API, data exchange, and application integration. - Work with DevSecOps teams to ensure continuous compliance in CI/CD pipelines. - Define data models, data flows, and integration strategies for enterprise systems. - Ensure data security, governance, lineage, and quality are built into architectures. - Design solutions to handle structured and unstructured data across platforms. - Work with analytics teams to enable secure and scalable data platforms (DWH, Data Lakes, BI tools). - Support implementation of data privacy regulations (GDPR, HIPAA, etc.) in solution designs. - Design cloud-native solutions leveraging AWS, Azure, or GCP services. - Define migration patterns (rehost, refactor, replatform, etc.) for legacy applications and databases. - Ensure secure data migration strategies, including encryption, backup, and failover planning. - Act as a trusted advisor to business and IT leaders on secure and data-driven design choices. - Participate in architecture review boards to approve designs and ensure compliance with enterprise standards. - Provide solution recommendations and alternatives to align IT capabilities with business goals. - Mentor junior architects and technical teams on secure solution and data design practices. - Create and streamline the process of application onboarding, ensuring alignment with enterprise standards. Qualifications Required: - Bachelors or Masters in Computer Science, Information Security, Data Engineering, or related field. - 10+ years in IT with at least 5 years in solution architecture, including significant security and data architecture responsibilities. - Deep knowledge of Architecture and Design Principles, Algorithms and Data Structures for both on-prem and cloud native solutions. - Strong background in cloud platforms (AWS, Azure, GCP) and cloud migration strategies. - Expertise in IAM, PKI, encryption, network security, API security, and DevSecOps. - Hands-on experience in data modeling, ETL, data lakes/warehouses, and BI platforms. - Familiarity with data governance frameworks, metadata management, and master data management (MDM). - Knowledge of compliance frameworks (GDPR, HIPAA, PCI-DSS, ISO 27001). - Preferred certifications in TOGAF, AWS/GCP/AZURE Solution Architect. - Soft skills including strong communication, ability to influence stakeholders, and proven track record of simplifying complex designs. Additional Details: Deutsche Bank promotes a positive, fair, and inclusive work environment where continuous learning, coaching, and support from experts are provided to aid in career progression. The culture fosters empowerment, responsibility, commercial thinking, initiative-taking, and collaborative work. Visit the company website for further information: [Deutsche Bank Website](https://www.db.com/company/company.htm).,
Posted 2 days ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
Role Overview: As the Data Protection & Regulatory Security Compliance Advisor at Schneider Electric, you will be a key expert resource responsible for supporting regional regulatory compliance, guiding data strategy, and safeguarding data security across the organization. Your role will involve translating business needs into actionable data privacy controls and guidance to ensure that all data handling processes are robust, secure, and fully compliant with relevant regulations. You will collaborate closely with legal and technical teams, acting as a strategic consultant to interpret and implement regulatory standards, fostering a unified approach to data protection. Additionally, you will play a vital role in developing comprehensive training programs to educate staff on data privacy and security best practices, building awareness and a strong culture of security throughout the organization. Key Responsibilities: - Serve as the subject matter expert on implementing security controls to safeguard data from unauthorized access, breaches, and other risks. Collaborate closely with IT and cybersecurity teams to ensure alignment with organizational security policies. - Advise the organization on adhering to regional and international data protection regulations. Work with legal and technical teams to interpret laws and standards for accurate and effective implementation. - Translate business needs into actionable data protection directives and controls. Contribute to the development and execution of a comprehensive, resilient data security strategy. - Design and deliver training programs to educate staff on data protection best practices, fostering a culture of compliance and accountability. - Assist with the deployment and monitoring of data protection controls, conduct audits to identify vulnerabilities, and mitigate risks. - Develop and maintain clear, comprehensive documentation of data protection policies and procedures for transparency and traceability. - Define and update an incident response plan for swift and effective handling of data security incidents. - Conduct regular risk assessments, identify vulnerabilities, and recommend strategies to mitigate data security risks. - Collaborate with internal and external stakeholders to align data governance requirements and shape a unified strategy for global compliance. Qualifications: - Bachelor's or master's degree in Information Security, Computer Science, Law, Business, or a related field. - 8+ years of experience in data protection, privacy, or information security roles, preferably within a regulated environment. - Professional certifications such as CIPP/E, CIPM, CIPT, CISSP, or equivalent are highly desirable. - Demonstrated understanding of major data privacy regulations and industry standards. - Proven track record in developing, implementing, or managing security policies, controls, and incident response plans. - Strong communication skills with the ability to convey technical or legal concepts to various audiences. Additional Details: Schneider Electric is committed to inclusivity and caring, providing equitable opportunities to all employees. The company upholds the highest standards of ethics and compliance, believing in the value of trust as a foundational principle. Inclusion is a core value, and diversity is celebrated as a strength that makes the company stronger. Schneider Electric's Trust Charter demonstrates its commitment to ethics, safety, sustainability, quality, and cybersecurity in all aspects of its business operations.,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As an AI Architect at NiCE, you will play a crucial role in building scalable AI systems and infrastructure to handle large datasets, ensuring performance, reliability, and maintainability. You will lead the development of secure and compliant data and machine learning pipelines, aligning with data governance and regulatory standards. Moreover, you will design, develop, and implement AI models and algorithms to solve real-world business problems while mentoring team members on AI technologies, best practices, and system architecture. Key Responsibilities: - Build scalable AI systems and infrastructure capable of handling large datasets with a focus on performance, reliability, and maintainability. - Lead the development of secure, compliant data and machine learning pipelines, ensuring alignment with data governance and regulatory standards. - Design, develop, and implement AI models and algorithms to solve real-world business problems, contributing to proof-of-concept and production-grade solutions. - Mentor and guide team members on AI technologies, best practices, and system architecture. - Collaborate with cross-functional stakeholders to identify opportunities for AI-driven innovation and translate business requirements into technical solutions. - Establish and promote ethical and responsible AI practices across the organization. - Take ownership of strategic decisions related to AI deployment, architecture, and lifecycle management. - Conduct research and implement appropriate machine learning algorithms, including Retrieval-Augmented Generation (RAG) techniques and integration with Vector Databases. - Develop and maintain AI applications using modern frameworks and run experiments to evaluate and improve model performance. - Define and implement AI project Software Development Lifecycle (SDLC) processes, including versioning, testing, validation, and monitoring strategies. - Ensure AI systems are secure, scalable, and aligned with the company's business strategy and compliance requirements. - Stay current with advancements in AI, machine learning, and data engineering to enhance system capabilities continuously. Qualifications Required: - Bachelors or Masters degree in Computer Science, Data Science, Artificial Intelligence, or a related field. - Proven experience as an AI Architect or in a similar senior AI/ML role, with a track record of deploying multiple AI solutions in production. - Strong expertise in AI/ML technologies, including experience with RAG architectures and Vector Databases. - Proficiency in cloud platforms such as Azure or AWS, with hands-on experience in deploying enterprise-grade AI solutions. - Hands-on experience in securely integrating AI solutions across cross-functional teams, with expertise in Azure cloud security best practices and implementation. - Solid understanding of building and managing secure, scalable data and ML pipelines, with knowledge of data security governance and compliance. - Proficiency in programming languages such as Python, .NET, or similar, and familiarity with AI/ML frameworks and libraries. - Experience with AI project SDLC, including model versioning, CI/CD for ML, and AI testing strategies. - Familiarity with DevOps and Data Engineering tools and practices. - Strong analytical and problem-solving skills, with the ability to work collaboratively in a fast-paced environment. - Excellent communication skills to convey complex technical concepts to technical and non-technical stakeholders. - Commitment to continuous learning and staying current with the latest AI trends and technologies. Join NiCE, a global company known for market disruption and innovation in AI, cloud, and digital domains. Embrace the NiCE-FLEX hybrid model for maximum flexibility and endless internal career opportunities across various roles and locations. If you are passionate, innovative, and ready to raise the bar, NiCE is where you belong!,
Posted 2 days ago
2.0 - 6.0 years
0 Lacs
pune, maharashtra
On-site
Role Overview: ZS is seeking a motivated and detail-oriented Information Protection Analyst to join the Data Governance team. In this role, you will be responsible for supporting the implementation and ongoing management of the data governance framework. Your primary focus will be on operationalizing governance policies, maintaining data quality, supporting data classification and discovery efforts, and ensuring that data is well-managed, secure, and compliant. As a Data Governance Analyst, you will play a crucial role in data stewardship, monitoring governance processes, and assisting with tool administration to promote trusted and governed data across the organization. Key Responsibilities: - Support the implementation and enforcement of data governance policies, standards, and procedures. - Assist in data discovery, classification, and cataloging efforts using data governance and DSPM tools. - Monitor and report on data quality metrics, compliance adherence, and data stewardship activities. - Maintain and update data inventories, metadata repositories, and lineage documentation. - Collaborate with data owners, stewards, and IT teams to address data governance issues and remediation. - Participate in data risk assessments, control testing, and regulatory compliance audits. - Manage governance tool configurations, user access, and workflow automation. - Provide training and support to data steward teams and end-users on governance principles and tool usage. - Prepare regular data governance status reports and dashboards for leadership. Qualifications Required: - Bachelor's degree in Data Management, Computer Science, Information Systems, Business, or related field. - 2+ years of experience in data governance, data quality, data management, or related operational roles. - Familiarity with data governance frameworks, data cataloging, or metadata management concepts. - Hands-on experience with data governance, catalog, or data quality tools (e.g., Collibra, Informatica, Alation, Talend, BigID). - Understanding of data privacy regulations (such as GDPR, CCPA, HIPAA) and data security basics. - Strong analytical skills with attention to detail and problem-solving capabilities. - Excellent communication skills with the ability to work collaboratively across teams. - Familiarity with cloud platforms (AWS, Azure, GCP) and data architecture is a plus. Additional Company Details: ZS offers a comprehensive total rewards package that includes health and well-being, financial planning, annual leave, personal growth, and professional development. The company's skills development programs, career progression options, internal mobility paths, and collaborative culture empower individuals to thrive within the organization and as part of a global team. ZS promotes a flexible and connected way of working, allowing employees to combine work from home and on-site presence at clients/ZS offices for the majority of the week. The company values face-to-face connections for fostering innovation and maintaining its culture. If you are eager to grow, contribute, and bring your unique self to the work environment at ZS, they encourage you to apply. ZS is an equal opportunity employer committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law.,
Posted 2 days ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As a Collibra Specialist at NTT DATA, your main responsibility will be to produce data mapping documents, import various glossaries and CDEs into Collibra, establish lineage from Glossary to CDM to LDM, and configure lineage visualizations, glossary workflows, and governance processes in Collibra. Key Responsibilities: - Produce data mapping documents including Glossary, CDM, LDM - Import Business Glossary, sub-domain glossaries, and CDEs into Collibra - Import mapping documents into Collibra and establish lineage from Glossary, CDM, LDM - Configure lineage visualizations, glossary workflows, and governance processes in Collibra Qualifications Required: - Minimum 5-7 years of experience in data governance/metadata management - At least 3 years of hands-on experience with Collibra implementation (glossary, lineage, workflows, metadata ingestion) - Proficiency in metadata ingestion and mapping automation - Ability to script/transform mapping templates into Collibra-ingestable formats - Knowledge of ERWin/Foundry integration with Collibra - Strong analytical and problem-solving skills to support lineage accuracy Please note that you will be required to be available up to 1:30am IST for shift timings. NTT DATA is a $30 billion trusted global innovator of business and technology services, serving 75% of the Fortune Global 100. As a Collibra Specialist, you will be part of a diverse team of experts across more than 50 countries, working towards helping clients innovate, optimize, and transform for long-term success. NTT DATA is committed to investing in research and development to support organizations and society in confidently moving into the digital future. Visit us at us.nttdata.com.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
Role Overview: As a Data Engineer at Infineon, you will be a technical expert responsible for designing and developing AI data pipelines to manage large unstructured and structured datasets, with a focus on GenAI RAG/Agent solutions. You will work closely with data scientists and domain experts to ensure efficient data processing and develop solutions using agile development processes. Additionally, you will support the operation of data pipelines by troubleshooting, bug fixing, and implementing change requests. Key Responsibilities: - Work closely with data scientists and domain experts to design and develop AI data pipelines using an agile development process. - Develop pipelines for ingesting and processing large unstructured and structured datasets from various sources, ensuring efficient data processing. - Develop BIA solutions using a defined framework for Data Modeling, Data Profiling, Data Extraction, Transformation & Loading. - Design and review Functional & Technical Design specifications in sync with overall solution platforms and architecture. - Provide data/information in the form of reports, dashboards, scorecards, and data storytelling using Visualization Tools such as Business Objects & Tableau. - Troubleshoot, bug fix, and implement change requests to ensure the continuous operation of data pipelines. Qualifications Required: - Masters or Bachelors Degree in Computer Science/Mathematics/Statistics or equivalent. - Minimum of 3 years of relevant work experience in data engineering with technical knowledge of databases, BI tools, SQL, OLAP, ETL. - Proficiency in RDBMS, specifically Oracle/PL SQL. - Extensive hands-on experience in designing and implementing data pipelines with proficiency in handling unstructured data formats, databases, and big data platforms. - Experience with front-end Reporting & Dashboard and Data Exploration tools such as Tableau. - Strong background in Software Engineering & Development cycles with proficiency in scripting languages, particularly Python. - Understanding and experience with Kubernetes / Openshift Platform. - Knowledge of data management, data governance, and data security practices. - Highly motivated, structured, and methodical individual with a high degree of self-initiative. - Team player with good cross-cultural skills to work in an international team. - Customer and result-oriented. Additional Details of the Company: Infineon is a global leader in semiconductor solutions for power systems and IoT, driving decarbonization and digitalization. The company enables innovative solutions for green and efficient energy, clean and safe mobility, and smart and secure IoT. Infineon values diversity and inclusion, offering a working environment characterized by trust, openness, respect, and tolerance. The company is committed to providing equal opportunities to all applicants and employees based on their experience and skills. Join Infineon in creating a better, safer, and greener future!,
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The data governance job market in India is thriving with opportunities for skilled professionals in this field. As organizations continue to recognize the importance of managing and protecting their data assets, the demand for data governance experts is on the rise. Job seekers in India can explore a variety of roles in data governance across different industries and sectors.
These cities are known for their vibrant job markets and have a high demand for data governance professionals.
The salary range for data governance professionals in India varies based on experience and location. Entry-level positions may start at around INR 5-8 lakhs per annum, while experienced professionals can earn upwards of INR 15-20 lakhs per annum.
A typical career path in data governance may start with roles such as Data Analyst or Data Quality Analyst, progressing to roles like Data Governance Specialist, Data Governance Manager, and eventually to Chief Data Officer or Data Governance Lead.
In addition to expertise in data governance, professionals in this field are often expected to have skills in data management, data analysis, compliance regulations, and communication.
As you prepare for data governance roles in India, remember to showcase your expertise in this field, stay updated with industry trends, and demonstrate your ability to drive successful data governance initiatives. With the right skills and preparation, you can confidently apply for exciting opportunities in the data governance job market in India. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |