Home
Jobs

1380 Data Governance Jobs - Page 42

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

12.0 - 15.0 years

10 - 14 Lacs

Kolkata

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Stibo Product Master Data Management Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that solutions are effectively implemented across multiple teams, while maintaining a focus on quality and efficiency in application delivery. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with strategic goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Stibo Product Master Data Management.- Strong understanding of data governance principles and practices.- Experience with application lifecycle management tools.- Ability to analyze and optimize application performance.- Familiarity with integration techniques for master data management. Additional Information:- The candidate should have minimum 12 years of experience in Stibo Product Master Data Management.- This position is based at our Kolkata office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

5.0 - 8.0 years

10 - 14 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP Master Data Governance MDG Tool Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process and ensuring seamless communication within the team and stakeholders. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Lead the application development process effectively.- Ensure seamless communication within the team and stakeholders.- Provide guidance and mentorship to team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Master Data Governance MDG Tool.- Strong understanding of data governance principles.- Experience in configuring and customizing SAP MDG Tool.- Knowledge of SAP data models and structures.- Hands-on experience in data quality management.- Experience in leading application development projects. Additional Information:- The candidate should have a minimum of 12 years of experience in SAP Master Data Governance MDG Tool.- This position is based at our Chennai office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Bengaluru

Hybrid

Naukri logo

Location : Bengaluru (Hybrid) / Remote Job Type : Full-time Experience Required : 5+ Year Notice Period : immediate -30 days Role Overview : As a Collibra Expert, you will be responsible for implementing, maintaining, and optimizing the Collibra Data Governance Platform to ensure data quality, governance, and lineage across the organization. You will partner with cross-functional teams to develop data management strategies and integrate Collibra solutions with Google Cloud Platform (GCP) to create a robust, scalable, and efficient data governance framework for the retail domain. Key Responsibilities : - Data Governance Management : Design, implement, and manage the Collibra Data Governance Platform for data cataloging, data quality, and data lineage within the retail domain. - Collibra Expertise : Utilize Collibra for metadata management, data quality monitoring, policy enforcement, and data stewardship across various business units. - Data Cataloging : Lead the implementation and continuous improvement of data cataloging processes to enable a centralized, user-friendly view of the organization's data assets. - Data Quality Management : Collaborate with business and technical teams to ensure that data is high-quality, accessible, and actionable. Define data quality rules and KPIs to monitor data accuracy, completeness, consistency, and timeliness. - Data Lineage Implementation : Build and maintain comprehensive data lineage models to visualize the flow of data from source to consumption, ensuring compliance with data governance standards. - GCP Integration : Architect and implement seamless integrations between Collibra and the Google Cloud Platform (GCP) tools such as BigQuery, Dataflow, and Cloud Storage, ensuring data governance policies are enforced in the cloud environment. - Collaboration & Stakeholder Management : Collaborate with Data Engineers, Analysts, Business Intelligence teams, and leadership to define and implement data governance best practices and standards. - Training & Support : Provide ongoing training and support to business users and technical teams on data governance practices, Collibra platform usage, and GCP-based solutions. - Compliance & Security : Ensure data governance initiatives comply with internal policies, industry standards, and regulations (e.g., GDPR, CCPA). Key Requirements : - Proven Expertise in Collibra : Hands-on experience implementing and managing Collibra Data Governance Platform (cataloging, lineage, data quality). - Google Cloud Platform (GCP) Proficiency : Strong experience with GCP tools (BigQuery, Dataflow, Pub/Sub, Cloud Storage, etc.) and integrating them with Collibra for seamless data governance. - Data Quality and Lineage Expertise : In-depth knowledge of data quality frameworks, metadata management, and data lineage implementation. - Retail Industry Experience : Prior experience in data governance within the retail or eCommerce domain is a plus. - Technical Skills : Strong understanding of cloud data architecture and best practices for managing data at scale in the cloud (preferably in GCP). - Problem-Solving and Analytical Skills : Ability to analyze complex data governance issues and find practical solutions to ensure high-quality data management across the organization. - Excellent Communication Skills : Ability to communicate effectively with both technical and non-technical stakeholders to advocate for data governance best practices. - Certifications : Relevant certifications in Collibra, Google Cloud, or Data Governance are highly desirable. Education & Experience : - Bachelor's degree (B. Tech/BE) mandatory, masters optional - 5+ years of experience in Data Governance, with at least 3 years of specialized experience in Collibra and GCP. - Experience working with data teams in a retail environment is a plus.

Posted 1 month ago

Apply

6.0 - 10.0 years

27 - 42 Lacs

Chennai

Work from Office

Naukri logo

Job Summary We are seeking a highly skilled Sr. Developer with 6 to 10 years of experience specializing in Reltio MDM. The ideal candidate will work in a hybrid model with day shifts. This role does not require travel. The candidate will contribute to the companys mission by developing and maintaining high-quality MDM solutions that drive business success and societal impact. Responsibilities Develop and maintain Reltio MDM solutions to ensure data quality and integrity. Collaborate with cross-functional teams to gather and analyze business requirements. Design and implement data models and workflows in Reltio MDM. Provide technical expertise and support for Reltio MDM configurations and customizations. Conduct performance tuning and optimization of Reltio MDM applications. Ensure compliance with data governance and security policies. Troubleshoot and resolve issues related to Reltio MDM. Create and maintain technical documentation for Reltio MDM solutions. Participate in code reviews and provide constructive feedback to team members. Stay updated with the latest trends and best practices in MDM and data management. Contribute to the continuous improvement of development processes and methodologies. Mentor junior developers and provide guidance on best practices. Collaborate with stakeholders to ensure successful project delivery. Qualifications Possess strong expertise in Reltio MDM and data management. Have a solid understanding of data modeling and data integration techniques. Demonstrate proficiency in performance tuning and optimization. Show experience in troubleshooting and resolving technical issues. Exhibit excellent communication and collaboration skills. Have a strong attention to detail and a commitment to quality. Be able to work independently and as part of a team. Display a proactive approach to learning and staying current with industry trends. Possess a bachelors degree in Computer Science or a related field. Have experience with Agile development methodologies. Show the ability to mentor and guide junior team members. Demonstrate strong problem-solving skills. Be committed to delivering high-quality solutions that meet business needs. Certifications Required N

Posted 1 month ago

Apply

1.0 - 5.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking for an experienced and highly skilled Data Architect with 5-7 years of expertise in Data Architecture, Data Governance, and Data Modeling to join our growing team This role involves designing and implementing comprehensive data solutions, ensuring data quality, consistency, and governance across the organization You will collaborate with various stakeholders to create robust data architectures, enforce data governance policies, and develop high-quality data models that support business intelligence and analytics. Responsibilities Design and implement enterprise-level data architectures to support business operations, analytics, and reporting needs. Ensure scalability, reliability, and security in data architecture solutions, optimizing for performance and efficiency. Work with data engineering teams to select the right data platforms, technologies, and methodologies for system design and development. Oversee the integration of various data sources into a centralized data repository, ensuring efficient data flow across systems. Develop and implement data governance frameworks and policies to ensure high-quality data management across the organization. Define and enforce standards for data security, privacy, and compliance (eg., GDPR, CCPA). Ensure data ownership, stewardship, and accountability are clearly defined. Monitor data usage and quality, identifying areas for improvement and ensuring data consistency across systems. Collaborate with business and IT teams to ensure data is trustworthy, accessible, and compliant with organizational policies. Create and maintain logical, physical, and conceptual data models for transactional and analytical databases. Develop complex data models that reflect business processes, ensuring alignment between technical and business needs. Use industry-standard methodologies to ensure data models are scalable, flexible, and efficient. Collaborate with cross-functional teams to validate data models and ensure alignment with business requirements and data strategy. Document and maintain metadata, schema definitions, and data dictionaries to ensure consistency and clarity. Design, implement, and optimize data architectures to support data integration, data analytics, and business intelligence. Lead efforts to establish data governance processes, policies, and practices across the organization. Implement data quality and consistency frameworks to ensure the integrity of the data across all platforms and systems. Work with stakeholders to gather and analyze business requirements and translate them into effective data architecture and modeling solutions. Provide expertise in selecting the appropriate tools and technologies for data storage, retrieval, and integration. Collaborate with data engineers, data analysts, and other teams to ensure effective data architecture implementation and governance. Document data models, architecture designs, and governance frameworks to maintain clarity and alignment with organizational objectives. Keep up to date with emerging data management trends and technologies, recommending innovative solutions to improve data quality and management. Requirements Bachelor's degree in Computer Science, Information Technology, Data Science, or related field (Master's degree preferred). 5-6 years of proven experience in Data Architecture, Data Governance, and Data Modeling. Strong hands-on experience with data modeling tools such as Erwin, IBM InfoSphere, Microsoft Visio, or similar tools. In-depth knowledge of data governance frameworks, including best practices and standards (e g., DAMA DMBOK). Proficient in data architecture principles, including designing scalable and high-performance data storage systems. Experience with SQL and NoSQL databases, data warehousing, ETL processes, and cloud-based platforms (AWS, Azure, or Google Cloud). Strong understanding of data privacy, security, and compliance standards (eg., GDPR, HIPAA, CCPA). Familiarity with big data technologies like Hadoop, Spark, or Kafka is a plus. Excellent communication skills to present complex data architecture concepts to both technical and non-technical stakeholders.

Posted 1 month ago

Apply

6.0 - 11.0 years

8 - 12 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

JobOpening Senior Data Engineer (Remote, Contract 6 Months) Remote | Contract Duration: 6 Months | Experience: 6-8 Years We are hiring a Senior Data Engineer for a 6-month remote contract position. The ideal candidate is highly skilled in building scalable data pipelines and working within the Azure cloud ecosystem, especially Databricks, ADF, and PySpark. You'll work closely with cross-functional teams to deliver enterprise-level data engineering solutions. #KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices #GoodToHaveSkills Event Hubs, Logic Apps Power BI Strong logic building and competitive programming background Location : - Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune,Remote

Posted 1 month ago

Apply

2.0 - 5.0 years

8 - 13 Lacs

Gurugram

Work from Office

Naukri logo

About This Role Position Overview: BlackRock is seeking a highly skilled and motivated Associate to support its growing and dynamic data stewardship function In this role, you will be responsible for ensuring that data products are accurately defined, compliant with data governance policies, and delivered effectively to meet business needs You will work closely with cross-functional teamsincluding business stakeholders, technical teams, and external vendorsto manage the full lifecycle of data products from ideation to delivery The ideal candidate will have at least 4 years of experience in data stewardship, data governance, and data modeling, and will thrive in a fast-paced, results-driven environment, Key Responsibilities As an Associate Data Steward, your responsibilities will span several key areas: Business & Strategic Acumen: You will collaborate closely with business units to understand evolving data requirements and align data products to meet strategic goals and objectives You will ensure that data products support various use cases, such as operational efficiencies, risk management, and commercial applications, while defining success criteria for data offerings in collaboration with key stakeholders, Data Governance & Quality: A core aspect of this role is managing data quality through the application of robust data governance controls You will be responsible for monitoring data health, implementing data quality metrics, and ensuring that data products meet established standards for accuracy, completeness, and consistency Regular assessments of data sources and processes will be part of your ongoing responsibilities to identify deficiencies and opportunities for improvement, Data Product Lifecycle Management: You will support the full delivery lifecycle of data products, from ideation to release This includes working with cross-functional teamssuch as product managers, engineers, and business stakeholdersto plan, design, and deliver data products In addition, you will contribute to the design and creation of conceptual, logical, and physical data models to ensure that data products meet business requirements, Requirements Gathering & Documentation: You will be actively involved in gathering, defining, and documenting business requirements for data products This includes translating business needs into detailed data requirements and user stories for development teams You will work to break down complex data problems into manageable tasks, ensuring alignment between technical and business requirements, Testing & Quality Assurance: During the testing phase of data product development, you will collaborate with engineering and quality assurance teams to validate that data is accurately extracted, transformed, and loaded Ensuring that data governance controls are applied during testing is also part of your role, and you will help resolve any issues that arise, Vendor & Stakeholder Management: You will manage relationships with external data vendors to ensure that data feeds meet business requirements and quality standards Additionally, you will work with both internal and external stakeholders to ensure that data products align with organizational goals and address customer needs Regular engagement with stakeholders will be key to soliciting feedback on data products and identifying opportunities for enhancement, Data Stewardship Support: In addition to data management, you will provide Level 3 support for complex data-related inquiries and issues You will proactively identify data challenges and offer data-driven solutions to meet business objectives You will also participate in data governance initiatives, helping to define and implement best practices for data stewardship across the organization, Collaboration & Communication: You will communicate effectively with both technical and non-technical teams, ensuring that complex data concepts are conveyed clearly Your collaboration with internal and external teams will ensure that data solutions align with business goals and industry best practices You will be expected to work in an agile environment, managing multiple priorities to ensure efficient and timely data product delivery, Qualifications & Requirements The ideal candidate will possess the following qualifications: Experience At least 4 years of experience in data stewardship, data governance, or a related field, Experience in the financial services industry is a plus, but not required, A strong background in data modeling (logical, conceptual, physical), data governance, and data quality management is essential, Technical Skills Proficiency in data management tools and technologies such as SQL,, Unix,, Tableau, etc Familiarity with data governance platforms (e-g, Aha!, ServiceNow, Erwin Data Modeling, DataHub) and methodologies for data management and quality assurance is preferred, Knowledge of databases (Relational, NoSQL, Graph) and cloud-based data platforms (e-g, Snowflake) is also beneficial, Business & Communication Skills Strong business acumen and the ability to align data products with both organizational and client needs, You should be able to effectively communicate complex technical concepts to both technical and non-technical stakeholders, Strong organizational skills and the ability to manage multiple tasks and priorities in an agile environment are essential, Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about, Our hybrid work model BlackRocks hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week Some business groups may require more time in the office due to their roles and responsibilities We remain focused on increasing the impactful moments that arise when we work together in person aligned with our commitment to performance and innovation As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock, About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being Our clients, and the people they serve, are saving for retirement, paying for their childrens educations, buying homes and starting businesses Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress, This mission would not be possible without our smartest investment the one we make in our employees Its why were dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive, For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: linkedin,com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law,

Posted 1 month ago

Apply

3.0 - 8.0 years

16 - 18 Lacs

Hyderabad

Work from Office

Naukri logo

We are Hiring Data Management Specialist Level 2 for a US based IT Compnay based in Hyderabad. Job Title : Data Management Specialist Level 2 Location : Hyderabad Experience : 3+ Years CTC : 16 LPA - 18 LPA Working shift : Day shift We are seeking a Level 2 Data Management Specialist to join our data team and support the development, maintenance, and optimization of data pipelines and cloud-based data platforms. The ideal candidate will have hands-on experience with Snowflake , along with a solid foundation in SQL , data integration, and cloud data technologies. As a mid-level contributor, this position will collaborate closely with senior data engineers and business analysts to deliver reliable, high-quality data solutions for reporting, analytics, and operational needs. You will help develop scalable data workflows, resolve data quality issues, and ensure compliance with data governance practices. Key Responsibilities: Design, build, and maintain scalable data pipelines using Snowflake and SQL-based transformation logic Assist in developing and optimizing data models to support reporting and business intelligence efforts Write efficient SQL queries for data extraction, transformation, and analysis Collaborate with cross-functional teams to gather data requirements and implement dependable data solutions Support data quality checks and validation procedures to ensure data integrity and consistency Contribute to data integration tasks across various sources, including relational databases and cloud storage Document technical workflows, data definitions, and transformation logic for reference and compliance Monitor the performance of data processes and help troubleshoot workflow issues Required Skills & Qualifications: 24 years of experience in data engineering or data management roles Proficiency in Snowflake for data development or analytics Strong SQL skills and a solid grasp of relational database concepts Familiarity with ETL/ELT tools such as Informatica, Talend , or dbt Basic understanding of cloud platforms like AWS, Azure , or GCP Knowledge of data modeling techniques (e.g., star and snowflake schemas) Excellent attention to detail, strong analytical thinking, and problem-solving skills Effective team player with the ability to clearly communicate technical concepts Preferred Skills: Exposure to data governance or data quality frameworks Experience working in the banking or financial services industry Basic scripting skills in Python or Shell Familiarity with Agile/Scrum methodologies Experience using Git or other version control tools For further assistance contact/whatsapp : 9354909521 9354909512 or write to pankhuri@gist.org.in

Posted 1 month ago

Apply

4.0 - 9.0 years

15 - 30 Lacs

Pune, Bengaluru

Hybrid

Naukri logo

Job Role & responsibilities:- Understanding operational needs by collaborating with specialized teams • Supporting key business operations. This involves architecture data flow, data lineage and building data systems Help to identify and deploy enterprise data best practices such as data scoping, metadata standardization, lineage, data deduplication, mapping and transformation, and business validations. Own development and management of data glossaries and data owner matrices to establish enterprise data standards pertaining to the use of critical data. Assist with deploying data issue capture and resolution process. Engage with key business stakeholders to assist with establishing fundamental data governance processes Create, prepare, and standardize data quality reports for internal analysis. Define key data quality metrics and indicators and facilitate the development and implementation of supporting standards. Technical skills ,Experience & Qualification required:- 4-10 years of experience in Data Governance Experience who is proficient in data management, understands data model frameworks, and has practical knowledge of MDM Hands on experience in Collibra tool Hands-on experience on working with data catalog tool as Collibra. Hands on experience on Collibra, Data governance and Data quality aspects; working with python Understanding of Cloud ServicesAzure Good communication and interpersonal skills Bachelors Degree in Computer Science or related field Soft skills and competencies: - Good communication skills to coordinate between business stakeholders & engineers Strong results-orientation and time management True team player who is comfortable working in a global team Ability to establish relationships with stakeholders quickly in order to collaborate on use cases Autonomy, curiosity and innovation capability Comfortable working in a multidisciplinary team within a fast-paced environment * Only Immediate Joiners will be preferred.

Posted 1 month ago

Apply

8.0 - 12.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

Good to have skills required : Cloud, SQL , data analysis skills Location : Pune - Kharadi - WFO - 3 days/week. Job Description : We are seeking a highly skilled and experienced Python Lead to join our team. The ideal candidate will have strong expertise in Python coding and development, along with good-to-have skills in cloud technologies, SQL, and data analysis. Key Responsibilities : - Lead the development of high-quality, scalable, and robust Python applications. - Collaborate with cross-functional teams to define, design, and ship new features. - Ensure the performance, quality, and responsiveness of applications. - Develop RESTful applications using frameworks like Flask, Django, or FastAPI. - Utilize Databricks, PySpark SQL, and strong data analysis skills to drive data solutions. - Implement and manage modern data solutions using Azure Data Factory, Data Lake, and Data Bricks. Mandatory Skills : - Proven experience with cloud platforms (e.g. AWS) - Strong proficiency in Python, PySpark, R, and familiarity with additional programming languages such as C++, Rust, or Java. - Expertise in designing ETL architectures for batch and streaming processes, database technologies (OLTP/OLAP), and SQL. - Experience with the Apache Spark, and multi-cloud platforms (AWS, GCP, Azure). - Knowledge of data governance and GxP data contexts; familiarity with the Pharma value chain is a plus. Good to Have Skills : - Experience with modern data solutions via Azure. - Knowledge of principles summarized in the Microsoft Cloud Adoption Framework. - Additional expertise in SQL and data analysis. Educational Qualifications : Bachelor's/Master's degree or equivalent with a focus on software engineering. If you are a passionate Python developer with a knack for cloud technologies and data analysis, we would love to hear from you. Join us in driving innovation and building cutting-edge solutions! Apply Insights Follow-up Save this job for future reference Did you find something suspicious? Report Here! Hide This Job? Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 1 month ago

Apply

2.0 - 7.0 years

45 - 50 Lacs

Pune

Work from Office

Naukri logo

We are looking for a highly driven and technically skilled Software Engineer to lead the integration of various Content Management Systems with AWS Knowledge Hub, enabling advanced Retrieval-Augmented Generation (RAG) search across heterogeneous customer data without requiring data duplication. This role will also be responsible for expanding the scope of Knowledge Hub to support non-traditional knowledge items and enhance customer self-service capabilities. You will work at the intersection of AI, search infrastructure, and developer experience to make enterprise knowledge instantly accessible, actionable, and AI-ready. How will you make an impact Integrate CMS with AWS Knowledge Hub to allow seamless RAG-based search across diverse data types eliminating the need to copy data into Knowledge Hub instances. Extend Knowledge Hub capabilities to ingest and index non-knowledge assets, including structured data, documents, tickets, logs, and other enterprise sources. Build secure, scalable connectors to read directly from customer-maintained indices and data repositories. Enable self-service capabilities for customers to manage content sources using App Flow, Tray.ai, configure ingestion rules, and set up search parameters independently. Collaborate with the NLP/AI team to optimize relevance and performance for RAG search pipelines. Work closely with product and UX teams to design intuitive, powerful experiences around self-service data onboarding and search configuration. Implement data governance, access control, and observability features to ensure enterprise readiness. Have you got what it takes Proven experience with search infrastructure, RAG pipelines, and LLM-based applications. 2+ Years hands-on experience with AWS Knowledge Hub, AppFlow, Tray.ai, or equivalent cloud-based indexing/search platforms. Strong backend development skills (Python, Typescript/NodeJS, .NET/Java) and familiarity with building and consuming REST APIs. Infrastructure as a code (IAAS) service like AWS Cloud formation, CDK knowledge Deep understanding of data ingestion pipelines, index management, and search query optimization. Experience working with unstructured and semi-structured data in real-world enterprise settings. Ability to design for scale, security, and multi-tenant environment.

Posted 1 month ago

Apply

8.0 - 9.0 years

10 - 11 Lacs

Pune

Work from Office

Naukri logo

We are looking for a highly driven and technically skilled Software Engineer to lead the integration of various Content Management Systems with AWS Knowledge Hub, enabling advanced Retrieval-Augmented Generation (RAG) search across heterogeneous customer data without requiring data duplication. This role will also be responsible for expanding the scope of Knowledge Hub to support non-traditional knowledge items and enhance customer self-service capabilities. You will work at the intersection of AI, search infrastructure, and developer experience to make enterprise knowledge instantly accessible, actionable, and AI-ready. How will you make an impact Integrate CMS with AWS Knowledge Hub to allow seamless RAG-based search across diverse data types eliminating the need to copy data into Knowledge Hub instances. Extend Knowledge Hub capabilities to ingest and index non-knowledge assets, including structured data, documents, tickets, logs, and other enterprise sources. Build secure, scalable connectors to read directly from customer-maintained indices and data repositories. Enable self-service capabilities for customers to manage content sources using App Flow, Tray.ai, configure ingestion rules, and set up search parameters independently. Collaborate with the NLP/AI team to optimize relevance and performance for RAG search pipelines. Work closely with product and UX teams to design intuitive, powerful experiences around self-service data onboarding and search configuration. Implement data governance, access control, and observability features to ensure enterprise readiness. Have you got what it takes Proven experience with search infrastructure, RAG pipelines, and LLM-based applications. 8+ Years hands-on experience with AWS Knowledge Hub, AppFlow, Tray.ai, or equivalent cloud-based indexing/search platforms. Strong backend development skills (Python, Typescript/NodeJS, .NET/Java) and familiarity with building and consuming REST APIs. Infrastructure as a code (IAAS) service like AWS Cloud formation, CDK knowledge Deep understanding of data ingestion pipelines, index management, and search query optimization. Experience working with unstructured and semi-structured data in real-world enterprise settings. Ability to design for scale, security, and multi-tenant environment.

Posted 1 month ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Noida

Work from Office

Naukri logo

Become part of Barclays Assistant Vice President - Impairment UK Cards Portfolio Analytics team. At Barclays, we don t just anticipate the future -we re creating it as part of this role, the candidate will be required to embed a control functionality by building and leading the development of the output for the team. Where you will help the colleague to demonstrate analytical and technical skills as well as knowledge of fundamentals of retail credit risk management, particularly across impairment management. The colleague will also need to demonstrate sound judgement in collaboration with the wider team and management. To be successful in this role, you should have To provide commentary for multiple forums Own IFRS9 risk models to manage entire lifecycle starting from data governance, development, implementation and monitoring. Findings and observations on IFRS9 risk models Develop Post Model Adjustments (PMA)to address model inaccuracy and underperformance. Review model monitoring reports to assess drivers for model underperformance and lias with modelling teams. To design and implement tactical and strategic remediation Support production of commentary packs and decks for multiple forums and group impairment committee. Some other highly values skills include Review and challenge IFRS9 impairment models (both SPOT and Forecasting). Produce an annual and monthly forecast for IFRS9. Maintain management information on impairment metrics e.g. stock coverage. Have a working knowledge of key regulatory requirements for IFRS9 and apply this to existing processes and reporting. Present and communicate results to management and other stakeholders. Facilitate a culture of decision making through provision of robust and accurate analyses. You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills. This role is in based in our Noida office. Purpose of the role To evaluate and assess the potential impairment of financial assets, ensuring that the banks financial statements accurately reflect the economic value of its assets. Accountabilities Identification of potential impairment triggers, analysis of relevant financial and non-financial information to assess the potential for impairment of financial assets, and application of quantitative and qualitative impairment tests to determine whether an asset is considered impaired. Assessment of the impairment loss for an asset by identification of the right valuation method, assessment of its fair value, and documentation of the process. Calculation of the impairment provision to reflect the impairment loss and prepare clear and accurate impairment disclosures for financial statements. Management of the performance of impaired assets and reassessment of their impairment status on a regular basis. . Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L - Listen and be authentic, E - Energise and inspire, A - Align across the enterprise, D - Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc).to solve problems creatively and effectively. Communicate complex information. Complex information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes.

Posted 1 month ago

Apply

5.0 - 10.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

About Us: Narwal, with its Global Delivery Model, strategically expands its reach across North America, the United Kingdom, and an offshore development centre in India. Delivery cutting edge AI, Data and quality engineering solutions and consistently surpassing expectations, Narwal has achieved remarkable triple-digit growth rates year after year, earning accolades such as Inc. 5000, Best IT Services Company, Best Data Technology Company, and Partner of the Year with Tricentis. Our Vision : To be an expert in AI, data, and quality engineering transformations, bold in our thinking, and authentic in our relationships. Key Skills : O2C (Order to Cash) / Sales and Distribution P2P (Procure to Pay) S2D (Schedule to Delivery) Optional: R2R (Record to Report), GTS (Global Trade Services), and MDG (Master Data Governance) Additional qualifications : At least 5 years of experience working on relevant SAP projects Hands-on experience with SAP manual testing and execution of test cases 7 to 12 years of professional experience (flexible) Qualifications B. E / B. Tech / MCA Why Narwal? Opportunity to shape the future of a rapidly growing company. Competitive salary and benefits package. A supportive and inclusive company culture. Collaborative and innovative work environment. Access to professional development and growth opportunities. Certified as a Great Place to Work Fully Remote Organization Narwal is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. For more information please visit: https://www.narwalinc.com/

Posted 1 month ago

Apply

5.0 - 10.0 years

3 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

Detailed job description - Skill Set: Strong Data Engineer with 5+ Years exp in Data Bricks , Pyspark , SQL Design and develop data processing pipelines and analytics solutions using Databricks. Architect scalable and efficient data models and storage solutions on the Databricks platform. Collaborate with architects and other teams to migrate current solution to use Databricks. Optimize performance and reliability of Databricks clusters and jobs to meet SLAs and business requirements. Use best practices for data governance, security, and compliance on the Databricks platform. Mentor junior engineers and provide technical guidance. Stay current with emerging technologies and trends in data engineering and analytics to drive continuous improvement. Strong understanding of distributed computing principles and experience with big data technologies such as Apache Spark. Experience with cloud platforms such as AWS, Azure, or GCP, and their associated data services. Proven track record of delivering scalable and reliable data solutions in a fast-paced environment. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills with the ability to work effectively in cross-functional teams. Good to have experience with containerization technologies such as Docker and Kubernetes. Knowledge of DevOps practices for automated deployment and monitoring of data pipelines. Mandatory Skills Strong Data Engineer with 5+ Years exp in Data Bricks , Pyspark , SQL Design and develop data processing pipelines and analytics solutions using Databricks. Architect scalable and efficient data models and storage solutions on the Databricks platform. Collaborate with architects and other teams to migrate current solution to use Databricks. Optimize performance and reliability of Databricks clusters and jobs to meet SLAs and business requirements. Use best practices for data governance, security, and compliance on the Databricks platform. Mentor junior engineers and provide technical guidance. Stay current with emerging technologies and trends in data engineering and analytics to drive continuous improvement. Strong understanding of distributed computing principles and experience with big data technologies such as Apache Spark. Experience with cloud platforms such as AWS, Azure, or GCP, and their associated data services.

Posted 1 month ago

Apply

12.0 - 17.0 years

25 - 30 Lacs

Hyderabad

Work from Office

Naukri logo

As a Senior Data Engineer, you will be responsible for designing and implementing complex data pipelines and analytics solutions to support key decision-making business processes in our Property & Casualty business domain. You will gain exposure to a project that is leveraging cutting edge technology that applies Big Data and Machine Learning to solve new and emerging problems for Swiss Re Property & Casualty. You will be expected to take end-to-end ownership of deliverables, gaining a full understanding of the Property & Casualty data and the business logic required to deliver analytics solutions. Key responsibilities include: Work closely with Product Owners and Architects to understand requirements, formulate solutions, and evaluate the implementation effort. Design, develop and maintain scalable data transformation pipelines. Data model Design, Data architecture implementation Working with Palantir platform for implementation Evaluate new capabilities of the analytics platform, develop prototypes and assist in drawing Development of single source of truth about our application landscape. Collaborate within a global development team to design and deliver solutions Assist stakeholders with data-related functional and technical issues Working with data governance platform for data management and stewardship. About the Team: This position is part of the Property & Casualty Data Integration and Analytics project within the Reinsurance Data office team under Data & Foundation. We are part of a global strategic initiative to make better use of our Property & Casualty data and to enhance our ability to make data driven decisions across the Property & Casualty reinsurance value chain. About You: You enjoy the challenge of solving complex big data analytics problems using state-of-the-art technologies as part of a growing global team of data engineering professionals. You are a self-starter with strong problem-solving skills and capable of owning and implementing solutions from start to finish. Key qualifications include: Bachelors degree level or equivalent in Computer Science, Data Science or similar discipline At least 12 years of experience working with large scale software systems At least 6 years of experience in Pyspark and Proficient in designing Large Scale Data Engineering solutions Minimum of 2 years of experience with Palantir Foundry, including familiarity with tools such as code repositories and Workshop. Proficient in SQL (Spark SQL preferred) Experience working with large data sets on enterprise data platforms and distributed computing (Spark/Hive/Hadoop preferred) Experience with TypeScript/JavaScript/HTML/CSS a plus Knowledge of data management fundamentals and data warehousing principals Demonstrated strength in data modelling, ETL and storage/Data Lake development Experience with Scrum/Agile development methodologies Knowledge of Insurance Domain, Financial Industry or Finance function in other industries is a strong plus Experienced in working with a diverse multi-location team of internal and external professionals Strong analytical and problem-solving skills Self-starter with a positive attitude and a willingness to learn Ability to manage own workload; self-directed Ability and enthusiasm to work in a global and multicultural environment Strong interpersonal and communication skills, demonstrating a clear and articulate standard of written and verbal communication in complex environments About Swiss Re If you are an experienced professional returning to the workforce after a career break, we encourage you to apply for open positions that match your skills and experience. Keywords: Reference Code: 133975

Posted 1 month ago

Apply

12.0 - 17.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

As a Senior Data Engineer, you will be responsible for designing and implementing complex data pipelines and analytics solutions to support key decision-making business processes in our Property & Casualty business domain. You will gain exposure to a project that is leveraging cutting edge technology that applies Big Data and Machine Learning to solve new and emerging problems for Swiss Re Property & Casualty. You will be expected to take end-to-end ownership of deliverables, gaining a full understanding of the Property & Casualty data and the business logic required to deliver analytics solutions. Key responsibilities include: Work closely with Product Owners and Architects to understand requirements, formulate solutions, and evaluate the implementation effort. Design, develop and maintain scalable data transformation pipelines. Data model Design, Data architecture implementation Working with Palantir platform for implementation Evaluate new capabilities of the analytics platform, develop prototypes and assist in drawing Development of single source of truth about our application landscape. Collaborate within a global development team to design and deliver solutions Assist stakeholders with data-related functional and technical issues Working with data governance platform for data management and stewardship. About the Team: This position is part of the Property & Casualty Data Integration and Analytics project within the Reinsurance Data office team under Data & Foundation. We are part of a global strategic initiative to make better use of our Property & Casualty data and to enhance our ability to make data driven decisions across the Property & Casualty reinsurance value chain. About You: You enjoy the challenge of solving complex big data analytics problems using state-of-the-art technologies as part of a growing global team of data engineering professionals. You are a self-starter with strong problem-solving skills and capable of owning and implementing solutions from start to finish. Key qualifications include: Bachelors degree level or equivalent in Computer Science, Data Science or similar discipline At least 12 years of experience working with large scale software systems At least 6 years of experience in Pyspark and Proficient in designing Large Scale Data Engineering solutions Minimum of 2 years of experience with Palantir Foundry, including familiarity with tools such as code repositories and Workshop. Proficient in SQL (Spark SQL preferred) Experience working with large data sets on enterprise data platforms and distributed computing (Spark/Hive/Hadoop preferred) Experience with TypeScript/JavaScript/HTML/CSS a plus Knowledge of data management fundamentals and data warehousing principals Demonstrated strength in data modelling, ETL and storage/Data Lake development Experience with Scrum/Agile development methodologies Knowledge of Insurance Domain, Financial Industry or Finance function in other industries is a strong plus Experienced in working with a diverse multi-location team of internal and external professionals Strong analytical and problem-solving skills Self-starter with a positive attitude and a willingness to learn Ability to manage own workload; self-directed Ability and enthusiasm to work in a global and multicultural environment Strong interpersonal and communication skills, demonstrating a clear and articulate standard of written and verbal communication in complex environments About Swiss Re If you are an experienced professional returning to the workforce after a career break, we encourage you to apply for open positions that match your skills and experience. Keywords: Reference Code: 133972

Posted 1 month ago

Apply

6.0 - 11.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Job Description As a Sr Data Engineer, your role is to spearhead the data engineering teams and elevate the team to the next level! You will be responsible for laying out the architecture of the new project as well as selecting the tech stack associated with it. You will plan out the development cycles deploying AGILE if possible and create the foundations for good data stewardship with our new data products! You will also set up a solid code framework that needs to be built to purpose yet have enough flexibility to adapt to new business use cases tough but rewarding challenge! Responsibilities Collaborate with several stakeholders to deeply understand the needs of data practitioners to deliver at scale Lead Data Engineers to define, build and maintain Data Platform Work on building Data Lake in Azure Fabric processing data from multiple sources Migrating existing data store from Azure Synapse to Azure Fabric Implement data governance and access control Drive development effort End-to-End for on-time delivery of high-quality solutions that conform to requirements, conform to the architectural vision, and comply with all applicable standards. Present technical solutions, capabilities, considerations, and features in business terms. Effectively communicate status, issues, and risks in a precise and timely manner. Further develop critical initiatives, such as Data Discovery, Data Lineage and Data Quality Leading team and Mentor junior resources Help your team members grow in their role and achieve their career aspirations Build data systems, pipelines, analytical tools and programs Conduct complex data analysis and report on results Qualifications 7+ Years of Experience as a data engineer or similar role in Azure Synapses, ADF or relevant exp in Azure Fabric Degree in Computer Science, Data Science, Mathematics, IT, or similar fiel

Posted 1 month ago

Apply

1.0 - 3.0 years

3 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

Job Description - Data Engineer We are seeking a highly skilled Data Engineer with extensive experience in Snowflake, Data Build Tool (dbt), Snaplogic, SQL Server, PostgreSQL, Azure Data Factory, and other ETL tools. The ideal candidate will have a strong ability to optimize SQL queries and a good working knowledge of Python. A positive attitude and excellent teamwork skills are essential. Key Responsibilities: Data Pipeline Development: Design, develop, and maintain scalable data pipelines using Snowflake, DBT, Snaplogic, and ETL tools. SQL Optimization: Write and optimize complex SQL queries to ensure high performance and efficiency. Data Integration: Integrate data from various sources, ensuring consistency, accuracy, and reliability. Database Management: Manage and maintain SQL Server and PostgreSQL databases. ETL Processes: Develop and manage ETL processes to support data warehousing and analytics. Collaboration: Work closely with data analysts, data scientists, and business stakeholders to understand data requirements and deliver solutions. Documentation: Maintain comprehensive documentation of data models, data flows, and ETL processes. Troubleshooting: Identify and resolve data-related issues and discrepancies. Python Scripting: Utilize Python for data manipulation, automation, and integration tasks. Technical Skills: Proficiency in Snowflake, DBT, Snaplogic, SQL Server, PostgreSQL, and Azure Data Factory. Strong SQL skills with the ability to write and optimize complex queries. Knowledge of Python for data manipulation and automation. Knowledge of data governance frameworks and best practices Soft Skills: Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Positive attitude and ability to work well in a team environment. Certifications: Relevant certifications (e.g., Snowflake, Azure) are a plus.

Posted 1 month ago

Apply

11.0 - 17.0 years

15 - 20 Lacs

Pune

Work from Office

Naukri logo

TA/SME - on M365 suite /Security M365 Suite Management: Oversee the administration, configuration, and optimization of Microsoft 365 applications, including Exchange Online, SharePoint, Teams, and OneDrive. Data Governance: Develop and implement data governance policies and procedures to ensure data integrity, security, and compliance across the organization. Microsoft Purview Expertise: Utilize Microsoft Purview to manage and protect sensitive data, ensuring compliance with regulatory requirements and internal policies. Design and Implementation: Design and implement solutions that leverage M365 and Purview capabilities to meet business needs, including data classification, labeling, and retention policies. Collaboration: Work closely with IT, security, and business teams to understand requirements and deliver effective solutions. Training and Support: Provide training and support to end-users and IT staff on M365 and Purview functionalities and best practices. Continuous Improvement: Stay up-to-date with the latest M365 and Purview features and updates, and continuously seek opportunities to enhance the organization's use of these tools. Qualifications: Education: Bachelor's degree in Computer Science, Information Technology, or a related field. Experience: Minimum of years of experience in managing Microsoft 365 environments, with a focus on data governance and Microsoft Purview. Certifications: Relevant certifications such as Microsoft Certified: Security, Compliance, and Identity Fundamentals, Microsoft Certified: Information Protection Administrator Associate, or similar. Skills: Strong understanding of M365 applications and services. Expertise in data governance, data protection, and compliance. Proficiency in Microsoft Purview and its capabilities. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Ability to design and implement effective solutions.

Posted 1 month ago

Apply

10.0 - 15.0 years

20 - 25 Lacs

Hubli, Mangaluru, Mysuru

Work from Office

Naukri logo

Job Description Core Skill Summary: Experienced Salesforce Administrator with 10+ years in IT, strong SOX/SOD compliance expertise, audit support, and cross-functional collaboration skills. Experience level 10+ years IT / Technology / Hi Tech consulting background and experience (Functional) Ability to communicate effectively and interact with cross-functional teams. Proven experience as a Salesforce Administrator with a focus on SOX and SOD compliance. Salesforce Administrator certification is highly desirable. Strong understanding of Salesforce security controls, data governance, and audit trails. Experience in identifying and resolving SOD conflicts in user roles. Understand the SOX process flows, narratives, risk and control matrices of business processes, and information technology platforms relevant to financial reporting Conduct or participate in walkthroughs and other meetings with process and control owners to ensure SOX testing plans are properly created to address financial reporting risks identified by management Work with control owners to resolve any potential issues before formal audits. Partner with Internal and External Audit to ensure that IT controls meet expectations and appropriately address risk. Should have the basic knowledge in Order Management processes (Order to Cash). Must have the configurational knowledge of the salesforce platform like workflow, flow, basic salesforce configuration basic salesforce configuration Good to documenting and functional flow diagram Translate the requirement to the technical team. Self-starter, motivated, well-organized. Good listener, Willing to learn new things and thrive in a fast paced team environment Job Description Core Skill Summary: Experienced Salesforce Administrator with 10+ years in IT, strong SOX/SOD compliance expertise, audit support, and cross-functional collaboration skills. Experience level 10+ years IT / Technology / Hi Tech consulting background and experience (Functional) Ability to communicate effectively and interact with cross-functional teams. Proven experience as a Salesforce Administrator with a focus on SOX and SOD compliance. Salesforce Administrator certification is highly desirable. Strong understanding of Salesforce security controls, data governance, and audit trails. Experience in identifying and resolving SOD conflicts in user roles. Understand the SOX process flows, narratives, risk and control matrices of business processes, and information technology platforms relevant to financial reporting Conduct or participate in walkthroughs and other meetings with process and control owners to ensure SOX testing plans are properly created to address financial reporting risks identified by management Work with control owners to resolve any potential issues before formal audits. Partner with Internal and External Audit to ensure that IT controls meet expectations and appropriately address risk. Should have the basic knowledge in Order Management processes (Order to Cash). Must have the configurational knowledge of the salesforce platform like workflow, flow, basic salesforce configuration basic salesforce configuration Good to documenting and functional flow diagram Translate the requirement to the technical team. Self-starter, motivated, well-organized. Good listener, Willing to learn new things and thrive in a fast paced team environment

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Pune

Work from Office

Naukri logo

So, what s the role all about? We are looking for a highly driven and technically skilled Software Engineer to lead the integration of various Content Management Systems with AWS Knowledge Hub, enabling advanced Retrieval-Augmented Generation (RAG) search across heterogeneous customer data without requiring data duplication. This role will also be responsible for expanding the scope of Knowledge Hub to support non-traditional knowledge items and enhance customer self-service capabilities. You will work at the intersection of AI, search infrastructure, and developer experience to make enterprise knowledge instantly accessible, actionable, and AI-ready. How will you make an impact? Integrate CMS with AWS Knowledge Hub to allow seamless RAG-based search across diverse data types eliminating the need to copy data into Knowledge Hub instances. Extend Knowledge Hub capabilities to ingest and index non-knowledge assets, including structured data, documents, tickets, logs, and other enterprise sources. Build secure, scalable connectors to read directly from customer-maintained indices and data repositories. Enable self-service capabilities for customers to manage content sources using App Flow, Tray.ai, configure ingestion rules, and set up search parameters independently. Collaborate with the NLP/AI team to optimize relevance and performance for RAG search pipelines. Work closely with product and UX teams to design intuitive, powerful experiences around self-service data onboarding and search configuration. Implement data governance, access control, and observability features to ensure enterprise readiness. Have you got what it takes? Proven experience with search infrastructure, RAG pipelines, and LLM-based applications. 5+ Years hands-on experience with AWS Knowledge Hub, AppFlow, Tray.ai, or equivalent cloud-based indexing/search platforms. Strong backend development skills (Python, Typescript/NodeJS, .NET/Java) and familiarity with building and consuming REST APIs. Infrastructure as a code (IAAS) service like AWS Cloud formation, CDK knowledge Deep understanding of data ingestion pipelines, index management, and search query optimization. Experience working with unstructured and semi-structured data in real-world enterprise settings. Ability to design for scale, security, and multi-tenant environment. What s in it for you? Enjoy NICE-FLEX! Reporting into: Tech Manager, Engineering, CX Role Type: Individual Contributor

Posted 1 month ago

Apply

7.0 - 11.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

As a Sr Data Engineer, your role is to spearhead the data engineering teams and elevate the team to the next level! You will be responsible for laying out the architecture of the new project as well as selecting the tech stack associated with it. You will plan out the development cycles deploying AGILE if possible and create the foundations for good data stewardship with our new data products! You will also set up a solid code framework that needs to be built to purpose yet have enough flexibility to adapt to new business use cases tough but rewarding challenge! Responsibilities Collaborate with several stakeholders to deeply understand the needs of data practitioners to deliver at scale Lead Data Engineers to define, build and maintain Data Platform Work on building Data Lake in Azure Fabric processing data from multiple sources Migrating existing data store from Azure Synapse to Azure Fabric Implement data governance and access control Drive development effort End-to-End for on-time delivery of high-quality solutions that conform to requirements, conform to the architectural vision, and comply with all applicable standards. Present technical solutions, capabilities, considerations, and features in business terms. Effectively communicate status, issues, and risks in a precise and timely manner. Further develop critical initiatives, such as Data Discovery, Data Lineage and Data Quality Leading team and Mentor junior resources Help your team members grow in their role and achieve their career aspirations Build data systems, pipelines, analytical tools and programs Conduct complex data analysis and report on results 7+ Years of Experience as a data engineer or similar role in Azure Synapses, ADF or relevant exp in Azure Fabric Degree in Computer Science, Data Science, Mathematics, IT, or similar field Must have experience e

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Manage, configure, and support Apple macOS and iOS devices using JAMF Pro. Develop and maintain policies, configuration profiles, and custom scripts to automate device management. Collaborate with IT security and infrastructure teams to ensure compliance and security of Apple endpoints. Lead JAMF deployment, patch management, and software distribution across the organization. Troubleshoot JAMF-related issues and provide Tier 2\/3 support as needed. Document configurations, policies, and procedures related to JAMF administration. Participate in new projects, upgrades, and system integrations involving Apple technologies. ","requirements":" 5+ years of hands-on experience with JAMF Pro (6+ years preferred). Mandatory Certification : JAMF 300 (Certified Tech) or JAMF 400 (Certified Admin). Note: Certifications below JAMF 300 will not be considered. Strong knowledge of Apple device lifecycle management. Proficiency in scripting languages such as Bash, Python, or AppleScript. Familiarity with macOS deployment tools and MDM frameworks. Experience with enterprise infrastructure (e.g., Active Directory, SSO, VPN). Excellent communication and documentation skills.

Posted 1 month ago

Apply

5.0 - 10.0 years

14 - 19 Lacs

Bengaluru

Work from Office

Naukri logo

Are you passionate about transforming data into actionable insights? Join our team at Infineon Technologies as a Staff Engineer in Data Engineering & Analytics! In this role, youll be at the forefront of harnessing the power of data to drive innovation and efficiency. Collaborate with experts, design robust data ecosystems, and support digitalization projects. If you have a strong background in data engineering, database concepts, and a flair for turning complex business needs into solutions, we want to hear from you. Elevate your career with us and be part of shaping the future! Job Description In your new role you will: Identify and understand the different needs and requirements of consumers and data providers (e.g. transaction processing, data ware housing, big data, AI/ML) and translate business digitalization needs to technical system requirements. Team up with our domain-, IT- and process experts to assess the status quo, to capture the full value of our data and to derive target data-ecosystems based on business needs Design, build, deploy and maintain scalable and reliable data assets, pipelines and architectures. Team-up with domain IT- and process experts and especially with you key users to validate the effectiveness and efficiency of the designed data solutions and contribute to their continuous improvement and to their future-proofing. Support data governance (Data Catalogue, Data Lineage, Meta Data, Data Quality, Roles and Responsibilities) and enable analytics use cases with a focus on data harmonization, connection and visualization. Drive and/or contribute to digitalization projects in cross-functional coordination with IT and business counterparts (e.g. data scientists, domain experts, process owners). Act as first point of contact for data solutions in the ATV QM organization to consult and guide stakeholders to leverage the full value from data and to cascade knowledge of industry trends and technology roadmaps for the major market players (guidelines,principles, frameworks, industry standards and best practice, upcoming innovation, new features and technologies) Your Profile You are best equipped for this task if you have: A degree in Information Technology, Business Informatics, Computer Science or related field of studies. At least 5 years of relevant work experience related to Data Engineering and/or Analytics with strong data engineering focus Ability to translate complex business needs into concrete actions Excellent expertise of database concepts (e.g. DWH, Hadoop/Big Data, OLAP), related query languages (e.g. SQL, Scala, Java, MDX) Expertise in data virtualization (e.g. Denodo) Working knowledge on the latest toolsets for data analytics, reporting and data visualization (e.g. Tableau, SAP BO) as well as in Python, R and Spark is a plus Ability to work both independently and within a team We are on a journey to create the best Infineon for everyone.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies