Home
Jobs

1380 Data Governance Jobs - Page 43

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 13.0 years

12 - 22 Lacs

Bengaluru

Work from Office

Naukri logo

Your Responsibilities: Designing and implementing scalable and reliable data pipelines on the Azure platform Developing and maintaining data integration solutions using Azure Data Factory, Azure Databricks, and other Azure services Ensuring data quality and integrity by implementing best practices in data collection, processing, and storage Collaborating with data scientists, data analysts, and other stakeholders to understand their data needs and deliver actionable insights Managing and optimizing Azure data storage solutions such as Azure SQL Database, Azure Data Lake, and Azure Cosmos DB Monitoring the performance of data pipelines and implementing strategies for continuous improvement Developing and maintaining ETL processes to support data warehousing and analytics Implementing best practices for data governance, security, and compliance Staying up-to-date with the latest industry trends and technologies to continuously improve data engineering practices and methodologies Living Hitachi Energys core values of safety and integrity, which means taking responsibility for your own actions while caring for your colleagues and the business. Your Background: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field 8+ years of experience in data engineering, with a focus on Azure data services Relevant certifications in Azure data services or cloud computing will be an added advantage Proficiency in programming and scripting languages such as Python, SQL, or Scala Experience with Azure data services, including Azure Data Factory, Azure Databricks, Azure Synapse Analytics, and Azure Data Lake Strong understanding of data modeling, ETL processes, and data warehousing concepts Experience with big data technologies such as Hadoop and Spark Knowledge of data governance, security, and compliance best practices Familiarity with monitoring and logging tools such as Azure Monitor and Log Analytics Strong problem-solving and troubleshooting skills Excellent communication and collaboration skills to work effectively with cross-functional team. Strong attention to detail and organizational skills Ability to articulate and present ideas to senior management Problem-solving mindset with the ability to work independently and as part of a team Eagerness to learn and enhance knowledge unassisted Strong networking skills and global orientation Ability to coach and mentor team members Effective collaboration with internal and external stakeholders Adaptability to manage and lead transformational projects Proficiency in both spoken & written English language is required

Posted 1 month ago

Apply

3.0 - 5.0 years

15 - 25 Lacs

Kochi

Work from Office

Naukri logo

We’re looking for candidates with any one of the below combinational skills set who can take In-person interview on 31st May’2025 for Kochi and Chennai location. This is a scheduled drive, and the eligible candidates will receive the call letter. Experience range: Only 3 to 6 Years Location: Kochi & Chennai Date of Interview: 31st May 2025 Skill combinations: Java, Selenium Java, Python, Selenium Java, Selenium, Rest assured Playwright – Cypress SDET Job Summary We are seeking a QA Automation engineer with 3 to 5 years of experience in RestAssured Java and Selenium. The ideal candidate will have a strong background in testing and quality assurance with a focus on ensuring the highest standards of product quality. Experience in data governance is a plus. This is a hybrid role with day shifts and no travel required. Responsibilities Conduct thorough testing of products using RestAssured Java and Selenium to ensure quality and functionality. Develop and execute test plans test cases and test scripts to identify defects and ensure product quality. Collaborate with development teams to understand product requirements and provide feedback on testability and quality. Perform regression testing to ensure that new features do not negatively impact existing functionality. Analyze test results and report defects using appropriate tools and methodologies. Work closely with the development team to troubleshoot and resolve issues in a timely manner. Provide detailed documentation of test processes results and defects for future reference. Participate in code reviews and provide feedback on testability and quality. Ensure compliance with industry standards and best practices in testing and quality assurance. Stay updated with the latest testing tools techniques and trends to continuously improve testing processes. Communicate effectively with team members and stakeholders to ensure alignment on quality goals and objectives. Contribute to the continuous improvement of testing processes and methodologies. Support the development and maintenance of automated testing frameworks and tools. Qualifications Possess strong technical skills in RestAssured Java and Selenium. Have a solid understanding of testing methodologies and best practices. Demonstrate experience in developing and executing test plans and test cases. Show proficiency in analyzing test results and reporting defects. Exhibit excellent communication and collaboration skills. Have experience in data governance (nice to have). Be familiar with automated testing frameworks and tools. Display a commitment to continuous improvement and learning. Have a keen eye for detail and a passion for quality. Be able to work effectively in a hybrid work model. Demonstrate the ability to work independently and as part of a team. Show a proactive approach to problem-solving and troubleshooting. Have a minimum of 3 years and a maximum of 5 years of relevant experience.

Posted 1 month ago

Apply

1.0 - 5.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Job TitleData Engineer Experience5"“8 Years LocationDelhi, Pune, Bangalore (Hyderabad & Chennai also acceptable) Time ZoneAligned with UK Time Zone Notice PeriodImmediate Joiners Only Role Overview: We are seeking experienced Data Engineers to design, develop, and optimize large-scale data processing systems You will play a key role in building scalable, efficient, and reliable data pipelines in a cloud-native environment, leveraging your expertise in GCP, BigQuery, Dataflow, Dataproc, and more Key Responsibilities: Design, build, and manage scalable and reliable data pipelines for real-time and batch processing. Implement robust data processing solutions using GCP services and open-source technologies. Create efficient data models and write high-performance analytics queries. Optimize pipelines for performance, scalability, and cost-efficiency. Collaborate with data scientists, analysts, and engineering teams to ensure smooth data integration and transformation. Maintain high data quality, enforce validation rules, and set up monitoring and alerting. Participate in code reviews, deployment activities, and provide production support. Technical Skills Required: Cloud PlatformsGCP (Google Cloud Platform)- mandatory Key GCP ServicesDataproc, BigQuery, Dataflow Programming LanguagesPython, Java, PySpark Data Engineering ConceptsData Ingestion, Change Data Capture (CDC), ETL/ELT pipeline design Strong understanding of distributed computing, data structures, and performance tuning Required Qualifications & Attributes: 5"“8 years of hands-on experience in data engineering roles Proficiency in building and optimizing distributed data pipelines Solid grasp of data governance and security best practices in cloud environments Strong analytical and problem-solving skills Effective verbal and written communication skills Proven ability to work independently and in cross-functional teams Show more Show less

Posted 1 month ago

Apply

1.0 - 5.0 years

11 - 15 Lacs

Pune

Work from Office

Naukri logo

RoleData QA Lead Experience Required8+ Years LocationIndia/Remote Company Overview At Codvo, software and people transformations go hand-in-hand We are a global empathy-led technology services company Product innovation and mature software engineering are part of our core DNA Respect, Fairness, Growth, Agility, and Inclusiveness are the core values that we aspire to live by each day. We continue to expand our digital strategy, design, architecture, and product management capabilities to offer expertise, outside-the-box thinking, and measurable results. The Data Quality Analyst is responsible for ensuring the quality, accuracy, and consistency of data within the Customer and Loan Master Data API solution This role will work closely with data owners, data modelers, and developers to identify and resolve data quality issues. Responsibilities: Profile data to identify data quality issues. Define and implement data quality rules and standards. Perform data cleansing and data validation & Track data quality metrics and report on data quality issues. Work with data owners to resolve data quality issues. Monitor data quality trends and identify areas for improvement. Collaborate with data modelers to ensure data models support data quality requirements. Collaborate with developers to implement data quality solutions. Participate in data governance activities to ensure data quality and compliance. Stay up to date with emerging data quality technologies and trends. Skills & Qualifications: Bachelor's degree in computer science, Information Systems, or a related field. 5+ years' Experience in Data Quality Assurance SharePoint Testing / FTP or File data migration Testing (Minimum experience of testing 5 to 10 lakh files migration testing) Strong understanding of data quality principles and techniques. Experience with data profiling tools. Proficiency in SQL and data analysis. Knowledge of data governance principles and practices. Excellent communication, collaboration, and analytical skills. Ability to work independently and as part of a team. Show more Show less

Posted 1 month ago

Apply

3.0 - 6.0 years

7 - 12 Lacs

Lucknow

Work from Office

Naukri logo

We are seeking an experienced and innovative AI Solution Engineer to be specialized in foundation models and large language models. In this role, you will be responsible for architecting and delivering AI and Automation solutions using cutting-edge technologies, with a strong focus on foundation models and large language models. In this role, you will be resposible for : Proof of Concept (POC) DevelopmentDevelop POCs to validate and showcase the feasibility and effectiveness of the proposed AI solutions. Collaborate with development teams to implement and iterate on POCs, ensuring alignment with customer requirements and expectations. Customer Engagement and SupportAct as a technical point of contact for customers, addressing their questions, concerns, and feedback. Provide technical support during the solution deployment phase and offer guidance on AI-related best practices and use cases. Industry Trends and InnovationStay up to date with the latest trends and advancements in AI, foundation models, and large language models. Evaluate emerging technologies, tools, and frameworks to assess their potential impact on solution design and implementation. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Bachelor’s or Master's degree in Computer Science, Artificial Intelligence, or a related field. 4+ years of experience in designing and delivering AI solutions, with a focus on foundation models, large language models, or similar technologies. Experience in natural language processing (NLP) and text analytics is highly desirable. Technical Skills: Strong programming skills, with proficiency in Python and experience with AI frameworks such as TensorFlow, PyTorch, or Hugging Face. Understanding in the usage of libraries such as SciKit Learn, Pandas, Matplotlib, etc. Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and related services is a plus. Solutioning ExperienceExperience in solution architecture and design, translating business requirements into technical specifications, and developing scalable and robust AI solutions. Business AcumenAbility to understand customer needs and business objectives. Experience in working closely with customers and translating their requirements into effective AI solutions. Soft Skills: Excellent interpersonal and communication skills. Engage with stakeholders for analysis and implementation. Commitment to continuous learning and staying updated with advancements in the field of AI. Growth mindsetDemonstrate a growth mindset to understand clients’ business processes and cha Preferred technical and professional experience Experience in full AI project lifecycle, from research and prototyping to deployment in production environments. Familiarity with Agile development methodologies Experience with AI and/or data governance Experience with building business automation or digital labor solutions Experience with building customer care solutions/digital assistants Experience with Red Hat OpenShift Experience with Kubernetes Experience with vector DB’s or open file formats like parquet ,avro or orc

Posted 1 month ago

Apply

3.0 - 6.0 years

5 - 9 Lacs

Kolkata

Work from Office

Naukri logo

Subject matter experts in Marketing and Comms provide business stakeholders with specialized advice on their subjects, and act as an advisor leveraging on a specific MC expertise. Shehe.is a person with in-depth, unique knowledge and expertise on a specific subject or in a particular industry ex digital marketing, internal comms, telecom, etc. : Experience working in a corporate enterprise environment, particularly with marketing or creative assets. Knowledge of UX/UI principles and best practices related to digital asset accessibility. Familiarity with project management tools (e.g., Jira, Monday) for tracking tasks and communicating progress. Primary Skills: Familiarity with digital asset management (DAM) systems and software, and an understanding of various digital file formats and their specifications. Exceptional ability to identify inconsistencies, errors, and deviations from standards in visual assets and metadata. Secondary Skills: Ability to work effectively with different departments and external partners, understanding their needs and ensuring the DAM meets those requirements. Proven ability to conduct thorough quality checks on visual assets, identifying potential errors or inconsistencies related to file integrity, technical specifications, and brand guidelines. Understanding of common issues in artwork and image production (e.g., flattened files, missing elements, resolution problems).

Posted 1 month ago

Apply

3.0 - 6.0 years

5 - 9 Lacs

Kolkata

Work from Office

Naukri logo

Subject matter experts in Marketing and Comms provide business stakeholders with specialized advice on their subjects, and act as an advisor leveraging on a specific MC expertise. She/he is a person with in-depth, unique knowledge and expertise on a specific subject or in a particular industry ex digital marketing, internal comms, telecom, etc. : Familiarity with project management tools (e.g., Jira, Monday) for tracking tasks and communicating progress. Desired background or studies inMarketing, Design, Digital Advertising or Creatives, Artwork Production, or Printed Materials. Deep understanding of brand guidelines and the importance of maintaining brand consistency across all digital assets. Ability to identify inefficiencies, analyze data, and implement solutions to improve DAM processes and workflows. Primary Skills: Proven track record of successfully managing large and diverse digital asset libraries, with a deep understanding of DAM principles, metadata best practices, taxonomy development, and workflow optimization. Demonstrated ability to build strong relationships, effectively communicate with diverse stakeholders at all levels, and influence cross-functional teams. Secondary Skills: Proven ability to conduct thorough quality checks on visual assets, identifying potential errors or inconsistencies related to file integrity, technical specifications, and brand guidelines. Understanding of common issues in artwork and image production (e.g., flattened files, missing elements, resolution problems). Knowledge of AI tools for asset management, such as automatic tagging and content discovery. Organizing, storing, and retrieving media, managing digital rights and permissions, and ensuring accurate metadata. A strong understanding of digital asset management practices, metadata standards, and stakeholder management is essential.

Posted 1 month ago

Apply

1.0 - 3.0 years

3 - 5 Lacs

Pune

Work from Office

Naukri logo

So, what s the role all about We are looking for a highly driven and technically skilled Software Engineer to lead the integration of various Content Management Systems with AWS Knowledge Hub, enabling advanced Retrieval-Augmented Generation (RAG) search across heterogeneous customer data without requiring data duplication. This role will also be responsible for expanding the scope of Knowledge Hub to support non-traditional knowledge items and enhance customer self-service capabilities. You will work at the intersection of AI, search infrastructure, and developer experience to make enterprise knowledge instantly accessible, actionable, and AI-ready. How will you make an impact Integrate CMS with AWS Knowledge Hub to allow seamless RAG-based search across diverse data types eliminating the need to copy data into Knowledge Hub instances. Extend Knowledge Hub capabilities to ingest and index non-knowledge assets, including structured data, documents, tickets, logs, and other enterprise sources. Build secure, scalable connectors to read directly from customer-maintained indices and data repositories. Enable self-service capabilities for customers to manage content sources using App Flow, Tray. ai, configure ingestion rules, and set up search parameters independently. Collaborate with the NLP/AI team to optimize relevance and performance for RAG search pipelines. Work closely with product and UX teams to design intuitive, powerful experiences around self-service data onboarding and search configuration. Implement data governance, access control, and observability features to ensure enterprise readiness. Have you got what it takes Proven experience with search infrastructure, RAG pipelines, and LLM-based applications. 2+ Years hands-on experience with AWS Knowledge Hub, AppFlow, Tray. ai, or equivalent cloud-based indexing/search platforms. Strong backend development skills (Python, Typescript/NodeJS, . NET/Java) and familiarity with building and consuming REST APIs. Infrastructure as a code (IAAS) service like AWS Cloud formation, CDK knowledge Deep understanding of data ingestion pipelines, index management, and search query optimization. Experience working with unstructured and semi-structured data in real-world enterprise settings. Ability to design for scale, security, and multi-tenant environment. What s in it for you Enjoy NICE-FLEX! Reporting into: Tech Manager, Engineering, CX Role Type: Individual Contributor About NICE

Posted 1 month ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Pune

Work from Office

Naukri logo

So, what s the role all about We are looking for a highly driven and technically skilled Software Engineer to lead the integration of various Content Management Systems with AWS Knowledge Hub, enabling advanced Retrieval-Augmented Generation (RAG) search across heterogeneous customer data without requiring data duplication. This role will also be responsible for expanding the scope of Knowledge Hub to support non-traditional knowledge items and enhance customer self-service capabilities. You will work at the intersection of AI, search infrastructure, and developer experience to make enterprise knowledge instantly accessible, actionable, and AI-ready. How will you make an impact Integrate CMS with AWS Knowledge Hub to allow seamless RAG-based search across diverse data types eliminating the need to copy data into Knowledge Hub instances. Extend Knowledge Hub capabilities to ingest and index non-knowledge assets, including structured data, documents, tickets, logs, and other enterprise sources. Build secure, scalable connectors to read directly from customer-maintained indices and data repositories. Enable self-service capabilities for customers to manage content sources using App Flow, Tray. ai, configure ingestion rules, and set up search parameters independently. Collaborate with the NLP/AI team to optimize relevance and performance for RAG search pipelines. Work closely with product and UX teams to design intuitive, powerful experiences around self-service data onboarding and search configuration. Implement data governance, access control, and observability features to ensure enterprise readiness. Have you got what it takes Proven experience with search infrastructure, RAG pipelines, and LLM-based applications. 8+ Years hands-on experience with AWS Knowledge Hub, AppFlow, Tray. ai, or equivalent cloud-based indexing/search platforms. Strong backend development skills (Python, Typescript/NodeJS, . NET/Java) and familiarity with building and consuming REST APIs. Infrastructure as a code (IAAS) service like AWS Cloud formation, CDK knowledge Deep understanding of data ingestion pipelines, index management, and search query optimization. Experience working with unstructured and semi-structured data in real-world enterprise settings. Ability to design for scale, security, and multi-tenant environment. What s in it for you Enjoy NICE-FLEX! Reporting into: Tech Manager, Engineering, CX Role Type: Individual Contributor About NICE

Posted 1 month ago

Apply

3.0 - 6.0 years

5 - 8 Lacs

Pune

Work from Office

Naukri logo

So, what s the role all about We are looking for a highly driven and technically skilled Software Engineer to lead the integration of various Content Management Systems with AWS Knowledge Hub, enabling advanced Retrieval-Augmented Generation (RAG) search across heterogeneous customer data without requiring data duplication. This role will also be responsible for expanding the scope of Knowledge Hub to support non-traditional knowledge items and enhance customer self-service capabilities. You will work at the intersection of AI, search infrastructure, and developer experience to make enterprise knowledge instantly accessible, actionable, and AI-ready. How will you make an impact Integrate CMS with AWS Knowledge Hub to allow seamless RAG-based search across diverse data types eliminating the need to copy data into Knowledge Hub instances. Extend Knowledge Hub capabilities to ingest and index non-knowledge assets, including structured data, documents, tickets, logs, and other enterprise sources. Build secure, scalable connectors to read directly from customer-maintained indices and data repositories. Enable self-service capabilities for customers to manage content sources using App Flow, Tray. ai, configure ingestion rules, and set up search parameters independently. Collaborate with the NLP/AI team to optimize relevance and performance for RAG search pipelines. Work closely with product and UX teams to design intuitive, powerful experiences around self-service data onboarding and search configuration. Implement data governance, access control, and observability features to ensure enterprise readiness. Have you got what it takes Proven experience with search infrastructure, RAG pipelines, and LLM-based applications. 5+ Years hands-on experience with AWS Knowledge Hub, AppFlow, Tray. ai, or equivalent cloud-based indexing/search platforms. Strong backend development skills (Python, Typescript/NodeJS, . NET/Java) and familiarity with building and consuming REST APIs. Infrastructure as a code (IAAS) service like AWS Cloud formation, CDK knowledge Deep understanding of data ingestion pipelines, index management, and search query optimization. Experience working with unstructured and semi-structured data in real-world enterprise settings. Ability to design for scale, security, and multi-tenant environment. What s in it for you Enjoy NICE-FLEX! Reporting into: Tech Manager, Engineering, CX Role Type: Individual Contributor About NICE

Posted 1 month ago

Apply

25.0 - 30.0 years

50 - 100 Lacs

Pune

Work from Office

Naukri logo

mLogica is seeking a visionary and strategic Chief Data Officer (CDO) to lead data management and governance initiatives, particularly focusing on the critical domains of Property Registration, Marriage Registration, and related governmental processes. The CDO will be responsible for developing and executing a comprehensive data strategy that optimizes data acquisition, analysis, governance, and management across both digital and offline platforms. The overall goal is to utilize data for enhancing customer experience, promoting efficiency and increasing revenue through the monetization of data. This role requires a strong leader with exceptional expertise in data management, governance, and technology, coupled with an understanding of the government land, urban planning, or real estate sectors. Responsibilities Strategic Data Leadership: Define and implement a comprehensive data management strategy, framework, and methodologies for acquiring, analyzing, governing, and managing data related to Property Registration, Marriage Registration, and other relevant government processes in both digital and offline environments. Identify and evaluate global technology trends in data acquisition, management of distribution, providing insightful recommendations to the steering/leadership committee. Reduce costs and eliminate redundancies stemming from disparate data and technology programs across various departments. Increase customer experience and promote efficiencies through the use of data. Leverage the organizations data for building data monetization programs. Data Governance and Compliance: Consult with process/data owners and data stewards to ensure the effective implementation of data policies and frameworks. Analyze and address existing data issues and challenges, including generating insights from all available data sources including scanned documents, and provide clear guidance for resolution. Collaborate closely with the Compliance Officer to understand and address any non-compliance with data policies and regulations. Stakeholder Management and Leadership: Exercise strong leadership capabilities, effectively managing stakeholders and demonstrating the ability to defend strategic data management and governance decisions. Foster a data-driven culture and promote data literacy across the organization. Project Management and Execution: Strategize and implement complex data management and governance projects, ensuring successful delivery and alignment with organizational goals. Qualifications: B.E./B.Tech. in Computer Science/Information Technology/Electronics and Communications, with an MBA preferred. 15+ years of extensive experience in data management and governance roles. Understanding of modern technologies and techniques around data acquisition, processing, management, delivery and utilization. Domain knowledge of Government Land Departments, Urban Planning Departments, or the Real Estate Industry. Ability to operate independently and with a lean team. Detailed oriented and hands-on leadership style a must. Proven track record of strategizing and executing successful data management and governance projects. Strong understanding of data management frameworks such as DAMA, DCAM, or equivalent certifications preferred. Demonstrated ability to analyze complex data issues and provide effective solutions. Previous experience in data monetization preferred Experience with Big Data and AI/ML technologies a plus. Excellent communication, presentation, and stakeholder management skills. Ability to understand and solve problems associated with scanned documents and extracting data from them.

Posted 1 month ago

Apply

25.0 - 30.0 years

50 - 100 Lacs

Mumbai

Work from Office

Naukri logo

mLogica is seeking a visionary and strategic Chief Data Officer (CDO) to lead data management and governance initiatives, particularly focusing on the critical domains of Property Registration, Marriage Registration, and related governmental processes. The CDO will be responsible for developing and executing a comprehensive data strategy that optimizes data acquisition, analysis, governance, and management across both digital and offline platforms. The overall goal is to utilize data for enhancing customer experience, promoting efficiency and increasing revenue through the monetization of data. This role requires a strong leader with exceptional expertise in data management, governance, and technology, coupled with an understanding of the government land, urban planning, or real estate sectors. Responsibilities Strategic Data Leadership: Define and implement a comprehensive data management strategy, framework, and methodologies for acquiring, analyzing, governing, and managing data related to Property Registration, Marriage Registration, and other relevant government processes in both digital and offline environments. Identify and evaluate global technology trends in data acquisition, management of distribution, providing insightful recommendations to the steering/leadership committee. Reduce costs and eliminate redundancies stemming from disparate data and technology programs across various departments. Increase customer experience and promote efficiencies through the use of data. Leverage the organizations data for building data monetization programs. Data Governance and Compliance: Consult with process/data owners and data stewards to ensure the effective implementation of data policies and frameworks. Analyze and address existing data issues and challenges, including generating insights from all available data sources including scanned documents, and provide clear guidance for resolution. Collaborate closely with the Compliance Officer to understand and address any non-compliance with data policies and regulations. Stakeholder Management and Leadership: Exercise strong leadership capabilities, effectively managing stakeholders and demonstrating the ability to defend strategic data management and governance decisions. Foster a data-driven culture and promote data literacy across the organization. Project Management and Execution: Strategize and implement complex data management and governance projects, ensuring successful delivery and alignment with organizational goals. Qualifications: B.E./B.Tech. in Computer Science/Information Technology/Electronics and Communications, with an MBA preferred. 15+ years of extensive experience in data management and governance roles. Understanding of modern technologies and techniques around data acquisition, processing, management, delivery and utilization. Domain knowledge of Government Land Departments, Urban Planning Departments, or the Real Estate Industry. Ability to operate independently and with a lean team. Detailed oriented and hands-on leadership style a must. Proven track record of strategizing and executing successful data management and governance projects. Strong understanding of data management frameworks such as DAMA, DCAM, or equivalent certifications preferred. Demonstrated ability to analyze complex data issues and provide effective solutions. Previous experience in data monetization preferred Experience with Big Data and AI/ML technologies a plus. Excellent communication, presentation, and stakeholder management skills. Ability to understand and solve problems associated with scanned documents and extracting data from them.

Posted 1 month ago

Apply

6.0 - 8.0 years

8 - 12 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

JobOpening Senior Data Engineer (Remote, Contract 6 Months) Remote | Contract Duration: 6 Months | Experience: 6-8 Years We are hiring a Senior Data Engineer for a 6-month remote contract position. The ideal candidate is highly skilled in building scalable data pipelines and working within the Azure cloud ecosystem, especially Databricks, ADF, and PySpark. You'll work closely with cross-functional teams to deliver enterprise-level data engineering solutions. #KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices #GoodToHaveSkills Event Hubs, Logic Apps Power BI Strong logic building and competitive programming background Location : - Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 1 month ago

Apply

7.0 - 10.0 years

2 - 6 Lacs

Pune

Work from Office

Naukri logo

Responsibilities : - Design, develop, and deploy data pipelines using Databricks, including data ingestion, transformation, and loading (ETL) processes. - Develop and maintain high-quality, scalable, and maintainable Databricks notebooks using Python. - Work with Delta Lake and other advanced features. - Leverage Unity Catalog for data governance, access control, and data discovery. - Develop and optimize data pipelines for performance and cost-effectiveness. - Integrate with various data sources, including but not limited to databases and cloud storage (Azure Blob Storage, ADLS, Synapse), and APIs. - Experience working with Parquet files for data storage and processing. - Experience with data integration from Azure Data Factory, Azure Data Lake, and other relevant Azure services. - Perform data quality checks and validation to ensure data accuracy and integrity. - Troubleshoot and resolve data pipeline issues effectively. - Collaborate with data analysts, business analysts, and business stakeholders to understand their data needs and translate them into technical solutions. - Participate in code reviews and contribute to best practices within the team.

Posted 1 month ago

Apply

10.0 - 15.0 years

20 - 25 Lacs

Pune

Work from Office

Naukri logo

As an Engineering Data Manager, your days will be varied and engaging in our global Engineering Planning and Reporting team. A typical day might include a standup meeting with your team, connecting with colleagues across the organization to understand requirements, focus time developing new Power BI reports, developing an endpoint to exchange data between systems, or improving our reporting service. Yours will be a relevant role in a team growing to accommodate the digitalization demands of Compression Engineering and Product Management. How you'll Make an Impact Design, develop, and maintain data infrastructure for a global Engineering organization Collaborate with data and process owners to optimize data extraction, transformation and analysis, including enhancing services and creating endpoints for data exchange where required Enhance Engineering data service with automated routines to fetch and store data for reporting Support project execution teams in manual data transportation between Engineering systems while meeting deadlines Collaborate with collaborators to capture requirements and translate them into effective data visualizations in Power BI. Implement data governance practices, including data access controls, role-level security, and data lineage documentation. Maintain documentation for reports, data sources, and data transformation processes. Conduct user testing and gather feedback to continuously improve the user experience What You Bring Bachelors degree in Computer Science, Information Technology, Engineering or a related field. 10+ years experience with business intelligence, data modeling, data warehousing, and ETL processes. Demonstrable experience as a Power BI Developer, including DAX and Power Query. Solid understanding of UI/UX principles and standard methodologies. Excellent problem-solving skills, attention to detail, effective communication and collaboration skills Preferred: familiarity with Agile development methodologies and software (eg, Jira) and Engineering processes and systems (eg., SAP, Teamcenter) Rewards/Benefits Employees are eligible for Remote Working arrangements up to 2 days per week. All employees are automatically covered under the Medical Insurance. Company paid considerable Family floater cover covering employee, spouse and 2 dependent children up to 25 years of age. Siemens Energy provides an option to opt for Meal Card to all its employees which will be as per the terms and conditions prescribed in the company policy as a part of CTC, tax saving measure Flexi Pay empowers employees with the choice to customize the amount in some of the salary components within a defined range thereby optimizing the tax benefits. Accordingly, each employee is empowe'red to decide on the best Possible net income out of the same fixed individual base pay on a monthly basis.

Posted 1 month ago

Apply

9.0 - 13.0 years

45 - 50 Lacs

Pune

Work from Office

Naukri logo

The Mastercard Data Governance Regulation team have an exciting opportunity for a Manager of Software Engineering to enhance and modernize our services. This position will be key to growing a global technology platform, operating at-scale, requiring focus on performance, security, and reliability. - Do you want to positively influence the experience of millions of customers - Do you like to get involved in the creation and execution of strategic initiatives centered around digital payments - Do you look forward to developing and engaging with high performant diverse teams around the globe - Do you like to own and be accountable for highly visible strategically important teams Role: - Managing multiple scrum teams of Software developers and testers to develop quality software solutions in a timely and cost-effective manner. - Successfully lead definition, development and delivery of major cross-department initiatives with broad scope and long-term business implications. - Provide technical leadership and direction to software development teams in the development of Java, microservices, Rest APIs event based applications and platform. - Work closely with product and architecture teams on product definition, technical design, and overall execution for the team. - Ensure the project or effort is adequately staffed; trained and managed. Ensure personnel have appropriate skills and behaviors; and effectively communicate performance results; as necessary, managing each effort within approved manpower and budget guidelines. - Automate and simplify all aspects of software delivery and development by actively evangelizing the need to automate and simplify where needed. - Own complex problems having dependency across services and facilitate cross-functional team interactions to drive resolution. - Define, design, and develop procedures and solutions at a service level to meet the business requirements/enhancements. - Drive prioritization decisions and trade-offs in working with product partners. - Drive blameless postmortems culture to identify root causes of incidents and implement learnings - Lead by example with hands-on approaches to demonstrate engineering excellence. All About You: - Overall career experience of 9-13 years in Technology / Java Development - Experience in Team Management required, managed a Team of 4-5 members or more - Hands-on experience in designing solutions and full stack development in modern technologies for large enterprise technology platforms and systems. - Strong knowledge of software development principles, design patterns, and best practices. - Specific expertise in Java, Spring boot, microservices, Rest APIs, Kafka, Oracle, Test Automation and its frameworks, sql and no sql databases. - Has experience designing and implementing solutions focusing on the non-functional concerns - Performance, Security, Scalability, Availability, Extensibility, Resiliency. - Operate with urgency, fairness and decency to address challenges and solve for new opportunities. - Experience collaborating with cross-functional teams, including product management, BizOps, TechOps, customer experience. - Strong communication skills - both verbal and written - with strong relationship building, collaborative skills and organizational skills. - Experienced in Agile methodologies of software development and SDLC practices. - Able to interact with product and business stakeholders independently. - Have strong decision-making skills, lead retrospection and continually improve as a result. - Enthusiastic, ambitious and confident.

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

The Data Engineer will serve as a technical expert in the fields ofdesign and develop AI data pipelines to manage both large unstructuredand structured datasets, with a focus on building data pipelines for enterprise AI solutions Job Description Working closely with data scientists and domain experts to design anddevelop AI data pipelines using agile development process. Developing pipelines for ingesting and processing large unstructuredand structured datasets from a variety of sources, with a specificemphasis on creating solutions for AI solutions to ensure efficient andeffective data processing. Work efficiently with structured and unstructured data sources. Work with cloud technologies such as AWS to design and implement scalable data architectures Supporting the operation of the data pipelines involves troubleshooting and bug fixing, as we'll as implementing change requeststo ensure that the data pipelines continue to meet user requirements. Your Profile Masters or bachelors Degree in Computer Science/Mathematics/Statistics or equivalent. Minimum of 3 years of relevant work experience in data engineering Extensive hands-on experience in conceptualizing, designing, andimplementing data pipelines. Proficiency in handling structured dataOracle unstructured data formats (eg, PPT, PDF, Docx), databases(RDMS, Oracle/PL SQL, MySQL, NoSQL such as Elasticsearch, MongoDB,Neo4j, CEPH) and familiarity with big data platforms (HDFS, Spark,Impala). Experience in working with AWS technologies focussing on buildingscalable data pipelines. Strong background in Software Engineering Development cycles(CI/CD) with proficiency in scripting languages, particularly Python. Good understanding and experience with Kubernetes / OpenshiftPlatform. Front-end Reporting Dashboard and Data Exploration tools -Tableau Good understanding of data management, data governance, and datasecurity practices. Highly motivated, structured and methodical with high degree ofself-initiative Team player with good cross-cultural skills to work in internationalteam Customer and Result orientated

Posted 1 month ago

Apply

13.0 - 15.0 years

20 - 27 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a highly skilled and experienced BigData Platform Architect to join our team. The ideal candidate will have a proven track record in building large shared platforms and hybrid cloud platforms. This role requires a visionary leader who can develop the vision and roadmap for our DataPlatform, collaborate with business and product teams to establish and drive consensus, and deliver as per the roadmap. Key Responsibilities: Develop and implement the vision and roadmap for the DataPlatform, ensuring alignment with the companys strategic goals. Design and build large shared platforms and hybrid cloud platforms that are scalable, reliable, and secure. Collaborate with business and product teams to establish and drive consensus on platform strategies and initiatives. Lead the architecture and design of data solutions, ensuring they meet the needs of the business and support the companys strategic goals. Provide technical leadership and guidance to development teams, ensuring best practices are followe'd. Stay up-to-date with the latest industry trends and technologies to ensure the DataPlatform remains cutting-edge. Oversee the implementation of data governance and security best practices. Foster a culture of innovation and continuous improvement within the team. Ensure the DataPlatform supports the companys data-driven decision-making processes. Basic Qualifications: 13+ years of relevant work experience with a Bachelor s Degree or an Advanced degree (e.g. Masters, MBA, JD, MD) OR 15+ years of relevant work experience. Preferred Qualifications: Proven experience in building large shared platforms and hybrid cloud platforms. Strong expertise in BigData technologies and architectures. Technologist with excellent leadership and communication skills. Ability to collaborate effectively with cross-functional teams. Strong problem-solving and analytical skills. Experience with cloud platforms such as AWS, Azure, or Google Cloud. Bachelors or Masters degree in Computer Science, Engineering, or a related field. Experience in a similar role within a large organization. Knowledge of data governance and security best practices. Familiarity with machine learning and AI technologies. Experience with data-driven decision-making processes. Strong understanding of data privacy regulations and compliance requirements.

Posted 1 month ago

Apply

3.0 - 7.0 years

8 - 12 Lacs

Gurugram

Work from Office

Naukri logo

I. Job Summary This senior level analyst position is responsible for the configuration and support of software application systems within the People Organization. As part of the HR Technology team, this role provides complex analytical and consultative support delivering HR processes. Generally, provides technical input for Digital/vendor support. II. Essential Duties and Responsibilities To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. Other duties may be assigned. Reviews open cases/issues and troubleshoots or assigns to other tech team members for resolution. Coordinates with vendor regarding required support by opening tickets, escalating, and following up as required. Continuously develops advanced knowledge of assigned application(s) utilizing vendor websites, user groups and training to effectively utilize system capabilities. Mentors other team members and is recognized as an expert. Consults on requested configuration changes, partnering with the team to determine best options for design decisions based on documented requirements, current configuration, and downstream impacts. Documents final specs and configures application. Analyzes impact of configuration from foundational structures to Fast Formulas and extensions to HR service delivery and downstream systems and integrations. Leads meetings to resolve priority issues. Ensures data integrity and governance by supporting large/complex data imports and extracts and validating accuracy through reporting and queries. Supports the development and maintenance of integrations/file transfers. Provides analysis and consultation on the implementation of new software application products or new modules in existing applications . Provides experienced support for integrations, reports, and large data imports/extractions. Plans for and supports migrations, releases, upgrades and/or patches - mitigating risk/downstream impacts. Developing knowledge and expertise on extensions and Fast Formulas. Engages Digital and vendor for support as necessary. Executes unit, integration and acceptance testing. May support functional team with required screen shots and system steps for testing and change management. May be responsible for configuring and delivering moderate to complex reports and queries utilizing delivered software. Follows established data governance. Documents all configuration. III. Supervisory Responsibilities No formal supervisory responsibilities in this position. Provides informal assistance such as technical guidance and/or training to coworkers. May lead project teams and/or plan and supervise assignments of lower level employees. IV. Qualifications The requirements listed below are representative of the qualifications necessary to perform the job. A. Education and Experience Education: bachelors Degree (accredited), or in lieu of degree, High School Diploma or GED (accredited) and four (4) years of relevant work experience. Experience: Eight (8) years of previous experience (in addition to education requirement). B. Certificates, Licenses, Registrations or Other Requirements None required. C. Other Knowledge, Skills or Abilities Required Database queries/extracts using calculations, formulas, and complex commands. Extensive experience evaluating requirements and specs for development, testing and deployment. Hands on configuration of application(s), evaluating impact and supporting releases, patches, upgrades and enhancements. V. Work Environment Listed below are key points regarding environmental demands and work environment of the job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions of the job. Required to use motor coordination with finger dexterity (such as keyboarding, machine operation, etc) most of the work day; Required to exert physical effort in handling objects less than 30 pounds rarely; Required to be exposed to physical occupational risks (such as cuts, burns, exposure to toxic chemicals, etc) rarely; Required to be exposed to physical environment which involves dirt, odors, noise, weather extremes or similar elements rarely; Normal setting for this job is: office setting. Must be available to work standard business hours, as we'll as be available to work non-standard hours in case of emergency (natural disasters, power outages, etc). May need to attend after hours calls with the offshore team. Benefits At Waste Management, each eligible employee receives a competitive total compensation package including Medical, Dental, Vision, Life Insurance and Short Term Disability. As we'll as a Stock Purchase Plan, Company match on 401K, and more! Our employees also receive Paid Vacation, Holidays, and Personal Days. Please note that benefits may vary by site

Posted 1 month ago

Apply

9.0 - 13.0 years

20 - 25 Lacs

Pune

Work from Office

Naukri logo

Our Purpose Title and Summary Manager, Software Engineering(Java Full stack) Overview: The Mastercard Data Governance & Regulation team have an exciting opportunity for a Manager of Software Engineering to enhance and modernize our services. This position will be key to growing a global technology platform, operating at-scale, requiring focus on performance, security, and reliability. - Do you want to positively influence the experience of millions of customers? - Do you like to get involved in the creation and execution of strategic initiatives centered around digital payments? - Do you look forward to developing and engaging with high performant diverse teams around the globe? - Do you like to own and be accountable for highly visible strategically important teams? Role: - Managing multiple scrum teams of Software developers and testers to develop quality software solutions in a timely and cost-effective manner. - Successfully lead definition, development and delivery of major cross-department initiatives with broad scope and long-term business implications. - Provide technical leadership and direction to software development teams in the development of Java, microservices, Rest APIs & event based applications and platform. - Work closely with product and architecture teams on product definition, technical design, and overall execution for the team. - Ensure the project or effort is adequately staffed; trained and managed. Ensure personnel have appropriate skills and behaviors; and effectively communicate performance results; as necessary, managing each effort within approved manpower and budget guidelines. - Automate and simplify all aspects of software delivery and development by actively evangelizing the need to automate and simplify where needed. - Own complex problems having dependency across services and facilitate cross-functional team interactions to drive resolution. - Define, design, and develop procedures and solutions at a service level to meet the business requirements/enhancements. - Drive prioritization decisions and trade-offs in working with product partners. - Drive blameless postmortems culture to identify root causes of incidents and implement learnings - Lead by example with hands-on approaches to demonstrate engineering excellence. All About You: - Overall career experience of 9-13 years in Technology / Java Development - Experience in Team Management required, managed a Team of 4-5 members or more - Hands-on experience in designing solutions and full stack development in modern technologies for large enterprise technology platforms and systems. - Strong knowledge of software development principles, design patterns, and best practices. - Specific expertise in Java, Spring boot, microservices, Rest APIs, Kafka, Oracle, Test Automation and its frameworks, sql and no sql databases. - Has experience designing and implementing solutions focusing on the non-functional concerns - Performance, Security, Scalability, Availability, Extensibility, Resiliency. - Operate with urgency, fairness and decency to address challenges and solve for new opportunities. - Experience collaborating with cross-functional teams, including product management, BizOps, TechOps, customer experience. - Strong communication skills - both verbal and written - with strong relationship building, collaborative skills and organizational skills. - Experienced in Agile methodologies of software development and SDLC practices. - Able to interact with product and business stakeholders independently. - Have strong decision-making skills, lead retrospection and continually improve as a result. - Enthusiastic, ambitious and confident.

Posted 1 month ago

Apply

5.0 - 6.0 years

11 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Proven experience as a Snowflake Developer or in a similar data engineering role. Strong proficiency in Snowflake architecture, data modeling , and performance tuning . Expertise in SQL and experience working with large datasets and complex queries. Experience with ETL/ELT tools and frameworks (e.g., Talend, Apache Nifi, or custom solutions). Familiarity with cloud technologies, especially AWS, Azure , , in a data engineering context. Understanding of data integration, data warehousing concepts , and BI tools. Experience with data governance, security, and privacy protocols . EXPERIENCE 4.5-6 Years SKILLS Primary Skill: Data Engineering Sub Skill(s): Data Engineering Additional Skill(s): Data Modeling, Data Warehouse, ETL, Data Architecture, databricks, snowflake, Azure Data Factory, Talend, SQL Data Engineering

Posted 1 month ago

Apply

8.0 - 11.0 years

32 - 40 Lacs

Bengaluru

Work from Office

Naukri logo

Proven experience as a Snowflake Developer or in a similar data engineering role. Strong proficiency in Snowflake architecture, data modeling , and performance tuning . Expertise in SQL and experience working with large datasets and complex queries. Experience with ETL/ELT tools and frameworks (e.g., Talend, Apache Nifi, or custom solutions). Familiarity with cloud technologies, especially AWS, Azure , , in a data engineering context. Understanding of data integration, data warehousing concepts , and BI tools. Experience with data governance, security, and privacy protocols . EXPERIENCE 8-11 Years SKILLS Primary Skill: Data Engineering Sub Skill(s): Data Engineering Additional Skill(s): Data Modeling, Data Warehouse, ETL, Data Architecture, databricks, snowflake, Azure Data Factory, Talend, SQL Data Engineering

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Chennai

Work from Office

Naukri logo

Lead Generative AI / Machine Learning / Artificial Intelligence Job DescriptionRole: Lead - Generative AI/ Maching Learning/ Artificial IntelligenceExperience: 5 - 10 yearsJob Location: ChennaiAbout OJ Commerce OJ Commerce (OJC), a rapidly expanding and... Job Description Role: Lead - Generative AI/ Maching Learning/ Artificial Intelligence Experience: 5 - 10 years Job Location: Chennai About OJ Commerce OJ Commerce (OJC), a rapidly expanding and profitable online retailer, is headquartered in Florida, USA, with a fully-functional office in Chennai, India. We deliver exceptional value to our customers by harnessing cutting-edge technology, fostering innovation, and establishing strategic brand partnerships to enable a seamless, enjoyable shopping experience featuring high-quality products at unbeatable prices. Our advanced, data-driven system streamlines operations with minimal human intervention. Our extensive product portfolio encompasses over a million SKUs more than 2,500 brands across eight primary categories. This role involves driving the marketing efforts for a product line with a compound annual growth rate (CAGR) of 70% to achieve the targeted revenue run rate of US$ 200 million in three years. As we continue to forge new partner relationships, our flagship website, , has rapidly emerged as a top-performing e-commerce channel, catering to millions of customers annually. Job Summary: We are seeking a highly experience and motivated Lead Engineer with deep expertise in Generative AI, Machine Learning and Artificial Intelligence . The ideal candidate will drive the design, development and deployment of AI/ML solutions, lead technical teams and collaborate with cross-functional stakeholders to solve real-world problems using cutting-edge technologies. Key Responsibilities: Lead and mentor a team of AI/ML engineers and researchers. Architect and deploy scalable Generative AI solutions (e.g., LLMs, diffusion models, transformers). Design and implement end-to-end machine learning pipelines, including data preprocessing, model training, validation, and deployment. Work with large-scale datasets to develop, fine-tune, and evaluate models. Collaborate with product, design, and engineering teams to integrate AI capabilities into applications. Stay current with the latest research, tools, and trends in AI/ML, and recommend adoption where applicable. Conduct technical reviews and provide guidance on best practices in AI model development. Ensure ethical, secure, and responsible AI development. Must-Have Skills & Experience: 5 10 years of hands-on experience in AI/ML, including leadership responsibilities. Strong understanding of deep learning , NLP , computer vision , or generative models . Proven experience with Generative AI frameworks (e.g., OpenAI, HuggingFace, LangChain, Diffusers). Proficient in ML libraries and tools such as TensorFlow , PyTorch , scikit-learn , Keras , etc. Solid programming skills in Python ; experience with other languages like Java , C++ , or Go is a plus. Familiarity with cloud platforms (AWS, GCP, Azure) and MLOps practices. Experience deploying models in production environments (APIs, microservices, containers). Strong problem-solving, communication, and analytical skills. Nice-to-Have: Research contributions or publications in AI/ML conferences or journals. Experience working with multi-modal models (text + image + audio). Knowledge of Reinforcement Learning , Prompt Engineering , or AutoML . Exposure to data governance , model explainability , and AI ethics . What we Offer Competitive salary Medical Benefits/Accident Cover Flexi Office Working Hours Fast paced start up Submit Your Application You have successfully applied You have errors in applying Social Network and Web Links Provide us with links to see some of your work (Git/ Dribble/ Behance/ Pinterest/ Blog/ Medium)

Posted 1 month ago

Apply

0.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

" FERM T enables eCommerce brands to transform clicks into conversions with highly-personalized , 1:1 dynamic shopping experiences. Weve raised $30M+ to date and are backed by Bain Capital Ventures, Greylock, QED, and other top angels and commerce investors. Located in SF, Austin, NYC, and Bangalore, were looking to expand our 70+ person team to build the future of eCommerce! After announcing our $17M Series A, FERM T is one of the fastest growing companies at this stage in the US. FERM T is the leading AI-native funnel management platform built for e-commerce marketers. We empower brands to create and manage delightful customer experiences across multiple channels in minutes. Our platform helps businesses transform their digital presence through intelligent, data-driven funnel creation that strengthens customer acquisition and drives measurable results. With FERM T, e-commerce teams can rapidly built, test and iterate on their customer journey while maintaining brand consistency across every touchpoint. About the Role: As a Senior Software Engineer on our Data Platform team, youll have a transformative impact on FERM Ts ability to deliver powerful data insights that drive business decisions. Youll architect, build, and scale our data infrastructure at a critical inflection point in our growth journey, as we expand our agency accelerator program and onboard larger enterprise clients. Your expertise will power both customer-facing analytics and internal reporting capabilities that form the backbone of our decision-making process. This role sits at the intersection of data engineering and business impact - youll work closely with teams across the organization to understand their data needs and translate them into robust, scalable data pipelines and OLAP solutions. Youll have significant autonomy to shape our data architecture, implement best practices for data governance, and mentor other engineers as we build a world-class data platform. If youre energized by transforming complex data challenges into elegant solutions that drive real business outcomes, this is an exceptional opportunity to leave your mark on a rapidly growing company. Responsibilities: Own the data platform at Fermat, powering key customer-facing and internal dashboards Build and maintain the OLAP stack and data pipelines that support reporting products Take responsibility for data QA, testing, and debugging, answering critical questions Influence company-wide architecture and technology decisions, setting trends Tackle challenging technical problems and expand your engineering skillset. Lead mission-critical projects, delivering end-to-end data ingestion capabilities with high quality Collaborate closely with Sales, Product Management, and Operations teams Handle customer escalations and resolve data-related issues Mentor and guide other engineers on the team, fostering growth and development Review and provide feedback on technical specifications Requirements: Proficient experience building software products, with a focus on data platforms Proven experience architecting and developing robust data platforms Expertise in dbt labs and writing complex data transformations in SQL and other programming languages Strong knowledge of data warehousing concepts, including building custom ETL integrations, snapshots, indexing, and partitioning Excellent cross-functional collaborator, able to explain technical concepts to non-technical partners Startup experience (with companies Strong written and verbal communication skills, with the ability to discuss and debate strategic engineering and product decisions Track record of building scalable, error-tolerant, and easily debuggable products Confident in making informed technology choices and advocating for the right tools for the job Driven to deliver secure, well-tested, and high-performing features and improvements Thrive in a product-focused startup environment, passionate about enhancing customer experience. Knowledge of Data Security and Governance highly desired Tech stack: Golang Fivetran BigQuery Nextjs React Typescript Postgres Google cloud GraphQL on Hasura

Posted 1 month ago

Apply

0.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Should have worked on Implementation projects involving SAP MDG Solution for either of the following: Customer, Vendor, Material, Financial masters etc. Must have experience in SAP MDG projects, performing MDG Configurations and Customizations in areas like Data Modeling, UI Modeling, Process Modeling, Data Replication Framework, Key Mapping, Validation and Derivations, BRF+ Experience in ABAP Object Oriented Programming, Master Data Governance (MDG), ABAP Floor Plan Manager (FPM), ABAP Web Dynpro, ABAP Workflow, Standard Data Model Enhancement, Custom Data Model Creation. Experience on MDG UI configuration, FPM UI enhancement, context-based adaptation, customizing, configuration, along with knowledge in WebDynpro and Floor Plan Manager. Should have experience in process modelling configuration (Business Activity, Change Request type, Workflow, Rule Based Workflow BRF+). Ability to work on migration of existing master data into MDG hub, validations and derivations using BADIs and BRF+ Should have worked on data replication framework (DRF), data import using DIF, SOA service, ALE configuration, key value mapping knowledge on integrating MDG with SAP Information Steward and SAP Data Services for Data Remediation and Data Validation respectively. Good knowledge on BOL, and Genil Programming skills. Good fair understanding on Data migration process of any master object like Material, Customer, Vendor and Finance etc. Worked on any SAP MDG custom objects to build solution on client requirements. Experience in supporting UAT phase and go-live period of MDG Implementation project. Establish clear expectations regarding status reporting, team communication and deliverable management. Ability to work independently and collaboratively. Dedicated and committed to project goals, Positive Attitude towards work, Team Spirit, Proactive and Problem solving oriented. Apply for this position Full Name * Email * Phone * Cover Letter * Upload CV/Resume * Upload CV/Resume * Allowed Type(s): .pdf, .doc, .docx By using this form you agree with the storage and handling of your data by this website. *

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies