Home
Jobs

2325 Data Quality Jobs - Page 7

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in the development and maintenance of the data platform components, contributing to the overall success of the project. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Assist with the data platform blueprint and design.- Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models.- Develop and maintain data platform components.- Contribute to the overall success of the project. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of statistical analysis and machine learning algorithms.- Experience with data visualization tools such as Tableau or Power BI.- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms.- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information:- The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- 15 years full time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

7.0 - 12.0 years

5 - 9 Lacs

Coimbatore

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Python (Programming Language), PySpark, Apache SparkMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for utilizing the Databricks Unified Data Analytics Platform to develop efficient and effective applications. Your typical day will involve collaborating with the team, analyzing business requirements, designing application solutions, and configuring applications to meet the needs of the organization. You will also be involved in troubleshooting and resolving any application issues that arise, ensuring the smooth functioning of the applications. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Design and build applications using the Databricks Unified Data Analytics Platform.- Configure applications to meet business process and application requirements.- Analyze business requirements and translate them into application solutions.- Troubleshoot and resolve any application issues that arise.- Collaborate with the team to ensure the smooth functioning of applications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with Python (Programming Language), PySpark, Apache Spark.- Strong understanding of statistical analysis and machine learning algorithms.- Experience with data visualization tools such as Tableau or Power BI.- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms.- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Kolkata

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : Fulltime 15 years qualificationRole and Responsibilities:1. Design, create, code, and support a variety of data pipelines and models on GCP cloud technology 2. Strong hand-on exposure to GCP services like BigQuery, Composer etc.3. Partner with business/data analysts, architects, and other key project stakeholders to deliver data requirements.4. Developing data integration and ETL (Extract, Transform, Load) processes.5. Support existing Data warehouses & related pipelines.6. Ensuring data quality, security, and compliance.7. Optimizing data processing and storage efficiency, troubleshoot issues in Data space.8. Seeks to learn new skills/tools utilized in Data space (ex:dbt, MonteCarlo etc.)9. Excellent communication skills- verbal and written, Excellent analytical skills with Agile mindset.10. Demonstrates strong affinity towards paying attention to details and delivery accuracy.11. Self-motivated team player and should have ability to overcome challenges and achieve desired results.12. Work effectively in Global distributed environment.13. Employee should be ready to work in shift B i.e. 12:30 pm to 10:30 pm14. Employee should be ready to work as individual contributor Skill Proficiency Expectation:Expert:Data Storage, BigQuery,SQL,Composer,Data Warehousing ConceptsIntermidate Level:PythonBasic Level/Preferred:DB,Kafka, Pub/Sub Additional Information:- The candidate should have a minimum of 5 years of experience in Google BigQuery.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful solutions.- This position is based at our Mumbai office. Qualification Fulltime 15 years qualification

Posted 2 days ago

Apply

7.0 - 12.0 years

18 - 22 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Architect Project Role Description : Architects the data platform blueprint and implements the design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : GCP Dataflow Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : BTech Summary :As a Data Platform Architect, you will architect the data platform blueprint and implement the design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Your day will involve designing and implementing data platform components, ensuring seamless integration across systems and data models. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead data platform architecture design and implementation- Ensure seamless integration between data platform components- Provide guidance and support to Integration Architects and Data Architects Professional & Technical Skills: - Must To Have Skills: Proficiency in GCP Dataflow- Strong understanding of cloud data architecture- Experience with data modeling and data integration- Hands-on experience with data platform implementation- Knowledge of data governance and security practices Additional Information:- The candidate should have a minimum of 7.5 years of experience in GCP Dataflow- This position is based at our Bengaluru office- A BTech degree is required Qualification BTech

Posted 2 days ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP Sales and Distribution (SD) Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for ensuring that the applications are developed according to the specified requirements and standards. Your typical day will involve collaborating with the team, making team decisions, and engaging with multiple teams to contribute to key decisions. You will also be expected to provide solutions to problems that apply across multiple teams, showcasing your expertise and problem-solving skills. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Expected to provide solutions to problems that apply across multiple teams- Ensure the applications are developed according to the specified requirements and standards- Design and build applications that meet business process and application requirements- Configure applications to ensure they function properly Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Sales and Distribution (SD)- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have a minimum of 12 years of experience in SAP Sales and Distribution (SD)- This position is based at our Chennai office- A 15 years full time education is required Qualification 15 years full time education

Posted 2 days ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica MDM Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the solutions align with business objectives. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application development. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM.- Strong understanding of data integration and data quality processes.- Experience with data modeling and database design.- Familiarity with ETL processes and tools.- Ability to troubleshoot and resolve application issues effectively. Additional Information:- The candidate should have minimum 2 years of experience in Informatica MDM.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

7.0 - 12.0 years

9 - 13 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in the development and maintenance of the data platform components, contributing to the overall success of the project. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Assist with the data platform blueprint and design.- Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models.- Develop and maintain data platform components.- Contribute to the overall success of the project. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of statistical analysis and machine learning algorithms.- Experience with data visualization tools such as Tableau or Power BI.- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms.- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Chennai office.- 15 years full time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

7.0 - 12.0 years

9 - 13 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in the development and maintenance of the data platform components, contributing to the overall success of the project. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Assist with the data platform blueprint and design.- Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models.- Develop and maintain data platform components.- Contribute to the overall success of the project. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of statistical analysis and machine learning algorithms.- Experience with data visualization tools such as Tableau or Power BI.- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms.- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- 15 years full time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

3.0 - 8.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. You will collaborate with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Your typical day will involve working on the data platform blueprint and design, collaborating with architects, and ensuring seamless integration between systems and data models. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist with the data platform blueprint and design.- Collaborate with Integration Architects and Data Architects.- Ensure cohesive integration between systems and data models.- Implement data platform components.- Troubleshoot and resolve data platform issues. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of statistical analysis and machine learning algorithms.- Experience with data visualization tools such as Tableau or Power BI.- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms.- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information:- The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Mumbai

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Duck Creek Claims Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : Net Revenue AdditionyNet HC AdditionyPricing Strfixed feeif ContractSoW is in placey number 9940478345Valid Emp No for that skill13229353 Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the effort to design, build, and configure applications- Act as the primary point of contact- Manage the team and ensure successful project delivery Professional & Technical Skills: - Must To Have Skills: Proficiency in Duck Creek Claims- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have a minimum of 5 years of experience in Duck Creek Claims- This position is based in Mumbai- A 15 years full-time education is required Qualification Net Revenue AdditionyNet HC AdditionyPricing Strfixed feeif ContractSoW is in placey number 9940478345Valid Emp No for that skill13229353

Posted 2 days ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Ahmedabad

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP for Retail Store Operations Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the effort to design, build, and configure applications- Act as the primary point of contact- Manage the team and ensure successful project delivery Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP for Retail Store Operations- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have a minimum of 5 years of experience in SAP for Retail Store Operations- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 days ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Microsoft Azure Data Services Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Be involved in the end-to-end data management process. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead data architecture design and implementation.- Optimize data delivery and re-design infrastructure for greater scalability.- Implement data security and privacy measures.- Collaborate with data scientists and analysts to understand data needs. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Data Services.- Strong understanding of cloud-based data solutions.- Experience with data warehousing and data lakes.- Knowledge of SQL and NoSQL databases.- Hands-on experience with data integration tools.- Good To Have Skills: Experience with Azure Machine Learning. Additional Information:- The candidate should have a minimum of 5 years of experience in Microsoft Azure Data Services.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 2 days ago

Apply

7.0 - 12.0 years

10 - 14 Lacs

Coimbatore

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP Sales and Distribution (SD) Good to have skills : SAP MM Materials ManagementMinimum 7.5 year(s) of experience is required Educational Qualification : Minimum 15 years of full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the effort to design, build, and configure applications- Act as the primary point of contact- Manage the team and ensure successful project delivery Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Sales and Distribution (SD)- Good To Have Skills: Experience with SAP MM Materials Management- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have a minimum of 7.5 years of experience in SAP Sales and Distribution (SD)- This position is based at our Noida office- A minimum of 15 years of full-time education is required Qualification Minimum 15 years of full time education

Posted 2 days ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Mumbai

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : PeopleSoft PeopleTools Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the effort to design, build, and configure applications- Act as the primary point of contact- Manage the team and ensure successful project delivery Professional & Technical Skills: - Must To Have Skills: Proficiency in PeopleSoft PeopleTools- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have a minimum of 5 years of experience in PeopleSoft PeopleTools- This position is based in Mumbai- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 days ago

Apply

5.0 - 10.0 years

35 - 45 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Hybrid

Naukri logo

Write specifications for Master Data Management builds Create requirements, including rules of survivorship, for migrating data to Markit EDM implementation of data governance Support testing Develop data quality reports for the data warehouse Required Candidate profile 5+ years of experience documenting data management requirements Experience writing technical specifications for MDM builds Familiarity with enterprise data warehouse Knowledge of data governance

Posted 2 days ago

Apply

5.0 - 7.0 years

10 - 14 Lacs

Mumbai

Work from Office

Naukri logo

Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.

Posted 2 days ago

Apply

7.0 - 10.0 years

5 - 8 Lacs

Bengaluru

Remote

Naukri logo

Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 2 days ago

Apply

8.0 - 12.0 years

12 - 16 Lacs

Hyderabad

Work from Office

Naukri logo

At Storable , were on a mission to power the future of storage. Our innovative platform helps businesses manage, track, and grow their self-storage operations, and we re looking for a Data Manager to join our data-driven team. Storable is committed to leveraging cutting-edge technologies to improve the efficiency, accessibility, and insights derived from data, empowering our team to make smarter decisions and foster impactful growth. As a Data Manager, you will play a pivotal role in overseeing and shaping our data operations, ensuring that our data is organized, accessible, and effectively managed across the organization. You will lead a talented team, work closely with cross-functional teams, and drive the development of strategies to enhance data quality, availability, and security. Key Responsibilities: Lead Data Management Strategy: Define and execute the data management vision, strategy, and best practices, ensuring alignment with Storables business goals and objectives. Oversee Data Pipelines: Design, implement, and maintain scalable data pipelines using industry-standard tools to efficiently process and manage large-scale datasets. Ensure Data Quality & Governance: Implement data governance policies and frameworks to ensure data accuracy, consistency, and compliance across the organization. Manage Cross-Functional Collaboration: Partner with engineering, product, and business teams to make data accessible and actionable, and ensure it drives informed decision-making. Optimize Data Infrastructure: Leverage modern data tools and platforms (e.g., AWS, Apache Airflow, Apache Iceberg ) to create an efficient, reliable, and scalable data infrastructure. Monitor & Improve Performance: Proactively monitor data processes and workflows, troubleshoot issues, and optimize performance to ensure high reliability and data integrity. Mentorship & Leadership: Lead and develop a team of data engineers and analysts, fostering a collaborative environment where innovation and continuous improvement are valued. Qualifications: Proven Expertise in Data Management: Significant experience in managing data infrastructure, data governance, and optimizing data pipelines at scale. Technical Proficiency: Strong hands-on experience with data tools and platforms such as Apache Airflow, Apache Iceberg, and AWS services (S3, Lambda, Redshift, Glue) . Data Pipeline Mastery: Familiarity with designing, implementing, and optimizing data pipelines and workflows in Python or other languages for data processing. Experience with Data Governance: Solid understanding of data privacy, quality control, and governance best practices. Leadership Skills: Ability to lead and mentor teams, influence stakeholders, and drive data initiatives across the organization. Analytical Mindset: Strong problem-solving abilities and a data-driven approach to improving business operations. Excellent Communication: Ability to communicate complex data concepts to both technical and non-technical stakeholders effectively. Bonus Points: Experience with visualization tools (e.g., Looker, Tableau ) and reporting frameworks to provide actionable insights. Why Storable Cutting-Edge Technology: Work with the latest tools and technologies to solve complex data challenges. Impactful Work: Join a dynamic and growing company where your work directly contributes to shaping the future of the storage industry. Collaborative Culture: Be part of a forward-thinking, inclusive environment where innovation and teamwork are at the core of everything we do. Career Growth: We believe in continuous learning and provide ample opportunities for personal and professional development.

Posted 2 days ago

Apply

1.0 - 5.0 years

8 - 12 Lacs

Pune

Work from Office

Naukri logo

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title and Summary Moon#169 - Senior Data Engineer Who is Mastercard Mastercard is a global technology company in the payments industry. Our mission is to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. With connections across more than 210 countries and territories, we are building a sustainable world that unlocks priceless possibilities for all. Overview: Ethoca, a Mastercard Company is seeking a Senior Data Engineer to join our team in Pune, India to drive data enablement and explore big data solutions within our technology landscape. The role is visible and critical as part of a high performing team - it will appeal to you if you have an effective combination of domain knowledge, relevant experience and the ability to execute on the details. You will bring cutting edge software and full stack development skills with advanced knowledge of cloud and data lake experience while working with massive data volumes. You will own this - our teams are small, agile and focused on the needs of the high growth fintech marketplace. You will be working across functional teams within Ethoca and Mastercard to deliver on cloud strategy. We are committed in making our systems resilient and responsive yet easily maintainable on cloud. Key Responsibilities: Design, develop, and optimize batch and real-time data pipelines using Snowflake, Snowpark, Python, and PySpark. Build data transformation workflows using dbt, with a strong focus on Test-Driven Development (TDD) and modular design. Implement and manage CI/CD pipelines using GitLab and Jenkins, enabling automated testing, deployment, and monitoring of data workflows. Deploy and manage Snowflake objects using Schema Change, ensuring controlled, auditable, and repeatable releases across environments. Administer and optimize the Snowflake platform, handling performance tuning, access management, cost control, and platform scalability. Drive DataOps practices by integrating testing, monitoring, versioning, and collaboration into every phase of the data pipeline lifecycle. Build scalable and reusable data models that support business analytics and dashboarding in Power BI. Develop and support real-time data streaming pipelines (e.g., using Kafka, Spark Structured Streaming) for near-instant data availability. Establish and implement data observability practices, including monitoring data quality, freshness, lineage, and anomaly detection across the platform. Plan and own deployments, migrations, and upgrades across data platforms and pipelines to minimize service impacts, including developing and executing mitigation plans. Collaborate with stakeholders to understand data requirements and deliver reliable, high-impact data solutions. Document pipeline architecture, processes, and standards, promoting consistency and transparency across the team. Apply exceptional problem-solving and analytical skills to troubleshoot complex data and system issues. Demonstrate excellent written and verbal communication skills when collaborating across technical and non-technical teams. Required Qualifications: Tenured in the fields of Computer Science/Engineering or Software Engineering. Bachelors degree in computer science, or a related technical field including programming. Deep hands-on experience with Snowflake (including administration), Snowpark, and Python. Strong background in PySpark and distributed data processing. Proven track record using dbt for building robust, testable data transformation workflows following TDD. Familiarity with Schema Change for Snowflake object deployment and version control. Proficient in CI/CD tooling, especially GitLab and Jenkins, with a focus on automation and DataOps. Experience with real-time data processing and streaming pipelines. Strong grasp of cloud-based database infrastructure (AWS, Azure, or GCP). Skilled in developing insightful dashboards and scalable data models using Power BI. Expert in SQL development and performance optimization. Demonstrated success in building and maintaining data observability tools and frameworks. Proven ability to plan and execute deployments, upgrades, and migrations with minimal disruption to operations. Strong communication, collaboration, and analytical thinking across technical and non-technical stakeholders. Ideally you have experience in banking, e-commerce, credit cards or payment processing and exposure to both SaaS and premises-based architectures. In addition, you have a post-secondary degree in computer science, mathematics, or quantitative science. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard s guidelines.

Posted 2 days ago

Apply

5.0 - 10.0 years

7 - 9 Lacs

Pune

Work from Office

Naukri logo

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title and Summary Lead Software Engineer Job Summary: As a Lead Software Engineer focused on Data Quality, you will lead the design, development, and deployment of scalable data quality frameworks and pipelines. You will work closely with data engineers, analysts, and business stakeholders to build robust solutions that validate, monitor, and improve data quality across large-scale distributed systems. Key Responsibilities: Lead the design and implementation of data quality frameworks and automated validation pipelines using Python, Apache Spark, and Hadoop ecosystem tools. Develop, deploy, and maintain scalable ETL/ELT workflows using Apache Airflow and Apache NiFi to ensure seamless data ingestion, transformation, and quality checks. Collaborate with cross-functional teams to understand data quality requirements and translate them into technical solutions. Define and enforce data quality standards, rules, and monitoring processes. Perform root cause analysis on data quality issues and implement effective fixes and enhancements. Mentor and guide junior engineers, conducting code reviews and fostering best practices. Continuously evaluate and integrate new tools and technologies to enhance data quality capabilities. Ensure high code quality, performance, and reliability in all data processing pipelines. Create comprehensive documentation and reports on data quality metrics and system architecture. Required Skills & Experience: Bachelor s or Master s degree in Computer Science, Engineering, or a related field with Data Engineering Experience. 5+ years of professional experience in software development, with at least 2 years in a lead or senior engineering role. Strong proficiency in Python programming and experience building data processing applications. Hands-on expertise with Apache Spark and Hadoop for big data processing. Solid experience with workflow orchestration tools like Apache Airflow. Experience designing and managing data ingestion and integration pipelines with Apache NiFi. Understanding on Data Quality automation, CI/CD, Jenkins, Oracle, Power BI, Splunk Deep understanding of data quality concepts, data validation techniques, and distributed data systems. Strong problem-solving skills and ability to lead technical discussions. Experience with cloud platforms (AWS, GCP, or Azure) is a plus. Excellent communication and collaboration skills Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard s guidelines.

Posted 2 days ago

Apply

4.0 - 9.0 years

13 - 17 Lacs

Mumbai

Work from Office

Naukri logo

Why Join Us At Mahindra Group, we provide an exciting and inclusive work environment that values collaboration, agility, and boldness. At Mahindra Finance, we are all about recognizing the potential in people and empowering them in every way possible. We make every possible effort to create and maintain a milieu which is highly conducive for their growth. Our people are never short of challenges and cross-functional opportunities to help them expand their horizons and learn in a holistic way. At Mahindra, Together We Rise! Data & Analytics at Mahindra Finance The Lead Emerging Business Analytics will play a crucial role in maintaining the company s competitive edge in the NBFC space. Drive the operational/new initiatives for these businesses and will have the responsibilities of the key business KPIs. This role contributes to the company s overall business by driving innovation, improving decision-making processes, and enhancing financial products and services through the application of data-based insights and ML. What You ll Do Collaborate with stakeholders across business, technology, and analytics teams to capture requirements and translate them into actionable tasks and project scopes. Conduct detailed exploratory analysis to generate insights that can be integrated into our OneApp for hyper-personalization. Extract customer insights from both internal and external data sources to inform business actions that drive revenue growth. Utilize data generated by our digital assets, such as our app and website, to build analytical solutions that enhance customer experience and address user drop-offs. Design and execute large-scale A/B experiments to validate improvements in key business KPIs and develop a long-term roadmap for the features being tested. Work in close collaboration with the Machine Learning/AI team to integrate their model outputs into our digital app and website and distribute the same to offline channels such as branches and call centres. Understanding Complex Business Processes: Need to understand and map out complex business processes. This can be challenging, given the role demands working with multiple departments and functions. Bridging the Gap between IT and Business: Act as a bridge between the IT department, business units, and analytics team. This can be challenging as it requires a deep understanding of both technical and business aspects. Keeping Up with Technological Advances: The field of analytics is constantly evolving with new technologies and methodologies. Keeping up with these changes and learning new tools and techniques are important. Stakeholder Management: Managing expectations and maintaining clear communication with various stakeholders can be difficult. Stakeholders may have different priorities and it can be a challenge to balance these and keep everyone informed. Data Quality and Management: Ensuring the quality and integrity of data used for analysis can be a challenge. Poor quality data can lead to inaccurate analysis and decision-making. Change Management: Implementing new systems or processes can be met with resistance from employees. Managing this change and ensuring a smooth transition is important. What We Value Working knowledge of Data Science toolbox like: Python, SQL, Jupyter Notebook, Azure/AWS cloud and/or experience with machine learning is a plus Deep understanding of HDIEP Framework - Hypothesis, Data Sourcing, EDA, Insights and Presentation. High degree of Emotional Intelligence in managing relationships with internal clients /stakeholders. Act as the translator between analytics, IT, and Business teams to effectively deliver business outcomes

Posted 2 days ago

Apply

5.0 - 10.0 years

15 - 20 Lacs

Pune

Work from Office

Naukri logo

Lead - IT Architecture Job Locations IN-Pune Requisition ID 2025-13105 Category (Portal Searching) Information Technology Position Type (Portal Searching) Experienced Professional Overview Lead Engineer: This is a technology lead role within the Global Customer Data Department at StoneX. This role will actively collaborate with other technology leads for delivery of ongoing data projects. As a contributing senior data engineer and a tech lead the role will assist in architecting, designing and implementing components within our cloud data platform for ongoing operations and new data projects. Responsibilities Contribute and lead a team deliverable within define project scope with attention to detail for both data and technical aspects. Understands system protocols, how systems operate and data flows. Aware of current and emerging technology tools and their benefits. Understands the building blocks, interactions, dependencies, and tools required to complete data transformation, data storage and data distribution. Engage with other team leads to implement data management program with high quality. Strong focus on innovation and enablement, contributes to designs to implement data pipelines for structured and unstructured data and deliver effective and efficient technical solution. Ability to understand capital market business problems, study and transform data as needed. In collaboration with other leads define key data domains and attributes to help standardize the data model to drive data quality across the global enterprise. Contribute toward improving data quality of StoneX strategic data assets. Oversee the day to day operations of the junior data team members, participate in scrum and sprint rituals. Qualifications 5+ years of experience in Data Architecture, Data Engineering and/or Production Support across large enterprises. 3 years of hands on Data Engineering or Software Development experience in capital markets / trading industry Strong understanding of Enterprise architecture patterns, Object Oriented & Service Oriented principles, design patterns, industry best practices Experience in leading a small team of highly qualified technical performers is desired. Experience facilitating discussions and resolving issues across a diverse set of cross functional business & IT stakeholders. Foundation knowledge of Data Structures, Algorithms and Designing for performance. Proficiency in Python and working knowledge of PySpark. Experience with databricks desired. Experience in database technology like MSSQL or Postgres or MySQL and one of key value and document databases like MongoDB, DynamoDB, Casandra Excellent communications skills and ability to wok with business to extract critical concepts and transform into technical task items Ability to work and lead in an Agile methodology environment Options Apply for this job online Apply Share Email this job to a friend Refer Sorry the Share function is not working properly at this moment. Please refresh the page and try again later. Share on your newsfeed

Posted 2 days ago

Apply

5.0 - 10.0 years

16 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

What You Will Do: As a Data Governance Architect at Kanerika, you will play a pivotal role in shaping and executing the enterprise data governance strategy. Your responsibilities include: 1. Strategy, Framework, and Governance Operating Model Develop and maintain enterprise-wide data governance strategies, standards, and policies. Align governance practices with business goals like regulatory compliance and analytics readiness. Define roles and responsibilities within the governance operating model. Drive governance maturity assessments and lead change management initiatives. 2. Stakeholder Alignment & Organizational Enablement Collaborate across IT, legal, business, and compliance teams to align governance priorities. Define stewardship models and create enablement, training, and communication programs. Conduct onboarding sessions and workshops to promote governance awareness. 3. Architecture Design for Data Governance Platforms Design scalable and modular data governance architecture. Evaluate tools like Microsoft Purview, Collibra, Alation, BigID, Informatica. Ensure integration with metadata, privacy, quality, and policy systems. 4. Microsoft Purview Solution Architecture Lead end-to-end implementation and management of Microsoft Purview. Configure RBAC, collections, metadata scanning, business glossary, and classification rules. Implement sensitivity labels, insider risk controls, retention, data map, and audit dashboards. 5. Metadata, Lineage & Glossary Management Architect metadata repositories and ingestion workflows. Ensure end-to-end lineage (ADF \u2192 Synapse \u2192 Power BI). Define governance over business glossary and approval workflows. 6. Data Classification, Access & Policy Management Define and enforce rules for data classification, access, retention, and sharing. Align with GDPR, HIPAA, CCPA, SOX regulations. Use Microsoft Purview and MIP for policy enforcement automation. 7. Data Quality Governance Define KPIs, validation rules, and remediation workflows for enterprise data quality. Design scalable quality frameworks integrated into data pipelines. 8. Compliance, Risk, and Audit Oversight Identify risks and define standards for compliance reporting and audits. Configure usage analytics, alerts, and dashboards for policy enforcement. 9. Automation & Integration Automate governance processes using PowerShell, Azure Functions, Logic Apps, REST APIs. Integrate governance tools with Azure Monitor, Synapse Link, Power BI, and third-party platforms. Requirements 1 5+ years in data governance and management. Expertise in Microsoft Purview, Informatica, and related platforms. Experience leading end-to-end governance initiatives. Strong understanding of metadata, lineage, policy management, and compliance regulations. Hands-on skills in Azure Data Factory, REST APIs, PowerShell, and governance architecture. Familiar with Agile methodologies and stakeholder communication. Benefits 1. Culture: Open Door Policy: Encourages open communication and accessibility to management. Open Office Floor Plan: Fosters a collaborative and interactive work environment. Flexible Working Hours: Allows employees to have flexibility in their work schedules. Employee Referral Bonus: Rewards employees for referring qualified candidates. Appraisal Process Twice a Year: Provides regular performance evaluations and feedback. 2. Inclusivity and Diversity: Hiring practices that promote diversity: Ensures a diverse and inclusive workforce. Mandatory POSH training: Promotes a safe and respectful work environment. 3. Health Insurance and Wellness Benefits: GMC and Term Insurance: Offers medical coverage and financial protection. Health Insurance: Provides coverage for medical expenses. Disability Insurance: Offers financial support in case of disability. 4. Child Care & Parental Leave Benefits: Company-sponsored family events: Creates opportunities for employees and their families to bond. Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child. Family Medical Leave: Offers leave for employees to take care of family members medical needs. 5. Perks and Time-Off Benefits: Company-sponsored outings: Organizes recreational activities for employees. Gratuity: Provides a monetary benefit as a token of appreciation. Provident Fund: Helps employees save for retirement. Generous PTO: Offers more than the industry standard for paid time off. Paid sick days: Allows employees to take paid time off when they are unwell. Paid holidays: Gives employees paid time off for designated holidays. Bereavement Leave: Provides time off for employees to grieve the loss of a loved one. 6. Professional Development Benefits: L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development. Mentorship Program: Offers guidance and support from experienced professionals. Job Training: Provides training to enhance job-related skills. Professional Certification Reimbursements: Assists employees in obtaining professional certifications.

Posted 2 days ago

Apply

5.0 - 10.0 years

7 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Help empower our global customers to connect to culture through their passions. Technology at StockX Our Technology Team is on a mission to build the next generation e-commerce platform for the next generation customer. We build world-class, innovative experiences and products that give our users access to the world s most-coveted products and unlock economic opportunity by turning reselling into a business for anyone. Our team uses cutting edge technologies that handle massive scale globally. We re an internet-native, cloud-native company from day 1 - you won t find legacy technology here. If you re a curious leader who loves solving problems, wearing multiple hats, and learning new things, join us! About the role As a Senior Data Engineer, you will be empowered to leverage data to drive amazing customer experiences and business results. You will own the end to end development of data engineering solutions to support analytical needs of the business. The ideal candidate will be passionate about working with disparate datasets and be someone who loves to bring data together to answer business questions at speed. You should have deep expertise in the creation and management of datasets and the proven ability to translate the data into meaningful insights through collaboration with analysts, data scientists and business stakeholders. What youll do Design and build mission critical data pipelines with a highly scalable distributed architecture - including data ingestion (streaming, events and batch), data integration, data curation Help continually improve ongoing reporting and analysis processes, simplifying self-service support for business stakeholders Build and support reusable framework to ingest, integration and provision data Automation of end to end data pipeline with metadata, data quality checks and audit Build and support a big data platform on the cloud Define and implement automation of jobs and testing Optimize the data pipeline to support ML workloads and use cases Support mission critical applications and near real time data needs from the data platform Capture and publish metadata and new data to subscribed users Work collaboratively with business analysts, product managers, data scientists as well as business partners and actively participate in design thinking session Participate in design and code reviews Motivate, coach, and serve as a role model and mentor for other development team associates/members that leverage the platform About you Minimum of 5 years experience in data warehouse / data lake technical architecture 3+ years of experience in using programming languages (Python / Scala / Java / C#) to build data pipelines Minimum 3 years of Big Data and Big Data tools in one or more of the following: Batch Processing (e.g. Hadoop distributions, Spark), Real time processing (e.g. Kafka, Flink/Spark Streaming) Minimum of 2 years experience with AWS or engineering in other cloud environments Experience with Database Architecture/Schema design Strong familiarity with batch processing and workflow tools such as AirFlow, NiFi Ability to work independently with business partners and management to understand their needs and exceed expectations in delivering tools/solutions Strong interpersonal, verbal and written communication skills and ability to present complex technical/analytical concepts to executive audience Strong business mindset with customer obsession; ability to collaborate with business partners to identify needs and opportunities for improved data management and delivery Experience providing technical leadership and mentoring other engineers for best practices on data engineering Bachelors degree in Computer Science, or a related technical field Nice to have: Masters in Computer Science or related quantitative field About StockX StockX is proud to be a Detroit-based technology leader focused on the large and growing online market for sneakers, apparel, accessories, electronics, collectibles, trading cards, and more. StockXs powerful platform connects buyers and sellers of high-demand consumer goods from around the world using dynamic pricing mechanics. This approach affords access and market visibility powered by real-time data that empowers buyers and sellers to determine and transact based on market value. The StockX platform features hundreds of brands across verticals including Jordan Brand, adidas, Nike, Supreme, BAPE, Off-White, Louis Vuitton, Gucci; collectibles from brands including LEGO, KAWS, Bearbrick, and Pop Mart; and electronics from industry-leading manufacturers Sony, Microsoft, Meta, and Apple. Launched in 2016, StockX employs 1,000 people across offices and verification centers around the world. Learn more at www.stockx.com. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. This job description is intended to convey information essential to understanding the scope of the job and the general nature and level of work performed by job holders within this job. However, this job description is not intended to be an exhaustive list of qualifications, skills, efforts, duties, responsibilities or working conditions associated with the position. StockX reserves the right to amend this job description at any time. StockX may utilize AI to rank job applicant submissions against the position requirements to assist in determining candidate alignment.

Posted 2 days ago

Apply

2.0 - 7.0 years

7 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Who we are Were a leading, global security authority thats disrupting our own category. Our encryption is trusted by the major ecommerce brands, the worlds largest companies, the major cloud providers, entire country financial systems, entire internets of things and even down to the little things like surgically embedded pacemakers. We help companies put trust - an abstract idea - to work. Thats digital trust for the real world. Job summary The DigiCert ONE CA team is looking for a knowledgeable Senior QA Automation Engineer to join our agile cross functional team to build the future of PKI and security management. What you will do Collaborate with an agile cross-functional engineering team in developing and implementing QA testing strategy Determine the appropriate balance of manual and automated tests, as well as the types of tests (UI, API, functional/performance/load, etc.) Create processes to validate data quality, including well-structured test plans and test cases Improve and build upon our test suite, test guidelines, and testing culture Review and test your teammates pull requests Develop a deep understanding of products, architecture, and systems Analyze, troubleshoot, and debug product defects and provide timely solutions to customer issues What you will have 2+ years of industry experience - QA automation experience with Python preferred Experience with Selenium WebDriver, Postman, or other API testing frameworks and technologies Familiarity with QA automation architecture, methodologies, processes, and tools Familiarity with SQL and relational databases Ability to research and learn new technologies Strong interpersonal communication skills - must enjoy working and collaborating with a team Nice to have Understanding of SSL/TLS, PKI, and other security related technologies Bachelor s degree in Computer Science, Information Systems, etc., or equivalent years of experience in the industry Technologies we use, any experience is a bonus: Docker and Kubernetes; Git and GitHub; Go; React; Jenkins or experience with other CI/CD tools Benefits Generous time off policies Top shelf benefits Education, wellness and lifestyle support #LI-CL1 __PRESENT

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies