Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements in Hyderabad. You will play a crucial role in the development and implementation of innovative solutions. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Conduct code reviews and ensure coding standards are met- Implement best practices for application development Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform- Strong understanding of data analytics and data processing- Experience with cloud-based data platforms- Knowledge of data modeling and database design- Hands-on experience with data integration and ETL processes Additional Information:- The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform- This position is based at our Hyderabad office- A 15 years full-time education is required Qualification 15 years full time education
Posted 13 hours ago
15.0 - 20.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to enhance the overall data architecture and platform functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in continuous learning to stay updated with the latest trends and technologies in data platforms.- Assist in the documentation of data architecture and integration processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration techniques and methodologies.- Experience with cloud-based data solutions and architectures.- Familiarity with data modeling concepts and practices.- Ability to troubleshoot and optimize data workflows. Additional Information:- The candidate should have minimum 2 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 13 hours ago
5.0 - 10.0 years
10 - 14 Lacs
Pune
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP Master Data Governance MDG Tool Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the effort to design, build, and configure applications- Act as the primary point of contact- Manage the team and ensure successful project delivery Professional & Technical Skills: - Must To Have Skills: Technical Proficiency in SAP Master Data Governance MDG Tool- Strong understanding of data governance principles and best practices- Experience in implementing and configuring SAP MDG Tool- Knowledge of data modeling and data integration concepts- Hands-on experience in data quality management and data cleansing techniques Additional Information:- The candidate should have a minimum of 5 years of experience in SAP Master Data Governance MDG Tool- This position is based in Mumbai- A 15 years full-time education is required Qualification 15 years full time education
Posted 13 hours ago
3.0 - 8.0 years
5 - 9 Lacs
Gurugram
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Oracle Procedural Language Extensions to SQL (PLSQL) Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education" Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will collaborate with teams to ensure successful project delivery and implementation. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica PowerCenter, Oracle Procedural Language Extensions to SQL (PLSQL)- Strong understanding of ETL processes and data integration- Experience in developing complex data mappings and transformations- Knowledge of data warehousing concepts and best practices- Hands-on experience in performance tuning and optimization of ETL processes Roles & Responsibilities:Strong experience in SQL, PL/SQL, Informatica Powercenter, Unix shell scripting, Oracle 12c database optimization and scheduling tools along with performance improvement techniques across ETL technology stack with hands-on abilities - 6+ yearsGood with data modelling skills; entity relationship as well as dimensional, designing various layers of Datawarehouse, etc - Atleast 2 yearsData analysis, source system analysis and data mapping to target data model - 2 to 4 yearsExtensive knowledge in data integration architecture/ETL/Data warehouse, esp large volumes with complex integrations - 3+ yearsGood knowledge of reporting tools such as OAS with understanding of different types of reporting, storytelling using visualization, etcGood knowledge of emerging alternate tools/technologies like Python, Spring Integration, Big Data technologies, etc.Good Knowledge in Architectural and Design patternsOverseeing and reviewing the application design & development to ensure that it meets both the technical constraints of the architecture and the business objectives in terms of reliability, scalability and serviceabilityAbility to identify and define non-functional requirements and design systems to meet the sameKnowledge of application architecture to ensure performance and scalability, including the identification and alleviation of bottlenecks.Excellent verbal, written communication and problem-solving skills Additional Information:- The candidate should have min 4 years of experience - This position is based at our Gurugram office. Visiting client office twice a week is must.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 13 hours ago
15.0 - 20.0 years
10 - 14 Lacs
Pune
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : PySpark Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions, providing insights and solutions to enhance application performance and user experience. Your role will require you to stay updated with the latest technologies and methodologies to ensure the applications are built using best practices. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate knowledge sharing sessions to enhance team capabilities.- Mentor junior team members to foster their professional growth. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Strong understanding of data processing frameworks and distributed computing.- Experience with data integration and ETL processes.- Familiarity with cloud platforms and services related to application deployment.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 2 years of experience in PySpark.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 13 hours ago
15.0 - 20.0 years
5 - 9 Lacs
Navi Mumbai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BusinessObjects Data Services Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while maintaining a focus on quality and efficiency. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BusinessObjects Data Services.- Strong understanding of data integration and transformation processes.- Experience with ETL (Extract, Transform, Load) methodologies.- Familiarity with database management systems and SQL.- Ability to troubleshoot and resolve application issues effectively. Additional Information:- The candidate should have minimum 5 years of experience in SAP BusinessObjects Data Services.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 13 hours ago
6.0 - 8.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Department : Platform Engineering. Role Description : This is a remote contract role for a Data Engineer Ontology_5+yrs at Zorba AI. We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality and Governance :. - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration and Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : Bachelor's or master's degree in computer science, Data Science, or a related field. Experience : 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e., TigerGraph, JanusGraph) and triple stores (e., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.
Posted 15 hours ago
6.0 - 8.0 years
9 - 13 Lacs
Mumbai
Work from Office
Job Title : Sr.Data Engineer Ontology & Knowledge Graph Specialist. Department : Platform Engineering. Role Description : This is a remote contract role for a Data Engineer Ontology_5+yrs at Zorba AI. We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality and Governance :. - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration and Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : Bachelor's or master's degree in computer science, Data Science, or a related field. Experience : 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e., TigerGraph, JanusGraph) and triple stores (e., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.
Posted 15 hours ago
8.0 - 13.0 years
5 - 10 Lacs
Pune, Chennai, Bengaluru
Work from Office
Location : Bangalore, Chennai, Delhi, Pune Primary Roles And Responsibilities : - Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack - Ability to provide solutions that are forward-thinking in data engineering and analytics space - Collaborate with DW/BI leads to understand new ETL pipeline development requirements. - Triage issues to find gaps in existing pipelines and fix the issues - Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs - Help joiner team members to resolve issues and technical challenges. - Drive technical discussion with client architect and team members - Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications : - Bachelor's and/or masters degree in computer science or equivalent experience. - Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. - Deep understanding of Star and Snowflake dimensional modelling. - Strong knowledge of Data Management principles - Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture - Should have hands-on experience in SQL, Python and Spark (PySpark) - Candidate must have experience in AWS/ Azure stack - Desirable to have ETL with batch and streaming (Kinesis). - Experience in building ETL / data warehouse transformation processes - Experience with Apache Kafka for use with streaming data / event-based data - Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) - Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) - Experience working with structured and unstructured data including imaging & geospatial data. - Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. - Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot - Databricks Certified Data Engineer Associate/Professional Certification (Desirable). - Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects - Should have experience working in Agile methodology - Strong verbal and written communication skills. - Strong analytical and problem-solving skills with a high attention to detail.
Posted 15 hours ago
8.0 - 13.0 years
15 - 30 Lacs
Bengaluru
Work from Office
Strong grasp O&G industry workflows & regulatory environment.Exp in automation using python, GenAi, AiOps etc. Exp with data integration, warehousing, & big data technologies. Exp with Docker & Kubernetes, CICD pipelines & automated testing framework
Posted 16 hours ago
7.0 - 9.0 years
15 - 30 Lacs
Thiruvananthapuram
Work from Office
Job Title: Senior Data Associate - Cloud Data Engineering Experience: 7+ Years Employment Type: Full-Time Industry: Information Technology / Data Engineering / Cloud Platforms Job Summary: We are seeking a highly skilled and experienced Senior Data Associate to join our data engineering team. The ideal candidate will have a strong background in cloud data platforms, big data processing, and enterprise data systems, with hands-on experience across both AWS and Azure ecosystems. This role involves building and optimizing data pipelines, managing large-scale data lakes and warehouses, and enabling advanced analytics and reporting. Key Responsibilities: Design, develop, and maintain scalable data pipelines using AWS Glue, PySpark, and Azure Data Factory. Work with AWS Redshift, Athena, Azure Synapse, and Databricks to support data warehousing and analytics solutions. Integrate and manage data across MongoDB, Oracle, and cloud-native storage like Azure Data Lake and S3. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver high-quality datasets. Implement data quality checks, monitoring, and governance practices. Optimize data workflows for performance, scalability, and cost-efficiency. Support data migration and modernization initiatives across cloud platforms. Document data flows, architecture, and technical specifications. Required Skills & Qualifications: 7+ years of experience in data engineering, data integration, or related roles. Strong hands-on experience with: AWS Redshift, Athena, Glue, S3 Azure Data Lake, Synapse Analytics, Databricks PySpark for distributed data processing MongoDB and Oracle databases Proficiency in SQL, Python, and data modeling. Experience with ETL/ELT design and implementation. Familiarity with data governance, security, and compliance standards. Strong problem-solving and communication skills. Preferred Qualifications: Certifications in AWS (e.g., Data Analytics Specialty) or Azure (e.g., Azure Data Engineer Associate). Experience with CI/CD pipelines and DevOps for data workflows. Knowledge of data cataloging tools (e.g., AWS Glue Data Catalog, Azure Purview). Exposure to real-time data processing and streaming technologies. Required Skills Azure,AWS REDSHIFT,Athena,Azure Data Lake
Posted 16 hours ago
7.0 - 9.0 years
5 - 5 Lacs
Thiruvananthapuram
Work from Office
Azure Infrastructure Consultant - Cloud & Data Integration Experience: 8+ Years Employment Type: Full-Time Industry: Information Technology / Cloud Infrastructure / Data Engineering Job Summary: We are looking for a seasoned Azure Infrastructure Consultant with a strong foundation in cloud infrastructure, data integration, and real-time data processing. The ideal candidate will have hands-on experience across Azure and AWS platforms, with deep knowledge of Apache NiFi, Kafka, AWS Glue, and PySpark. This role involves designing and implementing secure, scalable, and high-performance cloud infrastructure and data pipelines. Key Responsibilities: Design and implement Azure-based infrastructure solutions, ensuring scalability, security, and performance. Lead hybrid cloud integration projects involving Azure and AWS services. Develop and manage ETL/ELT pipelines using AWS Glue, Apache NiFi, and PySpark. Architect and support real-time data streaming solutions using Apache Kafka. Collaborate with cross-functional teams to gather requirements and deliver infrastructure and data solutions. Implement infrastructure automation using tools like Terraform, ARM templates, or Bicep. Monitor and optimize cloud infrastructure and data workflows for cost and performance. Ensure compliance with security and governance standards across cloud environments. Required Skills & Qualifications: 8+ years of experience in IT infrastructure and cloud consulting. Strong hands-on experience with: Azure IaaS/PaaS (VMs, VNets, Azure AD, App Services, etc.) AWS services including Glue, S3, Lambda Apache NiFi for data ingestion and flow management Apache Kafka for real-time data streaming PySpark for distributed data processing Proficiency in scripting (PowerShell, Python) and Infrastructure as Code (IaC). Solid understanding of networking, security, and identity management in cloud environments. Strong communication and client-facing skills. Preferred Qualifications: Azure or AWS certifications (e.g., Azure Solutions Architect, AWS Data Analytics Specialty). Experience with CI/CD pipelines and DevOps practices. Familiarity with containerization (Docker, Kubernetes) and orchestration. Exposure to data governance tools and frameworks. Required Skills Azure,Microsoft Azure,Azure Paas,aws glue
Posted 16 hours ago
7.0 - 10.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Job Summary : We are seeking an experienced Senior Oracle Data Integrator (ODI) Developer with over 3 years of experience. The ideal candidate will have a strong background in data integration and transformation projects, with previous experience in Oracle Business Intelligence Enterprise Edition (OBIEE). This role involves designing, implementing, and managing advanced ODI solutions to meet complex business requirements. Key Responsibilities : Design and Development : - Architect, develop, and optimize ODI solutions, including mappings, packages, procedures, and scenarios for ETL processes. Implementation : - Lead the implementation and deployment of ODI projects, ensuring they meet design specifications and business requirements. Data Integration : - Execute data integration tasks to consolidate data from various sources into a unified structure. OBIEE Integration : - Leverage experience with OBIEE to ensure effective integration and data flow between ODI and OBIEE. Optimization : - Continuously optimize ODI processes for performance and efficiency, ensuring minimal downtime and high availability. Collaboration : - Collaborate with business analysts, data scientists, and other stakeholders to understand data requirements and deliver effective solutions. Documentation : - Create and maintain comprehensive documentation for ODI solutions, including design specifications, test plans, and user manuals. Support : - Provide advanced support and troubleshooting for ODI solutions, resolving complex issues in a timely manner. Best Practices : - Implement and promote best practices in ODI development and data integration to ensure data integrity and quality. Mentorship : - Mentor and guide junior developers, fostering a culture of continuous learning and improvement. Required Skills and Experience : ODI Expertise : - Minimum of 4-5 years of experience with Oracle Data Integrator (ODI). Data Integration : - Deep understanding of data integration and ETL processes. OBIEE Experience : - Extensive experience with Oracle Business Intelligence Enterprise Edition (OBIEE). Technical Proficiency : - Proficient in SQL, PL/SQL, and other relevant programming languages. Analytical Skills : - Excellent analytical and problem-solving skills with the ability to handle complex data challenges. Communication : - Strong verbal and written communication skills, with the ability to explain complex technical concepts to non-technical stakeholders. Team Collaboration : - Proven ability to work effectively in a collaborative team environment. Leadership : - Demonstrated leadership skills with the ability to mentor and guide junior developers. Adaptability : - Flexible and able to manage multiple priorities in a fast-paced environment. Immediate Availability : - Preference for candidates who can join immediately or within a short notice period. Education : Degree : - Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Preferred Qualifications : Certifications : - Relevant certifications in ODI and OBIEE. - Enterprise Project Experience: Extensive experience working on large-scale enterprise projects. - If you are a seasoned ODI Developer with a passion for data integration and a solid background in OBIEE, we would love to hear from you. Join our team and contribute to leveraging data to drive business success.
Posted 16 hours ago
6.0 - 8.0 years
9 - 13 Lacs
Kolkata
Remote
Role Description : This is a remote contract role for a Data Engineer Ontology_5+yrs at Zorba AI. We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality and Governance :. - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration and Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : Bachelor's or master's degree in computer science, Data Science, or a related field. Experience : 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e., TigerGraph, JanusGraph) and triple stores (e., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.
Posted 17 hours ago
5.0 - 8.0 years
20 - 25 Lacs
Mohali, Pune
Work from Office
Exp with Azure Data Factory, Azure Synapse Analytics, Azure SQL Database, Azure Storage, SQL, Git, CI/CD, Azure DevOps, RESTful APIs, Data APIs, event-driven architecture, data governance, lineage, security, privacy best practices. Immediate Joiners Required Candidate profile Data Warehousing, Data Lake, Azure Cloud Services, Azure DevOps ETL-SSIS, ADF, Synapse, SQL Server, Azure SQL Data Transformation, Modelling, Integration Microsoft Certified: Azure Data Engineer
Posted 1 day ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Duck Creek Policy Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to guarantee the quality and functionality of the applications you create, while continuously seeking ways to enhance existing systems and processes. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Engage in code reviews to ensure adherence to best practices and standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Duck Creek Policy.- Strong understanding of application development methodologies.- Experience with software testing and debugging techniques.- Familiarity with database management and data integration.- Ability to work collaboratively in a team-oriented environment. Additional Information:- The candidate should have minimum 3 years of experience in Duck Creek Policy.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 day ago
6.0 - 8.0 years
9 - 13 Lacs
Chennai
Work from Office
This is a remote contract role for a Data Engineer Ontology_5+yrs at Zorba AI. We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality and Governance :. - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration and Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : Bachelor's or master's degree in computer science, Data Science, or a related field. Experience : 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e., TigerGraph, JanusGraph) and triple stores (e., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.
Posted 1 day ago
5.0 - 10.0 years
6 - 10 Lacs
Mumbai
Work from Office
Job Title :CDP Consultant + Consultant + Strategy & Consulting Global Network Song Management Level :09 - Consultant Location :Mumbai, Bangalore, Gurgaon, Hyderabad, Pune Must have skills :Customer Data Platform, Adobe Experience Platform, Digital Audience Management Good to have skills :Salesforce Marketing Cloud, Tealium Job Summary :As a Digital Marketing Platform Consultant, you will be responsible for designing, building, and implementing strategies to enhance business performance using Customer Data Platforms and other digital marketing tools. Your typical day will involve working with clients to solve customer-facing challenges in sales, service, and marketing, and developing solutions to meet those requirements. Roles & Responsibilities : Conduct assessment of clients current data management and Customer Data Platform (CDP) capabilities. Provide recommendations on optimizing the current stack and develop use cases to improve utilization. Gather business requirements and strategize and build clients digital marketing data management program that delivers high-performance profiles for segmentation, personalization & activation across digital channels. Identify different data sources, approaches to enrich customer profiles/segments in CDP and define use cases across multiple channels/destinations. Design data cleaning, transformation, and integration approach from various data sources into CDP. Evaluate and recommend new ways of data management through CDP and how it connects into the broader marketing technology landscape (campaign management and insight generation). Work as a BA or a functional consultant as part of CDP implementation. Plan, design, implement & integrate CDP solutions such as Adobe Experience Platform RT CDP and work with partners to leverage various data management capabilities offered by these solutions. Understand and convey the impact of regulations like CCPA, GDPR and clients own internal or industry regulation on data gathering and management. Guide towards privacy by design, consent management for customer profile activation across digital marketing channels. Understand and design integrations (including API-based/file-based integrations) between tools/platforms of different vendors/MarTech/AdTech components/data sources. Stay abreast of MarTech landscape & engage ecosystem players/vendors to ensure best-in-class digital audience management capabilities for clients. Propose and conduct pilots as needed. Professional & Technical Skills : Must Have Skills: Proficiency in Customer Data Platform, Adobe Experience Platform, Digital Audience Management. Strong understanding of marketing processes, data management, data strategy, data integration, marketing & advertising technologies, and their integrations with other technologies. Experience in developing and maintaining digital marketing data management programs. Experience in debugging and troubleshooting digital marketing platforms. Experience in working with marketing technology solutions such as Salesforce Marketing Cloud, Tealium. Additional Information : The ideal candidate will possess a strong educational background in business administration or a related field, along with a proven track record of delivering impactful solutions using digital marketing platforms. This position is based at our Mumbai, Bangalore, Gurgaon, Hyderabad, or Pune office. About Our Company | Accenture Qualification Experience :Minimum 5 years of experience is required Educational Qualification :Master of Business Administration / Post Graduate Diploma in Management, Marketing platform Certifications such as Adobe Experience Platform, Salesforce Marketing Cloud, Tealium
Posted 1 day ago
3.0 - 5.0 years
15 - 25 Lacs
Chennai
Work from Office
Professional with 3 to 5 years of experience in geospatial analysis focusing on Networx - Pricer and Networx - Facets Pricer. The candidate will work in a hybrid model primarily during day shifts. This role involves analyzing spatial data to support strategic decision-making and enhance operational efficiency. Experience : 3 - 5 years Technical Skills: Networx - Pricer Networx - Facets Pricer NetworX Responsibilities: - Analyze geospatial data to identify patterns and trends that can inform strategic decisions. - Collaborate with cross-functional teams to integrate geospatial insights into business processes. - Develop and maintain geospatial databases to ensure data accuracy and accessibility. - Utilize Networx - Pricer and Networx - Facets Pricer to optimize pricing strategies and improve cost efficiency. - Create detailed reports and visualizations to communicate geospatial findings to stakeholders. - Provide technical support and training to team members on geospatial tools and methodologies. - Implement best practices for data management and analysis to enhance data quality and reliability. - Conduct regular audits of geospatial data to ensure compliance with industry standards. - Stay updated with the latest advancements in geospatial technology and incorporate them into existing processes. - Collaborate with IT teams to ensure seamless integration of geospatial systems with other enterprise applications. - Monitor and evaluate the performance of geospatial solutions to identify areas for improvement. - Support the development of new geospatial products and services to meet evolving business needs. - Contribute to the companys sustainability initiatives by leveraging geospatial data to optimize resource allocation. Qualifications: - Possess strong analytical skills with experience in Networx - Pricer and Networx - Facets Pricer. - Demonstrate proficiency in geospatial analysis and data visualization techniques. - Have a solid understanding of database management and data integration processes. - Exhibit excellent communication skills to effectively convey complex geospatial concepts. - Show ability to work collaboratively in a hybrid work environment. - Display a proactive approach to problem-solving and continuous improvement.
Posted 1 day ago
7.0 - 12.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Adobe Experience Platform (AEP) Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements in a dynamic work environment. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead the application development process.- Implement innovative solutions to enhance application functionality.- Conduct regular code reviews and ensure coding standards are met. Professional & Technical Skills: - Must To Have Skills: Proficiency in Adobe Experience Platform (AEP).- Strong understanding of data integration and API development.- Experience with cloud-based application development.- Hands-on experience with front-end and back-end development.- Knowledge of Agile methodologies for software development. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Adobe Experience Platform (AEP).- This position is based at our Pune office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 day ago
7.0 - 12.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Adobe Experience Platform (AEP) Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating innovative solutions and ensuring seamless application functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the team in implementing cutting-edge technologies- Conduct regular code reviews and provide constructive feedback- Stay updated on industry trends and best practices Professional & Technical Skills: - Must To Have Skills: Proficiency in Adobe Experience Platform (AEP)- Strong understanding of data integration and data modeling- Experience in developing scalable and secure applications- Knowledge of cloud technologies and services- Hands-on experience with API development and integration Additional Information:- The candidate should have a minimum of 7.5 years of experience in Adobe Experience Platform (AEP).- This position is based at our Pune office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 day ago
15.0 - 25.0 years
5 - 9 Lacs
Chennai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP Sales and Distribution (SD) Good to have skills : NAMinimum 15 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for ensuring that the applications are aligned with the needs of the organization and contribute to its overall success. Your typical day will involve collaborating with cross-functional teams, analyzing business requirements, designing and implementing solutions, and ensuring the quality and performance of the applications. Roles & Responsibilities:- Expected to be a SME with deep knowledge and experience.- Should have Influencing and Advisory skills.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Collaborate with business stakeholders to understand their requirements and translate them into technical specifications.- Design, develop, and test applications based on the defined requirements.- Ensure the applications are scalable, reliable, and secure.- Troubleshoot and debug issues in the applications and provide timely resolutions.- Collaborate with cross-functional teams to integrate applications with other systems.- Stay updated with the latest industry trends and technologies to continuously improve the applications.- Provide technical guidance and mentorship to junior developers. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Sales and Distribution (SD).- Strong understanding of business processes and application requirements.- Experience in designing and building applications using SAP Sales and Distribution (SD).- Knowledge of SAP modules and integration with other systems.- Experience in troubleshooting and resolving technical issues in SAP Sales and Distribution (SD).- Good To Have Skills: Experience with SAP ABAP programming language.- Experience in SAP implementation projects.- Knowledge of SAP Fiori and UI5 development.- Experience in SAP S/4HANA migration.- Solid grasp of data migration and data integration techniques. Additional Information:- The candidate should have a minimum of 15 years of experience in SAP Sales and Distribution (SD).- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 day ago
5.0 - 10.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating innovative solutions to address various business needs and ensuring seamless application functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead and mentor junior professionals- Conduct regular knowledge sharing sessions within the team- Stay updated on the latest industry trends and technologies Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio- Strong understanding of ETL processes- Experience in data warehousing concepts- Hands-on experience with data integration tools- Knowledge of data quality and data governance principles Additional Information:- The candidate should have a minimum of 5 years of experience in Ab Initio- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education
Posted 1 day ago
5.0 - 10.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : SAP BusinessObjects Data Services Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will analyze, design, code, and test multiple components of application code across one or more clients. You will also perform maintenance, enhancements, and/or development work throughout the day. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead team meetings to discuss progress and challenges- Mentor junior team members to enhance their skills- Stay updated on industry trends and technologies to suggest improvements Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BusinessObjects Data Services- Strong understanding of ETL processes- Experience with data integration and data quality management- Hands-on experience in data modeling and database design- Knowledge of SAP systems and integration with other platforms Additional Information:- The candidate should have a minimum of 5 years of experience in SAP BusinessObjects Data Services- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education
Posted 1 day ago
15.0 - 20.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Microsoft Azure Databricks Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Databricks.- Strong understanding of cloud computing concepts and services.- Experience with application development frameworks and methodologies.- Familiarity with data integration and ETL processes.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 5 years of experience in Microsoft Azure Databricks.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
17062 Jobs | Dublin
Wipro
9393 Jobs | Bengaluru
EY
7759 Jobs | London
Amazon
6056 Jobs | Seattle,WA
Accenture in India
6037 Jobs | Dublin 2
Uplers
5971 Jobs | Ahmedabad
Oracle
5764 Jobs | Redwood City
IBM
5714 Jobs | Armonk
Tata Consultancy Services
3524 Jobs | Thane
Capgemini
3518 Jobs | Paris,France