Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 17.0 years
0 Lacs
bengaluru
Work from Office
Job Description: Complete data ingestion, data pipeline, data lineage , data quality , data wearhouse, data governance and data reconciliation. Essential Skills: Must have Data architect experience and knowledge. Data Architect with over 10 +years of hands-on experience in designing, developing, and managing large-scale data solutions. Proven expertise in building and optimizing ETL pipelines. Strong in data preprocessing, and enhancing data quality.Extracting events, processing large datasets (5 billion+ records) within (a Spark-Hadoop) cluster. Automated data processing tasks for DAAS (Data as a Service) project, streamlining workflow efficiency. Configured file and client setups, ensuring...
Posted 2 days ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
In this role as a Data Steward at Novartis in Hyderabad, you will ensure data curation by applying appropriate standard Quality & Security checks. You will help onboard data assets to the FAIRification process by collaborating with relevant stakeholders and systems to identify data and metadata. Your responsibilities will also include assisting in the creation of Uniform Resource Identifiers (URIs) for data and converting existing models or creating new models into semantic models following any UML standard. You will work towards aligning PLS data services to fully support the Data Strategy and the DLC products/assets. Additionally, you will be responsible for ensuring compliance with Good M...
Posted 3 weeks ago
12.0 - 16.0 years
0 Lacs
telangana
On-site
As a MongoDB developer at the company, your role will involve the following responsibilities: - Design, develop, and manage scalable database solutions using MongoDB - Write robust, effective, and scalable queries and operations for MongoDB-based applications - Integrate third-party services, tools, and APIs with MongoDB for data management and processing - Collaborate with developers, data engineers, and stakeholders to ensure seamless integration of MongoDB with applications and systems - Run unit, integration, and performance tests to ensure the stability and functionality of MongoDB implementations - Conduct code and database reviews, ensuring adherence to security, scalability, and best...
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
You will play a crucial role in designing, developing, and maintaining cloud-based data infrastructure to support BFSI customers at LUMIQ. Your responsibilities will include: - Designing, developing, and implementing data pipelines, ETL processes, and data integration solutions. - Collaborating with cross-functional teams to design scalable data models and architectures aligned with BFSI industry needs. - Optimizing data storage, processing, and retrieval for maximum performance and cost-efficiency in cloud environments. - Implementing data security and compliance measures to protect sensitive BFSI data integrity. - Working closely with data scientists and analysts to enable seamless access ...
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
As a Technical Project Delivery Specialist, your role involves being accountable for delivering the product architectural strategy and vision within the data team. You will also be responsible for delivering IT management solutions to realize business benefits. Here are the key responsibilities associated with this role: - Understand customer needs and collaborate closely with Business IT functions to validate the value proposition and fit to requirements of products and services. - Develop hands-on, in-depth knowledge of competitive products and maintain technical analysis of their strengths and weaknesses. Stay updated on relevant technologies and their potential impact on ongoing innovati...
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As an experienced professional with 3+ years of experience in building and operating production-grade applications and services across the stack, you will be responsible for the following key responsibilities: - Demonstrating strong programming skills in Python and/or Scala and SQL to write modular, testable, and well-documented code for batch and streaming workloads. - Hands-on experience with modern data engineering stacks including: - Distributed processing using Apache Spark (Databricks preferred), PySpark/Scala. - Orchestration with Azure Data Factory or Apache Airflow; implementing event-driven patterns with Azure Functions/Logic Apps. - Working with storage & formats such as Delta Lak...
Posted 1 month ago
4.0 - 8.0 years
15 - 25 Lacs
hyderabad, gurugram, bengaluru
Work from Office
Vulnerability Response (VR) module Must have: *Integrations: Implement and manage integrations between ServiceNow and third-party vulnerability scanners, such as Qualys, Tenable, and Rapid7, to ensure accurate ingestion of vulnerability data. *Automation: Automate VR processes for triage, prioritization, assignment, and remediation tracking using ServiceNow Flow Designer, Orchestration, and scripting. *CMDB and CI Correlation: Configure and maintain correlation rules between vulnerable items and Configuration Items (CIs) in the Configuration Management Database (CMDB).
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
ahmedabad, gujarat
On-site
As an Adobe RTCDP and Adobe Target Expert, your role will involve designing, configuring, and managing the RTCDP solution and implementing personalization activities on the web experience. You should deeply understand RTCDP principles and technologies, with a focus on practical implementation to deliver successful outcomes. Your expertise will be crucial in leveraging the platform to drive value for the business through data-driven insights and optimized customer journeys. Your key responsibilities will include: - Designing and implementing RTCDP solutions, including data schema creation, identity resolution, audience segmentation, and activation. - Ingesting and transforming data from vario...
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
haryana
On-site
Role Overview: As an experienced professional with over 10 years of total experience, you are seeking a new opportunity to showcase your expertise in Python, LLM integration, and GenAI development workflows. Your strong knowledge in prompt engineering, vector databases, retrieval-augmented generation (RAG), and toolchains sets you apart. You have hands-on experience with Agentic AI and Open AI, as well as agent frameworks like LangChain Agents, Semantic Kernel, Haystack, or custom-built agents. Key Responsibilities: - Lead LLM/GenAI projects from Proof-of-Concept to Production - Understand clients" business use cases and technical requirements to create elegant technical designs - Map decisi...
Posted 1 month ago
3.0 - 8.0 years
6 - 14 Lacs
gurugram
Work from Office
My linkedin linkedin.com/in/yashsharma1608. contract period - 6-12month payroll will be - ASV consulting , my company client - will disclose after 1 round Job location - Gurgaon - onsite(WFO) budget - upto 1lpa/month , depending on last (relevant hike) Exprnce - 3+ JD is About the Role We are looking for a skilled AWS Data Engineer with strong hands-on experience in AWS Glue and AWS analytics services. The candidate will be responsible for designing, building, and optimizing scalable data pipelines and ETL processes that support advanced analytics and business intelligence requirements. Key Responsibilities Design and develop ETL pipelines using AWS Glue, PySpark, and AWS services (Lambda, S...
Posted 2 months ago
10.0 - 17.0 years
0 Lacs
bengaluru
Work from Office
Job Description: Complete data ingestion, data pipeline, data lineage , data quality , data wearhouse, data governance and data reconciliation. Essential Skills: Must have Data architect experience and knowledge. Data Architect with over 10 +years of hands-on experience in designing, developing, and managing large-scale data solutions. Proven expertise in building and optimizing ETL pipelines. Strong in data preprocessing, and enhancing data quality.Extracting events, processing large datasets (5 billion+ records) within (a Spark-Hadoop) cluster. Automated data processing tasks for DAAS (Data as a Service) project, streamlining workflow efficiency. Configured file and client setups, ensuring...
Posted 2 months ago
7.0 - 12.0 years
15 - 30 Lacs
noida
Work from Office
5+ years of relevant experience on Scala/Python (PySpark), Distributed Databases, Kafka with solid hands-on multi-threading, functional programing etc. A good understanding of CS Fundamentals, Data Structures, Algorithms and Problem Solving. Professional hand-on experience in Sql and Query Optimization. Experience in building frameworks for data ingestions and consumptions patterns. Expertise with GCP cloud and GCP data processing tools, platforms and technologies like GCS, DataProc, DPaaS, BigQuery, Hive etc. aws, glue, devops, pyspark ETL, pipeline, AWS, glue AWS, AI, python, etl
Posted 2 months ago
7.0 - 12.0 years
15 - 30 Lacs
chennai
Work from Office
5+ years of relevant experience on Scala/Python (PySpark), Distributed Databases, Kafka with solid hands-on multi-threading, functional programing etc. A good understanding of CS Fundamentals, Data Structures, Algorithms and Problem Solving. Professional hand-on experience in Sql and Query Optimization. Experience in building frameworks for data ingestions and consumptions patterns. Expertise with GCP cloud and GCP data processing tools, platforms and technologies like GCS, DataProc, DPaaS, BigQuery, Hive etc. aws, glue, devops, pyspark ETL, pipeline, AWS, glue AWS, AI, python, etl
Posted 2 months ago
7.0 - 12.0 years
15 - 30 Lacs
bengaluru
Work from Office
5+ years of relevant experience on Scala/Python (PySpark), Distributed Databases, Kafka with solid hands-on multi-threading, functional programing etc. A good understanding of CS Fundamentals, Data Structures, Algorithms and Problem Solving. Professional hand-on experience in Sql and Query Optimization. Experience in building frameworks for data ingestions and consumptions patterns. Expertise with GCP cloud and GCP data processing tools, platforms and technologies like GCS, DataProc, DPaaS, BigQuery, Hive etc. aws, glue, devops, pyspark ETL, pipeline, AWS, glue AWS, AI, python, etl
Posted 2 months ago
2.0 - 7.0 years
0 - 0 Lacs
bangalore, noida, chennai
On-site
Adobe AEP Developer Adobe AEP Implementation Key Responsibilities Work with development, deployment, and production areas of the business to craft new solutions using Adobe Experience Platform, Adobe Analytics, Audience Manager, Data Workbench, and Adobe Analytics. Interface directly with internal teams, to address and manage client requests and communicate status in person, via phone and/or email. Be able to understand, customize and optimize the entire customer journey and data management process, to include data management solutions for ingestion of customer data from 1 st , 2 nd and 3 rd party data sources. Capable of developing reports, dashboards and metrics analysis to deliver actiona...
Posted 2 months ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
We are seeking a highly skilled and experienced Over-the-Top (OTT) Subject Matter Expert (SME) to join our dynamic team. As the OTT SME, you will provide strategic and technical leadership across all aspects of our OTT video platform to ensure reliable and high-quality content delivery to our audience. The ideal candidate for this role will possess in-depth expertise in OTT technologies, a strong understanding of industry trends, and a track record of applying best practices to enhance platform performance and drive innovation. You should hold a Bachelor's degree in Computer Science, Engineering, or a related field, along with at least 7 years of hands-on experience in OTT video streaming. Y...
Posted 3 months ago
6.0 - 10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Hello, Greetings from ZettaMine!! Hiring For Data engineer Exp: 6 to 10 Years Location: Bangalore Looking for immediate joiners only Job description. Skills Required: Python, SQL, ETL Azure Data Services (ADF, Synapse), Databricks Data Pipelines, Ingestion (Batch & Streaming), Data Migration Strong in data modelling, transformation, and performance tuning Interested candidates can share updated cv on [HIDDEN TEXT] Thanks & Regards Afreen Show more Show less
Posted 3 months ago
4.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Developer specializing in data modeling and ingestion, you will be responsible for designing, developing, and managing scalable database solutions using MongoDB. Your role will involve writing robust, effective, and scalable queries and operations for MongoDB-based applications. Additionally, you will integrate third-party services, tools, and APIs with MongoDB for data management and processing. Collaboration will be a key aspect of your job as you work closely with developers, data engineers, and stakeholders to ensure the seamless integration of MongoDB with applications and systems. You will also be involved in running unit, integration, and performance tests to guarantee the stabil...
Posted 3 months ago
5.0 - 9.0 years
0 - 17 Lacs
Hyderabad
Work from Office
Experience: 3-5 years of prior Product Management experience working with data warehouses, clouds, or AdTech platforms. Bachelor’s or Master of Science in Computer Science, Information Systems, Business or related degree Ability to lead demonstrations for technical and non-technical audiences Deep understanding of APIs and how they work/operate Proven ability to create product artifacts, including: Product requirement documents (PRDs),epics, story mapping, OKRs, etc. High-level understanding of the Product Lifecycle (PDLC) Basic understanding of coding and software development understanding Excellent attention to detail Excellent written and verbal communication skills Type S(tartup) persona...
Posted 3 months ago
5.0 - 10.0 years
10 - 20 Lacs
Hyderabad, Pune, Gurugram
Hybrid
Role: Data Engineer Experience: 5+ Years Location: Pune, Gurgaon & Bangalore Hybrid Shift Time: 12:00 PM - 10:00 PM Must have: Experience working in AWS, Redshift, Python Prior exposure to Data Ingestion and Curation work (such as working with Data Lakehouse) Knowledge in SQL for purpose of data analysis/investigation Help and support the Data Product Owner) to manage and deliver on the product technical roadmap Ability to digest and understand what the data is, how it is derived, meaning/context around the data itself and how the data fits into NFLs data model Working knowledge of Confluence and JIRA Good to have: Masters degree in computer science, statistics, or related discipline 5+ year...
Posted 3 months ago
8.0 - 12.0 years
18 - 27 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Work from Office
Job Description: Design, implement, and maintain data pipelines and data integration solutions using Azure Synapse Develop and optimize data models and data storage solutions on Azure Collaborate with data scientists and analysts to implement data processing and data transformation tasks. Ensure data quality and integrity through data validation and cleansing methodologies. Monitor and troubleshoot data pipelines to identify and resolve performance issues Collaborate with cross-functional teams to understand and prioritize data requirements. Stay up-to-date with the latest trends and technologies in data engineering and Azure services. Skills & Qualifications: Bachelors degree in IT, compute...
Posted 4 months ago
5.0 - 10.0 years
8 - 18 Lacs
Bengaluru
Work from Office
Data Engineers/ Analysts that can create the data models for Apromore Ingestion
Posted 4 months ago
5.0 - 10.0 years
12 - 22 Lacs
Bengaluru
Remote
Role & responsibilities Technical Capability Foundry Certified (Data Engineering) Foundry Certified (Foundational) Microsoft Certified (Azure AI Fundamentals) Microsoft Certified: Azure Fundamentals Microsoft Certified: Azure Data Engineer Associate Ontology Manager Pipeline Builder Data Linerage Object Explorer SQL Python & Scala Good knowledge of Azure cloud & ADF & Databricks Spark (Pyspark & Scala Spark) Troubleshooting jobs & finding the root cause of the issue Advanced ETL pipeline design for data ingestion & egress for batch dataFoundry Certified (Data Engineering) Experience 4+ Soft Skills Good communication skills Good documentation skills for drafting problem definition and solutio...
Posted 5 months ago
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
123151 Jobs | Dublin
Wipro
40198 Jobs | Bengaluru
EY
32154 Jobs | London
Accenture in India
29674 Jobs | Dublin 2
Uplers
24333 Jobs | Ahmedabad
Turing
22774 Jobs | San Francisco
IBM
19350 Jobs | Armonk
Amazon.com
18945 Jobs |
Accenture services Pvt Ltd
18931 Jobs |
Capgemini
18788 Jobs | Paris,France