9690 Big Data Jobs - Page 16

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 8.0 years

13 - 17 Lacs

bengaluru

Work from Office

Job Description : About Diageo: Diageo is the worlds leading premium drinks company with Outstanding collection of brands, such as Johnnie Walker, Smirnoff, Baileys, Captain Morgan, Tanqueray and Guinness. With over 200 brands in 180 countries and a global network of daring individuals, our teams blend a diverse range of experience, knowledge and skills. We connect customers and consumers to our iconic products and build innovative experiences that bring people together to celebrate life. Who we are: The Digital & Technology vision is to own the way in the transformation of Diageo business capabilities, demonstrating data & digital technology, to build a driven edge in the marketplace. About...

Posted 4 days ago

AI Match Score
Apply

8.0 - 13.0 years

25 - 30 Lacs

bengaluru

Work from Office

Who we are About Stripe About the team The Batch Compute team at Stripe manages the infrastructure, tooling and systems behind running batch processing systems at Stripe, which are currently powered by Hadoop and Spark. Batch processing systems power several core asynchronous workflows at Stripe and operate at significant scale. Were looking for a Software Engineer with experience designing, building and maintaining high-scale, distributed systems. You will work with a team that is in charge of the core infrastructure used by the product teams to build and operate batch processing jobs. You will have an opportunity to play a hands-on role in significantly rearchitecting our current infrastru...

Posted 4 days ago

AI Match Score
Apply

6.0 - 9.0 years

20 - 27 Lacs

kochi, thiruvananthapuram

Work from Office

Key Responsibilities Design, develop, and optimize ETL pipelines using PySpark on Google Cloud Platform (GCP) . Work with BigQuery , Cloud Dataflow , Cloud Composer (Apache Airflow) , and Cloud Storage for data transformation and orchestration. Develop and optimize Spark-based ETL processes for large-scale data processing . Implement best practices for data governance, security, and monitoring in a cloud environment. Collaborate with data engineers, analysts, and business stakeholders to understand data requirements. Troubleshoot performance bottlenecks and optimize Spark jobs for efficient execution. Automate data workflows using Apache Airflow or Cloud Composer . Ensure data quality, valid...

Posted 4 days ago

AI Match Score
Apply

5.0 - 8.0 years

10 - 14 Lacs

pune

Work from Office

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve cl...

Posted 4 days ago

AI Match Score
Apply

5.0 - 8.0 years

8 - 12 Lacs

hyderabad

Work from Office

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve cl...

Posted 4 days ago

AI Match Score
Apply

5.0 - 10.0 years

22 - 27 Lacs

bengaluru

Work from Office

We are seeking an AI Builder for our TAI initiative to develop a conversational bot using RAG framework and LLM, Develop new models and integrate GenAI to analyze the data. Positions in this function produce innovative solutions driven by exploratory data analysis from unstructured, diverse datasets typically measured in gigabytes or larger. Applies knowledge of statistics, machine learning, programming, data modeling, simulation, and advanced mathematics to recognize patterns, identify opportunities, pose business questions, and make valuable discoveries leading to prototype development and product improvement. Uses a flexible, analytical approach to design, develop, and evaluate predictive...

Posted 4 days ago

AI Match Score
Apply

9.0 - 12.0 years

5 - 5 Lacs

thiruvananthapuram

Work from Office

Job Description Job Summary: We are looking for a proactive and technically strong Lead Gen AI Engineer 10+ years exp to lead a high-performing offshore team working on Gen AI solutions in the real estate domain. This role requires close collaboration with the Gen AI Architect and direct accountability for delivery, technical execution, and mentoring the team across various Gen AI initiatives. Key Responsibilities: - Lead the offshore team in developing, testing, and deploying Gen AI and NLP-based applications. - Collaborate closely work with the Lead Architect to translate high-level designs into actionable workstreams. - Develop and guide the team on best practices in GPT-based development...

Posted 4 days ago

AI Match Score
Apply

3.0 - 7.0 years

4 - 7 Lacs

hyderabad

Work from Office

Role Description: As a Senior Data Engineer at Amgen, you will be responsible for managing and optimizing the company's data infrastructure and architecture. You will design and implement data pipelines, develop data models, perform data integration, and ensure data quality and governance. Your expertise in data engineering, big data technologies, and data manipulation will contribute to the effective storage, processing, and utilization of large-scale data sets. Key Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines to support structured, semi-structured, and unstructured data processing across the Enterprise Data Engineering for Biotech or Pharma functional knowledg...

Posted 4 days ago

AI Match Score
Apply

7.0 - 8.0 years

10 - 15 Lacs

ahmedabad

Remote

Contract Duration : 4 Months (Extendable based on Performance) Job Timings : India Evening Shift (till 11 : 30 PM IST) Job Description : We are looking for an experienced Databricks Tech Lead to join our team on a 4-month extendable contract. The ideal candidate will bring deep expertise in data engineering, big data platforms, and cloud-based data warehouse solutions, with the ability to work in a fast-paced remote environment. Key Responsibilities : - Lead the design, optimization, and management of large-scale data pipelines using Databricks, Spark (PySpark), and AWS data services. - Productionize and deploy Big Data platforms and applications across multi-cloud environments (AWS, Azure, ...

Posted 4 days ago

AI Match Score
Apply

7.0 - 8.0 years

10 - 15 Lacs

thane

Remote

Contract Duration : 4 Months (Extendable based on Performance) Job Timings : India Evening Shift (till 11 : 30 PM IST) Job Description : We are looking for an experienced Databricks Tech Lead to join our team on a 4-month extendable contract. The ideal candidate will bring deep expertise in data engineering, big data platforms, and cloud-based data warehouse solutions, with the ability to work in a fast-paced remote environment. Key Responsibilities : - Lead the design, optimization, and management of large-scale data pipelines using Databricks, Spark (PySpark), and AWS data services. - Productionize and deploy Big Data platforms and applications across multi-cloud environments (AWS, Azure, ...

Posted 4 days ago

AI Match Score
Apply

7.0 - 8.0 years

10 - 15 Lacs

mumbai

Remote

Contract Duration : 4 Months (Extendable based on Performance) Job Timings : India Evening Shift (till 11 : 30 PM IST) Job Description : We are looking for an experienced Databricks Tech Lead to join our team on a 4-month extendable contract. The ideal candidate will bring deep expertise in data engineering, big data platforms, and cloud-based data warehouse solutions, with the ability to work in a fast-paced remote environment. Key Responsibilities : - Lead the design, optimization, and management of large-scale data pipelines using Databricks, Spark (PySpark), and AWS data services. - Productionize and deploy Big Data platforms and applications across multi-cloud environments (AWS, Azure, ...

Posted 4 days ago

AI Match Score
Apply

0.0 - 2.0 years

3 - 5 Lacs

hyderabad

Work from Office

The role is responsible for designing, building, maintaining , analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data governance initiatives and visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities: Analyze complex datasets (pricing, rebates, provider contracting, market access) using Excel, SQL, and Databricks to create reports, dashboards, and analytic...

Posted 4 days ago

AI Match Score
Apply

4.0 - 9.0 years

5 - 15 Lacs

chennai

Hybrid

Immediate opening for " Python & Big Data Engineering" Job Title: Python & Big Data Engineer Job Location: Chennai (Hybrid) Experience Required: 4 -9 Years Job Requirement: Strong proficiency in Python with OOP and functional programming experience. Experience building multi-module, package-structured applications . Expertise in unit testing using pytest / unittest. Experience managing environments using venv / conda / pyenv . Proficient in SQL for complex data querying & performance optimization. Hands-on experience with Teradata for data extraction & transformation. (Highly Preferred) Working knowledge of Hadoop ecosystem components. PySpark / Apache Spark experience for distributed data p...

Posted 4 days ago

AI Match Score
Apply

8.0 - 13.0 years

6 - 24 Lacs

bengaluru

Work from Office

Responsibilities: Design, develop & maintain data pipelines using Azure Data Factory & SQL. Optimize performance through data modeling & query optimization. Scala Gradle is mandatory, please don't apply if you don't have experience.

Posted 4 days ago

AI Match Score
Apply

2.0 - 5.0 years

4 - 7 Lacs

karnataka

Work from Office

Description: Job Discription:Employee must have experince in Conflunet Kafka Configuration and orchestration of containers using Kubernetes/OpenShift as an orchestration tool. Administration of infrastructure for Tier1 applications In-depth knowledge in Kafka administration, with skills to manage Kafka clusters in production environments. Release Management, Jenkins, Git, Ansible, Linux, AWS/Azure/GCP cloud, Docker, Kubernetes/OpenShift, Tomcat, and shell scripting. Good knowledge of virtualization and container technology like Docker. Experience working with various orchestration platforms both on On-Premise and cloud servers. Designing and deploying applications utilizing Azure/GCP/AWS sta...

Posted 4 days ago

AI Match Score
Apply

5.0 - 7.0 years

7 - 9 Lacs

mumbai, hyderabad

Work from Office

We are looking for Oracle DBA L3 Admin with experience in Goldengate for our projects based out in NAVI MUMBAI - Belapur. Oracle DBA L3 Admin (Goldengate is must) Exp - 8- 12 Yrs Location - Navi Mumbai Job Criteria Oracle DBA L3 Admin (Goldengate is must) Educational Qualification BE/B.Tech (any branch)/ MCA/M.Tech/ MSC IT MSC CS Work Location - Navi Mumbai, Belapur 24/ 7 Support Environment Banking industry experience (Good to have) Documents required for BV purpose (Must have EPFO service history records with supporting documents for past 5 years) Roles & Responsibilities Ability to Work in a 24x7 production environment and provided day-to-day support on-call basis Expertise in Setup, Inst...

Posted 4 days ago

AI Match Score
Apply

2.0 - 5.0 years

4 - 7 Lacs

maharashtra

Work from Office

Description: Objective:Build and maintain data pipelines and infrastructure to support data storage and processing. Key Responsibilities: -Design develop and optimize data pipelines using Azure Databricks. -Ensure data reliability scalability and quality. -Integrate data sources into a unified platform. -Monitor and troubleshoot data workflows. Skills Required: -Expertise in Azure Databricks Spark and big data tools. -Strong programming skills (e.g. Python Scala SQL). -Knowledge of ETL processes and cloud technologies. -Familiarity with CI/CD practices for data pipelines Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade :C Level :To Be Defined Named J...

Posted 4 days ago

AI Match Score
Apply

2.0 - 5.0 years

4 - 7 Lacs

maharashtra

Work from Office

Enhance, optimize and maintain existing data ingestion, transformation and extraction pipelines and assets built for reporting and analytics on Cloud (GCP + BigQuery + Confluent Cloud) and Big Data(Cloudera and Confluent Platform) platforms. Work with the Product Owner to understand the priorities and OKRs for the quarter and gather detailed requirements from the initiative owners or program sponsor as per the Epics planned to be delivered in the quarter. Build new and optimized data pipelines, assets to meet the end user requirements. The Data pipelines must adhere to all the architecture, design and engineering principles. Design the data pipelines and assets to meet non functional require...

Posted 4 days ago

AI Match Score
Apply

3.0 - 6.0 years

4 - 8 Lacs

andhra pradesh

Work from Office

Description Job Summary Role Value Proposition MetLife Data & Analytics organization is the team of expert technologists responsible for building big data platforms and data services with innovative technologies to enable MetLife businesses to generate insights and value to its customers. The team is the center of excellence in data engineering in MetLife and plays a key role in data enablement through multiple data stores supporting different kind of analytical use cases to be able to derive predictive prescriptive and descriptive insights. The Azure Data Engineer III serves as big data development expert within the data analytics engineering organization of MetLife Data & Analytics. This p...

Posted 4 days ago

AI Match Score
Apply

2.0 - 6.0 years

5 - 9 Lacs

uttar pradesh

Work from Office

Location Hyd / Pune / Bengaluru JD 1ETL, Pyspark Experience4 8 years Proven experience as a development data engineer or similar role, with ETL background. Experience with data integration / ETL best practices and data quality principles. Play a crucial role in ensuring the quality and reliability of the data by designing, implementing, and executing comprehensive testing. By going over the User Stories build the comprehensive code base and business rules for testing and validation of the data. Knowledge of continuous integration and continuous deployment (CI/CD) pipelines. Familiarity with Agile/Scrum development methodologies. Excellent analytical and problem solving skills. Strong communi...

Posted 4 days ago

AI Match Score
Apply

5.0 - 9.0 years

9 - 13 Lacs

karnataka

Work from Office

Description Key job responsibilities include, but are not limited to Become involved with day-to-day operations in Production Control. Understand the workflows and be able to troubleshoot issues and define new workflows in PCS and Control-M Design Airflow infrastructure according to the industry best practices. Work with System Administrators, Platform Engineers, Network, Firewall and Monitoring teams on the implementation of the infrastructure. Document the system's specifics, define, and implement administrative controls and policies of Airflow. Work with the development teams to identify the groups of jobs to migrate to airflow. Assist the development teams with the migrationsdefine DAGs,...

Posted 4 days ago

AI Match Score
Apply

4.0 - 5.0 years

3 - 6 Lacs

tamil nadu

Work from Office

Description The position involves working within our CX Program Implementation team with a focus on building the surveys and reporting. Your day would include working with the business analyst to understand CX requirements shared by the stakeholders and work with the implementation team to develop solutions. The role requires a creative thinker who can solve problems through innovative technology solutions while supporting the development of customer experience programs. Responsibilities ? Configure the Medallia surveys and reporting setup as per the requirements shared by the Business Analyst. ? Build Auto Importers for the new survey programs. ? Conduct file inflow reviews to understand th...

Posted 4 days ago

AI Match Score
Apply

6.0 - 10.0 years

11 - 15 Lacs

uttar pradesh

Work from Office

Data ModelingThey create data models using Entity Relationship (ER) diagrams to define how data entities interact and relate to each other. This ensures data organization and accessibility. Data Architecture DesignThey design and implement the technical architecture for data storage, processing, and retrieval. This includes data lakes, data warehouses, and data pipelines. Data GovernanceData architects establish data governance policies and procedures to ensure data quality, security, and compliance with regulations Skills: Strong understanding of data modeling concepts (ER diagrams) Knowledge of database technologies (SQL, NoSQL) Expertise in data warehousing and data lakes Experience with ...

Posted 4 days ago

AI Match Score
Apply

4.0 - 8.0 years

7 - 10 Lacs

hyderabad, pune, bengaluru

Work from Office

ETL, Pyspark Experience: 4- 8 years Proven experience as a development data engineer or similar role, with ETL background. Experience with data integration / ETL best practices and data quality principles. Play a crucial role in ensuring the quality and reliability of the data by designing, implementing, and executing comprehensive testing. By going over the User Stories build the comprehensive code base and business rules for testing and validation of the data. Knowledge of continuous integration and continuous deployment (CI/CD) pipelines. Familiarity with Agile/Scrum development methodologies. Excellent analytical and problem solving skills. Strong communication and collaboration skills. ...

Posted 4 days ago

AI Match Score
Apply

2.0 - 7.0 years

4 - 9 Lacs

hyderabad

Work from Office

JR REQ---Data Engineer(Pyspark, Big mailto:data)--4to8year---hyd----hemanth.karanam@tcs.com----TCS C2H ---900000

Posted 4 days ago

AI Match Score
Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies