12278 Big Data Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

15 - 25 Lacs

bengaluru

Work from Office

AIA-Pune Job Summary We are seeking a skilled Developer with 4 to 8 years of experience to join our team in a hybrid work model. The ideal candidate will have expertise in Dataplex and Data Catalog contributing to our data management and analytics projects. This role involves working in a collaborative environment to enhance data solutions and drive business insights. Responsibilities Develop and implement data management solutions using Dataplex to optimize data storage and retrieval processes. Collaborate with cross-functional teams to integrate Data Catalog for efficient data discovery and governance. Ensure data quality and consistency across various platforms by implementing robust data...

Posted 15 hours ago

AI Match Score
Apply

4.0 - 8.0 years

27 - 42 Lacs

hyderabad

Work from Office

AIA - Weekend Drive 10 Jan 2026 - Face to Face Interview Skill - Bigdata Testing Interview Location: Chennai Experience Range - 6 to 8 Years Mode of Interview - Face to Face ( Mandate) 1. Job Title : Test Lead 2. Job Summary : As a Test Lead you will be responsible for overseeing the testing processes in a hybrid work model ensuring the quality and reliability of data pipelines across various cloud platforms. With a focus on automation and validation you will leverage your expertise in data testing and management to enhance the efficiency of our data operations. Your role will be pivotal in ensuring seamless data integration and validation contributing to the overall success of our data-driv...

Posted 15 hours ago

AI Match Score
Apply

2.0 - 6.0 years

0 Lacs

hyderabad, all india

On-site

As a Data Scientist at our organization, you will be applying your expertise in artificial intelligence by utilizing machine learning, data mining, and information retrieval to design, prototype, and construct advanced analytics engines and services for next-generation applications. Your role will involve collaborating with business partners to define technical problem statements and hypotheses, developing analytical models that align with business decisions, and integrating them into data products or tools with a cross-functional team. Your responsibilities will include: - Collaborating with business partners to devise innovative solutions using cutting-edge techniques and tools - Effective...

Posted 16 hours ago

AI Match Score
Apply

4.0 - 8.0 years

0 Lacs

pune, all india

On-site

As a data modeler with hands-on Snowflake experience at Bridgenext, your role will involve designing, implementing, and documenting data architecture and data modeling solutions using Azure SQL and Snowflake databases. You will be responsible for developing conceptual, logical, and physical data models, implementing operational data stores, data marts, and data lakes, and optimizing data query performance through best practices. Key Responsibilities: - Design, implement, and document data architecture and data modeling solutions using Azure SQL and Snowflake databases - Develop conceptual, logical, and physical data models - Implement operational data stores, data marts, and data lakes - Opt...

Posted 16 hours ago

AI Match Score
Apply

2.0 - 6.0 years

0 Lacs

noida, all india

On-site

As an Analytics professional at the company, your role will involve evangelizing and demonstrating the value and impact of analytics for informed business decision making. You will be responsible for developing and deploying analytical solutions, as well as providing data-driven insights to business stakeholders in order to understand and solve various lending business nuances. Key Responsibilities: - Utilize your 2-5 years of experience in the analytics domain, specifically in analytics consulting and/or BFSI industry - Possess a Bachelors or Masters degree in statistics, economics, engineering, math, or relevant quantitative discipline with a strong academic performance - Demonstrate stron...

Posted 16 hours ago

AI Match Score
Apply

3.0 - 7.0 years

0 Lacs

all india, gurugram

On-site

As a Data Scientist, you will be responsible for analyzing and interpreting data to identify trends and insights. You will develop and implement data-driven solutions to address business challenges. Your role will involve collecting, cleaning, and transforming data from various sources, including big data and unstructured data. Additionally, you will build and train machine learning models for prediction, classification, and other tasks. It is essential to communicate your findings and recommendations clearly and concisely to stakeholders. Staying up-to-date with the latest trends and technologies in data science is crucial in this role. You should also be able to perform Exploratory Data An...

Posted 16 hours ago

AI Match Score
Apply

4.0 - 8.0 years

0 Lacs

hyderabad, all india

On-site

Role Overview: As a Big Data QA Engineer with 4+ years of experience in batch testing and data validation, you will play a crucial role in ensuring the quality and reliability of large-scale data pipelines. Your responsibilities will include designing and executing both automated and manual tests, validating data transformations, and maintaining Python-based test automation frameworks. Key Responsibilities: - Design and execute automated and manual tests to validate data transformations - Ensure the quality and reliability of large-scale data pipelines - Maintain Python-based test automation frameworks - Collaborate closely with data engineers in an Agile environment - Analyze logs and troub...

Posted 17 hours ago

AI Match Score
Apply

6.0 - 9.0 years

27 - 42 Lacs

chennai

Work from Office

AIA - Weekend Drive 10 Jan 2026 - Face to Face Interview Skill - ETL Validation Interview Location: Chennai & Coimbatore Experience Range - 4 to 12 Years Mode of Interview - Face to Face ( Mandate) Key Responsibilities Design, develop, and execute ETL test cases to validate data extraction, transformation, and loading processes. Perform data validation and reconciliation between source and target systems. Conduct functional, regression, integration, and performance testing of ETL workflows. Collaborate with data engineers, developers, and business analysts to understand requirements and ensure test coverage. Identify, document, and track defects using issue-tracking tools (e.g., JIRA, Bugzil...

Posted 17 hours ago

AI Match Score
Apply

4.0 - 9.0 years

15 - 30 Lacs

bengaluru

Work from Office

Job Description Location: Bangalore Job Summary We are seeking a Python PySpark Developer who is responsible for designing, developing, and optimizing big data applications using Python and Apache Spark for large-scale data processing, ETL, and analytics, requiring skills in Spark SQL, data engineering, cloud platforms (AWS/Azure/GCP), and collaboration to build scalable, efficient data pipelines Key Responsibilities: Data Pipeline Development: Build and maintain scalable ETL/ELT pipelines using PySpark. Performance Optimization: Tune Spark applications, optimize queries, and improve processing efficiency. Data Transformation: Clean, transform, and analyze large datasets. Collaboration: Work...

Posted 17 hours ago

AI Match Score
Apply

4.0 - 9.0 years

0 - 3 Lacs

kolkata, pune, chennai

Work from Office

Important Review & edit the JD shared by client. Refrain from publishing JD as is from client mail / job portal Customize JD , make it simple & understandable Refrain from entering client name in the JD JD would be considered incomplete without RGS id & Billing rate/ Salary range & Pre Screening questions Fields highlighted in Red are must. TCS Profile Tata Consultancy Services is an IT services, consulting and business solutions organization that delivers real results to global business, ensuring a level of certainty no other firm can match. TCS offers a consulting-led, integrated portfolio of IT, BPO, infrastructure, engineering and assurance services. This is delivered through its unique ...

Posted 18 hours ago

AI Match Score
Apply

5.0 - 9.0 years

5 - 12 Lacs

hyderabad

Work from Office

Skill/Competency Requirements: Need to have experience in Big Data technologies and support projects. Able to quickly analyze complex SQL queries and grasp underlying functionality and able to provide RCA Well experienced in working with Hive scripts Experienced in working on Big Data technologies like Hive, Hue, Oozie. Experienced in working on L2/L3 support

Posted 19 hours ago

AI Match Score
Apply

5.0 - 10.0 years

10 - 20 Lacs

hyderabad, chennai, bengaluru

Work from Office

years of total experience in data engineering or big data development. 23 years hands-on experience with Apache Spark. Strong programming skills in PySpark, Python, and Scala. 2+ years of experience in Scala backend development. Proficient in Scala, both object oriented and functional programming concepts. Deep understanding and application of advanced functional programming concepts like category theory, monads, applicatives, and type classes. Hands-On knowledge with Scala Typelevel libraries like Cats, Shapeless, and others used for building applications with strong typing and efficient concurrency. Solid understanding of data lakes, lakehouses, and Delta Lake concepts. Experience in SQL d...

Posted 19 hours ago

AI Match Score
Apply

3.0 - 5.0 years

17 - 19 Lacs

hyderabad, bengaluru

Hybrid

Experience: 3+ years of experience in combined Data Engineering and Data Analyst roles. Technical Skills: Proficiency in SQL, Python and Big data technologies (PySpark, Hive, Hadoop) Strong understanding of data pipeline Familiarity with data visualization tools. Communication Skills: Ability to communicate complex technical concepts. Strong collaborative and team-oriented mindset.

Posted 19 hours ago

AI Match Score
Apply

3.0 - 8.0 years

5 - 10 Lacs

ahmedabad

Work from Office

Azure Data Factory: - Develop Azure Data Factory Objects - ADF pipeline, configuration, parameters, variables, Integration services runtime - Hands-on knowledge of ADF activities(such as Copy, SP, lkp etc) and DataFlows - ADF data Ingestion and Integration with other services Azure Databricks: - Experience in Big Data components such as Kafka, Spark SQL, Dataframes, HIVE DB etc implemented using Azure Data Bricks would be preferred. - Azure Databricks integration with other services - Read and write data in Azure Databricks - Best practices in Azure Databricks Synapse Analytics: - Import data into Azure Synapse Analytics with and without using PolyBase - Implement a Data Warehouse with Azure...

Posted 20 hours ago

AI Match Score
Apply

3.0 - 8.0 years

3 - 7 Lacs

patna

Work from Office

Azure Data Factory: - Develop Azure Data Factory Objects - ADF pipeline, configuration, parameters, variables, Integration services runtime - Hands-on knowledge of ADF activities(such as Copy, SP, lkp etc) and DataFlows - ADF data Ingestion and Integration with other services Azure Databricks: - Experience in Big Data components such as Kafka, Spark SQL, Dataframes, HIVE DB etc implemented using Azure Data Bricks would be preferred. - Azure Databricks integration with other services - Read and write data in Azure Databricks - Best practices in Azure Databricks Synapse Analytics: - Import data into Azure Synapse Analytics with and without using PolyBase - Implement a Data Warehouse with Azure...

Posted 20 hours ago

AI Match Score
Apply

3.0 - 8.0 years

3 - 7 Lacs

nashik

Work from Office

Azure Data Factory: - Develop Azure Data Factory Objects - ADF pipeline, configuration, parameters, variables, Integration services runtime - Hands-on knowledge of ADF activities(such as Copy, SP, lkp etc) and DataFlows - ADF data Ingestion and Integration with other services Azure Databricks: - Experience in Big Data components such as Kafka, Spark SQL, Dataframes, HIVE DB etc implemented using Azure Data Bricks would be preferred. - Azure Databricks integration with other services - Read and write data in Azure Databricks - Best practices in Azure Databricks Synapse Analytics: - Import data into Azure Synapse Analytics with and without using PolyBase - Implement a Data Warehouse with Azure...

Posted 20 hours ago

AI Match Score
Apply

3.0 - 8.0 years

3 - 7 Lacs

chennai

Work from Office

- Develop Azure Data Factory Objects - ADF pipeline, configuration, parameters, variables, Integration services runtime - Hands-on knowledge of ADF activities(such as Copy, SP, lkp etc) and DataFlows - ADF data Ingestion and Integration with other services Azure Databricks: - Experience in Big Data components such as Kafka, Spark SQL, Dataframes, HIVE DB etc implemented using Azure Data Bricks would be preferred. - Azure Databricks integration with other services - Read and write data in Azure Databricks - Best practices in Azure Databricks Synapse Analytics: - Import data into Azure Synapse Analytics with and without using PolyBase - Implement a Data Warehouse with Azure Synapse Analytics -...

Posted 20 hours ago

AI Match Score
Apply

3.0 - 8.0 years

3 - 7 Lacs

lucknow

Work from Office

Azure Data Factory: - Develop Azure Data Factory Objects - ADF pipeline, configuration, parameters, variables, Integration services runtime - Hands-on knowledge of ADF activities(such as Copy, SP, lkp etc) and DataFlows - ADF data Ingestion and Integration with other services Azure Databricks: - Experience in Big Data components such as Kafka, Spark SQL, Dataframes, HIVE DB etc implemented using Azure Data Bricks would be preferred. - Azure Databricks integration with other services - Read and write data in Azure Databricks - Best practices in Azure Databricks Synapse Analytics: - Import data into Azure Synapse Analytics with and without using PolyBase - Implement a Data Warehouse with Azure...

Posted 20 hours ago

AI Match Score
Apply

7.0 - 10.0 years

6 - 9 Lacs

pune

Work from Office

Job Position : GCP Data Engineer Experience : 6+ Yrs into Gcp Data Engineer Notice Period : Immediate - 15 days only JD- Key Responsibilities : We are seeking a skilled Data Engineer to design, build, and maintain data pipelines and data models that support analytical and business intelligence needs. The ideal candidate will have hands-on experience with Python or SQL, Google Cloud Platform (GCP), and a strong understanding of data management, quality, and security best practices. Key Responsibilities : - Build and maintain moderately complex data pipelines, ensuring data flow, transformation, and usability for analytical projects. - Design and implement data models, optimizing for performan...

Posted 20 hours ago

AI Match Score
Apply

7.0 - 8.0 years

10 - 14 Lacs

pune

Remote

Job Title : Databricks Tech Lead (Contract) Contract Duration : 4 Months (Extendable based on Performance) Job Location : Remote Job Timings : India Evening Shift (till 11 : 30 PM IST) Experience Required : 7+ Years Job Description : We are looking for an experienced Databricks Tech Lead to join our team on a 4-month extendable contract. The ideal candidate will bring deep expertise in data engineering, big data platforms, and cloud-based data warehouse solutions, with the ability to work in a fast-paced remote environment. Key Responsibilities : - Lead the design, optimization, and management of large-scale data pipelines using Databricks, Spark (PySpark), and AWS data services. - Productio...

Posted 20 hours ago

AI Match Score
Apply

7.0 - 10.0 years

6 - 9 Lacs

kochi, cochin

Work from Office

Job Position : GCP Data Engineer Experience : 6+ Yrs into Gcp Data Engineer Location : Chennai, Pune, Trivandrum, Kochi Notice Period : Immediate - 15 days only JD- Key Responsibilities : We are seeking a skilled Data Engineer to design, build, and maintain data pipelines and data models that support analytical and business intelligence needs. The ideal candidate will have hands-on experience with Python or SQL, Google Cloud Platform (GCP), and a strong understanding of data management, quality, and security best practices. Key Responsibilities : - Build and maintain moderately complex data pipelines, ensuring data flow, transformation, and usability for analytical projects. - Design and imp...

Posted 20 hours ago

AI Match Score
Apply

7.0 - 8.0 years

10 - 14 Lacs

chandigarh

Work from Office

Job Title : Databricks Tech Lead (Contract) Contract Duration : 4 Months (Extendable based on Performance) Job Location : Remote Job Timings : India Evening Shift (till 11 : 30 PM IST) Experience Required : 7+ Years Job Description : We are looking for an experienced Databricks Tech Lead to join our team on a 4-month extendable contract. The ideal candidate will bring deep expertise in data engineering, big data platforms, and cloud-based data warehouse solutions, with the ability to work in a fast-paced remote environment. Key Responsibilities : - Lead the design, optimization, and management of large-scale data pipelines using Databricks, Spark (PySpark), and AWS data services. - Productio...

Posted 20 hours ago

AI Match Score
Apply

7.0 - 8.0 years

10 - 14 Lacs

bengaluru

Work from Office

Job Title : Databricks Tech Lead (Contract) Contract Duration : 4 Months (Extendable based on Performance) Job Location : Remote Job Timings : India Evening Shift (till 11 : 30 PM IST) Experience Required : 7+ Years Job Description : We are looking for an experienced Databricks Tech Lead to join our team on a 4-month extendable contract. The ideal candidate will bring deep expertise in data engineering, big data platforms, and cloud-based data warehouse solutions, with the ability to work in a fast-paced remote environment. Key Responsibilities : - Lead the design, optimization, and management of large-scale data pipelines using Databricks, Spark (PySpark), and AWS data services. - Productio...

Posted 20 hours ago

AI Match Score
Apply

8.0 - 13.0 years

12 - 16 Lacs

bengaluru

Work from Office

Educational Requirements Bachelor of Engineering Service Line Infosys Quality Engineering Responsibilities Generic role expectations - Should have experience to build best-fit architectural solution in form of frameworks/ tools in identified technology area Should be conversant with architecture principles and practices ( Architecture frameworks and methodology, Architecture patterns, Architect QoS ( performance, scalability, maintainability, reusability) Should be able to provide technical leadership and strategic direction for testing Should be able to create technology differentiation by contribution to new proposal/ pursuits Preferably should be aware of industry trends in some emerging ...

Posted 20 hours ago

AI Match Score
Apply

5.0 - 10.0 years

4 - 8 Lacs

bengaluru

Work from Office

Educational Requirements Bachelor of Engineering Service Line Infosys Quality Engineering Responsibilities Should have experience to build best-fit architectural solution in form of frameworks/ tools in identified technology area Should be conversant with architecture principles and practices ( Architecture frameworks and methodology, Architecture patterns, Architect QoS ( performance, scalability, maintainability, reusability) Should be able to provide technical leadership and strategic direction for testing Should be able to create technology differentiation by contribution to new proposal/ pursuits Preferably should be aware of industry trends in some emerging technology areas (blockchain...

Posted 20 hours ago

AI Match Score
Apply

Exploring Big Data Jobs in India

The big data job market in India is booming with opportunities for skilled professionals in the field. With the increasing importance of data-driven decision making in businesses, the demand for big data professionals is on the rise. Job seekers looking to embark on a career in big data in India have a plethora of opportunities waiting for them.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Delhi

These cities are known for their thriving tech industries and are hotspots for big data job openings.

Average Salary Range

The average salary range for big data professionals in India varies based on experience levels. Entry-level positions can expect to earn between INR 4-8 lakhs per annum, while experienced professionals can command salaries ranging from INR 12-20 lakhs per annum.

Career Path

In the field of big data, a typical career path can progress from roles such as Junior Data Analyst or Data Engineer to Senior Data Scientist or Big Data Architect. As professionals gain more experience and expertise, they can move on to roles like Data Science Manager or Chief Data Officer.

Related Skills

In addition to expertise in big data technologies like Hadoop, Spark, or NoSQL databases, professionals in this field are often expected to have skills in programming languages like Python or R, data visualization tools like Tableau or Power BI, and a strong understanding of statistics and machine learning algorithms.

Interview Questions

  • What is the difference between structured and unstructured data? (basic)
  • Explain the MapReduce framework. (medium)
  • How would you handle missing data in a dataset? (medium)
  • What is the purpose of Apache Hive in the Hadoop ecosystem? (medium)
  • Can you explain the concept of dimensionality reduction? (medium)
  • What is the difference between supervised and unsupervised learning? (basic)
  • How does regularization help in preventing overfitting in machine learning models? (advanced)
  • Explain the concept of bias-variance tradeoff. (medium)
  • What is the curse of dimensionality? (advanced)
  • How would you handle imbalanced classes in a classification problem? (medium)
  • What is the difference between batch processing and real-time processing in big data? (basic)
  • What is the difference between bagging and boosting in machine learning? (medium)
  • Explain the concept of collaborative filtering. (medium)
  • How do you assess the performance of a machine learning model? (basic)
  • Can you explain the difference between L1 and L2 regularization? (medium)
  • What is the purpose of Apache Kafka in a big data ecosystem? (medium)
  • How does gradient descent work in machine learning optimization? (medium)
  • What is the difference between data profiling and data mining? (basic)
  • Explain the concept of feature engineering in machine learning. (medium)
  • How do you handle outliers in a dataset? (medium)
  • What is the role of a data engineer in a big data project? (basic)
  • How would you evaluate the significance of a variable in a regression model? (medium)
  • What is the importance of cross-validation in machine learning? (medium)
  • Can you explain the concept of ensemble learning? (medium)
  • How would you approach a big data project from data collection to model deployment? (advanced)

Conclusion

As you explore big data job opportunities in India, remember to equip yourself with the necessary skills and knowledge to stand out in the competitive job market. Prepare well for interviews, showcase your expertise, and apply confidently to secure your dream job in the exciting field of big data. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies