Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 9.0 years
5 - 9 Lacs
bengaluru
Work from Office
When you join Honeywell, you become a member of our global team of thinkers, innovators, dreamers, and doers who make the things that make the future. That means changing the way we fly, fueling jets in an eco-friendly way, keeping buildings smart and safe and even making it possible to breathe on Mars. Working at Honeywell isn t just about developing cool things. That s why all our employees enjoy access to dynamic career opportunities across different fields and industries. We offer amazing opportunities for career growth with a world-class team of diverse experts. Are you ready to help us make the future Join a team that is elevating our strategy to drive advanced analytics and visualizat...
Posted 1 day ago
1.0 - 5.0 years
3 - 7 Lacs
rajkot
Work from Office
NEX Softsys is looking for Hadoop Developer to join our dynamic team and embark on a rewarding career journey. A Hadoop Developer is responsible for designing, developing, and maintaining big data solutions using Apache Hadoop. Key responsibilities include : 1. Designing and developing scalable, efficient, and reliable data processing pipelines using Hadoop and related technologies such as MapReduce and Hive. 2. Writing and executing MapReduce jobs to process large datasets stored in Hadoop Distributed File System (HDFS). 3. Collaborating with stakeholders to understand their data processing requirements and develop solutions that meet their needs. 4. Integrating Hadoop with other data stora...
Posted 1 day ago
0.0 - 5.0 years
3 - 6 Lacs
rajkot
Work from Office
NEX Softsys is looking for Big Data Developer to join our dynamic team and embark on a rewarding career journey Design, develop, and maintain big data solutions to meet business requirements and support data-driven decision making Work with stakeholders to understand their data needs and determine how to best use big data technologies to meet those needs Design and implement scalable, high-performance big data architectures, using technologies such as Hadoop, Spark, and NoSQL databases Extract, transform, and load large data sets into a big data platform for analysis and reporting Write complex SQL queries and develop custom scripts to process big data Collaborate with data scientists, data ...
Posted 1 day ago
4.0 - 6.0 years
8 - 13 Lacs
chennai
Work from Office
Role Description : As a Senior Cloud Data Platform (GCP) Engineer at Incedo, you will be responsible for managing and optimizing the Google Cloud Platform environment, ensuring its performance, scalability, and security. You will work closely with data analysts and data scientists to develop data pipelines and run data science experiments. You will be skilled in cloud computing platforms such as AWS or Azure and have experience with big data technologies such as Hadoop or Spark. You will be responsible for configuring and optimizing the GCP environment, ensuring that data pipelines are efficient and accurate, and troubleshooting any issues that arise. You will also work with the security tea...
Posted 1 day ago
4.0 - 6.0 years
7 - 12 Lacs
chennai
Work from Office
Role Description : As a Senior Cloud Data Platform (GCP) Engineer at Incedo, you will be responsible for managing and optimizing the Google Cloud Platform environment, ensuring its performance, scalability, and security. You will work closely with data analysts and data scientists to develop data pipelines and run data science experiments. You will be skilled in cloud computing platforms such as AWS or Azure and have experience with big data technologies such as Hadoop or Spark. You will be responsible for configuring and optimizing the GCP environment, ensuring that data pipelines are efficient and accurate, and troubleshooting any issues that arise. You will also work with the security tea...
Posted 1 day ago
7.0 - 12.0 years
7 - 11 Lacs
pune
Work from Office
7+ years of experience in Big Data with strong expertise in Spark and Scala Mandatory Skills Big Data Primarily Spark and Scala Strong Knowledge in HDFS, Hive, Impala with knowledge on Unix , Oracle, Autosys, Good to Have Agile Methodology and Banking Expertise Strong Communication Skills Not limited to Spark batch, need Spark streaming experience No SQL DB Experience HBase/Mongo/Couchbase
Posted 2 days ago
4.0 - 6.0 years
9 - 13 Lacs
chennai
Work from Office
As a Senior Cloud Data Platform (GCP) Engineer at Incedo, you will be responsible for managing and optimizing the Google Cloud Platform environment, ensuring its performance, scalability, and security. You will work closely with data analysts and data scientists to develop data pipelines and run data science experiments. You will be skilled in cloud computing platforms such as AWS or Azure and have experience with big data technologies such as Hadoop or Spark. You will be responsible for configuring and optimizing the GCP environment, ensuring that data pipelines are efficient and accurate, and troubleshooting any issues that arise. You will also work with the security team to ensure that th...
Posted 2 days ago
4.0 - 6.0 years
7 - 12 Lacs
chennai
Work from Office
Role Description As a Senior Cloud Data Platform (GCP) Engineer at Incedo, you will be responsible for managing and optimizing the Google Cloud Platform environment, ensuring its performance, scalability, and security. You will work closely with data analysts and data scientists to develop data pipelines and run data science experiments. You will be skilled in cloud computing platforms such as AWS or Azure and have experience with big data technologies such as Hadoop or Spark. You will be responsible for configuring and optimizing the GCP environment, ensuring that data pipelines are efficient and accurate, and troubleshooting any issues that arise. You will also work with the security team ...
Posted 2 days ago
4.0 - 6.0 years
7 - 12 Lacs
chennai
Work from Office
Role Description As a Senior Cloud Data Platform (GCP) Engineer at Incedo, you will be responsible for managing and optimizing the Google Cloud Platform environment, ensuring its performance, scalability, and security. You will work closely with data analysts and data scientists to develop data pipelines and run data science experiments. You will be skilled in cloud computing platforms such as AWS or Azure and have experience with big data technologies such as Hadoop or Spark. You will be responsible for configuring and optimizing the GCP environment, ensuring that data pipelines are efficient and accurate, and troubleshooting any issues that arise. You will also work with the security team ...
Posted 2 days ago
5.0 - 9.0 years
12 - 17 Lacs
noida
Work from Office
Hive/Python/Spark Technical hands on data processing Database SQL knowledge for retrieval of data - transformation queries such as joins (full, left , right) , ranking , group by Database SQL/Oracle Good Communication skills. Additional skills - GitHub , Jenkins , shell scripting would be added advantage, large-scale data processing model implementation, and performance tracking Mandatory Competencies Big Data - Big Data - HIVE Big Data - Big Data - Pyspark Big Data - Big Data - SPARK Data Science and Machine Learning - Data Science and Machine Learning - Data Analyst Database - Oracle - PL/SQL Packages Programming Language - Python - Python Shell Beh - Communication and collaboration
Posted 2 days ago
2.0 - 7.0 years
5 - 6 Lacs
hyderabad
Work from Office
One Stop Destination for All your BI, DW, Big Data Needs Full Time Posted 2 days ago Data Engineer - Helical IT Solutions Pvt Ltd Posted 2 days ago About Us: Helical IT, based out of Hyderabad, is a software company that specializes in Open Source Data Warehousing & Business Intelligence, servicing clients in various domains like Manufacturing, HR, Energy, Insurance, Social Media Analytics, E-commerce, Travel, etc. Demonstrated strength in data modeling, ETL development, and data warehousing Hands-on experience using big data technologies (Hadoop, Hive, HBase, Spark, etc.); Apache Spark and PySpark are mandatory Hands-on experience using Spark SQL Hands-on experience with programming languag...
Posted 2 days ago
6.0 - 8.0 years
8 - 10 Lacs
maharashtra
Work from Office
6 + years of overall IT experience in Telecom OSS especially in Assurance domain Solution, design, and Implementation - Strong Knowledge of Telecom OSS domain, with excellent experience Service Now for Assuance - Knowledge and experience on Big Data,Data lake solution, KAFKA , Hadoop/Hive. - Experience on Python (pyspark) is essential. - Implementation experience in continuous integration and delivery philosophies and practices specifically on Docker, Git, JenKins - Self driven and highly motivated candidate for a client facing role in a challenging environment
Posted 2 days ago
2.0 - 5.0 years
4 - 7 Lacs
maharashtra
Work from Office
Hands On with advanced SQL, Python etc. Hands On in data profiling Hands On in working on Cloud like Azure and Cloud DW like Databricks Hands On experience on Scheduling tools like Airflow, Control M etc. Knowledgeable on Big Data tools like Spark (python/scala), Hive, Impala, Hue and storage (e.g. HDFS, HBase) Knowledgeable in CICD processes Bitbucket/GitHub, Jenkins, Nexus etc. Knowledgeable managing structured and unstructured data types
Posted 2 days ago
2.0 - 7.0 years
4 - 9 Lacs
andhra pradesh
Work from Office
JD -7+ years of hands on experience in Python especially dealing with Pandas and Numpy Good hands-on experience in Spark PySpark and Spark SQL Hands on experience in Databricks Unity Catalog Delta Lake Lake house Platform Medallion Architecture Azure Data Factory ADLS Experience in dealing with Parquet and JSON file format Knowledge in Snowflake.
Posted 2 days ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Level Java Developer at YASH Technologies, you will play a crucial role in implementing business requirements and addressing challenges while working in a fast-paced, quality-oriented team environment. Your key responsibilities will include: - Investigating and understanding business requirements to implement them effectively - Addressing issues, expanding current functionality, and implementing new features - Task scoping, estimation, and prioritization - Collaborating with business analysts and SMEs to devise creative solutions - Working with testers to create test plans and tooling - Providing production and operations support - Participating in development discussions and cod...
Posted 3 days ago
5.0 - 10.0 years
25 - 30 Lacs
bengaluru
Work from Office
Business Area: Product Mgmt. Seniority Level: Mid-Senior level Job Description: At Cloudera, we empower people to transform complex data into clear and actionable insights. With as much data under management as the hyperscalers, were the preferred data partner for the top companies in almost every industry. Powered by the relentless innovation of the open source community, Cloudera advances digital transformation for the world s largest enterprises. Cloudera is searching for a Senior Product Manager (individual contributor) to own the vision, strategy, and roadmap for our mission-critical Cloudera Operational Database (COD) service. This service is built on the robust foundation of Apache HB...
Posted 5 days ago
5.0 - 10.0 years
50 - 65 Lacs
bengaluru
Work from Office
Required skills/experience Bachelors degree in Computer Science (or a related discipline) as well as work experience of 5+ years. Strong computer science fundamentals in algorithms, data structures, storage technologies, distributed computing, operating systems, etc. Experience in designing and implementing scalable solutions in a large-scale distributed environment. Robust and defensive coding skills using Java/Golang or any other Object-Oriented Programming Language. Strong knowledge of RDBMS and any other NoSQL database technologies (Cassandra, HBase, MongoDB, Dynamo, etc. ). Who you are Design and develop robust services in coordination with front-end developers, ensuring the production ...
Posted 5 days ago
6.0 - 11.0 years
8 - 13 Lacs
hyderabad
Work from Office
We are looking for a highly motivated and detail-oriented Catastrophe Data Analyst to join our team at Swiss Re. The ideal candidate should have 0 to 7 years of experience in the field. Roles and Responsibility Analyze and interpret catastrophe data to identify trends and patterns. Develop and maintain databases for storing and managing catastrophe data. Collaborate with cross-functional teams to design and implement new catastrophe models. Conduct research and stay updated on industry developments and advancements in catastrophe modeling. Provide insights and recommendations to stakeholders based on analysis results. Ensure data quality and integrity by implementing data validation and veri...
Posted 5 days ago
6.0 - 11.0 years
2 - 5 Lacs
bengaluru
Work from Office
We are seeking a talented and experienced Databricks Engineer to join our innovative team in Bengaluru, India. In this role, you will be responsible for designing, implementing, and optimising data pipelines and analytics solutions using the Databricks platform. You will collaborate closely with cross-functional teams to deliver high-quality, scalable data solutions that drive business insights and decision-making. Develop and optimise ETL processes using Databricks and Apache Spark Design and implement efficient data models and schemas for optimal data storage and retrieval Collaborate with data scientists and analysts to build and deploy machine learning models Ensure data quality, consist...
Posted 5 days ago
0.0 - 6.0 years
11 - 12 Lacs
gurugram
Work from Office
Design, develop, and maintain scalable ETL/ELT data pipelines on GCP (BigQuery, Dataflow, Cloud Composer, Pub/Sub, etc) Build and optimize data models and data marts in BigQuery for analytical and reporting use cases Ensure data quality, integrity, and security across the data lifecycle Implement data transformation logic using SQL, Python, and Cloud Dataflow/Dataproc Collaborate with business and analytics teams to understand data requirements and deliver efficient solutions Automate workflows and orchestrate pipelines using Cloud Composer (Airflow) Monitor and optimize BigQuery performance and manage cost efficiency Support CI/CD deployment processes and maintain version control using Git ...
Posted 5 days ago
6.0 - 11.0 years
14 - 17 Lacs
mumbai
Work from Office
Your role and responsibilities As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the worlds technology leader. Come to IBM and make a global impact Responsibilities: Responsible to manage end to end feature development and resolve challenges faced in impl...
Posted 5 days ago
4.0 - 9.0 years
12 - 16 Lacs
pune
Work from Office
Your role and responsibilities As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform...
Posted 5 days ago
5.0 - 10.0 years
3 - 6 Lacs
kolkata, hyderabad, bengaluru
Work from Office
We are looking for skilled Hadoop Developers with 5-10 years of experience to join our team in Bangalore, Kolkata, Hyderabad, and Pune. The ideal candidate should have strong proficiency in Hadoop, Scala, Spark, and SQL. Roles and Responsibility Design, develop, and implement scalable data processing systems using Hadoop. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain large-scale data pipelines using Spark and Scala. Troubleshoot and resolve complex technical issues related to Hadoop. Participate in code reviews and ensure high-quality code standards. Stay updated with the latest trends and technologies in Hadoop development. Job...
Posted 5 days ago
7.0 - 10.0 years
2 - 5 Lacs
bhubaneswar, chennai, bengaluru
Work from Office
We are looking for a skilled Hadoop Admin with 7 to 12 years of experience, located in Hyderabad, Bangalore, Chennai, and Bhubaneshwar. The ideal candidate will have hands-on experience in building EKS clusters, Spark on EKS, AWS (EMR, S3, RDS), Python as programming skills, Shell scripting, and any CI/CD. Roles and Responsibility Manage and maintain large-scale Hadoop clusters for high availability and performance. Design and implement scalable data processing pipelines using Spark and Airflow instances. Integrate Ranger and Hive Metastore services in EKS for secure data storage and access. Collaborate with cross-functional teams to identify and resolve technical issues. Develop and impleme...
Posted 5 days ago
4.0 - 6.0 years
7 - 12 Lacs
chennai
Work from Office
As a Senior Cloud Data Platform (GCP) Engineer at Incedo, you will be responsible for managing and optimizing the Google Cloud Platform environment, ensuring its performance, scalability, and security You will work closely with data analysts and data scientists to develop data pipelines and run data science experiments You will be skilled in cloud computing platforms such as AWS or Azure and have experience with big data technologies such as Hadoop or Spark You will be responsible for configuring and optimizing the GCP environment, ensuring that data pipelines are efficient and accurate, and troubleshooting any issues that arise You will also work with the security team to ensure that the GC...
Posted 5 days ago
HBase is a distributed, scalable, and NoSQL database that is commonly used in big data applications. As the demand for big data solutions continues to grow, so does the demand for professionals with HBase skills in India. Job seekers looking to explore opportunities in this field can find a variety of roles across different industries and sectors.
These cities are known for their strong presence in the IT industry and are actively hiring professionals with HBase skills.
The salary range for HBase professionals in India can vary based on experience and location. Entry-level positions may start at around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15-20 lakhs per annum.
In the HBase domain, a typical career progression may look like: - Junior HBase Developer - HBase Developer - Senior HBase Developer - HBase Architect - HBase Administrator - HBase Consultant - HBase Team Lead
In addition to HBase expertise, professionals in this field are often expected to have knowledge of: - Apache Hadoop - Apache Spark - Data Modeling - Java programming - Database design - Linux/Unix
As you prepare for HBase job opportunities in India, make sure to brush up on your technical skills, practice coding exercises, and be ready to showcase your expertise in interviews. With the right preparation and confidence, you can land a rewarding career in the exciting field of HBase. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
126846 Jobs | Dublin
Wipro
40828 Jobs | Bengaluru
EY
33625 Jobs | London
Accenture in India
30804 Jobs | Dublin 2
Uplers
24658 Jobs | Ahmedabad
Turing
23117 Jobs | San Francisco
IBM
20385 Jobs | Armonk
Infosys
19479 Jobs | Bangalore,Karnataka
Accenture services Pvt Ltd
19425 Jobs |
Capgemini
19370 Jobs | Paris,France