Jobs
Interviews

2 Hive Programming Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

0 Lacs

karnataka

On-site

You have experience with Azure Data Bricks, Data Factory, and Azure Data components such as Azure SQL Database, Azure SQL Warehouse, and SYNAPSE Analytics. You are proficient in Python, Pyspark, Scala, and Hive Programming. In addition, you have hands-on experience with Azure Databricks/ADB and building CI/CD pipelines in Data environments. Your primary skills include ADF (Azure Data Factory) or ADB (Azure Data Bricks). You also possess excellent verbal and written communication skills, along with strong interpersonal skills. You are capable of working both independently and collaboratively within a team environment. In summary, you are a skilled professional with expertise in Azure Data services and related tools, proficient in various programming languages, and have experience in building data pipelines and CI/CD pipelines in Azure Data environments. Your strong communication and analytical skills make you a valuable asset to any team.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

You have experience with Azure Data Bricks, Data Factory, Azure Data components like Azure SQL Database, Azure SQL Warehouse, and SYNAPSE Analytics. You are proficient in Python, Pyspark, Scala, and Hive Programming. You also have experience in building CI/CD pipelines in Data environments. Your primary skills include ADF (Azure Data Factory) or ADB (Azure Data Bricks). Additionally, you possess excellent verbal and written communication skills along with the ability to work both independently and within a team environment. For the AWS Data Engineer role, you should have at least 2 years of experience working on the AWS Cloud platform with strong knowledge in Python. Knowledge of AWS services such as S3, Glue, API Gateway, Crawler, Athena, Lambda, Dynamic DB, and Redshift is advantageous. Experience or knowledge with streaming technologies, particularly Kafka, is essential. Moreover, familiarity with SQL, good analytical skills, experience working on Linux platforms, and understanding the pros and cons and cost impact of the AWS services being used are required. Strong communication skills are also necessary for this role. In the case of the GCP Data Engineer position, you must have a minimum of 4 years" experience in GCP Data Engineering. You should possess strong data engineering skills using Java or Python programming languages or Spark on Google Cloud. Experience in handling big data, Agile methodologies, ETL, ELT skills, data movement skills, and data processing skills are essential. Certification as a Professional Google Cloud Data Engineer would be an added advantage. Proven analytical skills, a problem-solving attitude, and the ability to function effectively in a cross-team environment are also required. Your primary skills in this role include GCP, data engineering, Java/Python/Spark on GCP, programming experience in either Python, Java, or PySpark, GCS (Cloud Storage), Composer (Airflow), and BigQuery experience, and experience in building data pipelines using the aforementioned skills.,

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies