Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 6.0 years
0 Lacs
karnataka
On-site
You have experience with Azure Data Bricks, Data Factory, and Azure Data components such as Azure SQL Database, Azure SQL Warehouse, and SYNAPSE Analytics. You are proficient in Python, Pyspark, Scala, and Hive Programming. In addition, you have hands-on experience with Azure Databricks/ADB and building CI/CD pipelines in Data environments. Your primary skills include ADF (Azure Data Factory) or ADB (Azure Data Bricks). You also possess excellent verbal and written communication skills, along with strong interpersonal skills. You are capable of working both independently and collaboratively within a team environment. In summary, you are a skilled professional with expertise in Azure Data services and related tools, proficient in various programming languages, and have experience in building data pipelines and CI/CD pipelines in Azure Data environments. Your strong communication and analytical skills make you a valuable asset to any team.,
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
You have experience with Azure Data Bricks, Data Factory, Azure Data components like Azure SQL Database, Azure SQL Warehouse, and SYNAPSE Analytics. You are proficient in Python, Pyspark, Scala, and Hive Programming. You also have experience in building CI/CD pipelines in Data environments. Your primary skills include ADF (Azure Data Factory) or ADB (Azure Data Bricks). Additionally, you possess excellent verbal and written communication skills along with the ability to work both independently and within a team environment. For the AWS Data Engineer role, you should have at least 2 years of experience working on the AWS Cloud platform with strong knowledge in Python. Knowledge of AWS services such as S3, Glue, API Gateway, Crawler, Athena, Lambda, Dynamic DB, and Redshift is advantageous. Experience or knowledge with streaming technologies, particularly Kafka, is essential. Moreover, familiarity with SQL, good analytical skills, experience working on Linux platforms, and understanding the pros and cons and cost impact of the AWS services being used are required. Strong communication skills are also necessary for this role. In the case of the GCP Data Engineer position, you must have a minimum of 4 years" experience in GCP Data Engineering. You should possess strong data engineering skills using Java or Python programming languages or Spark on Google Cloud. Experience in handling big data, Agile methodologies, ETL, ELT skills, data movement skills, and data processing skills are essential. Certification as a Professional Google Cloud Data Engineer would be an added advantage. Proven analytical skills, a problem-solving attitude, and the ability to function effectively in a cross-team environment are also required. Your primary skills in this role include GCP, data engineering, Java/Python/Spark on GCP, programming experience in either Python, Java, or PySpark, GCS (Cloud Storage), Composer (Airflow), and BigQuery experience, and experience in building data pipelines using the aforementioned skills.,
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
64580 Jobs | Dublin
Wipro
25801 Jobs | Bengaluru
Accenture in India
21267 Jobs | Dublin 2
EY
19320 Jobs | London
Uplers
13908 Jobs | Ahmedabad
Bajaj Finserv
13382 Jobs |
IBM
13114 Jobs | Armonk
Accenture services Pvt Ltd
12227 Jobs |
Amazon
12149 Jobs | Seattle,WA
Oracle
11546 Jobs | Redwood City