Posted:2 weeks ago|
Platform:
Work from Office
Full Time
1. Data engineer with 6+ years of hands on experience working on Big Data Platforms
2. Experience building and optimizing Big data data pipelines and data sets ranging from Data ingestion to Processing to Data Visualization.3. Good Experience in writing and optimizing Spark Jobs, Spark SQL etc. Should have worked on both batch and steaming data processing4. Good experience in any one programming language -Scala/Python , Python preferred.5. Experience in writing and optimizing complex Hive and SQL queries to process huge data. good with UDFs, tables, joins,Views etc6. Experience in using Kafka or any other message brokers7. Configuring, monitoring and scheduling of jobs using Oozie and/or Airflow8. Processing streaming data directly from Kafka using Spark jobs, expereince in Spark- streaming is must9. Should be able to handling different file formats (ORC, AVRO and Parquet) and unstructured data10. Should have experience with any one No SQL databases like Amazon S3 etc11. Should have worked on any of the Data warehouse tools
1. Data engineer with 6+ years of hands on experience working on Big Data Platforms
2. Experience building and optimizing Big data data pipelines and data sets ranging from Data ingestion to Processing to Data Visualization.3. Good Experience in writing and optimizing Spark Jobs, Spark SQL etc. Should have worked on both batch and steaming data processing4. Good experience in any one programming language -Scala/Python , Python preferred.5. Experience in writing and optimizing complex Hive and SQL queries to process huge data. good with UDFs, tables, joins,Views etc6. Experience in using Kafka or any other message brokers7. Configuring, monitoring and scheduling of jobs using Oozie and/or Airflow8. Processing streaming data directly from Kafka using Spark jobs, expereince in Spark- streaming is must9. Should be able to handling different file formats (ORC, AVRO and Parquet) and unstructured data10. Should have experience with any one No SQL databases like Amazon S3 etc11. Should have worked on any of the Data warehouse tools like AWS Redshift or Snowflake or BigQuery etc12. Work expereince on any one cloud AWS or GCP or Azure
1. Experience in AWS cloud services like EMR, S3, Redshift, EKS/ECS etc
2. Experience in GCP cloud services like Dataproc, Google storage etc3. Experience in working with huge Big data clusters with millions of records4. Experience in working with ELK stack, specially Elasticsearch5. Experience in Hadoop MapReduce, Apache Flink, Kubernetes etc
Globallogic
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Now8.0 - 14.0 Lacs P.A.
8.0 - 14.0 Lacs P.A.
8.0 - 14.0 Lacs P.A.
bengaluru
4.0 - 8.0 Lacs P.A.
bengaluru
4.0 - 8.0 Lacs P.A.
bengaluru
17.0 - 19.0 Lacs P.A.
14.0 - 16.0 Lacs P.A.
7.0 - 10.0 Lacs P.A.
bengaluru
6.0 - 12.0 Lacs P.A.
noida
5.0 - 8.0 Lacs P.A.