Posted:3 weeks ago|
Platform:
Work from Office
Full Time
Key Responsibilities - Python & PySpark: - Writing efficient ETL (Extract, Transform, Load) pipelines. - Implementing data transformations using PySpark DataFrames and RDDs. - Optimizing Spark jobs for performance and scalability. - Apache Spark: - Managing distributed data processing. - Implementing batch and streaming data processing. - Tuning Spark configurations for efficient resource utilization. - Unix Shell Scripting: - Automating data workflows and job scheduling. - Writing shell scripts for file management and log processing. - Managing cron jobs for scheduled tasks. - Google Cloud Platform (GCP) & BigQuery: - Designing data warehouse solutions using BigQuery. - Writing optimized SQL queries for analytics. - Integrating Spark with BigQuery for large-scale data processing
Allegis Group
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
My Connections Allegis Group
Kochi, Ernakulam
4.0 - 6.0 Lacs P.A.
Hyderabad
5.0 - 10.0 Lacs P.A.
Bengaluru
15.0 - 30.0 Lacs P.A.
10.0 - 20.0 Lacs P.A.
15.0 - 30.0 Lacs P.A.
Gurugram
4.0 - 9.0 Lacs P.A.
7.0 - 10.0 Lacs P.A.
10.0 - 15.0 Lacs P.A.
Experience: Not specified
0.5 - 0.7 Lacs P.A.
Bengaluru
6.0 - 14.4 Lacs P.A.