Posted:2 months ago| Platform:
Work from Office
Full Time
* Building and optimizing big data pipelines, architectures, and datasets to handle large-scale data * Enhancing infrastructure for scalability, automation, and data delivery improvements * Developing real-time and batch processing solutions using Kafka, Spark, and Airflow * Ensuring data governance, security compliance, and high availability * Collaborating with product, business, and analytics teams to support data needs Tech Stack: * Big Data Tools: Spark, Kafka, Databricks (Delta Tables), ScyllaDB, Redshift * Data Pipelines & Workflow: Airflow, EMR, Glue, Athena * Programming: Java, Scala, Python * Cloud & Storage: AWS * Databases: SQL, NoSQL (ScyllaDB, OpenSearch) * Backend: Spring BootWhat we expect you will bring to the table:1 Cutting-Edge Technology & ScaleAt Gameskraft, you will be working on some of the most advanced big data technologies, including Databricks Delta Tables, ScyllaDB, Spark, Kafka, Airflow, and Spring Boot
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
35.0 - 40.0 Lacs P.A.
20.0 - 30.0 Lacs P.A.
25.0 - 30.0 Lacs P.A.
25.0 - 30.0 Lacs P.A.
10.0 - 14.0 Lacs P.A.
6.0 - 11.0 Lacs P.A.
15.0 - 20.0 Lacs P.A.
Salary: Not disclosed
Experience: Not specified
Salary: Not disclosed
Experience: Not specified
Salary: Not disclosed