Home
Jobs

Spark Scala+AWS+SQL Developer - GO/JC/21445/2025

0 - 2 years

0 Lacs

Posted:1 month ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Job Description Spark Scala+AWS+SQL Developer (SA/M) A Spark Scala+AWS+SQL Developer is responsible for building and maintaining distributed data processing systems using Apache Spark and Scala, leveraging AWS cloud services for scalable and efficient data solutions. The role involves developing ETL/ELT pipelines, optimizing Spark jobs, and crafting complex SQL queries for data transformation and analysis. Collaboration with teams, ensuring data quality, and adhering to best coding practices are essential aspects of the role. Core skills include: ? Proficiency in Apache Spark and Scala programming. ? Expertise in SQL for database management and optimization. ? Experience with AWS services like S3, EMR, Glue, and Redshift. ? Knowledge of data warehousing, data lakes, and big data tools. The position suits those passionate about data engineering and looking to work in dynamic and cloud-based environments! Let me know if you'd like a detailed description or tips for preparing for such a role. Key Responsibilities: ? Data Pipeline Development: ? Cloud-based Solutions: ? Data Processing & Transformation: ? Performance Optimization: ? Collaboration & Communication: ? Data Quality & Security: ? Continuous Improvement: Skills and Knowledge: 1.Apache Spark: o Proficiency in creating distributed data processing pipelines. o Hands-on experience with Spark components like RDDs, DataFrames, Datasets, and Spark Streaming. 2.Scala Programming: o Expertise in Scala for developing Spark applications. o Knowledge of functional programming concepts. 3.AWS Services: o Familiarity with key AWS tools like S3, EMR, Glue, Lambda, Redshift, and Athena. o Ability to design, deploy, and manage cloud-based solutions. 4.SQL Expertise: o Ability to write complex SQL queries for data extraction, transformation, and reporting. o Experience in query optimization and database performance tuning. 5.Data Engineering: o Skills in building ETL/ELT pipelines for seamless data flow. o Understanding of data lakes, data warehousing, and data modeling. 6.Big Data Ecosystem: o Knowledge of Hadoop, Kafka, and other big data tools (optional but beneficial). 7.Version Control and CI/CD: o Proficiency in Git for version control. o Experience in continuous integration and deployment pipelines. 8.Performance Tuning: o Expertise in optimizing Spark jobs and SQL queries for efficiency. Soft Skills: ? Strong problem-solving abilities. ? Effective communication and collaboration skills. ? Attention to detail and adherence to coding best practices. Domain Knowledge: ? Familiarity with data governance and security protocols. ? Understanding of business intelligence and analytics requirements Skills Required RoleSpark Scala+AWS+SQL Developer Industry TypeIT/ Computers - Software Functional AreaIT-Software Required EducationAny Graduates-B.Tech Employment TypeFull Time, Permanent Key Skills APACHE SPARK SCALA PROGRAMMING SQL EXPERTISE AWS SERVICES ETL/ELT PIPELINES. Other Information Job CodeGO/JC/21445/2025 Recruiter NameSPriya

Mock Interview

Practice Video Interview with JobPe AI

Start Spark Interview Now
Golden Opportunities
Golden Opportunities

Career Services

Opportunities City

50 Employees

1204 Jobs

    Key People

  • John Doe

    CEO
  • Jane Smith

    Head of Marketing

RecommendedJobs for You