Posted:2 weeks ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

 
  • Study existing technology landscape and understand current data integration framework and do impact assessment for the requirements.
  • Develop spark jobs using Scala for new project requirements.
  • Enhance existing spark jobs for any ongoing product enhancement.
  • Performance tuning of spark jobs, stress testing etc.
  • Create new data pipelines for developed/enhanced spark jobs using AWS Lambda or Apache Airflow
  • Responsible for database design process, logical design, physical design, star schema, snowflake schema etc.
  • Analyze data processing, integration, modelling and reporting requirements & Define data loading strategies considering volume, data types, frequency and analytics specifications.
  • Ensure optimal balance between cost and performance.
  • Project documentation, Adheres to Quality guidelines & Schedules.
  • Works hand in hand with PM for successful delivery of project and provide Estimation, scoping, scheduling assistance.
  • Manage build phase and quality assure code to ensure fulfilling requirements and adhering to Cloud architecture. Resolve difficult design and develop issues.

Work and Technical Experience:

Must-Have:

  • Overall 7-9 years of IT Exp
  • 4+ years of AWS related project
  • Good to have Associate Level and Professional Level AWS Certification
  • In depth knowledge of following AWS Services required: S3, EC2, EMR, Severless, Athena, AWS Glue, Lambda, Step Functions
  • Cloud Databases (Must have) AWS Aurora, Singlestore, RedShift, Snowflake
  • Big Data (Must have) - Hadoop, Hive, Spark, YARN
  • Programming Language (Must have) Scala, Python, Shell Scripts, PySpark
  • Operating System (Must have) - Any flavor of Linux, Windows
  • Must have very strong SQL Skills
  • Orchestration Tools (Must have): Apache Airflow
  • Expertise in developing ETL workflows comprising complex transformations like SCD, deduplications, aggregations etc.
  • Should have thorough conceptual understanding of AWS VPC, Subnets, Security Groups, & Route Tables
  • Should be a quick and self-learner and be ready to adapt to new AWS Services or new Big Data Technologies as and when required

Qualifications:

  • Bachelor s degree in computer science, engineering, or related field (Master s degree is a plus)
  • Demonstrated continued learning through one or more technical certifications or related methods
  • At least 5+ years of IT experience; Minimum 4 years of experience on Cloud related projects

Qualities:

  • Hold strong technical knowledge and experience
  • Should have the capability to deep dive and research in various technical related fields
  • Self-motivated and focused on delivering outcomes for a fast growing team and firm
  • Able to communicate persuasively through speaking, writing, and client presentations
  • Able to consult, write, and present persuasively
  • Able to work in a self-organized and cross-functional team
  • Able to iterate based on new information, peer reviews, and feedback
  • Prior experience of working in a large media company would be added advantage

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
InfoCepts logo
InfoCepts

Business Consulting and Services

Washington DC VA

RecommendedJobs for You

kolkata, mumbai, new delhi, hyderabad, pune, chennai, bengaluru

noida, delhi / ncr, mumbai (all areas)