EY-GDS Consulting-AI and DATA-AWS DBX-Senior

10 - 17 years

10 - 17 Lacs

Posted:1 day ago| Platform: Foundit logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Your key responsibilities

  • Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data.
  • Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 3 - 7 years]
  • Need to understand current & Future state enterprise architecture.
  • Need to contribute in various technical streams during implementation of the project.
  • Provide product and design level technical best practices
  • Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions
  • Define and develop client specific best practices around data management within a Hadoop environment or cloud environment
  • Recommend design alternatives for data ingestion, processing and provisioning layers
  • Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark
  • Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies

Tech Stack: -

AWS

  • Experience building on AWS using S3, EC2, Redshift, Glue, EMR, DynamoDB, Lambda, Quick Sight, etc.
  • Experience in Pyspark/Spark / Scala
  • Experience using software version control tools (Git, Jenkins, Apache Subversion)
  • AWS certifications or other related professional technical certifications
  • Experience with cloud or on-premises middleware and other enterprise integration technologies.
  • Experience in writing MapReduce and/or Spark jobs.
  • Demonstrated strength in architecting data warehouse solutions and integrating technical components.
  • Good analytical skills with excellent knowledge of SQL.
  • 3+ years of work experience with very large data warehousing environment
  • Excellent communication skills, both written and verbal
  • 3+ years of experience with detailed knowledge of data warehouse technical architectures, infrastructure components, ETL/ ELT, and reporting/analytic tools.
  • 3+ years of experience data modelling concepts
  • 3+ years of Python and/or Java development experience
  • 3+ years experience in Big Data stack environments (EMR, Hadoop, MapReduce, Hive

Skills and attributes for success

  • Architect in designing highly scalable solutions AWS.
  • Strong understanding & familiarity with all AWS/GCP /Bigdata Ecosystem components
  • Strong understanding of underlying AWS/GCP Architectural concepts and distributed computing paradigms
  • Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming
  • Hands on experience with major components like cloud ETLs, Spark, Databricks
  • Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB
  • Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions
  • Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks.
  • Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms
  • Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB
  • Good knowledge in Apache Kafka & Apache Flume
  • Experience in Enterprise grade solution implementations.
  • Experience in performance bench marking enterprise applications
  • Experience in Data security [on the move, at rest]
  • Strong UNIX operating system concepts and shell scripting knowledge

To qualify for the role, you must have

  • Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution.
  • Excellent communicator (written and verbal formal and informal).
  • Ability to multi-task under pressure and work independently with minimal supervision.
  • Strong verbal and written communication skills.
  • Must be a team player and enjoy working in a cooperative and collaborative team environment.
  • Adaptable to new technologies and standards.
  • Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support
  • Responsible for the evaluation of technical risks and map out mitigation strategies
  • Experience in Data security [on the move, at rest]
  • Experience in performance bench marking enterprise applications
  • Working knowledge in any of the cloud platform, AWS or Azure or GCP
  • Excellent business communication, Consulting, Quality process skills
  • Excellent Consulting Skills
  • Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain.
  • Minimum 10 years industry experience

Ideally, you'll also have

  • Strong project management skills
  • Client management skills
  • Solutioning skills

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You

Chandigarh, Dadra & Nagar Haveli, Daman

Chandigarh, Dadra & Nagar Haveli, Daman