GDS Consulting - AI and DATA - Azure Architect - Manager

10 - 12 years

10 - 12 Lacs

Posted:1 week ago| Platform: Foundit logo

Apply

Skills Required

Work Mode

On-site

Job Type

Full Time

Job Description

EY GDS Data and Analytics (D&A) Cloud Architect

As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance.

The opportunity

We're looking for Managers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team.

Your key responsibilities

Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data.

Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years]

Need to understand current & Future state enterprise architecture.

Need to contribute in various technical streams during implementation of the project.

Provide product and design level technical best practices

Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions

Define and develop client specific best practices around data management within a Hadoop environment or cloud environment

Recommend design alternatives for data ingestion, processing and provisioning layers

Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark

Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies

Skills and attributes for success

Architect in designing highly scalable solutions Azure, AWS and GCP.

Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components

Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms

Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming

Hands on experience with major components like cloud ETLs,Spark, Databricks

Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB

Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions

Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks.

Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms

Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB

Good knowledge in apache Kafka & Apache Flume

Experience in Enterprise grade solution implementations.

Experience in performance bench marking enterprise applications

Experience in Data security [on the move, at rest]

Strong UNIX operating system concepts and shell scripting knowledge

To qualify for the role, you must have

Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution.

Excellent communicator (written and verbal formal and informal).

Ability to multi-task under pressure and work independently with minimal supervision.

Strong verbal and written communication skills.

Must be a team player and enjoy working in a cooperative and collaborative team environment.

Adaptable to new technologies and standards.

Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support

Responsible for the evaluation of technical risks and map out mitigation strategies

Experience in Data security[on the move, at rest]

Experience in performance bench marking enterprise applications

Working knowledge in any of the cloud platform, AWS or Azure or GCP

Excellent business communication, Consulting, Quality process skills

Excellent Consulting Skills

Excellence in leading Solution Architecture, Design, Buildand Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain.

Minimum 7 years hand-on experience in one or more of the above areas.

Minimum 10 years industry experience

Ideally, you'll also have

Strong project management skills

Client management skills

Solutioning skills

What we look for

People with technical experience and enthusiasm to learn new things in this fast-moving environment

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You

Bengaluru / Bangalore, Karnataka, India

Kolkata, West Bengal, India

Ahmedabad, Gujarat, India

Hyderabad / Secunderabad, Telangana, Telangana, India