Manager (AI Hub GTS)

0 years

0 Lacs

Posted:1 week ago| Platform: Foundit logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Roles & responsibilities
  • Lead the design, development, and implementation of scalable data engineering solutions using platforms such as Databricks and Microsoft Fabric, aligning with enterprise architecture and business goals.
  • Own the architectural vision for product and application development, ensuring alignment with organizational strategy and technical standards.
  • Drive innovation by evaluating emerging technologies and integrating them into solution roadmaps.
  • Establish and enforce coding standards and best practices across teams through structured code reviews and technical mentoring.
  • Oversee the estimation process for solution development efforts, ensuring accuracy and alignment with delivery timelines and resource planning.
  • Ensure comprehensive documentation of solutions, including technical specifications, testing protocols, and datasets, to support maintainability and audit readiness.
  • Provide technical leadership and guidance to cross-functional teams, fostering a culture of excellence and continuous improvement.
  • Collaborate with audit professionals and business stakeholders to understand regulatory, risk, and operational requirements, ensuring solutions are compliant and value-driven.
  • Facilitate knowledge sharing through team meetings, brainstorming sessions, and technical workshops.
  • Champion best practices in data engineering, architecture design, testing, and documentation to ensure high-quality deliverables.
  • Stay hands-on with critical aspects of system and model design, development, and validation to ensure robustness and scalability.
  • Monitor and optimize performance of deployed systems, proactively identifying areas for improvement.
  • Lead initiatives within the Data Engineering and Architecture practice area, contributing to capability building, asset development, and strategic growth.
Stay abreast of industry trends and advancements to maintain a competitive edge and drive continuous innovation.
Mandatory technical & functional skills
  • Strong understanding of MPP databases and RDBMS fundamentals
  • Hands-on experience with Cloud Platforms (SaaS/PaaS) preferably AZURE
  • Expertise on cloud databases and Datawarehouse EG: AZURE SQL,SYNAPSE etc.
  • Working knowledge with NoSQL databases EG: MongoDB,Cassandra,Redis
  • In-Depth knowledge of SPARK ecosystem and APIs
  • Exposure into Databricks and pySpark
  • Clear understanding of datalakes and data lakehouses
  • Decent understanding of Unity catalog, Delta live tables,MLFlow etc.
  • Exposure into streaming
  • Solid knowledge with building data pipelines using ADF,SYNAPSE,GLUE etc.
  • Familiarity with Event Driven designs and messaging using service bus, Event grid
  • Exposure into Serverless orchestrators EG: LogicApp,Function App,Airflow etc.
Familiarity with CI/CD using Git actions or AZURE devOps

Preferred Technical & Functional Skills

  • Backend frameworks - FastAPI, Django
  • Machine learning frameworks -TensorFlow, PyTorch
  • RestAPI
  • Experience working with frameworks LangChain/LlamaIndex /LlamaPrase/LlamaCloud/Semantic Kernel etc.
  • Certifications: Relevant certifications such as Microsoft Certified: AI 102, DP 700, DP 900 or AWS certifications
Key behavioral attributes/requirements
  • Strong analytical, problem-solving, and critical-thinking skills
  • Excellent collaboration skills, with the ability to work effectively in a team-oriented environment
  • Excellent written and verbal communication skills, with the ability to present complex technical concepts to non-technical audiences
Willingness to learn new technologies and work on them

Responsibilities

Roles & responsibilities
  • Lead the design, development, and implementation of scalable data engineering solutions using platforms such as Databricks and Microsoft Fabric, aligning with enterprise architecture and business goals.
  • Own the architectural vision for product and application development, ensuring alignment with organizational strategy and technical standards.
  • Drive innovation by evaluating emerging technologies and integrating them into solution roadmaps.
  • Establish and enforce coding standards and best practices across teams through structured code reviews and technical mentoring.
  • Oversee the estimation process for solution development efforts, ensuring accuracy and alignment with delivery timelines and resource planning.
  • Ensure comprehensive documentation of solutions, including technical specifications, testing protocols, and datasets, to support maintainability and audit readiness.
  • Provide technical leadership and guidance to cross-functional teams, fostering a culture of excellence and continuous improvement.
  • Collaborate with audit professionals and business stakeholders to understand regulatory, risk, and operational requirements, ensuring solutions are compliant and value-driven.
  • Facilitate knowledge sharing through team meetings, brainstorming sessions, and technical workshops.
  • Champion best practices in data engineering, architecture design, testing, and documentation to ensure high-quality deliverables.
  • Stay hands-on with critical aspects of system and model design, development, and validation to ensure robustness and scalability.
  • Monitor and optimize performance of deployed systems, proactively identifying areas for improvement.
  • Lead initiatives within the Data Engineering and Architecture practice area, contributing to capability building, asset development, and strategic growth.
Stay abreast of industry trends and advancements to maintain a competitive edge and drive continuous innovation.
Mandatory technical & functional skills
  • Strong understanding of MPP databases and RDBMS fundamentals
  • Hands-on experience with Cloud Platforms (SaaS/PaaS) preferably AZURE
  • Expertise on cloud databases and Datawarehouse EG: AZURE SQL,SYNAPSE etc.
  • Working knowledge with NoSQL databases EG: MongoDB,Cassandra,Redis
  • In-Depth knowledge of SPARK ecosystem and APIs
  • Exposure into Databricks and pySpark
  • Clear understanding of datalakes and data lakehouses
  • Decent understanding of Unity catalog, Delta live tables,MLFlow etc.
  • Exposure into streaming
  • Solid knowledge with building data pipelines using ADF,SYNAPSE,GLUE etc.
  • Familiarity with Event Driven designs and messaging using service bus, Event grid
  • Exposure into Serverless orchestrators EG: LogicApp,Function App,Airflow etc.
Familiarity with CI/CD using Git actions or AZURE devOps

Qualifications

This role is for you if you have the below

Educational Qualifications

  • Minimum qualification required: BTech in Computer Science/ MTech/ MCA - Fulltime education.

Work Experience

  • 7-9 years of experience in design, develop data centric applications using various tools and technologies e.g. Databases, reporting, ETL, NoSQL etc.
  • 5+ years of experience in designing, architecting solutions using Microsoft Data technologies like ADF/SYNAPSE
  • Relevant Data Professional certifications Databricks, AWS, GCP or Azure
#KGS

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
KPMG India logo
KPMG India

Professional Services

Pune

RecommendedJobs for You

bengaluru, karnataka, india

bengaluru, karnataka, india

hyderabad, telangana, india

Bengaluru, Karnataka, India

Hyderabad, Telangana, India