Module Lead - Data Fabric

6 - 8 years

3 - 10 Lacs

Posted:3 months ago| Platform: Foundit logo

Apply

Skills Required

Airflow Cloud Platforms (GCP/AWS) ETL pipeline design

Work Mode

On-site

Job Type

Full Time

Job Description

a Module Lead in Data Fabric POD, you would be responsible for producing and implementing functional software solutions You will work with upper management to define software requirements and take the lead on operational and technical projects You would be working with a data management and science platform which provides Data as a service (DAAS) and Insight as a service (IAAS) to internal employees and external stakeholders You are eager to learn technology-agnostic and love working with data and drawing insights from it You have excellent organization and problem-solving skills and are looking to build the tools of the future You have exceptional communication skills and leadership skills and the ability to make quick decisions Educational Qualifications - B.Tech/B.Ein Computers, Your job description Work break-down and orchestrating the development of components for each sprint. Identifying risks and forming contingency plans to mitigate them. Liaising with team members, management, and clients to ensure projects are completed to standard. Inventing new approaches to detecting existing fraud. You will also stay ahead of the game by predicting future fraud techniques and building solutions to prevent them. Developing Zero Defect Software that is secured, instrumented, and resilient. Creating design artifacts before implementation. Developing Test Cases before or in parallel with implementation. Ensuring software developed passes static code analysis, performance, and load test. Developing various kinds of components (such as UI Components, APIs, Business Components, image Processing, etc ) that define the IDfy Platforms which drive cutting-edge Fraud Detection and Analytics. Developing software using Agile Methodology and tools that support the same. Skills Required: Airflow,ETL,ETL pipeline design, Spark,Hadoop, Hive, System Architecture, Requirements: Know-how of Apache BEAM, Clickhouse, Grafana, InfluxDB, Elixir, BigQuery, Logstash. An understanding of Product Development Methodologies. Strong understanding of relational databases especially SQL and hands-on experience with OLAP. Experience in creating data ingestion pipelines and ETL(Extract, Transform & Load) pipelines (Good to have Apache Beam or Apache Airflow experience). Strong design skills in defining API Data Contracts / OOAD / Microservices / Data Models. Experience with Time Series DBs (we use InfluxDB) and Alerting / Anomaly Detection Frameworks. Visualization Layers: Metabase, PowerBI, Tableau. Experience in developing software in the Cloud such as GCP / AWS. A passion for exploring new technologies and express yourself through technical blogs

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
Idfy logo
Idfy

Software Development

Mumbai Maharashtra

RecommendedJobs for You

Mumbai, Maharashtra, India

bengaluru, karnataka, india

bengaluru, karnataka, india

bengaluru, karnataka, india

mumbai, maharashtra, india

chennai, tamil nadu, india

hyderabad, telangana, india