Jobs
Interviews

1 Code Verificatiion Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 5.0 years

25 - 40 Lacs

Bengaluru

Hybrid

The Modern Data Engineer is responsible for designing, implementing, and maintaining scalable data architectures using cloud technologies, primarily on AWS, to support the next evolutionary stage of the Investment Process. They build robust data pipelines, optimize data storage, and access patterns, and ensure data quality while collaborating across engineering teams to deliver high-value data products Key Responsibilities • Implement and maintain data pipelines for ingestion, transformation, and delivery • Ensure data quality through validation and monitoring processes • Collaborate with senior engineers to design scalable data solutions • Work with business analysts to understand and implement data requirements • Optimize data models and queries for performance and efficiency • Follow engineering best practices and contribute to team standards • Participate in code reviews and knowledge sharing activities • Implement data security controls and access policies • Troubleshoot and resolve data pipeline issues Core Technical Skills Cloud Platforms: Proficient with cloud-based data platforms (Snowflake, data lakehouse architecture) AWS Ecosystem : Strong knowledge of AWS services including Lambda, Glue, and S3 Streaming Architecture : Understanding of event-based or streaming data concepts using Kafka Programming: Strong proficiency in Python and SQL DevOps : Experience with CI/CD pipelines and infrastructure as code (Terraform) Data Security: Knowledge of implementing basic data access controls Database Systems : Experience with RDBMS (Oracle, Postgres, MSSQL) and exposure to NoSQL databases Data Integration : Understanding of data integration patterns and techniques Orchestration : Experience using workflow tools (Airflow, Control-M, etc.) Engineering Practices : Experience with GitHub, code verification, and validation Domain Knowledge: Basic knowledge of investment management industry concepts Core Technical Skills Cloud Platforms: Proficient with cloud-based data platforms (Snowflake, data lakehouse architecture) AWS Ecosystem : Strong knowledge of AWS services including Lambda, Glue, and S3 Streaming Architecture : Understanding of event-based or streaming data concepts using Kafka Programming: Strong proficiency in Python and SQL DevOps : Experience with CI/CD pipelines and infrastructure as code (Terraform) Data Security: Knowledge of implementing basic data access controls Database Systems : Experience with RDBMS (Oracle, Postgres, MSSQL) and exposure to NoSQL databases Data Integration : Understanding of data integration patterns and techniques Orchestration : Experience using workflow tools (Airflow, Control-M, etc.) Engineering Practices : Experience with GitHub, code verification, and validation Domain Knowledge: Basic knowledge of investment management industry concepts

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies