Consultant Engineer for Liquidity Program.India Gurugram

0 years

0 Lacs

Posted:4 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Contractual

Job Description

Consultant Engineer for Liquidity Program - India, Gurugram
Key points about the role -
  • We seek a mid-level engineer to design and build Liquidity calculations using our bespoke Data Calculation Platform (DCP), based on documented business requirements.
  • The front-end of the DCP is Dataiku, but prior experience with Dataiku is not necessary if they've had experience working as a data engineer.
  • Liquidity experience is not necessary, but it would be helpful if they've had experience designing and building to business requirements.
  • There is a requirement to work 3 days in the office in Gurugram.
  • They will work as part of a team that is located in both Sydney and Gurugram. The reporting manager will be based in Gurugram and project leadership in Sydney.
You will be an integral part of a dynamic team of business and technology experts, collaborating with a network of technologists across our programme. Your role will involve working within a dedicated squad to ingest data from producers and implement essential liquidity calculations — all within a cutting-edge data platform. We place a strong emphasis on delivering a high-performing, robust, and stable platform that meets the business needs of both internal and external stakeholders.In this role, you will bring an in-depth knowledge of big data technologies and a strong desire to work in a DevOps environment, where you will have end-to-end accountability for designing, developing, testing, deploying, and supporting your data assets. Additionally, you will create templates, implementation methods, and standards to ensure consistency and quality. You will be managing deadlines, articulating technical challenges and solutions, and contributing to the development of improved processes and practices. A growth mindset, passion for learning, and ability to quickly adapt to innovative technologies will be essential to your success in this role.

What You Offer

  • Experience in Big Data technologies, specifically Spark, Python, Hive, SQL, Presto (or other query engines), big data storage formats (e.g., Parquet), orchestration tools (e.g., Apache Airflow) and version control (e.g. Bitbucket)
  • Proficiency in developing configuration-based ETL pipelines and user-interface driven tools to optimize data processes and calculations (e.g., Dataiku).
  • Experience in analysing business requirements, solution design, including the design of data models, data pipelines, and calculations, as well as presenting solution options and recommendations.
  • Experience working in a cloud-based environment (ideally AWS), with a solid understanding of cloud computing concepts (EC2, S3), Linux, and containerization technologies (Docker and Kubernetes).
  • A background in solution delivery within the finance or treasury business domains, particularly in areas such as Liquidity or Capital, is advantageous.
When applying, please provide "Desired Pay" as LPA CTC.

Mock Interview

Practice Video Interview with JobPe AI

Start DevOps Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You