Posted:17 hours ago|
Platform:
Hybrid
Full Time
Title: Senior Data Engineer Job Level: 4 Job Code: Dept: R&D Sub-Dept: R&D DW-AI Division (P&L): Product- DW Location: India About InvestCloud InvestCloud is at the forefront of wealth technology, offering innovative solutions that redefine how the financial services industry operates. With a global presence and a client-first approach, we specialize in digital transformations powered by our flexible, modular technology. About the Team As a Senior Data Engineer, you will be joining the newly formed AI, Data & Analytics team, whose mission is to drive increased value from the data InvestCloud captures to enable a smarter financial future for our clients, focused on enhanced intelligence. Ensuring we have fit-for-purpose modern capabilities is a key goal for the team. This is a unique opportunity to shape the architecture and the tech execution patterns of a green field ecosystem to create a next-generation advisor and client experience. We're building for scale. As such, much of what we design and implement today will be the technology/infrastructure which will serve thousands of clients and petabyte-level volumes of data. The core stack we use and are building is: AWS as our cloud provider Oracle as our legacy data warehouse Snowflake as our next-gen data warehouse Mage AI for data ingestion and processing Kafka as our message bus Terraform for building infrastructure Key Responsibilities Work as part of the team to build reliable and scalable pipelines and capabilities across the platform, as well as monitor and support the capabilities we offer. Construct complex architectures tying multiple services, SaaS tooling and third party data together, leveraging a strong understanding of a cloud-based stack. Contribute to the technical strategy of the team, and its execution through prioritization, and delivery management. Set high standards across documentation, testing, resiliency, monitoring, and code quality. Enforce these standards by holding your team accountable. Drive towards efficiency, lower our cloud spend, tackle our tech debt and look for ways to simplify code, infrastructure and data models across the platform. Write well-rounded, reusable and documented code that captures the essential nature of the solution. Inspire, teach and guide your fellow team members; lead design sessions, participate in code reviews, take ownership of operational processes. Promote data quality, governance and security as a first-class citizen of the platform, complying with relevant regulations Required Skills You have at least 6 years of relevant professional experience in Data Engineering Youve participated in shaping the architecture of a mature cloud data platform (AWS, GCP, Azure), designed for different consumer types, and providing quantifiable business value. You have hands-on experience in building resilient batch (Airflow, Fivetran, Mage AI, Airbyte) and streaming (Kafka, Kinesis, Flink, Spark) data pipelines at scale (> 1 TB/day) You have designed and implemented performant, reusable, and scalable data models in a cloud data warehouse (dbt, BigQuery, Snowflake) and have working experience with legacy ecosystems (Oracle, Postgres) Youve run PoCs, planned large migrations of data and code, and participated in planning a roadmap multiple times before. You can build and maintain your own infrastructure through IaC (Terraform, OpenTofu, Ansible), containerization (Docker) and CI/CD (Jenkins, Github Actions); you rely on DevOps expertise when needed You preach operational procedures from data and infrastructure observability (Monte Carlo, Datadog, Prometheus), alerting and incident management (PagerDuty, incident.io) You are highly proficient in SQL and Python, and are confident applying them across data engineering tasks You are a strong communicator and collaborator, able to engage with both technical and non- technical teams. You listen actively and contribute constructively. Youre AI-proficient or AI-curious, eager to explore how emerging technologies can enhance yours and your teams productivity. You have experimented with Machine Learning frameworks (TensorFlow, PyTorch, Scikit-learn) and LLM frameworks (e.g. Langchain) Apply without meeting all requirements statement If you dont meet every requirement but believe youd thrive in this role, wed still love to hear from you. We're always keen to speak to people who connect with our mission and values.
Location and Travel
The ideal candidate will be expected to work from the office on a regular basis (3 days minimum per week). Occasional travel may be required. Compensation The salary range will be determined based on experience, skills, and geographic location. Equal Opportunity Employer InvestCloud is committed to fostering an inclusive workplace and welcomes applicants from all backgrounds.
Babelsys Technology
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Now4.0 - 8.0 Lacs P.A.
7.0 - 12.0 Lacs P.A.
bengaluru
18.0 - 25.0 Lacs P.A.
pune, bengaluru, mumbai (all areas)
25.0 - 40.0 Lacs P.A.
noida, new delhi, delhi / ncr
30.0 - 45.0 Lacs P.A.
hyderabad, telangana, india
Salary: Not disclosed
ahmedabad, gujarat, india
Salary: Not disclosed
kerala, india
Salary: Not disclosed
bengaluru, thiruvananthapuram, mumbai (all areas)
15.0 - 25.0 Lacs P.A.
pune, maharashtra, india
Salary: Not disclosed