Pay and Benefits:
- Competitive compensation, including base pay and annual incentive
- Comprehensive health and life insurance and well-being benefits, based on location
- Pension / Retirement benefits
- Paid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being.
- DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (onsite Tuesdays, Wednesdays and a third day unique to each team or employee).
The impact you will have in this role:
Being a member of the Data Services Platform Delivery team means you will be a part of a technology team with a rich diverse skill sets and a phenomenal hard-working committed team. Whether it s Snowflake, Java, Spring suite, Python, data analytics, Unix, cloud computing or Database skillset required for the project initiatives, we are there for each other collaborating and helping each other to achieve the common goal. We are embarking on an incredible multi-year Data Transformation journey, and we are looking for best-of-breed software engineers to join us on this journey.
We re looking for a passionate engineer to help design and build platforms that power the next generation of data products.
In this role you will be responsible for building platforms for next generation Data Products. You ll work within the Data Platform Squad to develop secure, resilient, scalable solutions in Snowflake, Java or Python delivered to the marketplace via multiple delivery mechanisms. The Solution will be built with latest and greatest cloud tools and industry standards. This role offers strong opportunities for growth driven by your performance and contributions to our strategic goals.
Qualifications:
- Minimum 10 years of related experience
- Bachelors degree (preferred) or equivalent experience
Primary Responsibilities.
- Act as a technical expert on the development of one or more applications including design and develop robust, scalable platforms that enable transformation of data into a useful format for analysis, enhance data flow, and enable efficient consumption and analysis of data.
- Partner with enterprise teams to identify and deploy efficient hosting environments.
- Research and evaluate technical solutions consistent with DTCC technology standards.
- Contribute expertise to the design of components or individual programs and participate in the unit and functional testing.
- Collaborate with teams across the software development lifecycle, including those responsible for testing, troubleshooting, operations and production support.
- Aligns risk and control processes into day-to-day responsibilities to monitor and mitigate risk; escalates appropriately.
- Write complex performance optimal SQL queries against Snowflake.
- Convert logical data models to physical data models, DDL, roles and views and enhance them as required.
- Participate in daily scrums, project related meetings, backlog grooming, sprint planning and retrospective sessions.
- Ensure operational readiness of the services and meet the commitments to our customers regarding reliability, availability, and performance.
- Be responsible for the technical quality of the projects by ensuring that key technical procedures, standards, quality control mechanisms, and tools are properly used including performing root cause analyses for technical problems and conduct quality review.
- Work across functions and across teams - we don t only work on code that we own; we work with other parts of successful delivery of data products every day.
Talents Needed for Success:
We recognize that expertise in software development can be gained through many different paths. Below are the key skills we value for this role not all are required, but the ones you bring should be demonstrated at an exceptional level to succeed in this position. -
Application development in Java and related technologies Java, J2EE, Spring (Boot, Batch, Core, MVC, JDBC,), Junit, AWS SDKs AND /OR Python, Polars/ Pandas, Snowpark, NumPy, SciPy, AWS SDKs, pytest static analyzers Sonar /Fortify with gating for code quality
. - Hands-on experience with databases architecture, import, export, performance techniques, data model, database table design and writing complex SQL queries.
- Solid Understanding of Unix/Linux OS including shell scripting, perl and/or python
- Solid understanding of Agile, CI/CD, Jenkins, Dev/Ops practices and tools like Maven, Jenkins, nexus, fortify, liquibase, etc.
- Exposure to design & architecture will be a plus
- Demonstrates strong analytical and interpersonal skills
- Experienced in working with a geographically separated (onshore + offshore) team
- Must understand the Agile development process and be committed to delivering assignments as planned and agreed.
- Ability to collaborate effectively with other developers and co-workers including distributed team members.
- Strong communication skills, desire to learn and contribute, self-starter and phenomenal teammate.
- Participate in daily scrums, project related meetings, backlog grooming, sprint planning and retrospective sessions.
Nice to have
- Proven background in database concepts data management, governance, modelling, and development.
- Snowflake Architecture, Snow SQL, Snowpark, Snow Pipe, Tasks, Streams, Dynamic Tables, Time travel, Optimizer, data sharing, and stored procedures.
- Design Patterns in Java/ Python, Cloud Design Pattern
- Time Series Analysis for financial data
- Experience with any BI tools such as QuickSight, Looker, PowerBI is a plus.
- Familiarity with container technologies like Docker, Kubernetes, OpenShift will be a plus.
- Proven understanding of Agile, CI/CD, Dev/Ops practices and tools.
- AWS experience
- Excellent oral and written English
Actual salary is determined based on the role, location, individual experience, skills, and other considerations. Please contact us to request accommodation.