We are looking for an Engineer who will have extensive experience 8-10 years in SQL Server Database Development, SSIS (SQL Server Integration Services), and SSRS (SQL Server Reporting Services This role requires a proactive individual who can work independently and manage multiple tasks efficiently. Key Responsibilities: SQL Server Database Development: Design, develop, and maintain SQL Server databases. Create and optimize stored procedures, functions, views, and triggers. Ensure database performance, security, and reliability. Perform data modeling and database schema design. SSIS Development: Design and develop ETL (Extract, Transform, Load) processes using SSIS. Build data transformations, including importing data from various sources and moving data between databases. Debug and optimize SSIS packages to ensure efficient data movement. Implement and maintain database objects and security. SSRS Development: Design, develop, and maintain reports using SSRS. Create complex reports, including dashboards and scorecards, to meet business requirements. Ensure reports are optimized for performance and accuracy. Collaborate with business users to gather reporting requirements and provide technical solutions. General Responsibilities: Work closely with business analysts, stakeholders, and other team members to understand data requirements. Provide technical guidance and support to other developers. Participate in code reviews and ensure adherence to best practices. Stay updated with the latest industry trends and technologies related to SQL Server, SSIS, and SSRS. Qualifications: Bachelors degree . Proven experience as a SQL Server Database Developer, SSIS Developer, and SSRS Developer. Strong knowledge of T-SQL, data modeling, and database design. Proficiency in developing and optimizing SSIS packages and SSRS reports. Excellent problem-solving skills and attention to detail. Ability to work independently and manage multiple tasks effectively.
Consultant Engineer for Liquidity Program - India, Gurugram Key points about the role - We seek a mid-level engineer to design and build Liquidity calculations using our bespoke Data Calculation Platform (DCP), based on documented business requirements. The front-end of the DCP is Dataiku, but prior experience with Dataiku is not necessary if they've had experience working as a data engineer. Liquidity experience is not necessary, but it would be helpful if they've had experience designing and building to business requirements. There is a requirement to work 3 days in the office in Gurugram. They will work as part of a team that is located in both Sydney and Gurugram. The reporting manager will be based in Gurugram and project leadership in Sydney. The Full Job Description Reads As Follows You will be an integral part of a dynamic team of business and technology experts, collaborating with a network of technologists across our programme. Your role will involve working within a dedicated squad to ingest data from producers and implement essential liquidity calculations — all within a cutting-edge data platform. We place a strong emphasis on delivering a high-performing, robust, and stable platform that meets the business needs of both internal and external stakeholders. In this role, you will bring an in-depth knowledge of big data technologies and a strong desire to work in a DevOps environment, where you will have end-to-end accountability for designing, developing, testing, deploying, and supporting your data assets. Additionally, you will create templates, implementation methods, and standards to ensure consistency and quality. You will be managing deadlines, articulating technical challenges and solutions, and contributing to the development of improved processes and practices. A growth mindset, passion for learning, and ability to quickly adapt to innovative technologies will be essential to your success in this role. What You Offer Experience in Big Data technologies, specifically Spark, Python, Hive, SQL, Presto (or other query engines), big data storage formats (e.g., Parquet), orchestration tools (e.g., Apache Airflow) and version control (e.g. Bitbucket) Proficiency in developing configuration-based ETL pipelines and user-interface driven tools to optimize data processes and calculations (e.g., Dataiku). Experience in analysing business requirements, solution design, including the design of data models, data pipelines, and calculations, as well as presenting solution options and recommendations. Experience working in a cloud-based environment (ideally AWS), with a solid understanding of cloud computing concepts (EC2, S3), Linux, and containerization technologies (Docker and Kubernetes). A background in solution delivery within the finance or treasury business domains, particularly in areas such as Liquidity or Capital, is advantageous. When applying, please provide "Desired Pay" as LPA CTC.
Role Overview: As a Consultant Engineer for the Liquidity Program in Gurugram, India, you will play a crucial role in designing and building liquidity calculations using the bespoke Data Calculation Platform (DCP) based on documented business requirements. You will be part of a team located in both Sydney and Gurugram, reporting to a manager in Gurugram with project leadership in Sydney. Your primary focus will be ingesting data from producers and implementing essential liquidity calculations within a cutting-edge data platform, ensuring high performance, robustness, and stability to meet the business needs of internal and external stakeholders. Key Responsibilities: - Utilize your expertise in big data technologies such as Spark, Python, Hive, SQL, Presto, and big data storage formats like Parquet to develop, test, deploy, and support data assets. - Develop configuration-based ETL pipelines and user-interface driven tools using tools like Dataiku to optimize data processes and calculations. - Analyze business requirements, design data models, data pipelines, and calculations, and present solution options and recommendations. - Work in a cloud-based environment, preferably AWS, with knowledge of cloud computing concepts, Linux, and containerization technologies. - Draw on your background in solution delivery within the finance or treasury domains, particularly in areas like Liquidity or Capital, for added advantage. Qualifications Required: - Experience in big data technologies, configuration-based ETL pipelines, and user-interface driven tools. - Proficiency in analyzing business requirements, designing data models, pipelines, and calculations. - Familiarity with cloud-based environments, cloud computing concepts, and containerization technologies. - Background in solution delivery within finance or treasury business domains, especially in Liquidity or Capital, is beneficial. (Note: Additional details about the company were not provided in the job description.),