Posted:2 weeks ago|
Platform:
Work from Office
Full Time
Job Description: Value Preposition Responsible for designing and building data pipelines for enterprise data through ETL/ELT processes. Develop and maintain large-scale data platforms, data lakes and cloud solutions. Job Details Position Title: Data Engineer II Career Level: P2 Job Category: Senior Associate Role Type: Hybrid Job Location: Bengaluru About the Team: The data engineering team is community of dedicated professionals committed to designing, building, and maintaining data platform solutions for the organization. Impact (Job Summary/Why this Role Matters) Enterprise data warehouse supports several critical business functions for the bank including Regulatory Reporting, Finance, Risk steering, and Customer 360. This role is vital for building and maintaining enterprise data platform, data processes, and to support business objectives. Our values inclusivity, transparency, and excellence drive everything we do. Join us and make a meaningful impact on the organization. Key Deliverables (Duties and Responsibilities) Responsible for building and maintaining data platform that supports data integrations for Enterprise Data Warehouse, Operational Data Store or Data Marts etc. with appropriate data access, data security, data privacy and data governance. Create data ingestion pipelines in data warehouses and other large-scale data platforms. Create Data Ingestion pipeline for a variety of sources - File (Flat, delimited, Excel), DB, API (With Apigee integration), and SharePoint. Build reusable Data pipelines / frameworks using Python. Creating scheduled as well as trigger-based ingestion patterns using scheduling tools. Create performance optimized DDLs for any row-based or columnar databases such as Oracle, Postgres, Netezza database per Logical Data Model. Performance tuning of complex data pipelines and SQL queries. Performs impact analysis of proposed changes on existing architecture, capabilities, system priorities, and technology solutions. Working in Agile Framework, participating in various agile ceremonies, co-ordination with scrum master, tech lead, and PO on sprint planning, backlog creation, refinement, demo, and retrospection. Working with Product Owners to understand PI goals, PI planning, requirement clarification, and delivery coordination. Technical support for production incidents and failures Work with global technology teams across different time zones (primarily US) to deliver timely business value. Skills and Qualification (Functional and Technical Skills) Functional Skills: 5+ years of experience, 3+ years relevant to Snowflake. Team Player: Support peers, team, and department management. Communication: Excellent verbal, written, and interpersonal communication skills. Problem Solving: Excellent problem-solving skills, incident management, root cause analysis, and proactive solutions to improve quality. Partnership and Collaboration: Develop and maintain partnership with business and IT stakeholders Attention to Detail: Ensure accuracy and thoroughness in all tasks. Technical/Business Skills: Data Engineering: Experience in designing and building Data Warehouse and Data lakes. Good knowledge of data warehouse principles, and concepts. Technical expertise working in large scale Data Warehousing applications and databases such as Oracle, Netezza, Teradata, and SQL Server. Experience with public cloud-based data platforms especially Snowflake and AWS. Data integration skills: Expertise in design and development of complex data pipelines Solutions using any industry leading ETL tools such as SAP Business Objects Data Services (BODS), Informatica Cloud Data Integration Services (IICS), IBM Data Stage. Knowledge of ELT tools such as DBT, Fivetran, and AWS Glue Expert in SQL - development experience in at least one scripting language (Python etc.), adept in tracing and resolving data integrity issues. Data Model: Knowledge of Logical and Physical Data Model using Relational or Dimensional Modeling practices, high volume ETL/ELT processes. Performance tuning of data pipelines and DB Objects to deliver optimal performance. Experience in Gitlab version control and CI/CD processes. Experience working in Financial Industry is a plus. Relationships & Collaboration Reports to: Associate Director - Data Engineering Partners: Senior leaders and cross-functional teams Collaborates: A team of Data Engineering associates Accessibility Needs We are committed to providing an inclusive and accessible hiring process. If you require accommodations at any stage (e.g. application, interviews, onboarding) please let us know, and we will work with you to ensure a seamless experience.
FC Global Services
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
4.0 - 7.0 Lacs P.A.
9.0 - 18.0 Lacs P.A.
Chennai
8.0 - 13.0 Lacs P.A.
Pune, Chennai, Bengaluru
0.5 - 3.0 Lacs P.A.
4.0 - 8.0 Lacs P.A.
Hyderabad
8.0 - 12.0 Lacs P.A.
5.0 - 9.0 Lacs P.A.
8.0 - 13.0 Lacs P.A.
Pune
13.0 - 14.0 Lacs P.A.
Hyderabad, Pune, Bengaluru
12.0 - 18.0 Lacs P.A.