Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As a Data Engineer (ETL, Big Data, Hadoop, Spark, GCP) at Assistant Vice President level, located in Pune, India, you will be responsible for developing and delivering engineering solutions to achieve business objectives. You are expected to have a strong understanding of crucial engineering principles within the bank, and be skilled in root cause analysis through addressing enhancements and fixes in product reliability and resiliency. Working independently on medium to large projects with strict deadlines, you will collaborate in a cross-application technical environment, demonstrating a solid hands-on development track record within an agile methodology. Furthermore, this role involves collaborating with a globally dispersed team and is integral to the development of the Compliance tech internal team in India, delivering enhancements in compliance tech capabilities to meet regulatory commitments. Your key responsibilities will include analyzing data sets, designing and coding stable and scalable data ingestion workflows, integrating them with existing workflows, and developing analytics algorithms on ingested data. You will also be working on data sourcing in Hadoop and GCP, owning unit testing, UAT deployment, end-user sign-off, and production go-live. Root cause analysis skills will be essential for identifying bugs and issues, and supporting production support and release management teams. You will operate in an agile scrum team and ensure that new code is thoroughly tested at both unit and system levels. To excel in this role, you should have over 10 years of coding experience with reputable organizations, hands-on experience in Bitbucket and CI/CD pipelines, and proficiency in Hadoop, Python, Spark, SQL, Unix, and Hive. A basic understanding of on-prem and GCP data security, as well as hands-on development experience with large ETL/big data systems (with GCP experience being a plus), are required. Familiarity with cloud services such as cloud build, artifact registry, cloud DNS, and cloud load balancing, along with data flow, cloud composer, cloud storage, and data proc, is essential. Additionally, knowledge of data quality dimensions and data visualization is beneficial. You will receive comprehensive support, including training and development opportunities, coaching from experts in your team, and a culture of continuous learning to facilitate your career progression. The company fosters a collaborative and inclusive work environment, empowering employees to excel together every day. As part of Deutsche Bank Group, we encourage applications from all individuals and promote a positive and fair workplace culture. For further details about our company and teams, please visit our website: https://www.db.com/company/company.htm.,
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough