Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 15.0 years
12 - 17 Lacs
pune
Work from Office
Job Title Data Engineer, AS Location Pune, India Role Description Role is an Data Engineer to manage the databases/data warehouses/ data marts within the Asset Management Fund Accounting/Investment Business. Has good understanding of Bafin/FCA/BundesBank regulatory reporting framework and reporting experiences. Can handle the day-to-day BAU of several Datawarehouse and Database Applications as part of the ADC Team. Perform data analysis and resolve Production Issues challenging the complex in and out flows of the Asset management key figures. Liaison with external and Internal stakeholders to prioritize and independently handle the Technical Road Map Compliance, Production Stability and partner with Business to cater to enhancements for Critical Regulatory Projects. Help reduce Open Audit Findings with resolving underlying Audit Action Items within the stipulated closure deadlines. Possess versatile database technology background and able to scale support to regulatory and Financial Audits. Flexible to learn other relevant technologies and perform as per project need. Self-driven, committed, process Oriented and able to handle challenging situations Added Advantage: Product Knowledge : Eagle Pace, Eagle Access is an added advantage Technology Working Experiences : Kafka, Stream sets, Java Spring Boot, Fabric/OCP, Microservices based Architecture is an added advantage Your key responsibilities 10-15 years of IT Development experience Established adapter/API based connectivity to REST and SOAP web-services, transports like HTTP, JMS, Web Services Listener, HTTP Client, Web Services SOAP Client Designed Technical/Integration Architectures, including development, runtime, operations architectures, automating business process models and cloud-based services. Strong experience in testing and implementation phases of software development life cycle. Having good experience in Telecom & Banking domain. A self-motivated and quick learner who is willing to adapt to new challenges & Technologies Few Years of Experience on DWH,OLTP and ETL specifically on Informatica, Oracle and SQL Server. Experience on integrating Kafka, MQ for streaming data ingestion. Good hands on experience on Performance Tuning and Optimizations. Having experience on Cloud Technologies with a focus on GCP. Experience on working within Agile Scrum Teams. Having exposure to DevOps activities, build and deployment in Data Engineering side Manage the BAU/Production Stability by closing the Incident tickets and offer stable/reliable permanent remediations across Asset management data platforms. Perform and Participate in Platform Migration and Shared Services Informatica IICS Cloud based Setup and Onboarding of Applications Migrate On Prem PowerCenter based Workflows into IICS based Cloud Informatica IDMS Cloud Orchestrations Help in standardizing the database platforms to be able apply data management at scale Ability to schedule jobs, monitor and correct the issues related to shell scripts, Oracle procedure call, file transfer, etc. Knowledge on integration using shell scripts, plugins and API based connections with other infrastructure services. Document technical solutions and build KOP for the monitoring L2 team. Knowledge on Linux, Unix and Windows environment. Exposure to automation and programming through UNIX shell scripting. Analytical skill to understand user requirements, procedures and problems to automate processing or improve performance. Work with Business Users/Analysts to understand the requirements and transform them into deliverables. Programming in Oracle PL/SQL. Perform Unit/System testing provide support to the Quality assurance Release Management Teams. Ensure process adherence for the development activity. Work in Agile methodology where development teams may also support production support teams for L2/L3 activities. Flexible to learn other relevant technologies and perform as per project need. Good knowledge and experience working with different automation tooling with IT operations. Very good communication & coordination skills. Very good team player. Very good analytical and problem solving skills. Self-driven, committed, process Oriented and able to handle challenging situations. Open to take-up additional responsibilities as required by the organization. Your skills and experience Is a fluent communicator and have experience with large banks and financial groups At least 10+ years of Technology and Domain experience in handling Asset Management Enterprise Data Platforms Have performed at least couple of migrations on Oracle Databases and hosting clusters and sound knowledge on the RDBMS Have good experience on SCD(1-4) rank based ETL topology and has extensively developed Informatica Workflows Good Skill in managing Oracle 12c, Exadata platforms, dataGuard and HPE clusters Good Skill in Unix, Linux Bash, Python Scripting Good knowledge Data repositories NFS, DFS, Cloud bases ( SSD, PSD, RSD, GCB) Have good Asset Management Back Office and Front Office Fund Accounting and Key asset flows supporting Ledgers and Financial AuM Good Experience in TIBCO Design, Development and Administrator Skills. Strong Experience on IPAAS (CloubHub) and Cloud Hosting ( AWS, GCP) Proficiency in various integration mechanisms like point-to-point, pub/sub. Capable of developing and incorporating well Integration solutions with IBM MQ Cloud Pak based environments Extensively worked in Scrum environment with active involvement in daily meetings and in scrum meetings, reviews. Experience in developing and deployingmicroservicesarchitectures using Spring Boot, Spring Batch, Spring Data or similar frameworks. Strong knowledge of RESTful web services and related technologies such as JSON, Swagger and XML Experience in BW 5.X, TIBCO EMS, BW 6.X, Web Services & SOA. With the following Suite of Products: a) Rendezvous 64 bit 8.4.4. b) Tibco TRA Suite 5.10.0 c) Tibco BusinessWorks 5.13.0 d)Tibco Activematrix Adapter 7.2.0 for database e)TIBCO ActiveMatrix Adapter7.0 for Files (SFTP). f)Tibco BW Plugin SFTP 1.1.0HF02 g)Tibco BW Plugin MQ 7.7.0 h)IBM Websphere MQ Client 8.0.0.5 i)JRE dependency Established adapter/API based connectivities to REST and SOAP web-services, transports like HTTP, JMS, Web Services Listener, HTTP Client, Web Services SOAP Client Designed Technical/Integration Architectures, including development, runtime, operations architectures, automating business process models and cloud-based services. Strong experience in testing and implementation phases of software development life cycle. Having good experience in Telecom & Banking domain. A self-motivated and quick learner who is willing to adapt to new challenges & Technologies Experience with Informatica Intelligent Cloud Services (IICS). Supporting integration configurations with IPaaS though connected Apps, API, microservices and webservices Migration experience to migrate workflows/repos from On-premise Informatica Power Center (9,*) toIICSon GCP or AWS Experience with Shell Scripting/Python, Webservice, XML, JSON and Real time data integration using IICS CAI (Cloud Application Integration) in building ETL Pipelines, Data Factory, Data Flows and Data replication Experienced in creating, monitoring, debugging and publishing processes in Informatica Cloud Real Time (ICRT) process designer/console using service connectors like REST,JDBC, File. Deliver hybrid heterogenous integrations of Cloud and on-premise systems of Informatica Data Integration Services Hands-on with creation of resources in catalog admin/onboarding of various types of data/sources and relational Databases (SQL Server, Oracle, PostgreSQL), SQL, ODBC,JDBC API Experienced with mappings involving SCD 1- VI to perform ETL data transformations using Update Strategy, Lookup, SP, Router, Filters, Sequencer Generators, Joiner, Aggregate and expressions mappings onIICS .
Posted 1 week ago
1.0 - 6.0 years
3 - 7 Lacs
Hyderabad
Work from Office
Walk-In Drive For Technical Service Department(TSD)- MSN Laboratories, Pashamylaram, R&D center on Friday 27-06-2025. Role & responsibilities Knowledge of process engineering activities from lab development to commercial execution. Process engineering calculations for scale up . Preparation of technology transfer documentation like PFD, P&ID, and equipment suitability. Supporting research team in generating safety data Knowledge of HAZOP/HIRA Conducting simulation experiments in lab Batch monitoring during manufacturing to ensure smooth scale up Preparation of campaign reports after the completion of the project for knowledge transfer. Department : TSD Qualification: B.Tech Chemical Engineering Designation : Executive/Sr Executive Job Location MSN R&D Center Work Location : MSN R&D Center, Pashamylaram, Hyderabad Interview Venue Details :Venue Details : MSN Laboratories Pvt. Ltd, R&D Centre, Pashamylaram (V), Patancheru (M), Sangareddy (D.t), Telangana Friday 27-06-2025. Please share CV to dinesh.baratam@msnlabs.com Subject TSD Profile
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |