Jobs
Interviews

datavruti

2 Job openings at datavruti
Java Developer (Java 17 AND Springbatch) - Financial Services - Mumbai (WFO) maharashtra 4 - 8 years INR Not disclosed On-site Full Time

Hiring for: An emerging fintech , specializing in financial services and fund management solutions. Role: Java Developer (BFSI/Financial Services/Fund Accounting/Fintech) Position: Java Developer/Software Engineer/Sr. Software Engineer Experience: 4 to 6 years Location: Mumbai (Kurla West - WFO) Job Description : Design, develop, and maintain high-performance Java applications using Java 17, Spring Boot, and Spring Batch tailored for the financial services sector. Implement security features using Spring Security to safeguard applications and user data. Utilize Java Collection APIs and demonstrate a solid understanding of data structures to enhance application performance. Develop and manage data access layers using Hibernate , ORM and JPA, ensuring efficient data handling and persistence. Work with MongoDB for database management, ensuring data integrity and optimal performance in financial applications. Design patterns like singleton , Strategy, Factory , Chain of Responsibility , Decorator , DAO , Chunk processing , Thread Pool etc. and develop Java microservices, ensuring they are modular, reusable, and aligned with business requirements. Experience in Java microservices on Kubernetes, understanding of dockers and containerization . Understanding of CI/CD processes within Agile development methodologies for streamline deployment and integration. Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Knowledge of front-end technologies (e.g., HTML, CSS, JavaScript) is a plus. Conduct code reviews, unit testing, and integration testing to ensure high-quality deliverables. Troubleshoot and resolve issues in existing applications, providing timely support and enhancements. Stay updated with the latest industry trends and technologies to continuously improve development processes. Thanks, Jilian,

Senior Tech Lead - Data Engineering & Infrastructure - 10+ years navi mumbai,maharashtra,india 10 years None Not disclosed On-site Full Time

Hiring for: A global digital health leader delivering trusted medical resources to millions daily. Role: Senior Tech Lead - Data Engineering & Infrastructure Experience: 10 years+ Location: Ghansoli, Navi Mumbai Type: Hybrid Shift: 2pm to 11pm IST Salary: Based on experience and fitment Roles & Responsibilities: Design and manage scalable data architectures that meet business needs and performance requirements. Lead the implementation of data storage solutions, such as data warehouses and data lakes, across hybrid and cloud-based environments (AWS, Azure, or GCP). Develop and enforce data governance, quality, and security standards to protect sensitive data and ensure compliance. Monitor and troubleshoot data infrastructure and pipeline issues to ensure high availability and reliability. Architect / Design, Develop & Support multiple Data Engineering projects with heterogeneous data sources, produce / consume data to / from messaging queues like Kafka, push / pull data to / from REST API’s. Support & enhance in-house build Data Integration Framework, Data Replication Framework, Data Profiling & Reconciliation Framework. Support, Enhance & Lead Data Engineering Initiatives, Infrastructure & Architecture. Engage / establish best standard practices with third party data providers. Translate complex technical subjects into terms that can be understood by both technical and non-technical audiences. Providing direction and guidance to a team of 2 or more - solving problems and resolving conflicts. Position Requirements: 7+ years of experience using Data Integration Tools - Pentaho, Talend or any other ETL / ELT tools. 7+ years of experience using traditional databases like Postgres, MSSQL, Oracle 4+ years of experience using MPP databases like Vertica, Google BigQuery, Amazon Redshift 3+ years of experience in creating Entity-Relationship & Dimensional Data Model. 2+ years of experience in Scheduler / Orchestration Tools Like Control-M, Autosys, Airflow, JAMS 2+ years of experience in leading a team of 2 or more 2+ years of proficiency in at least one programming language, such as Python or Java. 2+ years of proficiency in familiarity with cloud-based data storage and processing services, big data technologies (e.g., Spark, Hadoop), and containerization (e.g., Docker, Kubernetes). Experience with scripting languages like Shell Demonstrated experience with cloud infrastructure services (e.g., AWS, Azure, GCP). Strong implementation experience with ETL / ELT Strategies - determining heavy lifting of data using Push Down Or Push Up methods. Strong infrastructure architecture/administration knowledge on ELT / ETL Tools - Memory Management, Resource Management, User / Role Management, Performance improvement. Strong implementation or working experience in any Code Versioning Tools Strong communication and documentation skills. Experience in collaborating with multiple business units like Finance, Marketing, Sales Experience of working in Agile Delivery Model. Desirable: Experience in Data Visualization Tools like Tableau, Pentaho BA Tools. Experience on the Hadoop ecosystem - Programmed or worked with key data components such as HIVE, Spark and Sqoop moving and processing terabyte level of data. Digital Marketing / Web analytics or Business Intelligence a plus. Understanding of Ad stack and data (Ad Servers, DSM, Programmatic, DMP, etc). Knowledge of scripting languages such as Perl or Python. Experience in the Linux environment is preferred but not mandatory.