Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
8.0 - 12.0 years
10 - 20 Lacs
Chennai
Hybrid
Hi [Candidate Name], We are hiring for a Data Engineering role with a leading organization working on cutting-edge cloud and data solutions. If you're an experienced professional looking for your next challenge, this could be a great fit! Key Skills Required: Strong experience in Data Engineering and Cloud Data Pipelines Proficiency in at least 3 languages : Java, Python, Spark, Scala, SQL Hands-on with tools like Google BigQuery, Apache Kafka, Airflow, GCP Pub/Sub Knowledge of Microservices architecture , REST APIs , and DevOps tools (Docker, GitHub Actions, Terraform) Exposure to relational databases : MySQL, PostgreSQL, SQL Server Prior experience in onshore/offshore model is a plus If this sounds like a match for your profile, reply with your updated resume or apply directly. Looking forward to connecting! Best regards, Mahesh Babu M Senior Executive - Recruitment maheshbabu.muthukannan@sacha.solutions
Posted 2 weeks ago
10.0 - 15.0 years
12 - 16 Lacs
Pune, Bengaluru
Work from Office
We are seeking a talented and experienced Kafka Architect with migration experience to Google Cloud Platform (GCP) to join our team. As a Kafka Architect, you will be responsible for designing, implementing, and managing our Kafka infrastructure to support our data processing and messaging needs, while also leading the migration of our Kafka ecosystem to GCP. You will work closely with our engineering and data teams to ensure seamless integration and optimal performance of Kafka on GCP. Responsibilities: Discovery, analysis, planning, design, and implementation of Kafka deployments on GKE, with a specific focus on migrating Kafka from AWS to GCP. Design, architect and implement scalable, high-performance Kafka architectures and clusters to meet our data processing and messaging requirements. Lead the migration of our Kafka infrastructure from on-premises or other cloud platforms to Google Cloud Platform (GCP). Conduct thorough discovery and analysis of existing Kafka deployments on AWS. Develop and implement best practices for Kafka deployment, configuration, and monitoring on GCP. Develop a comprehensive migration strategy for moving Kafka from AWS to GCP. Collaborate with engineering and data teams to integrate Kafka into our existing systems and applications on GCP. Optimize Kafka performance and scalability on GCP to handle large volumes of data and high throughput. Plan and execute the migration, ensuring minimal downtime and data integrity. Test and validate the migrated Kafka environment to ensure it meets performance and reliability standards. Ensure Kafka security on GCP by implementing authentication, authorization, and encryption mechanisms. Troubleshoot and resolve issues related to Kafka infrastructure and applications on GCP. Ensure seamless data flow between Kafka and other data sources/sinks. Implement monitoring and alerting mechanisms to ensure the health and performance of Kafka clusters. Stay up to date with Kafka developments and GCP services to recommend and implement new features and improvements. Requirements: Bachelors degree in computer science, Engineering, or related field (Masters degree preferred). Proven experience as a Kafka Architect or similar role, with a minimum of [5] years of experience. Deep knowledge of Kafka internals and ecosystem, including Kafka Connect, Kafka Streams, and KSQL. In-depth knowledge of Apache Kafka architecture, internals, and ecosystem components. Proficiency in scripting and automation for Kafka management and migration. Hands-on experience with Kafka administration, including cluster setup, configuration, and tuning. Proficiency in Kafka APIs, including Producer, Consumer, Streams, and Connect. Strong programming skills in Java, Scala, or Python. Experience with Kafka monitoring and management tools such as Confluent Control Center, Kafka Manager, or similar. Solid understanding of distributed systems, data pipelines, and stream processing. Experience leading migration projects to Google Cloud Platform (GCP), including migrating Kafka workloads. Familiarity with GCP services such as Google Kubernetes Engine (GKE), Google Cloud Storage, Google Cloud Pub/Sub, and Big Query. Excellent communication and collaboration skills. Ability to work independently and manage multiple tasks in a fast-paced environment.
Posted 2 weeks ago
2.0 - 4.0 years
4 - 6 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Job Overview: We are looking for a skilled and results-driven Java Developer with hands-on experience in Spring Boot, Apache Flink, and Kafka The ideal candidate will be responsible for building and maintaining high-performance backend services and real-time data streaming applications This position is open for immediate joiners and can be based either onsite in Chennai or remotely (WFH), depending on candidate preference, Key Responsibilities: Develop and maintain scalable backend systems using Java and Spring Boot, Design and implement real-time data streaming applications using Apache Flink and Kafka, Build and manage microservices and integrate them with APIs and messaging systems, Collaborate with cross-functional teams to define, design, and deliver new features, Ensure code quality, performance, and security in a production environment, Participate in debugging, troubleshooting, and performance tuning of applications, Must-Have Skills: Minimum 5 years of experience in Java development with Spring Boot, Strong hands-on experience with Apache Flink for stream processing, Proficient in Apache Kafka and event-driven architecture, Solid understanding of RESTful services and microservices architecture, Strong problem-solving and debugging skills, Preferred Skills: Experience with Docker, Kubernetes, or other cloud-native technologies, Familiarity with CI/CD tools and deployment automation, Eligibility: Must be available to join immediately, Open to candidates based onsite in Chennai or working remotely (WFH),
Posted 3 weeks ago
3 - 6 years
20 - 27 Lacs
Pune
Remote
Data Acquisition & Web Application Developer Experience: 3 - 6 Years Exp Salary : USD 1,851-2,962 / month Preferred Notice Period : Within 30 Days Shift : 10:00AM to 7:00PM IST Opportunity Type: Remote Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : APIS, data acquisition, Web scraping, Agile, Python Good to have skills : Analytics, Monitoring, stream processing, Web application deployment, Node Js GPRO Ltd (One of Uplers' Clients) is Looking for: Data Acquisition & Web Application Developer who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description Job Title: Data Acquisition & Web Application Developer About the Project: We are seeking a skilled full-stack developer to build a specialised web application designed to aggregate and present public information on individuals, such as company executives and leaders. This tool will serve as a comprehensive profile generator, pulling data from diverse online sources including news outlets, social media and other platforms. The primary goal is to provide users with a centralised, easily navigable view of a person's online presence, latest news and public information. Project Overview: The core of this project involves developing a robust data acquisition layer capable of scraping and integrating information from various online sources. This data will then be presented through a user-friendly web interface. The application should allow users to input a person's name and receive an aggregated view of relevant public data. Key Responsibilities: Develop and Implement Data Acquisition Layer: Design and build systems to scrape and collect data from specified sources, including news websites (e.g., Bloomberg.com, Reuters, BBC.com, Financial Times), social media (e.g., X, LinkedIn), and media platforms (e.g., YouTube, podcasts). Integrate with APIs: Utilize official APIs (e.g., Bloomberg data, Reuters, FinancialTimes, Google Finance) where available and prioritized. Evaluate and integrate with third-party scraping APIs (e.g., Apify, Oxylabs, SerpApi) as necessary, considering associated risks and subscription models. Handle Hybrid Approach: Implement a strategy that leverages licensed APIs for premium sources while potentially using third-party scrapers for others, being mindful of terms of service and legal/ethical considerations. Direct scraping of highly protected sites like Bloomberg, Reuters, and FT should be avoided or approached with extreme caution using third-party services. Design Data Storage and Indexing: Determine appropriate data storage solutions, considering the volume of data and its relevance over time. Implement indexing and caching mechanisms to ensure efficient search and retrieval of information, supporting near real-time data presentation. Develop Web Application Front-End: Build a basic, functional front-end interface similar to the provided examples ("Opening Screen," "Person profile"). This includes displaying the aggregated information clearly. Implement User Functionality: Enable users to: Input a person's name for searching. Sort displayed outputs by date. Click through links to access the original source of information. Navigate to a new search easily (e.g., via a tab). Consider Stream Processing: Evaluate and potentially implement stream processing techniques for handling near real-time data acquisition and updates. ¢ Ensure Scalability: Design the application to support a specified level of concurrent searches (estimated at 200 for the initial phase). ¢ Build Business Informational Layer: Develop a component that tracks the usage of different data services (APIs, scrapers) for monitoring costs and informing future scaling decisions. ¢ Technical Documentation: Provide clear documentation for the developed system, including data flows, API integrations, and deployment notes. Required Skills and Experience: ¢ Proven experience in web scraping and data acquisition from diverse online sources. ¢ Strong proficiency in developing with APIs, including handling different authentication methods and data formats. ¢ Experience with relevant programming languages and frameworks for web development and data processing (e.g., Python, Node.js, etc.). ¢ Knowledge of database design and data storage solutions. ¢ Familiarity with indexing and caching strategies for search applications. ¢ Understanding of potential challenges in web scraping (e.g., anti-scraping measures, terms of service). ¢ Experience in building basic web application front-ends. ¢ Ability to consider scalability and performance in system design. ¢ Strong problem-solving skills and ability to work independently or as part of a small team. ¢ Experience working with foreign (western based) startups and clients. Ability to work in agile environments and ability to pivot fast. Desirable Skills: ¢ Experience with stream processing technologies. ¢ Familiarity with deploying and managing web applications (though infrastructure design is flexible). ¢ Experience with monitoring and analytics for application usage. How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: A web app aggregating real-time info on individuals for financial services professionals About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 month ago
8 - 13 years
25 - 30 Lacs
Bengaluru
Work from Office
Education: A Bachelors degree in Computer Science, Engineering (B.Tech, BE), or a related field such as MCA (Master of Computer Applications) is required for this role. Experience: 8+ years in data engineering with a focus on building scalable and reliable data infrastructure. Skills: Language: Proficiency in Java or Python or Scala. Prior experience in Oil Gas, Titles Leases, or Financial Services is a must have. Databases: Expertise in relational and NoSQL databases like PostgreSQL, MongoDB, Redis, and Elasticsearch. Data Pipelines: Strong experience in designing and implementing ETL/ELT pipelines for large datasets. Tools: Hands-on experience with Databricks, Spark, and cloud platforms. Data Lakehouse: Expertise in data modeling, designing Data Lakehouses, and building data pipelines. Modern Data Stack: Familiarity with modern data stack and data governance practices. Data Orchestration: Proficient in data orchestration and workflow tools. Data Modeling: Proficient in modeling and building data architectures for high-throughput environments. Stream Processing: Extensive experience with stream processing technologies such as Apache Kafka. Distributed Systems: Strong understanding of distributed systems, scalability, and availability. DevOps: Familiarity with DevOps practices, continuous integration, and continuous deployment (CI/CD). Problem-Solving: Strong problem-solving skills with a focus on scalable data infrastructure. Key Responsibilities: This is a role with high expectations of hands on design and development. Design and development of systems for ingestion, persistence, consumption, ETL/ELT, versioning for different data types e.g. relational, document, geospatial, graph, timeseries etc. in transactional and analytical patterns. Drive the development of applications related to data extraction, especially from formats like TIFF, PDF, and others, including OCR and data classification/categorization. Analyze and improve the efficiency, scalability, and reliability of our data infrastructure. Assist in the design and implementation of robust ETL/ELT pipelines for processing large volumes of data. Collaborate with cross-functional scrum teams to respond quickly and effectively to business needs. Work closely with data scientists and analysts to define data requirements and develop comprehensive data solutions. Implement data quality checks and monitoring to ensure data integrity and reliability across all systems. Develop and maintain data models, schemas, and documentation to support data-driven decision-making. Manage and scale data infrastructure on cloud platforms, leveraging cloud-native tools and services. Benefits: Salary: Competitive and aligned with local standards. Performance Bonus: According to company policy. Benefits: Includes medical insurance and group term life insurance. Continuous learning and development.10 recognized public holidays. Parental Leave
Posted 1 month ago
4 - 6 years
15 - 22 Lacs
Gurugram
Hybrid
The Job We are looking out for a Sr Data Engineer responsible to Design, Develop and Support Real Time Core Data Products to support TechOps Applications. Work with various teams to understand business requirements, reverse engineer existing data products and build state of the art performant data pipelines. AWS is the cloud of choice for these pipelines and a solid understand and experience of architecting , developing and maintaining real time data pipelines in AWS Is highly desired. Design, Architect and Develop Data Products that provide real time core data for applications. Production Support and Operational Optimisation of Data Projects including but not limited to Incident and On Call Support , Performance Optimization , High Availability and Disaster Recovery. Understand Business Requiremensts interacting with business users and or reverse engineering existing legacy data products. Mentor and train junior team members and share architecture , design and development knowdge of data products and standards. Mentor and train junior team members and share architecture , design and development knowdge of data products and standards. Good understand and working knowledge of distributed databases and pipelines. Your Profile An ideal candidate will have 4+ yrs of experience in Real Time Streaming along with hands on Spark, Kafka, Apache Flink, Java, Big data technologies, AWS and MSK (managed service kafka) AWS Distrubuited Database technologies including Managed Services Kafka, Managed Apache Flink, DynamoDB, S3, Lambda. Experience designing and developing with Apache Flink real time data products.(Scala experience can be considered) Experience with python and pyspark SQL Code Development AWS Solutions Architecture experience for data products is required Manage, troubleshoot, real time data pipelines in the AWS Cloud Experience with High Availability and Disaster Recovery Solutions for Real time data streaming Excellent Analytical, Problem solving and Communication Skills Must be self-motivated, and ability to work independently Ability to understand existing SQL and code and user requirements and translate them into modernized data products.
Posted 1 month ago
11 - 16 years
40 - 45 Lacs
Pune
Work from Office
About The Role : Job Title - IT Architect Specialist, AVP Location - Pune, India Role Description This role is for a Senior business functional analyst for Group Architecture. This role will be instrumental in establishing and maintaining bank wide data policies, principles, standards and tool governance. The Senior Business Functional Analyst acts as a link between the business divisions and the data solution providers to align the target data architecture against the enterprise data architecture principles, apply agreed best practices and patterns. Group Architecture partners with each division of the bank to ensure that Architecture is defined, delivered, and managed in alignment with the banks strategy and in accordance with the organizations architectural standards. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Data Architecture: The candidate will work closely with stakeholders to understand their data needs and break out business requirements into implementable building blocks and design the solution's target architecture. GCP Data Architecture & Migration: A strong working experience on GCP Data architecture is must (BigQuery, Dataplex, Cloud SQL, Dataflow, Apigee, Pub/Sub, ...). Appropriate GCP architecture level certification. Experience in handling hybrid architecture & patterns addressing non- functional requirements like data residency, compliance like GDPR and security & access control. Experience in developing reusable components and reference architecture using IaaC (Infrastructure as a code) platforms such as terraform. Data Mesh: The candidate is expected to have proficiency in Data Mesh design strategies that embrace the decentralization nature of data ownership. The candidate must have good domain knowledge to ensure that the data products developed are aligned with business goals and provide real value. Data Management Tool: Access various tools and solutions comprising of data governance capabilities like data catalogue, data modelling and design, metadata management, data quality and lineage and fine-grained data access management. Assist in development of medium to long term target state of the technologies within the data governance domain. Collaborate with stakeholders, including business leaders, project managers, and development teams, to gather requirements and translate them into technical solutions. Your skills and experience Extensive experience in data architecture, within Financial Services Strong technical knowledge of data integration patterns, batch & stream processing, data lake/ data lake house/data warehouse/data mart, caching patterns and policy bases fine grained data access. Proven experience in working on data management principles, data governance, data quality, data lineage and data integration with a focus on Data Mesh Knowledge of Data Modelling concepts like dimensional modelling and 3NF. Experience of systematic structured review of data models to enforce conformance to standards. High level understanding of data management solutions e.g. Collibra, Informatica Data Governance etc. Proficiency at data modeling and experience with different data modelling tools. Very good understanding of streaming and non-streaming ETL and ELT approaches for data ingest. Strong analytical and problem-solving skills, with the ability to identify complex business requirements and translate them into technical solutions. How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 2 months ago
2 - 4 years
4 - 7 Lacs
Bengaluru
Work from Office
Responsibilities Develop, maintain, and optimize backend systems using Java and Spring Boot Design and implement scalable solutions based on microservices architecture Integrate and manage messaging systems, particularly Apache Kafka Enhance system security using Spring Security, SSO, and Auth0 Collaborate with cross-functional teams to deliver high-quality solutions Ensure code quality, testing, and deployment standards are met Requirements Proven experience as a Backend Developer with expertise in Java Strong knowledge of Spring Boot and its ecosystem Hands-on experience with microservices architecture Proficiency in Apache Kafka for messaging and stream processing Experience with security frameworks and protocols: Spring Security, Single Sign-On (SSO), Auth0, or similar Familiarity with CI/CD processes and tools for efficient deployment
Posted 2 months ago
9 - 14 years
11 - 16 Lacs
Hyderabad
Work from Office
OVERVIEW : To succeed in this challenging journey, we have set up multiple co-located teams across the globe (Hyderabad, US, Europe), embracing the scaled agile framework, a Micro Services approach combined with the DevOps model. We have passionate engineers working full time on this new platform in Hyderabad and its only the beginning. You will get a chance to work with brilliant people and some of the best development and design teams, in addition to working with cutting edge technologies such as React, Java/Node JS, Docker, Kubernetes, AWS. We are looking for exceptional Java/Node based full stack developers to join our team . You will work alongside the Architect and DevOps teams to fully form an autonomous development squad and be in-charge of a part of the product. Overall, the Principal Software Engineer role at Skillsoft offers a challenging and rewarding opportunity for individuals who are passionate about technology, learning, and making a difference in the world. OPPORTUNITY HIGHLIGHTS: Technical leadership: As a Principal Software Engineer, you will be responsible for technical leadership, providing guidance and mentoring to other team members, and ensuring that projects are completed on time and to the highest standards. Cutting-edge technology : Skillsoft is a technology-driven company that is constantly exploring new technologies to enhance the learning experience for their customers. As a Principal Software Engineer, you will have the opportunity to work with cutting-edge technology and help drive innovation. Agile environment : Skillsoft follows agile methodologies, which means that you will be part of a fast-paced, collaborative environment where you will have the opportunity to work on multiple projects simultaneously. Career growth: Skillsoft is committed to helping their employees grow their careers. As a Principal Software Engineer, you will have access to a wide range of learning and development opportunities, including training programs, conferences, and mentorship. Impactful work: Skillsoft's mission is to empower people through learning, and as a Principal Software Engineer, you will be a key contributor to achieving this mission. You will have the opportunity to work on products and features that have a significant impact on the learning and development of individuals and organizations worldwide. SKILLS & QUALIFICATIONS: Minimum 9+ years of software engineering development experience developing cloud-based enterprise solutions. Proficient in programming languages (Java, JavaScript, HTML5, CSS) Proficient in JavaScript frameworks (Node.js, React, Redux, Angular, Express.js) Proficient with frameworks (Spring Boot, Stream processing) Strong knowledge working with REST API, Web services and SAML integrations Proficient in working with databases preferably Postgres. Experienced working on DevOps tools (Docker, Kubernetes, Ansible, AWS) Experience with code versioning tools, preferably Git (Github, Gitlab, etc) and the feature branch workflow Working Experience on Kafka, RabbitMq (messaging queue systems) Sound knowledge on design principles and design patterns Strong problem solving and analytical skills and understanding of various data structures and algorithms. Must know how to code applications on Unix/Linux based systems. Experience with build automation tools like Maven, Gradle, NPM, WebPack, Grunt. Sound troubleshooting skills to address code bugs, performance issues and environment issues that may arise. Good understanding of the common security concerns of high volume publicly exposed systems Experience in working with Agile/Scrum environment. Strong analytical skills and the ability to understand complexities and how components connect and relate to each other OUR VALUES WE ARE PASSIONATELY COMMITTED TO LEADERSHIP, LEARNING, AND SUCCESS. WE EMBRACE EVERY OPPORTUNITY TO SERVE OUR CUSTOMERS AND EACH OTHER AS: ONE TEAM OPEN AND RESPECTFUL CURIOUS READY TRUE MORE ABOUT SKILLSOFT: Skillsoft delivers online learning, training, and talent solutions to help organizations unleash their edge . Leveraging immersive, engaging content, Skillsoft enables organizations to unlock the potential in their best assets their people and build teams with the skills they need for success. Empowering 36 million learners and counting, Skillsoft democratizes learning through an intelligent learning experience and a customized, learner-centric approach to skills development with resources for Leadership Development, Business Skills, Technology & Development, Digital Transformation, and Compliance. Skillsoft is partner to thousands of leading global organizations, including many Fortune 500 companies. The company features three award-winning systems that support learning, performance and success: Skillsoft learning content, the Percipio intelligent learning experience platform, which offers measurable impact across the entire employee lifecycle. Learn more at . Thank you for taking the time to learn more about us. If this opportunity intrigues you, we would love for you to apply! NOTE TO EMPLOYMENT AGENCIES: We value the partnerships we have built with our preferred vendors. Skillsoft does not accept unsolicited resumes from employment agencies. All resumes submitted by employment agencies directly to any Skillsoft employee or hiring manager in any form without a signed Skillsoft Employment Agency Agreement on file and search engagement for that position will be deemed unsolicited in nature. No fee will be paid in the event the candidate is subsequently hired as a result of the referral or through other means. Skillsoft is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information, and other legally protected categories.
Posted 2 months ago
3 - 7 years
5 - 9 Lacs
Hyderabad
Work from Office
OVERVIEW : To succeed in this challenging journey, we have set up multiple co-located teams across the globe (Hyderabad, US, Europe), embracing the scaled agile framework, a Micro Services approach combined with the DevOps model. We have passionate engineers working full time on this new platform in Hyderabad and its only the beginning. You will get a chance to work with brilliant people and some of the best development and design teams , in addition to working with cutting edge technologies such as React, Java/ Node JS, Docker, Kubernetes, AWS. We are looking for exceptional Java/Node based full stack developers to join our team You will work alongside the Architect and DevOps teams to fully form an autonomous development squad and be in-charge of a part of the product. Overall, as a Full Stack Software Engineer II at Skillsoft, you will be responsible for designing, developing, and maintaining high-quality software products, and contributing to the overall success of the company. OPPORTUNITY HIGHLIGHTS: Designing and developing software Products: You will be responsible for designing, developing, and maintaining software products using a Java, Node.JS, React framework in Microservices architecture hosted on AWS cloud. Collaborating with cross-functional teams: You will work closely with cross-functional teams such as product management, user experience, DevOps and CloudOps teams to deliver high-quality software products. Participating in agile development practices: Skillsoft follows agile development practices, and as a Full Stack Software Engineer II, you will be responsible forparticipating in agile ceremonies such as grooming, stand-ups, sprint planning, and retrospectives. Staying up-to-date with emerging technologies: You will be expected to stay up-to-date with emerging technologies and industry trends, and apply this knowledge to improve software development practices at Skillsoft. SKILLS & QUALIFICATIONS: Minimum3+ years of software engineering development experience developing cloud-based enterprise solutions. Experience in programming languages (Java, JavaScript, HTML5, CSS) Experience in JavaScript frameworks (Node.js, React, Redux, Angular, Express.js) Experience with frameworks (Spring Boot, Stream processing) Experience in working with REST API, Web services and SAML integrations Experience working with databases preferably Postgres. Good knowledge on DevOps (Docker, Kubernetes, Ansible, AWS) Experience with code versioning tools, preferably Git (Github, Gitlab, etc) and the feature branch workflow Knowledge on Kafka, RabbitMq (messaging queue systems) Understanding of various data structures and algorithms. Must know how to code applications on Unix/Linux based systems. Experience in working with Agile/Scrum environment. Strong analytical skills and the ability to understand complexities and how components connect and relate to each other OUR VALUES WE ARE PASSIONATELY COMMITTED TO LEADERSHIP, LEARNING, AND SUCCESS. WE EMBRACE EVERY OPPORTUNITY TO SERVE OUR CUSTOMERS AND EACH OTHER AS: ONE TEAM OPEN AND RESPECTFUL CURIOUS READY TRUE MORE ABOUT SKILLSOFT: Skillsoft delivers online learning, training, and talent solutions to help organizations unleash their edge . Leveraging immersive, engaging content, Skillsoft enables organizations to unlock the potential in their best assets their people and build teams with the skills they need for success. Empowering 36 million learners and counting, Skillsoft democratizes learning through an intelligent learning experience and a customized, learner-centric approach to skills development with resources for Leadership Development, Business Skills, Technology & Development, Digital Transformation, and Compliance. Skillsoft is partner to thousands of leading global organizations, including many Fortune 500 companies. The company features three award-winning systems that support learning, performance and success: Skillsoft learning content, the Percipio intelligent learning experience platform, which offers measurable impact across the entire employee lifecycle. Learn more at . Thank you for taking the time to learn more about us. If this opportunity intrigues you, we would love for you to apply! NOTE TO EMPLOYMENT AGENCIES: We value the partnerships we have built with our preferred vendors. Skillsoft does not accept unsolicited resumes from employment agencies. All resumes submitted by employment agencies directly to any Skillsoft employee or hiring manager in any form without a signed Skillsoft Employment Agency Agreement on file and search engagement for that position will be deemed unsolicited in nature. No fee will be paid in the event the candidate is subsequently hired as a result of the referral or through other means. Skillsoft is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information, and other legally protected categories.
Posted 2 months ago
8 - 12 years
1 - 3 Lacs
Bengaluru
Hybrid
Expertise with Java extends to building RESTful APIs, working with microservices, and integrating with databases, which are key components of enterprise-grade applications. Java's strong ecosystem includes frameworks like Spring Boot, which simplifies the development of robust and scalable backend systems Kafka is a distributed event streaming platform that enables real-time data streaming, processing, and storage, and it plays a crucial role in building scalable, fault-tolerant, and high-throughput systems. In your experience, Kafka has been used as part of microservices and data streaming applications, highlighting its importance in handling real-time data flow. Kafka operates as a message broker, where data is published to topics, and consumers can subscribe to those topics to process the data. This allows for the decoupling of services, enabling systems to operate asynchronously. Preferred Skills # of Years Highlight of Experience Streaming process 6 This process is key in handling large-scale, high-throughput data in real-time systems, such as in the Notification microservice . Streaming is beneficial in scenarios where low-latency processing and quick insights are required. Technologies like Apache Kafka are commonly used for data streaming because they allow the streaming of data in a fault-tolerant and distributed manner. Data is ingested from various sources, processed in real-time, and then made available to consumers or other systems for further use.
Posted 3 months ago
7 - 12 years
20 - 35 Lacs
Bengaluru
Hybrid
Role :Java Developer Experience :6+ yrs Location :bangalore Mandatory Skills: Duties: Develop stream processing application, add features and support Help team on Production issues/Compliance issues Test and document software Operation tasks Required Skills: Good Fluency in building Java application Experience and expertise on building Stream processing applications ( any one - Kafka Steam, Flink, beam etc)
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2