Jobs
Interviews

2 Msk Kafka Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Engineer at Everstream Analytics, you will be a key player in the development and maintenance of our data infrastructure. Working alongside a team of skilled engineers, you will be responsible for designing, developing, and optimizing Python-based data pipelines and products that support our cloud-native data platform. Your expertise will be crucial in utilizing various AWS services such as MSK (Kafka), Lambda, Glue, Spark, Athena, Lake Formation, Redshift, S3, and RDS to ensure the scalability, reliability, and efficiency of our data infrastructure. Your responsibilities will include architecting, developing, and owning data pipelines that manage large volumes of data from multiple sources while maintaining data quality, integrity, and availability. You will leverage your expertise in AWS data services to create scalable and cost-effective data solutions. Experience with relational databases like PostgreSQL on RDS, graph databases like Neo4j, stream processing tools such as Apache Kafka and Apache Spark, and proficiency in Python development will be essential for success in this role. Collaboration with Product Management, Data Science, and leadership teams to understand data requirements and deliver solutions that meet business needs will be a key aspect of your role. Additionally, you will be responsible for monitoring and optimizing data pipelines for scalability and efficiency, maintaining documentation for data engineering processes, and providing leadership within the data engineering team. The ideal candidate will have proven experience in designing and building cloud-native data platforms in a SaaS or PaaS environment, proficiency in AWS services, relational and graph database technologies, distributed system design, data warehousing, and stream processing. Strong programming skills in Python, problem-solving abilities, and the capability to work collaboratively with cross-functional teams are crucial. A degree in Computer Science, Data Engineering, or related field, or equivalent experience is preferred. This position is based at the Everstream Analytics office in Koregaon Park, Pune. Everstream Analytics, a company focused on revolutionizing the supply chain industry with disruptive technology, offers a dynamic work environment where growth and innovation are encouraged. If you are passionate about driving change and want to be part of a team that values resiliency, responsiveness, and critical thinking, consider joining Everstream Analytics to advance your career. Learn more about Everstream Analytics at www.everstream.ai.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

As an API Developer, you will collaborate with Technology Delivery Managers, Business Units, Enterprise/Solution Architects, and vendor partners to implement API solutions that address critical business challenges. Your responsibilities will include building and maintaining integrations for various on-premises and cloud systems, understanding business requirements, working with end users, and deploying integrations effectively. You should have a minimum of 2 years of experience in Microservices architecture and Java, with a strong grasp of design patterns and the Spring Boot framework. Your ability to work collaboratively with team members, establish positive working relationships, and design, build, and deploy APIs to meet business needs will be crucial. Your commitment to ensuring business satisfaction and agility, coupled with a strong work ethic, passion for the role, positive attitude, and willingness to learn, will contribute significantly to your success in this role. Effective communication with the tech lead to comprehend requirements thoroughly and promptly address any blockers will be essential. In this position, you will handle programming and software development tasks such as requirement gathering, bug fixing, testing, documentation, and implementation. Operating within an agile environment, you will be responsible for delivering high-quality solutions and ensuring the implementation of Security, Logging, Auditing, Policy Management, and Performance Monitoring. Moreover, your familiarity with relational databases (e.g., Oracle), non-relational databases (e.g., MongoDB), MSK Kafka, Docker, Kubernetes, and CICD Technologies (Jenkins, GitHub, Maven) will be beneficial in fulfilling the job requirements. If you are excited about this opportunity and possess the required skills and experience, we encourage you to share your CV with us today at kishori@bwbsol.com.,

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies