Jobs
Interviews

9 Eventhub Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 5.0 years

0 Lacs

karnataka

On-site

You will be joining as a Senior Software Development Engineer focusing on Backend development using Golang at our workplace situated in Bangalore, HSR Layout. You should have a minimum of 4 years of experience as a Backend Developer with a strong grasp of analytical and algorithmic capabilities. Additionally, you must possess at least 1 year of hands-on experience in GoLang along with knowledge of RDBMS and NoSQL datastores. Familiarity with major public cloud providers such as AWS, GCP, or Azure is essential, as well as a clear understanding of Git and a preference for good programming practices including TDD. In this role, you will be responsible for owning one or more services entirely, driving the overall project and team to ensure timely delivery, and mentoring and guiding the team on development best practices. It is desirable if you have contributed to open source projects, have experience with UI frameworks like React, and have worked with message queues such as Kafka or EventHub. Experience in mentoring or team management, strong verbal and written communication skills, exposure to multiple cloud providers, and holding a valid US Business visa would all be advantageous for this position.,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Network Operations Center (NOC) Analyst at Inspire Brands, you will be responsible for overseeing all technology aspects of the organization. Your primary role will involve acting as the main technology expert for the NOC team, ensuring the detection and resolution of issues in production before they impact the large scale operations. It will be your duty to guarantee that the services provided by the Inspire Digital Platform (IDP) meet user needs in terms of reliability, uptime, and continuous improvement. Additionally, you will play a crucial role in ensuring an outstanding customer experience by establishing service level agreements that align with the business model. In the technical aspect of this role, you will be required to develop and monitor various monitoring dashboards to identify problems related to applications, infrastructure, and potential security incidents. Providing operational support for multiple large, distributed software applications will be a key responsibility. Your deep troubleshooting skills will be essential in enhancing availability, performance, and security to ensure 24/7 operational readiness. Conducting thorough postmortems on production incidents to evaluate business impact and facilitate learning for the Engineering team will also be part of your responsibilities. Moreover, you will create dashboards and alerts for monitoring the platform, define key metrics and service level indicators, and ensure the collection of relevant metric data to create actionable alerts for the responsible teams. Participation in the 24/7 on-call rotation and automation of tasks to streamline application deployment and third-party tool integration will be crucial. Analyzing major incidents, collaborating with other teams to find permanent solutions, and establishing and publishing regular KPIs and metrics for measuring performance, stability, and customer satisfaction will also be expected from you. In terms of qualifications, you should hold a 4-year degree in computer science, Information Technology, or a related field. You should have a minimum of 5 years of experience in a production support role, specifically supporting large scale SAAS Production B2C or B2B Cloud Platforms, with a strong background in problem-solving and troubleshooting. Additionally, you should possess knowledge and skills in various technologies such as Java, TypeScript, Python, Azure Cloud services, monitoring tools like Splunk and Prometheus, containers, Kubernetes, Helm, Cloud networking, Firewalls, and more. Overall, this role requires strong technical expertise, effective communication skills, and a proactive approach to ensuring the smooth operation of Inspire Brands" technology infrastructure.,

Posted 4 days ago

Apply

8.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

This job is with Kyndryl, an inclusive employer and a member of myGwork the largest global platform for the LGBTQ+ business community. Please do not contact the recruiter directly. Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl We are always moving forward - always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl We are always moving forward - always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. As a Data Engineer , you will leverage your expertise in Databricks , big data platforms , and modern data engineering practices to develop scalable data solutions for our clients. Candidates with healthcare experience, particularly with EPIC systems , are strongly encouraged to apply. This includes creating data pipelines, integrating data from various sources, and implementing data security and privacy measures. The Data Engineer will also be responsible for monitoring and troubleshooting data flows and optimizing data storage and processing for performance and cost efficiency. Responsibilities Develop data ingestion, data processing and analytical pipelines for big data, relational databases and data warehouse solutions Design and implement data pipelines and ETL/ELT processes using Databricks, Apache Spark, and related tools. Collaborate with business stakeholders, analysts, and data scientists to deliver accessible, high-quality data solutions. Provide guidance on cloud migration strategies and data architecture patterns such as Lakehouse and Data Mesh Provide pros/cons, and migration considerations for private and public cloud architectures Provide technical expertise in troubleshooting, debugging, and resolving complex data and system issues. Create and maintain technical documentation, including system diagrams, deployment procedures, and troubleshooting guides Experience working with Data Governance, Data security and Data Privacy (Unity Catalogue or Purview) Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won&apost find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You&aposre good at what you do and possess the required experience to prove it. However, equally as important - you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused - someone who prioritizes customer success in their work. And finally, you&aposre open and borderless - naturally inclusive in how you work with others. Required Technical And Professional Experience 3+ years of consulting or client service delivery experience on Azure Graduate/Postgraduate in computer science, computer engineering, or equivalent with minimum of 8 years of experience in the IT industry. 3+ years of experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases such as SQL server and data warehouse solutions such as Azure Synapse Extensive hands-on experience implementing data ingestion, ETL and data processing. Hands-on experience in and Big Data technologies such as Java, Python, SQL, ADLS/Blob, PySpark and Spark SQL, Databricks, HD Insight and live streaming technologies such as EventHub. Experience with cloud-based database technologies (Azure PAAS DB, AWS RDS and NoSQL). Cloud migration methodologies and processes including tools like Azure Data Factory, Data Migration Service, etc. Experience with monitoring and diagnostic tools (SQL Profiler, Extended Events, etc). Expertise in data mining, data storage and Extract-Transform-Load (ETL) processes. Experience with relational databases and expertise in writing and optimizing T-SQL queries and stored procedures. Experience in using Big Data File Formats and compression techniques. Experience in Developer tools such as Azure DevOps, Visual Studio Team Server, Git, Jenkins, etc. Experience with private and public cloud architectures, pros/cons, and migration considerations. Excellent problem-solving, analytical, and critical thinking skills. Ability to manage multiple projects simultaneously, while maintaining a high level of attention to detail. Communication Skills: Must be able to communicate with both technical and nontechnical. Able to derive technical requirements with the stakeholders. Preferred Technical And Professional Experience Cloud platform certification, e.g., Microsoft Certified: (DP-700) Azure Data Engineer Associate, AWS Certified Data Analytics - Specialty, Elastic Certified Engineer, Google Cloud Professional Data Engineer Professional certification, e.g., Open Certified Technical Specialist with Data Engineering Specialization. Experience working with EPIC healthcare systems (e.g., Clarity, Caboodle). Databricks certifications (e.g., Databricks Certified Data Engineer Associate or Professional). Knowledge of GenAI tools, Microsoft Fabric, or Microsoft Copilot. Familiarity with healthcare data standards and compliance (e.g., HIPAA, GDPR). Experience with DevSecOps and CI/CD deployments Experience in NoSQL databases design Knowledge on , Gen AI fundamentals and industry supporting use cases. Hands-on experience with Delta Lake and Delta Tables within the Databricks environment for building scalable and reliable data pipelines. Being You Diversity is a whole lot more than what we look like or where we come from, it&aposs how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we&aposre not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you - and everyone next to you - the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That&aposs the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter - wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked &aposHow Did You Hear About Us' during the application process, select &aposEmployee Referral' and enter your contact&aposs Kyndryl email address. Show more Show less

Posted 5 days ago

Apply

4.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

Experience in designing Azure IPaaS solutions with a focus on re-usability and loosely coupled architecture. Capable of leading a larger team and ensuring delivery with zero issues. Strong technical expertise in Azure IPaaS components including LogicApps, FunctionApps, APIM, ServiceBus, EventHub, EventGrid, ADF, and KeyVaults. Proficient in creating and maintaining automated build and release pipelines (DevOps CI/CD pipelines). Hands-on experience in designing and developing microservices-based architecture. Sound knowledge of Azure IaaS, PaaS, and SaaS. Experience in designing Azure IPaaS solutions with an emphasis on re-usability and loosely coupled architecture. Ability to lead a larger team and ensure seamless delivery. Proficiency in Azure IPaaS components such as LogicApps, FunctionApps, APIM, ServiceBus, EventHub, EventGrid, ADF, and KeyVaults. Familiarity with creating and managing automated build and release pipelines (DevOps CI/CD pipelines). Hands-on experience in designing and implementing microservices-based architecture. Strong understanding of Azure IaaS, PaaS, and SaaS.,

Posted 1 week ago

Apply

7.0 - 10.0 years

16 - 25 Lacs

Bengaluru

Remote

Role : Lead/Senior Python Azure Total Experience : 7+years Notice period : Immediate Skills Mandatory : Python, Azure Services, MSSQL, Azure functions, Blob, Queue, EventHub, Key Vault, Cosmos DB, Azure Event Grid / Service Bus etc EKS, Assure container instances Agile, Jira Job Purpose (both Onsite / Offshore) We are seeking a Lead Python Developer + Azure to join our dynamic team. The ideal candidate will have a strong background in Python programming. Sound understanding on we application development, with a focus on utilizing Azure services for building scalable and efficient solutions. Responsible for delivering senior-level innovative, compelling, coherent software solutions for our consumer, internal operations, and value chain constituents across a wide variety of enterprise applications through the creation of discrete business services and their supporting components. Job Description / Duties and Responsibilities Design, develop and deliver solutions that meet business line and enterprise requirements. Lead a team of Python developers, providing technical guidance, mentorship, and support in project execution. Participates in rapid prototyping and POC development efforts. Advances overall enterprise technical architecture and implementation best practices. Assists in efforts to develop and refine functional and non-functional requirements. Participates in iteration and release planning. Performs functional and non-functional testing. Informs efforts to develop and refine functional and non-functional requirements. Demonstrates knowledge of, adherence to, monitoring and responsibility for compliance with state and federal regulations and laws as they pertain to this position. Strong ability to produce high-quality, properly functioning deliverables the first time. Delivers work product according to established deadlines. Estimates tasks with a level of granularity and accuracy commensurate with the information provided. Works collaboratively in a small team. Excels in a rapid iteration environment with short turnaround times. Deals positively with high levels of uncertainty, ambiguity, and shifting priorities. Accepts a wide variety of tasks and pitches in wherever needed. Constructively presents, discuss and debates alternatives. Takes shared ownership of the product. Communicates effectively both verbally and in writing. Takes direction from team leads and upper management. Ability to work with little to no supervision while performing duties. Skills and Competencies : Architect, design, and implement high-performance and scalable Python back-end applications. Proficient in Python programming language to develop backend services and APIs. Experience with any web frameworks such as FastAPI/Flask/Django for building RESTful APIs. Experience in Azure services such as Azure functions, Blob, Queue, EventHub, Key Vault, Cosmos DB, Azure Event Grid / Service Bus,EKS, Assure container etc Basic understanding on Azure services such as Azure functions, Blob, Queue, EventHub, Key Vault, Cosmos DB, Azure Event Grid / Service Bus,EKS, Assure container etc Knowledge in Implementing authentication and authorization mechanisms using AWS Cognito and other relevant services. Good understanding on databases Including PostgreSQL, MongoDB, AWS Aurora, DynamoDB. Experience in automated CI/CD implementation using terraform is required. Deep understanding of one or more source/version control systems (GIT/Bitbucket). Develops branching and merging strategies. Working understanding of Web API, REST, JSON etc. Working understanding of unit testing creation. Bachelors Degree is required, and/or a minimum of four (5) + related work experience. To adhere to the Information Security Management policies and procedures.

Posted 1 week ago

Apply

3.0 - 6.0 years

3 - 7 Lacs

Hyderabad, Telangana, India

On-site

-Having 3-5 years of relevant experience in Dell Boomi. Having experience in design, develop, and implement integration solutions using Boomi, as well as manage, monitor, troubleshoot, and support existing Dell Boomi integrations and platform 3-Experience in any other Scripting Programming languages and databases are expected . Excellent communication skills, should be able to work independently and interact with stakeholders directly. Mandatory skills Integration Engineer Dell Boomi Desired/ Secondary skills Apache Airflow, Nifi, EventHub.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

6 - 11 Lacs

Bengaluru, Karnataka, India

On-site

Bullet-Pointed JD: Design, architect, and develop solutions using cloud big data technology to ingest, process, and analyze large, disparate data sets Develop systems to ingest, cleanse, normalize datasets and build pipelines from various sources, structuring previously unstructured data Collaborate with internal teams and external professionals to gather requirements and identify data development opportunities Understand and map data flow across applications like CRM, Broker & Sales tools, Finance, HR, etc. Unify, enrich, and analyze diverse data to generate insights and business opportunities Design and develop data management and persistence solutions using relational and non-relational databases Create POCs to validate solution proposals and support migration initiatives Build data lake solutions to store structured and unstructured data from multiple sources and guide teams in adopting modern tech platforms Follow CI/CD processes and best practices in development to strengthen data engineering discipline Mentor team members and contribute to overall organizational growth What we are looking for: 6+ years of experience and a bachelor's degree in Information Science, Computer Science, Mathematics, Statistics, or a related quantitative field Hands-on engineer with curiosity for technology and adaptability to evolving tech landscapes Understanding of Cloud Computing (AWS, Azure preferred), Microservices, Streaming Technologies, Network, and Security 3+ years of development experience using Python-Spark, Spark Streaming, Azure SQL Server, Cosmos DB/MongoDB, Azure Event Hubs, Azure Data Lake Storage, Azure Search Design and develop data management and persistence solutions with a focus on enhancing data processing capabilities Build, test, and improve data curation pipelines integrating data from DBMS, file systems, APIs, and streaming systems for KPI and metric development Maintain platform health, monitor workloads, and act as SME for assigned applications in collaboration with Infrastructure Engineering teams Team player with a self-motivated, reliable, and disciplined work ethic, capable of managing multiple projects 3+ years of experience with source code control systems and CI/CD tools Independent and capable of managing, prioritizing, and leading workload efficiently

Posted 1 month ago

Apply

4.0 - 8.0 years

25 - 27 Lacs

Bengaluru

Hybrid

Job Summary: We are looking for a highly skilled Azure Data Engineer with experience in building and managing scalable data pipelines using Azure Data Factory, Synapse, and Databricks . The ideal candidate should be proficient in big data tools and Azure services, with strong programming knowledge and a solid understanding of data architecture and cloud platforms. Key Responsibilities: Design and deliver robust data pipelines using Azure-native tools Work with Azure services like ADLS, Azure SQL DB, Cosmos DB, and Synapse Develop ETL/ELT solutions and collaborate in cloud-native architecture discussions Support real-time and batch data processing using tools like Kafka, Spark, and Stream Analytics Partner with global teams to develop high-performing, secure, and scalable solutions Required Skills: 4 years to 7 years of experience in Data Engineering and Azure platform Expertise in Azure Data Factory, Synapse, Databricks, Stream Analytics, PowerBI Hands-on with Python, Scala, SQL, C#, Java and big data tools like Spark, Hive, Kafka, EventHub Experience with distributed systems, data governance, and large-scale data environments Apply now to join a cutting-edge data engineering team enabling innovation through Azure cloud solutions.

Posted 1 month ago

Apply

4.0 - 6.0 years

9 - 19 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

JOB DESCRIPTION: • Strong experience in Azure Datafactory,Databricks, Eventhub, Python,PySpark ,Azure Synapse and SQL • Azure Devops experience to deploy the ADF pipelines. • Knowledge/Experience with Azure cloud stack.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies