Jobs
Interviews

7 Streaming Data Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

The job is located in Chennai, Tamil Nadu, India with the company Hitachi Energy India Development Centre (IDC). As part of the Engineering & Science profession, the job is full-time and not remote. The primary focus of the India Development Centre is on research and development, with around 500 R&D engineers, specialists, and experts dedicated to creating and sustaining digital solutions, new products, and technology. The centre collaborates with Hitachi Energy's R&D and Research centres across more than 15 locations in 12 countries. The mission of Hitachi Energy is to advance the world's energy system to be more sustainable, flexible, and secure while considering social, environmental, and economic aspects. The company has a strong global presence with installations in over 140 countries. As a potential candidate for this role, your responsibilities include: - Meeting milestones and deadlines while staying on scope - Providing suggestions for improvements and being open to new ideas - Collaborating with a diverse team across different time zones - Enhancing processes for continuous integration, deployment, testing, and release management - Ensuring the highest standards of security - Developing, maintaining, and supporting Azure infrastructure and system software components - Providing guidance to developers on building solutions using Azure technologies - Owning the overall architecture in Azure - Ensuring application performance, uptime, and scalability - Leading CI/CD processes design and implementation - Defining best practices for application deployment and infrastructure maintenance - Monitoring and reporting on compute/storage costs - Managing deployment of a .NET microservices based solution - Upholding Hitachi Energy's core values of safety and integrity Your background should ideally include: - 3+ years of experience in Azure DevOps, CI/CD, configuration management, and test automation - 2+ years of experience in various Azure technologies such as IAC, ARM, YAML, Azure PaaS, Azure Active Directory, Kubernetes, and Application Insight - Proficiency in Bash scripting - Hands-on experience with Azure components and services - Building and maintaining large-scale SaaS solutions - Familiarity with SQL, PostgreSQL, NoSQL, Redis databases - Expertise in infrastructure as code automation and monitoring - Understanding of security concepts and best practices - Experience with deployment tools like Helm charts and docker-compose - Proficiency in at least one programming language (e.g., Python, C#) - Experience with system management in Linux environment - Knowledge of logging & visualization tools like ELK stack, Prometheus, Grafana - Experience in Azure Data Factory, WAF, streaming data, big data/analytics Proficiency in spoken and written English is essential for this role. If you have a disability and require accommodations during the job application process, you can request reasonable accommodations through Hitachi Energy's website by completing a general inquiry form. This assistance is specifically for individuals with disabilities needing accessibility support during the application process.,

Posted 2 weeks ago

Apply

3.0 - 5.0 years

3 - 12 Lacs

Hyderabad, Telangana, India

On-site

Key Responsibilities: Data Management and Governance: Oversee the data management strategy for financial services within the global network to ensure data quality, consistency, and compliance across multiple regions. Implement data governance policies to ensure adherence to regulatory requirements (e.g., GDPR , CCPA , SOX ) and best practices. Develop data models and workflows that meet the business requirements while ensuring data security and privacy . AI and Machine Learning Implementation: Work closely with AI/ML teams to build intelligent data systems, utilizing artificial intelligence and machine learning techniques to improve data quality, predictive analytics, and decision-making processes. Implement machine learning algorithms to automate data categorization, anomaly detection, and risk assessments within financial data systems . Integrate AI-powered tools with data lakes and cloud storage systems to enhance the analysis of large datasets in real-time. Data Integration and Architecture: Design and implement data integration frameworks that connect diverse data sources (e.g., transactional data, customer insights, financial records) across the S&C Global Network . Create an enterprise data architecture that supports efficient, scalable, and secure data flows across platforms (on-premise and cloud-based). Ensure seamless data pipeline integration using technologies such as ETL (Extract, Transform, Load), APIs , and real-time data streaming . Data Analytics and Reporting: Collaborate with data analysts and business intelligence teams to develop insights that drive business decisions within financial services. Provide actionable data insights through AI-driven analytics , enabling stakeholders to make informed decisions. Utilize data visualization tools (e.g., Power BI , Tableau ) to create reports and dashboards for senior leadership, identifying trends, patterns, and potential risks. Financial Services (FS) Data Strategy: Develop and maintain a strategic roadmap for financial services data across the S&C Global Network, aligning with business goals and IT objectives. Work with cross-functional teams to improve the data lifecycle management process from data collection and storage to analysis and reporting. Enhance the ability to track and manage financial data systems , ensuring accuracy and completeness in reporting, auditing, and regulatory compliance. Collaboration and Stakeholder Management: Work closely with business stakeholders, including senior leaders, to understand requirements and propose solutions that address data challenges within global financial networks . Lead cross-functional teams to implement data-driven solutions, fostering collaboration between business, technology, and data teams. Provide training and support to teams on the effective use of AI-powered data management tools and data governance processes . Continuous Improvement and Innovation: Stay current with AI trends , data management best practices , and innovations in financial services data technologies. Propose and implement new technologies or processes that can improve data management efficiency, scalability, or automation. Develop a culture of continuous improvement in data quality and AI-powered decision-making within the organization. Requirements: Education: Bachelor's or Master's degree in Computer Science , Data Science , Information Technology , Financial Services , or related fields. Relevant certifications in AI/ML , Data Management , or Financial Services are a plus. Experience: 5+ years of experience in data management , AI/ML applications , and financial services . Proven track record of managing large-scale data projects within global networks . Hands-on experience with AI-powered data systems , predictive analytics , and data pipelines . Expertise in financial data management , including compliance, reporting, and risk management processes. Technical Skills: Strong knowledge of data integration tools (e.g., ETL , API integration , streaming data ). Experience with AI frameworks (e.g., TensorFlow , PyTorch , Azure AI ). Familiarity with cloud platforms like AWS , Azure , or Google Cloud for data storage and computing. Advanced proficiency in SQL and experience with big data technologies (e.g., Hadoop , Spark , Databricks ). Proficient in data visualization tools such as Power BI , Tableau , or similar.

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 - 3 Lacs

Hyderabad

Work from Office

Software Real-Time Data Engineer III - Streaming data Position Responsibilities Summary: Configure and validate infrastructure related to our internal SAAS Real-Time Data Streaming Platform in Azure. Develop and validate the Streaming Pipeline and data models. Enrich data with Azure Stream Analytics. Use Machine Learning techniques like classification and regression. Develop systems to support applying AI/ML techniques . Provide support to our development teams and work with our Data Organization Work with our Application Architects. Documentation: writes knowledge articles and other documentation General Responsibilities: Serving as a regional point of contact in our Data Streaming team together with members in other regions. Work in close coordination with the architecture, development, data and operations team such that the platform functions within expectation. Support the development Streaming Pipeline and data models. Look for opportunities to improve our end-to-end stream by means of architecture and technology-based changes. Education and/or Requirements Required: The incumbent should possess minimum associate or bachelors degree in computer science and proven work experience. Candidates must have proven excellent communications and interpersonal skills. Candidates should have strong organizational acumen. Skill Requirements: Works constructively and collaboratively with other team members and across organizations to accomplish organizational goals and objectives. Applies methodologies, processes, and tools to enhance work effectiveness and to implement process improvements. Leverages experiences and acquires and shares new skills and knowledge to enhance organizational capability and individual competence. Fosters teamwork by identifying and removing obstacles to ensure organization results are achieved. Proven affinity with Microsoft Azure Cloud and specifically with Azure Event Hub, Azure Stream Analytics. Knowledge and experience with Apache Flink or Kafka Streams API, Apache Spark, Apache Nifi and Azure HDInsight are a plus. Proven ability to develop in Java. Alternatives like Scala or Kotlin would be a plus.

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

The role at Hitachi Energy India Development Centre (IDC) in Chennai offers you the opportunity to be part of a dedicated team of over 500 R&D engineers, specialists, and experts focused on creating innovative digital solutions, new products, and cutting-edge technology. As a part of the IDC team, you will collaborate with R&D and Research centers across more than 15 locations globally, contributing to the advancement of the world's energy system towards sustainability, flexibility, and security. Your primary responsibilities in this role include staying on track to meet project milestones and deadlines, actively suggesting and implementing process improvements, collaborating with a diverse team across different time zones, and enhancing processes related to continuous integration, deployment, testing, and release management. You will play a crucial role in developing, maintaining, and supporting azure infrastructure and system software components, providing guidance on azure tech components, ensuring application performance, uptime, and scalability, and leading CI/CD processes design and implementation. To excel in this position, you should possess at least 3 years of experience in azure DevOps, CI/CD, configuration management, and test automation, along with expertise in Azure PaaS, Azure Active Directory, Kubernetes, and application insight. Additionally, you should have hands-on experience with infrastructure as code automation, database management, system monitoring, security practices, containerization, and Linux system administration. Proficiency in at least one programming language, strong communication skills in English, and a commitment to Hitachi Energy's core values of safety and integrity are essential for success in this role. If you are a qualified individual with a disability and require accommodations during the job application process, you can request reasonable accommodations through our website. Please provide specific details about your needs to receive the necessary support. This opportunity is tailored for individuals seeking accessibility assistance, and inquiries for other purposes may not receive a response.,

Posted 2 weeks ago

Apply

5.0 - 10.0 years

6 - 15 Lacs

Bengaluru

Work from Office

Urgent Hiring _ Azure Data Engineer with a leading Management Consulting Company @ Bangalore Location. Strong expertise in Databricks & Pyspark while dealing with batch processing or live (streaming) data sources. 4+ relevant years of experience in Databricks & Pyspark/Scala 7+ total years of experience Good in data modelling and designing. Ctc- Hike Shall be considered on Current/Last Drawn Pay Apply - rohita.robert@adecco.com Has worked on real data challenges and handled high volume, velocity, and variety of data. Excellent analytical & problem-solving skills, willingness to take ownership and resolve technical challenges. Contributes to community building initiatives like CoE, CoP. Mandatory skills: Azure - Master ELT - Skill Data Modeling - Skill Data Integration & Ingestion - Skill Data Manipulation and Processing - Skill GITHUB, Action, Azure DevOps - Skill Data factory, Databricks, SQL DB, Synapse, Stream Analytics, Glue, Airflow, Kinesis, Redshift, SonarQube, PyTest - Skill

Posted 1 month ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Bengaluru

Work from Office

Position : Senior Azure Data Engineer (Only Immediate Joiner) Location : Bangalore Mode of Work : Work from Office Experience : 7 years relevant experience Job Type : Full Time (On Roll) Job Description Roles and Responsibilities: The Data Engineer will work on data engineering projects for various business units, focusing on delivery of complex data management solutions by leveraging industry best practices. They work with the project team to build the most efficient data pipelines and data management solutions that make data easily available for consuming applications and analytical solutions. A Data engineer is expected to possess strong technical skills. Key Characteristics Technology champion who constantly pursues skill enhancement and has inherent curiosity to understand work from multiple dimensions. Interest and passion in Big Data technologies and appreciates the value that can be brought in with an effective data management solution. Has worked on real data challenges and handled high volume, velocity, and variety of data. Excellent analytical & problem-solving skills, willingness to take ownership and resolve technical challenges. Contributes to community building initiatives like CoE, CoP. Mandatory skills: Azure - Master ELT - Skill Data Modeling - Skill Data Integration & Ingestion - Skill Data Manipulation and Processing - Skill GITHUB, Action, Azure DevOps - Skill Data factory, Databricks, SQL DB, Synapse, Stream Analytics, Glue, Airflow, Kinesis, Redshift, SonarQube, PyTest - Skill Optional skills: Experience in project management, running a scrum team. Experience working with BPC, Planning. Exposure to working with external technical ecosystem. MKDocs documentation Interested candidates kindly share your CV and below details to usha.sundar@adecco.com 1) Present CTC (Fixed + VP) - 2) Expected CTC - 3) No. of years experience - 4) Notice Period - 5) Offer-in hand - 6) Reason of Change - 7) Present Location -

Posted 1 month ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Data Analyst Location: Bangalore Experience: 8 - 15 Yrs Type: Full-time Role Overview We are seeking a skilled Data Analyst to support our platform powering operational intelligence across airports and similar sectors. The ideal candidate will have experience working with time-series datasets and operational information to uncover trends, anomalies, and actionable insights. This role will work closely with data engineers, ML teams, and domain experts to turn raw data into meaningful intelligence for business and operations stakeholders. Key Responsibilities Analyze time-series and sensor data from various sources Develop and maintain dashboards, reports, and visualizations to communicate key metrics and trends. Correlate data from multiple systems (vision, weather, flight schedules, etc) to provide holistic insights. Collaborate with AI/ML teams to support model validation and interpret AI-driven alerts (e.g., anomalies, intrusion detection). Prepare and clean datasets for analysis and modeling; ensure data quality and consistency. Work with stakeholders to understand reporting needs and deliver business-oriented outputs. Qualifications & Required Skills Bachelors or Masters degree in Data Science, Statistics, Computer Science, Engineering, or a related field. 5+ years of experience in a data analyst role, ideally in a technical/industrial domain. Strong SQL skills and proficiency with BI/reporting tools (e.g., Power BI, Tableau, Grafana). Hands-on experience analyzing structured and semi-structured data (JSON, CSV, time-series). Proficiency in Python or R for data manipulation and exploratory analysis. Understanding of time-series databases or streaming data (e.g., InfluxDB, Kafka, Kinesis). Solid grasp of statistical analysis and anomaly detection methods. Experience working with data from industrial systems or large-scale physical infrastructure. Good-to-Have Skills Domain experience in airports, smart infrastructure, transportation, or logistics. Familiarity with data platforms (Snowflake, BigQuery, Custom-built using open-source). Exposure to tools like Airflow, Jupyter Notebooks and data quality frameworks. Basic understanding of AI/ML workflows and data preparation requirements.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies