Home
Jobs
Companies
Resume

42 Streaming Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

12.0 - 15.0 years

35 - 60 Lacs

Chennai

Work from Office

Naukri logo

AWS Solution Architect: Experience in driving the Enterprise Architecture for large commercial customers Experience in healthcare enterprise transformation Prior experience in architecting cloud first applications Experience leading a customer through a migration journey and proposing competing views to drive a mutual solution. Knowledge of cloud architecture concepts Knowledge of application deployment and data migration Ability to design high availability applications on AWS across availability zones and availability regions Ability to design applications on AWS taking advantage of disaster recovery design guidelines Design, implement, and maintain streaming solutions using AWS Managed Streaming for Apache Kafka (MSK) Monitor and manage Kafka clusters to ensure optimal performance, scalability, and uptime. Configure and fine-tune MSK clusters, including partitioning strategies, replication, and retention policies. Analyze and optimize the performance of Kafka clusters and streaming pipelines to meet high-throughput and low-latency requirements. Design and implement data integration solutions to stream data between various sources and targets using MSK. Lead data transformation and enrichment processes to ensure data quality and consistency in streaming applications Mandatory Technical Skillset: AWS Architectural concepts - designs, implements, and manages cloud infrastructure AWS Services (EC2, S3, VPC, Lambda, ELB, Route 53, Glue, RDS, DynamoDB, Postgres, Aurora, API Gateway, CloudFormation, etc.) Kafka Amazon MSK Domain Experience: Healthcare domain exp. is required Blues exp. is preferred Location – Pan India

Posted 2 days ago

Apply

1.0 - 3.0 years

3 - 6 Lacs

Ambarnath

Work from Office

Naukri logo

developerWe're hiring a React Native De to build an OTT platform and real-time chat, audio/video calls for our React Native social app. Skills: React Native, Node.js, WebRTC, AWS, video streaming. Experience with social or streaming apps is a plus!

Posted 5 days ago

Apply

13.0 - 20.0 years

40 - 45 Lacs

Bengaluru

Work from Office

Naukri logo

Principal Architect - Platform & Application Architect Experience 15+ years in software/data platform architecture 5+ years in architectural leadership roles Architecture & Data Platform Expertise Education Bachelors/Master’s in CS, Engineering, or related field Title: Principal Architect Location: Onsite Bangalore Experience: 15+ years in software & data platform architecture and technology strategy Role Overview We are seeking a Platform & Application Architect to lead the design and implementation of a next-generation, multi-domain data platform and its ecosystem of applications. In this strategic and hands-on role, you will define the overall architecture, select and evolve the technology stack, and establish best practices for governance, scalability, and performance. Your responsibilities will span across the full data lifecycle—ingestion, processing, storage, and analytics—while ensuring the platform is adaptable to diverse and evolving customer needs. This role requires close collaboration with product and business teams to translate strategy into actionable, high-impact platform & products. Key Responsibilities 1. Architecture & Strategy Design the end-to-end architecture for a On-prem / hybrid data platform (data lake/lakehouse, data warehouse, streaming, and analytics components). Define and document data blueprints, data domain models, and architectural standards. Lead build vs. buy evaluations for platform components and recommend best-fit tools and technologies. 2. Data Ingestion & Processing Architect batch and real-time ingestion pipelines using tools like Kafka, Apache NiFi, Flink, or Airbyte. Oversee scalable ETL/ELT processes and orchestrators (Airflow, dbt, Dagster). Support diverse data sources: IoT, operational databases, APIs, flat files, unstructured data. 3. Storage & Modeling Define strategies for data storage and partitioning (data lakes, warehouses, Delta Lake, Iceberg, or Hudi). Develop efficient data strategies for both OLAP and OLTP workloads. Guide schema evolution, data versioning, and performance tuning. 4. Governance, Security, and Compliance Establish data governance , cataloging , and lineage tracking frameworks. Implement access controls , encryption , and audit trails to ensure compliance with DPDPA, GDPR, HIPAA, etc. Promote standardization and best practices across business units. 5. Platform Engineering & DevOps Collaborate with infrastructure and DevOps teams to define CI/CD , monitoring , and DataOps pipelines. Ensure observability, reliability, and cost efficiency of the platform. Define SLAs, capacity planning, and disaster recovery plans. 6. Collaboration & Mentorship Work closely with data engineers, scientists, analysts, and product owners to align platform capabilities with business goals. Mentor teams on architecture principles, technology choices, and operational excellence. Skills & Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. 12+ years of experience in software engineering, including 5+ years in architectural leadership roles. Proven expertise in designing and scaling distributed systems, microservices, APIs, and event-driven architectures using Java, Python, or Node.js. Strong hands-on experience with building scalable data platforms on premise/Hybrid/cloud environments. Deep knowledge of modern data lake and warehouse technologies (e.g., Snowflake, BigQuery, Redshift) and table formats like Delta Lake or Iceberg. Familiarity with data mesh, data fabric, and lakehouse paradigms. Strong understanding of system reliability, observability, DevSecOps practices, and platform engineering principles. Demonstrated success in leading large-scale architectural initiatives across enterprise-grade or consumer-facing platforms. Excellent communication, documentation, and presentation skills, with the ability to simplify complex concepts and influence at executive levels. Certifications such as TOGAF or AWS Solutions Architect (Professional) and experience in regulated domains (e.g., finance, healthcare, aviation) are desirable.

Posted 5 days ago

Apply

6.0 - 10.0 years

27 - 42 Lacs

Pune

Work from Office

Naukri logo

Job Summary COGNIZANT IS LOOKING FOR ORACLE & POSTGRESSQL DBA. Bachelors degree in Computer Science or equivalent degree. experience Minimum 5 years of experience as Oracle and Postgres database administrator Hands-on experience with Oracle and Postgres required. Hands-on experience with Oracle Data guard and Postgres streaming replications Hands on experience in Oracle RAC environment setup. In depth knowledge on Oracle and Postgres Offline and online backup and restore mechanisms. Experience in supporti Responsibilities Position Overview This role is for a database administrator who will provide support for a highly distributed Oracle and Postgres environments. This role is expected to understand the benefits and limitations of clustering log shipping streaming replication and cluster availability to meet business needs. The DBA must have Exadata hands-on experience. The DBA is responsible for setting up Oracle RAC Cluster environments upgradation patching and migrating of database on Linux Platforms. The database administrator must be able to troubleshoot time sensitive production issues in a timely manner. Database monitoring tools are used to send alerts and must be fine-tuned to reduce false positives. The DBA will respond to failure notice alerts emails and escalation phone calls to fix production systems. Additionally the DBA will serve as a subject matter expert for the configuration implementation and maintenance of database processes and data access methods used in applications. The DBA will create and manage physical Oracle Data guard setup and Postgres replication setup. This role works in coordination with teams of infrastructure engineers as well as working closely with development and support teams for optimal application or database interfaces. The DBA will be responsible to fine tune the degraded database performance. Optimization aspects to be ensured at all levels while adjusting statistics during performance tuning. The candidate should have experience in setting up PostgreSQL cluster environment. The position is responsible for 2nd/3rd level support of the Oracle and Postgres database environments and as such superior customer service is a key aspect of this role

Posted 6 days ago

Apply

7.0 - 12.0 years

18 - 33 Lacs

Navi Mumbai

Work from Office

Naukri logo

About Us: Celebal Technologies is a leading Solution Service company that provide Services the field of Data Science, Big Data, Enterprise Cloud & Automation. We are at the forefront of leveraging cutting-edge technologies to drive innovation and enhance our business processes. As part of our commitment to staying ahead in the industry, we are seeking a talented and experienced Data & AI Engineer with strong Azure cloud competencies to join our dynamic team. Job Summary: We are looking for a highly skilled Azure Data Engineer with a strong background in real-time and batch data ingestion and big data processing, particularly using Kafka and Databricks . The ideal candidate will have a deep understanding of streaming architectures , Medallion data models , and performance optimization techniques in cloud environments. This role requires hands-on technical expertise , including live coding during the interview process. Key Responsibilities Design and implement streaming data pipelines integrating Kafka with Databricks using Structured Streaming . Architect and maintain Medallion Architecture with well-defined Bronze, Silver, and Gold layers . Implement efficient ingestion using Databricks Autoloader for high-throughput data loads. Work with large volumes of structured and unstructured data , ensuring high availability and performance. Apply performance tuning techniques such as partitioning, caching , and cluster resource optimization . Collaborate with cross-functional teams (data scientists, analysts, business users) to build robust data solutions. Establish best practices for code versioning , deployment automation , and data governance . Required Technical Skills: Strong expertise in Azure Databricks and Spark Structured Streaming Processing modes (append, update, complete) Output modes (append, complete, update) Checkpointing and state management Experience with Kafka integration for real-time data pipelines Deep understanding of Medallion Architecture Proficiency with Databricks Autoloader and schema evolution Deep understanding of Unity Catalog and Foreign catalog Strong knowledge of Spark SQL, Delta Lake, and DataFrames Expertise in performance tuning (query optimization, cluster configuration, caching strategies) Must have Data management strategies Excellent with Governance and Access management Strong with Data modelling, Data warehousing concepts, Databricks as a platform Solid understanding of Window functions Proven experience in: Merge/Upsert logic Implementing SCD Type 1 and Type 2 Handling CDC (Change Data Capture) scenarios Retail/Telcom/Energy any one industry expertise Real time use case execution Data modelling

Posted 1 week ago

Apply

7.0 - 12.0 years

18 - 33 Lacs

Navi Mumbai

Work from Office

Naukri logo

About Us: Celebal Technologies is a leading Solution Service company that provide Services the field of Data Science, Big Data, Enterprise Cloud & Automation. We are at the forefront of leveraging cutting-edge technologies to drive innovation and enhance our business processes. As part of our commitment to staying ahead in the industry, we are seeking a talented and experienced Data & AI Engineer with strong Azure cloud competencies to join our dynamic team. Job Summary: We are looking for a highly skilled Azure Data Engineer with a strong background in real-time and batch data ingestion and big data processing, particularly using Kafka and Databricks . The ideal candidate will have a deep understanding of streaming architectures , Medallion data models , and performance optimization techniques in cloud environments. This role requires hands-on technical expertise , including live coding during the interview process. Key Responsibilities Design and implement streaming data pipelines integrating Kafka with Databricks using Structured Streaming . Architect and maintain Medallion Architecture with well-defined Bronze, Silver, and Gold layers . Implement efficient ingestion using Databricks Autoloader for high-throughput data loads. Work with large volumes of structured and unstructured data , ensuring high availability and performance. Apply performance tuning techniques such as partitioning, caching , and cluster resource optimization . Collaborate with cross-functional teams (data scientists, analysts, business users) to build robust data solutions. Establish best practices for code versioning , deployment automation , and data governance . Required Technical Skills: Strong expertise in Azure Databricks and Spark Structured Streaming Processing modes (append, update, complete) Output modes (append, complete, update) Checkpointing and state management Experience with Kafka integration for real-time data pipelines Deep understanding of Medallion Architecture Proficiency with Databricks Autoloader and schema evolution Deep understanding of Unity Catalog and Foreign catalog Strong knowledge of Spark SQL, Delta Lake, and DataFrames Expertise in performance tuning (query optimization, cluster configuration, caching strategies) Must have Data management strategies Excellent with Governance and Access management Strong with Data modelling, Data warehousing concepts, Databricks as a platform Solid understanding of Window functions Proven experience in: Merge/Upsert logic Implementing SCD Type 1 and Type 2 Handling CDC (Change Data Capture) scenarios Retail/Telcom/Energy any one industry expertise Real time use case execution Data modelling

Posted 2 weeks ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Mumbai

Work from Office

Naukri logo

Job Summary: We are looking for a highly skilled Azure Data Engineer with a strong background in real-time and batch data ingestion and big data processing, particularly using Kafka and Databricks . The ideal candidate will have a deep understanding of streaming architectures , Medallion data models , and performance optimization techniques in cloud environments. This role requires hands-on technical expertise , including live coding during the interview process. Key Responsibilities Design and implement streaming data pipelines integrating Kafka with Databricks using Structured Streaming . Architect and maintain Medallion Architecture with well-defined Bronze, Silver, and Gold layers . Implement efficient ingestion using Databricks Autoloader for high-throughput data loads. Work with large volumes of structured and unstructured data , ensuring high availability and performance. Apply performance tuning techniques such as partitioning, caching , and cluster resource optimization . Collaborate with cross-functional teams (data scientists, analysts, business users) to build robust data solutions. Establish best practices for code versioning , deployment automation , and data governance . Required Technical Skills: Strong expertise in Azure Databricks and Spark Structured Streaming Processing modes (append, update, complete) Output modes (append, complete, update) Checkpointing and state management Experience with Kafka integration for real-time data pipelines Deep understanding of Medallion Architecture Proficiency with Databricks Autoloader and schema evolution Deep understanding of Unity Catalog and Foreign catalog Strong knowledge of Spark SQL, Delta Lake, and DataFrames Expertise in performance tuning (query optimization, cluster configuration, caching strategies) Must have Data management strategies Excellent with Governance and Access management Strong with Data modelling, Data warehousing concepts, Databricks as a platform Solid understanding of Window functions Proven experience in: Merge/Upsert logic Implementing SCD Type 1 and Type 2 Handling CDC (Change Data Capture) scenarios Retail/Telcom/Energy any one industry expertise Real time use case execution Data modelling Location: Mumbai

Posted 2 weeks ago

Apply

4.0 - 8.0 years

27 - 42 Lacs

Hyderabad

Work from Office

Naukri logo

Job Summary We are looking for an experienced Infra Dev Specialist with 4 to 8 years of experience to join our team. The ideal candidate will have expertise in KSQL Kafka Schema Registry Kafka Connect and Kafka. This role involves working in a hybrid model with day shifts and does not require travel. The candidate will play a crucial role in developing and maintaining our infrastructure to ensure seamless data flow and integration. Responsibilities Develop and maintain infrastructure solutions using KSQL Kafka Schema Registry Kafka Connect and Kafka. Oversee the implementation of data streaming and integration solutions to ensure high availability and performance. Provide technical support and troubleshooting for Kafka-related issues to minimize downtime and ensure data integrity. Collaborate with cross-functional teams to design and implement scalable and reliable data pipelines. Monitor and optimize the performance of Kafka clusters to meet the demands of the business. Ensure compliance with security and data governance policies while managing Kafka infrastructure. Implement best practices for data streaming and integration to enhance system efficiency. Conduct regular reviews and updates of the infrastructure to align with evolving business needs. Provide training and support to team members on Kafka-related technologies and best practices. Develop and maintain documentation for infrastructure processes and configurations. Participate in code reviews and contribute to the continuous improvement of the development process. Stay updated with the latest trends and advancements in Kafka and related technologies. Contribute to the overall success of the team by delivering high-quality infrastructure solutions. Qualifications Possess strong experience in KSQL Kafka Schema Registry Kafka Connect and Kafka. Demonstrate a solid understanding of data streaming and integration concepts. Have a proven track record of troubleshooting and resolving Kafka-related issues. Show expertise in designing and implementing scalable data pipelines. Exhibit knowledge of security and data governance practices in managing Kafka infrastructure. Display proficiency in monitoring and optimizing Kafka cluster performance. Have experience in providing technical support and training to team members. Be skilled in developing and maintaining infrastructure documentation. Stay informed about the latest trends in Kafka and related technologies. Possess excellent communication and collaboration skills. Have a proactive approach to problem-solving and continuous improvement. Demonstrate the ability to work effectively in a hybrid work model. Show commitment to delivering high-quality infrastructure solutions. Certifications Required Certified Apache Kafka Developer

Posted 2 weeks ago

Apply

5.0 - 10.0 years

6 - 7 Lacs

Mumbai Suburban, Thane, Mumbai (All Areas)

Work from Office

Naukri logo

Bachelor's degree in Broadcasting, Electrical Engineering, or related field. Proven experience as a Broadcasting Engineer or similar role, preferably in a broadcasting studio, media production company, or live event environment. Required Candidate profile Design, install, configure, and maintain broadcasting equipment and systems, including cameras, microphones, lighting, and audio/video mixers.

Posted 2 weeks ago

Apply

4.0 - 7.0 years

10 - 15 Lacs

Chennai

Remote

Naukri logo

Senior Flutter Developer Experience: 4 - 7 Years Exp Salary : INR 10-15 Lacs per annum Preferred Notice Period : Within 30 Days Shift : 7:00AM to 4:00PM IST Opportunity Type: Remote Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : audio processing, Dart, Firebase services, Flutter, Mobile App Architecture patterns, real-time synchronization, Streaming, Websockets Good to have skills : Gamification, language learning, mobile development, music applications, Speech Recognition Lingotune AI (One of Uplers' Clients) is Looking for: Senior Flutter Developer who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description About Lingotune: We are revolutionizing language learning by combining music and education, making it as addictive as Spotify. We're a VC-backed startup with founders from Google, McKinsey, and YC-backed startups. We are a passionate team that thrives at the intersection of languages, music, and tech. The Role: We're seeking a Senior Flutter Developer to develop our mobile app, which combines music, language learning, and gamification. You'll work directly with our founding team and have significant ownership over our mobile experience. What you'll do: Build and scale our Flutter-based mobile application Implement karaoke-style features with real-time lyrics synchronization Develop gamification mechanics similar to popular language learning apps Work with our in-house design team to create pixel-perfect UIs Implement audio processing and streaming features Handle real-time synchronization for lyrics and audio Optimize app performance and battery consumption Work with our AI/ML team to implement personalized learning features What we're looking for: 4+ years of professional mobile development experience 2+ years of experience with Flutter and Dart Strong understanding of mobile app architecture patterns Experience with audio processing and streaming Understanding of real-time synchronization techniques Experience with state management in Flutter Knowledge of mobile app performance optimization Nice to have: Experience with music-related applications Knowledge of speech recognition and processing Experience with gamification mechanics Understanding of language learning platforms Passion for music and language learning Tech Stack: Flutter & Dart Provider/Bloc for state management RESTful APIs and WebSocket Firebase services Audio processing and streaming Real-time synchronization What we offer: Competitive salary and equity package Remote-first work environment Direct impact on product decisions Work with experienced founders from Google, McKinsey, and Y Combinator Opportunity to shape the future of language education Access to premium music and language learning resources How to Apply: Send your resume and a brief introduction to careers@lingotune.ai. Include links to your GitHub profile and any relevant projects, especially those related to music or language learning applications. Location: Remote (with occasional team meetups) Equal Opportunity: Lingotune is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. Engagement Type: Fulltime Direct-hire on Lingotune AI payroll Job Type: Permanent Location: Remote Working time: 7:00 AM to 4:00 PM IST Interview Process - 3 Rounds How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: Lingotune makes language learning as easy as a catchy chorus in your head. Imagine Spotify and Duolingo had a baby - that's us • About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 2 weeks ago

Apply

6.0 - 11.0 years

4 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

SUMMARY Job Role: Apache Kafka Admin Experience: 6+ years Location: Pune (Preferred), Bangalore, Mumbai Must-Have: The candidate should have 6 years of relevant experience in Apache Kafka Job Description: We are seeking a highly skilled and experienced Senior Kafka Administrator to join our team. The ideal candidate will have 6-9 years of hands-on experience in managing and optimizing Apache Kafka environments. As a Senior Kafka Administrator, you will play a critical role in designing, implementing, and maintaining Kafka clusters to support our organization's real-time data streaming and event-driven architecture initiatives. Responsibilities: Design, deploy, and manage Apache Kafka clusters, including installation, configuration, and optimization of Kafka brokers, topics, and partitions. Monitor Kafka cluster health, performance, and throughput metrics and implement proactive measures to ensure optimal performance and reliability. Troubleshoot and resolve issues related to Kafka message delivery, replication, and data consistency. Implement and manage Kafka security mechanisms, including SSL/TLS encryption, authentication, authorization, and ACLs. Configure and manage Kafka Connect connectors for integrating Kafka with various data sources and sinks. Collaborate with development teams to design and implement Kafka producers and consumers for building real-time data pipelines and streaming applications. Develop and maintain automation scripts and tools for Kafka cluster provisioning, deployment, and management. Implement backup, recovery, and disaster recovery strategies for Kafka clusters to ensure data durability and availability. Stay up-to-date with the latest Kafka features, best practices, and industry trends and provide recommendations for optimizing our Kafka infrastructure. Requirements: 6-9 years of experience as a Kafka Administrator or similar role, with a proven track record of managing Apache Kafka clusters in production environments. In - depth knowledge of Kafka architecture, components, and concepts, including brokers, topics, partitions, replication, and consumer groups. Hands - on experience with Kafka administration tasks, such as cluster setup, configuration, performance tuning, and monitoring. Experience with Kafka ecosystem tools and technologies, such as Kafka Connect, Kafka Streams, and Confluent Platform. Proficiency in scripting languages such as Python, Bash, or Java. Strong understanding of distributed systems, networking, and Linux operating systems. Excellent problem-solving and troubleshooting skills, with the ability to diagnose and resolve complex technical issues. Strong communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams and communicate technical concepts to non-technical stakeholders.

Posted 3 weeks ago

Apply

4.0 - 7.0 years

18 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

• Must have - Hands on experience in Spark with Scala or Spark with Java is a MUST. • Must have - Worked on Performance Tuning, DAG optimization, Memory Management, Streaming and Batch pipelines.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

4 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

SUMMARY Job Role: Apache Kafka Admin Experience: 6+ years Location: Pune (Preferred), Bangalore, Mumbai Must-Have: The candidate should have 6 years of relevant experience in Apache Kafka Job Description: We are seeking a highly skilled and experienced Senior Kafka Administrator to join our team. The ideal candidate will have 6-9 years of hands-on experience in managing and optimizing Apache Kafka environments. As a Senior Kafka Administrator, you will play a critical role in designing, implementing, and maintaining Kafka clusters to support our organization's real-time data streaming and event-driven architecture initiatives. Responsibilities: Design, deploy, and manage Apache Kafka clusters, including installation, configuration, and optimization of Kafka brokers, topics, and partitions. Monitor Kafka cluster health, performance, and throughput metrics and implement proactive measures to ensure optimal performance and reliability. Troubleshoot and resolve issues related to Kafka message delivery, replication, and data consistency. Implement and manage Kafka security mechanisms, including SSL/TLS encryption, authentication, authorization, and ACLs. Configure and manage Kafka Connect connectors for integrating Kafka with various data sources and sinks. Collaborate with development teams to design and implement Kafka producers and consumers for building real-time data pipelines and streaming applications. Develop and maintain automation scripts and tools for Kafka cluster provisioning, deployment, and management. Implement backup, recovery, and disaster recovery strategies for Kafka clusters to ensure data durability and availability. Stay up-to-date with the latest Kafka features, best practices, and industry trends and provide recommendations for optimizing our Kafka infrastructure. Requirements: 6-9 years of experience as a Kafka Administrator or similar role, with a proven track record of managing Apache Kafka clusters in production environments. In - depth knowledge of Kafka architecture, components, and concepts, including brokers, topics, partitions, replication, and consumer groups. Hands - on experience with Kafka administration tasks, such as cluster setup, configuration, performance tuning, and monitoring. Experience with Kafka ecosystem tools and technologies, such as Kafka Connect, Kafka Streams, and Confluent Platform. Proficiency in scripting languages such as Python, Bash, or Java. Strong understanding of distributed systems, networking, and Linux operating systems. Excellent problem-solving and troubleshooting skills, with the ability to diagnose and resolve complex technical issues. Strong communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams and communicate technical concepts to non-technical stakeholders.

Posted 3 weeks ago

Apply

1 - 3 years

0 - 3 Lacs

Gurgaon

Work from Office

Naukri logo

Practical knowledge of video conferencing codec Cisco and bridging equipment or equivalent ->Hands on experience on the following AV products: ->Cisco Telepresence Equipments ->Control and Switching System Crestron, Extron, Lightware. ->Programming- AMX, Crestron ->Audio System Shure, Beyerdynamic, Biamp, etc. ->Video Display and Videowall Samsung, LG ->Projector Christie, Panasonic, Epson. Note: We are hiring for candidates who have experience from 1 to 2.5 years. Only Graduates and who are interested to work in Gurgaon location please share your updated resumes to @divya.arjun@microland.com with below details - Total Experience: Relevant Experience: Current CTC: Expected CTC: Notice Period:

Posted 2 months ago

Apply

8 - 11 years

15 - 30 Lacs

Pune, Bengaluru, Gurgaon

Work from Office

Naukri logo

Job Position: Senior GCP Data Engineer Experience: 8+ years Must-have: GCP, BigQuery, Composer, Pub-Sub, Airflow, Python, SQL, Streaming Locations: Gurgaon, Pune, Hyderabad, Bangalore, Jaipur, Chennai, Bhopal Work Mode: Hybrid

Posted 2 months ago

Apply

5 - 8 years

15 - 25 Lacs

Pune, Bengaluru, Gurgaon

Hybrid

Naukri logo

Job Role: GCP Data Engineer Experience: 5+ years Must-have: GCP, BigQuery, Composer, Pub-Sub, Airflow, Python, SQL, Streaming Locations: Pune, Hyderabad, Bhopal, Gurugram, Jaipur, Bangalore, Chennai

Posted 2 months ago

Apply

3 - 6 years

10 - 12 Lacs

Bengaluru

Remote

Naukri logo

Job Title: Data Engineer (3-5 Years Experience) Location: HYBRID- Mostly Remote Employment Type: Full-Time/ Contract Job Summary: We are seeking a highly skilled Data Engineer with 3-5 years of experience in designing, building, and optimizing data pipelines and cloud-based data solutions . The ideal candidate should have in-depth expertise in one of the following primary technologies: AWS, Azure, Databricks, or Snowflake while being familiar with other cloud and big data platforms. This role requires strong ETL development skills, data modeling knowledge, and a deep understanding of cloud-based data engineering best practices. Key Responsibilities: Design, develop, and optimize data ingestion and transformation pipelines using ETL/ELT frameworks . Implement big data processing solutions using Databricks (PySpark, Scala), Snowflake, AWS, or Azure . Develop and maintain structured and semi-structured data models , ensuring efficient data warehousing and analytics . Optimize data pipeline performance , handling large-scale datasets with efficiency. Ensure data governance, security, and compliance in cloud-based environments. Work with cloud-based storage, compute, and database services (e.g., AWS Redshift, Azure Synapse, Snowflake, Databricks Delta Lake). Collaborate with analysts, data scientists, and business teams to support reporting, analytics, and machine learning initiatives . Troubleshoot and resolve data pipeline and infrastructure issues . Implement monitoring and alerting mechanisms to maintain data pipeline health. Required Skills & Qualifications: 3-5 years of experience in data engineering, cloud computing, and ETL development . Primary expertise in one of the following: Databricks: Strong experience in Apache Spark, PySpark, Scala , and performance optimization. Snowflake: Deep knowledge of Snowflake architecture, SQL optimization, data sharing, and security best practices . AWS: Experience with AWS Glue, Redshift, S3, Lambda, and Step Functions for data workflows . Azure: Strong expertise in Azure Synapse, Data Factory, ADLS, and Databricks on Azure . Proficiency in SQL, Python, and data modeling techniques. Hands-on experience with structured and semi-structured data formats (JSON, Avro, Parquet, XML). Experience with data orchestration tools (Airflow, DBT, Prefect). Strong understanding of data security, governance, and compliance best practices. Excellent problem-solving and troubleshooting skills in big data environments . Preferred Qualifications: Experience with CI/CD pipelines for data engineering . Familiarity with real-time data streaming using Kafka, Kinesis, or Spark Streaming. Exposure to machine learning workflows and MLOps best practices. Knowledge of BI and visualization tools like Power BI, Looker, or Tableau.

Posted 2 months ago

Apply

5 - 9 years

25 - 35 Lacs

Nagpur

Work from Office

Naukri logo

Dear Candidate, We are hiring for Big Data Developer role at HCLTech; Nagpur . Please see the below job description and revert with your updated CV n case you find it suitable. GRADE/ ROLE/ SALARY As per relevant experience and last drawn CTC. To be discussed during the interview. Position: Big Data Developer Experience: 5 - 9 year Skills: Spark, Scala, Hive/SQL, Streaming Location: Nagpur, MH Responsibilities: Preferred Skillset: Spark, Scala, Linux Based Hadoop Ecosystem (HDFS, Impala, Hive, HBase, etc.), SQL, Jenkins, and Unix Commands. Experience in Big data technologies , real time data processing platform(Spark Streaming - Kafka) experience in Cloudera would be an advantage. Consistently demonstrates clear and concise written and verbal communication. Ability to multi-task and weekend support for production releases. Hands-on experience on Unix Command. Strong foundation in computer science fundamentals: Data structure, Algorithms, and coding. Experienced in Performance optimization techniques.

Posted 2 months ago

Apply

6 - 11 years

12 - 22 Lacs

Bengaluru

Work from Office

Naukri logo

Job Responsibilities: Tech Lead - Platform Delivery will act as the main technical interface point with assigned customer and OTT platform accounts to ensure that channel deliveries happen on time, with quality Be a partner to Project Manager/Service Manager and provide technical leadership during customer/platform calls Ensure that Dynamic Ad Insertion Configurations are proper and parameters are passing through. Work with platforms and internal teams on any change requests regarding monetization. Understand customer/platform requirements and advise how to use Amagi's products and services to fulfil those requirements Work closely with Implementation Engineers to deploy and configure software solutions and esure the configurations and deployment are proper Build trusting relationships with customers and partners Adopt quality as mindset. Ensure everything being sent or delivered to customer is tested and works as expected Obtain understanding of Amagi products and services and delivery specifications of various OTT platforms including stream delivery, monetization and Electronic Programme Guide (EPG) specifications. Guide, train and mentor Implementation Engineers Work with Engineering team to understand new features and deploy new customer requirements which are otherwise not supported by product Be available to work in US Shift Job Requirements: Our ideal candidate has atleast 6 years of experience in the Media Technologies sector, with at least 2 of them in external customer-facing positions. We are growing fast and we believe that people need to be together to collaborate and accelerate that growth, so this position does not allow telecommuting. The candidate has to be willing to work from the Amagi Headquarters. As most of our customers are in the US and Europe, the candidate is expected to work in Night hours on a rotational basis. Required Skills: Expertise in stream delivery over internet technologies - SRT, Zixi, HLS, DASH Expertise in AWS media services - Medialive, Mediastore, MediaTailor etc Expertise in delivering Media through various CDN - Akamai, Cloudfront, Limelight etc Expertise in Server Side Ad Insertion Technologies and knowledge of how Ad Querying using VAST works Knowledge of how EPG works Good Communication Skills Knowledge of various video codecs and standards Fundamental knowledge of how playout systems work Good Interpersonal Skills Good Communication Skills Certifications: AWS Elemental Domain Knowledge: Domain knowledge in Media technologies like OTT, Server Side Ad-Insertion, Video on Demand, Digital Broadcast is required

Posted 2 months ago

Apply

3 - 8 years

12 - 22 Lacs

Bengaluru

Work from Office

Naukri logo

Location: Bangalore Experience: 3 to 9 years Notice Period: 0-30Days Must Have Experience: Coding Experience C, C++ (C++ 11 will be added advantage) Technology Good knowledge on concepts of Audio , Video , Image,camera and Streaming Should have knowledge of Container formats , Parsers , Decoders , Encoders Good knowledge of multimedia frameworks Android/GStreamer Exposure to HAL layerGood to havePeripheral bring upKnowledge on Camera pipeline Experience with computer graphics, GPU programming (e.g. HLSL, GLSL, or similar) DRM conceptsOperating System:Firefox,KaiOS, Android etcMust know concepts Should have worked on multi-threaded environments . Understanding of IPCs , HIDL , AIDL Basic understanding of design concepts Understanding of web development best practices (e.g. Performance, Memory, etc) git or bitbucket - Common repository skills (can rebase, merge, branch, etc.).

Posted 2 months ago

Apply

5 - 8 years

20 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

Senior Data Engineer Job Description: We are seeking a Senior Data Engineer with a minimum of 5 years of experience in data architecture, data engineering, or related fields. The ideal candidate will have a proven track record of leading data projects, mentoring junior team members, and innovating to solve complex business problems. This role requires hands-on expertise in cloud platforms (Azure or GCP), programming, and designing data pipelines, as well as experience with data governance and CI/CD processes. Required Skills and Qualifications: Minimum of 5 years of experience in data architecture, data engineering, or related fields. Strong proficiency in Python and/or Java programming languages. Hands-on experience with GCP or Azure cloud platforms. Knowledge of data governance tools such as Azure Purview and Collibra. Expertise in designing and implementing batch, streaming, and event-driven data pipelines. In-depth knowledge of data lake and data warehouse architectures, particularly Delta Lake and BigQuery. Experience working in Agile or Scrum environments, with a focus on iterative delivery and adapting to evolving requirements. Excellent problem-solving, analytical, and critical thinking skills. Strong attention to detail with an emphasis on data accuracy and process integrity. Soft Skills: Strong communication skills, with the ability to explain technical concepts to non-technical stakeholders. Collaborative mindset and ability to work effectively within cross-functional teams. Ability to mentor and guide junior team members. If you are passionate about data engineering and ready to make a significant impact, we encourage you to apply for this exciting opportunity! Roles and Responsibilities Key Responsibilities: Lead the design, development, and optimization of data pipelines using batch, streaming, and event-driven architectures (e.g., Azure Event Hubs, GCP Pub/Sub, Apache Kafka). Architect and implement data lake and data warehouse solutions, including Delta Lake and BigQuery. Implement and manage tools for data cataloging, metadata management, and access control (e.g., Azure Purview, Collibra). Develop, deploy, and manage CI/CD pipelines using tools like Azure DevOps or Jenkins. Strategize and deliver data-driven solutions to solve complex business challenges. Mentor junior team members and provide leadership on data engineering best practices. Collaborate with cross-functional teams and communicate effectively with stakeholders. Ensure compliance with data privacy regulations (e.g., GDPR, CCPA) and related industry standards.

Posted 2 months ago

Apply

8 - 12 years

30 - 45 Lacs

Bengaluru

Work from Office

Naukri logo

As a Data Platform Architect , you will play a pivotal role in the design, implementation, and optimization of scalable data architecture and systems. You will be responsible for creating data-driven solutions that enable effective data storage, integration, processing, and analytics. You will collaborate closely with data engineers, data scientists, and business stakeholders to build a robust data infrastructure that supports the organization’s data strategy. Qualifications: Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Engineering, or a related field. Proven experience (typically 7+ years) in designing and implementing complex data architectures and platforms. Expertise in cloud platforms (AWS, Azure, GCP) and their data services (e.g., Amazon Redshift, Azure Synapse, BigQuery). Strong knowledge of data integration, ETL/ELT tools, and data pipelines. Hands-on experience with big data technologies (e.g., Hadoop, Spark, Kafka). Proficiency in database technologies (SQL, NoSQL, columnar, graph, etc.). Familiarity with data modeling techniques and practices. Strong understanding of data security, governance, and privacy practices. Experience with data visualization and reporting tools (e.g., Power BI, Tableau). Excellent problem-solving, analytical, and troubleshooting skills. Strong communication and collaboration skills with technical and non-technical teams. Preferred Skills: Certifications in cloud platforms (e.g., AWS Certified Solutions Architect, Microsoft Certified: Azure Data Engineer Associate). Experience with containerized environments (e.g., Docker, Kubernetes). Familiarity with DevOps practices and CI/CD pipelines. Knowledge of machine learning and artificial intelligence applications in data platforms. Experience with Agile methodologies and project management tools. Roles and Responsibilities Key Responsibilities: Architect Data Solutions: Design and implement high-performance, scalable data platforms that support large-scale data ingestion, transformation, and analytics. Data Modeling: Develop and maintain data models and databases, ensuring they align with business requirements and are optimized for performance and usability. Cloud Infrastructure: Lead the adoption and implementation of cloud-based data platforms (e.g., AWS, Azure, Google Cloud) for data storage, processing, and analysis. Integration: Oversee the integration of disparate data sources and ensure seamless data flow across different systems (ETL/ELT processes). Performance Optimization: Continuously monitor, assess, and optimize the data platform architecture to ensure high performance, reliability, and scalability. Data Security & Governance: Implement best practices for data governance, privacy, and security to ensure compliance with internal policies and regulations. Collaboration: Work with cross-functional teams to understand data requirements, provide technical guidance, and deliver data solutions that meet business needs. Innovation & Research: Stay current with emerging data technologies and trends to drive innovation and improvements to the data platform. Documentation & Reporting: Maintain comprehensive documentation of data architecture, processes, and best practices for stakeholders and team members.

Posted 2 months ago

Apply

5 - 8 years

0 - 1 Lacs

Noida

Remote

Naukri logo

inBenefits Company Description CrowdApps Technologies is a Web & Mobile Apps development company based in Gurugram. We provide customized software solutions to businesses, focusing on delivering high-quality, secure, scalable, and innovative digital products. Our services include Web Development, Mobile Application Development, Digital Marketing, and Cyber Security. We have also excelled in developing Healthcare/Medical applications. At CrowdApps, we strive for perfection and aim to build life-long professional relationships with our clients. Role Description This is a full-time Back End Developer role at CrowdApps Technologies. The Back End Developer will be responsible for day-to-day tasks related to back-end web development and software development. The role is located in Gurugram, with flexibility for some remote work. As a Backend Developer (Node.js) you will be responsible for developing, testing, and maintaining software applications. You will work closely with other developers, product owners to design and develop high-quality software solutions that meet the needs of our customers. Must-Have Skills(Apply only if you have expertise on these tools) ORM (Drizzle, Prisma) + Understanding of SQL and queries Expert Knowledge in Postgres Typescript Immutability vs mutability AWS Lambda SNS (bonus) Basic understanding of networks (private + public) Security groups AWS S3 + Cloudfront Express.js (in typescript) Passport auth - RSA Push Notifications - firebase Key Responsibilities: 5+ years' experience as a Senior Node.js Developer with a strong portfolio of successful enterprise level projects. Working knowledge of NodeJS on ExpressJS frameworks with strong proficiency with JavaScript and TypeScript. Must have worked on AWS. Design and develop APIs and scalable microservices for the cloud platform. Understanding of Modularization and knowledge of WebSocket, Webhooks and API Management. Knowledge of working with NoSQL/SQL/MongoDB. Solid understanding of web technologies JSON, HTTP, Restful APIs. Understanding the nature of asynchronous programming and its quirks and workarounds. Knowledge of User authentication and authorization between multiple systems, servers, and environments. Understanding of fundamental design principles behind a scalable application. Familiarity with Unit testing and debugging. Proficient understanding of versioning tools and repositories like Git & Gitlab. Understanding of CI/CD pipeline and implementation. A team player with excellent communication skills. Qualifications Bachelor's degree in computer science, Software Engineering, or related field. Perks and benefits 1. Only 5 days a week working 2. Salary on Time 3. 10+ Festival Holidays 4. 15+ Additional leaves 5. Yearly Office Trips 6. Excellent Company Culture 7. Mediclaim Benifits

Posted 2 months ago

Apply

6 - 10 years

15 - 20 Lacs

Pune, Delhi NCR, Trivandrum

Work from Office

Naukri logo

6 years of experience in Design, development, implementation & maintenance of scalable & efficient data pipelines using Azure Databricks platform. Experience in Databricks Unity catalog, SQL, Pyspark, Python, ETL/ELT, Streaming Technologies Required Candidate profile Performance tuning and optimization for Spark jobs on Azure Databricks, troubleshoot and resolve issues related to data pipelines, clusters, Use Databricks to assemble large, complex data sets

Posted 2 months ago

Apply

5 - 10 years

35 - 50 Lacs

Chennai

Hybrid

Naukri logo

We are looking for someone with: Strong and demonstrable problem-solving ability Comfortable with self-management and on-the-job learning Ability to share knowledge across team(s) Demonstrable initiative and logical thinking Passion about emerging technologies and self development Strong computer science fundamentals Collaborative work-ethic Strong problem-solving and analytical skills Excellent communication skills Knowledge of applying object oriented and functional programming styles to real world problems. Ideally (but not restrictive) you should have: Hands on experience (5+) years using Scala and/or Java Knowledge of continuous integration and continuous delivery Knowledge of microservice architecture Working experience with TDD & BDD Experience building REST API's Experience working with Docker General knowledge of agile software development concepts and processes Proficient understanding of code versioning tools, such as Git Working experience with Jira, Confluence Nice to haves: Special interest in functional programming Knowledge of reactive manifesto Knowledge of streaming data Experience with Akka, Play Framework or Spring Experience working with Kafka Knowledge of NoSQL Cloud based development with AWS, Microsoft Azure, Google Cloud etc. Commercial exposure with ELK stack

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies