Jobs
Interviews

523 Serialization Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 years

0 Lacs

India

On-site

Job description : Responsibilities and Duties: Understand and analyze technical requirements. Assist in developing AI models, algorithms, and solutions. Collaborate with senior developers and data scientists on projects. Contribute to end-to-end software development processes. Participate in code reviews and testing to ensure quality and functionality. Help with deployment and integration of Python-based applications and AI models. Key Skills: Strong knowledge of Python (preferably with frameworks like Flask, Django). Familiarity with AI concepts such as Machine Learning , Deep Learning , Natural Language Processing (NLP), etc. Knowledge of AI libraries such as TensorFlow , Keras , PyTorch , or Scikit-Learn . Experience with SQL or NoSQL databases (e.g., MySQL, MongoDB). Basic understanding of RESTful APIs and data serialization formats like JSON and XML . Familiarity with version control systems like Git . Understanding of OOP (Object-Oriented Programming) concepts. Ability to work with cloud platforms (AWS, Google Cloud, etc.) is a plus. Required Experience and Qualifications: Strong interest in Python development and AI/ML technologies. Basic knowledge of data structures and algorithms . Familiarity with web technologies ( HTML5 , CSS , JavaScript ) is a plus. Excellent problem-solving skills and attention to detail. Ability to work in a fast-paced environment and meet deadlines. Strong communication skills and ability to work in a team. Ability to research and self-learn new AI-related tools and technologies. Preferred Qualifications: Experience with data visualization tools like Matplotlib , Seaborn , or Tableau . Knowledge of AI ethics and the responsible use of AI is a bonus. Previous internships or academic projects in AI, machine learning, or data science. Job Type: Full-time (Internship) Job Types: Fresher, Internship Contract length: 6 months Location Type: In-person Schedule: Day shift Ability to commute/relocate: Govindpuri, Gwalior, Madhya Pradesh: Reliably commute or planning to relocate before starting work (Required) Education: Bachelor's (Preferred) Experience: Python: 1 year (Preferred) AI/ML: 1 year (Preferred) Language: English (Preferred) Work Location: In person *Speak with the employer* +91 9425151787 Job Type: Internship Contract length: 6 months Location Type: In-person Schedule: Day shift Ability to commute/relocate: Govindpuri, Gwalior, Madhya Pradesh: Reliably commute or planning to relocate before starting work (Preferred) Education: Bachelor's (Preferred) Willingness to travel: 100% (Preferred) Work Location: In person Speak with the employer +91 9425151787

Posted 11 hours ago

Apply

3.0 - 5.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience. Your Role And Responsibilities As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. You'll have the opportunity to work with the latest technologies, ensuring the applications delivered are high performing, highly available, responsive, and maintainable. Your primary responsibilities include: Analytical Problem-Solving and Solution Enhancement: Analyze, validate and propose improvements to existing failures, with the support of the architect and technical leader. Comprehensive Engagement Across Process Phases: Involvement in every step of the process, from design, development, testing release changes and troubleshoot where necessary, providing a great customer service. Strategic Stakeholder Engagement and Innovative Coding Solutions: Drive key discussions with your stakeholders and analyze the current landscape for opportunities to operate and code creative solutions. Preferred Education Master's Degree Required Technical And Professional Expertise BE / B Tech in any stream, M.Sc. (Computer Science/IT) / M.C.A, with Minimum 3-5 years of experience with software development. Experience in have working knowledge on Java 8 or higher Proven working experience on both Spring and Hibernate (Spring 4 or above). Experience in knowing OOP concepts well. Should be proficient in Exception Handling, Collections, Abstract Classes and Interfaces, Constructors, File IO (Input/Output) and Serialization, Collections (List, Map, Set), Access Specifiers, Exceptions (Checked, Unchecked), Generics, Java Keywords (Static, Final, Volatile, Synchronized, Transient), JVM (Java Virtual Machine) and Memory Management, Multithreading and Synchronization, JSP (Java Server Page)/ Servlets Preferred Technical And Professional Experience Should be conversant with Build tools like Ant, Maven and Git Must have experience on Design patterns and Optimize usage. Proven work experience on Spring Core, Spring ORM, Spring DAO, Spring AOP, Hibernate would be an added advantage

Posted 12 hours ago

Apply

0.0 - 1.0 years

0 Lacs

Govindpuri, Gwalior, Madhya Pradesh

On-site

Job description : Responsibilities and Duties: Understand and analyze technical requirements. Assist in developing AI models, algorithms, and solutions. Collaborate with senior developers and data scientists on projects. Contribute to end-to-end software development processes. Participate in code reviews and testing to ensure quality and functionality. Help with deployment and integration of Python-based applications and AI models. Key Skills: Strong knowledge of Python (preferably with frameworks like Flask, Django). Familiarity with AI concepts such as Machine Learning , Deep Learning , Natural Language Processing (NLP), etc. Knowledge of AI libraries such as TensorFlow , Keras , PyTorch , or Scikit-Learn . Experience with SQL or NoSQL databases (e.g., MySQL, MongoDB). Basic understanding of RESTful APIs and data serialization formats like JSON and XML . Familiarity with version control systems like Git . Understanding of OOP (Object-Oriented Programming) concepts. Ability to work with cloud platforms (AWS, Google Cloud, etc.) is a plus. Required Experience and Qualifications: Strong interest in Python development and AI/ML technologies. Basic knowledge of data structures and algorithms . Familiarity with web technologies ( HTML5 , CSS , JavaScript ) is a plus. Excellent problem-solving skills and attention to detail. Ability to work in a fast-paced environment and meet deadlines. Strong communication skills and ability to work in a team. Ability to research and self-learn new AI-related tools and technologies. Preferred Qualifications: Experience with data visualization tools like Matplotlib , Seaborn , or Tableau . Knowledge of AI ethics and the responsible use of AI is a bonus. Previous internships or academic projects in AI, machine learning, or data science. Job Type: Full-time (Internship) Job Types: Fresher, Internship Contract length: 6 months Location Type: In-person Schedule: Day shift Ability to commute/relocate: Govindpuri, Gwalior, Madhya Pradesh: Reliably commute or planning to relocate before starting work (Required) Education: Bachelor's (Preferred) Experience: Python: 1 year (Preferred) AI/ML: 1 year (Preferred) Language: English (Preferred) Work Location: In person *Speak with the employer* +91 9425151787 Job Type: Internship Contract length: 6 months Location Type: In-person Schedule: Day shift Ability to commute/relocate: Govindpuri, Gwalior, Madhya Pradesh: Reliably commute or planning to relocate before starting work (Preferred) Education: Bachelor's (Preferred) Willingness to travel: 100% (Preferred) Work Location: In person Speak with the employer +91 9425151787

Posted 16 hours ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Join us as a Senior Developer at Barclays, where you'll take part in the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. As a part of the team, you will deliver technology stack, using strong analytical and problem solving skills to understand the business requirements and deliver quality solutions. You'll be working on complex technical problems that will involve detailed analytical skills and analysis. This will be done in conjunction with fellow engineers, business analysts and business stakeholders. To be successful as a Senior Developer you should have experience with: Design and develop distributed data processing pipelines using Scala and Spark. Optimize Spark jobs for performance and scalability. Integrate with data sources like Kafka, HDFS, S3, Hive, and relational databases. Collaborate with data engineers, analysts, and business stakeholders to understand data requirements. Implement data transformation, cleansing, and enrichment logic. Ensure data quality, lineage, and governance. Participate in code reviews, unit testing, and deployment processes. Some Other Highly Valued Skills Include Strong proficiency in Scala, especially functional programming paradigms. Hands-on experience with Apache Spark (RDDs, DataFrames, Datasets). Expertise with Spark batch processing. Knowledge of Big Data ecosystems: Hadoop, Hive, Imapala, Kafka. Experience with data serialization formats like Parquet, AVRO. Understanding of performance tuning in Spark (e.g., partitioning, caching, shuffling). Proficiency in SQL and data modeling. Familiarity with CI/CD tools, version control (Git), and containerization (Docker/Kubernetes). Familiarity with AWS toolset is added advantage You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data. Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. Development of processing and analysis algorithms fit for the intended data complexity and volumes. Collaboration with data scientist to build and deploy machine learning models. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team’s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.

Posted 23 hours ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Join us as a Senior Software Engineer at Barclays, where you'll take part in the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. As a part of the team, you will deliver technology stack, using strong analytical and problem solving skills to understand the business requirements and deliver quality solutions. You'll be working on complex technical problems that will involve detailed analytical skills and analysis. This will be done in conjunction with fellow engineers, business analysts and business stakeholders. To be successful as a Senior Software Engineer you should have experience with: Strong proficiency in Scala, especially functional programming paradigms. Hands-on experience with Apache Spark (RDDs, DataFrames, Datasets). Expertise with Spark batch processing. Knowledge of Big Data ecosystems: Hadoop, Hive, Imapala, Kafka. Experience with data serialization formats like Parquet, AVRO. Understanding of performance tuning in Spark (e.g., partitioning, caching, shuffling). Proficiency in SQL and data modeling. Familiarity with CI/CD tools, version control (Git), and containerization (Docker/Kubernetes). Familiarity with AWS toolset is added advantage . You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data. Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. Development of processing and analysis algorithms fit for the intended data complexity and volumes. Collaboration with data scientist to build and deploy machine learning models. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team’s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.

Posted 23 hours ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About the Company Publicis Sapient is a digital transformation partner helping established organizations get to their future, digitally-enabled state, both in the way they work and the way they serve their customers. We help unlock value through a start-up mindset and modern methods, fusing strategy, consulting and customer experience with agile engineering and problem-solving creativity. United by our core values and our purpose of helping people thrive in the brave pursuit of next, our 20,000+ people in 53 offices around the world combine experience across technology, data sciences, consulting and customer obsession to accelerate our clients’ businesses through designing the products and services their customers truly value. About the Role We at Publicis Sapient is looking for Senior Associate Level 1 (Core Java + Microservices) to join our team of bright thinkers and doers. You will drive and translate our client's business problems into innovative technology solutions by creating and owning the technical vision of the project and ensuring that the vision is achieved with a high level of quality. You are a high-performance engineer expected to work in a product squad and deliver solutions for a medium to large-scale client. Responsibilities Responsible for programming and working with the design team and clients to create the needed artifacts. Understands client business domain and has been part projects with Digital Business Transformation (DBT) opportunity. Combine your technical expertise and problem-solving passion to work closely with clients, turning complex ideas into end-to-end solutions that transform our clients’ business. Constantly innovate and evaluate emerging technologies and methods to provide scalable and elegant solutions that help clients achieve their business goals. Responsible for choosing the needed technology stack based on the functional, non-functional requirements and based on other factors like client drivers, environment, and feasibility. Qualifications Experience range: 5 – 7 Years Experience in developing microservices in Spring Boot. Experience in security, transaction, Idempotency, log tracing, distributed caching, monitoring, and containerization requirements of Microservices. Experience in developing High Cohesion & Loosely Coupled Micro Services. Hands-on experience in Microservices Architecture. Should have excellent acumen in Data Structures, algorithms, problem-solving, and Logical/Analytical skills. Thorough understanding of OOPS concepts, Design principles, and implementation of different types of Design patterns. Sound understanding of concepts like Exceptional handling, Serialization/Deserialization Immutability concepts, etc. Good fundamental knowledge in Enums, Collections, Annotations, Generics, Autoboxing, etc. Experience with Multithreading, Concurrent Packages and Concurrent APIs. Basic understanding of Java Memory Management (JMM) including garbage collection concepts. Experience in RDBMS or NO SQL databases and writing SQL queries (Joins, group by, aggregate functions, etc.) Hands-on experience with Massage brokers like Kafka or others. Hands-on experience in creating RESTful web services and consuming web services. Hands-on experience with Spring Cloud/Spring Boot. Hands-on experience with any of the logging frameworks (SLF4J/LogBack/Log4j). Experience in writing Junit test cases using Mockito / Powermock frameworks. Should have practical experience with Maven/Gradle and knowledge of version control systems like Git/SVN etc. Hands-on experience in Cloud deployment/development. Required Skills: {Great to Have} Any Cloud Certification Distributed computing Building scalable systems UX areas like ReactJS, NodeJS, Webflux, etc. Education Bachelor’s/Master’s Degree in Computer Engineering, Computer Science, or a related field Pay range and compensation package 18 paid holidays throughout the year. Generous parental leave and new parent transition program. Flexible work arrangements. Employee Assistance Programs to help you in wellness and well being. Equal Opportunity Statement Gender-Neutral Policy

Posted 1 day ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

💼Job Title: Kafka Developer 👨 💻Job Type: Fulltime 📍Location: Pune 💼Work regime: Hybrid 🔥Keywords: Kafka, Apache Kafka, Kafka Connect, Kafka Streams, and Schema Registry. Position Overview: We are looking for a Kafka Developer to design and implement real-time data ingestion pipelines using Apache Kafka. The role involves integrating with upstream flow record sources, transforming and validating data, and streaming it into a centralized data lake for analytics and operational intelligence What you will Have:- Responsibilities: Key Responsibilities : Develop Kafka producers to ingest flow records from upstream systems such as flow record exporters (e.g., IPFIX-compatible probes). Build Kafka consumers to stream data into Spark Structured Streaming jobs and downstream data lakes. Define and manage Kafka topic schemas using Avro and Schema Registry for schema evolution. Implement message serialization, transformation, enrichment, and validation logic within the streaming pipeline. Ensure exactly once processing, checkpointing, and fault tolerance in streaming jobs. Integrate with downstream systems such as HDFS or Parquet-based data lakes, ensuring compatibility with ingestion standards. Collaborate with Kafka administrators to align topic configurations, retention policies, and security protocols. Participate in code reviews, unit testing, and performance tuning to ensure high-quality deliverables. Document pipeline architecture, data flow logic, and operational procedures for handover and support. Required Skills & Qualifications : Proven experience in developing Kafka producers and consumers for real-time data ingestion pipelines. Strong hands-on expertise in Apache Kafka, Kafka Connect, Kafka Streams, and Schema Registry. Proficiency in Apache Spark (Structured Streaming) for real-time data transformation and enrichment. Solid understanding of IPFIX, NetFlow, and network flow data formats; experience integrating with nProbe Cento is a plus. Experience with Avro, JSON, or Protobuf for message serialization and schema evolution. Familiarity with Cloudera Data Platform components such as HDFS, Hive, YARN, and Knox. Experience integrating Kafka pipelines with data lakes or warehouses using Parquet or Delta formats. Strong programming skills in Scala, Java, or Python for stream processing and data engineering tasks. Knowledge of Kafka security protocols including TLS/SSL, Kerberos, and access control via Apache Ranger. Experience with monitoring and logging tools such as Prometheus, Grafana, and Splunk. Understanding of CI/CD pipelines, Git-based workflows, and containerization (Docker/Kubernetes) A little about us: Innova Solutions is a diverse and award-winning global technology services partner. We provide our clients with strategic technology, talent, and business transformation solutions, enabling them to be leaders in their field. Founded in 1998, headquartered in Atlanta (Duluth), Georgia. Employs over 50,000 professionals worldwide, with annual revenue approaching $3.0B. Delivers strategic technology and business transformation solutions globally. Operates through global delivery centers across North America, Asia, and Europe. Provides services for data center migration and workload development for cloud service providers. Awardee of prestigious recognitions including: Women’s Choice Awards - Best Companies to Work for Women & Millennials, 2024 Forbes, America’s Best Temporary Staffing and Best Professional Recruiting Firms, 2023 American Best in Business, Globee Awards, Healthcare Vulnerability Technology Solutions, 2023 Global Health & Pharma, Best Full Service Workforce Lifecycle Management Enterprise, 2023 Received 3 SBU Leadership in Business Awards Stevie International Business Awards, Denials Remediation Healthcare Technology Solutions, 2023

Posted 1 day ago

Apply

13.0 years

0 Lacs

Kochi, Kerala, India

Remote

Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

13.0 years

0 Lacs

Greater Bhopal Area

Remote

Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

13.0 years

0 Lacs

Visakhapatnam, Andhra Pradesh, India

Remote

Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

13.0 years

0 Lacs

Indore, Madhya Pradesh, India

Remote

Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

13.0 years

0 Lacs

Dehradun, Uttarakhand, India

Remote

Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

13.0 years

0 Lacs

Mysore, Karnataka, India

Remote

Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

13.0 years

0 Lacs

Thiruvananthapuram, Kerala, India

Remote

Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

13.0 years

0 Lacs

Vijayawada, Andhra Pradesh, India

Remote

Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

13.0 years

0 Lacs

Patna, Bihar, India

Remote

Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

13.0 years

0 Lacs

Gurugram, Haryana, India

Remote

Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

13.0 years

0 Lacs

Ghaziabad, Uttar Pradesh, India

Remote

Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

13.0 years

0 Lacs

Agra, Uttar Pradesh, India

Remote

Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

13.0 years

0 Lacs

Noida, Uttar Pradesh, India

Remote

Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

13.0 years

0 Lacs

Faridabad, Haryana, India

Remote

Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

13.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

Remote

Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

13.0 years

0 Lacs

Vellore, Tamil Nadu, India

Remote

Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

13.0 years

0 Lacs

Madurai, Tamil Nadu, India

Remote

Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

13.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

Exploring Serialization Jobs in India

Serialization is a key skill in the technology industry, especially in India where there is a growing demand for professionals with expertise in this area. Serialization involves the process of converting complex data structures or objects into a format that can be easily stored or transmitted. Job seekers who are skilled in serialization have a wide range of opportunities in India's job market.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Chennai

These cities are known for their vibrant tech scenes and have a high demand for professionals with serialization skills.

Average Salary Range

The salary range for serialization professionals in India varies based on experience levels. Entry-level professionals can expect to earn between INR 4-6 lakhs per annum, while experienced professionals with several years of experience can earn upwards of INR 15 lakhs per annum.

Career Path

A typical career path in serialization may include roles such as Junior Developer, Senior Developer, Tech Lead, and eventually progressing to roles such as Architect or Manager.

Related Skills

In addition to serialization, professionals in this field may be expected to have knowledge of related skills such as data structures, algorithms, programming languages like Java or Python, and experience with databases.

Interview Questions

  • What is serialization and why is it important? (basic)
  • Explain the difference between serialization and deserialization. (basic)
  • How does serialization work in Java? (medium)
  • What are the different types of serialization in .NET? (medium)
  • Can you explain the serialization process in Python? (medium)
  • What are the advantages and disadvantages of serialization? (medium)
  • How can you customize the serialization process in C#? (advanced)
  • Explain the concept of version-tolerant serialization. (advanced)
  • What is JSON serialization and how is it different from XML serialization? (medium)
  • How does serialization help in data transfer between applications? (basic)
  • What are some best practices for serialization in distributed systems? (advanced)
  • Can you explain the concept of binary serialization? (medium)
  • How do you handle circular references during serialization? (medium)
  • What is the role of serialization in microservices architecture? (medium)
  • How can you optimize serialization performance in a high-traffic system? (advanced)
  • Explain the serialization protocols used in RESTful services. (advanced)
  • How do you handle backward compatibility in serialization? (advanced)
  • What are some common pitfalls to avoid in serialization? (medium)
  • How does serialization differ in a multi-threaded environment? (advanced)
  • Can you explain the concept of transient fields in serialization? (basic)
  • How do you secure data during serialization and deserialization? (advanced)
  • What are the challenges of serialization in cloud computing? (advanced)
  • How do you handle null values during serialization? (basic)
  • Explain the role of serialization frameworks in modern software development. (medium)
  • How do you ensure data integrity during the serialization process? (advanced)

Closing Remark

In conclusion, serialization is a valuable skill to have in the technology industry, and job seekers in India can find numerous opportunities in this field. By preparing thoroughly and showcasing your expertise in serialization, you can confidently apply for roles and advance your career in this dynamic industry. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies