Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
13.0 years
0 Lacs
Visakhapatnam, Andhra Pradesh, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Indore, Madhya Pradesh, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Dehradun, Uttarakhand, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Mysore, Karnataka, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Thiruvananthapuram, Kerala, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Vijayawada, Andhra Pradesh, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Patna, Bihar, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Gurugram, Haryana, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Ghaziabad, Uttar Pradesh, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Agra, Uttar Pradesh, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Noida, Uttar Pradesh, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Faridabad, Haryana, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Vellore, Tamil Nadu, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Madurai, Tamil Nadu, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Surat, Gujarat, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Ahmedabad, Gujarat, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Jaipur, Rajasthan, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Chandigarh, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Kolkata, West Bengal, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Guwahati, Assam, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Bhubaneswar, Odisha, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Cuttack, Odisha, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Ranchi, Jharkhand, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough