Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
13.0 years
0 Lacs
Chandigarh, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Kolkata, West Bengal, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Guwahati, Assam, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Bhubaneswar, Odisha, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Cuttack, Odisha, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Ranchi, Jharkhand, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Raipur, Chhattisgarh, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Amritsar, Punjab, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Jamshedpur, Jharkhand, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Thane, Maharashtra, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Greater Lucknow Area
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Nagpur, Maharashtra, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Nashik, Maharashtra, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Kanpur, Uttar Pradesh, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
3.5 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Responsibilities Writing testable and efficient code. Design and implementation of low-latency, high-availability, and performant applications. Implementation of security and data protection. Implementing business logic and developing APIs and services. Build reusable code and libraries for future use. Requirements 3.5 years' experience in development. Hands-on experience in back-end development with Go/Node.js . Knowledge of Node.js frameworks such as Resitfy. Good understanding of server-side templating languages. Basic understanding of front-end technologies, such as HTML5 and CSS3 Expertise with Linux-based systems. Proficient understanding of code versioning tools, such as Git. Have worked in any of the cloud-based platforms AWS, GCP, Docker, and Kubernetes. Independent, resourceful, analytical, and able to solve problems effectively. Ability to be flexible, agile, and thrive in chaos. Excellent oral and written communication skills. This job was posted by Akash R from GoKwik.
Posted 1 day ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
NVIDIA GeForce Now is revolutionizing the distribution of video games by streaming high-end titles from the cloud to users around the world. Do you want to build a creative multi-tenant state of the art platform to lead the business of these games for developers? NVIDIA's Developer Services software team builds the technology engine that helps developers build, distribute, monetize, and analyze their games on the GeForce Now service. We are passionate about building engaging websites, highly-scalable services, and interactive tools & applications in a dynamic environment. NVIDIA is looking for best in class Software Development Engineers to join our outstanding Networking Driver engineering team, developing driver, protocols and application to deliver high throughput and lowest latency with low CPU utilization! Come and take a significant part in architecting, crafting, developing and verifying innovating solutions. Enjoy working in a relevant, growing and highly professional environment where you make a huge impact in a technology-focused company. What You Will Be Doing Architect and craft solution for real world networking problems. Daily work involves all aspects of development: Architecture, Design, Coding, Production and Verification Flexibility to work cross-functional teams You will implement and develop applications and solutions using innovative pioneering distributed technologies to ensure scalability, reliability, and efficiency. Take on complex system-level optimization and resource utilization challenges. What We Need To See BS/ MS. in Computer Science/ Engineering or equivalent experience 5+ years of relevant work experience Strong coding, debugging and performance tuning Experience in C Good knowledge of standard Ethernet Good knowledge of Network protocols Proficiency in English language & clear communication Ways To Stand Out From The Crowd Knowledge of TCP/IP protocols and networking stack Knowledge of QNX/Linux drivers Working with hardware team Experience in silicon bring-up activities NVIDIA is widely considered to be one of the technology world’s most desirable employers. We have some of the most forward-thinking and hardworking people in the world working for us. Are you a creative and autonomous engineer who loves a challenge? Are you ready to become the engineer you always wanted to be? Come and join the best networking team! JR1999161
Posted 1 day ago
13.0 years
0 Lacs
India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
3.0 - 8.0 years
0 Lacs
India
On-site
Role description: This role is with one of our prominent portfolio companies. About Us We are a San Francisco based startup building next-generation Voice AI products that redefine how humans interact with machines from smart voice assistants to automated customer conversations, voice-driven tools, and more. We are at the intersection of speech technology, large language models, and real-time systems, backed by leading investors and supported by domain experts. We’re now building our founding engineering team in India to shape the core product experience. What You'll Do Build and deploy real-time voice-based AI applications using ASR (Automatic Speech Recognition), TTS (Text-to-Speech), and LLMs. Work on latency-sensitive systems to enable near real-time conversations. Design and implement prompt-chaining, memory, and tool integration for LLM-powered voice agents. Set up and manage scalable infra for voice/audio processing and AI model serving. Work closely with the founding team on product shaping, roadmap planning, and technical strategy. Continuously experiment with and evaluate new models, APIs, and speech/LLM techniques. Who You Are 3-8 years of experience in AI/ML, deep learning, or backend-heavy engineering roles. Solid hands-on experience with speech technologies (ASR, TTS, diarization, etc.). Comfortable working with Python, PyTorch, Hugging Face, OpenAI, or similar frameworks. Experience deploying real-time systems (Docker, Kubernetes, AWS/GCP). Strong problem-solving skills with a product-first mindset. Self-starter who thrives in high-ownership, fast-paced environments. Excellent written and verbal communication
Posted 1 day ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description What We Do At Goldman Sachs, we connect people, capital and ideas to help solve problems for our clients. We are a leading global financial services firm providing investment banking, securities and investment management services to a substantial and diversified client base that includes corporations, financial institutions, governments and individuals. Global Banking & Markets Our core value is building strong relationships with our institutional clients, which include corporations, financial service providers, and fund managers. We help them buy and sell financial products on exchanges around the world, raise funding, and manage risk. This is a dynamic, entrepreneurial team with a passion for the markets, with individuals who thrive in fast-paced, changing environments and are energized by a bustling trading floor. What We Do Engineers in the Systematic Market Making (SMM) team play an integral role on the trading floor. We develop and employ automated trading strategies for the firm and its clients. We build complex parallel computing architectures, electronic trading tools and models to help us explain market behavior and predict price movement. Throughout the Global Banking and Markets Division (GBM), eTrading Engineers are using quantitative and technological techniques to solve complex business problems. As an eTrading Engineer, you will be building the foundational technologies to run those algorithms on markets around the world, and to enable the research and analysis that support them. We are looking for developers who are interested in applying leading-edge technologies to solve problems in electronic trading. In a team of energetic, self-motivated individuals, we need someone who can take the initiative at any stage of the software cycle, from inception, through development, to release and support. This role also interacts with a variety of other engineering, trading and sales teams. The structure is flat and the successful candidate will be able to manage his or her time to have maximum impact. Your Impact You will be working on a team focused on electronic market making and execution. You will work with other engineers and traders to improve all aspects of price-making, risk management and execution. You’ll do this with a keen eye on performance, guided by a robust measurement framework and lots of experimentation. You will have an opportunity to develop a deep understanding of how GS interacts with some of the most dynamic and liquid markets in the world. Responsibilities And Qualifications Principal Responsibilities Analyze trading system performance and identify areas for improvement. Generate ideas for system enhancements that drive commercial performance. Implement, test and deploy these ideas. Improve the safety and reliability of trading systems. Work constructively in collaboration with other team members. Manage work to balance the short-term needs of the business with strategic enhancements. Experience/Skills Strong academic background in Computer Science or an analytical field such as Mathematics, Physics, Engineering, etc. Strong software engineering background. Proven ability to analyze data and draw useful commercial conclusions. Good communication skills. Experience Of The Following Would Be Advantageous Securities/trading experience. Multi-threaded/concurrent programming. Java/C++ performance tuning. Low-latency systems, including messaging, network protocols, network I/O in Java, C/C++, JNI. Hardware stack and hardware architecture from a latency perspective. Know your way around a Linux terminal Goldman Sachs Engineering Culture At Goldman Sachs, we commit our people, capital and ideas to help our clients, shareholders and the communities we serve to grow. Founded in 1869, we are a leading global investment banking, securities and investment management firm. Headquartered in New York, we maintain offices around the world. We believe who you are makes you better at what you do. We're committed to fostering and advancing diversity and inclusion in our own workplace and beyond by ensuring every individual within our firm has a number of opportunities to grow professionally and personally, from our training and development opportunities and firmwide networks to benefits, wellness and personal finance offerings and mindfulness programs. Learn more about our culture, benefits, and people at GS.com/careers . We’re committed to finding reasonable accommodations for candidates with special needs or disabilities during our recruiting process. Learn more: https://www.goldmansachs.com/careers/footer/disability-statement.html© The Goldman Sachs Group, Inc., 2025. All rights reserved. Goldman Sachs is an equal employment/affirmative action employer Female/Minority/Disability/Veteran/Sexual Orientation/Gender Identity
Posted 1 day ago
3.0 - 5.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
Job description: Job Description Role Purpose The purpose of this role is to design, develop and troubleshoot solutions/ designs/ models/ simulations on various softwares as per client’s/ project requirements ͏ Do 1. Design and Develop solutions as per client’s specifications Work on different softwares like CAD, CAE to develop appropriate models as per the project plan/ customer requirements Test the protype and designs produced on the softwares and check all the boundary conditions (impact analysis, stress analysis etc) Produce specifications and determine operational feasibility by integrating software components into a fully functional software system Create a prototype as per the engineering drawings & outline CAD model is prepared Perform failure effect mode analysis (FMEA) for any new requirements received from the client Provide optimized solutions to the client by running simulations in virtual environment Ensure software is updated with latest features to make it cost effective for the client Enhance applications/ solutions by identifying opportunities for improvement, making recommendations and designing and implementing systems Follow industry standard operating procedures for various processes and systems as per the client requirement while modeling a solution on the software ͏ 2. Provide customer support and problem solving from time to time Perform defect fixing raised by the client or software integration team while solving the tickets raised Develop software verification plans and quality assurance procedures for the customer Troubleshoot, debug and upgrade existing systems on time & with minimum latency and maximum efficiency Deploy programs and evaluate user feedback for adequate resolution with customer satisfaction Comply with project plans and industry standards ͏ 3. Ensure reporting & documentation for the client Ensure weekly, monthly status reports for the clients as per requirements Maintain documents and create a repository of all design changes, recommendations etc Maintain time-sheets for the clients Providing written knowledge transfer/ history of the project ͏ Deliver No. Performance Parameter Measure 1.Design and develop solutionsAdherence to project plan/ schedule, 100% error free on boarding & implementation, throughput %2.Quality & CSATOn-Time Delivery, minimum corrections, first time right, no major defects post production, 100% compliance of bi-directional traceability matrix, completion of assigned certifications for skill upgradation3.MIS & Reporting100% on time MIS & report generation Mandatory Skills: Embedded Java . Experience: 3-5 Years . Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 day ago
2.0 years
0 Lacs
Pune, Maharashtra, India
On-site
mthree is seeking a Java Developer to join a highly regarded Multinational Investment Bank and Financial Services Company. Job Description: Role: Java Developer Team: Payment Gateway Location: Pune (Hybrid model with 2-3 days per week in the office) Key Responsibility Develop and Maintain Applications: Design, develop, and maintain server-side applications using Java 8 to ensure high performance and responsiveness to requests from the front-end. • Scalability Solutions: Architect and implement scalable solutions for client risk management, ensuring the system can handle large volumes of transactions and data. • Data Streaming and Caching: Utilize Kafka or Redis for efficient data streaming and caching, ensuring real-time data processing and low-latency access. • Multithreading and Synchronization: Implement multithreading and synchronization techniques to enhance application performance and ensure thread safety. • Microservices Development: Develop and deploy microservices using Spring Boot, ensuring modularity and ease of maintenance. • Design Patterns: Apply design patterns to solve complex software design problems, ensuring code reusability and maintainability. • Linux Optimization: Ensure applications are optimized for Linux environments, including performance tuning and troubleshooting. • Collaboration: Collaborate with cross-functional teams, including front-end developers, QA engineers, and product managers, to define, design, and ship new features. • Troubleshooting: Troubleshoot and resolve production issues, ensuring minimal downtime and optimal performance. Requirements: • Educational Background: Bachelor’s degree in computer science, Engineering, or a related field. • Programming Expertise: Proven experience (c2-5 years) in Java 8+ programming, with a strong understanding of object-oriented principles and design. • Data Technologies: Understanding of Kafka or Redis (or similar Cache), including setup, configuration, and optimization. • Concurrency: Experience with multithreading and synchronization, ensuring efficient and safe execution of concurrent processes. • Frameworks: Proficiency in Spring Boot, including developing RESTful APIs and integrating with other services. • Design Patterns: Familiarity with design patterns and their application in solving software design problems. • Operating Systems: Solid understanding of Linux operating systems, including shell scripting and system administration. • Problem-Solving: Excellent problem-solving skills and attention to detail, with the ability to debug and optimize code. • Communication: Strong communication and teamwork skills, with the ability to work effectively in a collaborative environment. Preferred Qualifications: • Industry Experience: Experience in the financial services industry is a plus. • Additional Skills: Knowledge of other programming languages and technologies, such as Python or Scala. • DevOps Practices: Familiarity with DevOps practices and tools, including CI/CD pipelines, containerization (Docker), and orchestration (Kubernetes). Java Developer
Posted 1 day ago
3.0 years
0 Lacs
Jaipur, Rajasthan, India
On-site
We are seeking talented, driven individuals with a passion for tackling complex challenges and a strong desire to learn. As part of our engineering team, you will play a key role in orchestrating, deploying, and maintaining scalable and efficient applications. To excel in this role, you'll need experience in developing server-side logic and working on back-end solutions. Join us and be part of a team that thrives on innovation and impact! Responsibilities Write reusable, testable, and efficient code. Design and implement low-latency, high-availability, and performant applications. Design and create RESTful APIs for internal and partner consumption. Implement security and data protection. Debug code on the platform (written by self or others) to find the root cause of any ongoing issues and rectify them. Database query optimization & design and implement scalable database schemas that represent and support business processes. Implement web applications in Python, SQL, Javascript, HTML, and CSS. Provide technical leadership to teammates through coaching and mentorship. Delegate tasks and set deadlines. Monitor team performance and report on performance. Collaborate with other software developers, business analysts to plan, design and develop applications. Maintain client relationships and ensure Company deliverables meet highest expectations of the client. Qualification & Skills Mandatory 3+ years experience in Django/Flask. Solid database skills in relational databases. Knowledge of how to build and use RESTful APIs. Strong knowledge of version control. Hands-on experience on working on Linux systems. Familiarity with ORM (Object Relational Mapper) libraries. Experience with SQL Alchemy is a plus. Knowledge of Redis Strong understanding of peer review best practices Hands-on experience in deployment processes Good to Have Proficiency in AWS, Azure, or GCP (any one) Experience with Docker
Posted 1 day ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Description The Core Shopping Experience team builds products and services that solve unique customer needs in Amazon’s fastest growing marketplace. We are a multi-billion dollar business with a huge potential to grow in a trillion dollar market. Our engineers own the complete consumer experience for Amazon India, work on a wide range of technologies (including AWS and Android) and build and operate highly scalable, low latency mobile first products and services. We are solving last mile engineering challenges for the next set of customers who first experience Amazon on their mobile phones. If you are looking for an opportunity to build creative technology solutions that positively impact hundreds of millions of international customers, and relish large ownership and diverse technologies, join our team today! You will be instrumental in shaping the product direction and will be actively involved in defining key product features that impact the business. You will work with Principal Engineers at Amazon to evolve the design and architecture of the products owned by this team. You will be responsible to set up and hold a high software quality bar besides providing technical direction to a highly technical team of Software Engineers. As part of this team you will work to ensure Amazon.in is FAST and has the best shopping experience. It’s a great opportunity to develop and enhance experiences for Mobile devices first. You will get the opportunity to work on Amazon Mobile Shopping App and almost all key pages on retail website building features and improving business metrics. You will also contribute reducing latency for customers by reducing the bytes on wire and adapting the UX based on network bandwidth. You will be part of a team that obsesses about the performance of our customer’s experience and enjoy flexibility to pursue what makes sense. Come enjoy an exploratory and research oriented team of Cowboys working in a fast paced environment, who are always eager to take on big challenges. Position Responsibilities Work closely with senior engineers to test applications that impact the Amazon.in business with an emphasis on Mobile, Payments, and e-Commerce website development. Own the quality of an integral piece of a system or application. Management and execution against project plans and delivery commitments Assist directly and indirectly in the continual hiring and development of technical talent. Create and execute appropriate quality plans, project plans, test strategies and processes for development activities in concert with business and project management efforts Basic Qualifications 2+ years of quality assurance engineering experience Experience in automation testing Experience in manual testing Experience in UI and API automation testing (Selenium/SOAPUI) Preferred Qualifications Experience in API & Mobile testing Experience designing and planning test conditions, test scripts, and test data sets to ensure appropriate and adequate coverage and control Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka Job ID: A3031389
Posted 1 day ago
4.0 years
0 Lacs
Bengaluru East, Karnataka, India
On-site
About Us We are a global leader in food & beverage ingredients. Pioneers at heart, we operate at the forefront of consumer trends to provide food & beverage manufacturers with products and ingredients that will delight their consumers. Making a positive impact on people and planet is all part of the delight. With a deep-rooted presence in the countries where our ingredients are grown, we are closer to farmers, enabling better quality, and more reliable, traceable and transparent supply. Supplying products and ingredients at scale is just the start. We add value through our unique, complementary portfolio of natural, delicious and nutritious products. With our fresh thinking, we help our customers unleash the sensory and functional attributes of cocoa, coffee, dairy, nuts and spices so they can create naturally good food & beverage products that meet consumer expectations. And whoever we’re with, whatever we’re doing, we always make it real . Introduction At ofi, we are at the forefront of harnessing cutting-edge technology to revolutionize our operations. We aim to leverage machine learning and artificial intelligence to drive transformative business outcomes and create value for our clients. We are committed to a culture of innovation, diversity, and continuous improvement, where every team member can contribute and thrive. As a ML Engineer, you will be crucial in developing advanced algorithms and models to tackle complex problems. Your expertise will drive the deployment and upkeep of intelligent systems that enhance our products and services. You will work within a collaborative environment, leveraging data and machine learning to influence business strategies and improve operational efficiency. Key Deliverables Deliver end-to-end ML solutions: Architect and implement state-of-the-art models—classification, regression, clustering, reinforcement learning—precisely tuned to solve high-value business problems. Engineer data & experimentation pipelines at scale: Build reliable, self-service pipelines for ingesting, cleaning, transforming, and aggregating data, and orchestrate rigorous offline/online experiments (cross-validation, A/B tests) to benchmark accuracy, latency, and resource cost. Embed ML seamlessly into products: Partner with data scientists, backend/frontend engineers, and designers to wire models into production services and user experiences, ensuring low-friction integration and measurable product impact. Operate, monitor, and evolve models in production: Own the DevOps stack—automated CI/CD, containerization, and cloud deployment—and run real-time monitoring to detect drift, performance degradation, and anomalies, triggering retraining or rollback as needed. Uphold engineering excellence & knowledge sharing: Enforce rigorous code quality, version control, testing, and documentation; lead code reviews and mentoring sessions that raise the team’s ML craftsmanship. Safeguard data privacy, security, and compliance: Design models and pipelines that meet regulatory requirements, apply robust access controls and encryption, and audit usage to ensure ethical and secure handling of sensitive data. Qualification & Skills Formal grounding in computing & AI: Bachelor’s / Master’s in Computer Science, Data Science, or a related quantitative field. Proven production experience: 4+ years shipping, deploying, and maintaining machine-learning models at scale, with a track record of solving complex, real-world problems. End-to-end technical toolkit: Python (Pandas, NumPy), ML frameworks (TensorFlow, PyTorch, scikit-learn), databases (SQL & NoSQL), and big-data stacks (Spark, Hadoop). MLOps & cloud deployment mastery: Containerization (Docker, Kubernetes), CI/CD pipelines, and monitoring workflows that keep models reliable and reproducible in production. Deep applied-ML expertise: Supervised and unsupervised learning, NLP, computer vision, and time-series analysis, plus strong model-evaluation and feature-engineering skills. Collaboration & communication strength: Clear communicator and effective team player who can translate business goals into technical solutions and articulate results to diverse stakeholders. ofi is an equal opportunity/affirmative action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, nationality, disability, protected veteran status, sexual orientation, gender identity, gender expression, genetic information, or any other characteristic protected by law. Applicants are requested to complete all required steps in the application process including providing a resume/CV in order to be considered for open roles.
Posted 1 day ago
5.0 years
0 Lacs
India
Remote
Skyflow is a data privacy vault company built to radically simplify how companies isolate, protect, and govern their customers’ most sensitive data. With its global network of data privacy vaults, Skyflow is also a comprehensive solution for companies around the world looking to meet complex data localization requirements. Skyflow currently supports a diverse customer base that spans verticals like fintech, retail, travel, and healthtech. Skyflow is headquartered in Palo Alto, California and was founded in 2019. For more information, visit www.skyflow.com or follow on X and LinkedIn. About The Role As a Backend Software Engineer you will be responsible for developing a state-of-the-art SaaS solution that enables enterprises to govern and protect their sensitive data. You will contribute to performance engineering efforts and ensure low-latency and high-throughput transactions at scale. You will participate and be responsible for enforcing best practices in software quality, security, testing and documentation. We know great software engineers come from diverse backgrounds so no single individual may have all the desired skills on day one. But if you are the kind of software engineer who would have loved to engineer solutions for Stripe or Twilio API's, or the Slack or Zendesk app, or the Snowflake or MongoDB platform - we want to talk to you. You Have 5+ years of experience into Software development. Proficient in one or more programming languages like Go (preferred), Java, C, C++, Python Experience in performance engineering: developing high-throughput, low-latency systems Deep understanding of algorithms, data structures, scalability, and distributed systems Privacy, authorization/authentication engineering is a huge plus Experience with continuous integration, writing testable code, and test-driven development Proven track record of delivering cloud-native distributed SaaS platforms at scale, and with a meaningful adoption Traits such as being a fast learner, adaptable to changing landscape and most importantly a strong believer in being hands-on You Will Responsible for designing and developing Privacy APIs and backend infrastructure to support large-scale data and privacy workflows Contribute to performance engineering efforts and ensure low-latency and high-throughput transactions at scale Participate in building and implementing effective test strategies and developing software with high-agility and zero downtime Collaborate with security and privacy engineers to deliver state-of-the-art privacy solutions Benefits Work from home expense Excellent Health Insurance Options Very generous PTO Flexible Hours Generous Equity At Skyflow, we believe that diverse teams are the strongest teams. We invite applicants of all genders, races, ethnicities, nationalities, ages, religions, sexual orientations, disability statuses, educational experiences, family situations, and socio-economic backgrounds.
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough