Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
13.0 years
0 Lacs
Jaipur, Rajasthan, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Chandigarh, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Kolkata, West Bengal, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Guwahati, Assam, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Bhubaneswar, Odisha, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Cuttack, Odisha, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Ranchi, Jharkhand, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Raipur, Chhattisgarh, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Amritsar, Punjab, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Jamshedpur, Jharkhand, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Thane, Maharashtra, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Greater Lucknow Area
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Nagpur, Maharashtra, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Nashik, Maharashtra, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
Kanpur, Uttar Pradesh, India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
13.0 years
0 Lacs
India
Remote
Experience : 13.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: gRPC, Protocol Buffers, Avro, storage systems Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Platform team is uniquely positioned at the intersection of security, big data and cloud computing. We are responsible for providing ultra low-latency access to global security insights and intelligence data to our customers, and enabling them to act in near real-time. We’re looking for a seasoned engineer to help us build next-generation data pipelines that provide near real-time ingestion of security insights and intelligence data using cloud and open source data technologies. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What you will be doing Building next generation data pipeline for near real-time ingestion of security insights and intelligence data Partnering with industry experts in security and big data, product and engineering teams to conceptualize, design and build innovative solutions for hard problems on behalf of our customers. Evaluating open source technologies to find the best fit for our needs and also contributing to some of them! Helping other teams architect their systems on top of the data platform and influencing their architecture. Required Skills And Experience Expertise in architecture and design of highly scalable, efficient and fault-tolerant data pipelines for near real-time and real-time processing Strong diagnostic and problem-solving skills with a demonstrated ability to invent and simplify complex problems into elegant solutions. Extensive experience designing high-throughput, low-latency data services using gRPC streaming and performance optimizations Solid grasp of serialization formats (e.g., Protocol Buffers, Avro), network protocols (e.g., TCP/IP, HTTP/2), and security considerations (e.g., TLS, authentication, authorization) in distributed environments. Deep understanding of distributed object storage systems like S3, GCS etc, including their architectures, consistency models, and scaling properties. Deep understanding of data formats like Parquet, Iceberg etc, optimized partitioning, sorting, compression, and read performance. Expert level proficiency in Golang, Java, or similar languages with strong understanding of concurrency, distributed computing and system-level optimizations Cloud-native data infrastructure experience with AWS, GCP etc is a huge plus Proven ability to influence technical direction and communicate with clarity. Willingness to work with a globally distributed team in different time zones. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 day ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description: We are seeking a highly skilled Senior Cloud Engineer with extensive experience in software development and DevOps to lead our efforts in automating Cloud infrastructure. The ideal candidate will focus on building automation for Cloud Landing Zones and collaborate with cross-functional teams to ensure the successful implementation and maintenance of cloud solutions. Responsibilities: Design, implement, and maintain scalable and efficient cloud-based solutions on Azure. Lead initiatives to automate cloud infrastructure. Collaborate with teams to integrate best practices in development, code quality, and automation. Guide and mentor development teams, providing expertise in DevOps and automation practices. Contribute to the design and implementation of cloud applications using serverless architectures, Kubernetes, and event-driven patterns. Develop and maintain CI/CD pipelines to streamline deployments, utilizing GitOps methodologies. Apply security best practices to design and implement secure authentication and authorization mechanisms. Monitor and optimize the performance, scalability, and reliability of cloud applications. Stay updated with the latest cloud technologies and development trends, applying new tools and frameworks as needed. Ensure software systems meet functional and non-functional requirements while adhering to best practices in software design, testing, and security. Foster continuous improvement by sharing knowledge, conducting team reviews, and mentoring junior developers. Requirements: Proven experience as a Cloud engineer or similar role, with a strong focus on Azure (AWS is a plus). Solid experience in software development and DevOps practices. Expertise in Azure infrastructure automation. Proficiency in programming languages such as Python, Golang, or JavaScript. Experience with serverless architectures, Kubernetes, and event-driven patterns. Knowledge of CI/CD pipelines and GitOps methodologies. Strong understanding of cloud security best practices. Excellent problem-solving skills and ability to work collaboratively in a team environment. Strong communication skills and the ability to convey complex technical concepts to non-technical stakeholders. Preferred Qualifications: Experience in designing and working with No-SQL databases such as DynamoDB. Experience in leading and mentoring development teams. Expertise in software architecture, development, and systems testing with a strong focus on cloud technologies. Strong technical guidance and decision-making abilities to shape solutions and enforce development best practices. Proficient in applying quality gates, including code reviews, pair programming, and team review meetings. Experience in code management and release processes, with familiarity in Monorepo and Multirepo strategies. Solid understanding of functional programming principles, including list/map/reduce/compose techniques and familiarity with monads. Knowledge of SDLC, and adherence to DRY, KISS, and SOLID design principles. Proficient in managing security protocols such as ABAC, RBAC, JWT, SAML, AAD, and OIDC for authentication and authorization. Expertise in event-driven architecture, including queues, streams, batches, and pub/sub systems. Strong understanding of scalability, concurrency, and distributed systems. Experience with cloud networking and proxies. Expertise in CI/CD pipelines, GitFlow, and GitOps frameworks like Flux and ArgoCD. Polyglot programmer with expert-level proficiency in at least two languages (e.g., Python, TypeScript, GoLang). Experience in operating Kubernetes clusters from a developer’s perspective, including custom CRDs, operators, and controllers. Experience in building serverless cloud applications. Strong team player with the ability to communicate and collaborate well in a fast-paced, collaborative environment. Proficient in using GitHub for version control, code reviews, and collaborative development. Experience working in agile teams, participating in sprints, and collaborating effectively in cross-functional teams. Familiarity with basic AI tools is considered an advantage. Allianz Group is one of the most trusted insurance and asset management companies in the world. Caring for our employees, their ambitions, dreams and challenges, is what makes us a unique employer. Together we can build an environment where everyone feels empowered and has the confidence to explore, to grow and to shape a better future for our customers and the world around us. We at Allianz believe in a diverse and inclusive workforce and are proud to be an equal opportunity employer. We encourage you to bring your whole self to work, no matter where you are from, what you look like, who you love or what you believe in. We therefore welcome applications regardless of ethnicity or cultural background, age, gender, nationality, religion, disability or sexual orientation. Join us. Let's care for tomorrow.
Posted 1 day ago
2.0 years
0 Lacs
Pune, Maharashtra, India
On-site
mthree is seeking a Java Developer to join a highly regarded Multinational Investment Bank and Financial Services Company. Job Description: Role: Java Developer Team: Payment Gateway Location: Pune (Hybrid model with 2-3 days per week in the office) Key Responsibility Develop and Maintain Applications: Design, develop, and maintain server-side applications using Java 8 to ensure high performance and responsiveness to requests from the front-end. • Scalability Solutions: Architect and implement scalable solutions for client risk management, ensuring the system can handle large volumes of transactions and data. • Data Streaming and Caching: Utilize Kafka or Redis for efficient data streaming and caching, ensuring real-time data processing and low-latency access. • Multithreading and Synchronization: Implement multithreading and synchronization techniques to enhance application performance and ensure thread safety. • Microservices Development: Develop and deploy microservices using Spring Boot, ensuring modularity and ease of maintenance. • Design Patterns: Apply design patterns to solve complex software design problems, ensuring code reusability and maintainability. • Linux Optimization: Ensure applications are optimized for Linux environments, including performance tuning and troubleshooting. • Collaboration: Collaborate with cross-functional teams, including front-end developers, QA engineers, and product managers, to define, design, and ship new features. • Troubleshooting: Troubleshoot and resolve production issues, ensuring minimal downtime and optimal performance. Requirements: • Educational Background: Bachelor’s degree in computer science, Engineering, or a related field. • Programming Expertise: Proven experience (c2-5 years) in Java 8+ programming, with a strong understanding of object-oriented principles and design. • Data Technologies: Understanding of Kafka or Redis (or similar Cache), including setup, configuration, and optimization. • Concurrency: Experience with multithreading and synchronization, ensuring efficient and safe execution of concurrent processes. • Frameworks: Proficiency in Spring Boot, including developing RESTful APIs and integrating with other services. • Design Patterns: Familiarity with design patterns and their application in solving software design problems. • Operating Systems: Solid understanding of Linux operating systems, including shell scripting and system administration. • Problem-Solving: Excellent problem-solving skills and attention to detail, with the ability to debug and optimize code. • Communication: Strong communication and teamwork skills, with the ability to work effectively in a collaborative environment. Preferred Qualifications: • Industry Experience: Experience in the financial services industry is a plus. • Additional Skills: Knowledge of other programming languages and technologies, such as Python or Scala. • DevOps Practices: Familiarity with DevOps practices and tools, including CI/CD pipelines, containerization (Docker), and orchestration (Kubernetes). Java Developer
Posted 1 day ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Responsibilities Successfully and independently deliver large-size projects, including scoping, planning, design, development, testing, rollout and maintenance. Write clean, concise, modular and well-tested code. Review code from junior engineers and provide constant and constructive feedback. Contribute to building and maintaining documentation related to the team's projects. Create high quality, loosely coupled, reliable and extensible technical designs. Actively understand trade-offs between different designs and apply the solution suited to the situation / requirements. Participate in the team's on-call rotation and lead the troubleshooting and resolution process of any issues related to the services/ work sub-streams/ products owned by your team. Constantly improve the health and quality of the services / code they work on, through set practices and new initiatives. Lead the cross-team collaborations for the projects they work on. Support hiring and on-boarding activities along with coaching and developing junior members in your team, and contribute to knowledge sharing. Must Have Qualifications and Experience: 4-6 years of hands-on experience in designing, developing, testing, and deploying small to mid-scale applications in any language or stack. 2+ years of recent and active software development experience. Good understanding of Golang. Able to use Go concurrency patterns and contribute to building reusable Go components. Strong experience in designing loosely coupled, reliable and extensible distributed services. Great understanding of clean architecture, S.O.L.I.D principles, and event-driven architecture. Experience with message broker services like SQS, Kafka, etc. Strong data modeling experience in Relational databases. Strong cross-team collaboration and communication skills. Self-driven with a passion for learning new things quickly, solving challenging problems, and the drive to get better with the support from the manager. Nice To Have A bachelor degree in computer science, information technology, or equivalent education. Experience with NoSQL databases.
Posted 1 day ago
1.0 - 3.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Key Responsibility Area: • You will be the engine that drives the engineering productivity of the team. You will own either a product or feature development end to end from creating a solution, implementing it, ensuring exemplary quality and supporting the deployment and ongoing support work in the live environment. • You will be responsible for the performance, security, scalability, reusability and quality of the applications under you. You will collaborate with senior engineering managers/architects to refine & improve your solutions. You will cascade the same culture across the team by supporting and mentoring team members, helping build truly scalable applications. • You will need to collaborate closely with product managers, designers and business stakeholders to achieve predictable results at a rapid pace. • You will be able to handle competing priorities and chart the best course of action for yourself and your team members ensuring blockers are removed. • You will be a key person in ensuring that agile processes created by the team are adhered to and you actively contribute to the improvement of these processes. You will be responsible for estimating your tasks and ensuring that promised timelines are met. • You will help in attracting and evaluating tech talent to help build the engineering brand of the organization. Desired background & skills • 1-3 years of experience as an engineer at top tier digital product companies across consumer or SMB. • Passion to work in an exciting, fast paced environment. Conscientious, curious, hard-working individual who craves accountability and loves to solve complex problems. • Knowledge of parallel processing using reactive and asynchronous programming is must. Should have worked with coroutines, goroutines, high concurrency or optimized number of threads for processing on your backend application server. • Extremely proficient at writing performant, scalable and production-ready code in Kotlin, Java (8 or above, comfortable with concepts like lambda, streams, multithreading etc), Elixir, Rust, or Golang. Should be familiar with static code analysis, code coverage and code reviews. • Experience with NoSQL DBs like MongoDB, Cassandra, Redis or Aerospike is required. Knowledge of RDBMS like Postgres or MySQL is also required. Should have worked with query tuning and database optimization using composite and partial indexes. Should know how to optimize for both write and read path. Should have compared various databases and then made a choice between them based on use case. • Must have taken keen interest in finding out trends in technologies outside his work. Should know at least two cutting edge technologies outside his work. • Understanding of HTML / CSS / Angular / ReactJS Frontend technologies is a plus. • Knowledge with building applications integrated with CI-CD pipelines (automated build, tests and deployments) is mandatory. • Exposure to micro-services with a good understanding of containers, kubernetes, logging, alerting and monitoring. Inter services communication. Should know how to handle authentication and authorisation around REST APIs and how to manage multiple versions of it.
Posted 1 day ago
3.0 - 6.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job hiring for LXME (Laxmi) Job Title: Senior Backend Developer Company: LXME Location: Mumbai (Onsite Role) Must have - a. 3-6 years of work experience b. Strong background with Java or GoLang Tech Stack Languages: Golang, Java Cloud: AWS (S3, EC2, ECS, RDS, Lambda, API Gateway, SQS, SNS) Databases: PostgreSQL, Redis DevOps & Amp; Infra: Docker, ECS, Terraform, Bitbucket Pipelines Monitoring Tools: New Relic, AWS CloudWatch Responsibilities: Design, build, and maintain scalable and high-performing backend services using GoLang or Java Drive end-to-end architecture, design, and implementation of backend systems Champion clean code practices, robust system design, and performance optimization Mentor and guide new joiners to bring them to speed Collaborate closely with cross-functional teams including Product, QA, and DevOps Set up and manage CI/CD pipelines, infrastructure-as-code, and deployment workflows Monitor and enhance application performance, system reliability, and latency Implement comprehensive API and infrastructure monitoring, alerting, and logging Work with both SQL and NoSQL databases to optimize data storage and access Influence and shape engineering best practices, standards, and team processes Requirements: 3-6 years of hands-on backend development experience using Golang or Java Deep understanding of RESTful APIs, system design, and microservices architecture Experience with AWS, GCP, or Azure cloud services and container-based deployments Experience with CI/CD tools, Git workflows, and infrastructure automation Willing to learn from senior engineers and mindset of taking feedback and work towards continuous improvement Experience/Knowledge of database design, query tuning, and caching strategies A mindset focused on automation, efficiency, and scalability Proven ability of debugging and performance tuning skills Excellent written and verbal communication skills and strong documentation habits Nice to Have: Background in fintech, payments, or investment platforms Experience in the field of advanced concurrency and performance optimizations Familiarity with event-driven architectures and message brokers (Kafka, RabbitMQ) Knowledge of security best practices in backend development
Posted 1 day ago
1.0 - 3.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job hiring for LXME (Laxmi) Job Title: Junior Backend Developer Company: LXME Location: Mumbai (Onsite Role) Experience: 1-3 years Must have - a. 1-3 years of work experience b. Strong background with Java or GoLang Tech Stack Languages: Golang, Java Cloud: AWS (S3, EC2, ECS, RDS, Lambda, API Gateway, SQS, SNS) Databases: PostgreSQL, Redis DevOps & Amp; Infra: Docker, ECS, Terraform, Bitbucket Pipelines Monitoring Tools: New Relic, AWS CloudWatch Responsibilities: Design, build, and maintain scalable and high-performing backend services using GoLang or Java Drive end-to-end architecture, design, and implementation of backend systems Champion clean code practices, robust system design, and performance optimization Collaborate closely with cross-functional teams including Product, QA, and DevOps Set up and manage CI/CD pipelines, infrastructure-as-code, and deployment workflows Monitor and enhance application performance, system reliability, and latency Implement comprehensive API and infrastructure monitoring, alerting, and logging Work with both SQL and NoSQL databases to optimize data storage and access Influence and shape engineering best practices, standards, and team processes Requirements: 1-3 years of hands-on backend development experience using Golang or Java Deep understanding of RESTful APIs, system design, and microservices architecture Experience with AWS, GCP, or Azure cloud services and container-based deployments Experience with CI/CD tools, Git workflows, and infrastructure automation Willing to learn from senior engineers and mindset of taking feedback and work towards continuous improvement Experience/Knowledge of database design, query tuning, and caching strategies A mindset focused on automation, efficiency, and scalability Proven ability of debugging and performance tuning skills Excellent written and verbal communication skills and strong documentation habits Nice to Have: Background in fintech, payments, or investment platforms Experience in the field of advanced concurrency and performance optimizations Familiarity with event-driven architectures and message brokers (Kafka, RabbitMQ) Knowledge of security best practices in backend development
Posted 1 day ago
3.0 years
0 Lacs
India
On-site
Experience Level : Software Engineer- 3-5 years of relevant experience in data engineering. About Forage AI : Forage AI is a pioneering AI-powered data extraction and automation company that transforms complex, unstructured web and document data into clean, structured intelligence. Our platform combines web crawling, NLP, LLMs, and agentic AI to deliver highly accurate firmographic and enterprise insights across numerous domains. Trusted by global clients in finance, real estate, and healthcare, Forage AI enables businesses to automate workflows, reduce manual rework, and access high-quality data at scale. About the Role : We are seeking a talented and growth-minded Software Engineer who is passionate about building, scaling, and modernizing impactful business systems. In this full-time position, you’ll work on a mix of new feature development, system enhancements, and platform modernization initiatives—while ensuring the stability and reliability of our core products. You’ll have the chance to experiment with and implement GenAI-driven approaches, innovations, and new strategies as we continue to evolve our solutions. We value engineers who express their best technical talent, contribute innovative ideas to overall product development, and propose creative automation solutions across our engineering efforts. This is an opportunity to propose and lead technical improvements, contribute to architectural evolution, and make your mark on products used by real customers. Structured onboarding and domain training will be provided to set you up for success. Key Responsibilities : Design and develop new features and enhancements for our core platforms and services. Contribute to the modernization and refactoring of existing systems and code bases. Propose, experiment with, and implement GenAI-based innovations, tools, or workflows to solve business problems and create new value. Drive technical improvements, suggest and lead platform upgrades, and champion engineering best practices. Investigate, debug, and resolve production issues, to deliver robust solutions that improve performance and maintainability. Collaborate with product, engineering, and business teams to deliver impactful end-to end solutions. Help maintain the reliability, scalability, and security of our systems as we evolve our platform. Technical Skills & Requirements: Python: Strong hands-on experience with core Python and its standard libraries Web Framework: Experience developing backend APIs and services using FastAPI/Flask. RabbitMQ: Proficiency in implementing message queues and task distribution. Docker & Docker Compose: Experience containerizing applications and managing services. WebScraping: Demonstrated expertise with webscraping tools and techniques. Playwright & Selenium: Practical experience with browser automation frameworks. PostgreSQL & SQLAlchemy: Proficient in SQL and ORM-based data access. MongoDB & PyMongo: Experience with NoSQLdata using PyMongo. Redis: Practical use for caching and lightweight messaging. AWS: Hands-on experience with services such as S3, Secrets Manager, and Auto Scaling Groups(using boto3). Linux/Unix: Comfort with command-line operations and scripting. Git: Proficient in version control and team collaboration. Concurrency: Experience with multithreading and asynchronous programming in Python. API Integrations: Familiarity with integrating third-party APIs, including authentication, data handling, and rate limiting. Good to have: Exposure to GenAI/LLMs (OpenAI, HuggingFace, etc.), prompt engineering, or integrating GenAI-powered features into products Other Infrastructure Requirements Since this is a completely work-from-home position, you will also require the following - ● High-speed internet connectivity for video calls and efficient work. ● Capable business-grade computer (e.g., modern processor, 8 GB+ of RAM, and no other obstacles to interrupted, efficient work). ● Headphones with clear audio quality. ● Stable power connection and backups in case of internet/power failure.
Posted 1 day ago
12.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Objective of the Role: As an Engineering Manager, you serve as the technical anchor for an engineering team. You create, own and are responsible for the application architecture that best serves a product in its functional and non-functional needs. You Will: Lead, coach and mentor a team of 3-4 Leads and a team of 25-35 top notch Engineers so they all learn, grow, and succeed. Strong People leadership, Mentoring and communication skills. Plan and prioritize work for your team, including collaborating with cross border teams. Provide oversight and accountability for technical decisions. Create an inclusive environment that attracts and retains high-performing engineers. You will collaborate with Developers, Program Managers, QA and DevOps Engineers in an agile development environment. Constantly learn and grow as an engineer and an engineering leader. Demonstrate a passion for customers and technology. You must have: 12+ years Software development experience in fast-paced product organisation Lead product engineering and platform teams Hands on experience building microservices architecture, applying design patterns and developing frameworks Experience in developing and leading teams building solution with cloud technologies (AWS and/or Azure) Extensive experience in using open-source technologies: Proficiency in any of the languages (Preferred Java), Front-end technologies (react/angular), No-SQL databases (MongoDB, Cassandra), Elastic Search, Caching (Redis, Aerospike), using containers e.g. Docker (K8, EKS etc) at scale. Good exposure to development tools, Git, Jenkins, code review tools & introducing best coding practices. Strong hands-on experience in enabling a CI &CD pipeline, canary deployments, blue green deployments in production. Work with the support team to define how the applications are supported in production, including system performance and monitoring & Strong database skills Able to develop and maintain strong relationships with both internal and external customers Consistent track record for solution delivery, Quality improvements, Champion for agile scrum adoption. Big Pluses if you: Have a strong focus on business outcomes Are comfortable with collaboration, open communication and reaching across functional borders Are self-motivated and can get things done Have the ability to communicate and defend your ideas clearly Have a strong knowledge of threading, concurrency, scaling, and high availability. Have a desire to build products that users love Stay current with new and evolving technologies via formal training and self-directed education Please include whatever info you believe is relevant: Resume, LinkedIn profile, GitHub profile etc.
Posted 1 day ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description At Publicis Sapient, we re at the forefront of revolutionizing the future of product engineering with state-of-the-art, scalable innovations. If you re Associate Software Development Engineer seeking your next transformative challenge, we have an incredible opportunity for you: Our team utilizes advanced artificial intelligence and machine learning methodologies to design and implement intelligent, adaptive solutions that tackle complex real-world challenges. Your Impact You will work in the spirit of agile & a product engineering mindset - delivering the sprint outcomes, iteratively & incrementally, following the agile ceremonies You’re expected to write clean, modular, production ready code and take it through production and post-production lifecycle. You will groom the stories functionally & help define the acceptance criteria (Functional & Non-Functional/NFRs) You will have breadth of concepts, tools & technologies to address NFRs like security, performance, reliability, maintainability and understand the need for trade-offs You will bring in expertise to optimize and make the relevant design decisions (considering trade-offs) at the module / components level Manage the product lifecycle from requirements gathering and feasibility analysis through high-level and low-level design, development, user acceptance testing (UAT), and staging deployment. Qualifications Your Skills & Experience: You have professional work experience of 2+ years building large scale, large volume services & distributed apps., taking them through production and post-production life cycles You use more than one programming language with expertise in at least one; Ex: Memory Management, GC, Templates/Generics, Closures etc. Multi-Threading, Sync/A-Sync.; Blocking/Non-Blocking execution styles You practice Imperative, Functional Programming styles You are aware of Cloud Platform like AWS, GCP, Azure etc. You are a problem solver choosing the relevant data structures, algorithms considering the tools for Time & Space Complexity You apply SOLID, DRY design principles, design patterns & practice Clean Code You are an expert at String Manipulation, Data/Time Arithmetic, Collections & Generics You practice & guide on handling failures à Error Management & Exception handling You build reliable & high-performance apps leveraging Eventing, Streaming, Concurrency, Multi-Threading & Synchronization libraries and frameworks You develop web apps using HTML, CSS, Java-script & relevant frameworks (Angular, React, Vue) You design and build microservices from grounds up, considering all NFRs & applying DDD, Bounded Contexts You use one or more databases (RDBMS or NoSQL) based on your needs You deploy production, trouble shoot problems & provide live support You understand the significance of security aspects & compliance to data, code & application security policies; You write secure code to prevent known vulnerabilities; You understand HTTPS/TLS, Symmetric/Asymmetric Cryptography, Certificates You use one or more Web Application Frameworks Spring or Spring Boot or Micronaut (Java) Flask or Django (Python) Express or Meteor or Koa (Node) Asp.net MVC, WebApi or Nancy (.Net) You use one or more messaging platforms (e.g. JMS/RabbitMQ/Kafka/Tibco/Camel) You use Mocks & Stubs & related frameworks (Moq) You use logging frameworks like Log4j, NLog etc. You use build tools like MsBuild, Maven, Gradle, Gulp etc. You understand and use containers, virtualization You use proactive monitoring & alerting, dashboards You use Logging/Monitoring solutions (Splunk, ELK, Grafana) Additional Information Set Yourself Apart With You understand infra. as code (cattle over pets) You understand reactive programming concepts, Actor models & use RX Java / Spring React / Akka / Play etc. You are able to set-up a CI/CD pipeline infrastructure & stack from grounds-up You are able to articulate the pro’s, con’s of designs & tradeoffs You are aware of distributed tracing, debugging and troubleshooting You are aware of side-car, service mesh usage along with microservices You are aware of distributed, cloud design patterns & architectural styles You are aware of gateways, load-balancers, CDNs, Edge caching You are aware of gherkin and cucumber for BDD automation You are aware of performance testing tools like JMeter, Gatling You are aware of one search solution like Elasticsearch, SOLR, Endeca You are aware of one distributed caching solution like Redis, Memcached etc. You are aware of a Rules engine like Drools, Easy Rules etc. Benefits Of Working Here Gender Neutral Policy 18 paid holidays throughout the year. Generous parental leave and new parent transition program Flexible work arrangements Employee Assistance Programs to help you in wellness and well-being. A Tip From The Hiring Manager Software Development Engineers (ASDE-2) are bright, talented, and motivated young minds with strong technical skills, developing software applications and services that make life easier for customers. The ASDE-2 is expected to work with an agile team to develop, test, and maintain digital business applications. Company Description Publicis Sapient is a digital transformation partner helping established organizations get to their future, digitally-enabled state, both in the way they work and the way they serve their customers. We help unlock value through a start-up mindset and modern methods, fusing strategy, consulting, and customer experience with agile engineering and problem-solving creativity. United by our core values and our purpose of helping people thrive in the brave pursuit of the next, our 20,000+ people in 53 offices around the world combine experience across technology, data sciences, consulting and customer obsession to accelerate our clients’ businesses through designing the products and services their customers truly value.
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough