Jobs
Interviews

62 Low Latency Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 14.0 years

11 - 16 Lacs

hyderabad, bengaluru

Work from Office

The Oracle Cloud Infrastructure (OCI) Security Services organization presents a rare opportunity to contribute to the development of next-generation, AI-driven cybersecurity solutions at cloud scale. This effort centers on ingesting and processing massive volumes of telemetry and security event data across OCI, leveraging advanced techniques including generative AI (GenAI), large language models (LLMs), and machine learning (ML) to build intelligent detection, response, and mitigation systems. The goal is to deliver autonomous, adaptive security capabilities that protect OCI, Oracle, and our global customer base against evolving threat landscapes. You will assist in the development of short-, medium-, and long-term plans to achieve strategic objectives. The position regularly interacts across functional areas with senior engineering management or executives to ensure that the objectives are met. You will have the ability to influence thinking or gain acceptance of others in sensitive situations. Strong written and verbal communication skills are required. Attention to detail with strong leaderships skills are required. Why Join OCI? Impact billions of users by securing one of the worlds largest cloud platforms. Work alongside industry leaders in AI, distributed systems, and cybersecurity Competitive compensation, flexible work models, and career growth in cutting-edge technologies. Qualifications 10+ years of experience as a software engineer in applications and distributed systems development Strong experience in building high throughput/low latency and/or big data systems. 3+ years of experience in an engineering management role Experience with at least one modern language Java, python, C# etc. Excellent organizational, verbal, and written communication skills Experience driving hiring, onboarding new engineers and ongoing performance management. Preferred Qualifications Knowledge of ML/AI fundamentals Experience with high volume data with low latency query capabilities End to end Ownership of services Strong technical foundations Responsibilities Setting the technical direction for the architecture in collaboration with other senior technical resources. Developing strategic plans to accomplish technical as well as business objectives in partnership with leaders across various Oracle business groups and customers. Drive innovation through advancement of OCI data services. Management responsibility over multiple software engineering teams. Building high performing engineering teams through hiring and coaching. Establish and implement engineering strategies that align with applicable standards and regulatory practices

Posted 1 day ago

Apply

3.0 - 8.0 years

9 - 13 Lacs

hyderabad

Work from Office

About The Role Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : Python (Programming Language) Good to have skills : MySQL, Data Engineering Minimum 3 year(s) of experience is required Educational Qualification : BTECH Summary :In this role, you will work to increase the domain data coverage and adoption of the Data Platform by promoting a connected user experience through data. You will increase data literacy and trust by leading our Data Governance and Master Data Management initiatives. You will contribute to the vision and roadmap of self-serve capabilities through the Data Platform.The senior data engineer develops data pipelines extracting and transforming data as governed assets into the data platform, improves system quality by identifying issues and common patterns and developing standard operating procedures; and enhances applications by identifying opportunities for improvement, making recommendations, and designing and implementing systems. Roles and responsibilities:(MUST HAVE) Extensive experience with cloud data warehouse like Snowflake, AWS Athena, and SQL databases like PostgreSQL, MS SQL Server. Experience with NoSQL databases like AWS DynamoDB and Azure Cosmos is a plus.(MUST HAVE) Solid experience and clear understanding of DBT.(MUST HAVE) Experience working with AWS and/or Azure CI/CD DevOps technologies, and extensive debugging experience.Good understanding of data modeling, ETL, data curation, and big data performance tuning.Experience with data ingestion tools like Fivetran is a big plus.Experience with Data Quality and Observability tools like Monte Carlo is a big plus.Experience working and integrating with Event Bus like Pulsar is a big plus.Experience integrating with a Data Catalog like Atlan is a big plus.Experience with Business Intelligence tools like PowerBI is a plus.An understanding of unit testing, test driven development, functional testing, and performanceKnowledge of at least one shell scripting language.Ability to network with key contacts outside own area of expertise.Must possess strong interpersonal, organizational, presentation and facilitation skills.Must be results oriented and customer focused.Must possess good organizational skills. Technical experience & Professional attributes:Prepare technical design specifications based on functional requirements and analysis documents.Implement, test, maintain and support software, based on technical design specifications.Improve system quality by identifying issues and common patterns and developing standard operating procedures.Enhance applications by identifying opportunities for improvement, making recommendations, and designing and implementing systems.Maintain and improve existing codebases and peer review code changes.Liaise with colleagues to implement technical designs.Investigating and using new technologies where relevantProvide written knowledge transfer material.Review functional requirements, analysis, and design documents and provide feedback.Assist customer support with technical problems and questions.Ability to work independently with wide latitude for independent decision making.Experience in leading the work of others and mentor less experienced developers in the context of a project is a plus.Ability to listen and understand information and communicate the same.Participate in architecture and code reviews.Lead or participate in other projects or duties as need arises.Education qualifications:Bachelors degree in computer science, Information Systems, or related field; or equivalent combination of education/experience. Masters degree is a plus.5 years or more of extensive experience developing mission critical and low latency solutions.At least 3 years of experience with developing and debugging distributed systems and data pipelines in the cloud. Additional Information:The Winning Way behaviors that all employees need in order to meet the expectations of each other, our customers, and our partners. Communicate with Clarity - Be clear, concise and actionable. Be relentlessly constructive. Seek and provide meaningful feedback. Act with Urgency - Adopt an agile mentality - frequent iterations, improved speed, resilience. 80/20 rule better is the enemy of done. Dont spend hours when minutes are enough. Work with Purpose - Exhibit a We Can mindset. Results outweigh effort. Everyone understands how their role contributes. Set aside personal objectives for team results. Drive to Decision - Cut the swirl with defined deadlines and decision points. Be clear on individual accountability and decision authority. Guided by a commitment to and accountability for customer outcomes. Own the Outcome - Defined milestones, commitments and intended results. Assess your work in context, if youre unsure, ask. Demonstrate unwavering support for decisions.You will be working with a Trusted Tax Technology Leader, committed to delivering reliable and innovative solutions COMMENTS:The above statements are intended to describe the general nature and level of work being performed by individuals in this position. Other functions may be assigned, and management retains the right to add or change the duties at any time. Qualification BTECH

Posted 6 days ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As a senior software engineer in the APAC Equities Electronic Trading team within Citi Equities Technology organization, you will play a crucial role in designing, developing, and delivering Citis next-generation low latency execution platform in the APAC region. This platform encompasses critical components such as Algorithmic Trading, Smart Order Routing (SOR), Client and Exchange Connectivity, and high-performance market data processing. In this role, you will collaborate with a team of developers and closely work with the product development team, other technology teams, production support, and quality assurance. It is imperative to maintain close alignment with the global strategy to ensure the success of this platform. Your responsibilities will include core development tasks such as designing, developing, and maintaining the high-performance, low-latency electronic execution platform. You will also collaborate with traders, quant researchers, and clients to understand their requirements and translate them into innovative product features and enhancements. Additionally, you will continuously enhance testing frameworks, development tools, and environments while championing Agile development practices and Continuous Integration/Continuous Delivery (CI/CD) processes. Furthermore, you will be involved in building and maintaining common solutions for trading platform monitoring, trade reconciliation, application recovery, and other essential support functions. To qualify for this role, you should have 10+ years of experience with a strong technical background and expertise in Java & Low Latency experience. Proven experience in developing automated trading platforms, knowledge of Python, automated testing techniques, Agile methodologies, and Continuous Integration processes are essential. You should possess the ability to prioritize multiple tasks, set goals, and meet deadlines. Strong communication skills are necessary, with a proven ability to present and grasp complex concepts in a multicultural environment. A Bachelors degree/University degree or equivalent experience is required for this position. If you believe you need a reasonable accommodation to use the search tools or apply for a career opportunity due to a disability, please review Accessibility at Citi. Additionally, you can view Citis EEO Policy Statement and the Know Your Rights poster for more information.,

Posted 1 week ago

Apply

3.0 - 8.0 years

9 - 13 Lacs

hyderabad

Work from Office

About The Role Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : Data Engineering Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : BTECH Summary :As a lead engineer, in this role, you will work to increase the domain data coverage and adoption of the Data Platform by promoting a connected user experience through data. You will increase data literacy and trust by leading our Data Governance and Master Data Management initiatives. You will contribute to the vision and roadmap of self-serve capabilities through the Data Platform.The senior data engineer develops data pipelines extracting and transforming data as governed assets into the data platform, improves system quality by identifying issues and common patterns and developing standard operating procedures; and enhances applications by identifying opportunities for improvement, making recommendations, and designing and implementing systems.Roles and responsibilities:(MUST HAVE) Extensive experience with cloud data warehouse like Snowflake, AWS Athena, and SQL databases like PostgreSQL, MS SQL Server. Experience with NoSQL databases like AWS DynamoDB and Azure Cosmos is a plus.(MUST HAVE) Solid experience and clear understanding of DBT.(MUST HAVE) Experience working with AWS and/or Azure CI/CD DevOps technologies, and extensive debugging experience.Good understanding of data modeling, ETL, data curation, and big data performance tuning.Experience with data ingestion tools like Fivetran is a big plus.Experience with Data Quality and Observability tools like Monte Carlo is a big plus.Experience working and integrating with Event Bus like Pulsar is a big plus.Experience integrating with a Data Catalog like Atlan is a big plus.Experience with Business Intelligence tools like PowerBI is a plus.An understanding of unit testing, test driven development, functional testing, and performanceKnowledge of at least one shell scripting language.Ability to network with key contacts outside own area of expertise.Must possess strong interpersonal, organizational, presentation and facilitation skills.Must be results oriented and customer focused.Must possess good organizational skills.Technical experience & Professional attributes:Prepare technical design specifications based on functional requirements and analysis documents.Implement, test, maintain and support software, based on technical design specifications.Improve system quality by identifying issues and common patterns and developing standard operating procedures.Enhance applications by identifying opportunities for improvement, making recommendations, and designing and implementing systems.Maintain and improve existing codebases and peer review code changes.Liaise with colleagues to implement technical designs.Investigating and using new technologies where relevantProvide written knowledge transfer material.Review functional requirements, analysis, and design documents and provide feedback.Assist customer support with technical problems and questions.Ability to work independently with wide latitude for independent decision making.Experience in leading the work of others and mentor less experienced developers in the context of a project is a plus.Ability to listen and understand information and communicate the same.Participate in architecture and code reviews.Lead or participate in other projects or duties as need arises. Education qualifications:Bachelors degree in computer science, Information Systems, or related field; or equivalent combination of education/experience. Masters degree is a plus.6 years or more of extensive experience developing mission critical and low latency solutions.At least 3 years of experience with developing and debugging distributed systems and data pipelines in the cloud. Additional Information:The Winning Way behaviors that all employees need in order to meet the expectations of each other, our customers, and our partners. Communicate with Clarity - Be clear, concise and actionable. Be relentlessly constructive. Seek and provide meaningful feedback. Act with Urgency - Adopt an agile mentality - frequent iterations, improved speed, resilience. 80/20 rule better is the enemy of done. Dont spend hours when minutes are enough. Work with Purpose - Exhibit a We Can mindset. Results outweigh effort. Everyone understands how their role contributes. Set aside personal objectives for team results. Drive to Decision - Cut the swirl with defined deadlines and decision points. Be clear on individual accountability and decision authority. Guided by a commitment to and accountability for customer outcomes. Own the Outcome - Defined milestones, commitments and intended results. Assess your work in context, if youre unsure, ask. Demonstrate unwavering support for decisions.COMMENTS:The above statements are intended to describe the general nature and level of work being performed by individuals in this position. Other functions may be assigned, and management retains the right to add or change the duties at any time.You will be working with a Trusted Tax Technology Leader, committed to delivering reliable and innovative solutions Qualification BTECH

Posted 1 week ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

pune

Hybrid

Seeking a Senior Java Low Latency Developer with expertise in multi-threading, concurrency, and performance optimization. Proficient in Java, data structures, algorithms, and low-latency systems for high-performance applications. Required Candidate profile Senior Java Low Latency Developer with expertise in multi-threading, concurrency, and optimization. Strong in Java, algorithms, and low-latency systems for high-performance applications

Posted 1 week ago

Apply

10.0 - 15.0 years

20 - 35 Lacs

hyderabad, bengaluru

Hybrid

We're Hiring: PRODUCT ARCHITECT (Manager Level) Location: Bangalore / Hyderabad Experience: 1014 Years Are you an expert in .Net Core , Azure , SQL , Application Support , and Software Architecture ? Do you thrive in designing scalable systems, leading complex projects, and driving product innovation? We’re looking for a Product Architect with: Strong foundation in software engineering & architecture Hands-on experience with BCD, DFD, UML diagrams, OOP/OOD, data structures & algorithms Excellent communication & leadership skills Be part of a team that values passion , empathy , and technical excellence . Apply now or refer someone you know!

Posted 2 weeks ago

Apply

7.0 - 12.0 years

35 - 60 Lacs

pune

Hybrid

About the Role PubMatic is looking for engineers who can design and implement next-generation, highly scalable and low-latency ad server features at scale of 200 billion+ requests per day in our AD Server. If you get excited building applications and architecture to handle 100's of billions of requests per day, managing millions of requests per second with a creative and fast-paced work environment, competitive pay, great incentives, culture of teamwork, smart and friendly colleagues and plenty of opportunity to grow in your career then you should consider applying for this position. What You'll Do Research, learn, design and build highly reliable, available and scalable platforms. Use best practices for software development and documentation, assure designs meet requirements, and deliver high-quality work. Demonstrated ability to self-direct and work independently. Demonstrate work ownership and focus to do deliver on time. Be the owner of one or more functionality module and point of contact for it. Perform code and design reviews for code implemented by peers or as per the code review process. Work with teams to achieve desired goals. Demonstrate timely and excellent verbal and written communication skills. Willing to go the extra-mile to achieve greater results. We'd Love for You to Have Seven plus years of development experience in C/C++, Linux/UNIX environment. Good to have experience with GO language. Proficiency in the implementation of algorithms and the use of advanced data structures to solve problems in computing. A solid knowledge in the principles of computer science is desired. Good experience on software design and architecture. Good experience in building complex and scalable solutions. Ability to find optimal solutions and innovative ideas. Excellent problem-solving skills. Being able to use generative AI-based tools and IDE for getting work done. Understanding of different models at the basic level. Prompt engineering basics. Knowledge of OS and working experience on system programming (multi-threading, multi-processing, memory management). Troubleshoot any issues with existing features, live on production. Ability to write clean, modular and loosely coupled code. Ability to understand end-to-end product functionality. Working knowledge of scripting Perl/Python/Shell. Working experience in databases, preferably MySQL. Excellent interpersonal, written, and verbal communication skills. Proficiency in AI-assisted coding, automation, prompt engineering, and an understanding of the strengths and limitations of LLM-generated code is a strong plus. Should have a bachelors degree in engineering (CS / IT) or equivalent degree from well-known Institutes / Universities. AI-Enabled Engineering Mindset: We value engineers who actively leverage Generative AI tools and IDEs (e.g. GitHub Copilot, ChatGPT, Claude, Cursor, Windsurf etc.) to accelerate development, improve code quality, automate repetitive tasks, and enhance documentation and debugging workflows. Engineers who demonstrate AI-first thinking using these tools to drive faster experimentation, ideation, and technical execution will thrive in our high-scale, performance-critical environment. Apply to the link below: https://pubmatic.com/job/?gh_jid=4612185008

Posted 2 weeks ago

Apply

13.0 - 23.0 years

45 - 70 Lacs

pune

Hybrid

About the Role: PubMatic is looking for engineers who can design and implement next-generation, highly scalable and low-latency ad server features at scale of 200 billion+ requests per day in our AD Server. If you get excited building applications and architecture to handle 100's of billions of requests per day, managing millions of requests per second with a creative and fast-paced work environment, competitive pay, great incentives, culture of teamwork, smart and friendly colleagues and plenty of opportunity to grow in your career then you should consider applying for this position. What You'll Do: Lead a team of engineers at various levels with varied skills and experiences to solve complex problems, devise solutions, and motivate/mentor the team to reach the goals of a project / product. Develops a plan by which team members can reach the project goal. Seamlessly coordinates with engineering and Product Leads, understand the business needs and requirements of the Product/Platform. Research, learn, design and build highly reliable, scalable and low latency platforms. Experience in development of highly scalable, reliable, low latency, distributed backend platforms and services. Do research of emerging technologies and prepare the POCs to evaluate them to enhance and upgrade our services. Understand the architecture and designs and come up with innovations and optimizations. Collaborate with other teams to Audit and Ensure the Stability of AdServer Platform. Timely review of the major designs, Arch changes by the AdServer Development Team. Collaborate with Data Center and DevOps teams for better planning of new Tech, Software Optimizations, Stability Improvements and other optimizations of the overall platform. Use best practices for software development and documentation, assure designs meet requirements, and deliver high-quality work. Apply GenAI into various phases of Product Development. Demonstrate ownership, ability to self-direct, work independently and focus to deliver on time. Timely review of the major designs, Arch changes by the AdServer Development Team. Who You Are: Ten plus years of development experience in C/C++, Linux/UNIX environment. Good to have experience with GO language. Proficiency in the implementation of algorithms and the use of advanced data structures to solve problems in computing. A solid knowledge of the principles of computer science is desired. Good experience in software design and architecture. Good experience in building complex and scalable solutions. Ability to find optimal solutions and innovative ideas. Excellent problem-solving skills. Being able to use generative AI-based tools and IDE for getting work done. Understanding of different models at the basic level. Prompt engineering basics. Knowledge of OS and working experience on system programming (multi-threading, multi-processing, memory management). Troubleshoot any issues with existing features, live on production. Ability to write clean, modular and loosely coupled code. Ability to understand end-to-end product functionality. Working knowledge of scripting Perl/Python/Shell. Working experience in databases, preferably MySQL. Excellent interpersonal, written, and verbal communication skills. Proficiency in AI-assisted coding, automation, prompt engineering, and an understanding of the strengths and limitations of LLM-generated code is a strong plus. Should have a bachelors degree in engineering (CS / IT) or equivalent degree from well-known Institutes / Universities. AI-Enabled Engineering Mindset We value engineers who actively leverage Generative AI tools and IDEs (e.g. GitHub Copilot, ChatGPT, Claude, Cursor, Windsurf etc.) to accelerate development, improve code quality, automate repetitive tasks, and enhance documentation and debugging workflows. Engineers who demonstrate AI-first thinking using these tools to drive faster experimentation, ideation, and technical execution will thrive in our high-scale, performance-critical environment. Apply to the link below: https://pubmatic.com/job/?gh_jid=4672160008

Posted 2 weeks ago

Apply

3.0 - 6.0 years

37 - 40 Lacs

noida

Work from Office

Role Summary The ideal candidate will design and execute profitable trading strategies, leveraging statistical and quantitative methods. You will closely with related team and management to refine trading systems and maximize performance across diverse markets. Key Responsibilities Design Profitable Low/High-Frequency Options Market-Making Strategies. Develop advanced algorithms to efficiently quote and trade options in high-frequency environments. Analyze large datasets to identify profitable trading opportunities. Optimize strategies for consistent profitability. Actively monitor and execute trading strategies across multiple markets and asset classes. Manage and mitigate risks associated with trading activities. Strategy Development and Optimization. Collaborate with team to create, test, and deploy trading strategies. Evaluate the performance of strategies and provide actionable insights. Analyze market patterns and identify potential trading opportunities. Conduct research on new trading opportunities using statistical and machine learning techniques. Stay updated on market trends, trading technologies, and regulatory changes. Qualifications & Experience Bachelor's or Master's degree in Finance, Mathematics, Economics, Computer Science, or a related field. Minimum 3 year in quantitative research, statistical analysis, or related fields. Preferred Skills Experience in high-frequency trading, options market-making, and managing PnL effectively. Experience with time-series analysis and predictive modeling. Familiarity with low-latency trading infrastructure. Strong understanding of financial instruments and derivatives. Knowledge of risk management and portfolio optimization techniques. Proven ability to develop and implement advanced trading algorithms. Proficiency in Python and R for statistical and data analysis. Familiarity with C++ for performance optimization (preferred). Knowledge of Linux/Unix operating systems. Ability to analyze large datasets to uncover trading opportunities and market inefficiencies. Strong quantitative aptitude and analytical skills. Strong interpersonal and collaboration skills to work within a multidisciplinary team. Effective communication and decision-making abilities under pressure. Experience applying machine learning techniques in financial markets. What We Offer A competitive compensation structure based on performance. Access to cutting-edge technology and infrastructure. A collaborative environment with experienced professionals. Opportunities for growth and continuous learning. Application Process Interested candidates are encouraged to submit their resume and a cover letter to jobs@stokhos.in.

Posted 2 weeks ago

Apply

11.0 - 15.0 years

12 - 17 Lacs

mumbai, pune, chennai

Work from Office

Responsibility: To modernize the Trade Processing capabilities of the Trade, Tax, and Asset Servicing Platform by migrating from legacy systems to a Java-based Cloud Application Primary Skills : Java, Spring Boot, Kafka, Kubernetes, Oracle, AWS EKS, Lambda, Dynamo DB Secondary Skills : VPC configurations, and monitoring through AWS CloudWatch, Cucumber serenity framework. Job Description: A highly experienced Java Cloud Engineer with 11-15 years of expertise and excellent communication Must have a strong background and hands on experience in designing and developing high Transaction Per Second (TPS) and low latency business applications using Java, Spring Boot, Kafka, and EKS. Must have proficiency in Java, Spring Boot, Kafka, Kubernetes, Oracle, and AWS EKS, as well as familiarity with managed services like Lambda and DynamoDB. Experience in Kafka plays to enable development of event-driven applications and handle high Transaction Per Second (TPS) traffic with low latency. Experience with building applications in high frequency, resilient transactional processing in a public cloud platform (AWS), achieving low latency, high scalability, and cost savings for the firm. Must be proficient in CI/CD practices, container-based development, and have strong communication skills to drive and participate in meaningful discussions. Experience with the industry best practices for designing and developing high TPS and low latency business applications using Java, Spring Boot, Kafka, and AWS EKS. Experience with the of containerization (Docker) and microservices architecture. Experience with the industry-standard security and compliance requirements for financial services, including data encryption, secure access controls, regular security audits, and compliance with regulations such as GDPR, CCPA, and PCI-DSS. Additionally, AWS security best practices should be followed, including the use of IAM roles and policies, VPC configurations, and monitoring through AWS CloudWatch. Experience with the automated, functional, and regression testing using Java and Cucumber serenity framework.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

25 - 40 Lacs

bengaluru

Work from Office

Essential Responsibilities: As a Senior Software Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as Python, Spark, Airflow, Snowflake, Hive, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in on-call rotation in their respective time zones (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 8 years of software engineering experience. An undergraduate degree in Computer Science (or a related field) from a university where the primary language of instruction is English is strongly desired. 2+ Years of Experience/Fluency in Python Proficient with relational databases and Advanced SQL Expert in usage of services like Spark and Hive. Experience working with container-based solutions is a plus. Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in adequate usage of cloud services (AWS) at scale Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and on-premises environment. Exposure to the whole software development lifecycle from inception to production and monitoring. Experience in Advertising Attribution domain is a plus Experience in agile software development processes Excellent interpersonal and communication skills

Posted 3 weeks ago

Apply

8.0 - 13.0 years

25 - 40 Lacs

bengaluru

Work from Office

Essential Responsibilities: As a Senior Software Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as Python, Spark, Airflow, Snowflake, Hive, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in on-call rotation in their respective time zones (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 8 years of software engineering experience. An undergraduate degree in Computer Science (or a related field) from a university where the primary language of instruction is English is strongly desired. 2+ Years of Experience/Fluency in Python Proficient with relational databases and Advanced SQL Expert in usage of services like Spark and Hive. Experience working with container-based solutions is a plus. Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in adequate usage of cloud services (AWS) at scale Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and on-premises environment. Exposure to the whole software development lifecycle from inception to production and monitoring. Experience in Advertising Attribution domain is a plus Experience in agile software development processes Excellent interpersonal and communication skills

Posted 3 weeks ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

bengaluru

Work from Office

Lead Software Engineer Backend Were seeking a Lead Software Engineer to join one of our Data Layer teams. As the name implies, the Data Layer is at the core of all things data at Zeta. Our responsibilities include: Developing and maintaining the Zeta Identity Graph platform, which collects billions of behavioural, demographic, environmental, and transactional signals to power people-based marketing. Ingesting vast amounts of identity and event data from our customers and partners. Facilitating data transfers across systems. Ensuring the integrity and health of our datasets. And much more. As a member of this team, the data engineer will be responsible for designing and expanding our existing data infrastructure, enabling easy access to data, supporting complex data analyses, and automating optimization workflows for business and marketing operations. Essential Responsibilities: As a Lead Software Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as Python, Spark, Airflow, Snowflake, Hive, Scylla, Django, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Engineers to optimize data models and workflows Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in on-call rotation in their respective time zones (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 5 years of software engineering experience. Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and onpremises environment. Exposure to the whole software development lifecycle from inception to production and monitoring Fluency in Python or solid experience in Scala, Java Proficient with relational databases and Advanced SQL Expert in usage of services like Spark and Hive Experience with web frameworks such as Flask, Djang Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in adequate usage of cloud services (AWS) at scale Experience in agile software development processes Excellent interpersonal and communication skills Nice to have: Experience with large scale / multi-tenant distributed systems Experience with columnar / NoSQL databases Vertica, Snowflake, HBase, Scylla, Couchbase Experience in real team streaming frameworks Flink, Storm Experience in open table formats such as Iceberg, Hudi or Deltalake

Posted 3 weeks ago

Apply

5.0 - 10.0 years

20 - 30 Lacs

pune

Hybrid

Role & responsibilities Experience in developing large scale complex event-driven / reactive systems involving fault-tolerant, globally distributed processes with high frequency message /event workflows. Experience in Java / C++ building high performance systems involving concurrency and networking protocols with strong knowledge of Data Structures and Algorithms. Experience in building ultra-low latency and / or high throughput systems in Financial Markets or Big Data Analytics. Focused on ensuring to deliver quality solutions following Agile methodologies and Test-driven development. Knowledge of CPU / GPU Architecture, Memory management Shared Memory / Memory Mapped files, Networking Protocols TCP / UDP etc with understanding of Linux internals. Candidates with Java experience should have knowledge of GC, JNI, Java Unsafe, JNR-FFI etc. Knowledge of Aeron (preferred). Comfortable using Python / Kotlin / Shell script for Tooling. A keen learner who enjoys a challenge and collaborates naturally to take ownership of complex Business deliveries. At least one degree in Computer Science, Engineering, Physics, or Mathematics.

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As an Emerging Tech Specialist at our Applied Research center, you will focus on applying scientific and technical research to solve practical problems and develop new products or processes within a specific industry or field. Your responsibilities will include conducting research, analyzing data, and developing solutions that can be implemented in real-world settings. You will define the research agenda and collaborate with academia and partners to conduct applied research. Additionally, you will be responsible for building tech playbooks that can be utilized by product and implementation teams. Your key responsibilities will involve researching emerging tech trends, the ecosystem of players, use cases, and their impact on client businesses. This will include scanning and curating startups, universities, and tech partnerships to create an innovation ecosystem. You will rapidly design and develop Proof of Concepts (PoCs) in emerging tech areas, sharing design specifications with team members, integrating, and testing components. Furthermore, you will contribute to thought leadership by developing showcases that demonstrate the application of emerging technologies in a business context. As part of the Applied Research Center activities, you will contribute to the design, development, testing, and implementation of proof of concepts in emerging tech areas. You will also be involved in problem definition and requirements analysis, developing reusable components, and ensuring compliance with coding standards and secure coding guidelines. Your role will also include innovation consulting, where you will understand client requirements and implement solutions using emerging tech expertise. Additionally, you will be responsible for talent management, mentoring the team to acquire identified emerging tech skills, and participating in demo sessions and hackathons. You will work with startups to provide innovative solutions to client problems and enhance our offerings. In terms of technical requirements, you should have expertise in various emerging areas such as Advanced AI, New Interaction Models, Platforms and Protocols, Cybersecurity, Quantum, Autonomous Machines, and Emerging Research areas like Brain AGI and Space Semicon. You will also need advanced theoretical knowledge, experimental design expertise, data analysis skills, prototype development capabilities, and research tool proficiency. Preferred skills include experience in User Experience Design, Artificial Intelligence, Cybersecurity Competency Management, Machine Learning, Robotics Algorithms, and X Reality (XR) technologies. Soft skills like a collaborative mindset, communication skills, problem-solving approach, intellectual curiosity, and commercial awareness will be beneficial for this role.,

Posted 1 month ago

Apply

5.0 - 10.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Lead Software Engineer Backend Were seeking a Lead Software Engineer to join one of our Data Layer teams. As the name implies, the Data Layer is at the core of all things data at Zeta. Our responsibilities include: Developing and maintaining the Zeta Identity Graph platform, which collects billions of behavioural, demographic, environmental, and transactional signals to power people-based marketing. Ingesting vast amounts of identity and event data from our customers and partners. Facilitating data transfers across systems. Ensuring the integrity and health of our datasets. And much more. As a member of this team, the data engineer will be responsible for designing and expanding our existing data infrastructure, enabling easy access to data, supporting complex data analyses, and automating optimization workflows for business and marketing operations. Essential Responsibilities: As a Lead Software Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as Python, Spark, Airflow, Snowflake, Hive, Scylla, Django, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Engineers to optimize data models and workflows Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in on-call rotation in their respective time zones (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 5 years of software engineering experience. Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and onpremises environment. Exposure to the whole software development lifecycle from inception to production and monitoring Fluency in Python or solid experience in Scala, Java Proficient with relational databases and Advanced SQL Expert in usage of services like Spark and Hive Experience with web frameworks such as Flask, Djang Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in adequate usage of cloud services (AWS) at scale Experience in agile software development processes Excellent interpersonal and communication skills Nice to have: Experience with large scale / multi-tenant distributed systems Experience with columnar / NoSQL databases Vertica, Snowflake, HBase, Scylla, Couchbase Experience in real team streaming frameworks Flink, Storm Experience in open table formats such as Iceberg, Hudi or Deltalake

Posted 1 month ago

Apply

8.0 - 13.0 years

7 - 11 Lacs

Bengaluru

Work from Office

As a member of this team, the data engineer will be responsible for designing and expanding our existing data infrastructure, enabling easy access to data, supporting complex data analyses, and automating optimization workflows for business and marketing operations Essential Responsibilities: As a Senior Software Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as Python, Spark, Airflow, Snowflake, Hive, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in on-call rotation in their respective time zones (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 8 years of software engineering experience. An undergraduate degree in Computer Science (or a related field) from a university where the primary language of instruction is English is strongly desired. 2+ Years of Experience/Fluency in Python Proficient with relational databases and Advanced SQL Expert in usage of services like Spark and Hive. Experience working with container-based solutions is a plus. Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in adequate usage of cloud services (AWS) at scale Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and on-premises environment. Exposure to the whole software development lifecycle from inception to production and monitoring. Experience in Advertising Attribution domain is a plus Experience in agile software development processes Excellent interpersonal and communication skills

Posted 1 month ago

Apply

8.0 - 13.0 years

11 - 16 Lacs

Bengaluru

Work from Office

Essential Responsibilities: As a Senior Software Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as Python, Spark, Airflow, Snowflake, Hive, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in on-call rotation in their respective time zones (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 8 years of software engineering experience. An undergraduate degree in Computer Science (or a related field) from a university where the primary language of instruction is English is strongly desired. 2+ Years of Experience/Fluency in Python Proficient with relational databases and Advanced SQL Expert in usage of services like Spark and Hive. Experience working with container-based solutions is a plus. Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in adequate usage of cloud services (AWS) at scale Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and on-premises environment. Exposure to the whole software development lifecycle from inception to production and monitoring. Experience in Advertising Attribution domain is a plus Experience in agile software development processes Excellent interpersonal and communication skills

Posted 1 month ago

Apply

8.0 - 13.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Essential Responsibilities: As a Senior Software Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as Python, Spark, Airflow, Snowflake, Hive, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in on-call rotation in their respective time zones (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 8 years of software engineering experience. An undergraduate degree in Computer Science (or a related field) from a university where the primary language of instruction is English is strongly desired. 2+ Years of Experience/Fluency in Python Proficient with relational databases and Advanced SQL Expert in usage of services like Spark and Hive. Experience working with container-based solutions is a plus. Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in adequate usage of cloud services (AWS) at scale Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and on-premises environment. Exposure to the whole software development lifecycle from inception to production and monitoring. Experience in Advertising Attribution domain is a plus Experience in agile software development processes Excellent interpersonal and communication skills

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Senior Software Engineer Data Were seeking a Senior Software Engineer or a Lead Software Engineer to join one of our Data Layer teams. As the name implies, the Data Layer is at the core of all things data at Zeta. Our responsibilities include: Developing and maintaining the Zeta Identity Graph platform, which collects billions of behavioural, demographic, locations and transactional signals to power people-based marketing. Ingesting vast amounts of identity and event data from our customers and partners. Facilitating data transfers across systems. Ensuring the integrity and health of our datasets. And much more. As a member of this team, the data engineer will be responsible for designing and expanding our existing data infrastructure, enabling easy access to data, supporting complex data analyses, and automating optimization workflows for business and marketing operations. Essential Responsibilities: As a Senior Software Engineer or a Lead Software Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as Spark, Airflow, Snowflake, Hive, Scylla, Django, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Engineers to optimize data models and workflows Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in on-call rotation in their respective time zones (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 5-10 years of software engineering experience. Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and on-premises environment. Exposure to the whole software development lifecycle from inception to production and monitoring Fluency in Python or solid experience in Scala, Java Proficient with relational databases and Advanced SQL Expert in usage of services like Spark and Hive Experience with web frameworks such as Flask, Django Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in Kafka or any other stream message processing solutions. Experience in adequate usage of cloud services (AWS) at scale Experience in agile software development processes Excellent interpersonal and communication skills Nice to have: Experience with large scale / multi-tenant distributed systems Experience with columnar / NoSQL databases Vertica, Snowflake, HBase, Scylla, Couchbase Experience in real team streaming frameworks Flink, Storm Experience in open table formats such as Iceberg, Hudi or Deltalake

Posted 2 months ago

Apply

7.0 - 12.0 years

12 - 16 Lacs

Bengaluru

Work from Office

We are looking for lead or principal software engineers to join our Data Cloud team. Our Data Cloud team is responsible for the Zeta Identity Graph platform, which captures billions of behavioural, demographic, environmental, and transactional signals, for people-based marketing. As part of this team, the data engineer will be designing and growing our existing data infrastructure to democratize data access, enable complex data analyses, and automate optimization workflows for business and marketing operations. Job Description: Essential Responsibilities: As a Lead or Principal Data Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as HDFS, Spark, Snowflake, Hive, HBase, Scylla, Django, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Engineers to optimize data models and workflows Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in 24/7 on-call rotation (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 7 years of software engineering experience. Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and onpremises environment. Exposure to the whole software development lifecycle from inception to production and monitoring Fluency in Python or solid experience in Scala, Java Proficient with relational databases and Advanced SQL Expert in usage of services like Spark, HDFS, Hive, HBase Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in adequate usage of cloud services (AWS) at scale Experience in agile software development processes Excellent interpersonal and communication skills Nice to have: Experience with large scale / multi-tenant distributed systems Experience with columnar / NoSQL databases Vertica, Snowflake, HBase, Scylla, Couchbase Experience in real team streaming frameworks Flink, Storm Experience with web frameworks such as Flask, Django

Posted 2 months ago

Apply

10.0 - 20.0 years

90 - 100 Lacs

Bengaluru

Hybrid

Roles and Responsibilities Develop a comprehensive automation strategy aligned with organizational goals. Evaluate and select automation tools, integrating them into the existing framework. Automate network provisioning and configuration, from defining complex MPLS and QoS logic to configuring VLANs using intent-based automation. Define performance and monitoring metrics for automation pipelines. Ensure automation solutions are scalable and optimized to support growing network demands. Work with the Service Owner to document processes and provide training for network administrators and engineers. Continuously evaluate and enhance automation solutions to align with evolving business needs. Scripting and programming knowledge in Java, .NET, Terraform, Python, YAML, Ansible, and API development (hands-on coding is not required). Understanding of network concepts is beneficial. Gain deep knowledge of its technology landscape and leverage it to create strategic linkages. Lead teams by driving motivation, retention, and performance management while fostering an inclusive culture. Develop learning and development plans for teams and mentor junior professionals. Act as the go-to expert in their domain and expand knowledge across related domains. Align technology/domain strategy with business objectives and help implement strategic initiatives. Lead the delivery of complex and high-impact projects, ensuring quality outcomes. Work closely with other domain architects to break solutions into roadmaps and Epics. Guide the design and planning of Epics, translating them into actionable development journeys. Modify existing patterns to develop robust and scalable solutions. Represent professionally while building strong relationships with internal and external stakeholders. Drive collaboration across teams and communicate complex concepts effectively. Preferred candidate profile : Strong software development and delivery experience. Expertise in multiple programming languages and third-party software products (proficiency in at least two is required). Experience in multi-application management and integrating diverse technologies. Proven ability in stakeholder management, team leadership, and multi-group coordination.

Posted 2 months ago

Apply

10.0 - 20.0 years

85 - 100 Lacs

Bengaluru

Hybrid

Develop a comprehensive automation strategy aligned with organizational goals. Evaluate and select automation tools, integrating them into the existing framework. Automate network provisioning and configuration, from defining complex MPLS and QoS logic to configuring VLANs using intent-based automation. Define performance and monitoring metrics for automation pipelines. Ensure automation solutions are scalable and optimized to support growing network demands. Work with the Service Owner to document processes and provide training for network administrators and engineers. Continuously evaluate and enhance automation solutions to align with evolving business needs. Scripting and programming knowledge in Java, .NET, Terraform, Python, YAML, Ansible, and API development (hands-on coding is not required). Understanding of network concepts is beneficial. Gain deep knowledge of on technology landscape and leverage it to create strategic linkages. Lead teams by driving motivation, retention, and performance management while fostering an inclusive culture. Develop learning and development plans for teams and mentor junior professionals. Act as the go-to expert in their domain and expand knowledge across related domains. Align technology/domain strategy with business objectives and help implement strategic initiatives. Lead the delivery of complex and high-impact projects, ensuring quality outcomes. Work closely with other domain architects to break solutions into roadmaps and Epics. Guide the design and planning of Epics, translating them into actionable development journeys. Modify existing patterns to develop robust and scalable solutions. Represent our client professionally while building strong relationships with internal and external stakeholders. Drive collaboration across teams and communicate complex concepts effectively. Preferred candidate profile

Posted 2 months ago

Apply

10.0 - 16.0 years

40 - 85 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Principal Engineer Experience: 10 - 16 Years Exp Salary: Competitive Preferred Notice Period : Within 30 Days Shift : 9:30 AM to 6:30 PM IST Opportunity Type: Hybrid - Bengaluru Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : Python OR Scala OR Java AND Distributed Systems OR reduced latency OR Akka Netskope (One of Uplers' Clients) is Looking for: Principal Engineer who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description About the team: We are a distributed team of passionate engineers dedicated to continuous learning and building impactful products. Our core product is built from the ground up in Scala, leveraging the Lightbend stack (Play/Akka) . About Data Security Posture Management (DSPM) DSPM is designed to provide comprehensive data visibility and contextual security for the modern AI-driven enterprise. Our platform automatically discovers, classifies, and secures sensitive data across diverse environments including AWS, Azure, Google Cloud, Oracle Cloud , and on-premise infrastructure. DSPM is critical in empowering CISOs and security teams to enforce secure AI practices, reduce compliance risk, and maintain continuous data protection at scale. As a key member of the DSPM team, you will contribute to developing innovative and scalable systems designed to protect the exponentially increasing volume of enterprise data. Our platform continuously maps sensitive data across all major cloud providers, and on-prem environments, automatically detecting and classifying sensitive and regulated data types such as PII, PHI, financial, and healthcare information. It flags data at risk of exposure, exfiltration, or misuse and helps prioritize issues based on sensitivity, exposure, and business impact. What you'll do: Drive the enhancement of Data Security Posture Management (DSPM) capabilities, by enabling the detection of sensitive or risky data utilized in (but not limited to) training private LLMs or accessed by public LLMs. Improve the DSPM platform to extend support of the product to all major cloud infrastructures, on-prem deployments, and any new upcoming technologies. Provide technical leadership in all phases of a project, from discovery and planning through implementation and delivery. Contribute to product development: understand customer requirements and work with the product team to design, plan, and implement features. Support customers by investigating and fixing production issues. Help us improve our processes and make the right tradeoffs between agility, cost, and reliability. Collaborate with the teams across geographies. What you bring to the table: 10+ years of software development experience with enterprise-grade software. Must have experience in building scalable, high-performance cloud services. Expert coding skills in Scala or Java. Development on cloud platforms including AWS. Deep knowledge on databases and data warehouses (OLTP, OLAP) Analytical and troubleshooting skills. Experience working with Docker and Kubernetes. Able to multitask and wear many hats in a fast-paced environment. This week you might lead the design of a new feature, next week you are fixing a critical bug or improving our CI infrastructure. Cybersecurity experience and adversarial thinking is a plus. Expertise in building REST APIs. Experience leveraging cloud-based AI services (e.g., AWS Bedrock, SageMaker), vector databases, and Retrieval Augmented Generation (RAG) architectures is a plus. Education: Bachelor's degree in Computer Science. Masters degree strongly preferred. How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: Netskope, a global SASE leader, helps organizations apply zero trust principles and AI/ML innovations to protect data and defend against cyber threats. Fast and easy to use, the Netskope platform provides optimized access and real-time security for people, devices, and data anywhere they go. Netskope helps customers reduce risk, accelerate performance, and get unrivaled visibility into any cloud, web, and private application activity. About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 2 months ago

Apply

10.0 - 16.0 years

40 - 85 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Principal Engineer Experience: 10 - 16 Years Exp Salary: Competitive Preferred Notice Period : Within 30 Days Shift : 9:30 AM to 6:30 PM IST Opportunity Type: Hybrid - Bengaluru Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : Python OR Scala OR Java AND Distributed Systems OR reduced latency OR Akka Netskope (One of Uplers' Clients) is Looking for: Principal Engineer who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description About the team: We are a distributed team of passionate engineers dedicated to continuous learning and building impactful products. Our core product is built from the ground up in Scala, leveraging the Lightbend stack (Play/Akka) . About Data Security Posture Management (DSPM) DSPM is designed to provide comprehensive data visibility and contextual security for the modern AI-driven enterprise. Our platform automatically discovers, classifies, and secures sensitive data across diverse environments including AWS, Azure, Google Cloud, Oracle Cloud , and on-premise infrastructure. DSPM is critical in empowering CISOs and security teams to enforce secure AI practices, reduce compliance risk, and maintain continuous data protection at scale. As a key member of the DSPM team, you will contribute to developing innovative and scalable systems designed to protect the exponentially increasing volume of enterprise data. Our platform continuously maps sensitive data across all major cloud providers, and on-prem environments, automatically detecting and classifying sensitive and regulated data types such as PII, PHI, financial, and healthcare information. It flags data at risk of exposure, exfiltration, or misuse and helps prioritize issues based on sensitivity, exposure, and business impact. What you'll do: Drive the enhancement of Data Security Posture Management (DSPM) capabilities, by enabling the detection of sensitive or risky data utilized in (but not limited to) training private LLMs or accessed by public LLMs. Improve the DSPM platform to extend support of the product to all major cloud infrastructures, on-prem deployments, and any new upcoming technologies. Provide technical leadership in all phases of a project, from discovery and planning through implementation and delivery. Contribute to product development: understand customer requirements and work with the product team to design, plan, and implement features. Support customers by investigating and fixing production issues. Help us improve our processes and make the right tradeoffs between agility, cost, and reliability. Collaborate with the teams across geographies. What you bring to the table: 10+ years of software development experience with enterprise-grade software. Must have experience in building scalable, high-performance cloud services. Expert coding skills in Scala or Java. Development on cloud platforms including AWS. Deep knowledge on databases and data warehouses (OLTP, OLAP) Analytical and troubleshooting skills. Experience working with Docker and Kubernetes. Able to multitask and wear many hats in a fast-paced environment. This week you might lead the design of a new feature, next week you are fixing a critical bug or improving our CI infrastructure. Cybersecurity experience and adversarial thinking is a plus. Expertise in building REST APIs. Experience leveraging cloud-based AI services (e.g., AWS Bedrock, SageMaker), vector databases, and Retrieval Augmented Generation (RAG) architectures is a plus. Education: Bachelor's degree in Computer Science. Masters degree strongly preferred. How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: Netskope, a global SASE leader, helps organizations apply zero trust principles and AI/ML innovations to protect data and defend against cyber threats. Fast and easy to use, the Netskope platform provides optimized access and real-time security for people, devices, and data anywhere they go. Netskope helps customers reduce risk, accelerate performance, and get unrivaled visibility into any cloud, web, and private application activity. About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 2 months ago

Apply
Page 1 of 3
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies