Jobs
Interviews

14 Pulsar Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

0 Lacs

ranchi, jharkhand

On-site

You are a Middle Python Developer joining our international team to contribute to the development of our products. We are seeking individuals with a high-energy level and a passion for continuous learning, balancing work and personal life while striving to deliver exceptional productivity that directly impacts the success of our clients. Your responsibilities will include writing well-designed, testable, and efficient code, along with creating unit tests for each module. You will have hands-on experience in coding full applications, managing errors effectively, and working with RESTful and gRPC based microservices & pub/sub messaging. Additionally, you will be involved in implementing self-contained User Stories deployable via Kubernetes, participating in code reviews, exploring new technologies, and suggesting technical improvements. Designing and developing messaging-based Applications for URLLC Services through Event-Queues, Event-Audit, and Caching will also be part of your duties. To be successful in this role, you should have at least 2 years of development experience using the latest frameworks, fluency in Python (2.6+ and 3.3+), proficiency in Linux, and experience with frameworks such as Flask/Django, Bottle, uWSGI, Nginx, Jenkins/CI, etc. Knowledge of rapid-prototyping, RESTful and gRPC based microservices & pub/sub messaging, as well as familiarity with technologies like API Gateway Kong, Apigee, Firebase, OAUTH, 2-MFA, JWT, etc, is essential. Experience with data storage tools including RDBMS (Oracle, MySQL Server, MariaDB), Kafka, Pulsar, Redis Streams, ORM (SQLAlchemy, Mongoose, JPA, etc), and intermediate English skills are required. Experience with Containers and Cloud PaaS (K8S, Istio, Envoy, Helm, Azure, etc), Docker, CI/CD, developing instrumentation, ASYNCIO-function in TRY/CATCH block with Error Tracepoints, expertise in building microservice architecture, a software engineering degree or equivalent, and familiarity with Agile methodologies will be considered a plus. In return, we offer a competitive salary based on your experience, opportunities for career growth, a flexible work schedule, minimal bureaucracy, professional skills development programs, paid sick leaves, vacation days, and corporate events. Additionally, you will have the possibility to work remotely.,

Posted 1 day ago

Apply

6.0 - 10.0 years

0 Lacs

kochi, kerala

On-site

As a Senior JAVA Developer, you will be required to effectively communicate technical concepts in English, both verbally and in written form. With over 6 years of commercial JAVA experience, you should demonstrate the ability to write efficient, testable, and maintainable JAVA code following best practices and patterns for implementation, build, and deployment of JAVA services. Your expertise should extend to the Java ecosystem and related technologies, including Spring Boot, Spring frameworks, Hibernate, and Maven. Proficiency in Test-Driven Development (TDD) and exposure to Behavior-Driven Development (BDD) are essential. Familiarity with version control tools like Git, project management tools such as JIRA and Confluence, and continuous integration tools like Jenkins is expected. You should have a solid background in building RESTful services within microservices architectures and working in cloud-based environments, preferably AWS. Knowledge of both NoSQL and relational databases, especially PostgreSQL, is crucial. Experience in developing services using event or stream-based systems like SQS, Kafka, or Pulsar, and knowledge of CQRS principles is desirable. A strong foundation in Computer Science fundamentals and software patterns is necessary for this role. Additionally, experience with AWS services like Lambda, SQS, S3, and Rekognition Face Liveness, as well as familiarity with Camunda BPMN, would be advantageous. This position requires a Senior level professional with over 10 years of experience, offering a competitive salary ranging from 25 to 40 LPA. If you meet these qualifications and are eager to contribute your skills to a dynamic team, we encourage you to apply for this Senior JAVA Developer role.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

ranchi, jharkhand

On-site

We are seeking a Senior Python Developer to join our international team and contribute to the development of our products. We are looking for individuals who are enthusiastic, lifelong learners, prioritize work-life balance, and are committed to delivering high productivity to ensure the success of our customers. Responsibilities: - Maintain high-quality development standards. - Collaborate with the Customer Solutions department and Product Manager on the design and creation of User Stories. - Implement self-contained User Stories that are ready for containerization and deployment via Kubernetes. - Work with the QA team to ensure test planning and automated test implementations, driving continuous improvement cycles. - Conduct prototyping and explore new technology areas such as image recognition, robotic controls, high-speed messaging, and K8S orchestration. - Develop full applications with robust error management, RESTful and gRPC based microservices, and pub/sub messaging. - Participate in code reviews, test new concepts, suggest technical improvements, and lead by example in coding techniques. - Design and develop messaging-based Applications for URLLC Services via Event-Queues, Event-Audit, and Caching. - Demonstrate expertise in state machines, workflows, refactoring strategies, error management, and software instrumentation. Required experience and skills: - 4+ years of development experience using the latest frameworks. - Proficiency in Python (Python 2.6+, Python 3.3+). - Experience with frameworks like Flask/Django, Bottle, uWSGI, Nginx, Jenkins/CI, etc. - Hands-on experience with rapid prototyping, RESTful and gRPC based microservices, and pub/sub messaging. - Ability to create Python backend microservices. - Familiarity with technologies such as API Gateway Kong, Apigee, Firebase, OAUTH, 2-MFA, JWT, etc. - Experience with data storage tools like RDBMS (Oracle, MySQL Server, MariaDB), Kafka, Pulsar, Redis Streams, etc. - Knowledge of ORM tools (SQLAlchemy, Mongoose, JPA, etc). - Expertise in building microservice architecture. - Upper-Intermediate English proficiency or above. Additional qualifications that would be a plus: - Experience with Containers and Cloud PaaS technologies like K8S, Istio, Envoy, Helm, Azure, etc. - Proficiency in Docker, CI/CD. - Experience in developing instrumentation, using ASYNCIO-function in TRY/CATCH block with Error Tracepoints. - Experience as a Tech Lead. - Familiarity with Agile methodologies. We offer: - Competitive salary based on your professional experience. - Career growth opportunities. - Flexible work schedule. - Minimal bureaucracy. - Professional skills development and training programs. - Paid sick leaves, vacation days, and public holidays. - Participation in corporate events. - Remote work possibilities.,

Posted 1 week ago

Apply

8.0 - 15.0 years

0 Lacs

karnataka

On-site

The opportunity at our organization is focused on enabling IT to optimize capacity, proactively detect performance anomalies, and automate operational tasks. We strive to streamline management, deployment, and scale applications with self-service and centralized role-based IT governance. Additionally, we aim to drive financial accountability and unify security operations with intelligent analysis and regulatory compliance. In the Nutanix Cloud Manager team, we are dedicated to building the next-generation platform that assists enterprises in modeling, developing, and managing applications. Our goal is to provide them with the capability to encapsulate not only infrastructure but also the application, its architecture, and deployment as code. As part of your role, you will be responsible for delivering a best-in-class user experience to customers and developing robust microservices. You are expected to gain a deep understanding of customer use cases and design innovative solutions to meet customer requirements. Collaboration with team members across the organization including product managers, designers, and support engineers is crucial to your success. You will work on defining functionality that is easy-to-use and intuitive for customers, improving performance and scalability of backend services, and maintaining uptime to meet service level objectives. Additionally, diagnosing and debugging issues in a microservices and distributed environment will be part of your responsibilities. To excel in this role, you should bring 8 - 15 years of experience in one of the following programming languages: Go, C++, Java, or Python. Knowledge of TypeScript and familiarity with any server-side language and databases is preferred. A strong understanding of OS internals, concepts of distributed data management, and web-scale systems is essential. Experience in building and managing web-scale applications, proficiency in Linux, and expertise in concurrency patterns, multithreading concepts, and debugging techniques are required. Working experience with storage, networking, virtualization (Nutanix, VMware, KVM), cloud technologies (AWS, Azure, GCP), databases (SQL & NoSQL), and messaging technologies (NATS, Kafka, or Pulsar) will be beneficial for this role. This role operates in a hybrid capacity, combining the benefits of remote work with in-person collaboration. Most roles require a minimum of 3 days per week in the office, but certain roles or teams may need more frequent in-office presence. Your manager will provide additional team-specific guidance and norms.,

Posted 2 weeks ago

Apply

3.0 - 8.0 years

15 - 22 Lacs

Gurugram

Remote

The details of the position are: Position Details: Job Title : Data Engineer Client : Yum brands Job ID : 1666-1 Location : (Remote) Project Duration : 06 months Contract Job Description: We are seeking a skilled Data Engineer, who is knowledgeable about and loves working with modern data integration frameworks, big data, and cloud technologies. Candidates must also be proficient with data programming languages (e.g., Python and SQL). The Yum! data engineer will build a variety of data pipelines and models to support advanced AI/ML analytics projectswith the intent of elevating the customer experience and driving revenue and profit growth in our restaurants globally. The candidate will work in our office in Gurgaon, India. Key Responsibilities As a data engineer, you will: • Partner with KFC, Pizza Hut, Taco Bell & Habit Burger to build data pipelines to enable best-in-class restaurant technology solutions. • Play a key role in our Data Operations team—developing data solutions responsible for driving Yum! growth. • Design and develop data pipelines—streaming and batch—to move data from point-of-sale, back of house, operational platforms, and more to our Global Data Hub • Contribute to standardizing and developing a framework to extend these pipelines across brands and markets • Develop on the Yum! data platform by building applications using a mix of open-source frameworks (PySpark, Kubernetes, Airflow, etc.) and best in breed SaaS tools (Informatica Cloud, Snowflake, Domo, etc.). • Implement and manage production support processes around data lifecycle, data quality, coding utilities, storage, reporting, and other data integration points. Skills and Qualifications: • Vast background in all things data-related • AWS platform development experience (EKS, S3, API Gateway, Lambda, etc.) • Experience with modern ETL tools such as Informatica, Matillion, or DBT; Informatica CDI is a plus • High level of proficiency with SQL (Snowflake a big plus) • Proficiency with Python for transforming data and automating tasks • Experience with Kafka, Pulsar, or other streaming technologies • Experience orchestrating complex task flows across a variety of technologies • Bachelor’s degree from an accredited institution or relevant experience

Posted 2 weeks ago

Apply

12.0 - 22.0 years

30 - 45 Lacs

Chennai

Work from Office

Kubernetes, Docker & multi-cloud orchestration tools. AWS, Azure, GCP, and private cloud environments, ensuring compatibility & interoperability. Code tools & cloud-neutral deployments. front-end frameworks & back-end development. SQL & NoSQL, CI/CD

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

The Senior DevOps, Platform, and Infra Security Engineer opportunity at FICO's highly modern and innovative analytics and decision platform involves shaping the next generation security for FICO's Platform. You will address cutting-edge security challenges in a highly automated, complex, cloud & microservices-driven environment inclusive of design challenges and continuous delivery of security functionality and features to the FICO platform as well as the AI/ML capabilities used on top of the FICO platform, as stated by the VP of Engineering. In this role, you will secure the design of the next-generation FICO Platform, its capabilities, and services. You will provide full-stack security architecture design from cloud infrastructure to application features for FICO customers. Collaborating closely with product managers, architects, and developers, you will implement security controls within products. Your responsibilities will also include developing and maintaining Kyverno policies for enforcing security controls in Kubernetes environments and defining and implementing policy-as-code best practices in collaboration with platform, DevOps, and application teams. As a Senior DevOps, Platform, and Infra Security Engineer, you will stay updated with emerging threats, Kubernetes security features, and cloud-native security tools. You will define required controls and capabilities for the protection of FICO products and environments, build and validate declarative threat models in a continuous and automated manner, and prepare the product for compliance attestations while ensuring adherence to best security practices. The ideal candidate for this role should have 10+ years of experience in architecture, security reviews, and requirement definition for complex product environments. Strong knowledge and hands-on experience with Kyverno and OPA/Gatekeeper are preferred. Familiarity with industry regulations, frameworks, and practices (e.g., PCI, ISO 27001, NIST) is required. Experience in threat modeling, code reviews, security testing, vulnerability detection, and remediation methods is essential. Hands-on experience with programming languages such as Java, Python, and securing cloud environments, preferably AWS, is necessary. Moreover, experience in deploying and securing containers, container orchestration, and mesh technologies (e.g., EKS, K8S, ISTIO), Crossplane for managing cloud infrastructure declaratively via Kubernetes, and certifications in Kubernetes or cloud security (e.g., CKA, CKAD, CISSP) are desirable. Proficiency with CI/CD tools (e.g., GitHub Actions, GitLab CI, Jenkins, Crossplane) is important. The ability to independently drive transformational security projects across teams and organizations and experience with securing event streaming platforms like Kafka or Pulsar are valued. Hands-on experience with ML/AI model security, IaC (e.g., Terraform, Cloudformation, Helm), and CI/CD pipelines (e.g., Github, Jenkins, JFrog) will be beneficial. Joining FICO as a Senior DevOps, Platform, and Infra Security Engineer offers you an inclusive culture reflecting core values, the opportunity to make an impact and develop professionally, highly competitive compensation and benefits programs, and an engaging, people-first work environment promoting work/life balance, employee resource groups, and social events to foster interaction and camaraderie.,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 - 1 Lacs

Hyderabad, Bengaluru

Hybrid

Role & responsibilities The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solutions. Utilize a deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution Your Impact: Data Ingestion, Integration and Transformation Data Storage and Computation Frameworks, Performance Optimizations Analytics & Visualizations Infrastructure & Cloud Computing Data Management Platforms Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time Build functionality for data analytics, search and aggregation Preferred candidate profile Minimum 2 years of experience in Big Data technologies Hands-on experience with the Hadoop stack HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow, and other components required in building end-to-end data pipelines. Bachelor’s degree and year of work experience of 4 to 6 years or any combination of education, training, and/or experience that demonstrates the ability to perform the duties of the position Working knowledge of real-time data pipelines is added advantage. Strong experience in at least the programming language Java, Scala, and Python. Java preferable Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDB, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery, etc. Well-versed and working knowledge with data platform-related services on Azure Set Yourself Apart With: Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands-on experience Knowledge of data governance processes (security, lineage, catalog) and tools like Collibra, Alation, etc Knowledge of distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing, and Microservices architectures Performance tuning and optimization of data pipelines Cloud data specialty and other related Big data technology certifications A Tip from the Hiring Manager: Join the team to sharpen your skills and expand your collaborative methods. Make an impact on our clients and their businesses directly through your work.

Posted 1 month ago

Apply

10.0 - 12.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

FICO (NYSE: FICO) is a leading global analytics software company, helping businesses in 100+ countries make better decisions. Join our world-class team today and fulfill your career potential! The Opportunity VP of Engineering. What You'll Contribute Secure the design of next next-generation FICO Platform, its capabilities, and services. Provide full-stack security architecture design from cloud infrastructure to application features for FICO customers. Work closely with product managers, architects, and developers on implementing the security controls within products. Develop and maintain Kyverno policies for enforcing security controls in Kubernetes environments. Collaborate with platform, DevOps, and application teams to define and implement policy-as-code best practices. Contribute to automation efforts for policy deployment, validation, and reporting. Stay current with emerging threats, Kubernetes security features, and cloud-native security tools. Define required controls and capabilities for the protection of FICO products and environments. Build & validate declarative threat models in a continuous and automated manner. Prepare the product for compliance attestations and ensure adherence to best security practices. What We're Seeking 10+ years of experience in architecture, security reviews, and requirement definition for complex product environments. Strong knowledge and hands-on experience with Kyverno and OPA/Gatekeeper (optional but a plus). Familiarity with industry regulations, frameworks, and practices. For example, PCI, ISO 27001, NIST, etc. Experience in threat modeling, code reviews, security testing, vulnerability detection, attacker exploit techniques, and methods for their remediation. Hands-on experience with programming languages, such as Java, Python, etc. Experience in deploying services and securing cloud environments, preferably AWS Experience deploying and securing containers, container orchestration, and mesh technologies (such as EKS, K8S, ISTIO). Experience with Crossplane to manage cloud infrastructure declaratively via Kubernetes. Certifications in Kubernetes or cloud security (e.g., CKA, CKAD, CISSP) are desirable Proficiency with CI/CD tools (e.g., GitHub Actions, GitLab CI, Jenkins, Crossplane, ). Independently drive transformational security projects across teams and organizations. Experience with securing event streaming platforms like Kafka or Pulsar. Experience with ML/AI model security and adversarial techniques within the analytics domains. Hands-on experience with IaC (Such as Terraform, Cloudformation, Helm) and with CI/CD pipelines (such as Github, Jenkins, JFrog). Our Offer to You An inclusive culture strongly reflecting our core values: Act Like an Owner, Delight Our Customers and Earn the Respect of Others. The opportunity to make an impact and develop professionally by leveraging your unique strengths and participating in valuable learning experiences. Highly competitive compensation, benefits and rewards programs that encourage you to bring your best every day and be recognized for doing so. An engaging, people-first work environment offering work/life balance, employee resource groups, and social events to promote interaction and camaraderie. Why Make a Move to FICO At FICO, you can develop your career with a leading organization in one of the fastest-growing fields in technology today - Big Data analytics. You'll play a part in our commitment to help businesses use data to improve every choice they make, using advances in artificial intelligence, machine learning, optimization, and much more. FICO makes a real difference in the way businesses operate worldwide: . Credit Scoring - FICO Scores are used by 90 of the top 100 US lenders. . Fraud Detection and Security - 4 billion payment cards globally are protected by FICO fraud systems. . Lending - 3/4 of US mortgages are approved using the FICO Score. Global trends toward digital transformation have created tremendous demand for FICO's solutions, placing us among the world's top 100 software companies by revenue. We help many of the world's largest banks, insurers, retailers, telecommunications providers and other firms reach a new level of success. Our success is dependent on really talented people - just like you - who thrive on the collaboration and innovation that's nurtured by a diverse and inclusive environment. We'll provide the support you need, while ensuring you have the freedom to develop your skills and grow your career. Join FICO and help change the way business thinks! Learn more about how you can fulfil your potential at FICO promotes a culture of inclusion and seeks to attract a diverse set of candidates for each job opportunity. We are an equal employment opportunity employer and we're proud to offer employment and advancement opportunities to all candidates without regard to race, color, ancestry, religion, sex, national origin, pregnancy, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. Research has shown that women and candidates from underrepresented communities may not apply for an opportunity if they don't meet all stated qualifications. While our qualifications are clearly related to role success, each candidate's profile is unique and strengths in certain skill and/or experience areas can be equally effective. If you believe you have many, but not necessarily all, of the stated qualifications we encourage you to apply. Information submitted with your application is subject to the FICO Privacy policy at

Posted 1 month ago

Apply

8.0 - 13.0 years

85 - 90 Lacs

Noida

Work from Office

About the Role We are looking for a Staff EngineerReal-time Data Processing to design and develop highly scalable, low-latency data streaming platforms and processing engines. This role is ideal for engineers who enjoy building core systems and infrastructure that enable mission-critical analytics at scale. Youll work on solving some of the toughest data engineering challenges in healthcare. A Day in the Life Architect, build, and maintain a large-scale real-time data processing platform. Collaborate with data scientists, product managers, and engineering teams to define system architecture and design. Optimize systems for scalability, reliability, and low-latency performance. Implement robust monitoring, alerting, and failover mechanisms to ensure high availability. Evaluate and integrate open-source and third-party streaming frameworks. Contribute to the overall engineering strategy and promote best practices for stream and event processing. Mentor junior engineers and lead technical initiatives. What You Need 8+ years of experience in backend or data engineering roles, with a strong focus on building real-time systems or platforms. Hands-on experience with stream processing frameworks like Apache Flink, Apache Kafka Streams, or Apache Spark Streaming. Proficiency in Java, Scala, or Python or Go for building high-performance services. Strong understanding of distributed systems, event-driven architecture, and microservices. Experience with Kafka, Pulsar, or other distributed messaging systems. Working knowledge of containerization tools like Docker and orchestration tools like Kubernetes. Proficiency in observability tools such as Prometheus, Grafana, OpenTelemetry. Experience with cloud-native architectures and services (AWS, GCP, or Azure). Bachelor's or Masters degree in Computer Science, Engineering, or a related field.

Posted 1 month ago

Apply

10.0 - 12.0 years

30 - 35 Lacs

Chennai

Work from Office

Kubernetes, Docker & multi-cloud orchestration tools. AWS, Azure, GCP, and private cloud environments, ensuring compatibility & interoperability. Code tools & cloud-neutral deployments. front-end frameworks & back-end development. SQL & NoSQL, CI/CD

Posted 2 months ago

Apply

7.0 - 8.0 years

18 - 22 Lacs

Punjab

Work from Office

Key Responsibilities Define and develop scalable, high-availability system architectures for enterprise applications. Break down functional requirements into technical components. Define data models and system interactions. Provide presentation to client on technical solutions to be implemented. Identify potential challenges and risks, along with mitigation strategies. Optimize backend and frontend components for low-latency, high-throughput performance. Develop strategies for efficient data processing, storage, and retrieval to handle large datasets. Identify and resolve bottlenecks in application logic, database queries, and API response times. Architect solutions that support multi-system integration, ensuring seamless data exchange. Implement high-performance caching strategies (Redis, Memcached) and database optimizations. Work closely with softwareengineers, DevOps teams, and business stakeholders to align technical solutions with business needs. Provide technical mentorship to developers and conduct code and architecture reviews. Define best practices and standards for software development, testing, and deployment. Required Skills & Qualifications Backend PHP -laravel, Node , Java , Python. Frontend : React.js, Angular. Databases : PostgreSQL, MySQL, MongoDB, Cassandra, Elasticsearch. Messaging & Streaming : Kafka, RabbitMQ, Pulsar. Cloud & DevOps : AWS (Lambda, S3, RDS), Kubernetes, Docker. Security & Compliance : OAuth 2.0, JWT, GDPR,. Experience In Building enterprise-grade, high-performance applications with large user bases. Designing microservices, event-driven architectures, and serverless solutions. Previously worked on similar enterprise products and experience in handling a team of 20 developers. Preferred Qualifications BTECH/BE is CSE from good tier college/university. Worked previously in similar role and have extensive experience working in complex products. Passionate about technology and innovation, continuously exploring new trends and contributing to the development of cutting-edge solutions.

Posted 2 months ago

Apply

9.0 - 14.0 years

50 - 85 Lacs

Noida

Work from Office

About the Role We are looking for a Staff EngineerReal-time Data Processing to design and develop highly scalable, low-latency data streaming platforms and processing engines. This role is ideal for engineers who enjoy building core systems and infrastructure that enable mission-critical analytics at scale. Youll work on solving some of the toughest data engineering challenges in healthcare. A Day in the Life Architect, build, and maintain a large-scale real-time data processing platform. Collaborate with data scientists, product managers, and engineering teams to define system architecture and design. Optimize systems for scalability, reliability, and low-latency performance. Implement robust monitoring, alerting, and failover mechanisms to ensure high availability. Evaluate and integrate open-source and third-party streaming frameworks. Contribute to the overall engineering strategy and promote best practices for stream and event processing. Mentor junior engineers and lead technical initiatives. What You Need 8+ years of experience in backend or data engineering roles, with a strong focus on building real-time systems or platforms. Hands-on experience with stream processing frameworks like Apache Flink, Apache Kafka Streams, or Apache Spark Streaming. Proficiency in Java, Scala, or Python or Go for building high-performance services. Strong understanding of distributed systems, event-driven architecture, and microservices. Experience with Kafka, Pulsar, or other distributed messaging systems. Working knowledge of containerization tools like Docker and orchestration tools like Kubernetes. Proficiency in observability tools such as Prometheus, Grafana, OpenTelemetry. Experience with cloud-native architectures and services (AWS, GCP, or Azure). Bachelor's or Master’s degree in Computer Science, Engineering, or a related field.

Posted 2 months ago

Apply

6 - 9 years

8 - 11 Lacs

Bengaluru

Work from Office

What you will be doing... This role will be responsible for design and implementation of big data analysis applications to support VZW device technology. Design and implement data integration from various data sources. Performance tuning, diagnose, resolve big data and streaming technical issues. Design and implement solutions across multiple layers of the technology stack, including data processing, database, web services, network, and user security. Assist UI and frontend engineers to assure the system meets performance goals. Perform integration testing and assist QA on large projects. Contribute to requirements analysis, project estimation, design, coding, and testing. Suggest and implement third party libraries and software in the product when appropriate. Deliver applications that meet performance and scalability goals of the project. Always be on the top of latest techniques on big data and AIML What we are looking for You'll need to have: Bachelors degree or four or more years of work experience. Experience designing and implementing data applications in production using Java/ Python/ Scala and etc on big data platforms. Hands-on experience with related/complementary open source software platforms and languages such as Java, Linux, Spark, Hadoop, Kafka, Pulsar etc. Deep knowledge of information architecture methodologies; enterprise-wide data architecture for designing and building data lakes, data warehouses and BI reporting databases on Docker cluster Deep knowledge of distributed data architecture concepts such as caching, map-reduce, stream processing. Solid knowledge of data security & compliance. Understand data visualization techniques and libraries and their applicability to business Proficient in the programming languages of Java, Java Scripts, Python, Scala and SQL. Even better if you have one or more of the following: Experience in AI Development. Strong written and verbal communication skills.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies