Jobs
Interviews

Infibeam Softtech

5 Job openings at Infibeam Softtech
Senior Java Developer India 8 - 12 years Not disclosed On-site Full Time

We are looking for Senior developer for Java to help in modernization of applications. The candidate must have strong client facing experience, excellent interpersonal, written, and oral communication skills. Technical Skills : The candidate should possess expertise in the following technical stack : - Java 8 (and above) - Maven - Spring - Spring Security - Web Services - Strong knowledge and experience in Microservices Architecture. - Experience in building Restful backend services using Spring Boot. - Experience in Relational / NoSQL Database Management Systems such as Elasticsearch, MongoDB, MySQL. - Should have knowledge on GIT, BitBucket, JIRA, JFrog, Jenkins, Nexus - Sound knowledge on Docker - Strong knowledge of Agile methodologies and related software technologies (Scrum, Kanban, JIRA etc.) and the Atlassian stack. - Deep knowledge of OOP, Design Patterns, Clean Code, Refactoring, and Unit Testing. - Exposure to Application servers WebLogic, Tomcat, JBoss is preferred. - Sound Knowledge of IDE like Intellij, Visual Studio Code - Sound knowledge of Debugging - Sound knowledge of Postman, File Handling, Log4J, Threading. Key Responsibility : - Design, develop and test end-to-end web application using Java backend and React frontend. - Integration to backend services, RESTful APIs - Implement server-side logic in Java using Java-based framework Spring Boot - Analyze requirements and user stories - Troubleshoot and analyze issues during development: identify and resolve technical issues and bugs Experience and Qualifications : - Bachelors in computer science/ engineering/ MCA - 8-12 Years of relevant experience Show more Show less

GxP Manual Functional Tester Bengaluru,Karnataka,India 4 years Not disclosed On-site Full Time

Roles & Responsibilities: Perform manual testing in GxP-compliant environments ensuring all standards and compliance requirements are met. Execute full Software Testing Life Cycle (STLC) including Functional Testing, FIT (Factory Integration Testing), UAT (User Acceptance Testing), and Defect Management. Design and execute FIT test cases based on business and system requirements. Create UAT test scenarios and scripts, working closely with business users. Participate in Agile Incremental Testing, providing continuous feedback during development sprints. Work with cross-functional teams including business analysts, developers, and regulatory teams to ensure quality and compliance. Track, manage, and report defects using Jira or similar test management tools. Prepare and maintain clear documentation for test cases, execution results, and defect logs. Required Skills: Minimum 4+ years of experience in manual testing within GxP-compliant environments. Strong understanding of STLC, Functional Testing, FIT, UAT, and Defect Management. Hands-on experience in FIT test case design and execution, and UAT test design. Familiarity with Agile testing methodologies, test management tools such as Jira, and a good understanding of SDLC and Agile practices. Strong communication and documentation skills. Show more Show less

Software Engineer Specialist .NET India 7 years None Not disclosed Remote Full Time

Experience Required: 7+ Years Role Summary: We are seeking an experienced .NET Tech Lead with strong expertise in building high-performance, scalable, and reliable APIs, including gRPC services. This role requires proficiency in asynchronous programming, queue-based systems, event-driven architecture, and Azure cloud services. The ideal candidate will also have experience with unit testing, performance, and load testing to ensure the quality and scalability of the system. Key Responsibilities: Design, develop, and optimize RESTful APIs using .NET Core for high-throughput and low-latency applications . Implement asynchronous programming patterns to improve performance and scalability in API services. Develop solutions based on event-driven architecture to handle real-time, high-volume data flows and asynchronous messaging (e.g., Azure Service Bus, RabbitMQ, Kafka). Ensure proper logging, monitoring, and observability in distributed systems to detect and resolve issues quickly. Participate in code reviews, ensuring best practices for performance, security, and reliability are followed. Collaborate on technical discussions and contribute to the overall API, gRPC, and Azure strategy, architecture, and roadmap. Required Skills and Qualifications: 7+ years of experience with .NET stack (C#, ASP.NET Core, Web API). 2+ years of experience with Azure Cloud services. Extensive experience with asynchronous programming patterns such as async/await, Tasks, and Parallel Programming. Experience with Azure cloud services including App Services, Azure Functions, Service Bus, and Blob Storage. Experience with message queuing and event-driven systems (e.g., Azure Service Bus, RabbitMQ, Kafka). Proficiency with unit testing frameworks (e.g., XUnit, NUnit, MSTest). Experience conducting performance and load testing using tools like Azure Load Testing, JMeter, or Gatling. Strong understanding of performance optimization techniques, such as caching, load balancing, and connection pooling. Good experience with Agile methodologies and micro services architecture. Soft Skills: Takes ownership, is a proactive problem-solver with a positive, can-do attitude. Passionate about development. Excellent communication and teamwork skills. Ability to work effectively in a remote environment. Nice to Have: Optimize APIs, gRPC services, and Azure-based systems for high-concurrency scenarios, ensuring they are scalable, secure, and maintainable. Perform performance and load testing to evaluate system behavior under heavy usage, ensuring responsiveness and stability. Use tools like Azure Load Testing, JMeter, or Gatling.

Senior Data Engineer – Data Build Tool india 8 years None Not disclosed Remote Full Time

Location: Remote Employment Type: Full-time Shift time: 4:30PM - 1:30AM IST Experience Required: 8+ years in Data Engineering, with strong expertise in DBT About the Role: We are seeking a Senior Data Engineer with deep experience in DBT (Data Build Tool) to join our data team. You will be responsible for building scalable and maintainable data pipelines, transforming raw data into actionable insights, and helping shape the future of our data architecture and governance practices. Key Responsibilities: ● Design, develop, and maintain data pipelines using DBT, SQL, and orchestration tools like Airflow or Prefect ● Collaborate with data analysts, scientists, and stakeholders to understand data needs and deliver clean, well-modeled datasets ● Optimize DBT models for performance and maintainability ● Implement data quality checks, version control, and documentation standards in DBT ● Work with cloud data warehouses like Snowflake, BigQuery, Redshift, or Databricks ● Own and drive best practices around data modeling (Kimball, Star/Snowflake schemas), transformation layers, and CI/CD for data ● Collaborate with cross-functional teams to integrate data from various sources (APIs, third-party tools, internal services) ● Monitor and troubleshoot data pipelines and ensure timely delivery of data to business stakeholders ● Mentor junior engineers and contribute to team growth and development Required Skills: ● 7+ years of experience in Data Engineering or related fields ● 4+ years of hands-on experience with DBT (Core or Cloud) ● Strong SQL skills and experience with modular data modeling ● Experience with ELT/ETL pipelines using orchestration tools like Airflow, Dagster, Prefect, or similar ● Solid understanding of data warehouse architecture and performance tuning ● Proficient with one or more cloud platforms: AWS, GCP, or Azure ● Familiarity with version control (Git), CI/CD pipelines, and testing frameworks in data engineering ● Experience working with structured, semi-structured (JSON, Parquet) data ● Excellent communication and documentation skills Preferred Qualifications: ● Experience with DataOps practices and monitoring tools ● Familiarity with Python or Scala for data processing ● Exposure to Looker, Tableau, or other BI tools ● Knowledge of data governance, cataloging, or lineage tools (e.g., Great Expectations, Monte Carlo, Atlan)

Senior Data Engineer – Data Build Tool india 8 - 10 years INR Not disclosed Remote Full Time

Location: Remote Employment Type: Full-time Shift time: 4:30PM - 1:30AM IST Experience Required: 8+ years in Data Engineering, with strong expertise in DBT About the Role: We are seeking a Senior Data Engineer with deep experience in DBT (Data Build Tool) to join our data team. You will be responsible for building scalable and maintainable data pipelines, transforming raw data into actionable insights, and helping shape the future of our data architecture and governance practices. Key Responsibilities: ? Design, develop, and maintain data pipelines using DBT, SQL, and orchestration tools like Airflow or Prefect ? Collaborate with data analysts, scientists, and stakeholders to understand data needs and deliver clean, well-modeled datasets ? Optimize DBT models for performance and maintainability ? Implement data quality checks, version control, and documentation standards in DBT ? Work with cloud data warehouses like Snowflake, BigQuery, Redshift, or Databricks ? Own and drive best practices around data modeling (Kimball, Star/Snowflake schemas), transformation layers, and CI/CD for data ? Collaborate with cross-functional teams to integrate data from various sources (APIs, third-party tools, internal services) ? Monitor and troubleshoot data pipelines and ensure timely delivery of data to business stakeholders ? Mentor junior engineers and contribute to team growth and development Required Skills: ? 7+ years of experience in Data Engineering or related fields ? 4+ years of hands-on experience with DBT (Core or Cloud) ? Strong SQL skills and experience with modular data modeling ? Experience with ELT/ETL pipelines using orchestration tools like Airflow, Dagster, Prefect, or similar ? Solid understanding of data warehouse architecture and performance tuning ? Proficient with one or more cloud platforms: AWS, GCP, or Azure ? Familiarity with version control (Git), CI/CD pipelines, and testing frameworks in data engineering ? Experience working with structured, semi-structured (JSON, Parquet) data ? Excellent communication and documentation skills Preferred Qualifications: ? Experience with DataOps practices and monitoring tools ? Familiarity with Python or Scala for data processing ? Exposure to Looker, Tableau, or other BI tools ? Knowledge of data governance, cataloging, or lineage tools (e.g., Great Expectations, Monte Carlo, Atlan) Show more Show less