Jobs
Interviews

691 Api Gateway Jobs - Page 19

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 6.0 years

17 - 20 Lacs

Hyderabad

Work from Office

About the Role : ZettaMine is looking for a highly experienced and motivated Technical Architect Integration to lead the design and implementation of complex integration solutions across enterprise systems. In this role, you will work closely with cross-functional teams to define integration strategies, assess platform capabilities, and architect robust, scalable, and secure solutions that connect cloud, on-premises, and hybrid systems. This role demands deep technical acumen, hands-on development expertise, and strong leadership skills to guide integration projects through their full lifecycle. Key Responsibilities : - Define and own the integration architecture, including application, data, and process integration strategies. - Design end-to-end integration solutions using modern platforms such as MuleSoft, Dell Boomi, Azure Logic Apps, Apache Kafka, SAP Integration Suite, etc. - Develop API-first strategies and reusable services to facilitate system interoperability. - Translate business requirements into scalable, secure, and high-performance integration designs. - Implement and manage API gateways, developer portals, and lifecycle management tools. - Design and deploy REST, SOAP, GraphQL services, along with associated documentation using OpenAPI/Swagger standards. - Govern API versioning, throttling, and security policies across internal and external APIs. - Architect event-driven systems using Apache Kafka, ActiveMQ, RabbitMQ, or similar message brokers. - Integrate EDI/B2B protocols and third-party systems using middleware solutions. - Implement data transformation and mapping using tools like XSLT, DataWeave, JSONata, or custom scripts. - Build integrations across cloud platforms (Azure, AWS, GCP) and SaaS applications. - Work with containerized microservices using Docker, Kubernetes, and service mesh (e.g., Istio). - Set up CI/CD pipelines for integration artifacts using Jenkins, GitLab CI, or Azure DevOps. - Define and enforce data governance, security, and compliance across all integrations. - Implement monitoring, alerting, and logging using tools like Splunk, ELK, or Azure Monitor. - Conduct performance tuning, root-cause analysis, and issue resolution in complex integration landscapes. Required Skills & Qualifications : - Bachelors degree in Computer Science, Engineering, or a related field. - 5+ years of hands-on experience in enterprise integration architecture and middleware platforms. - Proven experience with at least one or more major integration platforms : 1. MuleSoft Anypoint Platform 2. Dell Boomi 3. Azure Integration Services (Logic Apps, Functions, API Management) 4. Apache Kafka 5. SAP Integration Suite 6. Informatica Intelligent Cloud Services (IICS) - Proficiency in programming/scripting languages such as Java, Python, JavaScript, or Groovy. - Familiarity with DevOps tools, Git repositories, CI/CD, and agile development methodologies. - Excellent problem-solving skills with strong attention to detail. - Strong communication skills with the ability to translate complex technical concepts for non-technical stakeholders

Posted 1 month ago

Apply

3.0 - 5.0 years

6 - 10 Lacs

Pune

Work from Office

Responsibilities : - Participate in team prioritization discussions with Product/Business stakeholders. - Estimate and own delivery tasks (design, dev, test, deployment, configuration, documentation) to meet the business requirements. - Automate build, operate, and run aspects of software. - Drive code/design/process trade-off discussions within their team when required. - Report status and manage risks within their primary application/service. - Drive integration of services focusing on customer journey and experience. - Perform demos/acceptance discussions in interacting with Product owners. - Understands operational and engineering experience, actively works to improve experience and metrics in ownership area. - Develop complete understanding of end-to-end technical architecture and dependency systems. Requirements : - Expert with previous experience in .Net Tech Stack API Development, SQL Server DB, Windows Services, Command-Line execution of a .NET Program. - Familiar with secure coding standards (e.g., OWASP, CWE, SEI CERT) and vulnerability management. - Understands the basic engineering principles used in building and running mission critical software capabilities (security, customer experience, testing, operability, simplification, service-oriented architecture). - Able to perform debugging and troubleshooting to analyze core, heap, thread dumps and remove coding errors. - Understands and implements standard branching (e.g., Gitflow) and peer review practices. - Has skills in test driven and behavior driven development (TDD and BDD) to build just enough code and collaborate on the desired functionality. - Understands internals of operating systems (Windows, Linux) to write interoperable and performant code. - Understands use cases for advanced design patterns (e.g., service-to-worker, MVC, API gateway, intercepting filter, dependency injection, lazy loading, all from the gang of four) to implement efficient code. - Understands and implements Application Programming Interface (API) standards and cataloging to drive API/service adoption and commercialization. - Has skills to author test code with lots of smaller tests followed by few contract tests at service level and fewer journey tests at the integration level (Test Pyramid concept). - Apply tools (e.g., Sonar) and techniques to scan and measure code quality and anti-patterns as part of development activity. - Has skills to collaborate with team and business stakeholders to estimate requirements (e.g., story pointing) and prioritize based on business value. - Has skills to elaborate and estimate non-functional requirements, including security (e.g., data protection, authentication, authorization), regulatory, and performance (SLAs, throughput, transactions per second). - Has skills to orchestrate release workflows and pipelines, and apply standardized pipelines via APIs to achieve CI and CD using industry standard tools (e.g., Jenkins, AWS/Azure pipelines, XL Release, others). - Understands how to build robust tests to minimize defect leakage by performing regression, performance, deployment verification, and release testing. - Has skills to conduct product demos and co-ordinate with product owners to drive product acceptance sign offs.

Posted 1 month ago

Apply

4.0 - 9.0 years

10 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Role: AWS Data Engineer Location: Hyderabad, Bangalore, Chennai, Mumbai, Pune, Kolkata, Gurgaon Experience: 4-8 years Work Mode: Hybrid Job Description: We are seeking a skilled AWS Data Engineer to design, develop, and support large-scale data solutions on AWS. The ideal candidate will have hands-on expertise in data engineering, automation, and cloud technologies, enabling data-driven decision-making and operational excellence. Contract to Hire Key Responsibilities: Design, develop, and deploy data pipelines and solutions using AWS services such as S3, Glue, Lambda, API Gateway, and SQS . Write clean, efficient code using Python , PySpark , and SQL to process and transform data. Implement batch job scheduling , manage data dependencies, and ensure reliable data processing workflows. Develop and maintain Spark and Airflow jobs for large-scale data processing and orchestration. Automate repetitive tasks and build reusable frameworks to enhance efficiency and reduce manual intervention. Provide Run/DevOps support , monitor pipelines, and manage the ongoing operation of data services on AWS. Ensure high standards for data quality, reliability, and performance. Collaborate with data scientists, analysts, and other engineers to support business initiatives. Must-Have Skills: Strong hands-on experience with AWS services: S3, Lambda, Glue, API Gateway, SQS Proficiency in Python, PySpark, and SQL Experience with batch job scheduling and managing data dependencies Strong knowledge of Spark and Airflow for data processing and orchestration Solidunderstanding of DevOps practices and operational support for cloud data services Good to Have: Experience with containerization (Docker, Kubernetes) Exposure to monitoring/logging tools (CloudWatch, Datadog, etc.) AWS certifications (e.g., Solutions Architect, Data Analytics Specialty)

Posted 1 month ago

Apply

3.0 - 8.0 years

10 - 20 Lacs

Noida

Work from Office

Responsibilities: Design, develop, and implement scalable and high-performance backend services using Node.js. Take ownership of features end-to-end from planning and architecture to coding, testing, deployment, and monitoring. Lead system design and architecture discussions for new modules and services. Build and maintain RESTful APIs and microservices, with attention to clean interfaces and robust error handling. Work with both SQL and NoSQL databases (e.g., PostgreSQL, MongoDB, Redis) to support dynamic data models. Collaborate closely with front-end developers, DevOps engineers, and product managers to deliver integrated solutions. Ensure performance, reliability, and security across all backend services. Implement automated tests (unit, integration, and performance) to ensure code quality and system stability. Contribute to DevOps practices including CI/CD pipelines and cloud infrastructure (AWS/GCP). Stay hands-on in code and continuously improve performance, code quality, and scalability. Required Skills & Qualifications: 3+ years of strong, hands-on backend development experience using Node.js. Strong proficiency in Node.js (Express.js/Nest.js) Good hands-on experience in Laravel and CodeIgniter Database knowledge: MySQL, MongoDB Experience in CRM, Travel & Tour Websites, eCommerce Development Strong understanding of REST APIs and third-party integrations Knowledge of application security best practices Familiarity with version control (Git) and deployment processes Ability to manage time efficiently and work independently or in a team. Experience with JWT, OAuth, or Passport.js Familiarity with cloud hosting (AWS, Linode, Google cloud ) is a plus Understanding of containerization (Docker) is an advantage

Posted 1 month ago

Apply

10.0 - 12.0 years

7 - 12 Lacs

Mumbai

Work from Office

About the Role : We are seeking an experienced and highly skilled Senior AWS Engineer with over 10 years of professional experience to join our dynamic and growing team. This is a fully remote position, requiring strong expertise in serverless architectures, AWS services, and infrastructure as code. You will play a pivotal role in designing, implementing, and maintaining robust, scalable, and secure cloud solutions. Key Responsibilities : - Design & Implementation : Lead the design and implementation of highly scalable, resilient, and cost-effective cloud-native applications leveraging a wide array of AWS services, with a strong focus on serverless architecture and event-driven design. - AWS Services Expertise : Architect and develop solutions using core AWS services including AWS Lambda, API Gateway, S3, DynamoDB, Step Functions, SQS, AppSync, Amazon Pinpoint, and Cognito. - Infrastructure as Code (IaC) : Develop, maintain, and optimize infrastructure using AWS CDK (Cloud Development Kit) to ensure consistent, repeatable, and version-controlled deployments. Drive the adoption and implementation of CodePipeline for automated CI/CD. - Serverless & Event-Driven Design : Champion serverless patterns and event-driven architectures to build highly efficient and decoupled systems. - Cloud Monitoring & Observability : Implement comprehensive monitoring and observability solutions using CloudWatch Logs, X-Ray, and custom metrics to proactively identify and resolve issues, ensuring optimal application performance and health. - Security & Compliance : Enforce stringent security best practices, including the establishment of robust IAM roles and boundaries, PHI/PII tagging, secure configurations with Cognito and KMS, and adherence to HIPAA standards. Implement isolation patterns and fine-grained access control mechanisms. - Cost Optimization : Proactively identify and implement strategies for AWS cost optimization, including S3 lifecycle policies, leveraging serverless tiers, and strategic service selection (e.g., evaluating Amazon Pinpoint vs. SES based on cost-effectiveness). - Scalability & Resilience : Design and implement highly scalable and resilient systems incorporating features like auto-scaling, Dead-Letter Queues (DLQs), retry/backoff mechanisms, and circuit breakers to ensure high availability and fault tolerance. - CI/CD Pipeline : Contribute to the design and evolution of CI/CD pipelines, ensuring automated, efficient, and reliable software delivery. - Documentation & Workflow Design : Create clear, concise, and comprehensive technical documentation for architectures, workflows, and operational procedures. - Cross-Functional Collaboration : Collaborate effectively with cross-functional teams, including developers, QA, and product managers, to deliver high-quality solutions. - AWS Best Practices : Advocate for and ensure adherence to AWS best practices across all development and operational activities. Required Skills & Experience : of hands-on experience as an AWS Engineer or similar role. - Deep expertise in AWS Services : Lambda, API Gateway, S3, DynamoDB, Step Functions, SQS, AppSync, CloudWatch Logs, X-Ray, EventBridge, Amazon Pinpoint, Cognito, KMS. - Proficiency in Infrastructure as Code (IaC) with AWS CDK; experience with CodePipeline is a significant plus. - Extensive experience with Serverless Architecture & Event-Driven Design. - Strong understanding of Cloud Monitoring & Observability tools : CloudWatch Logs, X-Ray, Custom Metrics. - Proven ability to implement and enforce Security & Compliance measures, including IAM roles boundaries, PHI/PII tagging, Cognito, KMS, HIPAA standards, Isolation Pattern, and Access Control. - Demonstrated experience with Cost Optimization techniques (S3 lifecycle policies, serverless tiers, service selection). - Expertise in designing and implementing Scalability & Resilience patterns (auto-scaling, DLQs, retry/backoff, circuit breakers). - Familiarity with CI/CD Pipeline Concepts. - Excellent Documentation & Workflow Design skills. - Exceptional Cross-Functional Collaboration abilities. - Commitment to implementing AWS Best Practices.

Posted 1 month ago

Apply

10.0 - 15.0 years

15 - 30 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Senior Full-Stack Developer with Node ,react .aws and Gen Ai llm Location: Chennai Hyderabad Banglore Experience: 10 Years Work Type: Onsite Budget: As per market standards Primary Skills NodeJS 6+ years of hands-on backend development JavaScript HTML CSS Strong frontend development capabilities ReactJS VueJS Working knowledge or project experience preferred AWS Serverless Architecture Mandatory (Lambda, API Gateway, S3) LLM Integration AI Development Experience with OpenAI, Anthropic APIs Prompt Engineering Context management and token optimization SQL NoSQL Databases Solid experience with relational & non-relational DBs End To End Deployment Deploy, debug, and manage full-stack apps Clean Code Writes clean, maintainable, production-ready code Secondary Skills Amazon Bedrock Familiarity is a strong plus Web Servers Experience with Nginx Apache configuration RAG Patterns Vector DBs AIAgents Bonus experience Software Engineering Best Practices Strong design & architecture skills CI/CD DevOps Exposure Beneficial for full pipeline integration Expectations Own frontend and backend development Collaborate closely with engineering and client teams Build scalable, secure, and intelligent systems Influence architecture and tech stack decisions Stay up-to-date with AI trends and serverless best practices

Posted 1 month ago

Apply

5.0 - 8.0 years

25 - 30 Lacs

Pune

Remote

Will design, develop, document, test & deploy modern apps leveraging Java, MySQL, CouchDB, React & REST APIs. Exp with ActiveMQ is a plus. Thrives in an agile environment, contributes to design discussions & promotes engineering best practices. Required Candidate profile Design, implement scalable backend services using Java & RESTful APIs. Develop modern, responsive UIs using Vue.js. Work with MySQL and CouchDB to design and optimize data storage solutions.

Posted 1 month ago

Apply

7.0 - 12.0 years

90 - 100 Lacs

Gurugram

Work from Office

Expected Notice Period: 30 Days Shift: (GMT+05:30) Asia/Kolkata (IST) Opportunity Type: Hybrid (Gurugram) What do you need for this opportunity? Must have skills required: Experience with media - image/video - workflows, Gen AI models, or visual editing apps, AWS Looking for: Were hiring a Director of Engineering to lead execution across both of our products Crop.photo (Bulk AI photo editing and banner automation) and Evolphin Zoom (enterprise-grade MAM). This role requires strong technical fluency and product judgment. Youre not expected to write code, but you must understand how modern AWS based SaaS systems are built, own delivery across teams, clarify PRD/specs, unblock engineers, and make sure what we ship meets the definition of customer product market fit. Youll work directly with the CTO, taking high-level product requirements and translating them into clear, implementation specs for the team. Youll lead engineering across two teams with very different maturity levels: one startup-fast and AI-driven, the other powering massive enterprise deployments. Youll be expected to enforce delivery timelines, priorities, and build a culture of high ownership and real ICP awareness. This role sits at the intersection of engineering, product, and execution. You wont just manage tickets youll guide the team to build things that actually work for customers. That includes asking hard questions when specs are vague, catching broken UX flows before QA does, and holding engineering team members to a standard that reflects the level we need to operate at to achieve product market fit. What Youll Do Own engineering delivery across two active products Crop.photo and Evolphin Zoom Translate high-level product inputs/PRDs from the CTO into actionable, implementation-ready specs Unblock engineers, challenge weak assumptions, and clarify gaps before they become delays Work with leads to ensure velocity stays high and step in when accountability drops Balance short-term execution with longer-term platform thinking across very different stacks Prioritize real outcomes: features that solve customer problems, not just check boxes in a PRD sheet Occasionally dive into code or architecture reviews when context or escalation demands it Help grow the team by identifying strong ICs, spotting performance issues & skill gaps early, and mentoring mid-level contributors Operate entirely in modern tooling Google Docs, Notion, Teams, ChatGPT, GitHub, YouTrack and Jira and bring in better tools when needed Coordinate with product, QA, and UX design to align expectations and catch misalignment early What You Must Bring 8+ years of engineering experience, including 2-3 years in a senior engineering management or director-level role Strong track record of delivery in B2B SaaS or product-focused startups Technical fluency across cloud-first architecture (AWS), APIs, and visual web apps 3+ years working with modern AWS infrastructure (EC2, ECS, Lambda, API gateway, CloudWatch, IAM) enough to understand cost, reliability, and trade-offs 23 years working with frontend frameworks (React) in products with visual UX or creative/editor-style interfaces Familiarity with backend stacks including FastAPI, Python, Java/Spring, and API design enough to evaluate decisions and guide implementation 1+ years of experience working with AI/ML or MLOps pipelines including deployment of models via FastAPI, Docker, or GPU-based ECS Exposure to CI/CD pipelines (AWS CodeBuild/CodeDeploy), containerization (Docker), and monitoring/debugging tools (Datadog, CloudWatch, or similar) High product IQ you know when a feature wont work for the customer, even if its technically complete Strong written communication you can write specs, help docs, and feedback notes without slowing the team down or depending on CTO/Founder to fill in the gaps Nice to Have Experience with media - image/video - workflows, gen AI models, or visual editing apps Familiarity with Evolphins or Crop.photos tech stack (FastAPI, Docker, React, Java/Spring, DynamoDB, PostgreSQL, etc.) Exposure to multi-tenant SaaS systems and cost-optimized scaling on AWS Prior role as Staff Engineer or EM in a startup - 40 people.

Posted 1 month ago

Apply

5.0 - 8.0 years

5 - 9 Lacs

Kolkata

Work from Office

Seeking a results-driven Python Developer with expertise in API development, AWS services, SQL, and raw queries. Must have a basic grasp of backend architecture and system design principles.A Python Developer Lead plays a crucial role in the software development lifecycle, combining deep technical expertise in Python with strong leadership and project management skills. They are responsible for guiding a team of Python developers, ensuring the delivery of high-quality, scalable, and efficient software solutions. Job Summary: The Python Developer Lead will be responsible for overseeing the design, development, and deployment of robust, scalable, and performant Python applications. This role requires a blend of hands-on coding, architectural design, team leadership, and cross-functional collaboration. The Lead will mentor junior developers, establish best practices, ensure code quality, and contribute significantly to the overall technical strategy and success of our projects. Key Responsibilities: Technical Leadership & Architecture: Lead the design and development of complex Python-based systems, ensuring scalability, reliability, and maintainability. Define and enforce coding standards, design patterns, and architectural principles across the team. Conduct code reviews, provide constructive feedback, and ensure adherence to best practices. Stay abreast of emerging technologies, tools, and trends in the Python ecosystem and integrate relevant advancements. Team Management & Mentorship: Manage and mentor a team of Python developers, fostering their technical growth and professional development. Assign tasks, monitor progress, and provide guidance to ensure efficient project execution. Facilitate knowledge sharing and encourage a collaborative team environment. Participate in the hiring process for new team members. Software Development & Delivery: Develop, test, and deploy high-quality, efficient, and well-documented Python code for various applications and services. Work with cross-functional teams (Product, UI/UX, QA, DevOps) to translate business requirements into technical specifications and deliver effective solutions. Design and implement RESTful APIs, integrate with third-party services, and manage data pipelines. Troubleshoot and debug complex issues, ensuring low-latency and high-availability applications. Oversee the entire software development lifecycle, from conception to deployment and maintenance.

Posted 1 month ago

Apply

2.0 - 5.0 years

5 - 8 Lacs

Bengaluru

Work from Office

A Web Developer is a professional who is responsible for the design and construction of websites. They ensure that sites meet user expectations by ensuring they look good, run smoothly and offer easy access points with no loading issues between pages or error messages Responsibilities: A Web Developer is a professional who is responsible for the design and construction of websites. They ensure that sites meet user expectations by ensuring they look good, run smoothly and offer easy access points with no loading issues between pages or error messages. Requirements: Experience in creating web applications based on Angular/React, with C# .NET hosted on Azure cloud. Experience in micro service API development You should have experience in Full stack web development using Typescript, Angular4+,HTML5 and CSS3. Exposure to CI/CD tools, Code Analysis, and Test automation, Azure/Function, API Gateway, Compute, AKS) is preferred. Strong analytical and debugging skills

Posted 1 month ago

Apply

1.0 - 3.0 years

2 - 5 Lacs

Bengaluru

Work from Office

A Web Developer is a professional who is responsible for the design and construction of websites. They ensure that sites meet user expectations by ensuring they look good, run smoothly and offer easy access points with no loading issues between pages or error messages RESPONSIBILITIES: A Web Developer is a professional who is responsible for the design and construction of websites. They ensure that sites meet user expectations by ensuring they look good, run smoothly and offer easy access points with no loading issues between pages or error messages. Requirements: Experience in creating web applications based on Angular/React, with C# .NET hosted on Azure cloud. Experience in micro service API development You should have experience in Full stack web development using Typescript, Angular4+,HTML5 and CSS3. Exposure to CI/CD tools, Code Analysis, and Test automation, Azure/Function, API Gateway, Compute, AKS) is preferred. Strong analytical and debugging skills

Posted 1 month ago

Apply

4.0 - 9.0 years

0 - 3 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Dear Candidate, Interested candidates : Please share your updated resume along with Photo, Pan card and PF member service history. Seasons Greetings! Role: Python Data Engineer Work Nature - Contract To hire (3rd Party Payroll) Work Location Pan India Total Experience 4+ Yrs Immediate Joiners only Email: tuppari.pradeep@firstmeridianglobal.com Job Description: 4+ years of experience in backend development with Python. Strong experience with AWS services and cloud architecture. Proficiency in developing RESTful APIs and microservices. Experience with database technologies such as SQL, PostgreSQL, and NoSQL databases. Familiarity with containerization and orchestration tools like Docker and Kubernetes. Knowledge of CI/CD pipelines and tools such as Jenkins, GitLab CI, or AWS CodePipeline. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Interested candidates : Please share your updated resume along with Photo, Pan card and PF member service history to the mentioned email: tuppari.pradeep@firstmeridianglobal.com Element Details Full Name as Per Govt Proofs: Contact Number Alternate Contact Number Email ID Date Of Birth Fathers Name Total Experience Relevant Experience PAN Card Number Current CTC Expected CTC Current Work Location Preferred Location Open for Relocation (Yes/No)- Current Company Name Notice Period Mode of Employment (Contract/Permanent) If Contract, please provide payroll company: Do you know anyone who could be interested in this profile? Feel free to share this email with the person. You might make their day just by forwarding this email. Regards Pradeep tuppari.pradeep@firstmeridianglobal.com

Posted 1 month ago

Apply

8.0 - 15.0 years

11 - 15 Lacs

Mumbai

Work from Office

Education BE/BCA/B-TECH/Bsc.IT or any IT Graduate from authorised university Experience/ Qualifications Excellent written and verbal communication skills in English, high integrity, strong work ethic and ability to empathize with the customer. At least 8 - 15 years of Cyber Security experience with large organization, Bank, or global IT or consulting firm. Strong background of Application Security, Secure Software Development Lifecycle (SSDLC). Experience in Threat Modelling, Application Security Architecture Review, Security Testing- SCA, SAST, DAST. Exposure of security tools integration in DevOps architecture. Exposure of Microservices security and API security. Exposure implementation of evaluation and implementation of Application Security & Testing tools. Troubleshooting and problem-solving ability including analytical thinking and strong attention to details. Good understanding of Application Security Standards like OWASP, SANS, NIST etc. Good understanding of Security by Design and Privacy by Design. Good understanding of compliance requirements for payment and nonpayment applications. Product & platform security assessment exposure is desirable. Understanding of Load Balancer, WAF, CDN, API Gateway, Secrets Management etc. is desired. Exposure of cloud application (SaaS) security solutions is desirable. Good understanding of encryption tools and technologies; SSL, Keys Management, HSM and PKI infrastructure and secrets management. Ability to take assess solution and recommend proactive steps to mitigate Network, OS and Application Layer Security attacks. Subject Matter Expert for Application and Product Security. Understanding business requirements, complexity and solution architecture and estimate scope and effort of SSDLC and Cyber Security. Driving SSDLC for projects from initial stage to development and implementation. Planning, resource allocation and tracking of SSDLC service delivery. Conducting Threat Modelling, Application Architecture Review, SCA, SAST, DAST & IAST Implementation of SCA, SAST, DAST & IAST tools for application security testing. Continual learning and enhancement of skills and processes for service delivery. Provide advice on Secure coding best practices. Conduct Application Security related trainings for team and developers. Managing small team of Application Security & SSDLC. Provide inputs for product and platform security. Assess application, product and platform security as per scope of the engagement. Prepare application risk summary & register and trace for closure. Prepare weekly/monthly service delivery reports and review with BU Lead and VH.

Posted 1 month ago

Apply

4.0 - 9.0 years

10 - 13 Lacs

Bengaluru

Remote

Join our team to build scalable full stack solutions using .NET Core, React, and AWS. You'll own backend APIs, integrate frontend apps, and manage cloud infrastructure. You will be using react in the frontend development. Required Candidate profile 4+ years in full stack development with .NET, React, AWS. Must have built secure APIs, deployed services on AWS, and collaborated across teams. Strong code quality and system design skills required.

Posted 1 month ago

Apply

4.0 - 9.0 years

0 - 3 Lacs

Hyderabad

Hybrid

Urgent Hiring for AWS Data Engineer for Client DELOITTE for CTH position Hi, Seasons Greetings! Role: AWS Data Engineer Work Nature - Contract To hire (3rd Party Payroll) Work Location Hyderabad Total Experience 4+ Yrs Immediate Joiners only •Job Description: • AWS Data Engineer Hands-on experience with AWS services including S3 , Lambda,Glue, API Gateway , and SQL. Strong skills in data engineering on AWS, with proficiency in Python ,pyspark & SQL . Experience with batch job scheduling and managing data dependencies. Knowledge of data processing tools like Spark and Airflow . Automate repetitive tasks and build reusable frameworks to improve efficiency. Provide Run/DevOps support and manage the ongoing operation of data services Please share your updated resume along with the below details: Element Details Full Name as Per Govt Proofs: Contact Number Alternate Contact Number Email ID Date Of Birth Fathers Name Total Experience Relevant Experience PAN Card Number Current CTC Expected CTC Current Work Location Preferred Location Open for Relocation (Yes/No)- Current Company Name Notice Period Mode of Employment (Contract/Permanent) If Contract, please provide payroll company: Do you know anyone who could be interested in this profile? Feel free to share this email with the person. You might make their day just by forwarding this email. Mounika.t@affluentgs.com 7661922227

Posted 1 month ago

Apply

6.0 - 11.0 years

18 - 33 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Greetings from TCS!! TCS Hiring for Kong Developer Job Title: Kong Developer Location: Chennai/Hyderabd/Bangalore/Pune/Kolkata Experience Range: 6- 15 Years Minimum Qualification: 15 years of full-time education Must Have: Expert level experience with SOAP and REST web service RAML, Swagger, open API development and SOA APIs. Develop APIs on Kong API Management platform Develop Kong Custom Plugins. Install, configure and deploy solution using Kong Microservices development using Spring Boot. Scripting using Python/Bash /shell etc. Experience working on Cloud platform. Good to have: Knowledge of DevOps Tools Any Other API Management products Terraform for Cloud Infrastructure setup Working to test automation suits like Gauge, BDD etc. Kubernetes and Docker knowledge

Posted 1 month ago

Apply

5.0 - 8.0 years

10 - 20 Lacs

Pune

Hybrid

***** GenAI Develope r***** ***** Immediate Joine r***** ***** Pune Hinjewadi ***** We are seeking experienced GenAI Developers to join our team in Pune. As a key member of our AI innovation group, you will be responsible for designing, architecting, and implementing scalable Generative AI solutions leveraging AWS services. You will lead the transformation of existing LLM-based architectures into modern, cloud-native solutions and collaborate with cross-functional teams to deliver cutting-edge AI applications. Key Responsibilities: Design and architect scalable GenAI solutions using AWS Bedrock and other AWS services. Lead the transformation of homegrown LLM-based architecture into a managed cloud-native solution. Provide architectural guidance across components: RAG pipelines, LLM orchestration, vector DB integration, inference workflows, and scalable endpoints. Collaborate closely with GenAI developers, MLOps, and data engineering teams to ensure alignment on implementation. Evaluate, integrate, and benchmark frameworks such as LangChain, Autogen, Haystack, or LlamaIndex. Ensure infrastructure and solutions are secure, scalable, and cost-optimized on AWS. Act as a technical SME and hands-on contributor for architecture reviews and POCs. Must-Have Skills: Deep expertise with AWS Bedrock, S3, Lambda, SageMaker, API Gateway, DynamoDB/Redshift, etc. Proven experience architecting LLM applications with RAG, embeddings, prompt engineering. Hands-on understanding of frameworks like LangChain, LlamaIndex, or Autogen. Knowledge of LLMs like Anthropic Claude, Mistral, Falcon, or custom models. Strong understanding of API design, containerization (Docker), and serverless architecture. Experience leading cloud-native transformations. Preferred: Experience with CI/CD, DevOps integration for ML/AI pipelines. Exposure to AWS, Azure/GCP in GenAI (bonus).

Posted 1 month ago

Apply

7.0 - 13.0 years

9 - 15 Lacs

Bengaluru

Work from Office

Location: Bangalore. About LeadSquared. One of the fastest-growing SaaS Unicorn companies in the CRM space, LeadSquared empowers organizations with the power of automation. More than 2000 customers with 2 lakhs+ users across the globe utilize the LeadSquared platform to automate their sales and marketing processes and run high-velocity sales at scale, We are backed by prominent investors such as Westbridge Capital, Stakeboat Capital, and Gaja Capital to name a few. We are expanding rapidly and our 1300+ strong and still growing workforce is spread across India, the U.S, the Middle East, ASEAN, ANZ, and South Africa, Among Top 50 fastest growing tech companies in India as per Deloitte Fast 50 programs. Frost and Sullivan's 2019 Marketing Automation Company of the Year award. Among Top 100 fastest growing companies in FT 1000: High-Growth Companies Asia-Pacific. Listed as Top Rates Product on G2Crowd, GetApp, and TrustRadius. Engineering @ LeadSquared. At LeadSquared, we like being up to date with the latest technology and utilizing the trending tech stacks to build our product. By joining the engineering team, you get to work first-hand with the latest web and mobile technologies and solve the challenges of scale, performance, security, and cost optimization. Our goal is to build the best SaaS platform for sales execution in the industry and what better place than LeadSquared for an exciting career?. The Role. LeadSquared platform and product suite are 100% on the cloud and currently all on AWS. The product suite comprises of a large number of applications, services, and APIs built on various open-source and AWS native tech stacks and deployed across multiple AWS accounts, The role involves leading the mission-critical responsibility of ensuring that all our online services are available, reliable, secure, performant, and running at optimal costs. We firmly believe in a code and automation-driven approach to Site Reliability, Responsibilities. Taking ownership of release management with effective build and deployment processes by collaborating with development teams, Infrastructure and configuration management of production systems, Be a stakeholder in product scoping, performance enhancement, cost optimization, and architecture discussions with the Engineering leaders, Automate DevOps functions and full control of source code repository management with continuous integration, Strong understanding of Product functionality, customers’ use cases, and architecture, Prioritize and meet the SLA for incidents and service management; also, to ensure that projects are managed and delivered on time and quality, Recommend new technologies and tools that will automate manual tasks, better observability, and faster troubleshooting, Need to make sure the team adheres to compliance and company policies with regular audits, Motivating, empowering, and improving the team’s technical skills, Requirements. 13+ years’ experience in building, deploying and scaling software applications on AWS cloud. (Preferably in SaaS). Deep understanding of observability and cost optimization of all major AWS services – EC2, RDS, Elasticsearch, Redis, SQS, API Gateway, Lambda, etc, AWS certification is a plus, Experience in building tools for deployment automation and observability response management for AWS resources. Dot Net, Python, and CFTs or Terraform are preferred, Operational experience in deploying, operating, scaling, and troubleshooting large-scale production systems on the cloud, Strong interpersonal communication skills (including listening, speaking, and writing) and ability to work well in a diverse, team-focused environment with other DevOps and engineering teams, Function well in a fast-paced, rapidly changing environment, 5+ years’ experience in people management, Why Should You Apply?. Fast-paced environment. Accelerated Growth & Rewards. Easily approachable management. Work with the best minds and industry leaders. Flexible work timings. Interested?. If this role sounds like you, then apply with us! You have plenty of room for growth at LeadSquared, Show more Show less

Posted 1 month ago

Apply

2.0 - 6.0 years

3 - 6 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

hackajob is collaborating with LexisNexis Risk Solutions to connect them with exceptional tech professionals for this role.. Rust Developer @ IDVerse. Full time — Remote — AU / US. Join a strong team of passionate engineers and build a world-class platform to fight identity fraud at a global scale. In Rust.. The Position. You will work in close collaboration with our SVP of Architecture, our engineering team, and our product team to:. Write and re-write hardened, documented, and tested weband API-based applications. Build our growing collection of libraries. Define and enforce best practices of an expanding Rust team A fair chunk is green-field work (and no, it's not crypto-currency/blockchain related ) (even our front-end applications are written in Rust, using Leptos for the WASM (and Tailwind for CSS)). We prefer event-based architectures, cloud (AWS) and serverless. Only the good stuff.. Needed Qualifications. Whilst technical competence is critical, we place great emphasis on passion, communication and collaboration across the business.. You have solid experience creating and maintaining web-based and API-based applications (in Rust or not). You can demonstrate having built non-trivial Rust projects, ideally web-related. You are comfortable with JavaScript/TypeScript. You are able to communicate clearly, both in writing and orally, and collaborate effectively with a remote team. You understand that documentation is half the battle, and that untested code is broken code. You know it takes time to build anything correctly, and you also know how to "get things done" when the situation calls for it. You are autonomous, but also know it's better to ask than to guess. You are dependable, responsible, and committed.. Nice-to-Haves. It would be even more awesome if you have experience:. Building front-end WebAssembly applications (with Leptos, maybe?). Solving problems with machine learning. Developing for/with AWS Serverless technologies (API Gateway, Lambda, DynamoDB...). Location and Time Zone. Our team is globally distributed and fully remote. The higher concentration is based around the Australian / East Asia time zones. For this role, we'll be looking at any location, but will favour either the American or European time zones.. About Us. IDVerse is a Sydney-based start-up that is a global pioneer in the development of digital identity verification technology. We've built everything from the ground up and have a broad range of blue-chip customers across banking, telecommunications, government, and more. We've perfected the technology locally in Australia and New Zealand, and are quickly expanding into the northern hemisphere. We're still a small team, and take pride in making it smart and inclusive. The position is remote and the work week can be flexible. Remuneration will be competitive and based on experience. We encourage people from all backgrounds and genders to apply to this position. As an early member of the team, you will have a great impact on its future shape.. Instructions On How To Apply. Send an email to devjobs@idverse.com with Rust Up! in the title (be exact, we have automated filters that will discard anything else. This is your first test!). Write a few lines about you and attach your rsum. Add any link you think will help us assess both your soft and hard skills. If you pique our interest, we'll set up a video call and go from there.. Show more Show less

Posted 1 month ago

Apply

1.0 - 4.0 years

2 - 5 Lacs

Bengaluru

Work from Office

Role Overview. We are seeking a skilled Backend Developer with expertise in TypeScript and AWS to design and implement scalable, event-driven microservices. The ideal candidate will have a strong background in serverless architectures and backend development.. Key Responsibilities. Backend Development: Develop and maintain server-side applications using TypeScript and Node.js.expertia.ai. API Design: Create and manage RESTful APIs adhering to OpenAPI specifications.. Serverless Architecture: Implement serverless solutions using AWS Lambda, API Gateway, and DynamoDB.. Event-Driven Systems: Design and build event-driven architectures utilizing AWS SQS and SNS.vitiya99.medium.com. Microservices: Develop microservices that are scalable and maintainable.. Collaboration: Work closely with frontend developers and other stakeholders to integrate APIs and ensure seamless functionality.. Code Quality: Write clean, maintainable code and conduct code reviews.iihglobal.com. Continuous Improvement: Stay updated with the latest industry trends and technologies to continuously improve backend systems.. Required Skills & Qualifications. Experience: 7–10 years in backend development with a focus on TypeScript and Node.js.. AWS Expertise: Proficiency in AWS services such as Lambda, API Gateway, DynamoDB, SQS, and SNS.. API Development: Experience in designing and implementing RESTful APIs.. Event-Driven Architecture: Familiarity with building event-driven systems using AWS services.. Microservices: Experience in developing microservices architectures.. Version Control: Proficiency in using Git for version control.. CI/CD: Experience with continuous integration and continuous deployment pipelines.. Collaboration: Strong communication skills and ability to work in a team environment.. Preferred Skills. Infrastructure as Code: Experience with tools like Terraform or AWS CloudFormation.. Containerization: Familiarity with Docker and container orchestration tools.. Monitoring & Logging: Experience with monitoring and logging tools to ensure system reliability.. Agile Methodologies: Experience working in Agile development environments.. Show more Show less

Posted 1 month ago

Apply

1.0 - 3.0 years

3 - 6 Lacs

Bengaluru

Work from Office

We are seeking a highly skilled and motivated System Ops Engineer with a strong background in computer science or statistics, and at least 5 years of professional experience. The ideal candidate will possess deep expertise in cloud computing (AWS), data engineering, Big Data applications, AI/ML, and SysOps. A strong technical foundation, proactive mindset, and ability to work in a fast-paced environment are essential.. Key Responsibilities. Cloud Expertise:. Proficient in AWS services including EC2, VPC, Lambda, DynamoDB, API Gateway, EBS, S3, IAM, and more.. Design, implement, and maintain scalable, secure, and efficient cloud-based solutions.. Execute optimized configurations for cloud infrastructure and services.. Data Engineering. Develop, construct, test, and maintain data architectures, such as databases and processing systems.. Write efficient Spark and Python code for data processing and manipulation.. Administer and manage multiple ETL applications, ensuring seamless data flow.. Big Data Applications. Lead end-to-end Big Data projects, from design to deployment.. Monitor and optimize Big Data systems for performance, reliability, and scalability.. AI/ML Applications. Hands-on experience developing and deploying AI/ML models, especially in Natural Language Processing (NLP), Computer Vision (CV), and Generative AI (GenAI).. Collaborate with data scientists to productionize ML models and support ongoing model performance tuning.. DevOps And IaaS. Utilize DevOps tools for continuous integration and deployment (CI/CD).. Design and maintain Infrastructure as a Service (IaaS) ensuring scalability, fault-tolerance, and automation.. SysOps Responsibilities. Manage servers and network infrastructure to ensure system availability and security.. Configure and maintain virtual machines and cloud-based system environments.. Monitor system logs, alerts, and performance metrics.. Install and update software packages and apply security patches.. Troubleshoot network connectivity issues and resolve infrastructure problems.. Implement and enforce security policies, protocols, and procedures.. Conduct regular data backups and disaster recovery tests.. Optimize systems for speed, efficiency, and reliability.. Collaborate with IT and development teams to support integration of new systems and applications. Qualifications. Bachelor's degree in Computer Science, Statistics, or a related field.. 5+ years of experience in cloud computing, data engineering, and related technologies.. In-depth knowledge of AWS services and cloud architecture.. Strong programming experience in Spark and Python.. Proven track record in Big Data applications and pipelines.. Applied experience in AI/ML models, particularly in NLP, CV, and GenAI domains.. Skilled in managing and administering ETL tools and workflows.. Experience with DevOps pipelines, CI/CD tools, and cloud automation.. Demonstrated experience with SysOps or cloud infrastructure/system operations.. Show more Show less

Posted 1 month ago

Apply

3.0 - 5.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Job Description. We are seeking a highly skilled and motivated Cloud Data Engineer with a strong background in computer science or statistics, coupled with at least 5 years of professional experience. The ideal candidate will possess a deep understanding of cloud computing, particularly in AWS, and should have a proven track record in data engineering, Big Data applications, and AI/ML applications.. Responsibilities. Cloud Expertise:. Proficient in AWS services such as EC2, VPC, Lambda, DynamoDB, API Gateway, EBS, S3, IAM, and more.. Design, implement, and maintain scalable cloud-based solutions.. Execute efficient and secure cloud infrastructure configurations.. Data Engineering:. Develop, construct, test, and maintain architectures, such as databases and processing systems.. Utilize coding skills in Spark and Python for data processing and manipulation.. Administer multiple ETL applications to ensure seamless data flow.. Big Data Applications:. Work on end-to-end Big Data application projects, from conception to deployment.. Optimize and troubleshoot Big Data solutions to ensure high performance.. AI/ML Applications:. Experience in developing and deploying AI/ML applications based on NLP, CV, and GenAI.. Collaborate with data scientists to implement machine learning models into production environments.. DevOps and Infrastructure as a Service (IaaS):. Possess knowledge and experience with DevOps applications for continuous integration and deployment.. Set up and maintain infrastructure as a service, ensuring scalability and reliability.. Qualifications. Bachelor’s degree in computer science, Statistics, or a related field.. 5+ years of professional experience in cloud computing, data engineering, and related fields.. Proven expertise in AWS services, with a focus on EC2, VPC, Lambda, DynamoDB, API Gateway, EBS, S3, IAM, etc.. Proficient coding skills in Spark and Python for data processing.. Hands-on experience with Big Data application projects.. Experience in AI/ML applications, particularly in NLP, CV, and GenAI.. Administration experience with multiple ETL applications.. Knowledge and experience with DevOps tools and processes.. Ability to set up and maintain infrastructure as a service.. Soft Skills. Strong analytical and problem-solving skills.. Excellent communication and collaboration abilities.. Ability to work effectively in a fast-paced and dynamic team environment.. Proactive mindset with a commitment to continuous learning and improvement.. Show more Show less

Posted 1 month ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Mangaluru

Work from Office

Job Summary. As a DevOps Engineer specializing in data, you will be responsible for implementing and managing our cloud-based data infrastructure using AWS and Snowflake. You will collaborate with data engineers, data scientists, and other stakeholders to design, deploy, and maintain a robust data ecosystem that supports our analytics and business intelligence initiatives. Your expertise in modern data tech stacks, MLOps methodologies, automation, and information security will be crucial in enhancing our data pipelines and ensuring data integrity and availability.. Technical Skills. 3+ years of experience in the field of Data Warehousing and BI. Experience working with Snowflake Database. In depth knowledge of Data Warehouse concepts.. Experience in designing, developing, testing, and implementing ETL solutions using enterprise ETL tools .. Experience with large or partitioned relational databases (Aurora / MySQL / DB2). Very strong SQL and data analysis capabilities. Familiarly with Billing and Payment data is a plus. Agile development (Scrum) experience. Other preferred experience includes working with DevOps practices, SaaS, IaaS, code management (CodeCommit, git), deployment tools (CodeBuild, CodeDeploy, Jenkins, Shell scripting), and Continuous Delivery. Primary AWS development skills include S3, IAM, Lambda, RDS, Kinesis, APIGateway, Redshift, EMR, Glue, and CloudFormation. Responsibilities. Be a key contributor to the design and development of a scalable and cost-effective cloud-based data platform based on a data lake design. Develop data platform components in a cloud environment to ingest data and events from cloud and on-premises environments as well as third parties. Build automated pipelines and data services to validate, catalog, aggregate and transform ingested data. Build automated data delivery pipelines and services to integrate data from the data lake to internal and external consuming applications and services. Build and deliver cloud-based deployment and monitoring capabilities consistent with DevOps models. Deep knowledge and skills current with the latest cloud services, features and best practices. Who We Are:. unifyCX is an emerging Global Business Process Outsourcing company with a strong presence in the U.S., Colombia, Dominican Republic, India, Jamaica, Honduras, and the Philippines. We provide personalized contact centers, business processing, and technology outsourcing solutions to clients worldwide. In nearly two decades, unifyCX has grown from a small team to a global organization with staff members all over the world dedicated to supporting our international clientele.. At unifyCX, we leverage advanced AI technologies to elevate the customer experience (CX) and drive operational efficiency for our clients. Our commitment to innovation positions us as a trusted partner, enabling businesses across industries to meet the evolving demands of a global market with agility and precision.. unifyCX is a certified minority-owned business and an EOE employer who welcomes diversity.. Show more Show less

Posted 1 month ago

Apply

3.0 - 7.0 years

8 - 12 Lacs

Mangaluru

Work from Office

Job Summary. As a DevOps Engineer specializing in data, you will be responsible for implementing and managing our cloud-based data infrastructure using AWS and Snowflake. You will collaborate with data engineers, data scientists, and other stakeholders to design, deploy, and maintain a robust data ecosystem that supports our analytics and business intelligence initiatives. Your expertise in modern data tech stacks, MLOps methodologies, automation, and information security will be crucial in enhancing our data pipelines and ensuring data integrity and availability.. Technical Skills. 3+ years of experience in the field of Data Warehousing and BI. Experience working with Snowflake Database. In depth knowledge of Data Warehouse concepts.. Experience in designing, developing, testing, and implementing ETL solutions using enterprise ETL tools .. Experience with large or partitioned relational databases (Aurora / MySQL / DB2). Very strong SQL and data analysis capabilities. Familiarly with Billing and Payment data is a plus. Agile development (Scrum) experience. Other preferred experience includes working with DevOps practices, SaaS, IaaS, code management (CodeCommit, git), deployment tools (CodeBuild, CodeDeploy, Jenkins, Shell scripting), and Continuous Delivery. Primary AWS development skills include S3, IAM, Lambda, RDS, Kinesis, APIGateway, Redshift, EMR, Glue, and CloudFormation. Responsibilities. Be a key contributor to the design and development of a scalable and cost-effective cloud-based data platform based on a data lake design. Develop data platform components in a cloud environment to ingest data and events from cloud and on-premises environments as well as third parties. Build automated pipelines and data services to validate, catalog, aggregate and transform ingested data. Build automated data delivery pipelines and services to integrate data from the data lake to internal and external consuming applications and services. Build and deliver cloud-based deployment and monitoring capabilities consistent with DevOps models. Deep knowledge and skills current with the latest cloud services, features and best practices. Who We Are:. unifyCX is an emerging Global Business Process Outsourcing company with a strong presence in the U.S., Colombia, Dominican Republic, India, Jamaica, Honduras, and the Philippines. We provide personalized contact centers, business processing, and technology outsourcing solutions to clients worldwide. In nearly two decades, unifyCX has grown from a small team to a global organization with staff members all over the world dedicated to supporting our international clientele.. At unifyCX, we leverage advanced AI technologies to elevate the customer experience (CX) and drive operational efficiency for our clients. Our commitment to innovation positions us as a trusted partner, enabling businesses across industries to meet the evolving demands of a global market with agility and precision.. unifyCX is a certified minority-owned business and an EOE employer who welcomes diversity.. Show more Show less

Posted 1 month ago

Apply

2.0 - 5.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Job Description. Responsibilities:. Design, implement, and manage cloud infrastructure using AWS services, including EC2, Lambda, API Gateway, Step Functions, EKS clusters, and Glue. Develop and maintain Infrastructure as Code (IaC) using Terraform to ensure consistent and reproducible deployments. Set up and optimize CI/CD pipelines using tools such as Azure Pipelines and AWS Pipelines to automate software delivery processes. Containerize applications using Docker and orchestrate them with Kubernetes for efficient deployment and scaling. Write and maintain Python scripts to automate tasks, improve system efficiency, and integrate various tools and services. Develop shell scripts for system administration, automation, and troubleshooting. Implement and manage monitoring and logging solutions to ensure system health and performance. Collaborate with development teams to improve application deployment processes and reduce time-to-market. Ensure high availability, scalability, and security of cloud-based systems. Troubleshoot and resolve infrastructure and application issues in production environments. Implement and maintain backup and disaster recovery solutions. Stay up-to-date with emerging technologies and industry best practices in DevOps and cloud computing. Document processes, configurations, and system architectures for knowledge sharing and compliance purposes. Mentor junior team members and contribute to the overall growth of the DevOps practice within the organization. Additional Information. At Tietoevry, we believe in the power of diversity, equity, and inclusion. We encourage applicants of all backgrounds, genders (m/f/d), and walks of life to join our team, as we believe that this fosters an inspiring workplace and fuels innovation.?Our commitment to openness, trust, and diversity is at the heart of our mission to create digital futures that benefit businesses, societies, and humanity.. Diversity,?equity and?inclusion (tietoevry.com). Show more Show less

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies