Overview We’re looking for a Director of Engineering to lead the engineering team at browserless.io. Core responsibilities are running day-to-day engineering operations, participating in meetings and planning, building the team / managing individual team members, and helping facilitate unblocking of the team. We’re all multi-sport athletes and value individuals who can contribute as well as lead. Who we Are Browserless.io is a self-funded web automation company founded in 2017. We’re building the next generation of web automation, supporting developers and other integrations by making the “hard parts” of web automation simpler. We’re non-VC backed, profitable, and a remote-first team. About You We’re looking for someone who still loves writing code and can help deliver great products and services at browserless.io, as well as facilitate check-ins across the remote team. You’d be a great fit if you’re self-motivated, can rally folks behind a cause, and can actively help deliver software towards that goal. We deeply value bias towards action and taking ownership of problems. Here’s a little bit about the role: Responsibilities Own and be responsible for ensuring quality, deliverability and timeliness of the engineering team. Run check-ins, help the team stay focused, run retrospectives and other scrum events. Understand how the platform works, how releases work, and be proactive in improving all aspects of the system. Manage software releases, changelogs etc. Help out with individual contributions where possible (we expect 30-50% of the time spent on code - which would decrease over time as you learned things well enough to be able to effectively review PRs) Required Skills Excellent NodeJS and TypeScript skills. Has worked in browser automation at some level (testing, scraping, Selenium, etc). Familiarity with other popular languages like Python are helpful but not essential. Understanding of how containerized technologies like Docker work and can use them effectively. Have used cloud services like AWS, Google Cloud or Digital Ocean. Prior experience running engineering teams would be a bonus Manage performance, hiring, and building a high performing engineering culture Manage well in an async culture - we have core overlap hours of 8am-10am PST, four days a week (half team based in North America and half in Asia)
Job Title: Java Developer Experience: 3+ Years Duration: 4+ Months (Extendable based on performance) Client: Hyperhire Working Hours: Korean Time Zone Employment Type: Contractual Job Description: We are looking for an experienced Java Developer with 3+ years of hands-on experience to join a project for a Korean client. This is a contractual role with an initial duration of 4 months, extendable based on performance. Key Responsibilities: Develop and maintain Java-based applications. Collaborate with cross-functional teams in a remote setting. Ensure code quality and timely delivery. Work according to Korean business hours. Requirements: 3+ years of Java development experience. Strong understanding of Java, Spring/Spring Boot, and related frameworks. Good communication skills and ability to work in a remote team. Availability to work in Korean time zone Interview process : 1.Technical assignment 2.Technical Discussion 3.Hiring manager round
Client : https://energylifeglobal.com/ Role: remote (1st Contract for 3+ month then extended to continues) Working hours :(40 hrs - Monday to Friday) Immediate joiners prefer Key Responsibilities 1.Data Ingestion and Processing Pipeline: Build and maintain an automated pipeline to ingest and process raw satellite data from sources like the Registry of Open Data on AWS, storing the results in a data lake built on Amazon S3. 2.AI Model Training & Integration Framework: Develop a framework in Python to automate the machine learning lifecycle, including the training, validation, versioning, and storage of models for satellite image classification (Segmentation) and thermal risk prediction (using libraries like XGBoost). You will also integrate the UTCI (Universal Thermal Climate Index) calculation logic using the pythermalcomfort library. 3. Real-time Prediction API: Design and build a low-latency REST API using AWS Lambda and API Gateway to serve real-time predictions from the AI models. 4. Frontend Core Feature Implementation: Implement the core features of the interactive dashboard using React and Mapbox GL JS, focusing on the intuitive visualisation of complex geospatial heatmap data. 5. Cloud Infrastructure: Design and construct the entire cloud infrastructure using Infrastructure as Code (IaC) principles with the AWS CDK, ensuring the platform is secure, cost-effective, and scalable. Qualifications Required Qualifications: 6. A minimum of 3 years of professional software development experience 7 Deep expertise in either Python-based backend development or React-based frontend development 8. Essential, in-depth experience designing, building, and deploying services on Amazon Web Services (AWS), with a strong command of Lambda, S3, API Gateway, and IAM 9. A thorough understanding of REST principles and proven experience in designing and building APIs from scratch. 10. Proficiency in containerising applications using Docker 11.Demonstrated ability to independently solve complex, open-ended technical challenges 12.Strong communication skills to articulate technical decisions and collaborate with the team Preferred Qualifications: 1.Experience building complex, interactive mapping and data visualisation services with Mapbox GL JS or similar libraries. 2. Experience with GIS data processing using libraries like GeoPandas and Rasterio, or familiarity with Google Earth Engine. 3. Experience with ML model serving (MLOps) using frameworks like XGBoost and Scikit-learn. 4. Proven experience with Infrastructure as Code (IaC) tools such as AWS CDK or Terraform. 5. Previous experience in an early-stage start-up, demonstrating the ability to build a product from version 0 to 1.
Client : https://energylifeglobal.com/ Role: remote (1st Contract for 3+ month then extended to continues) Working hours :(40 hrs - Monday to Friday) Immediate joiners prefer Key Responsibilities 1.Data Ingestion and Processing Pipeline: Build and maintain an automated pipeline to ingest and process raw satellite data from sources like the Registry of Open Data on AWS, storing the results in a data lake built on Amazon S3. 2.AI Model Training & Integration Framework: Develop a framework in Python to automate the machine learning lifecycle, including the training, validation, versioning, and storage of models for satellite image classification (Segmentation) and thermal risk prediction (using libraries like XGBoost). You will also integrate the UTCI (Universal Thermal Climate Index) calculation logic using the pythermalcomfort library. 3. Real-time Prediction API: Design and build a low-latency REST API using AWS Lambda and API Gateway to serve real-time predictions from the AI models. 4. Frontend Core Feature Implementation: Implement the core features of the interactive dashboard using React and Mapbox GL JS, focusing on the intuitive visualisation of complex geospatial heatmap data. 5. Cloud Infrastructure: Design and construct the entire cloud infrastructure using Infrastructure as Code (IaC) principles with the AWS CDK, ensuring the platform is secure, cost-effective, and scalable. Qualifications Required Qualifications: 6. A minimum of 3 years of professional software development experience 7 Deep expertise in either Python-based backend development or React-based frontend development 8. Essential, in-depth experience designing, building, and deploying services on Amazon Web Services (AWS), with a strong command of Lambda, S3, API Gateway, and IAM 9. A thorough understanding of REST principles and proven experience in designing and building APIs from scratch. 10. Proficiency in containerising applications using Docker 11.Demonstrated ability to independently solve complex, open-ended technical challenges 12.Strong communication skills to articulate technical decisions and collaborate with the team Preferred Qualifications: 1.Experience building complex, interactive mapping and data visualisation services with Mapbox GL JS or similar libraries. 2. Experience with GIS data processing using libraries like GeoPandas and Rasterio, or familiarity with Google Earth Engine. 3. Experience with ML model serving (MLOps) using frameworks like XGBoost and Scikit-learn. 4. Proven experience with Infrastructure as Code (IaC) tools such as AWS CDK or Terraform. 5. Previous experience in an early-stage start-up, demonstrating the ability to build a product from version 0 to 1.
Client : https://energylifeglobal.com/ Role: remote (1st Contract for 3+ month then extended to continues) Working hours :(40 hrs - Monday to Friday) Immediate joiners prefer Key Responsibilities 1.Data Ingestion and Processing Pipeline: Build and maintain an automated pipeline to ingest and process raw satellite data from sources like the Registry of Open Data on AWS, storing the results in a data lake built on Amazon S3. 2.AI Model Training & Integration Framework: Develop a framework in Python to automate the machine learning lifecycle, including the training, validation, versioning, and storage of models for satellite image classification (Segmentation) and thermal risk prediction (using libraries like XGBoost). You will also integrate the UTCI (Universal Thermal Climate Index) calculation logic using the pythermalcomfort library. 3. Real-time Prediction API: Design and build a low-latency REST API using AWS Lambda and API Gateway to serve real-time predictions from the AI models. 4. Frontend Core Feature Implementation: Implement the core features of the interactive dashboard using React and Mapbox GL JS, focusing on the intuitive visualisation of complex geospatial heatmap data. 5. Cloud Infrastructure: Design and construct the entire cloud infrastructure using Infrastructure as Code (IaC) principles with the AWS CDK, ensuring the platform is secure, cost-effective, and scalable. Qualifications Required Qualifications: 6. A minimum of 3 years of professional software development experience 7 Deep expertise in either Python-based backend development or React-based frontend development 8. Essential, in-depth experience designing, building, and deploying services on Amazon Web Services (AWS), with a strong command of Lambda, S3, API Gateway, and IAM 9. A thorough understanding of REST principles and proven experience in designing and building APIs from scratch. 10. Proficiency in containerising applications using Docker 11.Demonstrated ability to independently solve complex, open-ended technical challenges 12.Strong communication skills to articulate technical decisions and collaborate with the team Preferred Qualifications: 1.Experience building complex, interactive mapping and data visualisation services with Mapbox GL JS or similar libraries. 2. Experience with GIS data processing using libraries like GeoPandas and Rasterio, or familiarity with Google Earth Engine. 3. Experience with ML model serving (MLOps) using frameworks like XGBoost and Scikit-learn. 4. Proven experience with Infrastructure as Code (IaC) tools such as AWS CDK or Terraform. 5. Previous experience in an early-stage start-up, demonstrating the ability to build a product from version 0 to 1.
Client : https://energylifeglobal.com/ Role: remote (1st Contract for 3+ month then extended to continues) Working hours :(40 hrs - Monday to Friday) Immediate joiners prefer Key Responsibilities 1.Data Ingestion and Processing Pipeline: Build and maintain an automated pipeline to ingest and process raw satellite data from sources like the Registry of Open Data on AWS, storing the results in a data lake built on Amazon S3. 2.AI Model Training & Integration Framework: Develop a framework in Python to automate the machine learning lifecycle, including the training, validation, versioning, and storage of models for satellite image classification (Segmentation) and thermal risk prediction (using libraries like XGBoost). You will also integrate the UTCI (Universal Thermal Climate Index) calculation logic using the pythermalcomfort library. 3. Real-time Prediction API: Design and build a low-latency REST API using AWS Lambda and API Gateway to serve real-time predictions from the AI models. 4. Frontend Core Feature Implementation: Implement the core features of the interactive dashboard using React and Mapbox GL JS, focusing on the intuitive visualisation of complex geospatial heatmap data. 5. Cloud Infrastructure: Design and construct the entire cloud infrastructure using Infrastructure as Code (IaC) principles with the AWS CDK, ensuring the platform is secure, cost-effective, and scalable. Qualifications Required Qualifications: 6. A minimum of 3 years of professional software development experience 7 Deep expertise in either Python-based backend development or React-based frontend development 8. Essential, in-depth experience designing, building, and deploying services on Amazon Web Services (AWS), with a strong command of Lambda, S3, API Gateway, and IAM 9. A thorough understanding of REST principles and proven experience in designing and building APIs from scratch. 10. Proficiency in containerising applications using Docker 11.Demonstrated ability to independently solve complex, open-ended technical challenges 12.Strong communication skills to articulate technical decisions and collaborate with the team Preferred Qualifications: 1.Experience building complex, interactive mapping and data visualisation services with Mapbox GL JS or similar libraries. 2. Experience with GIS data processing using libraries like GeoPandas and Rasterio, or familiarity with Google Earth Engine. 3. Experience with ML model serving (MLOps) using frameworks like XGBoost and Scikit-learn. 4. Proven experience with Infrastructure as Code (IaC) tools such as AWS CDK or Terraform. 5. Previous experience in an early-stage start-up, demonstrating the ability to build a product from version 0 to 1.
Overview Client : Confidential, is looking for a product-focused, well-rounded Software Engineer who loves building and shipping impactful features end-to-end. This is a highly hands-on role where you’ll write production code, interact with customers, contribute to API documentation, and own full feature lifecycles across our platform. We’re looking for someone who thrives on real ownership, cares deeply about product quality, and enjoys thinking about the entire customer journey—from API ergonomics to reliability at scale. Who We Are a profitable, self-funded browser infrastructure platform company founded in 2017. We power the automation needs of thousands of developers through: Our core Browser-as-a-Service (BaaS) platform — scalable, resilient infrastructure for headless Chrome automation, scraping, testing, PDF/screenshot generation, and more. BrowserQL (BQL) — our declarative GraphQL-based automation language that simplifies complex browser interactions, adds smart waiting, stealth techniques, and human-like behavior, and reduces boilerplate for developers. Rest API's - easily scrape content, create pdf/screenshots, and more We are remote-first, profitable, and intentional about building a sustainable, meaningful engineering culture. About the Role As a Software Engineer , you’ll play a key role in evolving our BaaS platform, expanding BrowserQL’s capabilities, and creating the underlying primitives for next-generation autonomous agents. You will: Own features end-to-end—design, build, test, document, release. Work directly with customers and power users to understand real-world automation and scraping workflows. Improve and extend BrowserQL, designing intuitive developer-facing APIs and behaviors. Push forward our roadmap to: Advance our scraping product, including performance, reliability, anti-bot tooling, and DX improvements. Build deeper web agent infrastructure, enabling safe, robust, and autonomous browser-based actions. Strengthen and scale our BaaS core, including browser lifecycle management, distributed orchestration, and session handling. Contribute to product discussions, roadmap planning, design reviews, and architecture conversations. This role is ideal for engineers who enjoy both building polished user-facing features and diving deep into distributed systems, scraping pipelines, and browser internals. Responsibilities Develop features across our BaaS platform, BrowserQL API, Rest API's, AI integrations, agent-related infrastructure. Collaborate directly with customers to understand problems and gather actionable feedback. Write and maintain API documentation, examples, guides, and release notes. Improve performance, reliability, and developer experience across our scraping and automation stack. Participate in code reviews, technical discussions, and team rituals. Debug complex issues across distributed systems, browsers, automation frameworks, and cloud environments. Maintain strong engineering standards in a remote-first environment. Required Skills Excellent Node.js and TypeScript experience. Familiarity with Docker and containerized environments. Experience with cloud platforms such as AWS, Google Cloud, or DigitalOcean. Product-oriented mindset: comfortable talking with customers, owning problems, and iterating quickly. Ability to work independently with strong communication in a remote environment. Minimum of 4+ years of full-time programming experience Nice to Have Experience with: Strong background in browser automation (Playwright, Puppeteer, or related tools). Large-scale scraping systems, defensive/stealth techniques, or anti-bot measures. Distributed systems or high-throughput orchestration. Experience building with Terraform Developer tooling, CLIs, SDKs, or API design. Python or multi-language environments. Interest in building infrastructure for autonomous web agents. Remote Work & Availability This is a fully remote role. We ask for 2–3 hours of synchronous overlap with PST mornings, 4 days per week to support team collaboration, planning, and real-time discussion.
Overview Client : Confidential, is looking for a product-focused, well-rounded Software Engineer who loves building and shipping impactful features end-to-end. This is a highly hands-on role where you’ll write production code, interact with customers, contribute to API documentation, and own full feature lifecycles across our platform. We’re looking for someone who thrives on real ownership, cares deeply about product quality, and enjoys thinking about the entire customer journey—from API ergonomics to reliability at scale. Who We Are a profitable, self-funded browser infrastructure platform company founded in 2017. We power the automation needs of thousands of developers through: Our core Browser-as-a-Service (BaaS) platform — scalable, resilient infrastructure for headless Chrome automation, scraping, testing, PDF/screenshot generation, and more. BrowserQL (BQL) — our declarative GraphQL-based automation language that simplifies complex browser interactions, adds smart waiting, stealth techniques, and human-like behavior, and reduces boilerplate for developers. Rest API's - easily scrape content, create pdf/screenshots, and more We are remote-first, profitable, and intentional about building a sustainable, meaningful engineering culture. About the Role As a Software Engineer , you’ll play a key role in evolving our BaaS platform, expanding BrowserQL’s capabilities, and creating the underlying primitives for next-generation autonomous agents. You will: Own features end-to-end—design, build, test, document, release. Work directly with customers and power users to understand real-world automation and scraping workflows. Improve and extend BrowserQL, designing intuitive developer-facing APIs and behaviors. Push forward our roadmap to: Advance our scraping product, including performance, reliability, anti-bot tooling, and DX improvements. Build deeper web agent infrastructure, enabling safe, robust, and autonomous browser-based actions. Strengthen and scale our BaaS core, including browser lifecycle management, distributed orchestration, and session handling. Contribute to product discussions, roadmap planning, design reviews, and architecture conversations. This role is ideal for engineers who enjoy both building polished user-facing features and diving deep into distributed systems, scraping pipelines, and browser internals. Responsibilities Develop features across our BaaS platform, BrowserQL API, Rest API's, AI integrations, agent-related infrastructure. Collaborate directly with customers to understand problems and gather actionable feedback. Write and maintain API documentation, examples, guides, and release notes. Improve performance, reliability, and developer experience across our scraping and automation stack. Participate in code reviews, technical discussions, and team rituals. Debug complex issues across distributed systems, browsers, automation frameworks, and cloud environments. Maintain strong engineering standards in a remote-first environment. Required Skills Excellent Node.js and TypeScript experience. Familiarity with Docker and containerized environments. Experience with cloud platforms such as AWS, Google Cloud, or DigitalOcean. Product-oriented mindset: comfortable talking with customers, owning problems, and iterating quickly. Ability to work independently with strong communication in a remote environment. Minimum of 4+ years of full-time programming experience Nice to Have Experience with: Strong background in browser automation (Playwright, Puppeteer, or related tools). Large-scale scraping systems, defensive/stealth techniques, or anti-bot measures. Distributed systems or high-throughput orchestration. Experience building with Terraform Developer tooling, CLIs, SDKs, or API design. Python or multi-language environments. Interest in building infrastructure for autonomous web agents. Remote Work & Availability This is a fully remote role. We ask for 2–3 hours of synchronous overlap with PST mornings, 4 days per week to support team collaboration, planning, and real-time discussion.