Tactile Signal Processing Engineer (Robotics) — Transduction & Features

0 years

0 Lacs

Posted:1 day ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Contractual

Job Description

We are a stealth robotics startup in Palo Alto hiring a remote engineer to own the signal transduction layer of our pipeline. This role converts modality-specific raw signals into a time-synchronized, reproducible feature stream that downstream modeling teams rely on across different hardware devices. We’re looking for a hands-on engineer who can make heterogeneous sensor signals—ranging from magnetic fields to optical deformation—feel comparable through principled estimation and feature extraction.

This role does not require training ML models—the focus is deterministic signal processing and feature extraction. You’ll work with messy, real-world tactile data from robot fingertips and skins (vision-on-elastomer images, magnetic arrays, pressure signals). We provide representative logs and expected outputs; your job is to make the extraction repeatable and fast.

What You’ll Do

  • Own the technical handoff (raw streams → canonical features): Convert noisy, modality-specific signals into a shared feature set. You focus on signal integrity + feature extraction; our Physicist and ML Architect handle constitutive modeling and high-level representations.
  • Build modality adapters: Map raw data (Hall arrays, camera pixels, pressure channels) into a unified feature set using geometry-based transforms, analytic transforms (when applicable), and CV methods like tracking/optical flow.
  • Synchronization & alignment: Resolve clock offsets and sampling jitter between disparate high-rate sources (e.g., 500Hz+ sensors vs. 1000Hz ground-truth rigs).
  • Health & quality detectors: Identify saturation, packet loss, and drift; set standardized bitmask flags (Dropout, Saturation, Overheat).
  • Confidence + validity gates: Attach confidence/quality measures using SNR, saturation checks, residuals, and consistency tests.
  • Profile + optimize: Keep the raw→feature path efficient; aim for <5–10ms per frame on reference hardware.
  • Engineering rigor: Maintain golden logs so identical inputs yield repeatable outputs across runs/versions (with CI regression checks).
  • Own the feature schema lifecycle: Maintain the Stage-1 feature spec, versioning, and backward-compatible changes.

Who / What You’ll Interface With

  • People: DSP/CV (you), Physicist (Stage-2 kernels), ML engineer/architect (downstream consumers), robotics/firmware support as needed.
  • Data & formats: High-rate sensor logs/timebases (e.g., ROS bags, NPZ, Parquet), plus calibration/ground-truth references when available.
  • Runtime: Linux, CPU-first; accelerate via vectorization/Numba/C++ as needed.

Requirements

  • PhD in Electrical Engineering (Signal Processing), Computer Vision, Robotics Perception, Mechanical Engineering or Applied Mathematics (or equivalent industry experience).
  • Computer vision fundamentals: OpenCV (or equivalent) for real-time feature extraction, tracking, or flow.
  • DSP & estimation: Filtering, noise characterization, state-space methods (Kalman-family) for drift/bias handling.
  • High-rate data handling: Latency accounting, resampling, time alignment.
  • Numerical coding: Expert Python (NumPy/SciPy); C++ for performance/embedded-ready inference is a plus.
  • Reproducibility mindset: Golden datasets, CI checks, and clear debugging playbooks.

Preferred Requirements

  • ROS/ROS2 logging timebases, hardware timestamps, contact-rich robotics logs.
  • IMU/timebase alignment, bias/drift estimation, multi-stream synchronization.
  • Shipping sensor adapters/drivers or SDK-like modules with stable interfaces.
  • Profiling on Linux, SIMD/Numba/C++ acceleration, lightweight deployment constraints.
  • Any tactile-specific exposure (vision-based tactile, magnetic arrays, contact sensors).

Key Deliverables

  • Modality adapter library converting raw modality signals to a shared feature interface.
  • Alignment utilities for time-sync, drift handling, and resampling.
  • Feature interface specification defining units/coordinate conventions and required fields (timestamped records, centroid/area, flow/deformation summaries, shear proxies, slip indicators, quality flags, confidence).
  • Deterministic test suite: CI-ready golden datasets validating repeatability/byte-stability across versions.


Contract-to-hire with a clear path to full-time and founding equity for the right fit.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You