We’re a stealth robotics startup in Palo Alto hiring an engineer to define and ship a canonical Tactile Tensor and the reference SDK + conformance suite that makes tactile data reproducible, interoperable, and directly usable for robotics perception and foundation-model training. Critical requirement: deterministic, byte-stable serialization + strict versioning, plus tokenization-ready interfaces (tensors → stable token streams) for Transformer-style robotics pipelines—without heavy dependencies. What you’ll do Define the Tactile Tensor: units, coordinate frames, timestamps, shapes, uncertainty, required metadata, and forward/backward compatibility rules. Build a lightweight reference SDK (Python and/or C++) that validates, serializes/deserializes, and produces identical outputs across platforms. Specify training-grade data contracts: deterministic windowing/patching, normalization/quantization, and token schemas that are stable across sensors and logging setups. Ship a public-facing spec + examples + CI conformance tests so external robotics labs/OEMs can implement against it with confidence. Architect the tensor representation to ensure physical invariances (e.g., coordinate-frame independence, scale-invariant contact patches) so that policies trained on one robot's geometry generalize to another. Requirements PhD in a relevant field (Robotics, Computer Science, Applied Mathematics, Electrical Engineering, or similar), or 3+ years of equivalent industry experience. Excellent software engineering fundamentals (API design, packaging, CI, testing, docs). Python and/or C++ proficiency (both ideal). Proven ability to design deterministic serialization and conformance tests (identical inputs → identical bytes across platforms). Experience with high-rate numeric data formats (Arrow/Parquet/Zarr/Protobuf/FlatBuffers or similar). Ability to design metadata + lineage for robotics datasets (device ID, calibration artifact ID, robot/config versions, provenance). Familiarity with ML data pipelines; ability to define tokenization/embedding conventions for transformer training without bundling full ML stacks. Experience designing data schemas that explicitly handle and flag physical sensor artifacts (saturation, dropout, thermal drift, and variable sampling rates) without crashing downstream model inference. Preferred Experience authoring standards/specs, file formats, or widely-used SDKs. HPC/embedded/performance background; strong “minimal dependency” philosophy. Experience with data integrity/attestation (hashing/signing, provenance chains) for tamper-evident robotics logs. Key Deliverables PDF Spec: Tactile Tensor schema, metadata/lineage rules, determinism + versioning/migration, conformance criteria. Reference SDK: lightweight schema objects, validators, deterministic serializer/deserializer, minimal dependencies. Dataset Container Spec: reproducible storage + examples (streaming + offline parity; robotics log friendly). ML Interfaces: modular tokenization hooks + reference tokenization recipes (windowing/patching + quantization conventions). CI Suite: golden files, byte-stability, backward/forward compatibility tests, reference implementations. Contract-to-hire with a clear path to full-time and founding equity for the right fit.
This is a startup Contract Role in Palo Alto, CA with opportunity for conversion to Full-Time + Equity. Summary We are looking for an engineer with a strong physics background to help us model the 'squishiness' of our soft tactile sensors. Our sensors use silicone elastomers to feel the world—much like a human fingertip—but that flexibility introduces complex behaviors like hysteresis and slip. You will work with real data from our automated test rig to build mathematical models that explain these behaviors, helping us turn raw deformation signals into precise, real-time sensing data. Key Objective Select and justify hyperelastic/viscoelastic models for elastomeric sensor layers under cyclic loading. Model hysteresis, creep, and stress relaxation, including strategies for numerical compensation. Define contact and friction models (normal force vs indentation, stick–slip, velocity and load dependence). Incorporate temperature and rate dependence of stiffness and friction into the models. Propose signal transduction models mapping deformation fields to raw sensor outputs for common modalities (e.g. Hall-effect arrays, optical/vision-based tactile skins). Define parameter fitting procedures using force–displacement–time data (which parts of curves to use, cost functions, validation checks). Deliver computationally usable formulations plus a Python/Jupyter Notebook that demonstrates the models on sample data. Main Deliverables Technical modeling specification covering: Constitutive laws, friction/contact laws, and transduction relationships. Assumptions, variable definitions, units, parameter ranges, and example fits. Python/Jupyter Notebook that: Takes sample strain/deformation and temperature/rate) histories as input. Outputs predicted stress/force trajectories and example sensor signal trajectories. Is numerically stable and easy to translate into production code. Requirements PhD in a relevant field (Physics, Mechanical Engineering, Materials Science, or similar) or 3+ years of related industry experience. Strong background in viscoelasticity and hyperelasticity of elastomers. Experience with contact mechanics and tribology (friction, stick–slip). Comfortable producing closed-form/algorithmic models for real-time computation (Python/C++), not just FEA. Able to document models clearly and provide a clean, well-commented Python notebook.
We're a stealth robotics startup in Palo Alto hiring an engineer to define and ship a canonical Tactile Tensor and the reference SDK + conformance suite that makes tactile data reproducible, interoperable, and directly usable for robotics perception and foundation-model training. Critical requirement: deterministic, byte-stable serialization + strict versioning, plus tokenization-ready interfaces (tensors ? stable token streams) for Transformer-style robotics pipelineswithout heavy dependencies. What you'll do Define the Tactile Tensor: units, coordinate frames, timestamps, shapes, uncertainty, required metadata, and forward/backward compatibility rules. Build a lightweight reference SDK (Python and/or C++) that validates, serializes/deserializes, and produces identical outputs across platforms. Specify training-grade data contracts: deterministic windowing/patching, normalization/quantization, and token schemas that are stable across sensors and logging setups. Ship a public-facing spec + examples + CI conformance tests so external robotics labs/OEMs can implement against it with confidence. Architect the tensor representation to ensure physical invariances (e.g., coordinate-frame independence, scale-invariant contact patches) so that policies trained on one robot's geometry generalize to another. Requirements PhD in a relevant field (Robotics, Computer Science, Applied Mathematics, Electrical Engineering, or similar), or 3+ years of equivalent industry experience. Excellent software engineering fundamentals (API design, packaging, CI, testing, docs). Python and/or C++ proficiency (both ideal). Proven ability to design deterministic serialization and conformance tests (identical inputs ? identical bytes across platforms). Experience with high-rate numeric data formats (Arrow/Parquet/Zarr/Protobuf/FlatBuffers or similar). Ability to design metadata + lineage for robotics datasets (device ID, calibration artifact ID, robot/config versions, provenance). Familiarity with ML data pipelines; ability to define tokenization/embedding conventions for transformer training without bundling full ML stacks. Experience designing data schemas that explicitly handle and flag physical sensor artifacts (saturation, dropout, thermal drift, and variable sampling rates) without crashing downstream model inference. Preferred Experience authoring standards/specs, file formats, or widely-used SDKs. HPC/embedded/performance background; strong minimal dependency philosophy. Experience with data integrity/attestation (hashing/signing, provenance chains) for tamper-evident robotics logs. Key Deliverables PDF Spec: Tactile Tensor schema, metadata/lineage rules, determinism + versioning/migration, conformance criteria. Reference SDK: lightweight schema objects, validators, deterministic serializer/deserializer, minimal dependencies. Dataset Container Spec: reproducible storage + examples (streaming + offline parity; robotics log friendly). ML Interfaces: modular tokenization hooks + reference tokenization recipes (windowing/patching + quantization conventions). CI Suite: golden files, byte-stability, backward/forward compatibility tests, reference implementations. Contract-to-hire with a clear path to full-time and founding equity for the right fit.
We are a stealth robotics startup in Palo Alto hiring a remote engineer to own the signal transduction layer of our pipeline. This role converts modality-specific raw signals into a time-synchronized, reproducible feature stream that downstream modeling teams rely on across different hardware devices. We’re looking for a hands-on engineer who can make heterogeneous sensor signals—ranging from magnetic fields to optical deformation—feel comparable through principled estimation and feature extraction. This role does not require training ML models—the focus is deterministic signal processing and feature extraction. You’ll work with messy, real-world tactile data from robot fingertips and skins (vision-on-elastomer images, magnetic arrays, pressure signals). We provide representative logs and expected outputs; your job is to make the extraction repeatable and fast. What You’ll Do Own the technical handoff (raw streams → canonical features): Convert noisy, modality-specific signals into a shared feature set. You focus on signal integrity + feature extraction; our Physicist and ML Architect handle constitutive modeling and high-level representations. Build modality adapters: Map raw data (Hall arrays, camera pixels, pressure channels) into a unified feature set using geometry-based transforms, analytic transforms (when applicable), and CV methods like tracking/optical flow. Synchronization & alignment: Resolve clock offsets and sampling jitter between disparate high-rate sources (e.g., 500Hz+ sensors vs. 1000Hz ground-truth rigs). Health & quality detectors: Identify saturation, packet loss, and drift; set standardized bitmask flags (Dropout, Saturation, Overheat). Confidence + validity gates: Attach confidence/quality measures using SNR, saturation checks, residuals, and consistency tests. Profile + optimize: Keep the raw→feature path efficient; aim for <5–10ms per frame on reference hardware. Engineering rigor: Maintain golden logs so identical inputs yield repeatable outputs across runs/versions (with CI regression checks). Own the feature schema lifecycle: Maintain the Stage-1 feature spec, versioning, and backward-compatible changes. Who / What You’ll Interface With People: DSP/CV (you), Physicist (Stage-2 kernels), ML engineer/architect (downstream consumers), robotics/firmware support as needed. Data & formats: High-rate sensor logs/timebases (e.g., ROS bags, NPZ, Parquet), plus calibration/ground-truth references when available. Runtime: Linux, CPU-first; accelerate via vectorization/Numba/C++ as needed. Requirements PhD in Electrical Engineering (Signal Processing), Computer Vision, Robotics Perception, Mechanical Engineering or Applied Mathematics (or equivalent industry experience). Computer vision fundamentals: OpenCV (or equivalent) for real-time feature extraction, tracking, or flow. DSP & estimation: Filtering, noise characterization, state-space methods (Kalman-family) for drift/bias handling. High-rate data handling: Latency accounting, resampling, time alignment. Numerical coding: Expert Python (NumPy/SciPy); C++ for performance/embedded-ready inference is a plus. Reproducibility mindset: Golden datasets, CI checks, and clear debugging playbooks. Preferred Requirements ROS/ROS2 logging timebases, hardware timestamps, contact-rich robotics logs. IMU/timebase alignment, bias/drift estimation, multi-stream synchronization. Shipping sensor adapters/drivers or SDK-like modules with stable interfaces. Profiling on Linux, SIMD/Numba/C++ acceleration, lightweight deployment constraints. Any tactile-specific exposure (vision-based tactile, magnetic arrays, contact sensors). Key Deliverables Modality adapter library converting raw modality signals to a shared feature interface. Alignment utilities for time-sync, drift handling, and resampling. Feature interface specification defining units/coordinate conventions and required fields (timestamped records, centroid/area, flow/deformation summaries, shear proxies, slip indicators, quality flags, confidence). Deterministic test suite: CI-ready golden datasets validating repeatability/byte-stability across versions. Contract-to-hire with a clear path to full-time and founding equity for the right fit.