Definition

Tactile sensing is the measurement of physical contact properties — force, pressure distribution, texture, temperature, vibration, and slip — at the surfaces where a robot interacts with objects. Just as human fingertips provide the rich contact feedback essential for dexterous manipulation, tactile sensors give robots the ability to detect how firmly they are gripping an object, whether it is slipping, what material it is made of, and how contact forces are distributed across the grasp surface.

Vision alone cannot provide this information. A camera can see that a gripper is touching a glass, but it cannot tell whether the glass is about to slip, whether the grip force is sufficient, or whether the surface is wet. Tactile sensing fills this critical perception gap, and it is increasingly recognized as essential for moving beyond pick-and-place toward truly dexterous manipulation: in-hand reorientation, delicate assembly, fabric handling, and food preparation.

Sensor Types

  • Vision-based (optical) tactile sensors — A deformable elastomer surface sits over an embedded camera. When the sensor contacts an object, the gel deforms, and the camera captures high-resolution images of the deformation pattern. These images encode contact geometry (shape of the contact patch), force distribution, and surface texture. GelSight and DIGIT are the most prominent examples. Vision-based sensors provide the richest data (tactile "images" at 320x240+ resolution) but are bulky and require onboard image processing.
  • Resistive sensors — Piezoresistive materials change electrical resistance under pressure. Simple, thin, and inexpensive. Used in pressure-sensitive arrays for robotic skins. Limited spatial resolution and susceptible to drift over time, but excellent for detecting contact/no-contact and approximate force magnitude.
  • Capacitive sensors — Measure changes in capacitance caused by deformation of a dielectric layer between conductive plates. Higher precision than resistive sensors, good for measuring normal and shear forces. Used in commercial sensor arrays like those from Pressure Profile Systems.
  • Piezoelectric sensors — Generate voltage in response to dynamic pressure changes. Excellent for detecting vibration, texture, and slip events (which produce high-frequency force transients). Less suitable for static force measurement since the signal decays. Used for slip detection and texture classification.
  • Barometric (air pressure) sensors — Sealed air chambers with pressure sensors detect deformation when the surface is pressed. Simple, robust, and inexpensive. Used in soft robotic grippers where simplicity and compliance are priorities.

Key Products

  • GelSight (MIT / GelSight Inc.) — The original vision-based tactile sensor. Uses a reflective coating on the gel surface and controlled illumination to reconstruct 3D surface geometry from camera images. Provides both geometry and force estimation. Multiple commercial versions available (GelSight Mini, GelSight Wedge).
  • DIGIT (Meta AI) — Open-source vision-based tactile sensor designed for research. Compact form factor suitable for multi-finger grippers. Produces RGB tactile images at 60 fps. The most widely used tactile sensor in the robot learning research community.
  • Paxini GEN3 — Commercial tactile sensor array with high spatial resolution. Capacitive sensing technology with flexible form factors. Designed for integration into robotic grippers and prosthetic hands.
  • Contactile — Australian sensor company producing 3-axis force sensors with integrated slip detection. Uses papillae-inspired sensor elements that detect both normal and shear forces, mimicking the structure of human fingertip mechanoreceptors.
  • BioTac (SynTouch) — Multimodal fingertip sensor combining impedance-based pressure sensing, vibration detection, and temperature sensing in a single package. Designed to replicate the sensory capabilities of the human fingertip. Used extensively in dexterous manipulation research.

Applications

Grasp stability and slip detection: The most immediately practical application. Tactile sensors detect the onset of slip (object movement relative to the finger surface) in milliseconds, enabling reactive grip force adjustment. This allows robots to handle fragile objects (eggs, fruit, glassware) by gripping with the minimum force needed — tight enough to hold, gentle enough not to crush.

Dexterous in-hand manipulation: Rotating, flipping, or repositioning objects within the hand requires continuous contact feedback. Without tactile sensing, the robot cannot detect whether the object is moving as intended or has shifted unexpectedly. Multi-fingered hands (Allegro, LEAP) increasingly integrate tactile sensors for in-hand manipulation research.

Texture and material recognition: Tactile sensors can classify materials (metal, plastic, fabric, wood) by analyzing the contact signature — pressure distribution, vibration spectrum during sliding, and thermal conductivity. This enables robots to adjust manipulation strategies based on object properties without prior knowledge.

Contact-rich assembly: Insertion tasks (peg-in-hole, connector mating, snap-fit assembly) require detecting and responding to contact forces in real time. Force/torque sensors at the wrist provide aggregate force, but tactile sensors at the fingertips provide localized contact information essential for alignment and error correction.

Integration with Policy Learning

Integrating tactile data into learned manipulation policies is an active and rapidly advancing research area. Several approaches exist:

Tactile image as additional observation: For vision-based sensors like DIGIT and GelSight, the tactile image is treated as an additional camera view and processed through a CNN encoder alongside the wrist and overhead camera images. The tactile features are concatenated with visual features before being fed to the policy network (ACT, Diffusion Policy).

Tactile embeddings: Pre-trained tactile encoders (trained on tactile classification or reconstruction tasks) convert raw sensor data into compact feature vectors. These embeddings capture contact-relevant information in a lower-dimensional representation suitable for policy input.

Simulation of touch: Training tactile policies in simulation requires tactile simulators (TACTO for DIGIT, Taxim for GelSight) that render synthetic tactile images from physics simulation contact data. The sim-to-real gap for tactile sensing is significant due to the difficulty of modeling gel deformation and optical properties accurately.

Multi-modal fusion: The most effective policies fuse visual, proprioceptive, and tactile observations. Research shows that adding tactile input improves success rates by 10-30% on contact-rich tasks compared to vision-only policies, with the largest gains on tasks involving deformable objects, thin objects, and in-hand manipulation.

Practical Requirements

Mounting and integration: Tactile sensors must be mechanically integrated into the gripper or hand. Vision-based sensors (DIGIT, GelSight) require mounting space for the camera and gel, adding 10-20mm to fingertip dimensions. Thin-film sensors (resistive, capacitive) can be adhered to existing finger surfaces with minimal form factor change.

Data rate and latency: Slip detection requires sensing at 100+ Hz to catch the onset of slip before the object moves significantly. Vision-based sensors typically operate at 30-60 fps, which is marginal for slip detection. Piezoelectric sensors can capture vibrations at 1000+ Hz.

Durability: Tactile sensors experience repeated mechanical contact and must withstand thousands of grasp cycles. Gel-based sensors wear over time and require periodic replacement (typically every 2,000-10,000 grasps). Robust encapsulation and replaceable sensor tips improve practical lifespan.

Calibration: Converting raw sensor outputs to calibrated force values requires per-sensor calibration. For research, relative measurements are often sufficient; for industrial applications, traceable force calibration is needed.

Tactile Sensing at SVRC

SVRC integrates tactile sensing across its manipulation research fleet at both the Mountain View and Allston facilities:

  • Paxini GEN3 integration — SVRC maintains Paxini GEN3 tactile sensor arrays mounted on parallel-jaw and multi-finger grippers. These capacitive arrays provide high-resolution contact maps at 100+ Hz, suitable for slip detection and grasp force optimization. Available on OpenArm 101 and DK1 platforms.
  • DIGIT-equipped grippers — Vision-based DIGIT sensors on ALOHA bimanual systems capture tactile images at 60 fps. Our data platform records synchronized tactile images alongside wrist camera, overhead camera, and proprioceptive streams in LeRobot-compatible formats.
  • Multi-modal policy training — SVRC's data collection pipelines include tactile channels as standard observation modalities. Teams training ACT or Diffusion Policy can incorporate tactile input by adding the tactile image as an additional camera view, with pre-trained tactile encoders available for faster convergence.
  • Sensor selection consulting — Our engineering team helps you choose the right sensor technology for your application: vision-based for research richness, capacitive for industrial durability, piezoelectric for high-speed slip detection. We handle mounting, calibration, and ROS2 driver integration.

See Also

  • Hardware Catalog — Paxini GEN3 sensors, DIGIT sensors, and gripper accessories
  • Data Services — Multi-modal data collection including tactile streams
  • Data Platform — Synchronized storage of tactile, visual, and proprioceptive data
  • Robot Leasing — Access tactile-equipped robot cells for research
  • Repair and Maintenance — Gel tip replacement and sensor recalibration services

Key Papers

  • Yuan, W. et al. (2017). "GelSight: High-Resolution Robot Tactile Sensors for Estimating Geometry and Force." Sensors. The foundational paper on vision-based tactile sensing using GelSight technology.
  • Lambeta, M. et al. (2020). "DIGIT: A Novel Design for a Low-Cost Compact High-Resolution Tactile Sensor with Application to In-Hand Manipulation." IEEE RA-L. Introduces the DIGIT sensor that became the standard for tactile research.
  • Qi, L. et al. (2023). "General In-Hand Object Rotation with Vision and Touch." CoRL 2023. Demonstrates the critical role of tactile sensing for dexterous in-hand manipulation with learned policies.
  • Wang, S. et al. (2022). "TACTO: A Fast, Flexible, and Open-source Simulator for High-Resolution Vision-Based Tactile Sensors." IEEE RA-L. The standard simulator for DIGIT and GelSight sensors in policy training.

Related Terms

  • Force-Torque Sensing — Wrist-mounted sensing that complements fingertip tactile data
  • Grasp Planning — Plans grasps that tactile sensing validates and refines during execution
  • Diffusion Policy — Can incorporate tactile observations as additional input modalities
  • Sim-to-Real Transfer — Tactile sim-to-real requires specialized tactile simulators
  • Point Cloud — Provides geometric context that tactile sensing enriches with contact information

Apply This at SVRC

Silicon Valley Robotics Center integrates tactile sensors across its manipulation research fleet. We offer DIGIT and GelSight-equipped grippers for data collection, tactile-aware policy training pipelines, and consultation on sensor selection for your specific application. Our data platform records synchronized tactile, visual, and proprioceptive data streams for multi-modal policy training.

Explore Hardware   Contact Us