Tactile Sensing
Advanced tactile sensing technology now goes beyond simple force detection — enabling robots to read touch, pressure, proximity, friction, and material variations, creating a new dimension of robotic perception. A robot hand that responds as sensitively as a human hand significantly expands precision capabilities in manufacturing, service, and medical robotics.
Multi-Modal Tactile Information
The multi-modal sensors mounted on robot fingers analyze force, pressure distribution (matrix), temperature, material, and proximity distance in real-time. This technology goes beyond detecting mere “pressure” — it reproduces tactile sensations in data that mimic what humans feel with their fingertips.
Capabilities include:
•
Detection of tiny pressure differences
•
Surface texture recognition
•
Precise manipulation based on sensor fusion
Ultra-High Sensitivity
Very small force changes are captured instantly. This sensitivity is essential for robots handling fragile objects, thin films, or tiny components.
Highlights:
•
Detection of minute forces down to gram levels
•
Minimal response delay
•
Optimized for precision assembly and experimental robotics
Proximity Perception
Non-contact detection (proximity sensing) allows the tactile sensor to detect an object’s presence about 1–2 cm before contact. This feature helps prevent collisions and enhances safety when humanoid robots interact with people.
Features:
•
Non-contact object detection
•
Pre-emptive avoidance control
•
Precision approach and gentle touch completion
Dynamic Tangential Force Sensing
The sensor can detect dynamic friction and shear forces. This enables robots to sense slipping, vibration, and torque in real-time and automatically adjust grip strength.
Benefits include:
•
Slip prediction
•
Automatic grip adjustment
•
Increased safety in dynamic work environments
Sensor Models & Specifications
Model | Sensing Type | Normal Force Range | Normal Force Resolution | Tangential Resolution | Accuracy | Proximity Sensing Distance |
TS-F-A | 3D Force Sensing | 0–20 N | 0.1 N | 0.25 N | ±5 % FS | ≥ 1 cm |
TS-F-A2 | Matrix Sensing | 0–50 N | 0.1 N | 0.25 N | ±5 % FS | ≥ 1 cm |
TS-F-B | Multi-modal Sensing | 0–20 N | 0.1 N | 0.25 N | ±5 % FS | ≥ 1 cm |
TS-F-C | Multi-modal (Nail) | 0–20 N | 0.1 N | 0.25 N | ±5 % FS | ≥ 2 cm |
TS-E-A | 3D Force Sensing | 0–50 N | 0.1 N | 0.25 N | ±5 % FS | ≥ 1.5 cm |
TS-E-B | Matrix Sensing | 0–50 N | 0.05 N | 0.25 N | ±5 % FS | ≥ 1.5 cm |
Vision-Touch Fusion Technology Platform
Introducing a new era of robotic intelligence where robots “see” and “feel” simultaneously. The Vision–Touch Fusion platform integrates camera-based vision information with AI tactile sensors, enabling robots to understand environments much like human perception. This technology dramatically enhances a robot’s ability to identify and manipulate objects — far beyond simple visual recognition or force sensing alone.
Key idea:
Vision + Tactile = ultra-precise object understanding. Robots use camera vision to determine object shape, position, and material while tactile sensing simultaneously analyzes force, pressure, slip, texture, and contact state. Relying on vision alone leaves robots vulnerable to issues like light, shadows, reflections, or occlusion, but Vision–Touch Fusion compensates for these limitations to maintain high reliability in any environment.
Simulation Support
Currently (as of November 2025), this tactile sensor is the only tactile sensor registered in NVIDIA Isaac Sim, enabling development, simulation, data generation, learning, and digital twin operations for AI-based robots.






