Back to datasets
Dataset assetOpen Source CommunityRoboticsTactile Sensing
slip_detection_dataset_2021
The dataset contains tactile data collected during outdoor autonomous robot pick‑and‑place tasks. Data were collected by an EZGripper attached to a Universal Robot UR5 arm, using a uSkin tactile sensor measuring normal and shear forces at 180 Hz. The dataset includes grasp data for three different objects (solder spool, brush, screwdriver) in various poses, each pose repeated ten times, totaling 30 grasps per object.
Source
github
Created
Mar 6, 2021
Updated
Dec 28, 2021
Signals
125 views
Availability
Linked source ready
Overview
Dataset description and usage context
Dataset Overview
Dataset Name
slip_detection_dataset_2021
Data Collection Environment
The dataset contains tactile data collected during autonomous robot grasping and placing tasks.
Data Collection Equipment
- Robot Arm and Gripper: EZGripper mounted on a Universal Robot UR5 arm.
- Tactile Sensor: One finger of the EZGripper was replaced with a dedicated structure containing a uSkin tactile sensor.
- Vision Sensor: Kinect2 positioned above the workbench for autonomous grasp generation.
Tactile Sensor Characteristics
- Sensor Type: uSkin sensor.
- Number of Sensory Points: 18 points arranged 3×6.
- Sampling Frequency: 180 Hz.
- Measurement Capability: Each point independently measures applied normal and shear forces.
Dataset Content
- Number of Objects: 3 (solder spool, brush, screwdriver).
- Object Poses: Each object placed in three different poses within a 50 cm × 50 cm workspace.
- Experiment Repetitions: Each pose repeated ten times, yielding 30 grasps per object.
Data Storage Format
Data are stored in HDF5 files, one file per object.
Need downstream help?
Pair the dataset with AI analysis and content workflows.
Once the source passes your review, move straight into summarization, transformation, report drafting, or presentation generation with the JuheAI toolchain.