Haptics : Research - The Penn Haptic Texture Toolkit browse

The Penn Haptic Texture Toolkit

Current Researchers: Heather Culbertson, Juan Jose Lopez Delgado


The Penn Haptic Texture Toolkit (HaTT) is a publicly available repository of haptic texture models for use by the research community. The toolkit, available for download here, includes 100 haptic texture and friction models, the recorded data from which the models were made, images of the textures, and the code and methods necessary to render the textures using an impedance-type haptic interface such as a SensAble Phantom Omni. This toolkit was developed to provide haptics researchers with a method by which to compare and validate their texture modeling and rendering methods. Furthermore, the included rendering code allows both researchers and designers to incorporate our texture into their virtual environments, which can lead to a richer experience for the user.

Toolkit Contents

The 100 isotropic and homogeneous textures included in the toolkit are divided across ten material categories (paper, plastic, fabric, tile, carpet, foam, metal, stone, carbon fiber, and wood).

The toolkit includes two recorded data files for each texture. Each data file includes three axes of acceleration, force, and position data recorded for 10 seconds using the methods and hardware described on the Haptography page. The recorded acceleration signal from one of the data files was segmented and used to create a texture model set for each surface. During rendering of virtual textures, these model sets are used to generate synthetic texture vibrations that are dependent on the user's current normal force and scanning speed. The second data file was used to fit a Coulomb friction model for each texture.

Our original TexturePad system played the texture vibrations to the user through a voice-coil actuator. However, the open-source goal of this project motivated us to shift to rendering with a SensAble Phantom Omni because of its low cost and wide availability among haptics researchers. During rendering, three separate forces are calculated and displayed to the user: the normal force, the friction force, and the texture force. The friction force is calculated using the friction coefficient determined from the data. The texture force is calculated by scaling the generated texture vibrations by the effective mass of the handle and the user's hand before being displayed to the user.

The rendering software included in the toolkit is based on OpenHaptics 3.0. The toolkit currently includes the files necessary to run the rendering on Linux and was tested on a computer running Ubuntu version 12.04 LTS. We are in the process of adding support for Windows.

Reprinted from Culbertson et al., 2014. Copyright IEEE, 2014.


H. Culbertson, J. J. Lopez Delgado, and K. J. Kuchenbecker. One hundred data-driven haptic texture models and open-source methods for rendering on 3d objects. In Proc. IEEE Haptics Symposium, February 2014.

H. Culbertson, J. J. Lopex Delgado, and K. J. Kuchenbecker. The Penn Haptic Texture Toolkit. Hands-on demonstration to be presented at IEEE Haptics Symposium, February 2014.

This material is based upon work supported by the National Science Foundation under Grant No. 0845670. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.