Integrating Haptics
Lofelt Studio contains a growing number of components for integrating haptics into applications. At the time of this writing, the release version of Studio contains a framework for iOS that allows haptics to be added to iPhone applications. Additionally, the current beta version also includes an Android Library for building Android applications, as well as a haptics plug-in for the Unity game engine. Over time, we will release additional frameworks, libraries, and plug-ins for programming haptics on other platforms, engines, and devices.
The main workflow for integrating haptics is:
- Add your audio samples and haptic clips to your project.
- Use the functions of the the libraries, frameworks, or plug-ins to play the haptics at the appropriate times.
- Build and deploy to the target devices.
This section of the documentation is split into three sections, each detailing a specific way for integrating haptics into applications:
- If you are writing your own iPhone application, please read the section on Integrating Haptics using the Studio framework for iOS.
- If you are writing your own Android application, please read the section on Integrating Haptics using the Studio library for Android.
- If you are creating a game for iOS or Android phones using the Unity game engine, please read the section on Integrating Haptics using the Studio plug-in for Unity.