Lofelt has developed the Studio framework for iOS to simplify the integration of haptics into applications. The applications you create with this framework will work on iPhones that are compatible with Core Haptics.
The framework provides two mechanisms for creating haptics:
- Playing a pre-authored haptic clip. This method is best for when you want a haptic effect to match a sound file used in your app—for example, when you want to play the haptic effect of a door closing simultaneously with the sound of the door closing.
- Converting one of your application’s audio streams into haptics at run time. This method is used to create haptic effects in real time—for example, when you have streaming audio content from a server or when an application’s audio is synthesized at run time.
Before jumping ahead to the sections on playing haptic clips or Real-time Audio-to-Haptics, please read the following sections on adding the correct framework to your project and initializing it.