There are two ways to generate haptic effects with Lofelt Studio:
- Pre-rendering: Author small haptic files, known as haptic clips, derived from audio files. With the Studio framework for iOS, you assign them to various events in the software for playback.
- Real time: The Studio framework for iOS converts the audio from the application into haptics while the software is running.
“Authoring haptics” refers to the first method. You use the Studio desktop app and the complimentary Studio mobile app to craft and audition haptic effects that are stored as haptic clips for use in your application. The rest of this section focuses on how to use the Studio desktop app and the Studio mobile app to generate these haptic clips.
If you want to know more about the real-time method for generating haptics, see the section “Convert an Audio Stream to Haptics in Real Time.”
If you are subscribed to the Starter tier of Lofelt Studio, please read the section “Online Authoring” to learn how to author haptic clips with the Studio web app.