There are two ways to generate haptic effects with Lofelt Studio:
- Pre-rendering: Author small haptic files, known as haptic clips, derived from audio files. With the Studio frameworks, libraries, and game development plug-ins, you assign the clips to various events in the software for playback.
- Real time: The Studio framework for iOS includes a special function which automatically converts audio from the application into haptics while the software is running.
“Authoring haptics” refers to the first method. You use the Studio desktop app and the complimentary Studio mobile app to craft and audition haptic effects that are stored as haptic clips for use in your application. The rest of this section focuses on how to use the Studio desktop app and the Studio mobile app to generate these haptic clips.
If you want to know more about the real-time method for generating haptics in iOS, see the section “Convert an Audio Stream to Haptics in Real Time.”
If you haven’t yet purchased a license or subscription to Lofelt Studio, please read the section “Online Authoring” to learn how to author haptic clips with the Studio web app and test them with the Studio mobile app.