Real-time conversion of an audio stream into haptics requires two steps:
- Attach an audio source to the
- Play the audio source.
The audio that plays from the audio source is converted to haptic data and played to the device in real time.
func attachAudioSource(_ audioNode: AVAudioNode) throws
attachAudioSource() accepts audio sources of type
error: NSErrorPointer indicates any failures that occurred in attempting to attach the audio source.
This example attaches an
AVAudioPlayerNode to the
LofeltHaptics for real-time audio to haptics.
// Set up haptics. let haptics = try! LofeltHaptics.init() // Load test audio. let fileUrl = Bundle.main.url(forResource: "my-audio-file", withExtension: "wav")! let audioFile = try! AVAudioFile.init(forReading: fileUrl) // Create audio engine and player. let audioEngine = AVAudioEngine() let audioPlayer = AVAudioPlayerNode() // Connect player to main mixer. audioEngine.attach(audioPlayer) let mainMixer = audioEngine.mainMixerNode audioEngine.connect(audioPlayer, to: mainMixer, format: audioFile.processingFormat) // Connect player to Lofelt haptics. try! haptics.attachAudioSource(audioPlayer) // Start audio engine and play audio. try! audioEngine.start() audioPlayer.scheduleFile(audioFile, at: nil, completionHandler: nil) audioPlayer.play()
Included in the Lofelt Studio download, in the
sdk/examples/ios folder, is an example project for performing real-time audio-to-haptic conversion on iPhone called
LofeltHapticsExampleRealtime. To run the example, open the project in Xcode, target your iPhone, and build. Press and release the button on the screen and you will hear audio play while also feeling the haptic effects. Replace the audio file with another and you will feel different haptics as the new audio file plays.