Skip to content

Manual audio mode improvements #758

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 8 commits into from
Aug 25, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .changes/fix-manual-audio-remote-audio
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
patch type="fixed" "Remote audio buffer when using manual rendering mode"
2 changes: 1 addition & 1 deletion LiveKitClient.podspec
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ Pod::Spec.new do |spec|

spec.source_files = "Sources/**/*"

spec.dependency("LiveKitWebRTC", "= 137.7151.03")
spec.dependency("LiveKitWebRTC", "= 137.7151.04")
spec.dependency("SwiftProtobuf")
spec.dependency("Logging", "= 1.5.4")
spec.dependency("DequeModule", "= 1.1.4")
Expand Down
2 changes: 1 addition & 1 deletion Package.swift
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ let package = Package(
],
dependencies: [
// LK-Prefixed Dynamic WebRTC XCFramework
.package(url: "https://github.com/livekit/webrtc-xcframework.git", exact: "137.7151.03"),
.package(url: "https://github.com/livekit/webrtc-xcframework.git", exact: "137.7151.04"),
.package(url: "https://github.com/apple/swift-protobuf.git", from: "1.29.0"),
.package(url: "https://github.com/apple/swift-log.git", from: "1.6.2"),
.package(url: "https://github.com/apple/swift-collections.git", from: "1.1.0"),
Expand Down
2 changes: 1 addition & 1 deletion [email protected]
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ let package = Package(
],
dependencies: [
// LK-Prefixed Dynamic WebRTC XCFramework
.package(url: "https://github.com/livekit/webrtc-xcframework.git", exact: "137.7151.03"),
.package(url: "https://github.com/livekit/webrtc-xcframework.git", exact: "137.7151.04"),
.package(url: "https://github.com/apple/swift-protobuf.git", from: "1.29.0"),
.package(url: "https://github.com/apple/swift-log.git", from: "1.6.2"),
.package(url: "https://github.com/apple/swift-collections.git", from: "1.1.0"),
Expand Down
4 changes: 2 additions & 2 deletions Sources/LiveKit/Audio/Manager/AudioManager.swift
Original file line number Diff line number Diff line change
Expand Up @@ -101,10 +101,10 @@
// Keep this var within State so it's protected by UnfairLock
public var localTracksCount: Int = 0
public var remoteTracksCount: Int = 0
public var customConfigureFunc: ConfigureAudioSessionFunc?

Check warning on line 104 in Sources/LiveKit/Audio/Manager/AudioManager.swift

View workflow job for this annotation

GitHub Actions / Build & Test (macos-15, 16.4, tvOS Simulator,name=Apple TV,OS=18.5)

'ConfigureAudioSessionFunc' is deprecated

Check warning on line 104 in Sources/LiveKit/Audio/Manager/AudioManager.swift

View workflow job for this annotation

GitHub Actions / Build & Test (macos-14, 15.4, macOS,variant=Mac Catalyst)

'ConfigureAudioSessionFunc' is deprecated

Check warning on line 104 in Sources/LiveKit/Audio/Manager/AudioManager.swift

View workflow job for this annotation

GitHub Actions / Build & Test (macos-15, 16.4, macOS,variant=Mac Catalyst)

'ConfigureAudioSessionFunc' is deprecated

Check warning on line 104 in Sources/LiveKit/Audio/Manager/AudioManager.swift

View workflow job for this annotation

GitHub Actions / Build & Test (macos-14, 15.4, iOS Simulator,name=iPhone 15 Pro,OS=17.5)

'ConfigureAudioSessionFunc' is deprecated

Check warning on line 104 in Sources/LiveKit/Audio/Manager/AudioManager.swift

View workflow job for this annotation

GitHub Actions / Build & Test (macos-15, 16.4, iOS Simulator,name=iPhone 16 Pro,OS=18.5, true)

'ConfigureAudioSessionFunc' is deprecated

Check warning on line 104 in Sources/LiveKit/Audio/Manager/AudioManager.swift

View workflow job for this annotation

GitHub Actions / Build & Test (macos-15, 16.4, iOS Simulator,name=iPhone 16 Pro,OS=18.5, true)

'ConfigureAudioSessionFunc' is deprecated

Check warning on line 104 in Sources/LiveKit/Audio/Manager/AudioManager.swift

View workflow job for this annotation

GitHub Actions / Build & Test (macos-15, 16.4, visionOS Simulator,name=Apple Vision Pro,OS=2.5)

'ConfigureAudioSessionFunc' is deprecated
public var sessionConfiguration: AudioSessionConfiguration?

public var trackState: TrackState {

Check warning on line 107 in Sources/LiveKit/Audio/Manager/AudioManager.swift

View workflow job for this annotation

GitHub Actions / Build & Test (macos-15, 16.4, tvOS Simulator,name=Apple TV,OS=18.5)

'TrackState' is deprecated

Check warning on line 107 in Sources/LiveKit/Audio/Manager/AudioManager.swift

View workflow job for this annotation

GitHub Actions / Build & Test (macos-14, 15.4, macOS,variant=Mac Catalyst)

'TrackState' is deprecated

Check warning on line 107 in Sources/LiveKit/Audio/Manager/AudioManager.swift

View workflow job for this annotation

GitHub Actions / Build & Test (macos-15, 16.4, macOS,variant=Mac Catalyst)

'TrackState' is deprecated

Check warning on line 107 in Sources/LiveKit/Audio/Manager/AudioManager.swift

View workflow job for this annotation

GitHub Actions / Build & Test (macos-14, 15.4, iOS Simulator,name=iPhone 15 Pro,OS=17.5)

'TrackState' is deprecated

Check warning on line 107 in Sources/LiveKit/Audio/Manager/AudioManager.swift

View workflow job for this annotation

GitHub Actions / Build & Test (macos-15, 16.4, iOS Simulator,name=iPhone 16 Pro,OS=18.5, true)

'TrackState' is deprecated

Check warning on line 107 in Sources/LiveKit/Audio/Manager/AudioManager.swift

View workflow job for this annotation

GitHub Actions / Build & Test (macos-15, 16.4, iOS Simulator,name=iPhone 16 Pro,OS=18.5, true)

'TrackState' is deprecated

Check warning on line 107 in Sources/LiveKit/Audio/Manager/AudioManager.swift

View workflow job for this annotation

GitHub Actions / Build & Test (macos-15, 16.4, visionOS Simulator,name=Apple Vision Pro,OS=2.5)

'TrackState' is deprecated
switch (localTracksCount > 0, remoteTracksCount > 0) {
case (true, false): .localOnly
case (false, true): .remoteOnly
Expand All @@ -118,13 +118,13 @@
// MARK: - AudioProcessingModule

private lazy var capturePostProcessingDelegateAdapter: AudioCustomProcessingDelegateAdapter = {
let adapter = AudioCustomProcessingDelegateAdapter()
let adapter = AudioCustomProcessingDelegateAdapter(label: "capturePost")
RTC.audioProcessingModule.capturePostProcessingDelegate = adapter
return adapter
}()

private lazy var renderPreProcessingDelegateAdapter: AudioCustomProcessingDelegateAdapter = {
let adapter = AudioCustomProcessingDelegateAdapter()
let adapter = AudioCustomProcessingDelegateAdapter(label: "renderPre")
RTC.audioProcessingModule.renderPreProcessingDelegate = adapter
return adapter
}()
Expand Down
6 changes: 2 additions & 4 deletions Sources/LiveKit/Audio/MixerEngineObserver.swift
Original file line number Diff line number Diff line change
Expand Up @@ -188,10 +188,8 @@ extension MixerEngineObserver {

guard let converter else { return }

converter.convert(from: inputBuffer)
// Copy the converted segment from buffer and schedule it.
let segment = converter.outputBuffer.copySegment()
appNode.scheduleBuffer(segment)
let buffer = converter.convert(from: inputBuffer)
appNode.scheduleBuffer(buffer)

if !appNode.isPlaying {
appNode.play()
Expand Down
7 changes: 5 additions & 2 deletions Sources/LiveKit/Protocols/AudioCustomProcessingDelegate.swift
Original file line number Diff line number Diff line change
Expand Up @@ -46,6 +46,7 @@ public protocol AudioCustomProcessingDelegate: Sendable {
class AudioCustomProcessingDelegateAdapter: MulticastDelegate<AudioRenderer>, @unchecked Sendable, LKRTCAudioCustomProcessingDelegate {
// MARK: - Public

let label: String
var target: AudioCustomProcessingDelegate? { _state.target }

// MARK: - Private
Expand All @@ -60,8 +61,10 @@ class AudioCustomProcessingDelegateAdapter: MulticastDelegate<AudioRenderer>, @u
_state.mutate { $0.target = target }
}

init() {
super.init(label: "AudioCustomProcessingDelegateAdapter")
init(label: String) {
self.label = label
super.init(label: "AudioCustomProcessingDelegateAdapter.\(label)")
log("label: \(label)")
}

// MARK: - AudioCustomProcessingDelegate
Expand Down
9 changes: 6 additions & 3 deletions Sources/LiveKit/Support/Audio/AudioConverter.swift
Original file line number Diff line number Diff line change
Expand Up @@ -17,10 +17,11 @@
@preconcurrency import AVFAudio

final class AudioConverter: Sendable {
let converter: AVAudioConverter
let inputFormat: AVAudioFormat
let outputFormat: AVAudioFormat
let outputBuffer: AVAudioPCMBuffer

private let converter: AVAudioConverter
private let outputBuffer: AVAudioPCMBuffer

/// Computes required frame capacity for output buffer.
static func frameCapacity(from inputFormat: AVAudioFormat, to outputFormat: AVAudioFormat, inputFrameCount: AVAudioFrameCount) -> AVAudioFrameCount {
Expand All @@ -43,7 +44,7 @@ final class AudioConverter: Sendable {
self.outputFormat = outputFormat
}

func convert(from inputBuffer: AVAudioPCMBuffer) {
func convert(from inputBuffer: AVAudioPCMBuffer) -> AVAudioPCMBuffer {
var error: NSError?
#if swift(>=6.0)
// Won't be accessed concurrently, marking as nonisolated(unsafe) to avoid Atomics.
Expand All @@ -61,5 +62,7 @@ final class AudioConverter: Sendable {
bufferFilled = true
return inputBuffer
}

return outputBuffer.copySegment()
}
}
6 changes: 2 additions & 4 deletions Sources/LiveKit/Support/AudioMixRecorder.swift
Original file line number Diff line number Diff line change
Expand Up @@ -307,10 +307,8 @@ public class AudioMixRecorderSource: Loggable, AudioRenderer, @unchecked Sendabl
}

if let converter {
converter.convert(from: pcmBuffer)
// Copy the converted segment from buffer and schedule it.
let segment = converter.outputBuffer.copySegment()
playerNode.scheduleBuffer(segment, completionHandler: nil)
let buffer = converter.convert(from: pcmBuffer)
playerNode.scheduleBuffer(buffer, completionHandler: nil)
play()
}
}
Expand Down
74 changes: 74 additions & 0 deletions Sources/LiveKit/Support/AudioPlayerRenderer.swift
Original file line number Diff line number Diff line change
@@ -0,0 +1,74 @@
/*
* Copyright 2025 LiveKit
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

import AVFAudio

// A helper class to play-out audio which conforms to the AudioRenderer protocol.
public class AudioPlayerRenderer: AudioRenderer, Loggable, @unchecked Sendable {
let engine = AVAudioEngine()
let playerNode = AVAudioPlayerNode()

var oldConverter: AudioConverter?
var outputFormat: AVAudioFormat?

public init() {
engine.attach(playerNode)
}

public func start() async throws {
log("Starting audio engine...")

let format = engine.outputNode.outputFormat(forBus: 0)
outputFormat = format

engine.connect(playerNode, to: engine.mainMixerNode, format: format)

try engine.start()
log("Audio engine started")
}

public func stop() {
log("Stopping audio engine...")

playerNode.stop()
engine.stop()

log("Audio engine stopped")
}

public func render(pcmBuffer: AVAudioPCMBuffer) {
guard let outputFormat, let engine = playerNode.engine, engine.isRunning else { return }

// Create or update the converter if needed
let converter = (oldConverter?.inputFormat == pcmBuffer.format)
? oldConverter
: {
log("Creating converter with format: \(pcmBuffer.format)")
let newConverter = AudioConverter(from: pcmBuffer.format, to: outputFormat)!
self.oldConverter = newConverter
return newConverter
}()

guard let converter else { return }

let buffer = converter.convert(from: pcmBuffer)
playerNode.scheduleBuffer(buffer, completionHandler: nil)

if !playerNode.isPlaying {
playerNode.play()
}
}
}
6 changes: 3 additions & 3 deletions Tests/LiveKitTests/AudioConverterTests.swift
Original file line number Diff line number Diff line change
Expand Up @@ -52,9 +52,9 @@ class AudioConverterTests: LKTestCase {
while inputFile.framePosition < inputFile.length {
let framesToRead: UInt32 = min(readFrameCapacity, UInt32(inputFile.length - inputFile.framePosition))
try inputFile.read(into: inputBuffer, frameCount: framesToRead)
converter.convert(from: inputBuffer)
print("Converted \(framesToRead) frames from \(inputFormat.sampleRate) to \(outputFormat.sampleRate), outputFrames: \(converter.outputBuffer.frameLength)")
try outputFile?.write(from: converter.outputBuffer)
let buffer = converter.convert(from: inputBuffer)
print("Converted \(framesToRead) frames from \(inputFormat.sampleRate) to \(outputFormat.sampleRate), outputFrames: \(buffer.frameLength)")
try outputFile?.write(from: buffer)
}

// Close file
Expand Down
Loading