WebRTC plugin for Flutter Mobile/Desktop/Web
Sponsored with 💖 by
Enterprise Grade APIs for Feeds, Chat, & Video. Try the Flutter Video tutorial 💬
LiveKit - Open source WebRTC and realtime AI infrastructure
Feature | Android | iOS | Web | macOS | Windows | Linux | Embedded | Fuchsia |
---|---|---|---|---|---|---|---|---|
Audio/Video | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | |
Data Channel | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | |
Screen Capture | ✔️ | ✔️(*) | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | |
Unified-Plan | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | |
Simulcast | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | |
MediaRecorder | ✔️ | |||||||
End to End Encryption | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | |
Insertable Streams |
The RTCConfiguration
object allows you to specify various parameters for how a peer connection should be established and managed. One key setting for adapting to network conditions is degradationPreference
.
This setting, of type RTCDegradationPreference
, hints to the WebRTC engine how to handle situations where network quality degrades and available bandwidth is limited. It helps balance between maintaining frame rate and maintaining resolution for video streams.
The possible values for RTCDegradationPreference
are:
RTCDegradationPreference.disabled
:- Video quality will not be intentionally degraded. The WebRTC engine may still adapt, but it won't prioritize frame rate or resolution based on this hint. This can lead to choppy video if bandwidth is insufficient.
RTCDegradationPreference.maintainFramerate
:- Prioritizes keeping the frame rate smooth. If bandwidth is limited, the resolution will be reduced to maintain the frame rate. Use this if motion smoothness is critical.
RTCDegradationPreference.maintainResolution
:- Prioritizes keeping the video resolution clear. If bandwidth is limited, the frame rate will be reduced to maintain resolution. Use this if image clarity is more important than smooth motion.
RTCDegradationPreference.balanced
(Default in native WebRTC, though default might vary by platform if not explicitly set):- Attempts to strike a balance between frame rate and resolution. The WebRTC engine will make trade-offs based on its internal heuristics.
Example Usage:
To set the degradationPreference
, include it in your RTCConfiguration
map when creating a peer connection:
import 'package:flutter_webrtc/flutter_webrtc.dart';
// ...
Map<String, dynamic> configuration = {
'iceServers': [
{'urls': 'stun:stun.l.google.com:19302'},
],
'sdpSemantics': 'unified-plan', // Or 'plan-b'
// Add degradationPreference
'degradationPreference': RTCDegradationPreference.maintainFramerate.toString().split('.').last
};
// When creating a peer connection:
// RTCPeerConnection pc = await createPeerConnection(configuration, offerSdpConstraints);
// Or if you are using the RTCConfiguration object directly (recommended):
RTCConfiguration rtcConfig = RTCConfiguration(
iceServers: [RTCIceServer(urls: ['stun:stun.l.google.com:19302'])],
sdpSemantics: SDPSemantics.UnifiedPlan, // Assuming an enum SDPSemantics exists or use string
degradationPreference: RTCDegradationPreference.maintainFramerate,
);
// The toMap() method of RTCConfiguration will handle the correct string conversion for native platforms.
// Map<String, dynamic> configurationMap = rtcConfig.toMap();
// RTCPeerConnection pc = await createPeerConnection(configurationMap);
Setting degradationPreference
can be particularly useful in mobile applications or any scenario where network conditions can vary significantly, helping to provide a better user experience during video calls.
hardwareAcceleration
:bool?
- This field in
RTCConfiguration
serves as a hint for enabling or disabling hardware-accelerated video encoding/decoding. - Default Behavior: WebRTC typically attempts to use hardware acceleration by default where available. Setting this to
true
(default if null on native) aligns with that. - Disabling: Setting to
false
suggests a preference for software codecs. - Note: Actual enforcement depends on platform capabilities and the underlying WebRTC implementation. The native layer in this plugin attempts to honor this by selecting software-only video factories if set to
false
at the time the PeerConnectionFactory is initialized. Since the factory is often initialized once globally, this hint is typically a global preference for the application's lifetime or until the factory is re-initialized. Changing it inRTCConfiguration
for an already created factory might not change the active encoder/decoder factories for that specific peer connection.
- This field in
allowedIceCandidateTypes
:List<RTCIceCandidateType>?
- Allows you to specify which types of ICE candidates should be collected and used. If set, candidates not matching these types will be filtered out at the native level before being sent to the Dar 8000 t application.
RTCIceCandidateType
enum values:host
,srflx
(Server Reflexive),prflx
(Peer Reflexive),relay
(TURN relay).- Example:
allowedIceCandidateTypes: [RTCIceCandidateType.host, RTCIceCandidateType.relay]
allowedIceProtocols
:List<RTCIceProtocol>?
- Allows you to filter ICE candidates by their transport protocol.
RTCIceProtocol
enum values:udp
,tcp
.- Example:
allowedIceProtocols: [RTCIceProtocol.udp]
iceGatheringTimeoutSeconds
:int?
- Specifies a timeout in seconds for the ICE gathering process.
- If ICE gathering does not complete (i.e., state does not become
complete
) within this duration, the gathering process is considered timed out. - The plugin will then send a null ICE candidate to Dart, signaling the end of candidates, and will ignore further candidates from that gathering cycle.
- Set to
0
ornull
to disable the timeout (relying on WebRTC's default behavior). - Example:
iceGatheringTimeoutSeconds: 10
(10 seconds)
// Example demonstrating new RTCConfiguration fields
RTCConfiguration rtcConfig = RTCConfiguration(
iceServers: [RTCIceServer(urls: ['stun:stun.l.google.com:19302'])],
sdpSemantics: SDPSemantics.UnifiedPlan, // Or your desired semantics
degradationPreference: RTCDegradationPreference.balanced,
hardwareAcceleration: true, // Hint for hardware acceleration
allowedIceCandidateTypes: [RTCIceCandidateType.host, RTCIceCandidateType.srflx, RTCIceCandidateType.relay],
allowedIceProtocols: [RTCIceProtocol.udp],
iceGatheringTimeoutSeconds: 15,
);
// Map<String, dynamic> configurationMap = rtcConfig.toMap();
// RTCPeerConnection pc = await createPeerConnection(configurationMap);
This plugin provides mechanisms to influence codec selection and parameters.
RTCRtpCodecCapability.profile
:String?
- When defining preferred codecs for an
RTCRtpTransceiver
(see below), you can specify aprofile
string within anRTCRtpCodecCapability
object. - This
profile
string is appended to thesdpfmtpLine
attribute of the nativeRTCRtpCodecCapability
when passed to the underlying WebRTC engine (e.g.,a=fmtp:...;profile=your_profile_string
). The exact interpretation and validity of the profile string are codec-specific (e.g., for H.264, this might relate toprofile-level-id
).
- When defining preferred codecs for an
RTCRtpCodecParameters.profile
:String?
- When RTP parameters are retrieved (e.g., from
RTCRtpSender.getParameters()
orRTCRtpReceiver.getParameters()
), theprofile
field onRTCRtpCodecParameters
will be populated if a "profile" key (or a recognized key like "profile-level-id") was found in the native codec's specific parameters map. - When setting RTP parameters via
RTCRtpSender.setParameters()
, you can include acodecs
list in the parameters map. Each codec map in this list can have aprofile
string, which will be stored in the native codec's parameter map under the key "profile".
- When RTP parameters are retrieved (e.g., from
RTCRtpTransceiverInit.preferredCodecs
:List<RTCRtpCodecCapability>?
- When adding a new transceiver using
pc.addTransceiver(trackOrKind, init: rtpTransceiverInit)
, you can provide a list ofRTCRtpCodecCapability
objects inrtpTransceiverInit.preferredCodecs
. - This list tells the WebRTC engine your preferred order and configuration for codecs to be negotiated for that transceiver. The actual negotiated codec will depend on the capabilities of both peers.
- Each
RTCRtpCodecCapability
in the list can specifymimeType
,clockRate
,channels
,sdpFmtpLine
, and the newprofile
field.
- When adding a new transceiver using
Example of setting preferred codecs:
// Assuming 'pc' is your RTCPeerConnection instance
// and 'videoTrack' is a MediaStreamTrack
var videoCapabilities = await RTCRtpSender.getCapabilities('video'); // Or RTCRtpReceiver.getCapabilities('video')
// Example: Prefer H264 Constrained Baseline, then VP8
List<RTCRtpCodecCapability> preferredVideoCodecs = [];
// Find H264 capability and specify a profile (example profile-level-id for Constrained Baseline)
var h264Cap = videoCapabilities.codecs?.firstWhere(
(c) => c.mimeType.toLowerCase() == 'video/h264',
orElse: () => null, // Handle case where H264 might not be supported
);
if (h264Cap != null) {
// Modify sdpFmtpLine or use profile if your parsing logic handles it.
// For H264, profile-level-id is standard.
// This example uses the custom 'profile' field which gets appended to sdpFmtpLine in native.
h264Cap.profile = '42e01f'; // Example profile-level-id for Constrained Baseline
// Alternatively, manipulate sdpFmtpLine directly if needed:
// h264Cap.sdpFmtpLine = 'level-asymmetry-allowed=1;packetization-mode=1;profile-level-id=42e01f';
preferredVideoCodecs.add(h264Cap);
}
// Find VP8 capability
var vp8Cap = videoCapabilities.codecs?.firstWhere(
(c) => c.mimeType.toLowerCase() == 'video/vp8',
orElse: () => null,
);
if (vp8Cap != null) {
preferredVideoCodecs.add(vp8Cap);
}
if (preferredVideoCodecs.isNotEmpty) {
await pc.addTransceiver(
track: videoTrack, // Or kind: RTCRtpMediaType.Video
init: RTCRtpTransceiverInit(
direction: TransceiverDirection.SendRecv,
preferredCodecs: preferredVideoCodecs,
),
);
}
Understanding the state and lifecycle of media tracks and streams is crucial for building robust applications.
MediaStreamTrack.readyState
:String
- Indicates the current state of the track. Can be
'live'
or'ended'
.
- Indicates the current state of the track. Can be
MediaStreamTrack.onEnded
:Stream<void>
- A stream that fires an event when the track transitions to the
'ended'
state. This can happen ifstop()
is called, if a remote track is removed by the peer, or potentially if a local device is disconnected or its permissions are revoked (platform-dependent for automatic detection).
- A stream that fires an event when the track transitions to the
MediaStreamTrack.stop()
:Future<void>
- Stops the track, releasing its underlying native resources. The track's
readyState
becomes'ended'
, and theonEnded
event is fired.
- Stops the track, releasing its underlying native resources. The track's
MediaStreamTrack.restart(Map<String, dynamic> mediaConstraints)
:Future<MediaStreamTrack?>
- This method is available on local tracks (
MediaStreamTrackNative.isLocal == true
). - It first stops the current track (firing
onEnded
). - Then, it attempts to re-acquire a new track of the same kind using the provided
mediaConstraints
vianavigator.mediaDevices.getUserMedia()
. - If successful, it returns a new
MediaStreamTrack
instance. The original track instance remains 'ended'. - The application is responsible for updating any
RTCRtpSender
to use this new track viasender.replaceTrack(newTrack)
. - Returns
null
if re-acquisition fails. ThrowsMediaDeviceAcquireError
or its subtypes ifgetUserMedia
fails with a known error.
- This method is available on local tracks (
MediaStream.active
:bool
- A getter that returns
true
if the stream has at least one track withreadyState == 'live'
. Returnsfalse
if all tracks are 'ended' or if the stream has no tracks.
- A getter that returns
MediaStream.onActiveStateChanged
:Stream<bool>
- A stream that fires
true
when the stream transitions from inactive to active, andfalse
when it transitions from active to inactive. This is determined by thereadyState
of its constituent tracks.
- A stream that fires
When using navigator.mediaDevices.getUserMedia()
or navigator.mediaDevices.getDisplayMedia()
, more specific exceptions can be caught:
PermissionDeniedError
: Thrown if the user denies permission for camera/microphone access, or if the OS/browser policies prevent access.NotFoundError
: Thrown if no media tracks of the requested kind are found (e.g., no camera available).MediaDeviceAcquireError
: A more generic error for other issues encountered during device acquisition (e.g., hardware errors, constraints that cannot be satisfied).
Example: Handling Permission Error
try {
final stream = await navigator.mediaDevices.getUserMedia({'audio': true, 'video': true});
// ... use the stream
} on PermissionDeniedError catch (e) {
print('Permission denied: ${e.message}');
// Show UI to user explaining the need for permissions
} on NotFoundError catch (e) {
print('Device not found: ${e.message}');
// Handle missing devices
} catch (e) {
print('Error acquiring media: $e');
}
The CallQualityManager
is a utility class (found in package:flutter_webrtc/src/call_quality_manager.dart
) designed to proactively monitor and adapt call quality based on network conditions.
The manager's core logic (_monitorCallQuality
) periodically performs the following:
- Bandwidth Estimation: It fetches
availableOutgoingBitrate
from active ICE candidate pair statistics to estimate available send bandwidth. - Per-Sender Stats Analysis: For each video
RTCRtpSender
:- It analyzes WebRTC statistics (
outbound-rtp
) for packet loss, round-trip time (RTT), and jitter.
- It analyzes WebRTC statistics (
- Bitrate Adjustment:
- Downward: If
availableOutgoingBitrate
is significantly lower than the current video sender'smaxBitrate
, themaxBitrate
is reduced to target a percentage (e.g., 90%) of the estimated available bandwidth. - If high packet loss, excessive RTT, or high jitter are detected, the
maxBitrate
is further reduced by configurable factors. These reductions are multiplicative if multiple issues occur. - The bitrate will not be reduced below a configurable
minSensibleBitrateBps
. - Upward (Cautious): If network quality metrics (packet loss, RTT, jitter) are good, and
availableOutgoingBitrate
is significantly higher than the currentmaxBitrate
, themaxBitrate
may be cautiously increased (e.g., by 10%), but capped by a percentage of theavailableOutgoingBitrate
.
- Downward: If
The behavior of CallQualityManager
is controlled by CallQualityManagerSettings
. You can pass an instance of this class to the CallQualityManager
constructor.
Key settings include:
packetLossThresholdPercent
: e.g.,10.0
(10%)rttThresholdSeconds
: e.g.,0.5
(500ms)jitterThresholdSeconds
: e.g.,0.03
(30ms)bweMinDecreaseFactor
,bweMinIncreaseFactor
,bweTargetHeadroomFactor
: Control sensitivity to bandwidth estimation.packetLossBitrateFactor
,rttBitrateFactor
,jitterBitrateFactor
: Multiplicative factors for bitrate reduction (e.g.,0.8
for a 20% reduction).cautiousIncreaseFactor
: Factor for increasing bitrate (e.g.,1.1
for a 10% increase).minSensibleBitrateBps
: Minimum bitrate to target (e.g.,50000
for 50kbps).autoRestartLocallyEndedTracks
:bool
(defaulttrue
), enables the auto-restart policy.defaultAudioRestartConstraints
,defaultVideoRestartConstraints
: Default constraints used by the auto-restart policy if a local track ends.
Example: Customizing CallQualityManager
import 'package:flutter_webrtc/flutter_webrtc.dart';
// Make sure to import CallQualityManager and CallQualityManagerSettings,
// typically from: package:flutter_webrtc/src/call_quality_manager.dart (if not re-exported)
// ... Assuming 'peerConnection' is your RTCPeerConnection instance
final customSettings = CallQualityManagerSettings(
packetLossThresholdPercent: 15.0, // Be more tolerant to packet loss
rttThresholdSeconds: 0.7, // Be more tolerant to RTT
minSensibleBitrateBps: 75000, // Set a higher minimum bitrate
autoRestartLocallyEndedTracks: true,
defaultVideoRestartConstraints: {'video': {'width': 640, 'height': 480}} // Custom restart constraints
);
final callQualityManager = CallQualityManager(peerConnection, customSettings);
callQualityManager.start(); // Default period is 5 seconds
// Listen for track restart events
callQualityManager.onTrackRestarted.listen((MediaStreamTrack newTrack) {
print('CallQualityManager automatically restarted track: ${newTrack.id}, kind: ${newTrack.kind}');
// You might want to update UI or other application logic here.
});
// Don't forget to dispose the manager when the call ends
// callQualityManager.dispose();
- If
autoRestartLocallyEndedTracks
is true in settings,CallQualityManager
will monitor local tracks associated with activeRTCRtpSender
s. - If such a track's
onEnded
event fires (signaling it has stopped, possibly unexpectedly), the manager will:- Attempt to call
track.restart()
using thedefaultAudioRestartConstraints
ordefaultVideoRestartConstraints
from its settings. - If
restart()
successfully returns anewTrack
, the manager will callsender.replaceTrack(newTrack)
to replace the ended track with the new one on the correspondingRTCRtpSender
. - The
CallQualityManager.onTrackRestarted
stream will emit thenewTrack
.
- Attempt to call
This policy helps in automatically recovering from situations where a local media source might be temporarily lost and then re-acquired.
Additional platform/OS support from the other community
- flutter-tizen: https://github.com/flutter-tizen/plugins/tree/master/packages/flutter_webrtc
- flutter-elinux(WIP): sony/flutter-elinux-plugins#7
Add flutter_webrtc
as a dependency in your pubspec.yaml file.
Add the following entry to your Info.plist file, located in <project root>/ios/Runner/Info.plist
:
<key>NSCameraUsageDescription</key>
<string>$(PRODUCT_NAME) Camera Usage!</string>
<key>NSMicrophoneUsageDescription</key>
<string>$(PRODUCT_NAME) Microphone Usage!</string>
This entry allows your app to access camera and microphone.
The WebRTC.xframework compiled after the m104 release no longer supports iOS arm devices, so need to add the config.build_settings['ONLY_ACTIVE_ARCH'] = 'YES'
to your ios/Podfile in your project
ios/Podfile
post_install do |installer|
installer.pods_project.targets.each do |target|
flutter_additional_ios_build_settings(target)
target.build_configurations.each do |config|
# Workaround for https://github.com/flutter/flutter/issues/64502
config.build_settings['ONLY_ACTIVE_ARCH'] = 'YES' # <= this line
end
end
end
Ensure the following permission is present in your Android Manifest file, located in <project root>/android/app/src/main/AndroidManifest.xml
:
<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.camera.autofocus" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.CHANGE_NETWORK_STATE" />
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
If you need to use a Bluetooth device, please add:
<uses-permission android:name="android.permission.BLUETOOTH" android:maxSdkVersion="30" />
<uses-permission android:name="android.permission.BLUETOOTH_ADMIN" android:maxSdkVersion="30" />
The Flutter project template adds it, so it may already be there.
Also you will need to set your build settings to Java 8, because official WebRTC jar now uses static methods in EglBase
interface. Just add this to your app level build.gradle
:
android {
//...
compileOptions {
sourceCompatibility JavaVersion.VERSION_1_8
targetCompatibility JavaVersion.VERSION_1_8
}
}
If necessary, in the same build.gradle
you will need to increase minSdkVersion
of defaultConfig
up to 23
(currently default Flutter generator set it to 16
).
When you compile the release apk, you need to add the following operations, Setup Proguard Rules
The project is inseparable from the contributors of the community.
- CloudWebRTC - Original Author
- RainwayApp - Sponsor
- 亢少军 - Sponsor
- ION - Sponsor
- reSipWebRTC - Sponsor
- 沃德米科技-36记手写板 - Sponsor
- 阿斯特网络科技有限公司 - Sponsor
For more examples, please refer to flutter-webrtc-demo.
This project exists thanks to all the people who contribute. [Contribute].
Become a financial contributor and help us sustain our community. [Contribute]
Support this project with your organization. Your logo will show up here with a link to your website. [Contribute]