您现在的位置是:首页 >技术杂谈 >WebRTC 系列(二、本地通话,H5、Android、iOS)网站首页技术杂谈
WebRTC 系列(二、本地通话,H5、Android、iOS)
WebRTC 系列(一、简介)
一、整体流程
有了上一篇 WebRTC 简介的基础,我们知道了 WebRTC 的工作流程,接下来就是需要用代码去实现这个流程了。对于不同端,实现起来的难易程度可能略微不同(实践中我感觉 iOS 端是资料最少的),但是整体流程是一样的。
问:要把大象装冰箱,总共分几步?答:三步。
- 初始化;
- 发起呼叫,接收应答;
- 挂断,释放资源。
初始化
- 初始化 PeerConnectionFactory;
- 创建 EglBase;
- 创建 PeerConnectionFactory;
- 创建音轨;
- 创建视轨;
- 初始化本地视频渲染控件;
- 初始化远端视频渲染控件;
- 开始本地渲染。
这些步骤,不同端不一定都需要,有的端的 WebRTC 依赖库中已经做了一些,H5 的初始化操作是最少的。有的步骤的顺序可能会略有不同,Android 和 iOS 就有区别,需要注意的是初始化不当可能导致画面渲染不出来,而且还不易排查,后面我再说明我踩过哪些坑。
发起呼叫,接收应答
- A 创建 PeerConnection;
- A 为 PeerConnection 添加音轨、视轨;
- A 通过 PeerConnection 创建 offer,获取 sdp;
- A 将 offer sdp 作为参数 setLocalDescription;
- A 发送 offer sdp;
- B 收到 offer sdp,创建 PeerConnection;
- B 为 PeerConnection 添加音轨、视轨;
- B 将 offer sdp 作为参数 setRemoteDescription;
- B 通过 PeerConnection 创建 answer,获取 sdp;
- B 将 answer sdp 作为参数 setLocalDescription;
- B 发送 answer sdp;
- A 收到 answer sdp,将 answer sdp 作为参数 setRemoteDescription。
这期间,PeerConnection 的 onIceCandidate() 这个回调方法会被调用,我们需要将生成的 IceCandidate 对象传递给对方,收到 IceCandidate 的时候我们需要通过 PeerConnection 的 addIceCandidate() 方法添加。
这期间,PeerConnection 的 onAddStream() 这个回调也会被调用,我们需要将 MediaStream 添加到远端控件渲染器中进行渲染。
挂断,释放资源
- 关闭 PeerConnection;
- 释放远端视频渲染控件。
释放资源主要是为了能够正常的再次呼叫。
整理完整体流程后,我们可以先来实现一个本地 demo,我是创建了两个 PeerConnection,一个本地 PeerConnection 作为用户 A,一个远端 PeerConnection 作为用户 B,由于大部分 Android 设备和 iOS 设备同时只能开启一个摄像头,所以远端 PeerConnection 不会采集视频数据,只会渲染远端画面。
二、H5
1.添加依赖
上一篇博客已经说到,WebRTC 已经成为 H5 的标准之一,所以 H5 实现它很容易,并不需要额外引入依赖。
2.local_demo.html
H5 的实现相对简单,很多初始化的操作我们不需要去做,浏览器已经帮我们做好了,整体代码如下:
<html>
<head>
<title>Local Demo</title>
<style>
body {
overflow: hidden;
margin: 0px;
padding: 0px;
}
#local_view {
width: 100%;
height: 100%;
}
#remote_view {
width: 9%;
height: 16%;
position: absolute;
top: 10%;
right: 10%;
}
#left {
width: 10%;
height: 5%;
position: absolute;
left: 10%;
top: 10%;
}
.my_button {
width: 100%;
height: 100%;
display: block;
margin-bottom: 10%;
}
</style>
</head>
<body>
<video id="local_view" autoplay controls muted></video>
<video id="remote_view" autoplay controls muted></video>
<div id="left">
<button id="btn_call" class="my_button" onclick="call()">呼叫</button>
<button id="btn_hang_up" class="my_button" onclick="hangUp()">挂断</button>
</div>
</body>
<script type="text/javascript">
let localView = document.getElementById("local_view");
let remoteView = document.getElementById("remote_view");
var localStream;
var localPeerConnection;
var remotePeerConnection;
function call() {
// 创建 PeerConnection
localPeerConnection = new RTCPeerConnection();
localPeerConnection.onicecandidate = function (event) {
console.log("onicecandidate--->" + event.candidate);
let iceCandidate = event.candidate;
if (iceCandidate == null) {
return;
}
sendIceCandidate(localPeerConnection, iceCandidate);
}
// 为 PeerConnection 添加音轨、视轨
for (let i = 0; localStream != null && i < localStream.getTracks().length; i++) {
const track = localStream.getTracks()[i];
localPeerConnection.addTrack(track, localStream);
}
// 通过 PeerConnection 创建 offer,获取 sdp
localPeerConnection.createOffer().then(function (sessionDescription) {
console.log("create offer success.");
// 将 offer sdp 作为参数 setLocalDescription
localPeerConnection.setLocalDescription(sessionDescription).then(function () {
console.log("set local sdp success.");
// 发送 offer sdp
sendOffer(sessionDescription)
})
})
}
function sendOffer(offer) {
receivedOffer(offer);
}
function receivedOffer(offer) {
// 创建 PeerConnection
remotePeerConnection = new RTCPeerConnection();
remotePeerConnection.onicecandidate = function (event) {
console.log("onicecandidate--->" + event.candidate);
let iceCandidate = event.candidate;
if (iceCandidate == null) {
return;
}
sendIceCandidate(remotePeerConnection, iceCandidate);
}
remotePeerConnection.ontrack = function (event) {
console.log("remote ontrack--->" + event.streams);
let streams = event.streams;
if (streams && streams.length > 0) {
remoteView.srcObject = streams[0];
}
}
// 将 offer sdp 作为参数 setRemoteDescription
remotePeerConnection.setRemoteDescription(offer).then(function () {
console.log("set remote sdp success.");
// 通过 PeerConnection 创建 answer,获取 sdp
remotePeerConnection.createAnswer().then(function (sessionDescription) {
console.log("create answer success.");
// 将 answer sdp 作为参数 setLocalDescription
remotePeerConnection.setLocalDescription(sessionDescription).then(function () {
console.log("set local sdp success.");
// 发送 answer sdp
sendAnswer(sessionDescription);
})
})
})
}
function sendAnswer(answer) {
receivedAnswer(answer);
}
function receivedAnswer(answer) {
// 收到 answer sdp,将 answer sdp 作为参数 setRemoteDescription
localPeerConnection.setRemoteDescription(answer).then(function () {
console.log("set remote sdp success.");
})
}
function sendIceCandidate(peerConnection, iceCandidate) {
receivedCandidate(peerConnection, iceCandidate);
}
function receivedCandidate(peerConnection, iceCandidate) {
if (peerConnection == localPeerConnection) {
remotePeerConnection.addIceCandidate(iceCandidate);
} else {
localPeerConnection.addIceCandidate(iceCandidate);
}
}
function hangUp() {
if (localPeerConnection != null) {
localPeerConnection.close();
localPeerConnection = null;
}
if (remotePeerConnection != null) {
remotePeerConnection.close();
remotePeerConnection = null;
}
remoteView.removeAttribute('src');
remoteView.load();
}
navigator.mediaDevices.getUserMedia({ audio: true, video: true }).then(function (mediaStream) {
// 初始化 PeerConnectionFactory
// 创建 EglBase
// 创建 PeerConnectionFactory
// 创建音轨
// 创建视轨
localStream = mediaStream;
// 初始化本地视频渲染控件
// 初始化远端视频渲染控件
// 开始本地渲染
localView.srcObject = mediaStream;
}).catch(function (error) {
console.log("error--->" + error);
})
</script>
</html>
运行起来后点击呼叫,在右上角的远端控件就能收到本地视频了:
三、Android
Android 端流程是最多的,但是由于我是写 Android 的,所以觉得它的流程是最严谨的,也可能是因为 WebRTC 是 Google 开源的缘故,所以 Android 端这方面的资料还是挺好找的。
1.添加依赖
对于 Android 来说,需要在 app 的 build.gradle 中引入依赖:
// WebRTC
implementation 'org.webrtc:google-webrtc:1.0.32006'
由于国内众所周知的原因,下载依赖比较慢甚至会失败,所以可能需要使用阿里云 maven 镜像。
gradle 7.0 之前在项目根目录的 build.gradle 中添加阿里云 maven 镜像:
...
allprojects {
repositories {
...
// 阿里云 maven 镜像
maven { url 'http://maven.aliyun.com/nexus/content/groups/public/' }
}
}
...
gradle 7.0 及之后则是在项目根目录的 setting.gradle 中添加阿里云 maven 镜像:
...
dependencyResolutionManagement {
...
repositories {
...
// 阿里云 maven 镜像
maven {
allowInsecureProtocol = true
url 'http://maven.aliyun.com/nexus/content/groups/public/'
}
}
}
...
2.权限
除了依赖之外,Android 还需要配置网络权限、录音权限、摄像头权限、获取网络状态权限、获取 wifi 状态权限,Android 6.0 之后录音权限、摄像头权限需要动态申请,这些不是 WebRTC 的重点,只要不是 Android 新手应该都知道怎么去做,这里就不赘述了。
<!-- 允许程序访问网络的权限 -->
<uses-permission android:name="android.permission.INTERNET" />
<!-- 允许程序录音的权限 -->
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<!-- 允许程序使用摄像头的权限 -->
<uses-permission android:name="android.permission.CAMERA" />
<!-- 允许程序获取网络状态的权限 -->
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<!-- 允许程序获取 wifi 状态的权限 -->
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
3.布局
<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:background="#FF000000"
android:keepScreenOn="true"
tools:context=".LocalDemoActivity">
<org.webrtc.SurfaceViewRenderer
android:id="@+id/svr_local"
android:layout_width="match_parent"
android:layout_height="0dp"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintDimensionRatio="9:16"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toTopOf="parent" />
<org.webrtc.SurfaceViewRenderer
android:id="@+id/svr_remote"
android:layout_width="90dp"
android:layout_height="0dp"
android:layout_marginTop="30dp"
android:layout_marginEnd="30dp"
app:layout_constraintDimensionRatio="9:16"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintTop_toTopOf="parent" />
<androidx.appcompat.widget.LinearLayoutCompat
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_marginStart="30dp"
android:layout_marginTop="30dp"
android:layout_marginEnd="30dp"
android:orientation="vertical"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toTopOf="parent">
<androidx.appcompat.widget.AppCompatButton
android:id="@+id/btn_call"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="呼叫" />
<androidx.appcompat.widget.AppCompatButton
android:id="@+id/btn_hang_up"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="挂断" />
</androidx.appcompat.widget.LinearLayoutCompat>
</androidx.constraintlayout.widget.ConstraintLayout>
4.LocalDemoActivity
package com.qinshou.webrtcdemo_android;
import android.content.Context;
import android.os.Bundle;
import android.view.View;
import androidx.appcompat.app.AppCompatActivity;
import org.webrtc.AudioSource;
import org.webrtc.AudioTrack;
import org.webrtc.Camera2Capturer;
import org.webrtc.Camera2Enumerator;
import org.webrtc.CameraEnumerator;
import org.webrtc.DataChannel;
import org.webrtc.DefaultVideoDecoderFactory;
import org.webrtc.DefaultVideoEncoderFactory;
import org.webrtc.EglBase;
import org.webrtc.IceCandidate;
import org.webrtc.MediaConstraints;
import org.webrtc.MediaStream;
import org.webrtc.PeerConnection;
import org.webrtc.PeerConnectionFactory;
import org.webrtc.RtpReceiver;
import org.webrtc.SessionDescription;
import org.webrtc.SurfaceTextureHelper;
import org.webrtc.SurfaceViewRenderer;
import org.webrtc.VideoCapturer;
import org.webrtc.VideoDecoderFactory;
import org.webrtc.VideoEncoderFactory;
import org.webrtc.VideoSource;
import org.webrtc.VideoTrack;
import java.util.ArrayList;
import java.util.List;
/**
* Author: MrQinshou
* Email: cqflqinhao@126.com
* Date: 2023/3/21 17:22
* Description: 本地 demo
*/
public class LocalDemoActivity extends AppCompatActivity {
private static final String TAG = LocalDemoActivity.class.getSimpleName();
private static final String AUDIO_TRACK_ID = "ARDAMSa0";
private static final String VIDEO_TRACK_ID = "ARDAMSv0";
private static final List<String> STREAM_IDS = new ArrayList<String>() {{
add("ARDAMS");
}};
private static final String SURFACE_TEXTURE_HELPER_THREAD_NAME = "SurfaceTextureHelperThread";
private static final int WIDTH = 1280;
private static final int HEIGHT = 720;
private static final int FPS = 30;
private EglBase mEglBase;
private PeerConnectionFactory mPeerConnectionFactory;
private VideoCapturer mVideoCapturer;
private AudioTrack mAudioTrack;
private VideoTrack mVideoTrack;
private PeerConnection mLocalPeerConnection;
private PeerConnection mRemotePeerConnection;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_local_demo);
findViewById(R.id.btn_call).setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
call();
}
});
findViewById(R.id.btn_hang_up).setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
hangUp();
}
});
// 初始化 PeerConnectionFactory
initPeerConnectionFactory(LocalDemoActivity.this);
// 创建 EglBase
mEglBase = EglBase.create();
// 创建 PeerConnectionFactory
mPeerConnectionFactory = createPeerConnectionFactory(mEglBase);
// 创建音轨
mAudioTrack = createAudioTrack(mPeerConnectionFactory);
// 创建视轨
mVideoCapturer = createVideoCapturer();
VideoSource videoSource = createVideoSource(mPeerConnectionFactory, mVideoCapturer);
mVideoTrack = createVideoTrack(mPeerConnectionFactory, videoSource);
// 初始化本地视频渲染控件,这个方法非常重要,不初始化会黑屏
SurfaceViewRenderer svrLocal = findViewById(R.id.svr_local);
svrLocal.init(mEglBase.getEglBaseContext(), null);
mVideoTrack.addSink(svrLocal);
// 初始化远端视频渲染控件,这个方法非常重要,不初始化会黑屏
SurfaceViewRenderer svrRemote = findViewById(R.id.svr_remote);
svrRemote.init(mEglBase.getEglBaseContext(), null);
// 开始本地渲染
// 创建 SurfaceTextureHelper,用来表示 camera 初始化的线程
SurfaceTextureHelper surfaceTextureHelper = SurfaceTextureHelper.create(SURFACE_TEXTURE_HELPER_THREAD_NAME, mEglBase.getEglBaseContext());
// 初始化视频采集器
mVideoCapturer.initialize(surfaceTextureHelper, LocalDemoActivity.this, videoSource.getCapturerObserver());
mVideoCapturer.startCapture(WIDTH, HEIGHT, FPS);
}
@Override
protected void onDestroy() {
super.onDestroy();
if (mEglBase != null) {
mEglBase.release();
mEglBase = null;
}
if (mVideoCapturer != null) {
try {
mVideoCapturer.stopCapture();
} catch (InterruptedException e) {
e.printStackTrace();
}
mVideoCapturer.dispose();
mVideoCapturer = null;
}
if (mAudioTrack != null) {
mAudioTrack.dispose();
mAudioTrack = null;
}
if (mVideoTrack != null) {
mVideoTrack.dispose();
mVideoTrack = null;
}
if (mLocalPeerConnection != null) {
mLocalPeerConnection.close();
mLocalPeerConnection = null;
}
if (mRemotePeerConnection != null) {
mRemotePeerConnection.close();
mRemotePeerConnection = null;
}
SurfaceViewRenderer svrLocal = findViewById(R.id.svr_local);
svrLocal.release();
SurfaceViewRenderer svrRemote = findViewById(R.id.svr_remote);
svrRemote.release();
}
private void initPeerConnectionFactory(Context context) {
PeerConnectionFactory.initialize(PeerConnectionFactory.InitializationOptions.builder(context).createInitializationOptions());
}
private PeerConnectionFactory createPeerConnectionFactory(EglBase eglBase) {
VideoEncoderFactory videoEncoderFactory = new DefaultVideoEncoderFactory(eglBase.getEglBaseContext(), true, true);
VideoDecoderFactory videoDecoderFactory = new DefaultVideoDecoderFactory(eglBase.getEglBaseContext());
return PeerConnectionFactory.builder().setVideoEncoderFactory(videoEncoderFactory).setVideoDecoderFactory(videoDecoderFactory).createPeerConnectionFactory();
}
private AudioTrack createAudioTrack(PeerConnectionFactory peerConnectionFactory) {
AudioSource audioSource = peerConnectionFactory.createAudioSource(new MediaConstraints());
AudioTrack audioTrack = peerConnectionFactory.createAudioTrack(AUDIO_TRACK_ID, audioSource);
audioTrack.setEnabled(true);
return audioTrack;
}
private VideoCapturer createVideoCapturer() {
VideoCapturer videoCapturer = null;
CameraEnumerator cameraEnumerator = new Camera2Enumerator(LocalDemoActivity.this);
for (String deviceName : cameraEnumerator.getDeviceNames()) {
// 前摄像头
if (cameraEnumerator.isFrontFacing(deviceName)) {
videoCapturer = new Camera2Capturer(LocalDemoActivity.this, deviceName, null);
}
}
return videoCapturer;
}
private VideoSource createVideoSource(PeerConnectionFactory peerConnectionFactory, VideoCapturer videoCapturer) {
// 创建视频源
VideoSource videoSource = peerConnectionFactory.createVideoSource(videoCapturer.isScreencast());
return videoSource;
}
private VideoTrack createVideoTrack(PeerConnectionFactory peerConnectionFactory, VideoSource videoSource) {
// 创建视轨
VideoTrack videoTrack = peerConnectionFactory.createVideoTrack(VIDEO_TRACK_ID, videoSource);
videoTrack.setEnabled(true);
return videoTrack;
}
private void call() {
// 创建 PeerConnection
PeerConnection.RTCConfiguration rtcConfiguration = new PeerConnection.RTCConfiguration(new ArrayList<>());
mLocalPeerConnection = mPeerConnectionFactory.createPeerConnection(rtcConfiguration, new PeerConnection.Observer() {
@Override
public void onSignalingChange(PeerConnection.SignalingState signalingState) {
}
@Override
public void onIceConnectionChange(PeerConnection.IceConnectionState iceConnectionState) {
}
@Override
public void onIceConnectionReceivingChange(boolean b) {
}
@Override
public void onIceGatheringChange(PeerConnection.IceGatheringState iceGatheringState) {
}
@Override
public void onIceCandidate(IceCandidate iceCandidate) {
ShowLogUtil.verbose("onIceCandidate--->" + iceCandidate);
sendIceCandidate(mLocalPeerConnection, iceCandidate);
}
@Override
public void onIceCandidatesRemoved(IceCandidate[] iceCandidates) {
}
@Override
public void onAddStream(MediaStream mediaStream) {
}
@Override
public void onRemoveStream(MediaStream mediaStream) {
}
@Override
public void onDataChannel(DataChannel dataChannel) {
}
@Override
public void onRenegotiationNeeded() {
}
@Override
public void onAddTrack(RtpReceiver rtpReceiver, MediaStream[] mediaStreams) {
}
});
// 为 PeerConnection 添加音轨、视轨
mLocalPeerConnection.addTrack(mAudioTrack, STREAM_IDS);
mLocalPeerConnection.addTrack(mVideoTrack, STREAM_IDS);
// 通过 PeerConnection 创建 offer,获取 sdp
MediaConstraints mediaConstraints = new MediaConstraints();
mLocalPeerConnection.createOffer(new MySdpObserver() {
@Override
public void onCreateSuccess(SessionDescription sessionDescription) {
ShowLogUtil.verbose("create offer success.");
// 将 offer sdp 作为参数 setLocalDescription
mLocalPeerConnection.setLocalDescription(new MySdpObserver() {
@Override
public void onCreateSuccess(SessionDescription sessionDescription) {
}
@Override
public void onSetSuccess() {
ShowLogUtil.verbose("set local sdp success.");
// 发送 offer sdp
sendOffer(sessionDescription);
}
}, sessionDescription);
}
@Override
public void onSetSuccess() {
}
}, mediaConstraints);
}
private void sendOffer(SessionDescription offer) {
receivedOffer(offer);
}
private void receivedOffer(SessionDescription offer) {
// 创建 PeerConnection
PeerConnection.RTCConfiguration rtcConfiguration = new PeerConnection.RTCConfiguration(new ArrayList<>());
mRemotePeerConnection = mPeerConnectionFactory.createPeerConnection(rtcConfiguration, new PeerConnection.Observer() {
@Override
public void onSignalingChange(PeerConnection.SignalingState signalingState) {
}
@Override
public void onIceConnectionChange(PeerConnection.IceConnectionState iceConnectionState) {
}
@Override
public void onIceConnectionReceivingChange(boolean b) {
}
@Override
public void onIceGatheringChange(PeerConnection.IceGatheringState iceGatheringState) {
}
@Override
public void onIceCandidate(IceCandidate iceCandidate) {
ShowLogUtil.verbose("onIceCandidate--->" + iceCandidate);
sendIceCandidate(mRemotePeerConnection, iceCandidate);
}
@Override
public void onIceCandidatesRemoved(IceCandidate[] iceCandidates) {
}
@Override
public void onAddStream(MediaStream mediaStream) {
ShowLogUtil.verbose("onAddStream--->" + mediaStream);
if (mediaStream == null || mediaStream.videoTracks == null || mediaStream.videoTracks.isEmpty()) {
return;
}
runOnUiThread(new Runnable() {
@Override
public void run() {
SurfaceViewRenderer svrRemote = findViewById(R.id.svr_remote);
mediaStream.videoTracks.get(0).addSink(svrRemote);
}
});
}
@Override
public void onRemoveStream(MediaStream mediaStream) {
}
@Override
public void onDataChannel(DataChannel dataChannel) {
}
@Override
public void onRenegotiationNeeded() {
}
@Override
public void onAddTrack(RtpReceiver rtpReceiver, MediaStream[] mediaStreams) {
}
});
// 将 offer sdp 作为参数 setRemoteDescription
mRemotePeerConnection.setRemoteDescription(new MySdpObserver() {
@Override
public void onCreateSuccess(SessionDescription sessionDescription) {
}
@Override
public void onSetSuccess() {
ShowLogUtil.verbose("set remote sdp success.");
// 通过 PeerConnection 创建 answer,获取 sdp
MediaConstraints mediaConstraints = new MediaConstraints();
mRemotePeerConnection.createAnswer(new MySdpObserver() {
@Override
public void onCreateSuccess(SessionDescription sessionDescription) {
ShowLogUtil.verbose("create answer success.");
// 将 answer sdp 作为参数 setLocalDescription
mRemotePeerConnection.setLocalDescription(new MySdpObserver() {
@Override
public void onCreateSuccess(SessionDescription sessionDescription) {
}
@Override
public void onSetSuccess() {
ShowLogUtil.verbose("set local sdp success.");
// 发送 answer sdp
sendAnswer(sessionDescription);
}
}, sessionDescription);
}
@Override
public void onSetSuccess() {
}
}, mediaConstraints);
}
}, offer);
}
private void sendAnswer(SessionDescription answer) {
receivedAnswer(answer);
}
private void receivedAnswer(SessionDescription answer) {
// 收到 answer sdp,将 answer sdp 作为参数 setRemoteDescription
mLocalPeerConnection.setRemoteDescription(new MySdpObserver() {
@Override
public void onCreateSuccess(SessionDescription sessionDescription) {
}
@Override
public void onSetSuccess() {
ShowLogUtil.verbose("set remote sdp success.");
}
}, answer);
}
private void sendIceCandidate(PeerConnection peerConnection, IceCandidate iceCandidate) {
receivedCandidate(peerConnection, iceCandidate);
}
private void receivedCandidate(PeerConnection peerConnection, IceCandidate iceCandidate) {
if (peerConnection == mLocalPeerConnection) {
mRemotePeerConnection.addIceCandidate(iceCandidate);
} else {
mLocalPeerConnection.addIceCandidate(iceCandidate);
}
}
private void hangUp() {
// 关闭 PeerConnection
if (mLocalPeerConnection != null) {
mLocalPeerConnection.close();
mLocalPeerConnection.dispose();
mLocalPeerConnection = null;
}
if (mRemotePeerConnection != null) {
mRemotePeerConnection.close();
mRemotePeerConnection.dispose();
mRemotePeerConnection = null;
}
// 释放远端视频渲染控件
SurfaceViewRenderer svrRemote = findViewById(R.id.svr_remote);
svrRemote.clearImage();
}
}
其中 MySdpObserver 只是 SdpObserver 的一个自定义子接口,默认实现了失败回调:
package com.qinshou.webrtcdemo_android;
import org.webrtc.SdpObserver;
/**
* Author: MrQinshou
* Email: cqflqinhao@126.com
* Date: 2023/3/20 18:46
* Description: 类描述
*/
public interface MySdpObserver extends SdpObserver {
@Override
default void onCreateFailure(String s) {
}
@Override
default void onSetFailure(String s) {
}
}
运行起来后点击呼叫,在右上角的远端控件就能收到本地视频了:
四、iOS
相较于 Android,iOS 端的资料相对较少,且资料大同小异,我也是踩了好多坑才跑通了 demo。
1.添加依赖
对于 iOS 来说,也需要额外引入依赖,由于我是做 Android 的,对 iOS 的其他引入依赖方式不太了解,下面是使用 Cocopods 引入 WebRTC 依赖:
...
target 'WebRTCDemo-iOS' do
...
pod 'GoogleWebRTC','~>1.1.31999'
end
...
2.权限
除了依赖之外,iOS 还需要配置录音权限、摄像头权限,也需要动态申请,这些不是 WebRTC 的重点,只要不是 iOS 新手应该都知道怎么去做,这里就不赘述了。
<key>NSMicrophoneUsageDescription</key>
<string>允许程序使用麦克风的权限</string>
<key>NSCameraUsageDescription</key>
<string>允许程序使用摄像头的权限</string>
3.LocalDemoViewController
//
// LocalDemoViewController.swift
// WebRTCDemo
//
// Created by 覃浩 on 2023/3/21.
//
import UIKit
import WebRTC
import SnapKit
class LocalDemoViewController: UIViewController {
private static let AUDIO_TRACK_ID = "ARDAMSa0"
private static let VIDEO_TRACK_ID = "ARDAMSv0"
private static let STREAM_IDS = ["ARDAMS"]
private static let WIDTH = 1280
private static let HEIGHT = 720
private static let FPS = 30
private var localView: RTCEAGLVideoView!
private var remoteView: RTCEAGLVideoView!
private var peerConnectionFactory: RTCPeerConnectionFactory!
private var audioTrack: RTCAudioTrack?
private var videoTrack: RTCVideoTrack?
/**
iOS 需要将 Capturer 保存为全局变量,否则无法渲染本地画面
*/
private var videoCapturer: RTCVideoCapturer?
/**
iOS 需要将远端流保存为全局变量,否则无法渲染远端画面
*/
private var remoteStream: RTCMediaStream?
private var localPeerConnection: RTCPeerConnection?
private var remotePeerConnection: RTCPeerConnection?
override func viewDidLoad() {
super.viewDidLoad()
// 表明 View 不要扩展到整个屏幕,而是在 NavigationBar 下的区域
edgesForExtendedLayout = UIRectEdge()
let btnCall = UIButton()
btnCall.backgroundColor = UIColor.lightGray
btnCall.setTitle("呼叫", for: .normal)
btnCall.setTitleColor(UIColor.black, for: .normal)
btnCall.addTarget(self, action: #selector(call), for: .touchUpInside)
self.view.addSubview(btnCall)
btnCall.snp.makeConstraints({ maker in
maker.left.equalToSuperview().offset(30)
maker.width.equalTo(60)
maker.height.equalTo(40)
})
let btnHangUp = UIButton()
btnHangUp.backgroundColor = UIColor.lightGray
btnHangUp.setTitle("挂断", for: .normal)
btnHangUp.setTitleColor(UIColor.black, for: .normal)
btnHangUp.addTarget(self, action: #selector(hangUp), for: .touchUpInside)
self.view.addSubview(btnHangUp)
btnHangUp.snp.makeConstraints({ maker in
maker.left.equalToSuperview().offset(30)
maker.width.equalTo(60)
maker.height.equalTo(40)
maker.top.equalTo(btnCall.snp.bottom).offset(10)
})
// 初始化 PeerConnectionFactory
initPeerConnectionFactory()
// 创建 EglBase
// 创建 PeerConnectionFactory
peerConnectionFactory = createPeerConnectionFactory()
// 创建音轨
audioTrack = createAudioTrack(peerConnectionFactory: peerConnectionFactory)
// 创建视轨
videoTrack = createVideoTrack(peerConnectionFactory: peerConnectionFactory)
let tuple = createVideoCapturer(videoSource: videoTrack!.source)
let captureDevice = tuple.captureDevice
videoCapturer = tuple.videoCapture
// 初始化本地视频渲染控件
localView = RTCEAGLVideoView()
localView.delegate = self
self.view.insertSubview(localView,at: 0)
localView.snp.makeConstraints({ maker in
maker.width.equalToSuperview()
maker.height.equalTo(localView.snp.width).multipliedBy(16.0/9.0)
maker.centerY.equalToSuperview()
})
videoTrack?.add(localView!)
// 初始化远端视频渲染控件
remoteView = RTCEAGLVideoView()
remoteView.delegate = self
self.view.insertSubview(remoteView, aboveSubview: localView)
remoteView.snp.makeConstraints({ maker in
maker.width.equalTo(90)
maker.height.equalTo(160)
maker.top.equalToSuperview().offset(30)
maker.right.equalToSuperview().offset(-30)
})
// 开始本地渲染
(videoCapturer as? RTCCameraVideoCapturer)?.startCapture(with: captureDevice!, format: captureDevice!.activeFormat, fps: LocalDemoViewController.FPS)
}
override func viewDidDisappear(_ animated: Bool) {
(videoCapturer as? RTCCameraVideoCapturer)?.stopCapture()
videoCapturer = nil
localPeerConnection?.close()
localPeerConnection = nil
remotePeerConnection?.close()
remotePeerConnection = nil
}
private func initPeerConnectionFactory() {
RTCPeerConnectionFactory.initialize()
}
private func createPeerConnectionFactory() -> RTCPeerConnectionFactory {
var videoEncoderFactory = RTCDefaultVideoEncoderFactory()
var videoDecoderFactory = RTCDefaultVideoDecoderFactory()
if TARGET_OS_SIMULATOR != 0 {
videoEncoderFactory = RTCSimluatorVideoEncoderFactory()
videoDecoderFactory = RTCSimulatorVideoDecoderFactory()
}
return RTCPeerConnectionFactory(encoderFactory: videoEncoderFactory, decoderFactory: videoDecoderFactory)
}
private func createAudioTrack(peerConnectionFactory: RTCPeerConnectionFactory) -> RTCAudioTrack {
let mandatoryConstraints : [String : String] = [:]
let optionalConstraints : [String : String] = [:]
let audioSource = peerConnectionFactory.audioSource(with: RTCMediaConstraints(mandatoryConstraints: mandatoryConstraints, optionalConstraints: optionalConstraints))
let audioTrack = peerConnectionFactory.audioTrack(with: audioSource, trackId: LocalDemoViewController.AUDIO_TRACK_ID)
audioTrack.isEnabled = true
return audioTrack
}
private func createVideoTrack(peerConnectionFactory: RTCPeerConnectionFactory) -> RTCVideoTrack? {
let videoSource = peerConnectionFactory.videoSource()
let videoTrack = peerConnectionFactory.videoTrack(with: videoSource, trackId: LocalDemoViewController.VIDEO_TRACK_ID)
videoTrack.isEnabled = true
return videoTrack
}
private func createVideoCapturer(videoSource: RTCVideoSource) -> (captureDevice: AVCaptureDevice?, videoCapture: RTCVideoCapturer?) {
let videoCapturer = RTCCameraVideoCapturer(delegate: videoSource)
let captureDevices = RTCCameraVideoCapturer.captureDevices()
if (captureDevices.count == 0) {
return (nil, nil)
}
var captureDevice: AVCaptureDevice?
for c in captureDevices {
// 前摄像头
if (c.position == .front) {
captureDevice = c
break
}
}
if (captureDevice == nil) {
return (nil, nil)
}
return (captureDevice, videoCapturer)
}
@objc private func call() {
// 创建 PeerConnection
let rtcConfiguration = RTCConfiguration()
var mandatoryConstraints : [String : String] = [:]
var optionalConstraints : [String : String] = [:]
var mediaConstraints = RTCMediaConstraints(mandatoryConstraints: mandatoryConstraints, optionalConstraints: optionalConstraints)
localPeerConnection = peerConnectionFactory.peerConnection(with: rtcConfiguration, constraints: mediaConstraints, delegate: self)
// 为 PeerConnection 添加音轨、视轨
localPeerConnection?.add(audioTrack!, streamIds: LocalDemoViewController.STREAM_IDS)
localPeerConnection?.add(videoTrack!, streamIds: LocalDemoViewController.STREAM_IDS)
// 通过 PeerConnection 创建 offer,获取 sdp
mandatoryConstraints = [:]
optionalConstraints = [:]
mediaConstraints = RTCMediaConstraints(mandatoryConstraints: mandatoryConstraints, optionalConstraints: optionalConstraints)
localPeerConnection?.offer(for: mediaConstraints, completionHandler: { sessionDescription, error in
ShowLogUtil.verbose("create offer success.")
// 将 offer sdp 作为参数 setLocalDescription
self.localPeerConnection?.setLocalDescription(sessionDescription!, completionHandler: { _ in
ShowLogUtil.verbose("set local sdp success.")
// 发送 offer sdp
self.sendOffer(offer: sessionDescription!)
})
})
}
private func sendOffer(offer: RTCSessionDescription) {
receivedOffer(offer: offer)
}
private func receivedOffer(offer: RTCSessionDescription) {
// 创建 PeerConnection
let rtcConfiguration = RTCConfiguration()
let mandatoryConstraints : [String : String] = [:]
let optionalConstraints : [String : String] = [:]
var mediaConstraints = RTCMediaConstraints(mandatoryConstraints: mandatoryConstraints, optionalConstraints: optionalConstraints)
remotePeerConnection = peerConnectionFactory.peerConnection(with: rtcConfiguration, constraints: mediaConstraints, delegate: self)
// 将 offer sdp 作为参数 setRemoteDescription
remotePeerConnection?.setRemoteDescription(offer, completionHandler: { _ in
ShowLogUtil.verbose("set remote sdp success.")
// 通过 PeerConnection 创建 answer,获取 sdp
let mandatoryConstraints : [String : String] = [:]
let optionalConstraints : [String : String] = [:]
let mediaConstraints = RTCMediaConstraints(mandatoryConstraints: mandatoryConstraints, optionalConstraints: optionalConstraints)
self.remotePeerConnection?.answer(for: mediaConstraints, completionHandler: { sessionDescription, error in
ShowLogUtil.verbose("create answer success.")
// 将 answer sdp 作为参数 setLocalDescription
self.remotePeerConnection?.setLocalDescription(sessionDescription!, completionHandler: { _ in
ShowLogUtil.verbose("set local sdp success.")
// 发送 answer sdp
self.sendAnswer(answer: sessionDescription!)
})
})
})
}
private func sendAnswer(answer: RTCSessionDescription) {
receivedAnswer(answer: answer)
}
private func receivedAnswer(answer: RTCSessionDescription) {
// 收到 answer sdp,将 answer sdp 作为参数 setRemoteDescription
localPeerConnection?.setRemoteDescription(answer, completionHandler: { _ in ShowLogUtil.verbose("set remote sdp success.")
})
}
private func sendIceCandidate(peerConnection: RTCPeerConnection, iceCandidate: RTCIceCandidate) {
receivedCandidate(peerConnection: peerConnection,iceCandidate: iceCandidate)
}
private func receivedCandidate(peerConnection: RTCPeerConnection, iceCandidate: RTCIceCandidate) {
if (peerConnection == localPeerConnection) {
remotePeerConnection?.add(iceCandidate)
} else {
localPeerConnection?.add(iceCandidate)
}
}
@objc private func hangUp() {
// 关闭 PeerConnection
localPeerConnection?.close()
localPeerConnection = nil
remotePeerConnection?.close()
remotePeerConnection = nil
// 释放远端视频渲染控件
if let track = remoteStream?.videoTracks.first {
track.remove(remoteView!)
}
}
}
// MARK: - RTCVideoViewDelegate
extension LocalDemoViewController: RTCVideoViewDelegate {
func videoView(_ videoView: RTCVideoRenderer, didChangeVideoSize size: CGSize) {
}
}
// MARK: - RTCPeerConnectionDelegate
extension LocalDemoViewController: RTCPeerConnectionDelegate {
func peerConnection(_ peerConnection: RTCPeerConnection, didChange stateChanged: RTCSignalingState) {
}
func peerConnection(_ peerConnection: RTCPeerConnection, didAdd stream: RTCMediaStream) {
ShowLogUtil.verbose("peerConnection didAdd stream--->(stream)")
if (peerConnection == self.localPeerConnection) {
} else if (peerConnection == self.remotePeerConnection) {
self.remoteStream = stream
if let track = stream.videoTracks.first {
track.add(remoteView!)
}
if let audioTrack = stream.audioTracks.first{
audioTrack.source.volume = 8
}
}
}
func peerConnection(_ peerConnection: RTCPeerConnection, didRemove stream: RTCMediaStream) {
}
func peerConnectionShouldNegotiate(_ peerConnection: RTCPeerConnection) {
}
func peerConnection(_ peerConnection: RTCPeerConnection, didChange newState: RTCIceConnectionState) {
}
func peerConnection(_ peerConnection: RTCPeerConnection, didChange newState: RTCIceGatheringState) {
}
func peerConnection(_ peerConnection: RTCPeerConnection, didGenerate candidate: RTCIceCandidate) {
ShowLogUtil.verbose("didGenerate candidate--->(candidate)")
self.sendIceCandidate(peerConnection: peerConnection, iceCandidate: candidate)
}
func peerConnection(_ peerConnection: RTCPeerConnection, didRemove candidates: [RTCIceCandidate]) {
}
func peerConnection(_ peerConnection: RTCPeerConnection, didOpen dataChannel: RTCDataChannel) {
}
}
运行起来后点击呼叫,在右上角的远端控件就能收到本地视频了:
Gif 的质量就包涵一下,压缩后失真比较严重,高清的可以看最后 demo 地址中的 imgs 目下的图。
大家可能会看到 iOS 中间会卡了一下,那其实不是卡了,因为 iOS 录屏录不了点击事件,实际上是跟 H5 和 Android 一样,我中间点击了一下挂断,但是 iOS 挂断后释放远端渲染控件时,控件会保留显示最后一帧的画面,所以大家如果在实际做的时候想要效果好一点的话,可以在渲染控件上再放一个黑色遮罩,挂断时显示遮罩。
五、总结
踩过的坑,其实在代码中已经有注释了,大家留意一下。
Android 端的坑:
- SurfaceViewRender 对象需要调用 init() 方法初始化,否则会黑屏。
iOS 端的坑:
- 本地画面渲染控件就老老实实的用 RTCEAGLVideoView 就好,不要用 RTCCameraPreviewView,跟远端渲染控件保持一致,代码更整齐,使用其他视频源时还容易切换。
- 本地视频捕获对象(RTCVideoCapturer)、视轨对象(RTCVideoTrack)、远端视频流对象(RTCMediaStream)要声明成全局变量,我也不知道为什么,但是如果声明成局部变量就会渲染不出来画面。
- demo 阶段不要去设置花里胡哨的参数,(DtlsSrtpKeyAgreement、kRTCMediaConstraintsOfferToReceiveAudio、kRTCMediaConstraintsOfferToReceiveVideo 等),均使用默认配置就好,设置参数越多越不容易排查问题。
通过这三端的本地 demo,大家应该还是可以看到一定区别,H5 其实很少的代码量就能实现 WebRTC 功能,而 Android 相对来说需要初始化的地方更多,但可能因为是我的本行的缘故,所以我在实现 Android 端的时候并没有遇到什么坑。
其实一端与另一端通话是只需要创建一个 PeerConnection ,这一次本地 demo 中我们创建了两个 PeerConnection,这是为了不搭建信令服务器就能看到效果,所以创建了一个 RemotePeerConnection 模拟远端,下一次我们就会通过 WebSocket 搭建一个简单的信令服务器来实现点对点通话。