Android WebRTC 1v1通话样例

一、概述

  WebRTC是一套音视频实时通讯的解决方案,利用它我们可以很方便的只做出1v1 1v多 多v多的通话应用。如:音视频聊天、视频会议、直播等。而且这玩意完全开源、免费做应用不用担心版权,想要深入学习,直接撸源码。非常nice。今天主要介绍一下单对单视频通话场景,并在最后给出核心样例代码。

二、通话流程

 1.在此我默认所有的看客(包括我自己)都已经理解了。sdp、ice、stun、turn。

 2.大流程上主要分为三块

  1.媒体协商

  2.网络协商

  3.开始通话

  

  下面用A和B来表示客户端A和客户端B两个人通话。

  1.A创建Offer,设置本地媒体描述setLocalDescription,并通过信令将本地媒体描述发送给B

  2.B收到A发送的本地媒体描述并将A的本地媒体描述设置为setRemoteDescription。

  3.B创建一个Answer,并设置自己的setLocalDescription,并通过信令服务将B的媒体描述发送给A

  4.A收到B的本地媒体描述后设置setRemoteDescription。到此A和B已经交换了个字的媒体信息。

  5.媒体协商完成A和B都会回调PeerConnection.Observer的onIceCandidate,并通过信令服务器将个字的candidate发送给对方,各自收到后会设置相应的candidate

  6.A和B打开各自的媒体流(音频/视频),通过PeerConnection.Observer的回调函addStream接收远程媒体流,并显示。

  

三、核心源码实现

  1.初始化部分  

 this.context = context;
        peerConnectionMap = new HashMap<>();
        iceServers = new ArrayList<>();
        eglBaseContext = EglBase.create().getEglBaseContext();
        surfaceTextureHelper = SurfaceTextureHelper.create("CaptureThread", eglBaseContext);
        //初始化PeerConnectionFactory
        PeerConnectionFactory.initialize(PeerConnectionFactory.InitializationOptions.builder(context).createInitializationOptions());
        //创建一个默认的PeerConnectionFactory.Options
        PeerConnectionFactory.Options options = new PeerConnectionFactory.Options();
        //创建默认视频编码器
        DefaultVideoEncoderFactory defaultVideoEncoderFactory = new DefaultVideoEncoderFactory(eglBaseContext, true, true);
        //创建默认视频解码器
        DefaultVideoDecoderFactory defaultVideoDecoderFactory = new DefaultVideoDecoderFactory(eglBaseContext);
        //构建PeerConnectionFactory
        peerConnectionFactory = PeerConnectionFactory.builder().setOptions(options)
                .setVideoEncoderFactory(defaultVideoEncoderFactory)
                .setVideoDecoderFactory(defaultVideoDecoderFactory)
                .setAudioDeviceModule(JavaAudioDeviceModule.builder(context).createAudioDeviceModule())//设置音频设备
                .createPeerConnectionFactory();
        setIceServers(Config.IP.STUN);
        mediaStream = peerConnectionFactory.createLocalMediaStream("ARDAMS");
    //初始化信令服务
        SignalingClient.get().init(this);

  2.开启本地视频预览和开启音频并将本地的音视频加入媒体流

rtcClient.startVideoPreview(this, true, surfaceViewRenderer);
        rtcClient.startLocalAudio();

 

/**
     * 预览视频
     *
     * @param context         上下文环境
     * @param isFront         true 前置摄像头 false后置摄像头
     * @param localViewRender 用来渲染视频的View
     */
    public void startVideoPreview(Context context, boolean isFront, SurfaceViewRenderer localViewRender) {
        Log.e(TAG, "startVideoPreview----->start");
        //添加视频预览
        if (videoCapturer == null) {
            videoCapturer = createVideoCapture(isFront, context);
        }
        if (videoSource == null) {
            Log.e(TAG, "startVideoPreview----->createVideoSource");
            videoSource = createVideoSource(videoCapturer);
        }
        Log.e(TAG, "startVideoPreview----->startCapture");
        videoCapturer.initialize(surfaceTextureHelper, context, videoSource.getCapturerObserver());
        videoCapturer.startCapture(480, 640, 30);
        if (videoTrack == null) {
            Log.e(TAG, "startVideoPreview----->createVideoTrack");
            videoTrack = createVideoTrack(videoSource);
        }
        addLocalVideoSink(localViewRender);
        //将视频加入媒体流
        Log.e(TAG, "startVideoPreview----->createMediaStream");
        mediaStream = createMediaStream();
        mediaStream.addTrack(videoTrack);
        Log.e(TAG, "startVideoPreview----->end");
    }

  

/**
     * 将音频加入媒体流
     */
    public void startLocalAudio() {
        if (audioSource == null) {
            Log.e(TAG, "startAudio----->createAudioSource");
            audioSource = createAudioSource();
        }
        if (audioTrack == null) {
            Log.e(TAG, "startAudio----->createAudioTrack");
            audioTrack = createAudioTrack();
        }
        //将音频加入媒体流
        Log.e(TAG, "startAudio----->addTrack(audioTrack)");
        mediaStream = createMediaStream();
        mediaStream.addTrack(audioTrack);
    }

  

  3.展示远程音视频

 @Override
            public void onUserVideoAvailable(String userId, boolean available, VideoTrack videoTrack) {
                if (available) {
                    remoteVideoTrack = videoTrack;
                    if (remoteVideoTrack != null) {
                        videoTrack.addSink(remoteSurfaceViewRender);
                    }

                }
            }

  

@Override
            public void onAddStream(MediaStream mediaStream) {
                super.onAddStream(mediaStream);
                Log.e(TAG, "PeerConnectionAdapter----->onAddStream");
                if (listener != null) {
                    //远程视频
                    List<VideoTrack> remoteVideoTracks = mediaStream.videoTracks;
                    if (remoteVideoTracks != null && remoteVideoTracks.size() > 0) {
                        listener.onUserVideoAvailable(key, true, remoteVideoTracks.get(0));
                    }
                    List<AudioTrack> remoteAudioTracks = mediaStream.audioTracks;
                    if (remoteAudioTracks != null && remoteAudioTracks.size() > 0) {
                        listener.onUserAudioAvailable(key, true, remoteAudioTracks.get(0));
                    }
                }
            }

  

  ps:由于这并不是一个简单的demo,都是经过封装的,所以一块都贴出来代码量太大,等后面会把响应的源代码上传的github上,有需要的可以直接下载

posted on 2021-08-12 18:16  飘杨......  阅读(634)  评论(1编辑  收藏  举报