Android-WebRTC完整入门教程05: OpenGL滤镜

WebRTC视频添加滤镜理论上有两种方式:

  • 一种是本地视频展示到屏幕后添加滤镜,把滤镜类型传给对面,对面也在视频展示后添加滤镜,滤镜单独传输不需要添加到视频流里。优点是处理相对简单,缺点是双方需要支持相同的滤镜。
  • 另一种是本地视频展示前添加好滤镜,然后展示到本地,并传输给对面,滤镜需要直接添加到视频流里。优点是接受方不需要对视频流特殊处理,缺点是处理相对麻烦。

1.WebRTC流程

查阅了一下,一般使用第二种方式。WebRTC提供了处理的接口,用videoSource.setVideoProcessor方法,就可以在视频采集之后,展示到本地和传输之前进行处理,在onFrameCaptured 方法中处理,处理后传给videoSink就可以了。

        SurfaceTextureHelper surfaceTextureHelper = SurfaceTextureHelper.create("CaptureThread", eglBaseContext);
        // create VideoCapturer
        VideoCapturer videoCapturer = createCameraCapturer(true);
        VideoSource videoSource = peerConnectionFactory.createVideoSource(videoCapturer.isScreencast());
        videoSource.setVideoProcessor(new EglProcessor());
        videoCapturer.initialize(surfaceTextureHelper, getApplicationContext(), videoSource.getCapturerObserver());
        videoCapturer.startCapture(480, 640, 30);



public class EglProcessor implements VideoProcessor {

        VideoSink videoSink;

        @Override
        public void setSink(VideoSink videoSink) {
            this.videoSink = videoSink;
        }

        @Override
        public void onCapturerStarted(boolean b) {
        }

        @Override
        public void onCapturerStopped() {

        }

        @Override
        public void onFrameCaptured(VideoFrame videoFrame) {
            VideoFrame newFrame = videoFrame; // 进行视频处理
            videoSink.onFrame(newFrame);
        }

}

2.OpenGL流程

由于是在视频展示前处理,没有GLSurfaceView环境,因此先创建EGL环境,将VideoFrame视频帧(YUV格式)传入,用YUV相关shader处理添加滤镜效果,处理后用GLES20.glReadPixels方法将图像(RGB格式)读取出来,转换还原成YUV格式,封装成新的VideoFrame,传给videoSink就行了。

EGL绘制流程可以看我的这篇文章:https://www.cnblogs.com/rome753/p/17141687.html

        @Override
        public void onFrameCaptured(VideoFrame videoFrame) {
            int w = videoFrame.getBuffer().getWidth();
            int h = videoFrame.getBuffer().getHeight();

//                ImageBytes imageBytes = ImageBytes.create(videoFrame);
            VideoFrame.I420Buffer ori = videoFrame.getBuffer().toI420();

            eglThread.initEglCore(w, h);
            eglThread.post(() -> {
                yuvRender.onSurfaceCreated(null, null);
                yuvRender.onSurfaceChanged(null, w, h);
                GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
                yuvRender.getYuvShader().draw(w, h, ori.getDataY(), ori.getDataU(), ori.getDataU());

                IntBuffer buffer =  IntBuffer.allocate(w * h);
                GLES20.glReadPixels(0, 0, w, h, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, buffer);
                byte[] yuv = YUVTools.rgb2Yuv420(buffer.array(), w, h);
                buffer.rewind();

                if (dataY == null) {
                    dataY = ByteBuffer.allocateDirect(w * h);
                }
                dataY.put(yuv, 0, w * h);

                if (dataU == null) {
                    dataU = ByteBuffer.allocateDirect(w * h / 4);
                }
                dataU.put(yuv, w * h, w * h / 4);

                if (dataV == null) {
                    dataV = ByteBuffer.allocateDirect(w * h / 4);
                }
                dataV.put(yuv, w * h + w * h / 4, w * h / 4);

                dataY.position(0);
                dataU.position(0);
                dataV.position(0);

                JavaI420Buffer nb = JavaI420Buffer.wrap(w, h, dataY, ori.getStrideY(), dataU, ori.getStrideU(), dataV, ori.getStrideV(), null);
                VideoFrame newFrame = new VideoFrame(nb, videoFrame.getRotation(), videoFrame.getTimestampNs());
                videoSink.onFrame(newFrame);
            });

展示效果没问题,并且能正常传输到远程画面中。

3.完整代码

webrtc-android-tutorial/step5opengl模块

posted @ 2023-02-22 11:31  rome753  阅读(585)  评论(0编辑  收藏  举报