WebRTC笔记(三)音视频同步
2020-04-06 21:17 jiayayao 阅读(2558) 评论(0) 编辑 收藏 举报1. RTP timestamp和SeqNo
RTP timestamp负责定义媒体数据的采样时刻,描述负载数据的帧间关系;
RTP SeqNo负责定义RTP数据包的先后关系,描述媒体数据的帧内关系;
2. RTP timestamp和NTP timestamp是同一时刻的不同表示;
3. 音视频同步的基本对象是AudioReceiveStream和VideoReceiveStream,两者都继承自Syncable;
4. 负责音视频同步的线程是ModuleProcessThread,主要处理文件是rtp_streams_synchronizer.cc,RtpStreamsSynchronizer类包含以下成员,
- StreamSynchronization类
- audio和video的Measurements;
- AudioReceiveStream和VideoReceiveStream的指针:syncable_audio_和syncable_video_;
class RtpStreamsSynchronizer : public Module { public: explicit RtpStreamsSynchronizer(Syncable* syncable_video); void ConfigureSync(Syncable* syncable_audio); ...... private: Syncable* syncable_video_; Syncable* syncable_audio_ GUARDED_BY(crit_); StreamSynchronization::Measurements audio_measurement_ GUARDED_BY(crit_); StreamSynchronization::Measurements video_measurement_ GUARDED_BY(crit_);
......
};
5. 同步过程:
void RtpStreamsSynchronizer::Process() { RTC_DCHECK_RUN_ON(&process_thread_checker_); last_sync_time_ = rtc::TimeNanos(); rtc::CritScope lock(&crit_); if (!syncable_audio_) { return; } RTC_DCHECK(sync_.get()); rtc::Optional<Syncable::Info> audio_info = syncable_audio_->GetInfo(); if (!audio_info || !UpdateMeasurements(&audio_measurement_, *audio_info)) { return; } int64_t last_video_receive_ms = video_measurement_.latest_receive_time_ms; rtc::Optional<Syncable::Info> video_info = syncable_video_->GetInfo(); if (!video_info || !UpdateMeasurements(&video_measurement_, *video_info)) { return; } if (last_video_receive_ms == video_measurement_.latest_receive_time_ms) { // No new video packet has been received since last update. return; } int relative_delay_ms; // Calculate how much later or earlier the audio stream is compared to video. if (!sync_->ComputeRelativeDelay(audio_measurement_, video_measurement_, &relative_delay_ms)) { return; } TRACE_COUNTER1("webrtc", "SyncCurrentVideoDelay", video_info->current_delay_ms); TRACE_COUNTER1("webrtc", "SyncCurrentAudioDelay", audio_info->current_delay_ms); TRACE_COUNTER1("webrtc", "SyncRelativeDelay", relative_delay_ms); int target_audio_delay_ms = 0; int target_video_delay_ms = video_info->current_delay_ms; // Calculate the necessary extra audio delay and desired total video // delay to get the streams in sync. if (!sync_->ComputeDelays(relative_delay_ms, audio_info->current_delay_ms, &target_audio_delay_ms, &target_video_delay_ms)) { return; } syncable_audio_->SetMinimumPlayoutDelay(target_audio_delay_ms); syncable_video_->SetMinimumPlayoutDelay(target_video_delay_ms); }
SetMinimumPlayoutDelay告诉AV的Stream,之后的每一帧播放至少要延迟的ms数,直至该值更新。通过调整target_audio_delay_ms和target_video_delay_ms来协调audio和video两路流的播放时机。
参考文档:
https://www.jianshu.com/p/3a4d24a71091?hmsr=toutiao.io&utm_medium=toutiao.io&utm_source=toutiao.io
本文来自博客园,作者:jiayayao,邮箱:jiayayao@126.com,转载请注明原文链接:https://www.cnblogs.com/jiayayao/p/12649665.html