Skip to content

Commit

Permalink
review cooments
Browse files Browse the repository at this point in the history
  • Loading branch information
Cilla-luodan committed Oct 30, 2024
1 parent 2855ed6 commit 40423bd
Show file tree
Hide file tree
Showing 11 changed files with 29 additions and 34 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@
<note id="note_v4j_tbx_jqb" type="attention">
<ul>
<li props="cpp unreal bp unity flutter">该方法仅适用于 macOS 和 Windows 平台。</li>
<li props="cpp unreal bp mac unity electron flutter">macOS 系统默认声卡不支持采集功能。自 v4.5.0 起,SDK 支持自动安装虚拟声卡功能,当你第一次调用该方法时,SDK 会自动安装内置的声网自研虚拟声卡 AgoraALD。安装成功后,音频路由会自动切换为虚拟声卡,同时支持使用虚拟声卡进行采集。</li>
<li props="cpp unreal bp mac unity electron flutter">macOS 系统默认声卡不支持采集功能。自 v4.5.0 起,SDK 支持自动安装虚拟声卡功能,当你第一次调用该方法时,SDK 会自动安装内置的声网自研虚拟声卡 AgoraALD。安装成功后,音频路由会自动切换为虚拟声卡,同时使用虚拟声卡进行采集。</li>
<li>该方法在加入频道前后都能调用。</li>
<li>如果你调用了 <xref keyref="disableAudio"/> 关闭音频模块,声卡采集功能也会被关闭。如果你需要再次启用声卡采集功能,需要调用 <xref keyref="enableAudio"/> 启用音频模块后再次调用 <apiname keyref="enableLoopbackRecording"/>。</li>
</ul> </note> </section>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -55,9 +55,7 @@
<parml>
<plentry>
<pt>mediaProjection</pt>
<pd>
<p><xref keyref="mediaprojection-link"/> 对象,用于采集屏幕视频流。</p>
</pd>
<pd>一个 <xref keyref="mediaprojection-link"/> 对象,用于采集屏幕视频流。</pd>
</plentry>
</parml> </section>
<section id="return_values">
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ targetFps:(int)targetFps;</codeblock>
</plentry>
<plentry id="targetfps">
<pt>targetFps</pt>
<pd>最大渲染帧率。支持的参数值为:1、7、10、15、24、30、60。
<pd>最大渲染帧率 (fps)。支持的参数值为:1、7、10、15、24、30、60。
<note type="caution">请将此参数设置为低于视频实际帧率的渲染帧率,否则设置将不会生效。</note></pd>
</plentry>
</parml> </section>
Expand Down
9 changes: 4 additions & 5 deletions dita/RTC-NG/API/api_irtcengine_startlocalaudiomixer.dita
Original file line number Diff line number Diff line change
Expand Up @@ -30,8 +30,8 @@
<dd>v4.5.0</dd>
</dlentry>
</dl>
<p>该方法支持在本地将多路音频流合并为一路音频流。例如:将本地麦克风采集的音频流、媒体播放器中的音频流、声卡采集的音频流、远端音频流、通过网络获取的远端音频流等合并为一路音频流,然后将合流后的音频流发布到频道中。<ul>
<li>如果你要对本地采集的音频流进行合流,可以将 <xref keyref="ChannelMediaOptions"/> 中的 publishMixedAudioTrack 设置为 true,则可将合流后的音频流发布到频道中。</li>
<p>该方法支持在本地将多路音频流合并为一路音频流。例如:将本地麦克风采集的音频流、媒体播放器中的音频流、声卡采集的音频流、远端音频流等合并为一路音频流,然后将合流后的音频流发布到频道中。<ul>
<li>如果你要对本地采集的音频流进行合流,可以将 <xref keyref="ChannelMediaOptions"/> 中的 publishMixedAudioTrack 设置为 <codeph><ph keyref="true"/></codeph>,则可将合流后的音频流发布到频道中。</li>
<li>如果你要对远端音频流进行合流,需确保远端音频流已在频道内发布并且已被订阅。</li>
</ul></p>
</section>
Expand All @@ -40,9 +40,8 @@
<p>你可以在如下场景中开启该功能:
<ul>
<li>结合本地合图功能一起使用,可将合图视频流相关的音频流同步采集和发布。</li>
<li>在语聊房、游戏直播场景下,添加来自媒体播放器的音频源作为背景音乐。</li>
<li>在线教育场景下,添加来自媒体播放器中的音频源作为音频课件。</li>
<li>为视频在线实时配音。</li>
<li>在直播场景下,用户接收频道内的音频流,在本地进行多路音频流合流后转发到其他频道。</li>
<li>在教育场景下,老师可将跟学生连麦互动的音频在本地进行合流,然后将合并的音频流转发到其他频道。</li>
</ul></p>
</section>
<section id="timing" deliveryTarget="details">
Expand Down
2 changes: 1 addition & 1 deletion dita/RTC-NG/API/api_irtcengine_stoplocalaudiomixer.dita
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@
</section>
<section id="timing" deliveryTarget="details">
<title>调用时机</title>
<p>该方法需要在 <apiname keyref="startLocalAudioMixer"/> 之后调用。</p>
<p>该方法需要在 <apiname keyref="startLocalAudioMixer"/> 之后调用。</p>
</section>
<section id="restriction" deliveryTarget="details">
<title>调用限制</title>
Expand Down
12 changes: 5 additions & 7 deletions dita/RTC-NG/API/class_localaudiomixerconfiguration.dita
Original file line number Diff line number Diff line change
Expand Up @@ -7,10 +7,10 @@
<section id="prototype">
<p outputclass="codeblock">
<codeblock props="android" outputclass="language-java">public class LocalAudioMixerConfiguration {
public ArrayList&lt;MixedAudioStream> mixedAudioStreams;
public ArrayList&lt;MixedAudioStream> audioInputStreams;
public boolean syncWithLocalMic;
public LocalAudioMixerConfiguration() {
mixedAudioStreams = new ArrayList&lt;MixedAudioStream>();
audioInputStreams = new ArrayList&lt;MixedAudioStream>();
syncWithLocalMic = true;
}
public static class MixedAudioStream {
Expand All @@ -33,7 +33,7 @@
@end</codeblock>
<codeblock props="cpp unreal" outputclass="language-cpp">struct LocalAudioMixerConfiguration {
unsigned int streamCount;
MixedAudioStream* sourceStreams;
MixedAudioStream* audioInputStreams;
bool syncWithLocalMic;
LocalAudioMixerConfiguration() : streamCount(0), syncWithLocalMic(true) {}
};</codeblock>
Expand All @@ -60,12 +60,10 @@
<pd>在本地进行合流的音频流数量。</pd>
</plentry>
<plentry>
<pt props="android">mixedAudioStreams</pt>
<pt props="apple">audioInputStreams</pt>
<pt props="cpp">sourceStreams</pt>
<pt>audioInputStreams</pt>
<pd>在本地进行合流的音频源。详见 <xref keyref="MixedAudioStream"/>。</pd>
</plentry>
<plentry props="cpp">
<plentry>
<pt>syncWithLocalMic</pt>
<pd>合流后的音频流是否使用本地麦克风采集的音频帧时间戳:
<ul>
Expand Down
8 changes: 4 additions & 4 deletions dita/RTC-NG/API/class_mixedaudiostream.dita
Original file line number Diff line number Diff line change
Expand Up @@ -28,24 +28,24 @@
<codeblock props="cpp unreal" outputclass="language-cpp">struct MixedAudioStream {
AUDIO_SOURCE_TYPE sourceType;
uid_t remoteUserUid;
const char* channelName;
const char* channelId;
track_id_t trackId;
MixedAudioStream(AUDIO_SOURCE_TYPE source)
: sourceType(source),
remoteUserUid(0),
channelName(NULL),
channelId(NULL),
trackId(-1) {}
MixedAudioStream(AUDIO_SOURCE_TYPE source, track_id_t track)
: sourceType(source),
trackId(track) {}
MixedAudioStream(AUDIO_SOURCE_TYPE source, uid_t uid, const char* channel)
: sourceType(source),
remoteUserUid(uid),
channelName(channel) {}
channelId(channel) {}
MixedAudioStream(AUDIO_SOURCE_TYPE source, uid_t uid, const char* channel, track_id_t track)
: sourceType(source),
remoteUserUid(uid),
channelName(channel),
channelId(channel),
trackId(track) {}
};</codeblock>
<codeblock props="bp" outputclass="language-cpp"/>
Expand Down
2 changes: 1 addition & 1 deletion dita/RTC-NG/API/enum_compressionpreference.dita
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
<parml>
<plentry>
<pt><ph keyref="PREFER_COMPRESSION_AUTO"/></pt>
<pd>-1:(默认)自动模式。SDK 会根据你设置的视频场景,自动选择 <ph keyref="PREFER_LOW_LATENCY"/> 或 <ph keyref="PREFER_QUALITY"/>;并在网络状态发生变化时,自动在 <ph keyref="PREFER_LOW_LATENCY"/> 和 <ph keyref="PREFER_QUALITY"/> 之间切换,以获得最佳的用户体验。</pd>
<pd>-1:(默认)自动模式。SDK 会根据你设置的视频场景,自动选择 <ph keyref="PREFER_LOW_LATENCY"/> 或 <ph keyref="PREFER_QUALITY"/>,以获得最佳的用户体验。</pd>
</plentry>
<plentry>
<pt><ph keyref="PREFER_LOW_LATENCY"/></pt>
Expand Down
8 changes: 4 additions & 4 deletions dita/RTC-NG/API/enum_localvideostreamreason.dita
Original file line number Diff line number Diff line change
Expand Up @@ -53,22 +53,22 @@
</plentry>
<plentry props="cpp unreal bp mac unity electron flutter cs">
<pt><ph keyref="LOCAL_VIDEO_STREAM_REASON_DEVICE_DISCONNECTED"/></pt>
<pd props="mac cpp unreal bp electron unity flutter"><ph>9:</ph><ph props="cpp unreal bp unity electron flutter">(仅适用于 macOS 和 Winodws)</ph><ph>当前正在使用的视频采集设备已经断开连接(例如,被拔出)。</ph></pd>
<pd props="mac cpp unreal bp electron unity flutter"><ph>9:</ph><ph props="cpp unreal bp unity flutter">(仅适用于 macOS 和 Windows)</ph><ph>当前正在使用的视频采集设备已经断开连接(例如,被拔出)。</ph></pd>
<pd props="cs">9:预留。</pd>
</plentry>
<plentry props="mac cpp unreal bp unity electron flutter cs">
<pt><ph keyref="LOCAL_VIDEO_STREAM_REASON_DEVICE_INVALID_ID"/></pt>
<pd><ph>10:</ph><ph props="cpp unreal bp flutter unity electron">(仅适用于 macOS 和 Windows)</ph><ph>SDK 无法在视频设备列表中找到该视频设备。请检查视频设备 ID 是否有效。</ph></pd>
<pd><ph>10:</ph><ph props="cpp unreal bp flutter unity">(仅适用于 macOS 和 Windows)</ph><ph>SDK 无法在视频设备列表中找到该视频设备。请检查视频设备 ID 是否有效。</ph></pd>
</plentry>
<plentry props="cpp unreal bp mac flutter unity electron cs">
<pt><ph keyref="LOCAL_VIDEO_STREAM_REASON_SCREEN_CAPTURE_WINDOW_MINIMIZED"/></pt>
<pd props="cpp unreal bp mac flutter unity electron"><ph>11:</ph><ph props="cpp unreal bp flutter unity electron">(仅适用于 macOS 和 Windows)</ph><ph>调用 <xref keyref="startScreenCaptureByWindowId"/> 方法共享窗口时,共享窗口处于最小化的状态。SDK 无法共享被最小化的窗口。请提示用户将共享窗口取消最小化。</ph></pd>
<pd props="cpp unreal bp mac flutter unity electron"><ph>11:</ph><ph props="cpp unreal bp flutter unity">(仅适用于 macOS 和 Windows)</ph><ph>调用 <xref keyref="startScreenCaptureByWindowId"/> 方法共享窗口时,共享窗口处于最小化的状态。SDK 无法共享被最小化的窗口。请提示用户将共享窗口取消最小化。</ph></pd>
<pd props="cs">11:预留。</pd>
</plentry>
<plentry props="cpp unreal bp mac flutter unity electron cs">
<pt><ph keyref="LOCAL_VIDEO_STREAM_REASON_SCREEN_CAPTURE_WINDOW_CLOSED"/></pt>
<pd>
<p><ph>12:</ph><ph props="cpp unreal bp flutter unity electron">(仅适用于 macOS 和 Windows)</ph><ph>通过窗口 ID 共享的窗口已关闭,或通过窗口 ID 共享的全屏窗口已退出全屏。退出全屏模式后,远端用户将无法看到共享的窗口。为避免远端用户看到黑屏,建议你立即结束本次共享。</ph></p>
<p><ph>12:</ph><ph props="cpp unreal bp flutter unity">(仅适用于 macOS 和 Windows)</ph><ph>通过窗口 ID 共享的窗口已关闭,或通过窗口 ID 共享的全屏窗口已退出全屏。退出全屏模式后,远端用户将无法看到共享的窗口。为避免远端用户看到黑屏,建议你立即结束本次共享。</ph></p>
<p>报告该错误码的常见场景:
<ul>
<li>本地用户关闭共享的窗口。</li>
Expand Down
7 changes: 0 additions & 7 deletions dita/RTC-NG/config/keys-rtc-ng-links-cpp.ditamap
Original file line number Diff line number Diff line change
Expand Up @@ -105,12 +105,5 @@
</keywords>
</topicmeta>
</keydef>
<keydef keys="mediaprojection-link" href="https://developer.android.com/reference/android/media/projection/MediaProjection" scope="external" format="html">
<topicmeta>
<keywords>
<keyword>MediaProjection</keyword>
</keywords>
</topicmeta>
</keydef>
</topichead>
</map>
7 changes: 7 additions & 0 deletions dita/RTC-NG/config/keys-rtc-ng-links.ditamap
Original file line number Diff line number Diff line change
Expand Up @@ -134,5 +134,12 @@
</keywords>
</topicmeta>
</keydef>
<keydef keys="mediaprojection-link" href="https://developer.android.com/reference/android/media/projection/MediaProjection" scope="external" format="html">
<topicmeta>
<keywords>
<keyword>MediaProjection</keyword>
</keywords>
</topicmeta>
</keydef>
</topichead>
</map>

0 comments on commit 40423bd

Please sign in to comment.