Skip to content

Commit

Permalink
Merge branch 'release/rtc-ng/4.5.0' into 4.5.0-ld
Browse files Browse the repository at this point in the history
  • Loading branch information
Cilla-luodan committed Oct 30, 2024
2 parents 40423bd + 3202885 commit 1f09306
Show file tree
Hide file tree
Showing 30 changed files with 1,115 additions and 338 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@
<note type="attention">
<ul>
<li props="cpp unreal bp flutter unity">该方法仅适用于 Android 和 iOS。</li>
<li id="sequence">该方法必须在 SDK 触发 <xref keyref="onLocalVideoStateChanged"/> 回调,返回本地视频状态为 <apiname keyref="LOCAL_VIDEO_STREAM_STATE_ENCODING"/> (2) 之后调用。</li>
<li id="sequence">该方法必须在 SDK 触发 <xref keyref="onLocalVideoStateChanged"/> 回调,返回本地视频状态为 <apiname keyref="LOCAL_VIDEO_STREAM_STATE_CAPTURING"/> (1) 之后调用。</li>
<li>建议你在调用 <xref keyref="setCameraExposureFactor"/> 调节曝光系数前,先调用该方法查询当前摄像头是否支持曝光调节。</li>
<li>当你调用该方法时,查询的是当前正在使用的摄像头是否支持曝光调节,即调用 <xref keyref="setCameraCapturerConfiguration"/> 时指定的摄像头。</li>
</ul></note>
Expand Down
2 changes: 1 addition & 1 deletion dita/RTC-NG/API/api_irtcengine_iscamerazoomsupported.dita
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@
</section>
<section id="timing" deliveryTarget="details">
<title>调用时机</title>
<p>该方法必须在 SDK 触发 <xref keyref="onLocalVideoStateChanged"/> 回调,返回本地视频状态为 <apiname keyref="LOCAL_VIDEO_STREAM_STATE_ENCODING"/> (2) 之后调用。</p>
<p>该方法必须在 SDK 触发 <xref keyref="onLocalVideoStateChanged"/> 回调,返回本地视频状态为 <apiname keyref="LOCAL_VIDEO_STREAM_STATE_CAPTURING"/> (1) 之后调用。</p>
</section>
<section id="restriction" deliveryTarget="details">
<title>调用限制</title>
Expand Down
65 changes: 65 additions & 0 deletions dita/RTC-NG/API/api_irtcengine_setexternalremoteeglcontext.dita
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE reference PUBLIC "-//OASIS//DTD DITA Reference//EN" "reference.dtd">
<reference id="api_irtcengine_setexternalremoteeglcontext">
<title><ph keyref="setExternalRemoteEglContext"/></title>
<shortdesc id="short"><ph id="shortdesc">设置远端视频流渲染的 EGL 环境上下文。</ph></shortdesc>
<prolog>
<metadata>
<keywords>
<indexterm keyref="setExternalRemoteEglContext"/>
</keywords>
</metadata>
</prolog>
<refbody>
<section id="prototype">
<p outputclass="codeblock">
<codeblock props="android" outputclass="language-java">public abstract int setExternalRemoteEglContext(Object eglContext);</codeblock>
<codeblock props="hmos" outputclass="language-arkts"/>
<codeblock props="ios mac" outputclass="language-objectivec"/>
<codeblock props="cpp unreal" outputclass="language-cpp">virtual int setExternalRemoteEglContext(void* eglContext) = 0;</codeblock>
<codeblock props="bp" outputclass="language-cpp"/>
<codeblock props="electron" outputclass="language-typescript"/>
<codeblock props="unity cs" outputclass="language-csharp"/>
<codeblock props="rn" outputclass="language-typescript"/>
<codeblock props="flutter" outputclass="language-dart"/> </p>
</section>
<section id="detailed_desc" deliveryTarget="details" otherprops="no-title">
<dl outputclass="since">
<dlentry props="native">
<dt>自从</dt>
<dd>v4.5.0</dd>
</dlentry>
</dl>
<p>通过设置该方法,开发者可以替换 SDK 内部默认的远端 EGL 环境上下文,便于实现统一的 EGL 上下文管理。</p>
<p>引擎销毁时,SDK 会自动释放 EGL 环境上下文。</p>
<note type="attention" props="cpp unreal bp flutter unity rn">该方法仅适用于 Android。</note>
</section>
<section id="scenario" deliveryTarget="details">
<title>适用场景</title>
<p>该方法适用于使用 Texture 格式的视频数据进行远端视频自渲染的场景。</p>
</section>
<section id="timing" deliveryTarget="details">
<title>调用时机</title>
<p>该方法需要在加入频道前调用。</p>
</section>
<section id="restriction" deliveryTarget="details">
<title>调用限制</title>
<p>无。</p>
</section>
<section id="parameters" deliveryTarget="details">
<title>参数</title>
<parml>
<plentry>
<pt>eglContext</pt>
<pd>用于远端视频流渲染的 EGL 环境上下文对象。</pd>
</plentry>
</parml> </section>
<section id="return_values">
<title><ph keyref="return-section-title"/></title>
<p props="flutter">方法成功调用时,无返回值;方法调用失败时,会抛出 <xref keyref="AgoraRtcException"/> 异常,你需要捕获异常并进行处理。<ph props="cn">详见<xref keyref="error-code-link"/>了解详情和解决建议。</ph></p>
<ul props="native unreal bp electron unity rn cs">
<li>0: 方法调用成功。</li>
<li>&lt; 0: 方法调用失败。<ph props="cn">详见<xref keyref="error-code-link"/>了解详情和解决建议。</ph></li>
</ul> </section>
</refbody>
</reference>
3 changes: 2 additions & 1 deletion dita/RTC-NG/API/api_irtcengine_setvideoscenario.dita
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,8 @@
</li>
</ul>
</p>
<p id="1v1"><apiname keyref="APPLICATION_SCENARIO_1V1"/> (2) 适用于视频 1v1 通话场景。针对该场景低延迟、高画质的体验要求,SDK 进行了策略调优,提升了画质、首帧出图、中低端机延迟及弱网流畅度等性能表现。</p>
<p id="1v1"><apiname keyref="APPLICATION_SCENARIO_1V1"/> (2) 适用于<xref keyref="one-to-one-live"/>场景。针对该场景低延迟、高画质的体验要求,SDK 进行了策略调优,提升了画质、首帧出图、中低端机延迟及弱网流畅度等性能表现。</p>
<p id="liveshow"><apiname keyref="APPLICATION_SCENARIO_LIVESHOW"/> (3) 适用于<xref keyref="showroom"/>场景。针对该场景对首帧出图时间和画质清晰度的高要求,SDK 进行了策略调优,重点提升了首帧出图体验和画质表现,同时增强了在弱网环境和低端设备上的画质和流畅度表现。</p>
</pd>
</plentry>
</parml>
Expand Down
36 changes: 27 additions & 9 deletions dita/RTC-NG/API/class_audiotrackconfig.dita
Original file line number Diff line number Diff line change
Expand Up @@ -7,27 +7,35 @@
<section id="prototype">
<p outputclass="codeblock">
<codeblock props="android" outputclass="language-java">public class AudioTrackConfig {

public boolean enableLocalPlayback;


public boolean enableAudioProcessing;
public AudioTrackConfig() {
this.enableLocalPlayback = true;
this.enableAudioProcessing = false;
}
@Override
public String toString() {
return &quot;AudioTrackConfig{&quot;
+ &quot;enableLocalPlayback=&quot; + enableLocalPlayback + &quot;enableAudioProcessing&quot;
+ enableAudioProcessing + &#x27;}&#x27;;
}
}</codeblock>
}</codeblock>
<codeblock props="hmos" outputclass="language-arkts">export class AudioTrackConfig {

public enableLocalPlayback: boolean = true;
}</codeblock>
<codeblock props="ios mac" outputclass="language-objectivec">NS_SWIFT_NAME(AgoraAudioTrackConfig) __attribute__((visibility("default"))) @interface AgoraAudioTrackConfig : NSObject
<codeblock props="ios mac" outputclass="language-objectivec">NS_SWIFT_NAME(AgoraAudioTrackConfig) __attribute__((visibility(&quot;default&quot;))) @interface AgoraAudioTrackConfig : NSObject
@property (assign, nonatomic) BOOL enableLocalPlayback NS_SWIFT_NAME(enableLocalPlayback);
@property (assign, nonatomic) BOOL enableAudioProcessing NS_SWIFT_NAME(enableAudioProcessing);
@end</codeblock>
<codeblock props="cpp unreal" outputclass="language-cpp">struct AudioTrackConfig {

bool enableLocalPlayback;


AudioTrackConfig()
: enableLocalPlayback(true) {}

bool enableAudioProcessing;
AudioTrackConfig() : enableLocalPlayback(true),enableAudioProcessing(false) {}
};</codeblock>
<codeblock props="bp" outputclass="language-cpp">USTRUCT(BlueprintType)
struct FAudioTrackConfig
Expand Down Expand Up @@ -84,6 +92,16 @@ class AudioTrackConfig {
</ul>
</pd>
</plentry>
<plentry>
<pt>enableAudioProcessing</pt>
<pd>是否启用音频处理模块:
<ul>
<li><codeph><ph keyref="true"/></codeph>:启用音频处理模块,应用回声消除 (AEC)、降噪 (ANS) 和自动增益控制 (AGC) 效果。</li>
<li><codeph><ph keyref="false"/></codeph>:(默认)不启用音频处理模块。</li>
</ul>
<note type="attention">该参数设置仅对 <ph keyref="AUDIO_TRACK_DIRECT"/> 类型的自定义音频采集轨道生效。</note>
</pd>
</plentry>
</parml> </section>
</refbody>
</reference>
75 changes: 47 additions & 28 deletions dita/RTC-NG/API/class_externalvideoframe.dita
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
public static final int BUFFER_TYPE_ARRAY = 2;
public static final int BUFFER_TYPE_TEXTURE = 3;
public AgoraVideoFrame() {
format = 10;
format = 10;
timeStamp = 0;
stride = 0;
height = 0;
Expand All @@ -37,6 +37,7 @@
rotation = 0;
alphaStitchMode = 0;
}

public int format;
public long timeStamp;
public int stride;
Expand All @@ -46,7 +47,6 @@
public float[] transform;
public javax.microedition.khronos.egl.EGLContext eglContext10;
public android.opengl.EGLContext eglContext14;

public byte[] buf;
public int cropLeft;
public int cropTop;
Expand All @@ -56,34 +56,43 @@
public int alphaStitchMode;
@Override
public String toString() {
return "AgoraVideoFrame{"
+ "format=" + format + ", timeStamp=" + timeStamp + ", stride=" + stride
+ ", height=" + height + ", textureID=" + textureID
+ ", buf.length=" + (buf != null ? buf.length : 0) + ", cropLeft=" + cropLeft
+ ", cropTop=" + cropTop + ", cropRight=" + cropRight + ", cropBottom=" + cropBottom
+ ", rotation=" + rotation + ", alphaStitchMode=" + alphaStitchMode + '}';
return &quot;AgoraVideoFrame{&quot;
+ &quot;format=&quot; + format + &quot;, timeStamp=&quot; + timeStamp + &quot;, stride=&quot; + stride
+ &quot;, height=&quot; + height + &quot;, textureID=&quot; + textureID
+ &quot;, buf.length=&quot; + (buf != null ? buf.length : 0) + &quot;, cropLeft=&quot; + cropLeft
+ &quot;, cropTop=&quot; + cropTop + &quot;, cropRight=&quot; + cropRight + &quot;, cropBottom=&quot; + cropBottom
+ &quot;, rotation=&quot; + rotation + &quot;, alphaStitchMode=&quot; + alphaStitchMode + &#x27;}&#x27;;
}
}</codeblock>
<codeblock props="hmos" outputclass="language-arkts"></codeblock>
<codeblock props="ios mac" outputclass="language-objectivec">__attribute__((visibility("default"))) @interface AgoraVideoFrame : NSObject
<codeblock props="ios mac" outputclass="language-objectivec">
__attribute__((visibility(&quot;default&quot;))) @interface AgoraVideoFrame : NSObject
@property(assign, nonatomic) NSInteger format;
@property(assign, nonatomic) CMTime time;
@property(assign, nonatomic) int stride DEPRECATED_MSG_ATTRIBUTE("use strideInPixels instead");
@property(assign, nonatomic) int strideInPixels;
@property(assign, nonatomic) int height;

@property(assign, nonatomic) CMTime time;
@property(assign, nonatomic) int stride DEPRECATED_MSG_ATTRIBUTE(&quot;use strideInPixels instead&quot;);

@property(assign, nonatomic) int strideInPixels;
@property(assign, nonatomic) int height;
@property(assign, nonatomic) CVPixelBufferRef _Nullable textureBuf;

@property(strong, nonatomic) IMAGE_CLASS * _Nullable image;
@property(strong, nonatomic) NSData *_Nullable dataBuf;

@property(strong, nonatomic) NSData *_Nullable dataBuf;
@property(strong, nonatomic) NSData *_Nullable alphaBuf;
@property(assign, nonatomic) AgoraAlphaStitchMode alphaStitchMode;
@property(assign, nonatomic) int cropLeft;
@property(assign, nonatomic) int cropTop;
@property(assign, nonatomic) int cropRight;
@property(assign, nonatomic) int cropBottom;
@property(assign, nonatomic) int rotation;

@property(assign, nonatomic) int cropLeft;
@property(assign, nonatomic) int cropTop;
@property(assign, nonatomic) int cropRight;
@property(assign, nonatomic) int cropBottom;
@property(assign, nonatomic) int rotation;
@property(strong, nonatomic) AgoraColorSpace *_Nullable colorSpace;

- (void)fillAlphaData;
@end</codeblock>
<codeblock props="cpp unreal" outputclass="language-cpp">struct ExternalVideoFrame {
<codeblock props="cpp unreal" outputclass="language-cpp">
struct ExternalVideoFrame {
ExternalVideoFrame()
: type(VIDEO_BUFFER_RAW_DATA),
format(VIDEO_PIXEL_DEFAULT),
Expand All @@ -99,22 +108,26 @@
eglContext(NULL),
eglType(EGL_CONTEXT10),
textureId(0),
fenceObject(0),
metadataBuffer(NULL),
metadataSize(0),
alphaBuffer(NULL),
fillAlphaBuffer(false),
alphaStitchMode(0),
alphaStitchMode(NO_ALPHA_STITCH),
d3d11Texture2d(NULL),
textureSliceIndex(0){}

enum EGL_CONTEXT_TYPE {
EGL_CONTEXT10 = 0,
EGL_CONTEXT14 = 1,
};

enum VIDEO_BUFFER_TYPE {
VIDEO_BUFFER_RAW_DATA = 1,
VIDEO_BUFFER_ARRAY = 2,
VIDEO_BUFFER_TEXTURE = 3,
};

VIDEO_BUFFER_TYPE type;
VIDEO_PIXEL_FORMAT format;
void* buffer;
Expand All @@ -126,17 +139,19 @@
int cropBottom;
int rotation;
long long timestamp;
void *eglContext;
void* eglContext;
EGL_CONTEXT_TYPE eglType;
int textureId;
long long fenceObject;
float matrix[16];
uint8_t* metadataBuffer;
int metadataSize;
uint8_t* alphaBuffer;
bool fillAlphaBuffer;
int alphaStitchMode;
ALPHA_STITCH_MODE alphaStitchMode;
void *d3d11Texture2d;
int textureSliceIndex;
ColorSpace colorSpace;
};</codeblock>
<codeblock props="electron" outputclass="language-typescript">export class ExternalVideoFrame {
type?: VideoBufferType;
Expand Down Expand Up @@ -402,9 +417,9 @@ class ExternalVideoFrame {
<pt>transform</pt>
<pd>Texture 帧额外的转换。该参数仅适用于 Texture 格式的视频数据。</pd>
</plentry>
<plentry props="android hmos">
<pt>eglContext11</pt>
<pd>EGLContext11。该参数仅适用于 Texture 格式的视频数据。</pd>
<plentry props="android">
<pt>eglContext10</pt>
<pd>EGLContext10。该参数仅适用于 Texture 格式的视频数据。</pd>
</plentry>
<plentry props="android hmos">
<pt>eglContext14</pt>
Expand All @@ -414,8 +429,8 @@ class ExternalVideoFrame {
<pt>eglContext</pt>
<pd>该参数仅适用于 Texture 格式的视频数据。
<ul id="ul_fsv_shd_krb" props="cpp unreal bp unity rn electron flutter">
<li>当使用 Khronos 定义的 OpenGL 接口 (javax.microedition.khronos.egl.*)时,需要将 eglContext 设置给这个字段。</li>
<li>当使用 Android 定义的 OpenGL 接口 (android.opengl.*)时,需要将 eglContext 设置给这个字段。</li>
<li>当使用 Khronos 定义的 OpenGL 接口 (javax.microedition.khronos.egl.*) 时,需要将 eglContext 设置给这个字段。</li>
<li>当使用 Android 定义的 OpenGL 接口 (android.opengl.*) 时,需要将 eglContext 设置给这个字段。</li>
</ul></pd>
</plentry>
<plentry props="cpp unreal bp unity electron rn flutter cs">
Expand Down Expand Up @@ -506,6 +521,10 @@ class ExternalVideoFrame {
<pt props="ios mac">time</pt>
<pd>传入的视频帧的时间戳,以毫秒为单位。不正确的时间戳会导致丢帧或者音视频不同步。</pd>
</plentry>
<plentry props="apple cpp" conkeyref="VideoFrame/colorspace">
<pt/>
<pd/>
</plentry>
</parml> </section>
</refbody>
</reference>
Loading

0 comments on commit 1f09306

Please sign in to comment.