Skip to content

Commit

Permalink
refactor(core): replace CodecSurface by the SurfaceProcessor
Browse files Browse the repository at this point in the history
  • Loading branch information
ThibaultBee committed Nov 18, 2024
1 parent 8adacbe commit 8fd4e0d
Show file tree
Hide file tree
Showing 56 changed files with 3,431 additions and 1,219 deletions.
100 changes: 74 additions & 26 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -134,8 +134,8 @@ To simplify integration, StreamPack provides an `PreviewView`.

There are 2 types of streamers:

- Kotlin Coroutine based
- callback based
- Kotlin Coroutine based: streamer APIs use `suspend` functions and `Flow`
- callback based: streamer APIs use callbacks

```kotlin
// For coroutine based
Expand All @@ -147,6 +147,9 @@ val streamer = DefaultCameraStreamer(context = requireContext())
4. Configures audio and video settings

```kotlin
// Already instantiated streamer
val streamer = DefaultCameraStreamer(context = requireContext())

val audioConfig = AudioConfig(
startBitrate = 128000,
sampleRate = 44100,
Expand All @@ -165,19 +168,26 @@ streamer.configure(audioConfig, videoConfig)
5. Inflates the camera preview with the streamer

```kotlin
// Already instantiated streamer
val streamer = DefaultCameraStreamer(context = requireContext())

/**
* If the preview is in a PreviewView
* If the preview is a [PreviewView]
*/
preview.streamer = streamer
/**
* If the preview is in a SurfaceView, a TextureView, or any View that can provide a Surface
* If the preview is in a SurfaceView, a TextureView, a Surface,... you can use:
*/
streamer.startPreview(preview)
```

6. Starts the live streaming

```kotlin
// Already instantiated streamer
val streamer = DefaultCameraStreamer(context = requireContext())


val descriptor =
UriMediaDescriptor("rtmps://serverip:1935/s/streamKey") // For RTMP/RTMPS. Uri also supports SRT url, file, content path,...
/**
Expand All @@ -192,6 +202,9 @@ streamer.startStream()
7. Stops and releases the streamer

```kotlin
// Already instantiated streamer
val streamer = DefaultCameraStreamer(context = requireContext())

streamer.stopStream()
streamer.close() // Disconnect from server or close the file
streamer.stopPreview() // The StreamerSurfaceView will be automatically stop the preview
Expand Down Expand Up @@ -265,50 +278,85 @@ You will also have to declare the `Service`,
</application>
```

## Rotations

To set the `Streamer` orientation, you can use the `targetRotation` setter:

```kotlin
// Already instantiated streamer
val streamer = DefaultCameraStreamer(context = requireContext())

streamer.targetRotation =
Surface.ROTATION_90 // Or Surface.ROTATION_0, Surface.ROTATION_180, Surface.ROTATION_270
```

StreamPack comes with a `RotationProvider` that fetches and listens the device rotation: the
`DeviceRotationProvider`.

```kotlin
// Already instantiated streamer
val streamer = DefaultCameraStreamer(context = requireContext())

val listener = object : IRotationProvider.Listener {
override fun onOrientationChanged(rotation: Int) {
streamer.targetRotation = rotation
}
}
rotationProvider.addListener(listener)

// Don't forget to remove the listener when you don't need it anymore
rotationProvider.removeListener(listener)
```

See the `demos` for a complete example.

## Tips

### RTMP or SRT

RTMP and SRT are both live streaming protocols. SRT is a UDP-based modern protocol, it is reliable
and ultra low latency. RTMP is a TCP-based protocol, it is also reliable but it is only low latency.
RTMP and SRT are both live streaming protocols . SRT is a UDP - based modern protocol, it is
reliable
and ultra low latency . RTMP is a TCP - based protocol, it is also reliable but it is only low
latency .
There are already a lot of comparison over the Internet, so here is a summary:
SRT:

- Ultra low latency (< 1s)
- HEVC support through MPEG-TS RTMP:
- Low latency (2-3s)
- HEVC not officially support (specification has been aban by its creator)
-Ultra low latency(< 1 s)
-HEVC support through MPEG -TS RTMP :
-Low latency (2 - 3 s)
-HEVC not officially support (specification has been aban by its creator)

So, the main question is: "which protocol to use?"
So, the main question is : "which protocol to use?"
It is easy: if your server has SRT support, use SRT otherwise use RTMP.

### Streamers

Let's start with some definitions! `Streamers` are classes that represent a streaming pipeline:
capture, encode, mux and send.
They comes in multiple flavours: with different audio and video source. 3 types of base streamers
are available:
capture, encode, mux and send.They comes in multiple flavours: with different audio and video
source . 3 types of base streamers
are available :

- `DefaultCameraStreamer`: for streaming from camera
- `DefaultScreenRecorderStreamer`: for streaming from screen
- `DefaultAudioOnlyStreamer`: for streaming audio only
-`DefaultCameraStreamer`: for streaming from camera
-`DefaultScreenRecorderStreamer`: for streaming from screen
-`DefaultAudioOnlyStreamer`: for streaming audio only

Since 3.0.0, the endpoint of a `Streamer` is inferred from the `MediaDescriptor` object passed to
the `open` or `startStream` methods. It is possible to limit the possibility of the endpoint by
the `open` or `startStream` methods.It is possible to limit the possibility of the endpoint by
implementing your own `DynamicEndpoint.Factory` or passing a endpoint as the `Streamer` `endpoint`
parameter.

To create a `Streamer` for a new source, you have to create a new `Streamer` class that inherits
from `DefaultStreamer`.
parameter.To create a `Streamer` for a new source, you have to create a new `Streamer` class that
inherits
from `DefaultStreamer` .

### Get device capabilities

Have you ever wonder: "What are the supported resolution of my cameras?" or "What is the supported
sample rate of my audio codecs?"? `Info` classes are made for this. All `Streamer` comes with a
specific `Info` object:
Have you ever wonder : "What are the supported resolution of my cameras?" or "What is the supported
sample rate of my audio codecs ?"? `Info` classes are made for this. All `Streamer` comes with a
specific `Info` object :

```kotlin

```kotlin
val info = streamer.getInfo(MediaDescriptor("rtmps://serverip:1935/s/streamKey"))

```
For static endpoint or an opened dynamic endpoint, you can directly get the info:
Expand Down
3 changes: 2 additions & 1 deletion core/build.gradle.kts
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,8 @@ dependencies {
implementation(libs.androidx.core.ktx)
implementation(libs.androidx.activity)
implementation(libs.kotlinx.coroutines.android)
implementation(libs.androidx.window)
implementation(libs.androidx.concurrent.futures)

testImplementation(libs.androidx.test.rules)
testImplementation(libs.androidx.test.core.ktx)
Expand All @@ -28,7 +30,6 @@ dependencies {
testImplementation(libs.kotlinx.coroutines.test)
testImplementation(libs.robolectric)


androidTestImplementation(libs.androidx.test.core.ktx)
androidTestImplementation(libs.androidx.test.rules)
androidTestImplementation(libs.androidx.junit)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,12 +33,13 @@ import android.media.MediaFormat
import android.media.MediaFormat.KEY_PRIORITY
import android.os.Build
import android.util.Size
import androidx.annotation.IntRange
import io.github.thibaultbee.streampack.core.internal.encoders.mediacodec.MediaCodecHelper
import io.github.thibaultbee.streampack.core.internal.utils.RotationValue
import io.github.thibaultbee.streampack.core.internal.utils.av.video.DynamicRangeProfile
import io.github.thibaultbee.streampack.core.internal.utils.extensions.isDevicePortrait
import io.github.thibaultbee.streampack.core.internal.utils.extensions.isVideo
import io.github.thibaultbee.streampack.core.internal.utils.extensions.landscapize
import io.github.thibaultbee.streampack.core.internal.utils.extensions.portraitize
import io.github.thibaultbee.streampack.core.internal.utils.extensions.rotateFromNaturalOrientation
import io.github.thibaultbee.streampack.core.internal.utils.extensions.rotationToDegrees
import io.github.thibaultbee.streampack.core.streamers.DefaultStreamer
import java.security.InvalidParameterException
import kotlin.math.roundToInt
Expand Down Expand Up @@ -153,20 +154,6 @@ class VideoConfig(
*/
val isHdr by lazy { dynamicRangeProfile != DynamicRangeProfile.sdr }

/**
* Get resolution according to device orientation
*
* @param context activity context
* @return oriented resolution
*/
fun getDeviceOrientedResolution(context: Context): Size {
return if (context.isDevicePortrait) {
resolution.portraitize
} else {
resolution.landscapize
}
}

/**
* Get the media format from the video configuration
*
Expand Down Expand Up @@ -348,3 +335,24 @@ class VideoConfig(
}
}

/**
* Rotates video configuration to [rotation] from device natural orientation.
*/
fun VideoConfig.rotateFromNaturalOrientation(context: Context, @RotationValue rotation: Int) =
rotateDegreesFromNaturalOrientation(context, rotation.rotationToDegrees)

/**
* Rotatse video configuration to [rotationDegrees] from device natural orientation.
*/
fun VideoConfig.rotateDegreesFromNaturalOrientation(
context: Context,
@IntRange(from = 0, to = 359) rotationDegrees: Int
): VideoConfig {
val newResolution = resolution.rotateFromNaturalOrientation(context, rotationDegrees)
return if (resolution != newResolution) {
copy(resolution = newResolution)
} else {
this
}
}

Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@
package io.github.thibaultbee.streampack.core.internal.encoders

import android.view.Surface
import io.github.thibaultbee.streampack.core.data.Config
import io.github.thibaultbee.streampack.core.internal.data.Frame
import io.github.thibaultbee.streampack.core.internal.interfaces.Releaseable
import io.github.thibaultbee.streampack.core.internal.interfaces.SuspendStreamable
Expand Down Expand Up @@ -62,6 +63,12 @@ interface IEncoder {

interface IEncoderInternal : SuspendStreamable, Releaseable,
IEncoder {

/**
* The encoder configuration
*/
val config: Config

interface IListener {
/**
* Calls when an encoder has an error.
Expand Down Expand Up @@ -99,6 +106,11 @@ interface IEncoderInternal : SuspendStreamable, Releaseable,
*/
interface ISurfaceInput :
IEncoderInput {
/**
* The surface where to write the frame
*/
val surface: Surface?

/**
* The surface update listener
*/
Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,18 @@
/*
* Copyright (C) 2024 Thibault B.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package io.github.thibaultbee.streampack.core.internal.encoders.mediacodec

import android.media.MediaCodecInfo
Expand All @@ -6,7 +21,6 @@ import android.os.Build
import io.github.thibaultbee.streampack.core.data.AudioConfig
import io.github.thibaultbee.streampack.core.data.Config
import io.github.thibaultbee.streampack.core.data.VideoConfig
import io.github.thibaultbee.streampack.core.internal.orientation.ISourceOrientationProvider

sealed class EncoderConfig<T : Config>(val config: T) {
/**
Expand Down Expand Up @@ -39,13 +53,12 @@ sealed class EncoderConfig<T : Config>(val config: T) {

class VideoEncoderConfig(
videoConfig: VideoConfig,
val useSurfaceMode: Boolean = true,
private val orientationProvider: ISourceOrientationProvider? = null
val useSurfaceMode: Boolean = true
) : EncoderConfig<VideoConfig>(
videoConfig
) {
override val isVideo = true

override fun buildFormat(withProfileLevel: Boolean): MediaFormat {
val format = config.getFormat(withProfileLevel)
if (useSurfaceMode) {
Expand All @@ -68,31 +81,19 @@ class VideoEncoderConfig(
return format
}

fun orientateFormat(format: MediaFormat) {
orientationProvider?.let {
it.getOrientedSize(config.resolution).apply {
// Override previous format
format.setInteger(MediaFormat.KEY_WIDTH, width)
format.setInteger(MediaFormat.KEY_HEIGHT, height)
}
}
}

override fun equals(other: Any?): Boolean {
if (this === other) return true
if (other !is VideoEncoderConfig) return false

if (!super.equals(other)) return false
if (useSurfaceMode != other.useSurfaceMode) return false
if (orientationProvider != other.orientationProvider) return false

return true
}

override fun hashCode(): Int {
var result = super.hashCode()
result = 31 * result + useSurfaceMode.hashCode()
result = 31 * result + (orientationProvider?.hashCode() ?: 0)
result = 31 * result + isVideo.hashCode()
return result
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -85,6 +85,7 @@ internal constructor(
}
}

override val config = encoderConfig.config

private val encoderCallback = EncoderCallback()

Expand Down Expand Up @@ -128,15 +129,6 @@ internal constructor(
}

override fun configure() {
/**
* This is a workaround because few Samsung devices (such as Samsung Galaxy J7 Prime does
* not find any encoder if the width and height are oriented to portrait.
* We defer orientation of width and height to here.
*/
if (encoderConfig is VideoEncoderConfig) {
encoderConfig.orientateFormat(format)
}

try {
/**
* Set encoder callback without handler.
Expand Down Expand Up @@ -448,7 +440,8 @@ internal constructor(
internal inner class SurfaceInput : IEncoderInternal.ISurfaceInput {
private val obsoleteSurfaces = mutableListOf<Surface>()

private var surface: Surface? = null
override var surface: Surface? = null
private set

override var listener = object : IEncoderInternal.ISurfaceInput.OnSurfaceUpdateListener {}
set(value) {
Expand Down
Loading

0 comments on commit 8fd4e0d

Please sign in to comment.