Skip to content

Commit

Permalink
Update some URLs (#412)
Browse files Browse the repository at this point in the history
  • Loading branch information
atouchet authored Nov 20, 2023
1 parent b1f6fe0 commit 971dd33
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 4 deletions.
6 changes: 3 additions & 3 deletions docs/avplayback.md
Original file line number Diff line number Diff line change
Expand Up @@ -88,13 +88,13 @@ while let Ok(event) = receiver.recv() {
```

## Implementation
The entry point of the media player implementation is the [ServoMedia.create_player()](https://github.com/servo/media/blob/b64b86b727ade722eaf571e65ff678364b69fc08/servo-media/lib.rs#L34) method that, among others, takes as argument an `IpcSender<PlayerEvent>` to get events from the `Player` instance, and shared references to instances of [VideoFrameRenderer](https://github.com/servo/media/blob/master/player/video.rs#L71) and [AudioFrameRenderer](https://github.com/servo/media/blob/b64b86b727ade722eaf571e65ff678364b69fc08/player/audio.rs#L1) that will receive the video and audio frames respectively as they are produced by the media player.
The entry point of the media player implementation is the [ServoMedia.create_player()](https://github.com/servo/media/blob/b64b86b727ade722eaf571e65ff678364b69fc08/servo-media/lib.rs#L34) method that, among others, takes as argument an `IpcSender<PlayerEvent>` to get events from the `Player` instance, and shared references to instances of [VideoFrameRenderer](https://github.com/servo/media/blob/main/player/video.rs#L71) and [AudioFrameRenderer](https://github.com/servo/media/blob/b64b86b727ade722eaf571e65ff678364b69fc08/player/audio.rs#L1) that will receive the video and audio frames respectively as they are produced by the media player.

Backends are required to implement the [Player](https://github.com/servo/media/blob/master/player/lib.rs#L92) trait, which exposes a basic API to control a/v playback.
Backends are required to implement the [Player](https://github.com/servo/media/blob/main/player/lib.rs#L92) trait, which exposes a basic API to control a/v playback.

The media player implementation does not deal with fetching the media data from any source. That task is left to the client. The `Player` trait exposes a [push_data()](https://github.com/servo/media/blob/b64b86b727ade722eaf571e65ff678364b69fc08/player/lib.rs#L101) method that gets a buffer of media data. The code that deals with fetching the media data in Servo lives within the [HTMLMediaElement implementation](https://github.com/servo/servo/blob/7bfa9179319d714656e7184e5159ea42595086e5/components/script/dom/htmlmediaelement.rs#L900). As Servo fetches data from the network or from a file, it [feeds](https://github.com/servo/servo/blob/7bfa9179319d714656e7184e5159ea42595086e5/components/script/dom/htmlmediaelement.rs#L2702) the media backend with media buffers. The media player decodes the given media data and builds the audio and/or video frames to be rendered by Servo. In the case of video frames, `servo-media` outputs frames either as raw images, by default, or as GL textures, if hardware acceleration is available. [WebRender](https://github.com/servo/webrender) is responsible for [rendering](https://github.com/servo/servo/blob/b41f5f97f26895f874514ce88cb359d65915738c/components/layout/display_list/builder.rs#L1883) the images that `servo-media` [outputs](https://github.com/servo/servo/blob/7bfa9179319d714656e7184e5159ea42595086e5/components/script/dom/htmlmediaelement.rs#L179).

The [GStreamer](https://github.com/servo/media/blob/master/backends/gstreamer/player.rs#L829) implementation is mostly a wrapper around [GstPlayer](https://gstreamer.freedesktop.org/documentation/player/gstplayer.html?gi-language=c), which is a very convenient media playback API that hides many of the complex GStreamer details. It is built on top of the [playbin](https://gstreamer.freedesktop.org/documentation/playback/playbin3.html?gi-language=c) element, which provides a stand-alone everything-in-one abstraction for an audio and/or video player and that dynamically builds the appropriate decoding pipeline for the given media content.
The [GStreamer](https://github.com/servo/media/blob/main/backends/gstreamer/player.rs#L776) implementation is mostly a wrapper around [GstPlayer](https://gstreamer.freedesktop.org/documentation/player/gstplayer.html?gi-language=c), which is a very convenient media playback API that hides many of the complex GStreamer details. It is built on top of the [playbin](https://gstreamer.freedesktop.org/documentation/playback/playbin3.html?gi-language=c) element, which provides a stand-alone everything-in-one abstraction for an audio and/or video player and that dynamically builds the appropriate decoding pipeline for the given media content.

### Hardware acceleration
TODO
Expand Down
2 changes: 1 addition & 1 deletion docs/webaudio.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ Following the [WebAudio API specification](https://webaudio.github.io/web-audio-

The `control thread` is the thread from which the `AudioContext` is instantiated and from which authors manipulate the audio graph. In Servo's case, this is the [script](https://github.com/servo/servo/blob/594ea14d5bd7b76d09b679fd0454165259ffbe7a/components/script/script_thread.rs#L5) thread.

The `rendering thread` is where the magic and the actual audio processing happens. This thread keeps an [event loop](https://github.com/servo/media/blob/2610789d1abfbe4443579021113c822ba05f34dc/audio/render_thread.rs#L250) that listens and handles control messages coming from the `control thread` and the audio backend and processes the audio coming from the audio graph in blocks of 128 samples-frames called [render quantums](https://webaudio.github.io/web-audio-api/#render-quantum). For each spin of the loop, the [AudioRenderThread.process](https://github.com/servo/media/blob/2610789d1abfbe4443579021113c822ba05f34dc/audio/render_thread.rs#L233) method is called. This method internally runs a [DFS](https://en.wikipedia.org/wiki/Depth-first_search) traversal on the internal graph calling the [process](https://github.com/servo/media/blob/master/audio/node.rs#L126) method for each node. The resulting chunk of audio data is [pushed](https://github.com/servo/media/blob/2610789d1abfbe4443579021113c822ba05f34dc/audio/render_thread.rs#L337) to the audio sink.
The `rendering thread` is where the magic and the actual audio processing happens. This thread keeps an [event loop](https://github.com/servo/media/blob/2610789d1abfbe4443579021113c822ba05f34dc/audio/render_thread.rs#L250) that listens and handles control messages coming from the `control thread` and the audio backend and processes the audio coming from the audio graph in blocks of 128 samples-frames called [render quantums](https://webaudio.github.io/web-audio-api/#render-quantum). For each spin of the loop, the [AudioRenderThread.process](https://github.com/servo/media/blob/2610789d1abfbe4443579021113c822ba05f34dc/audio/render_thread.rs#L233) method is called. This method internally runs a [DFS](https://en.wikipedia.org/wiki/Depth-first_search) traversal on the internal graph calling the [process](https://github.com/servo/media/blob/main/audio/node.rs#L126) method for each node. The resulting chunk of audio data is [pushed](https://github.com/servo/media/blob/2610789d1abfbe4443579021113c822ba05f34dc/audio/render_thread.rs#L337) to the audio sink.

### Audio Playback

Expand Down

0 comments on commit 971dd33

Please sign in to comment.