Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Output Stream for AudioPlayer #10489

Open
DanielBUBU opened this issue Sep 6, 2024 · 8 comments
Open

Output Stream for AudioPlayer #10489

DanielBUBU opened this issue Sep 6, 2024 · 8 comments

Comments

@DanielBUBU
Copy link

DanielBUBU commented Sep 6, 2024

Which application or package is this feature request for?

voice

Feature

I am trying to build a site that can play same music on discord.

  • when AudioPlayer pause, the music on the site pause (site got long keep alive setting so it should be fine)
  • the site can take multiple request at once
  • the stream on the site should able to play on VLC

here's the part of the code I wrote:
BufferTransformStream

const { Transform } = require('stream');

const DEFAULT_CAPACITY = 10;

class BufferingTransform extends Transform {
    constructor(options = {}) {
        super(options);

        this.capacity = options.capacity || DEFAULT_CAPACITY;
        this.delay = options.delay || 25
        this.pending = [];

        return;
    }

    get atCapacity() {
        return this.pending.length >= this.capacity;
    }

    _transform(chunk, encoding, cb) {

        if (this.atCapacity) {
            this.push(...this.pending.shift());
        }

        this.pending.push([chunk, encoding]);

        if (cb != undefined) {
            cb();
        }
    }

    _flush(cb) {

        while (this.pending.length > 0) {
            this.push(...this.pending.shift());
        }

        if (cb != undefined) {
            cb();
        }
    }

    _write(chunk, encoding, callback) {
        this.push(chunk);
        setTimeout(callback, this.delay);
    }
    _final() {
        this.push(null)
    }
}

in my class for music processing:

this.port = process.env.PORT + processIndex + 1 || 4000 + processIndex + 1;

        this.expressApp.get('/', (req, res) => {
            console.log("A new connection was made by a client.");
            var bufferStr = new BufferingTransform();
            res.writeHead(200, {
                //'Content-Type': 'video/mp4',
                //"Content-Length": "*",
            });
            bufferStr.on("data", async (data) => {
                //console.log("BufData")
            })
            bufferStr.on("end", async (data) => {
                //console.log("BufEnd")
            })
            res.on('close', () => {
                console.log("Des")
                try {
                    bufferStr.destroy();
                } catch (error) {

                }
            });
            bufferStr.pipe(res);
            this.webAudioStream.pipe(bufferStr);
        })
        while (!this.webListenerSuccessFlag && this.port <= 65535) {
            try {
                this.expressApp.listen(this.port, () => {
                    this.port--;
                    console.log(`ChildProcess ${processIndex} listening on port ${this.port}`)
                }).on('connection', function (socket) {
                        socket.setTimeout(3000 * 1000);
                        // 30 second timeout. Change this as you see fit.
                });
                this.webListenerSuccessFlag = true;
            } catch (error) {
                console.log(error);
            }
            this.port++;
        }

this.webAudioStream is a BufferingTransform object
I want data pass from a AudioPlayer object to this.webAudioStream (so music sync with discord)

Ideal solution or implementation

  • Output silence when player is not in AudioPlayerPlayingState instead of close/end the output stream

Method1-Add pipe function

Pipe to another stream and AudioPlayerObject doesn't close when the stream that pipe into is closed

AudioPlayerObject.pipe(BufferingTransformStreamETC1)
BufferingTransformStream.Close();
AudioPlayerObject.pipe(BufferingTransformStreamETC2)

Method2-Pretend it's a connection

Init a VoiceConnection object using a stream, so it can be sub/unsub just like a VoiceConnection

VoiceConnectionObject=joinVoiceChannel({
            stream :BufferingTransform
        })

or

VoiceConnectionObject=createFromStream({
            stream :BufferingTransform
        })

Alternative solutions or implementations

No response

Other context

My target is create a web radio that sync with discord.
It can be done if I pipe them like this:

sources=>web radio=>discord audio resource

But the annoying part is that I have to maintain more stuff.
And discord audio might be unstable due to the web radio part, in a discord bot project, so it become a tradeoff between reliability and new web radio function for my bot

@nyapat
Copy link
Contributor

nyapat commented Sep 6, 2024

your post is a bit vague but changing the behaviour of how a stream acts when piped is bad imo (& a breaking change). there are specific sites that can work right now if you create an audioresource from them and i think that's what you mean with the "web radio" stuff; this is probably the best option

@DanielBUBU
Copy link
Author

DanielBUBU commented Sep 6, 2024

your post is a bit vague but changing the behaviour of how a stream acts when piped is bad imo (& a breaking change). there are specific sites that can work right now if you create an audioresource from them and i think that's what you mean with the "web radio" stuff; this is probably the best option

Well, my idea is "using audioplayer as a web radio server controller", not just playing a 24/7 stream using audioPlayer.
So it should be work more like this:

queued sources(audioResources)=>single discord audioPlayer<==Subs==>multiple VoiceConnections (and web radio stream should be add here works like VoiceConnections)

The framework above is already done and worked except web radio stream part, the example below works right now:

  • Join a channel A in guild G1 and sub audioPlayer AP1
  • Join a channel B in guild G2 and sub audioPlayer AP1
  • channel A and B will have sync audio from AP1

And I want my web radio stream server has sync audio from AP1 too.
I suppose VoiceConnection works like a stream, so method2 I mentioned will fit in this situation more.

Besides, it should be fine if you pipe a stream into multiple streams

My bot has ability to play 24/7 stream now already btw, in case you misunderstood my thought.
image

The framework below is what I have to do if you don't add the feature.
This is the WORST solution in my project since it requires lot of controllers to block async functions, and sync audio between web radio and single discord audioPlayer

sources(fs audio stream or something)=>web radio=>discord audio resource=>single discord audioPlayer<==Subs==>multiple VoiceConnections

@nyapat
Copy link
Contributor

nyapat commented Sep 6, 2024

what's wrong with piping your input to the site separately from the audioplayer (but at the same time)? does it have to go "through" the audioplayer?

@DanielBUBU
Copy link
Author

DanielBUBU commented Sep 6, 2024

what's wrong with piping your input to the site separately from the audioplayer (but at the same time)? does it have to go "through" the audioplayer?

Yes, it has to go through the audioplayer first ,so my web radio site will have same audio that sync with discord.

In my best implementation plan, it should have nothing to do with audioResource and audioPlayer; only modified VoiceConnection or VoiceConnection extension is necessary.

@nyapat
Copy link
Contributor

nyapat commented Sep 7, 2024

What you've suggested for VoiceConnection doesn't make sense. It doesn't manage streams, that is the job of AudioPlayer.

Reading back, you said the ideal solution was

Output silence when player is not in AudioPlayerPlayingState instead of close/end the output stream

Have you tried setting the maxMissedFrames behaviour to infinity?

@DanielBUBU
Copy link
Author

DanielBUBU commented Sep 7, 2024

What you've suggested for VoiceConnection doesn't make sense. It doesn't manage streams, that is the job of AudioPlayer.

Oh I didn't know that, so the method2 is kinda useless now.

Reading back, you said the ideal solution was

Output silence when player is not in AudioPlayerPlayingState instead of close/end the output stream

Have you tried setting the maxMissedFrames behaviour to infinity?

ye, I did set maxMissedFrames to infinity, but it doesn't output silence audio data (no green circle around icon) in Idle, Paused, or Buffering state (not sure about buffering state because it's too short).

Besides, there's no way to get data that is processed by AudioPlayer and decode it right now.

@nyapat
Copy link
Contributor

nyapat commented Sep 7, 2024

ye, I did set maxMissedFrames to infinity, but it doesn't output silence audio data (no green circle around icon)

Yeah, silence frames don't set a "green circle around icon," that would be counterintuitive to actually sending a voice stream. You can get the "data processed by AudioPlayer" by just using it before you pass it into the AudioPlayer. You can synchronise a different stream & the audioplayer by just running the same actions on them.

Besides, there's no way to get data that is processed by AudioPlayer and decode it right now.

Sorry but it doesn't make sense at all to get the output of what you're piping into AudioPlayer; it will literally be the same stream, discord does not send the audio data back so you will not know if it's actually received by the channel & if you want silence frames on your secondary stream you can manually push them.

If you remove the context of discord it also doesn't make sense. You shouldn't rely on one service to send two identical streams of data to two services, just send the stream to both places separately of context of the other.

@DanielBUBU
Copy link
Author

If you remove the context of discord it also doesn't make sense. You shouldn't rely on one service to send two identical streams of data to two services, just send the stream to both places separately of context of the other.

I just did a experiment, and I found that on data event that is emit by the stream (the stream I use to create a audioResource) is sync with discord audio, here is my working solution:

wrapStreamToResauce(stream, BT = false) {
        try {
            var streamOpt;
            var ffmpeg_audio_stream_C = fluentffmpeg(stream)
            var audio_resauce;
            if (BT) {

                console.log("Set BT:" + Math.ceil(BT / 1000));
                ffmpeg_audio_stream_C.seekInput(Math.ceil(BT / 1000))
            }
            ffmpeg_audio_stream_C.toFormat('hls').audioChannels(2).audioFrequency(48000).audioBitrate('1536k');

            ffmpeg_audio_stream_C.on("error", (error) => {
                this.handling_vc_err = true;
                console.log("ffmpegErr" + error);
                if (error.outputStreamError) {
                    if (error.outputStreamError.code == "ERR_STREAM_PREMATURE_CLOSE") {
                        this.clear_status(false, () => {
                            try {
                                //stream.destroy();
                            } catch (error) {
                                console.log(error);
                            }
                            this.playingErrorHandling(audio_resauce, error)
                        })
                        return;
                    }
                }
                this.playingErrorHandling(audio_resauce, error);
            });

            streamOpt = ffmpeg_audio_stream_C.pipe();

            streamOpt.on("data", (chunk) => {
                //console.log(chunk.length)
                this.webAudioStream._transform(chunk);
            })
            streamOpt.on("end", () => {
                console.log("streamEnd")
            })

            audio_resauce = createAudioResource(
                streamOpt, { inputType: StreamType.Arbitrary, silencePaddingFrames: 10 }
            );

            audio_resauce.metadata = this.queue[this.nowplaying];
            return new Proxy(audio_resauce, {
                set: (target, key, value) => {
                    //console.log(`${key} set to ${value}`);
                    target[key] = value;
                    if (key == "playbackDuration" && process.send) {
                        process.send(value);
                    }
                    return true;
                }
            });
        } catch (error) {
            console.log("ERRwhenwarp");
            throw error;
        }
    }

Yeah, silence frames don't set a "green circle around icon," that would be counterintuitive to actually sending a voice stream. You can get the "data processed by AudioPlayer" by just using it before you pass it into the AudioPlayer. You can synchronise a different stream & the audioplayer by just running the same actions on them.

Sorry but it doesn't make sense at all to get the output of what you're piping into AudioPlayer; it will literally be the same stream, discord does not send the audio data back so you will not know if it's actually received by the channel & if you want silence frames on your secondary stream you can manually push them.

This changed my thought a lot, TY.

This is what it looks like now:
image

2 VLC players and bot in 2 guild, all 4 audio outputs have almost sync audio.
The issue can be closed now ig.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants