Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

question for embedding your decoder #12

Open
publicocean0 opened this issue Jun 12, 2017 · 8 comments
Open

question for embedding your decoder #12

publicocean0 opened this issue Jun 12, 2017 · 8 comments

Comments

@publicocean0
Copy link

hi , i d like embed your decoder in my player.

but i have some questions
ipothesis:
-i have a webm stream ... using chunks.
-the demuxer is seekeable .... it permits to seek in a specific position and send out the right frames.

the decoder is stateless? when i seek there no state to reset in the decoder? nothing related to keyframe for example?

@brianxautumn
Copy link
Member

The decoder does keep the state. You can only seek to a keyframe, but this is how all decoders work.

@publicocean0
Copy link
Author

publicocean0 commented Jun 12, 2017

ok pratically after a seek i have to call decode api with compressed frames starting from a keyframe. It seams simple

@brianxautumn
Copy link
Member

Yes, this is the case with VP8, you have to seek to the nearest keyframe.

@publicocean0
Copy link
Author

publicocean0 commented Jun 13, 2017

then for the player part ... i think i might refresh canvas it is simple but i have 2 doubt:

  • in webm i cant find samplerate for video. what it the frequency for refreshing canvas?
  • based on your experience setTimeout every X ms is the faster solution ?

@brianxautumn
Copy link
Member

When you multiplex it, it should put timestamps on each frame. I don't think a solid fps will work perfectly. Do you have audio too?

@brianxautumn
Copy link
Member

I'm not an expert on A/V sync but perhaps @Brion knows

@bvibber
Copy link
Contributor

bvibber commented Jun 13, 2017

Indeed there's no single fps value provided in WebM structure; each frame's packet lists a presentation timestamp relative to the start time.

Very roughly a playback loop (such as is implemented in my ogv.js player needs to do:

  • demux next frame
  • get the current playback time, something like:
    • playbackTime = audioContext.currentTime if you have audio, or just:
    • playbackTime = Date.now() - startTime or if you're really fancy and want to handle pauses better:
    • playbackTime = (Date.now() - timeWeLastPlayedAFrame) + ptsOfLastFrame
  • delay until it's time
    • setTimeout(displayFrame, ptsOfCurrentFrame - playbackTime)
  • draw!
  • go back to start

@publicocean0
Copy link
Author

publicocean0 commented Jun 14, 2017

Thanks a lot @Brion. I got the idea . I m thinking now in the webm blocks there is a blockduration : blockduration/framecount=time per frame. Answering to @brianxautumn. No i have still to see exactly how to do . I have just saw you realized a audiostream feeder for doing it.

The better solution would be realize a MSE object for browser not supporting it.
There is also another little problem to think.
Usually there is a parser/demuxer also in dash component.
MSE might have also a simplifed version of a demuxer+ decoder.
There is a rendundancy demuxing twice . I m asking if is better to think a trick for evoiding it(for example creating a virtual codec 'video/webm codecs="js" ' essentially passing directly frames after initialization segment ) , but anyway realizing a portable solution.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants