Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pending changes #1

Closed
jerch opened this issue Dec 30, 2021 · 23 comments
Closed

pending changes #1

jerch opened this issue Dec 30, 2021 · 23 comments

Comments

@jerch
Copy link
Owner

jerch commented Dec 30, 2021

  • remove private / shared palette separation, instead always carry entries forward
  • think about a less invasive palette reset than RIS/DECSTR (maybe hook into some XTSMGRAPHICS action)
  • properly define and document default palette as VT340 (lower 16 colors) + ANSI256 (up to 256) + zeroed (up to 4096)
  • remove other default palettes to shrink package size
  • transit to VT340 cursor positioning, remove all others for sixels
  • update tests & docs regarding changes
  • check if worker loop is properly guarded against malicious sixel data
@ghost
Copy link

ghost commented Jan 18, 2022

So what was the story around xtermjs/xterm.js#2503 ? I was so excited that almost everything was working, and then it seemed they abruptly pulled the plug. Will they never ship sixel? Or they might someday, just not anytime soon?

Is it OK if I point other JS devs looking for sixel this way?

@jerch
Copy link
Owner Author

jerch commented Jan 20, 2022

I stalled the PR over there mostly due to time constraints. The PR grew really big over a quite long time, which makes it harder and harder to merge in. Furthermore its maintenance is not easy, as it uses many internal private symbols for the buffer and output interaction. In fact that code belongs into core some day, such a heavy lifting should not reside on an addon. But we never made it to that point, mainly because the whole image story is still uncertain ground in terminals and still might see more than one groundbreaking shift. All in all - no good conditions to be moved into core.

Is it OK if I point other JS devs looking for sixel this way?

Sure. Well the sixel lib has still a few issues on the decoder, but in general should do its job. The encoder is not yet wasm optimized, which has no priority on my end (just added it to have support in both directions).

@ghost
Copy link

ghost commented Feb 1, 2022

I'm working on DOOM inside terminals now. Would you be interested in me testing against xterm.js with this? Perfectly fine if not, I do not want to create uninteresting work for you. :) If you are interested, is there a recommended way to get xterm.js + xterm-addon-image working with a local shell?

Also:

  • transit to VT340 cursor positioning, remove all others for sixels

Is this about DECSDM? I'm using that now for rendering to the bottom text row (when the that location is within 1000x1000 of the top-left, due to xterm limits). If so, there is an odd interaction with transparent-images-over-old-images, as seen here. This behavior currently affects wezterm and DomTerm.

@jerch
Copy link
Owner Author

jerch commented Feb 1, 2022

@klamonte Sure, feel free to play around with it. This what you would need:

  • recent nodejs on your machine (imho v12 or v14 should do) with yarn installed
  • then call contents of bootstrap.sh
  • switch to xterm.js repo folder
  • run yarn start
  • open localhost:3000 with your browser (should be recent chrome/firefox/safari)

DECSDM is correctly support as far as I had tested it. The cursor positioning currently follows the right-or-below semantics (thats more like xterm would do it), which is subpar, compared to VT340's mode (always at first image column of the last image row). This needs to be changed before it can get released.

Since you play around with transparent composition, thats also not yet done in the addon. Planned for second alpha version.

@ghost
Copy link

ghost commented Feb 2, 2022

I can't get it to work on Debian.

After installing:

nodejs
yarnpkg
webpack
node-express

Then (after aliasing yarn=yarnpkg):

yarn add express-ws

I get:

yarn add v1.22.10
[1/4] Resolving packages...
[2/4] Fetching packages...
info [email protected]: The platform "linux" is incompatible with this module.
info "[email protected]" is an optional dependency and failed compatibility check. Excluding it from installation.
[3/4] Linking dependencies...
[4/4] Building fresh packages...
[1/2] ⠄ node-pty
error /home/autumn/t/xterm-addon-image/xterm.js/node_modules/node-pty: Command failed.
Exit code: 1
Command: node scripts/install.js
Arguments: 
Directory: /home/autumn/t/xterm-addon-image/xterm.js/node_modules/node-pty
Output:
events.js:291
      throw er; // Unhandled 'error' event
      ^

Error: spawn node-gyp ENOENT
    at Process.ChildProcess._handle.onexit (internal/child_process.js:268:19)
    at onErrorNT (internal/child_process.js:470:16)
    at processTicksAndRejections (internal/process/task_queues.js:84:21)
Emitted 'error' event on ChildProcess instance at:
    at Process.ChildProcess._handle.onexit (internal/child_process.js:274:12)
    at onErrorNT (internal/child_process.js:470:16)
    at processTicksAndRejections (internal/process/task_queues.js:84:21) {
  errno: 'ENOENT',
  code: 'ENOENT',
  syscall: 'spawn node-gyp',
  path: 'node-gyp',
  spawnargs: [ 'rebuild' ]
}
info Visit https://yarnpkg.com/en/docs/cli/add for documentation about this comm

Based on the stack trace, something in there thinks it is running on Windows.

Any tips?

@jerch
Copy link
Owner Author

jerch commented Feb 2, 2022

This looks like node-gyp is missing, hmm. How did you install nodejs? On ubuntu I never use the node packages from the distro repo, as they are broken in several ways, maybe thats the issue here?
For me the deb packages from here https://github.com/nodesource/distributions/blob/master/README.md always worked (take v14 as node-pty has issues with v16).

@ghost
Copy link

ghost commented Feb 2, 2022

Thank you, I used the nodesource version, installed yarn as per https://vitux.com/how-to-install-yarn-node-package-manager-on-debian-11/ , and it ran!

vttest looked good. When jexer itself ran mouse and keyboard input died -- turns out I wasn't also honoring BEL to end OSC sequences (I use OSC 4 to query the ANSI colors) -- so thanks for exposing my bug. :)

Performance wasn't good. Firefox pegged out around 30+ kbytes/sec of sixel. Given the numbers I've seen you post on your decoder, I'm sure that's a firefox problem and not a xterm.js problem. XtermDOOM was posting 20-30 FPS, and firefox was pegging both of my measly cores trying to keep up lol.

But for you:

simplescreenrecorder-2022-02-02_10.13.32.mp4

@jerch
Copy link
Owner Author

jerch commented Feb 2, 2022

Yes Firefox is slower in WASM execution than Chrome, but 30 kb/s? Thats 500 times lower than I would expect, for Firefox I saw ~22 MB/s, while Chrome was around 41 in my tests. Whats the pixel size of a single frame from the doom output?

Are you sure that there is nothing else slowing down things? If you care enough, maybe open devtools-->performance in Chrome and record for a few seconds, while sixels are streamed. Then after stopping, check the tab "Bottom-Up" down below. I am pretty sure, that something else is going on here.

@ghost
Copy link

ghost commented Feb 2, 2022

Oh I'm sure something else is going on too. :) I'm sorry if it looked like I was saying your code is slow, I know it's very fast. I just wanted you to see it, and if you want to play with it too you know where to look.

@jerch
Copy link
Owner Author

jerch commented Feb 5, 2022

@klamonte Are there instructions anywhere how to get that DOOM thingy up and running?

@ghost
Copy link

ghost commented Feb 5, 2022

@jerch

git clone https://gitlab.com/klamonte/xtermdoom.git
cd xtermdoom
ant jar
cp /path/to/doom1.wad doom1.wad
./run_xterm.bash

Some more discussion (I sort of threadjacked oops!) at: https://codeberg.org/dnkl/foot/issues/481#issuecomment-363377

@jerch
Copy link
Owner Author

jerch commented Feb 5, 2022

@klamonte

Oh wow - not sixel is eating the precious CPU time, its the canvas renderer drawing uncached glyphs. Thats weird, somehow the glyph cache is not working for your chars. I suspect, that the runtime is also pretty bad without any sixel stream in between. This needs investigation...

Edit:
As a quick fix, try using the webgl renderer instead, seems thats not affected by that.

Edit2:
Seems the FPS limiting factor is the encoding? I get 40 FPS in small window, and 15 FPS in maxed (also see top output):
image

Edit3:
Its actually ~ 50 FPS with small window (~ 350x220), if I close the browser devtools.

Edit4:
And last but not least the numbers for xterm on my machine: 10 FPS maxed, 25 FPS in small window.

@ghost
Copy link

ghost commented Feb 5, 2022

My encoder definitely struggles at any decent frame rate. chafa's and notcurses' encoders are on the order of 100-1000x (not exaggerating) faster. I have a lot more work to do on the DOOM side to clean up CPU, and then go through the entire image from DOOM-->Jexer-->compositing-->xterm pipeline again.

A couple things that may improve you for now:

  • jexer.ECMA48.sixelPaletteSize : at 256 it can use "direct-indexed" encoding, where it just translates from the 8-bit palette to a sixel color number. It was slower for me, but might be different for you.
  • jexer.translucence=false: that will disable the window alpha blending, which won't do much when DOOM is foreground (it's window is opaque already) but still cuts some CPU.

You can also compare it against running inside wezterm. Jexer uses iTerm2 PNG images for that which is a lot faster.

BTW those FPS numbers are about what my laptop is getting on xterm itself. So great job with xterm.js! :)

@jerch
Copy link
Owner Author

jerch commented Feb 5, 2022

@klamonte WezTerm is at ~35 FPS for me.

A couple things that may improve you for now: ...

Will try that tomorrow. All in all, this looks really promising, well done 😺

@ghost
Copy link

ghost commented Feb 6, 2022

@jerch

Thats weird, somehow the glyph cache is not working for your chars. I suspect, that the runtime is also pretty bad without any sixel stream in between. This needs investigation...

Jexer is unique in its sixel output strategy. It emits images first, then text, and expects the text to destroy the image. The images are also 1 row high and I think they might be doing like 10 cols max. So lots of intersecting sixel and text.

Oh, and when multithread is enabled the sixel images are produced in whatever order the thread pool gets to them, so there will be lots of "place->sixel->place->sixel->..." and then "place->text->place->text->...".

Don't know if that helps or not. 🤷‍♀️

@jerch
Copy link
Owner Author

jerch commented Feb 6, 2022

Here are some results:

  • The canvas renderer currently does not cache RGB colors in FG/BG, thats the reason, why it bails out to uncached drawing. When I wrote the RGB extension for the buffer, the renderer got not fixed the same way, as that renderer was meant to be removed soon. Well it is still the default renderer, with uncached RGB path 😺. Is there an easy way to set different FG/BG colors?
  • jexer.ECMA48.sixelPaletteSize=256 runs worse for me. I have no clue yet why (sixel decoding takes only 130ms of 5s runtime), but the browser jumps to 120% load. My suspect is the sheer amount of messages, which might create very high load for the message handling in the browser itself. Also the worker does not help here much, as the messages are still very small but many, which stalls the parser chain often with stack unwinding (thats quite costly). Imho you trigger here an edge case, where the sixel decoding would be better done in the main thread (not tested yet).

Oh, and when multithread is enabled the sixel images are produced in whatever order the thread pool gets to them, so there will be lots of "place->sixel->place->sixel->..." and then "place->text->place->text->...".

Thats prolly the reason why I see such worse browser load with jexer.ECMA48.sixelPaletteSize=256. The context switches for sixel/text are quite expensive due to the internal worker setup (its kinda a "long line", I have no cheap memory mapping in place, SABs are a nightmare to use since spectre/meltdown).

I might try to run this with a non-worker based version (sixel decoding done directly in the mainthread), which should remove alot of the messaging overhead. But thats not a viable solution for xterm.js in general, as it makes the mainthread busy up to perceivable stalls for bigger images.

@jerch
Copy link
Owner Author

jerch commented Feb 6, 2022

Yes indeed, with a non-worker variant everything runs smooth with jexer.ECMA48.sixelPaletteSize=256 at 55 FPS. Congrats - for finding the bottleneck of the worker approach 🙈

So here are the profiler numbers for that non-worker variant of a 6s record:
image

It shows, that the sixel decoding took only 5% of the runtime (thats the wasm-function[9] entry there). Most time was spent on putImageData. Also the second entry Major GC results from that for 80%. In summary 35% of the runtime is spent on the 2d canvas creation, which could be made much faster with a webgl canvas (this wont happen anytime soon, still it is planned for a later version along with SAB usage).

Edit:
Out of curiosity I disabled sixel decoding - FPS is reported as 55-60 with the java process at 280% load. Seems thats the fastest my machine can get the frames (or you capped it there?).

@ghost
Copy link

ghost commented Feb 6, 2022

I think DOOM itself maxes at 60 FPS, as that is the same limit foot found.

You can eliminate the RGB by commenting out the getTheme().setFemme() line in src/xterm/XtermDOOM.java .

@ghost
Copy link

ghost commented Feb 6, 2022

The canvas renderer currently does not cache RGB colors in FG/BG, thats the reason, why it bails out to uncached drawing. When I wrote the RGB extension for the buffer, the renderer got not fixed the same way, as that renderer was meant to be removed soon. Well it is still the default renderer, with uncached RGB path smiley_cat.

I used to cache RGB, and then notcurses-demo crashed me. :-)

https://gitlab.com/klamonte/jexer/-/issues/85

@AutumnMeowMeow
Copy link

"subscribe"

(Hi @jerch, this is my new account.)

@jerch
Copy link
Owner Author

jerch commented Feb 13, 2022

(Hi @jerch, this is my new account.)

WB! And no worries, I have not forgotten about the original topic (yet), but still have other things to do before I can get back to it. I even might introduce an optional worker setup setting to work around the issue found above (so peeps can decide on their own, whether to use worker-based or mainthread-based decoding, depending on their integration needs).

@jerch
Copy link
Owner Author

jerch commented May 3, 2022

Dealt with in several PRs.

@jerch
Copy link
Owner Author

jerch commented May 3, 2022

@AutumnMeowMeow Just released the first npm version 😄

@jerch jerch mentioned this issue Sep 17, 2022
2 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants