Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug: Code rendering seems broken #4735

Open
1 of 3 tasks
intuity-hans opened this issue Feb 25, 2025 · 7 comments
Open
1 of 3 tasks

bug: Code rendering seems broken #4735

intuity-hans opened this issue Feb 25, 2025 · 7 comments
Labels
/desktop type: bug Something isn't working

Comments

@intuity-hans
Copy link

Jan version

0.5.15

Describe the Bug

Rendering of code seems to be broken. It used to work really well, but somehow codeblocks are no longer displayed and the tags for them are even escaped somehow?

I already reinstalled Jan, made a factory reset, deleted the user folder.

Tried with:
claude-3-5-haiku-latest
claude-3-5-sonnet-latest
claude-3-7-sonnet-latest

I am running Jan on MacOS 15.3.1 on a MacbookPro M3 Pro.

Image

Steps to Reproduce

  1. Download latest version
  2. Prompt "write a basic react component"

Not really sure what else to do

Screenshots / Logs

~/jan/logs/app.logs
No such file or directory

What is your OS?

  • MacOS
  • Windows
  • Linux
@intuity-hans intuity-hans added the type: bug Something isn't working label Feb 25, 2025
@github-project-automation github-project-automation bot moved this to Investigating in Menlo Feb 25, 2025
@rk-7474
Copy link

rk-7474 commented Feb 26, 2025

i think it's a response streaming problem. turning off stream in model inference settings resolves the problem

@pedroscosta
Copy link

+1 I'm having the same problem

@adfrisealach
Copy link

+1 for this problem. turning off streaming does fix the formatting issues until the bug is resolved.

@leanton
Copy link

leanton commented Mar 12, 2025

+1, have the same problem, how do I turn off stream in model inference for now?

@intuity-hans
Copy link
Author

@leanton you can do that in the inference settings. however this setting seems to get lost when creating a new conversation from time to time.

Image

@sherloach
Copy link

+1 I'm having the same problem

@martisaw
Copy link

+1 same problem on windows with 0.5.15 -> disabling streaming does the trick

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
/desktop type: bug Something isn't working
Projects
Status: Investigating
Development

No branches or pull requests

8 participants