Skip to content

issues Search Results · repo:NVIDIA/logits-processor-zoo language:Python

Filter by

5 results
 (76 ms)

5 results

inNVIDIA/logits-processor-zoo (press backspace or delete to remove)

Add a logits processor example that 1. Serves the model using vllm serve command 2. Uses openai client to create responses Motivation: Adding this example helps user that host models with vllm. The ...
  • maxjeblick
  • Opened 
    23 days ago
  • #18

First of all, thank you very much for this package. When I try to run a vLLM logits processor, it does not work for batched inference (it only works for one of the prompts, not all of them). Here is the ...
  • alonsosilvaallende
  • 3
  • Opened 
    on Apr 25
  • #14

you may be able to answer reasoning ... /reasoning answer ... /answer answer where the ... are?
  • NickyDark1
  • 2
  • Opened 
    on Feb 3
  • #7

I see how to add it into vLLM when calling it from python, but is there a way to do it when loading the engine to serve via API?
  • accupham
  • 3
  • Opened 
    on Jan 31
  • #6

I encountered an issue when trying to integrate logits-processor-zoo into my project. The library requires accelerate 0.27.0, =0.26.1, but this dependency conflicts with the versions of transformers that ...
  • seroetr
  • 1
  • Opened 
    on Dec 10, 2024
  • #3
Issue origami icon

Learn how you can use GitHub Issues to plan and track your work.

Save views for sprints, backlogs, teams, or releases. Rank, sort, and filter issues to suit the occasion. The possibilities are endless.Learn more about GitHub Issues
ProTip! 
Restrict your search to the title by using the in:title qualifier.
Issue origami icon

Learn how you can use GitHub Issues to plan and track your work.

Save views for sprints, backlogs, teams, or releases. Rank, sort, and filter issues to suit the occasion. The possibilities are endless.Learn more about GitHub Issues
ProTip! 
Restrict your search to the title by using the in:title qualifier.
Issue search results · GitHub