From 1aa94cb7349de0b86f0759e6da8d7f38dd7f5fc3 Mon Sep 17 00:00:00 2001 From: Zeeshan Lakhani Date: Wed, 1 May 2024 11:57:47 -0400 Subject: [PATCH] Update example-llm-workflows/README.md Co-authored-by: Brian Ginsburg <7957636+bgins@users.noreply.github.com> Signed-off-by: Zeeshan Lakhani --- example-llm-workflows/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/example-llm-workflows/README.md b/example-llm-workflows/README.md index 10429ac0..de9e856b 100644 --- a/example-llm-workflows/README.md +++ b/example-llm-workflows/README.md @@ -323,7 +323,7 @@ Here's an example response provided by the model from the executed workflow: curl localhost:3000/run --json @map_reduce.json ``` -To more applicability encode the [MapReduce][map-reduce] example from the [Crowdforge][crowdforge] paper, I implemented a `prompt_chain` Wasm/Wasi function registered on the Host that takes in a system prompt (e.g. "You are journalist writing about cities."), an input (e.g. an ongoing article), a map step prompt with a `{{text}}` placeholder that is filled in, a reduce step, which folds over (combines) the generated text(s) from the map step, and then the optional LLaMA model stored as a [`gguf`][gguf]. If the optional model path is not provided, the Host will fall back to the default `Meta-Llama-3-8B-Instruct.Q4_0.gguf` model. +To more applicability encode the [MapReduce][map-reduce] example from the [Crowdforge][crowdforge] paper, I implemented a `prompt_chain` Wasm/WASI function registered on the Host that takes in a system prompt (e.g. "You are journalist writing about cities."), an input (e.g. an ongoing article), a map step prompt with a `{{text}}` placeholder that is filled in, a reduce step, which folds over (combines) the generated text(s) from the map step, and then the optional LLaMA model stored as a [`gguf`][gguf]. If the optional model path is not provided, the Host will fall back to the default `Meta-Llama-3-8B-Instruct.Q4_0.gguf` model. ```rust async fn prompt_chain(