Skip to content

Commit

Permalink
Merge pull request #74 from fractalego/llm-webserving
Browse files Browse the repository at this point in the history
removing the last rule if executed
  • Loading branch information
fractalego committed Dec 8, 2023
2 parents 4050b55 + 25d05ed commit e10bf56
Show file tree
Hide file tree
Showing 2 changed files with 27 additions and 4 deletions.
19 changes: 16 additions & 3 deletions todo.txt
Original file line number Diff line number Diff line change
@@ -1,9 +1,22 @@
### TODO

**** make it so the computer does not repeat! reset conversation when the bot repeats itself
* don't use replicas, use a beam decoder where <execute> and <remember> are pushed upwards. (this means no sampling - perhaps there is a better way)
* use sequence_bias in generate() together with epsilon_cutoff
(for example if the <execute> token is not likely its prob should not be increased)

* only one rule at the time!!
* if a rule is executed, it is then consumed
* ALTERNATIVELY increase the number of replicas to 6?

* merge remote and local llm connector. Both should derive from the same class with common functions






/**** make it so the computer does not repeat! reset conversation when the bot repeats itself

/* only one rule at the time!!
/ * if a rule is executed, it is then consumed

* bug: the system kept executing "The bot predicts:"

Expand Down
12 changes: 11 additions & 1 deletion wafl/answerer/dialogue_answerer.py
Original file line number Diff line number Diff line change
Expand Up @@ -69,6 +69,9 @@ async def answer(self, query_text):
continue

if not memories:
if "<execute>" in answer_text:
self._remove_last_rule()

break

facts += "\n" + "\n".join(memories)
Expand Down Expand Up @@ -115,7 +118,7 @@ async def _get_relevant_facts(self, query, has_prior_rules):

return facts

async def _get_relevant_rules(self, query, max_num_rules=2):
async def _get_relevant_rules(self, query, max_num_rules=1):
rules = await self._knowledge.ask_for_rule_backward(
query,
knowledge_name="/",
Expand Down Expand Up @@ -182,3 +185,10 @@ async def _run_code(self, to_execute):
result = "unknown"

return result

def _remove_last_rule(self):
"""
remove the last rule from memory if it was executed during the dialogue
"""
self._prior_rules = self._prior_rules[:-1]

0 comments on commit e10bf56

Please sign in to comment.