Replies: 2 comments 1 reply
-
Beta Was this translation helpful? Give feedback.
0 replies
-
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
1). Alwrity does web research, to provide factual context before generating content.
2). This seems to be working, but there repetitions of the same with different context does not yield good results.
There are valid reasons for it. Shorter, transient memory and context window limitations.
3). All LLMs are susceptible to loose of context depending on the volume of data. They need to supplemented with more memory. The problem with that is, the context windows and LLM can remember in a prompt session is limited.
4). It will be worthwhile to check if embedding with local vectordb will improve utilization of context we provide to Alwrity.
Beta Was this translation helpful? Give feedback.
All reactions