The LLM Application 0.3 and 0.3.1 has been released.
Version 0.3 is a major new release with many new features and some breaking changes in the configuration and REST API, showing the beta-state of the extension. Version 0.3.1 is a bug fix release, fixing some problems that we noticed shortly after the release.
These are the first versions of the extension that include the developments of the WAISE - Wiki AI Search Engine project that is funded by NGI Search. The aim of the WAISE project is to develop a chatbot that can answer question based on content that is indexed by the LLM Application.
The main new feature of these releases is the ability to provide collections of documents that are indexed by the application and provided as context for answering questions to the LLM. This feature is provided by the Index for the LLM Application that needs to be installed for the feature to become available. At the moment, the main use case is to index content outside XWiki as the aim of the WAISE project is to provide a generic search appliance that can be embedded in any application. Features for indexing existing content in the wiki will be provided in the future, too. To support the use case of using the chat with an external application, a token-based authentication extension has been published together with an embeddable chat UI. This allows external applications to embed a chat powered by the LLM application that also authenticates the user in XWiki to allow for authorization checks.