diff options
| author | omagdy7 <omar.professional8777@gmail.com> | 2024-03-10 13:54:09 +0200 |
|---|---|---|
| committer | omagdy7 <omar.professional8777@gmail.com> | 2024-03-10 13:54:09 +0200 |
| commit | 2ef25a3b0ed6992a3ff4dcd06898196f927e0899 (patch) | |
| tree | 682548cd57a1afde9ee00853d9619175e48c2a7a /README.md | |
| parent | e24c87f3b3a8fde7bb4884e443e8bc1b7e9cb806 (diff) | |
| download | ollama-logseq-2ef25a3b0ed6992a3ff4dcd06898196f927e0899.tar.xz ollama-logseq-2ef25a3b0ed6992a3ff4dcd06898196f927e0899.zip | |
Removed the note on running WSL for ollama to work on windows(as ollama supports native windows now)
Diffstat (limited to 'README.md')
| -rw-r--r-- | README.md | 2 |
1 files changed, 0 insertions, 2 deletions
@@ -6,8 +6,6 @@ A plugin to integrate [ollama](https://github.com/jmorganca/ollama) with [logseq - First you will need to setup [ollama](https://github.com/jmorganca/ollama) you can check their github repo for instructions on how to setup ollama - That's it once you setup ollama you should be able to use the plugin with no problem -> Note: If you are on windows make sure to open WSL in the background for the model to work properly - # Features - The plugin currently has 6 commands - Ask Ai -> which is a prompt the AI freely without any context |
