blob: 7fab0d2e4825dcb15f2341b842039a252c438d38 (
plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
|
# 🦙 ollama-logseq plugin
A plugin to integrate [ollama](https://github.com/jmorganca/ollama) with [logseq](https://github.com/logseq/logseq)
# Get Started
- First you will need to setup [ollama](https://github.com/jmorganca/ollama) you can check their github repo for instructions on how to setup ollama
- That's it once you setup ollama you should be able to use the plugin with no problem
> Note: If you are on windows make sure to open WSL in the background for the model to work properly
# Features
- The plugin currently has 5 commands
- Ask Ai -> which is a prompt the AI freely without any context
- Ask Ai with context -> This is the same as Ask Ai but it gives the model the context of the current page
- Summarize -> Summarizs the whole page
- Summarize Block
- Create a flash card
- Divide a todo task into subtasks
- Respects theming
- Context menu commands
- Summarize Block
- Make a flash card
- Divide task into subtasks
- Prompt from block
- Expand block
- A slash command via /ollama
- Button in tool bar
- Settings for changing the host of the model, the model itself and a shortcut to open the plugin command palette
# Demo



# Contribution
If you have any features suggestions feel free to open an issue
>If this plugin helps you, I'd really appreciate your support. You can [buy me a coffee here. ](https://www.buymeacoffee.com/omagdy)
|