Kevin Dang 9247463480 hardcoded and mentions
* added options to queries

* removed hard coded vals, added message options

* updated importing

* added check for message mentions

* fix missing botID

* updated token to uid

* added contributer

---------

Co-authored-by: JT2M0L3Y <jtsmoley@icloud.com>
2024-01-29 12:50:59 -08:00
2024-01-29 12:50:59 -08:00
2024-01-25 18:24:37 -08:00
2023-12-22 11:22:16 -08:00
2024-01-25 18:24:37 -08:00
2024-01-28 12:59:45 -08:00
2024-01-29 12:50:59 -08:00
2023-12-22 11:22:16 -08:00

Discord Ollama Integration License: CC BY-NC 4.0 Release Badge

Ollama is an AI model management tool that allows users to install and use custom large language models locally. The goal is to create a discord bot that will utilize Ollama and chat with it on a Discord!

Ollama Setup

  • Go to Ollama's Linux download page and run the simple curl command they provide. The command should be curl https://ollama.ai/install.sh | sh.
  • Now the the following commands in separate terminals to test out how it works!
    • In terminal 1 -> ollama serve to setup ollama
    • In terminal 2 -> ollama run [model name], for example ollama run llama2
      • The models can vary as you can create your own model. You can also view ollama's library of models.
    • This can also be done in wsl for Windows machines.
  • You can now interact with the model you just ran (it might take a second to startup).
    • Response time varies with processing power!

To Run

  • Clone this repo using git clone https://github.com/kevinthedang/discord-ollama.git or just use GitHub Desktop to clone the repo.
  • Run npm install to install the npm packages.
  • You will need a .env file in the root of the project directory with the bot's token.
    • For example, CLIENT_TOKEN = [Bot Token]
  • Now, you can run the bot by running npm run start which will build and run the decompiled typescript and run the setup for ollama.
    • IMPORTANT: This must be ran in the wsl/Linux instance to work properly! Using Command Prompt/Powershell/Git Bash/etc. will not work on Windows (at least in my experience).
    • Refer to the resources on what node version to use.

Resources

  • NodeJS
    • This project uses v20.10.0 (npm 10.2.5). Consider using nvm for multiple NodeJS versions.
      • To run dev in ts-node, using v18.18.2 is recommended. CAUTION: v18.19.0 or lts/hydrogen will not run properly.
      • To run dev with tsx, you can use v20.10.0 or earlier.
    • This project supports any NodeJS version above 16.x.x to only allow ESModules.
  • Ollama
  • Discord Developer Portal
  • Discord.js Docs

Acknowledgement

discord-ollama © 2023 by Kevin Dang is licensed under CC BY-NC 4.0

Languages
TypeScript 99.8%
Dockerfile 0.2%