Kevin Dang de15185cff Channel/Thread Chat Toggle (#75)
* Add: Some Commands work in GuildText

* Add: Channel Toggle Command

* Add: Channel History handler by user

* Update: version increment

* Update: Testing scope

* Update: env sample

* Update: Readme goal checks

* Update: builds run on PR to validate them
2024-06-22 20:57:38 -07:00
2024-06-22 20:57:38 -07:00
2024-04-13 21:34:55 -07:00
2024-06-22 20:57:38 -07:00
2024-06-22 20:57:38 -07:00
2024-06-22 20:57:38 -07:00
2024-04-13 19:07:49 -07:00
2023-12-22 11:22:16 -08:00
2024-06-22 20:57:38 -07:00
2024-06-22 20:57:38 -07:00

ollama+discord

Discord Ollama Integration

Ollama as your Discord AI Assistant

License Release Build Status Testing Status

About/Goals

Ollama is an AI model management tool that allows users to install and use custom large language models locally.
The project aims to:

  • Create a Discord bot that will utilize Ollama and chat to chat with users!
    • User Preferences on Chat
    • Message Persistance on Channels and Threads
      • Threads
      • Channels
    • Containerization with Docker
    • Slash Commands Compatible
    • Generated Token Length Handling for >2000
      • Token Length Handling of any message size
    • External WebUI Integration
    • Administrator Role Compatible
  • Allow others to create their own models personalized for their own servers!
    • Documentation on creating your own LLM
    • Documentation on web scrapping and cleaning

Environment Setup

  • Clone this repo using git clone https://github.com/kevinthedang/discord-ollama.git or just use GitHub Desktop to clone the repo.
  • You will need a .env file in the root of the project directory with the bot's token. There is a .env.sample is provided for you as a reference for what environment variables.
    • For example, CLIENT_TOKEN = [Bot Token]
  • Please refer to the docs for bot setup.

Note

These guides assume you already know how to setup a bot account for discord. Documentation will be added later.

Resources

  • NodeJS
    • This project uses v20.10.0+ (npm 10.2.5). Consider using nvm for multiple NodeJS versions.
      • To run dev in ts-node, using v18.18.2 is recommended.
      • To run dev with tsx, you can use v20.10.0 or earlier.
    • This project supports any NodeJS version above 16.x.x to only allow ESModules.
  • Ollama

Note

For Nvidia GPU setup, install nvidia container toolkit/runtime then configure it with Docker to utilize Nvidia driver.

Caution

v18.X.X or lts/hydrogen will not run properly for npm run dev-mon.

Acknowledgement

discord-ollama © 2023 by Kevin Dang is licensed under CC BY 4.0

Languages
TypeScript 99.5%
Dockerfile 0.5%