359f46a450aa6adbbfac822bb80cf51dc7fc53af
About/Goals
Ollama is an AI model management tool that allows users to install and use custom large language models locally.
The project aims to:
- Create a Discord bot that will utilize Ollama and chat to chat with users!
- User Preferences on Chat
- Message Persistance on Channels and Threads
- Threads
- Channels
- Containerization with Docker
- Slash Commands Compatible
- Generated Token Length Handling for >2000
- Token Length Handling of any message size
- External WebUI Integration
- Administrator Role Compatible
- Allow others to create their own models personalized for their own servers!
- Documentation on creating your own LLM
- Documentation on web scrapping and cleaning
Environment Setup
- Clone this repo using
git clone https://github.com/kevinthedang/discord-ollama.gitor just use GitHub Desktop to clone the repo. - You will need a
.envfile in the root of the project directory with the bot's token. There is a.env.sampleis provided for you as a reference for what environment variables.- For example,
CLIENT_TOKEN = [Bot Token]
- For example,
- Please refer to the docs for bot setup.
- Local Machine Setup
- Docker Setup for Servers and Local Machines
- Local use is not recommended.
Note
These guides assume you already know how to setup a bot account for discord. Documentation will be added later.
Resources
- NodeJS
- This project uses
v20.10.0+(npm10.2.5). Consider using nvm for multiple NodeJS versions.- To run dev in
ts-node, usingv18.18.2is recommended. - To run dev with
tsx, you can usev20.10.0or earlier.
- To run dev in
- This project supports any NodeJS version above
16.x.xto only allow ESModules.
- This project uses
- Ollama
Note
For Nvidia GPU setup, install
nvidia container toolkit/runtimethen configure it with Docker to utilize Nvidia driver.
Caution
v18.X.Xorlts/hydrogenwill not run properly fornpm run dev-mon.
Acknowledgement
discord-ollama © 2023 by Kevin Dang is licensed under CC BY 4.0
Description
Languages
TypeScript
99.8%
Dockerfile
0.2%
