06638fec1f5955f88810eb841f9d6d956490ddbd
* Add: file changes to ignore * Fix: proper indentation in yml
About/Goals
Ollama is an AI model management tool that allows users to install and use custom large language models locally.
The project aims to:
- Create a Discord bot that will utilize Ollama and chat to chat with users!
- User Preferences on Chat
- Message Persistance on Channels and Threads
- Containerization with Docker
- Slash Commands Compatible
- Generated Token Length Handling for >2000
or >6000 characters- Token Length Handling of any message size
- External WebUI Integration
- Administrator Role Compatible
- Allow others to create their own models personalized for their own servers!
- Documentation on creating your own LLM
- Documentation on web scrapping and cleaning
Environment Setup
- Clone this repo using
git clone https://github.com/kevinthedang/discord-ollama.gitor just use GitHub Desktop to clone the repo. - You will need a
.envfile in the root of the project directory with the bot's token. There is a.env.sampleis provided for you as a reference for what environment variables.- For example,
CLIENT_TOKEN = [Bot Token]
- For example,
- Please refer to the docs for bot setup. NOTE: These guides assume you already know how to setup a bot account for discord.
- Local Machine Setup
- Docker Setup for Servers and Local Machines
- Local use is not recommended.
Resources
- NodeJS
- This project uses
v20.10.0+(npm10.2.5). Consider using nvm for multiple NodeJS versions.- To run dev in
ts-node, usingv18.18.2is recommended. CAUTION:v18.19.0orlts/hydrogenwill not run properly. - To run dev with
tsx, you can usev20.10.0or earlier.
- To run dev in
- This project supports any NodeJS version above
16.x.xto only allow ESModules.
- This project uses
- Ollama
- Ollama Docker Image
- IMPORTANT: For Nvidia GPU setup, install
nvidia container toolkit/runtimethen configure it with Docker to utilize Nvidia driver.
- Discord Developer Portal
- Discord.js Docs
- Setting up Docker (Ubuntu 20.04)
Acknowledgement
discord-ollama © 2023 by Kevin Dang is licensed under CC BY 4.0
Description
Languages
TypeScript
99.8%
Dockerfile
0.2%
