* Add: skeleton suite for command tests (#119) * test naming updated * fix imports, remove old references * added code coverage badge * Add: coverage environment * Fix: Readme hyperlink to coverage workflow * grab coverage pct from env * Update: gist hyperlink * color range on coverage * fix contributing, simplify coverage assessment * lmiit coverage to master, add branch naming conventions --------- Co-authored-by: Kevin Dang <77701718+kevinthedang@users.noreply.github.com>
4.4 KiB
4.4 KiB
About/Goals
Ollama is an AI model management tool that allows users to install and use custom large language models locally.
The project aims to:
- Create a Discord bot that will utilize Ollama and chat to chat with users!
- User Preferences on Chat
- Message Persistance on Channels and Threads
- Threads
- Channels
- Containerization with Docker
- Slash Commands Compatible
- Generated Token Length Handling for >2000
- Token Length Handling of any message size
- User vs. Server Preferences
- Redis Caching
- Administrator Role Compatible
- Multi-User Chat Generation (Multiple users chatting at the same time) - This was built into from Ollama
v0.2.1+ - Automatic and Manual model pulling through the Discord client
- Allow others to create their own models personalized for their own servers!
- Documentation on creating your own LLM
- Documentation on web scrapping and cleaning
Environment Setup
- Clone this repo using
git clone https://github.com/kevinthedang/discord-ollama.gitor just use GitHub Desktop to clone the repo. - You will need a
.envfile in the root of the project directory with the bot's token. There is a.env.sampleis provided for you as a reference for what environment variables.- For example,
CLIENT_TOKEN = [Bot Token]
- For example,
- Please refer to the docs for bot setup.
- Creating a Discord App
- Local Machine Setup
- Docker Setup for Servers and Local Machines
- Nvidia is recommended for now, but support for other GPUs should be development.
- Local use is not recommended.
Resources
- NodeJS
- This project runs on
lts\hydrogen. - This project supports any NodeJS version above
16.x.xto only allow ESModules.
- This project runs on
- Ollama
- Discord.js Docs
- Setting up Docker (Ubuntu 20.04)
Acknowledgement
discord-ollama © 2023 by Kevin Dang is licensed under CC BY 4.0
