mirror of
https://github.com/kevinthedang/discord-ollama.git
synced 2025-12-12 11:56:06 -05:00
4236582cf446432243a9f20d9dd4bc136989fddf
* fix: upgrade discord.js from 14.18.0 to 14.19.3 Snyk has created this PR to upgrade discord.js from 14.18.0 to 14.19.3. See this package in npm: discord.js See this project in Snyk: https://app.snyk.io/org/jt2m0l3y/project/d8b070a3-e4a3-457a-977b-7eb6a4a48346?utm_source=github&utm_medium=referral&page=upgrade-pr * Update: discordjs to latest * Fix: Broken commands * Fix: Ollama offline failsafes trigger --------- Co-authored-by: snyk-bot <snyk-bot@snyk.io> Co-authored-by: Kevin Dang <kevinthedang_1@outlook.com>
About/Goals
Ollama is an AI model management tool that allows users to install and use custom large language models locally.
The project aims to:
- Create a Discord bot that will utilize Ollama and chat to chat with users!
- User Preferences on Chat
- Message Persistance on Channels and Threads
- Threads
- Channels
- Containerization with Docker
- Slash Commands Compatible
- Generated Token Length Handling for >2000
- Token Length Handling of any message size
- User vs. Server Preferences
- Redis Caching
- Administrator Role Compatible
- Multi-User Chat Generation (Multiple users chatting at the same time) - This was built in from Ollama
v0.2.1+ - Automatic and Manual model pulling through the Discord client
Further, Ollama provides the functionality to utilize custom models or provide context for the top-layer of any model available through the Ollama model library.
Documentation
These are guides to the features and capabilities of this app.
Environment Setup
- Clone this repo using
git clone https://github.com/kevinthedang/discord-ollama.gitor just use GitHub Desktop to clone the repo. - You will need a
.envfile in the root of the project directory with the bot's token. There is a.env.sampleis provided for you as a reference for what environment variables.- For example,
CLIENT_TOKEN = [Bot Token]
- For example,
- Please refer to the docs for bot setup.
- Creating a Discord App
- Local Machine Setup
- Docker Setup for Servers and Local Machines
- Nvidia is recommended for now, but support for other GPUs should be development.
- Local use is not recommended.
Resources
- NodeJS
- This project runs on
lts\jodand above. - This project requires the use of npm version
10.9.0or above.
- This project runs on
- Ollama
- Redis
- Discord.js Docs
- Setting up Docker (Ubuntu 20.04)
Acknowledgement
discord-ollama © 2023 by Kevin Dang is licensed under CC BY 4.0
Description
Languages
TypeScript
99.5%
Dockerfile
0.5%
