Skip to content
This repository was archived by the owner on Aug 15, 2025. It is now read-only.

Commit d64fefa

Browse files
committed
Rewrite the README with a focus on DO GenAI Platform
1 parent 593d0c9 commit d64fefa

File tree

1 file changed

+63
-140
lines changed

1 file changed

+63
-140
lines changed

README.md

Lines changed: 63 additions & 140 deletions
Original file line numberDiff line numberDiff line change
@@ -1,200 +1,123 @@
1-
# Slack AI Chatbot
1+
# Slack AI Chatbot with DigitalOcean GenAI
22

3-
This Slack chatbot app template offers a customizable solution for integrating AI-powered conversations into your Slack workspace. Here's what the app can do out of the box:
3+
This Slack chatbot app template provides a customizable solution for integrating AI-powered conversations into your Slack workspace using DigitalOcean's GenAI Platform. Deploy the app on DigitalOcean App Platform for a fully managed experience.
44

5-
* Interact with the bot by mentioning it in conversations and threads
6-
* Send direct messages to the bot for private interactions
7-
* Use the `/ask-sailor` command to communicate with the bot in channels where it hasn't been added
8-
* Utilize a custom function for integration with Workflow Builder to summarize messages in conversations
9-
* Select your preferred API/model from the app home to customize the bot's responses
10-
* Bring Your Own Language Model [BYO LLM](#byo-llm) for customization
11-
* Choice of state storage:
12-
* File-based state store creates a file in /data per user to store API/model preferences
13-
* Redis state store for distributed deployments with environment variable configuration
5+
## Features
146

15-
Inspired by [ChatGPT-in-Slack](https://github.com/seratch/ChatGPT-in-Slack/tree/main)
7+
* **Interact with the bot** by mentioning it in conversations and threads
8+
* **Send direct messages** to the bot for private interactions
9+
* Use the **`/ask-sailor`** command to communicate with the bot in channels where it hasn't been added
10+
* Use the **`/sailor-summary`** command to have Sailor summarize a Slack thread and DM you an AI generated summary
11+
* **Utilize a custom function** for integration with Workflow Builder to summarize messages
12+
* **Select your preferred AI model** from the app home to customize responses
13+
* **Seamless integration** with DigitalOcean GenAI Platform
14+
* **Choice of state storage**:
15+
* **File-based state store** creates a file in /data per user to store preferences
16+
* **Redis state store** for distributed deployments (recommended for App Platform)
1617

17-
Before getting started, make sure you have a development workspace where you have permissions to install apps. If you don't have one setup, go ahead and [create one](https://slack.com/create).
1818
## Installation
1919

2020
#### Prerequisites
21-
* To use the OpenAI and Anthropic models, you must have an account with sufficient credits.
22-
* To use the Vertex models, you must have [a Google Cloud Provider project](https://cloud.google.com/vertex-ai/generative-ai/docs/start/quickstarts/quickstart-multimodal#expandable-1) with sufficient credits.
23-
* To use the GenAI models, you must have access to the [DigitalOcean GenAI Platform](https://docs.digitalocean.com/products/genai-platform/).
21+
* A Slack workspace where you have permissions to install apps
22+
* Access to the [DigitalOcean GenAI Platform](https://docs.digitalocean.com/products/genai-platform/)
2423

2524
#### Create a Slack App
2625
1. Open [https://api.slack.com/apps/new](https://api.slack.com/apps/new) and choose "From an app manifest"
2726
2. Choose the workspace you want to install the application to
2827
3. Copy the contents of [manifest.json](./manifest.json) into the text box that says `*Paste your manifest code here*` (within the JSON tab) and click *Next*
2928
4. Review the configuration and click *Create*
30-
5. Click *Install to Workspace* and *Allow* on the screen that follows. You'll then be redirected to the App Configuration dashboard.
29+
5. Click *Install to Workspace* and *Allow* on the screen that follows. You'll be redirected to the App Configuration dashboard.
3130

3231
#### Environment Variables
33-
Before you can run the app, you'll need to store some environment variables.
32+
Before running the app, store these environment variables:
3433

35-
1. Open your apps configuration page from this list, click **OAuth & Permissions** in the left hand menu, then copy the Bot User OAuth Token. You will store this in your environment as `SLACK_BOT_TOKEN`.
36-
2. Click **Basic Information** from the left hand menu and follow the steps in the App-Level Tokens section to create an app-level token with the `connections:write` scope. Copy this token. You will store this in your environment as `SLACK_APP_TOKEN`.
37-
38-
Next, set the gathered tokens as environment variables using the following commands:
34+
1. From your app's configuration page, go to **OAuth & Permissions** and copy the Bot User OAuth Token (`SLACK_BOT_TOKEN`)
35+
2. From **Basic Information**, create an app-level token with the `connections:write` scope (`SLACK_APP_TOKEN`)
36+
3. Get your DigitalOcean GenAI API key from the DigitalOcean dashboard (`GENAI_API_KEY`)
37+
4. Set your GenAI API URL if using a private agent (`GENAI_API_URL`)
3938

4039
```zsh
41-
# MacOS/Linux
40+
# Set environment variables
4241
export SLACK_BOT_TOKEN=<your-bot-token>
4342
export SLACK_APP_TOKEN=<your-app-token>
44-
```
45-
46-
```pwsh
47-
# Windows
48-
set SLACK_BOT_TOKEN=<your-bot-token>
49-
set SLACK_APP_TOKEN=<your-app-token>
50-
```
51-
52-
Different models from different AI providers are available if the corresponding environment variable is added, as shown in the sections below.
53-
54-
##### Anthropic Setup
55-
56-
To interact with Anthropic models, navigate to your Anthropic account dashboard to [create an API key](https://console.anthropic.com/settings/keys), then export the key as follows:
57-
58-
```zsh
59-
export ANTHROPIC_API_KEY=<your-api-key>
60-
```
61-
62-
##### Google Cloud Vertex AI Setup
63-
64-
To use Google Cloud Vertex AI, [follow this quick start](https://cloud.google.com/vertex-ai/generative-ai/docs/start/quickstarts/quickstart-multimodal#expandable-1) to create a project for sending requests to the Gemini API, then gather [Application Default Credentials](https://cloud.google.com/docs/authentication/provide-credentials-adc) with the strategy to match your development environment.
65-
66-
Once your project and credentials are configured, export environment variables to select from Gemini models:
67-
68-
```zsh
69-
export VERTEX_AI_PROJECT_ID=<your-project-id>
70-
export VERTEX_AI_LOCATION=<location-to-deploy-model>
71-
```
72-
73-
The project location can be located under the **Region** on the [Vertex AI](https://console.cloud.google.com/vertex-ai) dashboard, as well as more details about available Gemini models.
74-
75-
##### OpenAI Setup
76-
77-
Unlock the OpenAI models from your OpenAI account dashboard by clicking [create a new secret key](https://platform.openai.com/api-keys), then export the key like so:
78-
79-
```zsh
80-
export OPENAI_API_KEY=<your-api-key>
81-
```
82-
83-
##### GenAI Setup
84-
85-
To use the custom GenAI platform with OpenAI-compatible API, export the following environment variables:
86-
87-
```zsh
8843
export GENAI_API_KEY=<your-genai-api-key>
89-
export GENAI_API_URL=<your-genai-api-url> # Optional, can be private: https://docs.digitalocean.com/products/genai-platform/how-to/manage-ai-agent/use-agent/#set-availability. Example: https://agent-<id>.ondigitalocean.app/api/v1
44+
export GENAI_API_URL=<your-genai-api-url> # Optional for private agents, ex: https://agent-<id>.ondigitalocean.app/api/v1
9045
```
9146

92-
### Setup Your Local Project
47+
### Local Development
48+
9349
```zsh
94-
# Clone this project onto your machine
50+
# Clone this project
9551
git clone https://github.com/slack-samples/bolt-python-ai-chatbot.git
9652

97-
# Change into this project directory
53+
# Navigate to the project directory
9854
cd bolt-python-ai-chatbot
9955

100-
# Setup your python virtual environment
56+
# Setup python virtual environment
10157
python3 -m venv .venv
10258
source .venv/bin/activate
10359

104-
# Install the dependencies
60+
# Install dependencies
10561
pip install -r requirements.txt
10662

10763
# Start your local server
10864
python3 app.py
10965
```
11066

111-
#### Linting
112-
```zsh
113-
# Run flake8 from root directory for linting
114-
flake8 *.py && flake8 listeners/
115-
116-
# Run black from root directory for code formatting
117-
black .
118-
```
119-
120-
## Project Structure
121-
122-
### `manifest.json`
123-
124-
`manifest.json` is a configuration for Slack apps. With a manifest, you can create an app with a pre-defined configuration, or adjust the configuration of an existing app.
125-
126-
127-
### `app.py`
128-
129-
`app.py` is the entry point for the application and is the file you'll run to start the server. This project aims to keep this file as thin as possible, primarily using it as a way to route inbound requests.
130-
67+
### Deploy to DigitalOcean App Platform
13168

132-
### `/listeners`
69+
This application can be easily deployed to DigitalOcean App Platform:
13370

134-
Every incoming request is routed to a "listener". Inside this directory, we group each listener based on the Slack Platform feature used, so `/listeners/commands` handles incoming [Slash Commands](https://api.slack.com/interactivity/slash-commands) requests, `/listeners/events` handles [Events](https://api.slack.com/apis/events-api) and so on.
71+
1. Fork or clone this repository to your GitHub account
72+
2. In the DigitalOcean control panel, go to App Platform and create a new app
73+
3. Connect your GitHub repository
74+
4. Configure the environment variables (SLACK_BOT_TOKEN, SLACK_APP_TOKEN, GENAI_API_KEY, GENAI_API_URL)
75+
5. Optionally add a Redis database component for state storage
76+
6. Deploy the application
13577

136-
### `/ai`
137-
138-
* `ai_constants.py`: Defines constants used throughout the AI module.
139-
140-
<a name="byo-llm"></a>
141-
#### `ai/providers`
142-
This module contains classes for communicating with different API providers, such as [Anthropic](https://www.anthropic.com/), [OpenAI](https://openai.com/), [Vertex AI](cloud.google.com/vertex-ai), and GenAI (custom OpenAI-compatible API). To add your own LLM, create a new class for it using the `base_api.py` as an example, then update `ai/providers/__init__.py` to include and utilize your new class for API communication.
143-
144-
* `__init__.py`:
145-
This file contains utility functions for handling responses from the provider APIs and retrieving available providers.
146-
147-
### `/state_store`
148-
149-
* `user_identity.py`: This file defines the UserIdentity class for creating user objects. Each object represents a user with the user_id, provider, and model attributes.
150-
151-
* `user_state_store.py`: This file defines the base UserStateStore interface for implementing different storage backends.
152-
153-
#### File-based State Storage
78+
## Project Structure
15479

155-
* `file_state_store.py`: This file defines the FileStateStore class which handles the logic for creating and managing files for each user.
156-
* `set_user_state.py`: This file creates a user object and uses a FileStateStore to save the user's selected provider to a JSON file.
157-
* `get_user_state.py`: This file retrieves a user's selected provider from the JSON file created with `set_user_state.py`.
80+
### `/ai` - AI Integration
15881

159-
#### Redis State Storage
82+
The `/ai` directory contains the core AI functionality:
16083

161-
* `redis_state_store.py`: This file defines the RedisStateStore class which handles the logic for storing user state in Redis.
162-
* `set_redis_user_state.py`: This file creates a user object and uses a RedisStateStore to save the user's selected provider to Redis.
163-
* `get_redis_user_state.py`: This file retrieves a user's selected provider from Redis.
84+
* `ai_constants.py`: Defines constants used throughout the AI module
85+
* `/providers/__init__.py`: Contains utility functions for handling API responses and available providers
16486

165-
## App Distribution / OAuth
87+
The GenAI provider enables communication with DigitalOcean's GenAI Platform through an OpenAI-compatible API.
16688

167-
Only implement OAuth if you plan to distribute your application across multiple workspaces. A separate `app_oauth.py` file can be found with relevant OAuth settings.
89+
### `/state_store` - User Data Storage
16890

169-
When using OAuth, Slack requires a public URL where it can send requests. In this template app, we've used [`ngrok`](https://ngrok.com/download). Checkout [this guide](https://ngrok.com/docs#getting-started-expose) for setting it up.
91+
For App Platform deployments, we recommend using the Redis state storage option:
17092

171-
Start `ngrok` to access the app on an external network and create a redirect URL for OAuth.
93+
```zsh
94+
# Set Redis connection string
95+
export REDIS_URL=<your-redis-connection-string>
96+
```
17297

98+
Example with DigitalOcean Managed Redis:
17399
```
174-
ngrok http 3000
100+
export REDIS_URL=rediss://default:password@hostname.db.ondigitalocean.com:25061
175101
```
176102

177-
This output should include a forwarding address for `http` and `https` (we'll use `https`). It should look something like the following:
103+
## Alternative AI Providers
178104

179-
```
180-
Forwarding https://3cb89939.ngrok.io -> http://localhost:3000
181-
```
105+
While DigitalOcean GenAI is the primary focus, this template also supports other AI providers:
182106

183-
Navigate to **OAuth & Permissions** in your app configuration and click **Add a Redirect URL**. The redirect URL should be set to your `ngrok` forwarding address with the `slack/oauth_redirect` path appended. For example:
107+
### OpenAI
184108

109+
To use OpenAI models, add your API key:
110+
```zsh
111+
export OPENAI_API_KEY=<your-api-key>
185112
```
186-
https://3cb89939.ngrok.io/slack/oauth_redirect
187-
```
188-
189-
##### Redis Setup (Optional)
190113

191-
To use Redis for state storage instead of the file-based approach, set the following environment variable:
114+
### Anthropic
192115

116+
For Anthropic models, configure your API key:
193117
```zsh
194-
export REDIS_URL=<your-redis-connection-string>
118+
export ANTHROPIC_API_KEY=<your-api-key>
195119
```
196120

197-
For example, with a Redis service on DigitalOcean:
198-
```
199-
export REDIS_URL=rediss://default:password@hostname.db.ondigitalocean.com:25061
200-
```
121+
## Bring Your Own Language Model
122+
123+
You can create a custom provider by extending the base class in `ai/providers/base_api.py` and updating `ai/providers/__init__.py` to include your implementation.

0 commit comments

Comments
 (0)