Nano Bots ๐ ๐ค
An implementation of the Nano Bots specification with support for Anthropic Claude, Cohere Command, Google Gemini, Maritaca AI Sabiรก, Mistral AI, Ollama, OpenAI ChatGPT, and others, with support for calling tools (functions).
TL;DR and Quick Start
gem install nano-bots -v 3.4.0
nb - - eval "hello"
# => Hello! How may I assist you today?
nb - - repl
๐ค> Hi, how are you doing?
As an AI language model, I do not experience emotions but I am functioning
well. How can I assist you?
๐ค> |
---
meta:
symbol: ๐ค
name: Nano Bot Name
author: Your Name
version: 1.0.0
license: CC0-1.0
description: A helpful assistant.
behaviors:
interaction:
directive: You are a helpful assistant.
provider:
id: openai
credentials:
access-token: ENV/OPENAI_API_KEY
settings:
user: ENV/NANO_BOTS_END_USER
model: gpt-4o
nb gpt.yml - eval "hi"
# => Hello! How can I assist you today?
gem 'nano-bots', '~> 3.4.0'
require 'nano-bots'
bot = NanoBot.new(cartridge: 'gpt.yml')
bot.eval('Hi!') do |content, fragment, finished, meta|
print fragment unless fragment.nil?
end
# => Hello! How can I assist you today?
- TL;DR and Quick Start
- Usage
- Command Line
- Debugging
- Library
- Setup
- Anthropic Claude
- Cohere Command
- Maritaca AI MariTalk
- Mistral AI
- Ollama
- OpenAI ChatGPT
- Google Gemini
- Option 1: API Key (Generative Language API)
- Option 2: Service Account Credentials File (Vertex AI API)
- Option 3: Application Default Credentials (Vertex AI API)
- Custom Project ID
- Cartridges
- Tools (Functions)
- Experimental Clojure Support
- Marketplace
- Tools (Functions)
- Security and Privacy
- Cryptography
- End-user IDs
- Decrypting
- Supported Providers
- Docker
- Anthropic Claude Container
- Cohere Command Container
- Maritaca AI MariTalk Container
- Mistral AI Container
- Ollama Container
- OpenAI ChatGPT Container
- Google Gemini Container
- Option 1: API Key (Generative Language API) Config
- Option 2: Service Account Credentials File (Vertex AI API) Config
- Option 3: Application Default Credentials (Vertex AI API) Config
- Custom Project ID Config
- Running the Container
- Development
- Publish to RubyGems
Usage
Command Line
After installing the gem, the nb
binary command will be available for your project or system.
Examples of usage:
nb - - eval "hello"
# => Hello! How may I assist you today?
nb to-en-us-translator.yml - eval "Salut, comment รงa va?"
# => Hello, how are you doing?
nb midjourney.yml - eval "happy cyberpunk robot"
# => A cheerful and fun-loving robot is dancing wildly amidst a
# futuristic and lively cityscape. Holographic advertisements
# and vibrant neon colors can be seen in the background.
nb lisp.yml - eval "(+ 1 2)"
# => 3
cat article.txt |
nb to-en-us-translator.yml - eval |
nb summarizer.yml - eval
# -> LLM stands for Large Language Model, which refers to an
# artificial intelligence algorithm capable of processing
# and understanding vast amounts of natural language data,
# allowing it to generate human-like responses and perform
# a range of language-related tasks.
nb - - repl
nb assistant.yml - repl
๐ค> Hi, how are you doing?
As an AI language model, I do not experience emotions but I am functioning
well. How can I assist you?
๐ค> |
You can exit the REPL by typing exit
.
All of the commands above are stateless. If you want to preserve the history of your interactions, replace the -
with a state key:
nb assistant.yml your-user eval "Salut, comment รงa va?"
nb assistant.yml your-user repl
nb assistant.yml 6ea6c43c42a1c076b1e3c36fa349ac2c eval "Salut, comment รงa va?"
nb assistant.yml 6ea6c43c42a1c076b1e3c36fa349ac2c repl
You can use a simple key, such as your username, or a randomly generated one:
require 'securerandom'
SecureRandom.hex # => 6ea6c43c42a1c076b1e3c36fa349ac2c
Debugging
nb - - cartridge
nb cartridge.yml - cartridge
nb - STATE-KEY state
nb cartridge.yml STATE-KEY state
Library
To use it as a library:
require 'nano-bots/cli' # Equivalent to the `nb` command.
require 'nano-bots'
NanoBot.cli # Equivalent to the `nb` command.
NanoBot.repl(cartridge: 'cartridge.yml') # Starts a new REPL.
bot = NanoBot.new(cartridge: 'cartridge.yml')
bot = NanoBot.new(
cartridge: YAML.safe_load(File.read('cartridge.yml'), permitted_classes: [Symbol])
)
bot = NanoBot.new(
cartridge: { ... } # Parsed Cartridge Hash
)
bot.eval('Hello')
bot.eval('Hello', as: 'eval')
bot.eval('Hello', as: 'repl')
# When stream is enabled and available:
bot.eval('Hi!') do |content, fragment, finished, meta|
print fragment unless fragment.nil?
end
bot.repl # Starts a new REPL.
NanoBot.repl(cartridge: 'cartridge.yml', state: '6ea6c43c42a1c076b1e3c36fa349ac2c')
bot = NanoBot.new(cartridge: 'cartridge.yml', state: '6ea6c43c42a1c076b1e3c36fa349ac2c')
bot.prompt # => "๐ค\u001b[34m> \u001b[0m"
bot.boot
bot.boot(as: 'eval')
bot.boot(as: 'repl')
bot.boot do |content, fragment, finished, meta|
print fragment unless fragment.nil?
end
Setup
To install the CLI on your system:
gem install nano-bots -v 3.4.0
To use it in a Ruby project as a library, add to your Gemfile
:
gem 'nano-bots', '~> 3.4.0'
bundle install
For credentials and configurations, relevant environment variables can be set in your .bashrc
, .zshrc
, or equivalent files, as well as in your Docker Container or System Environment. Example:
export NANO_BOTS_ENCRYPTION_PASSWORD=UNSAFE
export NANO_BOTS_END_USER=your-user
# export NANO_BOTS_STATE_PATH=/home/user/.local/state/nano-bots
# export NANO_BOTS_CARTRIDGES_PATH=/home/user/.local/share/nano-bots/cartridges
Alternatively, if your current directory has a .env
file with the environment variables, they will be automatically loaded:
NANO_BOTS_ENCRYPTION_PASSWORD=UNSAFE
NANO_BOTS_END_USER=your-user
# NANO_BOTS_STATE_PATH=/home/user/.local/state/nano-bots
# NANO_BOTS_CARTRIDGES_PATH=/home/user/.local/share/nano-bots/cartridges
Anthropic Claude
You can obtain your credentials on the Anthropic Console.
export ANTHROPIC_API_KEY=your-api-key
Alternatively, if your current directory has a .env
file with the environment variables, they will be automatically loaded:
ANTHROPIC_API_KEY=your-api-key
Create a cartridge.yml
file:
---
meta:
symbol: ๐ค
name: Nano Bot Name
author: Your Name
version: 1.0.0
license: CC0-1.0
description: A helpful assistant.
behaviors:
interaction:
directive: You are a helpful assistant.
provider:
id: anthropic
credentials:
api-key: ENV/ANTHROPIC_API_KEY
settings:
model: claude-3-5-sonnet-20240620
max_tokens: 4096
Read the full specification for Anthropic Claude.
nb cartridge.yml - eval "Hello"
nb cartridge.yml - repl
bot = NanoBot.new(cartridge: 'cartridge.yml')
puts bot.eval('Hello')
Cohere Command
You can obtain your credentials on the Cohere Platform.
export COHERE_API_KEY=your-api-key
Alternatively, if your current directory has a .env
file with the environment variables, they will be automatically loaded:
COHERE_API_KEY=your-api-key
Create a cartridge.yml
file:
---
meta:
symbol: ๐ค
name: Nano Bot Name
author: Your Name
version: 1.0.0
license: CC0-1.0
description: A helpful assistant.
behaviors:
interaction:
directive: You are a helpful assistant.
provider:
id: cohere
credentials:
api-key: ENV/COHERE_API_KEY
settings:
model: command
Read the full specification for Cohere Command.
nb cartridge.yml - eval "Hello"
nb cartridge.yml - repl
bot = NanoBot.new(cartridge: 'cartridge.yml')
puts bot.eval('Hello')
Maritaca AI MariTalk
You can obtain your API key at MariTalk.
Enclose credentials in single quotes when using environment variables to prevent issues with the $ character in the API key:
export MARITACA_API_KEY='123...$a12...'
Alternatively, if your current directory has a .env
file with the environment variables, they will be automatically loaded:
MARITACA_API_KEY='123...$a12...'
Create a cartridge.yml
file:
---
meta:
symbol: ๐ค
name: Nano Bot Name
author: Your Name
version: 1.0.0
license: CC0-1.0
description: A helpful assistant.
behaviors:
interaction:
directive: You are a helpful assistant.
provider:
id: maritaca
credentials:
api-key: ENV/MARITACA_API_KEY
settings:
model: sabia-2-medium
Read the full specification for Mistral AI.
nb cartridge.yml - eval "Hello"
nb cartridge.yml - repl
bot = NanoBot.new(cartridge: 'cartridge.yml')
puts bot.eval('Hello')
Mistral AI
You can obtain your credentials on the Mistral Platform.
export MISTRAL_API_KEY=your-api-key
Alternatively, if your current directory has a .env
file with the environment variables, they will be automatically loaded:
MISTRAL_API_KEY=your-api-key
Create a cartridge.yml
file:
---
meta:
symbol: ๐ค
name: Nano Bot Name
author: Your Name
version: 1.0.0
license: CC0-1.0
description: A helpful assistant.
behaviors:
interaction:
directive: You are a helpful assistant.
provider:
id: mistral
credentials:
api-key: ENV/MISTRAL_API_KEY
settings:
model: mistral-medium-latest
Read the full specification for Mistral AI.
nb cartridge.yml - eval "Hello"
nb cartridge.yml - repl
bot = NanoBot.new(cartridge: 'cartridge.yml')
puts bot.eval('Hello')
Ollama
To install and set up, follow the instructions on the Ollama website.
export OLLAMA_API_ADDRESS=http://localhost:11434
Alternatively, if your current directory has a .env
file with the environment variables, they will be automatically loaded:
OLLAMA_API_ADDRESS=http://localhost:11434
Create a cartridge.yml
file:
---
meta:
symbol: ๐ค
name: Nano Bot Name
author: Your Name
version: 1.0.0
license: CC0-1.0
description: A helpful assistant.
behaviors:
interaction:
directive: You are a helpful assistant.
provider:
id: ollama
credentials:
address: ENV/OLLAMA_API_ADDRESS
settings:
model: llama3
Read the full specification for Ollama.
nb cartridge.yml - eval "Hello"
nb cartridge.yml - repl
bot = NanoBot.new(cartridge: 'cartridge.yml')
puts bot.eval('Hello')
OpenAI ChatGPT
You can obtain your credentials on the OpenAI Platform.
export OPENAI_API_KEY=your-access-token
Alternatively, if your current directory has a .env
file with the environment variables, they will be automatically loaded:
OPENAI_API_KEY=your-access-token
Create a cartridge.yml
file:
---
meta:
symbol: ๐ค
name: Nano Bot Name
author: Your Name
version: 1.0.0
license: CC0-1.0
description: A helpful assistant.
behaviors:
interaction:
directive: You are a helpful assistant.
provider:
id: openai
credentials:
access-token: ENV/OPENAI_API_KEY
settings:
user: ENV/NANO_BOTS_END_USER
model: gpt-4o
Read the full specification for OpenAI ChatGPT.
nb cartridge.yml - eval "Hello"
nb cartridge.yml - repl
bot = NanoBot.new(cartridge: 'cartridge.yml')
puts bot.eval('Hello')
Google Gemini
Click here to learn how to obtain your credentials.
Option 1: API Key (Generative Language API)
export GOOGLE_API_KEY=your-api-key
Alternatively, if your current directory has a .env
file with the environment variables, they will be automatically loaded:
GOOGLE_API_KEY=your-api-key
Create a cartridge.yml
file:
---
meta:
symbol: ๐ค
name: Nano Bot Name
author: Your Name
version: 1.0.0
license: CC0-1.0
description: A helpful assistant.
behaviors:
interaction:
directive: You are a helpful assistant.
provider:
id: google
credentials:
service: generative-language-api
api-key: ENV/GOOGLE_API_KEY
options:
model: gemini-pro
Read the full specification for Google Gemini.
nb cartridge.yml - eval "Hello"
nb cartridge.yml - repl
bot = NanoBot.new(cartridge: 'cartridge.yml')
puts bot.eval('Hello')
Option 2: Service Account Credentials File (Vertex AI API)
export GOOGLE_CREDENTIALS_FILE_PATH=google-credentials.json
export GOOGLE_REGION=us-east4
Alternatively, if your current directory has a .env
file with the environment variables, they will be automatically loaded:
GOOGLE_CREDENTIALS_FILE_PATH=google-credentials.json
GOOGLE_REGION=us-east4
Create a cartridge.yml
file:
---
meta:
symbol: ๐ค
name: Nano Bot Name
author: Your Name
version: 1.0.0
license: CC0-1.0
description: A helpful assistant.
behaviors:
interaction:
directive: You are a helpful assistant.
provider:
id: google
credentials:
service: vertex-ai-api
file-path: ENV/GOOGLE_CREDENTIALS_FILE_PATH
region: ENV/GOOGLE_REGION
options:
model: gemini-pro
Read the full specification for Google Gemini.
nb cartridge.yml - eval "Hello"
nb cartridge.yml - repl
bot = NanoBot.new(cartridge: 'cartridge.yml')
puts bot.eval('Hello')
Option 3: Application Default Credentials (Vertex AI API)
export GOOGLE_REGION=us-east4
Alternatively, if your current directory has a .env
file with the environment variables, they will be automatically loaded:
GOOGLE_REGION=us-east4
Create a cartridge.yml
file:
---
meta:
symbol: ๐ค
name: Nano Bot Name
author: Your Name
version: 1.0.0
license: CC0-1.0
description: A helpful assistant.
behaviors:
interaction:
directive: You are a helpful assistant.
provider:
id: google
credentials:
service: vertex-ai-api
region: ENV/GOOGLE_REGION
options:
model: gemini-pro
Read the full specification for Google Gemini.
nb cartridge.yml - eval "Hello"
nb cartridge.yml - repl
bot = NanoBot.new(cartridge: 'cartridge.yml')
puts bot.eval('Hello')
Custom Project ID
If you need to manually set a Google Project ID:
export GOOGLE_PROJECT_ID=your-project-id
Alternatively, if your current directory has a .env
file with the environment variables, they will be automatically loaded:
GOOGLE_PROJECT_ID=your-project-id
Add to your cartridge.yml
file:
---
provider:
id: google
credentials:
project-id: ENV/GOOGLE_PROJECT_ID
Cartridges
Check the Nano Bots specification to learn more about how to build cartridges.
Try the Nano Bots Clinic (Live Editor) to learn about creating Cartridges.
Here's what a Nano Bot Cartridge looks like:
---
meta:
symbol: ๐ค
name: Nano Bot Name
author: Your Name
version: 1.0.0
license: CC0-1.0
description: A helpful assistant.
behaviors:
interaction:
directive: You are a helpful assistant.
provider:
id: openai
credentials:
access-token: ENV/OPENAI_API_KEY
settings:
user: ENV/NANO_BOTS_END_USER
model: gpt-4o
Tools (Functions)
Nano Bots can also be powered by Tools (Functions):
---
tools:
- name: random-number
description: Generates a random number between 1 and 100.
fennel: |
(math.random 1 100)
๐ค> please generate a random number
random-number {} [yN] y
random-number {}
59
The randomly generated number is 59.
๐ค> |
To successfully use Tools (Functions), you need to specify a provider and a model that supports them. As of the writing of this README, the provider that supports them is OpenAI, with models gpt-3.5-turbo-1106
and gpt-4o
, and Google, with the vertex-ai-api
service and the model gemini-pro
. Other providers do not yet have support.
Check the Nano Bots specification to learn more about Tools (Functions).
Experimental Clojure Support
We are exploring the use of Clojure through Babashka, powered by GraalVM.
The experimental support for Clojure would be similar to Lua and Fennel, using the clojure:
key:
---
clojure: |
(-> (java.time.ZonedDateTime/now)
(.format (java.time.format.DateTimeFormatter/ofPattern "yyyy-MM-dd HH:mm"))
(clojure.string/trimr))
Unlike Lua and Fennel, Clojure support is not embedded in this implementation. It relies on having the Babashka binary (bb
) available in your environment where the Nano Bot is running.
Here's how to install Babashka:
curl -s https://raw.githubusercontent.com/babashka/babashka/master/install | sudo bash
This is a quick check to ensure that it is available and working:
bb -e '{:hello "world"}'
# => {:hello "world"}
We don't have sandbox support for Clojure; this means that you need to disable it to be able to run Clojure code, which you do at your own risk:
---
safety:
functions:
sandboxed: false
Marketplace
You can explore the Nano Bots Marketplace to discover new Cartridges that can help you.
Security and Privacy
Each provider will have its own security and privacy policies (e.g. OpenAI Policy), so you must consult them to understand their implications.
Cryptography
By default, all states stored in your local disk are encrypted.
To ensure that the encryption is secure, you need to define a password through the NANO_BOTS_ENCRYPTION_PASSWORD
environment variable. Otherwise, although the content will be encrypted, anyone would be able to decrypt it without a password.
It's important to note that the content shared with providers, despite being transmitted over secure connections (e.g., HTTPS), will be readable by the provider. This is because providers need to operate on the data, which would not be possible if the content was encrypted beyond HTTPS. So, the data stored locally on your system is encrypted, which does not mean that what you share with providers will not be readable by them.
To ensure that your encryption and password are configured properly, you can run the following command:
nb security
Which should return:
โ
Encryption is enabled and properly working.
This means that your data is stored in an encrypted format on your disk.
โ
A password is being used for the encrypted content.
This means that only those who possess the password can decrypt your data.
Alternatively, you can check it at runtime with:
require 'nano-bots'
NanoBot.security.check
# => { encryption: true, password: true }
End-user IDs
A common strategy for deploying Nano Bots to multiple users through APIs or automations is to assign a unique end-user ID for each user. This can be useful if any of your users violate the provider's policy due to abusive behavior. By providing the end-user ID, you can unravel that even though the activity originated from your API Key, the actions taken were not your own.
You can define custom end-user identifiers in the following way:
NanoBot.new(environment: { NANO_BOTS_END_USER: 'custom-user-a' })
NanoBot.new(environment: { NANO_BOTS_END_USER: 'custom-user-b' })
Consider that you have the following end-user identifier in your environment:
NANO_BOTS_END_USER=your-name
Or a configuration in your Cartridge:
---
provider:
id: openai
settings:
user: your-name
The requests will be performed as follows:
NanoBot.new(cartridge: '-')
# { user: 'your-name' }
NanoBot.new(cartridge: '-', environment: { NANO_BOTS_END_USER: 'custom-user-a' })
# { user: 'custom-user-a' }
NanoBot.new(cartridge: '-', environment: { NANO_BOTS_END_USER: 'custom-user-b' })
# { user: 'custom-user-b' }
Actually, to enhance privacy, neither your user nor your users' identifiers will be shared in this way. Instead, they will be encrypted before being shared with the provider:
'your-name'
# _O7OjYUESagb46YSeUeSfSMzoO1Yg0BZqpsAkPg4j62SeNYlgwq3kn51Ob2wmIehoA==
'custom-user-a'
# _O7OjYUESagb46YSeUeSfSMzoO1Yg0BZJgIXHCBHyADW-rn4IQr-s2RvP7vym8u5tnzYMIs=
'custom-user-b'
# _O7OjYUESagb46YSeUeSfSMzoO1Yg0BZkjUwCcsh9sVppKvYMhd2qGRvP7vym8u5tnzYMIg=
In this manner, you possess identifiers if required, however, their actual content can only be decrypted by you via your secure password (NANO_BOTS_ENCRYPTION_PASSWORD
).
Decrypting
To decrypt your encrypted data, once you have properly configured your password, you can simply run:
require 'nano-bots'
NanoBot.security.decrypt('_O7OjYUESagb46YSeUeSfSMzoO1Yg0BZqpsAkPg4j62SeNYlgwq3kn51Ob2wmIehoA==')
# your-name
NanoBot.security.decrypt('_O7OjYUESagb46YSeUeSfSMzoO1Yg0BZJgIXHCBHyADW-rn4IQr-s2RvP7vym8u5tnzYMIs=')
# custom-user-a
NanoBot.security.decrypt('_O7OjYUESagb46YSeUeSfSMzoO1Yg0BZkjUwCcsh9sVppKvYMhd2qGRvP7vym8u5tnzYMIg=')
# custom-user-b
If you lose your password, you lose your data. It is not possible to recover it at all. For real.
Supported Providers
- Anthropic Claude
- Cohere Command
- Google Gemini
- Maritaca AI MariTalk
- Mistral AI
- Ollama
- Open AI ChatGPT
01.AI Yi, LMSYS Vicuna, Meta Llama, and WizardLM are open-source models that are supported through Ollama.
Docker
Clone the repository and copy the Docker Compose template:
git clone https://github.com/icebaker/ruby-nano-bots.git
cd ruby-nano-bots
cp docker-compose.example.yml docker-compose.yml
Set your provider credentials and choose your desired path for the cartridges files:
Anthropic Claude Container
---
services:
nano-bots:
image: ruby:3.3.3-slim-bookworm
command: sh -c "apt-get update && apt-get install -y --no-install-recommends build-essential libffi-dev libsodium-dev lua5.4-dev curl && curl -s https://raw.githubusercontent.com/babashka/babashka/master/install | bash && gem install nano-bots -v 3.4.0 && bash"
environment:
ANTHROPIC_API_KEY: your-api-key
NANO_BOTS_ENCRYPTION_PASSWORD: UNSAFE
NANO_BOTS_END_USER: your-user
volumes:
- ./your-cartridges:/root/.local/share/nano-bots/cartridges
- ./your-state-path:/root/.local/state/nano-bots
Cohere Command Container
---
services:
nano-bots:
image: ruby:3.3.3-slim-bookworm
command: sh -c "apt-get update && apt-get install -y --no-install-recommends build-essential libffi-dev libsodium-dev lua5.4-dev curl && curl -s https://raw.githubusercontent.com/babashka/babashka/master/install | bash && gem install nano-bots -v 3.4.0 && bash"
environment:
COHERE_API_KEY: your-api-key
NANO_BOTS_ENCRYPTION_PASSWORD: UNSAFE
NANO_BOTS_END_USER: your-user
volumes:
- ./your-cartridges:/root/.local/share/nano-bots/cartridges
- ./your-state-path:/root/.local/state/nano-bots
Maritaca AI MariTalk Container
---
services:
nano-bots:
image: ruby:3.3.3-slim-bookworm
command: sh -c "apt-get update && apt-get install -y --no-install-recommends build-essential libffi-dev libsodium-dev lua5.4-dev curl && curl -s https://raw.githubusercontent.com/babashka/babashka/master/install | bash && gem install nano-bots -v 3.4.0 && bash"
environment:
MARITACA_API_KEY: your-api-key
NANO_BOTS_ENCRYPTION_PASSWORD: UNSAFE
NANO_BOTS_END_USER: your-user
volumes:
- ./your-cartridges:/root/.local/share/nano-bots/cartridges
- ./your-state-path:/root/.local/state/nano-bots
Mistral AI Container
---
services:
nano-bots:
image: ruby:3.3.3-slim-bookworm
command: sh -c "apt-get update && apt-get install -y --no-install-recommends build-essential libffi-dev libsodium-dev lua5.4-dev curl && curl -s https://raw.githubusercontent.com/babashka/babashka/master/install | bash && gem install nano-bots -v 3.4.0 && bash"
environment:
MISTRAL_API_KEY: your-api-key
NANO_BOTS_ENCRYPTION_PASSWORD: UNSAFE
NANO_BOTS_END_USER: your-user
volumes:
- ./your-cartridges:/root/.local/share/nano-bots/cartridges
- ./your-state-path:/root/.local/state/nano-bots
Ollama Container
Remember that your localhost
is by default inaccessible from inside Docker. You need to either establish inter-container networking, use the host's address, or use the host network, depending on where the Ollama server is running and your preferences.
---
services:
nano-bots:
image: ruby:3.3.3-slim-bookworm
command: sh -c "apt-get update && apt-get install -y --no-install-recommends build-essential libffi-dev libsodium-dev lua5.4-dev curl && curl -s https://raw.githubusercontent.com/babashka/babashka/master/install | bash && gem install nano-bots -v 3.4.0 && bash"
environment:
OLLAMA_API_ADDRESS: http://localhost:11434
NANO_BOTS_ENCRYPTION_PASSWORD: UNSAFE
NANO_BOTS_END_USER: your-user
volumes:
- ./your-cartridges:/root/.local/share/nano-bots/cartridges
- ./your-state-path:/root/.local/state/nano-bots
# If you are running the Ollama server on your localhost:
network_mode: host # WARNING: Be careful, this may be a security risk.
OpenAI ChatGPT Container
---
services:
nano-bots:
image: ruby:3.3.3-slim-bookworm
command: sh -c "apt-get update && apt-get install -y --no-install-recommends build-essential libffi-dev libsodium-dev lua5.4-dev curl && curl -s https://raw.githubusercontent.com/babashka/babashka/master/install | bash && gem install nano-bots -v 3.4.0 && bash"
environment:
OPENAI_API_KEY: your-access-token
NANO_BOTS_ENCRYPTION_PASSWORD: UNSAFE
NANO_BOTS_END_USER: your-user
volumes:
- ./your-cartridges:/root/.local/share/nano-bots/cartridges
- ./your-state-path:/root/.local/state/nano-bots
Google Gemini Container
Option 1: API Key (Generative Language API) Config
---
services:
nano-bots:
image: ruby:3.3.3-slim-bookworm
command: sh -c "apt-get update && apt-get install -y --no-install-recommends build-essential libffi-dev libsodium-dev lua5.4-dev curl && curl -s https://raw.githubusercontent.com/babashka/babashka/master/install | bash && gem install nano-bots -v 3.4.0 && bash"
environment:
GOOGLE_API_KEY: your-api-key
NANO_BOTS_ENCRYPTION_PASSWORD: UNSAFE
NANO_BOTS_END_USER: your-user
volumes:
- ./your-cartridges:/root/.local/share/nano-bots/cartridges
- ./your-state-path:/root/.local/state/nano-bots
Option 2: Service Account Credentials File (Vertex AI API) Config
---
services:
nano-bots:
image: ruby:3.3.3-slim-bookworm
command: sh -c "apt-get update && apt-get install -y --no-install-recommends build-essential libffi-dev libsodium-dev lua5.4-dev curl && curl -s https://raw.githubusercontent.com/babashka/babashka/master/install | bash && gem install nano-bots -v 3.4.0 && bash"
environment:
GOOGLE_CREDENTIALS_FILE_PATH: /root/.config/google-credentials.json
GOOGLE_REGION: us-east4
NANO_BOTS_ENCRYPTION_PASSWORD: UNSAFE
NANO_BOTS_END_USER: your-user
volumes:
- ./google-credentials.json:/root/.config/google-credentials.json
- ./your-cartridges:/root/.local/share/nano-bots/cartridges
- ./your-state-path:/root/.local/state/nano-bots
Option 3: Application Default Credentials (Vertex AI API) Config
---
services:
nano-bots:
image: ruby:3.3.3-slim-bookworm
command: sh -c "apt-get update && apt-get install -y --no-install-recommends build-essential libffi-dev libsodium-dev lua5.4-dev curl && curl -s https://raw.githubusercontent.com/babashka/babashka/master/install | bash && gem install nano-bots -v 3.4.0 && bash"
environment:
GOOGLE_REGION: us-east4
NANO_BOTS_ENCRYPTION_PASSWORD: UNSAFE
NANO_BOTS_END_USER: your-user
volumes:
- ./your-cartridges:/root/.local/share/nano-bots/cartridges
- ./your-state-path:/root/.local/state/nano-bots
Custom Project ID Config
If you need to manually set a Google Project ID:
environment:
GOOGLE_PROJECT_ID=your-project-id
Running the Container
Enter the container:
docker compose run nano-bots
Start playing:
nb - - eval "hello"
nb - - repl
nb assistant.yml - eval "hello"
nb assistant.yml - repl
You can exit the REPL by typing exit
.
Development
bundle
rubocop -A
rspec
bundle exec ruby spec/tasks/run-all-models.rb
bundle exec ruby spec/tasks/run-model.rb spec/data/cartridges/models/openai/gpt-4-turbo.yml
bundle exec ruby spec/tasks/run-model.rb spec/data/cartridges/models/openai/gpt-4-turbo.yml stream
If you face issues upgrading gem versions:
bundle install --full-index
Publish to RubyGems
gem build nano-bots.gemspec
gem signin
gem push nano-bots-3.4.0.gem