Important! Aleph Alpha will discontinue access to the public API as of November 30, 2024.
Aleph Alpha Ruby Client
Bundler
Add this line to your application's Gemfile:
gem "aleph-alpha-ruby"
And then execute:
$ bundle install
Gem install
Or install with:
$ gem install aleph-alpha-ruby
and require with:
require "aleph-alpha"
Usage
- Get your API token from https://app.aleph-alpha.com/profile
Quickstart
For a quick test you can pass your token directly to a new client:
client = AlephAlpha::Client.new(access_token: "access_token_goes_here")
With Config
AlephAlpha.configure do |config|
config.access_token = ENV.fetch("ALEPH_ALPHA_ACCESS_TOKEN")
end
Then you can create a client like this:
client = AlephAlpha::Client.new
Custom timeout or base URI
The default timeout for any request using this library is 120 seconds. You can change that by passing a number of seconds to the request_timeout
when initializing the client. You can also change the base URI used for all requests.
client = AlephAlpha::Client.new(
access_token: "access_token_goes_here",
uri_base: "https://other-api-endpoint.aleph-alpha.com",
request_timeout: 240
)
or when configuring the gem:
AlephAlpha.configure do |config|
config.access_token = ENV.fetch("ALEPH_ALPHA_ACCESS_TOKEN")
config.uri_base = "https://other-api-endpoint.aleph-alpha.com" # Optional
config.request_timeout = 240 # Optional
end
Models
There are different models that can be used to generate text. For a full list:
client.models.list
Current API version
client.version
Tokens
Get a list of issued API tokens
client.tokens.list
Create a new API token
client.tokens.create(
parameters: {
description: "token used on my laptop"
}
)
Delete an API token
client.tokens.delete(id: 123)
Users
Get settings for own user
client.users.me
Change settings for own user
client.users.settings(
parameters: {
out_of_credits_threshold: 0
}
)
Query Recent Usage
client.users.usage
Tasks
Completion
client.completions(
parameters: {
model: "model name",
prompt: "some text",
maximum_tokens: 64
}
)
Embeddings
response = client.embeddings(
parameters: {
model: "luminous-base",
prompt: "An apple a day keeps the doctor away.",
layers: [
0,
1
],
tokens: false,
pooling: [
"max"
]
}
)
# => Index 0 corresponds to the word embeddings used as input to the first transformer layer
puts response.dig("embeddings", "layer_0", "max")
# => Index 1 corresponds to the hidden state as output by the first transformer layer
puts response.dig("embeddings", "layer_1", "max")
Semantic Embeddings
client.semantic_embeddings(
parameters: {
model: "model name",
prompt: "some text",
representation: "symmetric",
compress_to_size: 128
}
)
Evaluate
client.evaluate(
parameters: {
model: "model name",
prompt: "some text",
completion_expected: "another text"
}
)
Explanation
client.explain(
parameters: {
model: "model name",
hosting: "aleph-alpha",
prompt: "some text",
target: "string",
control_factor: 0.1,
contextual_control_threshold: 0,
control_log_additive: true,
postprocessing: "none",
normalize: false,
prompt_granularity: {
type: "token",
delimiter: "string"
},
target_granularity: "complete",
control_token_overlap: "partial"
}
)
Tokenize
client.tokenize(
parameters: {
model: "model name",
prompt: "some text",
tokens: true,
token_ids: true
}
)
Detokenize
client.detokenize(
parameters: {
model: model,
token_ids: [
556,
48_741,
247,
2983,
28_063,
301,
10_510,
5469,
17
]
}
)
Q&A
client.qa(
parameters: {
query: "some text",
documents: [
{
text: "another text"
}
]
}
)
Summarize
client.summarize(
parameters: {
model: "model name",
document: {
text: "Text about something..."
}
}
)
Contributing
Initially, this gem is based on ruby-openai. I didn't want to reinvent the wheel and if possible have a similar API for convenience. Bug reports and pull requests are welcome on GitHub at https://github.com/skorth/aleph-alpha-ruby.
License
The gem is available as open source under the terms of the MIT License.