0.09
A long-lived project that still receives updates
High performance scoring engine for ML models
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
 Dependencies

Runtime

>= 0
 Project Readme

ONNX Runtime Ruby

🔥 ONNX Runtime - the high performance scoring engine for ML models - for Ruby

Check out an example

Build Status

Installation

Add this line to your application’s Gemfile:

gem "onnxruntime"

Getting Started

Load a model and make predictions

model = OnnxRuntime::Model.new("model.onnx")
model.predict({x: [1, 2, 3]})

Download pre-trained models from the ONNX Model Zoo

Get inputs

model.inputs

Get outputs

model.outputs

Get metadata

model.metadata

Load a model from a string or other IO object

io = StringIO.new("...")
model = OnnxRuntime::Model.new(io)

Get specific outputs

model.predict({x: [1, 2, 3]}, output_names: ["label"])

Session Options

OnnxRuntime::Model.new(path_or_io, {
  enable_cpu_mem_arena: true,
  enable_mem_pattern: true,
  enable_profiling: false,
  execution_mode: :sequential,    # :sequential or :parallel
  free_dimension_overrides_by_denotation: nil,
  free_dimension_overrides_by_name: nil,
  graph_optimization_level: nil,  # :none, :basic, :extended, or :all
  inter_op_num_threads: nil,
  intra_op_num_threads: nil,
  log_severity_level: 2,
  log_verbosity_level: 0,
  logid: nil,
  optimized_model_filepath: nil,
  profile_file_prefix: "onnxruntime_profile_",
  session_config_entries: nil
})

Run Options

model.predict(input_feed, {
  output_names: nil,
  log_severity_level: 2,
  log_verbosity_level: 0,
  logid: nil,
  terminate: false,
  output_type: :ruby       # :ruby or :numo
})

Inference Session API

You can also use the Inference Session API, which follows the Python API.

session = OnnxRuntime::InferenceSession.new("model.onnx")
session.run(nil, {x: [1, 2, 3]})

The Python example models are included as well.

OnnxRuntime::Datasets.example("sigmoid.onnx")

GPU Support

Linux and Windows

Download the appropriate GPU release and set:

OnnxRuntime.ffi_lib = "path/to/lib/libonnxruntime.so" # onnxruntime.dll for Windows

and use:

model = OnnxRuntime::Model.new("model.onnx", providers: ["CUDAExecutionProvider"])

Mac

Use:

model = OnnxRuntime::Model.new("model.onnx", providers: ["CoreMLExecutionProvider"])

History

View the changelog

Contributing

Everyone is encouraged to help improve this project. Here are a few ways you can help:

To get started with development and testing:

git clone https://github.com/ankane/onnxruntime-ruby.git
cd onnxruntime-ruby
bundle install
bundle exec rake vendor:all
bundle exec rake test