ankusa¶ ↑
<img src=“https://secure.travis-ci.org/bmuller/ankusa.png?branch=master” alt=“Build Status” /> <img src=“https://gemnasium.com/bmuller/ankusa.png” alt=“Dependency Status” />
Ankusa is a text classifier in Ruby that can use either Hadoop’s HBase, Mongo, or Cassandra for storage. Because it uses HBase/Mongo/Cassandra as a backend, the training corpus can be many terabytes in size (though additional memory and single file storage abilities also exist for smaller corpora).
Ankusa currently provides both a Naive Bayes and Kullback-Leibler divergence classifier. It ignores common words (a.k.a, stop words) and stems all others. Additionally, it uses additive smoothing in both classification methods.
Installation¶ ↑
Add this line to your application’s Gemfile:
gem 'ankusa'
Ensure that if you’re using the HBase, Cassandra, or MongoDB backends that you also add the correct dependency gem to your Gemfile:
gem 'hbaserb' # or gem 'cassandra' # or gem 'mongo'
If you’re using HBase, make sure the HBase Thrift interface has been started as well.
Basic Usage¶ ↑
Using the naive Bayes classifier:
require 'ankusa' require 'ankusa/hbase_storage' # connect to HBase. Alternatively, just for this test, use in memory storage with # storage = Ankusa::MemoryStorage.new storage = Ankusa::HBaseStorage.new 'localhost' c = Ankusa::NaiveBayesClassifier.new storage # Each of these calls will return a bag-of-words # has with stemmed words as keys and counts as values c.train :spam, "This is some spammy text" c.train :good, "This is not the bad stuff" # This will return the most likely class (as symbol) puts c.classify "This is some spammy text" # This will return Hash with classes as keys and # membership probability as values puts c.classifications "This is some spammy text" # If you have a large corpus, the probabilities will # likely all be 0. In that case, you must use log # likelihood values puts c.log_likelihoods "This is some spammy text" # get a list of all classes puts c.classnames # close connection storage.close
KL Diverence Classifier¶ ↑
There is a Kullback–Leibler divergence classifier as well. KL divergence is a distance measure (though not a true metric because it does not satisfy the triangle inequality). The KL classifier simply measures the relative entropy between the text you want to classify and each of the classes. The class with the shortest “distance” is the best class. You may find that for a especially large corpus it may be slightly faster to use this classifier (since prior probablities are never calculated, only likelihoods).
The API is the same as the NaiveBayesClassifier, except rather than calling “classifications” if you want actual numbers you call “distances”.
require 'ankusa' require 'ankusa/hbase_storage' # connect to HBase storage = Ankusa::HBaseStorage.new 'localhost' c = Ankusa::KLDivergenceClassifier.new storage # Each of these calls will return a bag-of-words # has with stemmed words as keys and counts as values c.train :spam, "This is some spammy text" c.train :good, "This is not the bad stuff" # This will return the most likely class (as symbol) puts c.classify "This is some spammy text" # This will return Hash with classes as keys and # distances >= 0 as values puts c.distances "This is some spammy text" # get a list of all classes puts c.classnames # close connection storage.close
Storage Methods¶ ↑
Ankusa has a generalized storage interface that has been implemented for HBase, Cassandra, Mongo, single file, and in-memory storage.
Memory storage can be used when you have a very small corpora
require 'ankusa/memory_storage' storage = Ankusa::MemoryStorage.new
FileSystem storage can be used when you have a very small corpora and want to persist the classification results.
require 'ankusa/file_system_storage' storage = Ankusa::FileSystemStorage.new '/path/to/file' # Do classification ... storage.save
The FileSystem storage does NOT save to the filesystem automatically, the #save method must be invoked to save and persist the results
HBase storage:
require 'ankusa/hbase_storage' # defaults: host='localhost', port=9090, frequency_tablename="ankusa_word_frequencies", summary_tablename="ankusa_summary" storage = Ankusa::HBaseStorage.new host, port, frequency_tablename, summary_tablename
For Cassandra storage:
-
You will need Cassandra version 0.7.0-rc2 or greater.
-
You will need to set a max number classes since current implementation of the Ruby Cassandra client doesn’t support table scans.
-
Prior to using the Cassandra storage you will need to run the following command from the cassandra-cli: “create keyspace ankusa with replication_factor = 1”. This should be fixed with a new release candidate for Cassandra.
To use the Cassandra storage class:
require 'ankusa/cassandra_storage' # defaults: host='127.0.0.1', port=9160, keyspace = 'ankusa', max_classes = 100 storage = Ankusa::CassandraStorage.new host, port, keyspace, max_classes
For MongoDB storage:
require 'ankusa/mongo_db_storage' storage = Ankusa::MongoDbStorage.new :host => "localhost", :port => 27017, :db => "ankusa" # defaults: :host => "localhost", :port => 27017, :db => "ankusa" # no default username or password # you can also use frequency_tablename and summary_tablename options
Running Tests¶ ↑
You can run the tests for any of the four storage methods. For instance, for memory storage:
rake test_memory
For the other methods you will need to edit the file test/config.yml and set the configuration params. Then:
rake test_hbase # or rake test_cassandra # or rake test_filesystem #or rake test_mongo_db