When using Matlab’s mnrfit to train a multinomial logistic regression classifier recently, I found it rather memory-consuming. Specifically, when training a classifier with thousands of samples and tens of thousands of samples, it used up the 32GB of RAM on a workstation and forced it to maintain considerable virtual memory. What should’ve been a CPU-bound problem then became a HDD-bound problem, and I never received my result (it just took tooooo long).

After reading about the awesome optimization code minFunc, I decided to implement a classifier on my own. Thanks to minFunc and its examples, I can finish this little piece of code and publish it here.

Setup & Usage

The user will have to download the minFunc from their website, and make sure it’s working. That is, you may need to add the directory of minFunc to Matlab path (e.g. addpath('./minFunc')), run mexAll, which might be required the first time you use it. Then, add all minFunc’s subdirectories to your Matlab path (e.g. addpath(genpath('./minFunc'))) and you’ll be able to run the following test case.

>> x = repmat(eye(3),[10 1]); y = repmat([1;2;3],[10 1]); x = x + 0.1*randn(size(x));
>> model = mnlr_fit(x, y);
>> prediction = mnlr_predict(model, x, 1);
>> sum(prediction == y) / length(y)

ans =


That is, we just trained our 3-way classifier on a small dataset of 30 samples and achieved 100% training accuracy.


Source code files: