A MATLAB version of our new classification algorithm with guaranteed
sensitivity and specificity has been released.
The
main feature of this classification algorithm is that it is
"self-testing",
i.e., the sensitivity and specificity of the trained
classifier can be tested and certified by means of a rigorous
statistical method without the need of an independent test set.
Therefore, strange as it may seem, if you trust math (and your data
are
i.i.d.), then you need no validation set: all of your precious data can
be used in the training phase!
Moreover, the
algorithm allows the user to control the sensitivity-specificity balance
by means of two input parameters.
The
new algorithm uses a simpler construction than its noble ancestor GEM,
which makes it easier to analyse and super-easy to implement.
Try it
immediately on your problem, and let
us know! :)
DOWNLOAD
GEM-BALLS
By downloading files from this website, you are accepting the following
agreement
We (the licensee) understand that the
GEM package is
supplied "as is", without expressed or implied warranty.
We agree on the following:
- The licensers do not have any
obligation to provide any maintenance or consulting help with respect
to GEM-BALLS.
- The licensers neither have any
responsibility for
the use of classifiers built through GEM-BALLS, nor for the correctness
of
GEM-BALLS itself.
- We will only use GEM-BALLS for
non-profit
research
purposes. This implies that neither GEM-BALLS nor any part of
its
code should be used or modified for any commercial software product. REFERENCE
Please
cite the following paper (bibtex) when referring to
our algorithm:
"A New Classification Algorithm With Guaranteed Sensitivity and
Specificity for Medical Applications",
by A. Carè, F.A. Ramponi, M.C. Campi. IEEE Control
Systems Letters, vol. 2, no. 3, pp. 393-398, July 2018.
(pdf copy here)
(the slides of a presentation at AUTOMATICA.IT 2019 can be found here)
QUICK
START
Type >>help traingemballs
at the MATLAB prompt for a general description of the MATLAB
functions.
Here you can find
another MATLAB example where a pool of GEM-BALLS classifiers are built
from the same training set. Using many
GEM-BALLS classifiers together might considerably improve performance -
we have been currently researching on this:
"A study on majority-voting classifiers
with guarantees on the probability of error"
by A. Carè, M.C. Campi, F.A.
Ramponi, S. Garatti, A.T.J.R. Cobbenhagen
IFAC World Congress 2020 (pdf copy
here)
"Consensus and reliability: the case of two binary classifiers",
by A.T.J.R Cobbenhagen, A. Carè, M.C. Campi, F.A. Ramponi,
W.P.M.H. Heemels.
IFAC Papers-OnLine, vol. 52, no. 20, pp.
73-78, 2019. https://doi.org/10.1016/j.ifacol.2019.12.129 (pdf
copy here)
"Novel bounds on the probability of
misclassification in majority voting: leveraging the majority size",
by A.T.J.R Cobbenhagen, A. Carè, M.C. Campi, F.A. Ramponi,
D.J. Antunes, W.P.M.H. Heemels. IEEE
Control Systems Letters, vol. 5, no. 5, pp. 1513-1518, 2020. https://doi.org/10.1109/LCSYS.2020.3040961 (pdf
copy here)
THIRD-PARTY
IMPLEMENTATION OF ENSAMBLE GEM-BALLS
Omar Younis
has published a classification scheme that combines
many GEM-BALLS classifiers. Here you can find the
code. Here you can find the
documentation.
TRIVIA
Here you are
an instance of a GEM-BALLS classifier (red=1, white=0) that was trained
by Roy
Cobbenhagen.
F.A. Ramponi
argued that GEM-BALLS classifiers bear similarities with some
of Umberto
Boccioni's sculptures. Shall we
start talking about Boccioni
classifiers? (In Italian, "boccioni" also means "big
bowls/balls"!)