Skip to content

MERCS

MERCS stands for multi-directional ensembles of classification and regression trees. It is a novel ML-paradigm under active development at the DTAI-lab at KU Leuven.

Installation

Easy via pip:

pip install mercs-mixed

Quickstart

Now that you have installed MERCS you are ready to fit your first versatile model: quickstart

Source code

MERCS is fully open-source cf. our github-repository

Run/Build locally

To run the project, you need Poetry. Once installed:

  1. Clone the repository.
  2. Run poetry install.
  3. The development environment is ready. You can test it by running pytest.

Publications

MERCS is an active research project, hence we periodically publish our findings;

MERCS: Multi-Directional Ensembles of Regression and Classification Trees

Abstract Learning a function f(X) that predicts Y from X is the archetypal Machine Learning (ML) problem. Typically, both sets of attributes (i.e., X,Y) have to be known before a model can be trained. When this is not the case, or when functions f(X) that predict Y from X are needed for varying X and Y, this may introduce significant overhead (separate learning runs for each function). In this paper, we explore the possibility of omitting the specification of X and Y at training time altogether, by learning a multi-directional, or versatile model, which will allow prediction of any Y from any X. Specifically, we introduce a decision tree-based paradigm that generalizes the well-known Random Forests approach to allow for multi-directionality. The result of these efforts is a novel method called MERCS: Multi-directional Ensembles of Regression and Classification treeS. Experiments show the viability of the approach.

Authors Elia Van Wolputte, Evgeniya Korneva, Hendrik Blockeel

Open Access A pdf version can be found at AAAI-publications

People

People involved in this project: