{"id":275,"date":"2017-12-07T08:58:38","date_gmt":"2017-12-07T10:58:38","guid":{"rendered":"http:\/\/web.inf.ufpr.br\/vri\/?page_id=275"},"modified":"2018-04-05T17:22:15","modified_gmt":"2018-04-05T20:22:15","slug":"dynse","status":"publish","type":"page","link":"https:\/\/web.inf.ufpr.br\/vri\/software\/dynse\/","title":{"rendered":"Dynse &#8211; Dynamic Selection Based Drift Handler"},"content":{"rendered":"<h3>What is the Dynse?<\/h3>\n<p>The Dynse (Dynamic Selection Based Drift Handler) is a Framework developed to deal with Concept Drift Problems by means of the Dynamic Classifiers Selection.<\/p>\n<p>The Framework was built in Java using the <a href=\"http:\/\/moa.cms.waikato.ac.nz\/\">MOA<\/a> and <a href=\"http:\/\/www.cs.waikato.ac.nz\/ml\/weka\/\">Weka<\/a> Frameworks.<\/p>\n<h3>Dataset<\/h3>\n<p>We propose the PKLot dataset as concept drift benchmark. The protocol is described <a href=\"http:\/\/web.inf.ufpr.br\/vri\/software\/dynse\/the-pklot-for-concept-drift-scenarios\/\">here<\/a>.<\/p>\n<h3>License<\/h3>\n<p>The Dynse Framework is an open source project issued under the <a href=\"https:\/\/www.gnu.org\/licenses\/\">GNU General Public License 3<\/a>.<\/p>\n<h3>You will Need<\/h3>\n<ol>\n<li><a href=\"https:\/\/java.com\/en\/download\/\" target=\"_blank\" rel=\"noopener\">Java 1.8<\/a><\/li>\n<li><a href=\"https:\/\/www.eclipse.org\/downloads\/?\" target=\"_blank\" rel=\"noopener\">Eclipse Neon IDE<\/a> &#8211; Alternatively you may build using Apache Maven<\/li>\n<li><a href=\"https:\/\/maven.apache.org\/\" target=\"_blank\" rel=\"noopener\">Apache Maven<\/a> &#8211; Necessary even if you are using Eclipse<\/li>\n<\/ol>\n<h3>Get the Project from the repository<\/h3>\n<p>The Project is hosted on the GitHub platform. You may download the project following one of the next three options:<\/p>\n<p><b>Clone using a SSH-Key: <\/b><i>git clone git@github.com:paulorla\/dynse.git<\/i><\/p>\n<p><b>Anonymous Clone: <\/b><i>git clone https:\/\/github.com\/paulorla\/dynse.git<\/i><\/p>\n<p><b>Download Zip:<\/b> <a href=\"https:\/\/github.com\/paulorla\/dynse\/archive\/master.zip\"><i>dynse.zip<\/i><\/a><\/p>\n<h3>Import the Project into Eclipse<\/h3>\n<ol>\n<li>Clone the dynse into a directory of your choice. See <a href=\"http:\/\/www.inf.ufpr.br\/prlalmeida\/dynseframework.html#cloneRepository\">how to get the framework from the repository<\/a><\/li>\n<li>A <i>dynse<\/i> directory will be created<\/li>\n<li>Open the Eclipse passing the <i>dynse<\/i> directory (the one that has the folders <i>dynse<\/i> and <i>RemoteSystemsTempFiles<\/i> in it) as the path for the Workspace. If you can&#8217;t change your Workspace check the <a href=\"http:\/\/www.inf.ufpr.br\/prlalmeida\/dynseframework.html#faq\">FAQ<\/a>.<\/li>\n<li>Go to <i>File -&gt; Import -&gt; Existing Projects Into Workspace<\/i><\/li>\n<li>In <i>Select Root Directory<\/i>, select <i>Browse<\/i>, and then click in <i>OK<\/i> (the root directory is the same you used to create the workspace)<\/li>\n<li>In <i>Projects<\/i> the <i>dynse<\/i> Project will appear marked.<\/li>\n<li>Click in <i>Finish<\/i>. It may take a while to Eclipse import the project and automatically download the project dependencies (through the Apache Maven)<\/li>\n<li>Add the <a href=\"http:\/\/www.inf.ufpr.br\/prlalmeida\/dynseframework.html#sizeOfFag\">SizeOfFag<\/a> path in the Run Configuration.<\/li>\n<\/ol>\n<h3>Build Using Apache Maven (Optional)<\/h3>\n<ol>\n<li>Open a terminal in the <i>dynse<\/i> directory (the directory that has the pom.xml file)<\/li>\n<li>Execute the command: <i>mvn install<\/i><\/li>\n<li>The built jar will be generated in the <i>target<\/i> directory<\/li>\n<\/ol>\n<h3>SizeOfFag<\/h3>\n<p>The <i>sizeofag-1.0.0.jar<\/i> can be found <a href=\"https:\/\/code.google.com\/archive\/p\/sizeofag\/downloads\" target=\"_blank\" rel=\"noopener\"><b>here<\/b><\/a><\/p>\n<p><b>Eclipse<\/b><\/p>\n<p>Right click on your project on Eclipse <i>-&gt; Run As -&gt; Run Configurations&#8230;<\/i><\/p>\n<p>Go to the <i>Arguments<\/i> Tab<\/p>\n<p>In the <i>VM Arguments<\/i> box, insert<\/p>\n<p><i>-javaagent:[PATH_TO_SIZE_OF_FAG]\/sizeofag-1.0.0.jar<\/i><\/p>\n<p>Example<\/p>\n<p>-javaagent:\/home\/user\/.m2\/repository\/com\/googlecode\/sizeofag\/sizeofag\/1.0.0\/sizeofag-1.0.0.jar<\/p>\n<p><b>Command Line<\/b><\/p>\n<p>If you are running the project from the command line (e.g. you built the project using the Apache maven), just execute the generated jar as follows:<\/p>\n<p>java -javaagent:[PATH_TO_SIZE_OF_FAG]\/sizeofag-1.0.0.jar -jar [DYNSE_JAR].jar<\/p>\n<p>Example<\/p>\n<p>java -javaagent:\/home\/user\/.m2\/repository\/com\/googlecode\/sizeofag\/sizeofag\/1.0.0\/sizeofag-1.0.0.jar -jar dynse.jar<\/p>\n<h3>Datasets<\/h3>\n<p><a href=\"http:\/\/www.inf.ufpr.br\/prlalmeida\/dynseFramework\/nebraskaWeather-Norm.arff\" target=\"_blank\" rel=\"noopener\"><b>Nebraska-Norm<\/b><\/a> &#8211; The normalized version of the Nebraska Weather Dataset (the original version can be found <a href=\"http:\/\/users.rowan.edu\/~polikar\/research\/NSE\/\">here<\/a>).<\/p>\n<p><a href=\"http:\/\/www.inf.ufpr.br\/prlalmeida\/dynseFramework\/checkerboard_data.zip\" target=\"_blank\" rel=\"noopener\"><b>Checkerboard<\/b><\/a> &#8211; The arff converted files from the checkerboard datasets (the original version can be found <a href=\"http:\/\/users.rowan.edu\/~polikar\/research\/NSE\/\">here<\/a>).<\/p>\n<p><a href=\"http:\/\/www.inf.ufpr.br\/prlalmeida\/dynseFramework\/gaussian.zip\" target=\"_blank\" rel=\"noopener\"><b>Gaussian<\/b><\/a> &#8211; The arff converted files from the checkerboard datasets (the original version can be found <a href=\"http:\/\/users.rowan.edu\/~polikar\/research\/NSE\/\">here<\/a>).<\/p>\n<p><a href=\"http:\/\/sourceforge.net\/projects\/moa-datastream\/files\/Datasets\/Classification\/covtypeNorm.arff.zip\/download\/\" target=\"_blank\" rel=\"noopener\"><b>Forest Covertype<\/b><\/a> &#8211; The normalized arff of the Forest Covertype dataset (note: this dataset is hosted in the MOA repository).<\/p>\n<p><a href=\"http:\/\/www.inf.ufpr.br\/prlalmeida\/dynseFramework\/digit-Norm.arff\" target=\"_blank\" rel=\"noopener\"><b>Digit-Norm<\/b><\/a> &#8211; The normalized version of the Digit Dataset (the original version can be found <a href=\"https:\/\/archive.ics.uci.edu\/ml\/datasets\/Optical+Recognition+of+Handwritten+Digits\">here<\/a>).<\/p>\n<p><a href=\"http:\/\/www.inf.ufpr.br\/prlalmeida\/dynseFramework\/digit.arff\" target=\"_blank\" rel=\"noopener\"><b>Digit<\/b><\/a> &#8211; The non-normalized version of the Digit Dataset (in this version the data were just put in the arff format).<\/p>\n<p><a href=\"http:\/\/www.inf.ufpr.br\/prlalmeida\/dynseFramework\/letters-Norm.arff\" target=\"_blank\" rel=\"noopener\"><b>Letters-Norm<\/b><\/a> &#8211; The normalized version of the Letters Dataset (the original version can be found <a href=\"https:\/\/archive.ics.uci.edu\/ml\/datasets\/letter+recognition\">here<\/a>).<\/p>\n<p><a href=\"http:\/\/www.inf.ufpr.br\/prlalmeida\/dynseFramework\/letters.arff\" target=\"_blank\" rel=\"noopener\"><b>Letters<\/b><\/a> &#8211; The non-normalized version of the Letters Dataset (in this version the data were just put in the arff format).<\/p>\n<h3>Create a Dynse using the Default Factory<\/h3>\n<p>The project contains a default factory to build a ready to use Dynse Configuration. To use the factory follow the example:<\/p>\n<div>\n<div class=\"syntaxhighlighter nogutter java\">\n<table border=\"0\" cellspacing=\"0\" cellpadding=\"0\">\n<tbody>\n<tr>\n<td class=\"code\">\n<div class=\"container\">\n<div class=\"line number1 index0 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java plain\">AbstractDynseFactory realConceptDriftFactory = <\/code><code class=\"java keyword\">new<\/code> <code class=\"java plain\">RealConceptDriftDynseFactory();<\/code><\/span><\/div>\n<div class=\"line number2 index1 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java plain\">StreamDynse dynse = dynseFactory.createDefaultDynseKE(<\/code><code class=\"java value\">100<\/code><code class=\"java plain\">);<\/code><\/span><\/div>\n<div class=\"line number3 index2 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><\/span><\/div>\n<\/div>\n<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<\/div>\n<\/div>\n<p>The above example will create a configuration of the Dynse framework prepared for a Real Concept Drift, using a Knora-Eliminate as the Classification Engine, where each classifier is trained for every 100 supervised instances received. To create a configuration for virtual concept drifts, follow the example:<\/p>\n<div>\n<div class=\"syntaxhighlighter nogutter java\">\n<table border=\"0\" cellspacing=\"0\" cellpadding=\"0\">\n<tbody>\n<tr>\n<td class=\"code\">\n<div class=\"container\">\n<div class=\"line number1 index0 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java plain\">AbstractDynseFactory virtualConceptDriftFactory = <\/code><code class=\"java keyword\">new<\/code> <code class=\"java plain\">VirtualConceptDriftDynseFactory();<\/code><\/span><\/div>\n<div class=\"line number2 index1 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java plain\">StreamDynse dynse = dynseFactory.createDefaultDynseKE(<\/code><code class=\"java value\">100<\/code><code class=\"java plain\">);<\/code><\/span><\/div>\n<div class=\"line number3 index2 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><\/span><\/div>\n<\/div>\n<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<\/div>\n<\/div>\n<p>In both scenarios, the created <i>dynse<\/i> object can be used as any MOA classifier.<\/p>\n<p>The factories are prepared to build a dynse configuration using any of the implemented Classification Engines (OLA, KNORA-E, A Priori, etc.).<\/p>\n<p>The default configuration is:<\/p>\n<p>Nearest Neighbors: 9 for the KNORA-E (and a slack =2), and 5 for the other classification engines.<\/p>\n<p>Accuracy Estimation Window Size: <i>4 x the number of train instances<\/i> for real concept drifts, and <i>32 x the number of train instances<\/i> for virtual concept drifts.<\/p>\n<p>Base Classifier: The pool of classifiers is built using Naive Bayes classifiers.<\/p>\n<p>Pruning: Only the latest 25 classifiers are kept in the pool.<\/p>\n<h3>Using the Testbeds<\/h3>\n<p>The project include a serie of Testbeds using different datsets as <b>examples<\/b>.<\/p>\n<p>The testbeds are available in the package br.ufpr.dynse.testbed<\/p>\n<p>You will need to manually select the test you want in the method <i>executeTests<\/i> of your testbed, and you may need to download the necessary dataset and set the path to it when the dataset is not artificially generated by the moa (e.g. the Digit testbed)<\/p>\n<h3>Creating a custom configuration<\/h3>\n<p>If you do not want to use a default configuration, you can create your own custom configuration of the Dynse framework.<\/p>\n<p>First of all, you will need to define the pruning engine. The original Dynse framework comes with 3 distinct implementations in the package <i>br.ufpr.dynse.pruningengine<\/i>: <i>AgeBasedPruningEngine<\/i> (Remove the oldest classifier), <i>AccuracyBasedPruningEngine<\/i> (Remove the worst performing classifier according to the current Accuracy Validation Window), and <i>NoPrunePruningEngine<\/i> (keep all classifiers in the pool). If you want to implement your own pruning engine check the Section <a href=\"http:\/\/www.inf.ufpr.br\/prlalmeida\/dynseframework.html#extendPruningEngine\">Creating your own Pruning Engine<\/a><\/p>\n<p>It is also necessary to define a classifiers factory, which is responsible to build the base classifiers that will be added to the pool. In the package <i>br.ufpr.dynse.classifier.factory<\/i> you can find a factory for Naive Bayes classifiers (<i>NaiveBayesFactory<\/i>) and to HoeffdingTree classifiers (<i>HoeffdingTreeFactory<\/i>). Check the Section <a href=\"http:\/\/www.inf.ufpr.br\/prlalmeida\/dynseframework.html#extendClassifiersFactory\">Creating a Base Classifier Factory<\/a> to verify how to build your own base classifier factory.<\/p>\n<p>The next step is to instantiate a Classification Engine. The Dynse framework already has many classification engines implemented. Just choose one in the package <i>br.ufpr.dynse.classificationengine<\/i>. When instantiating a classification engine, you will also need to specify the number of neighbors considered to estimate the classifiers competence in the current accuracy estimation window. To create your own classification engine check the Section <a href=\"http:\/\/www.inf.ufpr.br\/prlalmeida\/dynseframework.html#extendClassificationEngine\">Creating your own Classification Engine<\/a>.<\/p>\n<p>Finally, you will need to instantiate a <i>StreamDynse<\/i> (our implementation of the Dynse Framework) passing the required information to the constructor, including the number of samples used to train each classifier, and the size of the accuracy estimation window in batches (note: the instances accumulated before creating a new classifier are considered a batch in our implementation).<\/p>\n<p>See a complete example bellow:<\/p>\n<div>\n<div class=\"syntaxhighlighter nogutter java\">\n<table border=\"0\" cellspacing=\"0\" cellpadding=\"0\">\n<tbody>\n<tr>\n<td class=\"code\">\n<div class=\"container\">\n<div class=\"line number1 index0 alt2\" style=\"text-align: left\"><code class=\"java keyword\"><span style=\"font-size: 10pt\">import<\/span><\/code><span style=\"font-size: 10pt\"> <code class=\"java plain\">br.ufpr.dynse.classificationengine.IClassificationEngine;<\/code><\/span><\/div>\n<div class=\"line number2 index1 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java keyword\">import<\/code> <code class=\"java plain\">br.ufpr.dynse.classificationengine.KnoraEliminateClassificationEngine;<\/code><\/span><\/div>\n<div class=\"line number3 index2 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java keyword\">import<\/code> <code class=\"java plain\">br.ufpr.dynse.classifier.competence.IMultipleClassifiersCompetence;<\/code><\/span><\/div>\n<div class=\"line number4 index3 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java keyword\">import<\/code> <code class=\"java plain\">br.ufpr.dynse.core.StreamDynse;<\/code><\/span><\/div>\n<div class=\"line number5 index4 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java keyword\">import<\/code> <code class=\"java plain\">br.ufpr.dynse.pruningengine.AgeBasedPruningEngine;<\/code><\/span><\/div>\n<div class=\"line number6 index5 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java keyword\">import<\/code> <code class=\"java plain\">br.ufpr.dynse.pruningengine.DynseClassifierPruningMetrics;<\/code><\/span><\/div>\n<div class=\"line number7 index6 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java keyword\">import<\/code> <code class=\"java plain\">br.ufpr.dynse.pruningengine.IPruningEngine;<\/code><\/span><\/div>\n<div class=\"line number8 index7 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\">\u00a0<\/span><\/div>\n<div class=\"line number9 index8 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java keyword\">private<\/code> <code class=\"java keyword\">static<\/code> <code class=\"java keyword\">final<\/code> <code class=\"java keyword\">int<\/code> <code class=\"java plain\">POOL_SIZE = <\/code><code class=\"java value\">50<\/code><code class=\"java plain\">;<\/code><\/span><\/div>\n<div class=\"line number10 index9 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java keyword\">private<\/code> <code class=\"java keyword\">static<\/code> <code class=\"java keyword\">final<\/code> <code class=\"java keyword\">int<\/code> <code class=\"java plain\">TRAIN_SIZE = <\/code><code class=\"java value\">200<\/code><code class=\"java plain\">;<\/code><\/span><\/div>\n<div class=\"line number11 index10 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java keyword\">private<\/code> <code class=\"java keyword\">static<\/code> <code class=\"java keyword\">final<\/code> <code class=\"java keyword\">int<\/code> <code class=\"java plain\">ACC_WINDOW_SIZE = <\/code><code class=\"java value\">2<\/code><code class=\"java plain\">;<\/code><\/span><\/div>\n<div class=\"line number12 index11 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\">\u00a0<\/span><\/div>\n<div class=\"line number13 index12 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java comments\">\/\/...<\/code><\/span><\/div>\n<div class=\"line number14 index13 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\">\u00a0<\/span><\/div>\n<div class=\"line number15 index14 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java keyword\">public<\/code> <code class=\"java keyword\">void<\/code> <code class=\"java plain\">myTestMethod(){<\/code><\/span><\/div>\n<div class=\"line number16 index15 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java comments\">\/\/AgeBasedPruningEngine - remove the oldest classifiers<\/code><\/span><\/div>\n<div class=\"line number17 index16 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java plain\">IPruningEngine&lt;DynseClassifierPruningMetrics &gt; prungingEngine = <\/code><code class=\"java keyword\">new<\/code> <code class=\"java plain\">AgeBasedPruningEngine(POOL_SIZE);<\/code><code class=\"java comments\">\/\/pool size set to 50<\/code><\/span><\/div>\n<div class=\"line number18 index17 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java plain\">AbstractClassifierFactory classifierFactory = <\/code><code class=\"java keyword\">new<\/code> <code class=\"java plain\">NaiveBayesFactory();<\/code><code class=\"java comments\">\/\/naive bayes as the base learners<\/code><\/span><\/div>\n<div class=\"line number19 index18 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java plain\">IClassificationEngine &lt;IMultipleClassifiersCompetence&gt; classificationEngine = <\/code><\/span><\/div>\n<div class=\"line number20 index19 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java keyword\">new<\/code> <code class=\"java plain\">KnoraEliminateClassificationEngine(<\/code><code class=\"java value\">7<\/code><code class=\"java plain\">, <\/code><code class=\"java value\">1<\/code><code class=\"java plain\">);<\/code><code class=\"java comments\">\/\/K-E as the classification engine, 7 neighbors and slack variable = 1<\/code><\/span><\/div>\n<div class=\"line number21 index20 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java plain\">StreamDynse dynse = <\/code><code class=\"java keyword\">new<\/code> <code class=\"java plain\">StreamDynse(classifierFactory,<\/code><\/span><\/div>\n<div class=\"line number22 index21 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java plain\">TRAIN_SIZE, ACC_WINDOW_SIZE, classificationEngine, prungingEngine);<\/code><code class=\"java comments\">\/\/train a new classifier for every 200 supervised samples<\/code><\/span><\/div>\n<div class=\"line number23 index22 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java comments\">\/\/accuracy estimation window size set to 2 (i.e., 2x200 = 400 latest samples considered)<\/code><\/span><\/div>\n<div class=\"line number24 index23 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\">\u00a0<\/span><\/div>\n<div class=\"line number25 index24 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java comments\">\/\/...<\/code><\/span><\/div>\n<div class=\"line number26 index25 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java plain\">}<\/code><\/span><\/div>\n<div class=\"line number27 index26 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><\/span><\/div>\n<\/div>\n<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<\/div>\n<\/div>\n<p>The instantiated <i>StreamDynse<\/i> can be used as any MOA classifier.<\/p>\n<h3>Extending the Framework<\/h3>\n<p>Yoy may extend the Dynse framework and create your own classification engines, pruning engines, base classifiers factories, etc..<\/p>\n<p>All you will need is to extend some classes and implement some interfaces, as explained in the next sections<\/p>\n<p><b>Create a fork in the Git project, and share with us and other scientists your implementation! \u00a0 =)<\/b><\/p>\n<h4>Creating your own Pruning Engine<\/h4>\n<p>In order to implement your own pruning engine, you will need to implement the <i>IPruningEngine<\/i> interface.<\/p>\n<div>\n<div class=\"syntaxhighlighter nogutter java\">\n<table border=\"0\" cellspacing=\"0\" cellpadding=\"0\">\n<tbody>\n<tr>\n<td class=\"code\">\n<div class=\"container\">\n<div class=\"line number1 index0 alt2\"><span style=\"font-size: 10pt\"><code class=\"java keyword\">public<\/code> <code class=\"java keyword\">interface<\/code> <code class=\"java plain\">IPruningEngine&lt;T <\/code><code class=\"java keyword\">extends<\/code> <code class=\"java plain\">DynseClassifierPruningMetrics&gt; {<\/code><\/span><\/div>\n<div class=\"line number2 index1 alt1\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java comments\">\/\/returns the classifiers that must be removed from the pool<\/code><\/span><\/div>\n<div class=\"line number3 index2 alt2\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java keyword\">public<\/code> <code class=\"java plain\">List&lt;DynseClassifier&lt;T&gt;&gt; pruneClassifiers(DynseClassifier&lt;T&gt; newClassifier, List&lt;DynseClassifier&lt;T&gt;&gt; currentPool,<\/code><\/span><\/div>\n<div class=\"line number4 index3 alt1\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java plain\">List&lt;Instance&gt; accuracyEstimationInstances) <\/code><code class=\"java keyword\">throws<\/code> <code class=\"java plain\">Exception;<\/code><\/span><\/div>\n<div class=\"line number5 index4 alt2\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><\/span><\/div>\n<div class=\"line number6 index5 alt1\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java keyword\">public<\/code> <code class=\"java keyword\">void<\/code> <code class=\"java plain\">meassureClassifier(DynseClassifier&lt;T&gt; classifier) <\/code><code class=\"java keyword\">throws<\/code> <code class=\"java plain\">Exception;<\/code><\/span><\/div>\n<div class=\"line number7 index6 alt2\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><\/span><\/div>\n<div class=\"line number8 index7 alt1\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java keyword\">public<\/code> <code class=\"java keyword\">void<\/code> <code class=\"java plain\">getPrunningEngineDescription(StringBuilder out);<\/code><\/span><\/div>\n<div class=\"line number9 index8 alt2\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><\/span><\/div>\n<div class=\"line number10 index9 alt1\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java keyword\">public<\/code> <code class=\"java keyword\">void<\/code> <code class=\"java plain\">getPrunningEngineShortDescription(StringBuilder out);<\/code><\/span><\/div>\n<div class=\"line number11 index10 alt2\"><span style=\"font-size: 10pt\"><code class=\"java plain\">}<\/code><\/span><\/div>\n<div class=\"line number12 index11 alt1\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><\/div>\n<\/div>\n<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<\/div>\n<\/div>\n<p>Your class must implement the <i>IPruningEngine<\/i> passing a template that extends <i>DynseClassifierPruningMetrics<\/i>, or the class <i>DynseClassifierPruningMetrics<\/i> itself. This class is necessary to define the metrics used in your pruning process (e.g., the classifier&#8217;s age).<\/p>\n<p>The <i>pruneclassifiers<\/i> method take the newest created classifier, the current pool of classifiers and the current accuracy estimation window. The method must <b>return a list containing the classifiers that must be pruned<\/b> (this list can be empty). This method <b>should not<\/b> alter the current pool.<\/p>\n<p>The <i>meassureClassifier<\/i> takes a classifier (often it will be the newest classifier created by the Dynse) and measures it acccording to the <i>DynseClassifierPruningMetrics<\/i> (e.g., populate the creation time of the classifier).<\/p>\n<p>The <i>getPrunningEngineDescription<\/i> just write a desription of the pruning engine in <i>out<\/i>.<\/p>\n<p>The <i>getPrunningEngineShortDescription<\/i> just write a short desription of the pruning engine in <i>out<\/i>.<\/p>\n<p>To see implementation examples, check the classes <i>AgeBasedPruningEngine<\/i> and <i>AccuracyBasedPruningEngine<\/i> available in the Dynse framework.<\/p>\n<p>Note that it may be necessary to extend <i>AbstractDynse<\/i> in order to generate a Dynse version compatible with your pruning metrics if you are not using the original <i>DynseClassifierPruningMetrics<\/i>, or you may use some castings, which may be unsafe. This is necessary since the <i>StreamDynse<\/i> was implemented to deal only with <i>DynseClassifierPruningMetrics<\/i> due to the <a href=\"https:\/\/docs.oracle.com\/javase\/tutorial\/java\/generics\/bridgeMethods.html\" target=\"_blank\" rel=\"noopener\"><b>type erasure problem<\/b><\/a> in Java.<\/p>\n<h4>Creating a Base Classifier Factory<\/h4>\n<p>You may use any base classifier in the Dynse Framework (e.g., SVM, Naive Bayes, KNN, etc.).<\/p>\n<p>For now, we have a <i>Naive Bayes<\/i> (<i>NaiveBayesFactory<\/i>) and <i>HoeffdingTree<\/i> (<i>HoeffdingTreeFactory<\/i>) farctories implemented. If you want a factory for a different base classifier, just extend the <i>AbstractClassifierFactory<\/i> class. See next an example of the implementation of this class used in the <i>NaiveBayesFactory<\/i>:<\/p>\n<div>\n<div class=\"syntaxhighlighter nogutter java\">\n<table border=\"0\" cellspacing=\"0\" cellpadding=\"0\">\n<tbody>\n<tr>\n<td class=\"code\">\n<div class=\"container\">\n<div class=\"line number1 index0 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java keyword\">public<\/code> <code class=\"java keyword\">class<\/code> <code class=\"java plain\">NaiveBayesFactory <\/code><code class=\"java keyword\">extends<\/code> <code class=\"java plain\">AbstractClassifierFactory {<\/code><\/span><\/div>\n<div class=\"line number2 index1 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><\/span><\/div>\n<div class=\"line number3 index2 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java keyword\">private<\/code> <code class=\"java keyword\">static<\/code> <code class=\"java keyword\">final<\/code> <code class=\"java keyword\">long<\/code> <code class=\"java plain\">serialVersionUID = <\/code><code class=\"java value\">1L<\/code><code class=\"java plain\">;<\/code><\/span><\/div>\n<div class=\"line number4 index3 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><\/span><\/div>\n<div class=\"line number5 index4 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java color1\">@Override<\/code><\/span><\/div>\n<div class=\"line number6 index5 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java keyword\">public<\/code> <code class=\"java plain\">Classifier createClassifier() <\/code><code class=\"java keyword\">throws<\/code> <code class=\"java plain\">Exception {<\/code><\/span><\/div>\n<div class=\"line number7 index6 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java plain\">NaiveBayes classifier = <\/code><code class=\"java keyword\">new<\/code> <code class=\"java plain\">NaiveBayes();<\/code><\/span><\/div>\n<div class=\"line number8 index7 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java plain\">classifier.prepareForUse();<\/code><\/span><\/div>\n<div class=\"line number9 index8 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java keyword\">return<\/code> <code class=\"java plain\">classifier;<\/code><\/span><\/div>\n<div class=\"line number10 index9 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java plain\">}<\/code><\/span><\/div>\n<div class=\"line number11 index10 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><\/span><\/div>\n<div class=\"line number12 index11 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java color1\">@Override<\/code><\/span><\/div>\n<div class=\"line number13 index12 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java keyword\">public<\/code> <code class=\"java keyword\">void<\/code> <code class=\"java plain\">getDescription(StringBuilder out) {<\/code><\/span><\/div>\n<div class=\"line number14 index13 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java plain\">out.append(<\/code><code class=\"java string\">\"Naive Bayes Factory\"<\/code><code class=\"java plain\">);<\/code><\/span><\/div>\n<div class=\"line number15 index14 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java plain\">}<\/code><\/span><\/div>\n<div class=\"line number16 index15 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><\/span><\/div>\n<div class=\"line number17 index16 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java color1\">@Override<\/code><\/span><\/div>\n<div class=\"line number18 index17 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java keyword\">public<\/code> <code class=\"java keyword\">void<\/code> <code class=\"java plain\">getShortDescription(StringBuilder out) {<\/code><\/span><\/div>\n<div class=\"line number19 index18 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java plain\">out.append(<\/code><code class=\"java string\">\"NB\"<\/code><code class=\"java plain\">);<\/code><\/span><\/div>\n<div class=\"line number20 index19 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java plain\">}<\/code><\/span><\/div>\n<div class=\"line number21 index20 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java plain\">}<\/code><\/span><\/div>\n<div class=\"line number22 index21 alt1\" style=\"text-align: left\"><code class=\"java spaces\"><span style=\"font-size: 10pt\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/span>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><\/div>\n<\/div>\n<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<\/div>\n<\/div>\n<p>The most important overriden method is the <i>createClassifier<\/i>, which must create and <i>prepareForUse<\/i> a base classifier (note that this method does not train the classifier).<\/p>\n<p>The methods <i>getDescription<\/i> and <i>getShortDescription<\/i> just append a full and a short description description of the factory, respectivelly.<\/p>\n<h4>Creating your own Classification Engine<\/h4>\n<p>You may find several classification engines pre-implemented in the package <i>br.ufpr.dynse.classificationengine<\/i> (OLA, KNORA-E, A Priori, etc.).<\/p>\n<p>If you want to implement your own classification engine, you must implement the interface <i>IClassificationEngine<\/i> follow described:<\/p>\n<div>\n<div class=\"syntaxhighlighter nogutter java\">\n<table border=\"0\" cellspacing=\"0\" cellpadding=\"0\">\n<tbody>\n<tr>\n<td class=\"code\">\n<div class=\"container\">\n<div class=\"line number1 index0 alt2\" style=\"text-align: left\"><code class=\"java keyword\"><span style=\"font-size: 10pt\">public<\/span><\/code><span style=\"font-size: 10pt\"> <code class=\"java keyword\">interface<\/code> <code class=\"java plain\">IClassificationEngine&lt;U <\/code><code class=\"java keyword\">extends<\/code> <code class=\"java plain\">IMultipleClassifiersCompetence&gt; <\/code><code class=\"java keyword\">extends<\/code> <code class=\"java plain\">Serializable{<\/code><\/span><\/div>\n<div class=\"line number2 index1 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\">\u00a0<\/span><\/div>\n<div class=\"line number3 index2 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java keyword\">public<\/code> <code class=\"java keyword\">double<\/code><code class=\"java plain\">[] classify(Instance instance, List&lt;DynseClassifier&lt;DynseClassifierPruningMetrics&gt;&gt; availableClassifiers,<\/code><\/span><\/div>\n<div class=\"line number4 index3 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java plain\">Map&lt;Instance, U&gt; competenceMappings, NearestNeighbourSearch nnSearch) <\/code><code class=\"java keyword\">throws<\/code> <code class=\"java plain\">Exception;<\/code><\/span><\/div>\n<div class=\"line number5 index4 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><\/span><\/div>\n<div class=\"line number6 index5 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java keyword\">public<\/code> <code class=\"java plain\">NearestNeighbourSearch createNeighborSearchMethod();<\/code><\/span><\/div>\n<div class=\"line number7 index6 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><\/span><\/div>\n<div class=\"line number8 index7 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java keyword\">public<\/code> <code class=\"java keyword\">boolean<\/code> <code class=\"java plain\">getMapOnlyCorrectClassifiers();<\/code><\/span><\/div>\n<div class=\"line number9 index8 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><\/span><\/div>\n<div class=\"line number10 index9 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java keyword\">public<\/code> <code class=\"java plain\">List&lt;DynseClassifier&lt;DynseClassifierPruningMetrics&gt;&gt; getClassifiersUsedInLastClassification();<\/code><\/span><\/div>\n<div class=\"line number11 index10 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><\/span><\/div>\n<div class=\"line number12 index11 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java keyword\">public<\/code> <code class=\"java keyword\">void<\/code> <code class=\"java plain\">getClassificationEngineDescription(StringBuilder out);<\/code><\/span><\/div>\n<div class=\"line number13 index12 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><\/span><\/div>\n<div class=\"line number14 index13 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java keyword\">public<\/code> <code class=\"java keyword\">void<\/code> <code class=\"java plain\">getClassificationEngineShortDescription(StringBuilder out);<\/code><\/span><\/div>\n<div class=\"line number15 index14 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><\/span><\/div>\n<div class=\"line number16 index15 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0<\/code><code class=\"java keyword\">public<\/code> <code class=\"java keyword\">void<\/code> <code class=\"java plain\">reset();<\/code><\/span><\/div>\n<div class=\"line number17 index16 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java plain\">}<\/code><\/span><\/div>\n<div class=\"line number18 index17 alt1\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><\/span><\/div>\n<\/div>\n<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<\/div>\n<\/div>\n<p>Your class must pass a template that extends the <i>IMultipleClassifiersCompetence<\/i> interface, or the <i>IMultipleClassifiersCompetence<\/i> itself.<\/p>\n<p>The method <i>classify<\/i> take the instance that must be classified, a list containing all available classifiers in the pool, a competence mapping of the classifiers&#8217; competences and a nearest neighbors search object that can be used to find the k-nearest neighbors of the current instance. This method must return a double array representing the classification (i.e., the <i>a posteriori<\/i> probabilities) of the instance.<\/p>\n<p>The method <i>createNeighborSearchMethod<\/i> must return a nearest neighbors search object. Usually it will be an instance of the <i>moa.classifiers.lazy.neighboursearchLinearNNSearch<\/i> class.<\/p>\n<p>The method getMapOnlyCorrectClassifiers must return true if only the classifiers that correctly classify the intances in the current accuracy estimation window should be mapped in the competence mapping (e.g.,a <i>KNORA-E<\/i> based classificaiton engine), or return false if all classifiers should be mappped (e.g., a <i>A Priori<\/i> based classification engine).<\/p>\n<p>The method <i>getClassifiersUsedInLastClassification<\/i> must return a list containing all classifiers used in the classification of the lastest test instance received.<\/p>\n<p>The methods <i>getClassificationEngineDescription<\/i> and <i>getClassificationEngineShortDescription<\/i> just append a full and a short description description of the classification engine, respectivelly.<\/p>\n<p>The method <i>reset<\/i> resets the classification engine to its initial state.<\/p>\n<h3>FAQ<\/h3>\n<ol>\n<li>Question: I cannot change the location of my workspace in EclipseAnswer: Check <a href=\"https:\/\/help.eclipse.org\/neon\/index.jsp?topic=%2Forg.eclipse.platform.doc.user%2Freference%2Fref-workspaceswitch.htm\" target=\"_blank\" rel=\"noopener\">this<\/a> link.<\/li>\n<li>Question: The results generated in the current framework implementation are slightly different from the PublicationsAnswer: In our publications we used an older version of the MOA Framework (2014-11) and classifiers from the Weka 3-7-13. In the current implementation we use the MOA 2016-04 and its classifiers, which may generate generate a small difference in the predictions.This difference is caused mainly (but not only) by the normalization of data, since the Weka classifiers and datasets used in previous experiments automatically normalized the information. If you normalize each batch before training\/testing the results come closer to the published ones.To overcome this, we define the Naive Bayes as the base learners in the default DynseFactory, since this base learner is less sensitive to the normalization of data. If you want to use Hoeffding Threes (or any other classifier as base learner), in the AbstractDynseFactory change the line:\n<div>\n<div class=\"syntaxhighlighter nogutter java\">\n<table border=\"0\" cellspacing=\"0\" cellpadding=\"0\">\n<tbody>\n<tr>\n<td class=\"code\">\n<div class=\"container\">\n<div class=\"line number1 index0 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java keyword\">public<\/code> <code class=\"java keyword\">final<\/code> <code class=\"java plain\">AbstractClassifierFactory classifierFactory = <\/code><code class=\"java keyword\">new<\/code> <code class=\"java plain\">NaiveBayesFactory();<\/code><\/span><\/div>\n<div class=\"line number2 index1 alt1\" style=\"text-align: left\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><\/div>\n<\/div>\n<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<\/div>\n<\/div>\n<p><b>To<\/b><\/p>\n<div>\n<div class=\"syntaxhighlighter nogutter java\">\n<table border=\"0\" cellspacing=\"0\" cellpadding=\"0\">\n<tbody>\n<tr>\n<td class=\"code\">\n<div class=\"container\">\n<div class=\"line number1 index0 alt2\" style=\"text-align: left\"><span style=\"font-size: 10pt\"><code class=\"java keyword\">public<\/code> <code class=\"java keyword\">final<\/code> <code class=\"java plain\">AbstractClassifierFactory classifierFactory = <\/code><code class=\"java keyword\">new<\/code> <\/span><code class=\"java plain\"><span style=\"font-size: 10pt\">HoeffdingTreeFactory();<\/span><\/code><\/div>\n<div class=\"line number2 index1 alt1\" style=\"text-align: left\"><code class=\"java spaces\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/code><\/div>\n<\/div>\n<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<\/div>\n<\/div>\n<\/li>\n<li>Question: The Eclipse does not find the <i>Main<\/i> classAnswer: Just open the <i>Main<\/i> class in the Eclipse (package <i>br.ufpr.dynse<\/i>) and then Run the Project<\/li>\n<\/ol>\n<h3>References<\/h3>\n<ul>\n<li>Almeida, P.,\u00a0Oliveira, L. S., Britto Jr., A., Sabourin, R.,\u00a0<strong>Adapting Dynamic Classifier Selection for Concept Drift<\/strong>, Expert Systems with Applications, 104:67-85, 2018.\u00a0[<a href=\"http:\/\/www.inf.ufpr.br\/lesoliveira\/download\/ESWA2018.pdf\">pdf<\/a>]<\/li>\n<li>Almeida, P., Oliveira, L. S., Britto Jr, A., Sabourin, R., <b>Handling Concept Drifts Using Dynamic Selection of Classifiers<\/b>. IEEE International Conference on Tools with Artiticial Intelligence, San Jose, USA, 2016. <a href=\"http:\/\/ieeexplore.ieee.org\/document\/7814713\/\">[pdf].<\/a><\/li>\n<\/ul>\n<div class=\"grammarly-disable-indicator\"><\/div>\n<div class=\"grammarly-disable-indicator\"><\/div>\n<div class=\"grammarly-disable-indicator\"><\/div>\n<div class=\"grammarly-disable-indicator\"><\/div>\n<div class=\"grammarly-disable-indicator\"><\/div>\n<div class=\"grammarly-disable-indicator\"><\/div>\n<div class=\"grammarly-disable-indicator\"><\/div>\n","protected":false},"excerpt":{"rendered":"<p>What is the Dynse? The Dynse (Dynamic Selection Based Drift Handler) is a Framework developed to deal with Concept Drift Problems by means of the Dynamic Classifiers Selection. The Framework was built in Java using the MOA and Weka Frameworks. <a href=\"https:\/\/web.inf.ufpr.br\/vri\/software\/dynse\/\" class=\"read-more\">Read More &#8230;<\/a><\/p>\n","protected":false},"author":18,"featured_media":0,"parent":224,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-275","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/web.inf.ufpr.br\/vri\/wp-json\/wp\/v2\/pages\/275","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/web.inf.ufpr.br\/vri\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/web.inf.ufpr.br\/vri\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/web.inf.ufpr.br\/vri\/wp-json\/wp\/v2\/users\/18"}],"replies":[{"embeddable":true,"href":"https:\/\/web.inf.ufpr.br\/vri\/wp-json\/wp\/v2\/comments?post=275"}],"version-history":[{"count":13,"href":"https:\/\/web.inf.ufpr.br\/vri\/wp-json\/wp\/v2\/pages\/275\/revisions"}],"predecessor-version":[{"id":354,"href":"https:\/\/web.inf.ufpr.br\/vri\/wp-json\/wp\/v2\/pages\/275\/revisions\/354"}],"up":[{"embeddable":true,"href":"https:\/\/web.inf.ufpr.br\/vri\/wp-json\/wp\/v2\/pages\/224"}],"wp:attachment":[{"href":"https:\/\/web.inf.ufpr.br\/vri\/wp-json\/wp\/v2\/media?parent=275"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}