Projects that are tagged with classification.
Showing Items 1-20 of 83 on page 1 of 5: 1 2 3 4 5 Next

Logo JMLR dlib ml 19.11

by davis685 - May 18, 2018, 04:19:52 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 437931 views, 81471 downloads, 0 subscriptions

About: This project is a C++ toolkit containing machine learning algorithms and tools for creating complex software in C++ to solve real world problems.

Changes:

This release adds a bunch of new image processing routines as well as many minor usability improvements and bug fixes.


Logo MLweb 1.2

by lauerfab - February 23, 2018, 15:40:27 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 48778 views, 11291 downloads, 0 subscriptions

About: MLweb is an open source project that aims at bringing machine learning capabilities into web pages and web applications, while maintaining all computations on the client side. It includes (i) a javascript library to enable scientific computing within web pages, (ii) a javascript library implementing machine learning algorithms for classification, regression, clustering and dimensionality reduction, (iii) a web application providing a matlab-like development environment.

Changes:
  • Add bibtex entry of corresponding Neurocomputing paper
  • Create javascript modules to avoid global scope pollution in web pages

Logo KeLP 2.2.2

by kelpadmin - February 1, 2018, 00:57:32 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 62654 views, 15491 downloads, 0 subscriptions

About: Kernel-based Learning Platform (KeLP) is Java framework that supports the implementation of kernel-based learning algorithms, as well as an agile definition of kernel functions over generic data representation, e.g. vectorial data or discrete structures. The framework has been designed to decouple kernel functions and learning algorithms, through the definition of specific interfaces. Once a new kernel function has been implemented, it can be automatically adopted in all the available kernel-machine algorithms. KeLP includes different Online and Batch Learning algorithms for Classification, Regression and Clustering, as well as several Kernel functions, ranging from vector-based to structural kernels. It allows to build complex kernel machine based systems, leveraging on JSON/XML interfaces to instantiate prediction models without writing a single line of code.

Changes:

In addition to minor improvements and bug fixes, this release includes:

  • The possibility to generate the Compositional GRCT and the Compositional LCT data structures in kelp-input-generator.

  • New metrics for evaluating Classification Tasks.

  • New Tutorial and Unit Tests.

Check out this new version from our repositories. API Javadoc is already available. Your suggestions will be very precious for us, so download and try KeLP 2.2.2!


Logo pycobra regression analysis and ensemble toolkit 0.2.2

by bhargavvader - December 29, 2017, 13:57:46 CET [ Project Homepage BibTeX Download ] 14398 views, 3591 downloads, 0 subscriptions

About: pycobra is a python library for ensemble learning, which serves as a toolkit for regression, classification, and visualisation. It is scikit-learn compatible and fits into the existing scikit-learn ecosystem.

Changes:

pycobra is further pep8 compliant, has improved tests and more plotting options.


Logo WEKA 3.9.2

by mhall - December 22, 2017, 03:39:19 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 144733 views, 35085 downloads, 0 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarEmpty Star
(based on 6 votes)

About: The Weka workbench contains a collection of visualization tools and algorithms for data analysis and predictive modelling, together with graphical user interfaces for easy access to this [...]

Changes:

This release include a lot of bug fixes and improvements. Some of these are detailed at

http://jira.pentaho.com/projects/DATAMINING/issues/DATAMINING-771

As usual, for a complete list of changes refer to the changelogs.


Logo JMLR GPML Gaussian Processes for Machine Learning Toolbox 4.1

by hn - November 27, 2017, 19:26:13 CET [ Project Homepage BibTeX Download ] 88142 views, 19680 downloads, 0 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 2 votes)

About: The GPML toolbox is a flexible and generic Octave/Matlab implementation of inference and prediction with Gaussian process models. The toolbox offers exact inference, approximate inference for non-Gaussian likelihoods (Laplace's Method, Expectation Propagation, Variational Bayes) as well for large datasets (FITC, VFE, KISS-GP). A wide range of covariance, likelihood, mean and hyperprior functions allows to create very complex GP models.

Changes:

Logdet-estimation functionality for grid-based approximate covariances

  • Lanczos subspace estimation

  • Chebyshef polynomial expansion

More generic infEP functionality

  • dense computations and sparse approximations using the same code

  • covering KL inference as a special cas of EP

New infKL function contributed by Emtiyaz Khan and Wu Lin

  • Conjugate-Computation Variational Inference algorithm

  • much more scalable than previous versions

Time-series covariance functions on the positive real line

  • covW (i-times integrated) Wiener process covariance

  • covOU (i-times integrated) Ornstein-Uhlenbeck process covariance (contributed by Juan Pablo Carbajal)

  • covULL underdamped linear Langevin process covariance (contributed by Robert MacKay)

  • covFBM Fractional Brownian motion covariance

New covariance functions

  • covWarp implements k(w(x),w(z)) where w is a "warping" function

  • covMatern has been extended to also accept non-integer distance parameters


Logo JMLR Jstacs 2.3

by keili - September 13, 2017, 14:25:38 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 64550 views, 14214 downloads, 0 subscriptions

About: A Java framework for statistical analysis and classification of biological sequences

Changes:

New classes and packages:

  • Jstacs 2.3 is the first release to be accompanied by JstacsFX, a library for building JavaFX-based graphical user interfaces based on JstacsTools
  • new interface MultiThreadedFunction
  • new class LargeSequenceReader for reading large sequence files in chunks
  • new interface QuickScanningSequenceScore
  • new class RegExpValidator for checking String inputs against a regular expression
  • new class IUPACDNAAlphabet

New features and improvements:

  • Alignments may now handle different costs for insert and delete gaps
  • ListResults may now be constructed from Collections of ResultSets
  • Several minor improvements and bugfixes in many classes
  • Improvements of documentation of several classes

Logo KeBABS 1.5.4

by UBod - July 28, 2017, 09:55:04 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 77889 views, 15525 downloads, 0 subscriptions

Rating Empty StarEmpty StarEmpty StarEmpty StarEmpty Star
(based on 1 vote)

About: Kernel-Based Analysis of Biological Sequences

Changes:
  • importing apcluster package for avoiding method clashes
  • improved and completed change history in inst/NEWS and package vignette

Logo NaN toolbox 3.1.2

by schloegl - January 22, 2017, 12:24:59 CET [ Project Homepage BibTeX Download ] 138543 views, 32070 downloads, 0 subscriptions

About: NaN-toolbox is a statistics and machine learning toolbox for handling data with and without missing values.

Changes:

Changes in v.3.1.2 - improve configuration and build system - support of more platforms (including Octave 4.2.0) improved

Changes in v.3.0.3 - improve compatibility for Octave on Windows

Changes in v.3.0.1 - fix packaging for octave

Changes in v.2.8.5 - bug fix: trimmean - compiler support for gcc-5 and clang - fix typos

For details see the CHANGELOG at http://pub.ist.ac.at/~schloegl/matlab/NaN/CHANGELOG


Logo Java Statistical Analysis Tool 0.0.7

by EdwardRaff - January 15, 2017, 22:21:50 CET [ Project Homepage BibTeX Download ] 14227 views, 3653 downloads, 0 subscriptions

About: General purpose Java Machine Learning library for classification, regression, and clustering.

Changes:

See github release tab for change info


Logo Tools for Regression and Classification 1.0.0

by matloff - October 29, 2016, 08:22:40 CET [ Project Homepage BibTeX Download ] 7005 views, 2325 downloads, 0 subscriptions

About: Toolkit for parametric and nonparametric regression and classification.

Changes:

Initial Announcement on mloss.org.


Logo RLScore 0.7

by aatapa - September 20, 2016, 09:51:25 CET [ Project Homepage BibTeX Download ] 6895 views, 2091 downloads, 0 subscriptions

About: RLScore - regularized least-squares machine learning algorithms package

Changes:

Initial Announcement on mloss.org.


Logo slim for matlab 0.2

by ustunb - August 23, 2016, 20:27:00 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 9666 views, 2372 downloads, 0 subscriptions

About: learn optimized scoring systems using MATLAB and the CPLEX Optimization Studio

Changes:

Initial Announcement on mloss.org.


Logo DeeBNet, a new object oriented MATLAB toolbox for Deep Belief Networks 3.2

by keyvanrad - June 26, 2016, 16:19:55 CET [ Project Homepage BibTeX Download ] 36566 views, 8478 downloads, 0 subscriptions

About: Nowadays this is very popular to use the deep architectures in machine learning. Deep Belief Networks (DBNs) are deep architectures that use a stack of Restricted Boltzmann Machines (RBM) to create a powerful generative model using training data. DBNs have many abilities such as feature extraction and classification that are used in many applications including image processing, speech processing, text categorization, etc. This paper introduces a new object oriented toolbox with the most important abilities needed for the implementation of DBNs. According to the results of the experiments conducted on the MNIST (image), ISOLET (speech), and the 20 Newsgroups (text) datasets, it was shown that the toolbox can learn automatically a good representation of the input from unlabeled data with better discrimination between different classes. Also on all the aforementioned datasets, the obtained classification errors are comparable to those of the state of the art classifiers. In addition, the toolbox supports different sampling methods (e.g. Gibbs, CD, PCD and our new FEPCD method), different sparsity methods (quadratic, rate distortion and our new normal method), different RBM types (generative and discriminative), GPU based, etc. The toolbox is a user-friendly open source software in MATLAB and Octave and is freely available on the website.

Changes:

New in toolbox

  • Using GPU in Backpropagation
  • Revision of some demo scripts
  • Function approximation with multiple outputs
  • Feature extraction with GRBM in first layer

cardinal


Logo JMLR GPstuff 4.7

by avehtari - June 9, 2016, 17:45:15 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 91186 views, 20855 downloads, 0 subscriptions

Rating Whole StarWhole StarWhole StarWhole StarWhole Star
(based on 1 vote)

About: The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.

Changes:

2016-06-09 Version 4.7

Development and release branches available at https://github.com/gpstuff-dev/gpstuff

New features

  • Simple Bayesian Optimization demo

Improvements

  • Improved use of PSIS
  • More options added to gp_monotonic
  • Monotonicity now works for additive covariance functions with selected variables
  • Possibility to use gpcf_squared.m-covariance function with derivative observations/monotonicity
  • Default behaviour made more robust by changing default jitter from 1e-9 to 1e-6
  • LA-LOO uses the cavity method as the default (see Vehtari et al (2016). Bayesian leave-one-out cross-validation approximations for Gaussian latent variable models. JMLR, accpeted for publication)
  • Selected variables -option works now better with monotonicity

Bugfixes

  • small error in derivative observation computation fixed
  • several minor bug fixes

Logo AutoWEKA 2.0

by larsko - May 19, 2016, 19:58:41 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 9418 views, 2368 downloads, 0 subscriptions

About: Automatically finds the best model with its best parameter settings for a given classification or regression task.

Changes:

Initial Announcement on mloss.org.


Logo Probabilistic Classification Vector Machine 0.22

by fmschleif - November 10, 2015, 13:16:19 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 98869 views, 20023 downloads, 0 subscriptions

About: PCVM library a c++/armadillo implementation of the Probabilistic Classification Vector Machine.

Changes:

30.10.2015 * code has been revised in some places fixing also some errors different multiclass schemes and hdf5 file support added. Some speed ups and memory savings by better handling of intermediate objects.

27.05.2015: - Matlab binding under Windows available. Added a solution file for VS'2013 express to compile a matlab mex binding. Can not yet confirm that under windows the code is really using multiple cores (under linux it does)

29.04.2015 * added an implementation of the Nystroem based PCVM includes: Nystroem based singular value decomposition (SVD), eigenvalue decomposition (EVD) and pseudo-inverse calculation (PINV)

22.04.2015 * implementation of the PCVM released


Logo Apache Mahout 0.11.1

by gsingers - November 9, 2015, 16:12:06 CET [ Project Homepage BibTeX Download ] 47205 views, 11215 downloads, 0 subscriptions

About: Apache Mahout is an Apache Software Foundation project with the goal of creating both a community of users and a scalable, Java-based framework consisting of many machine learning algorithm [...]

Changes:

Apache Mahout introduces a new math environment we call Samsara, for its theme of universal renewal. It reflects a fundamental rethinking of how scalable machine learning algorithms are built and customized. Mahout-Samsara is here to help people create their own math while providing some off-the-shelf algorithm implementations. At its core are general linear algebra and statistical operations along with the data structures to support them. You can use is as a library or customize it in Scala with Mahout-specific extensions that look something like R. Mahout-Samsara comes with an interactive shell that runs distributed operations on a Spark cluster. This make prototyping or task submission much easier and allows users to customize algorithms with a whole new degree of freedom. Mahout Algorithms include many new implementations built for speed on Mahout-Samsara. They run on Spark 1.3+ and some on H2O, which means as much as a 10x speed increase. You’ll find robust matrix decomposition algorithms as well as a Naive Bayes classifier and collaborative filtering. The new spark-itemsimilarity enables the next generation of cooccurrence recommenders that can use entire user click streams and context in making recommendations.


Logo Cognitive Foundry 3.4.2

by Baz - October 30, 2015, 06:53:03 CET [ Project Homepage BibTeX BibTeX for corresponding Paper Download ] 74988 views, 15079 downloads, 0 subscriptions

About: The Cognitive Foundry is a modular Java software library of machine learning components and algorithms designed for research and applications.

Changes:
  • General:
    • Upgraded MTJ to 1.0.3.
  • Common:
    • Added package for hash function computation including Eva, FNV-1a, MD5, Murmur2, Prime, SHA1, SHA2
    • Added callback-based forEach implementations to Vector and InfiniteVector, which can be faster for iterating through some vector types.
    • Optimized DenseVector by removing a layer of indirection.
    • Added method to compute set of percentiles in UnivariateStatisticsUtil and fixed issue with percentile interpolation.
    • Added utility class for enumerating combinations.
    • Adjusted ScalarMap implementation hierarchy.
    • Added method for copying a map to VectorFactory and moved createVectorCapacity up from SparseVectorFactory.
    • Added method for creating square identity matrix to MatrixFactory.
    • Added Random implementation that uses a cached set of values.
  • Learning:
    • Implemented feature hashing.
    • Added factory for random forests.
    • Implemented uniform distribution over integer values.
    • Added Chi-squared similarity.
    • Added KL divergence.
    • Added general conditional probability distribution.
    • Added interfaces for Regression, UnivariateRegression, and MultivariateRegression.
    • Fixed null pointer exception that can happen in K-means with an empty cluster.
    • Fixed name of maxClusters property on AgglomerativeClusterer (was called maxMinDistance).
  • Text:
    • Improvements to LDA Gibbs sampler.

Logo SALSA.jl 0.0.5

by jumutc - September 28, 2015, 17:28:56 CET [ Project Homepage BibTeX Download ] 7644 views, 1927 downloads, 0 subscriptions

About: SALSA (Software lab for Advanced machine Learning with Stochastic Algorithms) is an implementation of the well-known stochastic algorithms for Machine Learning developed in the high-level technical computing language Julia. The SALSA software package is designed to address challenges in sparse linear modelling, linear and non-linear Support Vector Machines applied to large data samples with user-centric and user-friendly emphasis.

Changes:

Initial Announcement on mloss.org.


Showing Items 1-20 of 83 on page 1 of 5: 1 2 3 4 5 Next