This version upgrades to Java 1.8 and contains several bug fixes. You can download it directly here or from Maven Central through dependency management tools like Ivy and Maven.
* Updated to Java 1.8.
* Added callback-based transform methods to Vector that pass index and value.
* Fixed issue in RandomSubVectorThresholdLearner where feature selection was ignored if no sampling was done.
* Fixed mean and variance computation in StudentTConfidence.
* Added hyperbolic tangent function.
This version contains several new components, performance enhancements, and an upgraded version of MTJ. You can download it directly here or from Maven Central through dependency management tools like Ivy and Maven.
* Upgraded MTJ to 1.0.3.
* Added package for hash function computation including Eva, FNV-1a, MD5, Murmur2, Prime, SHA1, SHA2
* Added callback-based forEach implementations to Vector and InfiniteVector, which can be faster for iterating through some vector types.
* Optimized DenseVector by removing a layer of indirection.
* Added method to compute set of percentiles in UnivariateStatisticsUtil and fixed issue with percentile interpolation.
* Added utility class for enumerating combinations.
* Adjusted ScalarMap implementation hierarchy.
* Added method for copying a map to VectorFactory and moved createVectorCapacity up from SparseVectorFactory.
* Added method for creating square identity matrix to MatrixFactory.
* Added Random implementation that uses a cached set of values.
* Implemented feature hashing.
* Added factory for random forests.
* Implemented uniform distribution over integer values.
* Added Chi-squared similarity.
* Added KL divergence.
* Added general conditional probability distribution.
* Added interfaces for Regression, UnivariateRegression, and MultivariateRegression.
* Fixed null pointer exception that can happen in K-means with an empty cluster.
* Fixed name of maxClusters property on AgglomerativeClusterer (was called maxMinDistance).
* Improvements to LDA Gibbs sampler.
This version contains mainly upgraded versions of dependencies. It also has some bug fixes, performance improvements, and adds an ALS implementation of Factorization Machines. You can download it directly here or from Maven Central through dependency management tools like Ivy and Maven.
* Updated MTJ to version 1.0.2 and netlib-java to 1.1.2.
* Updated XStream to version 1.4.8.
* Fixed issue in VectorUnionIterator.
* Added Alternating Least Squares (ALS) Factorization Machine training
* Fixed performance issue in Factorization Machine where linear component
was not making use of sparsity.
* Added utility function to sigmoid unit.
This version contains an implementation of factorization machines, improvements to statistical distributions, improvements to tree learners, convenience methods, performance improvements, and bug fixes. It also upgrades to Java 1.7. You can download it directly here or from Maven Central through dependency management tools like Ivy and Maven.
* Now requires Java 1.7 or higher.
* Improved compatibility with Java 1.8 functions by removing
ClonableSerializable requirement from many function-style interfaces.
* Common Core:
* Improved iteration speed over sparse MTJ vectors.
* Added utility methods for more stable log(1+x), exp(1-x), log(1 – exp(x)),
and log(1 + exp(x)) to LogMath.
* Added method for creating a partial permutations to Permutation.
* Added methods for computing standard deviation to UnivariateStatisticsUtil.
* Added increment, decrement, and list view methods to Vector and Matrix.
* Added shorter versions of get and set for Vector and Matrix getElement and
* Added aliases of dot for dotProduct in VectorSpace.
* Added utility methods for divideByNorm2 to VectorUtil.
* Added a learner for a Factorization Machine using SGD.
* Added a iterative reporter for validation set performance.
* Added new methods to statistical distribution classes to allow for faster
sampling without boxing, in batches, or without creating extra memory.
* Made generics for performance evaluators more permissive.
* ParameterGradientEvaluator changed to not require input, output, and
gradient types to be the same. This allows more sane gradient definitions
for scalar functions.
* Added parameter to enforce a minimum size in a leaf node for decision
tree learning. It is configured through the splitting function.
* Added ability to filter which dimensions to use in the random subspace
and variance tree node splitter.
* Added ReLU, leaky ReLU, and soft plus activation functions for neural
* Added IntegerDistribution interface for distributions over natural numbers.
* Added a method to get the mean of a numeric distribution without boxing.
* Fixed an issue in DefaultDataDistribution that caused the total to be off
when a value was set to less than or equal to 0.
* Added property for rate to GammaDistribution.
* Added method to get standard deviation from a UnivariateGaussian.
* Added clone operations for decision tree classes.
* Fixed issue TukeyKramerConfidence interval computation.
* Fixed serialization issue with SMO output.
By popular demand, I’ve converted the development repository to Git and have made it available on GitHub. This includes migrating the source, issues, and wiki as well as setting up the historical releases. The old Trac site has thus been taken offline. We are hoping that this change will make it easier for people to find, use, and contribute to the Foundry in the future.
I’ve added links to the main site for the release and development source repositories that we’ve been using for open source development of the Cognitive Foundry. You can browse the sources online or make a local clone using Mercurial as described in the source page. The release repository contains the source for the latest release, which at this time is still 3.3.2. The development repository has the latest-and-greatest code, though may not be as stable as the release version.
The current version of the Cognitive Foundry (3.3.0) is now available in the Maven central repository. Thus, if you use Maven or Ivy as part of your build system, you can easily add the Foundry to your Java projects and get all the goodness of dependency management. Each of the 6 primary jars for Common Core, Common Data, Learning Core, Text Core, Framework Core, and Framework Learning are available, so you can pick and choose the parts you want to use. Future versions of the Foundry will be posted to Maven central as well.
If you use Maven, then you can add the following dependencies to your pom.xml file for the various parts of the Foundry you want to use, or include all of them:
If you use Ivy, you can add dependencies using the following declarations in your ivy.xml file:
<dependency org="gov.sandia.foundry" name="gov-sandia-cognition-common-core" rev="3.3.0"/>
<dependency org="gov.sandia.foundry" name="gov-sandia-cognition-common-data" rev="3.3.0"/>
<dependency org="gov.sandia.foundry" name="gov-sandia-cognition-learning-core" rev="3.3.0"/>
<dependency org="gov.sandia.foundry" name="gov-sandia-cognition-text-core" rev="3.3.0"/>
<dependency org="gov.sandia.foundry" name="gov-sandia-cognition-framework-core" rev="3.3.0"/>
<dependency org="gov.sandia.foundry" name="gov-sandia-cognition-framework-learning" rev="3.3.0"/>
Unless you have changed your Ivy resolvers, you should be able to pick these up just by adding the above.
Let us know if you have any questions. Thanks to Andrew for the suggestion.
I set up some forums for this site. Please make use of them to ask questions, provide answers, and share information about the Cognitive Foundry.
Welcome to cognitivefoundry.org, the community site for the Cognitive Foundry. The Cognitive Foundry was created by the Cognitive Systems group at Sandia National Laboratories to be a software platform for building intelligent systems. Started in 2006, it was open sourced in 2010 under a BSD-style license. It is primarily written in Java and has a heavy emphasis on machine learning algorithms.
This site was created to help provide information about the Foundry and to foster the community of Foundry users.