Implements multilayer feedforward ANNs much faster than other libraries
Fast Artificial Neural Network Library (fann) implements multi-layer feedforward networks that support both fully connected and sparsely connected networks. It supports execution in fixed point arithmetic to allow for fast execution on systems with no floating point processor. To overcome the problems of integer overflow, the library calculates a position of the decimal point after training and guarantees that integer overflow cannot occur with this decimal point. FANN is designed to be fast, versatile, and easy to use. Several benchmarks have been executed to test its performance. It is significantly faster than other libraries on systems without a floating point processor, and comparable to other highly optimized libraries on systems with a floating point processor.
DocumentationUser guide available in HTML format from http://fann.sourceforge.net/report/node7.html; Complete manual available in PDF format from http://prdownloads.sourceforge.net/fann/fann_doc_complete_1.0.pdf?download; Complete manual also available in HTML from http://fann.sourceforge.net/report/report.html
released on 24 January 2012
10 December 2003
Leaders and contributors
Resources and communication
|Developer||VCS Repository Webview||http://sourceforge.net/cvs/?group_id=93562|
This entry (in part or in whole) was last reviewed on 25 February 2017.
Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.3 or any later version published by the Free Software Foundation; with no Invariant Sections, no Front-Cover Texts, and no Back-Cover Texts. A copy of the license is included in the page “GNU Free Documentation License”.
The copyright and license notices on this page only apply to the text on this page. Any software or copyright-licenses or other similar notices described in this text has its own copyright notice and license, which can usually be found in the distribution or license text itself.