Category/Science/artificial-intelligence

From Free Software Directory
 
Jump to: navigation, search

Broaden your selection: Category/Science

Category/Science Search icon.png

artificial-intelligence (33)



Aletheia
In short, Aletheia is software for getting science published and into the hands of everyone, for free. It's a decentralised and distributed database used as a publishing platform for scientific research. So, Aletheia is software. But software without people is nothing. To comprehensively answer the question what is Aletheia, Aletheia is software surrounded by a community of people who want to change the world through open access to scientific knowledge. For a more in depth explanation, Aletheia is an Ethereum Blockchain application utilising IPFS for decentralised storage that anyone can upload documents to, download documents from, that also handles the academic peer review process. The application runs on individual PCs, all forming part of the IPFS database. This gives us an open source platform that cannot be bought out by the large publishers (and any derivitive works must also be open source) that should also be hard to take down due to the database being spread across the globe in multiple legal jurisdictions. Aletheia is designed to be a resilient platform run transparently by the community, not some black box corporation or editorial board, meaning all users can see the decisions Aletheia is making and have a stake in that decision making process if they so desire. By this nature, Aletheia is decentralised, it has no key person risk. Should the core group who invented Aletheia dissapear Aletheia won't cease to exist, it will continue to be run by the community. The community moderates content through various mechanisms (peer review, reputation scores etc.,) to ensure quality of content.
Charlemagne
Charlemagne is a genetic programming application that includes both a commandline client and an interactive console mode. It is written in Python and Lisp, and is user extensible to some degree in both languages. It features built-in input-output mapping support and provides the ability to define complex fitness calculations in Lisp or Python.
Dbacl
'dbacl' is a digramic Bayesian text classifier. Given some text, it calculates the posterior probabilities that the input resembles one of any number of previously learned document collections. It can be used to sort incoming email into arbitrary categories such as spam, work, and play, or simply to distinguish an English text from a French text. It fully supports international character sets, and uses sophisticated statistical models based on the Maximum Entropy Principle.
Deduce
'Deduce' is an artificial intelligence program which accepts natural language sentences as input. These sentences describe properties and relationships between objects, (for example, "Spot is a dog", "A liquid will evaporate", or "Water does not flow uphill"). The user can then ask questions against that input, to which Deduce will attempt to answer using deductive reasoning techniques.
Dinrhiw2
Primary aim of the dinrhiw is to be linear algebra library and machine learning library. For this reason dinrhiw implements PCA and neural network codes. Currently, the neural network code only supports:
  • hamiltonian monte carlo sampling (HMC) and simple bayesian neural network
  • second order L-BFGS search
  • gradient descent (backpropagation)
As well as mathematical routines for arbitrary precision mathematics, hermite curve interpolation and many other things.
Discrete Event Calculus Reasoner
The Discrete Event Calculus Reasoner allows a programmer to add common-sense reasoning capabilities to programs. It supports deduction/temporal projection, abduction/planning, postdiction, and model finding. It allows default reasoning about action, change, space, and mental states. It is based on the event calculus, a comprehensive and highly usable logic-based formalism. It helps applications understand the world, make inferences, adapt to unexpected situations, and be more flexible.
FANN
Fast Artificial Neural Network Library (fann) implements multi-layer feedforward networks that support both fully connected and sparsely connected networks. It supports execution in fixed point arithmetic to allow for fast execution on systems with no floating point processor. To overcome the problems of integer overflow, the library calculates a position of the decimal point after training and guarantees that integer overflow cannot occur with this decimal point. FANN is designed to be fast, versatile, and easy to use. Several benchmarks have been executed to test its performance. It is significantly faster than other libraries on systems without a floating point processor, and comparable to other highly optimized libraries on systems with a floating point processor.
GNOWSYS Heckert gnu.small.png
GNOWSYS is an acronym for "Gnowledge Networking and Organizing SYStem." It is a web based object oriented database server with each object provided by an unique URL. GNOWSYS is a tool to construct and store persistently a Gnowledge Base (GB). The GB consists of the following three groups of constructor classes (system and temporal classes under development): PredicateGroup: relationType, relation, functionType, function Object Group: metaType, class, object (with provision to have classes and objects of declarative, procedural, encapsulated, temporal etc.) Structure Group: systemType, system, flowType, flow, processType, process GNOWSYS indexes data and metadata of objects in a catalogue for faster queries. Optionally, data can remain anywhere on the Internet (only the metadata stays in the database). Surrogates of procedures (classes, functions, and system calls) can also be installed in the database as special objects. These procedures execute as web services, so users can design applications without writing program in any programming language by specifying the semantics of a program and mapping the elements of the program to the surrogates of procedures is sufficient for GNOWSYS to test the application design.
Gneural Network Heckert gnu.small.png
Gneural Network is the GNU package which implements a programmable neural network. The current version, 0.9.1, has the following features:
  • A scripting language is available which allows users to define their own neural network without having to know anything about coding.
  • Advanced programmers can use the methods/routines inside the code for their own purposes.
  • When defining the neurons of a network, it is possible to choose among various discriminant and activation functions, etc.
  • Different methods to train a neural network are available, such as genetic algorithms, multi-scale Monte Carlo optimizers, simulated annealing, and others.
  • Several training methods can run in parallel on clusters.
  • Neural networks can be saved once trained for later use.
  • The code is truly cross platform since it is entirely developed in C and does not depend on any external library.
The network can now learn tasks defined by the user. An example of script defining a simple network which fits a curve is given. We plan to deliver more advanced features very soon. In particular, we are already spending efforts to implement recurrent networks. We also plan to implement learning reinforcement techniques and apply Gneural Network for deep learning applications. We will release the data along with the trained network.
INFOTOPO
Programs for Information Topology Data Analysis Information Topology is a program written in Python (compatible with Python 3.4.x), with a graphic interface built using TKinter [1], plots drawn using Matplotlib [2], calculations made using NumPy [3], and scaffold representations drawn using NetworkX [4]. It computes all the results on information presented in the study [5], that is all the usual information functions: entropy, joint entropy between k random variables (Hk), mutual informations between k random variables (Ik), conditional entropies and mutual informations and provides their cohomological (and homotopy) visualisation in the form of information landscapes and information paths together with an approximation of the minimum information energy complex [5]. It is applicable on any set of empirical data that is data with several trials-repetitions-essays (parameter m), and also allows to compute the undersampling regime, the degree k above which the sample size m is to small to provide good estimations of the information functions [5]. The computational exploration is restricted to the simplicial sublattice of random variable (all the subsets of k=n random variables) and has hence a complexity in O(2^n). In this simplicial setting we can exhaustively estimate information functions on the simplicial information structure, that is joint-entropy Hk and mutual-informations Ik at all degrees k=<n and for every k-tuple, with a standard commercial personal computer (a laptop with processor Intel Core i7-4910MQ CPU @ 2.90GHz * 8) up to k=n=21 in reasonable time (about 3 hours). Using the expression of joint-entropy and the probability obtained using equation and marginalization [5], it is possible to compute the joint-entropy and marginal entropy of all the variables. The alternated expression of n-mutual information given by equation then allows a direct evaluation of all of these quantities. The definitions, formulas and theorems are sufficient to obtain the algorithm [5]. We will further develop a refined interface (help welcome) but for the moment it works like this, and requires minimum Python use knowledge. Please contact pierre.baudot [at] gmail.com for questions, request, developments (etc.): [1] J.W. Shipman. Tkinter reference: a gui for python. . New Mexico Tech Computer Center, Socorro, New Mexico, 2010. [2] J.D. Hunter. Matplotlib: a 2d graphics environment. Comput. Sci. Eng., 9:22–30, 2007. [3] S. Van Der Walt, C. Colbert, and G. Varoquaux. The numpy array: a structure for efficient numerical computation. Comput. Sci. Eng., 13:22– 30, 2011. [4] A.A. Hagberg, D.A. Schult, and P.J. Swart. Exploring network structure, dynamics, and function using networkx. Proceedings of the 7th Python in Science Conference (SciPy2008). Gel Varoquaux, Travis Vaught, and Jarrod Millman (Eds), (Pasadena, CA USA), pages 11–15, 2008. [5] M. Tapia, P. Baudot, M. Dufour, C. Formisano-Tréziny, S. Temporal, M. Lasserre, J. Gabert, K. Kobayashi, JM. Goaillard . Information topology of gene expression profile in dopaminergic neurons doi: https://doi.org/10.1101/168740 http://www.biorxiv.org/content/early/2017/07/26/168740

... further results



Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.3 or any later version published by the Free Software Foundation; with no Invariant Sections, no Front-Cover Texts, and no Back-Cover Texts. A copy of the license is included in the page “GNU Free Documentation License”.

The copyright and license notices on this page only apply to the text on this page. Any software or copyright-licenses or other similar notices described in this text has its own copyright notice and license, which can usually be found in the distribution or license text itself.