Abstract"Artificial neural networks" provide an appealing model of computation. Such networks consist of an interconnection of a number of parallel agents, or "neurons". Each of these receives certain signals as inputs, computes some simple function, and produces a signal as output, which is in turn broadcast to the successive neurons involved in a given computation. Some of the signals originate from outside the network, and act as inputs to the whole system, while some of the output signals are communicated back to the environment and are used to encode the end result of computation. In this dissertation we focus on the "recurrent network" model, in which the underlying graph is not subject to any constraints. We investigate the computational power of neural nets, taking a classical computer science point of view. We characterize the language recognition power of networks in terms of the types of numbers (constants) utilized as weights. From a mathematical viewpoint, it is natural to consider integer, rational, and real numbers. From the standpoint of computer science, natural classes of formal languages are regular, recursive, and "all languages". We establish a precise correspondence between the mathematical and computing choices. Furthermore, when the computation time of the network is constrained to be polynomial in the input size, the classes recognized by the respective networks are: regular, P, and Analog-P, i.e. P/poly. Among other results described in this thesis are a proper hierarchy of networks using Kolmogorov-complexity characterizations, the imposition of space constraints, and a proposed "Church's thesis of analog computing".
RightsThis Item is protected by copyright and/or related rights.You are free to use this Item in any way that is permitted by the copyright and related rights legislation that applies to your use.For other uses you need to obtain permission from the rights-holder(s).