How to generate cryptographically strong sequences of pseudo-random bits
SIAM Journal on Computing
Integer and combinatorial optimization
Integer and combinatorial optimization
A hard-core predicate for all one-way functions
STOC '89 Proceedings of the twenty-first annual ACM symposium on Theory of computing
Introduction to the theory of neural computation
Introduction to the theory of neural computation
Foundations of Cryptography: Basic Tools
Foundations of Cryptography: Basic Tools
An Introduction to the Modeling of Neural Networks
An Introduction to the Modeling of Neural Networks
The dynamic neural filter: a binary model of spatiotemporal coding
Neural Computation
Progress in Linear Programming-Based Algorithms for Integer Programming: An Exposition
INFORMS Journal on Computing
Neural Networks - 2003 Special issue: Advances in neural networks research IJCNN'03
Theory and application of trapdoor functions
SFCS '82 Proceedings of the 23rd Annual Symposium on Foundations of Computer Science
Hi-index | 0.01 |
Dynamic neural filters (DNFs) are recurrent networks of binary neurons. Under proper conditions of their synaptic matrix they are known to generate exponentially large cycles. We show that choosing the synaptic matrix to be a random orthogonal one, the average cycle length becomes close to that of a random map. We then proceed to investigate the inversion problem and argue that such a DNF could be used to construct a pseudo-random generator. Subjecting this generator’s output to a battery of tests we demonstrate that the sequences it generates may indeed be regarded as pseudo-random.