Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
P-Complete Approximation Problems
Journal of the ACM (JACM)
Fast Monte-Carlo Algorithms for finding low-rank approximations
FOCS '98 Proceedings of the 39th Annual Symposium on Foundations of Computer Science
Using Linear Algebra for Intelligent Information Retrieval
Using Linear Algebra for Intelligent Information Retrieval
Representing lexical semantics with context vectors and modeling lexical access with attractor networks
Geometry of Cuts and Metrics
Rule-based Management of Large Unorganized Data Sets
ITNG '12 Proceedings of the 2012 Ninth International Conference on Information Technology - New Generations
Proceedings of the 25th international conference on Machine learning
Connection Science - Music, Brain, Cognition
Hi-index | 0.00 |
Multidimensional scaling (MDS) is the process of transforming a set of points in a high-dimensional space to a lower-dimensional one while preserving the relative distances between pairs of points. Although effective methods have been developed for solving a variety of MDS problems, they mainly depend on the vectors in the lower-dimensional space having real-valued components. For some applications, the training of neural networks in particular, it is preferable or necessary to obtain vectors in a discrete, binary space. Unfortunately, MDS into a low-dimensional discrete space appears to be a significantly harder problem than MDS into a continuous space. This article introduces and analyzes several methods for performing approximately optimized binary MDS.