Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Theory Comput Syst ; 65(4): 687-705, 2021.
Article in English | MEDLINE | ID: mdl-34720702

ABSTRACT

Many shared memory algorithms have to deal with the problem of determining whether the value of a shared object has changed in between two successive accesses of that object by a process when the responses from both are the same. Motivated by this problem, we define the signal detection problem, which can be studied on a purely combinatorial level. Consider a system with n + 1 processes consisting of n readers and one signaller. The processes communicate through a shared blackboard that can store a value from a domain of size m. Processes are scheduled by an adversary. When scheduled, a process reads the blackboard, modifies its contents arbitrarily, and, provided it is a reader, returns a Boolean value. A reader must return true if the signaller has taken a step since the reader's preceding step; otherwise it must return false. Intuitively, in a system with n processes, signal detection should require at least n bits of shared information, i.e., m ≥ 2 n . But a proof of this conjecture remains elusive. For the general case, we prove a lower bound of m ≥ n 2. For restricted versions of the problem, where the processes are oblivious or where the signaller must write a fixed sequence of values, we prove a tight lower bound of m ≥ 2 n . We also consider a version of the problem where each reader takes at most two steps. In this case, we prove that m = n + 1 blackboard values are necessary and sufficient.

2.
Proc Natl Acad Sci U S A ; 111(47): 16872-6, 2014 Nov 25.
Article in English | MEDLINE | ID: mdl-25385619

ABSTRACT

Johnson-Lindenstrauss (JL) matrices implemented by sparse random synaptic connections are thought to be a prime candidate for how convergent pathways in the brain compress information. However, to date, there is no complete mathematical support for such implementations given the constraints of real neural tissue. The fact that neurons are either excitatory or inhibitory implies that every so implementable JL matrix must be sign consistent (i.e., all entries in a single column must be either all nonnegative or all nonpositive), and the fact that any given neuron connects to a relatively small subset of other neurons implies that the JL matrix should be sparse. We construct sparse JL matrices that are sign consistent and prove that our construction is essentially optimal. Our work answers a mathematical question that was triggered by earlier work and is necessary to justify the existence of JL compression in the brain and emphasizes that inhibition is crucial if neurons are to perform efficient, correlation-preserving compression.


Subject(s)
Models, Biological , Neurosciences
SELECTION OF CITATIONS
SEARCH DETAIL
...