Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 12 de 12
Filter
1.
PLoS One ; 14(1): e0208555, 2019.
Article in English | MEDLINE | ID: mdl-30608937

ABSTRACT

This article explores how probabilistic programming can be used to simulate quantum correlations in an EPR experimental setting. Probabilistic programs are based on standard probability which cannot produce quantum correlations. In order to address this limitation, a hypergraph formalism was programmed which both expresses the measurement contexts of the EPR experimental design as well as associated constraints. Four contemporary open source probabilistic programming frameworks were used to simulate an EPR experiment in order to shed light on their relative effectiveness from both qualitative and quantitative dimensions. We found that all four probabilistic languages successfully simulated quantum correlations. Detailed analysis revealed that no language was clearly superior across all dimensions, however, the comparison does highlight aspects that can be considered when using probabilistic programs to simulate experiments in quantum physics.


Subject(s)
Computer Simulation , Probability , Programming Languages , Quantum Theory , Time Factors
2.
PLoS One ; 13(12): e0208561, 2018.
Article in English | MEDLINE | ID: mdl-30571700

ABSTRACT

Open source software is becoming crucial in the design and testing of quantum algorithms. Many of the tools are backed by major commercial vendors with the goal to make it easier to develop quantum software: this mirrors how well-funded open machine learning frameworks enabled the development of complex models and their execution on equally complex hardware. We review a wide range of open source software for quantum computing, covering all stages of the quantum toolchain from quantum hardware interfaces through quantum compilers to implementations of quantum algorithms, as well as all quantum computing paradigms, including quantum annealing, and discrete and continuous-variable gate-model quantum computing. The evaluation of each project covers characteristics such as documentation, licence, the choice of programming language, compliance with norms of software engineering, and the culture of the project. We find that while the diversity of projects is mesmerizing, only a few attract external developers and even many commercially backed frameworks have shortcomings in software engineering. Based on these observations, we highlight the best practices that could foster a more active community around quantum computing software that welcomes newcomers to the field, but also ensures high-quality, well-documented code.


Subject(s)
Algorithms , Software
3.
Phys Rev Lett ; 119(19): 190501, 2017 Nov 10.
Article in English | MEDLINE | ID: mdl-29219480

ABSTRACT

Standard projective measurements (PMs) represent a subset of all possible measurements in quantum physics, defined by positive-operator-valued measures. We study what quantum measurements are projective simulable, that is, can be simulated by using projective measurements and classical randomness. We first prove that every measurement on a given quantum system can be realized by classical randomization of projective measurements on the system plus an ancilla of the same dimension. Then, given a general measurement in dimension two or three, we show that deciding whether it is PM simulable can be solved by means of semidefinite programming. We also establish conditions for the simulation of measurements using projective ones valid for any dimension. As an application of our formalism, we improve the range of visibilities for which two-qubit Werner states do not violate any Bell inequality for all measurements. From an implementation point of view, our work provides bounds on the amount of white noise a measurement tolerates before losing any advantage over projective ones.

4.
Nature ; 549(7671): 195-202, 2017 09 13.
Article in English | MEDLINE | ID: mdl-28905917

ABSTRACT

Fuelled by increasing computer power and algorithmic advances, machine learning techniques have become powerful tools for finding patterns in data. Quantum systems produce atypical patterns that classical systems are thought not to produce efficiently, so it is reasonable to postulate that quantum computers may outperform classical computers on machine learning tasks. The field of quantum machine learning explores how to devise and implement quantum software that could enable machine learning that is faster than that of classical computers. Recent work has produced quantum algorithms that could act as the building blocks of machine learning programs, but the hardware and software challenges are still considerable.

5.
Phys Rev Lett ; 118(19): 190503, 2017 May 12.
Article in English | MEDLINE | ID: mdl-28548536

ABSTRACT

In supervised learning, an inductive learning algorithm extracts general rules from observed training instances, then the rules are applied to test instances. We show that this splitting of training and application arises naturally, in the classical setting, from a simple independence requirement with a physical interpretation of being nonsignaling. Thus, two seemingly different definitions of inductive learning happen to coincide. This follows from the properties of classical information that break down in the quantum setup. We prove a quantum de Finetti theorem for quantum channels, which shows that in the quantum case, the equivalence holds in the asymptotic setting, that is, for large numbers of test instances. This reveals a natural analogy between classical learning protocols and their quantum counterparts, justifying a similar treatment, and allowing us to inquire about standard elements in computational learning theory, such as structural risk minimization and sample complexity.

6.
Sci Rep ; 7: 45672, 2017 04 19.
Article in English | MEDLINE | ID: mdl-28422093

ABSTRACT

Markov logic networks (MLNs) reconcile two opposing schools in machine learning and artificial intelligence: causal networks, which account for uncertainty extremely well, and first-order logic, which allows for formal deduction. An MLN is essentially a first-order logic template to generate Markov networks. Inference in MLNs is probabilistic and it is often performed by approximate methods such as Markov chain Monte Carlo (MCMC) Gibbs sampling. An MLN has many regular, symmetric structures that can be exploited at both first-order level and in the generated Markov network. We analyze the graph structures that are produced by various lifting methods and investigate the extent to which quantum protocols can be used to speed up Gibbs sampling with state preparation and measurement schemes. We review different such approaches, discuss their advantages, theoretical limitations, and their appeal to implementations. We find that a straightforward application of a recent result yields exponential speedup compared to classical heuristics in approximate probabilistic inference, thereby demonstrating another example where advanced quantum resources can potentially prove useful in machine learning.

7.
Scientometrics ; 110(2): 765-777, 2017.
Article in English | MEDLINE | ID: mdl-28239206

ABSTRACT

Citation and coauthor networks offer an insight into the dynamics of scientific progress. We can also view them as representations of a causal structure, a logical process captured in a graph. From a causal perspective, we can ask questions such as whether authors form groups primarily due to their prior shared interest, or if their favourite topics are 'contagious' and spread through co-authorship. Such networks have been widely studied by the artificial intelligence community, and recently a connection has been made to nonlocal correlations produced by entangled particles in quantum physics-the impact of latent hidden variables can be analyzed by the same algebraic geometric methodology that relies on a sequence of semidefinite programming (SDP) relaxations. Following this trail, we treat our sample coauthor network as a causal graph and, using SDP relaxations, rule out latent homophily as a manifestation of prior shared interest only, leading to the observed patternedness. By introducing algebraic geometry to citation studies, we add a new tool to existing methods for the analysis of content-related social influences.

8.
Phys Rev Lett ; 119(4): 040402, 2017 Jul 28.
Article in English | MEDLINE | ID: mdl-29341783

ABSTRACT

Bell inequalities have traditionally been used to demonstrate that quantum theory is nonlocal, in the sense that there exist correlations generated from composite quantum states that cannot be explained by means of local hidden variables. With the advent of device-independent quantum information protocols, Bell inequalities have gained an additional role as certificates of relevant quantum properties. In this work, we consider the problem of designing Bell inequalities that are tailored to detect maximally entangled states. We introduce a class of Bell inequalities valid for an arbitrary number of measurements and results, derive analytically their tight classical, nonsignaling, and quantum bounds and prove that the latter is attained by maximally entangled states. Our inequalities can therefore find an application in device-independent protocols requiring maximally entangled states.

9.
Front Psychol ; 7: 1790, 2016.
Article in English | MEDLINE | ID: mdl-27909418

ABSTRACT

Information foraging connects optimal foraging theory in ecology with how humans search for information. The theory suggests that, following an information scent, the information seeker must optimize the tradeoff between exploration by repeated steps in the search space vs. exploitation, using the resources encountered. We conjecture that this tradeoff characterizes how a user deals with uncertainty and its two aspects, risk and ambiguity in economic theory. Risk is related to the perceived quality of the actually visited patch of information, and can be reduced by exploiting and understanding the patch to a better extent. Ambiguity, on the other hand, is the opportunity cost of having higher quality patches elsewhere in the search space. The aforementioned tradeoff depends on many attributes, including traits of the user: at the two extreme ends of the spectrum, analytic and wholistic searchers employ entirely different strategies. The former type focuses on exploitation first, interspersed with bouts of exploration, whereas the latter type prefers to explore the search space first and consume later. Our findings from an eye-tracking study of experts' interactions with novel search interfaces in the biomedical domain suggest that user traits of cognitive styles and perceived search task difficulty are significantly correlated with eye gaze and search behavior. We also demonstrate that perceived risk shifts the balance between exploration and exploitation in either type of users, tilting it against vs. in favor of ambiguity minimization. Since the pattern of behavior in information foraging is quintessentially sequential, risk and ambiguity minimization cannot happen simultaneously, leading to a fundamental limit on how good such a tradeoff can be. This in turn connects information seeking with the emergent field of quantum decision theory.

10.
Annu Int Conf IEEE Eng Med Biol Soc ; 2016: 766-769, 2016 Aug.
Article in English | MEDLINE | ID: mdl-28324937

ABSTRACT

Noninvasive measurement of blood pressure by optical methods receives considerable interest, but the complexity of the measurement and the difficulty of adjusting parameters restrict applications. We develop a method for estimating the systolic and diastolic blood pressure using a single-point optical recording of a photoplethysmographic (PPG) signal. The estimation is data-driven, we use automated machine learning algorithms instead of mathematical models. Combining supervised learning with a discrete wavelet transform, the method is insensitive to minor irregularities in the PPG waveform, hence both pulse oximeters and smartphone cameras can record the signal. We evaluate the accuracy of the estimation on 78 samples from 65 subjects (40 male, 25 female, age 29±7) with no history of cardiovascular disease. The estimate for systolic blood pressure has a mean error 4.9±4.9 mm Hg, and 4.3±3.7 mm Hg for diastolic blood pressure when using the oximeter-obtained PPG. The same values are 5.1±4.3 mm Hg and 4.6±4.3 mm Hg when using the phone-obtained PPG, comparing with A&D UA-767PBT result as gold standard. The simplicity of the method encourages ambulatory measurement, and given the ease of sharing the measured data, we expect a shift to data-oriented approaches deriving insight from ubiquitous mobile devices that will yield more accurate machine learning models in monitoring blood pressure.


Subject(s)
Blood Pressure Determination , Blood Pressure , Photoplethysmography , Adult , Female , Humans , Male , Oximetry , Wavelet Analysis , Young Adult
11.
Article in English | MEDLINE | ID: mdl-25570674

ABSTRACT

Photoplethysmogram (PPG) signals acquired by smartphone cameras are weaker than those acquired by dedicated pulse oximeters. Furthermore, the signals have lower sampling rates, have notches in the waveform and are more severely affected by baseline drift, leading to specific morphological characteristics. This paper introduces a new feature, the inverted triangular area, to address these specific characteristics. The new feature enables real-time adaptive waveform detection using an algorithm of linear time complexity. It can also recognize notches in the waveform and it is inherently robust to baseline drift. An implementation of the algorithm on Android is available for free download. We collected data from 24 volunteers and compared our algorithm in peak detection with two competing algorithms designed for PPG signals, Incremental-Merge Segmentation (IMS) and Adaptive Thresholding (ADT). A sensitivity of 98.0% and a positive predictive value of 98.8% were obtained, which were 7.7% higher than the IMS algorithm in sensitivity, and 8.3% higher than the ADT algorithm in positive predictive value. The experimental results confirmed the applicability of the proposed method.


Subject(s)
Algorithms , Cell Phone , Computer Systems , Heart Rate/physiology , Photoplethysmography/methods , Signal Processing, Computer-Assisted , Female , Humans , Male
12.
IEEE Trans Pattern Anal Mach Intell ; 33(10): 2039-50, 2011 Oct.
Article in English | MEDLINE | ID: mdl-21321366

ABSTRACT

Wavelet kernels have been introduced for both support vector regression and classification. Most of these wavelet kernels do not use the inner product of the embedding space, but use wavelets in a similar fashion to radial basis function kernels. Wavelet analysis is typically carried out on data with a temporal or spatial relation between consecutive data points. We argue that it is possible to order the features of a general data set so that consecutive features are statistically related to each other, thus enabling us to interpret the vector representation of an object as a series of equally or randomly spaced observations of a hypothetical continuous signal. By approximating the signal with compactly supported basis functions and employing the inner product of the embedding L2 space, we gain a new family of wavelet kernels. Empirical results show a clear advantage in favor of these kernels.

SELECTION OF CITATIONS
SEARCH DETAIL
...