Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 6 de 6
Filter
Add more filters










Database
Language
Publication year range
1.
Rev Mod Phys ; 95(1)2023.
Article in English | MEDLINE | ID: mdl-37051403

ABSTRACT

Arrays of quantum dots (QDs) are a promising candidate system to realize scalable, coupled qubit systems and serve as a fundamental building block for quantum computers. In such semiconductor quantum systems, devices now have tens of individual electrostatic and dynamical voltages that must be carefully set to localize the system into the single-electron regime and to realize good qubit operational performance. The mapping of requisite QD locations and charges to gate voltages presents a challenging classical control problem. With an increasing number of QD qubits, the relevant parameter space grows sufficiently to make heuristic control unfeasible. In recent years, there has been considerable effort to automate device control that combines script-based algorithms with machine learning (ML) techniques. In this Colloquium, a comprehensive overview of the recent progress in the automation of QD device control is presented, with a particular emphasis on silicon- and GaAs-based QDs formed in two-dimensional electron gases. Combining physics-based modeling with modern numerical optimization and ML has proven effective in yielding efficient, scalable control. Further integration of theoretical, computational, and experimental efforts with computer science and ML holds vast potential in advancing semiconductor and other platforms for quantum computing.

2.
Article in English | MEDLINE | ID: mdl-36733297

ABSTRACT

Most data in cold-atom experiments comes from images, the analysis of which is limited by our preconceptions of the patterns that could be present in the data. We focus on the well-defined case of detecting dark solitons-appearing as local density depletions in a Bose-Einstein condensate (BEC)-using a methodology that is extensible to the general task of pattern recognition in images of cold atoms. Studying soliton dynamics over a wide range of parameters requires the analysis of large datasets, making the existing human-inspection-based methodology a significant bottleneck. Here we describe an automated classification and positioning system for identifying localized excitations in atomic BECs utilizing deep convolutional neural networks to eliminate the need for human image examination. Furthermore, we openly publish our labeled dataset of dark solitons, the first of its kind, for further machine learning research.

3.
PRX quantum ; 2(2)2021 Jun 01.
Article in English | MEDLINE | ID: mdl-36733712

ABSTRACT

Quantum dots (QDs) defined with electrostatic gates are a leading platform for a scalable quantum computing implementation. However, with increasing numbers of qubits, the complexity of the control parameter space also grows. Traditional measurement techniques, relying on complete or near-complete exploration via two-parameter scans (images) of the device response, quickly become impractical with increasing numbers of gates. Here we propose to circumvent this challenge by introducing a measurement technique relying on one-dimensional projections of the device response in the multidimensional parameter space. Dubbed the "ray-based classification (RBC) framework," we use this machine learning approach to implement a classifier for QD states, enabling automated recognition of qubit-relevant parameter regimes. We show that RBC surpasses the 82% accuracy benchmark from the experimental implementation of image-based classification techniques from prior work, while reducing the number of measurement points needed by up to 70%. The reduction in measurement cost is a significant gain for time-intensive QD measurements and is a step forward toward the scalability of these devices. We also discuss how the RBC-based optimizer, which tunes the device to a multiqubit regime, performs when tuning in the two-dimensional and three-dimensional parameter spaces defined by plunger and barrier gates that control the QDs. This work provides experimental validation of both efficient state identification and optimization with machine learning techniques for non-traditional measurements in quantum systems with high-dimensional parameter spaces and time-intensive measurements.

4.
Phys Rev Appl ; 132020.
Article in English | MEDLINE | ID: mdl-33304939

ABSTRACT

The current practice of manually tuning quantum dots (QDs) for qubit operation is a relatively time-consuming procedure that is inherently impractical for scaling up and applications. In this work, we report on the in situ implementation of a recently proposed autotuning protocol that combines machine learning (ML) with an optimization routine to navigate the parameter space. In particular, we show that a ML algorithm trained using exclusively simulated data to quantitatively classify the state of a double-QD device can be used to replace human heuristics in the tuning of gate voltages in real devices. We demonstrate active feedback of a functional double-dot device operated at millikelvin temperatures and discuss success rates as a function of the initial conditions and the device performance. Modifications to the training network, fitness function, and optimizer are discussed as a path toward further improvement in the success rate when starting both near and far detuned from the target double-dot range.

5.
PLoS One ; 13(10): e0205844, 2018.
Article in English | MEDLINE | ID: mdl-30332463

ABSTRACT

BACKGROUND: Over the past decade, machine learning techniques have revolutionized how research and science are done, from designing new materials and predicting their properties to data mining and analysis to assisting drug discovery to advancing cybersecurity. Recently, we added to this list by showing how a machine learning algorithm (a so-called learner) combined with an optimization routine can assist experimental efforts in the realm of tuning semiconductor quantum dot (QD) devices. Among other applications, semiconductor quantum dots are a candidate system for building quantum computers. In order to employ QDs, one needs to tune the devices into a desirable configuration suitable for quantum computing. While current experiments adjust the control parameters heuristically, such an approach does not scale with the increasing size of the quantum dot arrays required for even near-term quantum computing demonstrations. Establishing a reliable protocol for tuning QD devices that does not rely on the gross-scale heuristics developed by experimentalists is thus of great importance. MATERIALS AND METHODS: To implement the machine learning-based approach, we constructed a dataset of simulated QD device characteristics, such as the conductance and the charge sensor response versus the applied electrostatic gate voltages. The gate voltages are the experimental 'knobs' for tuning the device into useful regimes. Here, we describe the methodology for generating the dataset, as well as its validation in training convolutional neural networks. RESULTS AND DISCUSSION: From 200 training sets sampled randomly from the full dataset, we show that the learner's accuracy in recognizing the state of a device is ≈ 96.5% when using either current-based or charge-sensor-based training. The spread in accuracy over our 200 training sets is 0.5% and 1.8% for current- and charge-sensor-based data, respectively. In addition, we also introduce a tool that enables other researchers to use this approach for further research: QFlow lite-a Python-based mini-software suite that uses the dataset to train neural networks to recognize the state of a device and differentiate between states in experimental data. This work gives the definitive reference for the new dataset that will help enable researchers to use it in their experiments or to develop new machine learning approaches and concepts.


Subject(s)
Computational Biology/methods , Machine Learning , Quantum Dots , Algorithms , Automation , Computer Security , Computer Simulation , Data Mining , Databases, Factual , Drug Discovery , Neural Networks, Computer , Programming Languages , Reproducibility of Results , Semiconductors
6.
Article in English | MEDLINE | ID: mdl-30984895

ABSTRACT

The lack of an engaging pedagogy and the highly competitive atmosphere in introductory science courses tend to discourage students from pursuing science, technology, engineering, and mathematics (STEM) majors. Once in a STEM field, academic and social integration has been long thought to be important for students' persistence. Yet, it is rarely investigated. In particular, the relative impact of in-class and out-of-class interactions remains an open issue. Here, we demonstrate that, surprisingly, for students whose grades fall in the "middle of the pack," the out-of-class network is the most significant predictor of persistence. To do so, we use logistic regression combined with Akaike's information criterion to assess in- and out-of-class networks, grades, and other factors. For students with grades at the very top (and bottom), final grade, unsurprisingly, is the best predictor of persistence-these students are likely already committed (or simply restricted from continuing) so they persist (or drop out). For intermediate grades, though, only out-of-class closeness-a measure of one's immersion in the network-helps predict persistence. This does not negate the need for in-class ties. However, it suggests that, in this cohort, only students that get past the convenient in-class interactions and start forming strong bonds outside of class are or become committed to their studies. Since many students are lost through attrition, our results suggest practical routes for increasing students' persistence in STEM majors.

SELECTION OF CITATIONS
SEARCH DETAIL
...