Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Neural Netw ; 161: 449-465, 2023 Apr.
Article in English | MEDLINE | ID: mdl-36805261

ABSTRACT

This paper takes a parallel learning approach in continual learning scenarios. We define parallel continual learning as learning a sequence of tasks where the data for the previous tasks, whose distribution may have shifted over time, are also available while learning new tasks. We propose a parallel continual learning method by assigning subnetworks to each task, and simultaneously training only the assigned subnetworks on their corresponding tasks. In doing so, some parts of the network will be shared across multiple tasks. This is unlike the existing literature in continual learning which aims at learning incoming tasks sequentially, with the assumption that the data for the previous tasks have a fixed distribution. Our proposed method offers promises in: (1) Transparency in the network and in the relationship across tasks by enabling examination of the learned representations by independent and shared subnetworks, (2) Representation generalizability through sharing and training subnetworks on multiple tasks simultaneously. Our analysis shows that compared to many competing approaches such as continual learning, neural architecture search, and multi-task learning, parallel continual learning is capable of learning more generalizable representations. Also, (3)Parallel continual learning overcomes the common issue of catastrophic forgetting in continual learning algorithms. This is the first effort to train a neural network on multiple tasks and input domains simultaneously in a continual learning scenario. Our code is available at https://github.com/yours-anonym/PaRT.


Subject(s)
Algorithms , Neural Networks, Computer
2.
Biophys J ; 120(19): 4264-4276, 2021 10 05.
Article in English | MEDLINE | ID: mdl-34087212

ABSTRACT

Many species show a diverse range of sizes; for example, domestic dogs have large variation in body mass. Yet, the internal structure of the organism remains similar, i.e., the system scales to organism size. Drosophila melanogaster has been a powerful model system for exploring scaling mechanisms. In the early embryo, gene expression boundaries scale very precisely to embryo length. Later in development, the adult wings grow with remarkable symmetry and scale well with animal size. Yet, our knowledge of whether internal organs initially scale to embryo size remains largely unknown. Here, we utilize artificially small Drosophila embryos to explore how three critical internal organs-the heart, hindgut, and ventral nerve cord (VNC)-adapt to changes in embryo morphology. We find that the heart scales precisely with embryo length. Intriguingly, reduction in cardiac cell length, rather than number, appears to be important in controlling heart length. The hindgut, which is the first chiral organ to form, displays scaling with embryo size under large-scale changes in the artificially smaller embryos but shows few hallmarks of scaling within wild-type size variation. Finally, the VNC only displays weak scaling behavior; even large changes in embryo geometry result in only small shifts in VNC length. This suggests that the VNC may have an intrinsic minimal length that is largely independent of embryo length. Overall, our work shows that internal organs can adapt to embryo size changes in Drosophila, but the extent to which they scale varies significantly between organs.


Subject(s)
Drosophila Proteins , Drosophila , Animals , Body Patterning , Dogs , Drosophila melanogaster , Embryonic Development
SELECTION OF CITATIONS
SEARCH DETAIL
...