Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
Phys Rev Lett ; 132(6): 063801, 2024 Feb 09.
Article in English | MEDLINE | ID: mdl-38394585

ABSTRACT

Structured light offers wider bandwidths and higher security for communication. However, propagation through complex random media, such as the Earth's atmosphere, typically induces intermodal crosstalk. We show numerically and experimentally that coupling of photonic orbital angular momentum modes is governed by a universal function of a single parameter: the ratio between the random medium's and the beam's transverse correlation lengths, even in the regime of pronounced intensity fluctuations.

2.
Phys Rev Lett ; 130(7): 073801, 2023 Feb 17.
Article in English | MEDLINE | ID: mdl-36867810

ABSTRACT

We show that instantaneous spatial singular modes of light in a dynamically evolving, turbulent atmosphere offer significantly improved high-fidelity signal transmission as compared to standard encoding bases corrected by adaptive optics. Their enhanced stability in stronger turbulence is associated with a subdiffusive algebraic decay of the transmitted power with evolution time.

3.
Animals (Basel) ; 11(1)2021 Jan 11.
Article in English | MEDLINE | ID: mdl-33440704

ABSTRACT

Welfare in animal husbandry includes considerations of biology, ethics, ecology, law and economics. These diverse aspects must be translated into common quantifiable parameters and applicable methods to objectively assess welfare in animals. To assist this process in the field of aquaculture, where such methods are largely missing, we developed a model to assess fish welfare. A network of information was created to link needs, i.e., fundamental requirements for welfare, with parameters, i.e., quantifiable aspects of welfare. From this ontology, 80 parameters that are relevant for welfare, have practicable assessment methods and deliver reliable results were selected and incorporated into a model. The model, named MyFishCheck, allows the evaluation of welfare in five distinct modules: farm management, water quality, fish group behaviour, fish external and fish internal appearance, thereby yielding five individual grades categorising welfare ranging from critical, to poor, to acceptable, and good. To facilitate the use of the model, a software application was written. With its adaptability to different fish species, farming systems, regulations and purposes as well as its user-friendly digital version, MyFishCheck is a next step towards improved fish welfare assessment and provides a basis for ongoing positive developments for the industry, the farmers and the fish.

4.
J Eye Mov Res ; 11(6)2018 Dec 10.
Article in English | MEDLINE | ID: mdl-33828716

ABSTRACT

For an in-depth, AOI-based analysis of mobile eye tracking data, a preceding gaze assign-ment step is inevitable. Current solutions such as manual gaze mapping or marker-based approaches are tedious and not suitable for applications manipulating tangible objects. This makes mobile eye tracking studies with several hours of recording difficult to analyse quan-titatively. We introduce a new machine learning-based algorithm, the computational Gaze-Object Mapping (cGOM), that automatically maps gaze data onto respective AOIs. cGOM extends state-of-the-art object detection and segmentation by mask R-CNN with a gaze mapping feature. The new algorithm's performance is validated against a manual fixation-by-fixation mapping, which is considered as ground truth, in terms of true positive rate (TPR), true negative rate (TNR) and efficiency. Using only 72 training images with 264 labelled object representations, cGOM is able to reach a TPR of approx. 80% and a TNR of 85% compared to the manual mapping. The break-even point is reached at 2 hours of eye tracking recording for the total procedure, respectively 1 hour considering human working time only. Together with a real-time capability of the mapping process after completed train-ing, even hours of eye tracking recording can be evaluated efficiently. (Code and video examples have been made available at: https://gitlab.ethz.ch/pdz/cgom.git).

SELECTION OF CITATIONS
SEARCH DETAIL
...