Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 9 de 9
Filtrar
1.
Transl Anim Sci ; 6(3): txac082, 2022 Jul.
Artigo em Inglês | MEDLINE | ID: mdl-35875422

RESUMO

Animal behavior is indicative of health status and changes in behavior can indicate health issues (i.e., illness, stress, or injury). Currently, human observation (HO) is the only method for detecting behavior changes that may indicate problems in group-housed pigs. While HO is effective, limitations exist. Limitations include HO being time consuming, HO obfuscates natural behaviors, and it is not possible to maintain continuous HO. To address these limitations, a computer vision platform (NUtrack) was developed to identify (ID) and continuously monitor specific behaviors of group-housed pigs on an individual basis. The objectives of this study were to evaluate the capabilities of the NUtrack system and evaluate changes in behavior patterns over time of group-housed nursery pigs. The NUtrack system was installed above four nursery pens to monitor the behavior of 28 newly weaned pigs during a 42-d nursery period. Pigs were stratified by sex, litter, and randomly assigned to one of two pens (14 pigs/pen) for the first 22 d. On day 23, pigs were split into four pens (7 pigs/pen). To evaluate the NUtrack system's capabilities, 800 video frames containing 11,200 individual observations were randomly selected across the nursery period. Each frame was visually evaluated to verify the NUtrack system's accuracy for ID and classification of behavior. The NUtrack system achieved an overall accuracy for ID of 95.6%. This accuracy for ID was 93.5% during the first 22 d and increased (P < 0.001) to 98.2% for the final 20 d. Of the ID errors, 72.2% were due to mislabeled ID and 27.8% were due to loss of ID. The NUtrack system classified lying, standing, walking, at the feeder (ATF), and at the waterer (ATW) behaviors accurately at a rate of 98.7%, 89.7%, 88.5%, 95.6%, and 79.9%, respectively. Behavior data indicated that the time budget for lying, standing, and walking in nursery pigs was 77.7% ± 1.6%, 8.5% ± 1.1%, and 2.9% ± 0.4%, respectively. In addition, behavior data indicated that nursery pigs spent 9.9% ± 1.7% and 1.0% ± 0.3% time ATF and ATW, respectively. Results suggest that the NUtrack system can detect, identify, maintain ID, and classify specific behavior of group-housed nursery pigs for the duration of the 42-d nursery period. Overall, results suggest that, with continued research, the NUtrack system may provide a viable real-time precision livestock tool with the ability to assist producers in monitoring behaviors and potential changes in the behavior of group-housed pigs.

2.
Sensors (Basel) ; 19(4)2019 Feb 19.
Artigo em Inglês | MEDLINE | ID: mdl-30791377

RESUMO

Computer vision systems have the potential to provide automated, non-invasive monitoring of livestock animals, however, the lack of public datasets with well-defined targets and evaluation metrics presents a significant challenge for researchers. Consequently, existing solutions often focus on achieving task-specific objectives using relatively small, private datasets. This work introduces a new dataset and method for instance-level detection of multiple pigs in group-housed environments. The method uses a single fully-convolutional neural network to detect the location and orientation of each animal, where both body part locations and pairwise associations are represented in the image space. Accompanying this method is a new dataset containing 2000 annotated images with 24,842 individually annotated pigs from 17 different locations. The proposed method achieves over 99% precision and over 96% recall when detecting pigs in environments previously seen by the network during training. To evaluate the robustness of the trained network, it is also tested on environments and lighting conditions unseen in the training set, where it achieves 91% precision and 67% recall. The dataset is publicly available for download.

3.
Artigo em Inglês | MEDLINE | ID: mdl-25570416

RESUMO

As a first step toward building a smart home behavioral monitoring system capable of classifying a wide variety of human behavior, a wireless sensor network (WSN) system is presented for RSSI localization. The low-cost, non-intrusive system uses a smart watch worn by the user to broadcast data to the WSN, where the strength of the radio signal is evaluated at each WSN node to localize the user. A method is presented that uses simultaneous localization and mapping (SLAM) for system calibration, providing automated fingerprinting associating the radio signal strength patterns to the user's location within the living space. To improve the accuracy of localization, a novel refinement technique is introduced that takes into account typical movement patterns of people within their homes. Experimental results demonstrate that the system is capable of providing accurate localization results in a typical living space.


Assuntos
Comportamento , Monitorização Ambulatorial/métodos , Tecnologia sem Fio , Atividades Cotidianas , Algoritmos , Calibragem , Processamento Eletrônico de Dados , Humanos , Internet , Lasers , Movimento , Probabilidade , Processamento de Sinais Assistido por Computador
4.
Biomed Sci Instrum ; 49: 243-50, 2013.
Artigo em Inglês | MEDLINE | ID: mdl-23686206

RESUMO

Automating documentation of physical activity data (e.g., duration and speed of walking or propelling a wheelchair) into the electronic medical record (EMR) offers promise for improving efficiency of documentation and understanding of best practices in the rehabilitation and home health settings. Commercially available devices which could be used to automate documentation of physical activities are either cumbersome to wear or lack the specificity required to differentiate activities. We have designed a novel system to differentiate and quantify physical activities, using inexpensive accelerometer-based biomechanical data technology and wireless sensor networks, a technology combination that has not been used in a rehabilitation setting to date. As a first step, a feasibility study was performed where 14 healthy young adults (mean age = 22.6 ± 2.5 years, mean height = 173 ± 10.0 cm, mean mass = 70.7 ± 11.3 kg) carried out eight different activities while wearing a biaxial accelerometer sensor. Activities were performed at each participant’s self-selected pace during a single testing session in a controlled environment. Linear discriminant analysis was performed by extracting spectral parameters from the subjects’ accelerometer patterns. It is shown that physical activity classification alone results in an average accuracy of 49.5%, but when combined with rule-based constraints using a wireless sensor network with localization capabilities in an in silico simulated room, accuracy improves to 99.3%. When fully implemented, our technology package is expected to improve goal setting, treatment interventions and patient outcomes by enhancing clinicians’ understanding of patients’ physical performance within a day and across the rehabilitation program.

5.
Stud Health Technol Inform ; 184: 235-41, 2013.
Artigo em Inglês | MEDLINE | ID: mdl-23400163

RESUMO

The availability of digital stereoscopic video feedback on surgical robotic platforms allows for a variety of enhancements through the application of computer vision. Several of these enhancements, such as augmented reality and semi-automated surgery, benefit significantly from identification of the robotic manipulators within the field of view. A method is presented for the extraction of robotic manipulators from stereoscopic views of the operating field that uses a combination of marker tracking, inverse kinematics, and computer rendering. This method is shown to accurately identify the locations of the manipulators within the views. It is further demonstrated that this method can be used to enhance 3D reconstruction of the operating field and produce augmented views.


Assuntos
Tecido Conjuntivo/cirurgia , Imageamento Tridimensional/métodos , Procedimentos de Cirurgia Plástica/instrumentação , Robótica/métodos , Cirurgia Assistida por Computador/métodos , Interface Usuário-Computador , Humanos
6.
Surg Endosc ; 26(12): 3413-7, 2012 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-22648119

RESUMO

BACKGROUND: Accurate real-time 3D models of the operating field have the potential to enable augmented reality for endoscopic surgery. A new system is proposed to create real-time 3D models of the operating field that uses a custom miniaturized stereoscopic video camera attached to a laparoscope and an image-based reconstruction algorithm implemented on a graphics processing unit (GPU). METHODS: The proposed system was evaluated in a porcine model that approximates the viewing conditions of in vivo surgery. To assess the quality of the models, a synthetic view of the operating field was produced by overlaying a color image on the reconstructed 3D model, and an image rendered from the 3D model was compared with a 2D image captured from the same view. RESULTS: Experiments conducted with an object of known geometry demonstrate that the system produces 3D models accurate to within 1.5 mm. CONCLUSIONS: The ability to produce accurate real-time 3D models of the operating field is a significant advancement toward augmented reality in minimally invasive surgery. An imaging system with this capability will potentially transform surgery by helping novice and expert surgeons alike to delineate variance in internal anatomy accurately.


Assuntos
Imageamento Tridimensional , Laparoscopia/métodos , Animais , Sistemas Computacionais , Suínos
7.
Stud Health Technol Inform ; 173: 92-6, 2012.
Artigo em Inglês | MEDLINE | ID: mdl-22356964

RESUMO

Robotic surgical platforms require vision feedback systems, which often consist of low-resolution, expensive, single-imager analog cameras. These systems are retooled for 3D display by simply doubling the cameras and outboard control units. Here, a fully-integrated digital stereoscopic video camera employing high-definition sensors and a class-compliant USB video interface is presented. This system can be used with low-cost PC hardware and consumer-level 3D displays for tele-medical surgical applications including military medical support, disaster relief, and space exploration.


Assuntos
Percepção de Profundidade , Imageamento Tridimensional/métodos , Robótica , Cirurgia Vídeoassistida/economia , Cirurgia Vídeoassistida/instrumentação , Procedimentos Cirúrgicos Minimamente Invasivos
8.
Artigo em Inglês | MEDLINE | ID: mdl-23366406

RESUMO

Inexpensive, high-throughput, low maintenance systems for precise temporal and spatial measurement of mouse home cage behavior (including movement, feeding, and drinking) are required to evaluate products from large scale pharmaceutical design and genetic lesion programs. These measurements are also required to interpret results from more focused behavioral assays. We describe the design and validation of a highly-scalable, reliable mouse home cage behavioral monitoring system modeled on a previously described, one-of-a-kind system. Mouse position was determined by solving static equilibrium equations describing the force and torques acting on the system strain gauges; feeding events were detected by a photobeam across the food hopper, and drinking events were detected by a capacitive lick sensor. Validation studies show excellent agreement between mouse position and drinking events measured by the system compared with video-based observation--a gold standard in neuroscience.


Assuntos
Actigrafia/instrumentação , Comportamento Animal/fisiologia , Ecossistema , Abrigo para Animais , Monitorização Ambulatorial/instrumentação , Fotometria/instrumentação , Processamento de Sinais Assistido por Computador/instrumentação , Animais , Desenho de Equipamento , Análise de Falha de Equipamento , Camundongos
9.
Stud Health Technol Inform ; 163: 454-60, 2011.
Artigo em Inglês | MEDLINE | ID: mdl-21335838

RESUMO

Motor-based tracking and image-based tracking are considered for three-dimensional in vivo tracking of the arms of a surgical robot during minimally invasive surgery. Accurate tracking is necessary for tele-medical applications and for the future automation of surgical procedures. An experiment is performed to compare the accuracy of the two methods, and results show that the positioning error of image-based tracking is significantly less than that of motor-based tracking.


Assuntos
Interpretação de Imagem Assistida por Computador/métodos , Imageamento Tridimensional/métodos , Reconhecimento Automatizado de Padrão/métodos , Fotogrametria/métodos , Robótica/métodos , Cirurgia Assistida por Computador/métodos , Interface Usuário-Computador
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...