Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 11 de 11
Filter
Add more filters










Publication year range
1.
Int J Pharm ; 641: 123061, 2023 Jun 25.
Article in English | MEDLINE | ID: mdl-37211237

ABSTRACT

Biorelevant dissolution tests of oral solid dosage forms open the gate to valid in vitro-in vivo predictions (IVIVP). A recently developed apparatus, PhysioCell, allows mimicking the fluid flow and pressure waves occurring in the human fasted stomach. In this work, we used the PhysioCell to perform IVIVP for vortioxetine immediate-release (IR) tablets: the originator (Brintellix) and generic product candidates (VORTIO). The dissolved drug was monitored in the gastric (StressCell) and intestinal (Collection Vessel) compartments that contained biorelevant media. Simulated intermittent gastric stress at 15 min and "housekeeping wave" at 30 min increased the dissolution of Brintellix formulations only. A mechanistic model that best described the observations involved the first-order tablet disintegration with a stress-induced enhancement for Brintellix, dissolution of solid particles in the StressCell, and drug transfer to the Collection Vessel. Then, a semi-mechanistic pharmacokinetic model with dissolution parameters as inputs simulated vortioxetine plasma concentrations in healthy volunteers after single and multiple dosing of Brintellix. Despite different dissolution characteristics, VORTIO provided similar concentration profiles to the originator. In conclusion, PhysioCell dissolution tests, combined with semi-mechanistic IVIVP, can be successfully used to develop IR dosage forms exhibiting gastric stress-related effects.


Subject(s)
Chemistry, Pharmaceutical , Humans , Solubility , Vortioxetine , Workflow , Administration, Oral , Tablets , Drug Liberation
2.
Comput Methods Programs Biomed ; 182: 105052, 2019 Dec.
Article in English | MEDLINE | ID: mdl-31476448

ABSTRACT

BACKGROUND AND OBJECTIVE: People suffer from sleep disorders caused by work-related stress, irregular lifestyle or mental health problems. Therefore, development of effective tools to diagnose sleep disorders is important. Recently, to analyze biomedical signals Information Theory is exploited. We propose efficient classification method of sleep anomalies by applying entropy estimating algorithms to encoded ECGs signals coming from patients suffering from Sleep-Related Breathing Disorders (SRBD). METHODS: First, ECGs were discretized using the encoding method which captures the biosignals variability. It takes into account oscillations of ECG measurements around signals averages. Next, to estimate entropy of encoded signals Lempel-Ziv complexity algorithm (LZ) which measures patterns generation rate was applied. Then, optimal encoding parameters, which allow distinguishing normal versus abnormal events during sleep with high sensitivity and specificity were determined numerically. Simultaneously, subjects' states were identified using acoustic signal of breathing recorded in the same period during sleep. RESULTS: Random sequences show normalized LZ close to 1 while for more regular sequences it is closer to 0. Our calculations show that SRBDs have normalized LZ around 0.32 (on average), while control group has complexity around 0.85. The results obtained to public database are similar, i.e. LZ for SRBDs around 0.48 and for control group 0.7. These show that signals within the control group are more random whereas for the SRBD group ECGs are more deterministic. This finding remained valid for both signals acquired during the whole duration of experiment, and when shorter time intervals were considered. Proposed classifier provided sleep disorders diagnostics with a sensitivity of 93.75 and specificity of 73.00%. To validate our method we have considered also different variants as a training and as testing sets. In all cases, the optimal encoding parameter, sensitivity and specificity values were similar to our results above. CONCLUSIONS: Our pilot study suggests that LZ based algorithm could be used as a clinical tool to classify sleep disorders since the LZ complexities for SRBD positives versus healthy individuals show a significant difference. Moreover, normalized LZ complexity changes are related to the snoring level. This study also indicates that LZ technique is able to detect sleep abnormalities in early disorders stage.


Subject(s)
Electrocardiography/methods , Sleep Wake Disorders/physiopathology , Algorithms , Humans , Signal Processing, Computer-Assisted , Sleep Wake Disorders/classification
3.
Biol Cybern ; 113(4): 453-464, 2019 08.
Article in English | MEDLINE | ID: mdl-31243531

ABSTRACT

To understand how anatomy and physiology allow an organism to perform its function, it is important to know how information that is transmitted by spikes in the brain is received and encoded. A natural question is whether the spike rate alone encodes the information about a stimulus (rate code), or additional information is contained in the temporal pattern of the spikes (temporal code). Here we address this question using data from the cat Lateral Geniculate Nucleus (LGN), which is the visual portion of the thalamus, through which visual information from the retina is communicated to the visual cortex. We analyzed the responses of LGN neurons to spatially homogeneous spots of various sizes with temporally random luminance modulation. We compared the Firing Rate with the Shannon Information Transmission Rate , which quantifies the information contained in the temporal relationships between spikes. We found that the behavior of these two rates can differ quantitatively. This suggests that the energy used for spiking does not translate directly into the information to be transmitted. We also compared Firing Rates with Information Rates for X-ON and X-OFF cells. We found that, for X-ON cells the Firing Rate and Information Rate often behave in a completely different way, while for X-OFF cells these rates are much more highly correlated. Our results suggest that for X-ON cells a more efficient "temporal code" is employed, while for X-OFF cells a straightforward "rate code" is used, which is more reliable and is correlated with energy consumption.


Subject(s)
Action Potentials/physiology , Geniculate Bodies/cytology , Geniculate Bodies/physiology , Mental Processes/physiology , Neurons/physiology , Animals , Cats , Photic Stimulation/methods , Visual Cortex/cytology , Visual Cortex/physiology , Visual Pathways/cytology , Visual Pathways/physiology
4.
Int J Neural Syst ; 29(8): 1950003, 2019 Oct.
Article in English | MEDLINE | ID: mdl-30841769

ABSTRACT

The nature of neural codes is central to neuroscience. Do neurons encode information through relatively slow changes in the firing rates of individual spikes (rate code) or by the precise timing of every spike (temporal code)? Here we compare the loss of information due to correlations for these two possible neural codes. The essence of Shannon's definition of information is to combine information with uncertainty: the higher the uncertainty of a given event, the more information is conveyed by that event. Correlations can reduce uncertainty or the amount of information, but by how much? In this paper we address this question by a direct comparison of the information per symbol conveyed by the words coming from a binary Markov source (temporal code) with the information per symbol coming from the corresponding Bernoulli source (uncorrelated, rate code). In a previous paper we found that a crucial role in the relation between information transmission rates (ITRs) and firing rates is played by a parameter s, which is the sum of transition probabilities from the no-spike state to the spike state and vice versa. We found that in this case too a crucial role is played by the same parameter s. We calculated the maximal and minimal bounds of the quotient of ITRs for these sources. Next, making use of the entropy grouping axiom, we determined the loss of information in a Markov source compared with the information in the corresponding Bernoulli source for a given word length. Our results show that in the case of correlated signals the loss of information is relatively small, and thus temporal codes, which are more energetically efficient, can replace rate codes effectively. These results were confirmed by experiments.


Subject(s)
Action Potentials , Information Theory , Markov Chains , Models, Neurological , Uncertainty
5.
BMC Neurosci ; 16: 32, 2015 May 19.
Article in English | MEDLINE | ID: mdl-25986973

ABSTRACT

BACKGROUND: Explaining how the brain processing is so fast remains an open problem (van Hemmen JL, Sejnowski T., 2004). Thus, the analysis of neural transmission (Shannon CE, Weaver W., 1963) processes basically focuses on searching for effective encoding and decoding schemes. According to the Shannon fundamental theorem, mutual information plays a crucial role in characterizing the efficiency of communication channels. It is well known that this efficiency is determined by the channel capacity that is already the maximal mutual information between input and output signals. On the other hand, intuitively speaking, when input and output signals are more correlated, the transmission should be more efficient. A natural question arises about the relation between mutual information and correlation. We analyze the relation between these quantities using the binary representation of signals, which is the most common approach taken in studying neuronal processes of the brain. RESULTS: We present binary communication channels for which mutual information and correlation coefficients behave differently both quantitatively and qualitatively. Despite this difference in behavior, we show that the noncorrelation of binary signals implies their independence, in contrast to the case for general types of signals. CONCLUSIONS: Our research shows that the mutual information cannot be replaced by sheer correlations. Our results indicate that neuronal encoding has more complicated nature which cannot be captured by straightforward correlations between input and output signals once the mutual information takes into account the structure and patterns of the signals.


Subject(s)
Communication , Information Theory , Models, Neurological , Action Potentials , Algorithms , Brain/physiology , Neurons/physiology
6.
Brain Res ; 1536: 135-43, 2013 Nov 06.
Article in English | MEDLINE | ID: mdl-23891793

ABSTRACT

Organisms often evolve as compromises, and many of these compromises can be expressed in terms of energy efficiency. Thus, many authors analyze energetic costs processes during information transmission in the brain. In this paper we study information transmission rate per energy used in a class of ring, brain inspired neural networks, which we assume to involve components like excitatory and inhibitory neurons or long-range connections. Choosing model of neuron we followed a probabilistic approach proposed by Levy and Baxter (2002), which contains all essential qualitative mechanisms participating in the transmission process and provides results consistent with physiologically observed values. Our research shows that all network components, in broad range of conditions, significantly improve the information-energetic efficiency. It turned out that inhibitory neurons can improve the information-energetic transmission efficiency by 50%, while long-range connections can improve the efficiency even by 70%. We also found that the most effective is the network with the smallest size: we observed that two times increase of the size can cause even three times decrease of the information-energetic efficiency. This article is part of a Special Issue entitled Neural Coding 2012.


Subject(s)
Energy Metabolism , Information Theory , Nerve Net/physiology , Neural Networks, Computer , Neurons/physiology
7.
J Sleep Res ; 22(1): 13-21, 2013 Feb.
Article in English | MEDLINE | ID: mdl-22737985

ABSTRACT

Even in the absence of external stimuli there is ongoing activity in the cerebral cortex as a result of recurrent connectivity. This paper attempts to characterize one aspect of this ongoing activity by examining how the information content carried by specific neurons varies as a function of brain state. We recorded from rats chronically implanted with tetrodes in the primary visual cortex during awake and sleep periods. Electro-encephalogram and spike trains were recorded during 30-min periods, and 2-4 neuronal spikes were isolated per tetrode off-line. All the activity included in the analysis was spontaneous, being recorded from the visual cortex in the absence of visual stimuli. The brain state was determined through a combination of behavior evaluation, electroencephalogram and electromyogram analysis. Information in the spike trains was determined by using Lempel-Ziv Complexity. Complexity was used to estimate the entropy of neural discharges and thus the information content (Amigóet al. Neural Comput., 2004, 16: 717-736). The information content in spike trains (range 4-70 bits s(-1) ) was evaluated during different brain states and particularly during the transition periods. Transitions toward states of deeper sleep coincided with a decrease of information, while transitions to the awake state resulted in an increase in information. Changes in both directions were of the same magnitude, about 30%. Information in spike trains showed a high temporal correlation between neurons, reinforcing the idea of the impact of the brain state in the information content of spike trains.


Subject(s)
Brain/physiology , Sleep/physiology , Action Potentials/physiology , Animals , Behavior, Animal/physiology , Electroencephalography , Electromyography , Male , Photic Stimulation , Rats , Visual Cortex/physiology , Wakefulness/physiology
8.
Biosystems ; 105(1): 62-72, 2011 Jul.
Article in English | MEDLINE | ID: mdl-21439348

ABSTRACT

There has been a growing interest in the estimation of information carried by a single neuron and multiple single units or population of neurons to specific stimuli. In this paper we analyze, inspired by article of Levy and Baxter (2002), the efficiency of a neuronal communication by considering dendrosomatic summation as a Shannon-type channel (1948) and by considering such uncertain synaptic transmission as part of the dendrosomatic computation. Specifically, we study Mutual Information between input and output signals for different types of neuronal network architectures by applying efficient entropy estimators. We analyze the influence of the following quantities affecting transmission abilities of neurons: synaptic failure, activation threshold, firing rate and type of the input source. We observed a number of surprising non-intuitive effects. It turns out that, especially for lower activation thresholds, significant synaptic noise can lead even to twofold increase of the transmission efficiency. Moreover, the efficiency turns out to be a non-monotonic function of the activation threshold. We find a universal value of threshold for which a local maximum of Mutual Information is achieved for most of the neuronal architectures, regardless of the type of the source (correlated and non-correlated). Additionally, to reach the global maximum the optimal firing rates must increase with the threshold. This effect is particularly visible for lower firing rates. For higher firing rates the influence of synaptic noise on the transmission efficiency is more advantageous. Noise is an inherent component of communication in biological systems, hence, based on our analysis, we conjecture that the neuronal architecture was adjusted to make more effective use of this attribute.


Subject(s)
Models, Neurological , Nerve Net/physiology , Synaptic Transmission/physiology , Entropy , Information Theory
9.
Phys Rev Lett ; 93(23): 234101, 2004 Dec 03.
Article in English | MEDLINE | ID: mdl-15601163

ABSTRACT

We propose a definition of finite-space Lyapunov exponent. For discrete-time dynamical systems, it measures the local (between neighboring points) average spreading of the system. We justify our definition by showing that, for large classes of chaotic maps, the corresponding finite-space Lyapunov exponent approaches the Lyapunov exponent of a chaotic map when M-->infinity, where M is the cardinality of the discrete phase space. In analogy with continuous systems, we say the system has pseudochaos if its finite-space Lyapunov exponent tends to a positive number (or to +infinity), when M-->infinity.

10.
Neural Comput ; 16(4): 717-36, 2004 Apr.
Article in English | MEDLINE | ID: mdl-15025827

ABSTRACT

Normalized Lempel-Ziv complexity, which measures the generation rate of new patterns along a digital sequence, is closely related to such important source properties as entropy and compression ratio, but, in contrast to these, it is a property of individual sequences. In this article, we propose to exploit this concept to estimate (or, at least, to bound from below) the entropy of neural discharges (spike trains). The main advantages of this method include fast convergence of the estimator (as supported by numerical simulation) and the fact that there is no need to know the probability law of the process generating the signal. Furthermore, we present numerical and experimental comparisons of the new method against the standard method based on word frequencies, providing evidence that this new approach is an alternative entropy estimator for binned spike trains.


Subject(s)
Action Potentials/physiology , Algorithms , Entropy , Models, Neurological , Neurons/physiology , Animals , Cats , Electric Stimulation , Female , Male , Markov Chains , Photic Stimulation , Signal Processing, Computer-Assisted , Visual Cortex/physiology
11.
J Biol Phys ; 29(1): 39-54, 2003 Mar.
Article in English | MEDLINE | ID: mdl-23345818

ABSTRACT

We apply the random field theory tothe study of DNA chains which we assume tobe trajectories of a stochastic process. Weconstruct statistical potential betweennucleotides corresponding to theprobabilities of those trajectories thatcan be obtained from the DNA data basecontaining millions of sequences. It turnsout that this potential has aninterpretation in terms of quantitiesnaturally arrived at during the study ofevolution of species i.e. probabilities ofmutations of codons. Making use of recentlyperformed statistical investigations of DNAwe show that this potential has differentqualitative properties in coding andnoncoding parts of genes. We apply ourmodel to data for various organisms andobtain a good agreement with the resultsjust presented in the literature. We alsoargue that the coding/noncoding boundariescan corresponds to jumps of the potential.

SELECTION OF CITATIONS
SEARCH DETAIL
...