Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Filters








17,463 Hits in 3.3 sec

Capacity Bounds for Networks with Correlated Sources and Characterisation of Distributions by Entropies [article]

Satyajit Thakor, Terence Chan, Alex Grant
2016 arXiv   pre-print
dependencies via entropy functions.  ...  The characterisation of correlation or joint distribution via Shannon entropy functions is also applicable to other information measures such as Renyi entropy and Tsallis entropy.  ...  We investigated a more general question: is it feasible to characterise probability distribution (or source correlation) completely using entropy functions?  ... 
arXiv:1607.02822v1 fatcat:6kbvmayxubg43coq476wmsbgpa

Differential Entropy Rate Characterisations of Long Range Dependent Processes [article]

Andrew Feutrill, Matthew Roughan
2021 arXiv   pre-print
A quantity of interest to characterise continuous-valued stochastic processes is the differential entropy rate.  ...  negatively correlated parameterisations.  ...  Hence, we may be able to characterise the behaviour of LRD processes on the entropy rate as tending to −∞ as the strength of correlations increases.  ... 
arXiv:2102.05306v2 fatcat:airxciqwereuxjnnyrynazwgzq

Characterising Probability Distributions via Entropies [article]

Satyajit Thakor, Terence Chan, Alex Grant
2016 arXiv   pre-print
One main challenge to extend linear program bounds to the case of correlated sources is the difficulty (or impossibility) of characterising arbitrary dependencies via entropy functions.  ...  This paper tackles the problem by addressing how to use entropy functions to characterise correlation among sources.  ...  When X is not binary, the entropy H(X) alone is not sufficient to characterise the probability distribution of X.  ... 
arXiv:1602.03618v2 fatcat:g3nlthujujfs7jioccn7rwbxju

Triple junctions network as the key pattern for characterisation of grain structure evolution in metals

Siying Zhu, Elijah Borodin, Andrey P. Jivkov
2020 Materials & design  
With the limited available characterisation data we demonstrate that the proposed descriptor correlates well with the evolution of the microstructure during severe plastic deformation.  ...  the network of triple junctions in copper alloys as the sub-structure that governs continuous dynamic recrystallisation and propose one descriptor of this sub-structure, referred to as the structural entropy  ...  Borodin and Jivkov acknowledge the financial support from EPSRC, UK via grant EP/N026136/1.  ... 
doi:10.1016/j.matdes.2020.109352 fatcat:otogs7gmuzhbrn2pbwl37qiavm

Complexity Measures in Magnetoencephalography: Measuring "Disorder" in Schizophrenia

Matthew J. Brookes, Emma L. Hall, Siân E. Robson, Darren Price, Lena Palaniyappan, Elizabeth B. Liddle, Peter F. Liddle, Stephen E. Robinson, Peter G. Morris, Sam Doesburg
2015 PLoS ONE  
These timecourses are modulated by cognitive tasks, with an increase in local neural processing characterised by localised and transient increases in entropy in the neural signal.  ...  These findings demonstrate potential clinical utility for our method and support a recent hypothesis that schizophrenia can be characterised by abnormalities in the salience network (a well characterised  ...  Entropy signals were compared to time frequency spectra via computation of the Pearson correlation coefficient.  ... 
doi:10.1371/journal.pone.0120991 pmid:25886553 pmcid:PMC4401778 fatcat:hq4ypbwtbzdcjnvbuevvdebks4

Energy–entropy competition and the effectiveness of stochastic gradient descent in machine learning

Yao Zhang, Andrew M. Saxe, Madhu S. Advani, Alpha A. Lee
2018 Molecular Physics  
Finding parameters that minimise a loss function is at the core of many machine learning methods.  ...  Moreover, we show that the stochasticity in the algorithm has a non-trivial correlation structure which systematically biases it towards wide minima.  ...  Figure 1 shows that the mean entropy of minima found via SGD is larger than the mean entropy found using Langevin dynamics with the same level of noise, confirming our hypothesis that the anisotropy of  ... 
doi:10.1080/00268976.2018.1483535 fatcat:icpumct4gngzxdxbwrbtlcnlyq

Measuring Global Behaviour of Multi-agent Systems from Pair-Wise Mutual Information [chapter]

George Mathews, Hugh Durrant-Whyte, Mikhail Prokopenko
2005 Lecture Notes in Computer Science  
Differential entropy differs from standard entropy since it is defined for continuous probability density functions, i.e.  ...  We intend to show that this may be sufficient for a characterisation of global multi-agent behaviours. Fig. 1 . 1 LEFT: Magnitude of the ARL acceleration as a function of agent separation.  ... 
doi:10.1007/11554028_81 fatcat:fhgq42k2grfw5krzsy76e2bfom

Anomalous thermal behaviour and diffuse scattering in cadmium cyanide

Chloe Simone Coates, Mia Baise, Arkadiy Simonov, Ben Slater, Andrew Goodwin
2017 Acta Crystallographica Section A: Foundations and Advances  
physical properties and the degree of correlated disorder.  ...  It adopts the interpenetrated diamondoid structure of cubic ice-VII and favours the 'two-in-two-out' cyanide configuration that governs proton disorder in ice, and gives rise to residual entropy of ice  ...  This is an ideal system to probe the relationship between disorder and material properties, and give an indication as to how we might tune the structure, disorder and, in turn, functionality of framework  ... 
doi:10.1107/s2053273317087678 fatcat:u6i737ydxfgs5pr63u6cntfmdm

Three invariants of strange attractors derived through hypergeometric entropy [article]

Keisuke Okamura
2022 arXiv   pre-print
They are the correlation dimension (𝒟) and the correlation entropy (𝒦), both having attracted attention over the past decades, and a new invariant called the correlation concentration (𝒜) introduced  ...  The entropy function is modelled by Kummer's confluent hypergeometric function, which reproduces the known scaling behaviours of 𝒟 and 𝒦 in the "microscopic" limit ρ→∞ while exhibiting a new scaling  ...  New scaling law at macroscopic scale via extended correlation integral Let us modify the correlation integral of Eq. ( 2 ) by replacing the Heaviside function with an exponential function of the form  ... 
arXiv:2204.14092v1 fatcat:lal4v6otl5gq5ibspx7l4of3sy

Correlation-based characterisation of time-varying dynamical complexity in the Earth's magnetosphere

R. V. Donner, G. Balasis
2013 Nonlinear Processes in Geophysics  
or interact via short-range forces.  ...  Entropy measures (e.g. non-extensive Tsallis entropy, Shannon entropy, block entropy, Kolmogorov entropy, Tcomplexity and approximate entropy) have been proven effectively applicable for the investigation  ... 
doi:10.5194/npg-20-965-2013 fatcat:qwvp56dm3rciji3jcov4svpury

Correlation Effects in the Moshinsky Model

Przemysław Kościk, Anna Okopińska
2013 Few-body systems  
We study linear entropies and von Neumann entropies of the bipartitions and compare their behavior with that of the relative correlation energy and of the statistical Kutzelnigg coefficient.  ...  We investigate quantum correlations in the ground state of the Moshinsky model formed by N harmonically interacting particles confined in a harmonic potential.  ...  Neumann entropy S (1) , (b) relative correlation energy ΔE and (c) the Kutzelnigg coefficient τ as functions of N for g = 0.1, 1, 100  ... 
doi:10.1007/s00601-012-0546-4 fatcat:3ezt3f3go5fudasirf4pilwkje

Structural characterisation of polycrystalline colloidal monolayers in the presence of aspherical impurities

Andrew T Gray, Elizabeth Mould, C Patrick Royall, Ian Williams
2015 Journal of Physics: Condensed Matter  
Here we investigate a quasi-two-dimensional system of colloidal spheres containing a small fraction of aspherical impurities and characterise the resulting polycrystalline monolayer.  ...  Increasing the concentration of impurities leads to an increase in the number of these defects and consequently a reduction in system-wide hexagonal ordering and a corresponding increase in entropy as  ...  The orientational correlation function, g 6 (r) = | ψ j 6 ψ k 6 | where particles j and k are separated by a distance r, characterises spatial correlations in this alignment and is typically used to classify  ... 
doi:10.1088/0953-8984/27/19/194108 pmid:25924206 fatcat:tzuacdablbdgrprekg3szunub4

Thermodynamic Formalism in Neuronal Dynamics and Spike Train Statistics

Rodrigo Cofré, Cesar Maldonado, Bruno Cessac
2020 Entropy  
At its core, there is a variational principle that corresponds, in its simplest form, to the Maximum Entropy principle.  ...  Now, Equation ( 7 ) characterise the second derivative of the pressure as a time series of correlations, which converge when the correlations decay exponentially.  ...  In contrast to statistical physics, there is no need to define a partition function (the potential is defined via transition probabilities, and is thus normalised).  ... 
doi:10.3390/e22111330 pmid:33266513 pmcid:PMC7712217 fatcat:mb22nfmzxrbvbcx342tigdbsky

Functional diversity within protein superfamilies

James Casbon, Mansoor Saqi
2006 Journal of Integrative Bioinformatics  
We observe a negative correlation between the functional entropy of a superfamily and the size of the conserved core.  ...  However the assignment of function via structure remains difficult as some structures are compatible with a variety of functions.  ...  via structure.  ... 
doi:10.1515/jib-2006-46 fatcat:t6f3lpouhrarlflil7ilfygifm

Evolving Spatiotemporal Coordination in a Modular Robotic System [chapter]

Mikhail Prokopenko, Vadim Gerasimov, Ivan Tanev
2006 Lecture Notes in Computer Science  
In particular, the information-theoretic measure of coordination employed in this work estimates the generalized correlation entropy ¤ £ and the generalized excess entropy ¥ £ computed over a multivariate  ...  The entropy¨  ...  For any given actuator , a simple characterisation of the "regularity" of the time series y is provided by the auto-correlation function.  ... 
doi:10.1007/11840541_46 fatcat:lgzlknagubbbrizvyekqvion7i
« Previous Showing results 1 — 15 out of 17,463 results