Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Differential Entropy Rate Characterisations of Long Range Dependent Processes release_bsxfmg2n4fgvddgvcleu3yjncy

by Andrew Feutrill, Matthew Roughan

Released as a article .

2021  

Abstract

A quantity of interest to characterise continuous-valued stochastic processes is the differential entropy rate. The rate of convergence of many properties of LRD processes is slower than might be expected, based on the intuition for conventional processes, e.g. Markov processes. Is this also true of the entropy rate? In this paper we consider the properties of the differential entropy rate of stochastic processes that have an autocorrelation function that decays as a power law. We show that power law decaying processes with similar autocorrelation and spectral density functions, Fractional Gaussian Noise and ARFIMA(0,d,0), have different entropic properties, particularly for negatively correlated parameterisations. Then we provide an equivalence between the mutual information between past and future and the differential excess entropy for stationary Gaussian processes, showing the finiteness of this quantity is the boundary between long and short range dependence. Finally, we analyse the convergence of the conditional entropy to the differential entropy rate and show that for short range dependence that the rate of convergence is of the order O(n^-1), but it is slower for long range dependent processes and depends on the Hurst parameter.
In text/plain format

Archived Files and Locations

application/pdf  410.3 kB
file_tzppgoai3rfxrmotq4gllkfoyq
arxiv.org (repository)
web.archive.org (webarchive)
Read Archived PDF
Preserved and Accessible
Type  article
Stage   submitted
Date   2021-02-10
Version   v1
Language   en ?
arXiv  2102.05306v1
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: 4a46516b-142e-44e3-a311-0137fdf396cb
API URL: JSON