favorite1In order to capture the history dependence governing the spontaneous spiking activity of the RGC neuron, we model the spiking probability using two different link models to further corroborate the generalization of our results to models beyond the canonical self-exciting process studied in this paper.
favorite4Figure 2 shows 500 samples of the canonical self-exciting process generated using a history dependence parameter vector shown in Figure 3(a).
favorite7These models are widely-used in neural data analysis and are motivated by the continuous time point processes with history dependent conditional intensity functions .
favorite1Nevertheless, regularized ML estimators show remarkable performance in fitting GLMs to neuronal data with history dependence and highly non-i.i.d. In this paper, we close this gap by presenting new results on robust estimation of compressible GLMs, relaxing the common assumptions of i.i.d. We will present theoretical guarantees that extend those of CS theory and characterize fundamental trade-offs between.
favorite5This condition ensures that the resulting estimates of (9)-(12) pertain to stable AR processes and at the same time can be obtained by convex optimization techniques, for which fast solvers exist.
favorite4In the special case of a white noise sub-Gaussian process, i.e., a sub-Gaussian i.i.d. Finally, we provide simulation results as well as application to oil price and traffic data which reveal that the sparse estimates significantly outperform traditional techniques such as the Yule-Walker based estimators .
favorite0In general, the ubiquitous long-range dependencies in real-world time series, such as financial data, results in AR model fits with large orders .
favorite2Applying these techniques to simulated data as well as real-world datasets from crude oil prices and traffic speed data confirm our predicted theoretical performance gains in terms of estimation accuracy and model selection.
favorite21Our results improve over existing sampling complexity requirements in AR estimation using the LASSO, when the sparsity level scales faster than the square root of the model order.
favorite6Finally, we provide simulation results which reveal that the sparse estimates of the compressible state-space models significantly outperform the traditional basis pursuit estimator.
favorite7In this paper, we consider the problem of estimating state dynamics from noisy observations, where the state transitions are governed by autoregressive models with compressible innovations.
favorite3We provide simulation studies as well as application to spike deconvolution from calcium imaging data which verify our theoretical results and show significant improvement over existing algorithms.
favorite0We perform parameter and state estimation using a dynamic compressed sensing framework and develop an efficient solution consisting of two nested Expectation-Maximization (EM) algorithms.
favorite3ABSTRACT In this paper, we consider linear state-space models with compressible innovations and convergent transition matrices in order to model spatiotemporally sparse transient events.