Redirecting to original paper in 30 seconds...
Click below to go immediately or wait for automatic redirect
📄 Abstract
Abstract: The capability of a novel Kullback-Leibler divergence method is examined
herein within the Kalman filter framework to select the input-parameter-state
estimation execution with the most plausible results. This identification
suffers from the uncertainty related to obtaining different results from
different initial parameter set guesses, and the examined approach uses the
information gained from the data in going from the prior to the posterior
distribution to address the issue. Firstly, the Kalman filter is performed for
a number of different initial parameter sets providing the system
input-parameter-state estimation. Secondly, the resulting posterior
distributions are compared simultaneously to the initial prior distributions
using the Kullback-Leibler divergence. Finally, the identification with the
least Kullback-Leibler divergence is selected as the one with the most
plausible results. Importantly, the method is shown to select the better
performed identification in linear, nonlinear, and limited information
applications, providing a powerful tool for system monitoring.
Key Contributions
This paper proposes a novel method using Kullback-Leibler divergence within the Kalman filter framework to select the most plausible input-parameter-state estimation. It addresses the uncertainty arising from different initial parameter guesses by comparing prior and posterior distributions.
Business Value
Enhances the reliability and accuracy of state estimation in dynamic systems, which is critical for applications like autonomous navigation, industrial process control, and financial modeling.