Skip to main content

Table 3 Two-step particle filtering (TS) algorithm

From: A weighted likelihood criteria for learning importance densities in particle filtering

\(\left [\left \{x^{i}_{n},\, L_{n}^{i},\,w_{n}^{i}\right \}_{i=1}^{M}\right ]\) = TS\(\left [\left \{ x^{i}_{n-1},\,L^{i}_{n-1},\,w_{n-1}^{i} \right \}^{M}_{i=1},\,y_{n}\right ]\)

–Initialize \(x_{0}^{i} \sim p_{0}\) and \(w_{0}^{i} = 1/M\) for i=1,2,⋯,M.

–DO for n=1,2,⋯,T:

–DO for each particle i, i=1,2,⋯,M:

– STEP 1: Construct the EnKF importance sampling density and sample

– from it:

1. Construct: \([\hat {x}^{i}_{n},\, \hat {P}_{n}^{i}]\) = EnKF\(\left [x^{i}_{n-1},\, L^{i}_{n-1},\,y_{n}\right ]\).

2. Sample: Draw \(x^{*i}_{n} \sim \phi _{d}\left (x\,|\,\hat {x}_{n}^{i},\,\hat {P}_{n}^{i}\right)\) as in (18).

3. Calculate weights \({w}^{*i}_{n} = { w_{n-1}^{i}\frac {p\left (y_{n}|x^{*i}_{n}\right)p\left (x^{*i}_{n}|x^{i}_{n-1}\right)}{q\left (x^{*i}_{n}\,|\,\hat {x}_{n}^{i},\,\hat {P}_{n}^{i}\right)}}\).

– STEP 2: Learn p(x n | y1:n):

1. Find its estimate, \(\hat {f}_{n}(x)\), based on GMMs and data \(\left \{\,x_{n}^{*i},\,w_{n}^{*i}\,\right \}_{i=1}^{M}\)

– from STEP 1.

2. Sample: Draw \((x_{n}^{j},\,L_{n}^{j}) \sim \hat {f}_{n}(x)\)

3. Compute weights

\( w_{n}^{j} = \frac {1}{M}\sum _{i=1}^{M}\,\frac {w_{n-1}^{i}\,p\left (y_{n}\,|\,x_{n}^{j}\right)\,p\left (x_{n}^{j}\,|\,x_{n-1}^{i}\right)}{\hat {f}_{n}\left (x_{n}^{j}\right)} \)

– Propagate: \(\left \{x^{j}_{n},\, L^{j}_{n},\,w_{n}^{j}\right \}^{M}_{j=1}\)