 Research
 Open Access
 Published:
A novel approach to extracting useful information from noisy TFDs using 2D local entropy measures
EURASIP Journal on Advances in Signal Processing volume 2020, Article number: 18 (2020)
Abstract
The paper proposes a novel approach for extraction of useful information and blind source separation of signal components from noisy data in the timefrequency domain. The method is based on the local Rényi entropy calculated inside adaptive, datadriven 2D regions, the sizes of which are calculated utilizing the improved, relative intersection of confidence intervals (RICI) algorithm. One of the advantages of the proposed technique is that it does not require any prior knowledge on the signal, its components, or noise, but rather the processing is performed on the noisy signal mixtures. Also, it is shown that the method is robust to the selection of timefrequency distributions (TFDs). It has been tested for different signaltonoiseratios (SNRs), both for synthetic and reallife data. When compared to fixed TFD thresholding, adaptive TFD thresholding based on RICI rule and the 1D entropybased approach, the proposed adaptive method significantly increases classification accuracy (by up to 11.53%) and F1 score (by up to 7.91%). Hence, this adaptive, datadriven, entropybased technique is an efficient tool for extracting useful information from noisy data in the timefrequency domain.
1 Introduction
Various reallife phenomena produce signals that contain information on the systems of their origin. When analyzing underlying dynamics of these signals, most of them are nonstationary, meaning that their spectrum is timevarying and have dynamical spectral behavior (e.g., biomedical signals, signals from radars, sonars, seismic activity, audio). In addition, many reallife signals are also multicomponent and may be decomposed to multiple amplitudes and/or frequency modulated components.
When dealing with signal interpretation, signals are commonly represented in one of two domains, namely time domain or frequency domain. In classical representations, the variables representing time and frequency are mutually exclusive. The timefrequency distribution (TFD) of the signal, when the signal has timevarying frequency content and dynamical spectral behavior, allows us to represent the signal jointly in time and frequency domain and to detect frequency components at each time instant [1]. TFDs are used in various fields, such as nautical studies [2], medicine [3, 4], electrical engineering [5, 6], and image processing [7, 8].
One of the simplest TFDs is the shorttime Fourier transform (STFT) proposed by Gabor in 1946 which introduces a moving window and applies the Fourier transform (FT) to the signal inside the window [9]. However, the performance of the STFT is highly dependant on the window size and, according to the Heisenberg uncertainty principle, there exists a compromise between time and frequency resolution (increasing window size increases frequency resolution and reduces time resolution and vice versa). This has motivated the development of numerous other highresolution TFDs, many of which are quadratic. The main shortcoming of the quadratic class of TFDs is the inevitable appearance of crossterms or interferences caused by the TFD quadratic nature (this has led to the development of a wide range of reduced interferences quadratic TFDs).
In nonstationary signal analysis in the timefrequency domain, one of the fundamental problems is measuring the signal information content, both globally and locally (e.g., complexity and the number of signal components). Knowing the information content allows efficient preprocessing and dynamic memory allocation prior to signal features extraction (e.g., instantaneous frequency and amplitude estimation) in blind source separation, machine learning, automatic classification systems, etc.
A challenging problem in signal analysis is blind source separation, i.e., separating signal components from a noisy mixture without any apriori knowledge about the signal. Some of the algorithms that are considered standard in solving this problem are greedy approach [10, 11], relaxation approach [12, 13], smoothed approach [14], and component analysis method [15–17]. A timefrequency approach has been proposed in [18]. There is a variety of different methods and several new approaches have been studied in the last few years [19–22]. Methods exploring the use of entropy measures in separating the source signal have also been investigated in many studies.
Flandrin et al. [23] in their paper from 1994 gave a detailed discussion on the Rényi information measure for deterministic and random signals. In this study, authors have indicated the Rényi entropy measure general utility as a complexity indicator of signals in the timefrequency plane. Extensive research has shown that the most suitable entropy measure for TFD of a signal is the Rényi entropy [24].
In [25, 26] and later on in [27, 28], authors present and analyze a method based on the Rényi entropy for blind source separation as well as an extensive comparison of the proposed method with several different methods. Authors state that the method based on Rényi’s entropy should be preferred over other methods. Methods in the mentioned papers are not related to the signals TFD.
A modification of sparse component analysis based on the timefrequency domain was given in [29]. The blind source separation problem in the timefrequency domain has also been investigated in [30], as well as in [31]: the mixed signals were transformed from the time domain to the timefrequency domain. Both the effectiveness and superiority of the proposed algorithm were verified, but under the assumption that there are several sensors and that there are singlesource points. Both methods are dependent on the number of sensors. Other methods dependent on the number of sensors for blind source separation based on the mixing matrix are presented in [32, 33].
A method of combining wavelet transform with timefrequency blind source separation based on the smooth pseudoWignerVille distribution is investigated in [34] to extract electroencephalogram characteristic waves, and the result is used to construct the support vector machine. In the paper written by Saulig et al. [35], the authors propose an automatic adaptive method for identification and separation of the useful information contained in TFDs. The main idea behind the method is based on the Kmeans clustering algorithm that performs a 1D partitioning of the data set. Instead of hard thresholding, authors use blind separation of useful information from background noise with the local Rényi entropy. The advantage of this approach is that there is no need for any prior knowledge of the signal. The results show that this method acts as a neartooptimal automatic hardthreshold selector.
Combining a datadriven method for adaptive Rényi entropy calculation with the relative intersection of confidence intervals (RICI) method could allow the user to extract useful content without the need of any information about the signal source. The method could automatically adapt to the data obtained from the signal TFD. In this paper, we present a method for blind source separation based on the local 2D windowed Rényi entropy of the signals TFD. The method is selfadaptive in terms of choosing the appropriate window for the entropy calculation. It has been tested on the spectrogram and reduced interference distribution (RID) based on the Bessel function. Results are obtained for multicomponent signals. The results are compared to both fixed TFD thresholding and RICI based selection of fixed TFD thresholds without entropy calculations. In addition, comparison to the recently introduced [35] entropybased method is performed. The method is adaptive and no prior knowledge of the signal is required. It can be applied to various multicomponent frequently modulated signals both in noisy and noisefree environments. This blindsource separation method could potentially be applied to different reallife problems, such as biomedical signals (EEG, ECG, etc.) and seismology (earthquake seismographs). The method performance remains stable when considering different TFDs.
The rest of the paper is structured as follows. Section 2.1 provides a brief overview of timefrequency signal representations starting from the spectrogram and focusing on the RID with Bessel function. Entropy measures, in particular, the Rényi entropy, is defined in Section 2.2. Next, the proposed method is described in Section 2.3, followed by the RICI based adaptive thresholding procedure given in Section 2.4. Section 3 elaborates in detail numerical results achieved by the proposed technique. Finally, conclusions are found in Section 4. Nomenclature used in the paper is given in Table 1.
2 Methodology
2.1 Timefrequency distributions
The majority of reallife signals are nonstationary signals, meaning that their frequency content changes with time. The classic time or frequency representation does not display the dependencies between the two.
TFDs are used for the representation of the signal’s frequency contents w.r.t. time, allowing the analyst to see the start and end time of each signal component in the timefrequency domain. Unlike in classical representations, TFD can show whether the signal is monocomponent or multicomponent which can be hard to achieve with spectral analysis.
Two different distributions were used for the algorithm validation, namely the spectrogram and the reduced interference distribution (RID) based on the Bessel function.
2.1.1 The spectrogram
Computation of the spectrogram from the signal’s time domain essentially corresponds to the squared magnitude of the STFT of the signal [1, 36, 37].
x is the analyzed signal and ω is the smoothing window. The spectrogram introduces nonlinearity in the timefrequency representation. The spectrogram of the sum of two signals does not correspond to the sole sum of the spectrograms of the two signals but presents a third term if the two components share timefrequency supports. Also, the representation is dependent on the window function ω(t). A smaller window produces better time resolution, while a wider window gives a better frequency resolution. In other words, the observation window ω(t) allows localization of the spectrum in time but also smears the spectrum in frequency.
2.1.2 The reduced interference distribution (RID) based on Bessel function
The RID is a quadratic TFD in which the crossterms are constricted w.r.t. the autoterms. In this paper, the Bessel function of the first kind has been used [38]. The distribution is defined as
where h is the frequency smoothing window and R_{x} represents the kernel
g is the time smoothing window and x^{∗} denotes the complex conjugate of x. The paper provides the comparison of the results for the simple spectrogram and highresolution RID. Note, however, that other quadratic, highresolution TFDs can also be used with similar performances.
2.2 The Rényi entropy
Entropy measures are most commonly used in the analysis of medical signals such as EEG, heartrate variability, blood pressure, and similar.
The entropy estimation is a calculation of the time density of the average information in a stochastic process.
Shannon in [39] presents the concept of information of a discrete source without memory as a function that quantifies the uncertainty of a random variable at each discrete time. The average of that information is known as Shannon entropy. The Shannon entropy is restricted to random variables taking discrete values. A discrete random variable s, which can take a finite number M of possible values s_{i}∈{s_{1},…,s_{M}} with corresponding probabilities p_{i}∈{p_{1},...,p_{M}}, has the Shannon entropy defined as
From the Shannon entropy, many other entropy measures have emerged. One of the extensions of the Shannon entropy has been presented by Rényi [40].
The Rényi entropy of order α, where α≥0 and α≠1 [23], is defined as
Depending on the chosen α, different entropy measures are defined. For α=0, the obtained entropy is known as Hartley entropy. H(s) as \(\alpha \xrightarrow {} 1\) is the Shannon entropy, while α=2 is the collision entropy used in quantum information theory and it bounds the collision probability of the distribution.
When \(\alpha \xrightarrow {}+\infty \), the obtained entropy is known as the minentropy.
When the TFD entropy is calculated, odd integer values are suggested for the parameter α as the contribution of crossterms oscillatory structures cancels under the integration with odd powers [24, 40].
The definition of Rényi entropy can be extended to continuous random variables by
When it is applied to a normalized TFD, the Rényi entropy is defined as
where C^{α}(t,f) is TFD of the signal.
2.3 The proposed method
The proposed method, aimed at extracting useful information from noisy signals, relies on the hypothesis that a twodimensional entropy map could provide a more suitable substrate for a sensitive extraction procedure, compared to the classical extraction procedures from TFDs. After obtaining the TFD of the signal, for each point in the distribution, the local entropy is calculated over square window sizes ranging from one to the one tenth of the signal size as
The different window sizes are defined as
where
and
The entropy values \(H_{\rho (t,f)}^{\Delta }(t,f)\) for each window size are given as input to the RICI algorithm to determine the window size for the given point based on the entropy changes. The window chosen by the RICI algorithm, in this case, corresponds to the first inflection point when entropy values are modeled as a curve, suggesting that a change in entropy behavior has occurred. In this case, the change in entropy behavior is an indicator of the point where noise starts to influence the entropy measure.
For every t and f the Rényi entropy \(H_{\rho (t,f)}^{\text {RICI}}(t,f)\) is calculated so that
where
H^{Δ} represents the entropy calculation at the desired point for a specified window size.
The algorithm results are produced by observing the intersection of confidence intervals of the signal entropy for the given window size in comparison with the confidence intervals of the other proposed window sizes. The aim of applying the RICI rule to \(H_{\rho (t,f)}^{\Delta }(t,f)\) is to track the interval in which the change in the growth of the entropy occurs.
After the calculation is performed for every pair of t and f, the optimal entropy picture is obtained
where N represent number of samples, and M represents frequency bins. The RICI algorithm selects the desired window size for entropy calculation by tracking the existence and estimating the amount of the intersection of confidence intervals.
In the RICI algorithm, the number of overlapping confidence intervals is calculated to reduce the estimation bias. The method calculates N confidence intervals for each M(n). To produce the function M(n) with a noticeable difference between the signal and the noise entropy, the overlapping of confidence intervals is calculated and the interval Δ^{+}(n) defines the ideal interval. Δ^{+}(n) presents the last index that has the lowest estimation error [41]. The estimation error is calculated as the pointwise mean squared error (MSE) as
where σ(n,Δ) represents the estimation variance and ω(n,Δ) is the estimation bias.
In [42–44], the asymptotic estimation error is shown to demonstrate the following properties, where β is a constant and it is not signaldependent
When Δ>Δ^{+}, β is defined as β>1 and β<1 if Δ<Δ^{+}. The ideal window size Δ^{+} is the one providing the optimal biastovariance tradeoff resulting in the best estimate M(n,Δ^{+}).
Every confidence interval is defined by its lower and upper limits
The lower confidence interval L(n,Δ) limit is defined as
and upper confidence interval limit U(n,Δ) is defined as
where Γ is the threshold parameter of the confidence intervals.
The RICI rule, when compared to the original intersection of confidence interval (ICI) rule, introduces additional tracking of the amount of overlapping of confidence intervals, defined as
Δ=1,2,⋯,L. In order to obtain the value belonging to the finite interval [0,1], O(n,Δ) is divided by the size of the confidence interval D(n,Δ) resulting in R(n,Δ) defined as
For the optimal window width selection by the RICI rule, the previously described procedure can be expressed as
where R_{c} is a chosen threshold [41, 45, 46]). The window width Δ^{+} obtained by the RICI rule is defined as
This results in an image of the signal entropy. The flowchart of the algorithm is reported in Fig. 1.
Next, the mask for the original signal is extracted from the previously obtained timefrequency image again by using the RICI thresholding method.
2.4 The RICI thresholding method
To extract a mask from the optimal entropy map, the RICI method is used once again. Namely, the threshold is defined as
For every τ, E(M_{ρ}(t,f,τ)) is calculated and it represents the signal energy when a threshold is applied on the entropy map. E(M_{ρ}(t,f,τ)) is the energy of the distribution for the chosen threshold τ. The energy calculation for every threshold is given as input to the RICI algorithm
With that, the entropy mask is extracted
The next section estimates the performances of the proposed approach.
3 Results and discussion
3.1 Experimental setup
The method has been tested on four different types of signals, where two of them were synthetic signals. The resulting error shows the difference between the nonzero elements when the mask of the noisefree signal is subtracted from the mask obtained by the tested method. The correct extraction presents all zeros in the resulting error map, where 1 is false negative and −1 is a false positive. Two measures were used to evaluate the performance of the proposed method. The first one is accuracy, calculated as the difference between a given result and the correct result. In this case, the points where the signal and noise were correctly classified are the 0 elements in the subtraction mask. In the metric calculations they present TruePositives(TP)+TrueNegatives(TN). TruePositives(TP) are correctly classified signal points and TrueNegatives (TN) are correctly classified points where the signal is not present. FalseNegatives(FN) are points where the signal is present but the mask obtained from this method discarded them as noise and the value of those points is 1 in the subtraction matrix. FalsePositives(FP) are points where the method misclassified noise as a signal and are defined as −1 values in the subtraction matrix. Description of the used points is given in Table 2. In that case, accuracy is calculated as follows
As can be seen from the expression above, accuracy is not suitable for unbalanced data sets. In mask extraction, the useful signal takes only a portion of the whole set. F1 score is more suitable in cases when there is an uneven class distribution; in this specific case, it is more suitable as the useful signal takes only a smaller portion of the signal TFD. F1 score considers both precision and recall of the result. It is a harmonic mean between the two
where
and
Accuracy and F1 score are usually used as metrics in machine learning for evaluating classification models. In this case, it has been used to determine the fit of the obtained mask for the given noisefree signal. FP and FN are the classification version of statistical error types 1 and 2. This metrics are used in several papers that deal with image [47, 48] and signal processing [49], such as EEG signals [50, 51]. In addition to numerical results, images of the obtained signal masks are shown of Figs. 3, 4, 5, 6, 7, and 8 where the obtained masks are emphasized in yellow.
3.2 Simulation results
The first signal to be tested was the combination of three atoms as shown in Fig. 2. Noise was added with different signaltonoise ratios (SNRs) and the extracted useful information content from signals, for SNR’s −3 dB and 3 dB are shown in Figs. 3 and 4.
Results are shown in Tables 3 and 4 for the spectrogram distribution and in the Tables 5 and 6 for the RIDB distribution. Methods are compared by means of accuracy and F1 score from −3 dB to 10 dB SNR.
The proposed method was compared to the stateoftheart algorithm based on local entropy in one dimension described in [35] as well as to the RICI thresholding of the TFD and the fixed thresholding of the signal TFD. The RICI TFD thresholding is performed similarly to the described procedure in Section 2.4 with the only difference that the input to the RICI operator in Eq. 23 is not the energy calculation for different τ of the optimal entropy map, but the energy calculation for different τ of the signal TFD
The extracted mask is then
A comparison of the obtained results for the signal spectrogram shows that the proposed method overperforms the fixed TFD thresholding, local entropybased approach, and the RICI TFD threshold method in most cases.
Figure 3 shows the results obtained for the first synthetic signal with SNR=3 dB. Figure 3a shows the spectrogram of the noisy signal. Figure 3b and c represent the optimal entropy maps for the spectrogram and RIDB respectively. Results for the RICI thresholding are in Fig. 3d for the spectrogram, and in Fig. fig:atomi3fixe for the RIDB. The result of fixed thresholding is in Fig. 3f.
Comparison of the methods metrics for the spectrogram distribution is reported in Tables 3 and 4. The fixed thresholding has the highest error, while the proposed method gives similar results to the RICI TFD thresholding. The local entropybased algorithm does not perform as well as the proposed method. While the proposed method has a higher accuracy of 0.001, the RICI TFD threshold has a slightly higher F1 score. The local entropybased method seems to perform worse than the fixed threshold as well as the proposed method when applied to spectrogram in this case when SNR= −3 dB.
The proposed method performs far better on the RIDB distribution in Tables 5 and 6. The local entropybased algorithm does not appear to be sui for the RIDB distribution. When compared to the proposed method and RICI TFD threshold, the difference between the method’s measurements are much greater than in the case of the spectrogram. The proposed method has higher accuracy for 0.102 and a higher F1 score for 0.018 when compared to the RICI TFD thresholding.
The fixed threshold method has a lower score in comparison to both the proposed method and the RICI TFD threshold in the case of both spectrogram and RIDB distribution. The local entropybased algorithm does not perform as well as the proposed method or the RICI TFD thresholding for low SNR values.
The representation of obtained results for SNR=3 dB can be seen in Fig. 4. Figure 4a reports the spectrogram of the noisy signal. The optimal entropy map for the spectrogram and RIDB are in Fig. 4b and c. The results for the RICI thresholding are in Fig. 4d, for the spectrogram, and in Fig. 4e for the RIDB. The result of fixed thresholding is in Fig. 4f.
For the spectrogram, Tables 3 and 4, RICI TFD threshold seems to have the best result with the F1 score higher then the proposed method for 0.066 in case of the first signal spectrogram. The local entropybased signal has the accuracy lower than the proposed method by 0.015, but the F1 score is better by 0.011.
The proposed method still gives better results when applied to the RIDB distribution. It outperforms the RICI TFD threshold by 0.055 in accuracy and by 0.054 in the F1 score and fixed threshold by 0.094 in accuracy and by 0.051 in the F1 score. The entropybased method has lower accuracy by 0.041 and F1 score by 0.073.
As can be seen, the proposed method produces similar results as the RICI similar results to the RICI TFD thresholding. Namely, it presents slightly better performance in all cases, except for the first signal when SNR=3 dB (in this case, the F1 score is highest for the RICI thresholding while the accuracy measure is still higher for the proposed method). Differences in the accuracy, for the first signal, range from 0.001, in case of the spectrogram, to 0.102 in the case of the RIDB distribution.
The results for the second multicomponent synthetic signal with added noise with SNR= −3 dB are reported in Fig. 5.
Figure 5a reports the spectrogram of the noisy signal. Figure 5b shows the obtained optimal entropy map from the spectrogram, and Fig. 5c represents the obtained optimal entropy map from the RIDB distribution. The map obtained from the RICI threshold is in Fig. 5d for the spectrogram, and in Fig. 5e for the RIDB. The result of fixed thresholding is in Fig. 5f.
The results for the spectrogram are presented in Tables 3 and 4. The accuracy measure is larger for the proposed method for 0.008 when compared to the RICI TFD threshold, and for 0.019 when compared to the best fixed threshold. The F1 measure of the proposed method is 0.001 higher than the RICI TFD threshold and 0.005 then the highest F1 score for the fixed threshold. The local entropybased method, in this case, has the highest accuracy value but the lowest F1 score when compared to the proposed method and RICI TFD threshold.
From Tables 5 and 6, it is visible that, just as in the case of the first signal, the proposed method outperforms the other three. The difference between the accuracy is 0.058 and 0.091 for the RICI TFD threshold and fixed threshold. Even though the accuracy is higher for the proposed method, the F1 score is in favor of the RICI TFD threshold for 0.081. The local entropybased method has a slightly higher accuracy measure than the proposed method, but it also has a very low F1 score. In terms of the measures, the proposed method, in this case, has higher accuracy than RICI TFD threshold and higher F1 score than the entropybased method.
The results for SNR=3 dB are in Fig. 6. Figure 6a represents the spectrogram of the noisy signal. The optimal entropy map is in Fig. 6b for the spectrogram and in Fig. 6c for the RID. Figure 6d and e report the RICI TFD threshold result for the spectrogram and RID while in Fig. 6f the results of the fixed threshold are reported.
The difference in accuracy between the proposed method and the RICI TFD threshold for the spectrogram is 0.002 (Table 3), while between the proposed method and fixed threshold, it is 0.011. The F1 score (Table 4) is higher for the proposed method, compared to all other methods.
When the methods are applied to the RID distribution, accuracy (Table 5) is higher for 0.058 when the proposed method is compared to the RICI TFD threshold and 0.012 when compared to the entropybased method.
The considerably larger improvement obtained by the proposed method can be observed for the RID distribution. In the case of the RID distribution, the proposed method exceeds all other approaches. Specifically, the proposed optimal entropy map increases the accuracy for different SNRs, when compared to the RICI thresholding, from 0.055 to 0.1 in case of the first signal, and from 0.017 to 0.058 in the case of the second signal, i.e., improvement from 5.86 to 11.53% in the case of the first signal and from 1.83 to 6.79% in the case of the second signal, and when compared to the local entropybased algorithm, from 0.041 to 0.062 in case of the first signal, and for up to 0.026 in the case of the second signal, i.e., improvement from 4.03 to 6.65% in case of the first signal and up to 2.77% in the case of the second signal.
Differences between the results obtained by the spectrogram and the RID distribution are substantial. The RICI thresholding has considerably better performance on the signal spectrogram, regardless of the tested signal. The proposed method in the case of the first signal performs better on the RID, while in case of the second signal, the results are finer for the spectrogram.
The optimal entropy map provides similar results to the local entropybased method and RICI TFD threshold when applied to the spectrogram, but it outperforms them when applied to the RID distribution. All three methods are preferred to the fixed thresholding.
3.3 Reallife examples
The proposed method has been applied to a reallife signal, i.e., dolphin sound and seismology signals.
In Fig. 7, results obtained by the methods for the first reallife signal are displayed. In Fig. 7a, the RID distribution of the original signal is shown. The optimal entropy maps are in Fig. 7b and c for the spectrogram and RID distribution. Figure 7d and e present the maps obtained on the same distributions but by means of the RICI TFD threshold. The results for the fixed threshold of 30% is in Fig. 7f.
In Fig. 8, the extracted maps for a seismic signal are reported. Figure 8a shows the original signal’s TFD. Figure 8b shows the optimal entropy map extracted from the spectrogram and Fig. 8c shows the optimal entropy map extracted from the RID distribution. Figure 8d and e present the maps obtained from the RICI TFD threshold for the same distributions. The result for the fixed threshold of 5% is reported in Fig. 8f.
It is difficult to draw conclusions in the case of the reallife signal as we can not obtain numerical results. Visually, the results are similar in the case of the dolphin sound signal analysis for all tested methods.
For the seismic signal, the largest difference between the obtained signal maps for the different approaches seems to be in the case of the spectrogram, where the optimal map preserves more of the signal. The RID distribution, unlike in the case of the dolphin sound, seems to preserve more of the signal.
4 Conclusion
Here, we introduced a method for blind source separation of signal components and extraction of useful information from noisy TFDs based on a 2D local Rényi entropy. The method uses adaptive windows, the size of which is calculated utilizing the RICI rule. One of the advantages of the approach is that it does not require any specific knowledge of the signal or noise. Also, the proposed technique performs well for different TFDs, as shown in the paper for different SNRs. The method has been applied to both synthetic and realworld signals. When compared to fixed TFD thresholding, the adaptive approach when the RICI rule is applied directly to TFD thresholding, and the current 1D local entropybased method, the proposed adaptive 2D Rényi entropy approach is shown to significantly increase classification accuracy and F1 score. Hence, the method can be used as an efficient tool for extracting useful information from noisy data in the timefrequency domain. Future work prospects a combination of the proposed approach with machine learning techniques to yield additional classification improvements.
Availability of data and materials
Please contact the authors for data requests.
Abbreviations
 FT:

Fourier transform
 ICI:

Intersection of confidence interval
 MSE:

Mean squared error
 RID:

Reduced interference distribution
 RICI:

Relative intersection of confidence intervals
 STFT:

Shorttime Fourier transform
 SNR:

Signaltonoiseratio
 TFD:

Timefrequency distribution
References
B. Boashash, Timefrequency Signal Analysis and Processing: a Comprehensive Reference (Elsevier Academic Press, Australia, 2016).
Z. Hong, W. Qingping, P. Yujian, T. Ning, Y. Naichang, in 2015 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC). A sea cornerreflector jamming identification method based on timefrequency feature (Ningbo, 2015).
P. A. Karthick, G. Venugopal, S. Ramakrishnan, Analysis of surface emg signals under fatigue and nonfatigue conditions using bdistribution based quadratic time frequency distribution. J. Mech. Med. Biol.15(2) (2015).
M. A. Colominas, M. E. S. H. Jomaa, N. Jrad, A. HumeauHeurtier, P. Van Bogaert, Timevarying time–frequency complexity measures for epileptic eeg data analysis. IEEE Trans. Biomed. Eng.65(8), 1681–8 (2018).
M. Noor Muhammad Hamdi, A. Z. Sha’ameri, Timefrequency represetation of radar signals using dopplerlag block searching wignerville distribution. Adv Electr Electron Eng. 16: (2018).
Z. Wang, Y. Wang, L. Xu, in Communications, Signal Processing, and Systems. CSPS 2017. Lecture Notes in Electrical Engineering. Timefrequency ridgebased parameter estimation for sinusoidal frequency modulation signals (SpringerSingapore, 2019).
A. Mjahad, A. RosadoMuñoz, J. F. GuerreroMartínez, M. BatallerMompeán, J. V. FrancésVillora, M. K. Dutta, Detection of ventricular fibrillation using the image from timefrequency representation and combined classifiers without feature extraction. Appl. Sci.8(11) (2018).
Y. Zhao, S. Han, J. Yang, L. Zhang, H. Xu, J. Wang, A novel approach of slope detection combined with Lv’s distribution for airborne SAR imagery of fast moving targets. Remote Sens.10:, 764 (2018).
D. Gabor, Part 1 J. Inst. Electr. Eng. Part III Radio Commun.93:, 429–457 (1946).
S. G. M. and, Matching pursuits with timefrequency dictionaries. IEEE Trans. Sig. Process.41(12), 3397–3415 (1993).
J. A. Tropp, Greed is good: algorithmic results for sparse approximation. IEEE Trans. Inf. Theory. 50(10), 2231–2242 (2004).
S. Chen, D. Donoho, M. Saunders, Atomic decomposition by basis pursuit. SIAM Rev.43(1), 129–159 (2001).
I. F. Gorodnitsky, B. D. Rao, Sparse signal reconstruction from limited data using focuss: a reweighted minimum norm algorithm. IEEE Trans. Sig. Process.45(3), 600–616 (1997).
H. Mohimani, M. BabaieZadeh, C. Jutten, A fast approach for overcomplete sparse decomposition based on smoothed ℓ^{0}norm. IEEE Trans. Sig. Process.57(1), 289–301 (2009).
J. Wen, H. Liu, S. Zhang, M. Xiao, A new fuzzy KEVD orthogonal complement space clustering method. Neural Comput. Appl.24(1), 147–154 (2014).
E. Eqlimi, B. Makkiabadi, in 2015 23rd European Signal Processing Conference (EUSIPCO). An efficient KSCA based unerdetermined channel identification algorithm for online applications, (2015), pp. 2661–2665.
P. Addabbo, C. Clemente, S. L. Ullo, in 2017 IEEE International Workshop on Metrology for AeroSpace (MetroAeroSpace). Fourier independent component analysis of radar microdoppler features, (2017), pp. 45–49.
A. Belouchrani, M. Amin, Blind source separation based on timefrequency signal representations. IEEE Trans. Sig. Process.46(11), 2888–2897 (1998).
F. Feng, M. Kowalski, Underdetermined reverberant blind source separation: sparse approaches for multiplicative and convolutive narrowband approximation. IEEE/ACM Tran. Audio Speech. Lang. Process.27(2), 442–456 (2019).
T. H. Yi, X. J. Yao, C. X. Qu, H. N. Li, Clustering number determination for sparse component analysis during outputonly modal identification. J. Eng. Mech.145:, 04018122 (2019).
P. Zhou, Y. Yang, S. Chen, Z. Peng, K. Noman, W. Zhang, Parameterized model based blind intrinsic chirp source separation. Digit Sig. Process.83:, 73–82 (2018).
S. Senay, Timefrequency bss of biosignals. Healthcare Technol. Lett.5(6), 242–246 (2018).
P. Flandrin, R. G. Baraniuk, O. Michel, in Proc. IEEE Int. Conf. Acoustics Speech and Signal Processing ICASSP’94. Timefrequency complexity and information, (1994), pp. 329–332.
R. G. Baraniuk, P. Flandrin, A. J. E. M. Janssen, O. J. J. Michel, Measuring timefrequency information content using the Renyi entropies. IEEE Trans. Inf. Theory. 47(4), 1391–1409 (2001).
K. E. Hild, D. Erdogmus, J. Príncipe, Blind source separation using Renyi’s mutual information. IEEE Sig. Process. Lett.8(6), 174–176 (2001).
D. Erdogmus, K. E. Hild Ii, J. C. Principe, Blind source separation using Renyi’s αmarginal entropies. Neurocomputing. 49(1–4), 25–38 (2002).
K. E. Hild, D. Pinto, D. Erdogmus, J. C. Principe, Convolutive blind source separation by minimizing mutual information between segments of signals. IEEE Trans. Circ. Syst. I Regular Papers. 52(10), 2188–2196 (2005).
K. E. Hild II, D. Erdogmus, J. C. Principe, An analysis of entropy estimators for blind source separation. Sig. Process.86(1), 182–194 (2006).
X. Yao, T. Yi, C. Qu, H. Li, Blind modal identification using limited sensors through modified sparse component analysis by time–frequency method. ComputAided Civil Infrastruct Eng. 33: (2018).
F. Ye, J. Chen, L. Gao, W. Nie, Q. Sun, A mixing matrix estimation algorithm for the timedelayed mixing model of the underdetermined blind source separation problem. Circ. Syst. Sig. Process., 1–18 (2018).
Q. Guo, G. Ruan, L. Qi, A complexvalued mixing matrix estimation algorithm for underdetermined blind source separation. Circ. Syst. Sig. Process.37(8), 3206–3226 (2018).
F. Ye, J. Chen, L. Gao, W. Nie, Q. Sun, A mixing matrix estimation algorithm for the timedelayed mixing model of the underdetermined blind source separation problem. Circ. Syst. Sig. Process.38:, 1–18 (2018).
Q. Guo, C. Li, R. Guoqing, Mixing matrix estimation of underdetermined blind source separation based on data field and improved fcm clustering. Symmetry. 10:, 21 (2018).
X. Y. Zhang, W. R. Wang, C. Y. Shen, Y. Sun, L. X. Huang, in Advances in intelligent information hiding and multimedia signal processing, ed. by J. S. Pan, P. W. Tsai, J. Watada, and L. C. Jain. Extraction of EEG components based on time  frequency blind source separation (SpringerCham, 2018), pp. 3–10.
N. Saulig, Z. Milanovic, C. Ioana, A local entropybased algorithm for information content extraction from timefrequency distributions of noisy signals. Digit. Sig. Process.70: (2017).
F. Hlawatsch, G. F. BoudreauxBartels, Linear and quadratic timefrequency signal representations. IEEE Sig. Process. Mag.9(2), 21–67 (1992).
L. Cohen, Timefrequency distributionsa review. Proc. IEEE. 77(7), 941–981 (1989).
Zhenyu Guo, L. . Durand, H. C. Lee, The timefrequency distributions of nonstationary signals based on a Bessel kernel. IEEE Trans. Sig. Process.42(7), 1700–1707 (1994).
C. E. Shannon, A mathematical theory of communication. Bell Syst. Tech. J.27(3), 379–423 (1948).
A. Rényi, in Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, Volume 1: Contributions to the Theory of Statistics. On measures of entropy and information (University of California PressBerkeley, 1961), pp. 547–561.
J. Lerga, M. Vrankic, V. Sucic, A signal denoising method based on the improved ICI rule. IEEE Sig. Process. Lett.15:, 601–604 (2008).
A. Goldenshluger, A. Nemirovski, On spatial adaptive estimation of nonparametric regression. Math. Methods Stat.6: (1997).
V. Katkovnik, A new method for varying adaptive bandwidth selection. IEEE Trans. Sig. Process.47:, 2567–2571 (1999).
K. Egiazarian, V. Katkovnik, L. Astola, in 2001 IEEE International Conference on Acoustics, Speech, and Signal Processing. Proceedings (Cat. No.01CH37221), 3. Adaptive window size image denoising based on ICI rule, (2001), pp. 1869–18723.
G. Segon, J. Lerga, V. Sucic, Improved LPAICIbased estimators embedded in a signal denoising virtual instrument. Sig. Image Video Process.11: (2016).
J. Lerga, M. Franušić, V. Sucic, Parameters analysis for the timevarying automatically adjusted LPA based estimators. J. Autom. Control Eng.2:, 203–208 (2014).
G. Blanco, A. J. M. Traina, C. T. Jr., P. M. AzevedoMarques, A. E. S. Jorge, D. de Oliveira, M. V. N. Bedo, A superpixeldriven deep learning approach for the analysis of dermatological wounds. Comput. Methods Prog. Biomed.183:, 105079 (2020).
H. Li, H. Li, J. Kang, Y. Feng, J. Xu, Automatic detection of parapapillary atrophy and its association with children myopia. Comput. Methods Prog. Biomed.183:, 105090 (2020).
F. M. Bayer, A. J. Kozakevicius, R. J. Cintra, An iterative wavelet threshold for signal denoising. Sig. Process.162:, 10–20 (2019).
M. Sharma, S. Singh, A. Kumar, R. S. Tan, U. R. Acharya, Automated detection of shockable and nonshockable arrhythmia using novel waveletbased ECG features. Comput. Biol. Med.115:, 103446 (2019).
J. S. Lee, S. J. Lee, M. Choi, M. Seo, S. W. Kim, QRS detection method based on fully convolutional networks for capacitive electrocardiogram. Expert Syst. Appl.134:, 66–78 (2019).
Funding
This work was fully supported by the Croatian Science Foundation under the projects IP2018013739 and IP2020024358, Center for Artificial Intelligence and Cybersecurity  University of Rijeka, University of Rijeka under the projects uniritehnic1817 and uniritehnic1815, and European Cooperation in Science and Technology (COST) under the project CA17137.
Author information
Authors and Affiliations
Contributions
Conceptualization, A.V, J.L., and N.S.; methodology, A.V. and J. L.; software, A.V..; validation, A.V., and N.S.; investigation, A.V., and J.L.; resources, A.V. and J.L.; data curation, J.L.; writing—original draft preparation, A.V.; writing—review and editing, J.L. and N.S.; supervision, J.L.; project administration, J.L.; funding acquisition, N.S. The author(s) read and approved the final manuscript.
Corresponding author
Ethics declarations
Consent for publication
This research does not contain any individual person’s data in any form (including individual details, images, or videos).
Competing interests
The authors declare that they have no competing interests.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Vranković, A., Lerga, J. & Saulig, N. A novel approach to extracting useful information from noisy TFDs using 2D local entropy measures. EURASIP J. Adv. Signal Process. 2020, 18 (2020). https://doi.org/10.1186/s13634020006792
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/s13634020006792
Keywords
 Rényi entropy
 Timefrequency distributions
 Relative intersection of confidence intervals
 Adaptive thresholding