Abstract
We consider the problem of detecting (testing) Gaussian stochastic sequences (signals) with imprecisely known means and covariance matrices. An alternative is independent identically distributed zero-mean Gaussian random variables with unit variances. For a given false alarm (1st-kind error) probability, the quality of minimax detection is given by the best miss probability (2nd-kind error probability) exponent over a growing observation horizon. We study the maximal set of means and covariance matrices (composite hypothesis) such that its minimax testing can be replaced with testing a single particular pair consisting of a mean and a covariance matrix (simple hypothesis) without degrading the detection exponent. We completely describe this maximal set.
Similar content being viewed by others
References
Wald, A., Statistical Decision Functions, New York: Wiley, 1950. Translated under the title Statisticheskie reshayushchie funktsii, in Pozitsionnye igry (Positional Games), Moscow: Nauka, 1967, pp. 300–522.
Lehmann, E.L., Testing Statistical Hypotheses, New York: Wiley, 1959. Translated under the title Proverka statisticheskikh gipotez, Moscow: Nauka, 1979.
Poor, H.V., An Introduction to Signal Detection and Estimation, New York: Springer-Verlag, 1994, 2nd ed.
Zhang, W. and Poor, H.V., On Minimax Robust Detection of Stationary Gaussian Signals in White Gaussian Noise, IEEE Trans. Inform. Theory, 2011, vol. 57, no. 6, pp. 3915–3924. https://doi.org/10.1109/TIT.2011.2136210
Burnashev, M.V., On Detection of Gaussian Stochastic Sequences, Probl.Peredachi Inf., 2017, vol. 53, no. 4, pp. 49–68 [Probl. Inf. Transm. (Engl. Transl.), 2017, vol. 53, no. 4, pp. 349–367]. https://doi.org/10.1134/S0032946017040044
Bellman, R., Introduction to Matrix Analysis, New York: McGraw-Hill, 1960. Translated under the title Vvedenie v teoriyu matrits, Moscow: Nauka, 1976.
Horn, R.A. and Johnson, C.R., Matrix Analysis, Cambridge: Cambridge Univ. Press, 1985. Translated under the title Matrichnyi analiz, Moscow: Mir, 1989.
Burnashev, M.V., On Minimax Detection of Gaussian Stochastic Sequences and Gaussian Stationary Signals, Probl. Peredachi Inf., 2021, vol. 57, no. 3, pp. 55–72 [Probl. Inf. Transm. (Engl. Transl.), 2021, vol. 57, no. 3, pp. 248–264]. https://doi.org/10.1134/S0032946021030042
Kullback, S., Information Theory and Statistics, New York: Wiley, 1959. Translated under the title Teoriya informatsii i statistika, Moscow: Nauka, 1967.
Burnashev, M.V., On Stein’s Lemma in Hypotheses Testing in General Non-Asymptotic Case, Stat. Inference Stoch. Process., 2022, Online First article. https://doi.org/10.1007/s11203-022-09278-4
Burnashev, M.V., On the Minimax Detection of an Inaccurately Known Signal in a White Gaussian Noise Background, Teor. Veroyatnost. i Primenen., 1979, vol. 24, no. 1, pp. 106–118 [Theory Probab. Appl. (Engl. Transl.), 1979, vol. 24, no. 1, pp. 107–119]. https://doi.org/10.1137/1124008
Burnashev, M.V., Discrimination of Hypotheses for Gaussian Measures, and a Geometrical Characterization of Gaussian Distribution, Mat. Zametki, 1982, vol. 32, no. 4, pp. 549–556 [Math. Notes (Engl. Transl.), 1982, vol. 32, no. 4, pp. 757–761]. https://doi.org/10.1007/BF01152385
Petrov, V.V., Summy nezavisimykh sluchainykh velichin, Moscow: Nauka, 1972. Translated under the title Sums of Independent Random Variables, Berlin: Springer, 1975.
Funding
Supported in part by the Russian Foundation for Basic Research, project no. 19-01-00364.
Author information
Authors and Affiliations
Additional information
Translated from Problemy Peredachi Informatsii, 2022, Vol. 58, No. 3, pp. 70–84. https://doi.org/10.31857/S0555292322030068
Appendix: Proof of Lemma 1
Appendix: Proof of Lemma 1
Let \(\boldsymbol{\xi}_n\) be a Gaussian random vector with the distribution \(\boldsymbol{\xi}_n \sim{\mathcal{N}}({\bf0},\boldsymbol{I}_n)\), and let \(\boldsymbol{A}_n\) be a symmetric \(n\times n\) matrix with eigenvalues \(\{a_i\}\). Consider the quadratic form \((\boldsymbol{\xi}_n,\boldsymbol{A}_n\boldsymbol{\xi}_n)\). There exists an orthogonal matrix \(\boldsymbol{T}_n\) such that \(\boldsymbol{T}_n'\boldsymbol{A}_n\boldsymbol{T}_n=\boldsymbol{B}_n\), where \(\boldsymbol{B}_n\) is the diagonal matrix with diagonal entries \(\{a_i\}\) [6, Section 4.7]. Since \(\boldsymbol{T}_n\boldsymbol{\xi}_n \sim{\mathcal{N}}({\bf0},\boldsymbol{I}_n)\), the quadratic forms \((\boldsymbol{\xi}_n,\boldsymbol{A}_n\boldsymbol{\xi}_n)\) and \((\boldsymbol{\xi}_n,\boldsymbol{B}_n\boldsymbol{\xi}_n)\) have the same distributions. Therefore, by equation (12) we have
where
Introduce the quantity (see (31))
Then by (71), (72), and (17) or \(\alpha_{\mu}\) from (73) we have
where
To estimate \(P_1\) in (75), we use the following result [13, Section III.5.15]: let \(\zeta_1,\ldots,\zeta_n\) be independent random variables with \(\operatorname{\mathbf E}\nolimits\zeta_i=0\), \(i=1,\ldots,n\). Then for any \(1\le p\le 2\) we have
Therefore, using Chebychev's inequality and (76) for \(P_1\), we obtain
To estimate \(P_2\) in (74) and (75), note that
and so
Therefore, using the standard bound
we obtain (\(\xi_i \sim \mathcal{N}(0,1)\))
For the condition \(\alpha_{\mu}\le\alpha\) to be satisfied, we choose \(\mu\) so that \(\max\{P_1,P_2\}\le\alpha/2\). For that, by (77) and (78), it suffices to take \(\mu\) satisfying (32).
Rights and permissions
About this article
Cite this article
Burnashev, M. On Minimax Detection of Gaussian Stochastic Sequences with Imprecisely Known Means and Covariance Matrices. Probl Inf Transm 58, 265–278 (2022). https://doi.org/10.1134/S0032946022030061
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1134/S0032946022030061