To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
Languages
Recent
Show all languages
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

Complex random vector

From Wikipedia, the free encyclopedia

In probability theory and statistics, a complex random vector is typically a tuple of complex-valued random variables, and generally is a random variable taking values in a vector space over the field of complex numbers. If are complex-valued random variables, then the n-tuple is a complex random vector. Complex random variables can always be considered as pairs of real random vectors: their real and imaginary parts.

Some concepts of real random vectors have a straightforward generalization to complex random vectors. For example, the definition of the mean of a complex random vector. Other concepts are unique to complex random vectors.

Applications of complex random vectors are found in digital signal processing.

YouTube Encyclopedic

  • 1/5
    Views:
    14 528
    9 948
    32 992
    6 879
    19 498
  • Lecture - 16 Vector Space of Random Variables
  • Vector Spaces 1: What is the dimension of R over Q?
  • Vector Space : Examples (Part 2 of 3)
  • Complex Numbers, but Different - Vectorspace Isomorphisms and the ℝ² [ Episode 5 ]
  • 7. Linear Algebra: Vector Spaces and Operators (continued)

Transcription

Definition

A complex random vector on the probability space is a function such that the vector is a real random vector on where denotes the real part of and denotes the imaginary part of .[1]: p. 292 

Cumulative distribution function

The generalization of the cumulative distribution function from real to complex random variables is not obvious because expressions of the form make no sense. However expressions of the form make sense. Therefore, the cumulative distribution function of a random vector is defined as

 

 

 

 

(Eq.1)

where .

Expectation

As in the real case the expectation (also called expected value) of a complex random vector is taken component-wise.[1]: p. 293 

 

 

 

 

(Eq.2)

Covariance matrix and pseudo-covariance matrix

The covariance matrix (also called second central moment) contains the covariances between all pairs of components. The covariance matrix of an random vector is an matrix whose th element is the covariance between the i th and the j th random variables.[2]: p.372  Unlike in the case of real random variables, the covariance between two random variables involves the complex conjugate of one of the two. Thus the covariance matrix is a Hermitian matrix.[1]: p. 293 

 

 

 

 

(Eq.3)

The pseudo-covariance matrix (also called relation matrix) is defined replacing Hermitian transposition by transposition in the definition above.

 

 

 

 

(Eq.4)

Properties

The covariance matrix is a hermitian matrix, i.e.[1]: p. 293 

.

The pseudo-covariance matrix is a symmetric matrix, i.e.

.

The covariance matrix is a positive semidefinite matrix, i.e.

.

Covariance matrices of real and imaginary parts

By decomposing the random vector into its real part and imaginary part (i.e. ), the pair has a covariance matrix of the form:

The matrices and can be related to the covariance matrices of and via the following expressions:

Conversely:

Cross-covariance matrix and pseudo-cross-covariance matrix

The cross-covariance matrix between two complex random vectors is defined as:

 

 

 

 

(Eq.5)

And the pseudo-cross-covariance matrix is defined as:

 

 

 

 

(Eq.6)

Two complex random vectors and are called uncorrelated if

.

Independence

Two complex random vectors and are called independent if

 

 

 

 

(Eq.7)

where and denote the cumulative distribution functions of and as defined in Eq.1 and denotes their joint cumulative distribution function. Independence of and is often denoted by . Written component-wise, and are called independent if

.

Circular symmetry

A complex random vector is called circularly symmetric if for every deterministic the distribution of equals the distribution of .[3]: pp. 500–501 

Properties
  • The expectation of a circularly symmetric complex random vector is either zero or it is not defined.[3]: p. 500 
  • The pseudo-covariance matrix of a circularly symmetric complex random vector is zero.[3]: p. 584 

Proper complex random vectors

A complex random vector is called proper if the following three conditions are all satisfied:[1]: p. 293 

  • (zero mean)
  • (all components have finite variance)

Two complex random vectors are called jointly proper is the composite random vector is proper.

Properties
  • A complex random vector is proper if, and only if, for all (deterministic) vectors the complex random variable is proper.[1]: p. 293 
  • Linear transformations of proper complex random vectors are proper, i.e. if is a proper random vectors with components and is a deterministic matrix, then the complex random vector is also proper.[1]: p. 295 
  • Every circularly symmetric complex random vector with finite variance of all its components is proper.[1]: p. 295 
  • There are proper complex random vectors that are not circularly symmetric.[1]: p. 504 
  • A real random vector is proper if and only if it is constant.
  • Two jointly proper complex random vectors are uncorrelated if and only if their covariance matrix is zero, i.e. if .

Cauchy-Schwarz inequality

The Cauchy-Schwarz inequality for complex random vectors is

.

Characteristic function

The characteristic function of a complex random vector with components is a function defined by:[1]: p. 295 

See also

References

  1. ^ a b c d e f g h i j Lapidoth, Amos (2009). A Foundation in Digital Communication. Cambridge University Press. ISBN 978-0-521-19395-5.
  2. ^ Gubner, John A. (2006). Probability and Random Processes for Electrical and Computer Engineers. Cambridge University Press. ISBN 978-0-521-86470-1.
  3. ^ a b c Tse, David (2005). Fundamentals of Wireless Communication. Cambridge University Press.
This page was last edited on 27 April 2023, at 01:22
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.