In functional analysis, a unitary operator is a surjective bounded operator on a Hilbert space that preserves the inner product. Unitary operators are usually taken as operating on a Hilbert space, but the same notion serves to define the concept of isomorphism between Hilbert spaces.
Definition 1. A unitary operator is a bounded linear operator U : H → H on a Hilbert space H that satisfies U*U = UU* = I, where U* is the adjoint of U, and I : H → H is the identity operator.
The weaker condition U*U = I defines an isometry . The other weaker condition, UU* = I, defines a coisometry. Thus a unitary operator is a bounded linear operator that is both an isometry and a coisometry, [1] or, equivalently, a surjective isometry. [2]
An equivalent definition is the following:
Definition 2. A unitary operator is a bounded linear operator U : H → H on a Hilbert space H for which the following hold:
The notion of isomorphism in the category of Hilbert spaces is captured if domain and range are allowed to differ in this definition. Isometries preserve Cauchy sequences; hence the completeness property of Hilbert spaces is preserved [3]
The following, seemingly weaker, definition is also equivalent:
Definition 3. A unitary operator is a bounded linear operator U : H → H on a Hilbert space H for which the following hold:
To see that definitions 1 and 3 are equivalent, notice that U preserving the inner product implies U is an isometry (thus, a bounded linear operator). The fact that U has dense range ensures it has a bounded inverse U−1. It is clear that U−1 = U*.
Thus, unitary operators are just automorphisms of Hilbert spaces, i.e., they preserve the structure (the vector space structure, the inner product, and hence the topology) of the space on which they act. The group of all unitary operators from a given Hilbert space H to itself is sometimes referred to as the Hilbert group of H, denoted Hilb(H) or U(H).
The linearity requirement in the definition of a unitary operator can be dropped without changing the meaning because it can be derived from linearity and positive-definiteness of the scalar product:
Analogously we obtain
In mathematics, an inner product space is a real vector space or a complex vector space with an operation called an inner product. The inner product of two vectors in the space is a scalar, often denoted with angle brackets such as in . Inner products allow formal definitions of intuitive geometric notions, such as lengths, angles, and orthogonality of vectors. Inner product spaces generalize Euclidean vector spaces, in which the inner product is the dot product or scalar product of Cartesian coordinates. Inner product spaces of infinite dimension are widely used in functional analysis. Inner product spaces over the field of complex numbers are sometimes referred to as unitary spaces. The first usage of the concept of a vector space with an inner product is due to Giuseppe Peano, in 1898.
The Riesz representation theorem, sometimes called the Riesz–Fréchet representation theorem after Frigyes Riesz and Maurice René Fréchet, establishes an important connection between a Hilbert space and its continuous dual space. If the underlying field is the real numbers, the two are isometrically isomorphic; if the underlying field is the complex numbers, the two are isometrically anti-isomorphic. The (anti-) isomorphism is a particular natural isomorphism.
In mathematics, weak topology is an alternative term for certain initial topologies, often on topological vector spaces or spaces of linear operators, for instance on a Hilbert space. The term is most commonly used for the initial topology of a topological vector space with respect to its continuous dual. The remainder of this article will deal with this case, which is one of the concepts of functional analysis.
In mathematics, particularly linear algebra and functional analysis, a spectral theorem is a result about when a linear operator or matrix can be diagonalized. This is extremely useful because computations involving a diagonalizable matrix can often be reduced to much simpler computations involving the corresponding diagonal matrix. The concept of diagonalization is relatively straightforward for operators on finite-dimensional vector spaces but requires some modification for operators on infinite-dimensional spaces. In general, the spectral theorem identifies a class of linear operators that can be modeled by multiplication operators, which are as simple as one can hope to find. In more abstract language, the spectral theorem is a statement about commutative C*-algebras. See also spectral theory for a historical perspective.
In mathematics, a self-adjoint operator on a complex vector space V with inner product is a linear map A that is its own adjoint. If V is finite-dimensional with a given orthonormal basis, this is equivalent to the condition that the matrix of A is a Hermitian matrix, i.e., equal to its conjugate transpose A∗. By the finite-dimensional spectral theorem, V has an orthonormal basis such that the matrix of A relative to this basis is a diagonal matrix with entries in the real numbers. This article deals with applying generalizations of this concept to operators on Hilbert spaces of arbitrary dimension.
In mathematics, specifically functional analysis, a trace-class operator is a linear operator for which a trace may be defined, such that the trace is a finite number independent of the choice of basis used to compute the trace. This trace of trace-class operators generalizes the trace of matrices studied in linear algebra. All trace-class operators are compact operators.
In mathematics, a unitary transformation is a linear isomorphism that preserves the inner product: the inner product of two vectors before the transformation is equal to their inner product after the transformation.
In mathematics, an isometry is a distance-preserving transformation between metric spaces, usually assumed to be bijective. The word isometry is derived from the Ancient Greek: ἴσος isos meaning "equal", and μέτρον metron meaning "measure". If the transformation is from a metric space to itself, it is a kind of geometric transformation known as a motion.
In functional analysis, a discipline within mathematics, given a C*-algebra A, the Gelfand–Naimark–Segal construction establishes a correspondence between cyclic *-representations of A and certain linear functionals on A. The correspondence is shown by an explicit construction of the *-representation from the state. It is named for Israel Gelfand, Mark Naimark, and Irving Segal.
In mathematics as well as physics, a linear operator acting on an inner product space is called positive-semidefinite if, for every , and , where is the domain of . Positive-semidefinite operators are denoted as . The operator is said to be positive-definite, and written , if for all .
In mathematics, spectral theory is an inclusive term for theories extending the eigenvector and eigenvalue theory of a single square matrix to a much broader theory of the structure of operators in a variety of mathematical spaces. It is a result of studies of linear algebra and the solutions of systems of linear equations and their generalizations. The theory is connected to that of analytic functions because the spectral properties of an operator are related to analytic functions of the spectral parameter.
In mathematics, specifically in operator theory, each linear operator on an inner product space defines a Hermitian adjoint operator on that space according to the rule
In mathematics, operator theory is the study of linear operators on function spaces, beginning with differential operators and integral operators. The operators may be presented abstractly by their characteristics, such as bounded linear operators or closed operators, and consideration may be given to nonlinear operators. The study, which depends heavily on the topology of function spaces, is a branch of functional analysis.
In functional analysis, a branch of mathematics, the Borel functional calculus is a functional calculus, which has particularly broad scope. Thus for instance if T is an operator, applying the squaring function s → s2 to T yields the operator T2. Using the functional calculus for larger classes of functions, we can for example define rigorously the "square root" of the (negative) Laplacian operator −Δ or the exponential
In the mathematical discipline of functional analysis, the concept of a compact operator on Hilbert space is an extension of the concept of a matrix acting on a finite-dimensional vector space; in Hilbert space, compact operators are precisely the closure of finite-rank operators in the topology induced by the operator norm. As such, results from matrix theory can sometimes be extended to compact operators using similar arguments. By contrast, the study of general operators on infinite-dimensional spaces often requires a genuinely different approach.
In mathematics, a dissipative operator is a linear operator A defined on a linear subspace D(A) of Banach space X, taking values in X such that for all λ > 0 and all x ∈ D(A)
In mathematics, a commutation theorem for traces explicitly identifies the commutant of a specific von Neumann algebra acting on a Hilbert space in the presence of a trace.
In mathematics, Hilbert spaces allow the methods of linear algebra and calculus to be generalized from (finite-dimensional) Euclidean vector spaces to spaces that may be infinite-dimensional. Hilbert spaces arise naturally and frequently in mathematics and physics, typically as function spaces. Formally, a Hilbert space is a vector space equipped with an inner product that induces a distance function for which the space is a complete metric space. A Hilbert space is a special case of a Banach space.
In mathematics and more precisely in functional analysis, the Aluthge transformation is an operation defined on the set of bounded operators of a Hilbert space. It was introduced by Ariyadasa Aluthge to study p-hyponormal linear operators.
This is a glossary for the terminology in a mathematical field of functional analysis.