Search results
Results from the WOW.Com Content Network
In mathematics, there is in mathematical analysis a class of Sobolev inequalities, relating norms including those of Sobolev spaces.These are used to prove the Sobolev embedding theorem, giving inclusions between certain Sobolev spaces, and the Rellich–Kondrachov theorem showing that under slightly stronger conditions some Sobolev spaces are compactly embedded in others.
When Ω is a ball, the above inequality is called a (p,p)-Poincaré inequality; for more general domains Ω, the above is more familiarly known as a Sobolev inequality. The necessity to subtract the average value can be seen by considering constant functions for which the derivative is zero while, without subtracting the average, we can have ...
In mathematics, logarithmic Sobolev inequalities are a class of inequalities involving the norm of a function f, its logarithm, and its gradient . These inequalities were discovered and named by Leonard Gross, who established them in dimension-independent form, [1] [2] in the context of constructive quantum field theory. Similar results were ...
In mathematics, and in particular in mathematical analysis, the Gagliardo–Nirenberg interpolation inequality is a result in the theory of Sobolev spaces that relates the -norms of different weak derivatives of a function through an interpolation inequality.
In mathematical analysis, Trudinger's theorem or the Trudinger inequality (also sometimes called the Moser–Trudinger inequality) is a result of functional analysis on Sobolev spaces. It is named after Neil Trudinger (and Jürgen Moser). It provides an inequality between a certain Sobolev space norm and an Orlicz space norm of a
The trace operator can be defined for functions in the Sobolev spaces , with <, see the section below for possible extensions of the trace to other spaces. Let Ω ⊂ R n {\textstyle \Omega \subset \mathbb {R} ^{n}} for n ∈ N {\textstyle n\in \mathbb {N} } be a bounded domain with Lipschitz boundary.
Bennett's inequality, an upper bound on the probability that the sum of independent random variables deviates from its expected value by more than any specified amount Bhatia–Davis inequality , an upper bound on the variance of any bounded probability distribution
The Sobolev conjugate of p for <, where n is space dimensionality, is p ∗ = p n n − p > p {\displaystyle p^{*}={\frac {pn}{n-p}}>p} This is an important parameter in the Sobolev inequalities .