Note that Gaussian random variables describe variations in the dimensions
of gates (or equivalently gate delays), i.e., du = d%l + ψu where d® is nominal
dimension of the gate.
2.2.2 Compressive sensing β
The compressive sensing concepts, that enable us to reconstruct a sparse vector
by partial measurement, are explained here (see [10,15,24]). A vector is called
s-sparse when it has only s non-zero elements. Assume X is an s-sparse N × 1
vector. Assume Y is described based on the following equation
Y = UX + e. (2.2)
Vector X is the unknown sparse vector; U is a known K×N measurement matrix
and e is measurement noise. Note that not only are the values of the non-zero
components of X are not known, neither which components are zero. The vector
Y is our observation (measurement). The goal is to estimate the sparse vector X
using the measurement vector Y. To retrieve the vector Xf one might choose a
vector that minimizes ∣∣V — tλX^∣∣2∙ Because of the measurement noise and small
number of measurements, this procedure usually leads to a non-sparse signal.
However, solving the following optimization problem finds an sparse solution
min ∣∣X∣∣0 + α∣∣e∣∣2 (2.3)
such that Y = UX + e,
20
More intriguing information
1. The name is absent2. Legal Minimum Wages and the Wages of Formal and Informal Sector Workers in Costa Rica
3. Evolving robust and specialized car racing skills
4. The name is absent
5. The name is absent
6. Wettbewerbs- und Industriepolitik - EU-Integration als Dritter Weg?
7. Trade and Empire, 1700-1870
8. The name is absent
9. Iconic memory or icon?
10. Secondary stress in Brazilian Portuguese: the interplay between production and perception studies