نتایج جستجو برای: backward ijk version of gaussian elimination

تعداد نتایج: 21179831  

2006
Mukesh Kumar Douglas A. Miller

A classification strategy which does not require a priori assumptions about the statistical distribution of training pixels in each class is proposed. This method uses an indicator kriging approach in feature space to classify remotely sensed images incorporating both spectral and textural information of bands. Texture information is used only in cases where spectral information is not sufficie...

1993
Marius Junge

X iv :m at h/ 93 02 20 6v 1 [ m at h. FA ] 4 F eb 1 99 3 Comparing gaussian and Rademacher cotype for operators on the space of continous functions Marius Junge Abstract We will prove an abstract comparision principle which translates gaussian cotype in Rademacher cotype conditions and vice versa. More precisely, let 2<q<∞ and T : C(K) → F a linear, continous operator. 1. T is of gaussian cotyp...

Journal: :J. Symb. Comput. 2001
Thom Mulders

Sylvester's identity is a well-known identity which can be used to prove that certain Gaussian elimination algorithms are fraction-free. In this paper we will generalize Sylvester's identity and use it to prove that certain random Gaussian elimination algorithms are fraction-free. This can be used to yield fraction-free algorithms for solving Ax = b (x 0) and for the simplex method in linear pr...

2017
Tibor Heumann

A single-item ascending auction in which agents observe multidimensional Gaussian signals about their valuation of the good is studied. A class of equilibria is constructed in two steps: (i) the private signals of each agent are projected into a one-dimensional equilibrium statistic, and (ii) the equilibrium strategies are constructed “as if” each agent observed only his equilibrium statistic. ...

2008
LEO T. BUTLER BORIS LEVIT

Let Θ be a smooth compact oriented manifold without boundary, imbedded in a euclidean space E, and let γ be a smooth map of Θ into a Riemannian manifold Λ. An unknown state θ ∈ Θ is observed via X = θ + ǫξ where ǫ > 0 is a small parameter and ξ is a white Gaussian noise. For a given smooth prior λ on Θ and smooth estimators g(X) of the map γ we derive a second-order asymptotic expansion for the...

2007
W. Hoffmann K. Potma SJ Amsterdam

The use of threshold pivoting with the purpose to reduce fill-in during sparse Gaussian elimination has been generally acknowledged. Here we describe the application of threshold pivoting in dense Gaussian elimination for improving the performance of a parallel implementation. We discuss the effect on the numerical stability and conclude that the consequences are only of minor importance as lon...

2002
Dan Spielman Matthew Lepinski

To solve: 1 1 1 x 1 x 2 = 1 1 First factor the matrix to get: 1 0 1 1 1 0 1 − 1 x 1 x 2 = 1 1 Next solve: 1 0 1 1 y 1 y 2 = 1 1 To get: Which is the solution to the original system. When viewed this way, Gaussian elimination is just LU factorization of a matrix followed by some simple substitutions.

Journal: :IEEE Journal on Selected Areas in Communications 1998
Cheng-Shang Chang

In this paper, we extend the ltering theory in 6] for deterministic traac regulation and service guarantees to the matrix setting. Such an extension enables us to model telecom-munication networks as linear systems with multiple inputs and multiple outputs under the (min; +)-algebra. Analogous to the scalar setting, there is an associated calculus in the matrix setting, including feedback, conc...

2010
Jay B. Simha

gression enable us to investigate the relationship between a categorical outcome and a set of explanatory variables. The outcome or response can be either dichotomous (yes, no) or ordinal (low, medium, high). During dichotomous response, we are performing standard logistic regression and for ordinal response, model that uses standard logistic regression formula with feature selection using forw...

Journal: :CoRR 2004
Young Han Kim

The feedback capacity of the stationary Gaussian additive noise channel has been open, except for the case where the noise is white. Here we obtain the closed-form feedback capacity of the first-order moving average additive Gaussian noise channel. Specifically, the channel is given by Yi = Xi +Zi, i = 1, 2, . . . , where the input {Xi} satisfies a power constraint and the noise {Zi} is a first...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید