Includes the names: I. Sokolnikoff , Ivan S. Sokolnikoff , Ph. Ivan S. Become a LibraryThing Author. Ivan Stephen Sokolnikoff is currently considered a "single author.

Author: | JoJojar Mek |

Country: | Dominica |

Language: | English (Spanish) |

Genre: | Career |

Published (Last): | 28 September 2019 |

Pages: | 253 |

PDF File Size: | 4.47 Mb |

ePub File Size: | 6.58 Mb |

ISBN: | 915-2-59483-198-2 |

Downloads: | 19774 |

Price: | Free* [*Free Regsitration Required] |

Uploader: | Voodootaxe |

Embed Size px x x x x On a different track, kernel methods lead to flexible nonlinear models that have been provensuccessful inmany different contexts. Nonetheless, a nave application of kernelmethods does not exploitstructural properties possessed by the given tensorial representations. The goal of this work is to gobeyond this limitation by introducing non-parametric tensor-based models.

The proposed frameworkaims at improving the discriminative power of supervised tensor-based models while still exploitingthe structural information embodied in the data. We begin by introducing a feature space formed bymultilinear functionals.

The latter can be considered as the infinite dimensional analogue of tensors. Successively we show how to implicitly map input patterns in such a feature space by means of kernelsthat exploit the algebraic structure of data tensors. The proposed tensorial kernel links to the MLSVD andfeatures an interesting invariance property; the approach leads to convex optimization and fits into thesame primaldual framework underlying SVM-like algorithms.

The use of these data structures has beenadvocated in virtue of certain favorable properties. Additionally,tensor representations naturally result from the experimentsperformed in a number of domains; see Table 1 for some examples. An alternative representation prescribes to flatten the differentdimensions namely to represent the data as high dimensionalvectors. In this way, however, important structure might be lost.

Still, a main drawback oftensor-based learning is that it allows the user to construct modelswhich are affine in the data in a sense that we clarify later andhence fail in the presence of nonlinearities. On a different track. Corresponding author. E-mail addresses:marco. Signoretto ,lieven. De Lathauwer ,johan. Since the feature map is normally chosen to be nonlinear, alinear model in the feature space corresponds to a nonlinear rulein RI. When input data are Nth order arrays,nonetheless, a nave application of kernel methods amounts toperform flattening first,with a consequent loss of potentially usefulstructural information.

In this paper we elaborate on a possible framework to extendthe flexibility of tensor-based models by kernel-based techniques. We make several contributions:. We give a constructive definition of the feature space ofinfinite dimensional tensors and show the link with finitedimensional tensors that are used in multilinear algebra.

TheNeural Networks 2. Tensor-based techniques frepresentations of data. All rights reserved. However the latter does not capturethe topological structure underlying a number of objects ofinterests, such as videos. In turn, such objects often admita very natural tensorial representation. We then introduce aclass of structure-preserving product kernels for tensors thatfully exploits the tensorial representation.

This relies on theassumption that the latter is useful for the learning task ofinterest. We study an invariance property fulfilled by the proposedkernels and introduce the concept of congruence sets. Wehighlight the relevance of this formalism for pattern recognitionand explicitly discuss a class of problems that takes advantageof the new similarity measure.

We elaborate on the primaldual framework used in SupportVector Machines SVMs and related algorithms and discussimplications of the tensor-like primal representation. As such, the largest part of theexisting approaches relates to unsupervised methods.

The proposed ideas can be extended tohigher order tensors at the price of an even higher computationalcomplexity. Here we consider tensors of any order and elaborateon a different formalism that leads to convex optimization. Theapproach fits into the same primaldual framework underlyingSVM-like algorithms while exploiting algebraic properties oftensors in a convenient way. In the next Section we introduce the notation and some basicfacts about finite dimensional tensors and spaces of functionsadmitting a reproducing kernel.

In Section 3 we study spaces ofinfinite dimensional tensors which give rise to product kernels. Successively in Section 4 we introduce a novel family of structure-preserving factor kernels for tensors.

Section 5 is dedicated to thestudy of an invariance property possessed by the new kernels. Special attention is devoted to the case where input data aretemporal or spatial signals represented via Hankel tensors. InSection 6 we then discuss estimation of non-parametric tensor-based models in the framework of primaldual techniques. Successively we validate our finding by presenting experimentalresults in Section 7.

We end the paper by drawing our concludingremarks in Section 8. Notation and background materialWe denote scalars by lower-case letters a, b, c,. A, B, C,. We also use lower-case letters i, j in the meaning ofindices and with some abuse of notation we will use I, J to denotethe index upper bounds. Wewrite ai tomean the ith entry of a vector A.

Similarlywewrite aij to mean the entry with row index i and column index jin a matrix A. Finally wewill often use gothic letters A, , C,. In this paper we deal with input data observations representedas real-valued Nth order tensors, which we denote by calligraphicletters A,B,C,.

They are higher order generalizations ofvectors 1st order tensors and matrices 2nd order tensors. Scalars can be seen as tensors of order zero. We write ai1, Home Documents A kernel-based framework to tensorial data analysis. See Full Reader. Post on Sep views.

Category: Documents 1 download. On a different track Corresponding author. Main contributions In this paper we elaborate on a possible framework to extendthe flexibility of tensor-based models by kernel-based techniques.

We make several contributions: We give a constructive definition of the feature space ofinfinite dimensional tensors and show the link with finitedimensional tensors that are used in multilinear algebra. Suykens a , Belgiumrijk E. Sabbelaan 53, Kortrijk, Belgium or learning allow one to exploit the structure of carefully chosenis a desirable feature in particular when the number of training patternscase in areas such as biosignal processing and chemometrics.

However, M. Signoretto et al. Outline In the next Section we introduce the notation and some basicfacts about finite dimensional tensors and spaces of functionsadmitting a reproducing kernel.

Basic facts about finite dimensional tensors In this paper we deal with input data observations representedas real-valued Nth order tensors, which we denote by calligraphicletters A,B,C,.

FORMATO CLEM-01 PDF

## Análisis tensorial : teoría y aplicaciones y la geometría y mecánica de los medios continuos

Embed Size px x x x x On a different track, kernel methods lead to flexible nonlinear models that have been provensuccessful inmany different contexts. Nonetheless, a nave application of kernelmethods does not exploitstructural properties possessed by the given tensorial representations. The goal of this work is to gobeyond this limitation by introducing non-parametric tensor-based models. The proposed frameworkaims at improving the discriminative power of supervised tensor-based models while still exploitingthe structural information embodied in the data.

LIEBHERR 256HC PDF

## Análisis Tensorial sokolnikoff

In mathematics , tensor calculus , tensor analysis , or Ricci calculus is an extension of vector calculus to tensor fields tensors that may vary over a manifold , e. Developed by Gregorio Ricci-Curbastro and his student Tullio Levi-Civita , [1] it was used by Albert Einstein to develop his theory of general relativity. Contrasted with the infinitesimal calculus , tensor calculus allows presentation of physics equations in a form that is independent of the choice of coordinates on the manifold. Tensor calculus has many real-life applications in physics and engineering , including elasticity , continuum mechanics , electromagnetism see mathematical descriptions of the electromagnetic field , general relativity see mathematics of general relativity and quantum field theory. Working with a main proponent of the exterior calculus Elie Cartan, the influential geometer Shiing-Shen Chern summarizes the role of tensor calculus: [2].