Nowadays, tensors play a central role for the representation, mining, analysis, and fusion of multidimensional, multimodal, and heterogeneous big data in numerous fields.
This set on Matrices and Tensors in Signal Processing aims at giving a self-contained and comprehensive presentation of various concepts and methods, starting from fundamental algebraic structures to advanced tensor-based applications, including recently developed tensor models and efficient algorithms for dimensionality reduction and parameter estimation. Although its title suggests an orientation towards signal processing, the results presented in this set will also be of use to readers interested in other disciplines.
This first book provides an introduction to matrices and tensors of higher-order based on the structures of vector space and tensor space. Some standard algebraic structures are first described, with a focus on the hilbertian approach for signal representation, and function approximation based on Fourier series and orthogonal polynomial series. Matrices and hypermatrices associated with linear, bilinear and multilinear maps are more particularly studied. Some basic results are presented for block matrices. The notions of decomposition, rank, eigenvalue, singular value, and unfolding of a tensor are introduced, by emphasizing similarities and differences between matrices and tensors of higher-order.
1. Historical Elements of Matrices and Tensors.
2. Algebraic Structures.
3. Banach and Hilbert Spaces – Fourier Series and Orthogonal Polynomials.
4. Matrix Algebra.
5. Partitioned Matrices.
6. Tensor Spaces and Tensors.
Gérard Favier is currently Emeritus Research Director at CNRS and I3S Laboratory, in Sophia Antipolis, France. His research interests include nonlinear system modeling and identification, signal processing applications, tensor models with associated algorithms for big data processing, and tensor approaches for MIMO communication systems.
Table of Contents
PDF File 99 Kb