## What is/are Tensor Space?

Tensor Space - The first application of the realization of ${\mathcal U}_{\mathbb Q}(\widehat {\mathfrak {gl}}_{m|n})$ is to determine the action of ${\mathcal U}_{\mathbb Q}(\widehat {\mathfrak {gl}}_{m|n})$ on tensor spaces of the natural representation of $\widehat {\mathfrak {gl}}_{m|n}$.^{[1]}We define the degenerate two boundary affine Hecke-Clifford algebra H d , and show it admits a well-defined q ( n ) -linear action on the tensor space M ⊗ N ⊗ V ⊗ d , where V is the natural module for q ( n ) , and M , N are arbitrary modules for q ( n ) , the Lie superalgebra of Type Q.

^{[2]}In this paper, we propose to seek discriminative representation for multi-dimensional data by learning a structured dictionary in tensor space.

^{[3]}Similar to Rν-TSVM, Rν-TSTM constructs rough lower margin, rough upper margin and rough boundary in tensor space.

^{[4]}To further improve the discriminative power of representation, we extend the representation to the tensor space while imposing orthogonal constraints on the transformation matrix to effectively reduce feature dimensions.

^{[5]}We show a way to analyze a renormalization group (RG) fixed point in tensor space: write down the tensor RG equation, linearize it around a fixed-point tensor, and diagonalize the resulting linearized RG equation to obtain scaling dimensions.

^{[6]}Although our results are new and in the matrix case, we decided to present them in tensor space with reshape operator.

^{[7]}Consider the generalized symmetrizer on the tensor space $Uotimes V^{otimes m}$, $$ S_{Lambda}(uotimes v^{otimes})=dfrac{1}{|G|}sum_{sigmain G}Lambda(sigma)uotimes v_{sigma^{-1}(1)}otimescdotsotimes v_{sigma^{-1}(m)} $$ defined by $G$ and $Lambda$.

^{[8]}Here, we introduce an enhanced Hilbert embedding-based approach from a cross-covariance operator, termed EHECCO, to map the input Mocap time series to a tensor space built from both 3D skeletal joints and a principal component analysis-based projection.

^{[9]}At the same time, the learning model is extended from vector space to tensor space.

^{[10]}

## support tensor machine

In addition, the weighted support tensor machine is proposed to classify the human activities in tensor space while avoiding the outlier sensitivity problem.^{[1]}Although the support tensor machine (STM) has extended the traditional vector-based SVM to tensor space, it fails to deal with multiple classification problems.

^{[2]}In order to process the rotating machines faults and identify the information classes in tensor space, the KSTM is then introduced from sets of binary support tensor machine classifiers by the one-against-one parallel strategy.

^{[3]}

## general linear group

There is a particular tensor unfolding which gives rise to an isomorphism from this tensor space to the general linear group, i.^{[1]}

## Order Tensor Space

The notation of Moore-Penrose inverse of matrices has been extended from matrix space to even-order tensor space with Einstein product.^{[1]}Unlike most conventional clustering methods which are derived for dealing with matrices, the proposed algorithm performs clustering in a third-order tensor space.

^{[2]}To address this issue, in this paper, a novel multiview clustering method is proposed by using t-product in the third-order tensor space.

^{[3]}The proposed receiver exploits a cross-coding approach using a third-order tensor space-time code (TSTC) at the relay, and it does not require a channel reciprocity between uplink and downlink phases, which can be of interest in frequency division duplex relaying systems.

^{[4]}

## Unified Tensor Space

To this end, we propose a kernelized version of tensor-based multiview subspace clustering, which is referred to as Kt-SVD-MSC, to jointly learn self-representation coefficients in mapped high-dimensional spaces and multiple views correlation in unified tensor space.^{[1]}The functional matrices are then optimized in a unified tensor space to achieve a refinement, such that the relevant images can be pushed closer.

^{[2]}

## Dimensional Tensor Space

This article proposes a novel distributed hierarchical tensor deep computation model by condensing the model parameters from a high-dimensional tensor space into a set of low-dimensional subspaces to reduce the bandwidth consumption and storage requirement for federated learning.^{[1]}The author plans to propose a quantum state driven framework for language problems and generalize it in a high-dimensional tensor space.

^{[2]}

## tensor space model

To solve this ‘loss of term senses’ problem, we develop a concept-driven deep neural network based upon our semantic tensor space model.^{[1]}The proposed multivariate control chart uses tensor space model to represent a high-dimensional vector.

^{[2]}