Title: | ICM Lecture Series: Numerical Multilinear Algebra (Lecture 1: Tensors as Hypermatrices) |
Speaker: | Dr. Lim Lek-Heng, Department of Mathematics, University of California, USA |
Time/Place: | 10:00 - 12:00 FSC1217, Fong Shu Chuen Library, Ho Sin Hang Campus, Hong Kong Baptist University |
Abstract: | We will discuss the various meanings ascribed to the term "tensor" in algebra, analysis, and geometry, as well as in engineering, physics, statistics, and technometrics. Just as linear operators, bilinear forms, and dyads may all be represented by the same matrices, we will see that different types of higher-order tensors may be represented by hypermatrices if we ignore covariance and contravariance. Basic concepts like norms, rank, determinants for matrices do not however have well-known generalizations to hypermatrices --- not unless we look beyond a single field. In fact, the study of tensor norms came from functional analysis, tensor ranks started in computational complexity, and hyperdeterminants originated from algebraic geometry. We will discuss these and other hypermatrix generalizations of matrix mathematics. |
Title: | ICM Lecture Series: Numerical Multilinear Algebra (Lecture 2: Multilinear Decompositions) |
Speaker: | Dr. Lim Lek-Heng, Department of Mathematics, University of California, USA |
Time/Place: | 10:00 - 12:00 FSC1217, Fong Shu Chuen Library, Ho Sin Hang Campus, Hong Kong Baptist University |
Abstract: | We will examine two classes of decompositions and their corresponding approximation problems. We will first discuss "secant decompositions," which one may view as r-term decompositions over a dictionary that is a continuously varying manifold or variety. Examples include resolving a tensor into a sum of decomposable tensors, an operator into a sum of Kronecker products, a homogeneous polynomial into a sum of powers of linear forms, a joint probability distribution into a sum of conditional probability distributions, a multivariate function into a sum of separable functions. For each of these, we will discuss its closely related cousin that takes the form of a "subspace decomposition". We will look at some known results and problems in studies of such decompositions. |
Title: | Implementing the Blocking Gibbs Sampler on a Complex Pedigree |
Speaker: | Dr. Joseph Abraham, Department of Epidemiology and Biostatistics, School of Medicine, Case Western Reserve University Cleveland,, USA |
Time/Place: | 14:30 - 15:30 FSC1217, Fong Shu Chuen Library, HSH Campus, Hong Kong Baptist University |
Abstract: | The relevance of Probabilistic Graphical Models for Genetic Analysis will be reviewed, along with some outstanding conceptual problems. A new implementation of the Blocking Gibbs sampler will be discussed with an application to a real life complex pedigree. |
Title: | ICM Lecture Series: Numerical Multilinear Algebra (Lecture 3: Computations and Applications) |
Speaker: | Dr. Lim Lek-Heng, Department of Mathematics, University of California, USA |
Time/Place: | 10:00 - 12:00 FSC1217, Fong Shu Chuen Library, Ho Sin Hang Campus, Hong Kong Baptist University |
Abstract: | We will be interested in applications of these techniques to data analysis, machine learning, neuroscience, and signal processing. We will present the methods as different ways of generalizing principal components analysis (PCA) using multilinear algebra and the two classes of decompositions. In particular, we will examine a new technique called Principal Cumulant Components Analysis (PCCA) that relies on subspace approximations of cumulants, which are symmetric tensor generalizations of the covariance matrix. We will compute principal cumulant components using a limited memory BFGS algorithm on Grassmannians. This part of the lectures will feature new joint work with Jason Morton of Stanford (PCCA) and Berkant Savas of UT Austin (Grassmannian L-BFGS). |
Title: | Dimension reduction and variable selection |
Speaker: | Prof. YU Yan , Department of Quantitative Analysis and Operations Management, College of Business,University of Cincinnati, USA |
Time/Place: | 10:30 - 12:00 FSC1217, Fong Shu Chuen Library, HSH Campus, Hong Kong Baptist University |
Abstract: | In this mini workshop, the speakers will discuss the recent developments in dimension reduction and variable selection for nonlinear models |
Title: | ICM Lecture: Graph Helmholtzian and Rank Learning |
Speaker: | Dr. Lim Lek-Heng, Department of Mathematics, University of California, Berkeley, USA |
Time/Place: | 15:30 - 16:30 DLB 618, David Lam Building, Shaw Campus, HKBU |
Abstract: | The graph Helmholtzian is the graph theoretic analogue of the Helmholtz operator or vector Laplacian, in much the same way the graph Laplacian is the analogue of the Laplace operator or scalar Laplacian. We will see that a decomposition associated with the graph Helmholtzian provides a way to learn ranking information from incomplete, imbalanced, and cardinal score-based data. In this framework, an edge flow representing pairwise ranking is orthogonally resolved into a gradient flow (acyclic) that represents the L2-optimal global ranking and a divergence-free flow (cyclic) that quantifies the inconsistencies. If the latter is large, then the data does not admit a statistically meaningful global ranking. A further decomposition of the inconsistent component into a curl flow (locally cyclic) and a harmonic flow (locally acyclic) provides information on the validity of small- and large-scale comparisons of alternatives. This is joint work with Xiaoye Jiang, Yuan Yao, and Yinyu Ye. |
Title: | DLS: A Parallel Decomposition Algorithm for Training Multiclass Kernel-based Vector Machines |
Speaker: | Prof. Ya-xiang Yuan, Chinese Academy of Sciences, China |
Time/Place: | 11:30 - 12:30 ACC109, Jockey Club Academic Community Centre, Baptist University Road Campus, HKBU |
Abstract: | In this talk, I will discuss a decomposition method for training Crammer and Singer's multiclass kernel-based vector machine model. A new working set selection rule is proposed. Global convergence of the algorithm based on this selection rule is established. Projected gradient method is chosen to solve the resulting quadratic subproblem at each iteration. An efficient projection algorithm is designed by exploiting the structure of the constraints. Parallel strategies are given to utilize the storage and computational resources available on multiprocessor system. Numerical experiment on benchmark problems demonstrates that the good classification accuracy and remarkable time saving can be achieved. |
We organize conferences and workshops every year. Hope we can see you in future.
Learn MoreProf. M. Cheng, Dr. Y. S. Hon, Dr. K. F. Lam, Prof. L. Ling, Dr. T. Tong and Prof. L. Zhu have been awarded research grants by Hong Kong Research Grant Council (RGC) — congratulations!
Learn MoreFollow HKBU Math