Title: | A New Algorithm for Minimization without Derivatives |
Speaker: | Prof. M. J. D. Powell, Department of Applied Mathematics and Theoretical Physics, University of Cambridge, UK |
Time/Place: | 11:30 - 12:30 RRS 905 |
Abstract: | The development of this subject during the last 50 years was surveyed by the author in the William Benter Lecture at City University on February 7th. His recent research on unconstrained minimization that provided the NEWUOA algorithm was mentioned briefly. The main ideas of this algorithm with an extension that allows bounds on the variables will be described. Quadratic models are employed that are derived from a small number of interpolation conditions. Some numerical results will show the efficiency that is achieved. |
Title: | CMIV Lecture Series: Optimization for Image Processing (Lecture 3) |
Speaker: | Prof. Mila Nikolova, CMLA ENS de Cachan, France |
Time/Place: | 14:30 - 16:30 FSC 1217 |
Title: | Optimal Superconvergent Quadratic and Cubic Spline Collocation Methods |
Speaker: | Prof. Graeme Fairweather, Department of Mathematical and Computer Sciences, Colorado School of Mines, USA |
Time/Place: | 11:30 - 12:30 FSC 1217 |
Abstract: | Quadratic and cubic spline collocation methods are popular techniques for solving boundary value problems for ordinary and partial differential equations and for the spatial discretization of time dependent problems. When used in their basic form, they provide approximations which are no more than second order accurate. In this talk, we provide an overview of recent developments in the derivation of optimal methods. In particular, we describe new methods for elliptic problems in the unit square which are not only of optimal accuracy but possess certain superconvergence properties. Moreover, these methods are constructed so that the collocation equations can be solved using matrix decomposition algorithms (MDAs). MDAs are fast direct methods which employ fast Fourier transforms and require O(N^2 log N) operations on an N x N uniform partition of the unit square. We present the results of numerical experiments which exhibit the expected optimal global convergence rates as well as superconvergence phenomena. |
Title: | CMIV Lecture Series: Optimization for Image Processing (Lecture 4) |
Speaker: | Prof. Mila Nikolova, CMLA ENS de Cachan, France |
Time/Place: | 14:30 - 16:30 FSC 1217 |
Title: | Agglomerative Fuzzy K-means Clustering Algorithm with Selection of Number of Clusters |
Speaker: | Mr. Junjie Li, Department of Mathematics, Hong Kong Baptist University, HKSAR, China |
Time/Place: | 14:30 - 15:30 FSC 1217 |
Abstract: | The k-means algorithm is well known for its efficiency in clustering large data sets. Fuzzy versions of the k-means algorithm, where each pattern is allowed to have memberships in all clusters rather than having a distinct membership to one single cluster. Numerous problems in real world applications, such as pattern recognition and computer vision, can be tackled effectively by the fuzzy k-means algorithms. There are two major issues in application of the k-means-type (non-fuzzy or fuzzy) algorithms in cluster analysis. The first issue is that the number of clusters k needs to be determined in advance as an input to these algorithms. The second issue is that the k-means-type algorithms are very sensitive to the initial cluster centers. We propose an agglomerative fuzzy k-means clustering algorithm to tackle the above two issues in application of the k-means-type clustering algorithms. The new algorithm is an extension to the standard fuzzy k-means algorithm by introducing a penalty term to the objective function to make the clustering process not sensitive to the initial cluster centers. The new algorithm can produce more consistent clustering results from different sets of initial clusters centers. Combined with cluster validation techniques, the new algorithm can determine the number of clusters in a data set. Experimental results have demonstrated the effectiveness of the new algorithm in producing consistent clustering results and determining the correct number of clusters in different data sets, some with overlapping inherent clusters. Some further and promising work is also discussed. |
Title: | CMIV Lecture Series: Optimization for Image Processing (Lecture 5) |
Speaker: | Prof. Mila Nikolova, CMLA ENS de Cachan, France |
Time/Place: | 14:30 - 16:30 FSC 1217 |
Title: | A Smoothed Bootstrap Test for Independence Based on Mutual Information |
Speaker: | Mr. Edmond Wu, Department of Statistics and Actuarial Science, The University of Hong Kong, HKSAR, China |
Time/Place: | 11:30 - 12:30 FSC 1217 |
Abstract: | In this talk, we study a computational test for independence of multivariate time series data based on mutual information. We first construct a test between a pair of i.i.d. (over time) data and then extend to the cases of high dimensional and serial dependent time series data. The smoothed bootstrap method is used to estimate the null distribution of mutual information. The experimental results reveal that the proposed bootstrap test performs satisfactory and can achieve a high power even for moderate sample size. Furthermore, we adopt the proposed test in independent component analysis (ICA) applications as a crucial verification step to validate whether the 'independent' sources estimated by various ICA algorithms are really independent. |
Title: | CMIV Lecture Series: Optimization for Image Processing (Lecture 6) |
Speaker: | Prof. Mila Nikolova, CMLA ENS de Cachan, France |
Time/Place: | 14:30 - 16:30 FSC 1217 |
We organize conferences and workshops every year. Hope we can see you in future.
Learn MoreProf. M. Cheng, Dr. Y. S. Hon, Dr. K. F. Lam, Prof. L. Ling, Dr. T. Tong and Prof. L. Zhu have been awarded research grants by Hong Kong Research Grant Council (RGC) — congratulations!
Learn MoreFollow HKBU Math