Event Title

#27 - The Effects of the Different Configurations of the Periodic Kernels on Machine Learning Problems

Faculty Mentor

Hong Zhang , Hongjun Su

Proposal Type

Poster

Start Date

2-11-2019 3:20 PM

End Date

2-11-2019 4:30 PM

Location

Cleveland Ballroom

Abstract

Kernels as similarity measures among study objects are key components of machine learning algorithms such as Support Vector Machine (SVM) and Gaussian Process. Often the performance of a machine learning system depends highly on the effectiveness of the kernel to represent the similarity between objects. Capturing the special structural information of the data in the kernel can be useful in improving the system performance. One special property in the attributes of the data is periodicity. Examples of periodic attributes include the “time of day”, “day of the year”, and “position of a wave”. Applications involving seasonal economics data, environmental attributes, and astronomical data can all be linked naturally to periodic structures.

Recently, Dr. Su and Dr. Zhang had proposed a general family of periodic kernels and proved mathematically that all stationary periodic kernels in terms of explicit cos series belong to this family. The main technique applied in this study is the Fourier transform on locally compact abelian groups. Bochner’s theorem provided the tool for the characterization of the positive definite kernel functions. Their results were also extended to d-dimensional spaces in their research. Using the mathematical tools of Fourier analysis on locally compact abelian groups, this family of kernels has been completely characterized in terms of Fourier series expansions. In this research, we extend the research on the proposed periodic kernels by studying the effectiveness of different configurations of the proposed kernels and the applications of the kernels to real-world data sets to verify its effectiveness over more commonly used Radial Basis Function(RBF) Kernel.

This document is currently not available here.

Share

COinS
 
Nov 2nd, 3:20 PM Nov 2nd, 4:30 PM

#27 - The Effects of the Different Configurations of the Periodic Kernels on Machine Learning Problems

Cleveland Ballroom

Kernels as similarity measures among study objects are key components of machine learning algorithms such as Support Vector Machine (SVM) and Gaussian Process. Often the performance of a machine learning system depends highly on the effectiveness of the kernel to represent the similarity between objects. Capturing the special structural information of the data in the kernel can be useful in improving the system performance. One special property in the attributes of the data is periodicity. Examples of periodic attributes include the “time of day”, “day of the year”, and “position of a wave”. Applications involving seasonal economics data, environmental attributes, and astronomical data can all be linked naturally to periodic structures.

Recently, Dr. Su and Dr. Zhang had proposed a general family of periodic kernels and proved mathematically that all stationary periodic kernels in terms of explicit cos series belong to this family. The main technique applied in this study is the Fourier transform on locally compact abelian groups. Bochner’s theorem provided the tool for the characterization of the positive definite kernel functions. Their results were also extended to d-dimensional spaces in their research. Using the mathematical tools of Fourier analysis on locally compact abelian groups, this family of kernels has been completely characterized in terms of Fourier series expansions. In this research, we extend the research on the proposed periodic kernels by studying the effectiveness of different configurations of the proposed kernels and the applications of the kernels to real-world data sets to verify its effectiveness over more commonly used Radial Basis Function(RBF) Kernel.