Ladder Polynomial Neural Networks
Abstract
MS Thesis Defense:
A combination between neural networks and polynomial functions could be an advisable idea to import some favorable theoretical properties to neural network models. Polynomial neural networks limit the model functions to polynomials, but for now, those models are hard to control the order of the polynomial. In this work, we devise the product activation function and then create the Ladder Polynomial Neural Network (LPNN). LPNN has a feedforward structure and can be trained as other neural networks. It has a polynomial model function with precise polynomial order control. We show that this model is related to multiple other polynomial models and has many advantages in the optimization process and Bayesian learning. In the empirical study, deep LPNN models outperform other polynomial models in a series of regression and classification tasks. It also narrows the gap between polynomial models and well-developed feedforward neural network models.
Please join meeting in Miner Hall #112.