Analysis and Regularization of Deep Generative Second Order Ordinary Differential Equations

Abstract

Deep generative models aim to learn processes that are assumed to generate the data. To this end, deep latent variable models use probabilistic frameworks to learn a joint probability distribution over the data and its low-dimensional hidden variables. A challenging task for the deep generative models is learning complex probability distributions over sequential data in an unsupervised setting. Ordinary Di↵erential Equation Variational Auto-Encoder (ODE2VAE) is a deep generative model that aims to learn complex generative distributions of high-dimensional sequential data. The ODE2VAE model uses variational auto-encoders (VAEs) and neural ordinary di↵erential equations (Neural ODEs) to model low-dimensional latent representations and continuous latent dynamics of the representations, respectively. In this thesis, we aim to explore the e↵ects of the inductive bias in the ODE2VAE model by analyzing the learned dynamic latent representations over three di↵erent physical motion datasets. Then, we re-formulate the model for flexible regularization, and we extend the model architecture to facilitate the learning of the varying static features in the sequential data. Through the experiments, we uncover the e↵ects of the inductive bias of the ODE2VAE model over the learned dynamical representations and demonstrate the ODE2VAE model’s shortcomings when it is used for modeling sequences with varying static features.

Publication
Master’s Thesis
Batuhan Koyuncu
Batuhan Koyuncu
PhD Student

My research interests include interpretable deep generative modeling and its applications in psychiatry.