Sitemap

A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.

Pages

Posts

Variational Autoencoders

9 minute read

Published:

Autoencoder is a type of neural network that learns a lower dimensional latent representation of input data in an unsupervised manner. The learning task is simple: given an input image, the network will try to reconstruct the image at the output. The loss is measured by the distance between two images, e.g., MSE loss.

Finite-dimensional Approximation of Kernel SVM with Random Fourier Features

7 minute read

Published:

Support vector machines are in my opinion the best machine learning algorithm. It generalizes well with less risk to overfitting, it scales well to high-dimensional data, and kernel trick makes it possible to efficiently lift the feature space to higher dimensions, and the optimization problem for SVMs are usually quadratic programs that are efficient to solve and has a global minimum. However, there are a few drawbacks to SVMs. The biggest one is that it doesn’t scale well with large dataset and the inference time depends on the size of the training dataset. For a training set of set $m$, where $x \in \mathcal{R}^d$, the time complexity to compute the gram matrix is $\mathcal{O}(m^2 d)$ and the time complexity to for inference is $\mathcal{O}(md)$. This time complexity scales poorly with large training sets.

Learning with Invariance using Tangent Distance Kernel

5 minute read

Published:

In kernel SVM, most kernel functions relies on some distance metric. A commonly used distance metric is the Euclidean distance. However in fields like pattern recognition, Euclidean distance is not robust to simple transformations that don’t change the class label: rotation, translation and scaling. For example:

Learning with Invariances via RKHS

12 minute read

Published:

In the previous post I introduced the concept and properities of Reproducing Kernel Hilbert Space(RKHS). In this post, I will introduce how to construct RKHS from a kernel function and build learning machines that can learn invariances encoded in the form of bounded linear functionals on the constructed RKHS.

Road to RKHS

8 minute read

Published:

Overview

Reproducing Kernel Hilbert Space(RKHS) is related to kernel trick in machine learning and plays a significant role in representation learning. In this post, I will summarize the mathematical definition of RKHS.

How to create macOS application bundles with Python

less than 1 minute read

Published:

Creating macOS application bundles that appear on your launchpad and can be docked is extremely easy. Once you have finished coding your Python script, change the .py prefix to .command and make it executable:

portfolio

publications

talks

teaching

CS 252: Introduction to Computer Engineering

Undergraduate CS Course, University of Wisconsin-Madison, Department of Computer Sciences, 2018

Logic components built with transistors, rudimentary Boolean algebra, basic combinational logic design, basic synchronous sequential logic design, basic computer organization and design, introductory machine- and assembly-language programming.

ECE 271: Circuits Laboratory II

Undergraduate ECE Lab, University of Wisconsin-Madison, Department of Electrical and Computer Engineering, 2019

Experiments cover electronic device characteristics, limitations and applications of operational amplifiers, and feedback circuits.