Page Not Found
Page not found. Your pixels are in another canvas.
A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.
Page not found. Your pixels are in another canvas.
About me
This is a page not in th emain menu
Published:
Autoencoder is a type of neural network that learns a lower dimensional latent representation of input data in an unsupervised manner. The learning task is simple: given an input image, the network will try to reconstruct the image at the output. The loss is measured by the distance between two images, e.g., MSE loss.
Published:
Support vector machines are in my opinion the best machine learning algorithm. It generalizes well with less risk to overfitting, it scales well to high-dimensional data, and kernel trick makes it possible to efficiently lift the feature space to higher dimensions, and the optimization problem for SVMs are usually quadratic programs that are efficient to solve and has a global minimum. However, there are a few drawbacks to SVMs. The biggest one is that it doesn’t scale well with large dataset and the inference time depends on the size of the training dataset. For a training set of set $m$, where $x \in \mathcal{R}^d$, the time complexity to compute the gram matrix is $\mathcal{O}(m^2 d)$ and the time complexity to for inference is $\mathcal{O}(md)$. This time complexity scales poorly with large training sets.
Published:
In kernel SVM, most kernel functions relies on some distance metric. A commonly used distance metric is the Euclidean distance. However in fields like pattern recognition, Euclidean distance is not robust to simple transformations that don’t change the class label: rotation, translation and scaling. For example:
Published:
In the previous post I introduced the concept and properities of Reproducing Kernel Hilbert Space(RKHS). In this post, I will introduce how to construct RKHS from a kernel function and build learning machines that can learn invariances encoded in the form of bounded linear functionals on the constructed RKHS.
Published:
Reproducing Kernel Hilbert Space(RKHS) is related to kernel trick in machine learning and plays a significant role in representation learning. In this post, I will summarize the mathematical definition of RKHS.
Published:
Creating macOS application bundles that appear on your launchpad and can be docked is extremely easy. Once you have finished coding your Python script, change the .py prefix to .command and make it executable:
Published:
Spyder is a powerful scientific computing environment for Python programming language. However, to invode Spyder on macOS, you would need to type:
Published:
This is a tutorial written in Python on a Jupyter Notebook about kernelized ridge regression.
Me at the UW-Madison 2019 Winter Commencement
Published in arXiv, 2019
This paper discussed the effect of adversarial samples during test time on neural networks adversarial VC dimensions
Recommended citation: Cullina, Daniel. (2018). "PAC-learning in the presence of evasion adversaries" NeurIPS. 1(1). https://zetongqi.github.io/files/1912.08865.pdf
Published:
coming soon
Undergraduate CS Course, University of Wisconsin-Madison, Department of Computer Sciences, 2018
Logic components built with transistors, rudimentary Boolean algebra, basic combinational logic design, basic synchronous sequential logic design, basic computer organization and design, introductory machine- and assembly-language programming.
Undergraduate ECE Lab, University of Wisconsin-Madison, Department of Electrical and Computer Engineering, 2019
Experiments cover electronic device characteristics, limitations and applications of operational amplifiers, and feedback circuits.