tlehman@home

Seeing the bulk: a primer on dimensionality reduction

datavis machine-learning math
If you haven’t seen Interstellar, you might not get the reference in the title. Don’t sweat it, the bulk beings are some advanced civilization that is able to move outside the familiar 3D space that we are stuck in. A similar plot device was used in Edwin Abbott’s book Flatland. Outside of entertainment, there is a lot of value in attempting to understand high dimensional spaces. Physics, machine learning, and engineering control systems all make use of higher-dimensional spaces in one way or another. Read more...

word embeddings and semantics

Can a machine understand what a word means? Right now machines routinely correct spelling and grammar, but are pretty useless when it comes to semantics. Search engines are an exception, they have a rudimentary understanding of what words mean. One of the ways this can work is explored in Tomas Mikolov’s 2013 paper on word embeddings. Word embeddings are mappings from sets of words to vectors, such that the distances between the vectors represent the semantic similarity of the words. Read more...

Thing explainer in emacs

This is my first major mode for emacs. It was inspired by Randall Munroe’s Thing Explainer and Morten Just’s editor that doomed humanity. The concept is simple, restrict your vocabulary to the 1000 most common words. If you can explain something using this reduced vocabulary, then you really understand the topic. This is a decent test of understanding because it’s easy to learn a word, and even use that word in the right context, but still have no idea how it relates to other things. Read more...
1 of 28 Next Page

About

Computer programmer and amateur mathematician. Lives in rainy Portland, Oregon, United States. Read more

Contact

Latest Posts