ESP Biography
HAN ALTAETRAN, Stanford Student
Major: Physics College/Employer: Stanford Year of Graduation: 2015 

Brief Biographical Sketch:
Hey there, Like you, I find the world fascinating. That's why I've spent the past few years just learning about what makes it so amazing. Physics has been especially interesting in this aspect. I reckon I'll spend another few years with this mindset. But while I'm at it, I think I'll work on a few problems that affect people in a global context. Best, Han Past Classes(Clicking a class title will bring you to the course's section of the corresponding course catalog)M4053: For the Love of Optimization in Splash Fall 2014 (Nov. 08  09, 2014)
Pretty much anything important in life—the stability of the Golden Gate bridge, the quality of a compressed image, the edibility of a vegan ice cream, your happiness, or the likelihood of winning the lottery on your birthday—can be modeled as a mathematical function of one (or many more!) variables. Say we want to optimize of one of these functions. Easy, you say—set the derivative to zero! Duh. But what if we don't even have an formula for the function? What if our function lives in multidimensional space and checking every single point where the derivative is zero would take eons? What if all you have to guess what the function is, is a pile of super noisy data? What if the derivative isn't even computable?
What your calculus teachers have been hiding from you is that there exists an elegant framework for minimizing or maximizing functions, even when we can't describe what the function or its derivative looks like (or if it even has a derivative). In the age of big data, that's all you need to solve problems from 3D protein folding and predicting the next stockmarket crash, to filling in a damaged image or audio clip and designing a machine learning model for identifying faces. We'll show you the secrets of this magical optimization framework, and peer into the world of how to apply them to these awesome problems.
Core ideas: theory of mathematical optimization, optimization algorithms, machine learning, big data, and applications to science and engineering.
M3720: Music, Math, and Machines in Splash! Spring 2014 (Apr. 12  13, 2014)
What is music, really? In this tutorial, we'll talk about the math that goes into music. Specifically, we'll cover signal processing and Fourier transforms and talk about how they can be used to make the sounds used by Skrillex, Deadmau5, Passion Pit, etc.
Afterward, we'll talk about sound design, and you'll get a chance to make these sounds yourself.
We'll also talk about the physics that goes into making string and wind instruments sound so organic.
Depending on your interests, we can also cover other topics ranging from simulating echo and choir effects. We can also talk about machine learning algorithms that classify music by genre, or even how to use machines to generate music itself. It's all up to you! :)
M3158: Signal Processing and Sound Design in Splash! Fall 2013 (Nov. 02  03, 2013)
Ever wonder why knowledge of $$\hat{f}(s)=\int_{\infty}^{\infty}dt\cdot e^{2\pi ist}f(t)$$ tells you how your music sounds without even having to hear it? Find out how the Fourier transform can be used to create electronic music with a computer. We'll use a synthesizer and frequency filter to make some cool sounds.
If you can, bring your laptop, and I'll show you how to get started.
M3241: Practical Machine Learning in Splash! Fall 2013 (Nov. 02  03, 2013)
Can machines learn? Will they ever achieve a level of sentience that rivals that of humans? These are great questions that we will *not* answer in this class. Instead, we'll layout some of the foundations for classic machine learning techniques.
Starting with the maximum likelihood approach, we will cover topics such as binary classification, regression, or fitting to a mixture of gaussians, and will show you how to derive their update rules.
We'll end with real world examples, potentially in biology.
P2776: Neuronal Biophysics in Splash! Spring 2013 (Apr. 13  14, 2013)
Neurons serve as fundamental units for computation and communication in the human body. Though they are diverse in morphology and function, most neurons use properties of the cell membrane to transmit information. This seminar will provide an introduction to electrochemical based membrane dynamics and their applications to the modeling of action potential propagation in axons and signal integration in dendrites. This will be done primarily through circuit based models for transmembrane ion transport in the HodgkinHuxley (HH) framework.
Following this, we will discuss extensions of the HH framework that incorporate additional biological features and lead to the complex firing dynamics that we are able to observe experimentally, and if time permits, we will talk about the behavior of coupled neurons and their implications for neural networks.
