I've repeatedly reinvented my research interests, so I must begin with a brief history.
I started under the great Sergey Denisov in approximation theory, especially orthogonal polynomials. We generally applied techniques related to harmonic analysis (e.g. Steepest descent methods for RiemannHilbert problems, commutator estimates for singular integral operators, and classical martingale and probability theory as practiced in randomized Fourier analysis). My first three papers were in this area, and much more could be written with the work we did; my dissertation can be found here, though I would advise against reading it; Andreas Seeger once told me to never read your dissertation after you've defended it, advice I have followed to the letter.
I left pure mathematics with the idea of getting into machine learning; I felt I needed experience solving real problems in software in order to hone my skills in this field. I wanted to avoid the issue I felt I would encounter in academia; siloing and pigeonholing, as well as never being incentivized to create truly lean systems. For that reason I took a job with the Milwaukee Brewers Baseball Club, as their first data scientist on the business side.
With my team there, we have built systems I am very proud of, and helped lead a technology transition towards inhouse software development and maintenance. The future at MBBC is bright.
In October 2018 I moved to Seattle to work on the federated learning teamwork designed to allow users' data to remain on their devices while enabling massively distributed machine learning systems to actually be deployed and function in the real world.
Area 
Description 
Pure Mathematics in Machine Learning 
The insights of pure mathematics have at times been leveraged to great advantage in machine learning; see Wasserstein GAN. There are many opportunities to recast machine learning developments in mathematical terms and utilize this framing to enable better systems in practice.
I consider the intersection of harmonic analysis and machine learning to be a particularly fruitful area. 
Automatic Topology Learning

Many of the important advances in machine learning have been fundamentally topological in nature. That is, the topology of the data informs the architecture chosen. However, there has been a human in this process all along the way. As we look towards extending machine learning into new domains, how can we remove this human?

Title 
Abstract 
Orthogonal Polynomials on the Circle for the Weight w Satisfying Conditions w, w^{1} in BMO 
For the weight w satisfying w,w−1∈BMO(𝕋), we prove the asymptotics of {Φn(eiθ,w)} in Lp[−π,π],2⩽p<p0, where {Φn(z,w)} are monic polynomials orthogonal with respect to w on the unit circle 𝕋. Immediate applications include the estimates on the uniform norm and asymptotics of the polynomial entropies. The estimates on higherorder commutators between the Calderon–Zygmund operators and BMO functions play the key role in the proofs of main results. 
On Schur Parameters in Steklov's Problem 
We study the recursion (aka Schur) parameters for monic polynomials orthogonal on the unit circle with respect to a weight which provides negative answer to the conjecture of Steklov. 
Randomized Verblunsky parameters in Steklov's problem 
We consider randomized Verblunsky parameters for orthogonal polynomials on the unit circle as they relate to the problem of Steklov, bounding the polynomials' uniform norm independent of n. We prove a result which is sharp in a natural sense. 