Functional data analysis

Every once in a while, I choose some form of “brain candy” to read for work—something that’s out of my field and that’s just fun in some sense, because there’s no pressure other than the challenge of reading something new. Because of something an editor suggested to me, I decided to see what functional data analysis is. Why is an historian reading this? It has to do with the net-flow research in that I’m really looking at a step function (net flow as a function of the grade) but one that one could smooth in various ways and then look at the smoothed curves as the object of analysis.


When I was at Vanderbilt for a postdoc and looking at curriculum-based measurement data for K-12 students, my brain candy at that time was looking at locally weighted regression as a smoothing device (1). Functional data analysis takes smoothing one step further by providing tools to make the smoothed curves differentiable.

Arbitrary? Certainly, but the advocates of functional data analysis point out that all analyses assume some specificity that really isn’t there, including just the raw data. That willingness to transform data before analysis is the hallmark of John Tukey’s exploratory approach. So, the argument goes, why not make some reasonable assumptions about an underlying function and see what you can make of that?

Reference

Cleveland, W. S. (1979) Robust locally weighted regression and smoothing scatterplots. Journal of the American Statistical Association, 74, 829-836.

If you enjoyed this post, please consider subscribing to the RSS feed to have future articles delivered to your feed reader.
p5rn7vb