Honorary Degree banner
 

MORE PHOTOS

Terence Speed

Terence Speed

Professor, Head of the Bioinformatics Division at the Walter and Eliza Hall Institute of Medical Research, in Melbourne, Australia, and Department of Statistics, University of California, Berkeley

Doctor of Science
Five Hundred Nineteenth Convocation
June 14, 2014

Click for additional photos

Citation:
Presenter:  Stephen Stigler

Terry Speed, a statistician and statistical geneticist, is a pioneer in the development and application of statistical methods for the analysis of biomedical and genomic data.  His work is considered to exemplify the best of applied statistics in cross-disciplinary research.  Initially a pure mathematician specializing in algebra, he turned to the study of probability and mathematical statistics, and later to genomics. His expertise is in developing novel statistical and computational methods to extract the key signals of interest from the inherently large, complex, noisy datasets that arise from emerging genomics technologies.

He and his collaborators have developed innovative statistical methods for addressing key practical issues in microarray data analysis. These methods have become standards that have been implemented in widely used open-source software, thereby transforming the field.  His earlier work on Markov random fields and log-linear models was similarly influential, laying the groundwork for the modern analysis of graphical models. He has also made major contributions to the understanding of recombination, the fundamental biological process responsible for shuffling genetic material between chromosomes as it passes from generation to generation.  In addition, he has contributed to various aspects of genetic sequence analysis, including transcription factor binding site prediction, as well as the analysis of data from high-throughput sequencing assays, for example, in his recent work with the Cancer Genome Atlas Project.

 

David Donoho

David Donoho

Anne T. and Robert M. Bass Professor in the Humanities and Sciences, and Professor of Statistics

Doctor of Science
Five Hundredth Convocation
October 9, 2009

Click for additional photos

 

Presentation Statement: David L. Donoho has pioneered the application of principled methods in mathematical statistics to address a great modern scientific challenge, the problem of "sparsity" in high dimensional data sets. A data set of a million cases is not large if ten dimensions are recorded for each case and there are a huge number of potential interrelations or interactions among the characteristics. An unknown very few of these are likely to be important, and in that sense the structure is sparse. Determining ways to reveal that structure is a daunting challenge.

Donoho draws upon classical statistics and crafts both elegant theory and novel algorithms to overcome the dimensional complexity. He devises methods that include the use of wavelets and what he terms "compressed sensing" to recover sparse relationships with a fraction of the number of observations needed by other methods. This methodology is enormously influential in astronomy, genetics, geophysics, signal processing, financial analysis, and medical imaging.

At one level this work is only Occam's razor, but the technical complexities are immense and the clarity and mathematical rigor Donoho brings to this area of analysis are nothing short of extraordinary.

Citation: David L. Donoho is a mathematical statistician, and also one of the more influential applied mathematicians of his generation. Building upon the discipline of statistics, Donoho has developed effective new approaches to constructing low dimensional representations for modern high-dimensional data problems. His work provides new insight into some of the most pressing scientific questions of the present day.

 

Grace Wahba

Grace Wahba

I. J. Schoenberg-Hilldale Professor of Statistics, Professor of Biostatistics and Medical Informatics, and Professor of Computer Sciences (by courtesy), University of Wisconsin, School of Medicine and Public Health

Doctor of Science
Four Hundred Nineth Convocation
June 8, 2007

Click for additional photos

 

Grace Wahba, the I. J. Schoenberg professor of statistics, University of Wisconsin-Madison, represents the very best of the modern synthesis of applied statistical, mathematical and computational science. Her most influential work has concerned problems in the estimation of curves and surfaces from large, high-dimensional data sets, such as occur frequently in geophysics.

Wahba has introduced the use of reproducing kernel Hilbert spaces in the formulation of nonparametric smoothing problems to reveal general patterns without obscuring local features. Her pioneering methods include the introduction of Generalized Cross-Validation, now a generally adopted approach to making a principled trade-off between smoothness and attention to detail.

In recent years she and her students have applied these same statistically based theories to a diverse group of classification problems known in computer science as “machine learning.” Diverse areas in applied science have benefited from Wahba’s research, including satellite imaging, magnetic resonance imaging, meteorology, climatology and DNA micro-arrays.

Source: http://chronicle.uchicago.edu/070607/honorarydegrees.shtml

 

Persi Diaconis

Persi Diaconis

The Mary V. Sunseri Professor and Professor of Mathematics, Stanford University

Doctor of Science
Four Hundred Seventy Third Convocation
June 13, 2003

Click for additional photos

Diaconis, the Mary V. Sunseri professor and professor of mathematics at Stanford University, has over the past 20 years had a major influence on the development of probability theory. He and his collaborators have created diverse and recondite mathematical tools to analyze games of chance and their associated accouterments, such as cards, dice, coins and roulette wheels, and for other statistical investigations.

Though the mathematical theory of probability was born some 350 years ago when the Chevalier de Mere brought to the attention of Pierre de Fermat and Blaise Pascal the problem of calculating odds in a dice-rolling game played in certain French casinos, Diaconis' contributions have furthered the study of probability theory.

He is perhaps best known for his discovery with David Bayer that "seven shuffles suffice" to "randomize" a deck of cards. The problem of ascertaining how long a random process must run before reaching equilibrium recurs in almost every area of science where random processes arise. Accordingly, Diaconis' work and ideas have had ramifications throughout the sciences.

Before he embarked on a career in mathematics, Diaconis was a professional magician with an expertise in card tricks. By bringing the power of abstract mathematical reasoning to bear on the concrete problems of chance that arise from the milieu of gambling, magic and everyday coincidence, Diaconis followed the best and oldest tradition of probability.

Source: http://chronicle.uchicago.edu/030612/hon-degrees.shtml

 

David Aldous

David Aldous

Professor of Statistics, University of California, Berkeley

Doctor of Science
Four Hundred Sixty Second Convocation
November 2, 2000

David Aldous, a professor of statistics at the University of California, Berkeley, will be awarded a Doctor of Science degree. Aldous’ contributions place him among the world’s leading practitioners of both mathematical probability and in the theory of computing.

His 1989 monograph, Probability Approximations via the Poisson Clumping Heuristic, concisely shows how to solve more than 100 challenging and wide-ranging problems in probability theory.

In 1993 he became the first recipient of the Line and Michel Lo‘ve International Prize in Probability. The prize recognizes the outstanding contributions of probability researchers under the age of 45.

Reference: http://chronicle.uchicago.edu/001102/honorary-degrees.shtml

 

Bradley Efron

Bradley Efron

Professor in the Departments of Statistics and Health Research and Policy, Stanford University

Doctor of Science
Four Hundred Thirty-ninth Convocation
June 9, 1995

Efron, widely regarded as the most original and innovative statistician of his generation, will receive the Doctor of Science degree. He is a professor in the department of statistics and the department of health research and policy at Stanford.

Efron is regarded as the inventor of the bootstrap form of computer-intensive re-sampling for solving distributional problems connected with statistical inference, and his article on the topic is among the most influential papers on statistics in the 20th century. His work on the relationship between scientific inference and statistical models has set the research agenda for much of current statistics and has had a significant impact on a wide variety of sciences, from astronomy and paleontology to demography and medicine.

A faculty member at Stanford since 1966, Efron received his B.S. in 1960 from Caltech and his M.S. in 1962 and his Ph.D. in 1964 from Stanford. He has received numerous honors and awards, including a MacArthur Fellowship in 1983 and the American Statistical Association's Wilks Medal in 1990. He is a member of the American Academy of Arts and Sciences and the National Academy of Sciences.

Reference: http://chronicle.uchicago.edu/950525/honorary.shtml

 

Ulf Grenander

Ulf Grenander

L. Herbert Ballou University Professor Emeritus, Division of Applied Mathematics, Brown University

Doctor of Science
Four Hundred Thirty-fifth Convocation
June 10, 1994

Citation: Professor Ulf Grenander is an extraordinarily innovative statistician and applied mathematician. He has a rare ability to integrate elements and structures from various fields of statistics and mathematics and thereby to create new paradigms of the most original, profound and esthetic nature. His work in the 1950s was a cornerstone in the development of modern time series analysis. In the 1960s he turned his attention to probabilities on algebraic structures, or probabilities on groups. This work, summarized in his 1963 book, was ahead of its time. Grenander's work in abstract inference, or inference about random processes, spans three decades, from his 1950 Ph. D. thesis to his 1981 book on the topic. The method of sieves, central to non-parametric regression and density estimation, is Grenander's creation. Starting in the early 1970s, he turned his attention to the immensely challenging problems connected with pattern theory, pattern recognition, and image analysis. This was unexplored intellectual territory at the time, and Grenander's clear conceptual framework charted the way for the mathematical discussion of pattern over the next two decades. He continues to set the pace in formulating problems and inventing solutions in the area of pattern analysis.

 

Charles M. Stein

Charles M. Stein

Professor Emeritus of Statistics, Stanford University

Doctor of Science
Four Hundred Twenty-sixth Convocation, Second Session, Celebrating the Centennial Year
June 12, 1992

Presentation Statement: Professor Charles M. Stein obtained a B.S. Degree at the University of Chicago in 1940 and taught at the University from 1951-1953, but for most of his career he has been in the Department of Statistics at Stanford University.  Through a series of brilliant discoveries of striking originality, Stein has reoriented the way theoretical statisticians view their field.

Stein's work is concerned with deep problems in the foundations of statistical inference.  Some of this has to do with the attainability of optimum results with statistical data assumed to be generated within a limited class of mechanisms (a parametric family of distributions), and some has to do with the radically different nature of multidimensional problems in inference.  His most remarkable discovery, made in the 1950s, was of a phenomenon that was totally at odds with intuition: if you are faced with a problem involving 3 or more equally uncertain simultaneous estimates, you can improve the result by "shrinking" the separate averages towards a common value, such as zero.  In a prosaic example, if you wish to simultaneously estimate the average wholesale prices of bushels of apples in Wenatchee, Washington, of oranges in Orlando, Florida, and of grapes in Modesto, California, and you will judge their performance by a combined measure of overall accuracy, you can expect to improve (on average) overall performance by decreasing all three separate estimates, despite the fact that the problems are in an important sense unrelated!  Of course some assumptions are involved, but they are surprisingly benign.  When Stein first advanced this, it made many good scholars acutely uncomfortable, and the discovery disturbed a whole community's complacency.  Now, 35 years later, nobody questions the result, and although there are debates about its practical relevance, the phenomenon (variously called the "Stein Paradox" or the "Stein Phenomenon") has had a profound impact on the field.

Stein has made several other discoveries of major importance.  In particular, he discovered new possibilities of attaining asymptotic efficiency with nonparametric estimates (1956), and he invented a radically new way of obtaining normal approximations (1972).  Both of these works have been increasingly influential since they were published.  Charles Stein is a brilliant dedicated scholar; personally, he is unusually modest and his prominence in the field is due entirely to the force of his ideas.

Citation: Whose research in mathematical statistics and probability theory, has produced a series of brilliant discoveries of remarkable force and originality. His work on decision theory first startled and then inspired an entire community of scholars, changing forever the way statisticians view multidimensional problems of inference, and his novel methods for deriving approximations in probability theory have been adopted internationally.

 

Erich L. Lehmann

Erich L. Lehmann

Professor of Statistics, University of California, Berkeley

Doctor of Science
Four Hundred Twenty-third Convocation, Celebrating the Start of the Centennial Year
October 3, 1991

Presentation Statement: Erich Lehmann is a mathematical statistician.  Mathematics flourishes in the detailed exploration of constrained structures with widely accepted rules; statistics flourishes with the flexibility to treat the infinite variety of problems in the real world.  Lehmann’s genius has been his ability to reconcile these divergent goals and enrich both sides.  During his long and distinguished career, Erich Lehmann has played a crucial role in the flourishing of the Neyman/Wald paradigm in theoretical statistics.  Lehmann organized the theory, led in developing new concepts, methods, and results, taught generations of students, and wrote the definitive books on the subject.

The focus of the approach that is so closely associated with Lehmann is the application of decision theory to statistical problems, the construction of a calculus of optimal statistical procedures.  The success of Lehmann and his school has come from the balance they have maintained in creating a system of mathematical structures of sufficient richness to encompass a large range of practical problems, yet sufficiently concise that a true discipline could be constructed around them.

Erich Lehmann has emphasized optimality as the unifying principle for theoretical statisticians.  The optimality principle can be viewed in two ways: Given a criterion, find the best statistical procedure according to the criterion; given a statistical procedure, find criteria for which it is optimal.  Lehmann’s books and papers have developed both views, thus throwing new light on established procedures.  He is (with Henry Scheffé) responsible for the concept of completeness and its use to find optimal estimates; he has explored the concept of unbiased tests of statistical hypotheses and showed that many standard statistical tests are uniformly most powerful unbiased.  He has also played a key role in the development of mathematical theory for nonparametric inference, showing how this amorphous subject could be treated in a rigorous disciplinary framework.

Erich Lehmann’s texts have shaped the paradigm he is associated with.  Not all of the results are his (though many are), but the arrangement, the elegant seamless presentation, and the coherence of the whole, are his in a way that is seldom seen.  To a large degree, Erich Lehmann created the curriculum of the world’s graduate programs in mathematical statistics from midcentury onward.

Citation: Your research on the applications of statistical theory to the construction of a calculus of optimal statistical procedures has helped create and organize modern mathematical statistics. You and your school have succeeded in maintaining a remarkable balance in creating a system of mathematical structures of sufficient richness to encompass a large range of practical problems, yet sufficiently concise that a true discipline could be constructed around them. Your several elegant treatises have guided the curricula of graduate programs in statistics and thus given shape to this discipline, and your teaching has inspired a generation of scholars.

 

Frederick Mosteller

Frederick Mosteller

Professor of Mathematical Statistics, Harvard University

Doctor of Science
Three Hundred Forty-sixth Convocation, In Celebration of the Opening of the Harper Memorial Library College Center
October 26, 1973

Citation: Masterful teacher, distinguished scholar, and imaginative leader, whose exemplary work has enriched and advanced the teaching and the practice of statistical and quantitative inquiry.

 

John Wilder Tukey

John Wilder Tukey

Professor of Mathematics and Statistics, Princeton University, and Associate Executive Director of Research, Bell Telephone Laboratories

Doctor of Science
Three Hundred Twenty-eighth Convocation
June 13, 1969

Citation: Illustrious scientific generalist, who has redirected the development of statistics by his far-reaching contributions to theory and methods, and by his creative use of statistics and data analysis in the pure and applied sciences.

 

Maurice Stevenson Bartlett

Maurice Stevenson Bartlett

Professor and Head, Department of Statistics, University College, London, England

Doctor of Science
Three Hundred Fourteenth Convocation
June 10, 1966

Citation: Distinguished innovator in statistical theory and application, whose wisdom has guided the balanced development of statistical inference.

 

Jerzy Neyman

Jerzy Neyman

Director of the Statistical Laboratory, and Research Professor in the Institute for Basic Research in Science, University of California, Berkeley

Doctor of Science
Two Hundred Eighty-second Convocation
June 12, 1959

Citation: Illustrious creator and analyst of statistical methods, whose theoretical studies and wise applications of statistics have provided insight into the nature of inductive behavior.

 

Harold Hotelling

Harold Hotelling

Professor of Statistics, University of North Carolina

Doctor of Laws
Two Hundred Sixty-seventh Convocation, In Commemoration of the Twenty-fifth Anniversary of the Social Science Research Building of the University of Chicago
November 11, 1955

Citation: Foremost contemporary contributor of quantitative methods to the social sciences, who by mathematical analysis has notably advanced our understanding of fundamental problems in economics and in statistics.

Sir Ronald Aylmer Fisher

Sir Ronald Aylmer Fisher

Arthur Balfour Professor of Genetics, University of Cambridge, England

Doctor of Science
Two Hundred Fifty-first Convocation
June 13, 1952

Citation: The greatest figure in the history of statistics and one of the greatest in the history of scientific method generally, whose creation of the science of experimental design and formulation of the principles of interpreting evidence have opened the way and carried us far toward an understanding of the logic of inductive reasoning.

MORE PHOTOS

Search | Directories | Statistics Home | UChicago
Department of Statistics - 5734 S. University Avenue - Chicago, IL 60637
uchicago® - ©2014 The University of Chicago® Department of Statistics