Statistics/Operations Research/Math Finance

Estimation of Stochastic Dependence via Kendall's Tau

10/17/2013 - 4:00pm
10/17/2013 - 5:00pm
Speaker: 
Uwe Schmock, TU, Vienna
Abstract: 

We discuss the performance of different estimators of dependence
measures, concentrating on the linear correlation coefficient and the
rank-based measure Kendall's tau. As the estimator of Kendall's tau
is a U-statistic, it is asymptotically normal. We calculate the
asymptotic variance explicitly for the Farlie-Gumbel-Morgenstern
copula, the Clayton copula, the Ali-Mikhail-Haq copula and the
Marshall-Olkin copula. For the Clayton copula, the result is
expressed using a generalized hypergeometric function, which can be
reduced to a more elementary form for specific parameters values of
the copula.

For elliptical distributions there is a unique connection between the
linear correlation coefficient and Kendall's tau. This offers two
ways for estimation: using the standard estimator of the linear
correlation directly or estimating Kendall's tau and transforming it
into a linear correlation estimate. Since both estimators are
asymptotically normal under appropriate moment conditions, we use the
asymptotic variance to compare their performance. For the
uncorrelated t-distribution, using the fact that it is a variance
mixture of normal distributions, we show that the asymptotic variance
equals an integral involving the square of the arctangent and a
hypergeometric function. For all integer-valued degrees of freedom,
involved and tricky integrations lead to a surprisingly simple closed
form for the asymptotic variance. It turns out that especially for
small degrees of freedom the alternative estimation via Kendall's tau
performs much better than the standard estimator. (This is joint work
with Barbara Dengler.)

Where: 
Roberts South 105, CMC

On minimum correlation in construction of multivariate distributions

02/07/2013 - 4:15pm
02/07/2013 - 5:15pm
Speaker: 
Nevena Maric (University of Missouri-St. Louis)
Abstract: 

In this talk we will present an algorithm for exact generation of bivariate samples with pre-specified marginal distributions and a given correlation, based on a mixture of Fr\'echet-Hoeffding bounds and marginal products. The algorithm can accommodate any among the theoretically possible correlation coefficients, and explicitly provides a connection between simulation and the minimum correlation attainable for different distribution families. We calculate the minimum correlations in several common distributional examples, including in some that have not been looked at before. The method can also be extended to multivariate setting. As an illustration, we provide the details and results of implementing the algorithm for generating three-dimensional negatively and positively correlated Beta random variables, making it the only non-copula algorithm for correlated Beta simulation in dimensions greater than two. Joint work with Vanja Dukic.

Where: 
CMC, Roberts South 105

From Multifractional Brownian Motion to Multifractional Stochastic Volatility Models

10/11/2012 - 4:00pm
10/11/2012 - 5:00pm
Speaker: 
Qidi Peng (CGU)
Abstract: 

Hull and White and other authors in Mathematical Finance have introduced the so-called stochastic volatility models for taking into account some randomness which is specific to volatility (this randomness is due to exogenous arrivals of information). Later, Comte and Renault have proposed to replace the Brownian dynamic in these models by a fractional Brownian motion in order to make them more realistic. More recently, by using the notion of Generalized Quadratic Variations, Gloter and Hoffmann have constructed estimators of the parameters of Fractional stochastic volatility models. The goal of our talk is to introduce the multifractional Brownian motion and multifractional stochastic volatility models and to extend some results of Gloter and Hoffmann to these new models. At last, let us mention that the multifractional stochastic volatility models allow to take into account the fact that the local regularity of financial signals changes from one time to another.

Where: 
RN 104 (CMC campus)

Analysis of racial profiling by police

03/08/2012 - 4:15pm
Speaker: 
Greg Ridgeway (Rand Corporation)
Abstract: 

Several studies and high profile incidents around the nation involving police and minorities, such as the July 2009 arrest of Harvard Professor Henry Louis Gates, have brought the issue of racial profiling to national attention. While civil rights issues continue to arise in other areas such as offers of employment, job promotions, and school admissions, the issue of race disparities in traffic stops seems to have garnered much of the attention in recent years. Many communities have asked, and at times the U.S. Department of Justice has required, that law enforcement agencies collect and analyze data on all traffic stops.
Data collection efforts, however, so far have outpaced the development of methods that can isolate the effect of race bias on officers' decisions to stop, cite, or search motorists. In this talk Dr. Ridgeway will describe a test for detecting race bias in the decision to stop a driver that does not require explicit, external estimates of the driver risk set. Second, he’ll describe an internal benchmarking methodology for identifying potential problem officers. Lastly, he will describe methods for assessing racial disparities in citation, searches, and stop duration. He will present results from his studies of the Oakland (CA), Cincinnati, and New York City Police Departments.

Where: 
Davidson Lecture room, Adams Hall, Claremont McKenna College
Misc. Information: 

Greg Ridgeway, Ph.D., is a Senior Statistician and Director of the Safety & Justice Research Program at the RAND Corporation.

Some related papers to the talk:
J. Grogger and G. Ridgeway (2006). “Testing for racial profiling in traffic stops from behind a veil of darkness,” Journal of the American Statistical Association 101(475):878‐887. ASA 2007 Outstanding Statistical Application Award
G. Ridgeway (2006). “Assessing the effect of race bias in post‐traffic stop outcomes using propensity scores,” Journal of Quantitative Criminology 22(1):1‐29.
G. Ridgeway and J.M. MacDonald (2009). “Doubly Robust Internal Benchmarking and False Discovery Rates for Detecting Racial Bias in Police Stops,” Journal of the American Statistical Association 104(486):661–668.

Constructing Network Variables

02/16/2012 - 4:15pm
Speaker: 
Austen Head
Abstract: 

Data often come in the form of values on vertices which are embedded in multiple different networks. For an example from epidemiology, we may know who is infected with a disease currently and who was infected at a previous time point. If we know the networks in which these individuals are embedded (e.g., a friendship network, group memberships, and any number of other networks), we may want to predict along which of these networks the disease propagated most and also which individuals are most at risk for becoming infected. We present a simple method to restructure vertex valued multi-network data into a framework that allows for the use of standard statistical prediction and data exploration tools.

Where: 
Davidson Lecture Hall, Claremont McKenna College
Misc. Information: 

 Austen Head's website:
http://www-stat.stanford.edu/~ahead/
 

Visual Recognition with Humans in the Loop

02/23/2012 - 4:45pm
02/23/2012 - 5:15pm
Speaker: 
Serge Belongie (UC San Diego)
Abstract: 

We present an interactive, hybrid human-computer method for object classification. The method applies to classes of problems that are difficult for most people, but are recognizable by people with the appropriate expertise (e.g., animal species or airplane model recognition). The classification method can be seen as a visual version of the 20 questions game, where questions based on simple visual attributes are posed interactively. The goal is to identify the true class while minimizing the number of questions asked, using the visual content of the image. Incorporating user input drives up recognition accuracy to levels that are good enough for practical applications; at the same time, computer vision reduces the amount of human interaction required. The resulting hybrid system is able to handle difficult, large multi-class problems with tightly-related categories. We introduce a general framework for incorporating almost any off-the-shelf multi-class object recognition algorithm into the visual 20 questions game, and provide methodologies to account for imperfect user responses and unreliable computer vision algorithms. We evaluate the accuracy and computational properties of different computer vision algorithms and the effects of noisy user responses on a dataset of 200 bird species and on the Animals With Attributes dataset. Our results demonstrate the effectiveness and practicality of the hybrid human-computer classification paradigm.

This work is part of the Visipedia project, in collaboration with Steve Branson, Catherine Wah, Florian Schroff, Boris Babenko, Peter Welinder and Pietro Perona.

 

 

Where: 
Davidson Lecture Hall, Claremont McKenna College
Misc. Information: 

Serge Belongie received the B.S. degree (with honor) in Electrical Engineering from the California Institute of Technology in 1995 and the M.S. and Ph.D. degrees in Electrical Engineering and Computer Sciences (EECS) at U.C. Berkeley in 1997 and 2000, respectively. While at Berkeley, his research was supported by a National Science Foundation Graduate Research Fellowship. He is also a co-founder of Digital Persona, Inc., and the principal architect of the Digital Persona fingerprint recognition algorithm. He is currently a Professor in the Computer Science and Engineering Department at U.C. San Diego. His research interests include computer vision and pattern recognition. He is a recipient of the NSF CAREER Award and the Alfred P. Sloan Research Fellowship. In 2004 MIT Technology Review named him to the list of the 100 top young technology innovators in the world (TR100).

Probabilistic Numerical Methods for Fully Nonlinear Parabolic PDEs

04/19/2012 - 4:15pm
04/19/2012 - 5:15pm
Speaker: 
Jianfeng Zhang (USC)
Abstract: 

Motivated by the remarkable work Fahim, Touzi, and Warin (2010), we introduce a probabilistic numerical method for fully nonlinear parabolic PDEs in this talk. By using certain trinomial tree instead of Brownian Motion, we remove a serious bound constraint imposed in Fahim, Touzi, and Warin (2010).  Our scheme works well for high dimensional PDEs with a diagonal dominant coefficient of the Hessian matrix, and it is fast and stable when the dimension is low (d<=3). As a special case, our scheme can be applied to solve high dimensional coupled FBSDEs, especially when the forward diffusion is diagonal.  We will show several numerical examples, with dimension up to 12. The talk is based on a joint work with Wenjie Guo and Jia Zhuo.

 

 

Where: 
Davidson Lecture Hall, Claremont McKenna College

Optimal Investment under Model Uncertainty

04/12/2012 - 4:30pm
04/12/2012 - 5:30pm
Speaker: 
Scott Robertson (Carnegie Mellon University)
Abstract: 

Realistic models for stock prices incorporate randomness in future movements. For example, in the binomial model, during each time period, the stock price goes either up or down, with the probability of an up movement given by some number p. Once p is fixed, the stock price, while still a random process, has a well defined model governing its dynamics. In this talk, we will consider the case when there is uncertainty in how to model the dynamics of the underlying asset. Specified to the binomial model, this means that we do not know p exactly, but rather that we know that p lies in some interval (a,b). In this environment, we wish to invest in a robust manner, so that we do relatively well in all possible models.  It will be shown that the existence of an optimal trading strategy, different from just putting all your money in your pocket and doing nothing, is intimately related to the "distance" between the risk-neutral model and the class of acceptable models. Time permitting, results will be extended to the Black-Scholes model in continuous time.

 

Where: 
Third floor Sprague Library, Harvey Mudd College

Maternal-Fetal Genotype Incompatibility as a Risk Factor for Schizophrenia

03/01/2012 - 4:15pm
03/01/2012 - 5:15pm
Speaker: 
Christina Palmer
Abstract: 

 Prenatal/obstetric complications are implicated in schizophrenia susceptibility. Some complications may arise from maternal-fetal genotype incompatibility, a term used to describe maternal-fetal genotype combinations that produce an adverse prenatal environment. As will be described, maternal-fetal genotype incompatibility can occur when maternal and fetal genotypes differ from one another, or when maternal and fetal genotypes are too similar to each other. Incompatibility genes for each of these scenarios have been implicated as risk factors for schizophrenia and a review of maternal-fetal genotype incompatibility studies suggests that schizophrenia susceptibility is increased by maternal-fetal genotype combinations at the RHD, ABO, and HLA-B loci. Maternal-fetal genotype combinations at these loci are hypothesized to have an effect on the maternal immune system during pregnancy, which can affect fetal neurodevelopment and increase schizophrenia susceptibility. During this presentation, data including recent results from a pedigree analysis will be synthesized and the hypothesized biological role of these incompatibility genes in the etiology of schizophrenia will be described.

Where: 
Davidson Lecture Hall, Claremont McKenna College
Misc. Information: 

Professor of Psychiatry and Biobehavioral Sciences, Human Genetics, Institute for Society and Genetics
David Geffen School of Medicine at UCLA

Risk averse capacity control in revenue management

03/29/2012 - 4:15pm
03/29/2012 - 5:15pm
Speaker: 
Christiane Barz (UCLA)
Abstract: 

Traditionally, revenue management models aim at a maximization of expected revenue, i.e. a risk-neutral decision-maker is assumed. During the last few years, however, the consideration of revenue risk has gained more and more attention. By failing to suggest mechanisms for reducing unfavorable revenue levels, traditional risk-neutral capacity control models fall short of meeting the needs of a risk-averse planner. This is why we revisit the well-known capacity control problem in revenue management from the perspective of a risk-averse decision-maker. Modeling an expected utility maximizing decision maker, the problem is formulated as a risk-sensitive Markov decision process. Special emphasis is put on the existence of structured optimal policies.

Where: 
Davidson Lecture Hall, Claremont McKenna College
Add to calendar

Claremont Graduate University | Claremont McKenna | Harvey Mudd | Pitzer | Pomona | Scripps
Proudly Serving Math Community at the Claremont Colleges Since 2007
Copyright © 2018 Claremont Center for the Mathematical Sciences

Syndicate content