On the Construction of Minimax Optimal Nonparametric Tests with Kernel Embedding Methods

  • 0 Ratings
  • 0 Want to read
  • 0 Currently reading
  • 0 Have read
On the Construction of Minimax Optimal Nonpar ...
Tong Li
Not in Library

My Reading Lists:

Create a new list

Check-In

×Close
Add an optional check-in date. Check-in dates are used to track yearly reading goals.
Today

  • 0 Ratings
  • 0 Want to read
  • 0 Currently reading
  • 0 Have read

Buy this book

Last edited by MARC Bot
December 15, 2022 | History

On the Construction of Minimax Optimal Nonparametric Tests with Kernel Embedding Methods

  • 0 Ratings
  • 0 Want to read
  • 0 Currently reading
  • 0 Have read

Kernel embedding methods have witnessed a great deal of practical success in the area of nonparametric hypothesis testing in recent years. But ever since its first proposal, there exists an inevitable problem that researchers in this area have been trying to answer--what kernel should be selected, because the performance of the associated nonparametric tests can vary dramatically with different kernels. While the way of kernel selection is usually ad hoc, we wonder if there exists a principled way of kernel selection so as to ensure that the associated nonparametric tests have good performance. As consistency results against fixed alternatives do not tell the full story about the power of the associated tests, we study their statistical performance within the minimax framework. First, focusing on the case of goodness-of-fit tests, our analyses show that a vanilla version of the kernel embedding based test could be suboptimal, and suggest a simple remedy by moderating the kernel. We prove that the moderated approach provides optimal tests for a wide range of deviations from the null and can also be made adaptive over a large collection of interpolation spaces.

Then, we study the asymptotic properties of goodness-of-fit, homogeneity and independence tests using Gaussian kernels, arguably the most popular and successful among such tests. Our results provide theoretical justifications for this common practice by showing that tests using a Gaussian kernel with an appropriately chosen scaling parameter are minimax optimal against smooth alternatives in all three settings. In addition, our analysis also pinpoints the importance of choosing a diverging scaling parameter when using Gaussian kernels and suggests a data-driven choice of the scaling parameter that yields tests optimal, up to an iterated logarithmic factor, over a wide range of smooth alternatives. Numerical experiments are presented to further demonstrate the practical merits of our methodology.

Publish Date
Language
English

Buy this book

Book Details


Edition Notes

Department: Statistics.

Thesis advisor: Ming Yuan.

Thesis (Ph.D.)--Columbia University, 2021.

Published in
[New York, N.Y.?]

The Physical Object

Pagination
1 online resource.

ID Numbers

Open Library
OL44027163M
OCLC/WorldCat
1237771699

Source records

marc_columbia MARC record

Community Reviews (0)

Feedback?
No community reviews have been submitted for this work.

Lists

This work does not appear on any lists.

History

Download catalog record: RDF / JSON
December 15, 2022 Created by MARC Bot import new book