Image from Google Jackets

Bootstrap Diagnostics for Irregular Estimators / Isaiah Andrews, Jesse M. Shapiro.

By: Contributor(s): Material type: TextTextSeries: Working Paper Series (National Bureau of Economic Research) ; no. w32038.Publication details: Cambridge, Mass. National Bureau of Economic Research 2024.Description: 1 online resource: illustrations (black and white)Subject(s): Other classification:
  • C18
  • C44
  • D81
Online resources: Available additional physical forms:
  • Hardcopy version available to institutional subscribers
Abstract: Empirical researchers frequently rely on normal approximations in order to summarize and communicate uncertainty about their findings to their scientific audience. When such approximations are unreliable, they can lead the audience to make misguided decisions. We propose to measure the failure of the conventional normal approximation for a given estimator by the total variation distance between a bootstrap distribution and the normal distribution parameterized by the point estimate and standard error. For a wide class of decision problems and a class of uninformative priors, we show that a multiple of the total variation distance bounds the mistakes which result from relying on the conventional normal approximation. In a sample of recent empirical articles that use a bootstrap for inference, we find that the conventional normal approximation is often poor. We suggest and illustrate convenient alternative reports for such settings.
Tags from this library: No tags from this library for this title. Log in to add tags.
Star ratings
    Average rating: 0.0 (0 votes)

January 2024.

Empirical researchers frequently rely on normal approximations in order to summarize and communicate uncertainty about their findings to their scientific audience. When such approximations are unreliable, they can lead the audience to make misguided decisions. We propose to measure the failure of the conventional normal approximation for a given estimator by the total variation distance between a bootstrap distribution and the normal distribution parameterized by the point estimate and standard error. For a wide class of decision problems and a class of uninformative priors, we show that a multiple of the total variation distance bounds the mistakes which result from relying on the conventional normal approximation. In a sample of recent empirical articles that use a bootstrap for inference, we find that the conventional normal approximation is often poor. We suggest and illustrate convenient alternative reports for such settings.

Hardcopy version available to institutional subscribers

System requirements: Adobe [Acrobat] Reader required for PDF files.

Mode of access: World Wide Web.

Print version record

There are no comments on this title.

to post a comment.

Powered by Koha