This problem introduces a technique called the jackknife,
Chapter , Problem 51(choose chapter or problem)
This problem introduces a technique called the jackknife, originally proposed by Quenouille (1956) for reducing bias. Many nonlinear estimates, including the ratio estimator, have the property that
\(E(\hat{\theta})=\theta+\frac{b_{1}}{n}+\frac{b_{2}}{n^{2}}+\cdots\)
where \(\hat{\theta}\) is an estimate of \(theta\). The jackknife forms an estimate \(\hat{\theta}_{J}\), which has a leading bias term of the order \(n^{-2}\) rather than \(n^{-1}\). Thus, for sufficiently large n, the bias of \(\hat{\theta}_{J}\) is substantially smaller than that of \(\hat{\theta}\). The technique involves splitting the sample into several subsamples, computing the estimate for each subsample, and then combining the several estimates. The sample is split into p groups of size m, where n = mp. For j = 1, . . . , p, the estimate \(\hat{\theta}_{J}\) is calculated from the m(p 1) observations left after the jth group has been deleted. From the preceding expression,
\(E\left(\hat{\theta}_{j}\right)=\theta+\frac{b_{1}}{m(p-1)}+\frac{b_{2}}{[m(p-1)]^{2}}+\cdots\)
Now, p “pseudovalues” are defined:
\(V_{j}=p \hat{\theta}-(p-1) \hat{\theta}_{j}\)
The jackknife estimate, \(\hat{\theta}_{J}\), is defined as the average of the pseudovalues:
\(\hat{\theta}_{J}=\frac{1}{p} \sum_{j=1}^{p} V_{j}\)
Show that the bias of \(\hat{\theta}_{J}\) is of the order \(n^{-2}\).
Unfortunately, we don't have that question answered yet. But you can get it answered in just 5 hours by Logging in or Becoming a subscriber.
Becoming a subscriber
Or look for another answer