Random number generation and quasi-Monte Carlo methods
Random number generation and quasi-Monte Carlo methods
A quasi-Monte Carlo approach to particle simulation of the heat equation
SIAM Journal on Numerical Analysis
On the L2-discrepancy for anchored boxes
Journal of Complexity
Mathematics and Computers in Simulation - IMACS sponsored Special issue on the second IMACS seminar on Monte Carlo methods
Variance with alternative scramblings of digital nets
ACM Transactions on Modeling and Computer Simulation (TOMACS)
Improved upper bounds on the star discrepancy of (t, m, s)-nets and (t, s)-sequences
Journal of Complexity
Smoothness and dimension reduction in Quasi-Monte Carlo methods
Mathematical and Computer Modelling: An International Journal
On probabilistic results for the discrepancy of a hybrid-Monte Carlo sequence
Journal of Complexity
Mathematical and Computer Modelling: An International Journal
Generating low-discrepancy sequences from the normal distribution: Box-Muller or inverse transform?
Mathematical and Computer Modelling: An International Journal
Hi-index | 0.00 |
In problems of moderate dimensions, the quasi-Monte Carlo method usually provides better estimates than the Monte Carlo method. However, as the dimension of the problem increases, the advantages of the quasi-Monte Carlo method diminish quickly. A remedy for this problem is to use hybrid sequences; sequences that combine pseudorandom and low-discrepancy vectors. In this paper we discuss a particular hybrid sequence called the mixed sequence. We will provide improved discrepancy bounds for this sequence and prove a central limit theorem for the corresponding estimator. We will also provide numerical results that compare the mixed sequence with the Monte Carlo and randomized quasi-Monte Carlo methods.