This study is a premiere in using neural networks in stochastic simulation of the number of rejected Web pages per search query: The evaluation of the quality of search engines should involve not only the resulting set of Web pages but also an estimate of the rejected set of web pages. The iterative RBF neural network developed by Meghabghab and Nasr [I] was adapted to the actual evaluation of the number of rejected web pages on 4 search engines, i.e., Yahoo, Alta Vista, Google, and Northern Light. 9 input variables were selected for the simulation Typical stochastic simulation meta-modeling uses regression models in Response Surface Methods. RBF divides the resulting set of responses to a query into accepted and rejected web pages. RBF meta-model,vas trained on 937 examples from a set of 9000 different simulation runs on 9 input variables. *Results show that the number of rejected web pages for a specific set of search queries on these 4 engines very high. Also a goodness measure of a search engine for a given set of queries can be designed which is a function of the coverage of the search engine and the normalized age of a new document in result set for the query. This study concludes that unless search engine designers address the issue of rejected web pages, indexing, and crawling, the usage of the Web as a research tool for academic and educational purposes will stay hindered.