A researcher in the Management Information Systems (MIS) field entertains a hope that his or her research will contribute significantly to the field. Although arguably no study can be perfect in research design, and every study is susceptible to random error and foreseen or unforeseen bias, nonetheless, the more rigorous, or ideal, a study's methodology has been, the more likely meaningful results will be achieved. Accordingly, a set of nine attributes, called survey methodological attributes (SMA), have been developed for use in assessing published surveys, essentially serving as criteria to identify methodologically sound survey research. This article reports on the results of meta-analysis, using SMA criteria, of three publications - the Management Information Systems Quarterly (MISQ), the Journal of Management Information Systems (JMIS), and Information Systems Research (ISR) - from the time of their inauguration through 2004, so as to provide insights into the rigor of MIS survey research. We discovered that, although all three journals publish studies that are methodologically sound, two frequently reoccurring problems were 1) failure to perform statistical tests to evaluate the effects of non-response errors; and 2) not using multiple collection methods. These findings indicate a need for greater attention to survey design methods among empirical studies in the MIS field.