In this study, we explored gender differences in answers to constructed response reading items from a state assessment program. Construct related and unrelated factors were identified by a content expert after a review of the literature and a pilot study. Four raters were trained to score the identified features on approximately 500 papers evenly divided across 2 grade levels and between genders. These features included correctness of answer, unrelated answers, inaccurate answers, number of words written, and a measure of syntactic complexity. The papers rated for these features had already been assigned holistic scores by local teachers using a state-provided rubric. The relations between these studied features, holistic scores, objective scores, and gender differences was explored through correlations and regression analyses. The results indicate that number of words written and number of unrelated responses showed significant gender differences, were related to holistic scores, and were significant even when the other studied variables were controlled statistically. Further research is needed to investigate how these features could influence raters differentially for male and female students.