Essay writing is a common way to evaluate students' writing skills. Automated Essay Scoring (AES) systems are designed to facilitate the grading task for a large number of students in schools, universities, and testing companies. This study proposes a rule-based system to evaluate Arabic essays automatically, based on free textual essay analysis, whether or not there are predefined model essays. The evaluation criteria include surface-based and text processing requirements such as spelling, punctuation marks, essay structure, coherence, and essay style. To evaluate our system, we collected unrestricted Arabic essays from native Arabic writers with university-level proficiency. This dataset has been manually evaluated following our rubric based on the common criteria for good writing. The results obtained by our system show that, for essay's overall score around 73% were correctly evaluated. In addition, results show a variety in the system performance for each specific criterion. This system is designed as a baseline for more advanced AES system that uses complex features with machine learning techniques in future.