Classification of human thought is an emerging research field that may allow us to understand human brain functions and further develop advanced brain-computer interface (BCI) systems. In the present study, we introduce a new approach to classify various mental states from noninvasive electrophysiological recordings of human brain activity. We utilized the full spatial and spectral information contained in the electroencephalography (EEG) signals recorded while a subject is performing a specific mental task. For this, the EEG data were converted into a 2D spatiospectral pattern map, of which each element was filled with 1, 0, and -1 reflecting the degrees of event-related synchronization (ERS) and event-related desynchronization (ERD). We evaluated the similarity between a current (input) 2D pattern map and the template pattern maps (database), by taking the inner-product of pattern matrices. Then, the current 2D pattern map was assigned to a class that demonstrated the highest similarity value. For the verification of our approach, eight participants took part in the present study; their EEG data were recorded while they performed four different cognitive imagery tasks. Consistent ERS/ERD patterns were observed more frequently between trials in the same class than those in different classes, indicating that these spatiospectral pattern maps could be used to classify different mental states. The classification accuracy was evaluated for each participant from both the proposed approach and a conventional mental state classification method based on the inter-hemispheric spectral power asymmetry, using the leave-one-out cross-validation (LOOCV). An average accuracy of 68.13% (${\pm}9.64%$) was attained for the proposed method; whereas an average accuracy of 57% (${\pm}5.68%$) was attained for the conventional method (significance was assessed by the one-tail paired $t$-test, $p$ < 0.01), showing that the proposed simple classification approach might be one of the promising methods in discriminating various mental states.