In programming education, teachers need to monitor and assess the progress of their students by investigating the code they write. Code quality of programs written in traditional programming languages can be automatically assessed with automated tests, verification tools, or linters. In many cases these approaches rely on some form of manually written formal specification to analyze the given programs. Writing such specifications, however, is hard for teachers, who are often not adequately trained for this task. Furthermore, automated tool support for popular block-based introductory programming languages like Scratch is lacking. Anomaly detection is an approach to automatically identify deviations of common behavior in datasets without any need for writing a specification. In this paper, we use anomaly detection to automatically find deviations of Scratch code in a classroom setting, where anomalies can represent erroneous code, alternative solutions, or distinguished work. Evaluation on solutions of different programming tasks demonstrates that anomaly detection can successfully be applied to tightly specified as well as open-ended programming tasks.