KG (Knowledge Generation) and understanding have traditionally been a Human-centric activity. KE (Knowledge Engineering) and KM (Knowledge Management) have tried to augment human knowledge on two separate planes: the first deals with machine interpretation of knowledge while the later explore interactions in human networks for KG and understanding. However, both remain computer-centric. Crowdsourced HC (Human Computations) have recently utilized human cognition and memory to generate diverse knowledge streams on specific tasks, which are mostly easy for humans to solve but remain challenging for machine algorithms. Literature shows little work on KM frameworks for citizen crowds, which gather input from the diverse category of Humans, organize that knowledge concerning tasks and knowledge categories and recreate new knowledge as a computer-centric activity. In this paper, we present an attempt to create a framework by implementing a simple solution, called ExamCheck, to focus on the generation of knowledge, feedback on that knowledge and recording the results of that knowledge in academic settings. Our solution, based on HC, shows that a structured KM framework can address a complex problem in a context that is important for participants themselves.