Friday, March 22, 2013

Benefit of Quizzes for Learning

In a recent discussion I was pointed to this article on the benefits of testing as evidence that quizzes have pedagogical value. Given that memory researchers have been getting results that testing students improved their retention for many years I was a little surprised that the NYT was presenting these results as quite so novel.  For example Dempster (1996) provides multiple citations of this effect going back as early as the 1920s. I guess that's journalism for you :-)  However the study being reported showed that simple testing beat drawing concept maps which is very interesting, although I think the results might be reversed if the students were practising generating concept maps from memory in the concept map condition.

This is an area that I have researched in some detail, particularly as regards vocabulary learning:
Joseph S.R.H., Watanabe Y., Shiung Y.-J., Choi B. & Robbins C. (2009) Key Aspects of Computer Assisted Vocabulary Learning (CAVL): Combined Effects of Media, Sequencing and Task Type. Research and Practice in Technology Enhanced Learning.
4(2) 1-36.
I certainly don't doubt that retrieval practise promotes retention, or more simple that taking tests helps you remember things.  I think the problem with the study mentioned in the NYT and those referred to by Dempster is that what they are really showing is that taking a test is great preperation for doing well on tests.  If the objective is to have students pass tests then yes, let them do lots of tests.

A more important question is what are the target skills that we are hoping that students acquire?  For example in my programming classes I am hoping that my students will learn to program.  So should I be setting them written tests, or should I be setting them programming assignments?

In my HCI classes I want my students to learn design skills, to produce good designs, so should I set them written tests on the subject of design, or should I have them do design assignments?

Scott Klemmer's HCI classes have lots of great assignments in which students practise design skills, as well as standalone and media integrated quizzes.  I don't think there is any doubt as to value of the assignments, but are quizzes the best way to get the students to the point that they can practise the target skills?  Simple quizzes built into the lecture material seem relatively benign, and these are common place throughout the current slew of MOOC courses, and are arguably good at keeping student focus on the short videos provided.  I have also been impressed with the use of quizzes with voting and discussion as seen in the Berkeley Software Engineering classes.  However for students in MOOC land they are watching the videos in a variety of settings and I wonder if the short quizzes are really worth the effort.

Furthermore I think longer harder quizzes such as those in the Berkeley Software Engineering course (both face to face and online) can be rather intimidating for many students.  And even for confident students, quizzes can be fundamentally a boring activity.  Of course quizzes give instructors insight into student progress, but they are also potentially demotivating for students and one might reasonably ask if they have any effect on the ability of the students to perform the relevant target skills?

Given that we are considering courses that will provide at least some material in text and video and other media, the key question to me is whether quizzes integrated into the media are directly beneficial in terms of subsequent performance on target skills?

To operationalize all this we might try to device a study to test if is more valuable for a student to be spending it absorbing material passively and then taking a short quiz, or absorbing more material passively, or absorbing it passively and than doing an active assignment, or perhaps just doing an active assignment with the whole of that time period?  What might our experimental conditions look like?  These perhaps:
  • 10 mins passive material, 5 mins on quiz
  • 15 mins on passive material
  • 10 mins passive material, 5 mins on active relevant assignment
  • 15 mins on active relevant assignment
And by a relevant assignment, I mean ideally something that is relevant to that individual student, such as a project that they are actually motivated to work on.  It seems like there is a great opportunity for AB testing in the current MOOCs - give a randomly selected half of the students no quizzes during the lectures, and the other half the integrated quizzes, and compare the quality of outcomes.  Or three conditions perhaps:

A) No Quizzes
B) Simple Quizzes
C) Harder Quizzes

Maybe given some chunk of time, say 15 minutes, it's unrealistic to have every student work on an active relevant assignment - maybe given 15 minutes at the end of the day, all people can manage is some passive material and then a quick quiz to help them absorb it?

On a more ethereal note I start to wonder if it is ever very effective to provide information to a student if they haven't expressed a need for it, or at least an interest in it.  For most effective learning perhaps we need to manuever the student (or perhaps ourselves?) into a position where they ask a question of their own violition and then respond to them with an answer and perhaps a further question?



Dempster, F. N. (1996). Distributing and managing the conditions of encoding and practice. In
R. Bjork, & E. Bjork (Eds.), Memory (pp. 317–344). San Diego, CA: Academic Press

Tuesday, March 12, 2013

HyperLocal News in JustInMind

Building on my earlier Balsamiq prototype here's another version of the HyperLocalNews app using the JustInMind framework.   This gives a much higher fidelity prototype which to the untrained eye might look more like the real thing.  The danger here is that a client might think you've already built the entire app :-)

Try clicking on the map markers:


Many thanks to creative commons sharing folk for the images to make the news stories more interesting:

http://www.flickr.com/photos/7700821@N06/476498157/
http://www.flickr.com/photos/danshouse/137241182/
http://www.flickr.com/photos/us-pacific-command/7658198802/

Note that this is different from the Balsamiq exported PDF in that it's all HTML, and there's no accidental scrolling up and down.  Getting a good look and feel of course relies on one knowing where to position things as they would appear on the appropriate smartphone.

Monday, March 11, 2013

Fest Swing Testing in Netbeans IDE 7.3

I wanted to see if I could get the Fest Swing GUI testing framework running in NetBeans IDE (7.3).   The existing plugin was designed for 6.1; and some of the instructional images were missing.  Once I'd worked out how to install Netbeans IDE plugins I found it did work with 7.3.  Here's my version getting the sample code running in NetBeans IDE 7.3 beta 2

1. Create a File --> New Project

2. Choose Samples --> Java --> Anagram Game

















3. Right click "Test Libraries" and choose Fest and JUnit 4.10


















4.  Create a File --> New File, and select Fest Test from Swing GUI forms:

















5. Give it an appropriate name and put it in the com.toy.anagrams.ui package
















6. Add these three lines after the initComponents() method call in the constructor in Anagrams.java

        guessedWord.setName("anagramField");
        guessButton.setName("guessButton");
        feedbackLabel.setName("resultLabel");




















7.  Uncomment line 26 in the fest file you created:



8. Execute the fest file as a test, and watch the Anagram game be have text filled in automatically