Assessment and Feedback in the Classroom
Source: http://businesssolutions.it/en/assessment-center-2/
This week's reading is based off of:
---------------------------------------------------------------------------------
M. Brady, H. Seli, J. Rosenthal. "Metacognition and the influence of polling systems:
how do clickers compare with low technology systems."
Education Tech Research Dev (2013) 61:885–902
F. Dochy , M. Segers & D. Sluijsmans (1999) "The use of self-, peer and coassessment
in higher education: A review,"
Studies in Higher Education, 24:3, 331-350,
DOI:
10.1080/03075079912331379935
----------------------------------------------------------------------------------
This week in class we went over learner engagement and feedback in the classroom as part of assessment and feedback in the classroom. As one of the discussion leaders, I experimented with in-class polling technologies (namely
PollEverywhere) and had a discussion of the merits of clicker technology. I think this was a useful experience personally given my previous experience with clickers when it was a more recent technology.
When I was taking introductory electromagnetism, the professors were experimenting with a new program on campus called
TEAL (Technology Enabled Active Learning), in which clickers were a part of the curriculum and grade. We had the TurningPoint clickers that could take in alphanumeric inputs (shown below).
We had to bring these clickers every class session for attendance and participation. The inconvenience of these clickers were that they were pricey, really only used in this one course, the buttons mechanically wore out quickly, and if you forgot your clicker, you lost points for that day. Of course, for students who didn't want to show up to a lecture, they could just give their clicker to a friend for the day. Perhaps having clickers did help in the engagement of the material, but the last thing I remember from this experience is wondering how I could sell off my clicker and recover a decent amount of what I paid for it.
But I think clicker technology has come a ways from model. We saw in class that new clickers of the TurningPoint type have improved design. One technology I thought was particularly promising was PollEverywhere. The main advantages of PollEverywhere are:
- you need very little extra hardware or software to use it. The poll creation, administration, and response collection are all mediated using their website or a plug-in (e.g., to Google slides). Students can submit responses either by visiting a unique url or by texting.
- a variety of questions/responses are possible- multiple choice, clickable images, free-form text. It can also handle LaTeX notation (++ for doing equations).
- the user interface is simple, clean, and easy to use. I found that I spent more time thinking of poll questions than figuring out how to use the tool.
During the course of the class, I wanted to experiment with several things that were covered in the Brady, et al. paper. Brady, et al. did a study on undergraduate psychology students and compared clicker (high-tech) with paddles (low-tech). They postulated that having that in-class polling motivates students to reflect on their progress towards their learning goals by improving their metacognition (i.e., their awareness of what and how they are learning). Overall, they found clickers are useful for this, particularly for low- to mid-performing students. I found the comparison to low-tech technology to be interesting, because both methods provide the same instant feedback. However, clickers provide a way to accurately tally answers and keep record of it and avoid the group mentality of polling the class with paddles.
In the experiment, we compared the experience between PollEverywhere and using American Sign Language to answer what city had the largest population density (ans: Mumbai). We first started with American Sign Language where everyone signed the letter corresponding to the answer. Several interesting observations are worth noting. When the real-time tracking of answers was hidden, I noticed the majority of answers were answering Tokyo. When using sign language, it was harder to distinguish which was the majority answer. At the third time when real-time responses were displayed (shown below), there was less of a spread, and a clear winner of the possible answers. While we observed the clear influence of group mentality among the three trials, it was interesting when someone remarked that in seeing other people submit an answer they were thinking about made them more confident about their answer, even if it were incorrect.
PollEverywhere interface for in-class discussion and results the second round
There were several lessons I learned in doing this exercise:
- Always make time for technical glitches. The internet is not always reliable for these types of things. Conducting such an in-class poll for a large class may not be feasible with the current network infrastructure of the campus.
- Be sure not to give away the answer before revealing the answer. I accidentally slipped out a hint of the correct answer, and some picked up on it.
- I think it would be interesting to have someone purposefully submit and incorrect answer to encourage people to choose the answer they would have originally chosen. This would be a great way to start a dialogue on why the answer is incorrect, or go over the thought processes that went into the choice.
- Polling on how confident people feel about their answer would also be a good learning experience, because it's telling of what and how they are thinking about the problem.
Overall, despite the not-super-steller experience I had with clickers before, I would use an in-class polling technology like PollEverywhere in a future class. I like the possibilities that it offers to really engage students and become a tool to start a conversation about the material. If paired with a learning (and social) activity, such as think-pair-share, the full potential of in-class polling could be realized.