- I set up the event from the dashboard by entering a unique event code, usually the acronym of the event and some rendering of the date.
- I introduce the audience to Knitter, send those with web devices to the URL, and invite them to enter their name and the event code.
- I add that Tweets using the event code as a hash tag will also be captured by Knitter
- At the end of the event, I return to the dashboard, click off the event, and export the transcript.
- Poste the transcript into my wiki, read it, and comment.
A few days ago I mentioned on Facebook that I had been working, in mad scientist fashion, on my Knitter tool, sometimes called Knitter Chat. Rather than try to fit an adequate explanation into a single status update, I decided to just link to a blog post. But after a quick search, I found that although I have mentioned the tool several times in 2¢ Worth, it seems that I have never really explained it in a single post. So here goes.
I guess it would be most accurate to say that I started working on Twitter, at the NECC conference where I first learned about Twitter. I do not recall which NECC it was but I’m pretty sure that it was the first one with a Bloggers’ Cafe — Atlanta maybe, 2007. Anyway, someone, among the gathered bloggers, mentioned and described Twitter, and we all responded the only way that anyone responds to their first exposure — “Why would I want to do that?” But after we joined, and started Tweeting and reading the conference, the value became obvious and we started talking about the potentials of lots of conference attendees Twittering away and even workshop participants and classroom students.
Of course, having to explain Twitter, get everyone signed in, and befriending each other were obvious barriers to my using it in any systematic way. So I started plotting out a new programming project, something that would mimic Twitter, but perhaps have other features designed for learning environments.
Knitter went through several incarnation, the earlier ones including various existing chat room scripts that I’d found, downloaded, and integrated into the functionality that I was working toward. But I’m not a Java/Ajax programmer and was unable to make the tools run reliably enough. So I switched to the programming that I knew and although the result is less “slick,” it works for me — 99% of the time.
This is probably a good place to say that Knitter is not a public tool. It is a personal experiment that I use in many of my presentations. I simply do not need another public tool to have to support. I can barely keep up with Citational Machine and Class Blogmeister, and that with loads of help from Robert Sharp and other users. And there are alternatives when there were none when I built CB. Here are some that I usually suggest when asked.
If you know of others, please post a comment. Of those four, I have used and enjoyed Today’s Meet the most. It’s simple and fairly reliable.
There is one feature, however, that I have included in Knitter that I’ve not seen in other backchannel tools — at least the last time I looked. When my presentation is over, I go to a private dashboard and export the transcript of the backchannel in two formats. The first format is one that is coded to be copied and pasted directly into a wiki page. The codes are tuned to PMWiki, an open source wiki engine that I use for my online handouts. But the formatting can be easily adapted for Wikispaces or others.
The resulting wiki version of the transcript is then linked to my online handouts so that it becomes available to all participants as a community-generated resource. Since the transcript goes into a wiki, I (and others) can continue the conversation by inserting answers, insights, and corrections directly into the chat. It becomes another way for me to extend that learning experience beyond its place and time. Here is an example from the Georgia State Superintendents’ Association conference.
The other format is a simple text dump of the transcript with most coding stripped out. This text can be pasted into any of a number of word cloud tools to share a visual representation of the conversations. I usually use Wordle, but may give Tegxedo a try.
With more and more educators using Twitter, I recently used their API to capture Tweets posted with the Knitter event code (see The Process) as the hash tag. This worked far better than I expected.
The benefits in teaching and learning contexts are many. Here are just a few that I have found to be most important:
- Capable learners are more engaged, because they are getting traction from the ideas by pushing and pulling on each others’ perspectives.
- The answers to important side questions can be answered by knowledgeable and insightful peer/participants.
- Depth can be drilled among participants who are ready or in need of deeper learning.
- Participants learn about local experts whom they can contact later.
- Generates valuable community generated content.
- I receive invaluable feedback through the transcript, as I learn where I’m hitting the mark and where I am simply not making an idea clear — or where my idea may be wrong.
- I have more opportunities to teach, as I can read through the conversation and insert answers, clarifications, insights, and exploration of new ideas after the event is over and I have left.
Over my recent and greatly welcomed weeks at home, I have been working through another feature that, technically, is not really a part of Knitter. Again, one of the most important benefits of Knitter is the ability to archive and publish the backchannel transcript. So what if I could capture transcripts of backchannel chattings for other events, maybe even events that I am not able to attend. For most of EduCon, I will be working a conference in Christchurch. I’ll make it back for the last day, though I’m not sure how alert I’ll be after flying to NYC from New Zealand, and then a midnight train ride from there to Philly. But what if I have the transcript of the first two days of backchannel to keep me company.
Yesterday, I finished a feature that enables me to schedule a capture of Tweets with a particular hash tag. To test it, I scheduled a one hour capture based on a recent trend and hash tag, #savelibraries. You can see the transcript here. My plan is to schedule a capture of #educon postings, starting at 10:00 AM on Friday (Jan 28), hoping that the old laptop I’ll have monitoring the conversation doesn’t explode under the load.