Analyzing the History Department’s CLOs by Year of Instruction (2020)

NB: This is a draft document only. I have shifted the output to another site, because wordpress.com does not allow the presentation of live Voyant outputs.

 

The following windows present ways of “reading” the CLO files for the Departmental Self-Study using Voyant Tools, a text analysis program developed by two Canadian humanists: Geoffrey Rockwell and Stefan Sinclair. Each window is interactive. You can explore the data sets by hovering your mouse over words to find out raw numbers, and you can also change the analysis criteria (and even the visualization tool), if you wish — and if you want to work with Voyant Tools.

Even without spending any time to learn about Voyant Tools, you should be able to get a sense of the visualizations and their significance quite quickly. For each tool I have tried to link a set of basic instructions (but they are not loading from the Voyant homepage as of 10 June).

The Cirrus (aka word cloud) is the most basic visualization. We used it in the last Self-Study, but I don’t think it is the most useful, because it treats all the data as if they were in one big file.

The ScatterPlot shows how terms from our CLO documents cluster by year of instruction. You might have to view this frame in its own window to make it useful. Once you do so, you’ll probably find it to be much more valuable than a word cloud.

I find the TermsRadio to be one of the most helpful of the visualizations in Voyant for our purposes. It shows the trends in the frequency of keywords by year of instruction. Larger terms near the top of each column are more frequent for that year, while smaller ones near the bottom are less frequent (but still significant enough to show up among the 50 most frequent terms). One of the surprises for me from this visualization is that “analysis” diminishes in frequency of use in our 4th-year CLOs.

Note: If you would like to expand any of the windows, or change tools, you might need a quick introduction to Voyant. For this, I hope the instructions that Danny and I have prepared for our students in 2F90 will serve as a useful guide. See http://brockuhistory.ca/ebooks/hist2f90/workshop-2 and http://brockuhistory.ca/ebooks/hist2f90/advanced-voyant-workshop-1 (but do this only if you want to learn the nuts and bolts of the program). For a more conceptually advanced discussion of text analysis, see https://mitpress.mit.edu/books/hermeneutica (a book by Sinclair and Rockwell), and the companion website (which I can’t find at the moment).

Here is a note about how I (Mike D.) prepared the data set. I first deleted all of the basic, repeated explanations that were the same in each table, and then I combined all the files for each year into a Word document. I then uploaded the resulting four files

Slow Reading Early Modern Texts (by Giulia Forsythe)

Giulia_BrockU_Pedagogy_of_Transcription

The first iteration of HIST 2F90 is just wrapping up at Brock University. It’s a good time to reflect again online about the course.

One of the challenges of the course has been planning for and then managing the students’ work with primary sources. You can find an earlier post on the subject by clicking here, and we’ll certainly write more about this subject in later posts.

For now we’re sharing Giulia Forsythe’s illustration of the process the students and we went through in transforming publicly available page scans of four 18th-century books on abolition into machine-readable texts. In the process students learned how to read texts slowly using conventional “slow” reading and new forms of text mining using Voyant Tools.

Thanks for sharing this, Giulia!

Teamwork and humility

Putting together a course like ours poses many challenges, but for online teaching newbies like us the most daunting has been learning to design courses effectively for a new medium. The experience has taken us out of our comfort zone. Neither of us were Luddites – indeed, we liked to think of ourselves as being on top of some technologies. But it’s not technology per se that seems to be the issue for us; it’s the array of tools one must learn. There are so many things we need to learn, so many tools to figure out, and in a relatively short period of time. Never mind course design; never mind best practices in online forums; never mind making (and editing!) videos; never mind appropriate visual layouts; and for that matter never mind tools and applications like Scripto and TimelineJS. None of these are minor; indeed, they all warrant some discussion here. But the truly daunting challenges are coming now with our planned digital transcription assignment. Continue reading

Holy Fh!t: Online course design and the Accessibility for Ontarians with Disabilities Act (AODA)

By Mike Driedger

Fun fact: Before about 1800, “s” and “f” looked VERY similar in printed texts but they were clearly distinct letters (sort of like “1”, “l” and “I”, or “O” and “0” today). Here’s an example: In an 18th-century essay Joseph Warton wrote that “the favorite and peculiar pasttime” of Ariel in Shakespeare’s The Tempest is expressed in the following song:

Warton-PopeEssay-1782-p235
From Warton, Essay on the Genius and Writings of Pope, 4th and corrected [!] edn (London, 1782), vol. 1, p. 235. Available through archive.org.
This passage is an example of the kind of text that has given us course-planning fits in the past week. Imagine what a blind student using assistive reading technology would hear when trying to listen to this passage! We don’t have to imagine what the OCR (optical character recognition) technology used by archive.org does with the passage, because this is what you will actually (no joke) find online at archive.org (WARNING: explicit language):

Continue reading

Tailoring assignments to the online environment

Preparing to teach online has encouraged us to make fuller use of the online world by utilizing the dynamic, interactive tools of digital humanities scholarship. For some time, digital humanities have come to be seen as something of a panacea. And now, the rise of “Big History”, which embraces “big data” and DH tools, seems poised to push the discipline in that direction. Neither of us has ever shared that view, but we have both been adapting gradually to the new possibilities offered by computer tools in research and teaching.  Over the years, both of us have developed databases working with census materials and vital statistics, experimented with alternatives to PowerPoint such as Prezi, taught with online research tools, and organized much of our own research with Zotero.  More recently though we have begun to explore the potential of tools for text analysis, something that pushes us more fully into the world of Digital History.  We hope the course we’re developing will encourage students to engage more fully with the possibilities of research in a digital world, while also giving us an opportunity to share the learning experience with them. Continue reading

Blog introduction

Last spring, we (Danny Samson and Mike Driedger) began the design of a new online history course for Brock University.  The basic idea was to create a wholly online course that combined our specialisations in colonial North America (Danny) and early modern Europe (Mike).  Thus, we have a second-year course called “Money and Power in the Atlantic World, 1400-1830”; it will be delivered for the first time in fall-winter 2015-16.  The reasons for the development of the course will be the subject of a later post, but for now we’ll emphasise a desire to try to some new things, to engage some new students, and try our hand at teaching with digital tools. Continue reading