DHCH 2023, accuracy in visuality and the OpenRefine-Wikidata combo
As the title implies this will be a larger entry. I’d love to briefly cover my participation at the DHCH 2023, the summer school/conference of the Swiss Digital Humanities. While researching emulators and old video games, I also came across an interesting case that could pinpoint to games I’d surely want to research. Finally, I also want to note down some of my experiences with working with OpenRefine for the task of importing things to Wikidata. That last point might expand into its own note at a point.
I experienced this year’s DHCH much calmer than last year. Probably I was less nervous and the energy was all overall less chaotic and excited1. This year the PhD students did lightning talks as well as a student-to-student workshop in a world-cafe format. I like both very much. Here are my notes for my presentation.
Regarding the workshop, I presented emulators as a crucial but under-researched tool in the research on video games. I was able to show the problems regarding that piece of software, and then we played some games, of course, under the guise of research!
I could also participate in a master student’s thesis presentation from in the DH at the University of Bern. He worked on a rather large corpus of letters and used Transkribus, Voyant Tools and Mallet to gain insights via distant reading. The presentation was very good and gave me a valuable insight into why my own attempt with Distant Reading the VICE Source Code results into a scrambled mess. That insight can be found under Source Code as Text, and circles around the ability to sort a corpus, which is tough with sourcecode.
Source code is a highly networked text, modularised, and references other pieces of itself all the time. This realisation gave me the idea to attempt to visualise this network of a text.
Accuracy in the visuality in video games
One of the main tenets in historical research is a need for accuracy. In order to study things of the past, we don’t want to have mere sketches of it, we want to be able to have a good and detailed look at its original form. One of the main tenets in video games research is, that one has to play the games in order to be able to look at them properly.
The problem arises when these two intersect and we want to study old games, and we don’t have access to the original gaming systems anymore. Then, the need for emulation comes up. Emulation is a best-effort approach to enable access to video games of the past. They can’t duplicate the original system in it’s entirety and must make sacrifices in places. For example, if the emulation of a graphic chip takes to much resources, hacks and solutions are found to bypass this restriction. Running something is more important than running it accurately.
Generally then, emulation is viewed as flawed in representing the past accurately, but accepted as the only solution for this approach.
Now, for a while in the 80ies and 90ies, many of the more popular and successful video games were ported to to different systems with wildly different graphic capabilities2. I came across this while research on the topic of representation of masculinity in games. Two great examples are:
I can’t formulate my thoughts clearly yet. But there seems to be an interesting tension between the aforementioned problem and design rhetorics, respectively the intention and affects/effects of the designers of the games. Luckily we have some games fitting this profile in our corpus:
Because of this tension, I’m super interested in looking at the game code of these two and how this different expressions of visuality with the same intentions manifested there.
OpenRefine and Wikidata
I converted these part directly into its own note.