top of page
  • Havsfjord

As Ted Underwood points out in his text Where to start with text mining.; “Quantitative analysis starts to make things easier only when we start working on a scale where it’s impossible for a human reader to hold everything in memory.” Distant reading requires large amounts of data, which can aid the qualitative close reading. Text mining is a useful tool when the amount of data is far too great for us to grasp using our brain (or as Underwood calls it ‘wrinkled protein sponge’). I have previously primarily focused on qualitative close reading, not only because it has suited what I work with, but that large quantities of data seems daunting. Underwood makes a great point of the usage of context required for qualitative close readings, which is thus aided by larger quantitative mining.

A majority of the time is as Hadley Wickham underscores spent on preparing the data for analysis (1). Hadley continues to point out how datasets often ‘breaks the rules’ of tidy data and very rarely are data sets ready to be analyzed. I instantly saved this list, breaking down the most common faults with data sets:

‘• Column headers are values, not variable names.

• Multiple variables are stored in one column. • Variables are stored in both rows and columns. • Multiple types of observational units are stored in the same table. • A single observational unit is stored in multiple tables.’ (Wickham, 6).

I have used Excel and other tools to analyze data sets before, but have always had difficulty in how to structure data to get the desired outcome. It has always been with some meddling that is forgotten after and can’t be replicated (also going back to previous weeks readings, to remember to write down the process, in order to be able to replicate a process and also what to avoid). My focus has primarily been on qualitative analysis, where large data sets has been more of a nuisance. I agreed with the discussions during class, whereas the question of what these large data sets actually could be used for? As Pamela Fletcher and Anne Helmreich, with David Israel and Seth Erickson project Local/Global: Mapping Nineteenth-Century London's Art Market, argues: ‘some questions cannot be answered—or even posed—without using larger data sets’. For me and my research, using large data sets is not only about finding and presenting answers, but also to discover other questions to work on. At first, it was difficult thinking about how I could work with large sets of data myself. One of the projects I have been working on is the Canadian influence on Inuit art practice from the 1930’s up until today; specifically in Cape Dorset. Qualitative analysis has been made, such as interviews and fieldwork with practicing artists, but there is still much more archival sources that would be incredibly interesting to study further. This would entail archives with global sales records, newspaper articles and governmental records; a vast amount of data to go through. Being able to search for keywords without going through all of the information myself would be a great advantage, and maybe help in discovering new questions and interesting perspective to focus on from a more qualitative perspective.

Some of the tools we have gone through during this week's class, such as _Voyant_; a platform that enables keyword search and comparison of texts would be useful for upcoming projects. An example would be to do a content analysis where keywords could be analyzed in relation to time; when were certain keywords used more and not etc. From this a more qualitative discourse analysis could be concluded thanks to the distant reading done with Voyant or similar a similar tool. Similar to the Google tool Ngram Viewer, which enables you to see usage of phrases in corpus of literature, where you can also focus on specific periods of time. Using these kinds of tools makes it easy to get a broader grasp of word usage that I could see myself using in first step of analysis. Important to keep in mind is also what Underwood points out; these kind of tools may give you the impression that you don’t need to do any programming of your own, due to the large body of tools already out there. However, these available tools offer more of a scope of what is possible, but with own projects, it will most likely require you to programme in order for you to effectively focus your methodological approach.


Hadley Wickham, “Tidy Data,” Journal of Statistical Software, Submitted.

Pamela Fletcher and Anne Helmreich, with David Israel and Seth Erickson, “Local/Global: Mapping Nineteenth-Century London’s Art Market,” Nineteenth Century Art Worldwide 11:3 (Autumn 2012). Ted Underwood, “Where to Start with Text Mining,” The Stone and the Shell.

  • Havsfjord

In 2007, the art world and fans of Andy Warhol were shocked by the story uncovered by Swedish news paper Expressen; more than a hundred of Andy Warhol 'Brillo'-boxes, sold for thousands of dollars were fake made after Warhols death. The map below is created with Google Maps, a location based timeline of the events, from the creation of the Brillo box by James Harvey to Warhols version; to the subsequent uncovering of counterfeit boxes and its repercussions. The layers are divided into time periods; the creation of the Brillo box as a commodity and art work (-1968) the creation and first exhibitions with the counterfeit boxes (1990-1993), the fake boxes on the market (1994-2006), to the eventual discovery and aftermath (2007- ). I find it really interesting to be able to map out both timelines of ownership and artifacts travels, which I could see myself using in my own work. An example of usefulness is in the question of ownership of artifacts in the Global North originating from the Global South, which could be furthered analyzed with the help of map visualizations.

  • Havsfjord

Updated: Sep 23, 2019

This week has been an experience in both annotation for youtube, oral history and a program called Thinglink. When this were shown first, it instantly caught my eye, as I have worked with StoryMap and ArcGIS previously and I could instantly think of ideas on how to apply this to my work. I have here used Thinglink for another project of mine, about nature narratives within the Arctic circle, tourism and climate change. I together with other master students travelled to the Arctic circle for a research trip in May 2019, doing interviews with locals in the tourist industry. We were primarily focusing on their idea of a sense of place and its meaning in a changing environment; the focus lying on human inflicted climate change. Climate change is a phenomenon affecting humans, animals and the environment all over the globe. But certain ecosystems are affected to such an extent that the way of life in these regions will and have already changed radically; the Arctic Region is such an ecosystem.

The map below is an aerial view of the territory we did, with tagged locations for our sketch maps.

Our methodology was using sketch mapping together with narrative walks. Sketch mapping is freehand drawn maps, preferably on a blank paper or on a map with very few details. Less distraction of predefined spatial features enables that a greater focus and attention is paid to places and spatial dimensions that are of importance to the person drawing. Sketch mapping as a methodological tool is frequently used by behavioural geographers, and is a form of understanding relations and perceptions between humans and the studied location. Therefore forms of sketch mapping (and also cognitive mapping) has been adopted within environmental studies, due to it being able to compel the informants to consider their relation to their own surroundings. We used the theoretical concept of a sense of place, a term often used within geographic studies as well as art and literature, as a base for our project; what is our emotional space and its connections to the spatial dimensions? We wanted to argue that sense of place works to find and legitimize local values of importance instead of purely economic values used when formulating policies within climate change mitigation. In ArcGIS, we added coordinates of locations connected to their sketch maps, and related it to potential environmental changes already occurring and potential futures.

We decided not to record our interviews, as we were often on the road together or having a herd of 70 husky dogs in the background, but took extensive notes and photographs connected to coordinates; spatial maps and sketch maps. After this week's readings, in regards to Linda Shopes Making Sense of Oral History on oral history, I now in hindsight wish we would have been more mindful of collecting these stories in voice or video recording. The Arctic with its glaciers as a ‘nature’s archive’, as it was with extractions of glaciers that helped scientists be able to define periods of carbon dioxide levels and its increase. It is also one of the regions where change is happening now and very radically so. Our climate is changing and the stories of what it used to be and how it felt will be valuable to record, recording not the spatial importance, but the emotional and sense of place.

The maps below are our sketch maps with location tagged within the maps that are in relation to the aerial map.

Interview and sketch map with Olof, lived and worked in Jukkasjärvi for 10 years, working at the ICEHOTEL.

Sketch map from Åsa, lived her whole life in Arctic parts of Sweden, owner of Arctic Dogsled Adventure

Sketch map from talking to Stephanie, French who's been working and living in the area for almost 15 years, has her own business Husky Voice

Sketch map from Ida-Maria, who is Sámi and has created her business around showing tourists her and her family life as Sámi, SápmiLIFE.


Linda Shopes, “Making Sense of Oral History,” Oral History in the Digital Age.

bottom of page