2015-2016 Sherman Graduate Fellowship

The Lewis & Ruth Sherman Centre for Digital Scholarship is soliciting applications for a 2015-2016 graduate fellowship in digital scholarship. As digital scholarship (DS) becomes more prominent in academic research, the Sherman Centre’s role is to support members of the McMaster community as they experiment and integrate DS tools and practices into both projects and teaching. The term “digital scholarship” encompasses a diverse set of practices throughout the humanities, social, and hard sciences — to learn more about our version of digital scholarship, see our statement on our site.

The ideal candidate for the fellowship will be a graduate student who seeks to help drive the evolution of their discipline by applying the tools and methodologies of digital scholarship to their research. Also critical is a willingness to learn about  the collaborative and multidisciplinary nature of digital scholarship and to engage others both within and without the discipline in their projects. In order to create a focused learning experience over the course of the fellowship, we ask applicants to propose a specific idea for a digital scholarship project that they will work on over the course of the year. This project could be part of a sandwich thesis or otherwise complementary to the student’s dissertation research.

To see current fellowship projects as well as other work happening around the Sherman Centre, please visit our project page. The 2014 fellows were Mark Belan, Chris Handy, and Jeremy Parsons.

The Sherman Centre for Digital Scholarship received designation as a McMaster University Research Centre in 2012. It is located on the first floor of Mills Library and includes a presentation space, three public high-powered workstations that provide access to a wide range of digital scholarship tools, and office space for researchers. Beyond space, the Sherman Centre offers a range of staff to support research projects, as well as a robust yet flexible technical infrastructure that fellows and other researchers may utilize.

The fellowship runs from September 1, 2015 until August 31, 2016 and offers these benefits:

  • office space in the Lewis & Ruth Sherman Centre for Digital Scholarship
  • technical and project consulting, both via its own staff, as well as from other Library units, e.g.- Lyons New Media Centre or Maps, Data, GIS
  • technical infrastructure
  • a $1,500 stipend

Expectations and deliverables:

  • a presentation for the monthly Sherman Center Colloquium (20-30 minutes)
  • posts on the Sherman Centre blog (minimum two per term) on project updates and/or related digital scholarship issues. These posts will be revised into a written report at the conclusion of the fellowship that details project outcomes, making specific reference to the role(s) played by the Sherman Centre
  • creation of a visualization of an aspect of their work to display on the Sherman Centre multimedia entryway
  • physical presence in the centre and participation in its activities

Eligibility:

  • current or accepted graduate student at McMaster University (open to all faculties)
  • not previously a Sherman fellow

Applicants should submit a letter of intent outlining their project and how it would benefit from the fellowship, along with a CV and a list of three references to Dale Askey, Administrative Director of the Sherman Centre (askeyd@mcmaster.ca).

We will be holding two information sessions for interested graduate students:

  • Thursday, April 30, 1:30-2:00 (following the Sherman Colloquium from 12:30-1:30)
  • Wednesday, May 13, 3:00-3:30

Deadline: Friday, May 22nd

Posted in Blog

Plotting a Plague Pandemic

Little Hitchhikers

Humans are more than just singular beings, we are “Superorganisms”: vessels for thousands of small life forms that make up our microbiome. We have complex relationships with these resident microbes, ranging from beneficial to parasitic, which are influenced by numerous biosocial factors (diet, environment, genetics, antibiotic use, etc.). Exploring the effect these microorganisms have upon us is a hot topic of research, especially here at McMaster, as it is apparent that the microbiome plays a major role in health and disease, both physically and mentally.

But my research doesn’t seek to reinvent health care approaches. Instead, I exploit the human-microbe relationship to tell stories about how humans have migrated and exchanged diseases throughout history. And sometimes, the little microbes that have hitchhiked with us across the globe are even better storytellers than humans themselves.

Human History by Proxy

As early humans dispersed throughout the globe, human populations became geographically separate and diversified. Simultaneously, our microbiomes were co-evolving along with their human hosts resulting in the distinct geographic distributions of disease we see today. The stomach bug, Helicobacter pylori, is present in most human populations and its parallel evolution with humans has been used to reconstruct ancient patterns of human migration, dating all the way back to the dispersal out-of-Africa (Figure 1). Microbe evolution can also tell us about recent disease dispersal, such as genetically tracking the 2010 Haitian cholera epidemic back to the arrival of UN Nepalese peacekeepers.  Bacterial epidemiology can also reveal altered human-environment interactions as the increased prevalence of zoonotic diseases like malaria and plague can be linked to ecological instability (ex. extensive deforestation). Microbial evolution serves as a very unique and powerful line of evidence, especially when contextual information (such as historical records) are sparse or inaccurate.

Figure 1. Helicobacter pylori global dispersal (Yamaoka 2010).

So why not just sequence human DNA rather than using a microbial proxy? Humans evolve relatively slowly (a much longer generation time) and don’t accrue as many DNA mutations within short time frames. In comparison, bacteria replicate extraordinarily quickly, with generation times ranging from 30 minutes to several hours. That means new mutations are occurring constantly, generating new data points for us to assess biological relationships, and potentially acquire finer resolution. In my case, I’m interested in both the movement of humans and how they exchanged infectious diseases in the past, therefore I turn to both bacterial DNA and historical records to reconstruct these processes.

Plague and Phylogeography

For my doctoral research, I’m examining one of humanities deadliest and ancient diseases: plague. This infectious disease is infamous for playing a major role in historical pandemics such as the Roman Plague of Justinian and the Medieval Black Death, with 30-50% of human populations perishing during these outbreaks.  While mortality estimates like these have yet to be observed in the modern era, plague is still entrenched in many geographical regions in the world with the most topical being the ongoing Madagascar Plague outbreak of 2017.

Plague’s tendency to appear within a population, seemingly out of nowhere, and vanish in an equally mysterious fashion has long intrigued and frustrated researchers. Despite more than a century of comprehensive research, the origins and global routes of plague spread remain obscure. This is in part because plague is an ecologically complex disease, most commonly spread via infected rodents and their fleas, but can also spread directly between humans. In addition, the limited explanatory power of current models has also been attributed to the sparseness and ambiguity of historical mortality records, leading to high levels of uncertainty. In response, alternative lines of evidence have been cleverly sought out. Furthermore, current trends promoting the creation of open access, digital databases have greatly facilitated cross-disciplinary work and opened up previously inaccessible geographic regions and time periods for exploration.

My Project

My research continues this trend of novel lines of evidence by analyzing the ancient DNA (aDNA) of the plague bacterium in order to reconstruct disease dispersal events in human history. By extracting plague DNA from archaeological remains found in epidemic cemeteries, it is possible to identify key genetic mutations that link related bacterial strains and infer distinct waves of infection.  These projects encompass a “phylogeographic” approach, which integrates phylogenetic (evolutionary) relationships with geographical relationships in order to reconstruct the spread of this infectious pathogen. The extraction, sequencing, and evolutionary analysis of plague aDNA is currently being undertaken at the McMaster Ancient DNA Centre.

The project being conducted in collaboration with the Sherman Centre for Digital Scholarship, aims to put the “geography” in the “phylogeography” of plague. Informative geospatial analysis of past pandemics is highly dependent on having  strong foundational information on modern pandemics: a foundation that currently does not exist within plague literature. This is not for lack of data, as over 600 strains of plague have been sequenced and are publicly available through digital databases. This focused project therefore aims to curate, contextualize, and analyze the digital metadata associated with these plague strains. Armed with this comparative data, I will then be able to start exploring hypotheses such as:

  1. Did trade routes and migration events influence the distribution of plague?
  2. Are there ecological zones that correlate with increased prevalence of plague?
  3. Does genetic evidence complement or contradict archival-based models of plague spread?

My objective is to expand our epidemiological knowledge of plague, in a way that improves our understanding of the interplay of factors contributing to modern re-emergences, as well as the historical events that triggered past pandemics. The combination of molecular genetics and geospatial analysis, driven by humanities-focused questions offers a unique lens through which to reconstruct the fluctuating patterns of human connectivity and ecological interaction that have shaped our relationship with infectious disease.

Project Organization: The Beginning of the End?

When I’m trying to organize a project, my favorite place to start is… at the end. What kind of finished product do I want to end up with? And how might the answers to these questions be visualized and explored? There’s no shortage of geospatial tools to explore disease epidemiology, so I created 5 criteria to assist in program selection:

  1. Statistical Framework – Hypothesis testing is a must, visualization alone is insufficient.
  2. Disparate Data – Incorporate heterogeneous metadata and account for prior information about evolutionary relationships. (Bayesian GIS anyone?)
  3. Standardized Output – Produce an output file that will be recognized by other geospatial and visualization software. Proprietary file formats are to be avoided.
  4. Aesthetics – Try to avoid the Google Maps API (personal preference).
  5. Learning Barriers – Free, open source, reduced learning curve or plentiful training resources.

As expected, no program satisfies all 5 criteria, and thus I will likely use a combination to highlight each tool’s strengths and complement weaknesses. There are a variety of R packages that seem promising, the best candidate being BPEC (Bayesian Phylogeographic and Ecological Clustering) (Figure 2). This tool has powerful analytical potential and can identify meaningful geographic clusters in your data.  To go deeper into routes of spread, SpreaD3 is well-designed for epidemic source tracking and takes as input files from programs which I am already using (Figure 3). With a faint idea of what I need to prepare for downstream, I was better prepared to select appropriate datasets.

Figure 2.  Phylogeographic clustering of frog populations using BPEC (Manolopoulou et al. 2017).

Figure 3. Ebola virus spread in SpreaD3 (Bielejec et al. 2016)

Project Organization: Let’s Get Started

For the previously identified programs, there are three mandatory pieces of information I need for each outbreak record:

  1. DNA sequence data to reconstruct evolutionary relationships.
  2. Geographic location (ideally latitude and longitude).
  3. Collection year (time point calibration).

Accessory variables that would be very interesting to test include host (rodent, human, camel, etc.) but are rarely made available by submitters.

Based on my review of plague literature, I was expecting to find about 150 plague genome records as this number seemed to be representative of current publications. I began my search scouring online genome repositories (NCBI, ENA, DDBJ) to identify datasets. To my great surprise, I found over 600 plague genome sequencing projects which either had been 1) published on but quality could be improved, 2) published on but only in a limited descriptive sense, or 3) has no publications associated with it. Despite being overwhelmed with an unexpected amount of data, I’m still very excited by the potential to contribute something new and meaningful with data that is mostly untouched.

The problem is that this data sits behind a very scary wall: big data science. The actual genetic sequence data is enormous and complex (we’ll save that for another post) and the metadata is messy with many missing fields. The messiness can in part be cleaned up with tools like OpenRefine, but the missing data either means I’m going manually hunting through Supplementary Files, or a lot of data is getting kicked out.

I then investigated how to query, download, and parse over 600 files of metadata into a meaningful table I could import into downstream applications. I tested out a number of APIs (Bioconductor, Biopython, SRAdb, MetaSRA) but wound up dissatisfied in part with most of them. My current strategy is rather ad-hoc, as I intend to combine multiple programmatic APIs, web-browser GUIs, and my own python scripts to build up a geospatial relational database. The result is functional, but unwieldy, and is rather limited in application to my own project (Figure 4). I’m also currently missing about 300 database records, largely because of consistency issues inherent to repositories that are governed by user-submission.

Figure 4. One table (among many) present in my SQLite relationship database.

Next Steps

My next goal is to get the database pipeline to a point where I’m satisfied it has scraped as much information as it can via automation. From there I will move to manually going through associated publications to fill in geographic location and date where I can. At that point, it will be time to start experimenting with geospatial tools to figure out how I’m going to visualize hundreds of years of global disease dispersal in an informative manner. I’m looking forward to geospatial workshops in the new year, and I’ll be showcasing some preliminary maps next blog post!

 

Figure Reference

Bielejec, F., Baele, G., Vrancken, B., Suchard, M. A., Rambaut, A., Lemey, P. (2016) SpreaD3: interactive visualisation of spatiotemporal history and trait evolutionary processes. Molecular Biology and Evolution. 33 (8): 2167-2169.

Manolopoulou, I., Hille, A., Emerson, B. (2017). BPEC: An R package for bayesian phylogeographic and ecological clustering. Journal of Statistical Software. arXiv:1604.01617v2

Yamaoka, Y. (2010). Mechanisms of disease: Helicobacter pylori virulence factors. Nature Reviews Gastroenterology & Hepatology. 7: 629–641

Posted in Uncategorized

Developing A Photogrammetry Toolkit For Rapid, Low Cost And High Fidelity 3D Scans

As a current PhD student in the Communications Cultural Studies and New Media Program at McMaster University, my research revolves around the application of new media to create personal archives for individuals or relatively small communities, groups and peoples, primarily marginalized populations, including: ageing populations, people of colour, indigenous peoples, people with accessibility needs and migrant populations, especially those displaced by climate disaster, armed conflict, and global economics.

These new media archives are rooted in enabling the community itself to accessibly and rapidly generate their own archival content in response to the inability of traditionally large institutions like museums and government run organizations to include marginalized people, especially in the face of rapid change caused by climate disaster or armed conflict. The new media forms I intend to include in my research are: audio recording, photography, 3D scanning, and 3D printing. For the Sherman Centre Graduate Residency in Digital Scholarship I will mainly be focused on photogrammetry technology. Photogrammetry is a 3D scanning technique that requires an object to be photographed from multiple angles, these photographs are then compared in a computer program and a full colour 3D mesh can be created. This mesh can be very high resolution, allowing objects to be viewed in 3D on a computer or virtual reality headset or 3D printed, they can also be 3D printed in full colour out of various materials.

developing a toolkit for accessible and rapid 3D scanning and printing using photogrammetry and pro-sumer 3D printers to create personal archives that can be implemented by small community groups and not-for-profit organizations to create personal archives and digital scholarship. This research will culminate in a final working prototype that also functions as a work of speculative design that is a break the glass in case of emergency type of photogrammetry scanner that can be used in areas effected by catastrophe including climate disaster and, armed conflict that may lead to mass migration or evacuation,

where personal artefacts and heirlooms may not be able to be transported or preserved and need to be documented in 3 dimensions quickly. This system will be based on low-cost open-source electronics; mainly raspberry pi and 3D printed protective housing.

The aforementioned toolkit would include a digital wiki/archive of programs and tutorials on how to use several programs to create the kind of media I will be generating during the residency. This toolkit can be useful for individuals, community groups and larger institutions looking to enter into photogrammetry based 3D scanned and 3D printed archives.

During my residency in the Sherman Centre for Digital Scholarship I have begun using Agisoft Photoscan (http://www.agisoft.com/) a photogrammetry program that is a prosumer style program, featuring some professional features while maintaining user friendliness and relatively low cost. I have done several rounds of testing with other programs and I will continue to update the blog with my results.

Below are some examples of the process and results using Agistoft PhotoScan to scan a Molcajete and Temolote:

Above, the molcajete and temolote set can be seen with the 3 LED lights used to create even lighting. Using even lighting like this seems to have helped generate accurate scans.

Above, the temolote is ready to be photographed from several angles around a radius of about 4 feet in a dome pattern. I used an 85mm prime lens with an f-stop of f22 at 18 megapixels. These settings help to reduce distortions that might occur that make the scan reconstruction difficult.

Above, a view of the Agisoft PhotoScan software, after the images were aligned and a mesh generated.

In the video above, the images taken around the temolote can bee seen. These images are processed to create a 3D mesh that is also textured with these photos, giving a full colour model.

From there, an OBJ or STL 3D file is exported, as depicted above. This image was rendered in Blender3D. This file can be used to 3D print.

Printing the scan on a Ultimaker 2+ in natural PLA filament.

The result is a high detailed 3D print. The settings of the 3D scan file and 3D print file (GCode sent to the printer) can be fine tuned to yield greater detail, while taking much longer to generate and print.

 

Below are some readings you can use to learn more about the subjects of 3D scanning, 3D printing and speculative design:

Morehshin Allahyari

Material Speculation: ISIS (2015-2016)
“Material Speculation: ISIS” is a 3D modeling and 3D printing project focused on the reconstruction of 12 selected (original) artifacts (statues from the Roman period city of Hatra and Assyrian artifacts from Nineveh) that were destroyed by ISIS in 2015. “Material Speculation: ISIS” creates a practical and political possibility for artifact archival, while also proposing 3D printing technology as a tool both for resistance and documentation. It intends to use 3D printing as a process for repairing history and memory.

http://www.morehshin.com/material-speculation-isis/

 

Near Future Laboratory

At the Near Future Laboratory, our goal is to understand how imaginations and hypothesis become materialized to swerve the present into new, more habitable near future worlds. We work from a variety of conceptual and creative platforms to help explicate context and explore future states, including various calibers of research — from the field to the desk to the lab bench and everything in between.

http://nearfuturelaboratory.com/

 

Dunne & Raby

Today designers often focus on making technology easy to use, sexy, and consumable. In Speculative Everything, Anthony Dunne and Fiona Raby propose a kind of design that is used as a tool to create not only things but ideas. For them, design is a means of speculating about how things could be—to imagine possible futures. This is not the usual sort of predicting or forecasting, spotting trends and extrapolating; these kinds of predictions have been proven wrong, again and again. Instead, Dunne and Raby pose “what if” questions that are intended to open debate and discussion about the kind of future people want (and do not want).

http://www.dunneandraby.co.uk/content/projects/756/0 http://www.dunneandraby.co.uk/content/projects/75/0

Posted in Blog Tagged with: , , , , ,

Some Reflections on the Intersection between Conventional and Digital Approaches to Scrolls Research

Over the last seventy years, Dead Sea Scrolls research has carried on in a permanent state of revolution, with new methods, technologies, and bodies of evidence overturning or qualifying old consensuses. To current PhD students like myself, who are dissertating on the Scrolls, many of the recent advances in digital approaches and tools appear to be changing the face of the discipline; however, to seasoned scholars this revolutionary change is nothing new. Scrolls research has always been like Menelaus wresting an oracle from the shape-shifting Proteus—change and adaptation are the norm. The ill-conceived myth of the triumph of digital scholarship over conventional scholarship simply does not apply. The key consideration for early-career Scrolls scholars is how to follow in the footsteps of earlier generations in usefully integrating new tools and approaches without abandoning the conventional. During research that I carried out this summer in Jerusalem on the Thanksgiving Hymns from Qumran (1QHodayota), I frequently found myself combining the old with the new to address pressing research questions.

1QHa is a particularly challenging scroll to study because unlike many of the rest of the Dead Sea Scrolls, new high quality images, such as high-resolution multispectral images or RTI (reflectance transformation imaging) images are not yet available. Even if they were, however, the plates in the Dead Sea Scrolls of the Hebrew University,[1] the Shrine of the Book images, and the plates in the edition of 1QHa in volume 40 of the Discoveries in the Judaean Desert series would still be indispensable.[2]

The older images document the state of the manuscript in the years after its discovery and in the process of its unrolling—a resource that new digital tools or approaches cannot replace. Consequently, I find myself drawing heavily on conventional editions and photographs, even as I am making digital reconstructions of columns in GIMP and rolling them in three-dimensional environments to compare patterns of damage in digital modeling suites like Blender. When creating a reconstruction of a scroll in its rolled state, it is best to use these early images so that any modern shrinkage, decay, or damage are not baked into the model. Thus, even digital Scrolls research is forever anchored to those initial images.

In addition, when working on problems of material reconstruction, there are questions that cannot be answered relying solely on either editions or digital tools. Scrolls are three-dimensional objects, and certain aspects are not fully captured by existing images; e.g., texture, thickness, shrinkage, light damage, and the extent of delamination. What appear in photos to be patterns of repeating damages, or potential joins between fragments, may be ruled out upon first-hand inspection of the fragments themselves, especially with the help of conservators who are intimately familiar with the physical manuscripts and causes of damage. I found this to be the case when I visited the Shrine of the Book, where 1QHa is archived. Hasia Rimon, a conservator who has worked closely with the Shrine’s manuscripts since 2012, helped me to see and understand the condition of the manuscript and how it has been conserved since its discovery. The same applies for conservators at other institutions that conserve Dead Sea scrolls, most notably the Israel Antiquities Authority, which is responsible for the vast majority of the Judean Desert manuscripts, including the other Hodayot manuscripts.

Shrine of the Book, Israel Museum

Shrine of the Book, Israel Museum. Photo Credit: Author.

Furthermore, a visit to the Shrine of the Book or the IAA is the only way of tapping into the institutional memory of the discovery of the Scrolls and their condition over the course of the last seventy years. For example, anyone who has visited the Shrine of the Book will know of Irene Lewitt’s formidable knowledge of the whereabouts of the Shrine’s scrolls and their photos over the last 70 years—especially that of 1QHa and the other Hebrew University scrolls. A similar knowledge-base exists at many of the institutions in Jerusalem with historical ties to the Scrolls, like the Orion Center, the École Biblique, the Rockefeller Museum, and the Albright Institute.

One of the perennial methodological concerns for digital scholarship is how to use new tools and approaches judiciously and in ways that actually advance the field. For Scrolls research, implementing new digital approaches requires a thorough consideration of the conventional resources, tools, and institutional memories to gain new insights. This combination of innovation and convention is nothing new—it is business as usual for Scrolls scholarship in making use of every available means to yield new insights into the Dead Sea Scrolls.

[Expanded from 2017 Newsletter of the Orion Center for the Dead Sea Scrolls]

[1] E. L. Sukenik, The Dead Sea Scrolls of the Hebrew University (Jerusalem: Magnes Press, 1955).

[2] Hartmut Stegeman and Eileen Schuller, DJD 40.

Works Cited

Schuller, Eileen and Hartmut Stegemann. Qumran Cave 1.III: 1QHodayota with Incorporation of 1QHodayotb and 4QHodayota-f. DJD XL. Oxford: Clarendon, 2009.

Sukenik, E. L. The Dead Sea Scrolls of the Hebrew University. Jerusalem: Magnes Press, 1955.
Posted in Blog Tagged with: , ,

Making Uganda’s Intellectual History Digital: Knowledge Preservation and Ethical Considerations

As a historian of Africa and the colonized world, my research continuously pushes me to consider the unequal power relationships that govern the preservation and presentation of knowledge about the past in these places. Asking questions about how the history is being done and by whom and with what sources are necessary for undertaking ethical scholarship. How does the academy’s presentation and consumption of sources shape their historical meaning? How can digital tools be used ethically to develop/enrich our fields of study? Are the Digital Humanities neo-colonial?

My name is Samantha Stevens-Hall and I am a 5th year PhD student in the History Department and a returning Graduate Fellow/Resident at the Sherman Centre this year. I am a historian of Africa, more specifically the intellectual history of the kingdom of Buganda, the predecessor to modern day Uganda, in the 19th century during the transition to British colonial rule. My dissertation research took me to several continents over the course of 2 years during which I visited a variety of libraries and archives. It was my experiences in these archives that have brought me to the Digital Humanities. The pertinent materials to doing this intellectual history are not housed within one country, let alone one continent. While Britain, as the metropole, has fairly rich archives on this period, Uganda itself holds only mostly fractured and poorly preserved copies, if they even have any, of the intellectual works of some of the key figures in their countries’ past. And so, I began to consider how to make these materials more available, both within Uganda, and elsewhere outside of Britain, so that this history was no longer restricted to those with the monetary and institutional support needed to undertake long distance fieldwork research.

My dissertation deals with networks of knowledge and knowledge transfer during the period of transition from oral to written culture in Uganda, which coincided with the transition to British colonial rule. I am interested in what happens to knowledge and sources when they are transferred between mediums, from oral to written, typescript to microfilm, catalogued in physical archives to uploaded to the web as digital sources. In the case of the sources used in my dissertation, from oral to hand written in the vernacular, and from written to typescript translated into English, and finally partially digitized in the contemporary period. Digitizing some of these pieces of intellectual history offers the opportunity for discussion about what happens to sources when they are transferred from one medium to another. As much of my thesis deals with themes of translation and the transition from oral to written culture, I am also interested in what happens to the colonial archive and the dissemination of colonial knowledge when sources are made available digitally.

My DH project is an open access digital archive of primary sources and supplementary materials in African intellectual history.This archive would serve as a repository for endangered documentary materials and as an exhibition to curate and display the intellectual history of Uganda. The materials incorporated come from the archival work done for my dissertation; these include biographies of a few key Ugandan intellectuals who are the focus of my dissertation, with appended excerpts from their works. This archive will bring together scattered sources into one easily accessible online resource. For my dissertation digitizing some of these pieces of intellectual history offers the opportunity for discussion about the life cycle of documents, and what happens to sources when they are transferred from one medium to another. As much of my thesis deals with themes of translation and the transition from oral to written culture, I am also interested in what happens to the colonial archive and the dissemination of colonial knowledge when sources are made available online. Further, it would make a contribution to the DH community through its mandate of decolonizing the archive and attempting to bridge the “digital divide” between the West and Africa in computing access and capabilities.

Last year my proposed project was a prototype digital archive containing documents and other materials pertaining to the history of Uganda from the 19th century collected during my dissertation research. While this has not changed significantly from last year, the goals of the project have shifted somewhat, and what is hypothetically possible to complete in 12 months, and the steps necessary to meet these goals, has become much more well-defined. Last year I proposed building an online archive and exhibition that was structured around three portfolios of Uganda intellectuals from the period of transition to British colonial rule in Uganda in East Africa during the last decades of the 19th century and the first decades of the 20th. While I am no longer sure that organizing the exhibition biographically makes the most sense, it will definitely be divided into folders arranged along thematic, temporal or biographical lines with each containing document files and appended relevant information and analysis. The excerpts would come from materials collected during my archival work over the past three years. Some of these materials have been published and others are from unpublished manuscripts. The excerpts would be selected to show the dynamic character and variety of intellectual activity in Uganda in a way that supports the key arguments in my thesis that these intellectuals were multidimensional figures engaged in a vibrant culture of knowledge exchange and debate over representations of the past. The archive will bring together materials that are now held in disparate and distant archives across several continents and not digitized, prohibiting their study without extensive funding for travel. Creating an Open Access archive would make the materials available much more widely. This will foster new study’s of Uganda’s intellectual past from within the country’s own institutions and contribute to both the preservation and dissemination of knowledge about the country’s past.

On a final note, most of the documents I am working with are not easily accessible outside of archives or university libraries in the West. The archives that do house some of these sources in Uganda are in poor condition; if not catalogued and digitized soon risk complete destruction. Beyond my dissertation work I am deeply interested in the tenuous relationship between history and politics in contemporary Uganda. History if often a “dirty” word and no national histories are taught in primary or secondary school. The study of history is overshadowed by disciplines with more applicable career skills at the nation’s universities. That being said, there is an interest among some academics and politicians in preserving the region’s history and are willing to undertake the massive project of preserving the archives and turning the tides of public opinion back towards valuing history. My hope is that this project will be a step in the right direction and provide a possible template for future archival repositories, community engagement and ethical knowledge preservation and dissemination.

Posted in Uncategorized

Visualizing Climate Change and Environmental Disaster in Ontario

On 10 July, 1911, one of the deadliest forest fires in Ontario history ripped through the north, totally destroying the new gold rush community of Porcupine Lake. When it was all over, the blaze claimed seventy-three lives, burned over half a million acres, and caused millions of dollars in property damage. People burned to death in their homes, suffocated in mine shafts, and drowned while trying to take shelter in the storm-ravaged lakes. The nascent mines, surrounding communities, railways, and other infrastructure were reduced to twisted metal and rubble. The blaze made international headlines, and was called by the media the “worst disaster in Ontario history.”[1]

In an era of increasingly regular climate-related natural disasters, the Great Fire of 1911 proves instructive. What human and environmental factors made the fire so catastrophic? How did climate shape the fire and human responses to it? Finally, we all know that the climate is changing, but what does that look like on the local scale? How does 1911 compare to today?

Regular fire cycles are a natural part of Ontario’s northern forest ecology – burns like the Great Fire of 1911 occur in roughly 10 year intervals.[2] Northerners knew about these burn cycles and had a long history of living with fire. The annual report for the Ontario Bureau of Mines recorded burned forests every summer of its tenure in the province, starting in 1891.

Given the regular, predictable nature of northern fire, the second chapter of my dissertation argues the destructiveness of the Great Fire of 1911 stemmed from a combination of factors. In their hurry to exploit local gold, newcomers had not thought much about fire-proof construction. Adding to this problem, increased population concentration, insect attacks in 1905, and a policy of active fire suppression in Ontario led the build-up of excess fuel on the landscape.

At the same time, Ontario (along with the rest of North America) entered an especially hot, dry period which peaked in the summer of 1911.

The warm, dry period of the early twentieth century is not something I invented. Fire historian Stephen Pyne lists this early twentieth-century warming as one of the major contributing factors to the devastating forest fires in America in 1910 in his book Year of the Fires: The Story of the Great Fires of 1910. But how did these climate patterns specifically effect Ontario, and can we, as Pyne has done for the United States, connect the Great Fire of 1911 to climatic warming in Ontario during these years?

The government of Canada keeps historic climate data on their website. The data is divided by weather station. Some go back further than others. In the case of Porcupine I was immediately hamstrung by the fact that Porcupine records only go back to 1922. So lets be clear: I cannot actually say what was going on in my study area in 1911 climatically. In fact, the furthest north I could get was Ottawa. However, at the risk of re-affirming Ottawa’s perceived place at the center of the universe, I can say with confidence that if it was hotter and dryer than normal in Ottawa, it was probably hotter and dryer than normal in Porcupine – the climates are close enough for the Ottawa data to be useful for seeing broader trends in Ontario climate for the early 20th Century.

The data shows that Ottawa recorded its hottest ever days on 3 July, 9 July, and 10 July (the day of the fire) in 1911, well below average rainfall, and light snow packs.[3] I can corroborate this finding with anecdotal evidence. The Globe recorded fifty-eight heat-related deaths and dangerously low city reservoirs on 8 July 1911.[4] Dominion Horticulturalist W. T. Macoun recorded a hot, dry spring (which shortened the flowering season for many blooms) and noted that “July was an extraordinarily hot month, one of the hottest ever experienced.” According to Macoun the mean temperature in July was a scorching 97.8 degrees Fahrenheit (36.5 degrees Celsius), nights remained hot, and rainfall was light.[5]

Just to get a sense of how hot it was in July of 1911, and to get a sense of what the government climate data looks like, take a look at the screenshots below. The two tables show the first 15 days and temps of Jul 1911 vs. Jul 2017 (degrees Celsius).

 

To look more closely at 1911, but also to think about how climate has changed over time long term, lets plug the climate data into some visualization software.

Here’s a graph I made in excel’s “pivot” showing the maximum high temps for each year for the entire period of record.

Now the obvious story here is 2012 (woah), but we can also see a cluster of high temperature years at the beginning the 20th Century, between 1901 and 1917, which is the hot period I talk about above.

Here’s precipitation:

Here we can see that the early 20th Century saw low average precipitation, again supporting the idea that Ontario suffered a particularly dry period before the great fire. And if we look ahead to the 21st century, precipitation begins fluctuates to greater low/high extremes.

And here’s snowfall, this time in Tableau.

That big dip between 1905 and 1915 is 1911, when snowpacks were abnormally light. But again, there is an interesting story starting around 1975 when we can see a real sustained drop in average snowfall.

The three graphs lend weight to my argument that 1911 was a particularly hot, dry year in Ontario, and they show how that period fits into the longer story of our climate.

They also show the gradual effects of climate change since about the mid-twentieth century.  Climate change is a gradual, nearly invisible process in our day to day lives, but looking at the historic climate data we can see how it has already impacted local patterns in precipitation and temperature in our communities.

Given the changing climate and the increase in extreme weather events in the 21st century – including forest fires – what lessons can we draw from the Great Fire of 1911?

In the aftermath of the catastrophe, the Great Fire of 1911 became memorialized as a transitional moment or a “baptism by fire” for mining in Northern Ontario. Porcupine’s ability to survive and thrive after the fire continues to be cited as a testament to the community’s toughness and tenacity on a difficult frontier. By 1914, Porcupine had gone from a relatively chaotic small-scale gold rush to a large-scale, low-grade, efficient, deep-mining industry on par with the biggest mining enterprises in the world. In my dissertation, I argue that the economic destruction of smaller mining companies allowed bigger syndicates to buy up valuable land and take control of best deposits after 1911.

How did the industry adapt to the disaster? In order to protect their assets against future calamities, Porcupine assembled forest-fire fighting infrastructure including fire-proof buildings, fire towers, and a full time fire-fighting force.[6]

These measures failed: Porcupine burned again in 1916, would experience a major mine fire in 1928, and is periodically plagued by bush fires right to the present.[7] Moreover, by focusing on fire prevention, Porcupine failed to adapt to other environmental problems, including flooding, land scarcity, food and water insecurity, and mine waste disposal – all of which caused significant problems down the road.

This is sort of a depressing take-away. But with a little more research, I think I can dig into this evidence for some examples of successful adaption to climate change and extreme weather in Ontario. In general, when facing environmental problems, did collaboration with community members produce better outcomes? What was the role of international science in environmental adaption? What specific characteristics of the successful companies post 1911 helped them to endure the trauma of the fire? There are other anomalies in those long-term graphs that produced less catastrophic histories – what happened during those years that allowed mining to proceed unscathed?

The answers to these sorts of questions could contain lessons that can potentially help Canadian resource communities adapt in the future.

Disclaimer: I am not a statistician, so I welcome any feedback and/or pointing out of my glaring errors.

Further Reading:

Global Historical Climatology Network

Historical Climatology

Climate History Network

Footnotes: 

[1] “Porcupine Disaster Intensifies; Refugees Fleeing from the Scene,” The Globe, 14 July 1911.

[2] I.D. Thompson, A. Perera, and David Euler, Ministry of Natural Resources, Ecology of a Managed Terrestrial Landscape: Patterns and Processes of Forest Landscapes in Ontario (Vancouver: UBC Press, 2000), 41-42.

[3] Government of Canada, “Ottawa data” in Almanac Averages and Extremes, Historical Climate Data, Accessed 18 May 2017, http://climate.weather.gc.ca/.

[4] “Record is now fifty-eight deaths,” The Globe, 8 July 1911.

[5] W.T. Macoun, “Report of the Dominion Horticulturalist,” No. 16, 31 March 1912, in Second Session of the Twelfth Parliament of the Dominion of Canada Session 1912-13 (Sessional Papers) (Vol. 9), 86.

[6] The Porcupine Advance documents a long history of fire protection measures. See “Can New Bush Fires be Prevented?” Porcupine Advance, 9 August 1916; “Town Council Passes Fire Bylaw,” Porcupine Advace, 22 November 1916; “The Heliograph used in Firefighting,” Porcupine Advance 31 January 1917; “Government Plans for Preventing Fires,” Porcupine Advance 14 February 1917; “Modern Electric Fire Alarm System,” Porcupine Advance, 19 September 1917; “Getting Ready for Fire Menace,” Porcupine Advance, 5 May 1920; “Cultivated Fields Would Remove Fire Menace,” Porcupine Advance,11 October 1922; “Mile Firegaurd Around Northern Town,” Porcupine Advance, 8 November 1922; “Fire Guard Around Timmins,” Porcupine Advance, 30 May 1923; “New Fire Towers,” Porcupine Advance, 9 January 1930.

[7] “Terrible Fires Sweep Northern Ontario,” Porcupine Advance, 5 August 1916; “Early Bush Fires,” Porcupine Advance, 2 June 1920; “45 Die in Big Fire,” Porcupine Advance, 14 October 1922; “Fire Rings About Towns,” Porcupine Advance, 4 November 1922; “Fire Hazard,” Porcupine Advance, 4 February 1923; “Final Hollinger Fire Report,” Porcupine Advance, 18 October 1928; “Forest Fires Rage,” Porcupine Advance, 1 August 1929.

Posted in Blog, Uncategorized Tagged with: , , , , , , , ,

Putting Health Beliefs on Maps

As an extension to my dissertation, this project stems from a long-standing interest in global health outreach, gender studies, and public health policy. Successful public health policies rely on a deep understanding of the various health beliefs that underpin health behaviors. This pilot project will be an interactive online archive of health beliefs that underpin the various anti-vaccine movements around the world. This project will draw on Arthur Kleinman’s 1978 anthropological framework of the internal structure of the health care system (Figure 1), of which are three belief systems that influence our health practice of choice: First, the “professional sector” represents modern/Western medicine (e.g., hospital care, medical school trained physicians, immunization; concepts used include: evidence-based, diagnosis, treatment, prescription, prognosis), second, the “popular sector” represents contemporary influence (e.g., friends and family’s health beliefs, popular celebrity health claims, advertisements of health practices, internet dietitians, naturopaths, homeopaths, including an array of contemporary false claims such as “vaccine toxicity”, “immune overload”, “adrenal fatigue”, etc), and the “folk sector” represents traditional healing and wellness practices (e.g., healing, spiritual healing, natural healing, etc). These 3 belief systems can overlap, compliment and/or conflict with one another – thus, an implementation of successful public health policies are dependent on a deep anthropological and phenomenological understanding of the lived experiences of the population.

An example of the importance of documenting health beliefs is the case of low polio vaccine coverage rate in Nigeria where polio is endemic – there is currently widespread belief that polio vaccine is an American bio-weapon that sterilizes the Muslim population as a form of mass genocide. Similarly, in Japan where cervical cancer remains prevalent, HPV vaccine coverage rate remains at 0% due to rumours of HPV-vaccine induced anaphylaxis that is promoted by major media that lead the Japanese government to defund HPV vaccine due to low public interest in the vaccine.

Over the course of the next 9 months, I will produce an interactive map that would document health beliefs that impede vaccine efforts. A sample webpage is “Healthmap” (Figure 2), and the sources of documents are similar to that of “The Vaccine Confidence Project”. The former website documents infectious diseases and outbreaks through active surveillance (e.g., outbreaks are collected by mining keywords of specific diseases from select media on the web and then added into the registry) and passive surveillance (e.g., outbreaks are “added” by the public, usually health practitioners and Centers for Disease Control professionals and verified by the web owner). The latter website was established in 2013 to collect news and published articles on vaccine hesitancy. The “Atlas of Vaccine Hesitancy” will help researchers navigate health beliefs prior to implementing any immunization policy. It fills the gap of documenting health beliefs that underpins vaccine refusal in an easily retrievable way online, and it will be allow submissions of new entries from researchers, health professionals, and the public.

This website will be used by any public health policymakers, global health researchers, anthropologists, epidemiologists, etc – it can also be used by members of the public interested in learning about vaccine refusal and vaccine hesitancy. I envision to site to be a portal for knowledge exchange as well as a lens through which practitioners of Western medicine can adopt an understanding that vaccine hesitancy is not simply a deficit of scientific knowledge, but an interpretive construct of beliefs. Users can navigate with filters such as vaccine type, reasons for vaccine hesitancy, current or past vaccine refusal (e.g., users can type ‘1995’, or ‘1880’ to find the type of vaccine refusal at the selected year/decade). The search will yield a location of vaccine hesitancy, indicated by a “pin”. Clicking on the pins on the map will open up a short paragraph on the status of vaccine refusal in the region of selection, and links to journal articles, websites, news media, and other archival documents that have documented said vaccine hesitancy.


Figure 2. Health Map of outbreak alerts.


F
igure 3. Vaccine Confidence Project

Here comes the rambling. The system architecture of the “Atlas” should be pretty straightforward, it has a web frontend, a web backend, a data repository, a classification engine, and a crowd-sourced data acquisition API.  But here comes the problem – most of the maps I am talking about have million-dollar grants, with teams of 6 people. I am one person, with no research grants (Sherman Centre does provide grants for travelling and attending conferences for residences, but that’s aside the point). Because I’m not a programmer nor a GIS specialist, I’ve been playing around with map-making software for the past two weeks that allows me to not have to program, and hopefully get a sense of what option best suits my needs. We’ll go through the process of how this map is going built, for anyone who might be interested.

One of the easiest ways to build a crowd-sourcing map is using Story Maps developed by ArcGIS. To be able to use Story Maps, you will need an online ESRI account. An example of Story Maps is “Faces of Opioid” (See Figure 4), anyone from the public can contribute to the map by clicking “Add Lost Loved One” and upload a picture to the map. This is the first idea. There is no better way to create maps than ArcGIS, it is powerful, intuitive, has a rich online resource database for case-based learning, it also reads and exports all sorts of files (xls, json, csv) and from the cloud (Dropbox, One Drive, Google Drive), but it’s not free ($100CAD/yr for students). And I love free. So, I looked about for other map making tools.


Figure 4. Faces of Opioid

The second way to make maps and populate them is by Tableau. This one isn’t free either, but it is for students. There is no better way to make great beautiful looking graphics and charts than with Tableau, I have heard the director of Ryerson University’s Social Media Lab director Anatoliy Gruzd call it “Excel on Steroids” – and it is pretty much that. There’s functions to chronologically present your map, but it’s not built for crowdsourcing. The third way to build quickly build map is using Infogram, which allows you to populate the map by yourself – it’s pretty popular with infographic makers, and the interface is easy to use.

There’s also the possibility to build the map using Google My Maps. And after tinkering with it a bit, I found it to be the most customizable for crowd-sourcing and for input from my end into the repository. But I have not completely given up on Story Maps yet.

Before I sign off, I just wanted to share a site called Disease Map, which allows you to contribute to pinning your chronic or hereditary disease on the map, and connect with others with the same disease online. Figure 4 shows people with Anemia who have pinned themselves on the map.


Figure 5. Disease Map

I’ll be returning to show you the constructed Atlas in a few months’ time.

Posted in Blog Tagged with: , , , , , , ,

Everything Cold and White

Screenshot from Thunderbird Strike

CN: gendered violence, death

Pain

Collecting data can be painful: lower back spasms, headaches, blurred vision. I take screenshots for hours. Then there is the content of the data— Bell Let’s Talk documents, advertisements for Bell on the Canadian Mental Health Association (CMHA)’s website. The stamp of capitalism bleeding over every page. Twitter threads that makes you realize everything is a little bit worse than you were originally told, even though you knew this already, you’ve known since the first time someone grabbed your ass, and the last time someone told you how to cure your anxiety. You must look like a project that needs fixing, an imperfect line of code, always spitting out an error number.

I’ve been thinking a lot about presence lately. In Mad At School (2011), Margaret Price asks: “what does “participation” in a class mean for a student who is undergoing deep depression and cannot get out of bed? Or a student who experiences such severe anxiety, or obsession, that he can barely leave his dorm room or home?” (5-6). My grandmother died this term and I spent two weeks in my sweatpants watching Friends and playing Fable 3. Between grief and everyday depression, I missed every meeting for the digital scholarship graduate conference of which I am a committee member. I posted comments on the shared google document. I emailed suggestions. I have a virtual presence when I cannot be present physically.

Different colours of pain weave across our research.

Settler

Beth LaPensée is an Anishinaabe, Métis, and Irish artist and game designer. Her game Thunderbird Strike won the Best Digital Media Work award at ImagineNATIVE. LaPensée is currently under siege by oil lobbyists who want to rescind her art grant and destroy the game. Why? The game advocates for the removal of oil pipelines. I follow LaPensée on Twitter, and I’ve played Thunderbird Strike (it’s beautiful and powerful— and free! Play it!). Recently, I read Leanne Betasamosake Simpson’s Islands of Decolonial Love. I am currently reading the Michi Saagiig Nishnaabeg scholar’s As We Have Always Done: Indigenous Freedom Through Radical Resistance, and a collection of short stories by Dawn Dumont, a Plains Cree writer, entitled Glass Beads. I’m a third year PhD student in English and cultural studies and I have never read critical/cultural theory written by an Indigenous woman. I work in a colonial university, in an English department that privileges eurocentric texts and written culture over Indigenous literatures and oral cultures.

How do I honour the land on which I work and play, stolen land, and the people it was stolen from? How do I discuss mental health movements in the context of colonialism, and attend to the ways in which both the psychiatric industry and Mad Pride are settler movements, and have been leveraged as weapons of colonialism? And how do we honour Indigenous resistance that has always existed when we talk about mental health and Mad digital activism?

Haudenosaunee writer Alicia Elliot discusses depression, anxiety, and chronic illness in her article “On Being an Ill Writer”: “Is there a way that we can create a space, a language, around illness, that not only robs it of its stigma, but also positions it as a fact of life instead of merely an obstacle to be overcome on some imaginary road to “wellness”?” I want to follow these writers into their truths. Not to steal— we’ve stolen enough (we continue to steal). But to the acknowledge the work that sick, disabled, and mentally ill Indigenous women do, resistances that are not colonial, capitalist, anti-Black, violent.

I am starting to understand that the framework of my research, with its settler focus, colonial violence, and Indigenous erasure, is nowhere near good enough.

Winter

The light folds away into origami cranes and the world is dark. The world is dark. Mad Pride Toronto Retweets a CBC article entitled “Mentally ill patient died while strapped to bed in locked hospital room.” Everything is a little bit worse. They’ve stopped tweeting about Andrew Loku— there are new deaths to bear witness to. Next year, I will have to do something with this data I am collecting, make sense out of it, untangle it like a child’s skipping rope. As if the act of tweeting as both witness and testimony is simple, something easily categorized and dissected and known. As if we aren’t all entangled in our work, looking up organizations on our own Twitter and Facebook and Instagram pages, sharing and saving and responding and hurting and trying. The footprints of my research are easy to find. It is a textbook crime scene.

Works Cited

Elliot, Alicia. “On Being an Ill Writer.” Open Book. November 13, 2017. Web.

Price, Margaret. Mad At School. Ann Arbor: University of Michigan Press, 2011. Print.

 

Posted in Blog Tagged with: ,

Building a Database: African American Women and Racialized Violence in the Postemancipation South

I am a PhD Candidate in the Department of History and a Graduate Resident at the Lewis & Ruth Sherman Centre for Digital Scholarship. I will be posting throughout the year, so I want to take this opportunity to introduce my research and explain how I will be using digital humanities techniques to supplement my dissertation.

My Research:

My dissertation examines how black women devised a range of informal resistance techniques to contest racialized violence in its totality of forms. In the late nineteenth century and early twentieth century, racialized violence impacted African Americans across the postemancipation South. Generations of African Americans endured the constant threat of individualized and collective incidents of verbal abuse, sexual harassment, and physical assault. It is a false generalization, however, to characterize the black response in terms of passivity. Black women, in particular, found various ways to resist – theft, sabotage, destruction of property, boycotting, migration – that over time were effective in undermining racialized violence.

To uncover instances of informal resistance, my dissertation draws on three primary sources: the Slave Narrative Collection of the Federal Writers’ Project, the first-person testimony culled from the Joint Select Committee to Inquire into the Condition of Affairs in the Late Insurrectionary States, and the records of the Bureau of Refugees, Freedmen, and Abandoned Lands.

My dissertation builds on previous research. As part of an undergraduate fellowship, I began studying the methods of resistance employed by African Americans against lynching (Rejecting Notions of Passivity: African American Resistance to Lynching in the Southern United States).  Overt methods of resistance were dangerous, as those African Americans who attempted to assert their rights as free citizens frequently became the targets of attack by white Southerners. By adopting clandestine forms of resistance with limited risk of reprisal, African Americans were able to thwart attempts at control through violence.

 Building a Database:

Studying racialized violence across the postemancipation South poses certain challenges. Even after restricting the parameters of my dissertation to Georgia, Mississippi, South Carolina, and Texas, the number of documents available is immense. To manage my sources effectively, I intend to create a relational database that will feature data extracted from the testimony of both the victims and witnesses of racialized violence. The goal is to extract data on incidents of racialized violence – the victims and perpetrators, geographic locations, forms of violence, methods of resistance – in order to identify thematic trends. In particular, I am interested in the relationships between specific forms of violence and the methods of resistance employed in response.

For my previous work on African American resistance, I created a database of interviews from the Slave Narrative Collection using Microsoft Excel. Although useful for its record-keeping functions, this database had limited ability to identify relationships across multiple documents. Each interview was entered into the spreadsheet without consideration to those around it, and any relationships had to be identified manually. A relational database, however, is more complex; it can identify relationships across multiple tables by matching data common to both. A relational database, then, will allow me to identify relationships between documents.

At this stage, I am just beginning my primary research and testing the functionality of my database design. This is the initial design that I included as part of my dissertation proposal:

When reading the Slave Narrative Collection, however, I quickly found faults in this design. When I originally designed the database, I foolishly assumed that each document would describe only one incident of racialized violence and/or resistance. But, of course, things are never quite so neat. While many narratives describe only one incident, there are an equal number that describe multiple incidents. With my initial design, this would have required creating multiple entries in the “Connections” table and repeating data, particularly citations. Of course, the goal of any good database is to reduce data redundancy and achieve normalization. So I have already redesigned my database once.

I am using MySQL Workbench to build my database.

The biggest change is that I have eliminated the “Connections” table and created two smaller tables: “Document” and “Incident.” The “Document” table tracks all the citation information and the people involved in the creation of each document. I use the “Incident” table to track individual incidents of racialized violence and/or resistance. Now, when I encounter a document that describes multiple incidents, I can just add additional entries to the “Incident” table without having to repeat all the data on the document itself.

The current design is working quite well, but I have been mostly working with the Slave Narrative Collection and the first-person testimony culled by the Joint Select Committee. Because the records of the Freedmen’s Bureau are housed in the National Archives in Washington, D. C., I have not yet been able to gain access. It is likely that my database will require additional tweaks, but for now this is what I am working with.

Posted in Blog Tagged with: ,

ERU1, Fall 2017 – Nicole’s Post

By Nicole

I was lucky to be able to participate in the Electronics for the Rest of Us module and learn a very unique, new set of skills. I was hesitant at first given my very limited knowledge of electronics but I soon found that Arduino was far easier than I had imagined. The two-day boot camp introduced some fundamental principles and components of circuitry, coding and electronic devices using an Arduino board. Not only did we get to learn how to use the program, we also had the opportunity to develop and apply the logical thinking that is unique to coding and electronic design. Over the span of the module, I was able to see just how powerful the device was with all of the functions it was capable of performing with very rudimental circuit components.

The creative aspect of the course, where we made our own functioning devices, was the best way to solidify these new skills and start to integrate all of the individual functions into one code. My partner and I created a thermometer using a thermistor with visual and auditory cues that were initiated at specific temperatures. We connected an LED screen and a sound button which responded to the thermistor. At high temperatures, the screen turned red and displayed the message “it’s getting hot” while an alarm sound played. At lower temperatures, the screen turned blue and again read a message while playing the Frozen theme song. Working together and struggling through the design process gave us a deeper understanding of the Arduino components and improved our problem-solving skills with each new challenge.

As someone with interests in biology and chemistry the course was very refreshing and a great introduction to a whole new field of science and technology that is highly applicable in many disciplines. I learned the basics of electronic design and gained confidence with Arduino and programming language. I’m excited to find future projects where I can apply these skills and would highly recommend this opportunity to anyone else interested in doing the same!

Posted in Electronics for the Rest of Us! Tagged with:

ERU1, Fall 2017 – Constantine’s Post

By Constantine

Electronics for the Rest of Us was an enjoyable course that let me get a feel for how electronics work. This course is geared towards people who want to get more experience with electronics, and to those who are new to it. The programming that is involved is very simple and can be learned by new programmers. The instructors were amazing, they were enthusiastic to teach us and were very active in helping us out when we made a mistake in our code or schematic, as well as answer any questions that we had about electronics and how things work. We used an Arduino kit, a small circuit board that is able to be programmed fairly easily.

Not only did it help me understand how circuits worked, but also gave me some experience with coding. We had to learn how to read schematics and what different symbols meant, and from there we created a few circuits following schematics but were then able to try and alter the code and do stuff that we wanted it to do. Something that I found really cool is that we were able to show different colours on an LED based on the temperature in the room. We made the colour blue represent when the room was cold, and the colour red to represent when the room was warm.

For the last project, we were given time to create anything we wanted while achieving a few requirements. My partner and I created a thermostat that would change colour when the room temperature changed, as well as play a song when the room was too hot or played a different song if the room was too cold. Overall, this course was very educational and it was something different, as we were able to get hands-on experience, something that isn’t done that much with a lot of university courses. I highly recommend it to anyone who loves creating things and wants to expand their knowledge of how electronics works.

Posted in Electronics for the Rest of Us! Tagged with: