You may remember last week I provided a short example of using metaphor to give depth and life to some of my otherwise shallow and boring text. I’m not sure I acheived this, but it was a fun exercise for someone like me who likes playing with words. The crucial last step in Sword’s process is to share the reworked sentence with a friend to see if the reworked sentenece works, and to get feedback on how to make it better.
This week’s readings were focused on Values in Design with Friedman & Nissenbaum (1997), Shilton, Koepfler, & Fleischmann (2014) and Borning & Muller (2012). We were fortunate to have Katie Shilton on hand to talk about her article, and values sensitive design in general.
Friedman, B., & Nissenbaum, H. (1997). Human values and the design of computer technology. In (pp. 330–347). Cambridge University Press.
Shilton, K., Koepfler, J. A., & Fleischmann, K. R. (2014). How to see values in social computing: Methods for studying values dimensions. In Proceedings of the 17th ACM Conference on Computer Supported Cooperative Work & Social Computing (pp. 426–435). Association for Computing Machinery.
This week we took a break from readings and reviewed each others initial research topic proposals. I believe that the idea is for these research proposals to feed into the work we do in next semester’s seminar, which ultimately leads up to our integration paper that culminates our class work, and then feeds into our dissertation. It’s not really appropriate for me to share my classmates ideas here, but I will say that I was really struck by how varied and interesting they were: methods for studying citizen science, values in design, trauma in information systems. Our discussion was useful because it revealed the degree to which I actually missed the intent behind the proposals. I also got some useful feedback about mine, which mostly brought home that I have yet to express an actual research project!
For the past few weeks (honestly, perhaps event months) I’ve been in the process of writing a piece for MITH about techniques for preserving websites. The idea is for it to be part of a series that Porter Olsen and Trevor Muñoz started on the topic of stewarding digital humanities work on the Web. I’m trying to follow on from Porter’s piece which focused on the materiality of web servers, or how to work with webserver hard drives as objects of preservation and data curation. My contribution on the other hand is going to examine the ephemeral nature of the Web: how broken links break the illusion of a World Wide Web, and what we can do about it. Most of the content will be centered on techniques for mitigating this breakage using principles of repair borrowed from Web architecture.
This week’s readings included some dire looks at life after the PhD: Kovalik (2013) on how easy it is to slip through the cracks of academia, and Johnson (2014) on the hyper-competitive life of the postdoc. Both were quite sobering. Johnson describes the problem in the health sciences where reduced government funding has led to situations where academic research labs are increasingly dependent on cheap labor (postdocs), who do most of the actual science, while the faculty jobs are increasingly difficult to find, because there are too many postdocs being cranked out to do the research. It was a somewhat frustrating article because while it hinted at how smaller labs could help correct this problem, it really didn’t explain how that could work. Would the problem just be harder to identify if there were lots of smaller labs, rather than fewer large ones? I like to think there is more to this idea of smaller labs, that are geared more to research. Perhaps they are more like projects with longer funding cycles than labs?
This weeks seminar was focused on citizen science. We had three readings: Wiggins & Crowston (2011), Quinn & Bederson (2011), Eveleigh, Jennett, Blandford, Brohan, & Cox (2014) and were visited by the author of the first paper Andrea Wiggins. This class was a lot of fun because prior to talking about the readings we spent an hour walking around the UMD campus looking for birds, and collecting observations with Andrea’s eBird mobile app.
Eveleigh, A., Jennett, C., Blandford, A., Brohan, P., & Cox, A. L. (2014). Designing for dabblers and deterring drop-outs in citizen science. In Proceedings of the 32nd annual ACM conference on human factors in computing systems (pp. 2985–2994). Association for Computing Machinery.
Quinn, A. J., & Bederson, B. B. (2011). Human computation: A survey and taxonomy of a growing field. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 1403–1412). Association for Computing Machinery.
Wiggins, A., & Crowston, K. (2011). From conservation to crowdsourcing: A typology of citizen science. In System sciences (HICSS), 2011 44th Hawaii international conference on (pp. 1–10). IEEE.
This week we focused on information visualization with Niklas Elmqvist from the UMD iSchool. Niklas studies information visualization and human computer interaction. He joined UMD in the last year, after arriving from Purdue University.
Thank you for inviting me here today to be with you all at MARAC. I’ll admit that I’m more than a bit nervous to be up here. I normally apologize for being a software developer right about now. But I’m not going to do that today…although I guess I just did. I’m not paying software developers any compliments by using them as a scapegoat for my public presentation skills. And the truth is that I’ve seen plenty of software developers give moving and inspiring talks.
This week we dove into some readings about information retrieval. The literature on the topic is pretty vast, so luckily we had Doug Oard on hand to walk us through it. The readings on deck were Liu (2009), Chapelle, Joachims, Radlinski, & Yue (2012) and Sanderson & Croft (2012). The first two of these were had some pretty technical, mathematical components that were kind of intimidating and over my head. But the basic gist of both of them was understandable, especially after the context that Oard provided.
Chapelle, O., Joachims, T., Radlinski, F., & Yue, Y. (2012). Large-scale validation and analysis of interleaved search evaluation. ACM Transactions on Information Systems (TOIS), 30(1), 6.
Liu, T.-Y. (2009). Learning to rank for information retrieval. Foundations and Trends in Information Retrieval, 3(3), 225–331.
Sanderson, M., & Croft, W. B. (2012). The history of information retrieval research. Proceedings of the IEEE, 100(Special Centennial Issue), 1444–1451.
In this weeks class we took a closer look at design methods and prototyping with readings from Druin (1999), Zimmerman, Forlizzi, & Evenson (2007) and a paper that fellow student Joohee picked out Buchenau & Suri (2000). In addition to a discussion of the readings Brenna McNally from the iSchool visited us to demonstrate the Cooperative Inquiry that was discussed in the Druin paper.
Buchenau, M., & Suri, J. F. (2000). Experience prototyping. In Proceedings of the 3rd conference on designing interactive systems: Processes, practices, methods, and techniques (pp. 424–433). Association for Computing Machinery.
Druin, A. (1999). Cooperative inquiry: Developing new technologies for children with children. In Proceedings of the sigchi conference on human factors in computing systems (pp. 592–599). Association for Computing Machinery.
Zimmerman, J., Forlizzi, J., & Evenson, S. (2007). Research through design as a method for interaction design research in hci. In Proceedings of the sigchi conference on human factors in computing systems (pp. 493–502). Association for Computing Machinery.