Mapping & Clouding I & II

Mapping & Clouding: Employing Digital Methods I & II

Two workshops will be organized during the Web Science conference around the topic ‘Mapping and Clouding: Employing Digital Methods’. The first one (M&C I) will focus on The Issue Crawler, whereas the second one (M&C II) focuses on The Lippmannian Device.
M&C I: Thursday June 21, Half-day workshop (morning) or II: Friday June 22, Half-day workshop (morning)

Mapping & Clouding: Employing Digital Methods I

Thursday June 21, Half-day workshop (morning)
The Digital Methods workshop focuses on mapping website networks with the Issue Crawler. The Issue Crawler, online since 2001, is Web network and visualization software that works in a browser. It consists of crawlers, databases, analysis engines, and visualization modules. The software relies on co-link analysis, a scientometric sampling or network demarcation technique based on citation analysis, adapted for the Web. Enter a set of URLs into the software, and the URLs are crawled, the outlinks are captured and the co-links retained, in one, two or three iterations of the procedure, as selected by the user. (There are also snowball and inter-actor crawling methods built into the software.) The results are analyzed for centrality measures and visualized in a directed graph, showing site inter-linkings (nodes and lines with arrows). The file format of the graph is a scalable vector graphic (SVG), which also may be saved in a variety of other file formats, including PNG and PDF. The online SVG graphic is interactive, whereby the user can click the URLs behind the nodes, and may turn on and off links as well as domains. The purpose of the interactivity is to provide a single, graphical space for users to explore specific inter-linkings between sites (who links to whom, and who does not?) as well as to spend time reading the content of the pages in the network, in an alternative to search engine space (and ranked lists). The original purpose of the software is ‘issue network’ analysis for social and political theory, and a literature has developed around the software. The Issue Crawler is also employed, methodologically, for dynamic URL sampling, i.e., building out a seed list of URLs for related sites. Among other applications, dynamic URL sampling techniques have been employed in the study of Internet censorship.

The workshop provides an introduction to the Issue Crawler as well as its allied tools, including the actor profiler, and export features to Gephi and others. See http://www.mappingcontroversies.net/Home/PlatformIssueCrawler

Preparatory reading
R. Rogers (2010) ”Mapping Public Web Space with the Issuecrawler,” in: Claire Brossard and Bernard Reber (eds.), Digital Cognitive Technologies: Epistemology and Knowledge Society. London: Wiley, 115-126.

R. Rogers and N. Marres (2000) ”Landscaping Climate Change: A mapping technique for understanding science & technology debates on the World Wide Web,” Public Understanding of Science, 9(2): 141-163.

Relevant Websites

Required materials
Please bring your laptop!
Workshop Organizer:
Richard Rogers, PhD is University Professor and holds the Chair in New Media & Digital Culture at the University of Amsterdam. He is Director of Govcom.org, the group responsible for the Issue Crawler and other info-political tools, and the Digital Methods Initiative, reworking method for Internet research. Among other works, Rogers is author of Information Politics on the Web (MIT Press, 2004), awarded the 2005 best book of the year by the American Society of Information Science & Technology (ASIS&T). His latest book, Digital Methods, is to be published by MIT Press.

Mapping & Clouding: Employing Digital Methods II

Friday June 22, Half-day workshop (morning)
The Digital Methods workshop concerns itself with clouding resonance of issue mentions on websites. The workshop in particular concentrates on using and interpreting the Lippmannian Device, the tool developed by Rogers and colleagues in the context of the Mapping Controversies project led by Bruno Latour (http://www.mappingcontroversies.net). It is named after Walter Lippmann (1889-1974), the American writer and columnist, and author of Public Opinion (1922) and The Phantom Public (1927). In particular the software takes up Lippmann’s call for equipment to ‘test’ in a coarse means an actor’s partisanship. At the workshop, Rogers will present (at least) four ways to use the Lippmannian Device, and also facilitate use by the workshop participants. Rogers also will introduce additional devices and tools (some 30), developed to date by the Digital Methods Initiative and the Govcom.org Foundation (see http://tools.digitalmethods.net). The workshop also facilitates projects by participants. Participants should bring their laptops and consider bringing along a research question that concerns using online data for social research. For project examples, please see http://www.digitalmethods.net.

Preparatory reading
R. Rogers (2010). “Internet Research: The Question of Method,” Journal of Information Technology and Politics, 7(2/3): 241-260,
http://www.govcom.org/publications/full_list/rogers_internet_research_question_of_method_2010.pdf.

R. Rogers (2009). The End of the Virtual: Digital Methods, Amsterdam: Amsterdam University Press,
http://govcom.org/publications/full_list/oratie_Rogers_2009_preprint.pdf

Required materials
Please bring your laptop!

Workshop organizer:
Richard Rogers, PhD is University Professor and holds the Chair in New Media & Digital Culture at the University of Amsterdam. He is Director of Govcom.org, the group responsible for the Issue Crawler and other info-political tools, and the Digital Methods Initiative, reworking method for Internet research. Among other works, Rogers is author of Information Politics on the Web (MIT Press, 2004), awarded the 2005 best book of the year by the American Society of Information Science & Technology (ASIS&T). His latest book, Digital Methods, is to be published by MIT Press.