This paper presents a model of communication based on the conditions of its success and failure. Building on Peters’ metaphor of communication as both a bridge and a chasm, the model depicts cryptography as a drawbridge to selectively choose the audience of a message. The model forms a set of islands linked by a series of drawbridges, each representing a source of communication’s success or failure, and each of which must be passed in sequence. The first drawbridge is recognition, in which the most basic source of failed communication is to be unaware that a message is even present. The next is access, in which some form of barrier or lack of authorization keeps one from accessing a message. Next is legibility, the ability to recognize individual symbols, followed by intelligibility, the recognition of coherent patterns, words, and syntax in those symbols. The final two stages of this model concern different stages of meaning. The public meaning of a message is the literal, surface sense intended to be understood without insinuation or ambiguity. The private meaning of a message is either selectively encoded for a specific audience, or else fully interior to our own minds. The descriptive and explanatory power of this model is illustrated through various examples in which communication is secret, secure, and otherwise selective of its audience.
We offer a new model of the sensemaking process for data science and visual analytics. Whereas past sensemaking models have been built on theoretical foundations in cognitivism and positivism, this model adopts interpretivist foundations in order to reframe data sensemaking in humanistic terms. We identify five key principles centered on the concept of schemas: Tacit and Explicit Schemas, Schemas First and Always, Data as a Schematic Artifact, Schematic Multiplicity, and Sensemaking Over Time. Our model uses the analogy of an iceberg, where data is the visible tip of the schema underneath it. The analysis process iteratively refines both the data and its schema in tandem. We compare the roles of schemas in past sensemaking models and draw conceptual distinctions based on a historical review of schemas in different philosophical traditions. We validate the descriptive, predictive, and explanatory power of our model through four analysis scenarios: uncovering data injustice, investigating official data, teaching data wrangling, and producing data mashups.
For the many journalists who use data and computation to report the news, data wrangling is an integral part of their work. Despite an abundance of literature on data wrangling in the context of enterprise data analysis, little is known about the specific operations, processes, and pain points journalists encounter while performing this tedious, time-consuming task. To better understand the needs of this user group, we conduct a technical observation study of 50 public repositories of data and analysis code authored by 33 professional journalists at 26 news organizations. We develop two detailed and cross-cutting taxonomies of data wrangling in computational journalism, for actions and for processes. We observe the extensive use of multiple tables, a notable gap in previous wrangling analyses. We develop a concise, actionable framework for general multi-table data wrangling that includes wrangling operations documented in our taxonomy that are without clear parallels in other work. This framework, the first to incorporate tables as first-class objects, will support future interactive wrangling tools for both computational journalism and general-purpose use. We assess the generative and descriptive power of our framework through discussion of its relationship to our set of taxonomies.
Knowing Together is a collection of sculptures designed to explore collaborative techniques for capturing three-dimensional images. Thirty-five participants collectively created these images by forming circles and passing a camera around. These images were stitched together to form 3D models whose distortions are preserved as artifacts attesting to their creation process, suggesting novel approaches to photogrammetry that do not treat photorealism as its ideal quality.
Although Walter Benjamin’s ‘‘The Work of Art in the Age of Mechanical Reproduction’’ is a seminal essay in the study of media history, the work itself gives a surprisingly brief account of one of field’s core subjects: the printing press. Books and literature present only a special case of mechanical reproduction, accord- ing to Benjamin, but the implications of this point remain largely unexplored by scholars. The purpose of this essay is to ask why Benjamin would have considered print to be different or less historically consequential compared to photography and cinema when the revolutionary potential he ascribes to these more recent technologies is also prefigured in his other writings on books and literature. Answering this question helps to create a sharper picture of what matters to Benjamin about new media and also points to figures like Georg Lukacs who influenced Benjamin’s account of technology and art. Ultimately, this line of questioning also raises concerns about the place of the ‘‘Work of Art’’ essay in the study of media history, a field in which the signal error is to treat new media as unprecedented developments.
This report offers a guide to the use and significance of SecureDrop, an in-house system for news organizations to securely communicate with anonymous sources and receive documents over the Internet. Through interviews with the technologists who conceived and developed SecureDrop, as well as the journalists presently using it, this report offers a sketch of the concerns that drive the need for such a system, as well as the practices that emerge when a news organization integrates this tool into its news gathering routines.
Journalism schools have developed solid foundations for teaching shoe-leather reporting techniques, but the practice of data journalism has been largely left out of the mainstream of journalism education, even as the field’s relatively small core of devotees has honed it into a powerful and dynamic area of practice. This report offers a snapshot of the state of data journalism education in the United States and outlines models for both integrating the use of data journalism into existing academic programs and establishing new degrees that specialize in data-driven and computational reporting practices. We interviewed more than 50 journalists, educators, and students, and we evaluated more than 100 journalism programs across the nation. This report features a chapter detailing quantitative findings, such as the number of U.S. journalism programs offering classes in data, computation, and related tech skills. We also include a chapter of qualitative findings in which our interviews and classroom observations offer some color and texture to this picture of the present state of data journalism education and its potential. This report is meant to describe the state of data journalism education, to underline the urgency of incorporating these skills to equip the next generation of reporters, and to offer guidelines for moving journalism schools in this direction.
If you have ever seen an image macro, the chances are you’ve seen Impact, a typeface so commonly used in these memes that it could be called ‘the meme font’. Its ubiquity within image macros is largely overlooked, yet it contributes significantly to their structure, and raises productive questions about creativity and the balance of replication and variation in memes more generally. This article is a brief introduction to Impact – its design and its history – as it relates not only to the practice of typography, but also to the development of standards for operating systems and for the web. Impact, like internet memes themselves, tells a story of both standardization and innovation. This typeface lay largely dormant for decades after its design, but today it ensures its own proliferation through its fixed role in the production of image macros.