Selected Academic Writing
This is me on Google Scholar.
My published work is very multidisciplinary, crossing between media studies, information visualization, critical algorithm studies, cybersecurity, data journalism, design studies, digital arts, gender studies, and human-computer interaction.
“Iceberg Sensemaking: A Process Model for Critical Data Analysis and Visualization” (IEEE Transactions on Visualization and Computer Graphics, 2024).
“Knowing Together: An Experiment in Collaborative Photogrammetry” (ACM SIGGRAPH / Leonardo, 2019).
“Walter Benjamin and the Question of Print in Media History” (Journal of Communication Inquiry, 2017).
“Guide to SecureDrop: An Emerging Platform for Secure and Anonymous Communication in Newsrooms” (Tow Center for Digital Journalism, 2016).
“Teaching Data and Computational Journalism” (Knight Foundation, 2016).
Full Academic Publication List
Berret and Munzner. “Iceberg Sensemaking: A Process Model for Critical Data Analysis and Visualization”
IEEE Transactions on Visualization and Computer Graphics, 2024.
Abstract: We
offer a new model of the sensemaking process for data science and
visual analytics. Whereas past sensemaking models have been built on
theoretical foundations in cognitivism and positivism, this model adopts
interpretivist foundations in order to reframe data sensemaking in
humanistic terms. We identify five key principles centered on the
concept of schemas: Tacit and Explicit Schemas, Schemas First and
Always, Data as a Schematic Artifact, Schematic Multiplicity, and
Sensemaking Over Time. Our model uses the analogy of an iceberg, where
data is the visible tip of the schema underneath it. The analysis
process iteratively refines both the data and its schema in tandem. We
compare the roles of schemas in past sensemaking models and draw
conceptual distinctions based on a historical review of schemas in
different philosophical traditions. We validate the descriptive,
predictive, and explanatory power of our model through four analysis
scenarios: uncovering data injustice, investigating official data,
teaching data wrangling, and producing data mashups.
Berret and Yu. “The Lifespan of Ephemera: Reflections on
Collaborative Art and the Embodiment of Data.”
Book Chapter published by Linköping
University Press, July 2024.
Abstract: This chapter offers a retrospective
account of a collaborative art project performed at Columbia University
and later exhibited in New York and Los Angeles. The project
showcased a novel imaging technique called collaborative photogrammetry,
which was employed for the first time during a workshop involving 35
participants. Both this workshop and the exhibition that followed were
collaborative efforts with a dedicated group of creative technologists
at Columbia. This project is used as a lens to discuss issues
surrounding collaboration between scholars, artists, and cultural
institutions.
Berret. “The Drawbridge Model of Cryptographic Communication.”
SocArXiV preprint posted March 19, 2024.
Abstract: This article
introduces a theory and model of cryptographic communication that treats conditions
of communication failure as the basis for different types of information
security. Building on the metaphor that communication is a bridge when it
succeeds and a chasm when it fails, cryptography serves as a kind of drawbridge
to limit the audience of a message through selective communication failure.
Different forms of cryptographic mediation are thus framed as arts of privation
that selectively limit and divide an audience. This theoretical framework is
illustrated using a model with a series of drawbridges representing distinct
sources of communication failure, each of which induces distinct limits or
privations depending on the manner and conditions of failure. The model’s descriptive,
explanatory, and diagnostic power are illustrated through various examples in
which distinct forms of communication failure are used to facilitate conditions
of privileged discourse and dialogue by narrowing the dissemination of
communication.
Kasica, Berret, and Munzner. “Dirty Data in the Newsroom: Comparing Data Preparation in Journalism and Data Science.”
ACM Computer-Human Interaction (CHI) conference paper presented April 25, 2023.
Abstract:
The work involved in gathering, wrangling, cleaning, and otherwise
preparing data for analysis is often the most time consuming and
tedious aspect of data work. Although many studies describe data
preparation within the context of data science workflows, there
has been little research on data preparation in data journalism.
We address this gap with a hybrid form of thematic analysis that
combines deductive codes derived from existing accounts of data
science workflows and inductive codes arising from an interview
study with 36 professional data journalists. We extend a previous
model of data science work to incorporate detailed activities of
data preparation. We synthesize 60 dirty data issues from 16 taxonomies
on dirty data and our interview data, and we provide a
novel taxonomy to characterize these dirty data issues as discrepancies
between mental models. We also identify four challenges faced
by journalists: diachronic, regional, fragmented, and disparate data
sources.
Berret. “If Communication is a Bridge, Cryptography is a Drawbridge: A Stage-Based Model of Communication Processes” (ICA 2022)
International Communication Association (ICA) conference paper presented April 24, 2022.
Abstract: This
paper offers a model of communication based on the conditions of its
success and failure. Building on Peters’ metaphor of communication as
both a bridge and a chasm, the model depicts cryptography as a
drawbridge to selectively choose the audience of a message. The model
forms a set of islands linked by a series of drawbridges, each
representing a source of communication’s success or failure, and each of
which must be passed in sequence. The first drawbridge is recognition,
in which the most basic source of failed communication is to be unaware
that a message is even present. The next is access, in which some form
of barrier or lack of authorization keeps one from accessing a message.
Next is legibility, the ability to recognize individual symbols,
followed by intelligibility, the recognition of coherent patterns,
words, and syntax in those symbols. The final two stages of this model
concern different stages of meaning. The public meaning of a message is
the literal, surface sense intended to be understood without insinuation
or ambiguity. The private meaning of a message is either selectively
encoded for a specific audience, or else fully interior to our own
minds. The descriptive and explanatory power of this model is
illustrated through various examples in which communication is secret,
secure, and otherwise selective of its audience.
Kasica, Berret, and Munzner.
“Table Scraps: An Actionable Framework for Multi-Table Data Wrangling
From An Artifact Study of Computational Journalism.” IEEE TVCG Proc.
InfoVis.
IEEE Visualization and Visual Analytics (VIS) conference proceedings published September 22, 2020.
Abstract: For the many journalists who
use data and computation to report the news, data wrangling is an
integral part of their work. Despite an abundance of literature on data
wrangling in the context of enterprise data analysis, little is known
about the specific operations, processes, and pain points journalists
encounter while performing this tedious, time-consuming task. To better
understand the needs of this user group, we conduct a technical
observation study of 50 public repositories of data and analysis code
authored by 33 professional journalists at 26 news organizations. We
develop two detailed and cross-cutting taxonomies of data wrangling in
computational journalism, for actions and for processes. We observe the
extensive use of multiple tables, a notable gap in previous wrangling
analyses. We develop a concise, actionable framework for general
multi-table data wrangling that includes wrangling operations documented
in our taxonomy that are without clear parallels in other work. This
framework, the first to incorporate tables as first-class objects, will
support future interactive wrangling tools for both computational
journalism and general-purpose use. We assess the generative and
descriptive power of our framework through discussion of its
relationship to our set of taxonomies.
Yu and Berret. “Knowing Together: An Experiment in Collaborative Photogrammetry.” Leonardo.
ACM SIGGRAPH presentation and Leonardo article published September 1, 2019.
Abstract: Knowing
Together is a collection of sculptures designed to explore
collaborative techniques for capturing three-dimensional images.
Thirty-five participants collectively created these images by forming
circles and passing a camera around. These images were stitched together
to form 3D models whose distortions are preserved as artifacts
attesting to their creation process, suggesting novel approaches to
photogrammetry that do not treat photorealism as its ideal quality.
Berret. “Walter Benjamin and the Question of Print in Media History.” Journal of Communication Inquiry.”
Journal of Communication Inquiry article published September 29, 2017.
Abstract: Although
Walter Benjamin’s ‘‘The Work of Art in the Age of Mechanical
Reproduction’’ is a seminal essay in the study of media history, the
work itself gives a surprisingly brief account of a core
subject: the printing press. Books and literature present only a
special case of mechanical reproduction, according to Benjamin, but
the implications of this point remain largely unexplored.
The purpose of this essay is to ask why Benjamin would have considered
print to be different or less historically consequential compared to
photography and cinema when the revolutionary potential he ascribes to
these more recent technologies is also prefigured in his other writings
on books and literature. To critically approach this question helps to create a
sharper picture of what matters to Benjamin about new media and also
points to figures like Georg Lukacs who influenced Benjamin’s account of
technology and art. Ultimately, this line of questioning also raises
concerns about the place of the ‘‘Work of Art’’ essay in the general study of
media history, a field in which the signal error is to treat new media
as unprecedented developments.
Berret. “Guide to SecureDrop: An Emerging Platform for Secure and Anonymous
Communication in Newsrooms.”
Tow Center for Digital Journalism report published May 12, 2016.
Abstract: This
report offers a guide to the use and significance of SecureDrop, an
in-house system for news organizations to securely communicate with
anonymous sources and receive documents over the Internet. Through
interviews with the technologists who conceived and developed
SecureDrop, as well as the journalists presently using it, this study
offers a novel account of the concerns that drive the need for such a system,
as well as the practices that emerge when a news organization integrates
this tool into its news gathering routines.
Berret and Phillips. “Teaching Data and Computational Journalism.”
Knight Foundation report published March 14, 2016.
Abstract: Journalism
schools have developed solid foundations for teaching shoe-leather
reporting techniques, but the practice of data journalism has been
largely left out of the mainstream of journalism education, even as the
field’s relatively small core of devotees has honed it into a powerful
and dynamic area of practice. This report offers a snapshot of the state
of data journalism education in the United States and outlines models
for both integrating the use of data journalism into existing academic
programs and establishing new degrees that specialize in data-driven and
computational reporting practices. We interviewed more than 50
journalists, educators, and students, and we evaluated more than 100
journalism programs across the nation. This report features a chapter
detailing quantitative findings, such as the number of U.S. journalism
programs offering classes in data, computation, and related tech skills.
We also include a chapter of qualitative findings in which our
interviews and classroom observations offer some color and texture to
this picture of the present state of data journalism education and its
potential. This report is meant to describe the state of data journalism
education, to underline the urgency of incorporating these skills to
equip the next generation of reporters, and to offer guidelines for
moving journalism schools in this direction.
Brideau and Berret. “A Brief Introduction to Impact: ‘The Meme Font’
Journal of Visual Culture article published December 19, 2014.
Abstract: If
you have ever seen an image meme on the internet, you’ve
seen Impact, a typeface so commonly used that it
could be called ‘the meme font’. Its ubiquity is
largely overlooked, yet Impact contributes significantly to the common structure of memes,
and examining its history raises productive questions about creativity and the balance of
replication and variation in memes more generally. This article is a
brief introduction to Impact – its design and its history – as it
relates not only to the contemporary practice of digital typography, but also to the
development of standards for operating systems and the web.
Impact, like internet memes in themself, tells a story of both
standardization and innovation. This typeface had lain largely dormant
for decades after its design, but today it ensures its own
proliferation through a fixed role in the production of memes seen everywhere.
Berret. “Sensors and Sensibilia.”
Chapter in Tow Center Report on “Sensor Journalism” published May 2014.
No abstract.