Organizer: Javier Cha, University of Hong Kong, Hong Kong Chair: Ruth Mostern, University of California, Merced, USA Discussants: Ruth Mostern, University of California, Merced, USA; Hilde De Weerdt, Leiden Institute of Area Studies, Netherlands The ubiquitous encounter with digital resources has become a quotidian experience for the modern-day researcher in Asian Studies. The rise of digital humanities provides an opportunity to reflect upon and rethink prior modes of research in this field.
Elijah Meeks and Javier Cha address the role of the computer in the process of knowledge production. Current database management systems are designed to collate discrete data structured in stable ontologies. In historical studies, this assumption inherent in the software tools clash directly with the real-life inconsistencies revealed in the primary sources. Meeks argues for the necessity of developing tools that accommodate the ambiguity and mutable structures in historical geographic data like that produced in imperial China. Cha proposes an alternative strategy that leverages the interdisciplinary character of Asian Studies to bridge the division of the natural and the human sciences. In this approach to digital humanities, the computer serves the function of a tool, rather than an active interpreter, in humanistic inquiry. Following this vein, John Kim sets off from Leibniz’s reading of The Classic of Changes en route to examining the history and meaning of the idea of the digital. His critical reading unpacks the antinomies of man/machine, human/digital, and East/West that are implicitly embedded in humanities computing. Finally, Duncan Paterson argues against such oppositions altogether. An intriguing aspect of digital humanities, according to Paterson, lies in the way contemporary forms of intermediation produce new models for interacting with textual sources. Historical GIS, especially in the study of East Asia, is rapidly approaching maturity as far as the creation of spatial datasets. But critical engagement with the implicit arguments of data models and ontologies is almost uniformly superficial while technical engagement with such data remains mediated and tool-oriented. Using the Digital Gazetteer of the Song Dynasty as a case example, this paper examines the pitfalls of translating historical knowledge found in maps and gazetteers into modern, geolocated data modeled on standards developed to describe vastly dissimilar knowledge. That the primary tools and data models used in historical GIS were designed and built to represent static, universally agreed upon norms reflecting a modern, post-industrial age, needs to be taken into account when digitizing historical data. The reformulation of such data to better represent historical and cultural realities requires not only a nuanced vision of modeling data—one that moves distinctly away from relational data models toward more organic structures with less implicitness—but also the engagement with spatial data as spatial data, both at the ontological level and at the analytical level.
While the interdisciplinary character of Asian Studies is conducive to accommodating humanities computing within the diverse methodological turf in the field, the nomothetic tendencies of computational tools conflict with the pursuit of local knowledge and diversity of human experience which has been the hallmark of area studies.
My argument is twofold. First, the successful implementation of computer-assisted humanities projects entails the forsaking of the reductionist model of classical mechanics and the adaptation of network models and nonexperimental methods which matured in evolutionary biology. The second point discusses the practical implications of this conceptual leap in the context of historical studies. One notable virtue of digital humanities is the remarkable capacity of the computer in efficiently looping through and drawing connections between large sets of information. This data-intensive approach to history, I propose, should retain the function of the computer as that of a tool, though an undeniably powerful one, by which the historian builds models and writes narratives about the past on the basis of what those models reveal to us. The end objective of history is not the mere reconstruction of models but, as Marc Bloch asserted in the previous century, the hermeneutic dialogs between the concerns in the mind of the present-day researcher and the voices emanating from the records of the past. Examples will be drawn primarily from the ways in which historians have utilized the civil examination rosters and geographic gazetteers of the Choson dynasty (1392-1910), and occasional references will be made to China and France.
As the use of digital technologies and computational methods in the humanities becomes more prevalent, and as the line between the sciences and the humanities continues to get re-drawn, many have inquired about the status of the “humanities” in “digital humanities.” However, this paper seeks to subject the “digital” to the “humanistic” mode of inquiry. What exactly do we mean by “digital”? Does it have a history? And how does the idea of the digital bear upon the foundations of humanistic inquiry — representation, metaphor, and knowledge? When we begin to answer some of these questions, we not only realize the importance of the digital as an object of humanistic knowledge, but also its vital place in the history of East-West interactions. Limiting the discussion to a particular subset of digital systems known as binary, which is the dominant code of modern computing, I will look at how this system has its modern origins in Leibniz’s reading of The Classic of Changes. From this, arise questions concerning the nature of binary as a language that is universal yet artificial, non-representational yet informational. At the center of this “linguistic” problem is the reader — it now matters whether the reader is human or machine. This paper is ultimately about how binary allegorizes the antinomies between man and machine, the human and the digital, the subjective and objective, “East” and “West.”
The impacts of information technologies on hermeneutic processes are a topic for academic discourse across humanistic and scientific disciplines. This paper will critically examine the euro-centric narrative of early “media philosophy” and the anti phallogocentric perspective of the “post-human subject" in “digital philosophy” in order to question their relevance for contemporary area studies.
The differing approaches of thinkers like Mike Sandbothe, Katherine Hayles, Friedrich Kittler, and Bruno Latour share the tendency to address the effects of intermediation on both reader and text. The intricate questions of this multi-causal approach defy reductionism to a dualistic perspective about technical problems of quantitative data analysis, on the one side, and changing representations of content, on the other. I will argue that by dissolving the disciplinary boundaries along which these debates currently take place, aspects of both quantitative and humanistic research within area studies can be promoted. Because of the overlapping strategies between contemporary area studies and debates about new media, both fields stand to benefit from a joined discourse.
The resulting hermeneutic frameworks, which are inspired by non-linear narratives, distributed modes of production, and the associated literacy skills challenge current academic practices. Drawing examples from Chinese historical sources, I will argue that area studies presents much potential for evaluating methodologies developed outside our field.
|