Research Streams

Ontogenic Analysis 2001-Present

Ontogenic analysis is the process of following a subject through an indexing language.  There are many open questions about the power of this method, but more and more people find it useful. There are three key areas that are affected by this: preservation metadata, online access tools, and interoperability.  Research in ontogenic analysis has resulted in a few new constructs useful for evaluating indexing languages over time.   Some are listed below.

My papers on this topic available here: VersioningTennis.zip

Subject Ontogeny

Subject ontogeny is the life of the subject in an indexing language (e.g., classification scheme like the DDC).  Examining how a subject is treated over time tells us about the anatomy of an indexing language.  For example, gypsies as a subject has been handled differently in different editions of the DDC.  

Scheme Change

Indexing languages (schemes) change over time.  They do so to stay up to date.  However, there are implications for discoverability when schemes change.  Understanding how schemes change is part of ontogenic analysis and helps designers thing about their future users.

Collocative Integrity

If an indexing language changes over time, how does that affect the power of the scheme to collocate?  Is there a threshold below which a scheme becomes useless?

Semantic Gravity

Linked to collocative integrity, semantic gravity is the weight of the out dated class number in cataloguing practice.  Often libraries will keep an old number because they think it helps users.

Coordinate Enunciation

Once we have examined the life of a subject, we want to ask whether the concepts in the indexing language match those contemporaneously published literature.  We can now mine HathiTrust data to answer these questions.

Structural, Word-Use, and Textual Change

There are three kinds of change that occur through revising an indexing language (scheme).  The first kind is structural change, which affect the semantics of the scheme because they change the relationships that obtain between values in a scheme.  E.g., moving eugenics out of biology.   Word-use change affect meaning, but not structure per se.  An example of word-use change is changing gipsies-outcast races to people with status defined by changes in residence. Textual changes are changes in the semantic relationship between the scheme and the literature it organizes.  For example you can find collections that use the DDC that has both “sanitation of the race” books and “berries, nuts, and seeds” books in the same class.

Descriptive Informatics and Framework Analysis 2005-Present

Descriptive Informatics looks at metadata in the wild, and asks: How do we conceptualize different species of metadata and how diverse are their design requirements and implementations?  We analyze these conceptual constructs of metadata through Framework Analysis.  We ask: How can we conceptualize the differences and similarities that obtain between species of metadata schemes, and how does what we find affect our rubrics for design, use, management, and evaluation of such systems?

Ontomon

One way to describe the similarities and differences between indexing languages is to measure the terms used in them.  We can create visualizations of these measurements to inspect the comparisons.  Such radar graphs are evocative of creatures, and such measurements are not unlike cladistiic taxonomy.  So I have coined the phrase ontomon (ontology monster) to characterize these visualizations.

Teleology and Teleonomy in Metadata

We can also compare metadata by investigating the purpose of creation, maintenance, and use.  So we can compare and contrast how metadata for bibliographic control compares with metadata for the purpose of presuming authenticity.  And in fact we must carry out this kind of analysis in order to do work in ontogenic analysis.  Otherwise we are not assuming a constant purpose over time.

Framework Analysis

In order to understand the difference between ontologies, thesauri, and other knowledge organization systems, we have developed another analytical technique called framework analysis.  This not only looks at the structure of the system, but also the work practices that surround it, and the discourse which outlines the purpose, rhetoric, and context of the system.  So the information organization framework (IOF) is the unit of analysis, which is larger than other analyses of KOS.

Ethics and Intentionality in Knowledge Organization 2009-Present

Part of evaluating knowledge organization systems is knowing whether the actions you take are intentional, informed, and are in accordance with particular ethical beliefs.  To that end I have looked at some aspects of ethics and intentionality in knowledge organization.  Ethics and intentionality also reinforce issues brought up by teleology, teleonomy, and subject ontogeny.  So I see these three a family of research interests that help us understand knowledge organization.

Engaged Knowledge Organization (EKO)

If we take knowledge organization to be a craft, then we are assuming it requires, skill, time, care, and results in a work of art.  If we think that this work can do good or harm, we have to be sure that the time and care we take is well intentioned or engaged.  This line of work asks what does it take for us to understand our intentions in knowledge organization, and how, upon reflection, can we act in an engaged way with the work or organizing knowledge?

Metatheory of Indexing and Knowledge Organization

Metatheory, as outlined by Ritzer, is can help us better understand theory, be a prelude to future theory development, and serve as an overarching perspective on theory.  Some also say it can serve to evaluate theory.  My work in metatheory is to better understand indexing theory, classification theory, and other regimes of theory so we might improve the theoretically informed practices of indexing, classification, etc.