background image
DigiCULT 35
ical things and born-digital objects (including prod-
ucts of imagination, e.g. chimera). Bogaarts and other
experts agreed that for such applications new concep-
tual frameworks (e.g. CIDOC Conceptual Reference
Model) needed to be applied to get much above basic
models such as online thesauri.
10 years for most of the hard sciences', while oth-
ers might present `major semiological obstacles'. Here
metadata would be vital for semantic operations for
at least another 20 years, `even though some seman-
tic "drilling" on resource level may be feasible in the
meantime on the basis of semantic frameworks and
most important affordable language engineering
resources (lexica, morphological normalization tools
etc.)'. However, Gradmann warned that there might
be fundamentally inappropriate semantic approach-
es. As probably `the' major breakthrough in intelligent
systems, he suggested `a machine that would model
"understanding" in a hermeneutical sense, at least to
some degree. This machine would to some extent
carry out the very operations Weizenbaum's Eliza was
only just faking to perform. I expect a "Non-Fake-
Eliza" to be operational around 2010.'
Jacques Bogaarts (Nationaal Archief, The Nether-
lands) envisaged systems that allow users `to point to
a cultural heritage object, the real thing or a digit-
al presentation of it, and as a result get all the con-
tiguous material (objects and information) that is
available'. For developing such applications, bet-
ter cross-domain `contiguity models' and algorithms
would be necessary. These would take into account
the fact that cultural heritage objects are mostly phys-
ical things, not digital, but also allow for defining and
processing the contiguity of representations of phys-
A warning against likely inappropriate uses of
`In my vision the decisive gaps are not technical
ones: I am rather certain that advances in Seman-
tic Web and language engineering technology
will be more than impressive, even though major
investments will be needed (and made!) in that
respect. The decisive limitation probably will be
one of cultural differentiation: the basic require-
ment for conceptual models to function adequa-
tely is a proper recognition of the "semiological
divide" that separates hermeneutically based scho-
larship from empiricist STM science. These two
"cultures" have very different assumptions regar-
ding the relation between linguistic signs and
"things", between "concepts" and "words". Igno-
ring this difference would cause a fundamenta-
lly inappropriate use of semantic frameworks and
ontologies in a hermeneutically driven context!'
Stefan Gradmann (University of Hamburg/RRZ,
Browser: Some interesting Semantic Web
resources and projects
Currently, we may distinguish between grass-
roots cultural heritage developments towards the
Semantic Web making use of XML and RDF, and
`the fundamental route' of creating strong referen-
ce models and generic, foundational ontologies.
Much is expected from the cross-domain referen-
ce model CIDOC-CRM, of which version 4.0
was released in March 2004, and which has so far
found only limited use in RTD projects (e.g. in
the SCULPTEUR project).
For the more immediate future [2005-2009] we
may expect a wider use of XML, and a slowly
increasing uptake of RDF. Currently, tools for
semantic annotation, ontology building and `con-
trolled vocabularies' (e.g. thesauri) on the basis of
domain-specific ontologies are `booming'. `Mer-
ging' the current generation of domain-specific
ontologies (for describing heritage objects) with
`middle-layer' and generic top-ontologies may
be achievable only in a timeframe of ten years;
working `real-world' applications with reasoning
mechanisms (software agents) may not appear
until after that.
Some interesting resources and projects include:
Nick Crofts, M. Doerr, T. Gill, S. Stead and M.
Stiff, (eds): "Definition of the CIDOC Concep-
tual Reference Model", March 2004 (version 4.0).
Standard Upper Ontology Working Group (SUO
WG), References to Ontologies.
SWAD-Europe: Results of an EU-funded pro-
ject in support of the W3C's Semantic Web Acti-
vity through research, demonstrators and outreach
(for example, see their thesaurus activities and links)
"Eliza" is an Artificial
Intelligence program
that was created by
MIT scientist Joseph
Weizenbaum in the early
1960`s. Weizenbaum named
it after Eliza Doolittle, the
figure from the musical My
Fair Lady which is based
on Bernard Shaw's classic
Pygmalion. The program's
mission was to attempt to
replicate the conversation
between a psychoanalyst
and a patient.
Cf. the interviews
with Janneke van Kersen
and Nicola Guarino in
DigiCULT Thematic Issue
3: Towards a Semantic Web
for Heritage Resources, May
2003, http://www.digicult.
DCTHI7_271104.indd 35
06.12.2004 8:37:39 Uhr