background image
62 DigiCULT
preservation.' Such centres could assist by providing
guidance and support in `a continued diligence and
continuous migration anticipating the future'.
John Poirier (Northwest Territories Archives, Can-
ada), a coordinator of technical services with exper-
tise in dealing with photographs, sound recordings
and moving images, suggested that the following
could leverage digital preservation: First, authorita-
tive guidelines, which should build on best practices.
These practices needed to be reviewed, consolidated
and published. With an eye on the various sizes and
capabilities of the heritage institutions, Poirier stat-
ed: `Guidelines should be flexible in terms of enabling
institutions with varying levels of resources and skills
to obtain best results for resources invested. Consider-
ations would include reasonable quality and above all
long-term viability of product.'
Secondly, Poirier called for `a strategy for dealing
with changing storage and file format technology'.
He insisted that there needed to be more attention to
what smaller institutions require, for example, `much
research and planning seems to be geared toward the
top echelon of people and institutions involved with
preservation. There seems to be a gap in terms of
developing approaches that, while not absolutely per-
fect, will enable the small fry in all their diversity to
have a good shot at simple, good quality and afford-
able preservation methods.' Poirier explained further:
`It is (in a way) fine for institutions like the Library of
Congress to develop massive proprietary systems. Not
so fine for the lesser lights, especially when off-the-
shelf approaches could do the job as well if not bet-
ter. I cite IPTC captioning for image files, adapted for
archival purposes via a convention for extra fields, as
an example. A segment of RTD ought to be devot-
ed to finding, adapting, and promoting commercially
developed solutions.' A participant from a governmen-
tal body or agency added some further practical meas-
ures that could generally leverage the work in digital
preservation: `high-quality and understandable doc-
umentations and publications of projects which are
available free of charge; advice from practitioners at an
affordable price'.
Martin Doerr (ICS-FORTH, Greece) wrote that,
generally, `the problems are more social than tech-
nical. National agencies should monitor availability
of key technologies that may render media unreada-
ble and maintain standards recommendations for for-
mats they monitor. Companies could acquire quality
predicates if they guarantee a certain life-time of their
products to a national agency.' With regard to RTD
issues, Doerr saw a lack of `systematic investment in
migration (mapping) technology'. In the area of such
mapping technologies (as described in theme one)
he thought that by 2010 some real progress could be
achievable. Doerr noted that `migration is inevitable in
long terms'; however, `the solution is there when con-
tinuous migration, carrier multiplication and monitor-
ing has become institutionalized'.
Doerr considered as an area where focused RTD
could bring about good results in the medium term:
`standardized risk management software for decision
support in cultural institutions (catastrophes like fire,
technology change, carrier aging, connected to valid
data from a preservation agency)'. Such software could
become available around 2008. Furthermore, he high-
Grid computing
The development of Grid technology is dri-
ven by a vision of making massive computing
and storage resources available as a service on
demand. Practical implementations include most
notably the open source Globus toolkit and the
Unicore project. Standards are being develo-
ped for Grid computing by the Global Grid
Forum (GGF). In addition to many academic and
research institutions, about thirty IT industrial pla-
yers are committed to Globus, or involved in the
GGF, or both. These include, for example, Micro-
soft, Hewlett Packard, IBM, Sun, Dell and BEA.
What motivates these industry players is the goal
to achieve resource virtualisation and a massive
scaling of computing infrastructure and services
through effective resource sharing. Actually, enor-
mous processing power may be harvested at relati-
vely low cost, `on-demand'. Considered as further
down the road is the enabling of application sha-
ring and clustering.
Sources:
Brian Carpenter, "What is Grid Computing?"
(26 February 2003), http://www.isoc.org/
briefings/011/index.html
Ian Foster, C. Kesselman, J. Nick, S. Tuecke, "The
Physiology of the Grid: An Open Grid Services
Architecture for Distributed Systems Integration"
(22 June 2002), http://www.globus.org/research/
papers/ogsa.pdf
Globus toolkit, http://www.globus.org
Unicore project, http://www.unicore.de
Global Grid Forum, http://www.gridforum.org
Major European Grid computing projects, 2001-
2004: European DataGrid http://eu-datagrid.web.
cern.ch/eu-datagrid/, and EUROGRID,
http://www.eurogrid.org
DCTHI7_271104.indd 62
06.12.2004 8:38:35 Uhr