RE: robust hyperlinks could fix broken hyperlinks

Date view Thread view Subject view Author view

From: Jim Whitehead (
Date: Tue Mar 07 2000 - 12:07:51 PST


> In a recent academic paper, computer scientists Thomas A. Phelps
> and Robert Wilensky outlined a way to create links among Web pages that
> work even if documents are moved elsewhere. Although researchers have
> to tackle the issue before, Internet search experts said the paper
describes a
> potentially elegant solution to a widespread and long-recognized puzzle

While this approach is indeed interesting, there are a few things worth
noting about it.

1) The paper you referenced is a UCB Technical Report. While this paper was
produced by two academics, and hence is an "academic paper", is has not gone
through peer review. As a science reporter, you have a responsibility to
assess the sources you use, and to report when these sources may be
questionable (as is the case for any research that has not gone through peer

2) As a result of not going through peer review, this article misses some
directly relevant related work. The paper, "Referential Integrity of Links
in Open Hypermedia Systems", by Hugh Davis, in the Proceedings of
Hypertext'98 (p. 207-216), provides a framework listing a design space of
possible solutions to dangling links. While Davis' paper does not list the
Phelps and Wilensky approach, it does contain a very similar approach used
for matching anchors to their correct locations within documents. Davis'
paper mentions using a within-document search to find an anchor point that
may have moved. The Phelps and Wilensky paper extends this technique for
finding whole documents (but did do original work by doing so).

3) This same related work problem also affects the Phelps and Wilensky WWW9
paper, "Robust Intra-document Locations", where they talk about robust
within-document anchors. This work in many places duplicates Davis' earlier
work, without attribution (but does introduce a new mechanism that takes
advantage of the tree structure of XML documents). While this paper did go
through peer review for WWW9, it highlights the difference between good and
bad review processes. Had the reviewers been correctly chosen, they would
have known of this related work. Hugh Davis is very well known in the
Hypertext community, and did his dissertation on this subject -- reviewers
of this paper should have been aware of his work.

So, in the end a mixed message. You should have mentioned this work was
just a technical report, and hadn't been through peer review. But, going
through peer review isn't any guarantee either -- you also need to know
which are the good conferences, and the bad conferences.

- Jim

Date view Thread view Subject view Author view

This archive was generated by hypermail 2b29 : Tue Mar 07 2000 - 12:09:13 PST