z-logo
Premium
A flexible content repository to enable a peer‐to‐peer‐based wiki
Author(s) -
Bartlang Udo,
Müller Jörg P.
Publication year - 2010
Publication title -
concurrency and computation: practice and experience
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.309
H-Index - 67
eISSN - 1532-0634
pISSN - 1532-0626
DOI - 10.1002/cpe.1465
Subject(s) - computer science , scalability , world wide web , documentation , peer to peer , content management , architecture , search engine indexing , multimedia , database , operating system , art , visual arts
Wikis—being major applications of the Web 2.0—are used for a large number of purposes, such as encyclopedias, project documentation, and coordination, both in open communities and in enterprises. At the application level, users are targeted as both consumers and producers of dynamic content. Yet, this kind of peer‐to‐peer (P2P) principle is not used at the technical level being still dominated by traditional client–server architectures. What lacks is a generic platform that combines the scalability of the P2P approach with, for example, a wiki's requirements for consistent content management in a highly concurrent environment. This paper presents a flexible content repository system that is intended to close the gap by using a hybrid P2P overlay to support scalable, fault‐tolerant, consistent, and efficient data operations for the dynamic content of wikis. On the one hand, this paper introduces the generic, overall architecture of the content repository. On the other hand, it describes the major building blocks to enable P2P data management at the system's persistent storage layer, and how these may be used to implement a P2P‐based wiki application: (i) a P2P back‐end administrates a wiki's actual content resources. (ii) On top, P2P service groups act as indexing groups to implement a wiki's search index. Copyright © 2009 John Wiley & Sons, Ltd.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here