Keywords Abstract
Ferrara, Alfio, and Massimo Parodi. "A Journal on the Web: What We Are Not, What We Do Not Want." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 87-106. ELPUB. Milano, Italy, 2009.

In this paper we discuss the experience of publishing a journal in both a traditional and an electronic edition. In particular, we take into account our journal Doctor Virtualis, devoted to the history of medieval thought, trying to understand what the limitations and problems of electronic publishing are and where the paper edition is really different from the electronic one. Starting from our experience, we then try to better understand what electronic publishing actually is. We critically discuss the usual analogy between traditional publishing and electronic publishing, by proposing a new analogy between the web and the medieval cultural environment. This new analogy helps in understanding some complex processes on the web and in proposing new approaches to transform paper texts into electronic products. To this end, we show how rhetoric plays a crucial role in adapting the text to the medium and how new paradigms for text editing could help in finding a (preliminary) definition of electronic publishing.

Kokabi, Mortaza. "A look at the present status of electronic publishing in Iran at the beginning of the year 2009." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 521-526. ELPUB. Milano, Italy, 2009.

The present status and future of electronic publishing in Iran does not seem to be very promising. The author, trying to gain some statistics on the issue, found almost no useful statistics. This paper that tries to give a picture of electronic publishing in Iran is based structurally on an article by Zahra Seifkashani (2003) entitled: "How the Internet can Influence the Iranian Readers"; it discusses the status of electronic publishing in Iran. The problems discussed by the author, as well as other problems, will be discussed in this article along with various suggestions on how to improve the Iranian electronic publishing scene.

Marcondes, Carlos Henrique, Marília Alvarenga Mendonça, Luciana Reis Malheiros, Leonardo Cruz da Costa, and Gabriela Veras De Moraes. "A publishing system to extract and represent the knowledge content of scientific articles on Health Science in machine-processable format." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 175-186. ELPUB. Milano, Italy, 2009.

Scientific articles published in electronic format are knowledge bases, especially in Medicine. An obstacle to semantic processing of this knowledge by computers is that in spite of their digital format, articles are in text format for human reading and processing. A model is proposed for electronic publishing scientific articles both in textual format and in machine "understandable" format, in ontology format. Software agents can process the content of an article published according to the model, thus enabling semantic retrieval, consistence checking and the identification of new discoveries. The model is described and initial steps toward the development of an authoring/publishing system which implements the model proposed are related.

Yurdagül, Ünal, and Yasar Tonta. "An Analysis of the Consortial Use of Electronic Journals in Turkey: The Case of SpringerLink and Wiley InterScience Databases." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 535-542. ELPUB. Milano, Italy, 2009.

One of the most important functions of libraries is to develop and manage an effective collection that best meets the information needs of its users. With the introduction of electronic resources, collection development and management policies have become much more complex and interesting. Using the survey method, 2,770,905 journal articles downloaded by the users of the consortium members (universities) from SpringerLink and Wiley InterScience databases between 2003 and 2007 were evaluated. Journal articles used were analyzed by means of bibliometric laws and tcore journals were identified. One third of the articles used were satisfied by the core journals that constitute some 2.2% and 4.5% of all journals. Distributions of articles used for journals do not conform to Bradford’s Law. No correlation was observed between the frequencies of use of core journals and a) their impact factors; b) total number of citations they received; and c) their half-lives. Findings of this study should be used in collection development and management. Frequently used core journals should be retained, while rarely used or never used journals should be excluded from the collection. Findings should be used to negotiate better consortium license agreements.

Maly, Kurt, Harris Wu, Mohammad Zubair, and Victor Antonov. "Automated Support for a Collaborative System to Organize a Collection using Facets." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 187-203. ELPUB. Milano, Italy, 2009.

We are developing a system that improves access to a large, growing image collection by supporting users to collaboratively build a faceted (multi-perspective) classification schema. For collections that grow in both volume and variety, a major challenge is to evolve the facet schema, and to reclassify existing objects into the modified facet schema. Centrally managed classification systems often find it difficult to adapt to evolving collections. The proposed system allows: (a) users to collaboratively build and maintain a faceted classification, (b) to systematically enrich the user-created facet schema, and (c) to automatically classify documents into an evolving, user-managed facet schema. In this paper, we focus on (c), where we describe the approach to automatically classify documents into an evolving facet schema. We propose a learning-based system that periodically learns from manually classified images, and then classify new images accordingly.

Ballerio, Stefano. "Automatic Analysis of Electronic Discharge Letters as a Means to Evaluate the Continuity of Information and of Patient Care." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 607-612. ELPUB. Milano, Italy, 2009.

Joint Commission International standard 3.2 on Access to Care and Continuity of Care states that discharge letters should contain information about follow-up instructions of doctors to patients. We developed a text mining system to analyze a collection of 413 discharge letters of heart failure patients and checked their compliance with standard 3.2. We built a domain-specific ontology and a thesaurus and mined the collection with CASOS AutoMap. After validation, the system sensitivity was 0.484; specificity was 0.834; positive predictive value was 0.555; negative predictive value was 0.790. Improving these results requires more powerful natural language processing tools, but text mining seems a promising way to evaluate the continuity of information and of care.

Strotmann, Andreas, and Dangzhi Zhao. "Bibliometric factor maps for knowledge discovery in digital libraries." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 501-512. ELPUB. Milano, Italy, 2009.

In this paper we describe the architecture of a visual bibliometric browsing plug-in for the growing number of digital libraries that provide cited references in their document meta-data, using a simple but effective visualization method for citation network analyses we recently introduced. Citation-based network analysis methods such as co-citation analysis have long been recognized as effective tools for gaining insight into the intellectual structure of a field through its literature. Visualizations of these networks can help the user get an intuitive aggregated overview of the field and the interrelationships between documents or authors, which in turn can aid query expansion, search refinement, and exploratory browsing. Our design calls for a visualization of the results of a multivariate factor analysis of a bibliometric similarity matrix calculated from a user's search results and/or from documents that are closely related to them. This provides the user a digital library with an interactive map of the literature that the user is interested in, where each visual element aggregates different aspects of the search result (authors and/or subfields). By helping the user see the forest for the trees (i.e., a structured visual landscape of the intellectual domain covered by the user's search and its bibliometric vicinity rather than a long list of search results), these maps and the relevant links they contain promise to provide a valuable aggregated browsing tool for digital libraries.

Nisheva-Pavlova, Maria, and Pavel Pavlov. "Building a Digital Library with Learning Materials." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 471-484. ELPUB. Milano, Italy, 2009.

The paper discusses some aspects of a project aimed at the development of a methodology and proper software tools for building academic digital libraries. A particular functional model of academic digital library has been proposed and analyzed. The emphasis of the presentation falls on some solutions of the large set of problems concerning the development of adequate mechanisms for semantics oriented search in multilingual digital libraries. An ontology-based approach is suggested in order to standardize the semantic annotation of the library resources and to facilitate the implementation of the functionality of the search engine. A proper subject ontology covering the area of Computer Science has been developed for this purpose. The paper discusses the requirements of the basic types of users of an academic digital library and suggests some relevant solutions.

Barbera, Michele, Claudio Cortese, Romeo Zitarosa, and Emilia Groppo. "Building a Semantic Digital Library for the Municipality of Milan." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 133-154. ELPUB. Milano, Italy, 2009.

In the second half of 2007 the Municipality of Milan decided to co-finance a one year project proposed by CILEA (Consorzio Interuniversitario Lombardo per l'Elaborazione Automatica) called “Biblioteca Aperta di Milano” (Milan Open Library) or BAMI, aimed at creating an integrated system to make a set of digitized documents from the cultural institutions of Milan available on the Web. To meet the goals of the project, we adopted Semantic Web standards and technologies to build the knowledge base and used a faceted browser for the user interface. Faceted browsing is an exploration technique for structured data sets based on facet theory which allows users to find information without a-priori knowledge of its schema. To store and provide the visualisation of digital documents we used CodeX[ml] and AriannaWeb. Regarding the selection of content, we decided to focus mainly on the documents that belong to a specific branch of the city's cultural heritage: those from the 19th century, giving special attention to musical documents. In this paper we present the methodology and the workflow that led us to build an ontology with the aid of a Scientific committee of librarians and 19th century music experts. We also illustrate the usage of a dedicated web-based editor that we used to populate the ontology. One of the most important objectives of the project was to overcome the limitations of the search engines traditionally used in the librarian domain (e.g. OPAC) by providing the users with new tools for browsing and analysing cultural knowledge. Thus, the paper also focuses on the BAMI User Interface, which was built by extending and enhancing Longwell, a faceted RDF browser developed by the Massachusetts Institute of Technology. We then describe the complementarity and flow of information between the three applications (Longwell, CodeX[ml] and AriannaWeb). The paper ends with a discussion of some possible evolutions of the project and of the main difficulties we encountered during the development.

Morrison, Heather, Gilbert Bede, Richard Baer, Melissa Belvadi, Michelle Chou, Alison Curtis, Pamela Dent, John Dobson, Faith Jones, David Karpinnen et al. "Connecting Readers with Open Access Resources: The CUFTS Free! Open Access Collections Group." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 461-470. ELPUB. Milano, Italy, 2009.

Libraries play an important role in disseminating knowledge. This paper presents an overview of the work of one library group, focused on collection of quality free and open access journals, and illustrates how libraries can be more effective in disseminating knowledge and connecting patrons with needed material by working collaboratively on open access and free collections. Also discussed are the few simple steps that publishers can take to facilitate dissemination of journal content through libraries – following standards such as OpenURL and/or DOI to provide for article-level linking, and providing title lists for download with the key metadata libraries need to include content in library collections, such as title, ISSN, fulltext start date, and journal URL. The CUFTS Free! Open Access Collections Group works collaboratively to connect library patrons with quality open access and free resources, ranging from the international Directory of Open Access Journals to locally developed lists such as Open Access Journals, Open Access Magazines, and Canadian Historic Newspapers. CUFTS is the knowledgebase (journal title lists) of reSearcher, a locally developed open source suite of resources. Through CUFTS, the open access and free journals collections are made available through a link resolving service (GODOT), A to Z journal lists, library catalogues and union databases. A file of MARC records for all of the titles is freely available to download, and downloadable spreadsheets are freely available for local collections as well, from

Knoll, Adolf, Tomáš Psohlavec, Stanislav Psohlavec, and Zden_k Uhlí_. "Creation of an International Digital Library of Manuscripts: seamless access to data from heterogeneous resources (ENRICH Project)." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 335-348. ELPUB. Milano, Italy, 2009.

The Czech Manuscriptorium Digital Library has been in operation since the year 2002. It aggregates data from various cultural institutions in the Czech Republic and also from abroad. The EU eContenPlus ENRICH project (Dec. 2007 - Nov. 2009) provides the opportunity to enhance the integration of data from a number of European institutions. Manuscriptorium offers seamless access to them indifferently of the geographical location of their physical storage under its uniform interface ( It supports both harvesting via OAI as well as production of Manuscriptorium compliant data from the beginning for those who wish to create their digital library as a Manuscriptorium component part without building it completely on their site. The Manuscriptorium team has accumulated a lot of experience in work with partners and users; therefore, its current and future development is trying to respond to their requirements through actions defined as personalization both for users/researchers and contributors.

Baptista, Ana Alice, and Eva M. Méndez. "DC-Social Tagging Workshop." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 17-20. ELPUB. Milano, Italy, 2009.

The so-called Web 2.0 brought a new breadth to the Internet, and a social perspective that seems set to stay. Services such as LinkedIn, Hi5, and Facebook have found a place in our society. People connect to each other through common paths. Meta-APIs such as Google's November 2007 release, OpenSocial, enable social applications to operate across multiple sites and services, providing a way to relate much of this data. In social bookmarking tools (e.g., Connotea, Bibsonomy), and media sharing services (such as Youtube, Flickr, Picasa, Slideshare) people are asked to tag and otherwise annotate and share their resources inside communities or at a global scale, creating a huge amount of user generated metadata (tags) with a clear value for information discovery.This workshop intends to gather all interested in such applications and developments, and in their relationship with metadata and practices. The themes of the workshop will be:-Emerging trends in social tagging.-Tagging communities and Web-based collaboration.-Web standards for resource description in collaborative landscapes.-Vocabulary building from folksonomies (tag-ontologies, tag-thesaurus, etc.)-Metadata and annotation management.-Formats for describing communities (FOAF, SIOC, etc.)-Analysis of online communities (SNA) through folksonomies and tagging systems.-Other ways of describing information for Web 2.0 (microformats, etc.)A description of the workshop formatThe workshop will include invited talks and presentations, giving a consistent background for discussion. This will be followed by short presentations or position papers submitted by interested researchers, bloggers, etc. that will be evaluated by the Program Committee (see below). The Call for Presentations will be sent to discussion lists from different perspectives and backgrounds, including the DC Social Tagging Community, microformats- and other related communities.A discussion on improving communication between (and thus research within) different user-generated metadata communities will be included. This last session in the workshop will be conducted ‘BridgeCamp’ style.(BridgeCamp grew from the Barcamp movement, in which participants in the workshop are offered an essentially unfilled timetable in which to either propose questions, offer to give a 5-10 minute presentation answering them, with a further 10-5 minutes for discussion. Several members of the Program Committee have considerable experience with this style of workshop and have found it to be a valuable addition to a traditional set of presentations, providing a structure more readily oriented to outcomes relevant to the participants than normal ‘breakout sessions’).

Zhong, Cantao. "Development of Institutional Repositories in Chinese Universities and the Open Access Movement in China." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 527-534. ELPUB. Milano, Italy, 2009.

The number of research articles about Open Access (OA) and Institutional Repository (IR) has grown quickly in recent years, while Chinese universities move slower than their western counterparts. There are only a few experimental institutional repositories (IRs) now, and no explicit campus-wide policies towards open access have been proclaimed. This paper will describe the status of the OA movement in China, and mainly focus on institutional repositories in Chinese Universities. Factors that hinder the development of OA will be discussed; meanwhile we will give some suggestions for constructing IRs in Chinese Universities.

Muhammad, Ahsan. "Digital Divide and Digitization Initiatives in Pakistan: A Bird's Eye View." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 515-520. ELPUB. Milano, Italy, 2009.

The process of digitization in libraries of Pakistan is at a very initial stage. Pakistan is lagging behind developed countries due to many reasons such as: funding, computer illiteracy and expertise in digitization. However, some institutions like the Higher Education Commission, the Punjab University Library, the National Library of Pakistan and some private organizations are doing digitization. In social sciences, digitization is being done to preserve the cultural heritage of manuscripts and other old literature related to the history and culture of Pakistan. There are nearly 0.15 million manuscripts in Pakistan in Arabic, Persian, Urdu, Pashto, Sindhi and Sanskrit languages. Some are available in libraries of Pakistan while others are in Personal collections. Bibliographies of most of these manuscripts are available. Hence, there is a great need and opportunity to digitize this literature. This paper presents a status report on digitization initiatives in Pakistan.

Tanner, Simon. "Digital Futures: Strategies for the Information Age." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 49. ELPUB. Milano, Italy, 2009. The challenge of the next decade is to effectively expend our limited resources to continually deliver better information resources through e-publishing mechanisms. This keynote will focus upon the benefits of collaboration and the means by which collaboration is particularly benefiting digital preservation strategies for the e-publishing community. We need to build sustainably, providing publishing opportunities for today without discounting future opportunities to trade, to share and to preserve our collective digital heritage.
Melero, Remedios, María Francisca Abad, Ernest Abadal, and Josep Manel Rodr Gairín. "DULCINEA: Copyright Policies and Type of Access to Spanish Scientific Journals." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 581-586. ELPUB. Milano, Italy, 2009.

DULCINEA is a portal created as part of the objectives of a Spanish National Project entitled: Open access to scientific outputs in Spain: Current status, open access advocacy and implementation of open access policies. The name of Dulcinea was given due to the relationship with SHERPA/ROMEO project (, which analyses publisher copyright policies and self-archiving terms of most international journals, but in which databases Spanish Journals are underrepresented. The aim of Dulcinea is to identify the policies of publishers and Spanish journals towards open access archiving, and to analyze how these policies can affect the re-use of papers and their deposit in subject or institutional repositories. Currently Dulcinea's database contains more than 250 records of Spanish journals, which include bibliographic data, access policies, self-archiving-policies according to their copyright licences and a classification of the journals following SHERPA/ROMEOO colour taxonomy.

Dubini, Paola, and Elena Giglia. "Economic sustainability during transition: the case of scholarly publishing." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 239-262. ELPUB. Milano, Italy, 2009.

In recent years, Open Access has received increased attention by scholars and practitioners as an alternative paradigm to traditional journals for the publication and diffusion of scholarly publishing. The steady increase in the number of successful Open Access journals shows that the model is a viable alternative in terms both of reputation and visibility; recent studies have also demonstrated its cost-effectiveness. However, the analysis of the sustainability of different models for scholarly publishing needs to take into consideration the existence of network externalities and information asymmetries, that generate two sided markets; the introduction of innovative business models needs to overcome the problem of reaching critical mass both on the readers' and on the authors' market. In this exploratory paper we seek to understand to what extent offering configuration contributes to double market development; we compared twelve peer reviewed scientific journals, selected from different academic disciplines. Within each group we selected a pure Open Access (OA) journal, a journal that converted from Toll Access (TA) to Open Access, a hybrid journal, and a pure traditional TA journal. We mapped the offering characteristics and we classified them in terms of accessibility for the reader, visibility for the author and benefits for researchers; we also added information on the pricing scheme of the journal. Results show a pre-eminence of OA titles in each of the three markets - as they took advantage of the possibilities offered by digitization technologies in a faster and cost effective way -, even though TA journals have been quick in keeping up with the innovative services offered by OA journals; on the other hand, many TA journals still enjoy significant first mover advantage and reputation rent which they can leverage to strengthen their offering. In the asymmetry of the scholarly communication market, competition on the author side is therefore likely to be very strong. The presence in the market of a variety of business models has benefited the research community, as services have increased; the refereeing process is becoming more transparent, high quality contributions have higher chances of being accessed by wider market segments. We did not find a significant correlation between business models and offering configuration, neither between price and offering configuration, nor IF and offering configuration; although wider access has determined an acceleration in the ability of OA journals to reach visibility. As the two business models are likely to be increasingly in direct competition due to scarce financial and reputational resources, we expect that publishers (both OA and TA) will look for specific offering configurations for the different research communities they are targeting. In this transition phase, universities are going to play a key role in orienting the development of offerings of different publishers.

Moed, Henk F.. "Electronic publishing and bibliometrics." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 51. ELPUB. Milano, Italy, 2009.

This lecture deals with the relationships between electronic publishing and bibliometrics, the quantitative study of scientific-scholarly texts. It gives an overview of the effects of recent trends in electronic publishing upon the availability of bibliographic or bibliometric databases, indexing of publications, the construction of new bibliometric indicators, and upon the research topics in bibliometrics and quantitative studies of science. In fact, electronic publishing constitutes an important topic in bibliometric research. The lecture focuses on the effect of electronic publishing and related factors including ‘Open Access’ upon the citation impact of scientific-scholarly publications, and on the potentialities of “usage” data - based on full text downloads from electronic publication archives - as indicators of research quality.

Van Bentum, Maarten, Dennis Vierkant, and Arjan Hogenaar. "Enhanced Scientific Communication by Aggregated Publications Environments (ESCAPE)." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 601-606. ELPUB. Milano, Italy, 2009.

The ESCAPE-project aims at extending the existing infrastructure of repositories of scientific publications in such a way that it will be possible to identify, describe, preserve and present aggregations of related objects (documents, videos, datasets, etc.), not necessarily produced by an individual author or group of authors. To this end a repository for OAI-ORE resource maps will be developed as well as an editor for creating and changing resource maps. The repository will be based on Fedora 3.1, reusing its built-in RDF support. The resource maps repository form the basis for the so called Aggregated Publications Environments. These APE's provide a resource map editor and a tool for browsing and searching aggregated resources. APE's will be developed for three research groups involved in this project. For the description of content relations, besides the use of 'Proxy' in OAI-ORE we introduce the 'Relation Annotation' object. This is an object which describes the relationship between two resources.

Polydoratou, Panayiota. "Experimenting with the Trial of a Research Data Audit: Some Preliminary Findings about Data Types, Access to Data and Factors for Long Term Preservation." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 291-307. ELPUB. Milano, Italy, 2009.

Developing systems and services for the effective and efficient management of research data as well as addressing issues around their long term curation is an area of increasing activity in UK Higher Education. This paper discusses some preliminary results from a questionnaire survey, conducted as part of the trial implementation of the Data Audit Framework Methodology at University College London (UCL). Fifty seven (57) academic and research staff from 5 designated departments and an interdisciplinary research centre provided information about the nature of their research and the types of primary research data they produce. The survey explored factors that could impact on access, use and preservation of such data. The preliminary results indicate that researchers recognise the potential usefulness of such data for other researchers as well as their long term value. Retaining primary research data after the end of the funding period and re-using them for initiating further research are practices already acknowledged. However, ownership, copyright and restrictions on access to research data can be hazy areas for academic and research staff and require further investigation, advice and support. The value of primary research data appears to be closely linked to the context within the data which were generated.

Houghton, John. "Exploring the costs and benefits of alternative publishing models." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 207-238. ELPUB. Milano, Italy, 2009.

This paper reports findings from a project undertaken for the Joint Information Services Committee (JISC) in the UK which explored the Economic Implications of Alternative Scholarly Publishing Models. The aim of the project was to examine the costs and potential benefits of the major emerging models for scholarly publishing, including subscription publishing, open access publishing and self-archiving. To ensure that cost-benefit comparisons can be made, analysis focuses on self-archiving models that include the certification and quality control functions necessary for formal scholarly publishing, namely (i) ‘Green OA’ self-archiving in parallel with subscription publishing and (ii) the deconstructed or overlay journals model in which self-archiving provides the foundation for overlay journal services. Adopting a formal approach to modelling the process and identifying activity costs, this paper examines scholarly communication life-cycle costs per article. It concludes that different scholarly publishing models can make a material difference to returns to R&D expenditure as well as the costs faced by various stakeholders. It seems likely that more open access would have substantial net benefits in the longer term and, while net benefits may be lower during a transitional period, they are likely to be positive for both open access publishing and self-archiving alternatives.

Baratè, Adriano, Goffredo Haus, and Luca Andrea Ludovico. "IEEE 1599: a New Standard for Music Education." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 29-45. ELPUB. Milano, Italy, 2009.

IEEE 1559 is a new multilayer music code whose international standardisation was recently achieved. Its development follows the guidelines of IEEE P1599, "Recommended Practice Dealing With Applications and Representations of Symbolic Music Information Using the XML Language". This project proposes to represent music symbolically in a comprehensive way, opening up new ways to make both music and music-related information available to musicologists and performers on one hand, and to non-practitioners on the other. Its ultimate goal is to provide a highly integrated representation of music, where score, audio, video, and graphical contents can be enjoyed together. In this paper, two different aspects of the matter are discussed: the key features of the standard that make it suitable for both music education and training, and some examples implemented to achieve such goals.

Fernicola, Pablo F.. "Incorporating Semantics and Metadata as Part of the Article Authoring Process." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 367-378. ELPUB. Milano, Italy, 2009.

The ongoing shift in the delivery of publications, and in the consumption of content, from print to digital presents an opportunity to streamline the publishing workflow and to optimize the authoring process with digital content as the primary output, including the capture of semantics and metadata as part of authoring and the preservation of this data to the archival copy of the document. In addition to the shift in how content is delivered and consumed, a significant development in the last few years has been the release of new versions of word processors with native file formats based on XML. The use of XML in the authoring file format, combined with extensibility in its content model, will enable a greater level of content semantics and metadata to be expressed directly by authors. The level of interoperability enabled by XML-based word processing file formats will make it possible to preserve the semantics and metadata as documents go through the submission and review process, make it through the publishing workflow and are ultimately archived, likely also in an XML based format. This article describes the design considerations and possible benefits of the Article Authoring Add-in for Word 2007 to the scholarly publishing community, in particular for workflows focused on the production of documents for digital delivery and consumption, as well as for the XML based archival of publications. The second Beta release of the add-in is available as a free download (, and it is currently being evaluated by the scholarly publishing community, with the involvement of publishers, archives, information repositories, and early adopters. In addition to facilitating the creation of structured documents, and enabling semantics and metadata to be more easily captured during authoring, the add-in provides the ability to open and save files from Word 2007 into the XML format defined by the National Center for Biotechnology Information of the National Library of Medicine. The add-in extends the file format used by Word 2007, as well as its user interface, to tailor the authoring experience for the different audiences involved in the publishing workflow. As the add-in is adopted across multiple publications, authors will benefit from a consistent baseline experience, simplifying the authoring process and enabling a shift towards emphasising the expression of semantics over presentation by authors.

Cumming, Sioux. "Increasing the visibility of local research: the Journals Online Project at INASP." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 557-564. ELPUB. Milano, Italy, 2009.

The development of the Journals Online project at INASP is explained. The aim of the project is to increase the visibility of the research produced in developing and emerging countries. This has been achieved by creating websites on which the content of local journals is hosted. The visibility of the research was measured by recording the number of journals hosted, articles, full text articles and visitors. A questionnaire survey was conducted of changes in levels of indexing and qualitative comments from editors were assembled. It was found that the number of journals and articles on the websites was increasing and the number of article views was high, thus indicating that the research was being used by researchers from all over the world.

Zapilko, Benjamin, and Maximilian Stempfhuber. "Integrated Retrieval of Research Data and Publications in Digital Libraries." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 613-620. ELPUB. Milano, Italy, 2009.

Digital Libraries currently face the challenge of integrating many different types of research information (e.g. publications, primary data, expert's profiles, institutional profiles, project information etc.), for which to date no general model for knowledge organization and retrieval exists. This causes the problem of structural and semantic heterogeneity due to the wide range of metadata standards, indexing vocabularies and indexing approaches used for different types of information. The research presented focuses on integrating reference data for publications and survey data in the social sciences, but also applies the problems existing in other domains. We present a model for the integrated retrieval of factual and textual data which combines the traditional content indexing methods for publications with the newer, but rarely used ontology-based approaches which seem to be better suited for representing complex information like that contained in survey data. The benefits of our model are (1) easy re-use of available knowledge organisation systems and (2) reduced efforts for domain modelling with ontologies.

Stempfhuber, Maximilian, and Wei Shen. "Integrating Online Publications and Scholarly Discourse in the Context of Digital Libraries." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 349-366. ELPUB. Milano, Italy, 2009.

With the momentum the Open Access movement gained recently, more and more scientific journals are produced in a digital workflow and published in digital format. Nevertheless, the most often used format for publishing articles on the web is still the Adobe PDF format, which limits the extent to which readers of an article can interact with online content and within their browser environment. This not only separates the formal communication – the article itself – from the informal communication about a publication – the discussion about the article – but also fails to link the different threads of communications which might be going on in parallel and at different places in the scientific community as a whole. In this article we present an AJAX based technology for adding in-context discussion and annotation features to articles published on the web. The approach we took is that of XML based publishing using and its ISO standard Open Document format (ODF) for the production workflow of scholarly journals. The online discussion service we developed is linked into the process of rendering documents into formats suitable for distribution using XSLT technology. Mark-up in the document determines the location at which readers later can comment on an article directly within the HTML representation generated and displayed in their browsers. Using XSLT for generating HTML representations of articles allows us to provide different designs of the same article in cases where it is published at different web sites in parallel. The discussion service integrates the annotations made to the article at all web sites it is published at, therefore linking online discourse between different communities on the web.

Vanderfeesten, Maurice, and Bas Cordewener. "Knowledge Exchange Workshop on Persistent Identifiers." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 21-24. ELPUB. Milano, Italy, 2009.

Imagine the distant future where a researcher wants to gain access to an ancient digital publication about a research that has been conducted in 2009 on gene technology that made crops resistant to pesticides.A Persistent Identifier found in a more recent publication takes the researcher to the old document. Luckily the ancestors found a way to implement a strategy and a policy that made access to their documents possible, especially now that the old http-based internet is gone.This workshop is an interactive event where stuctured discussions take place on how to create a sustainable organisational model and a robust policy in order to build a trusted and reliable information infrastructure. This information infrastructure is able to locate and redirect someone/something to a knowledge asset that is part of our human heritage.The Knowledge Exchange Persistent Identifier Workgroup is working on a URN:NBN based pilot to demonstrate global resolution with Long Term Strategies. There are many other Persistent Identifier technologies, however in this pilot the Policy & Strategy part has the major focus.The answers and discussions in the workshop will contribute to this focal point.ELPUB agenda KE-PIDThe agenda for the workshop is divided in two parts:1. introduction (to provide context)2. interactive (to provide answers in dialogue)The goal of the workshop is to share ideas about user requirements and have input in the KE-PID project, which aims to setup a Global Harmonisation Resolution service and most importantly a sustainable organisation.We would like to stress out that this is not a technical workshop, but organizational issues will be tackled.Agenda:1 Introduction* opening and explanation of today's mission* explanation about Persistent Identifiers* examples / demo of Persistent Identifiers2 Dialogue / sharing experiences* session in groups of 4 persons: find out the user requirements and expectations of PID's* reporting back to plenary session* plenary prioritisation (finding out the risks and issues to be solved)* back in groups: find out who could do what, on a national and local scale: roles and responsibilities* plenary session: present of outcomes

Engelen, Jan. "Marketing Issues related to Commercial and Specialised Audiobooks, including Digital Daily Newspapers." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 621-624. ELPUB. Milano, Italy, 2009.

Audiobooks or talking books are becoming very popular. They used to be recorded only in specialised production centres for use by people with visual impairments. Since several years ago, many commercial publishers have found an interested public which appreciates listening to these books for leisure. In this contribution, which is a follow up to my ELPUB2008 paper "A new electronic publishing trend: Audiobooks for leisure and studying", I will focus on the delicate balance between commercially published books and the role of specialised production centres. Also issues of cataloguing metadata for audiobooks are discussed.

Coto, Rolando, Helena Francke, and Saray Córdoba. "Metadata Usage Tendencies in Latin American Electronic Journals." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 311-334. ELPUB. Milano, Italy, 2009.

The present study investigates the extent to which metadata tags are used in Latin American electronic journals, and whether these journals in fact provide basic information (abstracts, keywords, etc.) that could be tagged as metadata. The authors also studied multilingualism in the marked-up information and in the basic information, particularly the use of English (which can help bring the scientific production of Latin America to a wider audience). In total, 45% of the journals had metadata; the metatags keywords and description were the most commonly used. The inclusion of structured metadata from the Dublin Core Metadata Element Set in the journals was found to be very low, only 13%, and primarily existed in journals from Argentina, Costa Rica, and Brazil. The articles examined did not always include abstracts and keywords (84% and 77% respectively), but in the articles that did have them, English was frequently used (85% in abstracts and 91% in keywords). The element was found to be used deficiently: Only 42% of full text OA articles had their actual title in the tag, which can potentially affect visibility in a search engine results. In sum, the road to marked-up metadata in all journals is still long, and there are great inconsistencies in how metadata are employed and in their content. The authors conclude that there are signs that support and efforts to increase awareness of how metadata can easily be included in a journal’s web site may result in improved metadata and greater visibility.

Sachini, Evi, Victoria Tsoukala, Nikos C. Houssos, Ioanna-Ourania Stathopoulou, Christina-Eleni Paschou, and Aggeliki Paraskevopoulou. "Open Access in the Humanities: a case study of developing three open-access electronic journals in Greece." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 543-556. ELPUB. Milano, Italy, 2009.

The international movement for open access to scientific content along with advances in information and communication technologies and the Internet are bringing about revolutionary developments in scholarly publishing and communication: the availability of e-infrastructures supporting the management and exchange of the research output in digital format leads to the transformation of existing processes. It allows new ways of collaboration among researchers and facilitates the widespread dissemination of research results. Pioneering applications related to these trends have first appeared in scientific fields that inherently have a closer relationship with technology like natural sciences, engineering and medicine. However, significant relevant activities in the Humanities are also beginning to emerge worldwide. The present contribution concerns a case study of open access publishing in the Humanities, in particular a project that created freely accessible electronic versions of three pre-existing print-only journals of this subject area published in Greece but with international participation and perspectives. The paper provides Greek context in scholarly communication with an emphasis on the Humanities; it elaborates on the goals of the project and the challenges that were encountered and addressed during its implementation. One of the main reported successes of the project was the increased awareness among Greek researchers in Humanities of the capabilities and potentials of modern scholarly communication systems and the creation of a demand originating from the corresponding research community itself for the continuation and expansion of similar activities in the future.

Cavalli, Nicola. "Overlay Publications: a functional overview of the concept." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 55-68. ELPUB. Milano, Italy, 2009.

This paper tackles the issues of overlay publications (journals, but also edited monographs and books in general). Its two parts are aimed at providing theoretical understanding of what an overlay publication is and at examining a concrete example of a print on demand overlay book, published after the free digital version was released. The first part takes into account the definitions found in literature and goes back to the functions that every system of scholarly publication must satisfy, as Roosendaal and Geurts (1997) proposed, adopting a value chain perspective of the scholarly communication system. Some examples of overlay journals are then examined in order to clarify which added value an overlay publication can contribute. According to the first part, the second part of the paper analyzes in depth a first experiment of what can be defined as an overlay book. The case study is about a short monograph in Italian, in the field of online and social marketing published first as a free ebook and then published in print through a print on demand model. The paper ends summarizing the findings of the case study and providing some insights about overlay publications, trying to clarify what they are and why they are useful.

Wallace, Julia M., and Michael A. Mabe. "PEER: Publishing and the Ecology of European Research." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 439-457. ELPUB. Milano, Italy, 2009.

PEER (Publishing and the Ecology of European Research) is an EC supported collaboration between publishers, repositories and the research community aimed at improving understanding of the effects of the large scale deposit of stage two (accepted) manuscripts in open access repositories (Green Open Access). Through the creation of an observatory with European content from approximately 300 peer-reviewed journals from participating publishers, PEER aims to monitor the effects of systematic archiving over time. Research commissioned from qualified independent teams addressing author and reader behaviour, article usage at repository and publisher sites and the economics of publisher assisted deposit and author self archiving, will result in a number of outcomes including: evidence based guidance for the evolution of open access policy; a model of the effects of archiving on the traditional publishing systems and foster trust and mutual understanding between publishers and the research community for the overall benefit of European research.

Binfield, Peter. "PLoS One: background, future development, and article-level metrics." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 69-86. ELPUB. Milano, Italy, 2009.

PLoS ONE, a peer-reviewed Open Access academic journal published by the Public Library of Science, was founded in 2006 with the intent of reevaluating many of the aspects of the scholarly journal. As a result, PLoS ONE has taken elements of the traditional publishing model for scholarly journals and separated them into those functions that are most effectively carried out before publication (for example, peer review in order to evaluate whether the article deserves to join the scientific literature) and those that can most effectively be carried out after publication (for example, how impactful the article was once it joined the literature). With this basic premise in place, and using the online tools that are now available, the journal has grown to the extent that in 2009 it will become one of the largest journals in the world (by publication volume). This article overviews the development of the journal to date-how it differs from most other journals and how it engages with its core audiences. In March 2009, the journal (along with other PLoS titles) began a program to place 'article-level metrics' on each publication, and this article outlines how this has been achieved, as well as plans for further development. In conclusion, this article looks forward to the future developments of this transformational journal.

DiFiore, Kenneth Andrew. "Portico Tutorial: A Collaborative Approach to Preserving Scholarly Digital Content." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 27-28. ELPUB. Milano, Italy, 2009.

Teaching and research have become increasingly dependent upon the convenience and enhanced accessibility of electronic scholarly resources. Along with the use of these resources comes the challenge associated with protecting them for future generations of scholars, researchers, and students. In the past when print was the predominant medium, the preservation responsibility was linked to ownership and was traditionally a function of the library. In the digital age, however, the library’s responsibility for preservation is uncertain as the link between ownership and access is broken. Furthermore, the scale and complexity of the technology infrastructure, specialized expertise and quality control processes necessary to preserve electronic scholarly resources far exceed that of any individual library or institutional budget. In 2005, with support from the Library of Congress and the Andrew W. Mellon Foundation, JSTOR launched Portico to serve as a “dark archive” where publishers and libraries could collaborate and contribute together to support the long-term preservation of scholarly digital content. Portico has established a robust digital preservation platform that currently comprises over nine million journal articles from hundreds of publishers across the spectrum of commercial houses, university presses, and professional societies. In 2008, the archive expanded its capabilities to include e-books and digitized newspaper collections. The chief beneficiaries of the Portico archive - publishers and academic institutions – provide the primary sources of funding. Sharing the costs broadly across the scholarly community ensures that no single institution must bear the full force of the costs. All libraries supporting Portico will have campus-wide access to archived content if it becomes lost, orphaned or abandoned because of specific trigger events. In addition, libraries may rely on the archive for post-cancellation access if the publisher has chosen to extend this permission to Portico. Portico's archival approach for e-journals and e-books is focused on the publishers’ source files – the electronic files containing graphics, text, or other material that comprise an electronic journal article, issue, or volume. Portico has chosen migration as its primary long-term archival approach, as part of a managed preservation strategy.This tutorial session will educate attendees about the factors driving libraries to an e-only environment and the growing concerns about the preservation of scholarly e-content. Also, the session will provide an in depth description of Portico’s business model and technological approach for providing a trusted archival home for e-scholarship and how its archive can assist scholarly publishing stakeholders in their transition from print to electronic. Specifically, attendees will learn:-the factors driving libraries and publishers toward the adoption of e-content;-the importance of digital preservation and the urgent need for action;-archival strategy trends within the community;-the significance of separating the access role from the preservation role;-the economies of scale and reduction of risk through broad cost sharing across the publisher and library community;-Portico’s economic model and preservation capabilities for e-content;-the trigger events and post-cancellation scenarios in which libraries access content from Portico, including a demonstration of the archive;-a description of Portico’s holdings comparison analysis and how it can be used to assist librarians with print collection management;-a comparison of how Portico and CLOCKSS addressed the recent trigger events.

Romanello, Matteo, Monica Berti, Federico Boschetti, Alison Babeu, and Gregory Crane. "Rethinking Critical Editions of Fragmentary Texts by Ontologies." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 155-174. ELPUB. Milano, Italy, 2009.

This paper discusses the main issues encountered in the design of domain ontology to represent ancient literary texts that survive only in fragments, i.e. through quotations embedded in other texts. The design approach presented in the paper combines a knowledge domain analysis conducted through semantic spaces with the integration of well established ontologies and the application of ontology design patterns. After briefly describing the specific meaning of “fragment” in a literary context, the paper gives insights into the main conceptual issues of the ontology design process. Lastly, it outlines the overall architecture of protocols, services and data repositories which is required to implement a digital edition of fragments based on the proposed ontology.

Mornati, Susanna, and Turid Hedlund. "Rethinking Electronic Publishing : ELPUB 2009 (Preface)." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing. ELPUB. Milano, Italy, 2009.

It is a pleasure for us to present to you readers, speakers and attendants with these proceedings, consisting of over 40 contributions accepted for presentation at the 13th ELPUB conference. This year the conference was generously hosted by CILEA and the University of Milan in Italy and chaired by Susanna Mornati, CILEA, Italy and Turid Hedlund, Hanken School of Economics, Helsinki, Finland. It is well known that Internet publishing is continuously changing and taking new forms and models. As players on the market for electronic publishing, we have to be forerunners in shaping the coming models. In line with this, the theme of the conference this year was “Rethinking electronic publishing” with a subtitle urging us to be innovative in discussing new communication paradigms and related technologies. The focus of ELPUB 2009 is on key issues in e-communications, exploring dissemination channels, business models, technologies, methods and concepts. The three-day event consists of a first day of technical workshops, tutorials and demonstrations; the following two days feature contributed papers and posters examining a broad range of technical, conceptual, policy, and financial aspects of scholarly communication while showcasing significant experiences and lessons learnt. A symposium on openness in the academic environment run by Leslie Chan and Gale Moore, and special keynotes by Henk Moed and Simon Tanner enrich the programme.The 13th ELPUB conference carries on the tradition of previous conferences held in United Kingdom (1997 and 2001), Hungary (1998), Sweden (1999), Russia (2000), the Czech Republic (2002), Portugal (2003), Brazil (2004), Belgium (2005), Bulgaria (2006), Austria (2007) and Canada (2008). A conference on electronic publishing naturally offers the collection of papers of earlier conferences, fully archived in a sustainable digital library at The library has also been expanded with a citation index (i.e. collection of references in the ELPUB papers). More than 3,200 citations have been collected in over 500 records, starting with the very first ELPUB conference in 1997. Bob Martens has made a great effort in extracting citations from earlier ELPUB papers to make an analysis of the impact of ELPUB papers within the community. It seems that within this collection there appear to be only a relatively small number of citations of other ELPUB papers; the reason for this is under investigation. However, there might be a reason to emphasize the existing archive of earlier papers to the audience. Central research questions of today might have also been already tackled in earlier conference papers.We are proud to present a variety of contributions from five different continents, thus representing a wide geographical distribution of issues and themes related to electronic publishing. Emerging countries are witnessing the diffusion of innovations in scholarly communication, as well as developed countries. ELPUB is an event that makes these trends meet. We hope to contribute to reducing the information divide among countries and continents in the world. Among the topics, economic models hold a significant place in a market where new players are changing the traditional landscape. There is ever growing interest towards semantics, tagging, web 3.0 and all the techniques which will change the way machines present contents and help with their selection and re-aggregation. Open access models are consolidating their status; digital libraries are merging with journals; new metrics change evaluation parameters; preservation becomes more and more an issue within sustainability. Innovation is the common umbrella under which all papers contribute their effort.This year we introduce a new category of short papers. It is meant as a chance to present a brief update on an interesting situation, a success story, or showcase a significant experience. We hope that delegates and readers will enjoy this chance to meet a variety of examples and practices to draw inspiration for their initiatives.In order to guarantee the high quality of papers, all submissions (over 80) to the ELPUB conference were peer-reviewed by members of the international Programme Committee and additional peer reviewers. Their contribution and feedback to the authors was valuable and appreciated. We would like to express our gratitude and appreciation for their effort and help in the review process and in suggestions for the programme of the conference. As in the previous editions, we still propose printed proceedings. In the last few years the irony of printing contributions about electronic publishing was highlighted. But we are addressing, among others, a conservative academic world and it seems important to produce a tangible record of the ELPUB intellectual output for research evaluation.We hope you enjoy reading the proceedings. It is also our pleasure to invite delegates and readers to ELPUB 2010, which will take place in Helsinki, Finland. The 14th ELPUB conference will be organised by the Hanken School of Economics. Details of the conference will be forthcoming at the ELPUB web site.Our thanks go to Nilde de Paoli and Anna Marini for their invaluable support, to Julia Weekes for copyediting, to Angela Corgnale for layout editing, to Vania Ugé, Roberto Piazzola and many other colleagues at CILEA who made this event possible and to whom we express our gratitude. A special mention to Luigi Traiano at Nuova Cultura in Rome, who made this publication possible, and an intense gratitude to Paola Galimberti at the University of Milan, for her scientific advice. Finally, we would like to acknowledge and thank the various sponsors for their generous contributions.We look forward to an interesting and productive conference.

Das, Sudeshna, Mark Goetz, Lisa Girard, and Tim Clark. "Scientific publications on Web 3.0." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 107-129. ELPUB. Milano, Italy, 2009.

The advent of new technologies and paradigms is constantly changing the landscape of scientific publications. The use of online journals is rapidly rising and most researchers prefer online materials to print. The Internet has also given rise to open-access, online-only publications. There are several advantages to such journals - most importantly, the articles published can become a starting point for online community-based discourse on the subject. Researchers, given the right web environment, can discuss the articles published online, and such collaboration on the World Wide Web is the hallmark of Web 2.0. Another emergent trend is Web 3.0, in which the web becomes the medium for data, information and knowledge exchange through the use of shared semantics. We have developed the Science Collaboration Framework (SCF), a lightweight software framework that scientific communities can use to create open-access, online, scientific publications. The software uses Web 3.0 technologies (social web, semantic web, text-mining) and thus allows interoperability with other Web 3.0 sites. The software allows communities to publish complex scientific articles, annotate them with controlled vocabularies or ontologies, register research interests of members and conduct discussion forums. The software can integrate with other knowledge repositories and the site knowledge is available as linked data. The software is modular, so different communities can install and enable different features as well as contribute modules back to the main framework; thus creating a software community as well. The first site based on our software, StemBook (, an online open access peer-reviewed collection of invited review chapters covering a range of topics related to stem cell biology, went "live" in September 2008. Several other sites are under development, including a new web community for Parkinson's disease researchers, PD Online, and a re-engineered version of the popular Alzheimer Disease research community Alzforum ( The sites developed on the SCF platform are interoperable with each other and with other sites on the Semantic Web. In this new paradigm, there is a significant reduction in artificial barriers between research disciplines, and a much more dynamic and agile approach to information exchange.

Oppenheim, Charles, and Fytton Rowland. "Scoping Study on Issues Relating to Quality-Control Measures within the Scholarly Communication Process." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 263-290. ELPUB. Milano, Italy, 2009.

Twenty-six authoritative people, covering a range of academic disciplines and roles, were interviewed for their views on quality control of (i) research communications, (ii) data compilations, (iii) digital learning and teaching (L&T) materials in higher education, and (iv) scholarly communications on Web 2.0. Transcripts of the interviews were analysed qualitatively. Little change was expected in peer-review procedures in the next five years. 'Double-blind' peer review is widespread in the social sciences but almost unknown in the sciences. Neither electronic-only publication nor Open Access is expected to impact substantially on quality control of research communications. For L&T materials, a 'caveat emptor' approach is urged upon lecturers and students alike when they contemplate materials found on the Internet by using search engines. Quality of data compilations varies greatly with discipline: resources susceptible to algorithmic check, such as those in genomics, astronomy and crystallography, are reliable, and older scientific databanks maintained by international scientific unions are sound. Those in the social sciences - including Government statistics - may be of doubtful quality. In the humanities, image databanks need good metadata to be usable. Web 2.0 social-networking is popular with the younger generation, but there is doubt whether it supplants formal scholarly communication.

Linde, Peter, Carin Björklund, Jörgen Eriksson, and Aina Svensson. "Self-Archiving in practice: What do the researchers say and is there any pain alleviation?" In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 393-414. ELPUB. Milano, Italy, 2009.

The purpose of the study was to increase self-archiving of scientific articles in Swedish open archives and thus contribute to the dissemination and increased visibility of Swedish research and to a greater impact for the individual researcher. We wanted to find out what obstacles may occur in the self-archiving process and how the database SHERPA/RoMEO functions as support for control of the publishers' conditions. We engaged 40 researchers at 7 Swedish institutes of higher education to self-archive their peer-reviewed journal articles from the last 5 years. The result was that 140 publications were self-archived in the open archives of these universities and university colleges. After the self-archiving was carried out we followed up on the researchers' experiences and viewpoints in the form of oral interviews. We have found several imperfections and problems in the process of self-archiving. These issues are discussed and then we conclude with suggestions for measures to take, which we believe are crucial to making self-archiving generally accepted in the world of research and therefore increasing the dissemination of research results.

Bueno-de-la-Fuente, Gema, Tony Hernández-Pérez, David Rodríguez-Mateos, Eva Méndez-Rodríguez, and Bonifacio Martín-Galán. "Study on the use of metadata for digital learning objects in university institutional repositories." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 587-594. ELPUB. Milano, Italy, 2009.

This study analyzes the actual use of metadata describing the educational resources that some university institutional repositories (IRs) include in their collections. The goal is to test the viability of implementing value-added services by offering educational resources from IRs in addition to those available from learning object repositories (LOR), based on their metadata.We identify and analyze the different metadata models in a sample of university IRs, concentrating on: the use of one or multiple metadata schemas coexisting in the repository; the use of educational metadata schemas and application profiles such as IEEE LOM or DC-Ed; the possible extensions (qualifiers or any kind of refinements) to DC-Simple; the specific metadata elements used to describe educational features (such as audience, type of educational material, learning objectives, etc.) and the values of the metadata elements, especially the use of specific vocabularies for elements of educational interest.

Chan, Leslie, and Gale Moore. "Symposium on the Institutionalisation of Openness in Universities." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 627-629. ELPUB. Milano, Italy, 2009.

A host of initiatives predicated on the notion of “openness” has flourished in recent years. Open Access, Open Educational Resources, Open Source, Open Data and Open Innovation are now familiar terms, and “openness” has been a central theme of the ELPUB conferences in recent years. Each of these activities occupy a distinct intellectual space, yet they appear to share a common set of principles and practices that are collectively transforming the ways knowledge is being produced, shared, consumed, and disseminated. While the effects of these open practices are increasingly observable in the university, the awareness of the opportunities and the consequences for both individuals and universities continues to be unevenly distributed across the institutional landscape. At the same time, these ideas and the initiatives associated with them have, for the most part, originated within the academy, and although universities continue to be a primary site of knowledge production, there is a growing tension between tradition academic values of sharing and institutional concerns about income generation and sustainability.Collectively we face a challenge. The limited and uneven uptake both by individuals and by the institutions themselves is of concern. For example, despite the demonstrated advantages of Open Access for scholarly publishing, only a small percentage of faculty members make their publications openly accessible. Open Educational Resources remain a specialised pursuit despite the benefits to faculty and learners. Computer scientists have found it difficult to integrate Open Source into their curriculum, and contributing source code to Open Source projects is rarely recognized in formal academic evaluations for tenure and promotion. At the same time, universities are more interested in finding ways to commercialise intellectual property resulting from faculty research than in finding ways to maximize the potential of Open Innovation and developing new business models to generate income and accelerate innovation. How do we move this diverse agenda forward in ways that engage faculty, students and staff as well as our institutional leaders?We propose to begin a conversation on institutionalisation, to explore the possibilities and potential challenges of more formal processes, and to consider how our experiences in our own institutions might be shared and subsequently used to leverage action elsewhere. Universities as institutions regularly compare themselves with others using a variety of published data and indicators. How might access to evidence of successful institutionalisation of open practices and processes as well as the strategies employed in these cases support members of this community in our local efforts for change?The goal of the Symposium is to explore the idea of the institutionalisation of openness both as a concept and strategically.- Intellectually, we will explore the nature of openness and what constitutes it, asking if it is appropriate to describe openness in terms of “dimensions” – e.g., open access, etc., and how we might characterise the commonalities and differences across these practices.- Strategically, we will explore ways in which to move the openness agenda forward as well as identify factors that might enhance or retard its adoption by faculty, students and staff and institutions.The Symposium will take the form of an “unconference”. By using an open participatory process that acknowledges and respects the different origins and traditions represented by the existing communities on the open landscape our goal is to provide a space for critical discussions in which we can map out both strategy and a research agenda to help move forward collectively. There will not be conventional presentations, rather opportunities to present short position statements, brief reports on ongoing projects, summaries of current research, etc. Contributions should focus on the theme of “openness”, while keeping in mind that the phenomenon of openness is itself subject to debate and interpretation. We suggest the following questions to start the conversation; they are not meant to be exhaustive:- What constitutes openness?- Can we use these practices to open up new spaces for social and institutional change? In what ways do open practices call into question traditional notions of authority and power? Where is there support, why is there resistance?- Can we leverage examples of successful institutionalisation by working collectively, developing “metrics” and encouraging comparative analysis?- What are the key research questions for which we need answers?By the end of the Symposium we hope to have:- developed a more nuanced understanding of the value of framing openness a multi-dimensional concept- generated a list of key research questions on the institutionalisation of openness- identified a group of people interested in exploring these ideas further- produced a list of possible themes for a future ELPUB conference

Pescarmona, Gian Piero, and Elena Giglia. "Targeted knowledge: interaction and rich user experience towards a scholarly communication that “lets”." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 415-438. ELPUB. Milano, Italy, 2009.

All living systems share many properties, including hardly predictable behaviours, due to the differences between individuals and the chaos in natural environments. The reductionist approach to the interpretation of these phenomena suffers from the oversimplification of the factors involved in the quest for universal ''scientific” explanations. The validation of scientific paradigms is based on the consensus of leading groups who decide what is true and what is not. This means that all events - not only conflicting opinions but also conflicting raw data - not fitting with the official scientific truth were never published, which indirectly supported the correctness of the experts’ choice. With the advent of Web 2.0 and the freedom of publishing, the number of these not fitting events has dramatically increased. Yesterday, data were supplied to the reader with the interpretation. Now the reader has to wade through a huge amount of data and opinions in each field. Extracting the information you need from the garbage requires a strategy. Strategy is a science by itself. In the specific case of knowledge the first step is to define knowledge. The aim of life sciences, medicine and social sciences is to modify the reality when it is no longer sustainable, whatever it means in every single situation. I have to know how my system works to modify it. Knowledge is the information that allows me to succeed in my tasks. Tasks must have an assessable target. All information which is useful and therefore processed to attain the target will be 'targeted knowledge'. Information can be selected on the basis of its congruence with the internal rules of the system. In Web 2.0 we found proper tools to test this approach. We implemented a web application - which aims for easier identification of the molecular basis of diseases - structured in Rules, Reports, Items, Pathways and Tools referring and linking each other. The use of tags allows and fosters a free and personal use of information to create original knowledge. Users can follow and open innovative paths each time answering a different question, re-combining the fitting information. Our application is an example of advanced Problem Solving: the patient as a whole, not as a single symptom, has to be understood as a part of the living world (Gaia, with its rules) whose components (Items, Pathways) are described in their multiple roles and connections. The Web allows easy access to information, the program allows the network creation, the Rules drive the selection of the information and become more and more stable the more they evolutionary adapt to the reality; something like DNA, carrying millions of years old sequences in an ever changing world.

Lembinen, Liisi. "The native language university digital textbook collection pilot project." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 595-599. ELPUB. Milano, Italy, 2009.

Tartu University Library's native language university digital book collection (Ebrary platform) began in November 2008. The main purpose of the collection is to provide university students and professors with an alternative method to find necessary study materials in their native language through the Internet. Students do not need to worry about the lack of materials in libraries or bookstores. Professors can be sure that the students will come prepared and are well equipped. In addition, the option to submit feedback provides professors with a tool to amend previously published titles for the second edition. The long-term goal is to provide professors with a better understanding and basis for their rights and opportunities as authors.

Krottmaier, Harald, Rene Berndt, Sven Havemann, and Tobias Schreck. "The PROBADO-Framework: Content-Based Queries for non-textual Documents." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 485-500. ELPUB. Milano, Italy, 2009.

In this paper we describe the system architecture of PROBADO, a project funded by the German Research Foundation (DFG). Its main goal is to provide a general library infrastructure for dealing with non-textual documents, in particular for content-based searching. PROBADO provides an infrastructure that allows integrating existing data repositories and content-based search engines into one common framework. The system architecture has three layers interconnected by a service-oriented architecture (SOA) currently using SOAP 1.1 as the communication protocol. The layers are: (1) a front-end layer, responsible for providing the user interface, (2) a core layer, responsible for scheduling requests from the interface to different repositories, and (3) a repository wrapper layer, responsible for enabling existing repositories and search engines to interface to the system. The functionality of each layer is described in detail. The general architecture is complemented by a brief introduction to the domain-dependent functionality currently provided.

Woodward, Hazel, Lorraine Estelle, Caren Milloy, and Ian Rowlands. "Understanding how Students and Faculty REALLY use E-Books: The UK National E-Books Observatory." In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 381-392. ELPUB. Milano, Italy, 2009.

The E-Books Observatory project of the UK's Joint Information Systems Committee (JISC) aims to provide higher-education students throughout the UK with access to a number of popular textbooks in digital form free at the point of use, and then measure their usage of them. Subject areas covered were business studies, media studies, engineering and medicine. Surveys of users were conducted in January 2008 and in January 2009, to measure changes in usage as a consequence of the availability of the e-books. Focus groups of users have also been held. The usage of these e-books has also been monitored by deep weblog analysis, which will continue until Summer 2009. It is believed that this is the largest study of E-book usage yet mounted. Both the questionnaire surveys attracted over 20,000 respondents. Preliminary conclusions are given here, and they suggest among other things that electronic availability of textbooks will not impact sales of the printed books because print and electronic versions are used in different ways.

Dobreva, Milena, and Nikola Ikonomov. "What Small Projects Producing Digital Resources Need to Know about Digital Preservation?" In Rethinking Electronic Publishing: Innovation in Communication Paradigms and Technologies - Proceedings of the 13th International Conference on Electronic Publishing, 565-577. ELPUB. Milano, Italy, 2009.

Large institutions in the cultural heritage sector are establishing preservation policies and incorporating digital preservation related activities into their everyday practice. But there are numerous small projects which produce digital resources and the issue of their products’ sustainability quite often is not properly addressed. This paper will take three projects as case studies – TEXTE, KT-DigiCult-Bg and the Parallel Archive – and will offer scenarios based on these cases. These scenarios present three typical cases: a project which prepares stand-alone resources; a project which prepares materials for ingest into a larger repository, and a project with highly dynamic content. The risks in all three cases will be analysed and appropriate recommendations taking into account the specific situation will be drawn. The paper motivates the need to enhance existing standards in the field into the direction of preservation of dynamic resources.