A view on Europeana from the US perspective
Research article
A view on Europeana from the US perspective
Ricky Erway, Senior Program Officer, OCLC Research, 777 Mariners Island Blvd, Suite 550, San Mateo, CA 94404, USA, erwayr@oclc.org

At the express request of the organisers of the second LIBER/EBLIDA workshop on digitization, Ricky Erway of OCLC provided an outsider’s view on the Europeana project. Erway looks at Europeana from many vantage points: mandate and funding; branding and public relations; learning from others; aggregation; cooperation; content; rights; metadata; technology; access; user feedback; and sustainability – offering valuable advice for the Europeana community in doing so.

Key Words
Europeana; digitisation; the European library

The organisers of the second LIBER/EBLIDA workshop on digitisation (The Hague, October 19–21) asked Ricky Erway of OCLC to provide a view on Europeana from the US perspective. Erway accepted the invitation with some hesitation, as she was well aware that Europeana is still in its infancy. Her remarks, as reproduced below, were received by her audience as they were intended: as one person’s observations at a particular point in the Europeana timeline. But as she drew on her twenty-year experience in related projects and activities, her observations are well worth the attention of Europeana staff and stakeholders.

First, I will present some dissatisfactions with Europeana and then I will make some observations and suggestions. A recurring challenge with this assignment was that after coming up with a particularly good suggestion, often my subsequent reading would reveal that Europeana had already done it or was addressing it in their plans. And while I had a number of problems with the current Europeana portal, I had to remind myself that Europeana has yet to go public with an operational version.

A Sampling of Criticism of Europeana

Being a bit out of the loop, I scoured the web and polled others in the field for reactions to Europeana. Of course there is a lot of praise. Criticism was harder to come by, but here is a sampling.

Regarding the approach:

  1. Europeana is reinventing all sorts of wheels.

  2. Europeana has all the feel of a digital library R&D project.

  3. It is too top-down.

  4. So far it is all smoke and mirrors: Participation has been an act of good will, not an ongoing commitment and the content is gathered by hand-crafted ingestion, with no updates.

  5. Europeana expects users to search there first, rather than go to Google – and it is not up to competing with the big guys at the network level.

  6. Online access to collections is not serving a clear user need. Once the novelty wears off, how many people will routinely log on to a database of individual object and catalogue records?

  7. We need to develop models that put richly contextualized digital content in the channels users are habitually using.

Regarding participation:

  1. There is no place to learn how your institution could get involved.

  2. It is difficult to access the Europeana standards.

  3. There is very uneven contribution; it is mostly from France.

  4. [from an archivist] Unless we are plugged into a national network, it is nearly impossible to get involved, i.e., those who need it most cannot get in.

  5. What is more important in the long term is the local repository and the content enrichment we are getting – regardless of who harvests it.

  6. We depend on web stats and are concerned about sending our users to an aggregator’s website.

Regarding access:

  1. Europeana’s user studies are limited to testing the usability of the prototype.

  2. It is hard to know what is there.

  3. It is easy to get lost.

  4. Communities like Facebook and MySpace already exist and Europeana should interact there, instead of creating its own social networking.

  5. There is a lack of emphasis on search engines. Or Flickr. Or other places where users are.

Regarding sustainability:

  1. Working out the metadata, ingest, front end, search etc. are minor accomplishments – sustainability is the biggest challenge.

  2. Can a multi-state supported project produce a nimble, scalable, usable resource with adequate value-for-money?

  3. Europeana won’t amount to much unless the national libraries make it central to their mission. At the moment it is marginal.

  4. Europeana’s operating overhead is very expensive, which needs to be addressed before things progress further.

  5. EU funding to build and sustain Europeana is no match for Google as digitizer, aggregator, and search engine.

  6. We are concerned about monetising content. Aggregators should not lay claim to content that is not theirs.

  7. We would prefer a business model based on brand equity and public/private partnership.

  8. An economic model based on EU funding will be fragile if the numbers do not demonstrate value for money and show significant impact.

Some of these criticisms are valid; others reflect a lack of awareness of the work underway to address several of the issues.

Observations and Suggestions

The following are my observations and suggestions divided into the following topics: Mandate and funding, Branding and public relations, Learning from others, Aggregation, Cooperation, Content, Rights, Metadata, Technology, Access, User feedback, and Sustainability.

Mandate and Funding

The EU has made access to digitized cultural content a strategic priority. It is abundantly clear that Europeana has a political mandate and significant support. In addition to central administration, there are also, through the European Commission eContentplus program, funded efforts to involve others in research and content digitization. The challenge will be in making the transition from a funded project to an ongoing program.

Branding and Public Relations

First off, Europeana has a good name. A brief name that suggests what it is. Catchy one word names that do not directly relate to the service may work for companies like Microsoft, who has the marketing budget to put behind Bing. It certainly worked for Google. More descriptive titles, like The European Library fit into sentences so naturally, it is hard to know that is the actual name of a service you can use.

Having arrived at a good name, Europeana has been duly vigilant about promoting the brand, but it is also very important to distinguish it from other efforts.

  1. Many people will be stymied as to the difference between The European Library and Europeana. It needs to be clear that The European Library is limited to catalogs of national libraries and that Europeana provides access to digital collections from any European library, archives, or museum.

  2. Others will wonder, if Europeana is good, would not the World Digital Library be better? It needs to be clear that the World Digital Library is very selective, whereas Europeana is a rich resource with depth enough to support serious research.

  3. And even within Europeana – what is Europeana.eu, Europeana v1.0, Europeana Net, and EU Bookshop Digital Library?

It is important that users, funders, and contributors are clear about what Europeana is and is not. For those interested in the Europeana initiative, it would be useful to have a project timeline that lays out the project components, putting the current activities and future plans in a larger context and that decodes things like the maquette, Prototype 1, and the Rhine and the Danube.

The prototype was released on schedule on November 20, 2008 and was quickly taken down because it was unable to handle the load. Having managed to survive the press about the failed launch (can it really be a failure if you have too many users?), it is impressive that Europeana has turned things around so successfully and now gets very positive press. Maybe too much so, as many articles are now making comparisons between Europeana and Google.

The next big public relations question Europeana faces is how to ‘launch’ a project that is already getting tens of thousands of visitors. News releases from headquarters are appropriate in some cases, but often casual outreach from collaborating institutions tells a more compelling story. Having participants speak at conferences will help inspire others. And ensuring that all the collaborating institutions link to the Europeana site will help boost Europeana’s Google ranking.

Communication should also be taking place on the social networks:

  1. There is a Europeana Facebook page with only 54 fans and no posts, reviews, or discussions.

  2. There is a Europeana.eu Facebook page with 996 fans, but you cannot find it by searching Europeana. It has a number of posts, but none since March are more than tangentially about Europeana.

  3. There are three Europeana Facebook groups:

    • ‘Europeana Biblioteca Europea Online’ has 257 members, 3 wall posts (from Jan., Apr., and May), 1 discussion (4 posts from Nov. 2008–Apr. 2009).

    • ‘Europeana’ has 12 members and no posts.

    • ‘We like Europeana’ has 28 members and no posts.

    • ‘Promote Europeana’ has 11 members and no posts.

  4. There does not appear to be an official Twitter presence.

  5. And the Wikipedia entry is very minimal, with more about the launch problems than anything else.

In addition to putting the content where users are, information about Europeana should also be where users are likely to look.

Learning from Others

Europeana should take advantage of the work coming out of centers of competence such as IMPACT, which is sharing expertise and best practices specifically related to text. In the UK, JISC’s Strategic Content Alliance has developed wonderful Timeline functionality in a prototype called CenturyShare. It would be good to start where they left off.

Europeana should look at examples of best practice from around the world, such as the Australian Newspaper Digitisation project’s cutting-edge application of user generated corrections and tags. OCLC offers Metadata Schema Transformation Services where a number of incoming metadata formats can be mapped to a number of outgoing metadata formats. It may be useful as new contributors come on board. Europeana might also incorporate VIAF, the Virtual International Authority File, to normalize names and to help users get all references to a person despite the form of the name the user searches.

This is a tiny sampling of related efforts from which Europeana might benefit. It is sometimes difficult to learn from others when you do not know what they have done and they do not know what you need. But with so much to be done, it would be nice to adopt the best of what is out there and concentrate the bulk of the effort on unsolved problems.


Europeana’s approach to aggregation is very reasonable: aggregate the metadata, but access the digital objects from the providers’ sites. This allows the provider to brand the content with their own identity and to offer up navigation and context pertinent to the content. It also precludes the need for Europeana to have to centrally store all of the digital objects and the responsibility for preservation remains with the owning institution.

Even more reasonable is the idea to aggregate aggregators, wherever possible:

  • It would be nearly impossible to deal with over 1000 individual institutions, but dealing with 100 contributors is manageable.

  • It minimizes unexpected site outages that strand users at 404 error messages or interfere with harvester updates. Individual sites can be up and down without warning; sub-aggregators should be more reliable.

  • Sub-aggregators can solve some of the interoperability issues at a domain or national level.

  • Rights issues are dealt with at the appropriate level, as well as any necessary access control and authentication.

  • There are fewer contractual relations for Europeana to shepherd and manage.

Preferring to aggregate existing aggregations is a good approach, but it imposes distance between Europeana staff and the individual institutions – which is probably bad for collaboration, but may be good for getting input and consensus on policy.

Europeana needs to establish how it will relate to additional initiatives. It should interact with the UK Collections Trust’s Culture Grid, which aggregates collections and can act as an invisible switch to the likes of BBC, Google, and presumably Europeana.

Europeana should think of itself as a portal rather than the portal. And it should think of itself as a feed to other portals. Aggregation is not just about aggregating; it is also about being aggregated. While Europeana is one of the very few aggregations that could conceivably make it as a destination site, it is honorable that they plan to export into other national and thematic portals. Europeana might also consider making their content available in tourism sites, genealogy sites, and in services such as WorldCat.

Ultimately Europeana should consider whether the collections management, digital asset management and digital rights management should be moved into the cloud. It would allow all enhancements to be shared and anyone could select subsets they want for their portal – or make the entirety available in an alternative manner.

The impact of aggregation on use statistics must be taken into consideration. While having users discover content elsewhere and then link into Europeana will increase usage statistics, having Europeana content accessible elsewhere will decrease visits. The same is true at the institutional level – if their support is based on the level of use of their local portal, Europeana sessions that access the metadata and previews, without linking back to the institution’s portal, may be jeopardizing their case, even if the content is getting more exposure. We need to find better ways to demonstrate impact in a world of aggregated content.


Cooperation can be looked at in two main ways:

  1. The soft, fuzzy kind where you encourage people from different domains, sectors, backgrounds, and languages to work together to achieve a common goal.

  2. And the hard scratchy kind with a bureaucratic, legalistic framework for handling agreements, funding, policy, and governance.

It is a wonder the second does not render the first impossible.

Collaboration comes naturally to many in cultural institutions (except when competing for funds, collection donations, or audience!). In the Libraries, Archives, and Museum Collaboration work that my colleague Guenter Waibel and I are leading, we have identified seven catalysts that enable effective collaboration:

  1. trust in each other

  2. a shared vision

  3. a mandate of some kind

  4. incentives for working together

  5. one or more change agents

  6. some sort of mooring within an institution or organization

  7. resources in both funding and personnel

  8. flexibility on the part of the staff

  9. external catalysts, such as users or competition.

Not all of them are necessary, but the more the better the chances of transformative outcomes.

Europeana needs to build a community beyond the director level. In AMICO, the museum directors initiated the project, but the staff working groups on metadata and digitization generated so much buy-in that, when the directors discussed discontinuing AMICO, the staff resisted and were able to convince them there was lasting value in the collaboration. In the Cultural Materials Alliance, directors populated the policy group, but staff made up the working groups on descriptive metadata, digital surrogates, content development and use in the classroom. One of the best outcomes from collaborative efforts is giving staff connections to others who are tackling the same issues.

Having a group of advisors is another way to get good results and buy-in. I suspect that Europeana’s ccouncil of content providers and aggregators will play a role like this.

It will be a challenge In Europeana – with so many different groups doing so many different, but related activities – to keep everything synchronized and to prevent duplication or divergence from plan. So much overlap and so many interdependencies make the most mundane things like version control very important.

Getting others to successfully collaborate requires talent, patience, and luck. In today’s economy it is not as easy to get people to meet face-to-face. You can make online sharing sites, listservs, and wikis available, but getting them used is another thing. Setting up conference calls can be unduly challenging. Just getting people’s attention can be trying. It requires leadership skills, the right personality, and a topic that really matters to the participants to make it happen, but when it works well, it is well worth the effort.

Then there is the scratchy part of cooperation. A framework is needed to protect the project and its participants and to formalize working relationships. Typically, agreements focus on governance, contribution commitments, and agreed uses of content. In my experience, a great deal of the focus has been on things like how and to whom the aggregator is allowed to make the content accessible, the assurance the contributor needs to provide regarding rights, and any uses the aggregator is allowed or prohibited to make.

So far, the Europeana agreements that I have seen have to do with governance and participation in funded projects. The content, I understand, was contributed under a gentleman’s agreement. It is not ideal to iron out the terms after the fact, but sometimes you have to get underway before you know what points need to be in an agreement. You can, however, build in some flexibility. In the Cultural Materials initiative, we had a contributor agreement that laid out all the anticipated uses, but since the Cultural Materials Alliance wanted to explore a variety of business models, we had a clause saying they would agree to additional uses that the policy committee approved. That way, when we had concrete plans for other uses, participants could opt out rather than opt in, avoiding another round of contract signing, a process that can drag on for longer than many projects last! And for that reason, the contracts should be multiyear agreements that automatically renew, with easy termination clauses for both parties.


Ten million items by 2010 puts Europeana in the big league. This has been possible because Europeana has incorporated a combination of what people have done and are doing anyway with things they are doing for Europeana. People wonder why 47% of the content is from France, but it is because France had already come a long way with culture.fr and there were no equivalent efforts of that scale elsewhere.

With this size of a collection, one has to think about how to characterize it. The AMICO art image library was about a quarter the size of Cultural Materials, but it was used about ten times as much. Why? Because it had focus. It was easy to know who might want to use it and how they might want to use it. Cultural Materials was a bit of a dog’s dinner. A lot of this and a lot of unrelated that. It was hard to describe and hard to promote. An approach Europeana may want to take is to slice and dice the content into various language, subject, and format groups – for particular types of users. This is happening to some extent because of Europeana projects with subjects like travel or biodiversity heritage and by declaring and encouraging areas of interest like cities, social life, and music. But this slicing is something that can also be done after the corpus is assembled and maybe by people very familiar with the appropriate audience.

Some vertical access points can be developed by taking the content developed through a project like the Musical Instrument Museums Online, offering targeted access, but also including subsequent contributions of related items. On the other hand, it might be good to decide and declare if there are things that won’t be included. What about licensed journal articles? Commercial eBook publications? Data sets? Theses and dissertations?

One of the most amusing suggestions I saw in my reading was that Europeana should publish a list of what Europeana has and ask country members to fill in the gaps. Europeana is that list and it is really hard to look for what is not there.


(I assume that the contributor has determined that they have the right to provide the content to Europeana, so I focus here on the rights issues related to the Europeana service.) The rights issue is especially difficult for Europeana, because of the varieties of content and the varieties of sources. If different sets of content are available for different uses, it may be confusing. Conversely, it would be unfortunate not to include content that only some people can access or that require agreeing to specific terms. A filter for things that are freely available would be a nice feature to offer users.

Some of the rights issues that need to be addressed are:

  1. Is there any limitation on use of metadata records?

  2. Could someone else take the entire set of metadata?

  3. Are previews part of the metadata?

  4. Is download of surrogates permitted – if not, how can you prevent it?

  5. Can any of the content be used outside of the system to promote Europeana?

  6. Is reuse permitted? Is attribution required?

  7. Is dissemination allowed?

  8. Are commercial uses permitted?

  9. Can metadata or digital surrogates be modified or transformed?

  10. If people want permission for a particular use, who do they ask?

  11. And what kinds of reuse is Europeana allowed to do?

  12. And don’t forget about issues surrounding trademarks, patents and moral rights.

From an operational point of view, if there are different levels of access to different types of materials, it should be managed at the collection level, not at the item-level. And it is best if the metadata includes an indicator that allows automatic application of different access rules.


My theories on metadata are

  1. We do not need another standard.

  2. People will use standards, but not in standard ways. Surprising choices are made even in using plain old Dublin Core. Having to hunt for or transform data, based on site-specific rules, does not easily scale.

  3. People say they want to be told what to do, but they will not do it, because their situation or collection is unique.

  4. No one likes their own metadata.

  5. Mapping is a mythical grail.

What follows is a gross generalization (to which I have found no exceptions): Librarians want metasearch or federated searching. They do not like their own implementation. They blame the deficiency on metadata mapping. If they just had a better crosswalk, it would be better. So they change their software, retool with better mapping, and they still do not like it.

The reason is that a butterfly specimen has entirely different metadata than a painting of a butterfly. Who is the creator and what is the title or subject of a butterfly specimen? What is the Latin name or habitat of an impressionistic rendition of a butterfly? Just how many fields can be mapped between these two records?

My recommendation is to require a very small set of common elements and allow the rest to aid free text searching. Europeana’s adoption of OAI-PMH and Dublin Core is a good thing. It precludes the development of yet another approach and adopts one that others may already be using. Requiring some very basic elements makes some advanced searches or filtering possible. If participants are allowed to leave required elements empty, it will render those documents not discoverable. Allowing data beyond what is required will allow for better retrieval, but just through free text searching. That’s pretty much what users do anyway, type words in a box. Google manages to make it work.

User-generated information is intriguing. Access points that users use can be added to the ones we use. And we may get very rich information from experts. But there is a management headache when the data being augmented is in an aggregation. How do you coordinate giving enriched records back to contributors? If you do not or if they do not incorporate them into their catalog, then how do you coordinate updates from the contributor to the records that have been enhanced?


Europeana has the benefit of building on what others have already done and the luxury of being able to work with today’s tools instead of maintaining yesterday’s tools or transitioning from one to the other. Committing to open source code development through EuropeanaLabs will benefit Europeana and will help all who follow. Europeana is also showing some foresight in considering the effect of today’s choices on future semantic web capabilities.

It is true that Europeana’s development approach has the nature of an R&D project, farming out discrete packages of work, rather than that of a production-oriented environment. But some discrete pieces make sense to farm out, for instance, enabling mobile device interfaces will be developed as part of the EuropeanaConnect work.

When harvesting from remote sites, staying in sync is difficult. Though OAI-PMH provides mechanisms for reporting deletes and incrementally harvesting only new and updated records, those protocol features are rarely applied. The University of Michigan found, for example, that they had to harvest the 23 million records in OAIster from scratch every six months to keep in synch. Once you are doing that, you start to wonder why you are bothering with the rest of the OAI-PMH overhead, and whether some well-designed Google sitemaps might not be just as effective and require less support.

Another challenge I see on the technology front is a strain on resources due to the success of the current prototype system. Europeana staff have to continue to maintain that system, while developing the V1.0 system.


Accommodating all the European Union languages adds a huge layer of complexity. Europeana has made quite a commitment regarding multilingualism. Often a nod is given to multilingual access, but what is often meant is translating interface buttons (the easy bit). Europeana has to go much further. EuropeanaConnect will deliver thesauri and other resources that will give users multilingual access to the content itself. To enable some degree of browsing names across multiple languages, Europeana might consider making use of resources such as VIAF, the Virtual International Authority File. Fully accommodating search and display for 20–30 languages will be plowing entirely new ground.

The Google Research Corpus will likely create demand for data-mining access to the text in Europeana, or means to improve image content-based retrieval, or other demands not yet imagined. Sometimes users are machines with hungry algorithms.

Much (perhaps too much) credence is given to search engine optimization. It is likely, however, that an initiative that makes its contents available in more frequently visited systems will have a larger pay-off than continued optimization of a lightly visited website with the hopes of rising to the top of Google results. Wikipedia can play this role, if you take a curatorial view of the materials and target some materials to highlight and promote. SEO should not be ignored, and there are some simple things to do, but it is not a silver bullet. (And SEO is counterproductive if you trick someone into coming to your site and they find it inappropriate or impenetrable.) Even a ten-fold increase in a low number of visitors will not look like much in the larger web context. It is better to spend the effort on learning more about the ‘elsewheres’ where discovery happens and make relevant materials available there.

Some access-related observations about Europeana:

  1. The ability to see an item in its context is great, but it is important to make it easy for them to get back (although with the Google model running in their veins, users may not bat an eye when an interface links them to another website).

  2. It is so rewarding for designers to work with images that other formats often get short-shrift.

  3. Ranking will be very important in such a large resource.

  4. Filtering by facet is more important than advanced search. Seeing the size of a result set and at a glance being able to see how many are, for instance, audio resources is much more intuitive than limiting up front. Filtering by language and access and contributor and date is doable and useful. The likelihood of someone searching by title in a resource like this is low and can be accomplished in a free text search.

User Feedback

User studies should start with the users and their needs, learning how they currently approach their tasks and what they wish they could do. Most often, however, we wait until we have something and then test users on our interface. Europeana has done both. Their definitions of users, of use cases and expectations are very impressive. Usability testing has been done on the demo and prototype systems. More work is planned to learn more about what researchers want and Europeana V1.0 will be tested via focus groups.

Typically, the bulk of early use is tire-kicking. Having heard of a new resource, users just want to find out what is in it and what it does. Often those users do not come back. The most promising sign in Europeana user studies is that a large majority of first-time visitors say they are likely to come back again and the high numbers of repeat visitors corroborate that.

We always have to be prepared to redefine our audience. Some initiatives start out with a narrowly defined audience and are delighted to learn of unanticipated uses by unexpected users. Some think everyone will love their service and then find that there is a very enthusiastic niche market and redirect their efforts. With OCLC’s ArchiveGrid – and with the Australian Newspapers project – we started thinking it was mostly researchers of historical topics and found that genealogists are the biggest users. With American Memory, the Library of Congress saw their audience as their traditional serious researchers, but found that it was eagerly embraced by school kids and teachers and roundly ignored by the university community. User surveys should include non-users, so that we learn of potential new audiences, though non-users’ contact information is far harder to come by.

The ‘My Europeana’ and ‘Communities’ features did not fare well in the user survey, though many say they want that functionality. It could be that fatigue is setting in with setting up profiles, friending, and sharing. I think it is more important to make established social networks interoperate with Europeana than it is to create new ones.

Europeana found that very high ratings were given for the detail and accuracy in the description. This is something to take into consideration if planning to allow user-added data. At RLG and at OCLC, we have asked users about user annotation, rating and tagging and on a number of occasions found that they would not trust that information unless the contributor was someone they thought had credibility. Users may be more interested in making their own lists and sharing them through existing social networks. Setting up friends, groups, or networks for chatting may be seen as irrelevant in Europeana.

It is encouraging to see that Europeana is looking not just at an array of what we traditionally think of as end users, but also at those using APIs and at other aggregators and service providers. Again, it is important to think creatively about other ways to measure Europeana’s impact, because it may be that the places where the most activity occurs are in other portals, services, or social environments.


It is a luxury to know there is funding until 2013, but progress needs to be made in building community among contributors, putting content in the users’ workflow, and developing APIs for reuse. A sustainability plan will only be compelling when Europeana is embedded in these ways.

Components of sustainability may include:

  1. advertising

  2. private sector sponsorship

  3. sharing affiliate income

  4. paid inclusion

  5. image licensing

  6. continued project-based funding from EU, philanthropic, and national sources

  7. commission on sales of in-copyright content

  8. consultation deals

  9. paid referrals from Europeana to revenue-generating sites

  10. creation of web stores for products derived from the content or based on the brand

  11. exhibit services

  12. codified support from member states.

Assuming Europeana can demonstrate significant value, some of the above could supplement but not replace continuing EU support for the ongoing maintenance and development of Europeana. If it costs over €M3 to keep Europeana running, there must be significant central support.

American Memory was a huge initiative at the Library of Congress, supported at first with philanthropic funding and, after the proof of concept, Congressional appropriations. In the end, American Memory became part of a series of larger initiatives to preserve and to provide access to the nation’s patrimony. While it still gets some project-based funding, gradually components are being absorbed into the fabric of the Library.

In trying to make Cultural Materials recover some of its costs, we spun off a site, Trove.net, with a page per item. We tried image licensing. We tried revenue sharing by including our content in another resource. We attempted to get sponsorships. We tried selling ads. Perhaps I need say no more than that Cultural Materials was retired.

In Conclusion

Many challenges lie ahead for Europeana. There is a lot to coordinate and a lot to communicate.

Some of the biggest challenges are related to becoming a mature operation:

  1. operationalizing harvest and update

  2. rights issues and finalizing other policy issues

  3. setting up to let others harvest and reuse Europeana content

  4. long-term funding and sustainability.

Most of the yet-to-be-determined aspects are key components of the current work. But it is also the hard stuff, at which many similar initiatives have failed.

The biggest risk seems to me to be biting off more than can be chewed. I would have suggested that Europeana ratchet down its ambitions to a more modest set. The risk is that years go by with many things half-baked and no big successes. It first appeared to me that to succeed it would be important to scale back to the bits that matter the most: encouraging practices that enable efficient harvesting, conquering the language barriers, providing a portal, and enabling subsequent harvesting from Europeana. Some of the other Europeana activities are important to achieving those goals (shared development of open source code, integration of existing terminologies to help with multiple languages, development of APIs…). I would have suggested that Europeana let go of things like user profiles, social spaces, user-generated content, collaborative working, exhibits, color searching, audio and video preservation, and mobile phones until it has achieved those first four goals.

Europeana has been wise to defer some of the fluffy extras like connecting to Google Earth or Second Life or 3D touring. It is very important to differentiate the nice-to-haves from the must-haves and to avoid being distracted by shiny new things. Some of those are things that others can do, once Europeana has provided the necessary framework and content.

What lies ahead is ambitious, but having now digested all the documentation I could find, I’m becoming convinced that Europeana in fact may have the bandwidth to accomplish much, if not all, of what has been laid out. So I guess my admonition is to make sure the goals are prioritized, so that if things start to slide or begin to lose steam, there can be confidence in the most important components.

I think that Europeana has some advantages over similar efforts in its practical technical approach, in the sheer amount of political support it has garnered, and in the breadth and depth of funding, of support, of content, of technical expertise, and of participation. Europeana may succeed where others have failed. There has never been a project of this scale. Widespread interest is demonstrated by the fact that 90K people signed up for the Europeana eNews. Europeana is a collective achievement with a huge following.

Websites referred to in the Text

Australian Newspaper Digitisation Project, http://newspapers.nla.gov.au/ndp/del/home

CenturyShare, http://www.k-int.com/projects/c-share

Culture Grid, http://www.collectionstrust.org.uk/culturegrid

eContentplus program, http://ec.europa.eu/information_society/activities/econtentplus/index_en.htm

The European Library, http://search.theeuropeanlibrary.org/portal/en/index.html

Europeana Connect, http://www.europeanaconnect.eu/

Europeana portal, http://europeana.eu/portal/

Europeana brand, http://version1.europeana.eu/web/guest/communication-tools

Google Research Corpus, http://www.googlebooksettlement.com/help/bin/answer.py?hl=en&answer=118704#q51

IMPACT, Improving Access to Text, http://www.impact-project.eu/

Library, Archive and Museum Collaboration, http://www.oclc.org/research/activities/lamsurvey/default.htm

OAIster, http://www.oaister.org/

OCLC metadata schema transformation services, http://www.oclc.org/research/activities/schematrans/default.htm

VIAF, the Virtual International Authority File, http://www.oclc.org/research/activities/viaf/default.htm

WorldCat, http://www.worldcat.org/

The World Digital Library, http://www.wdl.org/