As an open data fan or as someone who is just looking to learn how to publish data on the Web and distribute it through the Semantic Web you will be facing the question “How to describe the dataset that I want to publish?” The same question is asked also by people who apply for a publicly funded project at the European Commission and want to have a Data Management plan. Next we are going to discuss possibilities which help describe the dataset to be published. Continue reading
The semantic web is suffering of data. Still. To get the network effects we expect to have with the use of the semantic web, there is still the need to open quality content to the semantic web world. One of the fields where such an opening to the RDF-world should happen, is cultural heritage. As works, people, history and references are distributed over various places, archives, libraries and holders of data, a semantic web approach seems to be perfect to resolve a lot of questions in making the world cultural heritage available.
Europeana is such a promising project. Europeana is funded by the European Commission under the eContent+ programme, as part of the i2010 policy. It is a partnership of 100 representatives of heritage and knowledge organisations and IT experts from throughout Europe. In the last two years Europeana’s prototype was done technically and in terms of connecting contents from various European museums, governmental organisations and art foundations. At Europeana two million books, maps, recordings, photographs, archival documents, and paintings can be found. This figure should be raised – with financial support of the European comission – up to 10 million entries until 2010. An effort which will take approximately 350 million euro.
Under the lead of Stefan Gradmann (University of Hamburg) semantic technologies within the framework and also to the outside semantic web are implemented. Even the now running beta version of Europeana focuses on traditional browsing and search algorithms, an additional semantic europeana prototype gives some insights into further developments of Europeana to a well intergrated semantic web service. So, hopefully we can expect a connection of big content networks to the LOD-cloud soon.
Projects like Europeana will go its way to a rich web of data. Hopefully this is not only a development which public institutions follow. Also commercial initiatives dealing with cultural heritage – say Google – should consider a connection of their harvested data into a bigger semantic web.
For the past several months the EU Commission and the EU Parliament were struggling over the so called “Telecom Package“, a legislative initiative promoted by the Commission under heavy advocacy of France. In a nutshell the Telecom Package contains a very problematic passage, which is meant to strengthen the rights of ISPs in being able to cut off the internet access of individual users, if any violations of existing or future copyright law were detected. In other words: ISPs would be able to control who gets access to the internet, violating the universal service doctrine, which is a basic cornerstone of democracy.
In their first reading on September 24, 2008 the European Prarliament voted against the the “Telecom Package” advocating the so called “Bono Amendment” – which refers to the French Socialist MEP Guy Bono – which basically states that that courts need to be involved in any disconnection procedure. In the original passage, quoted in a recent EU Observer article, it says:
No restriction may be imposed on the rights and freedoms of end users … without a prior ruling by the judicial authorities.”
This decision has some relevant implications for any future developments of the internet. While the telcos and the media companies are struggling hard to adapt to the social logic the internet, searching for new business models and lobbying for regulation in their favour, it is obvious that the existing abundance and innovativeness of the internet is hardly compatible with their notion of making money on the web – basically by restricting access and promoting artificial scarcity.
It also is relevant to developments like Linking Open Data, as in an increasingly interconnected and mashupped world it is getting harder and harder to comply with strict and rigid copy- & usage rights policies – even if they are published under any sort of commons license. In this respect it is important to mention that research on judicial problems arising from the automated processing of content released under differing commons licenses is still missing (as far as I know – does anybody have a hint for me?). But with the current decision of the European Parliament we can observe a very promising shift in the notion that the internet is made up of much more than its commercial exploitability. And that any attempt to stiffle this notion by imposing unbalanced regulatory restrictions on the rights of the users is a major threat not just to the internet as it exists but to democracy itself.
In this respect enjoy a great talk of Lawrence Lessig on this topic.
To be precise it is about “early challenges regarding the Internet of Things”.
And it will focus on
architectures, control of critical infrastructures, emerging applications, security, privacy and data protection, spectrum management, regulations and standards, broader socio-economic aspects.
Contributions can be sent to email@example.com by 28th November 2008.
Take your chance! Visit their consultation site.
Author: Tassilo Pellegrini