Will Twitter Annotations Jump-Start the Semantic Web?


Among the announcements at Twitter’s first “Chirp” conference for developers this past April was the launch of a new feature called Annotations. But unlike some of the other features that were launched, such as “promoted tweets” or Twitter Places, Annotations aren’t so much a product as a substantial rethinking of the way the service functions on a fundamental level. The changes and extra dimensions they add to Twitter could have a tremendous impact, not just on the social network and the developers and companies who make use of it, but on the way we interact with the web itself.

The new feature will be one of the first large-scale implementations of what the father of the web, Sir Tim Berners-Lee, called the “semantic web.” By that, he meant web technologies that give software and applications a built-in understanding of the relationships between the different elements of the web — that is, everything from web pages to specific pieces of web sites and services. Equipped with these kinds of tools, developers and companies could theoretically create applications and services that allow different pieces of the web to function together and exchange information, and make a whole range of services easier to use and more efficient.

One example used by Berners-Lee is the simple act of getting a cup of coffee with a friend. Instead of having to manage multiple different services or applications — calling or emailing the friend, checking a calendar, looking for a coffee shop nearby, checking a bus schedule — building semantic knowledge into software would allow all of these different applications to talk to each other. You could simply choose a task, such as booking a time in your calendar, and see dates and times that would work for you and your friend, as well as locations and bus routes automatically laid out for you.

While Annotations won’t make this high of a level of integration possible (at least not right away), the underlying principle is the same: Additional information, attached to an action, adds meaning to the behavior of users and can be interpreted in some way by software. The feature is expected to launch sometime later this year and will allow developers to add that additional information to a tweet. That might include a keyword, a string of text, a hyperlink, a geographic location or virtually anything else. These pieces of “metadata” won’t affect the character count of the original tweet, but will be carried along with it through the network and eventually be decoded, aggregated and filtered by a variety of applications or services (or by Twitter itself).

In a piece recently published at GigaOM Pro (subscription required), I look at some of the potential applications of Annotations, and what the technology implies about the future of the semantic web. Please check out the full report.


Julien C

As of end of July, Annotations do not seem to be high priority for the Twitter API team anymore.

Might have something to do that Annotations in Search is going to be hard, maybe harder than anticipated.

StatusNet (Twitter open source “clone”), however, has a rapidly evolving Annotations implementation.

nikos lianeris

I don’t believe that tweeter annotations will be semantic web’s jump start.I mean,yes,annotations will give some kind of semantic meaning at the tweets but this meaning will be given by the user and not some kind of automated algorithm or whatever.So there will be no difference from other tagging systems.In other words I believe that tweeter annotation do not have much in common with semantic web

Sean Kollak

Though the idea is brilliant, the difficult thing is that no one wants to make extra effort. Additional information has to be an offspin of the original message. At Twick.it we try to realise this by analysing keywords and semantic connections between topics.


Excellent article Mathew.

I think Twitter’s annotation feature will be a sustentative step towards an early implementation of the Semantic Web vision, Sir Tim Berners-Lee’s 1999 “dream for the Web [in which computers] become capable of analyzing all the data on the Web – the content, links, and transactions between people and computers.”

There is a growing Semantic Web meme afoot. Twitter’s as yet undefined annotations feature, Facebook’s Open Graph, employing a partial implementation of RDFa, and Apple’s recent acquisition of Siri will spur a cascade of interest in the exceptional values to be gleaned from implementing Linked Data schema and galvanize further investments in the early practical application of a Semantic Web.

It is all about the metadata!

I suspect LinkedData-empowered Twitter annotations and Facebook Open Graph are going to find powerful synergistic coupling with the product of another tech goliath innovator, Google’s TV platform.

GoogleTV Android developers are going to have a field day employing dynamic Linked Data interpolations of TV/Film metadata, calling related internet content, to deliver highly personalized & contextualized extended viewer engagement opportunities. Semantic Web-enhanced Twitter and Facebook users will be able to link their constellation of content preferences deeply into video episodes and large computational engines will be able to reveal hugely-scaled clustering of people/content preference relationships.

the future

Annotations are a good starting point, but they are going to eventually just become a mess unless proper standards and context are introduced.
Knowing Twitter the annotations model is probably only half thought out and will be hindered by Twitter’s core structure.

Look what happened to the retweet, functionality that evolved naturally by users was replaced with an official retweet function that lost one of the most important elements of the old-style RT, the ability to comment on the content being retweeted. Sure, if you notice your users having to ‘hack’ your platform to perform a common task, I’m not surprised Twitter decided to attempt to clean things up. But it seems Twitters underlying structure simply isn’t suitable for a chunk of comment text to go alongside a retweeted tweet. Now we are stuck with 2 retweeting methods, neither really doing exactly what we want.

Annotations are very very basic. Its just a new field that contains for a limited about of serialized data that can add context to @mentions. I’ve seen the Annotations Overview (http://apiwiki.twitter.com/Annotations-Overview) and the Recommended Types – its all very pretty basic. Nothing semantic about it.

The semantic web creates a relationship between pieces of content (for example, in our case a Tweet and a TV Show). A tweet can reference a TV Show now, thanks to annotations, but the tv show isn’t a single object. If one of the attributes of the tv show changed a week later, that old tweet’s annotations won’t update.
Perhaps this isn’t a major problem since Twitter seem to have no interest in archived tweets (I don’t think you can even search for tweets older than 2 weeks old).

There is going to require some intelligent searching methods to make annotations searchable. Example; you can put anything in the annotations field when tweeting. I’m assuming this serialized data is stored in the database ‘as is’.
If we want to search the database for tweets that reference a certain TV Show we would need to query the database with a regex matching the JSON (slow and INSANE), %LIKE% (slow) or FULLTEXT. Maybe the annotations are ‘transformed’ and normalized into more tables when they are tweeted but even then that would mean joining tables when requesting the data back again. And we all know Twitters API is getting beaten to death constantly, when annotations are introduced UNLESS they stay with the current ‘plain text’ approach, I’m not sure how they will handle the additional load.

I was under the impression that the semantic web involves microformats (not involved with annotations), RDF (XML describing and connecting resources)… not a simple text field and some recommended types.

I want to be able to reference a Flickr Photo in a tweet and have inline data about that photo that can change (not supported in annotations) in the future. It should update with new comments from Flickr, number of views, in real-time (absolutely impossible using this basic approach).
Also when I see the photographers username, which is sent through an annotation as {“user”:”photoman123″} – what use is that? The user annotation should be an annotation in itself describing the user, and again, have the ability to update in realtime.

As I said earlier, Twitters core prevents true semantics because at the end of the day twitter was built to support text, not objects.

It’s good to see Twitter’s own http://t.co attempting to replace short-urls with contextual objects (but still using short-urls in the process… good one).

I don’t know I just predict a lot of mess and I don’t think we should credit Twitter with kickstarting the semantic web if it’s a tinpot setup and can’t perform (in a sensible and optimised fashion) a fraction of what it should be capable of.

tl;dr – Annotations are basic. API cant handle anything more advanced than a new simple text field most likely. Searching annotations will be a confusing nightmare. It will take forever for these “standards” for context types to be implemented into publishing services, and reading services. Annotations can not change or update – they do not reference real objects – they are just blocks of static text. ‘Twannotations’ sounds dumb. Sure we will get a richer browsing experience being able to see inline information, but none of it is wired up.


annotations do indeed have some interesting possibilities. I participated in the Annotations hackethon at Twitter HQ about a month back. At that point, search for Annotations wasn’t yet enabled, so all we could do was write annotations and attach them to a tweet. but it’s totally flexible as you say.. you can write anything you want. Has the potential to get really messy though. Will be interesting if Annotations standards are adopted. Similar to XML standard layouts for resumes, documents, bills, etc.

Comments are closed.