Introducing Mobile Museum

I’m ramping up to launch a sister site to this one. It’s called Mobile Museum and will be a series of semi-structured written interviews with people who have developed, authored or project-managed mobile solutions. Some of these people will be museum people, others won’t…

If you’re interested you can find out more over on http://mobilemuseum.org.uk/ where there’s a link to a signup form. Expected launch date – end September 2011.

QR isn’t an end, it’s a means

QR seems to have taken on a bit of a life of its own over the past few weeks. Not only have I seen far more of the codes in the wild, but there seems to be many more people writing about it, many more news articles – and also (which is nice) – lots of people emailing me to ask how they can “do QR”.

Google Trends graph for "QR"

QR is a great technology. Actually, no – it’s an ok-ish technology. The more important thing is that the awareness means gains in popularity, which in turn means more people will know what a QR code is, how to use it – and also make them aware of some of the foibles. As with anything, this isn’t about how awesome the technology is. Many, many geeky people will tell you QR is crap – which in some ways it is per se – but the important thing is market penetration, expectation, device support – and (most importantly), the content experiences which underly it.

Underlying the concept of QR though is something rather more important, which I think many people miss in their rush to play with the latest and greatest thing. The important thing is this: QR is a way of poking the digital world into the real world. In a way, QR is simply one technology in a line of technologies that does this. Remember the first time you saw a URL on a piece of print advertising? That was digital poking into real, albeit in a slightly crap way. Then bluetooth. Now QR.

Ultimately, the concept is the same in each of these cases: put a marker in the real world which allows your audiences to connect with content in the virtual world.

The technology with which you do this can be agnostic. This year it might be QR. Next it might be NFC or AR. The following – who knows, image recognition / hyper-accurate GPS / whatever. The facts remain the same:

First: People have to have a desire to engage with the marker in the first place. Why would you go to the effort of scanning a QR code with no knowledge of what that code might provide for you? Nina Simon just recently blogged about QR Codes and Visitor Motivation which asks this question. The cost curve – as always – has to balance: the value that your user gets out must be greater than the effort that they have to put in – and (almost more important), you have to make this value clear before they scan.

Second: A proportion of people will never take part / have the technology to take part. QR scanning (or – even more so – NFC or whatever the next big thing is) will be a niche activity for the foreseeable future. Bear in mind that not only does your user have to have a QR code reader installed, they also need the right kind of phone, an internet connection at the point of scan AND a contract with their provider that lets them use this connection. These things are becoming more real, but it is by no means a given yet.

Third – and possibly the most important – the content that you deliver should add something significant to their experience. This is tied to the first point. Here’s a banner I snapped when I was in London recently:

UCL zoology QR code

If you scan this you get a link to the UCL Zoology Museum (and ironically, out of shot to the left is the URL that the QR code sends you to..). From a user experience perspective, I bet you 50p I can get my smartphone out, type in the url and be looking at the relevant content quicker than you can boot up a QR app, scan and open.

In this instance, you do actually end up at a mobile-friendly site and some interesting links to QR technologies in use at UCL – which is fantastic. But the use case and motivation aren’t really articulated in the physical world.

Finally – you can easily put some measures in place to track usage, and use this to inform future activity. Here’s another example, this time from the British Library:

British Library QR

If you follow this link, you’ll find it goes to http://www.bl.uk/sciencefiction. The problem with this is that the URL is the same one as is being used on the poster, around the web and in all their other marketing. So when it comes to evaluating the use of QR – and whether it has been successful as a means to pull in new visitors – my suspicion is the BL won’t have any idea how to separate out these clicks from any of the others.

The simple solution to this is to use something like bit.ly and create a unique URL which is specifically for this QR code. More advanced techniques might include things like appending a string to the end of the URL (for example www.bl.uk/sciencefiction?source=qr) – or using Google Analytics “campaigns” to track these.

(Note that you could also get even more clever by having separate unique QR codes for separate advertising zones or even for separate posters – imagine the impact of being able to track which posters or areas have been most successful…now that’s cool use of a technology…)

Coming back to the beginning of this post – the overriding point here is that QR, and many other technologies similar to it, provide a very exciting way of bringing digital content into the real world. With some upfront thinking, genuinely interesting content can be delivered in this way and users can be made to engage. As ever, though, it isn’t about the technology but about the use, motiviation and content which lies behind the technology. These are the things that count.

“Activate the world” (or: what “mobile” really means)

I’m talking at the CETIS conference next week on “Next Generation Content” and as with all my recent talks, I’ve done a mindmap to help me structure my thoughts…

Here’s the basic premise: “mobile” isn’t just “designing for mobile devices” but goes much deeper. We need to start thinking about what mobile means from a user experience perspective, from a privacy perspective and from a product design perspective.

When internet-connected devices (all 5 billion-ish of them) start becoming the norm, how does this change our lives?

Click the image for a bigger version version. It helps.

Urban Augmented Reality: Q&A

Some time ago, Jacco Ouwerkerk contacted me having seen the interview I did with the Museum of London. He directed me towards a hugely exciting Augmented Reality application called UAR – “Urban Augmented Reality” which launched in the Netherlands in June 2010.

Here’s what we talked about.

Q: Please introduce yourself, and tell us about your involvement with the project

I’m Jacco Ouwerkerk, interactive concepter at IN10 Communication, a creative agency that creates interactive brand, museum and city communication. I’ve been responsible for developing ‘open museum concepts’ like Urban Augmented Reality (UAR) for the Netherlands Architecture institute (NAi). The Netherlands Architecture Institute is a museum, archive, library and platform that wants to get people of all ages involved in architecture.

I’ve been working on the UAR project since the start in 2009 and I’m responsible for the concept development, interaction design and the project management.

Q: What is the project / what does it do?

Urban Augmented Reality (UAR) is the world’s first mobile architecture application featuring augmented reality and 3D models. With UAR you can see the past and the future (things that aren’t even there yet..) of the built environment on your iPhone and Android smartphone. The NAi has set itself an incredible challenge by making the Netherlands the first country in the world to have its entire architecture viewable in augmented reality.

Rotterdam is the first city that is available in UAR. Rotterdam is famous for its modern architecture, but let’s not forget the past. Rotterdam doesn’t have many historical architecture left due to the bombardment during WWII, but the UAR makes it all visible again. UAR makes it able to see alternative designs of buildings in their real environment. Or get a sneak preview on the new Rotterdam Central Station.

Later this year Amsterdam, Utrecht and The Hague will follow.

Q: You chose Augmented Reality as your technology of choice. Can you tell us how you went about making this choice, and why you think it works best for you?

Mid 2009 Ferry Piekart, curator UAR from NAi, invited us for a brainstorm meeting at the NAi. He wanted us to come up with an idea how they could be a museum outside the museum walls. This is because architecture is best experienced in and around the architecture itself. Also, NAi will be closed for months because of construction activities. Therefore there had to be an alternative for NAi to share its enormous collection to the public.

At that moment I was thesis supervisor of Maurice Melchers on the subject of Augmented Reality. We were looking for relevant Augmented Reality concepts that could surpass the more gimmicky concepts that we found on the internet at that time. We concluded that AR could very well be used to view architecture.

In june 2009 we started researching the possibilities of AR on the smartphone in combination with interactive tours. At the same time Layar launched their Augmented Reality browser. We contacted them and soon we started a partnership to develop the Urban Augmented Reality application with 3D models. There was a lot of enthusiasm among the participants in this project: NAi, Layar and IN10.

Q: Tell us about the process you went through to build the app?

Our goal was to develop a stand-alone (native) smartphone application that could be accessed by as many people as possible. We chose to develop an application for the iPhone and the Android platform, in our opinion the two most relevant platforms to launch our application on. After researching the possibilities we came to the conclusion that we had to develop a web based app to make it easier to show content on both platforms, (multi platform) in stand alone apps. The content is also available in the Layar browser.

Our main goal was to create a maximal user experience: easy to use, with relevant information and optimised mobile content. User experience design for Augmented Reality is new. For the AR view we had to design with the Layar AR possibilities but also added some features. For example; a switch between the different stages of AR: past and present. At the same point we got the feeling we were finally designing our childhood dream: a time machine!

In the application you get all sorts of extra information about architectural projects, architect biographies, sketches, drawings, environments and an overview of the process of the realisation of the projects. The NAi spent lots of time selecting projects out of world’s largest architecture collection and preparing texts and images ready for a mobile context.

With UAR we tried to bring the ideas and stories in architecture to life by adding audio tours within themes and special ‘famous’ guides who tell you about the buildings surrounding you. This feature will be available in the upcoming update.

Testing UAR was surreal! We spent hours wandering in the city of Rotterdam looking through mobile phones: sometimes the spots were very crowded. People tend to get a little paranoid when they think you point a mobile phone in their direction. The technology is new so we had to deal with GPS accuracies caused by electricity cables, buses driving by and so on. We also spent a lot of time finding the right angle and GEO codes combination for 3D building positioning.

Q: What provision is there for people who don’t have these phones, and how did you go about making the choice to be selective with your audience?

It’s the first time mobile Augmented Reality is accessible on this scale. Augmented Reality only works on smartphones with compass and GPS receiver. We know not everybody has a smartphone and had several discussions how we could make UAR accessible for as many people as possible. We have choosen a multiplatform approach where we make the content available on stand-alone apps (for free) and via Layar browser. The Netherlands Architecture Institute wants to be innovative and decided to start with AR because of the relevancy of AR for architecture and in the belief that in the coming years everybody will have a smartphone.

Q: How successful has the app been?

The app has been downloaded approximately 2500 (iPhone/Android) times. Within the Layar browser UAR has been requested more the 6500 times.

Q: Can you give us some detail about the technical implementation of the app?

There were a lot of people and parties involved: NAi, IN10 (responsible for the concept, design, project management, CMS), Layar (SDK and browser), Triangle Studios (app development) DPI Animation House (3D models) and the Rotterdam Historical Archives.

Together we worked on getting the archive accessible in UAR on all kind of levels. We used the Content Management System to collect and upload the complete (selection of the) archive of NAi (materials, texts and 3D models made by DPI).

We imported a great amount of data by using Excel! For future releases, editors of the NAi, urban archives and architects can upload and create content directly in the CMS themselves. We’re also going to use all kinds of API’s and connect various collections and archive databases in UAR.

Q: What have you learnt about mobile / AR / developing this kind of thing? What might you do the same / differently in the future?

The biggest challenges we faced and learnings we have experienced during this process were the mobile multi-platform development, pre-loaded content, database connections/imports/API and 3D positioning.

I hope we get together with other museums and institutes to join forces on mobile development. I’ve seen that there are so many archives and collections that are digitalised. Together we can create strong mobile user experiences.

Q: What have you got planned for the future?

Rotterdam is the first city, to be followed later this year by Amsterdam, Utrecht and The Hague. The rest of the Netherlands will follow in 2011. We’re also planning to add user generated content to the application.

Q: Anything we’ve missed…?

Smartphones with AR, QR and image recognition are just the catalyst of a future where everything will be connected with data. It’s more important than ever to open up and join forces to create beautiful, interactive and meaningful museum environments in and outside the museum.

It’s time to tell data stories to augment the reality of our daily lives.

Streetmuseum: Q&A with Museum of London

Streetmuseum – a rather lovely iPhone app by the Museum of London – launched a few weeks ago, and almost immediately began to cause a bit of a buzz across Twitter and other social networks. It’s hardly surprising that people have responded so positively to it – the app takes the simplicity of the Looking Into the Past Flickr group and combines it with cutting-edge stuff like AR and location-based services (think Layar++) to bring historical London into a modern-day context.

I caught up with Vicky Lee last week and asked her a bunch of questions about the app. Here’s what she had to say:

Q: Please introduce yourself, and tell us about your involvement with the Museum of London iPhone app project

I’m Vicky Lee, Marketing Manager for the Museum of London. As part of the launch campaign for the new Galleries of Modern London I’ve been working with creative agency Brothers and Sisters to develop a free iPhone app – Streetmuseum – that brings the Museum to the streets.

Q: Tell us about the app – what it does, and how you’re hoping people will use it, also about how successful it is being

Streetmuseum uses augmented reality to give you a unique perspective of old and new London. The app guides users to sites across London where over 200 images of the capital, from the Museum of London’s art and photographic collections, can be viewed in-situ, essentially offering you a window through time. If you have a 3GS iPhone these images can be viewed in 2D and also in 3D, as a ghostly overlay on the present day scene. The AR function cannot be offered on 3G iPhones but users can still track the images through their GPS and view them in 2D, with the ability to zoom in and see detail. To engage with as many Londoners as possible, images cover almost all London boroughs. Each image also comes with a little information about the scene to give the user some historical context.

What we bet on from the start was that users would enjoy finding images of the street they live or work on and would be quick to demonstrate this to their friends and colleagues – helping to spread the word about Streetmuseum but also raising the profile of the Museum itself, particularly among young Londoners who we have previously struggled to reach. We hoped that the app would spread virally in this way within days and it certainly seems to have worked as in just over 2 weeks the app has had over 50,000 downloads. It’s just been released in all international iTunes stores so we’re expecting this figure to rocket over the coming weeks.

Q: Why did you choose to build an iPhone app as opposed to something else (Android, web, etc)

When I wrote the brief for a viral campaign to promote the new galleries and reposition the Museum of London, I had no idea we would end up launching an app. I hadn’t for one moment considered that we could afford to develop an app but Brothers and Sisters’ instinct from the start was that this was what we needed to change perceptions about the Museum. As soon as we understood how the concept fitted in with the overall marketing campaign (which also uses images from the Museum’s collections) it was the only option we wanted to pursue.
As with most Museum projects we were limited by budget so it was a case of either iPhone or Android but not both. To launch with maximum impact our feeling was that we had to go out with an iPhone app, therefore benefiting from the positive associations with the Apple brand and securing the interest of the media. We hope now to be able to secure funding to develop an Android version of the app in response to the many requests we have received.

Q: Can you tell us a bit about the financial model? Did you build it in partnership with someone else?

As a free museum reliant on funding, we would not have been able to create this app without collaborating with Brothers and Sisters. The partnership was mutually beneficial, generating media coverage for both parties and new business leads for the agency. Using images from the Museum’s collections meant that all the content was readily available so this kept costs down. Licensing agreements on certain images made it complicated to charge for the app, however it was always our intention to launch this free in order to reach the widest possible audience.

Q: Overall, what have you learnt about the process so far?

Simple works best. We originally planned to include user generated content but dropped this idea to ensure we stuck to our budget and timescale. Ultimately the idea is not that original but its simplicity has made the app an easy sell, both nationally and internationally.
I’d certainly give myself more time in future – we delivered the app in an incredibly short amount of time which gave little opportunity to review how it worked in practice. With more time we could have carried out user testing and refined the concept further to end up with an even slicker product.

Q: What else have you got planned for mobile at the MOL into the future?

We’re keen to keep the momentum going and stay ahead of the field, so, together with Brothers and Sisters, we are already looking at how we can develop this concept further. If we can secure additional funding we’d like to explore different subject areas and tie-in with future exhibitions and gallery redevelopments. Most importantly though we need to build upon what we have already achieved and keep evolving to ensure that any new apps continue to be newsworthy. We are also looking into the possibility of adding more images to the current Streetmuseum app and developing a version for Android phones.

Quality, functionality and openness

It is against an increasingly bitter backdrop of argument between Apple and Adobe (Flash! No Flash! HTML 5! Openness! Closedness! etc…) that I found myself a week ago with a damaged iPhone. An accidental dropping incident from Son1 added a seemingly minor dent just next to the power button, and hey presto – a device I can’t turn off manually.

The poor, bashed-about phone I dropped was a Gen 1 iPhone: almost a retro device by some accounts. Nonetheless, I’ve stuck with it, and life now without an internet-ready mobile is simply not an option for me. It was therefore a rather lucky twist of fate that found a generous friend offering me his brand new Android phone to use for a while.

So I find myself with the latest and greatest Android handset: an HTC Desire. A ten zigabit processor, a gwillion megapixel camera, a ten billion pixel screen, infinite memory. Something like that, anyway. It’s slick, beautiful, thin, light. It has a bright, hi-res screen, a wonderful camera. It is rammed to the hilt with functionality. I’m blown away by having real location capability (remember, my Gen 1 could only find me using cell stuff rather than GPS); I’ve experienced using Layar, Google Sky Maps, other LBS services – properly – for the first time. That openness, that speed, that power. Awesome.

The first night I got back with the Desire, I found myself sitting on the sofa, flicking my way through the Android store, checking Twidroid, browsing the news. And a weird thing happened, something I wasn’t expecting. Like an almost intangible movement in my peripheral vision, I realised that something wasn’t quite right. I was a bit on edge, trying a bit hard, having to think. Night One, I said to myself. Night One with a new and unfamiliar device. No wonder. It’ll be ok tomorrow.

The thing is: the uneasy thought didn’t get better the next day, or the next night, or the night after that.

After a week of using the latest and greatest Android phone, I find myself sitting down on the sofa in the evening and the thing is sitting unused on the top of the piano. Instead I’m – get this - back using the 1st generation iPhone. It’s SIM-less (useless as a phone, but still ok as a device on the WIFI), battered, slow as buggery, and I can’t turn it off, but hey – I’m back.

Now’s the point in time I should make something very clear: I’m not an Apple fanboy. I have a Macbook at home but I spend most of my working life on PC’s. In my past I’ve used both, enjoyed both, had different experiences of both. I’m also pretty conflicted about some of the recent moves by Apple. I personally think that the whole anti-Flash thing is a major mistake, in the same way that I think the anti-Flash zealots are making some pretty bold assumptions in saying that HTML5 can replace Flash at this point in time. Frankly, that’s bullshit. I also dislike the pro-app, anti-web thing that they appear to have going on. The web wins: it always will. Apple say they get this but do a bunch of stuff which implies otherwise.

I wanted to love Android. I wanted to embrace openness, turn my back on Apple’s rejection of free markets, join the crowd of developers shouting about this new paradigm.

I can’t.

I’ve tried very hard to articulate to myself why this is the case. It is – certainly – something about usability. To take one of many examples: on Android you apparently have one paradigm for copy and paste in one application, and another in another: in the browser you get a reasonable Apple-like magnifier; in Twidroid (for example), you don’t. This to me just simply isn’t acceptable. Copy and paste is ubiquitous, end of. Stuff like global Google Search is good – very good – but when every move is hampered by subtle but vital compromises in usability, the overall experience becomes stressful, not playful.

The Android store is also, frankly, embarrassing. I tried very hard to find any kind of game or app that came close to the beautiful stuff you see on even the worst of the Apple store. Nothing. The UI of many apps is just terrible, the graphics all a bit 1995. Crashes are frequent, and when they do happen they are peppered with developer-like comments about code and runtimes.

It’s hard – store aside – to fault the Android device from a functionality perspective, and I’ve tried very hard to find ways that I can articulate what exactly is wrong. It is something about playfulness, about the fun of the technology. There is also something about quality. Robert Pirsig says this:

“…the result is rather typical of modern technology, an overall dullness of appearance so depressing that it must be overlaid with a veneer of “style” to make it acceptable…”

I don’t want to get all metaphysical about Apple products: enough people do this already, but the iPhone experience – in a week of living with Android – is much, much closer to the invisible technology that makes for a better and more natural user experience. That’s what has me reaching for an old, broken, semi-retired phone rather than the faster, slicker, by-all-accounts-better model.

Apple stuff comes with a compromise – and make no mistake, I’m as conflicted as the rest of the world about this: the restricted UI, the closed and editorially controlled store, the limits placed by Apple on the devices their OS will run on – these are not “good” things – but they appear, at least in this instance, to be necessary for quality. When Android is forking its way off into infinite loops of differentness, each with pluses and minuses, Apple stays the course – a slow, chugging, proprietary, known experience. It doesn’t feel right, and yet it absolutely does.

When I think about what this means, I worry. As technology people, we should all be concerned about the approaches that Facebook, Google and Apple are taking, and we all know that openness is – or should be – key. But – and I’ve written about this a bit before – usability and ubiquity are the definers for normal, non-geeky people, not openness or functionality. And we need to focus on this and think about what it means when usability comes into conflict with openness, as I believe it does with Android.

So that’s me. I tried. Circumstance mean I’ll be using Android for the next few weeks either way, and I may change my mind. I may find myself on the sofa using the Ferrari of phones rather than the Morris Minor. But somehow, I doubt it.

What’s so great about mobile?

I gave a presentation recently at UK Museums on the Web entitled “The Intertubes Everywhere”. It was a re-working of my Ignite Cardiff talk, with a gentle angle towards cultural heritage. Here are the slides:

[slideshare id=2742484&doc=theintertubeseverywhere-091218044628-phpapp02]

The one-liner for those that don’t have the time to go through the slides is something like this: I believe that although mobile has been held up as THE NEXT BIG THING for some time, we are reaching a kind of “perfect storm” of conditions where it is at last becoming a viable reality for many users and therefore something for institutions to think about, too.

This is as much to do with effective marketing and consciousness raising as it is to do with device or network capability: if you’ve tried buying a mobile phone in the last year or two, you will have been offered mobile internet; if you go to a mobile phone company website today, you’ll see smartphones, dongles and internet on the go on their homepage. It would be very hard to miss this kind of marketing push. Couple this with the radical improvement of mobile content, the beginnings of location-based services and the increasing speeds and capability of a “normal” mobile device, and it seems pretty clear that we’re on the cusp of something pretty big.

If you’re in any doubt, check out slides 25-35 of the presentation that Dan Zambonini and I did at DISH 2009, which have some interesting figures on changing mobile usage. With device replacements happening on average every 14 months, even the old-school phones that don’t support mobile internet won’t be here for much longer.

With this level of exposure, it’s obvious that museums and other cultural heritage institutions are going to be following along and getting excited about mobile, either building iPhone apps or creating mobile versions of their sites.

While it is excellent to see innovation in this field, I’m slightly underwhelmed by some of the mobile offerings starting to appear that seem to be more “because we can” rather than “because we should”, in particular the current trend (and I’m deliberately not giving any examples – you can go find them yourself!) for “mobile collections search”.

It seems to me that the single mantra which should surround any mobile web development project right from the start is something like “never forget: the mobile browsing experience is far, far inferior to the desktop browsing experience”.

Browsing a mobile website is generally not a fun time. You don’t relax when you’re browsing on a mobile; you don’t lose yourself in the content: you’re there in sit forward mode, and you want to do one of two things:

  1. find some information and get out as quickly as you can
  2. use the capability of the “mobile” bit of the experience to do something…well, “mobile”

The first point is a no-brainer, IMO. Consider when and how I might choose to browse a museum website on my mobile. The answer is not “in my living room at home” – if I’m there, I’ll go find my laptop and have a far easier and more pleasurable experience in sit back mode. The answer probably is (and don’t shout at me for being obvious..) but when I’m mobile. I’m out and about, wondering what to do at lunchtime, thinking about whether a museum is open or where I can get tickets or how to get there. I’m not on WIFI, and I want the information as quickly and as seamlessly as possible. I don’t want images, I don’t want interaction, I want information. And I want it right now. And – this is the painful bit – I really, really don’t want to browse the collections. Why would I want a second-rate experience of browsing content using a 2″ screen, some clumsy non-mouse interaction touchpoints and a slow connection? And – more to the point – why would I possibly want to stand in the street (being mobile…) and look at museum collections? I don’t*.

* Actually, sometimes I do, provided the mobile experience adds something. And this is where point 2 comes in:

If I can have an experience which augments my real experience rather than just providing a poor quality facsimile of an online experience – then you’re talking about truly putting mobile capability to good use.

So for example – if I’ve got a known location (and this can mean GPS but more likely in our museum context means “I’m standing in front of artefact X and my phone knows that because I’ve keyed in something to tell it this”), then now is the time for the museum to give me additional information about other similar exhibits, let me bookmark that artwork, or share it with my network.

mobile.nmsi.ac.uk - something I knocked out about 5 years ago and still live!

Some of the museum sites we’re starting to see are making use of this capability – check out BlkynMuse on your mobile (and note the immediate emphasis on “where are you on-gallery?”) as a good example; but there also seems to be an increasing number who are simply putting their museum collections online as they are in some kind of mobile format – either a mobile optimised site or (worse) an iPhone application, with none of the context-sensitivity that makes mobile a value-add proposition for end-users.

Much as I’m glad to see innovation in this space, I’d much rather see museums focussing on point 1 above by having a mobile-sniffing code on their homepage and redirecting to an optimised m.museumsite.com page with visiting information, than putting in a huge amount of effort into providing mobile-optimised collections search. At the very worst, museums should have the subdomain m.*** or mobile.*** and there have a script to strip out the images and so on. There are many ways to do this – here, for example is the Museum of London site stripped using a simple PHP script from Phonefier, or see these tips on how to create simple “mobilised” versions of your existing site with zero extra effort.

Once the simple and high-gain win is done, then it’d be great to see some location-specific and innovative approaches to “virtually collecting” or augmenting collections experiences. But the “browse our mobile collections site” without really thinking about the use-case is pretty much saying: “go here on your mobile and you can have an experience which is infinitely worse than the one on your desktop with absolutely no upside”. In other words, no thanks.

What do you think? Has your museum got a mobile site for visitors, or just for collections, or none at all? What mobile apps have you downloaded or accessed that provide museum collections (or other) information? How was it for you?

UPDATE (about 3 minutes after I posted this…): I just realised I utterly neglected to talk about gaming. Which, IMO, is where mobile (and in particular mobile collections) have a huge amount of potential. I think this’ll have to wait for a future post :-)

UK Museums on the Web 2009 – QR in the wild

Last week was the annual UK Museums on the Web conference.

Things were particularly hectic and exciting for me this year for a whole host of reasons:

  • We launched a new MCG website in the week before the conference – this was a full migration to WordPress MU which I’ll write more about shortly;
  • We were working behind the scenes with the excellent Laura Kalbag to develop a new logo and design guidelines for the group which we also needed to get live by conference day;
  • I suggested I build and trial a QR tag demo based on individualised badges (more below..);
  • I was giving a presentation on ubiquitous technology…

Crazy busyness aside, overall this was for me the best UKMW conference yet, for one very simple reason: we’d managed to get a range of speakers from outside the sector. Often, domain-specific conferences have a tendency to focus inwards, and although it is incredibly useful to see projects that are specific to that sector, I think it is as important that everyone keeps an eye on the outside world. This is particularly the case right now, as museums come down off the 2.0 peak and start to ask where the value is and how best to capitalise on an ever-decreasing budget.

Many other people have done a much better job of describing in detail who talked about what at the conference. If you’re interested, see the many things tagged ukmw09 on Google Blog Search. For me it was probably Paul Golding, Andy Ramsden or Denise Drake who did the most insightful talks for me, but actually every presentation was really interesting and led to a fascinating day.

So now to the point of this post: the beta “onetag” system I put in place to allow delegates to use QR tags in a “real-world” scenario.

For those who aren’t familiar with QR codes, I’d suggest a brief moment over on Wikipedia or go with the one-line description: “barcodes for linking the real and virtual worlds”.

As well as wanting to give people a QR example to play with, I based the idea for the system on a problem which I think needs solving, particularly at conferences: business cards are irritating, wasteful and require re-keying (hence duplication) of details. The idea, therefore, was to give everyone at the conference a personalised badge with the QR code on it, get them on-board prior to the event so that as many as possible had QR code readers installed on their mobiles, and then sit back and watch how this kind of system might be used, or not!

For the badges, I used local print firm Ripe Digital, who are not only incredibly helpful but also have the ability to run what is essentially a large-scale mail-merge: I designed an A6 badge in Adobe Illustrator which had various fields in it which were populated from an Excel spreadsheet of delegates. (Incidentally, we’d made extensive use of Google Docs during the conference for gathering and munging delegate names, and this really paid off in terms of sharing, collaborating and processing delegate information).

I created the actual codes using a fairly nasty mix of Google Charts, downthemall and mailmerge (don’t ask) – once I’d got a local folder with all 100 or so QR codes in it, I just referenced those codes in the AI document and asked Ripe to insert the specific QR tag at top right of the printed badge.

Here’s the front of my badge – note (important, this) that the grey code under the QR tag is also unique to each person, allowing those without QR readers to take part in the demo as well.

The badge, incidentally, also contained sponsor information and outline timings for the day on the back, a detailed description of the timings and speakers on the inside fold and a delegate list on the reverse. The badge was folded from a single sheet of A4 into an A6 wallet hung on a lanyard around people’s necks. The basic premise was to save as much as possible on enormous (mostly unwanted) wads of printed material and focus instead on the key information that delegates are likely to want.

Assuming (not a great assumption, but go with me for now) that someone not only had a reader installed on their mobile but also managed to successfully read the code, here’s what happened:

The very first time a delegate uses the app, they get directed to a mobile-formatted web page which asks them for their PIN (QR number) details – that’s the bit in grey under their glyph:

Delegates only had to do this once (I placed a cookie to keep the logged-in state) – once they had, and on all future taggings, they get redirected to a screen showing them details for the person they just tagged:

This is only so much use, especially given the name badge itself has all of this detail on it already, so I also built in the functionality behind the scenes to email the “taggees” details to the “tagger”, both as a plain email but also with an attached vCard. This therefore means that the person who did the tagging can easily add this contact to their address book without having to re-key any of the information. Here’s how the email looks in Outlook:

And that, basically, is that – :-)

So did people use it? And if so, how?

Behind the scenes, I was grabbing some data each time anyone carried out a tagging. The data I intended to capture was: who did the tagging, who they tagged and when. As it happens, and annoyingly, my script failed on the “when” bit. I also realise that in hindsight I really should have captured the user agent for each tagging as well – then I would have some insight into what people used, most common devices, etc. With a fair amount more time (of which I currently have none!) I could probably marry up the server logs with device types, but for now I’ll leave that bit of information to one side.

The first bit of interesting information is this: there were 81 taggings during the day, which was actually much higher than I’d anticipated.

27 different people used the system (out of around 100 registered delegates)

On average, the people who did tag someone did it on average 3 times, although this figure is skewed upwards by one person who tagged 21 people! Here’s how the distribution looks:

Another view on this data shows that a fair number of people also tagged themselves, presumably to familiarise themselves with the software (that’s the visible diagonal line bottom left to top right):

So what did we actually learn from this: first of all, total simplicity from a user perspective is – as always – absolutely key. Here, we had a willing audience who had been given a heads-up to expect to install the software, definitely would have been “geek skewed” in terms of internet-enabled devices and were willing to play; and although I was pleased that lots of people took part, the figures show that this is clearly far from being a “everyone does it” activity.

Secondly, the blocker – again, as always – wasn’t just the technology but the social issues that surrounded the technology. I saw lots of people tagging, but this wasn’t an “invisible” activity of the type that makes for seamless interaction. People had to stop other people, ask them to hold still, take a photo, wait for the software to catch up, try again when the barcode failed to read and so-on. However hard I tried to make the back-end seamless, QR software just isn’t good enough (yet) to deal with quick shots, moving targets, wobbling hands. In this particular instance (and this is actually the next stage of onetag that I’m going to look at), RFID or SMS based tagging would have been slicker.

Thirdly, although I see business cards as an issue, it isn’t necessarily a problem which is identified as such for everyone. Exchanging a business card is natural; scanning a badge isn’t. So for this to really work, the technology either needs to be invisible (I just wave my reader over your badge, no focussing or waiting or holding still..) OR the win needs to be much more tangible (a tagger gets more information about a tagee, or there is some kind of other incentive to make the connection, etc). Providing more information obviously has privacy issues, and also potentially usability issues; as the incentive becomes bigger, so – normally – would the complexity of both the system and the explanations underlying that system.

Overall, I was very pleased with how the system worked, and also delighted that so many people took the time to test it out – so thanks to you, whoever you were!

I’m going to be continuing to develop the various onetag systems, and am always up for hearing from you if you’d like me to put something together for your conference or event – just comment or email and I’ll get in touch.