augmented reality – eightbar Raising The Eight Bar Tue, 26 Jul 2016 09:38:58 +0000 en-US hourly 1 Parrot AR.Drone Wed, 08 Sep 2010 20:37:53 +0000 Continue reading ]]> Andy Piper brought his new toy to the lab today. While on a whistle stop tour of China recently he called in at Hong Kong on the way back, where he picked up one of the a Parrot AR.Drones which have been released this month.

The AR.Drone is a quadricopter with 2 video cameras, one mounted in the nose and one downward-facing. The drone that acts as an ad-hock Wi-Fi access point allowing it to be controlled from any device with Wi-Fi. At the moment Parrot are only shipping a client for the iPhone, but there is an API available and there is already footage on the net using an Android Nexus One to control one. It’s loaded with a bunch of other sensors as well, an accelerometer to help keep it stable and a ultrasound altimeter to help it maintain altitude over changing ground.

The iPhone interface for flying the drone uses the accelerometer and is a bit tricky to start with, but I think with a little bit of practice it shouldn’t take too long to get the hang of it. The feed from the video cameras is fed back to the handset allowing you to get a pilot’s eye view. At the moment none of the software allows you to capture this video, but it’s expected to be added soon. You can also use the camera to play AR games or have the it hover and hold station over markers on the floor.

The whole thing runs a embedded Linux build on a ARM chip and you can even telnet into the thing. It comes with 2 chassis, one for outside and one with some protective shrouds for the propellers to use indoors. 

I think some very cool stuff should be possible with a platform like this.

Here are 2 short videos I short of a few of us having a go with it on the lawn in front of Hursley House.

]]> 5
Augmented reality for Hursley mobiles Fri, 06 Nov 2009 20:17:32 +0000 Continue reading ]]> On Wednesday, Chris Book was kind enough to invite me to join the mobile developer panel at openMIC 3 : the third Mobile Innovation Camp.

The theme for the day was location and augmented reality.

A particular highlight was a talk by Paul Golding on Augmented Reality & Augmented Virtuality, covering a variety of topics such as the state of Virtual Worlds today, and the potential of mobile augmented reality apps to move us from a “Thumb Culture” to a camera-led “Third Eye culture”.

A number of mobile augmented reality platforms were discussed, such as Nokia’s MARA research project, the QR-based Insqribe, the real-world / virtual-world mobile mashup platform junaio, and the ‘world browser’ Wikitude.

Another platform that got several mentions, including a developer’s crash course in the afternoon from Richard Spence, was Layar.

I had a quiet afternoon in the office this afternoon, so I thought I’d give the Layar API a quick try for myself.

Layar is a mobile app for Android and iPhone that lets you display location-based information overlaid on a real-time camera view.

For example, the screen normally shows a viewfinder-like view from your mobile’s camera.

Search for “coffee” and a bunch of markers appear on the view, showing you where the nearest coffee shops are.

As you move the phone around, the markers follow the approximate location of the places they are showing you.

That’s assuming you want to search ‘Google Local’, but that’s not the only option. Location data is provided through “layars”, and there are layars available for location-tagged Wikipedia articles, Flickr photos, brightkite users, and more.

The interesting thing talked about at openMIC was the Layar API which lets anyone create a new Layar with their own information.

So I decided to spend a quiet Friday afternoon in the office creating a Layar for around Hursley. 🙂

This means a phone with the Layar browser installed can browse and search for points of interest around the site.

It was really very easy, so I’ll quickly outline the steps involved.

Step 1 – Get an API key

Visit and click on ‘Request a developer account’.

Once approved (it took a few hours), you get a developer ID and key.

Step 2 – Define a new layar

Once you’ve got an account, go back to and fill out the ‘Create Layer’ form, describing the layar you want to create.

There are a ton of options here. These range from the obvious like a name, description and tags to display to users choosing a layar to use, to neat customisations like uploading custom colour schemes and icons to use as map markers on the phone screen.

I just filled in a brief name and description and left the rest as the default values.

Step 3 – Define a source of Points of Interest (POI)

Layar wants to support real-time location information.

A good example of this is the brightkite layar. Users choosing the brightkite layar see a view overlaid with the current location of their friends or other brightkite users.

To make this possible, you don’t create a Layar by uploading a static database of points of interest.

Instead, you have to provide a web service that Layar can query to get current locations. (When defining the new layar on the dev website, one of the steps is to provide the URL of where you will put this web service.)

The API doc for this web service is at and details the parameters that your web service will be invoked with (e.g. the user’s current location), and the format for the response that your web service must return to Layar.

The Layar browser does perform client-side filtering to only show markers relevant to the user’s location. However, if you are writing a Layar like the Wikipedia one, it’d be best not to just return your entire database of POIs, burning the mobile’s battery both in the data transfer and the processing needed to do the filtering.

For my quick first test, I was a bit lazy and ignored the input location parameter and just return all of my Hursley POI markers regardless, leaving the mobile to do the filtering.

Parameters are provided as URL variables, and the response format is a pretty straightforward JSON interface.

It took no time at all to knock up a quick bit of PHP to return this…

$currentLat = $_GET['lat'];
$currentLon = $_GET['lon'];

$place1 = array('actions'=>array(array('uri'=>'',
                                       'label'=>'view on map'),
                                       'label'=>'phone now')),                
                'title'=>'Main Reception',
                'attribution'=>'ETS demo',
                'distance'=>distance($currentLat, $currentLon, 
                                     51.026570, -1.397964),

// <snip> ... bunch of other places here ... </snip>

$allPlaces = array('errorCode'=>0,
                   'hotspots'=>array($place1 ,$place2, $place3, 
                                     $place4, $place5, $place6, 
                                     $place7, $place8, $place9),

echo json_encode($allPlaces);

Obviously, hard-coding all of the locations is a sloppy way of doing this – in a real system, you’d want the PHP to be retrieving this from a data store of some sort.

There is open source code available from a bunch of places to get you started in doing this – PorPOISe looks like an interesting example.

Interestingly, the mobile client doesn’t calculate the distance from the user to the POI – you have to return that from your web service call. This is the only reason why I didn’t just write a static text file with JSON data for my “web service”, as I had that one dynamic bit needing to be changed.

The other bit worth highlighting is that each POI can include one of more “actions”, that get offered to the user if they click on a point of interest on their mobile. I added a few examples such as launching a web page or making a phone call. E.g. you can see where Main Reception is on your screen, and choose to call Reception if you want.

Step 4 – Test the web service

It took a couple of tries to get my PHP right… mainly because several of the parameters described as optional in the API doc are actually required 🙂

Fortunately, there is a nice test service available at

You plug in your developer ID and key, give it a location of where your mobile should be, and click “Load POIs”.

It tells you what is wrong or missing with the JSON response it gets back from your web service, which for me was either which required value was missing, or a numeric value that I was returning as a string. The errors are clear enough to help you fix your service.

Step 5 – Try it for real has a link to the installer for the Android app.

This app is a developer version of the normal one available in the Android Market.

(If you already have the normal app installed, you need to uninstall it first. I didn’t, and the developer app didn’t work properly until I uninstalled and started again).

The developer app gives you extra options in Settings to provide your developer ID and key. This lets you access your unpublished layar for testing (until approved by Layar, your new layar isn’t available to other users).

(It also lets you override your phone’s current location – useful for testing indoors!)

Once you’ve done that, your new layar shows up in the mobile’s list of “Featured” layars.

(I found that it only showed up if I also changed the Country setting from “Auto” to “GB”… not sure why, although when I defined the Layar, I did specify that it was a GB layer.)

And that’s it… I had a fun excuse to go wandering around Hursley this afternoon to test it out and collect the screenshots for this post. 🙂

]]> 2
Lecturing MBA’s at Babson in Second Life Mon, 16 Feb 2009 15:24:33 +0000 Continue reading ]]> Last night I was invited to speak to a very diverse class of MBA students and entrepreneurs at Babson as a guest (Thankyou Linda for the invite). The conversation of course happened in Second Life, and also happened to be around midnight my time. That in itself is almost routine now, though for a change I was using voice and watching for text questions. However what was great was that the subject was not the metaverse itself, though I did throw in some futures like 3d printing and augmented reality. No the subject was the story of eightbar, the steps to get to the point we are at, how despite various things stacked against many of us we just carried on and did the right thing.
***UPDATE @abelniak who was at the meeting twittered that he had this post on his perspective as an member of the class. Once again the power of social media and willingness to share and build is in action.
Also some honest statements about the risk of being a pioneer, and the fear that self organized group can generate in traditional control structures.
I really enjoyed talking to the group, and there were some great questions from some clever minds.
We discussed leadership in particular, and what differs or is the same in virtual worlds. My general answer is a good leader will adapt, those true leaders already in traditional places of power have the emotional skills to lead and inspire anywhere. However the new connected world and removal of local as barrier unleashes the abilities in anyone who wants to lead.
Babson MBA lecture
Anyway, a huge thanks to everyone who came along, thankyou for listening, and following up. It became clear to me there is a huge value in sharing this story now, its constantly evolving for me, epredator, eightbar, metaverses, IBM. At any point in time it has things to learn from and things to share.

A teaser trailer – SHASPA Sun, 15 Feb 2009 19:42:39 +0000 This trailer has just hit youtube. I am sure more will become clear as this project moves on, for now just enjoy.

]]> 1
Augmented reality anywhere from MIT Fri, 06 Feb 2009 17:06:43 +0000 Continue reading ]]> Thankyou to AnnieOk for pointing me towards the video and articles here on the MIT Fluid interfaces that got such a good reception at TED 2009. This is brilliant work. You have to see this and go to wired to read the rest of the article.

Projection, mixed with gesture and finger tracking, whilst looking a little cumbersome this is showing some very clever things actually working.
What I like about projection (though I do find the personal ways to get an AR experience relevant too) is the potential to share with others. Just as it has become common, as I have mentioned before, to see people gathered around and iphone on the table.
Its been quite a weeks for seeing things often talked about actually working.

]]> 2
Mirror Mirror on the wall Wed, 04 Feb 2009 11:46:58 +0000 Continue reading ]]> Over at Redmonk, James Governor has written a very interesting piece on what has happened to the Microsoft ESP platform. Mirror worlds, accurate representations of real things, ideally instrumented by a raft of sensors from the real world are a very specific, and obvious, use of virtual worlds. After all pilots already spend a large amount of time training in such environments and we entrust out lives to them. (It would be interesting to know how much virtual training the Hero of the Hudson has had, re water landings).
James said of ESP “the single coolest initiative I have seen from Microsoft in the 13 years I have been watching the firm”, but now it appears there is a drop in focus on it.
There are of course lots of other mirror worlds and hybrid mirrors out there, but as yet there is not a good commercial high fidelity toolkit that can be used to build specific mirrors.
Google Earth is clearly the most rich in terms of global level instrumentation, but it is at a much more finite and realtime level that we will see the benefits.
I am not sure what we would do with a live as-is model of the world accurate and instrumented in every way possible, but as a concept and seeing the fascination people have for maps, photos, and satellite images of their part of the planet it seems a worthwhile goal to make a true mirror world.
Also an accurate model of an environment is a base requirement to help enhance the real world with augmented reality systems. i.e. like the ones we already have for our GPS tracking. Without the accurate map(digital model of the world), the GPS position is of less use to the average user.
As James also says though ,there is some speculation in the future of the ESP platform. So I guess we will have to wait and see.

]]> 5
Coca Cola Avatars in the real world Mon, 02 Feb 2009 10:58:03 +0000 Continue reading ]]> Whilst we may have missed the advertizing-fest that is the US Superbowl, we do atleast get to see some of the great ideas courtesy of youtube. For me the most significant was this one. (Thanks Roo for finding it first 🙂 )

It speaks for itself, but does have some subtle little niceties. What is does show is a mainstream appreciation that we all have various avatars and visual persona’s that we engage with anywhere and everywhere, on mobile devices, in coffee shops.
Mainstream appreciation of the adoption of this way of interacting?

]]> 12
Just thinking out loud – Metaverse snapshot Mon, 26 Jan 2009 14:29:09 +0000 Continue reading ]]> I moved offices today and having a bright new whiteboard I could not leave it clean for long.
Its not really a mindmap, just some association of thoughts and bits of linkages. I am sure it will alter, but right now this is what was in my head in a mad flurry. The underlying red part is really the substrate of the whole thing. Just my personal thoughts linked to some of the things I have seen and been involved with one way and another.

Thoughts on the metaverse
Note: edited to show smaller version of the board as it was cropping the right hand important side for those that did not click through to flickr. 3d printing FTW and high value professional social networks one there too !

]]> 10
Rivers Run Red’s Retail Planogram in Second Life Wed, 14 Jan 2009 10:38:34 +0000 Continue reading ]]> Thanks again to Malburns for spotting this and tweeting it. Rivers Run Red have released an example of an application layered onto immersive workspaces in Second Life. In this case it is around retail planning and visualization.

This is an example of the next layer of toolsets that we can expect to see across virtual worlds, as those virtual worlds become a platform not just a place.
Producing what if scenarios, or mirror world scenarios does need the ability to simply sketch and examine the possibilities whether its a retail store, a machine room or an intricate business model that cannot normally be visualized.
The exciting thing about this for us here at eightbar is that it makes it a step closer to be able to then instrument the model with real live data via publish subscribe methods such as MQTT. Merging the data from a smart planet into immersive visualizations that can be explored together, not stand alone clearly is a direction we have been pushing since even before the 2006 Wimbledon. Hursley is (for those who dont know) the home of messaging, pub/sub reliable MQ messaging.

]]> 1
An odd comment to make on virtual worlds Leo Laporte? Tue, 13 Jan 2009 15:13:48 +0000 Continue reading ]]> Andy Piper was telling me that he was listening to a podcast with Leo Laporte where he basically dissed Second Life as a gimmick and suggested it was not all that. Well I did have a listen to the end of this podcast and sure enough both Leo and Amber Macarthur made some throw away comments about the value of Second Life. Now to be fair, it was not a rant. In some ways this was about the time and effort required to engage, and they did give a shout out to the great communities that have formed. Though Leo did use the word gimmick.
Clearly they only said Second Life, they did not say virtual worlds in general, so they may have been dealing with a specific experience and press bubble, but it is a little odd position to take when someone has a reputation for transitioning across media. Yes it can take a little bit of time to engage with people in virtual worlds, but that is the point in many ways.
There is room for text, for twitter, for podcasts (that I sometimes find very time consuming to have to listen to), for virtual world events and for whats next.
Without virtual worlds we have no place to take this further, no mirror worlds, no augmented reality, no 3d printing/rapid fabrication.
It is of course different horses for courses, but I dont think any of us in any field should consider excluding any of the others or writing them off out of hand. Text and voice still work of course.
The journey of discovering good ways to interact with one another is one I think we are all on so I am just let them off this time 🙂
If they want a metaverse evangelist on the show to explain…. well happy to help.

]]> 5