Hursley Emerging Tech on the News

Kevin Brown who also featured in my previous eightbar post appears to be increasing his level of fame after appearing on Channel 4 news last night.

Kevin has done a lot of work with HCI (Human Computer Interfaces) and is leading the way in the Hursley Emerging Technology Services department. He has a huge interest and wealth of knowledge on the topic but the bleeding-edge HCI device catching people’s attention again at the moment is the brain reading headset from Emoviv Technologies. Kevin has been working with this device for quite some time already, having for example used it with hospital patients, and a wealth of other uses too including driving cars. This gives a good indicator to how far ahead of the curve our emerging tech team can be at times.

The Channel 4 news clip focuses on using the headset to drive cars and puts this in the context of Google’s self-drive car too, here’s the video:

Mad thermostat plan

Something I’ve really wanted to have a go at for a long time is hacking together a smarter heating system. The long process of moving house prevented any progress until now but I think a few things fell in to place today to get the project off the ground. And so a slightly mad thermostat plan was hatched…

The first part of the puzzle is a side effect of getting a solar water panel; to make the most of the solar panel we should only be using the boiler to top up the hot water at the end of the day. (Obviously that’s just theoretical at the moment because its pretty much been raining non stop since we got the solar panel!) Unfortunately the current central heating controller will only turn on the heating if the hot water is on at the same time, which is no help at all, so we really need a new controller to make the most of our zero carbon supply of hot water. There’s another, purely aesthetic reason to want a new heating controller; the kitchen upgrade got under way this week and the old controller has seen better days.

The current kitchen destruction has a bigger part to play though; now is an ideal opportunity to hide cables behind the new cupboards. For a while that didn’t actually seem like it was going to be all that much help, based on where the old thermostat was (hidden behind a door in the living room). I was looking at various programmable thermostats but the existing wiring from the thermostat restricted the options somewhat. The programmable thermostat we had in the old house seemed to work quite well with the existing wiring and controller… as long as the battery was fresh, otherwise it got confused about the temperature. Obviously not ideal for a thermostat, so I was hoping to avoid batteries this time!

Then, while being distracted by the wonky light switches yet again, inspiration struck…

The house hasn’t been constructed with the greatest care in the world, but those switches just could not have been original. The only thing that makes sense is if they were another botched DIY job, and it seemed highly unlikely that anyone would have dropped another cable run down the wall to do it. My hunch, based on the fact that there’s a water cylinder directly above those switches, is that there’s a horizontal cable run between the two. I checked, and… eureka! So now it’s a simple job to put both switches back on the same box, leaving an empty recessed box with a now bare kitchen wall behind it, making it perfect to run a new thermostat cable through the back of the box and round to the boiler! (Well I was pretty excited by this plan at the time.)

The thermostat to finish off this puzzle is a Heatmiser combined programmable thermostat and hot water timer. My theory is that I need the PRT/HW-N thermostat to go in the living room and a PRC powered relay card in place of the old central heating controller. I’m almost certain that the wiring will work with the existing system anyway, but if anyone has any experience/tips/gotchas, please let me know! That programmable thermostat should give me an RS485 interface to the thermostat which, if all goes well, won’t be too difficult to connect to my nanode- either with a bit of soldering, or one of these IO shields if I’m feeling lazy! The thing I like about this arrangement is that it should be possible to achieve plenty of automation if all goes well but, if there are any technical hitches, there’s a decent off the shelf controller to fall back on.

Update: a quick update since I’m doing some head scratching over whether the existing wiring from the central heating timer to the junction box in the airing cupboard will allow the heating to run independently from the hot water. If it does, the new thermostat is in place ready to go…

If it doesn’t, the new thermostat will just be a decorative feature while I figure out where I can sneak a new cable upstairs without disturbing the new kitchen! I don’t want to break the heating until I’m sure everything will work, so I’m working off a photo for now…

I’d love to hear from anyone who can decipher that lovely nest of wires! Here’s my theroy so far:

The black cable is the valve, and the other two cables that enter with it at the bottom are the pump and cylinder stat. It looks to me like the grey cable should be to turn the hot water off, which seems to be connected to the cylinder stat and a red wire from one of the cables above, which I’m hoping is from the timer. That just seems too easy for this house though, and I’m a bit puzzled by what the connections on the orange wire actually are. Lucky it’s all neatly connected and labelled so I can check the orange wire is connected to the cylinder stat and pump… bother. I guess I’m going to have to wait until Jo’s not looking so I can investigate more thoroughly!

Emerging Technology Services Interviews

The British Computer Society recently came to Hursley to interview some of the members of Emerging Technology Services about some of the work we’ve been doing recently. The results, as ever in ETS, are really interesting so here is the set of video interviews reposted for all you Eightbar subscribers out there.

To kick things off we have Bharat Bedi, IBM Master Inventor, talking about his work on the Universal Information Framework. This is an innovative idea that allows secure interactions that could benefit, for example, banks:

Another piece from Bharat Bedi but this time talking about his work on the Living Safe project which runs in Balzano, Italy to help older residents who live by themselves:

Now something a little different from Kevin Brown, IBM Senior Inventor, talking about his work using a mind-reading headset. Here he gets Brian Runciman from the BCS to drive a car with his brain and trains him to run a brain wave reading headset:

Next up we have Dominic Harries, IBM Emerging Technologies Specialist, talking about some of his work using a multi-user multi-touch surface. Here Dominic is demonstrating the use of a business application on the multi-touch table:

Last, but not least we have Helen Bowyer, Emerging Technologies Manager, talking about her work on Automatic Sign Language. Helen explains and demonstrates the Say It, Sign It (SiSi) project which uses an avatar to translate spoken English into sign language.

The original content can be found at


This is my mood (as identified from my facial expressions) over time while watching Never Mind the Buzzcocks.

The green areas are times where I looked happy.

This shows my mood while playing XBox Live. Badly.

The red areas are times where I looked cross.

I smile more while watching comedies than when getting shot in the head. Shocker, eh?

A couple of years ago, I played with the idea of capturing my TV viewing habits and making some visualisations from them. This is a sort of return to that idea in a way.

A webcam lives on the top of our TV, mainly for skype calls. I was thinking that when watching TV, we’re often more or less looking at the webcam. What could it capture?

What about keeping track of how much I smile while watching a comedy, as a way of measuring which comedies I find funnier?

This suggests that, overall, I might’ve found Mock the Week funnier. But, this shows my facial expressions while watching Mock the Week.

It seems that, unlike with Buzzcocks, I really enjoyed the beginning bit, then perhaps got a bit less enthusiastic after a bit.

What about The Daily Show with Jon Stewart?

I think the two neutral bits are breaks for adverts.

Or classifying facial expressions by mood and looking for the dominant mood while watching something more serious on TV?

This shows my facial expressions while catching a bit of Newsnight.

On the whole, my expression remained reasonably neutral whilst watching the news, but you can see where I visibly reacted to a few of the news items.

Or looking to see how I react to playing different games on the XBox?

This shows my facial expressions while playing Modern Warfare 3 last night.

Mostly “sad”, as I kept getting shot in the head. With occasional moments where something made me smile or laugh, presumably when something went well.

Compare that with what I looked like while playing Blur (a car racing game).

It seems that I looked a little more aggressive while driving than running around getting shot. For last night, at any rate.

Not just about watching TV

I’m using face recognition to tell my expressions apart from other people in the room. This means there is also a bunch of stuff I could look into around how my expressions change based on who else is in the room, and their expressions?

For example, looking at how much of the time I spend smiling when I’m the only one in the room, compared with when one or both of my kids are in the room.

To be fair, this isn’t a scientific comparison. There are lots of factors here – for example, when the girls are in the room, I’ll probably be doing a different activity (such as playing a game with them or reading a story) to what I would be doing when by myself (typically doing some work on my laptop, or reading). This could be showing how much I smile based on which activity I’m doing. But I thought it was a cute result, anyway.


This isn’t sophisticated stuff.

The webcam is an old, cheap one that only has a maximum resolution of 640×480, and I’m sat at the other end of the room to it. I can’t capture fine facial detail here.

I’m not doing anything complicated with video feeds. I’m just sampling by taking photos at regular intervals. You could reasonably argue that the funniest joke in the world isn’t going to get me to sustain a broad smile for over a minute, so there is a lot being missed here.

And my y-axis is a little suspect. I’m using the percentage level of confidence that the classifier had in identifying the mood. I’m doing this on the assumption that the more confident the classifier was, the stronger or more pronounced my facial expression probably was.

Regardless of all of this, I think the idea is kind of interesting.

How does it work?

The media server under the TV runs Ubuntu, so I had a lot of options. My language-of-choice for quick hacks is Python, so I used pygame to capture stills from the webcam.

For the complicated facial stuff, I’m using web services from

They have a REST API for uploading a photo to, getting back a blob of JSON with information about faces detected in the photo. This includes a guess at the gender, a description of mood from the facial expression, whether the face is smiling, and even an estimated age (often not complimentary!).

I used a Python client library from github to build the requests, so getting this working took no time at all.

There is a face recognition REST API. You can train the system to recognise certain faces. I didn’t write any code to do this, as I don’t need to do it again, so I did this using the API sandbox on the website. I gave it a dozen or so photos with my face in, which seemed to be more than enough for the system to be able to tell me apart from someone else in the room.

My monitoring code puts what it measures about me in one log, and what it measures about anyone else in a second “guest log”.

This is the result of one evening’s playing, so I’ve not really finished with this. I think there is more to do with it, but for what it’s worth, this is what I’ve come up with so far.

The script


# imports for capturing a frame from the webcam
import pygame.image

# import for detecting faces in the photo
import face_client

# import for storing data
from pysqlite2 import dbapi2 as sqlite

# miscellaneous imports
from time import strftime, localtime, sleep
import os
import sys



class AudienceMonitor():

    # prepare the database where we store the results
    def initialiseDB(self):
        self.connection = sqlite.connect(DB_FILE_PATH, detect_types=sqlite.PARSE_DECLTYPES|sqlite.PARSE_COLNAMES)
        cursor = self.connection.cursor()

        cursor.execute('SELECT name FROM sqlite_master WHERE type="table" AND NAME="facelog" ORDER BY name')
        if not cursor.fetchone():
            cursor.execute('CREATE TABLE facelog(ts timestamp unique default current_timestamp, isSmiling boolean, smilingConfidence int, mood text, moodConfidence int)')

        cursor.execute('SELECT name FROM sqlite_master WHERE type="table" AND NAME="guestlog" ORDER BY name')
        if not cursor.fetchone():
            cursor.execute('CREATE TABLE guestlog(ts timestamp unique default current_timestamp, isSmiling boolean, smilingConfidence int, mood text, moodConfidence int, agemin int, ageminConfidence int, agemax int, agemaxConfidence int, ageest int, ageestConfidence int, gender text, genderConfidence int)')


    # initialise the camera
    def prepareCamera(self):
        # prepare the webcam =[0], (900, 675))

    # take a single frame and store in the path provided
    def captureFrame(self, filepath):
        # save the picture
        image =, filepath)

    # gets a string representing the current time to the nearest second
    def getTimestampString(self):
        return strftime("%Y%m%d%H%M%S", localtime())

    # get attribute from face detection response
    def getFaceDetectionAttributeValue(self, face, attribute):
        value = None
        if attribute in face['attributes']:
            value = face['attributes'][attribute]['value']
        return value

    # get confidence from face detection response
    def getFaceDetectionAttributeConfidence(self, face, attribute):
        confidence = None
        if attribute in face['attributes']:
            confidence = face['attributes'][attribute]['confidence']
        return confidence

    # detects faces in the photo at the specified path, and returns info
    def faceDetection(self, photopath):
        client = face_client.FaceClient(FACE_COM_APIKEY, FACE_COM_APISECRET)
        response = client.faces_recognize(DALELANE_FACETAG, file_name=photopath)
        faces = response['photos'][0]['tags']
        for face in faces:
            userid = ""
            faceuseridinfo = face['uids']
            if len(faceuseridinfo) > 0:
                userid = faceuseridinfo[0]['uid']
            if userid == DALELANE_FACETAG:
                smiling = self.getFaceDetectionAttributeValue(face, "smiling")
                smilingConfidence = self.getFaceDetectionAttributeConfidence(face, "smiling")
                mood = self.getFaceDetectionAttributeValue(face, "mood")
                moodConfidence = self.getFaceDetectionAttributeConfidence(face, "mood")
                self.storeResults(smiling, smilingConfidence, mood, moodConfidence)
                smiling = self.getFaceDetectionAttributeValue(face, "smiling")
                smilingConfidence = self.getFaceDetectionAttributeConfidence(face, "smiling")
                mood = self.getFaceDetectionAttributeValue(face, "mood")
                moodConfidence = self.getFaceDetectionAttributeConfidence(face, "mood")
                agemin = self.getFaceDetectionAttributeValue(face, "age_min")
                ageminConfidence = self.getFaceDetectionAttributeConfidence(face, "age_min")
                agemax = self.getFaceDetectionAttributeValue(face, "age_max")
                agemaxConfidence = self.getFaceDetectionAttributeConfidence(face, "age_max")
                ageest = self.getFaceDetectionAttributeValue(face, "age_est")
                ageestConfidence = self.getFaceDetectionAttributeConfidence(face, "age_est")
                gender = self.getFaceDetectionAttributeValue(face, "gender")
                genderConfidence = self.getFaceDetectionAttributeConfidence(face, "gender")
                # if the face wasnt recognisable, it might've been me after all, so ignore
                if "tid" in face and face['recognizable'] == True:
                    self.storeGuestResults(smiling, smilingConfidence, mood, moodConfidence, agemin, ageminConfidence, agemax, agemaxConfidence, ageest, ageestConfidence, gender, genderConfidence)
                    print face['tid']

    # stores face results in the DB
    def storeGuestResults(self, smiling, smilingConfidence, mood, moodConfidence, agemin, ageminConfidence, agemax, agemaxConfidence, ageest, ageestConfidence, gender, genderConfidence):
        cursor = self.connection.cursor()
        cursor.execute('INSERT INTO guestlog(isSmiling, smilingConfidence, mood, moodConfidence, agemin, ageminConfidence, agemax, agemaxConfidence, ageest, ageestConfidence, gender, genderConfidence) values(?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)',
                        (smiling, smilingConfidence, mood, moodConfidence, agemin, ageminConfidence, agemax, agemaxConfidence, ageest, ageestConfidence, gender, genderConfidence))

    # stores face results in the DB
    def storeResults(self, smiling, smilingConfidence, mood, moodConfidence):
        cursor = self.connection.cursor()
        cursor.execute('INSERT INTO facelog(isSmiling, smilingConfidence, mood, moodConfidence) values(?, ?, ?, ?)',
                        (smiling, smilingConfidence, mood, moodConfidence))

monitor = AudienceMonitor()
while True:
    photopath = "data/photo" + monitor.getTimestampString() + ".bmp"
        faceresults = monitor.faceDetection(photopath)
        print "Unexpected error:", sys.exc_info()[0]

Why Doctor Who Confidential mattered

Behind-the-scenes documentaries, like Doctor Who Confidential, matter. They matter because they show viewers, in particular children still deciding what to do with their lives, that it takes more to produce a high-class TV programme than just a few actors who become famous. It shows what other creative and/or technical jobs there are in television.

A couple of weekends ago, we went to the Doctor Who Official Convention (#dwcuk) in Cardiff. While one of the three main panels featured the three stars, Matt Smith, Karen Gillan and Arthur Darvill (along with executive producers Stephen Moffat and Caroline Skinner), most of the other scheduled events were focused on how Doctor Who is made.


At the very start of the day, we went to see Danny Hargreaves blow things up talk about the Special Effects on Doctor Who. In his Q&A session (after making it snow indoors), the first question asked was “How did you get into special effects work?” and, between questions like how he blew up the Torchwood Hub and how he makes the Doctor’s hands and head fiery during a regeneration, a later question was “When did you realise you wanted to work in special effects?”. Attendees were interested not just in the fictional stories and characters but in how the programme is made and the interesting careers they might not otherwise have come across.

2012-03-24 14.26.38

Throughout the day, I heard audience members ask how to become costume and prosthetics designers and how to become script writers. Danny described how his team designs and creates the effects, assess the risks of blowing things up, and who they work with to make it all happen. He also explained how he came to be a trainee in the nascent world of special effects before studying Mechanical Engineering so that he could build the devices they need for Doctor Who (and the other shows he’s worked on, like Coronation Street). Directors of photography, set designers, executive producers, writers, and directors went on to talk about what their own jobs entailed day-to-day and how it all comes together to make an episode of Doctor Who.

These discussions continued the story that used to be told after each new episode of Doctor Who by Doctor Who Confidential on BBC3. Doctor Who Confidential started in 2005 with the return of Doctor Who. As well as talking about some interesting perspective on making that night’s episode of Doctor Who, it featured interviews with, and ‘day-in-the-life’ documentaries about, the actors (including showing the less glamorous side of shivering in tents and quilted coats between takes), the casting directors, the producers, the writers, the choreographers, the costume designers, the special effects supervisors, the monster designers, the prosthetics experts, the directors, the assistant directors, and many, many others. It also held competitions for children to write a mini episode and then see the process of making it, which would’ve been an amazing experience!

Yes, it took a slightly odd turn in the last series when it turned a bit Top Gear by showing Karen Gillan having a driving lesson and Arthur Darvill swimming with sharks; possibly a misguided attempt to increase its popularity before it got canned anyway to cut costs.

I think it’s a real shame to lose Doctor Who Confidential and its insights into the skill, hard work, and opportunities in TV and film production.

Cool photo of Danny in the snow by Tony Whitmore.

The post Why Doctor Who Confidential mattered appeared first on

Reflecting on our total home energy usage

The graph of our total gas usage per year doesn’t decrease quite so impressively as our electricity graph, which I blogged about halving over five years. Because the numbers were getting ridiculously big and difficult to compare at a glance, I’ve re-created the electricity graph here in terms of our average daily electricity usage instead of our annual usage (click the graph to see a larger version):

Graph of daily electricity usage per year.


If you compare it with the average daily gas usage graph below, you can see (just from the scales of the y-axes) that we use much more gas than electricity (except in 2007, which was an anomalous year because we didn’t have a gas fire during the winter so we used a electric halogen heater instead):


Graph of daily gas usage per year.

Our gas usage has come down overall since 2005 (from 11280 kWh in 2005 to 8660 kWh in 2011; or 31 kWh per day to 24 kWh per day on average) but not so dramatically as our electricity usage has. Between 2005 and 2011, we reduced our electricity usage by about a half  and our gas usage by about a quarter.

Gas, in our house, is used only for heating rooms and water. So if I were to chart the average outside temperatures of each year, they’d probably track reasonably closely to our gas usage. In 2005 (when we used an average of 31 kWh per day), we still had our old back boiler (with a lovely 1970s gas fire attached) which our central heating installer reckoned was about 50% efficient. In 2006 (26 kWh per day), we replaced it with a new condensing boiler (apparently 95% efficient) but didn’t replace the gas fire until mid-2007 (the dodgy year that doesn’t really count). In 2006, we also had the living-room (our most heated room) extended so it had a much better insulated outside wall, door, and window. These changes could explain the pattern of reducing gas usage year by year up till then.

Old boiler being removed

In 2009, January saw sub-zero temperatures and it snowed in November and December. I think that must be the reason why our usage for the whole year shot back up again, despite the new boiler, to 31 kWh per day. In 2010 (21 kWh per day), it was again very cold and snowy in January; I think the slight dip in gas usage that year compared with both 2008 (25 kWh per day) and 2011 (24 kWh per day) was down to a problem with the gas fire that meant we used the electric halogen heater again during the coldest month. In 2011 it snowed in January but was fairly mild for the rest of the year.

I think 2008, 2010, and 2011 probably represent ‘typical’ years of heating our house with its new boiler and gas fire. Like I concluded about reducing our electricity usage, I think our gas usage went down mostly by getting some better insulation and a more efficient boiler but we did also reduce the default temperature of our heating thermostat to about 17 degrees C (instead of 20 degrees C) a couple of years ago too (we increase it when we need to but it stays low if we don’t), which I think has made some difference but it’s hard to tell when our heating usage is so closely tied to the outside temperature. Also, we don’t currently have any way of separating out our water heating from our central heating, and our gas fire from the boiler.

Of course, what really matters overall is the total amount of energy we use (that is, the gas and electricity numbers combined). So I’ve made a graph of that too. Now we’re talking numbers like 48 kWh per day in 2005 to 33 kWh per day in 2011.


Graph of total daily energy usage per year.

Overall, that means we reduced our total energy usage by about one-third over seven years.

Thanks again to @andysc for helping create the graph from meter readings on irregular dates.

The post Reflecting on our total home energy usage appeared first on

Failing to Invent

We IBM employees are encouraged, indeed incented, to be innovative and to invent.  This is particularly poignant for people like myself working on the leading edge of the latest technologies.  I work in IBM emerging technologies which is all about taking the latest available technology to our customers.  We do this in a number of different ways but that's a blog post in itself.  Innovation is often confused for or used interchangeably with invention but they are different, invention for IBM means patents, patenting and the patent process.  That is, if I come up with something inventive I'm very much encouraged to protect that idea using patents and there are processes and help available to allow me to do that.

This comic strip really sums up what can often happen when you investigate protecting one of your ideas with a patent.  It struck me recently while out to dinner with friends that there's nothing wrong with failing to invent as the cartoon above says Leibniz did.  It's the innovation that's important here and unlucky for Leibniz that he wasn't seen to be inventing.  It can be quite difficult to think of something sufficiently new that it is patent-worthy and this often happens to me and those I work with while trying to protect our own ideas.

The example I was drawing upon on this occasion was an idea I was discussing at work with some colleagues about a certain usage of your mobile phone [I'm being intentionally vague here].  After thinking it all through we came to the realisation that while the idea was good and the solution innovative, all the technology was already known available and assembled in the way we were proposing, but used somewhere completely different.

So, failing to invent is no bad thing.  We tried and on this particular occasion decided we could innovate but not invent.  Next time things could be the other way around but according to these definitions we shouldn't be afraid to innovate at the price of invention anyway.

Where are they now?

Ian Hughes Ian Hughes/Epredator

As part of the reorganisation of the Eightbar site recently I’ve been catching up with some of the honored past Eightbar members. We say past in the loosest sense of course, Eightbar was set up with the principle that “Once you’re Eightbar, you’re always Eightbar”. Here, I manage to muscle in on some of Ian Hughes’ (a.k.a epredator) time as he’s kindly answered some questions for us. What follows is a 10 question interview style post where I talk to Ian about life after IBM – in more than 140 characters. I think it’s a really interesting read, enjoy…!

Ian, you worked for IBM for a long time (somewhere around 20 years!) before making the big decision to leave and form your own start-up at Feeding Edge nearly 3 years ago!

1. What have you found are the main things keeping you busy now?

Just as when I was at IBM my work life is very varied. Living and working with technology and social changes, and being a bit of a polymath I find myself mixing a lot of skills.

Sometimes I am coding or combining code, usually on open source platforms but often in Unity3d. Building some game elements for a startup. Other times I am on the conference circuit helping
people to see the future by showing examples of how various things have changed already and how they link together to form a disruptive future. i.e. carrying on as an evangelist.

Much of this is still related to virtual worlds because they form a social and technical glue that still surprises many people only just getting to grips with Twitter and Facebook.

2. We’ve seen your continued rise to stardom on the ITV programme The Cool Stuff Collective, how did that come about?

Stardom is a very strong word 🙂 It was an ambition I had tucked away to do some more TV work. Like many things though it was serendipity that brought that about.

As I still blog many of my ideas and things about interesting advances many of my friends still read that. A good friend and IBMer Scotty (Kevin Scott / @starbase37) had told his friend John Marley / @marleyman007 who runs a TV production company Archie Productions about all the stuff I was talking about. Games, 3d printing, virtual worlds etc. So we got connected and had a meeting about a new show John was looking to start.

The aim of the meeting was really a friendly catchup and for me to give John a list of things that he could put on his show. Somewhere in the conversation he said “and then you will come on set and explain that to camera and the other presenter?” Which I still thought he meant he wanted me to be tech advisor for the kids show. Then it clicked and I realised I was being thrown in at the deep end. It was one of the few shows ITV/CITV has commissioned over the past few years.

So really because I have always shared what I know, used the web and social media to explain and offer a kind of open source advise I ended up with a character and role on the show. Which we have done 3 series of too!

Cue Showreels 🙂 TV Showreel

3. You must enjoy being the CSC resident g33k and teaching the viewers, what do you learn from them?

It has been the most fun and rewarding thing I have done. The third series in particular we moved from a studio and just the crew to being on location with schools in a Top Gear style. Whilst we were making a show for a mass audience it became even more important to be able to reach kids directly. I learned, and re-learned that the willingness to go with the flow on some ideas because they just are cool is still a magical thing. The things I say on the show are the same things I say in boardrooms and at conferences. The kids put many adults to shame though in not worrying straight away about ROI or marketing blurb. They get the idea and then fly with it.

It was also great to be able to reclaim geek/g33k. In a few schools the kids who were the tech geeks were suddenly allowed to be cool too. After all there was a bloke off the telly they could talk to.

We always had questions at the end of my future tech slot and I often didn’t get to know what they were up front, they were their questions and they were always taking me by surprise with their new angles or just the depth of understanding they showed. Once again putting many adults to shame.

4. Your time on Eightbar was mainly filled with Virtual Worlds work, what’s going down with the 3D Internet now, has it progressed as you thought?

It’s interesting as in many ways parts of the metaverse are now so mainstream, yet still not so much in the “business” world as you may have expected. We know that people tend to have to evolve through things, hence the struggling to understand the power of connection in social media is still a struggle for many decision makers in business. In a time of global recession with restricted travel it seems that the obvious use for communication and understanding via virtual environments is still not being exploited. Much of this is due to people being risk averse when they think their jobs are on the line. I find that many of the things we do and talk about are still reaching an audience who then say wow I didn’t think of it like that.

When they are used in their various forms they have a huge impact. Imperial College have some of the best examples, even with just a simple Opensim environment to help people plan a particular event it showed up real world procedures needed fixing after the first 5 minutes which saved more than money.

Lots of companies have floundered who where virtual world providers, but equally lots of their code is now open source. At the same time though lots of the games industry has been turned on its head by the arrival of minecraft. Which is a “game” but that uses co creation tools live in the environment. It has done a lot to help the games industry (who also did not understand virtual worlds of this sort) to look and say “oh! thats what its all about”.

So none of it has gone away. It hit the usual Gartner trough of dissillusionment after the confused hype and now is ploughing up the right slope.

Regular business will get hit with a minecraft moment though. A game changer in the same way open source software hit the IT industry, or Amazon hit retail. It’s just about being prepared to go with it when it arrives.

Another great development has been the ability to self build game tech environments with products like unity3d (a huge nod to Rob Smart for spotting unity3d way back too!) and have socket servers like photon and smartfoxserver.

I should also mention gamification, a horrible word, another thing for people to misunderstand, yet it covers the principles of applying both gaming and game technology into places it has not been before. It is often used in a lazy fashion slapping badges on things and giving out points, however at its heart the elements of playing with identity and expression online with a virtual environment in a business context provide way more benefit.

5. What has the past 3 years done for 3d printing, another of your interest areas?

3d printing has gone from strength to strength. It is appearing in more places and often more people have seen something about it when I talk about it. It is linked to the virtual worlds work as when you consider that a virtual environment is often about distributing digital assets from one place to another, you bolt a 3d printer on the end of that and you get digital design and distribution of physical product and the world changes.

The increase in open source builds like the RepRap make the hobby end of this accessible (around £400 of bits to build one). Makerbot provide some very cheap, but clever printers too that were featured heavily at CES 2012 (Consumer Elecrtonics Show) note the Consumer in that 🙂 ! Services that print for you, like Shapeways, initially funded by Phillips, have grown and moved to New York.

It is still something that when someone has never seen it they think it is witchcraft, somewhat like google used to seem to people 🙂 That magic is nice to share, but then applying the extrapolation of the change to the entire world economy and manufacturing business as it moves on then scares and excites in equal measure.

6. What would you like to see Eightbar doing more/less of after the departure of Andy Piper from Hursley recently?

When we all set up eightbar it was an antidote to the west coast US tech bloggers getting all the kudos. We’re doing some great things over here too 🙂 Just as tech blogging has evolved I would love to see eightbar carrying on as a mini brand and a voice of that same attitude wherever it needs to be.

7. Looking back at IBM, any regrets about leaving? Things you miss?

I miss all the people, well nearly all 😉 Though in reality much of the work was with people all over the world having a base of people in the same timezone and same place eating lunch in the same canteen provides an anchor. As does having to battle the same corporate resilience to change, or political short sightedness. There are still a great many sparky, slightly subversive but for the right reasons, renegade thought leaders under the radar at IBM.

Oh and the regular pay 🙂

8. What’s been the best thing about moving on?

Diversity of experiences and freedom to explore them. Like the TV work, it was just because of being open minded and master of my own calendar. I like to link everything, let one piece of work and ideas flow with another. That is tricky in a billable utilisation environment when you are not in control of the finances and the workload. It is why big corporations will keep getting side swiped by very small fast moving organisations with huge world connectivity at their finger tips.

I have also had to learn a lot about the various forms and processes needed to run even the smallest Ltd company. It’s an odd and archaic system, but they are the rules 🙂 It has also been fun picking various ideas and developing them getting people with the money to get interested. It gets all very Dragon’s den.

Freedom also allows me to try and pick things based on if I think they are beneficial in some way, not purely just because they are there. I have always prided myself on trying to act honourably in everything and with positive principles. So now it is up to me to stick to that and help others try and do the same.

9. Your personal life and work-life balance must have adjusted, what does a day in the life of epredator look like now you’re self-employed?

Aha! I called myself self employed once and my accountant was quick to point out I am not 🙂 This is part of what I was saying about companies and rules. As Feeding Edge is a limited company it is a legal entity in its own right that I happen to be a director of. At the same time there is a person on its official payroll, an employee… me 🙂 So as many twist and turns in business language as in any piece of tech 🙂

My day is much more thinly sliced than ever. I get up check a few streams of information, spot anything urgent, then do the school run, back home for 45 minute workout on UFC trainer on the Kinect, do some calls afterwards whilst cooling down. Most of the day is spent talking to the US and or my other biz partners around the gaming startup we have, building some code, pitching how bizarre the idea is. This is usually interspersed with some contacts from previous conferences getting in touch or some BCS animation and Games Dev SG business. Several times a month I pop along to a convention or meeting to talk about Tech and usual with Cool Stuff Collective as a backdrop. So the cycle continues.

Then there are the ad hoc conversations around other possible TV shows, or helping other startup businesses who are focussed using new tech with some connections or ideas.

Evenings are mix of cooking for the family, putting the kids(predlets) to bed, some gaming, heading to a Choi Kwang Do class or late night calls with US west coast for an interview or in Second Life.

However there is not start or end to a working day, a tweet on the way back form the school run may lead to something as much as a scheduled Skype call at 2pm. The emphasis is still on talking and sharing online.

10. Finally, give us a plug for Feeding Edge, who might I be if I were your customer and what might you be able to do for me?

Feeding Edge is a vehicle for people to get help from me, consulting or hands on development. As I say I am taking a bite out of technology so you don’t have to. All the years of experience with corporate tech and now several years out in the wild having to use what I talk about gives me a view on the world that many people don’t have time to consider, in person, in writing, on the TV, on stage, in the lab. I cover how technology feels and changes your life as much as the more obvious version x with version y tech.

In conferences I am usually the one put there to shake everybody up. So if you need a jolt of inspiration and a view of the future. well thats Feeding Edge and epredator. Cue show reels again 🙂

Well that’s it from Ian again for now. It’s really good to hear him talking in a wider context again, reading about the mix of drawing inspiration from such a wide variety of sources is really refreshing. It’s certainly reminded me to go “heads up” more often than I generally manage to do, so easy is it to keep too narrow a view on your immediate work tasks.

Thanks Ian, it’s been a pleasure – as always!

Halving our electricity usage

I learnt something interesting today: between 2007 and 2011, we halved the amount of electricity we use in our house:

Total electricity usage per year (kWh)

In 2007, we used 6783 kWh of electricity (for electricity, a kilowatt hour is the same thing as a ‘unit’ on your bill). In 2011, by contrast, we used 3332 kWh (or ‘units’). 2007 was slightly on the high side (compared with 2006) because we had no gas fire in the living-room during the winter of 2006-7 so we’d used an electric oil heater during the coldest weeks (we don’t have central heating in that room) 1.

That’s an average of 19 kWh per day in 2007 compared with 9 kWh per day in 2011. Which is quite a difference. So what changed?

In early 2008, I got a plug-in Maplin meter (similar to this one) and one of the very early Current Cost monitors, which display in real-time how much electricity is being used in your whole house:

An classic Current Cost monitor

Aside from the fun of seeing the display numbers shoot up when we switched the kettle on, it informed us more usefully that when we went to bed at night or out to work, our house was still using about 350 Watts (which is 3066 kWh per year)2 of electricity. That’s when the house is pretty much doing nothing. Nothing, that is, apart from powering:

  • Fridge
  • Freezer
  • Boiler (gas combi boiler with an electricity supply)
  • Hob controls and clock
  • Microwave clock
  • Infrared outside light sensor
  • Print/file server (basically a PC)
  • Wireless access point
  • Firewall and Internet router
  • DAB clock radio
  • ADSL modem
  • MythTV box (homemade digital video recorder; basically another PC)

And that’s the thing, this ‘baseline’ often makes a lot of difference to how much electricity a house uses overall. 3066 kWh per year was 56% of 2007’s total electricity usage.

The first six items on that list draw less than 100 Watts (876 kWh per year) altogether. They’re the things that we can’t really switch off. But there were clearly things that we could do something about.

Over the next couple of years, we reduced our baseline by about 100 Watts by getting rid of some of the excessive computer kit, buying more efficient versions when we replaced the old print/file server and MythTV box, and replaced most of our lightbulbs with energy-efficient equivalents. We also, importantly, changed our habits a bit and just got more careful about switching lights off when we weren’t using them (which wouldn’t affect the baseline but does affect the overall energy usage), and switching off, say, the stereo amplifier when we’re not using it.

That brought our baseline down to about 230 Watts (2015 kWh per year), which is a lot better, though it’s still relatively high considering that the ‘essentials’ (eg fridge and freezer) contribute less than half of that.

And that’s about where we are now. We tended to make changes in fits and starts but none of it has been that arduous. I don’t think we’re living much differently; just more efficiently.

1The complementary gas usage graph shows lower gas for that year for the same reason; I’ll blog about gas when I have a complete set of readings for 2011).
2350 Watts divided by 1000, then multiplied by 8760 hours in a year.
Photo of the Current Cost monitor was by Tristan Fearne.
Thanks also to @andysc for helping create the graph from meter readings on irregular dates.

The post Halving our electricity usage appeared first on

UX hack at London Green Hackathon

At the London Green Hackathon a few weeks ago, the small team that had coalesced around our table (Alex, Alex, Andy, and me) had got to about 10pm on Saturday night without a good idea for a hack, in this case a piece of cool software relevant to the theme of sustainability. We were thinking about creating a UK version of the US-based Good Guide app using on their API to which we had access. The Good Guide rates products according to their social, environmental, and health impacts; the company makes this data available in an API, a format that programmers can use to write applications. Good Guide uses this API itself to produce a mobile app which consumers can use to scan barcodes of products to get information about them before purchase.

Discussing ideas

The problem is that the 60,000 products listed in the Good Guide are US brands. We guessed that some would be common to the UK though. We wondered if it would be possible to match the Good Guide list against the product list so that we could look up the Good Guide information about those products at least. Unfortunately, when we (Andy) tried this, we discovered that Amazon uses non-standard product IDs in its site so it wasn’t possible to match the two product lists.

The equivalent of the Good Guide in the UK is The Good Shopping Guide, of which we had an old copy handy. The Good Shopping Guide is published each year as a paperback book which, while a nicely laid out read, isn’t that practical for carrying with you to refer to when shopping. We discovered that The Ethical Company (who produce the Good Shopping Guide) have also released an iPhone app of the book’s content but it hasn’t received especially good reviews; a viewing of the video tour of the app seems to reveal why.

Quite late at night

By this point it was getting on for midnight and the two coders in our team, Andy and Alex, had got distracted hacking a Kindle. Alex and I, therefore, decided to design the mobile app that we would’ve written had we (a) had access to the Good Shopping Guide API and (b) been able to write the code needed to develop the app.

While we didn’t have an actual software or hardware hack to present back at the end of the hackathon weekend, we were able to present our mockups which we called our ‘UX hack’ (a reference to the apparently poor user experience (UX) of the official Good Shopping Guide mobile app). Here are the mockups themselves, along with a summary of the various ideas our team had discussed throughout the first day of the hackathon:

The post UX hack at London Green Hackathon appeared first on