Halving our electricity usage

I learnt something interesting today: between 2007 and 2011, we halved the amount of electricity we use in our house:

Total electricity usage per year (kWh)

In 2007, we used 6783 kWh of electricity (for electricity, a kilowatt hour is the same thing as a ‘unit’ on your bill). In 2011, by contrast, we used 3332 kWh (or ‘units’). 2007 was slightly on the high side (compared with 2006) because we had no gas fire in the living-room during the winter of 2006-7 so we’d used an electric oil heater during the coldest weeks (we don’t have central heating in that room) 1.

That’s an average of 19 kWh per day in 2007 compared with 9 kWh per day in 2011. Which is quite a difference. So what changed?

In early 2008, I got a plug-in Maplin meter (similar to this one) and one of the very early Current Cost monitors, which display in real-time how much electricity is being used in your whole house:

An classic Current Cost monitor

Aside from the fun of seeing the display numbers shoot up when we switched the kettle on, it informed us more usefully that when we went to bed at night or out to work, our house was still using about 350 Watts (which is 3066 kWh per year)2 of electricity. That’s when the house is pretty much doing nothing. Nothing, that is, apart from powering:

  • Fridge
  • Freezer
  • Boiler (gas combi boiler with an electricity supply)
  • Hob controls and clock
  • Microwave clock
  • Infrared outside light sensor
  • Print/file server (basically a PC)
  • Wireless access point
  • Firewall and Internet router
  • DAB clock radio
  • ADSL modem
  • MythTV box (homemade digital video recorder; basically another PC)

And that’s the thing, this ‘baseline’ often makes a lot of difference to how much electricity a house uses overall. 3066 kWh per year was 56% of 2007’s total electricity usage.

The first six items on that list draw less than 100 Watts (876 kWh per year) altogether. They’re the things that we can’t really switch off. But there were clearly things that we could do something about.

Over the next couple of years, we reduced our baseline by about 100 Watts by getting rid of some of the excessive computer kit, buying more efficient versions when we replaced the old print/file server and MythTV box, and replaced most of our lightbulbs with energy-efficient equivalents. We also, importantly, changed our habits a bit and just got more careful about switching lights off when we weren’t using them (which wouldn’t affect the baseline but does affect the overall energy usage), and switching off, say, the stereo amplifier when we’re not using it.

That brought our baseline down to about 230 Watts (2015 kWh per year), which is a lot better, though it’s still relatively high considering that the ‘essentials’ (eg fridge and freezer) contribute less than half of that.

And that’s about where we are now. We tended to make changes in fits and starts but none of it has been that arduous. I don’t think we’re living much differently; just more efficiently.


1The complementary gas usage graph shows lower gas for that year for the same reason; I’ll blog about gas when I have a complete set of readings for 2011).
2350 Watts divided by 1000, then multiplied by 8760 hours in a year.
Photo of the Current Cost monitor was by Tristan Fearne.
Thanks also to @andysc for helping create the graph from meter readings on irregular dates.

The post Halving our electricity usage appeared first on LauraCowen.co.uk.

UX hack at London Green Hackathon

At the London Green Hackathon a few weeks ago, the small team that had coalesced around our table (Alex, Alex, Andy, and me) had got to about 10pm on Saturday night without a good idea for a hack, in this case a piece of cool software relevant to the theme of sustainability. We were thinking about creating a UK version of the US-based Good Guide app using on their API to which we had access. The Good Guide rates products according to their social, environmental, and health impacts; the company makes this data available in an API, a format that programmers can use to write applications. Good Guide uses this API itself to produce a mobile app which consumers can use to scan barcodes of products to get information about them before purchase.

Discussing ideas

The problem is that the 60,000 products listed in the Good Guide are US brands. We guessed that some would be common to the UK though. We wondered if it would be possible to match the Good Guide list against the Amazon.co.uk product list so that we could look up the Good Guide information about those products at least. Unfortunately, when we (Andy) tried this, we discovered that Amazon uses non-standard product IDs in its site so it wasn’t possible to match the two product lists.

The equivalent of the Good Guide in the UK is The Good Shopping Guide, of which we had an old copy handy. The Good Shopping Guide is published each year as a paperback book which, while a nicely laid out read, isn’t that practical for carrying with you to refer to when shopping. We discovered that The Ethical Company (who produce the Good Shopping Guide) have also released an iPhone app of the book’s content but it hasn’t received especially good reviews; a viewing of the video tour of the app seems to reveal why.

Quite late at night

By this point it was getting on for midnight and the two coders in our team, Andy and Alex, had got distracted hacking a Kindle. Alex and I, therefore, decided to design the mobile app that we would’ve written had we (a) had access to the Good Shopping Guide API and (b) been able to write the code needed to develop the app.

While we didn’t have an actual software or hardware hack to present back at the end of the hackathon weekend, we were able to present our mockups which we called our ‘UX hack’ (a reference to the apparently poor user experience (UX) of the official Good Shopping Guide mobile app). Here are the mockups themselves, along with a summary of the various ideas our team had discussed throughout the first day of the hackathon:

The post UX hack at London Green Hackathon appeared first on LauraCowen.co.uk.

BBC looking at mind control

Katia Moskvitch from the BBC has just published a nice article on using the mind to control technology.

As part of the article as well as trying out the Emotive headset* she interviewed Ed Jellard and Kevin Brown from the IBM ETS team based in Hursley.

* This is the same headset used for the Bang Goes The Theory Taxi racing.

MQTT powered video wall

Scaling things up a little from my first eightbar post.

This was one of those projects that just sort of “turned up”. About 3 weeks ago one of the managers for the ETS department in Hursley got a call from the team building the new IBM Forum in IBM South Bank. IBM Forums are locations where IBM can showcase technologies and solutions for customers. The team were looking for a way to control a video wall and a projector to make them show specific videos on request. The requests will come from pedestals known as “provokers”, each having a perspex dome holding a thought-provoking item. The initial suggestions had been incredibility expensive and we were asked if we could come up with a solution.

Provoker

The provokers have access to power and an Ethernet connection. Taking all that into account a few ideas came to mind but the best seamed to be an Arduino board with Ethernet support and a button/sensor to trigger the video. There is a relatively new arduino board available that has a built in Ethernet shield which seemed perfect for this project. Also, since a number of the items in the provokers would be related to IBM’s Smarter Planet initiative, it made sense to use MQTT as a messaging layer as this has been used to implement a number of solutions in this space.

Nick O’Leary was enlisted to put together the hardware and also the sketch for the Arduino as he had already written a MQTT client for Arduino in the past.

Each provoker will publish a message containing a playload of “play” to a topic like

provoker/{n}/action

Where ‘{n}’ is the unique number identifying which of the 6 provokers sent the message.

To provide some feedback to the guest that pressed the button, the LED has been made to pulse while one of the provoker-specific videos is playing. This is controlled by each provoker subscribing to the following topic

provoker/{n}/ack

Sending “play” to this topic causes the LED pluse, sending “stop” turns the LED solid again.

The video wall will be driven by software called Scala InfoChannel which has a scripting interface supporting (among other things) Python. So a short script to subscribe to the ‘action’ topics and to publish on on the ‘ack’ got the videos changing on demand.

And sat in the middle is an instance of the Really Small Message Broker to tie everything together.

Arduino in a box

This was also the perfect place to use some of my new “MQTT Inside” stickers.

First sticker deployed

This project only used one of the digital channels (for the button) and one of the analogue channels (for the LED) available on the Arduino – which leaves a lot of room for expansion for these type of devices. I can see them being used for future projects.

Parts list

  1. Arduino Ethernet
  2. Blue LED Illuminated Button
  3. A single resistor to protect the LED
  4. 9v power supply
  5. Sparkfun Case

Hursley Extreme Blue 2011 Presentations

Extreme Blue logoExtreme Blue is IBM’s summer intern scheme. Students can apply to IBM to be part of the scheme and those lucky enough to be selected are brought into various IBM locations worldwide to be mentored by IBM staff who have proposed an idea and small project for them to work on.

This morning I went along to listen to what the 16 students in the UK have been doing with their summer. These students were split into four groups of four, working on projects for an improved voting system, a smart cursor, smarter vehicles and FTP discovery.

You know you’re getting old when all the students seem rather young, I think “green” is the term people used to use when I was starting in IBM, they do remind me of my early days at work. However, they all presented themselves beautifully, spoke very well using slick rehearsed presentations they’ve put a lot of effort into, and (barring one or two stutters) seemed entirely confident in what they were doing up at the front of what must seem an intimidating auditorium full of knowledgeable IBM professionals. They handled questions well too, I don’t necessarily have to agree with all the answers, but the way they each went about receiving the questions and providing thoughtful answers was good.

Each team had 7 minutes to present their 12 weeks’ work with every person in the team getting a chance to pitch in at some point, so they didn’t get very long to put their projects across. The audience were asked to keep questions until the end of the pitch, which allowed them to flow easily through their material. The range of presentations was interesting, some chose to manually click through PowerPoint-style, while other groups came up with stories or monologuing through a video they had created. This range kept the audience interested with each style of presentation being effective for its purpose.

It was interesting to see how each of the projects has been clearly influenced by the four members of the team. Each team of four contained one business student and three technical students, and the range of skills came through in the presentations. Some groups had “deep-dived” straight into technical work while others had spent more time thinking about use cases, business cases, how their project might fit in with IBM or be sold. I suspect this has a direct relationship to both how the team was lead by the IBM staff but also by the particular characters of each team and reminded me of Myers-Briggs or Belbin style studies I’ve done in the past.

Now I’ll have a little look at each project in (very) brief… I’ll stress in advance that I’ve heard a small snippet of 12 weeks of hard work and any opinion here is mine alone and based solely on today’s pitches:

Improved voting system
The team gave an introduction to their solution involving a three phase voting system followed by an example of the problem they were trying to solve and how their solution tackled these. The team had been working with a local council to identify requirements for such a system, so were able to work with real-world examples and solicit feedback. Questions followed and feedback from the council seemed to have been good. Some doubts were expressed by the audience about the security of such a system which whilst possibly valid, it seemed to me that these could be addressed should the solution be implemented live. The team presented the solution as having environmental benefits which might seem obvious at first but I thought were rather questionable given the requirement to use computer hardware and power, a further study would be required here to determine whether the current system using sustainably-sourced paper could be bettered on the environmental front. Verification of voters appears to be vastly improved using their system with less room from fraudulent votes with connection to other systems for authentication such as the DVLA. Clearly any such automated voting system would have huge benefits for the speed of counting after voting has completed.

Smart Cursor
A new input device to control an on screen cursor using any sort of body movement aimed at improving human-computer interaction (primarily for disabled people). The system involves a hardware sensor strapped to the part of the body that has movement. Initial calibration for any new part of the body is required which is run once to set up 4 movements (up/down/left/right). Other movements and gestures would also be possible such as a mouse click and the combination of sensors on multiple parts of the body. The hardware technology could be built small enough to be permanently wearable without distress or difficulty to the user. Other uses of the technology appear to be for rehabilitation or monitoring a condition whilst wearing the hardware device. Lots of room for customisation brought out during questioning as well as a few issues about how to set up the device in the first place. However, this seemed like a really worthwhile (if low usage) piece of research that could be immensely useful to its target audience and at low cost too.

Smarter Vehicles
The aim of this project is to personalise the driving experience for car users by attempting to add three things to a car (1) identifying which user is driving, (2) providing the car with knowledge about where it’s going, and (3) permanently connecting the car to a network. The team used a video style presentation and monologue they had story-boarded which was clearly well produced and rehearsed. It was unclear what the project had achieved, however, as no specifics were mentioned on what had been achieved but there were certainly plenty of good ideas as to what could be done in this area. The team do appear to have a demonstration available which I’m looking forward to going to see in Hursley tomorrow and the Extreme Blue demonstration expo after which I’m sure it will be a lot clearer which ideas they’ve followed through into something tangible and which are still in progress. Another great plus for this team was they were aligned with an automotive manufacturer and will be presenting their ideas back to the board at a later date which will be a fabulous experience to get for them all.

FTP Discovery
Tackles the problem of escalating FTP network complexity in enterprises. The project attempts to map FTP files on the network in flight and automatically provides a visualisation of the network in a node graph style format. This network can be annotated manually with things such adding the cost of various transfers and links to allow the users to build up a visual picture and cost to the company of their FTP services. The team advocate the use of managed file transfers (as provided by WMQ File Transfer Edition, for example) but failed to clearly state what the problem with FTP as a service is. That said, they seem to have a very clever way of detecting FTP traffic by sniffing the network and could easily extend their architecture to include all sorts of other protocols. They have also thought carefully about how their work might be used in the future, for example as a tool for IBM pre-sales, a saleable IBM product or (most likely) a component of one or more existing IBM products.

Congratulations to all the teams and people involved. The presentations were great, a very entertaining hour, and it seems like some really useful work has come out of Extreme Blue in the UK again this year. Well done!

Minihacks and Open Technologies

It’s not all about process, software development, and quadricopters… 🙂

Guruplug This week we’ve had what could be described as a “mini Hackday”, instigated by an idea from Andy Stanford-Clark and organised by Hursley newcomer Vaibhavi Joshi. The idea was to spend a few hours exploring the world of plug computers (in this case, a model called a Guruplug); to brainstorm some ideas around utility computers; and to generally see what we could do with this kind of a form factor.

Some great ideas emerged, and quite a few of us were severely tempted to order our new shiny gadgets on the spot… by the end of the morning the Really Small Message Broker was built and running on the Guruplug and some exciting MQTT-related thoughts were flying around. A nice break from the norm for all of us!

Inspired by some of the “social technologies for internal communications” discussions I’d had with Abi Signorelli at Social Media Week London the previous week – in particular, the ease of capturing a brief audio snippet on any particular topic – I thought I’d ask Vaibhavi what she thought – here’s a quick interview:

Straight after the hacking, it was time to move on to the Open Technologies event that was being run to promote Linux, Firefox and Symphony. I’m a user and a big fan of all of these tools so it was nice to see a local Hursley event as part of IBM’s global awareness month dedicated to helping those within the internal community not yet up-to-speed on what people were using. The best part? Free stickers 🙂

Open Technologies

Parrot AR.Drone

Andy Piper brought his new toy to the lab today. While on a whistle stop tour of China recently he called in at Hong Kong on the way back, where he picked up one of the a Parrot AR.Drones which have been released this month.

The AR.Drone is a quadricopter with 2 video cameras, one mounted in the nose and one downward-facing. The drone that acts as an ad-hock Wi-Fi access point allowing it to be controlled from any device with Wi-Fi. At the moment Parrot are only shipping a client for the iPhone, but there is an API available and there is already footage on the net using an Android Nexus One to control one. It’s loaded with a bunch of other sensors as well, an accelerometer to help keep it stable and a ultrasound altimeter to help it maintain altitude over changing ground.

The iPhone interface for flying the drone uses the accelerometer and is a bit tricky to start with, but I think with a little bit of practice it shouldn’t take too long to get the hang of it. The feed from the video cameras is fed back to the handset allowing you to get a pilot’s eye view. At the moment none of the software allows you to capture this video, but it’s expected to be added soon. You can also use the camera to play AR games or have the it hover and hold station over markers on the floor.

The whole thing runs a embedded Linux build on a ARM chip and you can even telnet into the thing. It comes with 2 chassis, one for outside and one with some protective shrouds for the propellers to use indoors. 

I think some very cool stuff should be possible with a platform like this.

Here are 2 short videos I short of a few of us having a go with it on the lawn in front of Hursley House.

A day with the inventors

Retail Lab

A few weeks ago, the Financial Times Digital Business podcast visited Hursley and checked out some of the innovations that are being worked on. The result is a nice 22 minute episode which tours the lab (including the Retail Lab pictured on the left!) and talks to John McLean, Andy Stanford-Clark, Bharat Bedi and Jamie Caffrey.

If you’re into augmented apps, location awareness, Emotiv headsets (as featured in our last post here, too!), e-paper labels on shop shelves, telemetry, instrumented houses, and Smarter Planet – it’s a great listen.

Bang went the theory…

As with yesterday’s post, I really don’t have to do too much work on this one, as the detail has already been written up elsewhere…

If you watched this week’s edition of Bang Goes The Theory on BBC1, you will have seen Nick O’Leary and Kevin Brown from IBM Hursley helping Jem and Dallas to drive taxis. That probably wouldn’t have been entirely revolutionary, had it not been done through a combination of an Emotiv brain-signal-reading headset, and some MQTT and Arduino funkiness… no hands on the wheel or feet on the pedals!

Nick has a great write-up of what sounds like a fun (but cold) event. You may still be able to catch the fun on iPlayer, or there are some clips over here.


(Image: Creative Commons Attribution Non-Commercial Share-Alike (2.0) from knolleary’s photostream, used with permission – full set)

The Christmas lights

As a festive entry on eightbar this year, let’s talk about Christmas lights. Twitter-controlled ones! 🙂

Andy Stanford-Clark hooked up a set of lights to Twitter. As reported in Computer Weekly:

Using some clever IBM middeware, The microcontroller sets the illumination colour based on a signal from the internet or via SMS over a GSM network – so you can tweet “ibmlights” with the word RED, GREEN or BLUE to change their colour.

In fact, the commands got a bit more sophisticated than that, with more colours and lighting patterns. Towards the middle of last week the lights ended up over Laura‘s desk, and a growing band of folks delightedly tweeted the @ibmlights account with instructions to change colour or pattern. She took some pictures for me (and some video as well, but I didn’t have time to edit it…).

Another year of innovation and fun at Hursley! 🙂 Happy Christmas!

(by the way, well worth taking a look at the rest of the Computer Weekly article I linked above – lots more coolness from Hursley! oh, and I’m not sure how long the lights will be online… it’s just a bit of fun really)