Artag has come up again recently in a few posts. I was just getting an older webcam working my laptop so I used Artag to see if it was working ok and did this little render test. In this case with a head model of me generated by cyberextruder from a 2d photo. It looses registration a bit but as I said it was an old camera.
I was also intrigued to see whilst looking for a newer webcam that Logitech have some avatar webcam software. i.e. it responds to your talking movements but overlays or replaces you with a rendered object. I must try this. It would let me use seesmic more as I have not got the hang of small clips to camera yet.
I was trying not to post too much today and let the discussion run on Roo’s points about the TV ad. However this was both cool and funny as well as shameless self publicity. http://www.fastcompany.com/articles/2008/01/ten-jobs.html The metaverse evangelist role listed with 9 others 🙂 Flavourist, Brewmaster, Sensory Brander, Carbon Coach, Sleep Instructor, Interaction Designer, Roller Coaster Engineer, Animator and Travel Writer It was interesting that Metaverse ended up in “enhancing life and the bottom line” especially given the recent dicussion on money and that TV ad.
I’m not sure. I can see why it looks like that, but I’m not prepared to be annoyed with IBM for damaging its influential position in virtual worlds just yet.
Before I go any further, I should make it clear that I had nothing to do with the making of this ad (if it’s not perfectly clear already, I don’t get asked about things like this <grin>). Additionally, I have not even seen the ad yet, I’ve only read transcripts, so I may be missing some subtle undertones here.
The commercial starts with an employee showing off his avatar to someone else, presumably a boss. The employee is all pumped about how he can conduct business in this virtual world and how he owns an island there. The boss asks if he can make money. The employee responds with something like, “Virtual money or real money?” This sets up the boss’s response that “The point of innovation is to make actual money.”
I’m not sure what to think. Who is the fool here? Is the boss even right? Isn’t innovation about far more than just making money? (Would training and rehearsal count? What about collaboration, recruitment, developer relations, …)
Based on the transcript (tell me if I’m wrong), I think there’s another way of looking at it. What about that ambiguous question: “virtual money or real money”? The implication is that the ’employee’ character can’t use the virtual world to make real money, but everyone who reads Business Week knows that there is real money to be made in virtual worlds. What if the boss actually “gets it” and (unlike the hesitant employee?) knows that real money can be made in virtual worlds, and is pointing this out to him, and us? Suddenly the ad takes on a new perspective.
I’m not at all sure it justifies my broadminded interpretation though, and I’m as annoyed as anyone that the ad might be interpreted as “look. aren’t virtual worlds silly” and perhaps risk undermining IBM’s amazing position in this area. What do you think?
Update: since this post went up, Ogilvy have posted the ads from this campaign to YouTube. Here’s the avatar advert:
Someone recently showed me this BT advert from 2002: “The more connections we make, the more possibilities we have”.
It’s interesting for many reasons. It nicely sums up the internet of course, but it increasingly reminds me of a popular misconception about virtual worlds.
Yes, the internet is a bit like an auditorium which seats millions of people, and in which we all get a chance to talk to any (and all) of them. But it’s not exactly like it. If we were creating a 3D digital social space (with ooh, let’s say, avatars to represent people) would we actually create a great big auditorium to bring them all together at once? Probably not, no. For the same reason that although large numbers of people may come together for specific events in the real world, the number of connections you can make inside a huge crowd doesn’t scale with the size of the crowd. How many people are you likely to talk to at a sports event, or a music gig, with a thousand people around you? Tens? Hundreds? If the crowd was a million rather than a thousand, would your experience be any better?
The advert itself highlights how ridiculous (though naively adorable and even romantic) the idea of bringing millions of people in one physical space would be, so why is it so easy to obsess about doing the exact same thing with virtual worlds? IBM is not immune from this, and when the 12 IBM islands were first launched you’ll remember the three large auditoria, capable of holding around 200 people each.
I’m pleased to say that although these spaces are useful, they’re used rarely in comparison to the rest of the IBM territory in Second Life. We rarely even attempt to fill these spaces, perhaps because of a realisation that an even bigger and more scaleable approach is needed. We need to (continue to) concentrate on what works on the web: the idea of multiple ongoing and concurrent conversations, with people picking an area (or even environment) which is most appropriate and interesting to them.
Ian just highlighted this very point in a post about the recent NRF (National Retail Federation) show, in which a custom shopping experience can be dynamically created. I find the idea of small social environments which can be dynamically created much more interesting than huge social environments which attempt to cater to crowds of unmanageable and unlikely sizes.
All things virtual featured in the latest NRF retail show. I spent some time talking to my collegue in Second Life a fellow member of eightbar Siobhan Cioc about the Dallas based GSC demo centre.
There is a background story to this that is also of interest to the Web2.0 community trying to establish the value of both blogging and virtual worlds.
I had seen a press release about IBM at NRF. Now that the metaverse acceptence has spread there are too many things and spin off for even us tuned in metaverse evangelists to keep up with. So here is what happened….
I saw the press release and the mention of a CAVE demo. I blogged internally about it to see if anyone had anything. Of course many of the people involved were at the show so that was always going to be a slow burn approach. However the question was out there, what are we doing? Then an article about NRF appeared on our intranet with a link to … yes an NRF blog. I tracked back on the blog entry asking the question again about the SL/Cave piece to try and connect the threads. Sobhan Cioc’s real life presence both replied on my blog and also sametime instant messaged me. We then both dived into our public SL islands where she explained what the project was all about. I listened and also took a small snap of film which I just put on youtube.
Now we have connected, discussed who we know in common, worked out some ways to help one another becuase we both used all the available technology and approaches to connect with one another. Why blog inside a company firewall….. Well there is your answer. We get people connected and questions answered. Some instantly.
I digress (I think I may have turned into Ronnie Corbett)
So the cave project. You may be able to see this here though I am not sure how well the link will work to the fox news item I will post a better one when I find it, this is hot of the press after all.
So just in case here is the explanation with some SL footage.
The team have created a configurable room in SL. The room is HUD controlled. The HUD allows elements in the room, in this case TV’s, Speakers and even the starlit sky above it to be adjusted. This approach has been shown before in various ways, the Circuit City couch and the Sears kitchens over on IBM 10. The difference here though was that the demo was built to specifically integrate with a head mounted display and the booth was build for this sort of interaction to occur.
Yes we have seen 3d rooms in VR before, but the difference here is this is on a public multi user platform. Much of what we saw with VR before was single user or just very expensive. This example was at a general retail conference, mainstream.
Being able to configure an room or a product experience and be able to share it with others, whether they are friends, other people with a vested interest and/or experts from the store or business is a very significant point.
Having a booth with a single SL or metaverse experience is good, but in some ways having multiple headset stations to help people see that someone else is going to join them may have made this more obvious to people that others need to be involved in the process.
I think we will see lots more concrete examples this year both of additional interfaces into virtual worlds, but more importantly interaction with existing enterprise systems such as product fulfilment.
Another thing to consider is scale here. People often worry about how many people can you get in one space, Roo has a good post on the way very soon. Here we have an example that if you are having a personal shopper experience you do not want a huge crowd of thousands around you?
Last summer, Ian blogged here on Eightbar about an experiment with running the ARTag system alongside Second Life, and augmenting SL with additional 3D content, like this…
The Georgia Tech project goes the other way, augmenting the real world with live content from SL. Like this…
It hints at a future in which the lines between virtual worlds and real world will be crossed by more than just the use of a keyboard, mouse and monitor combination. The ability to see and interact with other people in virtual worlds is one of the things that has allowed interest in 3D environments to expand far beyond what we ever saw back in the days of (largely single-user) ‘virtual reality’. Being able to go beyond clunky user interfaces and blend those interactions naturally and intuitively with the real world is something I expect we’ll see a lot more of this year.
Now I am no artist(I guess you noticed), but I felt quite intrigued by my results and the use of sculpties to do it. Bear in mind this was a very very quick experiment. In fact it too longer to process the video than it did to create and import the ‘head’.
As the object is made of multiple sculpties it opens up the options. I was going to script it to move around, thats the next little experiment.
Our collegues and friends Gizzy Electricteeth, Pipe Hesse and their cat Linden down in Australia are running a public build of the Australian Open Tennis right now. Whilst British hopes are dashed alread with Andy Murray getting knocked out in the first round they are are still very high with respect to the Aus Open in SL. Regular readers will know the links between what we do and tennis. IBM sponsors and provides the sites for the major grand slam tennis events around the world.
Wimbledon 2 years ago was where we cut our teeth in seeing how we could represent the tennis data in a metaverse. This was done privately. Pipe and Gizzy then built a massive stadium for the Aus Open which very much expanded the ealry wimbledon proof of concept. Last year it was private though. Then this yeat in June we swapped places again. Taking and adding to Pipe and Gizzy’s work to create the public Wimbledon 07 event. This was an experiement in how people would repsond and we kept it small an intimate but staffed during the live event. So now we are back again with the Australian Open and they awesome stadium build on Slam1 is open.
In order to keep things fresh they are having an official fashion contest for all you couture experts out there in Second Life.
We dont often do things that are press releases, but in this case I am pasting this in here to help you all out there.
This year, in conjunction with its sponsorship of the Australian Open
tennis, IBM is holding a ‘Couture on the Court’ competition to find out
who can create the best avatar tennis outfit in Second Life.
The winner will be voted by the SL residents and the first prize is
$250,000 Linden Dollars.
The top 10 eligible entrants will be awarded prizes – which will be
amounts in Linden Dollars.
Entry is open to any avatar created before Jan 8 2008 however
eligibility to recieve the Linden Dollar prizes is restricted to those
who fill in the full details of the entry form.
It is strange as we talk about this project all the time, yet as Ugotrade was looking for a link to it on eightbar it turns out we had not covered it well enought. So here we go.
We have said before about taking things from one place and regerenating them in another to find interesting ways to visualize data and system architectures and workflows in a business.
This project by our collegues in Watson took the result of some complex protein folding on a blue gene supercomputer and then injected the result of those calculations on the structure of a rhodopsin molecule into a scripted object in Second Life which then built a huge representation of that molecule.
The guys have an interview here over on SLNN.com Rez Tone and Zha Ewry are the main contacts.
The interesting thing is that as well as being a scientific visualization it has a place as an art installation and a though provoking place to gather. Inventiably when people gather there they tend to ask so what is this then? This leads to much deeper conversations about life sciences and supercomputing at IBM.
I tend to use this little clip of me sitting in the molecule to deomstrate the scale of this. Remember its all generated, not a manual build.