The excellent chairity Sport Relief is holding a charity event 14-16th March in Second Life. There are various press releases flying around but as this is for a good cause then a little bit of viral marketing does not go a miss. The sim’s themselves are not yet public but the avatar based racing looks like it will probably need you to join the group “Sport Relief Fund Raisers” owned by Lottie WeAreHere.
I am sure I will be able to get along and participate, and as with many of the other charity events we can all do some good with our virtual world presences and get donating.
If you are not sure what this is all about then check out the official sport relief site
More to come over the weekend and good luck to everyone running it.
*Updated the dates and the group name 🙂
Category Archives: Second Life
Crossing worlds – Video Avatars
I recently tried a little experiment using the excellent Live! Cam Avatar application I have, the one that I re-did Daz with.
In this is tried using a live avatar from one place, namely my machine and injecting it into Second Life as video feed. Its a simple but effective demonstration of pushing live content around.
The key to this is that the video avatar was a live puppet, in this case singing a song from an audio stream. The mapping is just a sphere texture, but I know done with more care it would work on a scultpy.
So I can have an avatar from one system injected into another in a simple but effective way.
We can of course take this further with better data interchange, but this show what we can do now. I know that this video avtar on a bubble can be moved around from external stimuli too. i.e. the mapped prim can be moved becuase of an event external to second life. So technically (as I own the parcel) I could walk around and act as a normal avatar despite being rendered on another system entirely.
Compare and contrast – SXSW
Roo is off at the fantastic SXSW conference in texas. You can follow his fun and frolis over on his blog I found it quite amusing that we posted individual photos up on flickr customizing out respective laptops, which I thought worth a comparison.
Mine was the arrival of my epredator Moo stickers with QR codes which I thenk stuck on my work Lenovo T61 Thinkpad epredator.com and eightbar.
Roo on the other hand had got his new personal MacBook Pro custom laser etched in Texas with an autobot logo
Just to keep the linked flow of things Roo also twittered he was just off to the Moo party 🙂
Note also how this was not about us hooking up and following one another via one single social network or virtual world. Both Twitter and Flickr and various blogs also feature in keeping us appraised of what one another is up to. Even though in this case Roo is having the lions share of the fun and a little bit of Metaverse Evangelist PR 🙂
*update I just noticed this said comments were turned off. That was unintentional despite the spam we get, normal service is now resumed, it must have been the mad UK weather ATM.
Spimes, Motes and Data Centres
A few of you may have noticed recent coverage around on the blogs about Michael Osias’s 3d datacentres. Ugotrade has as usual a great write up and analysis. You may have seen the work that our friend David Orban has been doing with OpenSpime. What’s that all about I here some of you ask?
Spime is a word that Bruce Sterling created, along with Spime Wranglers(The people who control and gather information from Spimes). The Spime being a self contained small device that broadcasts all sorts of information about its surroundings. Again Ugotrade has covered this in some depth in Tish’s most recent post
For a while here in Hursley Andy Stanford-Clark has been using the term “mote” as in remote and we have shown his instrumented house replicated in Second Life. Also Dave Conway-Jones has been busy with various forms of sensors and actuators. Also in some of the public research going on here for sensor arrays of the future spime like devices are being simulated in game environments to aid in understanding what would happen if they were applied to a large area.
So it would appear that Hursley and many members of eightbar are in fact Spime Wranglers already.
This ability to instrument the world fits into the principles of mirror worlds rather than the pure escapist virtual worlds and metaverses. Being able to augment reality, or augment virtual reality requires masses of live information, so a Spime or a Mote array is fairly crucial to the whole thing. In many ways instrumenting a data centre makes the data centre an entire spime in its own right, so you can see these things are linked very closely.
The balance of a nanotech generic smart dust gathering generic information but organic patterns forming for that versus specific devices understanding the monitoring of piece of information will make for interesting wrangling decisions too.
My key interest in this at the moment is using this approach to not just monitor and report on the real world, but on multiple virtual environments. A spime can be virtual too.
Finally, as I twittered this to David. I cant get the song out of my head Spimerman, Spimerman does whatever a Spime can. So maybe the Spime wranglers are going to be the information superheros for the next generation.
Interaction the way we want it as Humans
I noticed a great post by Christian over on the Cisco virtual world blog about the rise of expectation in new interfaces. I think in nearly every pitch I do towards the end I remind people that we seem to have tied ourselves to keyboards, supposedly to stop typewriters jamming, mice for navigation of 2d windows, and a few other metaphors for interaction that we are now in a position to break away from. I wrote some of this last year in a mini predictions post
One of the things people always seem to say on entering a virtual world (those who are not metarati or gamers) is the fact it is hard to move around. That may not be the case in reality, but just as people struggled with a mouse and menus and windows 15 years ago, they are doing the same with arrow keys, mouselook and the various other convaluted ways we seek to interact with the computer.
Clearly people’s expectation of display devices will be changed by the multitouch iphone, or simple gesture interaction as we see with the wii controller. All that is well trodden technology in some respects now. It has become commercially robust and is now in all our hands to push things forward.
Another exciting development and one I am sure we will cover in a lot more depth in the near future is Emotiv. There is a great BBC article on it here and you will notice a certain company mentioned alongside it and those of you at GDC may well have seen it. A very soon to be available commercial device to detect brain patterns and allow us to interact with the machines in yet another way.
Combine all these with the augmented reality, projection, headset approaches and we have a very rich set of tools to work with to see how we as humans are able to free ourselves from some of the self imposed shackles we have for interaction. Another article here on Kurzweil’s keynote at GDC hints at an even deeper future
Of course, thats not to throw away any of the old ways, we still use command lines where needed, we still use books and print where needed, but having more and richer things more suited to an indivuals neuro linguistic programming stack, or adding in accessibility for all so we can all interact however and wherever regardless of particular limitations can only be a good thing?
The Eightbar brand – another angle
Well you have seen eightbar represent itself all over the place. A meta-guild across multiple virtual worlds, a very large Second Life group, Eve corporation, even a Halo 3 clan. We have tshirts virtual and real, custom ferrari’s on Xbox and even been in a book. However our very own Graham spotted that we were missing what can best be described as a gang sign.
Last night in Winchester as a night out paid for by Daz (thankyou again Daz, who also pays for the hosting of eightbar.co.uk) this sign was thrown for the first time. It’s fairly self explanatory 🙂
Holly on the BBC, Its not all Roo and I on virtual worlds you know
At the VW Forum Europe in October Holly Stewart was thrown in front of the camera for a BBC click interview. Holly/Ada Alfa has been in with this since very early on and is very much core to eightbar. Not only that but she is currently the jointly elected guildmaster for our Virtual Universe Community.
It has taken a while but the BBC just ran it on BBC click as part of a virtual worlds piece. The page is here and UK people can watch the video.
A few small things, Its Holly Stewart not Stewarts, and the other IBMer is Paul Ledak not Paul Ladek, but you cant have everything can you.
**Update as Holly just pointed out on twitter the IBM SL machinima used in the piece was Rob Smart’s work too so credit where credit is due too 🙂
The key angle is also about interoperability, as you will notice this has been a bit of subject lately.
Anyway well to done our Holly for a great piece to camera. At last I cant take Click off series link on Sky+
Also greta perfomances from Valerie from ESC, the ubiquitous Justin from RRR and a great advert for Corey’s Multiverse in the middle of it.
Just for the record I think we (eightbar) have appeared on virtual worlds on click 3 times now in some way or another. Now where is the royalty cheque?
Exploring communication options in that metaverse middleground
As part of a bit of forward thinking I have been doing more experimenting with some levels of visualization. Working on the assumption that all video or all avatar is not the only way forward.
One way and another I ended up using Daz’s very amusing photo from flickr to illustrate the point here.
This is a mix of a static photo (an insane one) blended into an emotive 3d-ish representation, but with a synthesized voice from txt.
Daz crazy talked up from his 2d http://www.flickr.com/photos/shawdm/2… photo they will not read my mind
and speech synth lyrics by kanye west
I did a one of these the other day on my epredator.com blog which was a little more just to see if the technology worked, but also let me see if that would help in expressiveness and enhance a pure video conversation (which I think it does).
Here comes another wave of ideas for metaverses?
Metaverse technology and approaches to how people can interact in a MMO type way are appearing thick and fast. It always opens debates around one world versus many, it starts technical arguments around platforms. However that diversity is both a rich source of ideas and approaches and a restrictivce and confusing situation in social media circles.
Eric “Spin Martin” Rice comments on some of the problems of this in a recent post Just where and why are people choosing to gather in 3d spaces.
Recently Roo and I have been discussing the evolution of all these fragmented spaces. I dont think it is enough that we just tell people what is out there at the moment. It is by no means solved or possible may not be solvable, but it is worth considering some things here. Often interoperability reduces to a pure technical discussion when in fact its a social and organizational problem too. As virtual world companies and communities attempt to own their customers/members in a traditional sense they clearly want you to come to them to experience their wares and their way of doing things. This is a wider web2.0 conversation around who owns me and my stuff.
We are starting to see some words appear in up and coming virtual environments that start to hint at maybe some different metaphors. “Widgetized” is a forced word but if you read the press around RocketOn (props to Xantherus for twittering this company the other day) you start to see that we do not have to stick with the real world analogies that we have today. I am second guessing what Rocketon is doing but having a thing you take around with you from world to world appears to be their approach.
So I made a little picture, not so much a roadmap as a suggestion of where we are today and the ? as to where we need to evolve to in our understanding tomorrow. It is fairly self explanatory I hope.
We have gone from not knowing about anything going on around us, to our friends being online and sharing their thoughts/pictures/videos asynchronously to a set of single worlds where our avatar presence is part of the experience for us and those around us with a nominal amount of the previous steps awareness pulled into that environment too.
The trick is to think about the evolution from that, not to just replace real world metaphors but to extend them.
We already see this adoption as people start thinking about metaverses. They start with the replicas of themselves and of their offices and of their existing assets. They very quickly start to evolve their thinking and challenge why we need to stay on the floor in an office, do powerpoint, market with billboards etc. The non-real world representations start to flow as ideas.
My suggestion here is that the very container of those ideas, the world itself may also need to have this sort of evolutionary thought applied to it.
Single worlds and single avatars and a single live presence may be too restrictive, though is a comfortable metaphor to help people adopt metaverses and to feel some benefit from the.
This idea I think flows across each of the quadrants we see from the metaverse roadmap with the distinction being made with the types of virtual worlds and metaverses. Mirror Worlds, Virtual Worlds, Augmented Reality and Lifelogging.
Any thoughts?
IBM at the NRF
Does your avatar know how to make actual money? Bernadette Duponchel’s does. She was recently at the National Retail Federation conference with the rest of her team, presenting IBM’s take on virtual worlds for the fashion design industry.
This is the second consecutive year IBM has demonstrated the use of virtual worlds at the NRF. The brief demo highlights the benefits of real-time collaborative design, short feedback loops when tweaking materials and costs, and even pre-selling the item before it is physically manufactured.