Thursday, January 29, 2009

Sitting on Top of the (virtual) world

With apologies to Howling Wolf, Here's a pic we took in the Social Computing Room yesterday. It's showing an adaptation of the Second Life client that provides a 360-degree view of the virtual world. The development work on the SL client was done by a colleague, David Borland.

The idea is to embed the Social Computing Room within a larger virtual space, so that you can look out in every direction to see what is happening in the virtual world. In the pic below, I'm in the SCR looking at my avatar...



How about looking in? We have video cameras from all angles, so I thought about building a 'fish tank' from the SL perspective so that avatars can walk around the SCR and see inside of it. The bread and butter of two-way audio and video, along with text chat is the obvious next step. Here's a shot from SL looking into the Social Computing Room...


Right now it's just a bare media object, and one interesting question I have is what to build on the virtual side? Do I want to mirror the room, or perhaps the SCR could be sitting on the bottom of the sea? Once the basics are tackled, the SCR and the 4-channel Second Life client make for a unique research space. The SCR is well suited for installation of all sorts of sensors, robotics, and input devices. This is something of a side project, but I've noted how intrigued people are when they visit the SCR and see the first prototypes!

Wednesday, January 28, 2009

The world of mobile sensors

I've been sharing around a white paper by Nokia that considers the mobile phone as a sensor. The key point:

As mobile device subscriptions pass the four billion mark, we’re looking at
the world’s most distributed and pervasive sensing instrument. Thanks to an increasing number of built-in sensors—ambient light, orientation, acoustical, video, velocity, GPS—each device can capture, classify, and transmit many types
of data with exceptional granularity. The perfect platform for sensing the world
is already in our hands.

Well, here's a cool example, from OpenSpime, WideNoise is an iPhone application that uses the microphone on the iPhone to measure environmental noise. These geo-tagged reports can then be used to create a noise map.

Rewind a bit, 'Spime' is an abbreviate of 'space' + 'time'. A Spime is defined by OpenSpime as: a technologically enabled device that interacts both with the physical and the digital environment, aware of its location and with an history about itself. OpenSpime is working to create a Jabber/XMPP protocol that allows Spimes to report information about themselves, and has a set of python libraries in development as a first project.

A handful of threads that really are woven together...

Friday, January 16, 2009

Renci featured in Endeavors Magazine

Cool article on Renci and the power of visualization...in Endeavors Magazine right here.

Thursday, January 15, 2009

InfoMesa and Databases in the Cloud - Windows Live

InfoMesa is a technology demonstrator from Microsoft Life Sciences that I've blogged about previously. I've been working (as time permits) on two tracks.

First, I've forked the code to investigate how the interactive whiteboard/electronic research notebook would work in an environment like the Social Computing Room. I've already had some interesting results, and the product of that work is in actual use by researchers in day-to-day work. The main departure in this case is turning the interface 'inside out', essentially creating a 12,288x768 workspace, while migrating all of the other interface items into context menus. In the SCR port, there's more concern with laying out content around the room than having a free-flowing scrollable palate. Essentially, I'm trying to put this fork 'out there', and then asking researchers what sorts of tools they would like. I'm incrementally porting InfoMesa features to this environment as I can, and am especially keen on adding annotation features.

This brings me to the second track, which is looking at the 'cloud' as the source of data. In this way, InfoMesa becomes a 'browser' of sorts. The InfoMesa database becomes a link repository, a tagging service, and an annotation service. The whiteboards are composed of objects that may exist outside of the InfoMesa metadata repository. The data may be in a database, may be a resource on the web, identified by a URI, or may be stored in a cloud database, such as SQL Data Services in Microsoft Azure, or Amazon S3. The InfoMesa blog has some interesting demonstrations of porting the whiteboard metadata to Azure in the post InfoMesa and Databases in the Cloud - Windows Live.

I think the cloud data idea is useful, making whiteboards accessible anywhere. What I think is potentially more interesting is to think of InfoMesa as a resource browser and annotation platform, and looking at creating pluggable hooks in InfoMesa to be able to retrieve whiteboard objects from different locations, such as the above mentioned cloud data services. Questions to answer include designing such a pluggable architecture, laying out the metadata schema that would be able to store and properly access data in the cloud, and defining the security layer such that a whiteboard can acess data with proper security.

The ability to store and annotate data in these views has other interesting benefits, including 'wall-to-wall' collaboration. I imagine augmenting collaboration sessions with video conferencing between two visualization environments, where some ability to synchronize whiteboards brings remote parties together. I also am interested in the more pragmatic ability to allow a researcher to design a whiteboard that represents the agenda for a research group meeting at their desktop, and be able to walk up to a visualization wall, or an environment like the SCR, and have their data appear, ready to go. Think if this as a more fluid way of developing a presentation, where sequences of slides are less useful than a free-flowing group interaction with imagery, video, and other types of visual data.

You can see that the potential is endless. I'm trying to keep it focused on practical use, making working in the SCR a productive and fun experience.

Wednesday, January 7, 2009

Working on a blog/website for Renci@UNC

I'm working on a Wordpress based website and blog for the Renci@UNC engagement center. I've added a Flickr photostream...well at least a start, which should be visible on my blog. I'm using a Wordpress plug-in (FlickrBox) to incorporate this into the new Renci@UNC site.

Lots of cool stuff to talk about, alas, a bit busy actually doing the stuff to blog about it, but will soon (New Year's resolution). Anyhow, back from the Holidays, and glad to be into it again!