Thursday, May 31, 2007

Google Gears

This was touted as Google going straight at Microsoft. A framework for going off-line with on-line apps, called Google Gears. It appears that Google Reader and other apps are going to be outfitted with this capability, which is very cool.

One of the most frequently requested features for Google's web applications is the ability to use them offline. Unfortunately, today's web browsers lack some fundamental building blocks necessary to make offline web applications a reality. In other words, we found we needed to add a few new gears to the web machinery before we could get our apps to run offline. Gears is a browser extension that we hope -- with time and plenty of input and collaboration from outside of Google -- can make not just our applications but everyone's applications work offline.


Too many irons in the fire to play with this now, but I'll file it away, sort of today's mini-buzz after yesterday's MSoft Surface Computer wave. I'm also catching that people are complaining about the Google Street View, check out Boing Boing for the blowback on that!

update: Here's another article on Gears.

Wednesday, May 30, 2007

Microsoft Surface Computing - HCI and UbiComp

Sometimes, it's hard for me to pin down what my own blog is about. I tend to run many threads at once, and end up thrashing sometimes, as I suspect anyone working in technology does these days. The past few weeks, it's been about Second Life, and that continues, but I'm looking at other areas as well, such as plain old Web2.0, ubiquitous computing, agent computing, mobility, location aware services, SOA, and dynamic scripting languages (specifically Ruby and Rails these days). In the mix somewhere is my original interest in Java/J2EE, along with things like Spring.

Rather than a testament to a short attention span, I think this wide variation in themes is actually a sign of the times we live in. Developers no longer learn one language, and roll into every project with the same set of tools. The evolving web, the evolution of mobility, and the pervasive field of networked information and devices that surround us everywhere we go make for an interesting and challenging time. I'd like to suggest that the disparate topics covered in this blog are on a converging trajectory. Maybe that's what this blog can be about.

Case in point, check out this short video on Microsoft's Surface Computer. I think this is an exciting platform that brings together a bunch of ideas. Essentially, this is a big, touch sensitive display that uses gestures to manipulate data. The cool thing is that it's multi-touch, so you can gesture with both hands, and multiple people can interact with the computer at the same time. In addition, the Surface Computer is sensitive to physical objects. It can sense with these objects, and also interact with other computers placed on the surface.

  • The 'multi-touch' is collaborative. Technology is getting more and more social. This reality is core to Web2.0, as well as the evolving 3D web. We're not isolated from each other anymore, we Twitter and blog, we IM and message, now we can compute together.
  • The Surface Computer bridges the physical and the virtual. In the video, they demonstrate placing a device on the surface, having it dynamically connect, and using a gesture to shoot a photograph into the device. The natural action of placing a device of interest on the collaborative surface, and being able to manipulate it, is a step towards useful ubiquitous computing.
  • The Surface Computer could be an interesting new metaphor for web collaboration in the way that avatar representation in Second Life creates a sense of immersion. I think it won't be long until you could assemble remotely around a common 3D web surface, with remote participants as avatars.
The combination of natural interface, immersion, and the ability to easily incorporate data from the web, or from other devices, in collaborative ways seems like a natural progression.

Tuesday, May 29, 2007

Google street view

Just go here...it's amazing. Alas, Franklin Street not available...

Second Life Best Practices in Education - Link Dump

Here's a nice link dump to catch up on the SLBP confo last week...

WUNC today, the rise and fall of Friendster

On 'the Story' today..Jonathan Abrams from Friendster.

quote:

Dick talks with Jonathan about what he learned from the success and later failure of Friendster, and how he plans to compete with a new social networking project in what has become a very crowded field.


Friday, May 25, 2007

More of SL Best Practices in Education confo


A few keynotes this afternoon worth catching...catching the end of the IBM keynote. Place is SRO.

Search IBM and Secondlife in YouTube (do that later) to see some examples of their use of virtual worlds. ? on when virtual worlds truly become mainstream? Mainstream is difficult to argue..what is mainstream? Talk in terms of internet 1 (democratization of access), lots of people connected now took about 10 years. Web2.0 (democratization of participation), took about half the time. 3D internet is about people coming back. People are going to be involved in every aspect of environment, and it will happen fast.

Pirate Shipman, adjunct faculty at Pepperdine, is next keynote. Right brain attitudes important in today's world:

  • Design
  • Story
  • Sympathy
  • Empathy
  • Play
  • Meaning
Basic premise is that we need to focus on these areas, versus left brain logic. Pirate did a class project do develop virtual content reflecting these 6 aptitudes. This comes from the book "The Whole New Mind". Students given a small area and a 150 prim limit for their projects. We watch slides of the building progress on the class island. Students experimenting...Conclusion was that it was a powerful learning experience for all involved.




That's the keynote in progress..

Question...how do we teach in a virtual world? We need to discuss strenths and weakneses.

Strengths:

  • SL is a spatial experience. Virtual world has physicality. Shapes, sizes, movement, spatial relationships take on deep meaning.
  • SL is an immersive experience. We can respond as if we are really there...effects emotion and mood to be in a virtual situation.
  • SL is a social platform. We can craft and present an identity to others. "This makes the us that engages with others easier to become". (Interesting).
  • SL has tools to connect to and communicate with others. LSL allows us to develop socially aware objects.
  • SL democratizes the ability to create content and learning artifacts, it's a participatory medium. (This is what I think is the key point, which SL captures well).
  • SL enables collaborative development of objects. (I think building things with others, and having the tools in-world, is the key...this is why SL works, and why importing from blender, etc is not important, and rather not the point, but that's just me!).
Some talk about the playful spirit that is the game part of the environent...the keynote speaker is wearing a pirate eyepatch.

Weaknesses:

  • Effective communication of large amounts of data is difficult.
  • Technological overhead high.
  • Combo of 2d with SL lacks synergy most of the time. Showing 'flat' images, for example, still easier in a browser interface..
  • Activities outside of the scope of what second life does can usually be done better outside of SL. "Sometimes, though, the novelty may be enough".

OK...gonna hit some posters, then I gotta go....

Here's a parting shot of one of the posters, this one for the SL Genetics Center. All in all a remarkable day, and an effective use of SL. I've even got an inventory full of junk now I have to sift thru!.

Second Life Best Practices in Education



Popped into the SL Best practices conference. Hit the registration tables, got my gift bag, off to a discussion about DRM and contracts in virtual space.


Here's the scene at the presentation space...Parallels right now to Viacom v YouTube, discussion of safe harbor provisions as they will apply to Second Life. Real life money is involved in Second Life, real life legal action will follow.




What happens if someone misappropriates my content in virtual space outside of virtual space (outside of the 'rules' in Second Life)? There are layers of rights (game code, copyright) but do they give a practical answer?

Interesting angle...What happens when the players don't like what the game desigeners/owners are doing. Interesting because this could be seen as a game, but significant investments could be made in virtual space. How does freedom of speech, etc play in?

Second Life may be more open source. How about Creative Commons licenses? There are some mappings between the SL object permissions and CC permissions. What does CC give us in SL that copyright does not give us in SL? We need to be careful here about coding or contracting away rights that may be in copyright. Note this is from an Australian perspective.

Next pres, Profiling e-world customers...This is an e-marketing presentation.

  • personalization of products and services (CRM)
  • online profiling of customers (what's that)?
  • all e-world transactions leave traces of data
  • issues involving trust, accuracty, privacy
Online profiling is practice of aggregating info about consumers lifestyle, product preferences, purchase history. Adds up to info on customer's buying behavior. Now talk about SL specific approaches..

Brands and self image apply in virtual space. Research discussed addressing the issue of online profiling by introducing self-profiling. Capture online ideal self-image via brand personality. What evolving brands traits could be highlighted online? SL objects must be designed in a way to construct an ad-hoc shopt display for each customer, sensing their profile, brand identity, self profile.

Between presentations. It's interesting to watch typical real life confo behaviors play out in virtual space. Small groups congregating, people walking in to check the scene, and deciding to attend the other talk after looking at the boards. Presenters trying to figure out how to get their slides to show, etc. IOW oddly familiar. Also cool is how people linger to associate, and look around to check who else is in the room. Lots of networking going on, and informal off-line talk.

Into a panel discussion now... gonna sneak to the Grind for coffee. Just like a real confo!

OK..came back at the end of the panel..darn..took too long. Off to the vendor area, outreach, etc.


Popped in to a bit of discussion, SL Education is about collaboration and partnerships. Example, various teachers can swap sections to teach content that they are best at, and enjoy teaching the most. The content is the focus.




That's a shot from the discussion area. I'll wander over to the exhibits now. Saw the Moodle Booth, vendors who build sims, libraries. Here's an interesting thing, a booth for the National Underground Railroad Freedom Center, who will be coming to Second Life...




More wandering..

Digital Campfires Foundation, looking at creating thinklets within second life. UTD has an impressive booth, going to have to visit their campus. Some discussion ensued about an open source model within SL. SLForge was one attempt to start this. Perfect for higher ed!

Schome looks like an interesting pilot project. "Schome park was the site of a pilot project during which over 100 memebers of NAGTY (National Assoc of Gifted and Talented Youth) collaborated with staff in Second Life to seek ways to transform educational spaces and practices"

Map of forces shaping future of education, from the Knowledge Foundation...Feed of current projects from My SL Project. Here's their academia edition. Saw Columbia College Chicago has a cool sim, I AM Columbia. I'm going to do a quick tele to their sim.




Checking out the galleries at Columbia College.

Back to the presentation hall. Instead of turning of our cell phones, we are reminded to kill all particles and animations. No SL bling for lag's sake.

Suzi Mazzenga (Xirconnia Morphett-SL) Drawing on Second Life Experiences to Enrich the First Life.


This presentation is being simulcast over the plain old web via SLCN (Second Life Cable Network) now.

Living outside of the circuit box...applying learning about self in Second Life to First Life...Compare your avatar in SL to real life. Mine is a virtual doppleganger. Or is your avatar a subconscious blueprint for making a better 1st life you?

What does your SL interaction say about you? Is your avatar a virtual extension of your personality? What does your SL home say about you? Did you put up a virtual wall to block your neighbor's view? Would you do the same things you do in SL in real life? Is there morality in SL? Various giggles and grins in audience.



Here's a shot of the talk, note the video screen showing the live simulcast.

How many have group meetings in SL? Do you have a traditional board room for meetings? Look at other non-traditional settings. Coffee shops, koi ponds...no walls, no chairs. Restrictions and expectations can be changed by working within SL as groups.

There is no 'back of the room' in SL collaborations. Gonna break out, do some RL work, and hit the IBM keynote...

Thursday, May 24, 2007

IBM Promo Video on their new Virtual Biz Center

The simple thing that stuck with me, back when I heard Dr. Irving Wladawsky-Berger talk at as part of the RENCI Distinguished Lecture series was the observation that the '2D' web was about taking the catalog and putting it into the browser, while the 3D web was about taking the whole store, sales staff included, and putting it in a virtual space. The purely commercial side of the web is not the whole story by any means, but I suspect that the next Amazon or Ebay will rise from the 3D web, and that makes this stuff exciting to watch.

So dig this little YouTube video about the new IBM Virtual Biz center. I took five and logged in, and happened upon the real virtual site, and saw this thing on the Eightbar blog. IBM's Second Life presence is impressive, and I often point people there when they let me know that the virtual web is just a game....

Approx 31 Blogging about moble + SL, Croquet

Willi's got a bit in his blog about some prototypes we're working on. This one is meant to show on-scene disaster workers walking around assessing damage with gps and camera equipped mobile devices.

I messed with Croquet a bit yesterday. just out of interest, and also a bit because I was miffed at Second Life for deciding to bring the grid down yesterday. Such is life on the front end of the hype curve!

Initial impressions...

Second Life is more like the web, it has the chaotic, random feel that comes from random exploration, searching, and inputs from your social/professional network. Croquet has more of a self-contained, peer-to-peer feel. Right now I see Croquet as a targeted application development platform. I also see Croquet as lending itself to more abstract presentation. I really love the metaphor in Croquet of jumping through portals to navigate to new worlds! I'm quite sure I've only scratched the surface on Croquet, but it seems less collaborative, and more about cool new metaphors for information. In other words, I don't feel as much immersion, or presence of myself or others in the Croquet demos, but I do see more of a free-flowing, abstract experience.

That's not a knock, I see different horses for different courses. I'm even more of a Croquet noob than a Second Life noob, so take everything with a grain of salt. I've got to dive into Blender and Squeak, I think, before I really have a good grounding on the potential of Croquet.

Best Practices in Education Confo in Second Life Tomorrow

By way of Kathy Kyzer..

Planning is currently underway for the first Second Life International Best Practices in Education Conference!

The conference will be held on May 25, 2007 in venues all over the Second Life world, with exciting presentations, vendors and exhibitors, and everything an educator needs to know to get started exploring the possibilities for teaching, learning, and research in Second Life!

Here's a link with more info

Tuesday, May 22, 2007

IBM Presentation on Virtual Worlds

I think IBM has correctly identified the conceptual link between today's Web2.0 formulation and the future 3D virtual web.

Check out this presentation from the eightbar blog, now in my 2D Google Reader. One great point is made in a qute from Ian Davis' blog, where he says, about Web2.0, Web2.0 is an attitude, not a technology. It's about enabling and encouraging participation through open applications and servers.

I have been been asked several times why I'm messing with Second Life, when X, Y, or Z allows you to use your 3D modeling tool of choice to import a CAD drawing, etc. This is a very good question, and I can't claim to know the answer, but here are some observations.

First, I think it's been said that people working in technology A are probably better at predicting the future of technology B, which may be a less familiar field, than in their 'own' field. This could be true. I know that as a hard-core Java/J2EE developer, it took me some time, and distance, to understand the Web2.0 ethos, as well as the value of dynamic scripting languages such as Ruby. I think the same holds true thinking about virtual worlds. My immediate reaction to the questions about various virtual worlds is to try and remove the lens of past experience with 3D technologies.

Second, my reaction is to re frame the question, and to reconsider the 'things that matter' when judging virtual content. I helped (in small ways) as UNC worked on its Second Life campus. The natural first step was to make the UNC island look like UNC by building various familiar landmarks, such as the Old Well and the Bell Tower. Great...Then soon after the island opened up, Intelligirl took one look and pretty much said it can be a mistake to push the real world into the virtual. I took a step back, and had to agree that the virtual space is different, and is neither a game, nor is it real life. It needs to be taken as it is. The true goal in virtual space is not the ability to accurately replicate real life, the goal is to provide content formed by community participation.

Third, remember when Mosaic came out, and all there was was a tour of Amsterdam with some hyperlinked pictures? Anyone looked at any HTML source lately? HTML is crap, I'll go ahead and say it! The most important point is to imagine the end product based on the creative power of the participants, not on the nature of the component tools. Here is where Web2.0 comes into play. The power of Web2.0 is built from simple pieces of technology...RSS, HTML, AJAX, XML, etc. It's the fact that modest tools have been put in the hands of the many, with low barriers to entry, that's made the web a revolution. Comparing the tools in a space like Second Life to those on the workstations of professional designers is not the test! These comparisons lead to faulty conclusions. The real test is, does the environment provide sufficient facility to create compelling content, and enough critical mass to create communities around that content.? To me that's the test, much different than the ability to manipulate sophisticated polygons!

I'm still somewhat skeptical that Second Life is the answer. Its move towards open source gets things moving in the right direction, but if the Web2.0 analogy is to hold, than it implies that a walled garden will fail. Certainly the ecosystem is there, but new generations of web users are fickle, and will jump if a more open, buzz-worthy environment arises. My guess is that alternative virtual worlds will reach hype-parity with the current Second Life buzz, there will be an explosion, and potentially open source will move in to create some standardization. Either way, it's going to be an interesting, and increasingly virtual web.




Second Life controlling Real Life



We're working with some ambient orbs for multiple purposes. Since we had one handy, I got a developers account for a custom channel, and slipped in a mash-up between the orb, and Second Life.

The basic idea of a custom channel is that you can call a URL for your own orb, and specify the current color and animation. These URL's look something like this...

http://www.theurltheygiveyou/thescripttheygiveyou.jsp
?devID=YOURDEVICEID&anim=0&
color=12&comment=this+is+steady+green

And off you go. From that, it's a simple matter to plug in an HttpRequest in LSL. For this first cut, I just wanted to get the thing working, so I have a display, along with several color buttons. Push the button, the color display changes, and the correct site is called with the correct parms. This took all of a cup of coffee to get through, but I think it shows some interesting potential. One thing I've discussed for quite a while is the idea that the physical and the virtual are merging.

Anyhow, this changes the status of the orb on the ambient site, but I'm having a heck of a time getting the physical orb to respond. Perhaps there's a radio coverage problem, but I'll have to look into it...but this cap from my orb administration page shows that at least on the web site, my orb is the correct color.






The initial concept is to create a DEFCON style alert system. Since the scenario we're playing with is a sort of collaborative disaster management scenario, this SL widget could control a group of orbs for people that need to be alerted when something is going on. Perhaps the orbs on their desks start flashing red, and everyone teleports into the collaborative space...that sort of thing. Another more generic idea would be an app that can change the orb color based on whether anyone is in the sim, showing number of people, for example. If you were a help desk for a retailer running a virtual store, the orb could show the amount of foot traffic, and alert virtual sales avatars to pop in to close a sale.

Monday, May 21, 2007

Imagine that..

Maps with 3D symbols representing real-world data in real time...this is very similar to the things we're working on in a disaster scenario...Here's how it's described. This is very cool, and worth a visit.



The floor plan is of the offices surrounding CJ in Real Life. The blue balls with white designs represent active Bluetooth devices. The pyramids scattered about the floor represent other people working, with the color designating things like physical presence or telepresence. The flame in the pic on the left is a probe under the lamp, “so when lamp is on the temp goes up.” The black rectangle in the righthand pic is a RL door (so CJ knows when someone enters his lab).

Friday, May 18, 2007

Some SL Shots...another mash-up too

In previous posts, I'd been talking about "SlIcer". It's really a simple framework, and it's only in rev 1, so don't let me give you the impression that it's more than some Ruby scripts, but I do think SlIcer is illustrating some cool 3D web mash-up ideas.

I am working on interacting with 'mash-up' info in the 3D world, concentrating on a few concepts:

  • Representation, and interaction with spatial information in a 3D environment.
  • Triggering real-world actions from a 3D interface.
  • Creating visualizations of real-world information using primatives in Second Life.
Here are a few simple illustrations I've been working on. First is a 'Mapping Room'...



There are a few things going on in this pic. See that small green sphere above the map? That's a SecondLife part of SlIcer. It's an object sensor/hub. This thing scans for objects in range, and sends an inventory up to the RL server with key, object name, and position. I'm standing on a big map, and on this map are 'counters' for real world objects. The counter closest to me on the right is a representation of a communications truck with a balloon antenna. This has a known name, and using SlIicer, I can hook up a GPS signal from the truck, get the lat/long, as well as other information, and push it into SL to move the truck counter on the map as the real truck moves. It's hard to see, but the object has a balloon that deploys as the real-world balloon deploys. People collaborating on the map in Second Life can walk around, point, etc. This has a couple of benefits:

  • Collaborators in Second Life are able to point, position their avatars, and refer to objects on the maps. This is not possible with other forms of collaboration.
  • Situational information is presented as ambient information. In a funny way, you can create virtual physical devices representing virtual data. How cool is that? As a side-line, our work with AmbientDevices orbs is going to merge in here too, where an orb can reflect information about what is happening in the virtual space.
Note also that SlIcer is keeping track of objects on the map, and it is possible to view the positions of counters as symbols generated by a GIS map server. Individuals in Second Life could move a counter to a particular place on the virtual map, causing real-life GIS to be updated! This works (with some bugs to work out), but it was hosing up when I took the screenshot. There is a media viewer in the above picture that's displaying a 2D barcode link to approx31's blog...drat...it's supposed to show a map with the Second Life data as symbols in a GML layer. Just trust me, it does kinda work.

Anyhow, here's another mash-up, attempting to do visualization of data. If only you could load textures on prims dynamically from a URL! This pic shows a tropical storm mash-up. It's a mock-up right now, as I look for a good data source. The idea would be to parse a data stream, such as RSS, that shows current tropical depressions and hurricanes, and depict them on a map. Also shown is a fly-out I'm working on. As it operates now, you click on a storm, and the fly-out rezzes. The flyout uses prims to depict intensity (the red bar), storm directions (the compass rose and pointer), and storm track speed (the blue bar). If I can find a data feed, this could be wrapped up into a stand-alone mash-up!

The storms are prims too, in (real) Second Life they spin and stuff, way cool. It's dumb stuff like that that amuses us programmers.

Wednesday, May 16, 2007

REST-ful Rails reflections, SlIcer, and the 3D Web

So I've been quite busy throwing together a web/Second Life mash-up, implementing ideas outlined in a previous post. The basic premise is to build a framework, (I apply the term loosely, perhaps 'hack' is the more appropriate term), that could start bridging the gap between Second Life and the web.

I called the framework SlIcer for the hell of it, and began implementing it using Rails. The basic functions I'm working on include:

  • A set of sensors, deployed into Second Life, to detect people, and to detect scripted objects within range. These sensors are installed in various areas, and tuned for sensor range and angle to provide localized service. The sensors repeatedly sweep for named objects and people, and report up to SlIcer via HttpRequest with the object name, second life key, and the location vector. It would certainly be possible to add state properties to the report for real-life storage as well.
  • Co-located with certain sensor types are 'hubs', which act as the bridge between Second Life and real life. This may be a dumb idea, but I thought it useful to create a database to store and forward incoming data in SlIcer. The hub would poll SlIcer, and get messages as bundles. The bundles are pulled into Second Life, marked as distributed in the database, and then distributed to the target objects in range. The main benefits are:
    • A real time view of the objects in range, and the current position. Again, state properties are also possible. External applications could use this data in ways I quite haven't come up with yet...
    • An ability for real life apps to address objects within Second Life by a 'plain english' handle. This obviates the need to know the current key of a rezzed object in the sim.
  • A plain http call can be put into any application to post message bound for a Second Life object. Any application can push data into Second Life without any SL cruft, as long as the name of the target object is known.
  • The (future) ability to add reliable delivery (sequencing of messages, and delivery confirmation) semantics if this seems useful.
A later development might be to have some sort of event pub/sub mechanism. For example, anytime an object within a location is touched, it could report that event, and listeners outside of Second Life could tie in. We're looking at ambient orbs, as an example. It would be possible, given these mechanisms, to watch for certain thresholds of people occupying a certain room in a sim, and change the orb status...really, any number of interesting things would be possible.

So I've got the basics working using Rails. I have not tried NetBeans and Rails yet, sticking with RadRails for now, that seems to be all I need at this point. My original impulse was to use a REST-ful approach, given the new facilities in Rails for scaffold_resource. The particulars are described in this great tutorial. This was particularly attractive as the llHttpRequest LSL function supported GET, POST, PUT, and DELETE>. At any rate, I had some initial success, but rapidly ran into strange problems with Second Life. There are likely some strange ACCEPT headers sent out from an HttpRequest, because I began running into strange 406 Not Acceptable errors. Lazy developer I am, I just decided to punt REST for now, and just use GET and POST within normal Rails controllers. Now it works fine. I'll go back and dig when I have time...but I just don't.

It's an odd thing if you think about it. I was going through all sorts of gyrations to implement REST-ful interfaces, which was supposed to make things 'simple'. Really, the gyrations were there so I could code to a particular programming philosophy, and while attaining a philosophically pure implementation still has it's attraction, I was wasting time. Ironic!

I want to add props for a pretty interesting blog on the 3D web, there's more out there than I thought, and I thought I was pretty tiresome hyping this technology around my shop! Look through some of the reported developments, and see if you agree...hype or not?

Well, the sim is down for mx, hence attention to the blog. I'll put up some screen shots of how we're using SlIcer when I can get back in! Cheers...

Thursday, May 3, 2007

REST-ful Rails

Ok, I'm drinkin the kool-aid...

I've been hacking around (with purpose) on the ideas described in my previous post. I've got a prim in Second Life that acts as a sensor, identifying active (objects running scripts), and capturing the name, SL key, position, etc of each responding object. The first case is to collect these relations between 'given' name, and SL-assigned key for the object. Given that my sensor works, I'm working tonight to create the interface on the real-life side. The sensor can take each detected object, and use the LSL http request to send this observation for storage outside of Second Life.

Http request from SL....there's good reason to take advantage of REST-ful approaches. That's where I am tonight, going through a really interesting tutorial/document about building REST-ful interface using Ruby and Rails. So I'm loosing steam, and won't have it all done tonight, but should be soon! As I had posted yesterday, maintaining an easily updated database/registry of SL objects is the first step. Next, I'm working on more formalized mechanisms for a store-and-forward queue from RL => SL, and some fashion of an event pub-sub mechanism for SL => RL. The latter may be a cursory attempt, just for grins, but I could see applications.

Wednesday, May 2, 2007

2-way messaging with Second Life

So I'm messing with the idea of a messaging hub for Second Life. The idea would be to create a web app that would accumulate messages bound for named objects in Second Life. These would queue up in a database, and be delivered in batches over some period that would not kill the sim in question.

So you'd have a table with stuff like this:

sl_bound_messages
----------------
id
destination (fk to id)
contents (right now, big varchar text)
source (source of message)
delivered_at (time of batch pick-up)
hub_pickup_id (fk to hub)
status (p = pending, b = batched, e = error, n = no object)
date_created (will be queue time)
date_updated

So an external application, fed by a GPS unit, could report the location of a RL object. This hub would accept the report, and hold it until it can push it through to Second Life. The hub could wait for polls from a 'repeater' in SL, which could be a prim running http requests on a timer. The 'repeater' would call up to the hub, get a batch of messages, and then distribute them around to target objects.

Objects in SL could also be hooked into the hub, in two ways:

  • An object can be scripted to listen for, and register with, a hub. The object could report its location in the sim as it is pinged, as well as the unique id that is assigned to it. This gives some ability to track the location of objects within SL, and allows the hub to provide a handle to objects in the sim by relating a name with the SL-assigned unique id.
  • The hub could know the objects it 'has', and route messages to the objects. The hub would request batches of messages on some period.
The hub could keep a registry that looks something like this:

sl_uids
-------
id
sl_uid
object_name
description
type (a = avatar, o = object, h = hub)
last_check_in
object_location_x
object_location_y
object_location_z
date_updated
date_created

So how to efficiantly distribute messages, and how many http requests and deliveries can be done without causing lag is a question, but I have some things in mind given these facilities. Much of LSL is undocumented, so there could also be better ways of doing this, but sure sounds like a fun experiment

The hub would keep a queue of messages like this:

sl_bound_messages
----------------
id
destination (fk to id)
contents
source (source of message)
delivered_at (time of batch pick-up)
hub_pickup_id (fk to hub)
status (p = pending, b = batched, e = error, n = no object)
date_created (will be queue time)
date_updated

Also, a simple event publishing mechanism seems useful, where SL objects publish an event, and RL things can be notified. Simply, a prim with an LSL script could shoot out an http request on some event in SL, this would go up to the hub, and the hub could allow registration of these events to interested parties outside. These events could note the object, object location, and any other relevant data, and the real world could react.

I've got RadRails fired up, and gonna start hacking.

Tuesday, May 1, 2007

Ruby tidbits

This came across my reader, but have not had a chance to dig and delve yet.

Sun Microsystems, Inc. (Nasdaq: SUNW) and the NetBeans(TM) Community, today announced an early access release of the NetBeans Ruby Pack which provides support for the Ruby programming language. The NetBeans plug-in offers developers added support for dynamic and scripting languages and includes editing features for both Ruby and JRuby - a 100% pure-Java(TM) implementation of the Ruby programming language that runs on the Java Virtual Machine.

In the past, I've messed with RadRails, and like it too. I've got a Ruby project in mind that I might start digging into this week, and I'll give it a spin. RadRails is now available over on the Aptana site, and to be folded into the Aptana ide which is another IDE for web development/ajax/css/javascript. Have not tried that either, but some demo shots of Aptana are here.

Needless to say, I've got some catching up to do. I've been focused on a few projects, and if you take your attention away from your feeds for a couple of weeks (and neglect your blog), then you end up behind the curve!

As far as the idea for Ruby, I'm trying to merge my interest in Ruby/Rails, mash-ups, and Second Life. Our group has put up a server dedicated to some Second Life mash-ups, basically a place where scripters working on several UNC islands in SL can collaborate on supporting code. We're starting out looking at tools like silo, and thinking about tools we might gin up. A prime target early on will be to look at better ways to get data in and out of SL. XML-RPC is one mechanism available to push data into Second Life, but it seems unreliable, and I wonder about it's long-term viability.

As an alternative, here's a Rube Goldberg idea, which brings me back to messing with Rails IDE's. Essentially, I'd like to write a store-and-forward queue arrangement. Put out a REST-ful interface on our mash-up server. You may send messages to an arbitrary object in a Second Life sim (and the framework would have to have hooks to resolve a name to a unique id assigned within SL). The framework would accumulate and sequence these messages. Then, build a series of prims that check the queue on a timer, and retrieve bundles of queued messages, routing them to the proper objects within Second Life. These routers could also manage the resolution of SL unique ids.

I'm still new to the environment, so I don't know how silly this is, but sure sounds fun to try!