Thursday, November 15, 2007

SunSPOTS talk and demo today at Sitterson

Paul Jones sent out this note, and I'll be attending for sure:

About SunSPOTS http://www.sunspotworld.com/

Where: Sitterson 014 When: Thursday November 15th at 3:30
Who: A member from the Sun Labs, David Simmons
http://blogs.sun.com/davidgs
More:

David has hands-on experience in building applications for SunSPOTs and was instrumental in its design and development, will be on hand to offer his insight into this amazing product. http://www.sunspotworld.com/

The SunSPOT (Small Programmable Object Technology) was developed in the Sun Labs and represents the future of embedded systems. Already used throughout academia, students and professors alike are finding new and interesting uses for SunSPOTs. Each SunSPOT comes equipped with a
processor, memory, eight external tri-color LEDs, light sensors, temperature sensors, an accelerometer, and several digital/analog inputs and outputs; offering up seemingly countless practical uses.

At its core, a SunSPOT is an embedded system. But, unlike other embedded systems that must be programmed using a low-level language such as assembly or C, SunSPOT applications are developed in Java. By allowing Java applications to be uploaded and run on an internal Java Virtual Machine, Sun is not only opening up SunSPOTs to more users than many other embedded systems, it is also leaving the final function of each SunSPOT up to the end user. By following a simple API with which to interface the SunSPOT, developers nationwide have created unique uses for SunSPOTs - everything from animal research to rocket testing and much more!


I'm currently working with the SunSPOT developers kit, and have been going through (and hacking on) the demo apps. One of the first things I am trying is to tap into the 3D accellerometer. I took the telemetry example and added tilt to the packets coming off the SunSPOT, and have that available on the host. At the same time I've created a virtual SunSPOT in Second Life, and have scripted that to mirror the pitch, yaw, and roll coming into the LSL script. Just a few more tweaks, and the virtual SunSPOT will be controllable from a real one. This has been done before, but not to Second Life. The lag will probably be pretty bad, but I want to explore how multiple SunSPOTS, used by different people in an immersive environment, can create cool experiences.

Anyway, here's a shot of the virtual SunSPOT, when I get it hooked up, I'll shoot a video. I might have it by this afternoon, if the creek don't rise. Anyhow, see you all at the talk this afternoon!

No comments: