Showing posts with label sunspot. Show all posts
Showing posts with label sunspot. Show all posts

Monday, April 14, 2008

NetBeans 6.0.1 running like a pig...here's how I fixed it

Netbeans 6.0.1 was running like a pig on my ThinkPad T60p. I did a bit of poking around, and found this set of config changes quite helpful, so I'll pass them along:

This is in my netbeans.conf, which should be under Program Files/NetBeans 6.0.1/etc on Windows. The critical change was the memory config:
netbeans_default_options="-J-Dcom.sun.aas.installRoot=\"C:\Program Files\glassfish-v2ur1\" -J-client -J-Xss2m -J-Xms32m -J-XX:PermSize=32m -J-XX:MaxPermSize=200m -J-Xverify:none -J-Dapple.laf.useScreenMenuBar=true"

Now NetBeans is running quite well. I'm hacking some Sun code samples to get data from the accelerometer to build a prototype air mouse. This isn't a standard mouse, but rather a way for multiple users to manipulate visualizations in the Social Computing Room. For grins, here's a shot of the space...

Thursday, April 10, 2008

Note to Self

I knew this, forget where I wrote it down, so I'll memorialize it here. I need to add some files to a SunSPOT project (in this case a desktop client), but couldn't remember the property in the ANT script to point to additional classpath entries...viola!


main.class=org.sunspotworld.demo.TelemetryFrame
#main.args=COM1
#port=COM1
user.classpath=lib/log4j-1.2.15.jar

Of course, now I'll forget that I stuck it in my blog. I'm looking at using the spots to create a multi-user input interface to a 360 degree visualization environment (our Social Computing Room), at least as a proof-of-concept.

Monday, March 3, 2008

Play with a SunSpot without buying a developer kit

Sun has released, in beta, an SDK that lets you code and run on an emulated SunSpot. Take your mobile Java experience in a new direction, and start learning about sensors!

http://blogs.sun.com/davidgs/entry/beta_starts

Monday, November 19, 2007

Virtual SunSPOT controlled by a real SunSPOT

Pardon the zapruder-like quality to this film, but this shows the hack I mentioned in my last post. I'm in SecondLife, controlling a virtual SunSPOT from a real one. In this case, tapping into the 3D accellerometer to pick up the xyz rotation, sending it through my framework to rotate the virtual one. It's a bit laggy, and not 100 percent there, but enough to get the idea.

If I ever find the time, the next cool example would be to implement the ectoplasmic bouncing ball demo using one real and one virtual SPOT. Anyhow, it works. The point really is to learn about the SPOT, and why not do something interesting while testing them...?



Thursday, November 15, 2007

SunSPOTS talk and demo today at Sitterson

Paul Jones sent out this note, and I'll be attending for sure:

About SunSPOTS http://www.sunspotworld.com/

Where: Sitterson 014 When: Thursday November 15th at 3:30
Who: A member from the Sun Labs, David Simmons
http://blogs.sun.com/davidgs
More:

David has hands-on experience in building applications for SunSPOTs and was instrumental in its design and development, will be on hand to offer his insight into this amazing product. http://www.sunspotworld.com/

The SunSPOT (Small Programmable Object Technology) was developed in the Sun Labs and represents the future of embedded systems. Already used throughout academia, students and professors alike are finding new and interesting uses for SunSPOTs. Each SunSPOT comes equipped with a
processor, memory, eight external tri-color LEDs, light sensors, temperature sensors, an accelerometer, and several digital/analog inputs and outputs; offering up seemingly countless practical uses.

At its core, a SunSPOT is an embedded system. But, unlike other embedded systems that must be programmed using a low-level language such as assembly or C, SunSPOT applications are developed in Java. By allowing Java applications to be uploaded and run on an internal Java Virtual Machine, Sun is not only opening up SunSPOTs to more users than many other embedded systems, it is also leaving the final function of each SunSPOT up to the end user. By following a simple API with which to interface the SunSPOT, developers nationwide have created unique uses for SunSPOTs - everything from animal research to rocket testing and much more!


I'm currently working with the SunSPOT developers kit, and have been going through (and hacking on) the demo apps. One of the first things I am trying is to tap into the 3D accellerometer. I took the telemetry example and added tilt to the packets coming off the SunSPOT, and have that available on the host. At the same time I've created a virtual SunSPOT in Second Life, and have scripted that to mirror the pitch, yaw, and roll coming into the LSL script. Just a few more tweaks, and the virtual SunSPOT will be controllable from a real one. This has been done before, but not to Second Life. The lag will probably be pretty bad, but I want to explore how multiple SunSPOTS, used by different people in an immersive environment, can create cool experiences.

Anyway, here's a shot of the virtual SunSPOT, when I get it hooked up, I'll shoot a video. I might have it by this afternoon, if the creek don't rise. Anyhow, see you all at the talk this afternoon!

Thursday, October 11, 2007

Turning Turing Around

I was reading Irving Wladawsky-Berger's blog today when I happened upon this wonderful observation..

I was reminded of the Turing Test recently, as I have been watching the huge progress we are making in social networks, virtual worlds and personal robots. Our objective in these applications can perhaps be viewed as the flip side of the Turing Test. We are leveraging technology to enable real people to infuse virtual objects - avatars, personal robots, etc - with intelligence, - as opposed to leveraging technology to enable machines and software to behave as if they are intelligent.

What intrigues me so much about virtual worlds like Second Life is this ability of avatar-based virtual spaces to allow you to push through the barrier, and cross over. How's that for a bunch of meta-physical BS! This is a different aim then something like Looking Glass, which is trying to apply a 3D metaphor to a 2D interaction...it's about stepping through to live with the data, or the sensors, or the other distant collaborators. As the real world becomes more inhabited by pervasive computing, it only seems natural that we go and visit the virtual on its own turf. One wonders about the definition of an application interface in the future. As machines grow smarter, perhaps we'll pop into the 'living room' of our personal agent to have a chat.

At any rate, there are a couple of fun things I'll be looking at in the near future that can tie in to these ideas. First, the idea of pervasive, wireless sensors everywhere. I'm waiting for a SunSpot Developers Kit, and there will be some sensor applications coming down the pike that could involve these extremely cool sensors. The fact that they use Java is a plus in my book. Needless to say, I'll be brushing up on my J2ME.

The next thing I see coming down the pike is real time location tracking, using the UbiSense platform. This is being leveraged for an intriguing space called a Social Computing Room, and has all sorts of potential uses. Here, I'm going to be doing some .Net programming.

Like the blog quote above, I've had a unique chance to push the physical into the virtual, and with the mentioned projects, there's a chance to work in the other direction. Where these meet is getting to be a pretty interesting space!