In brief, here's a demo of a physical/virtual mashup. In this case, UbiSense tracking is used on individuals within a space called the Social Computing Room, and depicted within a virtual representation of the same space.
One can think of a ton of ways to take this sort of thing. There are many examples of using the virtual world as a control panel for real-world devices and sensors, such as the Eolus One project. How can this idea be applied to communication between people, for social applications, etc. What sort of person-to-person interactions between persons in the SCR and remote visitors are possible? I have this idea that virtual visitors would fly in and view the actual SCR from a video wall. Then they could fly through the wall (through the looking glass) to see and communicate with the virtual people as they are arranged in the room. A fun thing we'll using as a demo at HASTAC.