Project concluded

This is just a short post to let any visitors know that the project has concluded. It seems that the university has closed the servers the website was hosted on and they have probably decided to use them for something else. However, all the source code, the blog posts, our videos at YouTube, and the final project report, is available. With this information you should be able to make your own prototype.

We have since gone our separate ways and started working on our respective master theses. Perhaps of interest to people interested in mediating presence is the LTU Telepresence Master's Thesis by Jimmy and Nicklas Nyström or The Online Paris Café by Patrik Burström and Alexandra Tsampikakis.




Usage of the sketching table

We have collected some movies depicting the usage of our sketching table. The videos will hopefully provide a better understanding of the usage of the sketching table.

The first video shows Robin Schaeverbeke teaching Jimmy drawing techniques using the table. It also contains reflections on the usage.

The next video shows some other persons using the table for the first time during the exhibit in Delft.

We also created two timelapse picture series. First we have one that shows the first day of the exhibition.

The other one shows some more usage of the table during the second exhibition day and how we take down the stations.


Project presentation

Tomorrow, 13 december, at 13.00 o'clock to 14.00 in A109 on LTU we will hold a presentation about the project. If you are not able to come, there is a possibility to see it on Adobe Connect:

After the presentation there will be a demonstration in A1301. In between there is a break with some mulled wine and gingerbreads.

Hope to see you there!


Release 4 aka "Dolph"

It's time to announce our fourth release.

In this release we have updated the workspace transform code to use WebGL, as Jimmy hinted at before we went to Delft. So that code is a thousand times faster now. :)

There is also a small bug fix in the tangible binaries. You can continue to use the old executables if you haven't encountered any problems. You can get the new binaries here: Tangibles-Release4.zip

Remember that the latest instructions are always up to date on the wiki: github.com/stefansundin/The-Tangibles/wiki

Go to Tangible.se to try it out, as usual!


Reflections from the WebRTC group

Working with our system and setting up a live demo in Delft revealed several issues previously not encountered.

The first problem, which is something that affects the system as a whole is that Chrome is currently having problems handling more than one video stream. The resulting effect is that the video views swap rapidly between the two video streams. It does not happen every time, but often enough for it to be a problem. In our case, the camera view swaps between showing the desktop view and the face view. Google has been working with this issue from the start, but it is not prioritized. To get around this problem we decided to use two computers per workstation. This worked well and allowed us to have a successful demonstration and workshop. A one-computer solution is not really viable until Google fixes the issue in Chrome.

The other main issue is freezing video streams. This seem to vary between different computers and simply requires that the user exits and reenters the room. Something related to this is that a rather powerful computer is required to run the system with several participants. Test results examining this will be presented in the report. The first set of computers we received in Delft turned out to be too weak to handle rendering of several video streams.

These issues are something that we have to accept when using technology that is currently in development.


Projector group - reflections

Time for a report from the Projector group!
As you know, we’re back from our trip to Delft. During our demonstrations in the university there, we made a number of interesting observations which we'll now share with all of our dear readers :

  • Our system is very sensitive to changing light conditions: A lot of manual calibration to the camera settings is needed to cope with this.
  • The virtual buttons need to be more intuitive: people didn’t realize what they did, and often pushed them by accident.
  • The virtual buttons push themselves when the video stream on them change drastically, meaning that you can turn other users’ incoming video streams on and off.
  • A single user with bad settings ruins the experience for everyone in the shared workspace. As their bad projection is filmed and sent to all other users.
  • Unstable workstation setup poses a problem. The projection gets thrown off if the camera or projector is slightly moved.
  • When the shared workspace was used by an architect teacher to teach Jimmy some basic drawing concepts, we noticed that the teacher’s ability to see the students face was very important.
  • High demands on the Internet connection: a slow connection results in low resolution video, to the point where the system becomes unusable.
  • We need better resolution on the video streams, which Jimmy is currently trying to fix by rewriting the code to use WebGL for our transformations.
A last observation is that people didn’t seem to notice the technical issues the way we do (since they were having too much fun drawing, maybe?), which was a positive surprise.

That’s all from us!

Tangible group - reflections

The team is back in Luleå again after the Delft trip. We're happy to report that the days here have become even shorter, the weather has turned colder again, the snow is back!

The tangible group has returned to the basement lab and started to reflect on our impressions concerning the role of the tangible devices' in this project. Here comes a couple of thoughts:

  1. In Delft we discovered that in that setting, the surface of the sifteo cubes were too shiny for the camera to be able to recognize the qr code images on the cubes during calibration. Our solution in delft was to instead print QR codes on paper.
  2. Our group did feel a bit superfluous during the demonstration in Delft. It turned out that demonstrating the initialization phase, including resizing the shared space, was very cumbersome to do many times, due to the fact that the hardware for the cameras needed to be re-calibrated every time the webpage was reloaded. This caused the team to choose not to demonstrate this part, with the result that the tangible devices was not used at all in delft.
  3. Overall, we feel that the architect use case does not need tangible devices. All we do here is using them for accepting a call and calibrate the shared space, but we don't think doing only this justifies their existence. Other use cases would be needed.
  4. A new use case was suggested during a reflection session in Delft. One could use a Sphero to rotate 3D objects on the screen - for example during an architect teaching session when a student is to draw a sketch based on a 3D model.
  5. Another use case could include using sifteo cubes for showing and moving around virtual documents, like a pdf document, in the shared workspace.
  6. While running in a web browser sounds like a better idea from the user's perspective, it did not set the user free from running programs locally in our case. The user had to start the driver locally, he had to  repeatedly press 'allow' to start the camera in the web browser. The user also had to leave the web browser window open in order to be able to accept an invitation to join a room. We believe that the experience would be more smooth if the whole program was an application running locally at start-up.

We hope that these reflections relating to the tangible devices will be helpful input for future projects.

Samuel, Mattias, Viktor, Alex