2012-12-13

Usage of the sketching table

We have collected some movies depicting the usage of our sketching table. The videos will hopefully provide a better understanding of the usage of the sketching table.

The first video shows Robin Schaeverbeke teaching Jimmy drawing techniques using the table. It also contains reflections on the usage.


The next video shows some other persons using the table for the first time during the exhibit in Delft.

We also created two timelapse picture series. First we have one that shows the first day of the exhibition.

The other one shows some more usage of the table during the second exhibition day and how we take down the stations.

2012-12-12

Project presentation

Tomorrow, 13 december, at 13.00 o'clock to 14.00 in A109 on LTU we will hold a presentation about the project. If you are not able to come, there is a possibility to see it on Adobe Connect:
https://connect.sunet.se/ltu_srt_pmc

After the presentation there will be a demonstration in A1301. In between there is a break with some mulled wine and gingerbreads.

Hope to see you there!

2012-12-10

Release 4 aka "Dolph"

It's time to announce our fourth release.

In this release we have updated the workspace transform code to use WebGL, as Jimmy hinted at before we went to Delft. So that code is a thousand times faster now. :)

There is also a small bug fix in the tangible binaries. You can continue to use the old executables if you haven't encountered any problems. You can get the new binaries here: Tangibles-Release4.zip

Remember that the latest instructions are always up to date on the wiki: github.com/stefansundin/The-Tangibles/wiki

Go to Tangible.se to try it out, as usual!

2012-12-04

Reflections from the WebRTC group

Working with our system and setting up a live demo in Delft revealed several issues previously not encountered.

The first problem, which is something that affects the system as a whole is that Chrome is currently having problems handling more than one video stream. The resulting effect is that the video views swap rapidly between the two video streams. It does not happen every time, but often enough for it to be a problem. In our case, the camera view swaps between showing the desktop view and the face view. Google has been working with this issue from the start, but it is not prioritized. To get around this problem we decided to use two computers per workstation. This worked well and allowed us to have a successful demonstration and workshop. A one-computer solution is not really viable until Google fixes the issue in Chrome.

The other main issue is freezing video streams. This seem to vary between different computers and simply requires that the user exits and reenters the room. Something related to this is that a rather powerful computer is required to run the system with several participants. Test results examining this will be presented in the report. The first set of computers we received in Delft turned out to be too weak to handle rendering of several video streams.

These issues are something that we have to accept when using technology that is currently in development.

2012-11-29

Projector group - reflections

Time for a report from the Projector group!
As you know, we’re back from our trip to Delft. During our demonstrations in the university there, we made a number of interesting observations which we'll now share with all of our dear readers :

  • Our system is very sensitive to changing light conditions: A lot of manual calibration to the camera settings is needed to cope with this.
  • The virtual buttons need to be more intuitive: people didn’t realize what they did, and often pushed them by accident.
  • The virtual buttons push themselves when the video stream on them change drastically, meaning that you can turn other users’ incoming video streams on and off.
  • A single user with bad settings ruins the experience for everyone in the shared workspace. As their bad projection is filmed and sent to all other users.
  • Unstable workstation setup poses a problem. The projection gets thrown off if the camera or projector is slightly moved.
  • When the shared workspace was used by an architect teacher to teach Jimmy some basic drawing concepts, we noticed that the teacher’s ability to see the students face was very important.
  • High demands on the Internet connection: a slow connection results in low resolution video, to the point where the system becomes unusable.
  • We need better resolution on the video streams, which Jimmy is currently trying to fix by rewriting the code to use WebGL for our transformations.
A last observation is that people didn’t seem to notice the technical issues the way we do (since they were having too much fun drawing, maybe?), which was a positive surprise.

That’s all from us!

Tangible group - reflections

The team is back in Luleå again after the Delft trip. We're happy to report that the days here have become even shorter, the weather has turned colder again, the snow is back!

The tangible group has returned to the basement lab and started to reflect on our impressions concerning the role of the tangible devices' in this project. Here comes a couple of thoughts:

  1. In Delft we discovered that in that setting, the surface of the sifteo cubes were too shiny for the camera to be able to recognize the qr code images on the cubes during calibration. Our solution in delft was to instead print QR codes on paper.
  2. Our group did feel a bit superfluous during the demonstration in Delft. It turned out that demonstrating the initialization phase, including resizing the shared space, was very cumbersome to do many times, due to the fact that the hardware for the cameras needed to be re-calibrated every time the webpage was reloaded. This caused the team to choose not to demonstrate this part, with the result that the tangible devices was not used at all in delft.
  3. Overall, we feel that the architect use case does not need tangible devices. All we do here is using them for accepting a call and calibrate the shared space, but we don't think doing only this justifies their existence. Other use cases would be needed.
  4. A new use case was suggested during a reflection session in Delft. One could use a Sphero to rotate 3D objects on the screen - for example during an architect teaching session when a student is to draw a sketch based on a 3D model.
  5. Another use case could include using sifteo cubes for showing and moving around virtual documents, like a pdf document, in the shared workspace.
  6. While running in a web browser sounds like a better idea from the user's perspective, it did not set the user free from running programs locally in our case. The user had to start the driver locally, he had to  repeatedly press 'allow' to start the camera in the web browser. The user also had to leave the web browser window open in order to be able to accept an invitation to join a room. We believe that the experience would be more smooth if the whole program was an application running locally at start-up.

We hope that these reflections relating to the tangible devices will be helpful input for future projects.

Samuel, Mattias, Viktor, Alex

2012-11-26

Release 3 aka "Sylvester"

It's time to announce our third release. This is the same code that we used during our time at TUD in Delft.

The changes in this release are mainly bug fixes and stability improvements. The most visible new feature is the new transform used in the shared workspace. For the tangibles, there are no changes in the applications that you have installed on your computer; just continue using the same binaries from release 2.

We have started using the wiki at Github to document our instructions. Please go there to read the most up to date instructions: github.com/stefansundin/The-Tangibles/wiki/

Go to Tangible.se to try it out, as usual!

2012-11-24

Conference concluded

Hello! Stefan here again. I'm glad to report that the conference went very well and that we are very satisfied with the outcome.

We arrived late on Sunday, then spent three days setting up the prototype before showing it to the other participants. The shared workspace is quite sensitive to different lighting conditions, so that is probably the issue that bothered us the most. It was a little unfortunate that our setup was situated besides some very large windows. We started each day by calibrating the camera, then a few hours later we had to re-calibrate the camera to account for the new sunlight.

We displayed the setup to many interested people, among them the architect and lecturer Robin Schaeverbeke who gave Jimmy a lesson in architecture over the shared workspace. More about this in a later blog post.

I want to thank the whole team. It makes me feel proud as a project leader to see the whole team working so hard towards a common goal, and achieving it! Also a big thank you to the people who have helped us, among them Dr. Tjerk de Greef, Dr. Charlie Gullström, Dr. Caroline Nevejan and Prof. Peter Parnes.

We still have work to do, and we will release more media and thoughts from the conference as time goes on. Here are some pictures in the mean time:

Robin Schaeverbeke teaching Jimmy using the shared workspace.

Dr. Tjerk de Greef and Dr. Charlie Gullström

Prof. Peter Parnes and Dr. Leif Handberg

2012-11-20

Delft!!

We are also helping out with finalizing the exhibition.

And the goldfishes are soon ready to start chatting.

Delft!

We are currently setting up our stations. Some problems have been encountered for example that chrome seems to have changed webRTC version.



 

A note, QR codes and bicycles are very popular here in the Netherlands.



We did get a very nice dinner yesterday :)



And now we got ourselves some goldfish to keep company. The rumor is that the goldfishes will be in a Skype conversion on Thursday.