If the connector point (above the marker) is in the connector area (area below marker) for x frames (5?10?) the two markers will be linked. This is regardless of whether they are subsequently moved.
If a marker is removed from the board for y frames (large enough that in moving markers, blocking the camera, it wouldn't be removed). If a link wants to be broken the two markers could be placed in a detachment area for z frames (2/3) or another method could be to rotate both markers so that they face the wrong way towards each other.
For this method to work I need to know the angle (which way up it is. Where the top and bottom is), the location and even possibly the size of the marker.
Tuesday, 30 November 2010
Monday, 29 November 2010
Progress update
So I've now got multiple markers displayed. The next step involves working on the tracking/developing the overall concept.
For next week I aim to have a good idea of how to link the markers together and a better picture of where we're going so that when event tracking is introduced it is developed in a sensible and productive manner.
I'll publish thoughts and further things this afternoon or tomorrow
For next week I aim to have a good idea of how to link the markers together and a better picture of where we're going so that when event tracking is introduced it is developed in a sensible and productive manner.
I'll publish thoughts and further things this afternoon or tomorrow
Saturday, 20 November 2010
XML parsing successful. Single Marker displayed in place
The Xml is now parsed and now the display is being designed. Currently I have a marker shown. It is a permanent marker which is moved to the location where the user places it and displays it's ID number on it.
Screenshot of current design:
The main aim for the next week is to get several markers displayed concurrently using QVector holding the different markers.
Screenshot of current design:
The main aim for the next week is to get several markers displayed concurrently using QVector holding the different markers.
Thursday, 4 November 2010
XML parsing
So where am I up to this week.
I've not quite managed to get an xml parser working. Am currently stuck at compiler errors. Hopefully by next week I'll be beyond this and can get to work on the actual application.
I have however created an ecs.forge project at https://forge.ecs.soton.ac.uk/projects/dl1g08-tui/ this should allow version control as well as others being able to view my code.
I've also read a few background papers. Key things I learnt from Experiments with Face-To-Face Collaborative AR Interfaces by M. Billinghurst, H. Kato, K. Kiyokawa, D. Belcher, I. Poupyrev. were that we communicate much better facing each other over a table top doing something. In this way we achieve tasks quicker and the experience is generally much better than either sitting next to each other facing a screen. It also showed that using physical objects is much easier and satisfying than virtual objects.
Alternative tools for tangible interaction: a usability evaluation by Morten Fjeld, Sissel Guttormsen Schär, Domenico Signorello, Helmut Krueger. demonstrated that in terms of dealing with spatial problems physically dealing with something is best, a tui second, a 2d physical system 3rd and mathematically working out a distant fourth in terms of speed and enjoyment of using. It demonstrates that humans work much better practically in these projects rather than trying to do it theoretically. It also showed that a tui and physically were very similar in terms of usability except that users found it more enjoyable with the physical test.
This coming week I hope to:
I've not quite managed to get an xml parser working. Am currently stuck at compiler errors. Hopefully by next week I'll be beyond this and can get to work on the actual application.
I have however created an ecs.forge project at https://forge.ecs.soton.ac.uk/projects/dl1g08-tui/ this should allow version control as well as others being able to view my code.
I've also read a few background papers. Key things I learnt from Experiments with Face-To-Face Collaborative AR Interfaces by M. Billinghurst, H. Kato, K. Kiyokawa, D. Belcher, I. Poupyrev. were that we communicate much better facing each other over a table top doing something. In this way we achieve tasks quicker and the experience is generally much better than either sitting next to each other facing a screen. It also showed that using physical objects is much easier and satisfying than virtual objects.
Alternative tools for tangible interaction: a usability evaluation by Morten Fjeld, Sissel Guttormsen Schär, Domenico Signorello, Helmut Krueger. demonstrated that in terms of dealing with spatial problems physically dealing with something is best, a tui second, a 2d physical system 3rd and mathematically working out a distant fourth in terms of speed and enjoyment of using. It demonstrates that humans work much better practically in these projects rather than trying to do it theoretically. It also showed that a tui and physically were very similar in terms of usability except that users found it more enjoyable with the physical test.
This coming week I hope to:
- Build the parser fully (functioning!)
- Set up dtserver using qprocess (if time though low priority)
- Start dealing with data from dtserver (again if time)
- Continue background research.
Subscribe to:
Posts (Atom)