Administration‎ > ‎

2/27 Team Meeting

-- use trailer hitches on stands.. one on each short end. ("table mounts")

--hitch ball on the stand.

-- ball mount (hole) on each short end of the projection/surface apparatus ("table," distinct from mounts). also many smaller holes along a semi-circle at a fixed radius from the ball mount hole for locking the table at a specific angle (holes are for pegs similar to weight lifting pegs found in gyms)

Jas will talk to his dad about trailer hitches (and dxf files hopefully). still need to procure balls & mounts on our own.

matt -- big premise is touches to look like mouse points (mouse actions) -- pre release version of x server has multiple pointer support.

multi pointer vs touches or gestures is a different concern.

we agree that straight touches are similar to mouse points -- thinking that touch driver (cuda & image comp processing) could behave as a mouse driver to the operating system (or windowing system) if i t looks like straight mouse touches, then apps like firefox could be used by touch.

lets you use apps, like when zach was talking about the dj thing earlier..

is multiple mouse clicks local to the operating system or the specific program?

problem - right now if you plug 5 keyboards into a linux box, they all operate as the same device.

MPX -- multi-pointer x. -- integrated into the next version of x windows.

eddie- - why does the touch driver need to interface with the x server?

matt -- in linux, the GUI runs as a dedicated program that runs more priviledged (as root) -- so xserver handles any and all GUI apps (talks over network protocol, wrapped in an API), handles all input and output for a program.

where does the gesture library fit into the block diagram?

lengthy discussion on interfacing the touch driver to the gesture library and the purpose of the gesture library.

eddie's take on x windows -- feels like output from gesture library could be output into the x server rather than just the touch driver.

-- we've decided that touch driver should output only directly to the gesture library and that the gesture library should drive the x server and applications / gesture server directly

-- jas - feels like we might need to put some of open cv into cuda.

eddie -- also (nvidea talk) -- local memory and global memory are offboard, so we need to have a sizeable amount of ram on the motherboard. (

numerical calculations (how much memory do we need?) 640x480 pixels x 24-bits per pixel (3 byes) = 3.68 million x 60 frame buffer = 221MB

we can gauge computational power required for the big table by deliberately under-spec'ing the prototype. if the prototype runs fine, then we dont' can scale appropriately. if it shits its pants, we know we need to up the computational power

-- touchlib vs tbeta vs opencv. is open cv in cuda? does openGL compete with CUDA for resources?

matt's research on mobos- 3x PCI

a card that does x16 will downscale to x8 if the bus doesn't support it, but you'll be running at 1/2 speed data transfer. so, the best board matt found had 3x PCIx16, but one was electrically x8 despite being physically x16 (so one card would be downscaled on this board)

actually the comment about gauging computational power by under-spec'ing the protoype is moot, since the prototype only has one projector and one camera and doesn't need the processor power to scale.

attack -- build it, run it, locate the bottleneck, and fix it.

how do we locate the bottleneck? (must determine where the most computationally intensive part is)

jas thinks we may be able to determine this info with gdb

unambitious version -- no OS hooks and require any applications that need to run on the table be designed specifically for the table.

block diagram done. images and numbers will be sent out in next email

Block Diagram