Sonntag, 13. März 2011



3d modelling with KINECT + ARDUINO + PD

i did this to check out Kinects precition
and usability for Virtual Reality applications on one side -

and on the other

to create algorythms to SELECT / MOVE / LINK Objects
for multitouch Applications

which is all part of my attempts in developing
inovative human/computer interfaces.


while working with kinect, i realized pretty soon that
its quite annoying to percept something (very essential)like a MOUSECLICK
only from the sceleton Data. I made an attemt that makes
clicks occur by pointing at an object and then straightening the arm,
but this was far too exhaustive after a while.

so i decided to add comfortable clicking posibillities with Datagloves.

i had an arduino and some resistors lying around, went to Lagerhaus ------ARDUINO---GLOVES--HANDSTATES
bought some fancy gloves for 3 euros and soldered this together:
using 1 analog pin per glove i get 4 differnt currents for each fingerstate.
the arduino connects with pd using pduino.
of course it would be much nicer to have a wireless version of this,
it could easily be done using a XBEE arduino that sends data over bluetooth.
and a rumblepack would be cool aswell, to provide haptic feedback.

HOW does it work?
Kinect sends a depth image of the user to ---------skizze1
OPEN NI / NITE which tracks the Users Skeleton and
sends 3D Positions of every joint to
Pure Data over OSC.
the arduino sends the Finger States to PD.

in PD this data is streamed to various patches to serve multiple purposes: --------skizze 2

Navigation
Essential for 3D modelling is the virtual camera, that allows one to move around
the objects in virtual space.
the Fingerstates 2 + 3 of either hand are reserved for this.
with one finger2 pressed the camera orbits around its target. ------------navigation-----
with one finger3 pressed camera and target moves left/right + up down in screenspace.
also i added head tracking wich moves the camera up/down/left/right/back/front in screenspace.

Finger 1 is reserved for ----------zeitraffer
Selection/Moving

Here i need to satisfy 2 different requirements.
once i want to Select by pointing at something at the screen for the 2D menue stuff.
for this i made a calibration algorythm that works by pointing
at the top-left and bottom-right corner of the screen. then only handpositions within
this pyramid are used, and mapped to screenspace.
2 cursors are calculated, making a total of 3 with the mousecursor.
selection is done by comparing the unitvectors of camera-->oject and camera-->cursor.

for the modeling i select things by moving my hands in 3d space.
so for this i fix the sceletons hands infront of the camera
to have them moving and rotating with the view.
this works by multiplying the sceletons world coordinates with the camera matrix.

the most demanding part was the multitouch logic, that forbids a certain cursor for other objects, once one is selected. if you select something and move it around, you dont want to drag everything else on the way with it.
so there is many crooks and hacks to avoid unwanted situations in here.

the rest was a piece of cake:
for the modeling i made 2D menupoints that switch between the 4 basic operations needed for mesh-modeling:
[create Verticle] [create QuadFace] [create TriFace] [Move] ----------------create a box and a pyramid

when [create Verticle] is on
a select- and move-able point with a unique ID ins created at the hands 3d position when finger1 is pressed

when [create QuadFace] [create TriFace] is on
a Polygon is created and linked to 3 or 4 Verticals by selecting them one after the other.

finally, i needed to save and load the results of my modeling work. ----------MAX
i did this using the same syntax as the .obj format,
so the model can be interchanged directly with 3dsMAX or other 3D apps.

and of course ------stereo
its double the fun in stereo.

credits go out to
all the people
who made this possible
by developing
these wonderful pieces of
open source software

Pure Data

GEM

OpenNI

Arduino

Sensebloom

Linux

and all the Kinect Hackers

9 Kommentare:

Oleg Kostour hat gesagt…

Man, this is great, I'm very impressed. I wonder if there's a way to direct message you through your blog. Let me know if there is. I wanted to shoot you an email if possible.

Oleg, from Canada.

Unknown hat gesagt…

Is it opensource? Is it possible to recreate the exoeriment here in Russia?!

Sebastian hat gesagt…

@ oleg: sonsofsol@gmx.net
@ STRIBOJICH its all open source. you will nedd some pd skills though.
i can send you the patches, once you got kinect, OpenNI + Nite + Sensebloom OSCeleton running. this took me about 2 weeks... ;-/

PureVision hat gesagt…

This is great work!! I can't wait to get this running! I will have kinect running soon. Your work in user interface is cutting edge!
Blaine

Unknown hat gesagt…

Very impressive... can you publish or send your patch?
I've already kinect and OSCeleton up and running.

This is a sort of patch-dream... kinect (or similar) are so young and so promising and connected to PD is just magic stuff (cause PD is magic)

Thanks a lot

ciao!

Sebastian hat gesagt…

i uploaded the patches to
www.3rd-eye.at/GEM-engine-kinect.zip
its in a very beta state.
watch the tutorial video about gem-engine before.
much success and fun

Sebastian hat gesagt…

i send you the patches i made for the video.
you will need to start them in the following order.

1
/GEM_engine1.pd - is for viewportnavigation (explaination is here: http://www.youtube.com/watch?v=5lNDBVoqico)

OSCeleton + arduino connected?
2
/kinect3Dmodeling/0_osceleton.pd - recieves osc from osceleton + arduino (set ports here)
3
/kinect3Dmodeling/1_3Dmodeling.pd


i did some documentation, but its still quite messy...
i hope everything is there, i couldnt test it, as i changed my setup.
if you run in troubles just write me a mail. ofcourse id be happy to hear what you guys made of it. have fun
schena gruas
sebastian

jjbarrows hat gesagt…

hi,
super cool program - i had been hoping i could use my kinect for interactive 3D modelling and am an avid pd/gem user - but my setup fails to recognise unitvector3D object used in your patch (I'm using pd-extended on ubuntu)

Sebastian hat gesagt…

@ jjbarrows:
you can find that file in gemengine1.pd/pd-tools/
its called pd-unitvector3D
copy the contnent to a new file and save it in Gemengine folder, maybee add this to searchpath.
sorry bout the inconvenience