Skip to content

Blockplayer is a system that uses a depth sensor to acquire and track a building block model in realtime, as the user assembles and interacts with the physical model. This is the open-source project code that goes along with the IEEE TVGC 2012 (IEEE VR 2012) paper "Interactive 3D Model Acquisition and Tracking for Building Block Structures".

License

Notifications You must be signed in to change notification settings

amiller/blockplayer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Blockplayer is a system that uses a depth sensor to acquire and track a building block model in realtime, as the user assembles and interacts with the physical model. This is the open-source project code that goes along with the IEEE VR 2012 (and IEEE TVGC 2012) paper "Interactive 3D Model Acquisition and Tracking for Building Block Structures".

Teaser image of the BlockPlayer project

Building

The blockplayer project is intended to be run from a command line in the current working directory. The following command builds the necessary cython files in place:

python setup.py build_ext --inplace

It is also possible to build the normal way and install it as a library:

python setup.py install

Running the experiment

To run our experiment, you will need to download (at least some of) the dataset. The dataset is available at http://isue-server.eecs.ucf.edu/amillervr2012/dataset/ as 75 tar.gz files, each containing one run (15 seconds). Extract these into the data/sets directory. Use the following commands to run the experiment and prepare the html results report:

python experiments/make_output.py
python makewww/make_grid.py

If you have downloaded the entire dataset, then you can produce the average error graph as shown in Figure 9 of the paper.

python experiments/exp_avg.py

Running with a prerecorded dataset

Download one or two of the files from http://isue-server.eecs.ucf.edu/amillervr2012/dataset/.

    $ wget http://isue-server.eecs.ucf.edu/amillervr2012/dataset/study_user1_z1m_add.tar.gz
    $ tar -xzf study_user1_z1m_add.tar.gz
    $ ipython --pylab=wx
    [1] run -i demos/demo_grid.py
    [2] go('data/sets/study_user1_z1m_add')

Running in live real-time mode

The system will first need to be calibrated. Place the Kinect sensor on a stand about a half meter above the table surface, facing down at around 45 degrees. Check that your desired work area is within the field of view of the camera. The minimum distance the sensor perceives is about half a meter. From the IPython shell, use the following commands:

    [1] from blockplayer.table_calibration import run_calib
    [2] run_calib()

The system will take a snapshot of the table surface, which should be clear of any objects. Click four points (clockwise) to define a quadrilateral work area. Calibration results are saved in data/newest_calibration. Next, run the demo with the following commands:

    [3] run -i demos/demo_grid.py
    [4] go(forreal=True)

Dependencies

Blockplayer has several library dependencies that may be difficult to satisfy on your system. The script vmdist/install_vm.sh is the best reference for how to set it up.

In some cases the script refers to the most recent version of a project, therefore the script might not turn out to be 'future-proofed'. So, in addition to the source code, this project comes with an Open Virtualization Format machine image (*.ova) which has all necessary dependencies pre-installed. The machine image has been tested with VirtualBox OSE. The scripts in the vmdist directory can be used to build the virtual machine from a stock Ubuntu image, but some undocumented manual configuration is needed to build the virtual machine image.

Running BlockPlayer on a headless machine

To run the experiment and display the results on a headless machine, you should start an X virtual server with:

Xvfb

If a graphics card isn't available, then you should specify the mesa (rather than nvidia) OpenGL drivers, e.g., with:

LD_PRELOAD=/usr/lib/mesa/libGL.so xvfb-run bash

About

Blockplayer is a system that uses a depth sensor to acquire and track a building block model in realtime, as the user assembles and interacts with the physical model. This is the open-source project code that goes along with the IEEE TVGC 2012 (IEEE VR 2012) paper "Interactive 3D Model Acquisition and Tracking for Building Block Structures".

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published