Stefan

e[cam]Board - ToF skating

16 posts in this topic

We, Lutz and Stefan, are building our own hand gesture controlled skateboard! Yay! Inspired by CM’s Board of Awesomeness and the endless possibilities given by the Camboard nano we decided to create our own version of an e-powered skateboard. The superiority of the Nano over other market solutions for gesture control really call for it. The Nano makes a fast and direct 3D data acquisition easy. This will allow for a stable and robust gesture control algorithm that can cope with the harsh conditions of outdoor skateboarding (changing illumination, direct sunlight, vibrations e.g.)

We would like to give you a weekly update of our current status. Make sure to follow this thread by clicking the button on the top right to stay tuned!

Share this post


Link to post
Share on other sites

Current status:

Some time ago we ordered a remote controlled 800 W e-skateboard and we had to wait 3 weeks full of anticipation for its delivery. In the meantime, we already started to do what can be done without the board: software development.

We will use a small, fast SSD-Tablet-PC for number crunching and we are working on a simple algorithm to extract a robust speed value from a set of gestures that is still to be defined. For the interfacing to the skateboard we decided to go “nano” again. We use an “Arduino nano” microcontroller. The Atmel microcontrollers are also easily programmable in C/C++ using the AVR-GCC compiler and offer lots of I/O pins.

So the first thing we did when the skateboard was delivered last week (after two or three test rounds in the parking lot {which was really cool! PS: helmet and protective clothing are not a bad idea!}), was to turn it inside out and to find out how to build the interface...

post-116-0-64453300-1346410498_thumb.jpg

post-116-0-55806300-1346410502_thumb.jpg

post-116-0-17295200-1346410535_thumb.jpg

post-116-0-26574300-1346410562_thumb.jpg

Sab, JimP4nsen, Buxi and 2 others like this

Share this post


Link to post
Share on other sites

Thank you for your offer Christian! Actually we are members of the PMDTeam but working on this project after hours in our spare time. Regarding software: We want to try out a few things of our own first, but good ideas are always welcome. Stay tuned for an update on this topic!

Share this post


Link to post
Share on other sites

Software development started right after unboxing our other toys: a Slate tablet pc and our very own Nano! As a first try we used Matlab and PMD's Matlab SDK in combination with our Nano. Here we played around with some basic ways of hand recognition. This can be done by simply looking for the nearest few pixels in the distance data and selecting all pixels which are not more than a few centimeters further away.

Our first approach for a speed control input is to take the area surrounding the hand, detecting open, closed and continous values in between. The easiest algorithm and probably fastest for this task is the fit of a rectangle around the hand. And because we have a PMD camera with absolute distance data, we can normalize the pixel area by multiplying with the square of the mean distance to get the real area size! This way closing and opening the hand (thus changing the area size) gives reproducible results which are independent of the hand's absolute position. This is quite an important issue in the context of balancing on our e[cam]Board. Our preliminary tests on the virgin unmodified eBoard clearly proofed that with none of us being a skate or snowboard expert all hands are needed to keep standing upright. In addition, we believe that although fancy gestures would be cool (of course), for e[cam]Board riding: less is more. So we will keep it simple on purpose.

Current status: It works very well in Matlab - and fast! So...

--- ! Mission accomplished ! ---

Okay...but now we're hooked. With OpenCV under C++ there exists a huge free library for 2D image operations ready for hand tracking. Let's see what we can do with that!

PS: For our purpose we tuned the Matlab PMD SDK a bit. We get around 60 fps with the Nano and the Slate with an integration time of 1.5 ms, with all calculations - in Matlab! (78 fps at 500 µs). If you try this with the normal PMD-MDK you end up with a much lower frame rate. We put the "Separate Threaded" example from the downloads section into a little Matlab mex file. This takes care of the data sampling in a separate thread (like a free run mode) and Matlab can fetch new frames at any time. Source code and compiled mex files are attached here as well.

post-115-0-45885600-1347001477_thumb.jpg

post-115-0-85043200-1347001477_thumb.jpg

post-115-0-03983400-1347001741_thumb.jpg

PMDMDK_Multithreading.zip

LowKnee, JimP4nsen and Buxi like this

Share this post


Link to post
Share on other sites

Hardware status report:

When we took a first glimpse into the electronic bowels of the e-skateboard some time ago, it turned out that there is a "mainboard" that basically controls the motor (what else) and a separate smaller PCB, the receiver unit, that picks up the commands from the remote control (see first photo). So all we had to do in order to gain control, was to somehow get in between these two modules. A quick scan with an oscilloscope revealed that the communication is digital and simply one-wire and one-way. It took a few minutes to break down the signal chain of bits and bytes into speedvalue bits, speed/brake bit, checksum bit, static bits, and mirror bits, but once this was done the hardest part was over.

As mentioned before, we decided to use an Arduino Nano to do all the work. The integrated Atmega microcontroller runs at 16 MHz and should be potent enough to handle all current and future requirements. It has plenty of IO pins and in addition offers an integrated USB port, so that direct communication between our tablet and the skateboard should be no problem. However, we thought that a wireless skateboard is much more handy and added a class 1 Bluetooth module for the communication with the tablet. In case that no connection is established, the signals of the remote control will be simply repeated, thus preserving the original functionality of our new e-Board. We designed a tiny PCB in order to accommodate the Arduino Nano, the Bluetooth module and the old receiver unit. Being quite small everything still fits into the original housing, so that our mod is not visual from the outside 8see second photo). Now - thanks to the greater range of coverage of Bluetooth - we are able to take control over the e-Board whenever we want, even if someone else is using it with the standard remote control.

Coming up next:

First tests of our combined software and hardware - finally making our e-Board an e[cam]Board.

PS: after two weeks of skateboard riding in the evening still no broken bones - keep fingers crossed!

post-116-0-77334400-1347608190_thumb.jpg

post-116-0-72698200-1347608195_thumb.jpg

JimP4nsen and LowKnee like this

Share this post


Link to post
Share on other sites

CamBoard nano + e-SkateBoard = e[cam]Board

And here we are again with our weekly update:

Concerning our software we have reimplemented our tracking algorithm in C++ with OpenCV. Using our very own ROI and filter functions in conjunction with the library contour functions improves the hand gesture input a lot. Now it’s even more stable, robust and allows really fine steps of motor power control or braking intensity. The hand now has to give a starting signal (spread hand) and then it is tracked. This turns out to be especially important in our application, because if we put the nano on the skateboard, the hand is not neccessarily the closest object. Good luck trying to get your leg out of a 90 degree field of view! Okay, we could have just cut off that part of the image or tilted the Nano the right way...but doing hand tracking gives you more freedom of movement.

On the hardware front we decided to store the slate below the skateboard. On top it takes up way too much space and below is plenty of clearance. Although some tape might have been sufficient for functionality, we designed a small aluminum chassis for the tablets protection (and its looks). One may argue that this way we lose all optical feedback, but that’s useful for debugging only. Eyes belong on the road and the Nano needs to see you, not the other way around. We now use sound files as acoustical feedback for the hand recognized and hand lost signals - that's all you really need.

Last week it turned out, that the internal 5 V power supply can't handle the additional circuitry of the microcontroller and Bluetooth module. We used the opportunity of the almost completely disassembled board during the assembly of our slate chassis to add a more potent voltage converter. The finally assembled board is shown in one of the attached images. Apart from the Nano on top it looks quite "normal" with the tablet being well hidden. First preliminary indoor tests (with the wheels being lifted from the floor) showed that everything works great! Next week (weather permitting) we will be able to show our first outdoor trial. Break a leg!

post-115-0-95467600-1348219396_thumb.jpg

post-115-0-27714700-1348219399_thumb.jpg

post-115-0-50089000-1348219518_thumb.jpg

post-115-0-87882500-1348219523_thumb.jpg

post-115-0-93812500-1348219525_thumb.jpg

post-115-0-35232900-1348219528_thumb.jpg

LowKnee and JimP4nsen like this

Share this post


Link to post
Share on other sites

Just a tiny update this week. Vacations are coming up (at least for one of us).

Sadly the weather was rainy all week so we only had very few opportunities of trying it out. Indoors it works great so far, so we are really excited! Actually driving outside requires some trial runs to find the perfect settings compared to "driving" in the living room with the wheels lifted from the floor.

The weather forecast gives us some hope for the upcoming days (Yeah vacation! :) ).

We will try to get some videos online in the next few days, so stay tuned!

Share this post


Link to post
Share on other sites

As the test driver in our group was gone this week, I tuned the software algorithms a bit. After the few tries outside we did have before last week, we decided to add an additional gesture, which you can already see in one of the previous videos: you only brake if the hand is spread wide in the horizontal direction. We hope that this enables a better idle state. I tried to implement this by multiplying the contour width, reduced by an offset, with the original power input: the contour's size. But only if the contour size is above the braking threshhold. This means that the speed control is unchanged, only the brake input is rescaled.

Well, it's untested so far, so we will see if we keep this input method or stick to the old one with some threshhold variations. We just need a few more test runs outside...

Share this post


Link to post
Share on other sites

We used one of the very few recent days, which had reasonably good weather (meaning it was not raining all day) to do some outdoor tests. And there is just one thing to say: it’s a lot of fun!

We already used the last weeks to ride the eBoard using the standard remote control, so we already had a kind of feeling for the board.

So now the only change is to leave the remote control at home. It still feels kind of strange to drive without a remote control (and for sure you get lots of curious gazes), but the gestures we employ are somehow intuitive. Pointing to the camera - like Captain Picard’s famous “Engage!” gesture – will increase the speed and breaking is simply achieved by spreading the fingers - the standard stop sign. In between, there is a range of intermediate speeds, so it really feels like an analog control.

In conclusion: The basic gesture controlled e[cam]Board works great! Of course, we have many more ideas about possible upgrades (if time allows).

LowKnee and Buxi like this

Share this post


Link to post
Share on other sites

Check out the final video of the e[cam]Board, it´s online now!

Thanks to Lutz & Stefan for this awesome blog about their development progress, it was really fun to follow your weekly updates!

...and it is also really fun :D to ride the e[cam]Board - helmet recommended :ph34r:. Watch the video here:

Buxi and JimP4nsen like this

Share this post


Link to post
Share on other sites