Monday, April 26, 2010

Making the BeagleCam - frame processing for objects

After a bit of fiddling with opkg - opencv samples were installed. The plan is to make an office behaviour analysis system with a hallway camera. The school already has several surveillance cameras but I can't really get to them, but I did make a javascript based frame-by-frame loop for this system.

After testing the opencv samples for a bit I realized that a filtered image stream can be very handy in saving processing and storage. The idea is to utlize gstreamer chaining to slot in an openCV facedetect module and save only frames where face is detected.

I can currently capture timestamped frames using gstreamer:

gst-launch v4l2src num-buffers=1 ! video/x-raw-yuv,width=640,height=480,framerate=30/1 ! ffmpegcolorspace ! jpegenc ! filesink location=$(date +"%s").jpg

Then I can use OpenCV peopledetect and facedetect in the chain with queued up frames and start saving once the detection starts producing output frames. This will greatly reduce the amount of boring hallway frames and the board will concentrate on processing interesting events i.e. people approaching the camera. This can then be used for interesting data mining work such as time spent in the office, walking speed, hallway meetings between colleagues and number of days without changing shirts.

Time to ensure the gstreamer opencv module works properly on beagleboard.

Saturday, April 17, 2010

Faking Earthquakes - Live IMU data

Adelaide had a very small tremor on Friday and the "scientists" are using statistics to say a stronger one is on the way. Well that is cause for concern indeed, let's get paranoid and install personal seismographs in all our houses in addition to the bushfire, flood and tsunami alert systems.

At this week's hackerspace I tried to stabilize the video stream from the BeagleBoard and push it out to the web via gstreamer, with limited success. I was recommended to use FluMotion. I will push out over USB net from the beagleboard, aggregate on a secondary server and publish via flu. Still kernel lockups when reading the webcam are annoying.

I also finally managed to get the 3DM-GX2 reading done with Python-Pyserial. This provides fun ways of automated testing on the rig i.e. measuring mechanical properties by driving the motors with certain speeds while talking to the Arduino on serial and recording IMU data. I have both logging and live display working. The live without caching has lower sampling speeds since I haven't threaded the sampling separately from the display, but at least the graph updates as I move the IMU about. So here are some pylab plots.

Note the red curve settles at -1g, this is equal to the gravitational acceleration. Wonder what it will record at high altitudes. May be it can used as a gravity based locator system in GPS denied under-water scenarios. The live graph uses a wxPython backend for matplotlib and samples very slowly.

Final word - with proper sensitivity and stable mounting points you can build yourself a seismograph with an IMU.

Wednesday, April 14, 2010

Streaming video from Beagle - not from Mars

I can proudly claim to have a video feed from the Beagle unfortunately it is not the one on Mars. The beagleboard is running Angstrom-2.6.29 kernel and using the uvcvideo module in conjuntion with the el-cheapo ($4 cost + $4 shipping) USB webcam.

root@beagleboard:~# lsusb
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 001 Device 002: ID 1e4e:0100  <- Webcam
Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub


Then connected to the device using ethernet-over-usb (also supplies power).

ubuntu@ubuntu:~$ lsmod | grep cdc
cdc_acm                16544  0
cdc_subset              3100  0
cdc_ether               4924  0
usbnet                 17188  2 cdc_subset,cdc_ether

ssh-ed with X-enabled and streamed the display of mplayer from the beagle to my main screen.

mplayer -fps 30 -tv driver=v4l2:width=640:height=480:device=/dev/video0 tv://

So here is a screencap of the webcam looking back at the Beagle.

Sunday, April 11, 2010

PIRR Review - Radars are everywhere

Last week I attended the PIRR (Progress in Radar Research) workshop sponsored by DSTO. As expected it was full of DSTO employees from various parts of the country as well as some University people.

Zheng Shu from  CSIRO CMIS had the most relevant presentation pertaining to my research area. CSIRO and UNSW are developing techniques for combined use of Landsat and PALSAR for forest monitoring in Tasmania.

There were also talks dense with applied mathematics and beam forming. Some where people were tracking artillery shells with HH/VV radar, even though the shell is symmetric and did not have much difference in cross-section in the 2 polarizations. Then interesting spotlight views of ships from Ingara.

I also saw a very small and neat K-band (24 Ghz) dual-pol patch antenna used for termite detection. Getting closer to a flying radar in the lab then, hope there are no termites around.

Wednesday, April 7, 2010

HackerSpace Adelaide - Flying Crates and Hawkboard Screen

Today at hackerspace we got carried away a bit with helicopters. There were attempts to even get a crate to fly. Instead of flying the crate nudged along the ground. By the end of the night the radio control had come out of the said helicopter and the motor controlling PWM was being viewed on an Arduino oscilloscope.

We also managed to get the Hawkboard up on a VGA monitor and all the way to showing the Ubuntu login screen. We are planning to pool and buy a few more HawkBoards, everybody is dreaming up stuff to do with them. I have mentioned RADAR on a small UAV before, now it seems to be a real possibility with the ISEE RADAR. Time will come when there will be drones flying around giving out speeding tickets.