Sunday, March 10, 2013

Digitizing the Em2

Some of you may be familiar with the Raspberry Pi minicomputer concept. Some of you might also be familiar with the Arduino digital interface/controller. My colleague Todd ordered a Raspberry Pi and it arrived this last week. I couldn't help but imagining how we could incorporate this tiny little computer into our Em2 hacks.

The Raspberry Pi has a few advantages over the Arduino board that makes the Pi a better base platform for digitizing the Em2.

  • Linux-based operating system, programming opportunities with Python

  • HDMI output for display on a large monitor

  • USB input/output for keyboard, mouse, hard drives, Kinect scanner, etc.

  • Ethernet (10/100) connectivity

  • SD card as boot/flash drive (download/mount data and images with regular computer)

  • General purpose Input/Output pins for connecting other devices



  • That last item - the I/O pins provide opportunities to connect to digital devices like the LRRD's digital flow controller, or other sensors that could collect data and then combine and sync all the output.

    My first priority is to get the Raspberry Pi to talk to the Kinect and automate the 3D scanning process. Ideally, we'd have two Kinects to cover the entire stream table. These would then be linked to an overall timeline of a particular experiment run (with information on discharge, sediment supply, and base level). There is a digital camera in development (5 MP) that could form the basis of photogrammetry measurements. Being automated, one could lock the cameras into position to maintain consistency.

    My dream setup includes a Raspberry Pi (or two) controlling:
    •Discharge from digital flow controller (or monitoring via simple Ventury tube, pump voltage, etc)
    •Kinect/3D data
    •Sediment Supply system voltage (either the LEGO version I've got, or something more robust)
    •Base Level elevation measurements
    •Time lapse photography
    •Optical sediment sensor consisting of a UV LED to record the presence of individual fluorescent plastic bits that make up a small fraction of the sediment (as an estimate of bedload transport)
    •All synchronized to a single timeline

    So, I'll keep on hacking and we'll see where we're at in a month, semester, or year, etc. Who knows - these kinds of things always end up changing as new opportunities arise, plans end up being overly ambitious, technology doesn't cooperate, or whatever. The thing to keep in mind is that (to rephrase John Lennon) "Research is what happens while you are busy making other plans."

    4 comments:

    1. Matt, I'm pretty inexperienced at data acquisition, but why not just use a desktop and Labview?

      ReplyDelete
    2. Hey Jeff - I think part of the problem is the cost of Labview...I havent looked recently, but if I remember correctly it is quite expensive.
      Katie

      ReplyDelete
    3. Jeff (and Katie) the goal with my Emriver hacks is to keep the cost of any particular mod under $50 bucks (although a used Kinect sensor goes for about $70). There's a student version of Labview that doesn't cost too much, and if the institution has the resources, it could be an effective setup. Todd and I are looking at using ImageJ for more of the data processing and analysis because of the cost.

      ReplyDelete
    4. Matt - that is a laudable goal! Labview is indeed pricey and actually you need to do quite a bit of behind the scenes work to get it to communicate with the Kinect. And then you have do decide what you want to use to analyze the data. In addition to the low equipment cost, I'm interested in exploring options that could utilize the fewest pieces of software out there. Not only is the learning curve steep for me, but if I wanted to use some of this in labs, I cannot ask the students to know 3 different software packages.

      ReplyDelete