Aug 03 2011

Kinect 3D Scanning for Archeologists

As we’ve seen before, Kinect 3D scanning keeps getting more popular all the time, including for outdoor work in the sciences:  “Archaeologists Now Use Kinect to Build 3-D Models During Digs”.

 

Still some clear and major issues with using the Kinect outside and for scanning forests, maybe it is time to give this a try in the lab?

Jul 29 2011

Introducing "Vanga"

I work for REBIOMA - a joint project of UC Berkeley's Kremen Lab and the Wildlife Conservation Society, Madagascar. We develop and apply spatial tools for biodiversity conservation in Madagascar. For example, we work with a wide array of individuals and institutions to publish high-quality biodiversity occurrence data and species distribution models on our data portal - work that has helped to identify 4 million hectares of new protected areas.

Last week, I visited the Ecosynth team to build and practice flying what we're calling "Vanga" - a Hexacopter that we will take to Madagascar in late 2011 to map forest cover and forest disturbance in the Makira and Masoala protected areas. 

We're excited about the potential for low-cost, high-frequency forest monitoring in two and three dimensions. We will start by testing the capacity of the system for producing high-resolution 2D ortho-mosaics of selected field sites. We also hope to explore the 3D modeling capabilities - this has real potential for contributing to ongoing biomass measurements, and contributing to forest carbon inventories. Finally, we plan to evaluate the potential of this system as a tool to help communities adjacent to protected areas measure and monitor their forest resources.

Jul 29 2011

Multirotors on the Colbert Report

colbert_quadCheck out multirotors on the Colbert Report!!!  The clip starts at about 15 minutes into the program.

The researcher, Missy Cummings Associate Professor from MIT, is developing better human multirotor interfaces to help people steer the units using only a smart phone, which makes me wonder how different it is from the Parrot AR.Drone.

 

http://www.colbertnation.com/full-episodes/wed-july-27-2011-missy-cummings

Seeing this video reminded me of something I noticed when flying the Hexakopters on campus with Tom Allnutt last week, see his post here.  Many people stopped and asked, ‘What is that?’, as usual, while we were out practicing in the Quad at UMBC.  But almost everyone asked if we had put a camera on it, as if that was the obvious thing to do with such a cool device.  I explained to them our research and that we do usually fly with cameras and thought to myself that something is different now then when we were practicing last year.  In September 2010 when people asked us what we were doing they never asked if we were putting cameras on the devices and thought it was an odd thing to do when we told them about our work.  Now it seems that the practice is even expected.  I hope this signals a shift in perception about autonomous vehicles as useful tools for research and for recreational aerial photography and not just greater public awareness about the other uses of such devices.

UPDATE: I've been thinking about this post and in all fairness, the researcher is discussing the use of multirotors by the armed forces.  I posted for the sake of noting the signifcance of the devices in pop-culture.

Jul 27 2011

ArduPilot/ArduCopter Update

100_3553As many of you know our attempt at photographing Elbow Ridge Farm via two EasyStars this past weekend was anything but successful. Although the primary issue of Auto mode being inactive was eventually resolved at the field by trimming the endpoints on the mode toggle switch, we were still unable to fly our missions due to the system failing to obtain a GPS lock. Even after that weekend when the GPS was relocated to a more familiar region it was unable to obtain a GPS signal. Out of frustration I had decided to clear the GPS settings and delete the current firmware. Starting from scratch I reloaded both the firmware for the GPS as well as a script which enables it to communicate with the ArduPilot. In doing this I had also updated the GPS to a more recent firmware version. Just as it had in the past the ArduPilot was able to get a GPS lock within 5 minutes of being powered up (as indicated by the solid blue LED on IMU shown in the above picture). I’m planning on flying a short mission with the EasyStar tomorrow afternoon to make sure everything is working as it should be. Hopefully this will better prepare us for our next trip to Elbow Ridge Farm.

100_3534This week Jonathan had given me the camera mount and landing gear from one of the old Gaui quads so I could attach it to the new ArduCopter system. I just figured I’d post a picture of the new setup. The landing gear had to be extended by 2 in to provide enough clearance for the camera so I drew up a CAD model and laser cut a set of extensions out of 1/8’’ thick plastic sheet (triangular support structure on bottom). I had also added a servo to the camera mount and attached it to one of the output ports on the ArduPilot Mega to provide automatic camera stabilization. So far it seems to work great but we’ll need to upgrade the stabilizing servo if we plan to fly missions with camera stabilization activated.       

100_3535

Jul 22 2011

Cheap UAV LIDAR System

I just came across this post on DIYDrones: http://diydrones.com/profiles/blogs/found-cheapestever-lidar-seems

Basically the company Electro-Optical sells kits and spare parts that you would need to make your own LIDAR system, This includes things like the nanosecond timer, transmitter module, receiver module, controller,  and laser diodes.

The cheapest kit runs about $430 and supposedly the boards are pretty small. Check out their site for yourself (its under construction).

http://www.eodevices.com/main_home_frameset.htm

Jul 18 2011

XBees Again

laptop_xbee_stuff_crop_small

I think I have fixed the XBees, again, maybe…

I wanted to get our tablet laptop up and running again as a Hexakopter flying machine for the field – especially since I got the new Pentax WG-1 GPS camera in the mail today (I’ll post on that soon).  This laptop had already been running Mikrokopter-Tool v1.74a, allowing us to do 3D waypoint runs, but the XBees were not functioning at all.  I also had it in my head to install a SSD hard drive in this old laptop to give it a new lease on life – what better opportunity to try a fresh setup! 

A quick note to anyone that has found their way here with their own XBee woes, we are using XBee Pro DigiMesh 900 modules.  This post discusses the (hopefully) successful configuration of a pair of XBee Pro 900’s each mounted on an Xbee Explorer USB.  In a previous post, Xbee Solutions?, I suggested that it is necessary to have an Xbee Explorer Regulated on the MK end, but it seems that may not be necessary based on the results described below.

I got all the standard drivers and software installed and running (XCTU and  UART drivers) and plugged in the suspect Xbees.  Windows 7 said it correctly installed the new hardware, but when I opened up MikroKopter Tool I could not get any XBee communication. AAAAAAAH!

Back to the internet, I found this long thread about Xbee problems that offered promise: http://forum.mikrokopter.de/topic-21969.html

Taking from the thread, I set up two XBees on the same machine in two instances of XCTU to be able to effectively range test and compare parameters. Why had I never thought of that!? I read the modem configurations from each unit – mostly noting anything that was other than the default and confirming the baud rates were set correctly.  I quickly noted that the Modem VID numbers were different and read from the help dialog: “Only radio modems with matching VIDs can communicate with each other.”  One XBee was set to the default and another was set to a specific number.  I didn’t remember making this change but decided to set them both to the same number.  The range test was now working perfectly (see post picture).  Back in Mikrokopter Tool I was back in business with wireless telemetry, but I still couldn’t transfer waypoints.  I kept getting that ‘Communication Timeout’ error.

I tried another suggestion from this  post  in the same thread and manually adjusted the Destination Addressing fields on each unit.  I noted the high and low serial numbers for each unit (SL and SH) and manually configured the  high and low destination addresses to point at each other: XBee1 DL = XBee2SL, XBee1DH = XBee2SH, and vice-versa.

I flashed these settings, booted up MikroKopter Tool and was wirelessly transferring waypoints and receiving telemetry with no problems.

Of course, now we just have to see if it’s actually going to work in the field!

Next up: playing with the GPS camera!

Jul 15 2011

First Altitude Controlled Hexakopter Flight!!!

 

This past week I've been working on flashing the new firmware to fly altitude controlled waypoints. As it turns out there was no need for the newest hardware to use the latest firmware (FC 2.1ME, BL 2.0 required). After working out some compatibility issues with the old version of MKtools, I finally was able to connect to the Hexakopter. Today we were able to do a flight test, check out the video for yourself (best in full screen hd).

Next week I plan to flash the new firmware on to the other 2 remaining Hexakopters.

 

                                                                                        Why are you reading this watch the video!

Jul 14 2011

Sub-centimeter positioning on mobile phones?

Just came across this today at Slashdot: "Sub-centimeter positioning coming to mobile phones": http://bit.ly/pIvQ0e.

Apparently this is based on a technique called “SLAM”.  From wikipedia: “Simultaneous localization and mapping (SLAM) is a technique used by robots and autonomous vehicles to build up a map within an unknown environment (without a priori knowledge), or to update a map within a known environment (with a priori knowledge from a given map), while at the same time keeping track of their current location.”

I could imagine this becoming VERY interesting for high spatial resolution 3D scanning in Ecosynth- but maybe I am missing some potential limitation to this? 

Your thoughts?

Jul 09 2011

Leafsnap: An Electronic Field Guide

image

Yet another dimension of computer vision...

Leafsnap:

“ is the first in a series of electronic field guides being developed by researchers from Columbia University, the University of Maryland, and the Smithsonian Institution. This free mobile app uses visual recognition software to help identify tree species from photographs of their leaves. Leafsnap contains beautiful high-resolution images of leaves, flowers, fruit, petiole, seeds, and bark. Leafsnap currently includes the trees of New York City and Washington, D.C., and will soon grow to include the trees of the entire continental United States. This website shows the tree species included in Leafsnap, the collections of its users, and the team of research volunteers working to produce it."

Jul 09 2011

Image-based Tree Modeling

Image-based Modeling

Can the geometry of trees be captured using computer vision and then used to create models of tree structure?  YES!  Super cool work described here at Ping Tan’s website at the National University of Singapore:

http://www.ece.nus.edu.sg/stfpage/eletp/Projects/ImageBasedModeling/

 

Still a long way to go before this will be useful for ecologists- but a huge step in the right direction!

Youtube version here…