a b
Figure 13: Localised bee-flowers. a My treasures Meine Schätze, b All treasures Alle Schätze Source: Shohrab
Uddin. With the “Radar” function the visualization of the geolocation
is activated. In Figure 13 it is shown how localised bee-flowers are presented using Google Maps. With the option “My
treasures” all own found plants are presented. Small thumbnail images are indicating the location and circles of 1 km, 2,5km
and 5km show usual flight areas of honey bees. With the option “All treasures“ the collected plants of all users are presented
Figure 13b. With the additional functions “Mehr“ the user can e.g. find a
list of all flowers integrated in the current version of the app. The user can select further detailed meta-information on each
flower, make an application to extend the flower catalogue or simply access the Web-portal itself via a web-browser.
3.3 Design and development of the Web-portal
A core component is the Web-portal www.trachtfliessband.de with the plant catalogue and the functionalities needed for this
project. User friendly functionalities for user account and user registration have been implemented to take care of the
protection of data privacy. The Web-portal allows in addition the updating of the smartphone application with newest
database contents. A user friendly GUI design for presentation of data and analysis
and query functions was essential. A major issue was the design and implementation of the internal structure of the databases
involved. The data provided by the smartphone application has to be received and prepared for presentation including the meta-
data information. The generation of the honey yield radar image is a core module as shown already in the previous chapter 3.2.
For the cartographic presentation a structure was designed, that can use Google Maps, but also other services, as e.g. Bing Maps
or Open Street Map. An area clustering of found flowers for positioning and presentation of area respectively quantity
information was developed with many options for the user. As a final function the honey yield can be estimated based on
the user provided “All treasures” data. The database allows for a simulation of honey yield at a specific location using the
derived amount of nectar and pollen estimated from the user collected flower types and locations. An example in Figure 14
estimates a honey yield in a certain area with 102 found flowers on 411 m
2
and with a very high pollen diversity, which is very good for the honey bees. An important aspect is also the good
pollen and nectar distribution. Figure 14: Automatic estimation of honey yield in a selected
area with different flower locations Source: Boris Willi.
3.4 Analysis of UAV imagery for flower recognition
The honey yield radar can be extended by using UAV platforms to monitor larger areas at reasonable costs. It is, however, not
possible to safely identify single flowers, but the intention is to identify regions where bee-flowers occur and to estimate their
extension. The basis is the acquisition of digital imagery with a RGB camera and a NIR Near Infrared camera to be able to use
classical classification algorithms for multi-spectral imagery. In co-operation with the company GerMAP GmbH four test flights
with different fixed-wing UAVs and 2 RGB cameras and a NIR camera have been conducted in July 2014 Chaudry, 2015. The
flying height above ground was about 100m resulting in a ground sampling distance of about 2-3cm. At that time the
simultaneous usage of 2 cameras on one platform was not possible. The RGB and NIR flights were thus conducted right
after each other at a short time span. The resulting images had to be registered and then analysed as multispectral imagery. For
this purpose aerial triangulation, DTMDSM generation and orthophotoorthomosaic processing was applied. In Figure 15
the RGB-Orthomosaic, and a patch are presented. In Figure 16 the NIR-Orthomosaic is shown.
a
b Figure 15: a RGB-Orthomosaic in the test area Welzheim, b
patch with field of red clover arrow Source: Flight and AT GerMAP GmbH, Orthomosaic Pieneering OY.
Flower locations
This contribution has been peer-reviewed. doi:10.5194isprsarchives-XLI-B3-863-2016
867
Figure 16: NIR-Orthomosaic image patch with field of red clover arrow Source: Flight and AT GerMAP GmbH,
Orthomosaic Pieneering OY. The classification was based on the registered RGB and NIR
orthomosaic images and on the digital surface model and the digital terrain model using eCognition software. Figure 17
presents the result of a supervised classification. The classes “Bare soil“, “Mixed Soil Vegetation”, “Road“ and
“Vegetation“ could be estimated. The class “Vegetation” was divided in the sub-classes “Ground Object, “Low pasture”,
“High Pasture” and “TreesBushes”. The result is very promising. Especially as in the class “High Pasture” the
occurrence of “Yarrow” could be confirmed, in the class “Low Pasture” a whole field of “Red Clover” was correctly found.
This shows that with UAV imagery including NIR data, larger flower areas can be classified and could complement the crowd
sourcing acquisition with smartphones on the ground.
Figure 17: UAV imagery – result of classification Source. Chaudry, 2015.
4. TESTS AND ANALYSIS RESULTS