The team at Greenroom have been busy writing lines and lines of code to bring the first version of AniML to life. What is AniML? It is a post processing package for UAV (drone) imagery which is used to map animals over large areas with accuracy. Initially this is achieved manually by a point and click approach where the user identifies where animals are present, and what species they are; however, this is only initially and where things start to get interesting.
Let me introduce you to the ML of AniML, this is the machine learning component. Everytime an animal is spotted, the detected animals details are both saved for post mission analysis using AniML and saved to the mission database for training the ML dataset for future automated counting. Basically this involves splitting the image into animal size chunks to show Greenrooms machine learning algorithm, which then “learns” what we show it.
Machine learning is a bit of a buzz word these days, but in simple terms it means that we can teach the software what we are looking at so down the track when enough data has been gathered we can run automated detection without the need for human interaction.
That’s right, its software that learns with the goal to take the tedium out of data collection and to also significantly increase the accuracy of the data being gathered.
The initial in-house release is focused on animal detection, however, in the future Greenroom has the vision to map any object to real world coordinates giving powerful data analytic insights into whatever you may be looking at mapping. In this instance this may be introduced pests, endangered animal species or for scientific studies focused on population assessments. We are working on the tools to make hard, repetitive and potentially inaccurate work easier so more time can be spent on the important work of making sense and applying the collected data to the real world application.
If you are interested in the power of machine learning and how it can be applied to animal detection from UAV imagery feel free to get in touch with us at: [email protected]