Greenroom is moving increasingly into the maritime domain with our edge-computing and machine learning capabilities. We’re doing this as part of our ongoing R&D and also increasingly with industry partners who are looking to apply our capability to their unique applications. These projects are all exciting, but in the meantime we thought we would have an R&D day to test ourselves. We set out to develop a new application in under 24 hours to test our data pipeline, and explore feasibility for a future project.
Using footage gained from a elevated vantage point, the Greenroom R&D team developed a system to run on embedded hardware that could detect boats and whales. We did this in under 24 hours, which was a good test of our data pipelines and automated training processes
The video above is simply detecting and tracking objects of interest, in this case vessels and marine mammals (Whales). By applying Greenroom Robotics Application Programming Interfaces (APIs), further insights can be acquired, and in real time. For instance, by tagging the location and frequency of whale sightings we could begin to model migratory patterns and interactions with whale watching vessels.
What’s especially exciting about this is that the processing is all done on small devices designed to be embedded in robotic systems. This means the AI requires only minimal power, and the system can work as a standalone system (beyond communications ranges). This is going to let us push these algorithms literally to the edge.
The ability for Greenroom to run all of our custom software on small embedded devices (smaller than a lunch-box) means we can deploy vision-based AI into any domain: Sea, Land, Air and Space. We are excited about working with Australian and international clients, so get in touch if you have a unique remote sensing challenge (for manned or unmanned systems) that you think could be enhanced by vision-based AI running real-time, and on-the-edge. Feel free to email [email protected] to start the conversation.