Atlas of Living Australia Webinar: Exploring next-gen biodiversity detection technologies

Atlas of Living Australia Webinar: Exploring next-gen biodiversity detection technologies

Show Video

Good morning everyone welcome to the September  webinar for the Atlas of living Australia my name   is Martin Westgate I run the science and decision  support team at the ALA and I'm really thrilled   to be here um we've got a great webinar for you  today um but before I launch into that I'd like to   acknowledge uh while normally I'd be calling from  the lands of the Ngunnawal People I'm interstate today so I'm calling from uh from Sydney from the lands of the Gadigal People of the Eora Nation   and I'd like to pay my respects to Elders past and  present. And yeah just really thrilled to be here   today with three speakers from all around the  country so um look the topic of today's webinar   you have seen um from joining in today whether  you're logging in from the uh in the future on   webinars or whether you're watching live um and  it's about technology now that may not be entirely   surprising we've been doing these webinars for  coming up on four years now a lot of interesting   technology talks but something that's particularly  exciting to me about today is that it um impinges   on an area that's very close to my heart which is  about ecological monitoring um look to say that   technological development is moving quickly is  sort of an obvious statement these days um and yet   in ecology um the ability to detect species in the  environment is increasing rapidly and markedly and   so I'm really thrilled to be able to hear from  three experts today in that field and to talk   about different technologies that all have that  that shared goal and I think are really genuinely   revolutionizing the field and environmental  monitoring and a Australia and around the world   so uh we've got three speakers today we've got uh  Christine from Macquarie University give us a wave Christine we've got Paul calling in from QUT and  we've got Matthew calling in from WildObs in at   the University of Queensland now um for those who  are watching live we have um a questions portal   you'll be able to see it in on your screen we  also have a poll at the end if you want to tell   us something about yourselves um we'd love to hear  about it always helps us to customise our future   webinars to you and your needs um but with that in  mind I'll get out of the way our first speaker is   Christine um and so I'll hand over to you thank  you so much no all right that's looking great   perfect so good morning everyone as Martin said  my name is Christine Chivas I'm a PhD candidate from   Macquarie University and today just going to be giving  you an insight into some of the work that we've   been doing on monitoring birds and mammals in  Kakadu using minimally invasive approaches  I'd first like to begin with an acknowledgement of  Country so today I'm tuning from the lands of the   Wallumattagal Clan of the Darug Nation I'd also like to  acknowledge the Mirarr people whose lands of Kakadu   I am fortunate enough to be able to do research  upon and extend that respect to any Aboriginal   or Torres Strait Islander People present today  So first and foremost why do we need novel and   new approaches to biodiversity monitoring So  currently we're seeing a global biodiversity   decline and this has been particularly pronounced  within Australia's loss of mammals a third of all   modern mammal extinction events have occurred in  Australia despite only being home to six% of the   world's mammalian diversity this has resulted in 39  mammals going extinct since European settlement   and listed a further 110 at a risk of  Extinction despite Australian birds faring   better we do have 163 Birds currently listed as  at risk of extinction and we've seen a 60% decline in   abundance of these threaened birds over the last  40 years this brings me to Next Gen biodiversity   techniques so ultimately the first step in  being out of temper this rate of biodiversity   decline is being able to gain reliable and robust  biodiversity monitoring data today I'm going to be   talking about three of these techniques passive  acoustic monitors camera traps and eDNA or iDNA   based approaches so people are probably more so  familiar with the first two on that last slide   but not so much familiar with the eDNA or iDNA  side of things some people may have heard of   the term eDNA which stands for environmental DNA  well essentially iDNA is a subdiscipline within   this field and stands for invertebrate  derived or ingested DNA it relies on the capture   of invertebrate samplers so things such as carrion  flies leeches and mosquitoes essentially to their   day-to-day feeding are collecting that vertebrate  DNA for us this offers us the opportunity   to collect these invertebrates as a means of  passively sampling the vertebrate diversity of an area   so far the approach has shown great promise  and its ability to detect multiple vertebrate groups   with one method approach that can be challenging  with traditional techniques as well as increased   sensitivity in difficult to monitor groups including small  bodied and arboreal mammals and studies offers   demonstrated its ability to detect more mammals  and established minimally invasive techniques such   as camera traps as well as in some cases being  able to differentiate between cryptic and morphologically   similar species so this brings me to what we've  been doing in Kakadu so we have sampled six 1   hectare reference plots on the northeastern Edge  um of Kakadu National Park during two rounds of   sampling undertaken during a wet and dry season  we have employed these three minimally invasive   techniques so acoustic monitors were only deployed  during the dry season camera traps for a six to eight   weight deployment during each field campaign  and mosquito iDNA for two consecutive nights of   sampling at the beginning of each of these field  campaigns so this is just an idea of what our set   up looks like so we have an acoustic monitor in  the middle program to record for 2 hours in the   morning 2 hours in the afternoon to capture that  Dawn and dust chorus camera trap baited with some   rolled oats honey and peanut butter and a CO2  beta mosquito trap that essentially pulls the   mosquitoes in and they're sucked into that net  that we come along and collect the next day   so as you can see we definitely had no shortage  of mosquitoes in Kakadu we were getting on average   between 800 to 1,500 mosquitoes per trap and  collected over 48,000 mosquitoes across the two   field campaigns so what do you do with 48,000  mosquitoes well essentially iDNA follows the   same sort of metabarcoding workflow you would in   an eDNA based study to simplify it it starts with   a DNA extraction process in our case from bulk  mosquito samples amplification of this DNA using   mammal vertebrate and avian primers that then goes  off for next Generation sequencing in return you   receive millions of reads back that essentially  go through a bioinformatic and taxonomic assignment process so what do these techniques tell us about the  detection of birds so just to clarify what we   mean by sampling effort on these um species  recovery curves so acoustic monitors is the   number of trapping periods per trap um due to  time constraints we were only able to analyse   the period that corresponded with the collection  of iDNA corresponds to each of those bulk mosquito   samples and camera traps is the number of days  of deployment per camera trap so as you can see   from this we're getting significantly more bird  detections with both acoustic monitors and iDNA   than with camera traps so in total during the dry  season when we had all three methods deployed we   were able to detect a total of 79 avian taxa this included 52 with acoustic monitors 49 with   mosquito iDNA and 11 with camera traps only six  of these were detected with both methods while 21   were detected by both acoustic monitors and camera  traps despite this though there were still a considerable   amount that were unique to each method  essentially telling us that these two   methods are additive and will be able to result  in the detection of additional taxa than they   would be on their own just to put into summary  some of the examples of some of the birds that   we were getting with different methods so  with the acoustic monitors we found we were   detecting additional honey eaters as well  as transient species so things such as the   magpie goose and red tail black cockatoo  that may be passing over our sites and   getting picked up by these monitors but aren't  necessarily stopping to get fed on by mosquito   while the taxa that were detected with both  methods included relatively common birds so things   such as the blue winged kookaburra tawny frog mouth and  galah while with iDNA we found we were getting more   ground dwelling birds so things such as the brown  quail and the threatened white throated grass wren   as well as semi- aquatic birds are things such  as the Nankeen night heron and intermediate egret   um generally aquatic birds can be challenging to  monitor using approaches such as acoustic monitors   so what about mammals so we found we were getting  pretty much double the amount of mammal detections   with iDNA as with camera trapping with pretty much on  quarter of the sampling efforts so with camera traps   we were able to detect a total of nine tax across  both field campaigns and these included relatively   common ground dwelling species so things such as  dingos agile wallabies feral cats and the northern brown   bandicoot despite this though the prolonged  deployment of the camera traps over that six   to eight week period did result in the detection of  additional species that weren't picked up by iDNA   most notably the threatened black footed tree rat while with iDNA we were detecting a total of   20 taxa this included additional arboreal species  that aren't picked up by camera traps so things   such as the sugar glider and ghost bat small  mammal so things such as the pale field rat   and common rock rat and as well as macropods  most notably the short eared rock wallaby at one of   our sites that is near the sandstone escarpment  and the relatively elusive spectacled hare- wallaby so   what about reptiles you might ask so we found in  this study that although mosquito iDNA appears to   be a good approach for mammals and birds not  so much for reptiles in our case we did find   that camera traps were able to detect the most  reptiles however these were mostly large and diurnal   species so you can see down on that left hand  corner we've got a fill neck lizard and in the   right hand side of that right hand picture we have  a yellow spotted monitor and currently studies   do demonstrate that currently a combination of  camera traps to catch those larger species and   live trapping for smaller and nocturnal species appears  to be the best approach really hiding that there's   still work to do and being able to develop many  invasive sensitive techniques for the detection   of reptiles so as with any approach all of these  methods do have their own sets of challenges so a correct identification can be challenging with  all these methods so acoustic monitors you can   have challenges with incomplete calls crossover or  species variation and calls camera traps you   can have issues with blurred imagery especially  for small fast moving species and morphologically   similar species so to put this in perspective with  iDNA we were able to detect three different rat   species this included the introduced black rat  as well as the common pale field rat and common   rock rats however we were challenged be had to get  any speeches level identifications for any of our   rat images from camera traps iDNA can also have its challenges so essentially any iDNA or eDNA based   study is really only as good as the reference  library that you have to be able to match those sequences to furthermore all these challenges  do have their own specialised skill sets this ranges from the molecular skills required for iDNA  based approaches to the localised knowledge for the identification of acoustic calls however in  saying that though both the use of AI as well as   the commercial labs offering eDNA services are making these approaches more   accessible for a wider audience additionally to this  we do have challenges with the transportation of   CO2 to be able to capture mosquitoes particularly  when working in places such as Kakadu and you do   always have challenges with equipment failure and  destruction so over half of our bait stations were   destroyed during our campaigns um I'll give you  best guesses who may have been responsible we've   got um suspect number eight down that right hand  corner so just to summarise this all so we found   that mosquito iDNA outperformed camera traps in  the detection of mammals detecting additional   macropods and arboreal species we found that acoustic  monitors and iDNA appear to be complimentary   approaches with acoustic monitors resulting in  the detection of more honey eaters and transient   species and iDNA more ground dwelling birds in  semi aquatic Birds however camera traps appear   to be at best for reptiles in this study however  our current approach of live trapping and Camera   trap appears to be the best approach with that  really demonstrated need to continually to work   on new sensitive techniques for reptiles and  furthermore also they continue need to improve   DNA reference libraries to really uptake that  use of eDNA and iDNA based approaches so just   to wrap up today I'd like to acknowledge a few  people so obviously this research couldn't have   been possible without especially my supervisors  Anthony Chariton Adam Stow and Andrew Harford   the support from Macquarie University and my lab  in the eDNA and Biomonitoring lab as well as   our collaborators and project partners of of the  supervising scientists who this project wouldn't   have been able to happen without and Special M  thanks sorry to Tom Mooney David Lowensteiner   and Kate Montgomery that's wonderful thanks Christine  look I've already got so many questions and uh   there's already some coming through the chat as  well now normally in these events questions at   the end though so um I'll cue those up fascinating  talk and thanks so much for sharing your research   with us um now uh Matthew over to you thank you  can you hear me yes great and the slides looking   great too thank you so um I'm very happy to  present uh the Wildlife Observatory of Australia   which has recently been funded through ARDC QCIF  TERN and the University of Queensland it's a   big initiative um with a lot of people that I will  acknowledge at the end so I also want to start by   acknowledging the traditional owners of the land  that I work here at UQ and I'm going to hop right   in so what is WildObs and why do we need it we  are going to create a National Wildlife database   that will deliver information on robust monitoring  science and management we need this information on   wildlife to track threatened species we need it to  track invasive species and management we need it   to produce high quality scientific um innovations  we need it for our public to be able to access   information about where their amazing wildlife is  and we need it to meet our uh UN biodiversity goals  that we have committed to as a country we  also need it because there are dramatic events   and these require timely uh information so that  we can uh manage them these include things like   bushfires disease outbreaks climate change and  we want to be able to sell biodiversity carbons   biodiversity credits alongside our carbon credits  which means we have to be able to robustly   quantitatively track wildlife so we have some  opportunities and some problems with cameras   so cameras are relatively easy to set strap them  to a tree leave them for a few months easy to set   50 cameras per survey but these create so much  data hundreds of thousands of images over 100   gigabytes per survey and we have many surveys in  Australia but these are completely uncoordinated   often targeting targeting a single species but  they capture so many species that if they were   brought together we could use it as a monitoring  tool we are collecting a lot of data but we are   not publishing as much as we would expect based  on this investment in our sampling this basically   boils down to camera traps are easy to use in the  field but generate unmanageably large data streams   and these data streams are not being leveraged to  produce high quality science and monitoring so who   is using cameras well basically everybody they're  very accessible to researchers at all stages in   their careers they're used in a wide variety of  biodiversity research management and monitoring   people that use cameras in Australia include NGOs  government consultancies academics and Indigenous   groups so right now camera data is being boiled  down to presence only this is not the richest   format for the data but it is suitable for species  distribution models something that Australia   really excels in but species distribution models  do have some limitations they're really good range   maps essentially but they just tell us if the  species is there how likely it is to be there well   camera surveys actually include the detections and  non- detections this enables a type of modelling   called hierarchal modelling to derive occupancy  and abundance while accounting for variable detection   what that means is two people can set cameras one  with baits and one without and we can account for   the fact that species are detected at different  rates utilizing different methods that's extremely   powerful for robust monitoring it allows us  also to look at trends through time and to infer   species interactions both of which I will show  you examples of we also want to note that while   ALA focuses on presence only data to date  we are in process of making it more accessible to   download full survey data using the same website  that we all know and trust to go to for our   biodiversity information so right now only a third  of Australian publications utilize an analyses   that account for detectability this is relatively  low and it could explain why are comparatively low   publication rates compared to other places so what  are the things WildObs is going to address we're   going to enable coordination and collaboration  in sampling fieldwork so that we can be more   productive and efficient and save money and um  answer our questions more efficiently we are also   going to help with image management so this is the  storage and efficient processing of images this   needs to be free needs to be cloud-based so we can  access images anywhere we want we can have many   people sorting images and we need to be able to  power AI identifications and I'll explain that in   more detail we also need open access to data sets  right now these are often siloed away at specific   institutions or in someone's old dusty hard drive  this especially important for publicly funded um   efforts so anything from ARC federal government  like DCCEEW state governments these are all required   to be open access but they currently are rarely  Open Access we also need training in advanced   statistics for impactful research and monitoring  and this is something that we are going to be um   really focused on so in order to store  organize identify millions of images we need   access to a cloud-based image platform that has  identifications using AI one example is wildlife   insights but there are many and we are debating  which one right now that we're going to promote   and provide free for all Australians we also  need tagged images in a repository so that our AI   researchers here in Australia have what they need  to do new AI research and improve our algorithms   we also need an aggregation of all the spreadsheet  data that we're currently deriving in all these   separate places separate institutions separate  Labs so we need standards for our data and we   need APIs to share that data and then we need it  to be accessible and searchable such as via online   on ALA then we need robust analytics to drive  key inferences this requires meticulous data   standardization and preparation robust statistics  and we need to create intuitive results so that   people can really understand what is being um  derived from their sampling we want to create   automated reports you provide us data we give  you back an automated report that includes the   best robust analyses possible we're going to feed  those analyses and results into the government   regulation like the EPBC Act for endangered  species and our indices like the threatened species   index which I'll talk about in a second we're  also providing training for statistical skills   and we're initiating many more collaborative of  papers and reports and this really is a carrot to   get people to be involved because they can get a  lot of publications if they're part of this group   as a proof of concept we worked in Asia and our  lab to put together 239 camera trapping studies   published a data paper then that paper has been  used in over 20 publications within the last two   years including a publication that made the cover  of nature so this indicates that if we can do this   there's really power powerful science that can  be derived we really want to contribute to the   threatened species index this shows um trends  through time for birds mammals plants right   now there's very little camera trap data in this  because um a variety of reasons and we can solve   that so we can make that orange line extremely  confident in what it's um reflecting and we can do   an orange line for every single mammal species I  want to give you one example which is the Eyes on   Recovery project this was thousands of cameras set  in after the 2019-2020 black summer fires massive   amounts of data we were able to analyse the images  using Wildlife insights and the AI um helped make   that efficient and then we able to process all  that data in eight weeks provide specific robust   for with hierarchal models for 93 species and  today we can do that in one to two weeks this   is what we need to provide timely management  relevant information we can also use these uh   richer data sets to look at species interactions  here's an example from Tom Bruce who I'm going to   give you a little more information about his  work in our lab and what it shows is the green   line is the dingo um activity throughout a day as  captured by camera traps that blue line shows how   cats behave in the absence of dingos that red  line shows what happens when how cats behave   when dingos are present showing a strong shift  toward nocturnality so these sort of inferences   can cannot be gleaned from presence only data  we're expanding that set that type of research   to look at entire food webs if you want to learn  more about this Tom's actually giving a seminar   next week um we're going to drop the information  for that in the chat um and to work in the lab   managing all this data has a few easy conclusions  that setting cameras in the going out in the field   is really fun managing 25 million photos is not  fun but we there to help so one of our some of our   progress today is that we've already had a review  paper with 194 Australian authors representing all   major universities federal and state governments  and NGOs we've also been collecting data I should   say collating existing data we're already up near  30 million images we have a very defined data   standardisation process this is led by um Dr Zach Amir  and we are um nearing a choice of platform to   provide for all Australians that will provide free  limitless image storage easy management and AI   identifications exceeding 95% or targeting 98% so  that we don't even have to review them once we get   to that accuracy we don't even have to look at our  images we just trust the computer is doing a great   job we're also um working with DCCEEW to provide a  metadata collection app this is going to be called   Monitor you'll hear more about it in the next year  this is going to allow everyone to enter data in   the exact same format from their smartphone in the  field and that will automatically uploaded to the   cloud the government can see where you've sampled  it provides everything you need for reporting and   that is linked to the image outcomes later and  finally we're hosting statistics seminars and   we're providing statistical tools so if you  have data need help analysing it we can help   you I'm going to let you under the hood just  for a second so first I'm going to explain our   um pipeline for images so image are sent to us  either as SD cards or uploaded in the cloud in   a dropbox they are moved into what we call a WIMP  a wildlife image management platform examples of   this include wildlife insights and there's even  one being developed here in um Tasmania by Barry   Brook um called MEWC so this platform hosts  all your images allows you to see which cameras   collected each images allows you to sort per  for different species and what it also does is   it puts all identified image unidentified images  through something called Mega detector that tells   you what has an Wildlife versus anything else  if it has Wildlife it goes to a computer vision   um classifier which is an AI tool to identify  species if that species is identified over 97%   accuracy it goes straight as data if it's less  than that it gets verified by human all that   data then goes into our um data standardization  process and our um camera database okay now I'm   going to shift this up to show you some more if  you already have data your institution already   has data that's been sorted or you have your  own image management platform we can pull that   directly either from your platform or through ALA  they have an API and that goes also into our data   standardization International um database we  also are pulling directly from biodiversity   aggregators so Queensland New South Wales BioNet  WildNet federal government um BDR Biodiversity   Data Repository this is where a lot of the data is  already exist but it's currently siloed in these   places and right now we cannot pull full detection  histories only uh presence only and so we're   changing all that so we are providing external  access through ALA and this is what is going   to enable wildlife biologists and government  officials to monitor biodiversity easily we're   also allowing a tagged image repository where we  keep all these images um collated in a nice format   so that can be used directly um by AI researchers  so we can be on the forefront of developing the   um these tools so with that I'd like to thank many  people listed on this um slide um and thank the   funders and uh the websites learn more about  work that we already have done for example on   Asia is that our website or my website ecological  cascades.com and if you want to learn more about   WildObs the website is tern.org.au/wildobs thank you thanks Matthew fascinating   to say and um I see that other people have  problems with too much data at times as well it's   something that hits us a lot at the ALA um so uh  Paul, you're our last speaker today would you mind   um sharing your screen that looks great thank  you excellent okay thanks Martin thanks uh to the   ALA for inviting me to give a talk and it's great  to talk after um Christine and Matt because a lot of   what they've already talked about in terms of the  motivations and things are exactly why I'm excited   about ecoacoustics uh so my name is Paul Roe I'm  an academic at Queensland University of Technology   and for the past 10 years I've been on really this  journey into ecoacoustics and what I want to do   uh today briefly is to talk about a couple of  um projects um I'm involved with I would like to   start off by acknowledging the traditional owners  of the land uh where I now am or where QUT is that's   the Turrbal and Yuggara as the First Nations um owners  the traditional custodians of the land which   was never ceded I'd like to pay my respect to  the elders laws customs and creation spirits and   recognise that these have always been places of um  teaching research and science and the Aboriginal   and Torres Strait Islander People play an incredible  role in um in all of our communities okay so   I'd like to start by just talking a little bit  about ecoacoustics um and what it is so really   eco acoustics is a new way to understand our world  through continuous sound recording um it provides   a sort of scalable way of monitoring biodiversity  um through continuous sound recording and provides   a direct environmental record of that so it's a  little bit like um sort of remote sensing drones   and satellite imagery uh which collect sort of  images of um you know of the continent of the   vegetation um and from those images we're able to  deduce what's happening to the sort of forest we   can look at kind of flowering event and understand  the sort of vegetation but really up until now   the problem has been that we haven't had scalable  methods for understanding uh floral biodiversity   um and that's why we're so interested in things  like camera traps the ecoacoustics and the um   iDNA ecoacoustics is a rather young discipline  there are a number of unique issues Matt sort   of alluded to some of those some of the issues  of the same uh that we face in Eco Acoustics is   with camera traps we've got sort of a lot of  data uh it's collected all over Australia we   need to sort of bring it together and also it's  quite difficult to analyse um there's been some   quite rapid progress over the last 10 years and  I want to talk about a couple of those projects   I've been involved with and Australia's been  taking a um a leading role um before I do that   though let me just play you um some sort of sound  it's sort of somewhere collect um collected um uh   on a property uh near Brisbane and this was  sort of a beautiful morning chorus there [birdsong] so one of the um great things about working with  sound is you get to listen to these beautiful um   choruses to the beautiful birds not just birds  but also frogs and some mammals um vocalising   um like that of course the issue is how do we  how do we make sense of that so one of the um   projects which I've sort of created is called  the Australian Acoustic Observatory uh this   was a collaboration with some colleagues from  other universities around the country um and   what we set out to do was really to um realize  that vision of the kind of continental scale   biodiversity monitoring to really understand how  can we monitor uh floral biodiversity at scale   how can we actually understand what's happening  um so we set out um and created a a project uh   we were successful with getting some funding  through the Australian Research Council um and   we set up a large network of recorders and one  of the things about ecoacoustics is we regard   it as being very much a data first approach to  science so in that sense it's a little bit like   um astronomy um or bioinformatics where you might  actually collect data and you might start with   that data and mine that data to actually sort of  come out with to understand what's happening to   find questions um so it's a data first approach  um and hence the name Observatory because we're   observing or listening if you like to all of the  sounds across Australia we are also very driven by   an open science um uh methodology in that where  believe we believe that the data should be it's   been um funded through an Australian government  research grant we believe that the data should   be made uh sort of freely available just like  um the observations through the ALA are um and   we're using the data to support obviously sort  of ecology um and computer science and citizen   science but they're also creative uses of the  data um as you can imagine with such beautiful   recordings there's um a lot of sort of artists  and other people that are interested in using   that data um we started deploying recorders in  2019 which was of course a great time to start   so we had a few delays due to bush fires and  covid and things um but we now have over 365   sensors uh deployed all around Australia uh we're  continuously recording sound so there's no sort   of sampling involved this is continuous sort of  sound recording each of the recorders there's a   little sort of solar powered box that records to a  set of SD cards and we're very much leveraging the   good will of land owners to um collect the data  for us to collect the SD cards dust off the solar   panels and send the SD cards um to us so we have  this this huge amount of data um rolling in the   um the difficulty of course as Matt's already  said is how do we manage all of this data we   obviously want to get the data into the Atlas um  but how do we sort of get it there from you know   remote areas like the sort of Sturt Desert you  know bunch of um sort of SD cards that are sent   to us we need to get that into some database  where it can kind of be analysed and then the   observations um actually put into the um the Atlas  so we can kind of generate those um environmental   insights uh so to do this we um along with um  some of my colleagues created a um a project   open source platform this was supported through  the Australian uh Research Data Commons um to   essentially kind of complete that pipeline that  workflow for processing um the data and to get   the data into the ALA um it's been very important  for us to um work with um sort of communities so   we are co-designing the um tools and systems that  we're producing all of the tools and systems are   sort of open source for everyone to use so it's  very much a program to a project to produce both a   platform but also to um bring the community along  with us because it's a young discipline so that we   can understand um how best to use the data how  people want to use the data the tools that they   need to um deliver on this vision of being able to  understand what's happening to our um beautiful country once we've got all this data of course  there's still this issue of how to find calls   um so you know somehow we needed to search through  all the data so we thought who are we going to who   we going to call when we need to do a search  so we've got a um a project with Google um   where we can actually uh undertake a kind of a  search so if you've got an example call that you   want to find we can actually scan through all  of the data through hundreds of megabytes of   data hundreds of um gigabytes rather of data to  actually find those particular calls and if you   would if you want to um perhaps take a photo  with your phone or something or just um grab   um that image or um if your phone allows you go  to that link for the QR code and you can actually   give the um the similarity search ago and this  is all driven by um Google AI so just to sort   of wrap up um a couple of projects we've um we're  collecting data um with the Australian Acoustic   Observatory for from all over Australia we've  got other specific projects which sort of target   um threatened species um or other monitoring  regimes which very sort of NGOs and governments   um are running uh we've got sort of literally  thousands of users we've got over 200 projects   um over half a petabyte of data now almost 500 years  of um continuous um data obviously is too much   to listen to which is why we need the Open Ecoacoustics platform to um to enable us to sort   of store analyse the data um and also with such  big dating you get a lot of errors so we've got   a lot of sort of very sophisticated tools to  um to fix the errors um but as much as this is   about data um and about platforms and things it's  also about people so I would like to uh thank all   of my colleagues and collaborators that um have  made this happen uh and I'd like to thank the ALA   for inviting me to give this talk uh so thank you  lovely thanks for that Paul um I'm slightly unsure   where to go we now have about 20 minutes which  is a wonderful amount of time for a discussion   so thank you all for your presentations people  have been queuing up questions in in the chat   and I'll try to get to what I can and summarize  what I can't um so I guess uh to get started then   um look it seems like each in each of those three  talks you ve benefited from some pretty massive   leaps in technologies in the last few years um  you know uh Matt you talked about processing   things with machine learning and AI and Paul  same I guess the uh eDNA um boom has been well   documented as well like the sort of technologies  for genomics um are we at the is everything else   keeping up you're now generating all this data  is do we need new development in our statistics   to manage this kind of data or have we really just  been waiting for that um sorry just open question   there if anyone wants to jump in I think uh this  could be a good opportunity for me to jump in   because the statistics for wildlife ecology have  largely been developed for cameras that came on   the scene a little bit earlier than Acoustics  and eDNA but can be used for those methods for   example I mentioned these hierarchal occupancy and  abundance models those take detection histories so   we need detection versus non- detection in each  amount of time at each sensor that can be created   through multiple surveys of eDNA or through  multiple surveys of acoustic data the statistics   are there they're a bit complicated but there's  packages being developed in R and other tools that   make them accessible and one of the things that  we're doing is producing um automated reports that   include all those complicated analyses that we  run on our supercomputers here at UQ and QCIF so   that users who don't have a degree in statistics  can still have access to robust outcomes from   their data no thanks Matt Paul do you want to  add to that at all well I'll probably defer to   Matt I'm not a statistician but yeah certainly  I mean with the new developments in sort of um   hardware and software so as you saying Martin  the sort of the AI developments which is both   the sort of compute and also the techniques to um  implement the AI those deep learning networks and   the decreasing price of sort of storage that's  enabled us to collect and analyse more data and   as we get more data we are going to you know  potentially got to um find out and understand   new things but it is going to require um I think  new kinds of statistics or certainly new um kinds   of um models so I think things like you know the  occupancy modelling and hierarchical models that   Matt sort of talked to I mean those things perhaps  have been around for a little while but we haven't   really been able to use those because we just  haven't had the data you know the great thing   about you know putting out you know um iDNA or  camera traps or eco acoustic um recorded is that   you're not reliant on a person being out there in  the field making all of those observations um and   so yeah it is going to enable a whole new kind  of science and that's obviously really important   for um conservation and understanding um our  environment as well as you know producing food   and all of the other ecosystem services that we're  reliant upon yeah no it is as someone who did a   PhD in a in a long-term monitoring lab where every  observation was someone standing in the field it's   uh it's amazing to see the sort of wealth of data  is coming through um Christine if I could um pick   you up on this question of um dealing with data  I mean it was a really stark example of a of a   long-standing problem I felt in ecology that you  demonstrated there which is that different methods   detect different things um how do you see this  playing out in future do you think that we're   going to have to presumably there's a lot of  um a lot of testing needed to compare these   methods yeah I definitely think there's more  work required to really understand the bias   especially associated with iDNA is sort of a newer  approach uh so definitely understanding that and   using that in a decision making framework and when  planning biodiversity monitoring working out sort   of what focal groups you're wanting to focus  on and how to go about that um yeah obviously   there's still a lot more work especially with iDNA that needs to be done there's a lot more   questions than answers at the moment but yeah it's  definitely showing great promise at the moment no   it's fascinating and it's really good to see that  work being done as well um there was a question   oh sorry Martin I want to um just bring up some  uh a frontier which is that these hierarchical models   because they allow you to include a detection  covariant for each survey they also allow you   to merge eDNA acoustics and cameras with each of  them having a different detection probability so   for example Christine mentioned how there was a  vastly different detection of mammals reptiles and   birds using these methods you know cameras excel  at vertebrates on the ground mammals predominantly   ground dwelling birds acoustics excels in Birds  particularly and insects and eDNA covers a lot of   the most cryptic species that are hard to see on  a camera don't make a lot of noise putting them   together in a single model especially these  multi-species multi-site models allows you to   learn about the entire community in ways that  we've never been able to do before this is the   frontier there are models already available um  it's not easy but um it's exciting and it is the future and presumably the uh the scale that you've got the amount of data that is putting into these   models and the degree of standardization  as well as helping with that I'd imagine   absolutely um what I think the most important  like innovation there is in this space is the   ability to include species specific detector  specific so whether or not your DNA acoustics or   camera specific um co-variants for each of these  sampling approaches and then that allows you to   bring them all together under a single modelling  framework no it's fascinating to see it be good   to see how it how it um flows into things like  the TSX for example that you mentioned Matt um   look I'm going to try and get through some of  the questions that have come in from uh people   who are online and one of them sorry to go back  to you Christine so soon there was uh one that was   asking about co- benefits so for example whether  it's possible to detect um like pathogens from   mosquitoes simultaneously with wildlife is that something that's on the horizon um so that   is definitely possible so you can do DNA and RNA  co-extractions if you're wanting to look at that as   well so in our case we were just focusing on the  DNA but that's definitely a probability that you can   also do the RNA extraction to be able to isolate any  mosquito born pathogens as well I dread to think the   amount of data you'd get from something like that  but I imagine it would answer some very exciting   questions um there was a similar one Paul for  you about whether Eco Acoustics is feasible in   the marine environment which um I wondered if  you have any questions any comments on that oh   yes there's a yeah I mean there's a sort of  a long history of um using acoustics for um   citations and things um I was also going to jump  back to the previous question if I may and say   that yeah I mean there are other things that you  can use Acoustics for because it does detect other   information so potentially you can get information  about you know wind and rain and things but also   you can detect um you know potentially you know  if you've got some sort of human disturbance you know   sort of gunshots and things like that or um you  know motor vehicles are actually sort of noise   pollution which of course may itself be affecting  the environment so yeah these um these methods you   can do a lot with them and detect more than just a  sort of a single target species you really can get   a bit of a handle on what's happening to the whole  um environment and how far are we along on that   journey so there was a specific question in the  in the chat that was someone was asking about uh mimics  so um birds that copy other birds presumably we're  still at a point where um some of that stuff is   challenging for these models is that fair to say  um yeah that's a great question we have done some   work with mimics particular with Albert's lyrebirds  and uh generally if you know I think we're at   the stage where if a if a human can um perceive  the difference then usually providing we've got   enough you know test and training data we and  then actually train an AI model to distinguish   the two so certainly we had success with sort  of Albert's lyrebird but yeah um obviously there are   there are limits to um you know what you can do  and sometimes um uh you know I think as Christine   was saying it's important to understand the bias  of the of the technique I mean all techniques are   biased even if you've got a sort of a human  observer in the field um sometimes we can't   distinguish um particular um species so sometimes  we can't um distinguish for example between different   raptors and things like that that make the same  kind of noises other times we may actually be able to   distinguish individuals so potentially with some  birds and some um koalas and things like that we   can actually get a handle perhaps even on the  individual that's calling that's an interesting   point actually because um we've had a couple of  questions in the chat about um comparison against   citizen science data or um I guess the and with  birds of course in particular pick on those just   because we get a lot of data on birds and I see  it a lot many of the people um who are who are   submitting those observations are extremely expert  um you know to the extent that they could be doing   this professionally but don't um do you have a  sense of uh the extent to which um uh automated   methods obviously they scale much better they're  in a sense cheaper per unit um do you have a sense   of their comparability against expert observers  in the field um I think they're pretty good I mean   I mean I think in a sense the advantage of a lot  of these methods is so they can be deployed um by   non-experts and then you can have experts review  the data if you keep the data I mean even if   you've got an expert birder in the field you know  if you hear a weird sound that's not a bird you   know you hear a frog or something and if you don't  have a you know herpetologist with you what do you   do but if you've got the recording you can then  sort of take the data to the expert um so I think   um keeping the data is you know it's about the  sort of scalability that the techniques provide   but it's also about the record having the data  which can then potentially be kind of audited   and assessed and used for you know the sort of  green accounting schemes but also if it's a call   that perhaps not a lot of people know way you  might really need an expert you can take it to   the expert um I think the I see the AI as being  an assistant really I don't think it's ever going   to replace people it's more a um an assistant to  help people An entirely fair point um Matt if I could come   back to you on this so um I'll just um what Paul  said is it um about the AI being assistant I agree   with that traditionally that's how we've used it  but for cameras for a majority of species but not   every species the AI is significantly better than  a human and the accuracy is much better than a   human so we're getting to a point where we never  have to see the images and we can have a workflow   once we have let's just say universal access  to a signal the cameras can upload images the   AI will process them and the data will flow into  the spreadsheet so we can analyse it completely   removing the human from the link and that will  allow scalability on a whole another scale um so   I suspect that will happen for many  species of birds in the future maybe not all   I'm curious what Paul thinks on that one well  I guess some bird calls evolve over time so um   it's not a sort of a static system that you can  completely analyse um I you know I think that the   AI is going to greatly help with dealing with the  massive amounts of data that we've got but I think   there's always going to be a role for a certain  amount of um validation and checking I think we   will also need to look at other um techniques um  you know the sort of the finding bird calls in a   long recording is a very reductionist approach  to understanding the sound um I think and all   you find out is what the recognizers have been  trained to do I think it's also useful to have   other perhaps unsupervised approaches where you  also try to kind of characterize the overall sort   of soundscape I mean crudely you know if a um if a  microphone gets eaten or damaged as Christine sort   of knows all too well with some of the problems  that she's sort of had um the recognizers aren't   going to tell you that they're simply going to say  that we haven't found anything they're not going   to tell you that well actually there's no sound  data there yeah we've all had that happen I had   a bushfire go through some of my sites when I was  trying to record some things and that um that was   in the tape days but let's not go back to that um  speak while we're talking about this sort of scale   um Christine we haven't really talked about um  you've mentioned eDNA and iDNA and they're in some   ways newer technologies but also becoming quite  established do you think they have the potential   to scale to the same sort of things we're  seeing in Eco Acoustics and camera trap data uh   definitely I think especially um with the uptake  of also like things like citizen science projects   employing eDNA and that we definitely have the  application to extend the use of it uh so it's   definitely getting more accessible and definitely  increasing in its scalability especially as well   as I was saying earlier with more commercial  labs also offering eDNA services and that sort of   workflow becoming more streamlined it's definitely  increasing in that ability as well and you think   with a with a body of users in the bioinformatics  community who are used to large data already sort   of sort of straight off the bat that that would  be a benefit definitely also with super computing   and like the increasing bioinformatic capability it's  definitely going to be easier to add to process   that data as well there's definitely people that  are working on ways to try and essentially take   that data and have it come out in a format that's  accessible by you know government agencies or   people who might not be familiar you know what a  OTU table anything like that looks like so there's   definitely things in the pipeline to try and  make that as a data source that is more easily   able to be understood by sort of general users  as well definitely some challenges there though   I'm sure some opportunities as well look we're  coming close to time but I will try and sneak   in one sneaky question which um and I'll direct  it to you Matt first if that's all right which is   that um now some time ago a year or two perhaps  the um Australian Academy of Science put out a   report suggesting that Australia needs a Bureau of  Biodiversity in the same way as we have a Bureau   of Meteorology now some of the things that that  the presenters today have put forward are what   looked like steps towards something like that do  you feel that in future there's the potential for   an integrated monitoring system for biodiversity  in Australia that could be productionised in that   way yeah absolutely that's in the cards um there's  a variety of initiatives around biodiversity right   now that are playing out so the landscape's  evolving really quickly um one thing that has   been great for at least Paul and I is um the  Australian Data Research Commons ARDC has a   new arm called ARDC Planet devoted to large scale  data commons for biodiversity and environmental   data um they've supported both me and Paul and are  looking at expanding that to drones potentially   eDNA in the future so the ball is rolling but  the path it will take is unclear at this stage   that makes sense and obviously we're speculating  at this point you know it's uh with the technology   mov moving so fast it's always hard to be sure but  it is interesting to think that these things would   have been impossible a decade ago for example  um lovely okay well with that in mind um thanks   everyone for your time uh I'll remind everyone  uh who's uh watching along at home um we have   a poll that um we put out if you want to tell us  something about yourselves we're always glad   to hear about it please uh click on that on your  screen if you'd like um but with that uh thanks   so much to all our speakers today and for those  insights this will be available on YouTube in   due course um and uh enjoy the rest of your day  we'll see you at the next webinar thanks so much

2024-09-11 01:45

Show Video

Other news