THD 83 IDUN Brain-Sensing Earbuds that measuring neural activity EEG

Show video

foreign [Music] with another episode of THD podcast today we have  a company joining us from Switzerland called edun   and they're doing Electro and silificgram  technology or EEG and yes I did practice   saying that word earlier so we're gonna find out  about this technology but without delay and it's   it's basically they're embedding another sensor  into earbuds and stuff so that's why we're going   to talk about it um but without delay let's not  forget about our sponsor the ulti association   they're going to be at the high-end show in  Munich and also there'll be uh cooperating   with uh just before infocom in June in Orlando  they'll be having another event there so you put   the information right here on the screen and  you guys can go check it out on your own the   delay let's say hello Simon's joining us  from Japan with some nice pink headphones   yep Pretty in Pink and Mark melnikovich how are  you doing wonderful thank you very much great   to be here okay so lead product designer from  E Dunn and uh yeah so doing EEG technology to   put largely in earbuds I guess is that the core  Focus or are we going into other spaces as well   no exactly we're really looking at how we can  take earbuds in-ear devices to the next level   of understanding humans and that means  understanding the brain so we're at a you   know we're and we're at a very interesting point  where hearables or earbud devices that you say   are really transitioning from being just devices  that deliver sound and media to eventually being   things that are actually driving applications  with you know the rise of multi-core processor   technology uh the ear Os or com OS systems from  companies like sonical and we want to be this   layer which then is essentially this glue which  helps understand human intention needs and also   how the brain is responding to different media  it's a sensor which we we really can't measure   the type of signal for brain activity in any  other way except by really going to the source   um and it's really a beautiful complementary  technology to the integration of different   audio features AI to enhance audio experiences  or the sound landscape around the person plus   having motion so for example having a lot of  the functions that you currently have in your   Smartwatch but actually putting it into the  earbud and so if Eden yeah okay so maybe just   to illustrate this we might walk people through  a presentation good thank you uh in Technologies   it's really a company which is creating a  new brain analytics platform for hearables   now many times when we talk about hearables we  don't we usually think about sleep but sleep is   one of the most critical problems that we have  in our in our daily lives and also throughout   the world and understanding sleep as a metric  and as an entity for for being a person is one   of the things that we're currently applying  this technology to which sounds a bit strange   you know when you think about hearables but when  we talk about sleep we talk about the evolution   of where where earbuds are going in the future  it's really interesting to look at the industry   So currently if you're looking at companies like  LG if you're at CS this year you hopefully have   seen the LG Breeze earbuds these are earbuds that  are also measuring the EEG or Electro esophagy   from the ear it's actually their electrodes are  a little bit outside the ear canal and then you   also see great products coming from companies  like Bose where you're using musical intervention   to actually help the Sleep journey of people  and what's missing is really understanding the   feedback between what's happening in a person's  brain what they really need and then how we can   change the acoustic environment for them and  you can see these are like very small highly   packaged devices and that's a big challenge  that we have at Eden how can we bring complex   uh neurotechnology into the earbud form factor now  for Sleep many people also then make the analogy   to wearables so things like smart watches uh  track sleep and they do track sleep but they   track it in a way that's not as accurate as we  get from being in a sleep Laboratory so an Apple   Watch a pixel watch you're gonna have things  like a heart rate sensor and a motion sensor   which is giving you an indication for where you  are in the in the Sleep Cycle but it's not really   getting the information directly from your brain  and also why that's super important in the future   so traditionally if you want to measure brain  activity you're going to have to put on electrodes   to the top of your head preferably glue them there  is dry electrode technology to make it easier but   it's still a rather cumbersome device it's not a  device which you're actually going to use day to   day and at Eden we're really focused on creating  this concept of the everyday neurotechnology   device the everyday brain computer interface so we  actually started from the electrode material from   the earbud and then we use that as a platform  to measure brain activity and that allows us   to make things that are very integrated into the  consumer lifestyle like earbuds and we also have   already done validation of our technology so if  you compare this full scalp technology to our   in-ear system you're actually able to measure  the brain signal the brain activity in a very   similar way so if you think about how your brain  actually changes during sleep you have things   like changes in the signal of your brain the way  it's oscillating you you'll have bursts of Elf   activity you'll have changes to deep sleep and  that's stuff that you can't really measure when   you look at just the the macro architecture of  sleep that you get from a smart watch and we also   see this stuff can actually be connected up with  how people are hearing or experiencing the world   and part of what we do at Eden is building  that technology but then also building how   to understand how to interpret that signal  from the brain so being able to extract signal   patterns for things like the K complex or the  Sleep spindle it's a very simple but ultimately   very powerful algorithm because if we can do this  throughout a night and also throughout the day we   can really start to understand how you're  reacting to different situations and then   bring interventions such as Acoustics to help you  focus better help you relax better help you sleep   better we're really looking at bringing this  a layer to earbud Technologies and just just   for the the layperson the sleep spindles and K  complex what what are those I mean I've heard   of RAM and some of the others but so you can  imagine that when you when you're awake your   brain is at a certain higher state When You Close  Your Eyes it starts to go into what we call the   alpha phase so you're going from let's say 30  40 Hertz down to 10 Hertz of your brain activity   and that's really what we measure so the  electrical bio potential at the ear or on   the on the surface of the scalp and then as you  start to fall into actual sleep your your brain   waves are going to shift another another little  bit so go from about 10 Hertz to about 8 Hertz   or six Hertz and then as you start to go into  dreaming States you know we hear these these   Concepts such as uh uh REM you know the random  eye movement that happens when you're dreaming   um these are all markers that we use in laboratory  to understand where you are in the sleep cycle   and usually you have to sit there and you look  at this on a computer screen and a technician   will sleep stage an entire night and we're really  looking at ways to take algorithms and make this   automatic and easy to use so that this complex  history of how your brain is changing during   the night can be easily interpreted through this  mobile device and then used to build interventions   like getting the the perfect type of music for  you to fall asleep right if I like Rammstein   and you like what what is your favorite music  what do you use to fall asleep White House deep   house white white noise usually yeah so like some  people it's uh they're gonna fall asleep better   or they'll have their optimal sleep Journey  we want to help design that and understand it sorry Mark you may be going to come to this  but um what we're looking at is an electrical   signal where does that actually come from you  mentioned bio uh potential I think so you can   imagine that in your brain you have uh electrons  and then you also have neurons right so a little   neuron is going to be you know very small little  thing that you're not really going to see but when   they all get together and they start firing they  start creating a an electrical potential and this   electrical potential can be sensed on the surface  of the scale and in this way we can basically   get an indication and an understanding of what's  Happening inside your brain okay it's just curious   that you take this brain activity electrical into  essentially a single you know one-dimensional   uh potential voltage is It ultimately yeah exactly  so yeah you can have many electrodes on your scalp   you know some people use up to I think 1024 so you  can have a very high density and you can learn a   lot about the brain but then it's something which  is never going to get outside the laboratory   and what we're doing is bringing that down to  just an ear tip just an ear tip electrode to   measure stuff throughout the day so you can think  about it as like a smart watch for your brain   okay thanks yeah my pleasure and you can  imagine if we can understand a bit of   what's happening in the brain over a long time  period you know let's go from days to weeks to   months to years to decades we can actually  start to see how your needs are changing   throughout your life we can understand how  your cognition how your focus is changing   and one of the the higher purpose goals is  to see if we can connect this up also with   disorders that are happening in the brain  you know ultimately dementia Alzheimer's   and as you know also music and dementia treatment  these are starting to be paired together the idea   of music as medicine is also coming to the  Forefront and this is one idea that we can   connect up what's happening in the brain with the  musical intervention that's coming through earbuds   and then we can create essentially a new way for  people to to live their lives as they're aging um and when we think about our product we first  want to focus on sleep because there's like a   big need in the market but there's so many ways  that we can expand Beyond sleep into the daytime   activities as well so things like measuring  Focus as well as measuring interactions like   the the direction of your gaze or things we  can actually extract just from the ear canal   so we are building a platform in  addition to our earbud Technologies   we just released our Guardian earbuds onto the  market it's a B2B product so the companies and   Enterprise can use this technology they can  understand how to integrate it into their   into their businesses how they can integrate  into their own reference designs or really   interested in connecting with more companies in  the hearing space in the earbud space to see how   we can build and integrate these reference designs  eventually into consumer products on the market   so beyond just a physical device we have our  brain analytics platform so we have the ability   to stream brain data through a mobile device or  computers onto our Cloud platform and this can   be done in a variety of ways we can go directly  through our web page we use web Bluetooth connect   devices to earbuds we also have a python SDK so  companies can create more advanced scenarios like   playing a type of music showing a type of media  understanding what happens in the brain during   that time and understanding then how we could do  things like personalization of Music algorithms   and our application layer within our Cloud  platform allows us to actually learn different   neuromarkers deploy different narrow markers  and create a situation where we can have a   really seamless experience between collecting  brain activity data to building all these new   use cases and then the challenge is bringing that  down to an earbud form factor and this is where I   think all these cool Technologies coming out like  multi-core processors are really going to bring   hopefully a paradigm shift in what an earbud is  and going just from what people consider to be   a relatively passive device to one of which is  really integrated into their Lifestyles and is   also replacing some functions of their smartphone  and other Technologies so if this platform uh   going Beyond sleep we're looking at how we can do  things like acoustic attention with audio thinking   about gaming how can we do more hands-free game  interaction or having games that actually adapt   themselves to our mental States right people get  bored of games because they don't have the right   balance between frustration of solving a hard  problem and the flow of the game environment   so in the future especially and this is the  cool thing with generative media right when   we have generative AI creating environments for  users and the metaverse and also when we have   um acoustic media which is being generated  directly for users because this is you know   going to be a next logical step we see we  see this happening with visual media like   still images then comes to video and now also the  ability to adapt or actually create new musical   pieces is also going to be coming and if we can  combine that with our neurotechnology at the air   it's going to bring some really interesting  very seamless autonomous new experiences and also interesting is that when we read brain  data one part is understanding brain state but   we can also understand quite a bit about  what's happening with the eye and the face   so we're starting to develop these unique  classifiers where you can actually do gaze   detection and monitor how the eyes are moving  in addition to doing the brain activity   so you can imagine that when we start to think  about integrating more UI or more ux into into   the the earbud the ability to change the  direction of how your 3D audio is being   delivered to your brain is one thing that we can  start to hook up with our in-ear Technologies and we were recently at the Mobile World  Congress as a guest of Oppo at their Innovation   event where we could show this technology  to people coming from all over the world   um and it was really cool and really engaging  to see how people are reacting to it it's still   a relatively large device that we're selling this  is really to prove the you know the business case   and also the application and we're really looking  for a partner sales to help us take our reference   design and package it into into a smaller system  because integrating the ability to stream brain   data with tws Technologies it's something that  takes time to develop and it's something where   we really want to be closer to the the device  manufacturers to really realize these new products   but a lot of our core technology it  goes back to materials actually so   one of our key USPS is the ability to measure  this electrical brain activity from inside the   ear canal and we do this by developing our  own ear tip materials that are conductive   and then we build our product around that and  so doing things like tuning the Acoustics of   the design tuning you know the the the way  that sound is transmitting through the the   um the sound channel in our ear in our earbuds  there is also some of the interesting challenges   that we have and then people are always  asking ah but do you have a an interaction   between the ear and sound is there a problem  with measuring brain waves at the same time   as measuring or delivering Acoustics and the  answer is no we we have really good performance   um so we've basically proved that we can measure  the brain in a mobile way in an earbud form factor   and now we want to basically take it to the next  step connect with more people in the industry and   basically I see how we're going to build the  headphone 3.0 as people call it and the next   evolution of earbuds and hearing and this really  is a great segue into like our road map as a   company we started really as a biosensor company  we then prove that we could measure in ear EEG   with our new device we're really connecting  with all the business cases POC projects and   in consumer or in B2B use cases and then we  have a concept for our Guardian audio Eden   audio where everything is packaged in a very small  device first based on two WS technology but we're   also interested to see what's happening with Le  audio and what's really going to make the most   sense in the future for combining the you know  very easy to develop scalable audio platforms   together with highly scalable brain analytics  so we want to make the brain analytics easy to   integrate into the audio platforms so we can  get to the stage where instead of buying a tws   chip you're buying it essentially in a tws  slash EEG chip where we have our amplifier   combined with our reference design for the  for the ear tip and earpiece electrode system   there's a Grand Vision for the company and we're  very open for having conversations about the   future getting feedback and seeing how we can  all build this future of the earbuds together uh so um uh what I'm saying there's a couple of  components to it there is the sensor part and it   sounds to me like there's two there's two elements  to that one would be the ear tip itself where   you're getting physical contact and then there  must be another like a uh an amplifier or some   man handling of that input and then uh the second  part of it would be somehow you process that   data into so we've been looking at waveforms but  you're obviously doing something more in terms of   analyzing those waveforms to create results from  a century and then another part of the package is   uh what you do with that data how you interpret  it which would be done generally outside of the   earbud is that fair to say yeah I mean it's really  interesting question because right now we stream   all of our brain data to the internet and we do  everything on our Cloud platform for extracting   features and then having an API so that API can  talk to your Spotify playlist for example I'm also   seeing the future where there's going to be this  hybrid solution where we have our Edge processing   device which could be one of the new multi-core  processors coming out is providing the ability to   potentially You Know modulate audio at the same  time as take in and process the brain signals   and then a lot you know maybe 40 50 of the  processing is happening on the earbud and   then we still have that this longitudinal view for  what's happening in your life how are your music   preferences changing with your brain activity  how's your sleep changing and having that really   be more in the cloud so it's personalized and then  also of course transferable over all the different   Digital Services through API integration yep nice  okay and uh just on a kind of a legal aspect is   there any issue about the collecting those  brain waves just as a as in collecting other   data like because I can imagine your partners are  potentially going to be people like Spotify that   curate the playlist and Amazon music and they're  going to be very interested to know what makes   Mark happy what makes Simon mad and and so and  then so yeah that's an interesting uh kind of   dynamic as we collect more and more data um  where could this end up so we actually have   a narrow ethics board as part of the Eden company  and our CTO mortex he has created our neuroethics   Charter so when we're talking with B2B companies  um we actually go through a process of them to   understand what they actually want to do with this  data and be sure that things are handled in an   ethical way so that we're not for example stealing  your thoughts or trying to steal the thoughts of   people and of course keeping the user in control  of who they're sharing their brain activity with   um also when we talk about extracting these  classifiers or neuromarkers from the data   this is also another layer of protection so  as we're controlling this so full value chain   from collecting data to having an actionable  Insight then that can also be restricted so   that doesn't mean that you know just anyone  would connect to an API endpoint and be able to   access raw brain data so we have these protection  mechanisms in place and it's also very interesting   to see what's happening in different parts of  the world with the interpretation of brain data   because we have you can also extract  tons of information from the smartwatch   um the ability to like really characterize human  behavior and then connect it up with identity is   going to be I mean it's a pressing problem  over pretty much all wearables especially   as there's more data collected and stored so  our goal is really to have the user in control   um have a process for neuroethics that's also  done together with industry representatives   from neurotechnology and Neuroscience and  really forming I hope a standard for how we   can uh sustainably and also ethically build these  types of neurotechnology consumer products so the   current state is uh but this proof of concept  is it what you would call it a proof of concept   design we've definitely gone beyond the  proof of concept so in our roadmap we   so our proof of consult was essentially  the Guardian development kit as we call   it which was released last year and this  looked more like a prototype product   where it didn't really have a lot of scalability  built into it so our current product we have great   scalability of the actual ear tip material it's  injection moldable it's transferable our system   for producing Hardware it's also a much better  supply chain we had a lot of compounding issues   which was blocking you know just the general  ability to to scale the product right now   but it's definitely beyond the proof of console  phase that we had earlier and now it's a product   which is basically ready to scale uh we also have  the new product development of a company cerebra   Health in Canada where we're co-developing a  sleep device because for the consumer Market   in North America and in Europe so the product  is very quickly going towards a masculability   yeah and so for somebody who wants to get their  hands on it wants to implement it the model would   be that people would license the technology or buy  components or both so right now for our Guardian   earbuds it's essentially a subscription basis so  companies can come we discuss and essentially rent   the device on a monthly basis we're working on a  reference design business model for the OEM for   the later business business case where we can have  a reference design license sell it and then scale   it and that would include essentially the the  way that we measure EEG so the ear tip production   material and and as well things like firmware  the type of potential amplifier to use and then   how to integrate it into earbuds so we're looking  at these two different business cases for scaling   okay so I guess um it's uh be a fairly  complicated process to implement at the stage   well I don't know I think you know we're talking  with some uh some of the larger companies about   how to build reference designs so it's not  that complicated I would say it's really about   probably changing a few different processes so for  example having conductive strips on the earpiece   itself so we can conduct to the conducted ear  tip and then having a more standardized way   for integrating system on a chip modules with the  hardware and this is also one of the interesting   challenges for the future you know will it make  more sense to do a die stacking chip ourself or   work with other companies and do more of  a system on module approach to the design   okay so um it would uh this the device  the current uh model Guardian earbuds   um it has a fairly powerful processor  in it or is it something that would be   uh the software would be able to be  implemented into existing uh tws chips   yeah so we're actually working on a tws reference  design in parallel it should be ready probably   beginning of a Q2 so actually quite soon and  there we're actually integrating with a Qualcomm   chip so we have uh EEG streaming directly into  the Qualcomm chip it helps with our partners   and that's going to be our main reference design  for doing earbuds at scale so it's actually almost   ready and I'd love to talk with companies about  yeah working with it and adopting it yep can I   ask a little bit to you because that's what kind  of processing actually is done so you're sampling   uh So currently we sample uh 250 Hertz for  the brain signal and we're not doing too much   processing on the device itself because we wanted  to bring for the development kit we want to bring   a lot of this data into the cloud for doing  algorithm development if we're going towards   a specific use case where for example we're just  measuring a change in Focus these are things that   we can more easily do on the device and where it  could make more sense to do it on the device for a   specific consumer product so we're looking more at  the let's say a microcontroller level processing   and not needing an eight core multi-core processor  right now to to do the product that we need   shows almost like two kind of things one where  you would collect an enormous amount of data   and offline process then another one where  you're essentially doing a a sample block by   sample block uh characterization or looking  for some characteristic or feature right   exactly so if we have a narrow marker  such as for focus and we want to just   deploy that one mirror marker on  device I mean that would be feasible   um but for our development kit where many  of the customers are still looking for   you know the application the killer application  that they want to develop for the technology and   therefore we make a much more flexible platform  in the cloud so I can pull in all this data you   can run experiments much more seamlessly and then  do the the data science on it yep yep very good   yeah so what do you think about this type of idea  of integrating brain activity with uh hearables   uh I like it a lot and um all of these type of  features um going into the earbuds so heart rate   uh you know pulse pulse related but this one  sounds a little more interesting actually   oh thank you yeah and you know really I see  we're going to be integrating all that stuff   in the future you know either um and our  own reference design if we take it to you   know the third or fourth version or if we do it  together with a commercial partner because of   course I'm sure you know there are many devices  already out there which are doing heart rate   um measuring heart rate plus  motion not too difficult   and then when you add in our technology to that  you have this great combination of understanding   what's happening in the heart the brain and the  body and I think when we have that we're going   to have a really nice scalable powerful system  for doing all these things like also measuring   stress right because then you can measure heart  rate variability then you combine it with your   focus you combine it with you know where you're  taking a walk in the morning were you taking a nap   in the afternoon after lunch uh so I think it's  like really an amazing exciting space yeah and   um well and when you're the developer of this you  can see this enormous range of possibilities but   I guess from uh and implementer's point of view  you might like to choose I really only want I'm   going to make for example Sports earphones so this  one's specific uh capability that I want to give   to my users I don't care about sleep for example  maybe they do you get my point you want to really   try and consolidate onto one specific thing that  you get a very robust uh response out of yeah and   one of the reasons that we started with sleep is  that we saw a lot of value that we could bring to   the Sleep market so the Sleep Tech market and for  this reason and also for the component Supply we   had at the time we built this type of device which  was something that is has sleep in mind it's not   completely optimized for sleep so it can also be  used during the day as a more normal earbud device um so solving sleep with some of our first  commercial Partners made a lot of sense from   the market and when we talk to other companies  who are in the earphone space this concept   of like okay what would be the application  for neurotech earbuds it's still a bit open   um and it's really great to see companies like  LG coming into the market with these types of   devices because it's first proving that there's  an appetite from from the industry to bring   these new types of Technologies but it's also  going to be great for testing the market and   understanding how people uh respond to that  type of use case so they also saw the value   in going with sleep first and in optimizing  their their value proposition around sleep   yeah but beyond that you know once you solve  sleep where you might have earbuds falling out   you have to have good contact people are moving  around it's being used you know for probably six   to eight hours already then transitioning to those  daytime applications like tracking Focus tracking   music that's going to be a lot easier and much  easier to scale and then we want to have you know   the market proven having devices in the market  for asleep and then then going into these other   um yeah amazing use cases is I think really  going to be the way we go as a company   uh that one do you mind if I ask this uh concept  of focus or concentration let's say uh is there is   there a metric for that I mean is there a way to  say your concentration is at a level so right now   companies are all doing it individually you know  it's you know it's going to be the same as I think   with many of these different applications that you  you basically see a change a synchronization in   brain waves um and doing it together with music  so there's this company come uh concept of Flow   State where you know your brain is very focused  you're doing one thing could happen in sports it   can happen in you know creativity and creative  Industries so seeing how the brain is you know   essentially shifting from one state to another  is one of the things that we do with the device   and having a real let's say biomarker for focus  in the ear and being able to scale that is one of   the big things are going to be bringing to  the market we're doing that right now with   sleep where you're essentially seeing like I  said these shift in brainwave frequencies to   define the the Sleep State yeah and it's kind  of something similar to that for for folks   if you were interested in a study headphone or  earphones that can essentially prompt you when   you're losing focus because you have the  experience too or when you're working as   well uh you can fall into a state where you're  just not concentrating and just drifting and   it could an hour could go by wouldn't it be  nice if the iPhone just told you just go and   take a break and come back and you know regain  your focus rather than leaving you to drift off   well or having your relaxation music or audio  start to play automatically right so I really   love this idea of the um the work time cycle you  know where especially if you're doing like a new   problem your your thoughts are like expanding and  then Contracting and you need that time like your   brain your brain plasticity needs that time  to think about stuff and then go away from it   and then you have memory consolidation and there's  also a lot of cool um research happening in places   like MIT where they're actually looking at memory  consolidation during sleep for assisting learning   so having things like having an audio queue  when you're learning something during the   day and then bringing that audio cue while  you're sleeping this is one of the resource   topics where they're trying to see if we can  actually improve learning processes during the   day by understanding and doing interventions  during the night at the right sleep stage   so I mean it's still you know really in the  research phase but the potential is so amazing to   really affect and improve all these different like  small parts of our lives yeah all right so that's   that's some good stuff very interesting stuff  Mark um thank you thank you for joining us today   um we'll put some information down below for  people to uh to follow up and find out more   information on edun and uh yeah so everybody  like subscribe share all that good stuff and   uh we'll see you on the next episode see you  for now cool thank you sir thank you David

2023-04-16

Show video