Digital Humanities Lab: Exploring musical instruments using digital technologies
Good morning everyone, my name's Alexis Tindall I'm the manager of Digital Innovation here at the University of Adelaide Library and thank you all for joining for joining us for today's Digital Humanities Lab webinar. This is the second in this year's five-part series. Before we start I'd just like to mention I live and work and on the lands of the Kuarna people and I'd very much like to acknowledge and pay my respects to them as the traditional custodians on whose ancestral lands, from whose ancestral lands I'm joining you. I acknowledge the deep feelings of attachment and relationship with Kaurna people with country and respect their past present and ongoing connection to the land and cultural beliefs. As this webinar is online it's very possible you're joining us from other parts of Australia so I extend that respect to relevant communities in other regions and I especially welcome any Aboriginal and Torres Strait Islander people who are joining us today. With this
webinar series the University Library hopes to inspire and inform humanities researchers and their potential collaborators who are interested in using digital approaches in their work. When i say potential collaborators i mean everything from researchers in other disciplines, representatives of the galleries, libraries, archives, museum sector, students and whoever else is interested, everyone is welcome with these webinars. We showcase real projects and researchers that are using data digital tools or digital research methods to drive, to enrich, to complement or communicate their research. Just a couple of bits of housekeeping before we get started, today's session will be recorded and shared after the event on the University of Adelaide YouTube page. As a registrant you will be sent a link to that
recording. We will have all of our presenters first today and then bring them back for questions and discussions at the end. If you have any questions please look to the bottom of your screen and you can see the chat icon and you can ask your questions in that chat window. Feel free to introduce yourself in the chat as well and let us know where you're coming in from today. Your questions can be sent through at any time and I will collect them
and pose them to our speakers at the end. Online we've also got my colleague Matt Lumsden who is helping us today and will be keeping an eye on that chat for any technical questions you might have. So, we have a lot to get through so without any further ado we will be hearing about three projects today all of which are centered around musical instruments. They
include everything from researchers working with traditional instruments found in historical collections through to those that are creating brand new instruments enabling new forms of music and engagement and performance. So first up we have Dr Anthea Skinner from theUniversity of Melbourne, the Victorian College of the Arts and Dr Alon Ilsar. Anthea is an ethnomusicologist who specializes in disability music culture, military music, organology and archiving. Anthea's research into disability music focuses on professional musicians with
a disability, and their creative output and career pathways as well as adaptive musical instrument design. She's a musician herself but the all-disabled Bearbrass Asylum Orchestra. And Dr Alon Ilsar is a drummer, composer, instrument designer and researcher He has researched in the uses of a new gestural instrument he co-designed the AirSticks at Monash University's SensiLab in the field of health and well-being, making music creation more accessible to the broader community. He too is a musician having played with a range of theater productions, orchestras around the world and contemporary artists. And with that I will hand over to Alon to get us started Thank you Alexis I'll just share my screen. So yeah I'm I feel really lucky to be part of this project the AirSticks and the AirDancer: Adapting a Gestural Controller for a Dancer with Disability. I'll try to give a quick background to the AirSticks as an instrument and then we'll speak a little bit about Melinda Smith's practice and how he adapted that instrument for her practice and it's a very much an ongoing project so we look forward to presenting more work in the future on this. So firstly what are the
AirSticks? Basically my background is as a drummer and I wanted to combine the expressiveness of acoustic instruments with the endless sonic possibilities of electronic music. And that's me holding the AirStick here's the AirStick here this is the latest one. We'll get to this I'll give you a quick background on what it the system overview, so basically the the AirSticks at the moment have a nine degrees of freedom IMU, a Nordic BLE Microcontroller that sends information via Bluetooth to a computer. On the computer we have a bit of
software that we call Airware and that translates the gestural data into MIDI or OSC and that gets sent to software, music software. I often use Ableton Live but we can also use Max or Logic or Qbase or whatever we like to use. We've also triggered synthesizers with it and so forth so it's it's very open. And because we're talking more specifically about hardware
here I thought I'd give an overview of all the different hardware, and when i started looking at what I've been doing the last 15, 16 years with this I realize there has been a lot of prototypes. The first one on the top left is using camera, we had a basically a camera detecting a mallet and using the shape of a square to see how close it was to the screen or further back. But this was 2007 and the cameras were quite slow then so we tried an exoskeleton there in the middle and that was very clunky and then we sat for a very long time on the Razor Hydra gaming controllers which are off the shelf product a VR controller. But I always dreamed of building my own hardware for this project which is there at the bottom left is a CAD of what we wanted to create and during my time before joining SensiLab this was some of the development we had made in the bottom here. So my PhD
centered around my own practice as a drummer and electronic producer using the Razor Hydra gaming controllers to create gestural music. And then when i started at SensiLab at the end of 2019 we started to develop our own AirSticks, what we call 2.0 which uses our own custom hardware. Here's one that you can see this is the first prototype we made of
AirSticks 2.0 as you can see it slots onto a drumstick very easily because I'm a drummer and it was within my own practice that i wanted to use this but it's also got no buttons so the point of this was to become a much more accessible instrument in terms of being able to just grab it, move it, not worry about pressing buttons, map it in a really, in a way that was quite intuitive for people to use for the first time and then develop it further with them in terms of what sounds they wanted to make and what movements they wanted to make to make those sounds. So we developed that further for drummers as well just to take another point this is the AirSticks 2.1 which also has a sensor at the bottom that you can squeeze sound out of. So we just to talk about the printing part of this a little bit as well, these are all printed from sintered nylon which as you can see there is the process, it's kind of magical, this is not my field but they pop out with as this hardened material and then we shave them off and these are the parts that we put together this AirStick at least.
So yeah I'd like to try to move on and quickly play a video of the first day the first session where I had two of these sticks and I worked with Melinda Smith and then Anthea can talk a little bit more about Melinda's practice. Music plays Shall we leave it there given we're very tight for time? Yeah. Oh can we go back us just so we can see Melinda while I talk about it there we go thank you. I'm Dr Anthea Skinner
and my field of expertise is disability music so when I came into this project my aim was to look for professional artists who hadn't had access to instruments before. And I, for about 15 years have worked on and off with Dr Melinda Smith who is a professional dancer with Cerebral Palsy and she's also hard of hearing. And immediately the work Alon was doing with the AirSticks seemed like a really obvious thing to work with with Melinda because you know as you can see this was literally her first day using it you know you don't need to be able to read music to use it, you don't need to be able to have a an in-depth understanding of sort of music theory. And so we thought this would be a really great
starting point for Melinda and and as you saw this is her first day. What we then did with Alon obviously Alon already had a really functional musical instrument but we then worked to make it suit Melinda more specifically both because of her, both in relation to her disability and in relation to her actual performance practice which in some ways for me turned out to be the more exciting part I think, and I think Alon would agree. So on the first day for me looking at Melinda here she's you know she's clearly playing it really well but when you have Cerebral Palsy or sort of any kind of muscular sort of spasming issue the grip involved with holding the drumstick like that actually takes a massive amount of concentration. So when I'm watching this, someone who doesn't know Melinda's work will
see it and say wow she's making beautiful music what i see is she's very limited in the way she can move because she's concentrating really really hard on holding that instrument. So we then started looking at different ways that she could hold it. I think we first started looking at maybe creating a glove that would actually hold the stick because we really love that sort of flick that you can get with the stick but as we moved forward we started to realize that perhaps something that would hold to her body would work a lot better and we started working with a costume designer to help that happen. Musically speaking though I think that's where she really surprised us in that we had seen well I had seen this as a musical instrument, I know Alon had done a little bit of work with poetry before but Melinda is a poet, a published poet as well and the first thing she wanted to do was to put her poetry into the machine so that she could point in various directions and play lines of poetry. So she ended up with this wonderful music poetry collaboration in and of itself. We recorded her, Melinda doesn't speak very clearly as a result of her Cerebral Palsy so we recorded the poem in her electronic communication voice you know like an electronic voice and we also had her speak it in her own voice which people often struggle to understand but by layering those two things it sort of gave a deeper level of understanding. I was
just joking sort of earlier that our main tactic when it came to redesigning the shape of the AirSticks to suit Mel is I i call it the take it and break it technique. We sent the instrument home with Melinda and I think what within six hours she'd broken it, broken the charger because she has limited hand movement you know. And that's really what we wanted to know you know where are the glitches for someone with the kind of hand movement of Melinda, what's not working for her what what is? So by repeatedly breaking the instrument we then gave Alon information that allowed him to come back and redesign and redesign and redesign until she couldn't break it anymore. And now it's still gives the music that she
wants but she can't break it and and that is wonderful and that's really important because there's no point in having a piece of technology if you need to constantly have it fixed or have someone else turn it on and off for you. Should we move on to the next bit Alon? Sure yeah. Just very aware of time. So we came up with an idea of basically, at first strapping this AirStick that we had onto Melinda's wrist with a with a wristband and then she did her first performance of a piece called the rhythm of my body shapes with this technique. But as an instrument for her to take home this wasn't accessible, it had no way for her to press any buttons on it to change to other settings in when she was rehearsing and it also had a very small on-off switch and a recharge port that was quite difficult to to make new charges for, it's not a standard USB so we went on to create a touch sensitive and pressure sensitive pad on top of something like this in the form of this new AirStick which is a bit bulkier but because she's wearing it that's not a problem for us or for her so this is where the I've taken off the pad but this is where the pad will sit for her to be able to slide her finger on this or touch it but also the button is a lot more accessible and I went and visited her and brought us some new buttons to make sure that she could switch this on and off but not switch it on and off accidentally as well so that's the kind of compromise that we came to. So that's it there with a strap through this little if you can see we've created a gap in the print and again this this printing is so strong this form of printing is so strong that we can afford to have this very thin bit of of plastic that's not going to break when you put a strap through it so that's and there's also a light and so she knows it's charging and it's using normal USB charges now as well so these are all the things that we looked for to make this instrument more accessible within the hardware. Within the software is a whole other thing and maybe not so much to be discussed as part of this conversation.
So finally now that we have a prototype that fits onto Mel's body our next step is creating what we call the air dancer harness and we've been working with Anna Cordingley who's a costume designer at the Victorian College of the Arts to do that so here you can see their basic designs and i don't know if you can see the sort of little pink outlines on her arms that they're the little pockets where the where the AirSticks will go there's little pockets as well on her legs and there'll be one on her chest and I really wanted to show you this picture mainly I guess to give you an idea of sort of what working with Mel is like because I envisaged this as a sort of very utilitarian sort of black and white you know harness. Mel as well as being a musician and a poet sorry a dancer and a poet is a painter and she said well can't we make it look pretty too so she and Anna got together and they're actually going to be laser printing Mel's designs onto the air dancer harness and indeed that's something that we can change for different performances and and for different artists if we ever you know decide to make it sort of more available. So that's us that is the air dancer and the air sticks and I'll hand over to Aaron and Kim Thanks so much Alon and Anthea that was really really interesting presentation and thanks so much for sharing that video as well it's wonderful to see so coming up next we have let me just say Professor Aaron Corn and Dr Yeonuk Kim. Aaron is presently the Director of University of Melbourne's Indigenous Knowledge Institute and was previously a Director in the National Centre for Aboriginal Language and Music Studies and the Centre for Aboriginal Studies in Music here at the University of Adelaide. He has decades of experience in close collaboration with indigenous colleagues working towards broader recognition for the significance of indigenous knowledge through multiple research and education initiatives.
Kim is a mechanical engineer whose research interests explore mechanical and chemical properties and study the structure of various bio-materials to combine engineering tools. Working with multiple disciplinary teams he has been developing 3D visualization technology for imaging and reconstruction of bio-materials and I invite him to join us i can see you so take it away. Thanks Alexis and thanks to the University of Adelaide for having us today and to all other speakers on this panel. I'm speaking from the land of the Woiwurrung Wurundjeri here in Melbourne and as Alexis said I'm currently inaugural Director of the Indigenous Knowledge Institute at the University of Melbourne and until recently was Director of the Centre for Aboriginal Studies in Music CASM at the University of Adelaide. So the project that brings us all together is a linkage project funded by the Australian Research Council the ARC for short that I'm the lead chief investigator on and Anthea here is one of the other eight chief investigators and everybody here is connected to those chief investigators one way or the other through the work that they do in their various institutes and labs. And that project is called 3D Printing of Custom Musical Instruments for Heritage and Industry Needs which is a boring but efficient way of talking about all the bases we're trying to cover in that project. I think it's quite a unique project because for the most part
3D printing technologies haven't really been applied to producing sound producing instruments in a way that enables them to produce sound as intended by their original models but before i get into that the partners on the project at an institutional level and there are honestly just too many individual people to name on this project at once there are a lot of people on it so when this goes live Alexis will make sure that we've credited everybody sufficiently but the the institutional partners at the University of Adelaide as lead organization, Monash University, the University of Melbourne, the Mulka project in Yirrkala in the Northern Territory which is a local indigenous-led community archive in the town of Yirrkala in Northeast Arnhem Land which is a leading example of what locally you know indigenous lead indigenous archiving can look like and the South Australian Museum is the final department organization there. And we conceived of this project as a highly interdisciplinary approach and team and we have a whole range of people who work on the project who span disciplines like but not limited to ones in music, computer science, museums and collections, material culture conservation, so you know the physical maintenance and restoration of things, engineering including acoustics and and until recently because there was a you know sad and untimely passing away of one of the members on the team one of the indigenous community leaders at at Yirrkala from Yirrkala. And there is also a growing constellation of other intersecting intersecting projects with this one that has brought us all together like the one that Anthea and Alon that's funded by various means are working on through the AirSticks that they've just exemplified so it's it's good in a way that despite the challenges of being locked down and not being able to get into labs and you know having to wait long times for you know the right computer chips to become available to manufacture the equipment that we needed that then we got held up in shipping delays over the past two years it's it's nonetheless great that this project has already in a way outgrown the original vision and picked up momentum and gone in a whole bunch of intended and unintended directions that's one of the great strengths of what we do. So when we first came to the project and it went through the ringer with the funding body several times before it was finally funded so it had various iterations of development and typing up before we finally got over the line to land the funding. We wanted to test how 3D printing techniques could work for both preservation and documentation outcomes initially effectively from museums and collections context because that really hadn't been done before. The other challenge of course is that even though 3D modelling has been used for non-sound producing non-musical artifacts as a way of preserving them the the added challenge of trying to document what a musical instrument is and how it works is does it actually work if you scanned something and then print it will it still make a sound that's anything like the original? And so there are limitations with the kinds of printing materials you can use but there are also opportunities because what we soon discovered as the project was being refined over the you know three or four years we were talking about it is that well what if we could make playable scans and prints of instruments of various kinds and what if in some cases the materials that we could use and I'm thinking you know very high tech materials at the moment like you know carbon fiber and fiberglass what if those kinds of materials could actually produce things instruments that were more robust than the originals in some cases because you know a lot of musicians around the world particularly ones who play instruments made of wood won't travel on planes on long flights to other parts of the world with their best instruments because instruments native would have a terrible tendency to crack in the holds of planes due to temperature and humidity fluctuations.
And there was one thing that really stuck with me as I was writing the application about how this could be possible because there's a very very famous Australian band that I worked with for many years that toured the whole world with a Didgeridoo or Yidaki to use the Yolngu term from Northeast Arnhem Land made of fibreglass it was actually cast from an instrument that had been made from wood then you know split vertically down the middle and cast on both sides made out of fiberglass using traditional surfboard making techniques and glued back together. It was painted and nobody ever knew that that instrument was fibreglass which is why I'm being a bit cagey about which band that was precisely. Nobody ever knew it was fiberglass they never told anybody they never advertised it it weighed as light as a feather you could pick it up and you know just be surprised at how light it was given its size and it never cracked, it never warped, it never failed. A wooden instrument would eventually just have you know completely been destroyed by temperature and humidity fluctuations in the hold of a plane so that really got me thinking about how this project would be more than merely about preserving instruments that exist but what about adapting them in a whole bunch of ways that hadn't been considered before to be you know more robust or more affordable? What about adapting instruments for the kinds of people who Anthea and Alon are working with at the moment who at the moment don't really have access to cheap and affordable bespoke instruments that if you're using traditional manufacturing methods cost a fortune to make for an individual it's completely unaffordable for most people particularly people living on a disability pension. So what can we do in that space became the real sort of question of the application and the work that we're doing now so before i finish i just want to talk about some of the novel challenges that we had to consider because ultimately there's not much point producing musical instruments that can't be used as musical instruments okay if we're talking about anything than beyond a cosmetic dead display behind glass in a cabinet approach to making a replica of something if we if you actually want to use these technologies to make functional musical instruments there's a whole range of things you've got to think about so to replicate instruments properly and particularly ones held in collections you actually just can't destroy them you can't slice them down the middle and scan their insides. Their insides of course their resonators are absolutely integral to sound production so unless you're scanning the internal structures of an instrument you're unlikely to make anything that will sound remotely like the original you can't just scan something with a handheld device externally expect to produce an instrument that is in any way faithful to the original so we've had to come up with a range of non-invasive methods for scanning the interior structures of the instruments that we're seeking to print and there are a range of techniques for how to do that you know some of it you know involves you know sticking instruments into you know MRI machines you know x-ray you know ultrasound they're all non-invasive techniques you know you can feed speckle lasers you know up pipes so you know like into the balls of wind instruments and you know there's you know even a robot arm that's been worked on at Monash to you know send tiny scanners up. All of that is
still really being prototyped at the moment and we have another interesting and novel challenge as well because you know a lot of musical instruments are bigger than most commercially available 3D printer beds so how we deal with that is another great question. Every time a printer comes onto the market that can scan you know sorry that can print you know bigger instruments we look at it very closely we just found Anthea just found an infinite printer that runs on a on a treadmill that we're currently looking at trialling and then you know we've got other options other than finding bigger printer beds or infinite printers you know we just sort of make the instrument in parts and glue them together that that's another option and that that might work for instruments that are made in parts and nailed together traditionally. It probably would be less satisfying for something like a didgeridoo which is you know a long pipe however so look you know they're the various engineering challenges that we have and I think I've gone pretty much over time so I'm going to throw to Kim so thanks. Okay thank you Aaron so let me share my screen so I can talk about what I did or what we did for 3D scanning and printing of in an Yidaki so I'll start so I'll start talking about some like 3D scanning techniques Aaron covered a little bit as well and what challenges we have with this 3D scan 3D scanning techniques when we're scanning Yidaki and we're going to talk about the prototype strategies can be developed and some challenges and results with regard to 3D printings. Still most commercially scanners you can see around the labs around
the world around your homes or like the handheld 3D scanners they're relatively fast really easy to use they're quite accurate and the fun thing is the interesting thing is that because they use cameras you can they can actually capture the textures and the colors and images around the object you're trying to scan but the, one of the issue is that it requires a line of sight so if you can't see or this is kind of can't see it you cannot scan anything so like Yidaki or any other wind instrument that has internal like tubes and internal structures this is not really suitable so what about like a CT scan or MRI scanners like this like this micro CT scanner this example what I did before with the clay ocarina it gives you phenomenal amazing results you can even scan tiny little grooves and engravings on the instruments you can see inside of course and but the downside is that first thing it takes very extremely long time to scan i believe i took the ocarina is around like 15 centimeters wide and it took about like 36 hours in total to get the model the first thing it takes very long extremely long and there is a size limitation so like a Yidaki that goes at least one meter to several meters long you can't really use this type of 3D scanners. Then what about like just measuring the instruments drawing out using computer-aided design to model it this is very viable for most cases but and like this is like a you can measure the diameters on each sections and diameter of the fingering holes and assuming the internal tube is straight you can relatively easily to mold easily model this kind of instruments however like a Yidaki that has amazing organic shapes there's curvatures and grooves inside so those are seeing something that they're challenging using this kind of 3D scanning or 3D modeling techniques. So like challenges in the scanning Yidaki as you can see in the images on in the slides there are very organic shapes something you can't really measure or and model easily and because of this on wind instrument there's an internal cube you can't see inside that means we can't use handheld scanner. With a large size even if we can't use any micro CT scanner because the organic shapes we can't use any solubles or any 3D modeling software so we had to come up with something new that can scan the instrument inside of the instrument along the length and we developed a prototype series scanner that uses a laser line module with a 45 degrees mirror that reflects the laser line and endoscope camera then it looks like this. So what it does that it shoots out the laser and bounces the laser line off the mirror 45 degrees mirror and illuminates the sort of the single layer I would say a single layer of the instrument and with a laser line and it creates a contour following that following the instruments and by repeating so we do these scans along the length in this particular example we did every 10 millimeters increments along the lens and after each length after it scans we rotate the instrument roughly like 90 degrees because the laser line can't really create the entire contours of the instruments. Now rotate the instrument 90 degrees and repeat
the process like four times so with the laser images we capture we align those laser images prior images and using the 3D software we can create a create a point cloud which is one of the formats of the 3D models and from this point cloud we can create a 3D mesh structure which is file type for 3D printing. So for internal images we follow those steps for external scanning we use something relatively more simpler we're using photogrammetry techniques where you take images of the object you're trying to scan in this case the Yidaki in around the model in multiple images and using trigonometry techniques using like following the point multiple points in the instruments and and like placing the camera in virtual 3D environment and using those like and by merging those images capture you can see 3D structure of the Yidaki's external features. And since we have like now internal and external 3D models now we can merge them together to create a single and final 3D models and and Yidakis as you can see we have quite accurate external features and we also have an internal tube structures model that as well. So with this model we can put it into in any 3D printer and print it out and it comes out like this and to give you perspective and there's a pen next to it to give you a sort of scale what you're seeing so as you can see the printed result is very very small it's only a few few centimeters long and you can definitely can't play this. The issue with this is that there are some limitations with 3D printing. Obviously limitation is first the size. Most of the conventional 3D printers on the FDM
3D printers you can you have in your homes can print maybe up to 250 millimeters maybe 300 reaching at the maximum limit whereas Yidaki itself is like 1.5 meters long so you can't really use that. As Aaron talked about before there are like new 3D printers that use a treadmill system that you can technically bring infinite lengths but the setting up can be a little bit more challenging and imagine the print failure in the middle of the fail that means like you're wasting you wasted so much time and materials as well. Other way you can print a small section by small section and glue them together if you can imagine like blues saxophones and clarinet you can you put multiple sections together you can do something like that but like to print out the entire instrument you know like single print it is still some sort of challenge. And another challenge is of material. Most
of the 3D printers uses either resin or wood I mean plastics whereas Yidakis are wooden, wooden instruments so there are certain qualities that the wood produces in terms of sound quality and in the resonance so there are and there are some 3D printing filaments that infused with wood grains and with dust but because it is with small amount of wood that's infused with plastic it's not really it have like same quality as wood there are other materials like fiberglass or carbon fiber infused with plastic, plastic filaments but still those things as well the reason fiberglass and resin produces like good really sound quality and good resonance because it's fibrous structures. However, those infused plastic filaments uses a very short strength in in the filament so it doesn't have that structural structure similarities so this is something we need to still look into this is something we need to still investigate and experiment. So to conclude for so far we successfully constructed some sort of process flow for constructing 3D models on the Yidaki and we developed the prototype printer for this process and we created this 3D model with a fairly accurate representation we have a scaled model Yidaki printed using FDM 3D printer. And as I said before it's a 3D printing process for in terms of new materials and in terms of size there's something we still need to look into and yeah I think we are way over the time and this should be the end of the presentation thank you very much. Thanks very much Kim and Aaron yeah we are a bit over time and I just want to make sure that we have plenty of time for Daniel at the end there as well we were trying to fit a lot in today's session. I will just remind attendees that they can ask a question in the chat window and now if they like and we'll perhaps hopefully have a few minutes for questions at the end. So just finally
we have Daniel Bornstein who is a student of the Grimwade Centre of Master of Cultural Materials Conservation. He has broad specialization in a professional background in photographs, books, paper objects and machines. He's worked in the private sector and with institutions including the Powerhouse Museum, University of Melbourne and National Library of Australia. Today he's going to share some findings and research in 3D scanning traditional Indonesian instruments and so if i can just ask Kim turn your camera off and we'll hand over to Daniel.
I'll be chatting a little bit about 3D imaging techniques so this dovetails nicely with what Aaron and Kim were just talking about there so this just so you know is is based on my minor thesis project which was specifically was fed by this the same research but it was specifically looking at imaging techniques for the instruments and specifically i was looking at accessible imaging techniques for small institutions or individuals wanting to try this. So just a sec, so I was looking specifically at two instruments from the music archive of Monash University so it's an interesting case study just because the music archive it's a significant collection of Indonesian ethnographic musical instruments dating back to the 19th century. For some areas it is probably one of the most complete records of a lot of this stuff just because in areas a lot of it has been destroyed by natural disaster and and other other things so in some cases this archive provides one of the most sort of complete complete archives of the of the cultural instruments that it holds and so it's a very significant archive. Just the only problem with the archive is that it's six thousand or roughly six thousand kilometers from from its source community so in this case the instruments that I was looking at were both from the Sumatra region so they're held in the collection store in Clayton which is a very very long way away so you can imagine that in order to increase access they're looking at digital or virtual virtual access and so the way that they're doing that at the moment is through the Figshare platform and the Figshare platform is quite a good platform so it allows them to host things like audio recordings and things like this but it only hosts 2D visual assets so they can put photographs in but not 3D assets. So in my research of course I was asking can 2D assets make effective digital surrogates for objects which are defined by their three-dimensionality so that's what Aaron was just saying a moment ago, you can record the sort of the sense of the shape of an object from a photograph but how much of it can you actually or how much of what makes the object culturally meaningful can you actually capture it in that photograph? So in the case of musical instruments of course they're defined by their three-dimensionality so they're acoustically defined by it the internal shapes and resonating chambers within them are very important for the sounds that they make which of course and also for the use and the way that they're played which contributes to the way that they're used as cultural objects. So I've posed this as a
question obviously my answer well the answer that I think is no so for the reasons that others have just discussed we you know I think that 3D assets are a really important thing for many cultural institutions to look into as my background is in conservation conservators generally like to keep things, I can't remember Aaron's exact phrasing or something to the effect of static and dead on the wall or something like that and that tends to be how we like our objects but in the case of musical instruments i think that really doesn't sort of capture what's important about the instruments and as conservators I think we're generally looking to manage cultural transmission or the transmission of cultural meaning through an object's change of state or condition and so if that's the sort of stated aim then you really need to be thinking about making sure that you're transmitting all the aspects which make it culturally meaningful and so I think 3D is a really good way to do that or to look at it so there are lots of 3D imaging technologies available I'm not going to talk about all of these although I think Kim and Aaron did just sort of mention quite a lot of these and it looks as if they're looking to a lot of them but in my research I looked at two of them in particular so photogrammetry because this is by far the most accessible and where most people looking to get into 3D will start out and also x-ray micro CT because it is the most comprehensive so as Kim said you can get extraordinary detail out of it but it does have drawbacks and limitations well both of them do. So I think we've already sort of discussed what photogrammetry is but basically it works it's sort of the same principle that the way your eyes can calculate depth perception using sort of the distance between them photogrammetry kind of does that but on a scale where you give it lots and lots and lots of different points rather than just two eyes you give it usually between about 150 to 250 of them so you just photograph the object from lots and lots of different perspectives. What this means though is that since it is generating the 3D data from from photographs you can do it with virtually any camera so it's a very good starting point because most institutions will have some kind of setup for two-dimensional digitization and photogrammetry can use that setup so you don't actually need to buy any new equipment. For my own research I was looking at the Grimwade center so my background in photography so I found photogrammetry very easy but I know that lots of other people don't have a background in photography and lots of objects are quite difficult to light so I came up with a workflow well I didn't come up with it I adapted a workflow which uses cross polarization and turntable capture in order to in order to make it as easy as possible to get repeatable results and it's also a scalable process so with minimal purchase of new equipment. So
this is about five hundred to a thousand dollars worth of new equipment that the Grimwade had to purchase and you can see in the bottom left there that's basically all of the that's everything it needs so it's it's a camera, the turntable, the strobe which is on the camera and just a computer for processing. So I won't talk too much about the process, as far as the assets are concerned you actually get out of that so as was indicated by earlier speakers you really want to get sort of the exterior details although in the case of the slit drum there you can see that it actually did get decent decent results from the inside of the yeah of that slit drum there and you could probably actually clean that up in post but obviously of the sealed the sealed instrument there the Gendang there's absolutely no way that you can get any data from inside that drum there so in terms of reproducing it or taking like an acoustically accurate document of that there'd be no way that you could do that using photogrammetry. But still that you can see that these would be useful assets for lots of different things and combined with a sound recording these could be a quite quite a good digital resource. So I looked also at x-ray micro CT as well in the study and so this works very similar to the way that x-ray or x-ray medical CT does so you can see on the left there it's basically a stack of a whole lot of x-rays so the same that you would get if you broke your arm or something like that and it just takes about 1800 of them were used for this model and it just stacks them all together and is able to sort of get a model from that and you can see from the or the middle on the right there that's the what's called a DICOM stack so it's like the the same as you would get with a if you've ever had medical imaging done you'll often get this. You can see extraordinary detail in there it's also geometrically accurate for the inside and the outside and in this case I was able to image the inside there and determine well in this case the skin was stretched on around what's called a flesh hoop so like the ring that kind of tensions it and I wasn't sure what that flesh hoop was made of but you can see from this data and it's probably moving too fast to actually see it there but the or you can on the right the flesh hoop is made from little strips of bamboo which have been twisted around and then the the skin has been kind of rolled around it and then tension applied to that and so that's something that describes the construction of the artifact which is something that as a conservator we're always looking to sort of document. So if somebody wanted to recreate this they could do that from that data and we didn't need to sort in half with a bandsaw which is always a plus when you're dealing with cultural objects. As was indicated
earlier it's a specialized process so I couldn't do this myself this is the machine the machine I think costs between 250 and 500 000 dollars. So it's it's a good device this one was quite quite a bit quicker than the one that Kim used i think so it only took about two hours for the for the sort of complete imaging and then another sort of three to four hours for the post processing and it really only took that long because i was very picky about what i wanted and also Dr Jay Black who you can see there on the left who is very good at what he does just hadn't really imaged ethnographic musical instruments before so it's a very specialized thing but together we were able to sort of come up with this mounting system that you can kind of see in the bottom right there so basically it's put inside a PVC tube which protects it from the machine so there's no way that there can be a collision or anything like that and there are foam blocks put inside there and the foam doesn't come up in the scan which is why I've had to render them in the colors there because it's a different density to the to the wood there so you can get very clean good data from this method it's very safe. There's actually there's a standard or a document that's come out with from there's a project called the MUSICES project which actually describes or details a process for CT scanning of musical instruments and there are open architecture systems so you can scan very large very large objects with this method. But it is quite specialized as I said it's not necessarily that accessible but with that said because it is used so much in industry there are commercial operators so it's kind of if there's a particular research project that you're looking at or a particular object that you really want to see you can sort of pay to have it done and that brings the cost down to sort of maybe between 500 dollars to like a couple of thousand dollars per scan which does I would say put it in reach for specific research projects. So just to give you a sense of what you can get from
the data you can see on the right there that's the mesh that's generated it's a beautiful beautiful mesh so there's about five million faces in that mesh it's lovely and you can do all sorts of fancy things with it so on the left there there's a false color density map of the of the instrument so basically you can see areas of density so the more dense areas are the yellow areas and the less dense areas are the red so you can probably guess from that but this can be used for acoustic modeling. Some manufacturers claim like sub micron levels of accuracy and resolution for these scans so they're really amazing data you can use them for all sorts of things you could or people do use them to work out say what sound an instrument can make without having to smash it with a you know drumstick or even an air drumstick. Just to sort of compare them there so obviously this is external data only but this is the the photogrammetry scan overlaid onto the onto the CT mesh as well so the photogrammetry scan is in blue there and the CT mesh is in red so just because the meshes are different so particularly the strap i wasn't able to actually get like a quantitative measurement data but you can sort of just eyeball it here and say this drum's only about 10 centimeters long by 7 centimeters wide so it's about kind of this big very small. So you can see they're surprisingly accurate so even though there's a couple of orders of magnitude difference in the resolution there so about a thousand times difference in resolution the they the overlay is pretty is pretty close so you can see a big blob of red on kind of the side of it sorry it's moving very fast i can't actually control it's a gif but there's a big blob of red on kind of the bottom and that's where the foam block was in contact with the with the drum so that's kind of pushed it in a little bit and so there's a difference there in there that's in the CT mount and obviously the straps are positioned differently in the two scans but by and large it's sort of surprisingly accurate data so you can actually overlay the textures generated from photogrammetry so the the like the surface appearance that's generated photographically you can overlay that onto CT scans if you can get them close enough that's not something that I've done here but it is something that I know he's done. Unfortunately I've only got eight minutes
to talk about it but generally as many people know if you approach somebody and ask them to talk about their thesis they'll talk for hours so I could happily but I'll cut it off there so basically just something to think about is that those are two methods which are very accessible show very or you know capture very different data and and are two things that are definitely worth considering for anybody looking into digitization of musical instruments or any three-dimensional objects and something that I think a lot more institutions should be thinking about and I'm sure we'll start to think about there. I've just got a few acknowledgments here there are a lot more accounts due as well I think but those are kind of the main ones that contributed and helped me with my thesis I've also got a little QR code there that says scan the code to see the 3D assets now I don't actually have the permissions yet to to host them so they're not up at that link but by the time you see the recording hopefully they will be and I have put in just as a little placeholder there's a little scan of a a little bronze tortoise so just if you feel like you really want to scan something and look at the 3D thing you still you still can there. So as well you're welcome to email me so my email address is at the bottom there so feel free to take a little screenshot of this slide or anything like that and otherwise i will wrap up there thank you very much. That's great thank you very much Daniel sorry we
are running quite a lot over time but I've just got a couple of minutes left if I invite the other speakers to turn their cameras back on we did have one last question coming in from Amy who asked about the opportunity to perhaps use 3D printed Yidakis in exhibition environments to give people access to perhaps play the instruments and I think Aaron answered it but it might have only just gone to the panelists so Aaron did you just want to just to give an answer to that question and perhaps even you've put a comment in there about another application of bringing these projects together with the AirSticks and the sound of the drums and that kind of thing. Ah yeah so the answer to the question have had we thought about it yes we we have thought about it we thought about bringing you know scanned instruments into gallery spaces so that people can actually play the instruments I mean that would be that'd be great it'd be great as a way of providing that kind of experiential learning and access while saving the originals from wear and tear which is you know something that is a great concern for people working on the conservation and curation side of things. I also thought you know as you know Daniel was working there you know another great application that could come out for that kind of you know gallery floor uses will you know why not program some AirSticks to actually play the voice bank of sounds from drums like the ones that Daniel had scanned so you know that's another sort of you know crossover that could potentially come out of this. I mean the AirSticks could really be used for like literally anything so I mean there it is in their own right but you know look all the resources that come out of this are pretty special so it's really great privilege to be able to work with everybody on this. Alon do you have any comments on using the AirSticks for different sounds like that? You can't blow into them but the next that would be the next stage but yeah they're they're a gestural instrument so they're really great for shakers or yeah for triggering samples through gesture and yeah any we the the closest we've done to that is in the Grainger museum we took samples from the Grainger instruments for those who have heard of Percy Grainger and his whips and chains and instruments and that was great because we got to play his instruments alongside the samples and play around the instruments and trigger like the kind of ghost sounds of the instruments around them so yeah it's definitely something that I like doing is taking samples from instruments. Look I think we'll wind that up there because we've come to the end of our time today thank you so much for all these presentations I think I really feel that there was a great thread running through all these presentations which is thinking about how we how we capture say the the experience of playing, the sounds, the activity of playing because these objects are more than just a a standalone object in itself it's been really interesting and I thank all of our speakers today for their presentations. I'd also like to thank Matt for capably handling things both behind the
scenes there as well and just before we sign off I just want to point out we have our next installment in our DH Lab webinar series coming up really soon on Tuesday the 6th of September. That'll bring together a couple of specialists in history and education who are going to share their experiences designing games and VR experiences and how we learn and engage University and interactive spaces so as attendees here this one I'll send you an email with details for the next one and I'll also send the video out to everyone who's registered today and with that we'll sign off everybody have a fantastic day. Thank you.