Music and Machine Learning (Google I/O'19)

Music and Machine Learning (Google I/O'19)

Show Video

I. Am. Jesse, and I'm Adam and we. Are friend thanks we're, from a project, within, Google called project, magenta it's part of the machine learning group within. Google and we. Work specifically. On. It's. An open-source research project, so, we do cutting-edge. Machine, learning research but we're really interested in, the. Role that, machine, learning can play for creative, technologies and for artists and musicians so. Everything. We do we put out in open source and. We also focus on building, tools, for. Developers, and for artists so, that they can actually act to actively, explore using, AI and machine learning in the. Creative process, and. So. If you're interested more in the project just a real quick plug you can go to gos. Less magenta, if. You're more on the research side you can find all sorts of research papers and datasets if, you're musician there's a lot of integrated. Tools you can try and. We also have. JavaScript. In other libraries, to use for coders and. So I'm going to pass it over to Adam he's going to talk about what this means more in practice yeah. So let's look at one concrete, example of, the type of work we do this. Is a project that just we released last week called groove and. It actually started, with, just. Inviting. And, actually hiring drummers, professional, drummers to come into the office we recorded, them playing. On a electronic. Drum kit that allowed us to to. Capture the. Symbolic. Representation, of their performances, and then we trained machine learning models to do various tasks, with this data that, we thought could be interesting, to use in a creative context, so, I'll get into some of those tasks in just a moment but first I just want to kind of go through how. We actually, see. How, we actually seed, this stuff into the world so first we always you, know we're we're research, groups so we write an academic paper we submit, it to a machine learning conference we, also release the data set so that other. Researchers can take that and either reproduce, our results or even hopefully, expand. On it and improve upon it but, we also always, put. Our stuff into open source so we want coders to have access. To this technology as well so we'll, release a tensorflow. Implementation. Of our models we, also typically. And in this case we re, implemented, in tensorflow j s and. This is just a really useful technology. For. Building interfaces. And applications, on top of these methods, it, makes it really easy to do that when you put it into JavaScript, and then lastly in, this case we put the data set into tensorflow data sets which, is just like a single line API for accessing, that data set so you can train new models or, you can use, it for whatever purpose, you'd like and then. Finally we, typically. Build some sort of tools for either musicians, or artists whatever types of creators were targeting and in this case we took our JavaScript, implementation, and we built some plugins called. Midgets of studio for. Ableton Live so if you're not familiar with Ableton, Live it's, a professional. Software. Package that that people use to produce music or to compose music and. These these plugins add, new new. Functionality. To that library. That that didn't exist before so, I want to focus in on drama Phi which is one of the two plugins that, we built from the models we. Made, with our group data set. And. With this what this plug-in does is it actually lets you take any sort, of rhythmic music, and turn, it into a drum beat or to create a drum beat that kind of accompanying, is it well so, imagine your producer you're, starting a new song and you have a bass line that you like a lot but you know you maybe either don't have access to a drum kit or you're not a talented, or skillful drummer yourself you, can use the, drum. If I plug in to take that bass line and and, create a an accompanying, drumbeat, to kind of continue your compositional.

Process So, let's just hear a quick example of this in practice so. First you're gonna hear a bass line. That. Somebody made. So. Now we're gonna take that and we're going to turn it into a drum, beat to accompany with. Drama five. So. That was just using the the baseline the on sets of the base notes extracting, that rhythm and then with this with just a few clicks you can create a drumbeat to go along with that so that's just one example of, the types of things we're working on there's, a lot more at Chico slash magenta, we. Have a lot more projects everything is free open-source easy, to use so please check that out if you're interested but now we want to transition over and focus, a bit on some of the creators that have actually started using this technology, and their artistic practice so, specifically, we have two musicians here today that we're really excited about the. First is is Claire Evans so she's one third of the band yachts she's. Also an. Artist in her own right and a very accomplished, author her, book is called broad bands the, untold story of the. Women who made the internet and it's. I highly, recommend this book yeah, it. I consider. It to be required reading so definitely, check that out but today she's not gonna be talking about that she's gonna be talking about how, her band has recently adopted some, of these machine learning technologies. In their process. And kind, of we, can see where, that's taken them as a band so let's invite, Claire up to the stage. Thanks. Guys. Hi. Everyone hi. I'm Claire obviously, and I'm here as a representative of, my band um. Yeah we're playing tonight on the main stage so if you like this we find this interesting we're gonna be playing a lot of the songs that I'm talking about here today on. The main stage so, for, anyone who isn't familiar with Yod I'm just gonna start this with like a quick sort of preface so you get a sense of who we are and where we're coming from and I start talking about how, we get into machine learning so yot. Was founded in 2002. By my partner Jonna Bechtel who's sitting over there it, was named after this kind of decrepit looking sign that he saw on the street in Portland Oregon yacht, young, Americans challenging. High technology, we have no. Idea what this business did um, it, is actually, frankly ungoogleable some things are we've. Tried to find out many times over the years but, even. Though we've had a lot of different incarnations of the band it's been 17, years of, making music we've kept this acronym because it kind of articulates, something really core about. Who we are which is that we want to stay in constant. Engaged. Dialogue, with technology. I mean obviously, we're not Luddites, cuz I'm here and we're, not particularly, adversarial. Either it's just challenging. In the sense that we, want to remain engaged always, and we've always want to be aware of the, kind of push and pull between using. Tools and having. Our work be affected, by the tools that we use the. Tools shaping our work I want. To say that I'm not a coder like at all I'm like, 20% of what Adam and Jesse were even saying just now and we've been working with him for three years so our. Relationship, to technology in the context, of art-making has always been from. The outside looking in and we're, interested in tools in getting. Access to interesting, tools and then finding our own novel, ways of using those tools ways that are kind of sideways, we.

Like To force creative, applications, of non creative, technologies, both consumer, facing and non consumer facing just. To see what we can do with them and see how they can be applied to our historically, a pretty DIY, punk, rock operation. If. We, have one basic prime, directive, that dominates, what we do it's to do as much as possible with. As little as possible which is something that we picked up from reading a lot of Buckminster. Fuller and drawing. Requisite analogies, to our own background, into centralized punk-rock communities, in the Pacific Northwest. I'm. Gonna talk about a couple projects we've done to give you a sense of who you are so a couple years ago we revealed, new. Album cover artwork exclusively, via fax, we. Did that by building a web application that identified nearby fax machines to our fans like at FedEx or UPS or, like their parents offices, and sent. The artwork directly to them with an edition cover letter you. Know a fax machine transmits, information, through sound which is basically what music does and we, liked the idea of activating. A dormant technology, or a latent technology, as a way of showing what creative possibilities, exist at, the brink of obsolescence, and on. That see if we recently wrapped a four year project to, reactivate, a really dormant piece of technology in downtown. Los Angeles a, public. Artwork from 1975. Called, the triforium which, was originally designed as the world's first polyphen. Optic instrument, so an instrument, that's a good synchronized light, and sound into, an original, new, art form the. Computer system that it was built with in 1975, obviously, was not up to snuff so, it had been broken for a very long time but. We got some money and we got an interdisciplinary team together and managed, to bring the lights back, using. A custom-built. Led installation, and we managed to salvage the original 8-bit paper tape code that, ran the original computer system so they could be responsive, to live musical input once, again so again, this is a cohabitation, of, old and new technologies. And we, love making use of things things that are just waiting to be reborn, and reimagined. And, again with as little resources, as possible. We. Became really interested in machine learning about four years ago because we felt like it, was maybe the next step for us we. Were interested both, in the reflective, qualities, of machine, learning like the way that it would help, us to understand, ourselves and. The generative qualities of machine learning the ways that it would help us to make something entirely, new so. I'm speaking to you now from basically. The tail end of a years-long project, of trying to find a machine learning driven compositional, process that would work for our purposes, that would allow us to make music that wasn't just passable, as human. Being generated, music but as. Genuinely interesting, and meaningful and, you know in line with our back catalogue of recordings. And really so we didn't just wanna make a record using machine learning we wanted to make a yacht record using, machine learning and that's a much different proposition. I'll. Get into the nitty-gritty because I figure that's the scene, we. This, is basically. What we used to make our record we experiment with a bunch of different strategies though, we found that the best compositional, tool for us was, magentas, music VA ii model, which is a layton's based interpolation, model that, allowed us to find essentially. Like I mean I know this isn't a technical way of explaining it but it allowed us to find melodies. Hidden, in between songs from our own back catalogue and this is what the user facing side of that model looks like when we were initially recording, the record last May it's, a collab notebook so not, exactly the kind of thing you know musicians, are used to bringing into the studio and, unfortunately. We started doing this before magenta. Made user-friendly, Ableton Live plug-ins for musicians, but you know whatever it gives us street cred so I'm okay with it so. In. Order to work this way to, bring something like a collab notebook into the studio we have to do a lot of prep work so first. We manually, annotated our, entire back catalog of music that's 82 songs into. MIDI and then, we broke out all of the bass lines vocal, melodies keyboard, lines drum, parts into. Four bar loops then. We ran pairs of those loops through, the collab notebook at different, temperatures, sometimes, dozens, if not hundreds, of times in order to generate this massive, body of melodic, information, that we could then use as sort of source material for creating, new songs.

When. We had this massive, amount of musical, information that's, when the human being process began this is when we started to manually, cooling, through all of this MIDI data trying to find interesting. Moments, things that spoke to us things that felt interesting, things that we wanted to explore further, as. Some of you might know using, machine learning to make a song with structure, with a beginning middle and end with a verse chorus verse is a, little bit still out of our reach but that's a good thing because the. Melody was the models job but, the arrangement, and the performance was entirely our job. So. To demonstrate what I mean a little bit more concretely, let's focus on just a single melody from a single song so I'm gonna play you a melody that came straight out of music, the AE model, it was one, of several different. MIDI sequences, generated, by an interpolation, between two different, yacht songs one called hologram, and one that has a swear word in the title so I feel like I probably shouldn't say it out loud I, want. To you till I'm dead. Okay. So this particularly melody for us was exceptionally. Aesthetically, interesting but, like, every melody generated. By the Music, City model it's just an endless sequence of notes that goes on and on until it stops it's not exactly pop material. So. This is where the rules come in and I don't mean like technical rules I mean human rules, working, rules for our specific process we. Have always thrived like many artists I believe under self-imposed. Constraints because, when you sit down to write a pop song about anything, in the world it's overwhelming, though if you have some boundaries, in place you can begin to think about it more concretely, so. For us we decided that every. Single stone that we're gonna create with this process had to be interpolated, from, existing, melodies from our back catalogue we. Hoped that this would result in songs that had that kind of indefinable, yot. Feeling, which I we don't know how to quantify and I don't think the model can either but that's, what that's what we decided were our parameters, we, decided also that we could not add any. Notes we could not add any harmonies, we could not jam or improvise or otherwise interpret, or essentially be creative, in any way there, was no sort of no additive, alteration, only subtractive. Or trans, positional, changes so we could assign. Any melody to any instrument, so that melody we just heard could have been a keyboard line could've been a bassline could've been a guitar line could have been a vocal melody that was our decision to make we, could transpose, melodies, to our working key and we. Could structure and cut up and collage as much as we wanted. Now. Let's talk about lyrics, another, important, element of any song so, for this project we collaborated. With a different creative technologist, Ross Goodwin, who. Was, a free agent and we started working with him he's now with Google's artisan between intelligence group, which is something that happened kind of a lot during our process. We. Worked with Ross to create a lyrics model we, sort of the same ethos as our melodic model so we wanted to have, it be kind of reflective, of our own inspiration our own, background, our own history our own back catalog so the model that we built with Ross was trained on a corpus of 20. Megabytes of text, so that's about 500,000. Pages or approximately, 2 million words, and these, are all from bands that we consider to be our influences.

Music, We grew up listening to music, our parents listen to our own music our friends, and collaborators and peers we. Saw this as an opportunity to kind of teach, the Machine our values, our, history, our community, and where we come from as artists. The. End result was this so, this is one instance, one block of output from, the lyrics model that Ross created with us which, we printed out on a single sheet of dot matrix printer, paper because do they still make dot matrix printers, and you can buy them on Amazon and we. Wanted to visualize it, as something really physical so we had this massive block, of text one continuous, sheet that we brought into the studio with us and I literally sat down on the studio floor highlighting. Interesting, passages, it's, interesting because it contains, a range of low to high temperature, material. So the low temperature stuff you, know because it's taking less risks is, much. More repetitive much more simplistic it's kind of a punk rock lyrics engine it taps into the more elemental, things, in song so there's entire pages and pages and pages of repetitive, phrases like I want your brain or I want to rock or I love you you know that's, the really low temperature material and then the high temperature, material, is full, of nonsensical, you know run-on sentences, and lots of really weird proper, nouns and names of things, and so. In order to make songs I had a range of emotional, sort of feeling, we combined, a lot of low temperature and high temperature material into, the same songs and. As with the melodies we didn't take anything as is we really went, through manually, and combed through and looked, for exceptionally, interesting phrases, or images, or passages, or things that spoke to us and felt like they were meaningful to who we are and we were coming from the. Biggest influences, on our working method with the text we're, really kind of like low-tech. Anti, technological, really I mean we're looking at like William, s Burroughs cut-up writing methods and the Dadaists. You know high, tech low tech is kind of our operand us so in. Order to actually make songs with this giant block of text and this giant pile of melodies, we. Actually had to take the interesting passages, and then place them on top of the melodies that we decided would be that interesting vocal melodies the, problem is the melody is generated, by VA II don't. Have any relationship, to the human body at least of all to our human, bodies or to our competencies. As performers, and singers, and they certainly have no relationship, to like the intern rhythms of the English language so a lot of time we had to kind of break, the, lyrics on top of the melodies in order to sort of force them into working and that meant we had to pull apart syllables, and pronounce, things in really weird ways and do things that were deeply unintuitive, and which will certainly lead to a lot of people listening, to this music and miss hearing lyrics constantly. It's like a mondegreen maker. So. It's like a closer look at some of the lyrics we decided to work with here's. A passage from a song I want. Your phone to my brain I want, you to call my name I want you to do it - oh won't. You come won't, you come won't you work on my head be my number 9 to. Be alive to be with you like, a weed I can, feel it in my head like, a dog in bed I know. So. Speaking. Of someone who normally write songs that have like a relationship to meaning or cadence, or you, know are personal.

In Some way singing. These kinds of lyrics really forced me to step outside of my embodied habits and have, develop a relationship to words first as sounds. And then, to kind of grow to love and appreciate the meaning that comes after, sound, this, is pretty liberating but. The lyrics also contain these really strong strange. Images, that I would never have written like I can, feel it in my head like a dog in bed I mean what a phrase, it has the form of idiomatic. English but the meaning is completely sideways, and. Yet, it also still, kind of means something because I think we can all easily imagine, a feeling. That says warm and willful. And present, as a dog that's sneaking into, bed at night and that's the magic of working with this stuff it really opens, you up to new ways of thinking about language of thinking about music and of thinking of the interplay between those two things. Okay. Let's hear how these lyrics fit on that melody that I just played you which we determined at the outset would be a good vocal. Melody oh. My. Friend. I want. You to do it too. Oh won't. You come won't you come won't you look on my head be mine I'm gonna to be alive, to be with, you like, a weed, I. Didn't. Feel it in my, like. Talking. But. So. One, of the most interesting and challenging things of working in this way is. Actually. Performing, the generative material. Like I said earlier it's often far beyond our competencies, and sometimes. The things that sounds simple I mean this sounds simple but it's, like sideways. From the embodied. Patterns, of play of singing and performance that were accustomed to and I can't tell you how many times we, are in the studio just trying to nail like some seemingly simple guitar line but, just because it was slightly different than what we were accustomed to doing it was impossible. To do and that happened a lot and it, was kind of brain breaking ly difficult, in many moments but at the same time it often forcibly, pushed, us outside of our comfort zone pushed. Us outside of sort, of the patterns that we had fallen into and, often, patterns we hadn't even perceived, were there to begin with and force us to play differently and think differently about how we work, okay.

Finally I'm gonna play you the, first minute of the final song with everything together so you'll. Hear the first chorus which has its own like, amazing made-up, idiom and, then you'll hear the first I played you before so again melodically, everything, you'll hear was generated, by, the music via emod Allah but the performance, arrangement, production structure everything else is ours. And. This is like kind of what we see as a collaborative, strategy, it's, not so much about like. As an artist being replaced by machine learning but rather being. Given the opportunity, to focus our energies in different directions and in different places and we're accustomed, to it's not, about revoking control, at all it's not about letting go it's about holding, on and letting the, process change, you. So. Obviously. This is just one way of working with machine learning to make music and it's not even like really you. Know the right way I don't think there's can't most of approaches and many of them are going to be far more technical again, we are getting in where we fit in we are engaging at the level that we know how but beyond. The challenges, that it faces like, that it brings to workflow because it's definitely not intuitive, or fun to, like pull up the collab notebook, in the browser in the studio the, challenges are really satisfying, and exciting, because they're the kinds of challenges that make you stop and consider what, you're actually doing and, for, us the process has been, infuriating. At times but ultimately, really, deeply gratifying, and the best way I can describe it is that it feels like you're doing a puzzle and then when you're done the puzzle, is not the. Picture on the puzzle is not what's on the box but. Who cares because who cares what's on the box okay. There's more to talk about what we can talk about during the panel thank you. Yeah. Yeah. So we're. Really grateful to yot, for coming, to us so early in this process because, you. Can see that a lot of our tools you know through your story were just. Sort of in their early stages and we got a lot of really useful feedback, about. How. An artist would actually want to interact with you know different types of things, in machine learning and this. Next project we're going to introduce I feel, sort. Of shows. How we've sort of come along, this process to now make these tools. Available to the point where someone can just do a project, in a shorter amount of time and. So we've done a project with. The Flaming Lips that's very specifically, for mm-hm. That's. Very specifically, for IO and so we're gonna give you a sneak. Peak of, things. That are going to be happening for. The concert. Tonight and, then we're gonna have a nice discussion, panel here talking about all these things so, we'll, play a video. Anytime. That The Flaming Lips have, stumbled upon a, new little instrument, it's changed. What, we created. The, goal of the magenta team is to really explore, the role of machine. Learning in creativity. In the creative process to enable people to express, themselves in new ways piano. Genie is the great work of an intern we had Chris Donahue he designed an algorithm to be able to take piano, playing and try, to reproduce, the piano playing only hitting a couple buttons on a controller, and it naturally comes out sounding a lot more like a professional piano player one of the things that we really focus on that's very different from a lot of other machine learning projects, is how. Can we let, people manipulates.

These Algorithms it's been really exciting collaborating, with the Flaming Lips because, they're so creative, and their approach to music that we just sort of showed them everything we have we're hoping to create a new experience where the audience can co-create music, with the band in real time using piano genie so we made an intelligent musical, instrument, and melody. Creator out. Of fruit with, Google. When. I played the fruit I'm touching, it and I didn't know exactly what it was going to do every time each, one, is, announcing. What it is. So. We worked with Google AI and they sent us the software, called piano genie you hit a note and it automatically, plays music. In. The more that we play with it the more it understands, what, it's playing against, and who it's playing with. So. You play a different rhythm or a different note and this goes on and on so it actually wrote, a melody, that we would not have written instead, of the Machine doing, it for you you're kind of encouraging, the machine to do something it's kind of cool. If. You're a banana that's, pretty good for a banana. So. All. Right Mike am i working there yeah yeah all. Right so. This Wayne Coyne hello everybody. So. We. Wanted to start this discussion, because. There's obviously like. There's a lot of hype about machine learning and artificial intelligence and when you come to one of these projects, there, can be a lot of preconceptions, about. What. It is that that's gonna be like to interact with these things and maybe Claire I want to start with you just, sort of talking about how. How, were those preconceptions, met or you know what was what was different than what you expected oh yeah I mean I'm not ashamed to say that at the outset we thought we were just like push a button and make songs like we thought we. Thought that's why we were at in terms of the technological development, of AI I mean that's maybe what the hype of the mainstream is makes us believe that it's gonna come for our jobs in this way that's so visceral but we, really thought we could sort of put all of our songs into, a machine and then it would it would give us a new yacht song and we, found out very quickly of course that that's not at all where we're at which. Was really exciting because I meant that we got we got to be the humans in the loop and we got to have way more control over the process than we initially believed. We would I mean initially we thought it would be about like committing, to whatever the, Machine made and then we would have to play it and perform it and make it our own but actually. We got to kind of co-create, with with. The models and we had to take a much stronger hand and we had to come up with rule sets and systems and processes that were uniquely our own so. That like even if you gave the same you, know we gave other musicians or really anyone in this room like the exact same like lyrics, generated, machine output and the same notation. Data we'd all make different records because it's really about the personal interpretation, of what to do with this all this source material, so I was, pleasantly surprised at, the. Lack of, sophistication. Suppose. Yeah. Wayne. Does that vibe with your experience. I. Would. Say you know our thing, had. So much momentum, going, and. I. Always have I feel like I have too many questions, you know I've, questions, even ask you what. Was it what, was the word okay so there's the light, thing. That's in loss right. Triforium. Yeah yeah so, it's so what's, the word it says Holley fin optic I know I figured you know what is that I mean fawn.

Object Sort of real I mean it's a word okay. But, some made-up word it's a it's everywhere, dozen made-up words I was like what, yeah. Well that's true. So. I mean from my experience, I. Mean. I I. Didn't, really have any idea. How. Really. What we were gonna do I think we were deciding what we were gonna do based. On what we did five minutes ago I mean everything that we've done starts. To accelerate with. Ideas. And energy and momentum and. I. Think. That's why you. Guys wanted the Flaming Lips you know because it's like they're good they're gonna do something you know and. You. Talked about, you. Know these self-imposed. Rules. Or whatever and. I think we, were lucky that the rule only rule we had is you. Have to do it now you have to get going you know and I love that I mean I think I mean a lot. Of times, you. Do, with. With, time you know you second-guess. What. Is this is this any good and you you go back and redo it and sometimes. With that energy of do, it do it do it you know you make decisions, and your 20 decisions, in before you've had time to be too. Insecure about it or whatever, so. I think the more that and we're. Very lucky because we have we. Had you guys sitting there. Every. Step of the way, if. Something didn't work, we. Just blamed you and said you, know you got to fix this thing you know and. To. Me that's always the intimidating, part of a, new, thing. We. Have a brand, new Volvo, and I don't quite know how to turn on the. Sirius. Stations. In the car yet you. Know you're driving you're trying not to kill anybody and then you're trying to get to the last. Beetle station, that you were on and so I always get intimidated, that I'm gonna just turn the car off or something so I'm like if I don't know how it works I'm always afraid but. Having you guys there. Sort of showing. Us how it worked and then we kind of immediately want to go oh I want to do this I want to do that and I think that's you. Know, that's not a luxury that, very, many people have but I felt like that, encouraged. Just to be. You. Know as absurd, as you guys. You. Know thankfully allowed us to be I mean the idea, that this. Quickly, went from, the. There's a piano there with this little device and, then 30.

Seconds Later there's a bowl of fruit there and Stephens playing this. Amazing. Classical. Kind, of piece, on using. Bananas, and strawberries and, oranges you know and. When. You're there and all that's happening it's you know it's exhilarating and I think too you know for someone like me that. That. Idea that we're not playing it on a keyboard we're not playing it on a guitar we're, not playing it it's we're playing it on fruit just, takes, it into, this. Other. Realm. I mean and you, know and so some. People say well that's that's. What maybe, that's what children would do you know they'd say yeah can I can. I play musical, notes on an orange or a strawberry, and I'd be like yeah. Yeah, yeah, so even even though there was like we were there right like so we could help get, around sort, of any issues you came you came across there are some level of unpredictability, to these algorithms right yeah I think you guys have both experienced, that in. Your work so, I'm curious to hear like maybe starting with Claire from your process because there, was parts of things that you're giving up some agency, to write like you were, letting it write lyrics for you but in. Your case you're gonna be doing a live performance where you don't know exactly what it's going to play so how yeah how do you, even. Though you're giving up a little bit this agency, how do you keep control and make sure that you your, artistic vision is still shining, through on, top of this yeah I mean I think ultimately I, mean nothing goes on into the world of our name on it that hasn't been fussed over for months on the computer and our living room you know like we, have the ultimate agency and the fact that we're even doing the project to begin with is like a kind of aesthetic provocation, that we've decided we, want to do but, at the same time I, don't, know I mean I think what's what's being replaced here in terms of our process is like the initial, Jam, the initial sound gathering, the initial filling up the notebooks with lyrics ideas like a sort of generative moment, and instead. Of jamming in a room with each other were jamming in a server with a model right and we're what.

We End up with is like a mass of source, material to work with and in, my mind that's when the work begins like that's when the creative work begins and I'm you know I'm a writer too so I believe, strongly that like writing is editing, you know what you put on the page in the very beginning is just like this thing that happens in a fugue state that's a total mess and then it's useless until you actually do, something with it and I think it's the same thing I mean of notebook, full of lyrics ideas isn't a song like a jam you did in the studio that sounded really cool in the moment like isn't a song what you do with those things how you bring them together how you structure them how you arrange them produce and perform them commit to them and then perform them for possibly years to come like that's the song yeah and so, I don't really feel that, much, like we gave up anything really we just sort of sped up or change the nature of the process I don't even think it was faster actual I think it was slower, way. Longer to like manually, annotate, our back cataloging, the MIDI and like come, up with a corpus of two million words of things that we thought would be the same things that were kicking around in our head on some level I, think it was actually more tedious in many ways so yeah. It's like you decide what you give up but you also really determine. The parameters of how you do it and then and then what you do with it afterwards it's what really matters and there was more in like a compositional, process yeah, where you have to deal with you know how much are you controlling and how much are you editing and, so Wayne with this thing that's more of a performance, right you. Know you have this different interaction, where it's like how much are you, controlling what's happening how much is there a chance element, in the performance, right I mean I mean. The way we looked at it and even the way we looked at even you. Guys are. You. Know we're. Accepting. That we're bringing a collaborator.

In You, know what I mean. Which. We do that all the time and. Sometimes. You regret it you, know we, had a. We. Had a. Like. A digital trumpet. Part on a song once we. Really thought it was great and whatever and we had. A trumpet player come. To the studio. And we. Said yeah just play whatever you want or whatever you know and. Little. By little you, know we didn't we. Didn't like what he played or you know for, whatever reason boy it wasn't his fault it was we were stuck on a certain idea and. By, the end of it we said what can you play exactly, what we played, only play it on. A real trumpet, and. After. He left we. Actually didn't even use his trumpet you know you know but. You. Know all this is a process, of saying you know do I like this thing how much do I care about it how much does it matter to me and you go through all these different reasons. And there's no real reason you, know you're your. Reason at the end of it is like I just. Like it I don't have any I don't have any deep meaning to it and. So, for us I mean we went into it knowing, we. Want to do, this collaboration, and. Unlike. The trumpet, player you, know everything, that we would touch and I said this to you guys there you know we. Were starting with sounds, that I really liked to begin with and a. Lot. Of times I'll start to hear the sounds and I'm already starting to think of a song you know I'm like hey play that again you know you're you're, going off on things I'm like wait go back to this other sound so. To, me that's really all I'm ever doing, you know I'm it's it's sparking, something that's saying oh I can turn that into a song I already know what I want to sing to that or whatever you know and. Knowing. That we're, gonna turn it into something that, we have to perform we, never consider, that when we're thinking. Of what it's gonna be you. Know to me those are just two different worlds, you know yeah it and, knowing. That we, you know that we want the audience to be part, of it I think, as we went our. Initial, surge. Of an idea. Quickly. Turned into a way to make that something, that we thought, when. We're there with an audience and they're doing this thing with us they'll understand, their. Contribution. To the, thing. We. Talked about. Interactive. Things where. You're. Not quite sure what you're doing to make it go you know you walk, into a lot of things where you, sort of feel like you, move, you're supposed to move your arm or stannis in place and it does something you. Can't quite tell what it's doing and I, think we've gone to a great effort to say, when. This thing is happening good. Bad, indifferent, you. Know, you. Contributed. To its growing. It's, dying. Its boredom. Its excitement, you, contributed, to it which to me is already. That's. What we want cuz to me that's that's where the fun is that's where the the, energy comes from we're not really saying we're. Gonna create the greatest piece of music ever we might but. To me that's not what concerts. Or anything is, it's we're, all gonna participate. In this thing we're gonna create our own energy create our own time, our own happening, and.

So, You're talking you, mentioned like there's some unpredictability, right in the interaction, with this thing and like a lot of music, and technology, is always co-evolved. With that unpredictability, like guitar, amplifiers, weren't, meant to distort but then people turn them up and they found this sounds great you know or the 808, you know drum was originally, supposed to be an accurate reproduction. Of a drum set which it's not but man it sounds really good in electronic, and hip-hop music, for sure yeah so. With. These new technologies machine, learning like. Claire I don't know did you have any experiences, where, they failed in ways that. You know maybe that the failure was then inch they interesting I mean I think it's all about the failure the minute it's perfect, as I was when I stopped being interested I think I think those, moments where, you, know the melodic information, deviates, from anything a human being would do or perform or those moments where a tool, doesn't meet your expectations, or the explode it's supposed to do but it doesn't completely different that's more interesting like the 808 or the incense by the way which is an amazing neural. Synthesizer that Google makes but um yeah. Those are the moments where I think that's, the most that's, the most interesting output, because it allows, you to like determine. What your own taste is sometimes, I mean I think that taste. Is often a response to, something rather than something that's coming directly from you you know you see something like I like it I don't like it and that's how you kind of determine who you are what you want to be doing what how you want to express yourself and I think it's very interesting and helpful, to have this sort of new semi, neutral I mean I know saying that AI is neutral, is kind of a loaded thing but semi, neutral other party, in the room that is not one of your bandmates that, proposes. An idea and we can all agree with it we can all disagree, with it no one's feelings are hurt if we dispense with that idea because it didn't come from any of us we, can fall in love with it and move forward with it as a group or not and we. Can determine, you, know who we are and in terms of our relationship to each other by, agreeing, or disagreeing on some other generated, output but I mean the innocence for example is kind of like the 808 in many ways because it's this, tool that is supposed to do one thing it's supposed to sort of split the difference between lots of different sounds using latent space and kind, of in my opinion sort of fails at that I mean it doesn't convincingly, split the difference between a car horn and a flute because, the sort of sample rate is so low that it ends up sounding kind of Reedy and wonky.

And Weird and initially. We thought that that was like we. Thought that was a failure we thought it was not interesting, it wasn't until we started thinking about it like the 808 where we realized that it was a tool with its own aesthetic its own kind of weird wonky, Reedy sound that, it became really interesting to us and now it's a huge part of our record because it is again one of these objects. That's both high-tech and low-tech at the same time it sounds really low Phi but, it takes like millions of dollars of machining research to make that sound that juxtaposition, is pretty fascinating to us yep, so. I think unfortunately we're out of time but let's, think Claire, and Wayne. Thanks. All right thank you and they're both playing tonight so check out the show. You.

2019-05-10 21:19

Show Video

Comments:

do it!

Nice!

Being a drummer and programmer...I'm totally terrified :D

"Every word is made word." Wow Claire thanks for that clairty

8

Apple logic did this years ago

Great ! It looks really expert! Keep up the great work!

its very intuitively

So you have your algorithm identifying things but it only can identify stuff its familiar with in familiar scenes. Whats missing if I see a pink sheep on a stair case my mind adapts but a computer will likely relate it to something it is familiar with and identify that there's a pile of flowers on a stair case. So in short AI is today good at deep learning with things it can get familiar with but is not good at adaptive learning like a human or even an ant colony can be. So where do you start well I would suggest a robot like ant colony where you challenge the ant colony with things unfamiliar to it to test new systems abilities to adapt or that sort of thing. Imagine that Ai in the future ran more of our lives but wasn't very adaptive this could lead to some small or big problems as new challenges arose. It also gets one thinking about the nature of consciousness is consciousness simply a mechanism that allows life to adapt to new challenges.

Almost switched to YouTube music from Spotify. The AI / Machine learning replaces the 1000s of playlists I have. YouTube is better at being the music plug

I find artists to be a little bit weird

that's my idea

The part that shows up as [INAUDIBLE] in the subtitles is "mondegreen".

Totally terrified but smiling after you say it. Can you expand on that?

Other news