Um we're ready to get rolling with our next speaker, Mike. Kent is an industry, leader in MIDI and musical, instrument, industries, he's, a member of the technical Standards Board of the MIDI Manufacturers, Association, he's. Co-author, of the USB MIDI specification. Principal. Architect, of MIDI CI and. Contributor. To USB, audio one two and three specifications. And I know that's a lot of wonky, stuff but for those of you especially my students, um you. Know what that means. He, had a 30 year career at roland before, starting, his own consulting, business he's. Contributed, components, to MIDI for Windows for Nintendo's, developer, SDK and, he's, worked closely with Apple, on MIDI and audio solutions. For many years he contributed, to an audio processing, system, for a NASA event, simulation, center and to. Audio capture, for an archiving, network for the US Coast Guard most. Of Mike's work is now focused, on the future expansion, of MIDI working. As a consultant, to Yamaha, R&D, Mike, owns 19. Synthesizers. And eight guitars, please. Welcome Mike Kent. Thank. You I'm, nervous to know now that there are 10 people watching, online. One, of them probably is my mother. Okay. But. Before we start I would, like to take your picture this. Is proof to my wife that sometimes, people. Like to hear what I have, to say. Alright. Okay. So. The. Topic for the next, next. Presentation, here is MIDI and mini mini has been with us and we call it MIDI 1.0, because fundamentally it, was invented 35, years ago and we still use the same language to. Control musical instruments today how many people here use MIDI. Everybody. Every, error every, hand went up okay, how many people are experts in MIDI. Okay. Half. A dozen hands went up all right excellent. So that gives me some some, help. To know how quickly to go through different sections here all right so history, I'd like to talk a little, bit about how, we got to here and. The arrival of digital, technology, in the musical instrument business that I come from, was. 1977. Ralph, Dyck was a real pioneer just, across the border in Vancouver, where I live he, worked out of little mountain sound recording. Studios he had a studio there and. He, invented a sequencer. That had a microprocessor. In it and, mr. takahashi the founder of Roland met, Ralph. In. In, 1977. They released the MCA which was an update to his sequencer. And it was the first commercial, musical, introduct, with a microprocessor, in, it so. Ralph was was a real, pioneer another. Pioneer right around the same time as dave smith who also happened, to building the building sequencers, and, he, would released the prophet 5 which was the first synthesizer. With, a microprocessor for storing, memory, and. Complying, storing.
The Whole patch of the synthesizer really, a breakthrough, instrument. Digital. Interfaces, started, to happen so so instruments, were connected, to each other with digital. Connections. And this was all breakthrough, and in 1981, roland had a system, called DC be over. Hime had their system from connecting their synthesizers. And sequential. Circuits, had a third, one and mr.. Kakashi have Roland I went, to Tom Oberheim and said we're. Doing this thing you're doing the same thing but they don't talk to each other how, about we we. Both agreed to do the same thing and, Tom. Oberheim went to his friend Dave Smith and said what do you think of this and Dave, got really excited about it they went and told the whole world let's. Do this together and. You'll. See that that's a bit of a theme through what I have here to say to say here about MIDI today MIDI, is a cooperative. Effort where even competitors, got together and, agreed, on something. And and, and. Push the industry forward and. Talking. To other people that, are in your industry is always, a good thing belonging, to the gangue network. Is. A good thing so. Out. Of that. Came. Midi 1.0, and midi. Was. Actually that the line the protocol. Was. Almost basically DCB. Version to the, hardware, came. From Dave, Smith and sequential. Circuits. And. And those two things came together and form MIDI. Tell. You here I. Forget. Okay it wasn't important apparently. So. I thought, it should go into the background of what bidi does it's, a communication. Language where one, device, can talk to another device so, I've got the pink synthesizer, down here and if I play a note it. Sends out a message saying turn. On that note and another, synthesizer, will turn on the same note and when I release my finger there's a simple message, it says stop, playing that note so these are just note on and note off and this is probably the, most important, MIDI message, at. Least for from, musical, instruments. In. Addition, to the note notes, and notes on and off there's a couple of other types of messages, that get sent, selector. Here there's there's a message called program, change that, allows you simply to select which sound, or. Which, function, a device is doing and then you've got controllers, and these are things like just turning the volume up and down or bending.
Your Pitch up and down or adding vibrato, to a note these. Are controllers and these are these are these form, kind of the the fundamental types of controls, that, meeting communicates, from one device to another. MIDI. Has 16, channels, so. If this pink synthesizer, wants to talk to the green synthesizer, it can do so on channel 1 and the, blue synthesizer, will be set to receiving channel 2 and so on the pink synthesizer, when, I set it to channel 2 it'll, switch from talking to the green one and switch to talking to the blue one so have multiple, channels on one cable, 16. Of them, these. Are digital signals, that can be recorded with something called a sequencer, a sequencer. Is simply a midi recorder. Now. Sequencers, can record, data they can do all kinds of manipulation, and editing on that data and then, play it back to the device and when. That sequencer, is playing the device it's actually a real-time performance the sequencer, is sending, a, this note now now. Stop playing that note it's just, playing back the commands, that is recorded, or that you've edited. Similarly. To, the the keyboard talking on different channel sequencers, can send on multiple channels and so, it can send a bassline to one type of synthesizer, the drum, notes to a different synthesizer, and chords and melody to other synthesizers, and there, are 16 channels for doing that so. These are kind of the core ideas of MIDI, which. Led to then different, different types of instruments so if I'm using a system with a sequencer, like this maybe, the blue and green synthesizers, don't, need all the keys on it I can generate all the notes from my, pink synthesizer, there and today. We have plug-in synthesizers. Which might be represent, by these green and blue blocks. Up here so. The synthesizers, don't all have keys on them anymore but, they did originally we've. Also. Been. Able to move, beyond just, the keyboard omitting what does come from the keyboard, industry, and so the note on a note off model works really well for keyboards, becomes, a little bit more problematic but, we found over the 35 years solutions, for. Using MIDI in other applications so, typically, a guitar, with six strings will, send on seven MIDI channels, there's, one MIDI channel per string and then. The seventh channel would be you know master volume and those kinds of controls. This. Becomes important, as compared, to a keyboard to use multiple channels because typically. On a piano I can't, play two versions. Of middle C there's only one key and it's either up or down but. On a guitar I could have a C coming from one string and up a few frets on the next string is the, same C and so, I need to do that on multiple channels so we found, over the years ways, to expand, and use MIDI in new ways to do new things beyond. Its original, design. Now, back to to history is funny that Brian talked about history and his talk this, morning and I have the same kind of idea this is MIDI. Expansion. Happened, first. Thing that happened was personal computers, were coming, out at around the same time that MIDI came out and Roland. Had the. MU 401, it, was a standard, MIDI interface, for with, a. The little box connected, to an ISO card to go in an IBM, PC or, compatible back. When we used to talked about the PC as an IBM PC or, compatible, it, could, plug into an apple or a Commodore, 64 and admitting. Functions to those personal computers, there. Was multimedia, boy, did you know that that computers, could show pictures, and they. Could even be in color or they, could create sound, and they it. Could do both at one time a picture, and sound together these were the early days of MIDI in the in the late 80s and early 90s. Multimedia. Was a big part of driving where. MIDI was used and the MIDI specifications. And general. MIDI was was one of the. Extensions. To MIDI that came along that defined, a default, set of sounds. At. That time here's, the mp4 one by the way mp4, a one connected, to a nicer car door or card for an apple or a little, cartridge that went into a Commodore.
And Beside. It here is a sound blaster, this, is what you had if you played games and you wanted sound you had a sound blaster sound card and it. Had a Yamaha, FM chip on board the, chip that Brian was talking about today, using. The technology, FM. Technology from Stanford, and, there's a joystick, connector, port on there and, some. Of the pins on the joystick connector, were the pins that were used for MIDI interfaces and, so this is the way a standard way MIDI was done was, off of a joystick, connector. And. So many. Advanced. And. And moved into personal. Computer and became part of multimedia, because, of products like this. In. 1997. Microsoft came, to me and said Mike you know that mp4, one thing that you guys I said, the de facto standard for and that creative labs you know is is putting into. Their sound blaster it's an mp4 one compatible, MIDI interface that's, gonna go away because we're gonna take away the joystick, port other. Companies like moto and and opcode, were using the parallel ports and serial ports they said they said all of those are going away and USB, is coming you, better think about what to do with MIDI on USB, how is it gonna work when the when the joystick, port disappears. So, technology forced, us to adopt, USB. So I went to Joe Rowland and said hey we're gonna have to write a USB MIDI spec and I said okay why don't you go do that and so. We worked with the USB implementers forum, and and with. Microsoft, and Apple at the time and and creative. Labs and, a few other companies joined, and, we wrote the USB MIDI spec and it came out in 1999. It's hard for me to believe that I actually started this work like 20 years ago but. USB added, some things to MIDI, first. Of all it allows MIDI to run at much higher speeds, MIDI. Was 30 1.25. K baud we don't measure things in K baud anymore, and. So, it, allows MIDI to run hundreds, of times the speed that it was originally, designed. For so. That's great it also goes beyond the 16 virtual cable a 16. MIDI, channels, because we have 16 virtual cables, on the one USB cable on what's called an endpoint and so, that gives us on one, device easily. 256. MIDI channels can happen on one device and if, I instantiate another, endpoint actually I get another 256. I haven't seen a device that needed more than 256, but, it's possible, so USB, added. A bunch of things to MIDI and MIDI continue to advance, and, allow us to do new things today. We, can also run MIDI over a network. Using, RTP, which, is real-time. Transport protocol, which, is built. On top of UDP, so. Another. Way of running high-speed, MIDI and, running at long distances, and in a networked environment. This. Is often used actually with, something called MIDI show control, and so, you. Go to theme parks a. Lot, of things are run by MIDI using, RTP MIDI if, you've been to Universal Studios in in. Southern. California I assume they still have the Waterworld thing where the the plane comes down and lands on the water and there's there's, you.
Know Fire and smoke you know and all the, systems, are controlled by many are, the Fountains at the Bellagio. In. Las Vegas. They're controlled by mini as well so RTP MIDI takes MIDI into. New environments as well more. Recently we have MPE, MIDI. Polyphonic. Expression, if, you've seen the, Roli seaboard or, the. Linstrom and live, instrument. Is called from from Roger Lynn these. Use MPE. Which allows more, expressive. Music so if I play a chord, I can, add to one just one core one note of the chord vibrato. Or or a filter, swell or make one note kind of swell up in volume as compared, to the others so, MPE adds more. Expressive. Control. In. Fact we've added a lot of things to MIDI since. MIDI 1.0, came out in 1983. This. Is a list of the expansions, we've made to MIDI on, this side is the confirmation, of approvals, list I don't, know the name. Doesn't make sense to me but these are new message. Specifications. New, messages, and MIDI ok, and on the far side we have other specifications. That are called recommended. Practices, and this is taking. A set of messages and let's all agree to use them in a particular way to, solve a particular, a. Particular. Need, like MIDI show control and. So we've, done a lot to expand, MIDI so, while Mini Mini is still mini 1.0, language, and MIDI 1.0, is what we call it today many, 1.0, of 1983. Is not what we have today we've, done a lot. But. We've added a lot within. Kind. Of a walled garden there. We're. Kind of pushing, at the walls is there's not a lot more that we can do, for. Several reasons first, of all users, expect. Backwards. Compatibility, they, want to use MIDI, the new bitty devices, they get today. Or by today with devices they bought 10 20 and 30 years ago and they just expect MIDI, to continue, as it's done for 35 years to, mix. Old and, new and have good compatibility and, the, mini specification. Is actually, fairly. Tightly, defined, in. Order to serve interoperability. And compatibility. And. So it's hard when you have a specification. That is fixed, to, to. Expand, it sometimes. Common. Practices, we, just know how to use MIDI we like the way it works and we, know how to use it so we don't want to change, too much and. So that's. As, as we get new ideas and the and the tech environment changes, we, still have our ways of working that, we.
Like, And. Really, fundamentally, MIDI has no opcodes left for new messages we've, exhausted all the opcodes, and things, that we're doing to expand MIDI now today, some, of them are a little bit well. I'd call them hacks. And. We have to hack MIDI in order to add things and. It's. Just becoming more, difficult to expand. But. Today, bidi works in, fact it works really well and so. We've kind of come to the edge of the garden and the question today is are, we. Done I don't, think we are but. The, fact is there's some resistance, to go any further because one, is difficult and hey. It really works you know where everybody, in the room, here is using it anybody. Not like it are there, times I don't like it but generally, MIDI works really well so, do we need to change I'd like to. Take. A fork, in my presentation, here for a moment and talk, about the fork the fork works, really well, it's. Looked like this for about a hundred and fifty years a couple, thousand years ago they looked like this now if they still looked like this they'd, be good for poking your little brother under the table but not so great for putting food in your mouth compared, to the, modern fork we've, built other Forks along the way here's, here's, here's one with seven tongs. On now I'm sure that's really good for something, but, I'm not gonna use that to put food in my mouth, okay so there's other types of forks there's. Other things you, can do by controlling, musical instruments that MIDI made us maybe doesn't do but. For. What MIDI does it, doesn't. Really well do we need to change this. This. Has been with us for 150. Years can, we make the handle longer well, I'm not have to eat out here make, it shorter now it, works the way it is so, do you change the fork. Many. Works, should. We change it well, the fact is there, are some reasons why we might have to change it and there. Are some reasons we might want to change it but. Change it in the right way, okay so there's, a push for change, first, of all users expect, expectations. And desires and, users. Vote with their dollars you. Buy something, because you like it and therefore manufacturers. Sell, you more of the same thing whatever you buy they're, gonna make, more of that okay so many.
Works Well and people continue to buy many products, and so. We. Continue to make more people. Want backwards-compatibility. And so the manufacturers. Manufactures, have to look at that and say well, whatever we do in the future has got to work with the old stuff because our customers, vote with their dollars and they're not going to buy it if it breaks all their old gear so, users expectations, is a huge. Factor. To weigh as we look at change in technology. Small. One-man companies, this didn't exist in the earliest days of. Technology. Today, one man could do so many cool things. I'm. Pro. Toe typing new, mini products just on an Arduino board. There's. So many great tools and and, one-man companies, can produce great products, whether it's an app on an iPhone or. Some, small MIDI widget, and, one. Man companies, are really nimble they, can change direction of their company or you. Know very, easily very quickly so, they want to change and a lot of innovation, comes from small companies, big. Companies on, the other hand like, Yamaha that I worked for an apple, and, rollin that I used to work for those. Kinds of companies change is difficult even, though they're in the tech tech industry, major. Change, is very, costly. So. There's new technologies, are all around us and we want to take advantage of them or sometimes we're forced to just, like when Microsoft, came to me and said you better do something about MIDI because USB, is coming, and the, joystick, port is going away we had no choice but to adapt okay, and and to some extent we're in the same situation with, MIDI today we, almost have no choice, because new technologies, come along and then, those, feedback to user expectations. Every. Device, is, gonna. Be on the Internet they're. All becoming connected devices but, you're synthesizers, today are generally, not connected, to the Internet and that's changing visit through a new thing called web MIDI. And so there. Is technology change, around us that pushes us forward whether, we like it or not and then, there are things that we want to do like. MP that makes more expressive, musical. Instruments so we can do more creative, things the technology, moves ahead so. That new creative, things can happen as well and so it's not just technology but it we. Keep in mind fundamentally. This is about making music MIDI, is used for a lot of other things like the fountains of bellagio but fundamentally it's got to serve making. Music and we can make music better if we, have a better MIDI okay, so. One. Of the things we have to consider in MIDI is the value of the existing knowledge we've. Done an awful lot to learn how many works users. Have invested, they bought books in the past there, used to be a best-selling, book called MIDI for musicians they, must have sold a lot of books, and. Users. Today are you know go online and read you can watch youtube videos on how, many works. The. Press invested, an awful lot over the years magazines, in the past today of course mostly online but.
There's A whole mechanism. Of serving, information, and there's, this value, of knowledge, that exists, in MIDI today the. The, dealers, know about it manufacturers. When educated. All their dealers said this is what MIDI does and we, have 35. Years of, this. Almost. A pedagogy, or or or existing. Knowledge that we've built up around MIDI, and we don't want to abandon all of that you. Know in, the early days of MIDI, when. I joined Roland we used to ship a MIDI cable inside every box and so, people would buy a keyboard and they said what do I do with this cable I only, only only own one synthesizer. I guess I better buy another one so, I can use this cable and connect out but that was that was an investment that musical. Instrument companies made in promoting, and educating about, MIDI and, that. Cost millions of millions of dollars over. Over, particularly. The first five years it was just a whole very, expensive, so we want to use the existing resources, of knowledge we don't want to change completely to, something that, where we have to throw all of that existing, knowledge base away. We. Want to innovate, but. There's, this thing backwards, compatibility, backwards, compatibility this is a theme that comes up over and over as we asked users we've done surveys over the last few years from, the MMA what, do you want out of MIDI and what should we do to move ahead and, people. Say hey do cool do things but but please don't make it incompatible. With my old gear and, so. Users. Have time, invested. And money, invested, in their gear there's, comfort in the workflow, of how many works how, sequencers, work all of those things we, don't really want to change, and. There's also this, is different from many many, technologies. When, a new GPS, comes, out you. Throw away the old one because you like the new one and then when your iPhone does GPS, then. I probably. Don't use the GPS in my car anymore, we're. Embracing new technology, but, musical, instruments are very personal to people and I've, invested a lot of time in this musical, instrument I have an emotional connection with it so don't make me throw. That away may. Allow me to use that in a new environment. Okay. It's part of my creative being of Who I am, all. Right so these, are all factors we consider as we, look at future. Expansion, of MIDI or changing. The fork. So. Two. Years ago I, had. This idea no we had we've, had a lot of proposals. Made over the years to, do new things some. Of them were very many liked and some were kind of like the fork with the seven prongs on it they do something really. Well but it's a little bit different and so there have been other proposals, that come along and. Some. Of those ended up being in MIDI as those, recommended practice and so on but. Some things have, not not, worked out I came. Up with an idea and I proposed proposed, two years ago. Called. MIDI capability, inquiry and I proposed it to the MIDI Manufacturers, Association, and. Icicle. II the, idea is that many, was as fundamentally, designed as a one-way communication. I send, you controls, and I control you from sender. To receiver, what. If MIDI, is always, a two-way communication, a USB, cable while, it has MIDI data on it it guarantees the, two-way. Communication, can happen whereas. The MIDI cable was a one-way cable he had to hook up two of them if you wanted to wait so, if we assume two-way, communication. Can. We do new things that we didn't do before and, the, idea is that if too, many devices agreed, to do something new then. Do the new thing even, if it somewhat, breaks the rules of MIDI 1.0, maybe. You could do something new so. You, can agree to run at higher speeds, than the 30 1.25 if both, units agree you, could agree to use, new. Messages, that that were not existing, and couldn't. Be added to MIDI 1.0, if you, both agreed to break. The rules and say no I'm gonna add messages even though MIDI was out of messages, okay, so so, this is a fundamental idea of MIDI capability, inquiry and.
I Took that to the MMA and. Added. Bilious of Yamaha was not quite excited, about this several other people in the MMA thought it was a cool idea but Adam was was, really adamant on. This and just like Dave. Smith, got excited, about the idea of the, synthesizer companies, all using one common standard, Ivan. Got. Excited about this and he, convinced the Yamaha to. Continue. With this I was just. Started a consulting company so, Yamaha hired me to, do to actually continue, to develop these ideas so. He convinced, Ritchie Hasegawa, who is the head of technology, at Yamaha at the time to, hire me, Ritchie, has to go also happened to be the chairman of army and I'll come back to army in a minute but. Adam went and told people so he went and visited Rowley, and mogh, Steinberg. Arturia I came all to media able to MOTU and very, importantly Apple Microsoft and, Google and, said we, should all go and do these things and try, to enlist support for, the idea and. It's. Because, these companies came along and said, yeah. We like that idea, that, we were able to advance this this in our industry. Some, people said no no no Dave Smith said you know what I really like MIDI now he it, was his okay so he, was one of the adventures, you know he won a Grammy Award for it you know so maybe. He has the right to say I kind of like it though it is. But. But, go ahead and do your thing and, so. We've moved ahead with, mini, capability, inquiry, Yamaha. Rolling the core give all built prototypes that, as. Proof of concepts, as we were writing the spec they, made prototypes to make sure it works and, our. Mate is the association, of music electronics, industry in Japan. MMA. Is in the. United States the MIDI Manufacturers Association the. Two of them jointly, develop. New, MIDI specifications. And. So. We. Took the specifications, their army was on board immediately when, Richie Hasegawa took it to them and. Then army proposed it back to MMA, and two. Months later and they said yeah let's go for it and the, MMA formed a working group 35, companies joined, the working group to. Write the specification. And. Finally the MIDI CIA mini. CI was adopted, this January, actually in November by Army and in.
January, By the MMA, so it's an official specification. All, right so what, is it what does it do for me it. Really, is laying, just a framework. Or, maybe. A gate outside of this walled garden it's the door to say here's, how we're going to expand beyond the garden, that, was MIDI 1.0, and so. It. Doesn't actually do anything by itself it, just shows how to do it okay, so there. Are three peas, in the. Current version of MIDI CI there. Is profile, configuration, property, exchange and protocol. Negotiation so. I'm going to talk a little bit about the three P's, of MIDI. Ci. Ok, first one is profile negotiation. And. Here's. Here's three different instruments from three companies here's a traditional b3, which is now actually owned by suzuki if you buy a b3, organ, today from Hammond. It's, got MIDI on it this, is a Yamaha drawbar. Organ plug-in this, is a roland, portable. Synthesizer, with, drawbars, on it for controlling its organ sound three. Different instruments that do the same things, they have the same draw bars and yet, they don't talk to each other because they all use different midi messages for. The same function, the, idea of a pro file is to, define that. The eight-foot drawbar will always be controlled by this particular message and so, we write a profile, specification. And then, MIDI CI, allows. A two, instruments, to talk to each other and say do, you support, the drawbar Oregon profile and if, the other advice is yes I support the drawbar organ profile, then, they. Agree to, use that and now when, I move the draw bars on my b3, they. Automatically, move on my Yamaha, b3, plug-in and I. Don't have to do mapping which we you, know many of us have already you, know learned how to do suddenly. These things just happen automatically. I have. An example that is even worse than these three instruments Yamaha. Who I work for a lot these days has three organs they have an organ in the montage that, have that plugin and then they have the the reface YC which is a little organ with drawbars on it and so, that Yamaha has three organs and none of them use the same messages, so even within one company they don't talk to each other but. Through a profile. Devices. That have similar functions, like, the nine drawbars. The. Percussion, on/off the three, chorus the three different vibrato selections. On an, organ let's, all agree to do them the same way so this is the idea of a profile, and. So. Here are here are the types of profiles, that we expect, to come, midi. CI does not define, these profiles, it just says how profiles, work okay. So we still need to write these specifications. I've written already the the drawbar oregon profile, and, i've written the, rotary speaker one to go along with it and the effect profiles, so, it's three, main, categories of, profile. Configuration. Or, type. Of profiles instrument profiles are very, typical instruments, and there may be probably. I'm guessing maybe 30 profiles, that, would describe the common interest like orchestral, strings, brass. Other, than trombone because trombones got the slide thing the rest of them you know have, valves generally, so there might be a brass, profile, and a trombone profile, maybe they were the same thing I don't know nobody's written that one yet somebody, who's an expert in that area needs, to write that profile. We, need Roland, in Yamaha because they make MIDI controlled drum sets. Probably need to write the drum set profile, if you're, an expert on one of these areas why. Don't you write the profile, for that area. We, need effects there's probably a hundred effects, out there distortions. Phasers, flangers, compressors. Multiband. Compressor there's. Just a whole, lot of profiles that can be written so that on a reverb, unit if I if I turn the reverb time on one it. Also applies to another reverb somewhere else so, that I'm using commonality. Of messages. And things. Auto map and and. The, system's auto configure. So. If. All of your devices, and all your plugins support, profiles, your, da W could say who. Are you what do you do and, a. Yamaha. Piano could reply I'm a piano that conforms the piano profile, and it, could automatically, pop up a piano editing. Window. And, maybe. I've got to plug in and I've, got an orchestral string sounds, selected. And it says all the profiles orchestral, strings it'll, give me things about articulation.
In String sections, in an editor window and then I switch the patch to draw our organ and suddenly, that window goes away and a drawbar organ window, comes up so, automatic. Configuration is, one, of the core ideas of, this, this, concept, of profile. Configuration. Making. Things easier, for the, user, and. I think that's really powerful and. Will really help workflow. So. As, I said for user easier configuration. And. For, manufacturers. Profile. Configuration, is actually quite simple because they're already designing, musical, instruments with all of these controls they, just happen to all, choose. Different messages to control these things and so, it's really profile, configuration, is not a big challenge to manufacturers. To do this and, it's. All built around sistex, sysex, messages which. We've used for 35 years we, know how to do this and so this, is an incremental. Step, it's. Not completely, changing, the fork it's just, refining, the fork and. And. So. It's, a simple thing for us to do and. Not too far away from what we already do today. Property, exchange goes, one step further, than that and. Property. Exchange is a set of messages that can ask far more detailed, questions, like. What parameter, is controller 15 mapped to, or. You, have do. You have an. Amplitude. Envelope generator, with a decay you do what controller, do you want me to send for that so, you can ask very detailed, questions of a device. You. Can also get. Large, data sets, out of out of a device. One. Of the ideas, with. Property exchanges that will probably, use JSON. Or. Some other technology, not. Standard, MIDI system exclusive, which is really just a blob of data but, JSON becomes human readable and, and it's very easy for us to then include things like text strings and so we can ask for a list of all your patch names out of a synthesizer, and get, a list of the patch names and then then some meta tags to go along with that so, property, exchange will will do that now. The. Core ideas of property exchange are the ability to get and set properties. And. Getting set is. Used, in other things just really never been done in MIDI before but there's there's a lot of other things you know HTTP, has getting. Set type mechanisms, and pings and so on so we're. Gonna borrow some concepts. From, other industries, Roo there's no reason to invent new things and so that's why one proposal, would be that the language of this would be JSON. It's. It's machined a machine and so once, again we're gonna help the user in configuration. So. Devices, can talk to each other and self, map or, self self, configure a da W can configure to support very detailed, features. Of a. MIDI device whether, that's in hardware, or in software. This. Does present some new fields, for, manufacturers, you, know, MIDI. Manufacturers, may not have done, JSON ever ever before, so. There are some new things here but it's not too far away from the. Technology, that many. Of us are using today so. It's a reasonable step, and then reasonable addition to MIDI. And, is, gonna do powerful new things and help. Workflow in particular. The. Last area the last P of the three P's is protocol, negotiation, and this. Allows devices, to select the protocol, they're going to use and this, is potentially, is. One. Of the biggest areas that's going to allow new creativity, not just ease ease of use and better, configuration. But, new creativity, and, so the idea is the ability to negotiate the. Language, you're going to use you're. Going to use either the MIDI 1.0, language, or the, new MIDI protocol, that, is currently being defined. And we're simply calling next-generation, protocol. At the moment so. Devices, can can decide and. The and, agree together what. They're going to what language is gonna use to talk to each other and so, therefore, we can kind of break the rules of MIDI 1.0, and say, well MIDI 1.0 language said that it had to be done this way so. We can break some of those rules and do new things that. You could not do with MIDI 1.0, we, can do them in a next-generation, protocol.
By. The way you do have the ability also, to switch to something completely manufacturers. Specific, so if you if you want your device to generally do MIDI but, every now and then go away and do something that's not many like at all, then. You can go and do that and then return negotiate, back to using MIDI again. And. So. Protocol. Negotiation is part, of many capability. Inquiry. It. Comes with a test for compatibility, so when you switch to the next-gen protocol, the. Two devices that are talking one of them sends a test message if the, other one receives. It and everything's good it sends back a test message and the second one receives it and then, the two devices know that they're, they're talking successfully, in the next-gen. Protocol, if something, doesn't work out after a time out both, devices fall back to being midi 1.0, language, and, so, this, preserves compatibility. If I take a new product and connect, it to an old product and I plug it in and say can we use the next-gen protocol by hitting a configure. Button on it or something like that it talks the old device the old device doesn't know anything about it then. This device times out says ok gonna use MIDI 1.0, and that looks after backwards compatibility. In. This test mechanism, hey. So protocol. Negotiation, again. We don't define the new protocol, in MIDI capability, inquiry we just tell you how to get there okay, so. So. Let's take a look at what this next-gen, protocol, might, look like. And. One. Of the things that we've done as we're we're writing, the next generation, protocol, is it. Has to use the existing semantics. And mechanisms, of MIDI so. Remember as I told you many 1.01, of the fundamental ideas is I press, a note and it says start playing into when I stop playing you know says stop playing that note we don't want to change that mechanism, we, want to use the mechanisms, and architecture, that many has always add and keep, them and. So, the. Next generation protocol, is is. Really more, MIDI which is you'll see the bottom bottom slide it really is MIDI, but, with with more, it's. Incremental change, it doesn't it, doesn't change everything, wholesale. Come to something completely new. It's. A step by step expansion, and there's, room for more expansion, later but in its first version of next. Generation protocol, it looks an awful lot like MIDI 1.0, but, there's a whole lot of room for expansion, whereas MIDI 1.0, was this walled garden we've now moved out of that to a much larger space that's far more expandable. Backwards-compatibility. Is important, so, in addition to this negotiation. Compatibility. That I told you about just. Translation, from the new protocol, to, the MIDI. 1.0, protocol, is important. So if I record something into a sequencer, record, a performance, into a sequencer, and later, I change, it so there's gonna playback on another device it's a midi 1.0, i don't, want to have that being, completely. Meaningless i've got to be able to translate, that and have something meaningful come out on the other device okay. So so, translation, is, an important part of or. A requirement. Of this next generation protocol. So. I said it's more mini so what, is the more we want, to do higher resolution. MIDI. Is a seven, bit, data language. Everything. Is 8-bit, words and. You. Only get to use seven bits of it for data because, the first bit is a status, bit, on. On every message so, we. Want high resolution, and today. Computers, use 32-bit, or 64-bit resolution. For. Their data and so, we want to be able to support 32 bit resolution, 32. Bits is probably overkill for a lot of properties, or parameters, but, we want to be able to support up to 32 bits, and. So we, want high resolution, we want more channels. And. So, we're gonna provide you more channels and more, controllers. And. I'll. Come back to more controllers on the next slide in a minute something. That many doesn't have is, per node controllers. And, so, MPE gets around that an MP is actually a little bit of a hack in, order to get per node controllers, and so we want to be able to to take that concept and make it native, to the next generation protocol, and so, we've added per node controllers, and per. Node pitch Bend so, that I can detune, one note from the rest or or bend, a pitch from this this note to this note so. Per note pitch Bend is a new, addition. It's. Many so let's blow them to they're, a little bit unique in that the. Top things here high resolution, so on those things translate, back very well to MIDI 1.0, these, bottom, two probably.
Don't Translate back to many 1.0 so not everything, is gonna translate back you. Can't do something completely new and have it the, the old things completely, do the same thing otherwise, they are the same thing so, these. Are some new functionality, here. We. Have articulation, control. If. You if you if you're scoring an orchestral, score and using a strings library, and a plug-in you, know you've got switching, between bowed. Pizza, kado and spit sacado on your violins, and so on and so, we want to have articulation, control, in. Your note messages, and be able to do that and so that's part of the protocol, so, that we all start to do those things in the same way if. I use somebody as string library and use, their articulation controls. And then, load up another string library today the, articulation, controls, don't necessarily. Match they're, done in different manners and so we want to standardize, articulation. Control in this next generation protocol. Expanded. Tuning capabilities. Many, 1.0, was, designed around a twelve-tone, equal. Temperament. Scale the western scale that that, we would generally use. In. This country and in our you, know western. Music but. There's a lot of other music and there's a lot of other tunings, and so, MIDI, does actually have something. Called MIDI scale tuning where you can do non-western. Scales, but. We're expanding on that in the next generation protocol, and allowing, far more ways of using. Tuning, I'm. Not excited about this you, know just. To get musical, how many how many our music students here okay. We've got a third, of the room of music students I'm thinking of I'm. In the key of G and I play a deokman. Ted Kord where's, that D augmented, chord leading if I'm leading from that and. I look at the top note of that, augmented, the augmented, note is, an. A sharp if I tuned that a little bit sharp, above. Equal, temperament it, starts, pulling even, harder up to the B that it's kind of hinting and moving towards right I want to be able to do that kind of thing or in that same augmented chord the D augmented, in the key of G if I if the, next chord is going to be a C major seven, going from the v augmented, to the four major seven, I'm gonna take that D and maybe make it a little bit flat and see if that helps, pull towards, that that, four major seven, I want to do those kinds of things in experiment, with, the tuning of my chords, and that's. Musical, creativity that. Comes out of expanded. Tuning capabilities, in the next-gen protocol, and. I'm really here you can tell I'm excited about those kinds of things.
Simpler. NRP n & RP nrfu is anybody ever used n rpms, or rpms. And MIDI no anybody, know what it is core Taylor probably, does Dave. Does, n. Rpm we ran out of controllers. There was only 128, of them and so. We ran out and we said okay when we need more and, so we did said we'll. Use one control, change messages as. The, most significant, bit of a new address we, use another control as the least significant, bit of that address so now we have a 14 bit address space and, then we'll send controller. Number six as seven, bits worth of data the MSB, data of the, actual. Value. That I want to set at that address, that's. The MSB and then controller number what, is it. 37. I know. 30, 38. Is. The LSB, I'm gonna use 4 for that that, value and then I'm gonna send a clear message at the end of that to say, I'm finished, doing controlling. That one thing so I send 5 messages to accomplish, one change, it's, really complex, and. Basically. Has, not been implemented very well even, by companies who do, know many very well it's, not been implemented, very much at all and therefore none, of you are using it because, your tools don't allow you to use it because even the tool manufacturers say man that's ugly so, we want to fix that so, NRP ends and RP ends become one atomic message, in the next-generation protocol, it is just a message it's controller, and here's, the the index, or the address and here's the value which, is probably 32-bit, value and it. Translates. Back perfectly, to rpms, and our n rpms in MIDI 1.0, but they're. Unwieldy, to use there they're really easy to use in the next-generation protocol, so. That's. A big improvement over, MIDI. One point owes structure. At expanding, its controllers. So. That goes back to the first slide where I said more controllers it's, actually not more controllers it's the same amount just, were, you taking the. 32. Thousand of them that were unusable, making, them really useable. We're. Also making an atomic message out of Bank select, and program, change, Bank. Select is again. Three messages, is an MSB, of your, bank and then an LSB of your bank and then you send a program changed, to select your sound because, program change there was only 128. Of them on most of the sizes they have a thousand, or two or 10,000, sounds and so, we needed to expand so. MIDI didn't didn't allow for that in program change so we defined this, Bank select idea where, we're using a control change message now. To set an address and, so it's a little bit unwieldy a midi 1.0 it works but. It's a little bit unwieldy fortunately. A lot of D aww look after that I'm wielding this for us and. Give us just lists of patches and so on and. So that's a lot easier but, we want to make all of that easier even for the deaw manufacturers. And so, we, have a simpler Bank and program change would be a requirement. Of this next generation protocol. The. Next thing is improved. Musical, timing and time. Stamps is something that can allow us better. Musical, timing. Just. By adding higher speed on USB MIDI we, can improve timing. So, in. MIDI 1.0. Running. At 30 1.25. If. I want to play 10, notes simultaneously. All on the downbeat, I send, the first one and then, the next one then the next one in the next one the next one and the. Last one arrives several, milliseconds, after the first one okay. So then I got this jitter when I try to play everything simultaneously it, doesn't, come out simultaneously, because it's a serial port that's slow if I if I run that a lot faster, the last one arrives much closer to the first one and, and. So. Just. By running faster we get better timing on MIDI but, time stamps has allowed us, to capture. A performance. With high resolution so. If I've got a great performer. Oscar. Peterson on the piano I used to work with Oscar I was his tech support guy he'd. Recorded two sequencers, his is playing on the piano and, we. Have captures, of him playing but. It's all got this time, skiwear, where he played he had massive, hands put his hands down like that and cover four octaves on the piano and and. So. He'd play these massive, chords, and all, those notes when he played a Mac happened, simultaneously but the recording, didn't necessarily get it that way because over MIDI some, of those notes were skewed later, than the first note right so. With. Time stamps the piano can say those. Things all happen at the same time and send those that every one of those notes out with, a time stamp that said these things all happen together and so, when you play it back from the sequencer the sequencer could say to the, rendering engine these, things all happen together so, play them together so we're gonna get better musical, timing out of this neck generation protocol through, the addition of timestamps.
I. Can't. Really tell you a lot more details than that because, when. It comes to writing these specifications. It's done within the MIDI Manufacturers, Association and, these, things are actually confidential. The actual design of this and how, it's all done is confidential. Until the specification, is released if you'd, like to participate in this you. Can join and participate. But. But but. Then you sign a confidentiality, agreement so this is about all I can tell you about next generation protocol. Without. Getting in trouble and I know some people watching. Online right now might even be be, saying I think you're sharing too much already. Alright, so to, review MIDI CI has. The three P's profile, configuration. Allows, me to to auto configure, various instrument. Types or effects types. Property. Exchange is a getting, set mechanism, to, dig deeper, into details, or to get large data sets out of instruments, and then, protocol. Negotiation. Now. I'd, like to just, cover these three a little bit in terms of backwards compatibility. For. Profile configuration. If two devices talk and the device on the other end says I don't support any profiles, it. Doesn't break anything midi 1.0, can, still continue, to work just fine the. Way it always has so. Profile, is just, something that we add that. Devices support, or don't and if they don't if, that MIDI continues, to work this way it always has same. With property exchange if a device doesn't support it it doesn't support it you just continue to use many 1.0 the way it was, protocol. Negotiation is a little bit different, in that, we. Have a test for the. New next-generation. Protocol, so two, devices agree to use it and then, you switch there then, you do a test if the test fails or. It, doesn't get completed within a certain time out then, the devices both. Revert. Back to MIDI 1.0, and they have an agreement that they're going to do so and so they revert to MIDI 1.0, and you, continue use MIDI the way you always, did and so. While. We advance, and step outside the, gate we, have wasted to say no this this thing belongs back inside the wall, and, we. We look after backwards-compatibility. I. Mentioned. This as, well down the bottom here you'll. See MIDI, 1.0 protocol and the next-generation protocol, and translation. Between these two has to be simple there. Has to be the. Same concepts. Of note on and note off need to exist in order to make easy translation. So, the next generation of protocol, as I said looks, a lot like MIDI 1.0, it's just more of the same. So, I asked, a question earlier what's next what's next really is just more MIDI is. Not something completely, new it's an incremental step MIDI. CI allows, us to step outside the. Walls that we have so far and start to new do new things. So. What, do we need to do actually, what what are the the actual, action, items that are going, on right now first, of all profile, configuration. We're, defining profile, specifications. For common instrument types if you're, an expert in an area, chord. I'm talking to you right now about profiles, for tuning. Maybe. You should write write a tuning profile, specification. Properly. Exchange we need to define properties, and then standard, applications, so we're we're defining applications. For getting patch lists all in text width and meta tags and so on and we're defining those kinds of things in our me and the Mme.
And. We're defining the new protocol, specification. In. Fact we're we're, pretty much done with a new protocol specification. And. This this, past couple of weeks in the MMA, and army we, agreed that we're basically done and now we're gonna do proof-of-concept prototypes. So, we're gonna build some products that implement the new protocol, test, it for the next six months before we vote and say yes it works and then release it as a spec, or. After. Six months find out that everything, we invented is terrible and and we can't prototype, show that we shouldn't do it that way so. That's. Where we're going so what can you do join. The MIDI Association, it's, free go to MIDI org, and join the MIDI Association, and there, are discussion, forums there you can discuss all of these ideas if. You join you'll get, emails every now and then tell me about the latest things that are happening in MIDI you'll find out when when. New, profiles are released you'll find out when when things are happening if you actually want to define, specifications. Then, you join the MIDI Manufacturers, Association, also. Found at mini org. So, there are things that you could do, everybody. In the room should at least join the MIDI Association if you use MIDI you should join. It's, a lot of fun and it's free all, right, that's. It any questions. Questions. I'm, gonna go to Brian Schmidt. Anyway. I was great in looking. At your presentation about the the protocol it's are the. Parameter. Sets it, seems like the very similar problem. That the Internet of Things has like, you could easily see configurate. Profiles, for, door. Locks, profiles. For refrigerators, or thermostats are. There any other protocols, that are out there kind, of trying to solve this similar problem, that, you're aware of that you might think, about incorporating into this or is there like a secret plan to have MIDI ci take over the internet of things. To. Take over the world. Army did a presentation at, the NAMM show where. They showed MIDI, controlled, robots, a MIDI controlled toaster, which was a bit of a joke but but, just the idea of appliances. Being controlled by MIDI like, The Fountains at Bellagio were controlled by MIDI minis. A sim pretty simple language it's, not expensive, it's not heavy and there, are applications and there were reasons you might want to use it and. So. One. Of the things that we want to do with profiles is expand, MIDI to non musical. Instrument, control so. For example not. Only if you saw the I be sort. Of the Intel keynote. At the Consumer, Electronics Show they had a. Performer. On stage playing a piano and then, they had AI listen. To what he played and play back some some music in the same style and so on and they had these huge. Like. 50, feet tall, graphic. Performers. That were being controlled by MIDI and so, imagine, a holographic, performer, sitting in a piano and that performer, is controlled by MIDI that, allows the, the holographic, performer.
To, Be on the same data stream as music. And where, the clock messages are for the the beat timing and so on so in fact, our interest, in profiles is probably to take MIDI to other things rather, than but. But we don't want to ignore what's happening elsewhere, which, is which, is why we're looking at JSON for part of property exchanges. JSON. Is a perfect way to get in set properties. How. Are we gonna store this stuff court asked me this earlier today. What. About storing, all of this this new information and and there's, a standard mini file which, is a unique file format, for standing for for, storing MIDI but maybe we should be looking at if we're doing new things that, include JSON and and other, new types of messages maybe xml is a better way to store MIDI, in the future so. We. Are definitely the army and MMA wants. To be outward looking and we, don't want to invent things, that already exist it's, much cheaper to, to, you know a lot of the stuff is just driven by dollars it's a lot cheaper to to. Borrow an idea that's already done elsewhere. I'm. Really. Curious what do you think about open, sound control. Open. Sound control is really cool it's, it's. A little bit like a slightly different fork, it's, maybe the dessert Fork of Forks you, know it's different from the standard fork. The. The. Problem with open source and control if there's, a problem is it's, a little bit unlike MIDI and. Therefore. It's. It's a it's a step further away it's harder to translate, it to MIDI, it's. Harder for manufacturers. To to. Build. Products, using all their existing. Investment. All the testing, tools that they have for. MIDI don't necessarily, apply to OSC, and so OSC, is. Been, unset somewhat unsuccessful, because a little bit too far away from MIDI and. So a couple of times I mentioned, and when I went when I when I went when I wrote the USB MIDI spec I went and visited Moe - I went visited opcode I went to visit, a MIDI man which became M audio and said hey this USB things coming you're, gonna be implementing, this and this is what I'm doing about it. Oscy. Was, not really well presented to manufacturers. Same, thing with MIDI ci it was as in Billy's who went and presented it to all, of the manufacturers, that that. Count I went, with them at times and and ben-israel, from Yamaha was was, was. Active in this and and the army companies, were all active in this and various. People started to spread the news I know I see just, really wasn't sold well to manufacturers, and so, it hasn't caught on with. The. Core. Musical. Instrument companies, and and, it's. It's for two reasons yeah, it's, a bit far away from many and it, wasn't sold well thanks. What. What, is the state of audio. To MIDI say, you know with the ability. Of people to play instruments and have audio than, interpret, MIDI data or. Sure. So. Really. The audio, to MIDI has been done for a long time for many years guitar, controllers, do audio to MIDI and. There. Are newer technologies. In the last 5 and 10 years of that have come about that actually can do it from a polyphonic, source, or. Listen, to a WAV file and figure out the chords, that are being played at least, and. So. Some of that is happening. And. I. Think the new next-generation, protocol. Will. Help us in that regard in that it. Allows us to define. New messages, that, possibly, are tied to that. One. Guy has, proposed. A perceptual. Pitch. Profile. And. So, on, many instruments they start making a sound but at the very beginning it's just noise it's not there's no pitch and as, time goes on over a number of milliseconds, or whatever a pitch, becomes more and more clear it becomes less and less random, until a. Well note comes out with all of its harmonics, above that and so, he. Wants to write a profile, for defining. That kind of a, transition. From a note, that has no pitch to a note that has pitch and, I, think profiles, are, gonna, be great helpers to do that and allow, us to do. Those things in a common, way. It's great for one manufacturer, to innovate and do something, and some. Manufacturers, their businesses built is I'm built. Around I'm doing it and I'm doing my own way and nobody else can share with me, MIDI. Was successful, because Oberheim. Was willing to give up their system, sequential. Circuits Roland were giving willing to give up their systems to say no let's use a common. One and that became the bigger business I don't, there's. A business model for for. Doing, it a different way so I'm not I'm not saying companies shouldn't do that but, by and large the business is gonna happen when, we agree to do things the same way and I think profiles, will, will, be a big part of that.
Audio To MIDI. Moving. Ahead and being, more, and more viable as. We do it all the same way and learn from each other and compete with each other so, we're gonna take one more question here and I, think we're gonna move on. Hey. My name is Ben great, talk. You. Mentioned the progression, of sort of a MIDI fork from the 5-pin, din connector to, the USB, connection I wanted to know what your thoughts, are on, Wireless. MIDI. Connections, since it's zeros. And ones and the feasibility of that right. So our, TP MIDI is also already doing, Wireless you could you could do. Wireless MIDI / RTP and in fact I think. Mac. OS OS. OS. X and iOS both, support RTP MIDI wirelessly. And. There's also a new specification that, that. We put out a year and a half ago now ble. Mini bluetooth MIDI and. That, works works really quite well and. So. You'll find that in a bunch of products now Wireless. MIDI there. Wireless. Always always has a couple of problems one is lost, packets and so. You, need a protocol, that's going to look after that. And, and. One way you look after that is by adding latency, and, wireless. Generally, is high latency so, while a 5-pin. Din cable, can get a single message across in. In a third of a millisecond. On. Bluetooth, le to send one message it, depends, on the service interval of the Bluetooth devices you're using and I. Believe a Mac like. My MacBook Pro here, it's. Service, interval I think is 11.5. Milliseconds. So, my, average leaves he's gonna be 11.5, milliseconds. To before. Something. Happens. So, Wireless. Adds latency on. Top of the latency that's already there if I'm playing a keyboard from. The time I press a key to the time it actually gets put on a bus whether, it's the 5 pin then or Bluetooth, or RTP or whatever there's. Some latency in the keyboard to actually, make that happen and then it arrives in a synthesizer, or maybe it arrives into an OS and, like, core MIDI has. Gets. The message to a driver and, USB. Has its latency, and so, on there's always latency, and the.
Wireless. Latency, is added on top of Layton sees that already exists in the system and, really. The wireless technologies. That exist, today are really great for things where real. Time performance, is not critical, and where. Real time performance is like playing a keyboard over, Bluetooth we're. Really just on the verge of not, play it not really friendly. And playable but, it does work and for, most people it's, it's good a great player can tell the difference between a USB. MIDI, connection and a Bluetooth connection because, the latency is there. Mike. Thank you so much 7, on the round of applause for my campus fantastic. Talk.
2018-06-27