Shai Machnes: Machine learning for quantum computing
hi everyone and welcome to a new interview in today's episode that we recorded in the early half of July 2023 I'm talking to Shai Machnes. Shai is the CEO of qruise and we're going to partially focus on the work that they're doing with that company we're also going to focus on the intersection between Quantum Computing and machine learning so the intersection between these two fields is where Shai is focusing his research and we're going to talk about upcoming technologies on the machine learning side and how they extend to quantum computing architectures. So let's get started with that discussion. Hey Shai, thanks for joining. My pleasure. Today I figured we could kick this discussion off by talking about a concept that you've articulated in the past namely an AI physicist. So you're working in the intersection I guess you could
say between machine learning and Quantum Computing and there's a concept that you've not necessarily coined but at least stated about AI physicists. So I would just like to pick your brain a little bit on what you think about when you when you say that concept what the ambition is over the long term of how we can integrate machine learning with Quantum Computing in general. Right, so you can do it in one of two ways at the moment. Well, at the moment there is one way.
There'll probably be two others going forward. So right now quantum computers aren't very good so what you can do is you can use machine learning to help improve and design quantum computers which is what our company does uh once we get better quantum computers one can use quantum computers to do machine learning and one of the machine learning things you can do on future quantum computers is improved quantum computers right so the so if today to do what qruise does you need to classically simulate Quantum devices in the future you could do the simulation on another quantum device so it would be a like a bootstrapping process um just like you know people are now using output from chat GPT to train other machine learning models um but right now quantum computers aren't very useful yet they're mostly I would categorize them as engineering prototypes meaning they're good enough for physicist Engineers to learn how to build quantum computers how to use them Etc um and you know maybe even The Wider you know potential users and things like that but you can't run anything useful on a quantum computer that you won't rather run on a you know on a classical computer not yet hmm so for now doing classical machine learning to help improve quantum computers is what we do okay so other types of devices like Quantum sensors and it's a pretty General field before we focus on that in more detail let's maybe talk about the other approach first so you you mentioned this um sort of cyclical behavior and how you can make these enhancements is the key reason why you would want to use quantum computers to then enhance your quantum computers in the end that you have a much more natural overlap in terms of the physics associated with them is that the uh quantum computers will do best is simulate quantum physics because the let's see the the memory requirements for simulating quantum physics is kind of exponential in the number of qubits or degrees of freedom let's say of the quantum system uh which is partly why for example we don't fully understand high temperature superconductors and we don't have room temperature superconductors yet because these are kind of deeply quantum materials and even if we can write down the equations for you know one atom and one electron and how they interact in these materials you have a large number of electrons interacting with a large number of atoms and we simply can't solve these equations not analytically but not even numerically um quantum computers should be able to do that pretty directly because it's you know it's a natural fit but as I mentioned not quite yet right besides the let's say specific material considerations that quantum computers might help simulate are there General so I don't want to get into the details of qruise yet but this sort of comes into like almost to the domain of qruise when you think about optimizing a Quantum system a lot of the developments that need to happen as I understand with um especially superconducting circuits is you want to construct better models of your Quantum device is is the bottleneck there right now that you're not able to simulate Quantum like systems well enough or is it some other bottleneck I'm thinking like to what extent having quantum computers in this space actually helps resolve those kinds of problems so the problems we're having with Quantum are not oh they say they are about Quantum but also you can put them contextualize them in a in a bigger framework um when you're developing physical devices that are really at The Cutting Edge um you don't fully understand what's going on in there so the you know there is something that's called the forward forward problem in engineering which is you know creating a a accurate simulation of something so that if you you know if you did your simulation right and you didn't mess up the construction the device will behave as indicated by the simulation okay and there are fields of engineering where this works great right if you design a house in the appropriate software and you complete you know compute the stresses and the loads and everything and and you know you don't kind of cheat on the materials but you actually build it it'll be stable just like the simulation predict and there are other fields where um let's say electric motors right the the technology is quite well understood so I can simulate an electric motor in detail and what the simulation indicates would be the performance very likely will actually be the performance in practice um but there are Technologies where this is not the case um for example Quantum Technologies but but not only for example um let's say um silicon photonics so there are chips that have kind of photonic Pathways that are fabricated kind of generally in the same kind of lithographic approaches as you would make chips but Humanity in 2023 cannot control the optical length of these Pathways to sufficient accuracy because for the system to behave the way we wanted to we need to control the kind of the lengths of these things to within a small fraction of a wavelength and we don't know how to do that yet so you fabricate the device but the the phase difference between the two kind of Pathways is kind of random why because we don't have sufficiently good manufacturing control um and and therefore you need like once the device comes out of manufacturing you need per device to actually measure what's going on and make Corrections um and in Quantum it's partially that um we have let's say in superconducting qubits um kind of at the bottom of the qubit is a very small super sensitive uh component that's based on quantum tunneling known as the Josephine Junction Quantum tunneling has kind of exponential sensitivity to kind of changes in the in distance and potential and other things so you know a few atoms move to the left to the right and now this thing behaves differently we don't know how to make things with Atomic accuracy right Humanity hasn't figured that out yet so we we make qubits and they don't come out exactly the way we planned but on top of that you have other issues so you control these qubits with electromagnetic fields which are from these kind of boxes of very high-end electronics that can manipulate fields at the nanosecond time scale but like everything nothing's perfect so on these controls you know you get I don't know 12 bits of accuracy in the voltage right but the 13th qubit is randomish but it's actually even kind of worse because the way it's random is problematic and um you you need to start accounting for it and measuring exactly how much noise you have and how much crosstalk you have between all the wires and so the the the distance between what you design and what you are able to fabricate there's some Gap there and that's what limiting the performance of quantum computers so we need tools to kind of in some sense reverse engineer the devices we we ourselves are building because we plan them to some accuracy we fabricate them to to the best knowledge of humanity you know today and yet that gap between what we actually get and what we plan is kind of the pain point for performance of the device and then having tools to kind of understand in great detail what's going on and you know given you know you could have for a certain Quantum operation you could easily have between five and ten different sources of noise simultaneously and to be able to understand which of these is actually the pain point which of these is actually the limiting factor which tells you what you need to focus on when you're developing the next iteration of hardware this whole thing is really really hard but it's it's not limited to Quantum and in Quantum Computing the equations that describe the Dynamics of the system are the Schrodinger equation but in other systems it could be you know electromagnetic waves or I don't know the propagation of light in matter which is an electromagnetic equation but in a different frequency range or other things you could have phonons which are essentially slight vibrations that also affect things um and sometimes it's really silly things like the 50 hertz of the electrical signal from the wall somehow weakly propagating through all the devices and all the electronics and and it's it's tiny it's like a 0.1 percent or whatever but when you're trying to get to four and five nines accuracy we're trying to get a quantum operation to be 99.999% accurate right you get like the tenth of a thousandth of the noise from somewhere and and that's what the field is wrestling with and everybody's wrestling with it in different you know different Hardwares different attempts to get there one thing I want to linger on for a second that that struck me was you talked about josephson junctions and the way those are being fabricated you said that depending on where the atoms actually end up in the fabrication process that will have quite significant effects on the actual um well model or the actual product that you get in the end is there also a Time Dimension to that I'm thinking the architecture as such or the material as such ones that has been fabricated is that does that have a tendency to shift over time let's say yeah so it's not just so it's not just that you need to characterize all the noises you have right but it's also really on the material Level you have changes it's not just on the material Level so I'll tell you something we experienced in the last 48 hours okay uh in in the last 48 hours we're working on this uh superconducting quantum computer and the lab had a glitch in the cooling mechanism so if the chip is usually 300th of a degree above absolute zero it went all the way up to one degree above absolute zero right that's still a lot colder than outer space right it's really really cold but it wasn't enough that first nothing worked and then once it kind of it took it takes hours to cool it down back um atoms shifted because of this higher temperature and where they settled wasn't exactly the same place they were a previous so we had to so things kind of work but the accuracy was very bad so we we had to re-measure all the qubit frequencies and the all the parameters of the chip because everything drifted because you heat it up you you but not all the way up to room temperature like from you know 0.03 above absolute zero to one degree of absolute zero and that was still enough to mess things up so these things are kind of painfully sensitive um there are other Technologies by the way that are far less sensitive so okay when you say would you say superconducting architectures are the most sensitive to these kinds of of things or when you say other architectures that are less sensitive what are you referring to there so to to kind of take the The Other Extreme um NV-centers meaning you take nanodiamonds or artificially manufactured diamonds and you shoot nitrogen atoms in them uh sometimes the nitrogen atom will kick out a carbon atom and that creates like a Quantum system that you can use as a qubit it's a bit a bit more elaborate but generally speaking uh this is valid now this system is quantum at room temperature right yeah so you don't need to cool it at all and and for example trapped ions uh you know some most variants don't need cooling either but they need excellent vacuum however it's not that hard to build a hundred qubit superconducting quantum computer because if you know how to build 10 going from 10 to 100 is with superconducting it's mostly more of the same it's an engineering challenge that's so much scientific but with trapped ions and NV- centers and others there are these inherent limits that make it really hard to scale beyond a certain size sorry for interrupting what would you say some different advantages some scale better some are more sensitive some are this some of that so it's not clear that's why people are trying multiple directions because for every technology people are working on You Can Count advantages and disadvantages and it's not clear you know where we'll make the Breakthrough mm-hmm can we can we stay on uh trapped Ion on computers for just a second and particularly focus on the scaling up of Trapped iron quantum computers compared to superconducting ones is your view that uh that would be sort of easier to do I mean this is all maybe in sort of parenthesis because it's hard to predict what roadblocks would potentially be ahead but your general view of these two architectures comparing them would you say there are more like significant roadblocks for let's say scaling up trapped ion quantum computers as opposed to superconducting ones because the reason why I'm asking this is I I actually talked to someone before who was under the impression that scaling up trapped ion quantum computers could actually be um quite replicable as well because you have similar like because you circumvent some of the fabrication issues that you have with for example uh the josephson junctions if you take that like the atoms that you use in a trapped iron quantum computer are identical so that would potentially be an argument for scaling it up right so um the the atoms are identical right so so but the environment in which they are held the magnetic fields the potentials used to control them are not exactly identical um and we need everything to be identical right the other is that superconducting qubits are fabricated with let's say lithography type techniques right and we know how to make chips with a billion transistors today so from that perspective the the let's say the chip part it's a little bit easier to scale however if you think of Trapped ions um so people used to do linear traps which was kind of you have range magnetic fields so the old ions come like sit in a row that Rainer black used to kind of spearhead that he got to 14 qubits with a lot of effort and then he basically stopped because you couldn't scale bigger than that since then people have come up with Solutions segmented perhaps or ideas but the so I'll say this if you have a 10 qubit quantum like superconducting quantum chip going to 100 is more of the same you're going to have a lot of problems with wires and it gets a little big and there's crosstalk it's not trivial but you don't need kind of the fundamental change of architecture to get to 100 trapped ions you need to start playing around with the architecture because you can't do it in one simple track so people do uh their ideas where the the ions themselves move and you move them around but then you're you're looking at uh something that you know maybe a little bit like a pinball machine that has a hundred balls and you you need to be very careful with all of them because if you lose one the quantum State goes kind of you you open up the box and the cat either dies or meows but you know it's not a quantum cat anymore mm-hmm so so in in some sense you can think of trapped irons as this is kind of really nanomechanical system so that comes with its own difficulties that said the these systems can be very very well isolated so let's say as far as I know the records for their best single and two qubit gates are all in trapped ion systems but the biggest systems are uh superconducting and not trapped ions now then again you have neutral atoms reedberg systems where they can do larger number of atoms but then you you have some difficulty with controlling individual atoms you can do kind of global operations but it's more difficult to kind of do individual Gates let's call it and as a result these systems are very well suited for simulation but for let's say what I would call digital Quantum circuits it's a bit harder so again there are advantages and disadvantages to every approach um Intel is working on the on the quantum dots in Silicon idea which is even closer to the kind of lithographic um you know chip industry approach to things and of course there are also difficulties there so and I think that the largest they got is somewhere between 10 and 20. um okay so it do the reason you have people working on so many directions and and very smart people is that there's no clear winner at this problem that's right so you know today every chip you can find with really tiny uh exceptions are all kind of mosfet transistors you know one technology took over the world right yeah we're not there in Quantum we haven't found our transistor the thing you can replicate like a billion times and and you have multiple Labs commercial academic exploring you know multiple candidates and trying to come up with new candidates all the time because all the options we have are not that great all really really hard to kind of push forward now you know there are so many smart people working on this I wouldn't be surprised if somebody figures something out and and suddenly we have a leap um on the other hand you know there are also a lot of smart people working on Fusion and there are a lot of approaches to do fusion and fusion is you know five years away for the last 50 years so it's extremely hard to predict it's extremely hard one thought I had and we could use this maybe to to tailor the discussion a bit to machine learning as well um when you mentioned before the incident today like with the superconducting quantum computer and the refrigerator that wasn't quite at the right temperature are you seeing any sort of tools that go beyond just characterizing the architectures in a better way but actually help with these kinds of everyday considerations for running these systems I'm thinking that AI I mean obviously the sort of the the natural thing your mind tends towards when you hear that term AI combined with with quantum physics is you can use it to characterize your systems better get sort of better performance but in terms of helping out like the all the surrounding tasks that need to be done are you seeing right now in in the academical space that there are lots of tools popping up for those kinds of things or is that still lagging behind a little bit um it very much depends at which level you're looking at so there's a lot of activity in machine learning for science in general there's a lot of activity in machine learning for anything whatever you want to put there's a lot of activity there right now um but it hasn't coalesced into you know kind of large like large very powerful mature software packages or something like that the if you want let's say simulations for Quantum systems on the simpler side you have packages like QuTiP an open source package it's really really good right but it kind of only goes so far which is fine um there aren't really you know for error mitigation which is this idea that you can you have a noisy quantum computer and if you take not just run the original calculation that you wanted but you modified in in really smart ways and you run multiple calculations you can kind of uh figure out from the various results what would the result would have been if there was no noise so there's a very powerful package called medic to do that um do there are other parts but there's no kind of a comprehensive solution okay understood one other development I mean that's been really coming along heavily is all the GPT models oh yeah have you seen any applications of that in the the overall space that you're in I'm thinking that that to me doesn't come very naturally like how you would actually apply a technology like that to a space like Quantum Computing perhaps I'm wrong there yeah not yet right I mean Chat GPT today is like um let's say a very Junior programmer yeah so you can tell it to read the Qiskit so Qiskit for example a very uh very nice package from IBM open source to do things having to do with compilation and searches it's a good collection very useful thing you can tell Chat GPT okay um read the documentation for Qiskit and then Implement for me I don't know the shore algorithm or seven qubits and it'll figure out how to do that but it's it's very much a junior and when it comes to doing things that are let's say research level it's not and I don't think it'll get there anytime soon because the way Chat GPT thinks of math is is Too Human by the way if you ask Chat GPT to sum up 15 numbers you know and you don't have the same number of digits or something it'll almost always give you the wrong match yeah now it's kind of ridiculous this is a computer to do the inference that's the the conversation you're having and it's trying to add up it's you know utilizing many many many billions of assembly instructions you know or the equivalent on a GPU yeah where it could do this with like five assembly instructions if it felt like a computer and not like a human um LLMs large language models were designed to process human language or process fuzzy reality sort of things math is not that like there is no fuzziness one and one is always right there's this joke that you know you go to an accountant you ask it how much is one plus one he says well how much do you need it to be yeah the the with with um with math and physics when you manipulate algebraic equations there is no room for fuzziness if you if you ask Chat GPT to solve a math problem like a high school or you know undergrad math book if you just ask for the solution the chance of getting an error is much greater than if you ask it to detail all the steps along the way that's a very human thinking when we train in universities when we train you know people coming out of high school and we take them through undergrad and Graduate Studies in PhD we train them that in certain things you should not think like a human you should try and think like a computer because the way you know if you have a set of equations the decision what step to take is perhaps based on intuition and experience but once you've decided I'll move the variables from you know the left side to the right side there's only one correct way to do that there is no kind of there's no wiggle room and trying to get and even in high school you know Elementary School we teach kids to do arithmetic but when they get to high school we say no don't do arithmetic by hand takes too much time and you're more likely to make mistakes this is a calculator use the calculator and right now the approach the main approach is to get is you know there's this shiny new Hammer these LLMs and they do wonderful things but not everything is a nail and qruise is going in a direction of okay if not everything is a nail what's next and and we have some ideas but yeah not quite yet ready to to speak about it but but maybe we can segue two qruise then and speak generally about the kinds of problems you're solving with that framework yeah would you just like to give a quick high level pitch sort of of what qruise does and then we'll get into some more details right so so qruise is developing um essentially machine learning physicists right if I said it earlier that you know right now language models aren't very good at uh at math and physics We believe We Know Why and and the long-term goal is to to help people in development Labs to develop new sort of devices uh by having you know like an infinite number of Junior physicists working with them 24 / 7.
now we're not going to replace you know the the more Senior People anytime soon uh and just having you know a lot of Junior people in software doesn't solve all the problems like you know there's this kind of you know a thousand monkeys on a thousand keyboards don't uh generate Shakespeare and you can't take a thousand PhD students and tell them build me a quantum computer they need somebody with more experience to guide them so the software won't be able to replace the more senior judgment and strategy but the day-to-day a lot of it can be done by by software by Machine learning and what we're focusing right now is um is what's known as this inverse problem of okay I've built something it doesn't quite work the way I've planned now I need to understand why and we we originally built the software very much for Quantum technology but it turns out it's a little bit like you know Amazon started building a bookstore once they had you know the the web infrastructure and fulfillment and Logistics all that stuff worked out they realized that hey I can actually use it almost directly for selling absolutely everything else we we kind of started out trying to provide algorithms to to help quantum computers work better and understand why they're not perfect and it turns out that if you replace the simulation with from schrodinger equation to the Maxwell equations and suddenly you can solve very similar problems in many other fields fluid mechanics or you know whatever it is because the the the the methodology and ideas that sit on top of the simulation don't really care what's inside the simulation they're quite General so the way we train physicists to kind of think about how do you analyze data to figure out what's going on this methodology is quite general not very dependent on which sub-sub field of physics you're at right and and and the equivalent algorithms are similarly not very dependent on the field of course you need to to have let's say a very reliable simulation in whatever field you do but I'll give you an example let's have a I have a model of this of a system and I I optimized it to be as close as possible to the data but one of the parameters in the model of the system still has a very big uncertainty why because my data doesn't include sufficient information to narrow down the uncertainty um you can ask the computer to design a set of experiments that once executed will give you information that once analyzed can narrow the uncertainty of this parameter which is extremely useful and important but all these algorithms don't really care what sort of physics there is all they need is that they need their simulation and they need the simulation to have certain mathematical properties but once you have that you can do what's known as Bayesian experiment design in almost any field of physics so if I understand that correctly you're saying essentially that you would develop an algorithm that identifies not only if a system is underfitted let's say but also what the root cause like where you would most likely find the root cause of that underfitting where this model would need additional parameterization in order to fit the data in a good way is that roughly the right way of thinking about it so there are two issues here okay is my model is too simplistic it doesn't include the right physical phenomena it's missing some parameters uh and I need to figure out what to add to the model this at the moment is beyond the capabilities to do automatically okay this is where you need kind of physical intuition Etc and experience and that's really hard but let's say you have a model that actually you know doesn't do a very bad job but you still want to know the value of certain parameters to high accuracy but you haven't collected the right data you want the computer to figure out what's the right data to collect so that you you can narrow down these parameters so this we can do automatically the the part of trying to figure out what to add to the model we don't have a very smart way of doing it so we can just try stuff you know like you know throw stuff against the wall see what sticks right and if it's a computer it can throw things really fast so maybe you can make some progress this way but that's kind of a Brute Force approach to things and we have ideas of a more refined approach but we're not there yet okay when we talk about this being generalizable as well I don't want to dig in too much into the details of what fields that would be generalizable towards but just in terms of um your where you're coming from where you're saying it do you mean that it's generalizable across let's say architectures I would assume these kinds of approaches are but even more fundamentally than that perhaps yeah yeah so okay so so you know we came from Quantum Computing and Quantum sensor and our academic background is control of these types of systems um so we had to deal with very different systems right from room temperature NV-centers to milli Kelvin superconducting systems and also time scales that vary from the nanosecond to the micro sector to the millisecond um this kind of forced us to create approaches that are actually very flexible because although we were dealing with always things that fall under the kind of headline of quantum Computing or Quantum technology but if you if you kind of look down into the detail they're actually very different from each other so kind of that led us to to seek approaches that are that are General and are not very specific to the physics and and once we developed them for Quantum we realized that hey we made something that's actually even more flexible than we realized but the the flexibility stems from the fact that over the last for me 15 years for the some of the professors it's more than 25. they've been dealing with a wide range of physical systems all of them still under the title of you know Quantum technology but they're very different so we had no choice but to to develop flexible approaches that aren't too specific to the architecture or to the type of system right when you're trying to optimize a Quantum gate you you you're looking for ways to do it that'll work no matter if it's NV-center or trapped ion or superconducting and you end up with this kind of flexible approach that's actually even more flexible than you thought and you can take it into other places okay one consideration when it comes to this offering and another initiative that you've been involved in so I know you've been involved in OpenSuperQ I think that's now actually evolved to OpenSuperQPlus these days I don't know if you're still part of that project um okay so are you able to use these Technologies now that you've developed as part of qruise so the story of Cruz is kind of intimately tied to OpenSuperQ and OpenSuperQPlus okay a lot more of the original OpenSuperQ so we actually started as an academic open source project that was developed as part of OpenSuperQ right we needed to provide control algorithms and you know these sort of things for for open OpenSuperQ and that that's how we created the not the methodology the methodology predated that but the software and then when we created the qruise the company uh we essentially forked the open source uh project and continued developing it as proprietary because it was MIT license and we could do that okay and just then to help people with putting putting the um inventions into context now in the uh OpenSuperQ project what are you actually doing that requires the involvement of the qruise products if you just want to paint a picture of like what what the target of that project is and how that's being facilitated okay so so the idea was that okay you have this quantum computer every qubit is slightly different and a-priori you don't even know the exact parameters of the qubit and yet you want to make to to create the same Gates the same operations on all the qubits and and they have to be very very accurate right but so you're trying to do as essentially um yeah having kind of think of it as you know you get very a set of very different cars with different steering different tires different I don't know what engines and they need to kind of Traverse this really complex path and they need to do it exactly exactly the same to the millimeter but they're all slightly different I mean they're all cars but they're all slightly different and we we had to build algorithms that will you know be able to do that be able to understand quickly kind of the particular parameters of each qubit each car and then figure out how do we need to tweak the the driving which superconducting qubits are driven by electromagnetic fields in the gigahertz range which is why are known as microwaves right then the macro if you have like at home is like 2.4 gigahertz or something whereabouts which is why when Wi-Fi was 2.4 gigahertz it failed when the microwave was on because they're basically using the same frequencies so superconducting qubits are driven by by microwaves uh typically not as low as 2.4 or like five six seven eight gigahertz
something of that nature so we needed to figure out how to drive the qubits very precisely and to tailor it for each qubit in each device and if you heat it up the chip and cool it down again you needed to do a do-over because you know the engines are now different and the tires are all different it's still the same car but it's not the exactly exactly the same car and you need to kind of fine tune everything again and that was that was the role of our team okay how would you say the resources for a project a lot like that compared to some of the other players in the space like one of the considerations I have is I would assume that like for example the development of machine learning models it's quite resource intensive and the players that have large budgets that they're able to allocate to these kinds of projects are able to get a long way especially when it comes to scaling up the system so like the largest superconducting quantum computers come from like Google IBM these these kinds of players how um well budgeted would you say the product was from the perspective of actually uh driving this kind of innovation and how far would you say you've come so so far in that project so I I think that's the wrong question you can correct it if you want in Europe were trying to compete with the big commercial companies with university-led product projects and Building 100 cubic point of computer is a huge engineering feat and one to which universities are not well suited in my mind if I were kind of in charge of project funding instead of giving you know 50 to 100 million to a project to build a big quantum computer I will give you know 10 million or 5 million to a bunch of universities and tell them to look for better qubits to build the best 10 qubit chip possible and then patent it you know and then create a commercial company to scale it up yeah the budget you've got in universities isn't as large but there are other issues um European approaches to distribute to have everything distributed multiple countries multiple universities Etc these things benefit from a concentration and in IBM everybody working on quantum or almost everybody is in Upstate New York at least for working on the hardware um it's really hard if people are at different universities and then they have different motivations and different agendas and everybody is in multiple projects so they have multiple priorities and you know they I would change the way Quantum funding is done in Europe a lot more small projects to make breakthroughs in qubit design circuit design gate design that sort of thing which is needed in the field and the leave of scaling up to commercial companies because if you created you know a 10 qubit chip that has uh you know far better fidelities than anybody on the market and you you know and you patented it you'll have a line of VCS kind of around the block waiting to give you money and on the other hand and this is the sort of thing that universities do well you know go very very deep understand everything you know to the finest detail come up with new ideas do essentially technical proof of concept on the other hand there are things that universities don't do quite as well which is I think which is kind of large large projects that are you know have a very significant engineering component now sometimes you have no choice right because if you're trying to build LHC then you know there's no commercial side to this equation so you have to do it in an academic environment but here universities in Europe are competing against commercial companies in the US right and and the skill sets are different and the strengths are different and we kind of we can choose our fights and I think we're choosing the wrong fight understood so I think that leads us very well to to the last question that I was going to ask you which is specifically about what parts of research are gaining sufficient attention in your view and which ones would require more attention and more perhaps funding from on behalf of the of the University so you mentioned that your opinion is that a university should focus for example on qubit design but do you have any more uh concrete yeah yeah I have but I want to maybe clarify this is not the University's decision it's the funding agency's decision because if the funding agencies are putting out you know 200 million euros that we're going to fund three collaborations to build quantum computers we're going to give each 70 million or something right but this is what the universities will do and if the funding agencies said okay we're gonna fund now 25 programs to build better qubits and it's small it's like five million euro project uh and and you know you you could have it everywhere it doesn't matter but every project is localized so that people can sit and work together I think overall you'll get better results but it's not the University's choice it's the funding agency's choice that's one another thing is that you ask what's not covered well and I think machine learning um a person today can finish a PhD in you know quantum physics you know work on building quantum computers and never take a single course in machine learning I think in 2023 that's insane I think machine learning is clearly becoming a very basic tool right you don't need to to develop new approaches in machine learning but just like you can't really do let's say Quantum technology without knowing how to program you you can't do Quantum technology without properly understanding the tools that machine learning make available the approaches that machine learning make available you don't need to design new neural nets but you do need to know how to kind of utilize existing architectures in an effective way and you know most okay I don't know if most but probably most people finishing PhD in Quantum technology you know don't go through machine learning training and I think that's a big mistake good so then more resources to to machine learning or more prioritization perhaps towards machine learning at least we're overtime now and I think we should cut it here I really enjoyed having the discussion Shai so thanks a lot for that yeah I hope you have a great rest of your evening uh you too