Intel Turbo Technologies Explained by Guy Therien | Talking Tech

Intel Turbo Technologies Explained by Guy Therien | Talking Tech

Show Video

- Hi, welcome to "Talking Tech." I'm your host, Marcus Yam, and today, we're talking turbo. That's a feature that will increase the frequency of your processor automatically. To tell me more about that today, joining me is Intel Fellow, Guy Therien.

Guy, thanks for joining "Talking Tech." - You're welcome, Marcus, good to be here. - Thanks, so would you mind telling me a little bit about yourself and what you do at Intel? - Sure. I've been at Intel about 30 years, and I concentrate nowadays in performance segmentation, that is increasing the performance of our products to deliver real end-user value.

- Great, thanks. So, I know we're talking about turbo today, so I wanna share kind of my own experience of what turbo means to me, what kind of comes to my mind. So when I was younger, my family had cars from Sweden, and I remember my dad got this new car, and he said it's a lot faster, and it's because it had a turbo. So that's my first association that turbo means faster. Later on, we got a PC with a 486 CPU in it. And I remember on that PC, there was a button on the front of the case next to an LED that said 33 megahertz.

And when I pressed the turbo button, it said 66 megahertz. Now, I know that was a very clever trick to kind of get the most performance out of 486 while maintaining compatibility with older programs, but I knew that at that point, turbo had some association with frequency, and I think that still carries through to today. So can you tell me more about the relationship between frequency and performance? - Sure thing.

So generally speaking, frequency is associating what we call the clock speed of the processor. So clocking's an electrical signal that goes up and down, and that's called a clock, we can also call that one cycle. And so how many cycles per second or how many times the clock is up and down per second is what we consider frequency. So, within the processor itself, you get a certain amount of work done with every clock cycle. So, higher the speed or higher the frequency, the higher the amount of work and thus the higher performance that you get. - And to distill it down, when you mention these cycles, also another word for this is hertz, right? Is that correct? - Yes, so we measure the speed of the cycle in hertz or megahertz or gigahertz.

- Okay, so we're going up to billions of, is that billions, gigahertz is billions of cycles per second? Do I have that right? - I believe you do. - Okay, all right, I've been doing my homework. (chuckles) So let's dig a little bit deeper before we get to the the term turbo. I know there's been other kind of frequency-related technologies with Intel processors, including SpeedStep and Speed Shift. - That's right.

- Can you tell me about those and how those relate to frequency and performance? - Sure, sure. Prior to the introduction of SpeedStep Technology, we had one frequency and one voltage to reach that frequency. But because power's related to the square of the voltage and related to the frequency, as you increase frequency, you have to have a higher and higher voltage, and that voltage is squared to the power. So the introduction of SpeedStep said, well, if we're running, sometimes we only can run at a lower frequency, we only need to run at a lower frequency, a lower speed because the demand is low. And other times we will run at a higher frequency when the demand is high. So we're gonna switch between voltages and frequencies in pairs, some when we're low and some when we're high, and that will be a better, more power-efficient approach.

And so this was the adoption of the SpeedStep Technology, and we provided that capability within our hardware, and also we provided an interface that allowed the operating system to find out how many of these voltage frequency pairs that we had, or we call these P-states, and allow the operating system to select which one they wanted depending on the demand that was placed on the system. - So if I understand this correctly, Speed Shift and SpeedStep were created for power and efficiency considerations. - So SpeedStep, which was the first implementation of voltage frequency pair switching was indeed for getting the most performance within the best power.

Speed Shift is a different technology that we can talk about next. - Okay, so tell me about Speed Shift. - Okay, so Speed Shift, as I mentioned previously, SpeedStep was when the OS controlled which P-state that you're in. Speed Shift is a mechanism whereas the processor itself looks at the demand placed onto it and adjusts the voltage and frequency according to its own decision-making inside its own algorithm.

- So Speed Shift is just, not to simplify, but it's a more advanced iteration of SpeedStep. - It is in many ways. It's more autonomous, it's hardware-autonomous. It actually allows you to switch voltage or frequency at a much higher rate than it would be able to be done previously with an OS-based control. - Okay, so does this bring us to turbo yet? Is turbo the inverse of SpeedStep, where SpeedStep would be looking at frequency and powering down? Is that one way to think of what turbo is or is that something else? - I wouldn't put it exactly in those terms, so let me say it in the way I would consider it.

So SpeedStep was selecting voltage frequency pairs that were available from the processor. When we manufacture parts, we assure that the parts run at the specified maximum and minimum voltage and frequency, and that's also when running certain specified workloads within certain power. So this is assured by manufacturing to be able to operate within those spec conditions. Then there is the concept of turbo, where it's opportunistic.

You know, outside of those conditions, we may have opportunity depending on the workload that's running, the temperature of the ambient environment. And you could potentially get to much higher performance, but I can't necessarily assure that every part gets the identical performance within that power outside of what we call the manufacturing assured range. So turbo is the range above the range that we have assured in manufacturing, and it can deliver significant number of bins of performance, that's hundreds of megahertz times five, six, seven, depends on the product.

Can even be more. - So when people see on their processor specs, they see a base frequency and then a turbo frequency or max turbo, that's what you're referring to. - Yes, but it depends on the kind of turbo, so we have single-core turbo and then we have multi-core turbo. Most likely, every part that has a single-core turbo, all parts can reach that. It's when you get to the multi-core turbo when you're running a very heavy workload, not all parts can necessarily get that frequency for all workloads that you run.

- Okay, and I know that Intel has several different types of turbo technologies, so could you give us an overview of the evolution of turbo. - Okay, sure thing. All right, so as I mentioned, we have an assured range, and then outside of that assured range, we have a turbo range, okay? Historically, within the turbo range, every core in the processor could get to the single-core max. But it turns out through natural manufacturing variation, some cores are actually better than other cores, and they actually have higher performance capability at the same voltage as others.

So this was an alien concept to operating systems to provide diversity amongst them, so we pioneered this technique, it's called Intel Turbo Boost Max 3.0, where some cores there might be favored or considered to be higher performance than other cores. And this was implemented first with our silicon and our own control software, and then Microsoft adopted the control of this. So Microsoft's Windows operating systems and then Linux operating systems understand that some cores might be higher performance than others, and they utilize them as possible when there are workloads that are demanding, we place those workloads on those higher-performance processors first. So this is Turbo Boost Max 3.0

as opposed to the standard Turbo Boost 2.0. - So, just to simplify things, what is the key difference between Turbo Boost 2.0 and Turbo Boost Max 3.0? - The main difference is that in Turbo Boost 2.0, the maximum frequency of all the cores is the same, but in Turbo Boost Max 3.0,

some cores can be higher frequency than others, and sometimes they can be levels of performance above the others. - Okay. And there are, since then, since Turbo Boost Max, there are other types of adaptive turbos or thermal-related turbos. - Yes. - Can you tell me about those too? - Sure, so we have a feature known as Thermal Velocity Boost, and Thermal Velocity Boost takes advantage of certain aspects of temperature to determine that potentially a lower voltage could be used to get to the same frequency that would not be possible otherwise. As you could imagine, we're extremely quality-focused at Intel.

And that means when we manufacture, we make sure that the part works within our specifications and that we also need to make sure that the part does not wear out before we expect it to wear out. So there's always a concern to make sure that our test programs that test processors perform the right test in order to ensure the quality that we need and that end-users need to have. And so we have to be very careful, but there are some parts and some cases and some times that we can detect when it's what we might call safe or not cause an adverse impact to wear out to run at higher frequencies using the same voltage. So this is the kind of technology that Thermal Velocity Boost is, it looks at the temperature and says if the temperature is below a certain threshold for a certain part, we could actually run at a higher frequency. - So on the user side, if an enthusiast is building a PC and has a very robust cooling solution, that would mean the Thermal Velocity Boost would actually be able to activate or kick on to a greater degree? - That's right, and so if you're an end-user or even an OEM who is investing in the power delivery and investing in the thermal solution that's provided in the system, we believe that if you invest in those things, we should provide means to allow you to realize more benefit from having made that investment.

So Thermal Velocity Boost is a way to provide that. - And as a person who builds his PCs, I definitely think of that as I can choose my cooling solution, but we should be clear that this also extends to notebooks and other form factors where it's really up to the system designer and the OEM. - Yes, and of course, as you can imagine, the form factor plays into this. So there are trade-offs to be made across all form factors with regards to the cost of the cooling solution that are implemented, the size of the platform, and they all play together to give you different kinds of performance based upon the processor features that are supplied. - I think you gave me a clue a little bit earlier, just to go back, about the difference between Turbo Boost 2.0

and Turbo Boost Max 3.0 is, I guess, our shift in architecture going from single-core to multi-core to now actually hybrid multi-cores. Does the core count/core configuration influence turbo, or is that a consideration when creating these technologies? - Sure, as you can imagine, the more cores that you add, the higher the power would be, okay? If you keep the power constant, then you have to reduce performance in order to remain within the same power envelope. So as you add cores, performance will go down in order to remain within that envelope. - And I guess a simple way for me to think about this, is this load balancing where you have a certain power budget and you decide that you've got this multi-core system, and you're running a bursty workload, and you only need one, maybe two cores, is it sustainable to intelligently divert that power to those most favored cores and to just get the maximum performance? - So when you have a large number of cores, large number of logical processors, when work is placed on the system with Turbo Boost Max 3.0,

the OS is very much aware of which cores are the highest performance cores, and it places the work on those first. Demanding work, not just low-demand work, but high-demand work. And then it expands that out to other cores in succession. So the most favored cores, the second most favored cores, maybe the rest of the cores successively.

And that keeps the work from causing power to be run unnecessarily elsewhere. So that maximizes performance. - Now, you mentioned thermals are a factor, again, that's Thermal Velocity Boost. What about when some people think of kind of a floating turbo? Is there an Intel technology for that? - So we do have a technology, it's called Adaptive Boost Technology.

This is a technology that came out in the 11th Gen processor, and we're also gonna supply that in our 12th Gen, one of the SKUs in our 12th Gen processor generation. Intel historically has had a limit on the maximum speed when a certain number of cores were active, and this was done for various reasons. The Adaptive Boost Technology removes that limit, and it says let's not have a proactive limit on the frequency, let's have a reactive limit instead. And so the reactive limit is something that engages the power analysis and with the current detection to make sure that the maximum speed can be attained when there's light workloads, but maybe when there's a heavier complexity workload, the frequency may be controlled or moved down to stay within the power budget.

So this ability to naturally flow it up to the frequency that is possible, that fits within the current and power capability of the processor is what we call Adaptive Boost Technology. - So there's a lot of these kind of adaptive, almost automatic, even use the word autonomous, technologies in the suite of turbos. Is there anything that the user needs to do to take advantage or make best use of them or even to activate these technologies? - Generally speaking, no. So the concept is that these should just be seamless and they should just work.

These are generational improvements to try to get every little last bit of performance out of your processor. In order to help that, the environment and the cooling and the power delivery needs to be there to support it. So again, as the OEM and/or the end-user invests in these capabilities in the system, we want the parts to have capabilities that naturally are exposed and the end-user get benefit in that in terms of performance when they're employed.

- And these technologies, what is the, aside from getting more work done in a given period of time for processors, who is this technology designed for? Who's the ideal user who would benefit the most from these turbo technologies? - Okay, that's a good question. So generally speaking, there's a couple kinds of performance that you might say you get benefit from. One is in terms of responsiveness.

There is a natural flow to working, right? When you're working between tasks, switching between tasks, you wanna make sure that there's no lag. Whenever you mentally think, oh, sh, I had to wait for that, it's irritating, it breaks your flow. - Right. - Okay? So the first aspect of turbo is making sure that we can race up to an appropriate clock speed to give a responsiveness as quick as possible. The other aspect of turbo was what I talked about as non-guaranteed performance within power. It still gives you great performance, but I just can't guarantee it on every part to be the same.

And that continuous turbo capability is something you might see in content creation, when you're doing an encode or a transcode or things like that. So turbo just gives you higher performance within the power envelope, and it also gives you peak performance for responsiveness. - And when you're talking about responsiveness, how quickly, I know these things happen, they switch billions of times a second, but how quickly are these cycles, clock speeds able to ramp up in response to the workload? - Oh, we can change that every millisecond. - Okay. - Right. - So really, when you're talking about lag- - And less. - There isn't going to be

any lag. - Yeah. Well, none that's associated with the processor clock speed changing. - Right. - Of course, you also have

software overhead, and you perceive the lag in terms of what displayed on your screen via the application and the operating system. But you know, this is much better than it was historically, and so there is a tangible end-user experience delta, and if you've heard the Intel Evo branding, right? That is also a way to ensure that the system that has that brand is meeting certain responsiveness requirements to deliver that great experience. - So with all these technologies which you've been working on for so long, what was one of the most challenging technology or aspects of turbo to bring to market and make work reliably? - Myself and my colleague Barnes Cooper, also an Intel Fellow, went up to Microsoft, and we told them one time in a meeting and says, "You know what? We are gonna change the voltage and frequency of the processor on the fly." And so that was a big step, right? To be able to have a modern operating system that changes the voltage and frequency on the fly was the big step forward.

The turbo aspect of it is just associated with whether or not we assure the entire range. There are great things associated with the tricks we went in originally to make turbo possible. Like I mentioned to you, SpeedStep Technology was not designed with turbo in mind. And so we did a small one megahertz bump in the control that meant that the OS, I'm sorry. that the hardware could take over and manipulate six, seven bins of performance above that.

So that was pretty fun to be able to take interfaces that were meant for one thing, add turbo, and then see them used for a different purpose very successfully. And then of course, when we went to Speed Shift, then we finally comprehended the entire range, and we said, look, we have an assured range, a not-so-assured range. And we talked to the ACPI spec folks and Microsoft engineers, and so it was great to see everyone lean in to start to implement OS controls for turbo rightfully understood and comprehended.

- So, lots of work by you, colleagues, even partners. Do you feel that, sure, enthusiast gamers, people who care about performance really understand and appreciate turbo, but overall, do you feel that this technology has been appreciated and given enough of exposure? - Yeah, you know, my personal view is that end-users have a very difficult time in selecting their systems and what to buy. Usually, they ask your neighbor what should I get, right? But the word turbo has becomes synonymous with performance, and it is something that people will look for when they buy a system, then we actually deliver that performance in those systems. So from a perspective of actually understanding exactly what it buys you and what they perceive it, I don't think that people really get it, but they do know when that feature's on there, you definitely get higher performance, and it has become a buyer's choice. And it does deliver that performance, it's just do people really understand exactly how much it gives them? I don't think so, but you know, the proof is in the pudding, as we say.

Look at the benchmark scores, you see a significant delta between having turbo and not having turbo because there's so much opportunistic range that we provide in turbo. - So with all that, all your technical background, all your technical expertise and telling that hey, to help people understand the value of turbo more, I'm gonna borrow a page from Reddit where they have this subreddit called Explain Like I'm Five. Are you able to, can you try to explain what turbo is to a five-year-old? How would you do that? - Let's see, well, I happen to have a five-year-old. - Okay, well. (Guy chuckles) Is he like a little mini Intel Fellow too? Cause that's- - He's a little bit more honorary, but yes. (Marcus chuckles) So what I would, I would probably not be able to explain it to a five-year-old.

I would just be able to say, they won't understand it, they'll say computers run at different speeds, some are fast and some are slow, and I'd probably say that turbo is really, really fast, you know? And that's the most they can understand, I think. - That's actually how my dad explained to me about his car with forced induction with the turbo charger. So on that note, I can't think of a better way to wrap things up.

We're kind of out of time here, but this has been a very exciting and insightful conversation on turbo, so thanks for joining me, and I hope we can do this again. - You're welcome, and I'd love to. - Thanks, Guy. (bright music)

2022-05-04 03:24

Show Video

Other news