Google Presents Pixel Fall Launch

Show video

[Person singing: "Baby Come Back"] Person: Stop. [Cass Elliot "New World Coming" singing] Person: I am so sorry! Person: Hey Google, turn the lights off, please. [Cass Elliot "New World Coming" continued...] Person: Excuse me sir, excuse me. [Japanese] Person: Hey Google, help me speak Japanese. [Japanese] Google Assistant: What's the name of this dog? Person: Oh, Tink.

[Cass Elliot "New World Coming" continued...] Person: Stay. Person: Look at me. Person: Good girl. [Cass Elliot "New World Coming" continued...] Person 1: Hey Person 2: Hey, what's up? Person 2: Um, I forgot my keys today.

Person 1: Not today. Person 2: Thank you. [Cass Elliot "New World Coming" continued...] Rick: Hi, everybody. I'm so glad you decided to join us today. I'm here in our beautiful Google store in the heart of New York City, where we're gonna show you a few new things we've been working on, some of them for quite a long time. Today, you're gonna see real breakthroughs across many different technologies and teams at Google, all coming together to create a more helpful, personal smartphone experience for each person.

We're doing what Google does: developing advanced technologies and taking on massive scale tech challenges, which, in the end, are all about helping you out and making your day easier. For so many Google products from Search to Gmail to Google Photos, AI is the foundational technology that lets us deliver helpful experiences to each of our billions of users. AI is driving our vision for ambient computing as well, the idea that helpful technology should always be available, whenever and wherever you need it. From a Google hardware perspective, everything we're doing is in pursuit of that vision and the smartphone has a huge role to play.

The phone is the most personal technology in people's lives, so it's only natural that your phone should be the central control device of an ambient system. Your phone needs to understand you and your world- your context, your unique needs and preferences, how you speak, and what you care about. That's what we set out to create with Pixel- the most personal, most helpful phone.

A true Google phone, which takes all the helpfulness and intelligence of Google and adapts it to you. Now, this approach has led to a lot of groundbreaking smartphone innovation over the years, from the Pixel camera to the Google Assistant, to the speech and translation models we've pioneered. And we've always done a great job of tuning our software to get the best real-world performance out of Pixel's hardware. Now, this year is quite a bit different.

We have state-of-the-art hardware, which means Pixel can deliver even more impressive real-world performance, as well as new AI-driven experiences that have never been possible before. We're getting the most out of leading-edge hardware and software and AI. The brains of our new Pixel lineup is Google Tensor, a mobile system on a chip that we designed specifically around our ambient computing vision and Google's work in AI. Tensor was years in the making. We worked closely with our AI researchers to create machine learning models that run well on Pixel.

And we use those models to power all kinds of new experiences that you'll see today. With Tensor, we're connecting Google's AI breakthroughs directly to Pixel. So we can push the limits of helpfulness in a smartphone and turn it from a one-size-fits-all piece of hardware into a device that's intelligent enough to respect and accommodate the differences between each person. Now we can give you Google's best speech recognition in a smartphone.

The speech models are trained to understand natural speaking patterns, accents, dialects, and how to translate them accurately into other languages. We've developed the most advanced smartphone camera, ever. It intuitively understands how to get the best picture for you: Your life, your friends, your face, and skin tone, as well as the nuance in the world around you. We can deliver our most helpful smartphone experience, one that understands and anticipates what you need when you need it.

The phone experience is designed for inclusion at its core: we've taken real steps in hardware, software, and AI to prioritize historically underserved communities, particularly in the camera. And it's a highest-rated phone for security that keeps your information safe and private. We can even extend the support window for security updates so your phone has the most up-to-date protection.

With Tensor, we can pursue our own vision for smartphones. This vision comes to life right here in the new Pixel 6, our most helpful, most personal, most Google phone ever. Pixel 6 is designed inside and out to understand the nuance of you. You can see right away, a major evolution in the phone's design language, with a distinct new color palette. The new Pixel Camera Bar houses a completely redesigned camera system, with upgraded sensors, lenses, and computational photography that enables entirely new features and modes. You can see the always-on display, and the expressive, adaptive new Android 12 UI, which we call Material You.

Pixel 6 is designed with powerful new security and privacy features to safeguard your information. And we've made big improvements to boost every aspect of performance, from the all-day Adaptive Battery with fast charging speeds, to 5G connectivity for even faster downloading and streaming. The product is full of more features that we're gonna show you today.

Pixel 6 is available today starting at $599. And now, we're gonna talk about a first for us... our first flagship phone for people who want our most advanced technology.

The new Pixel 6 Pro has everything Pixel 6 has, including our Tensor chip, and a lot more too: It has a pro triple rear camera system packed into the Camera Bar, with a new telephoto lens. And on the front, you have an ultra-wide front camera that shoots 4K video. It's got a bigger and better 6.7-inch dynamic display, that goes up to 120Hz for smooth scrolling and fluid animation, while going down to 10Hz to save power. Pro includes 12GB of RAM for speedy performance, and seamless app switching and multitasking.

It's our fastest and smartest Pixel yet. Pro has the biggest battery we've ever put in a Pixel. And it has a fresh, high-end design with sophisticated colors and refined finishes.

Pixel 6 Pro is available today starting at $899. It sets a new bar for performance and helpfulness you can expect from a Pixel phone. Today, you're getting the first full tour of Pixel 6 and Pixel 6 Pro, starting at the outside and working our way in. Here's Isabelle from our design studio. Isabelle: We completely reimagined Pixel as a personal fluid phone experience that is expressed throughout the device, from the form factor to the materials to the UI.

And the first design element I'll show you is probably the first one you noticed - the Camera Bar. The Pixel Camera has been celebrated since the beginning, and we've been subtly evolving our camera design over the years. But Pixel 6 is not the phone for subtle evolutions. The new Tensor chip is unlocking lots of new camera capabilities, so we upgraded the camera system to take advantage of this.

Both phones use bigger, better rear camera sensors, and a laser autofocus sensor, which helps make your low- light photos sharper. And Pixel 6 Pro has a telephoto camera with folded optics, for impressive photos in as small of a form factor as possible. It's a powerful collection of hardware that helps you take a beautiful photo under any condition. This is the kind of step-change innovation that deserves a total rethink- to think outside the box so to speak. So the industrial design team designed the phone itself to celebrate the camera. The Camera Bar brings a clean, symmetrical design that puts the camera front-and-center.

It is a major design evolution that encapsulates real technical innovation. You've probably also noticed that Pixel 6 has a distinctively different look and feel from previous models. We designed the phone to be graphic, vibrant, and unique.

Pixel 6 has a textured black metal band combined with expressive but versatile colors that look like nothing else in the smartphone world. We're introducing a tinted gray and a vibrant yellow combination, for sophisticated, but fun vibes. The next colorway is our take on our iconic Google coral. Pastel pink paired with juicy red that feels inviting and energetic, accented with a punch from the textured black metal band.

And our dark color option is simple, but not plain. We're using dark gray tones for an understated and minimalistic look. And for the Pixel 6 Pro, we were inspired by high-quality finishes used in jewelry and watches.

It's our most refined phone, made with polished unibody metal transitioning fluidly into the gorgeous 3D glass. The phone feels refined and unique in- hand with unexpected color combinations: Our white Pixel 6 Pro with a clean silhouette and timeless style, is accentuated by the warm white and tinted gray colors, and the high polished silver frame. We're also pairing a light gold with warm colors for luxurious and uplifting feel.

And of course, we have a tailored and modern black Pixel 6 Pro, which combines dark neutral colors with a high polished black metal, for an elegant and understated look. A beautiful phone demands a big, beautiful display, so we took extra care with the Pixel 6 displays. Pixels 6 uses a 6.4-inch 90Hz display. And Pixel 6 Pro has a big and beautiful 6.7-inch high resolution waterfall display that really melts into the frame of the device. Scrolling is super smooth, with a dynamic refresh rate that gives you peak performance when you need it, and lower power consumption when you don't.

Pixel 6 and Pixel 6 Pro are as durable as they are beautiful. Both phones are water and dust resistant, with Corning Gorilla Glass Victus displays. It is the toughest Gorilla Glass, with twice the scratch-resistance of previous Pixels, so the phone can withstand inevitable flashes and scratches. Underneath the display, the new Pixels have a fast and secure fingerprint sensor for easy unlocking. We're also continuing our investment in making our products more sustainable.

For example, Pixel 6 and Pixel 6 Pro are designed with recycled aluminum to reduce their carbon footprint. But perhaps the most important sustainability move we can make is improving the phone that you already have, so that you can go longer between upgrades. That's the idea behind Pixel's Feature Drops, even older Pixels are constantly getting new features, so they keep on getting better over time.

And when it's time to upgrade, just send us your old Pixel, and we'll make sure it's responsibly recycled. And to protect your phone until then, our series of Pixel 6 cases are made of recycled materials, and they're a perfect match for accentuating Pixel's design and colors. The frosted translucent cases fit seamlessly around the Camera Bar, letting the colors show through, so your Pixel is protected by its case, without the design and the colors of the phone disappearing. The accessories, the colors, the display, the materials- everything about Pixel 6 has been reimagined.

But reimagining the form factor wasn't our ultimate goal. We wanted to create a complete phone experience, built around you. Our design team developed the striking Pixel 6 exterior in parallel with the software development on the Android team, so that the colors, the forms, and what's on the screen all work together in a single, beautiful, and fluid experience.

And here's Sabrina, to share what we've been working on together. Sabrina: Thanks, Isabelle. With Android 12, we set out to build an OS that's the foundation for the future of hardware and software together, for an experience that's uniquely made for you. Android 12 looks especially stunning on Pixel 6. We designed it using our years of mobile OS experience, while keeping our own hardware in mind.

So Pixel 6 is the best expression of Android. One of the best examples of hardware and software working together is in the color and customization logic of our new Material You design. Material You is a first-of-its-kind UI breakthrough, and we think you're gonna love it. For starters, your entire phone UI updates to reflect whatever image you choose as your wallpaper.

The system does the rest, determining which colors are dominant, which are complementary, and which just plain look good. It then applies different shades to different parts of the interface. You can see this gorgeous wallpaper that I've picked, and how it influences the surfaces on my phone as a sunny gold color scheme. It even matches the hardware. As often as I want, I can give my phone a fresh feel just by updating my wallpaper. I really like this blue I'm wearing today, so I can quickly update my Pixel to match it.

You can see how the different UI elements immediately adapt, including the system UI, apps, icons, and search bar. And we've re-imagined every aspect of the UI. So, you'll instantly notice how responsive and smooth everything feels- It's more fluid and adapted to your needs. The redesign quick settings provide easy access to more useful controls.

And take a look at the updated widgets- Google Photos, Maps, Gmail, weather, your YouTube Music UI- all get redesigned with Material You, with the goal of making the UI more intuitive and beautiful. So my phone adapts to me, rather than the other way around. Material You also brings a fresh look and new features to At A Glance, the new home and lock screen experience. It contextually surfaces information you need, right when you need it- like having access to your flight details and your boarding pass, right on the lock screen. Or showing you a time-to-leave reminder that makes sure you get out the door on time. It's great for when I work out.

When I'm getting ready to start my run, I pop in my Pixel Buds and I'm able to jump right into my YouTube Music playlist with a single tap. Then when I'm running, At A Glance tracks my run and shows my live workout stats. As Isabelle said, everything is new about the Pixel 6 design, from the powerful Camera Bar, to the elegant materials and finishes, to the vibrant new colors, to the personalized look and feel of Material You. It's a completely reimagined Pixel. Of course, we can't create a personal phone experience without a real focus on security and privacy.

Pixel 6 has to get this right, because your life is on your phone: your photos, your financial information, personal data, and everything else. Pixel's lead the way in smartphone security for years, tested against real-world threats. And Pixel 6 is the most secure pixel yet.

It includes Tensor security core, a new CPU-based subsystem. That means it's a real core on the Tensor chip that's separate from the application processor, so sensitive tasks and controls run in an isolated environment. We're also introducing the next generation of our industry-leading dedicated security chip, Titan M2, which works with Tensor security core to protect your sensitive user data, PINs, and passwords. We made Titan M2 even more resilient to advanced attacks like electromagnetic analysis, voltage glitching, and even laser fault injection. And it's been proven through independent security lab testing.

Titan M2, Tensor security core, and our new open source trusted execution environment, give Pixel the most layers of hardware security in any phone. And thanks to Tensor, we're now able to extend our support window for the first time, so every Pixel 6 user gets at least five years of security updates. That ensures you have the latest, most up-to-date protection. We've also pulled a lot of helpful privacy and security features forward in the UI, so you have full control over your settings and preferences. Here are a few safety features that are coming first to Pixel: This is the new Security Hub.

It's got simple steps you can take to better protect your data. This green checkmark means I'm all clear and don't need to take any actions right now. You'll now see indicators right on the screen when an app is accessing your mic and camera feed. And, there are toggles so you can quickly turn the camera and mic on and off, across every app and use case.

The new Privacy dashboard helps you make informed decisions about which apps get your permissions to access personal information. There are also strong protections working behind the scenes. Many of the ML models for personalized experiences are running in a secure, on-device sandbox called Private Compute Core, so sensitive information never leaves your device without your explicit action.

Captions and translations, suspicious messaging alerts, Gboard data, are all handled securely and privately. We're stepping up anti-phishing protections too, detecting potential risks in phone calls, text messages, emails, and links, letting you know if something looks suspicious. Google keeps more people safe online than anyone else in the world, and our commitment to safety and security guides our hardware development as well. When you add up all these new security

and privacy enhancements across the OS, the UI, and the hardware itself, Pixel 6 demonstrates Google's continued commitment to security innovation. We're going to keep moving to the brain of Pixel 6. It's a mobile AI computer called Google Tensor, and we're excited to tell you more. Here's Rick Rick: Google Tensor is the biggest mobile hardware innovation in the history of the company. It's the culmination of years of investment in AI, and Google's deep experience in silicon.

The name is a nod to tensors, a building block of machine learning computation. The Tensor name also connects it to Google technologies like TensorFlow, our open-source AI and ML software library, and our Tensor Processing Units which Google developed to power machine learning in our data centers. The Tensor chip is specifically designed to offer Google's latest advances in AI directly on a mobile device.

This is an area where we've been held back for years, but now, we're able to open a new chapter in AI-driven smartphone innovation. Tensor also gives us a hardware foundation that we'll be building on for years to come, so you get the personal, helpful experiences you'd expect from a Google phone. Monika is here to explain what's so different about Tensor Monika: Every couple of years, Google comes out with something that completely changes how people use technology in their lives.

We started in Search, and kept going with Google Translate, Google Photos, Assistant, and these innovations are built around our Machine Learning research. It's in Google's DNA and drives everything we do. While Google is known for groundbreaking work in ML, there's one place we haven't always been able to bring it, and that's a smartphone. Mobile chips simply haven't been able to keep pace with Google research. And rather than wait for them to catch up, we decided to make one ourselves. We needed a chip that was engineered

to fulfill our vision of what should be possible on Pixel. So a few years ago, Google's team of researchers came together to collaborate across hardware, software, and ML. The result of that work is Google Tensor! We approached Tensor differently. Every aspect of Tensor was designed and optimized to run Google's ML models; This permeates our entire chip. We're fortunate to have great insights when it comes to ML and built our chip based on where ML models are heading, not where they are today. Starting with the integrated ML engine, the TPU. It was custom-made by Google Research for Google Research.

For the image signal processor, or ISP, we brought key algorithms directly into the silicon for power efficiency. Even our choices for CPU and GPU were designed to complement our ML to deliver advanced computational photography. The CPU cluster is a 2+2+4 configuration; It includes two big ARM X1 cores; The 20-core GPU also delivers a premium gaming experience for the most popular Android games. There's a context hub that brings machine learning to the ultra-low power domain. It enables ambient experiences like Now Playing and Always-On-Display to run all the time, without draining your battery. Tensor was designed for total performance and efficiency.

When it comes to running Google experiences. It has to be really good at heterogeneous computing. Here's what that means... As software applications on mobile phones become more complex, they run on multiple parts of the chip. This is heterogeneous computing. To get good performance for these complex applications, we made system-level decisions for the SoC.

We ensured different subsystems inside Tensor work really well together, rather than optimizing individual elements for peak speeds. Peak CPU and GPU speeds look great in benchmarks, but they don't always reflect real-world user experience. Pixel 5 is a good example of our approach. Google software delivered a great experience, even on a chip that didn't win on benchmarks. Don't get me wrong, Tensor CPU and GPU are much faster compared to any past Pixel.

But what matters more are the brand new experiences enabled by Google Tensor that weren't possible until now. So let's get to those. Here's Brian to talk about how all this comes together in Pixel 6. Brian: Thanks, Monika. Let's start with the camera.

Ever since our first Pixel five years ago, the Pixel Camera has set the bar and reshaped the industry. Our leadership in computational photography and machine learning have led to some remarkable camera capabilities over the years, and have let Pixel users take some extraordinary pictures, even when we've used ordinary camera components. With Pixel 6, we're applying all that software expertise to a fully upgraded camera system, for the most advanced smartphone camera in the world.

It's leagues ahead of our previous Pixel Cameras from the hardware, to the software, to the computational photography. For starters, let's take a look at the main camera. Both Pixel 6 and 6 Pro have a massive new 1/1.3 inch 50-megapixel sensor.

We combine adjacent pixels on the sensor to get extra-large 2.4-micron pixels. With Night Sight, the Pixel Camera has always been able to do a lot with very little light, but now the primary sensor captures up to 2.5x as much light thanks to those huge pixels. This means you're gonna get photos with even greater detail and richer color. Both phones also have completely new ultra-wide cameras with larger sensors than before, so photos look great when you wanna fit more in your shot. Pixel 6 Pro has a larger ultra-wide front camera that records 4K video.

It also has a telephoto lens with 4X optical zoom for getting in close. That's not easy to fit in a phone without making it super thick. To get that much magnification, the Pixel Camera uses what's called "folded optics," a flawless prism bends the light 90 degrees, so that the camera can fit in the body of the phone.

And, you can get up to 20X zoom with an improved version of Pixel's Super Res Zoom, our advanced computational approach to combining optical and digital zoom. Finally, the sensor behind the telephoto lens is even larger than the primary rear sensor in past Pixel phones, so you can capture great low- light zoom shots with Night Sight. When this amazing hardware is paired with Tensor, we can build new camera features that were impossible before. Video is a great example. Video is a hard use case for computational photography, because you're basically taking lots of photos very quickly. Applying a machine learning algorithm to a single photo is very different than running the same algorithm for each frame, 60 times per second.

We've always dreamed of getting Pixel's video quality up to the signature photo quality, but it wasn't possible. The processor just wouldn't be able to keep up. So we spent years on this problem and have made a lot of progress. We started by developing more efficient methods for applying tonemapping edits very quickly, and doing everything we could to get the most out of the sensor.

We also developed an algorithm called HDRnet, which could deliver the signature Pixel look much more efficiently. With Tensor, we're able to embed parts of HDRnet directly into the ISP and accelerate it to make the process faster and more efficient. With this system, Pixel 6 can now run HDRnet and 4K video at 60 frames per second- That's 498 million pixels each second.

And this is what Pixel 6 video looks like. You can see what a huge improvement this is. The color accuracy is excellent with a big boost to the vividness, the stabilization, and overall video quality. This is all thanks to the bigger camera sensors, Google's cutting-edge machine learning, and the efficiency gains from the new Tensor chip. It's a giant step forward. Have you ever had a perfect photo ruined by something random in the background? Let's say you wanna be the only one on the beach in your photos.

If you don't have access to a deserted island or don't wanna spend hours in a photo editing suite, Pixel's new Magic Eraser can do the job. In Google Photos, you'll see suggestions for distractions you might wanna remove from your photo. Erase them all at once, or tap to remove them one by one. What really sets this feature apart is how we're able to figure out what you're trying to remove, and how well we can fill what's in its place. Even if something is not suggested, you can still erase that distraction.

Just circle them and they disappear. And you can use Magic Eraser on Pixel to clean up all your photos, whether you took them a minute ago or years ago. Here's Hollywood production designer, Hannah Beachler to show you what's possible with Magic Eraser. Hannah: Hey, Hannah Beachler, production designer and world builder working on "Black Panther 2: Wakanda Forever." I oftentimes consider myself a story designer, and I'm designing towards mood and tones.

I love this building. I have to go back to my dad. We would drive around and we would just make up fantasy places. I can just remember seeing everything that he would say. Yeah, if I crop that. When I go on a location, I'm photographing hundreds of places. And for me, I have to envision what that certain place is gonna look like for our story.

Oh, my gosh. This was the key gym in "Creed" and they had this big workout station that we had to take out. And this is what I could have showed 'em all along. If I just have that gone. Oh, Magic Eraser. Yes. Oh, wow.

That is not the right period car, so let's get rid of that. So, I can present this as a 1950s space. Game changer! Ooh. It's so integral and so important to have a blank canvas to have the creative conversations, and I think anyone should be able to do that.

And I think they should be able to do it on the spot. Wow. Once you start using that muscle- of seeing past something- you're gonna do it a lot and then you're gonna see the world in the creative, and I think that it's a great tool. I know I'm gonna use it. Brian: Here's a problem everyone has seen before. You go to take a picture but the lighting isn't great and the subject is moving around. You can't quite get the perfect photo.

It's a little blurry. Here's the same scene with and without our new Face Unblur feature. Normally, this great moment would be a blurry throwaway photo. There's too much motion, and not enough light- It's a physics problem that Tensor's on-device machine learning can solve. Let's talk about what's happening here. Before you even take a picture, the Pixel Camera is using FaceSSD to figure out if there are faces in the scene. If they're blurry, it spins up a second camera, so it's primed and ready to go when you tap the shutter button.

In that moment, Pixel 6 takes two images simultaneously, one from the ultra-wide camera and one from the main. The main image uses a normal exposure to reduce noise, and the ultrawide uses a faster exposure that minimizes blur. Machine learning fuses the sharper face from the ultrawide with the low-noise shot from the main camera to get the best of both into the image. As a last step, Pixel Camera takes one final look to see if there's any remaining blur in the fused image, estimates the level and direction of the blur, and then removes it for you. In all, it takes four machine learning models combining data from two cameras to deliver the scene you know you saw but couldn't get from your camera until now with Face Unblur.

So most of the time we wanna eliminate blurriness from our pictures, but sometimes a bit of blur can actually add to the picture, especially for action shots that don't seem to have much action. Pixel 6 introduces Motion Mode which brings a professional look to your nature scenes, urban photos, or even a night out. Typically, you'd create these effects with panning and long exposures- techniques that require fancy equipment, and lots of practice.

Motion Mode makes it easy. For action shots like this one, the Pixel Camera takes several photos and combines them, using on-device machine learning and computational photography to identify the subject of the photo, figure out what's moving, and add aesthetic blur to the background. For a nature shot like this, the camera applies computational photography and ML to align multiple frames, determined motion vectors, and interpolate intermediate frames that are blurred so you get this silky smooth waterfall. That sounds hard but watch how easy this is.

Nothing captures the energy of New York like a fast-moving subway train. With Motion Mode, just wait on the subway platform for the right moment, snap a photo of your friend, and you have a vibrant, artistic photo to remember this moment. Now, we know that not every picture is taken in the Pixel Camera app. Some of these new camera capabilities and image quality improvements extend to any app that uses the camera, including your favorite camera apps.

Here's Snap founder and CEO, Evan Spiegel, to tell you more. Evan: Hey, I'm Evan. The camera was once a tool for documenting important life moments. Today, people use the Snapchat camera for so much more: as a platform for self-expression, creativity, and visual communication with friends. For Snapchatters, speed really matters. Billions of snaps are created every day, and our community wants to be ready to Snap everyday moments as they happen.

That's why we are always working on new ways to help Snapchatters get to our camera as quickly and easily as possible. We're excited to announce today that we are partnering with Google on a Pixel 6 feature called Quick Tap to Snap. This Pixel first feature puts the Snap camera directly into the lock screen for fast and easy access to the Snapchat camera. Just tap the back of your phone twice and you're into the camera. This new feature is a speedy and simple gesture that will help our community Snap more moments before they disappear.

We've designed Quick Tap to launch into camera-only mode so Snapchatters can create Snaps even if they haven't yet unlocked their device. Once you make a great snap that you want to share, simply authenticate on your device to unlock the full app experience. With Quick Tap to Snap, Pixel 6 will be the fastest phone to make a Snap, and we're also working with Google on exclusive augmented reality lenses, and bringing other key Pixel features like live translation- directly into the Chat feature on Snapchat. Snapchatters can talk to their friends in more than 10 languages and conversations will be translated in real-time.

These are the first features coming to Snapchat on Pixel 6, and we can't wait to bring more innovation to our community with our partners at Google. Florian: Hey, y'all. I'm Florian, and I lead Google's Image Equity Initiative, our cross-product mission to improve camera and image tools for people of color. Going back decades, cameras have centered light skin- a bias that's crept into many of our modern digital imaging products and algorithms, especially because they're not being tested with diverse enough groups of people. Photos are symbols of what and who matter to us collectively, so it's critical that they work equitably for everyone- especially communities of color like mine, who haven't always been seen fairly by these tools. This year, one of the advances in Pixel 6 and Google Photos that we're most excited about is Real Tone.

We knew that building for the community meant we had to acknowledge our own gaps, and learn from the folks who know this issue best. So, we started by working with image experts- like photographers, cinematographers, and colorists- who are celebrated for their beautiful and accurate imagery of communities of color. We asked them to test our cameras in a wide range of tough lighting conditions. And in the process, they took thousands of portraits that made our image data sets 25x more diverse to look more like the world around us. They worked directly with our engineers, telling them what was already working well, and where we needed to do better, to make sure our images highlighted the nuances of all skin tones equally. And, they kept it real.

Zuly: When they had a really white background on the back or a light or anything, like, they just looked very washed out. Natacha: And you know, it's a beautiful pink sunset, there's no reason why she should be looking this green. Kira: For this, I would love it if the camera, if this if this picture was darker. Deun: The second image, absolutely not. It...No. It was just...the color was...everything was just ashy. Adrienne: If there isn't adequate light, the skin can sometimes skew a little gray or desaturated.

When you take a picture of someone and they look gray? Like, that's not, you know, that's not good. Mambu: It negated the blue. It, like, washed out the blue that would've had the, the brown hue show up more in the skin. Kira: I think that the instinct is suddenly if somebody is not used to shooting darker skin tones the, the instinct is to just be like, you know, to shoot them much brighter.

And it's just like no, there really is like, we should appreciate and really sort of work toward, like, all the different hues and all the different tonalities. Mambu: How do we make sure that when someone grabs that phone, and it takes a photo of them, they see themselves? Kira: It should be that everybody just kind of looks like they look. Florian: All of that wisdom helped us make a more equitable camera: First, to make a great portrait, your phone has to SEE a face in the picture. And our experts helped us

improve our face detection models, so the camera sees you as are. From there, we improved our auto-white balance tuning to better reflect the beauty of your skin tone and we improved our auto-exposure tuning to make sure your skin looks like you- not unnaturally darker or brighter. They also inspired our teams to make advances like algorithmically reducing stray light that can make darker skin tones look ashy or washed out. And making Night Sight portraits less blurry for folks like me. All of these changes are part of Real Tone, improvements that led these experts to vote Pixel 6 as the most inclusive camera available.

In a blind test across top smartphone cameras, they rated Pixel 6's camera as best in rendering skin tone, brightness, richness, and detail for people of color. Google Photos will also have Real Tone baked into its auto-enhance editing feature, so you always feel seen- from the instant you take a photo to the moment you edit and share it. But this mission goes beyond Google apps: Real Tone will improve the camera performance for photos and videos in third-party apps like Snapchat. Because feeling seen shouldn't be limited to just one tool or company.

This fall, we partnered with the New York Times to see how some of the most compelling contemporary image makers put Real Tone to work in their own art. Let's take a look. Vida: For decades, you know, there've been generalizations made about people of color in imagery and just how they've been shown and show up in their portrayals. It, it whitewashes our history in some ways. Image technology was calibrated for white skin, and that meant that the variety and spectrum of colors had to therefore conform to that calibration.

Picture Progress is a campaign created by T Brand, our content studio at New York Times Advertising in partnership with Google. The campaign for us is an opportunity to literally lay the groundwork of opening the dialogue around what does it look like to have a more equitable and visually representative future? We put the Pixel 6 camera in the hands of BIPOC creators, we felt as though getting their unique perspective on image equity was very important. Kennedi Carter joined us on this effort, we were so excited to work with her. What we love about Kennedi is that her work expresses and really captures and elicits really unique questions around what is "pride" and "power" and "compassion" and "poise" mean in the Black community? Kennedi: It's important to think of images as facts. Whether it's an image of your family and your grandma cooking dinner or it's something like a protest.

This will be a point of reference for a very long time. I found that a lot of my sitters looked good no matter where they were. And that was just straight out the camera, and people were onset looking at the image output and they were like, "Wow, this looks really great."

If you're an artist, you should go into every community with the intention of trying to elevate a voice and make people feel heard and make people feel good. And make people feel proud. Vida: This will be a game-changer for sure. There's something very empowering about being able to see yourself as you see yourself. To see the nuances that make us all unique. Being able to know that you take a photo with a group of friends and everyone is represented equally, all of their beautiful hues are showing up in the same photo. It's our truth in our hands.

Florian: I'll hand it back to Brian now to take you through some remarkable speech innovations in Pixel 6. Brian: Tensor is unlocking so many helpful camera features in Pixel 6 from improved video quality to Face Unblur to Motion Mode. We think it all adds up to Pixel 6 being the most advanced, intelligence smartphone camera. Next up, let's talk about how Tensor is making typing much faster on Pixel.

Obviously, speaking is the fastest and most natural way to go, but today's voice input systems just haven't been up to the task. First, understanding the nuance of human speech is really a hard problem with challenges that are unique to each person. It means understanding syntax, intent, and the context of your request; knowing the terms and names that are important to you but potentially uncommon to others; understanding your accent and dialect, isolating your voice when there's background noise; and even hearing you correctly if your mouth is full! Google has led the industry for years with our natural language processing. It's why Search queries are so accurate, and Google Translate is so good. Thanks to some outstanding work from our speech team, Pixel 6 has an on-device speech recognition model that can transcribe speech with incredible accuracy. And thanks to Tensor, it does it using half as much power as previously possible! It's the most advanced speech recognition model ever released by Google.

With this new capability, we rethought the voice typing experience based on some key insights about how people write. Let me show you how it works: Hey, Rani, are you still up for dinner tomorrow with our friends? [SEND] You can see how accurate the transcription of my friend's name is. Pixel is smart enough to take the hint from my contacts list, and the mic stays open while I tap to insert the clarification about who's coming. Pixel even automatically adds the right punctuation, so I don't need to specify where to put commas and question marks. Let's keep going. I hope you can come. Catherine is hosting this week.

Of course, Pixel won't always know which Catherine I'm talking about and the one I have in mind spells her name with a K. The language model also helps with transcription suggestions, so corrections make sense based on what you're saying. Historically, transcription suggestions have been the same ones designed for typing, based on keystroke proximity. But that doesn't make any sense if you aren't typing with keystrokes! Now, Pixel's model is phonetically based, meaning it suggests corrections that sounds similar. So Catherine with a C becomes Katherine with a K, instead of "catharsis" or "cathedral" or whatever else you might've ended up with.

And to make voice typing as fast as possible, I should be able to do more with my voice, including commands like "clear" or "send." Pixel is smart enough to understand that I don't want those words in my message. I'm sure it's gonna be great. Clear. I'm looking forward to it. Katherine has a really nice new place. Send. See how fast that was?

Also, notice that your phone remembered it's Katherine with a K so you won't have to make the same edit twice. The model adapts to your usage. And watch this: The model also unlocks the accuracy of emoji transcriptions, which is so much faster than searching through little pages of icons. I think we're making homemade [pasta emoji] and [ice cream emoji]. Send. With the new speech model,

typing with your voice is faster and easier than just typing with your fingers. Thanks to Tensor and these new breakthroughs, we can do all this on-device without connecting to a server, so it's super-fast and reliable. We really think it's gonna change how you use your phone.

Google's natural language understanding allows us to rethink phone calls too. Call Screen is great for shielding you from spammers- unknown callers have to say why they're calling before you decide to pick up or hang up. It's a useful feature and the Tensor speech models are making Call Screen even more accurate in Pixel 6.

The next problem we wanted to solve is calling a business. It's been the same bad experience for decades with hold music, the same long automated system saying "your call is very important". It's the worst, but Pixel 6 makes it so much better for you. Now, before you even place your call, Pixel shows you the current and projected wait times, so you can call when it works for you.

And when you do call and encounter that endless list of options like "press one for branch hours and locations," you actually don't need to listen to carefully and remember all the automated menu options. Google Assistant listens for the details, and shows them onscreen for you to tap. Of course, if you still find yourself in the hold queue, don't worry. Google Assistant will hang out on the line and listen to the hold music for you. It understands the difference between a recorded message and an actual representative on the line, and lets you know when a real live human is ready to talk.

Tensor also dramatically improves translation features in Pixel, so you can interact naturally with people who don't speak the same language that you do. Pixel can help with smart, simple translation that's there wherever you need it. Just like with voice typing, speed and accuracy are best-in-class.

Until now these kinds of breakthroughs haven't been able to run on a phone. They require intensive computation on highly specialized hardware- This is exactly what Tensor is good at- It improves Pixel's translation quality by 18%, a level of improvement that typically takes multiple years of research from our world-class research team. And the improved model uses less than half of the power when running on Tensor. We all have friends or family members whose native language isn't the same as ours, and I've always wished I could speak with them in the language they're most comfortable in. With Pixel 6, I can finally do that.

My sister-in-law is Japanese, and here's a message from her. It says, "I have a question about the latest Pixel." Now, I can reply in English and it will show up in Japanese for her. "Sure, now, I can finally tell you about it!" It all happens right on the device in the Messages app, so I don't have to deal with cutting and pasting text into Google Translate. In app translation is also available in WhatsApp, Google Chat, Snapchat, Instagram, Twitter, Line, and a growing list of other chat apps. Now, if we combine our real time translation models with our speech recognition models, we can translate audio from any source, in real-time, right on the device itself.

It opens up a world of content for people. Watching YouTube videos in your own language, stream a sports broadcast from another country, watch Instagram live videos from around the world, all translated and captioned in real-time for you, right on your Pixel 6. Translation is even integrated into the Pixel Camera so you can read signs, product labels, documents, and even menus in your own language.

Across all the speech and translation features in Pixel 6, you can see a few of the ways Tensor unlocks new capabilities. Google's ML research enables things like helpful voice typing, calling assistance, and translation on Pixel. And we've got one last translation feature to show you today. The improved Interpreter Mode in Pixel 6 is more fluent, faster than ever, and available in 48 languages.

Here's my colleague Shenaz with Marie Kondo to try it out. Shenaz: Marie Kondo is famous for changing people's lives through her KonMari method of organization. She has a television show, has written books, and is a huge star in the United States. Thank you very much for being our first live translate interviewee.

[AI voice: Japanese Translation] I love seeing what sparks joy in other people's lives. What sparks joy for Marie? [Marie: Speaking Japanese into phone] AI voice: For me, it is a pleasure for as many people as possible to finish cleaning up and lead a thrilling life. [Marie: Speaking Japanese into phone] AI voice:I'm surprised that it's translated so perfectly! Shenaz: What is it like working in a place where people don't speak the language? [Marie: Speaking Japanese into phone] AI voice: Speaking English is certainly one of the difficult things for me.

But it's just fun to be able to join people with different values, different languages, and cultures. Shenaz: It was so nice to talk and connect with you in this way. Shenaz: Thank you. [Marie: Speaking Japanese into phone] AI voice: To be honest, I'm really surprised that my words have been translated and conveyed so well. Marie: To be honest, very nice. Shenaz: We are so happy to hear your excitement because we worked very hard on this.

[Marie: Speaking Japanese into phone] AI voice: I think that the language barrier is getting smaller now. [Laughter] Brian: Let's go back to Rick for a few final Pixel 6 details. Rick: So that's Pixel 6 and Pixel 6 Pro.

They're available for pre-order today starting at $599 and $899 for the Pro. Head over to our online Google Store to pre-order. There'll be on store shelves starting on October 28th. You'll also see our lineup of Pixel accessories, including the beautiful new cases and the hands-free Pixel Buds A Series.

Between the incredible new Pixel 6 design, the phenomenal camera and speech capabilities, helpful new features, and all the Android 12 goodness, there's a lot to be excited about this year. Pixel 6 and Pixel 6 Pro completely reimagined Google's approach to smartphones and represent the very best of Google's helpfulness. They are phones that adapt to you, and I can't wait for you to check them out for yourself. Thanks so much for watching today. We'll see you again very soon.

2021-10-20

Show video