Ultra HDR images Android Build Time

Show video

[Music] welcome to build Time the show where the Android team talks with Android developers about building great apps I'm Christopher Cartland and I'm ultra excited to have some special guests here with us today we have Levi from the Google Drive team and myori from the Android developer relations team uh thanks for being with us today thanks for having us thank you and so today we are going to be talking about some cool new capabilities that are are available on Android devices and for Android apps to take advantage of um so it's going to be something called Ultra HDR and so I'll letot the uh guests today explain a little more about exactly what that is but before we get into that and how uh Google Drive has been able to support that in their app uh let's introduce our guests or let them introduce themselves uh let's start with uh Levi yeah my name is Levi Schmid I've been at Google for over eight years uh I've been working on Android uh for wow probably 12 at this point maybe a little longer um from from the early days pre pre- lollipop uh so uh yeah I uh I initially um actually was a mechanical engineer uh joined a startup ended up kind of falling into doing software development small company wore lots of hats but my favorite of all those hats that I wore from Storage to web to mobile uh was Android um I've always been an Android User myself uh I've actually never owned an iPhone um and so uh I just love how DIY Android has felt from the beginning um how anyone could just download studio and just get started uh building something fun and exploring and seeing what you can do so um these days um I'm one of the Eng leads for Drive Android uh and I also lead experimentation across drive here at Google awesome thank you really appreciate you being with us today and myori hi excited to be here I'm myri kenser I've been working on Android for the past 12 years pretty much from the days when there was froo and we were building our old Android OS systems and then putting customizing apps and then putting it on tablets and all of that so it was a lot of fun to be in that uh Android has been a long journey for me I've always been an Android User loved how customizable it is and how easily accessible it is I've been in Google for the past four years in Google I've got an opportunity to work on a variety of things I worked on Android TV Android performance specifically startup time back up the restore and now Ultra HDR and this is one of my favorite features as of camera user myself so thank you it's been great being on the show today awesome this is really exciting uh and so maybe we should get to I think everyone's oh but let I think everyone's familiar with Google drive but maybe Levi do you want to just give some context about like Google Drive so that we can have something uh specific to talk about as we get into the topic itself absolutely yeah so Google drive what we do is basically file storage in the cloud um so uh we allow users to easily upload files from your computer from your phone um you can scan documents uh from your from your phone using your phone's camera uh and those things get uploaded to drive to the cloud and then the Big Value add is that you can access your files from anywhere so uh you know we've been doing this for a long time now Google Drive is now a pretty mature product um but yeah iOS Android web we have a desktop app as well um you know lots consumer users use Drive lots of paying customers use drive as well for their businesses um as part of Google workspace uh yeah and so that's what Drive does awesome all the files all the files in one place so uh and then for Ultra HDR specifically how did you find out about it yeah so um I found out about Ultra HDR through pretty much a yearly process at Google um I think the Android uh product area here at Google is pretty good about uh consolidating all of the new things all of the capabilities requirements new features uh into uh basically a yearly uh ask for different teams different apps different organizations and so uh over the past few years I'm part of the small team uh as part of Google workspace right so Gmail Drive chat meet all of those productivity apps that kind of come together as part of Google workspace uh we evaluate all of those uh possibilities all those asks uh and feature requests from the Android PA uh and then decide which ones we as a team are able to commit to you know which ones we have uh the resources available to commit to which ones are high priority for us and which ones aren't so it was part of that feature evaluation process that uh I came across Ultra HDR maybe this is a moment to also just Define what Ultra HDR is uh do you think mayori would you be able to take that yeah I would love to uh Ultra HDR is a new API that we introduced in Android 15 it basically transforms how we view images on the new HDR screens uh before uh with our camera we could take capture more colors more density more Vibrance and all of that but we used to we used to translate all that to display on a standard dynamic range screen that's SDR screen so you really lose the HDR n even if you have the HDR data you're not able to see that on an HDR screen that's where Ultra HDR comes into place with ultra HDR it's really an endtoend experience for the user it's not just capturing the photo in HDR but also displaying in HDR editing those images and then sharing those so the complete user Journey for an Android User has been transformed with ultra HDR as a feature it basically displays more Vibrance more brightness more darkness in the colors and gives you a very real life experience imagine a case where you're capturing a sunset but when you see in the phone you don't don't really see what you see with your real eyes there's a difference there and that's what with ultra HDR we are ready to solve with a more real real eye experience for the user and it's been great and the feature I would I would urge users to really see it on the screen it's a feature to experience and not just to see in screenshots or projectors but really see it on the device and you will feel the difference over it awesome yeah it seems like something that's really hard to we can explain it but it's also hard to show to most people cuz uh it's about the viewing experience which requires kind of some sort of Hardware dependency right yes so so how do you actually view these images so with ultra HDR uh you definitely need HDR capable displays which most phones today have all premium phones today will have HDR capable displays and to display that Android 14 has a new API which is like really simple and straightforward wherein you can just enable HDR mode in your activity and you're able to see HDR screens the beauty of this API is it's completely backwards compatible so by design it will work as it is on an SDR that's the ster dynamic range screen so as developers you really don't have to do or do make any change to support it across devices and that's the beauty of the CI okay yeah and and just to make sure we understand kind of where this works uh so we talked about the cameras being able to capture uh these images and then also the display being able to show it uh so uh where does where does that uh get stored in the middle like how does what's the common part of ultra HDR and all this so Ultra HDR is designed in a way that it's a JPEG image it's a standard format of cross Industries uh for a decade now so it's jpeg so all devices will support it what we've done is uh generally with HD you would want 10 bit output but that again restricts ourselves so we have two 8bit Imes one is the standard 8bit image that's the jpeg SDR and we have one more one more computation that's called the gain map that really saves the brightness and the Darkness of the data the size of that is pretty much just 5% of the original image so it's not really heavy on the size when we combine these two the base s strr image and the gain map which is the extra Delta highlights of the image both these images combined together and on the screen are displayed as HDR and Android has outof the Box support for displaying this whenever this extra game map data is available if it's not available uh it just displays as it is or if you're displaying on a screen which does not have the HDR or the high density uh available it won't it will just show as it is so there's an image format based on jpeg it's backwards compatible um but since it's just like you know many cameras might capture this and users May back up these images or or share them with other people that means they're probably getting stored in Google Drive and so I imagine this is kind of where we kind of maybe we can talk a little bit about these images even before even if you hadn't heard about Ultra HDR users would be sharing them storing them in Google Drive and so what does that mean for an app like Google Drive yeah I can speak to that one um so you know to Google Drive we we store hundreds of different file types probably more than that probably thousands of different file types um and we have you know client side viewers uh for different file types But ultimately there's various logic by which the client will pull down the bytes of that file from our backend uh and then ultimately show them on the client side uh when Myer and I first started talking about Ultra HDR I actually thought that that backend integration was going to be a problem and that we were going to have to you know add additional logic to our backend to actually you know make sure it was serving the right version um of of these these files turns out it wasn't a problem for us at all um Drive store you know if you upload the btes drive is going to store the btes uh we do have logic when it comes to like serving thumbnails for example in a list um we do have logic to request a transformation of those bites to a lower resolution and that kind of stuff uh and and those things certainly we would have to add logic if we wanted our thumbnails and our list view to be ultra HDR but that really wasn't you know that's not really where this thing shines um and so our our thumbnails continue to be SDR low resolution images uh but but fundamentally when you open that file and you get that full screen we call it a projector view um that full screen view of that file we're just pulling down the bytes of the file and and since the bytes you know since Ultra HDR is designed in a way that it's just a JPEG um the bytes contain the information so it was really just about the client side changes to make sure that we're enabling this functionality in the framework to take advantage of that extra data that's there and show the best version of the image to the user awesome so you were able to basically simplify the server side by saying well they can do all the regular standard definition for or standard dynamic range for uh the thumbnails and kind of smaller features and your focus was on the hero experience kind of like when you're really trying to look at an image like how does how to make it look its best and so that the changes you made were all focused around kind of like viewing maybe I don't know if it's full screen or at least a large version of the image that's exactly right yeah and and my if I remember correctly that was pretty much the the guidance uh from the the team responsible for Ultra HDR is that right yes uh Ultra HDR uh will definitely give you a pop of colors on the screen and use the variance of display colors on the screen so if it's a short image like a thumbnail you really can't notice that much of a difference and you would also have multiple images together and maybe one Ultra HDR image might just pop up too bright which is not a great user experience so our recommendation was always to use ultra HDR when you use at least 75% of of the screen to display an image and that's when the user will be able to see the true colors of ultra HDR got it and so uh when you are going to use this what do those changes entail I mean maybe I'm curious to get uh kind of Levi's perspective um since we have a lot of other content where we describe how how we think you should do it but like from as as as an app developer when you're approaching this what did what did you have to do yeah so this is actually a topic I'm excited to talk through this is one of those rare cases where you know feature request comes in we look through the documentation and uh we estimate how much work we think this will be and you know we're always trying to foresee the potential problems we'll have like I said I initially thought there would be a backend component involved uh so you know I I will admit I severely overestimated how how much of an investment this feature would require uh and it's a testament to the the way the team designed it um this was exceptionally simple um a few lines of code uh basically uh what we had to do is when well I can go into a little bit of detail here I guess uh the the the easiest way to do this is you can actually control it on an activity by activity level and just basically flip this uh this flag that says hey this activity should support HDR you know color what is it called myi remind me the actual API name of the flag color mode yes color mode HDR yeah um and so once you've told your activity hey support HDR then all of the underlying framework classes the bitmap factory all that kind of stuff will automatically give you back uh you know the the full HDR bit Maps um and so that was the that you know it's pretty much one or two lines of code we flagged it made sure we rolled it out safely added some uh some logging and that kind of stuff but uh but that was pretty much it now the way our app works like I mentioned we drive can be used to view hundreds of different file types uh images by the way are one of the top file types like over two-thirds of files open in Google Drive are images and that accounts for over a billion images opened in Drive every month um so images are a big part of what we do uh but that same activity that we use to view images is also use you can you know swipe through the different files in a folder so I might have a folder that has an image a PDF a video an audio file uh if I tap on the image I'm going to see the image then if I swipe uh I can swipe to the next file in that folder and so on and you know that one activity handle all those different file types um and so when we just turned it on for the activity level uh we ended up starting to roll it out and had some issues uh that were specific to a couple of oems um uh and basically the issue was that we were we were flipping this flag no matter what we were just like hey party time let's just you know I don't care what kind of file you're opening I don't care whether it's actually an HDR image or not let's just flip this flag and uh and make sure that if it is then you know you get this functionality because you know in the framework that flag is basically like it's safe like you you know you're supposed to be able to flip that flag and it has no effect if the thing you're looking at is not an ultra HDR image um and you know on 90% of devices that was the case I'm that I'm just throwing out that number I don't know what percent it was it was probably even a higher percent than that but we did start getting a couple of uh reports from users on specific OEM specific handsets uh that uh had this side effect where because we were flipping this Ultra HDR flag if they were not looking at HDR content it was actually lowering the brightness uh and or that users were observing lower brightness so we looked at that we actually reached out to the oems to try to figure out what might be going on turns out not an issue that kind of predictably so right as an oem you can't kind of roll out a quick fix for an issue like that um so what we decided to do was limit uh turning on this flag only two cases where we know we're viewing an ultra HDR image uh and so there's also an API that team is built for that I think it's called has gain map you can basically ask that of the bit map hey does this have gain map data associated with it um and if so then we would turn this flag on and that way we weren't just you know invariably turning this on and lowering the brightness on those few handsets for any file type any image we were only uh you know doing this so that users who actually are looking at an ultra HDR image will get to see it in full Ultra HDR Glory uh but everything else was unaffected so that it seems like there's a so activity level and then a flag level that you can dynamically in your code change based on whether you want to enable the HDR Ultra HDR viewing exactly yeah yeah so so Mayer you can tell us more about like when developers should use the activity versus the kind of dynamic flag yes uh so Ultra HD can also be enabled at the Manifest level but we don't really recommend that unless your app is like full of only images and just viewing them the second option is enabling it at the activity level so if your use case is uh your user journey in an app is like a screen with a full photo viewer and you keep swiping it left right up down and just see constant images that's the place where we recommend to use uh HDR mode on an activity but if you don't have full content of HDR and your influx of images is a mixed one we recommend you to only activate the HDR mode when you have that data and it's a pretty simple API which is you just get has gain map uh true or false value out of the bitmap image uh and you can easily uh toggle the activity mode on that additionally uh what if you just you don't use an image view but you use Advanced image loading libraries to load images in your apps which is what most top apps today do glide and coil also have outof the box support for Ultra HDR so you don't really need to worry uh and you can always intercept the bit maps and get the data if it's an ultra HDR image or not so essentially use it on an activity level when you have a full screen view and only that much but when you have when you keep changing your views or when you have an flux of mixed images only activate it when you need it that's more conservative that's more power because you're using less brightness of the screen as well and so I wanted to ask a little bit more about the API before getting into maybe some of the library uh implications but is this API for turning on and off the flag at the activity level or is it in a kind of screen display manager Window Manager kind of mode is that like where does that API actually live so the API is on the activity level and that's why anywhere in the activity life cycle you can just set the flag on and off and the screen will uh use the hardware layer to display more colors got it great and if you're now for the image libraries you recommended or not recommended but at least mentioned like you said they handled them out of the box does that mean that you don't have to set that flag if you're and if if it's an aware of ultra HDR images it will set that flag for you no so it's not that the image Library will automatically set it for you that's not the right use case because you've not requested for it uh but if you request the activity mode to be HDR and anywhere in your activity you're displaying images through these libraries it will be able to detect that you are in HDR mode and it will be able to show that extra information that's an ultra SDR image got it awesome thank for that you don't have to make any changes in your image logic anywhere in your activity if you just set the flag great it's an OP API it's not forced on on the developer on the user but it's an optim API so so in Google Drive it sounds like we uh so initially did the simple just enable it everywhere uh found uh that it caused some user experience problems in some cases and then moved to a more uh kind of case-by case basis so maybe when an ultra HDR image is displayed then uh tell the activity to enable that that display mode um that's exactly right yeah were there any other steps that's pretty much it launch it profit done fantastic I would say it pretty much a low effort but high value feature and in terms of users the user experience like really transforms when you see mhd Ultra HDR image yeah and it's worth talking about you know I mentioned we evaluate man close to a 100 feature from the Android org every year of like hey this is this new thing take advantage of this and unfortunately we don't have the Ben withth to take to you know to do every single one right so we do have to triage and prioritize um I mentioned you know this was easier than I thought it would be but I will also mention this was higher value than than I thought it would be and we kind of alluded to this already that like it's hard to communicate you know what this actually feels like in your hand especially when you know uh you might not as a as a developer you might not have maybe this was more problem a year or two ago and hopefully less now but you might not have in your hand uh you know a debug device or or a device that supports Ultra HDR and so uh you know it's hard to really deeply groc like what how much of a better experience this will be for your users until you can hold it in your hand and not just have an ultra HDR phone but also have a non-ultra HDR phone and then you know actually view the same image that is an ultra HDR like it's easy to get get it wrong when you're first evaluating this and saying like oh is this really different you you might think that you have this a good comparison case and then you know you might be using an app that's not flipping the flag or you might have flipped the flag but it's not an ultra HDR image uh so there's a lot of variables but like once we actually got it right and I got it in my hands I was like oh okay I see the value here I see how this is so much better um yeah so thought that was worth worth mentioning as well it seems like it's a feature that you can't really experience until you've done it like it's you can't like just like mock can't just do a mock and say like this this like here's what pitch this to the team like is this going to be good enough um I see a pessimistic like a resource constrained team saying ah but it is backwards compatible so like do we really need to add this because the image will still be displayed like why do you think this is so important and so it seems like it's hard to convey the value it really is yeah in so many areas like there were times when I was you know communicating this with other uh swedes on my team or with my management chain and it was like oh let's you know like write up our one pager about what this is and it's like let's just take screenshot it's like oh wait wait wait screenshots don't actually show you the difference uh you have to like take a picture of the phone with another phone like but then that has to be it's it's really hard to find a way to communicate this without actually walking up to someone with two different phones and showing them like hey this is what this looks like and do add to that uh just to solve this problem of understanding the value we've made a sample in in the sample we have a toggle button for SDR for gain map and for an ultra HDR image so on the same screen you just keep toggling and the same image you'll see in different formats you also see the game map that's like a black and white combination of uh color of brightness and darkness applicable in the image so in one app you can see multiple images and there you see the difference uh of the image so you don't really have to go and check somewhere but use that sample app to see the difference for yourself got it and where do we find that sample app all these sample apps are available on GitHub and they are called the platform samples in the Android repository we also have documentation on Ultra HDR in our developer documentation and that lists out all the samples that we have we have around 7 to eight samples for Ultra HDR uh which is displaying an image in an image view in Android views in compos views combining a multiple Andro HD images into a video editing the images viewing the gain map all of those samples which are like very much a user Journey what an app would need we have samples for all of those so developers can just go and see and play for themselves awesome I wanted to talk a little bit more about the kind of maybe other formats with similar sounding names so we've been talking about Ultra HDR and then talking about images so first of all is this an image only format yes Ultra HDR is an image only format because it's pretty much B based on a JPEG format got it and so there's other hdrs out there so what is it not so uh the other uh the other format is HDR video which is pretty much showing an HDR capable video on your device and that's completely different on how you use codex and all of that for displaying an HDR video Ultra HDR image I would say is a very very low uh low effort in that terms of displaying an image on the screen great so yeah we've been talking about images only today and that's on purpose it's not because there's another hidden part of ultra HDR that uh covers video yes yeah uh so uh in terms of making sure this works well uh I can tell that Levi is happy with it when he's able to get the two phones side by side um and showing that to the team but how else can you make sure that this is working well for your users uh in terms of working well it's it's in system API out of the box usually oems have Hardware support for it in terms of the hardware layer that they add for brightness so that you can only check when you see on a device but generally it's a JPEG format if it doesn't work there's no error that you see you'll just see a standard dynamic range image and uh that's the easiest part of the CPI that even the error is a fall back to a standard image so you won't see any error or you won't see anything else either you see it or you you see a standard image so that's that's the easiest error condition uh to be solved I guess got it and for the Google Drive team is there anything you do in testing or monitoring to kind of just make sure that like this seems like it's working well for you yeah so um we have a QA team um that does do manual testing uh and so well I should start at the beginning first and foremost we have automated testing right so we already had a test Suite that was set up to test our image viewer uh we made sure you know we added a new test case with an actual HDR image to make sure that uh that you know continue to work um that those run on emulators and right now our emulators do not support Ultra HDR um so that's really testing the fallback case uh which still is important right like that that's still going to work um because like fundamentally we never want to break the 99% for the 1% given that of the entire Drive Corpus uh Ultra HDR images are not yet the larger you know line sh I think over time they will be right as more and more uh users get these devices in hand that are able to capture Ultra HDR content we'll see that um you know that take over over time but right now that's not the case so so step one automated testing step two uh we have a QA team um and so you know we a typical process is whenever we have a new feature is we write up a test plan uh based on the new feature we instructed the QA Team how to access the feature um and so you know thankfully the QA team had some premium devices that do support Ultra HDR and we were able to write up a test plan make sure that they understood how to evaluate that this is working prop properly uh and then and then yeah once we've got sign off from QA then we go ahead and launch cool uh and are there any lessons that you learned uh while uh implementing this that you may uh take into future releases yeah there are several I think we've alluded to a few of them but just to to call out again I think like first and foremost um I think there was a lesson to be learned about about how to communicate quality related projects right um I think a lot of times right it's we're used to things like Hey we're doing you know a new material refresh and we're going to be updating the UI and um and those things are easy to see it's easy to see like you know left and right side what's the difference uh this was one of those rare cases where it was really difficult to communicate what the real change was going to feel like uh and so like that's something for us to be on the lookout in the in the future because I think initially we were going to not do this project we were like nope we don't have the bandwidth to do it uh mayeri really like came and saved us from that by going above and beyond and showing us like hey this is this is how easy this is going to be for you guys um and and this is the value so uh so really thankful to you myi for that um but I would say the other lesson uh like I said I we alluded to this as well but uh our initial implementation was just like you know hey this is supposed to work in all cases so let's just flip the flag on and not worry about it um but just given the Android ecosystem and that iversity of different devices and oems fragmentation um given you know how far back uh folks are using Android devices from from years and years ago and all that kind of stuff um always a good idea if something has value for a subset of cases limit that thing to those subset of cases that's what we ended up going with and that you know we were able to launch this successfully because of that so that was definitely a lesson learned for me on this project so so what devices are supported so you mentioned like kind of potentially like older versions of Android is there a minimum Android version or some other requirements for this API so Ultra HDR is a format was launched for Android 14 it's also the default capture for all Flagship phones uh most of Flagship phones for example Samsung pixel 1plus so we are seeing uh since it's the default capture format right there's so much influx of these images that's coming in the ecosystem and if apps are able to show it then that that's a great user experience so it works from Android 14 onwards and with most Flagship phones and other phones which are HDR capable can always display those images so for example Drive is a image sharing app in some case so even uh uh even if I don't if I don't capture an ultra HDR image I somebody shares it to me I can still view in Ultra HDR and and that's the and that's how the API is designed in the end to end Journey for it got it so it seems like if you are an app that displays images that have been captured on one of these devices regardless of who captured it then your app could benefit from showing Ultra HDR yes and one more reason for that is also most developers think that they have some server side compression and this compression could affect the ultra HDR images because you always have a g map data but since the format is jpeg we've seen most compression algorithms are are not affecting the format Ultra HDR image so that gain map data which is just like 5% of the original image is still preserved and if if that's passed on then the Android system by default displays it can you describe a little bit about what the game map actually is I mean it because I know that like a regular image has some kind of like two ends up being a two-dimensional representation of some data but what what does the gain map look like sure so a gain map is again coming from a format which adab had created for images and Ultra HDR Google's implementation also comes from as a base from adubs format uh so so G map is again an 8bit uh array of uh 8 bit 8bit Matrix of information it gives you the information of brightness and darkness for example if you have a sunset image and if you just apply brightness filter the entire image would get bright but we only want certain places in the image which are bright and that will just pop out and give you a real life experience so Ultra HDR will only store data on which bits which pixels should be brighter or should be darker so that information in terms of black and white uh colors uh in in easier words is is what the Gat map stores so it does not store every coordinate in the pixel to the original image but it only stores where there is a Delta where there's a difference and that's why the size is 5% and not 100% as a direct one to one mapping so it sounds like it still has some pixel by pixel data but it's not every pixel yes yes and that's by Design because it has to consume less space it cannot be a replica of the original image but it also has to carry that datta which will create a huge difference when it hits the display layer got it I'm I'm curious if you tested like the any extremes like you have an extremely bright and extremely dark image or like where every pixel needs something is there like a maximum size that like Ultra HDR images might add uh ironically now uh with the next version of Android we are able to give developers that offering to control HDR Headroom so they can control how much of the hardware displays brightness they want to use and control how much bright or less dark the image should be so so in terms of that control uh we also give that to the developers as well got it is that now or is that coming soon so that's launched in Android 15 Android 14 launched Ultra HDR uh we we saw developers asking for more granular control over the brightness because they always want to keep multiple images side by side and that's where the new API which gives them more control on the hardware layer that comes in so what's coming next with ultra HDR uh the influx of images is going to be great and we uh we are working with a lot of apps to uh adopt this as a format earlier developers always thought that this is mostly valid for social media apps because image is the primary form of communication there but example with Google Drive and other search apps we've seen that even they benefit a lot of of getting out of ultra HDR so as an Android User where else am I going to find Ultra HDR um if Google photos has been our earliest adopters of ultra HDR and we worked with them very closely with them to actually Define the format for Android so Google photos is one great place to see Ultra HDR images not only on Android but on iOS as well we have Instagram who were able to adopt trdr within just 3 months of the OS release we we have Google messages we have Google files and a host of other apps that are able to give you an ultra HDR image experience and notice only one or two of these are social media apps and with a primary use case of photos but others are also able to see that value with them so if you're an Android user and you have a premium phone today you're already seeing a lot of ultra HDR images uh so we talked a lot about Android um and since these are files that are being shared by users um wherever they are you know Google Drive has clients across different operating systems and platforms so um was this work really isolated to Android was there any implication that you needed to coordinate with other teams yeah in this case uh the solution was entirely client side which means it was limited to just the Android code base so yeah ended up being just an Android solution um the cool thing about Google Drive is just like many other you know uh file sharing apps or apps that can support viewing images uh we support viewing images that may have been taken on the same phone you're on or uh images that have been taken by some other phone uh owned by someone else or some other device right um and so you know this is just part of uh improving the quality and the user experience on Google Drive uh so you know it is for us it is a subset of usage uh the case that you are viewing an ultra HDR image uh on a a handset that does support Ultra HDR um but you know it's closing those uh those gaps uh one by one that ultimately enables us to ship a product that's really high quality and in additionally just to add to that context uh even if a user today does not have an HDR capable display they might upgrade in the future and whenever they upgrade they would be able to see all their past images in Ultra HDR and True Colors so that's also what we can we are building for the future because phones will keep on upgrading Hardware will keep on getting better and we already have this image data so uh without doing anything we will be able to offer premium quality experience to our users in the future it's a great Point yeah awesome thank you for that well it's great to be able to talk to the Google Drive team to hear their insights and then also know that this is kind of something that everyone together is working on to make the experience better for all of our users yeah it was great talking to you and revisiting all the entire Journey that we worked with drive for to get them on Ultra SDR fantastic and I guess uh before we go i' like to ask if anyone has any Android Pro tips to share with our audience Android Pro tips I have one oh go ahead sorry this is R Android and not specific to ultra HDR uh managing background TK is is is a big challenge in Android but with the newest apepi that have been launched in the last two versions I think it's great in terms of having that guarantee that your task will be executed but how efficiently you use the system resources to manage those background t so uh as as a developer it's it's a big challenge to keep a lot of work happening in the background but using the latest apis it's getting so much easier well thank you so much for taking time to talk with us today uh it was really great to learn the insights learn how easy it was um and we hope to see more Ultra HDR images in the near future awesome thank you let's go Ultra HDR [Music]

2024-11-16

Show video