in this episode of the Denoise podcast we're going to talk about NAB predictions and what we already know and open AI's move into open weight let's get into it all right welcome back to the podcast hey Joey you going to NAB i am going to NAB getting ready to go tomorrow just had a team meeting with all our editors and stuff so getting ready for the circus of a bunch of video coverage yeah no it sounds like it's going to be a really exciting year lots of new announcements i'm sorry I'll miss it i know but uh I'm sure we're gonna catch up yes we'll have a debrief after the fact but yeah bummed that uh you won't be there we can't tag team around the show so yeah but there have already been a good handful of announcements and interesting products coming out so I kind of want to break down and talk about what we already know and what's kind of circulating around the internet we're recording this Thursday yeah I know Blackmagic is doing their big press live stream or usually they're pre-recorded but their big press announcement of all their new updates tomorrow morning Friday morning so by the time this comes out that will probably have already played so then if you're like "Why aren't you talking about all the new stuff that came out?" That is why we're talking about this Thursday but there was a leak today of a new update to the Pixus camera they're releasing a 12K version and I was wondering this it looks like it is with their RGBW sensor oh wow cuz I was wondering I'm like are they going to start putting that sensor into more of their smaller cameras and it looks like that's very exciting and as we talked about in the episode breaking that down the really cool thing about that is if you shoot different resolutions it automatically can remap on the sensor and you don't have to deal with the crop issue so it it doesn't do like the typical downsampling right where punches into a section of the sensor this if you don't want to shoot 12K which a lot of cases probably don't so even if you're shooting 4K it's not like you're forced to crop in a lot more you can still get the wider field of view which means you're still getting like a full-frame lens response and you're getting all of the photo sensitivity of that entire sensor yet uh it's lightweight because you're probably shooting 4K or something yeah so yeah that's exciting i am curious so this was sort of a leak from there because Blackmagic usually does a lot of outdoor signage around the convention center and around the hotel across the way the downside of that is you have to put it up early and people will take photos of it and this was how it was it was posted on Reddit and they're putting the signage out so it's just a picture of it i am curious if it's going to be the same exact body or if it's going to address a few of the slightly annoying things on the Pixus like it only has one mini XLR input doesn't have any internal ND so I wonder if they're going to bump anything up with the Pixus itself or it's just a higher resolution version same exact body and then there's the Ursa Cine 12K i was just going to say like this is the sort of issue a lot of manufacturers have with product lines at different tiers like you don't want the tier at the bottom to start nabbing sales of the tier up top which really makes more money so I think they'll keep a really clean line between the Pixas and the Ursas like you want you need more inputs you need more X Y and Z go get the Ursa or if you're good with this get the Pixie yeah I'm wondering that what do you think pricing the Pixus as it exists now is 3,000 we saw what was the Ursa 12K they dropped the body only to like 6,000 7,000 I thought 7,000 yeah and I'm guessing this maybe 4 or 4500 i wouldn't be surprised if they offer this at 3,000 and blow the red Komodo out of the water and drop the existing Pixus price down to like 2,000 or something correct yeah it's what the like everybody at the 3,000 it's incredible the world we live in $3,000 gets you a solid cinema camera i mean yeah with red dropping their price i mean I've not seen anything like affordable cinema cameras high quality like this yeah i mean do you remember like about a decade ago when we would have to shoot something the hardest part was getting a camera with enough dynamic range with the correct lens mount and all those things and now those are all that had video features you know we're talking about the 5D Mark II and it's like well it looks great oh but you can only record like a couple minutes at a time before you have to cut and it has like zero video features and you're sort of like hacking your way through it to get something that looks cool we didn't transition to like the mirrorless world of DSLRs and now we're in the world of like proper sensors made for cinema which is what the Pixus and the Ursa sensors are yeah i am wondering about the price and I feel like it's gonna be crazy that I'm gonna probably have to ask companies this and bring in economic policy into NAB but there are a bunch of tariffs that just went to effect in the US and I'm wondering is this going to affect pricing for these companies especially something like someone like Blackmagic they usually are pretty outspoken about their price and make it a big part of like their marketing campaign is that going to change like based on where are these things made probably not in America i highly doubt that and they I mean they're also an Australian company so it's probably Southeast Asia like Thailand Vietnam my guess yeah so I'm wondering for their sales in the US is that whatever price they announce is that going to stay the same is that going to change and I'm kind of wondering that across the board for all the hardware companies with their pricing i mean that's a bigger question for the economy are the manufacturers that are going to get hit with the tariffs are they going to absorb the cost cut back in other ways like sales or marketing whatever department or are they going to just pass that cost to the consumers and look 25% more expensive right or a combination of all of the above yeah so think I'm going to have to become an economist going to NAB but I feel like that's going to like surprisingly play into the conversation of I mean just to kind of play the devil's advocate who is American cinema manufacturing like as far as camera sensors go is is RED making their sensors here probably not although anyone made their sensor here I don't know enough about manufacturing but I don't think any I don't think so and that's the same issue with any kind of silicon right whether it's a CPU or GPU like the wafers aren't made in America anymore and camera sensors are wafers mhm so I think there is not a single camera manufacturer that is not going to be hit with tariffs in my opinion yeah where their cost of sales in the US are going to be affected yeah one last thing about the fixes i am excited that the way this news leaked feels like I've talked about how the excitement of like NAB preandemic feels like it hasn't bounced back yet but this was the kind of stuff that would happen from NAB online around like pre- pandemic where like someone some booth would be setting up their signage and someone would like sneak a photo and like post it online and it would be like oh look is launching this new thing and I feel like we haven't had that you know buzz and excitement around some like leaked photo and so I'm excited to see that just anything is like leaked and there's some like my mystery and excitement before it's officially announced it it's really nice right because it takes you back to when you were excited about new technology and there was leaps and bounds in each generation i mean to go from you know the old Pixus to a 12K with RGBW sensor and still offer it at whatever price i'm sure it'll be super competitive that's a big deal so very excited about that yeah all right other product announcement workflow wise they've been talking about this for a couple weeks strata is releasing a new app they're calling it Strata agents and it's basically a uh piece of software you can load on your computer and it will make any hard drive you have plugged into your computer remotely accessible and you can access the media files stream the media files download the media files without having to put them on a cloud intermediary it's directly peerto-peer you can just play it stream it that's insane yeah how safe or secure is it that's like my first question i mean the with their account access I I will ask him more details about the security but I believe it's streaming with a WebRTC and it is I yeah I don't know if there I don't think there's any encryption that's happening when you're playing it but it has the uh account connection how it opens up it's through your account which has credentials and anyone I mean currently it's still like an alpha mode I did they did send me a preview to download and I messed with it yesterday and it worked I mean it's wild it's like I'm like accessing a computer like somewhere else and I can just kind of browse like it only works with video files but I mean I'm able to browse it and click and hit play and it's just like playing back online on the computer remotely they said they've tested it with a computer in Australia and playback was pretty seamless when this comes out and if it works the way it's supposed to in my test it worked it will solve a lot of the issues that I had when I was getting a NAS and I was trying to like get the team onto it and then I realized like oh once you go into the media asset manager route nothing directly works with the server you have to like upload a proxy to a cloud intermediary or if they want to download a file the file has to go to a cloud service and then goes to the like the blackmagic cloud thing I think you had a video or a live stream how do you compare that solution which I think is like really well polished and elegant versus something like this blackmagic cloud is definitely more polished camera to cloud they definitely went the Apple route so like if you are using Blackmagic cameras and you're using Resolve and you're using Blackmagic Cloud it is a very seamless experience and I will find out more about that cuz our plan is to use that entire workflow for our NAV coverage i'll see how it works so you're really going to put it to the test i'm going to put to the test cuz we have tried Frame.io and camera to cloud solutions in the past and usually the issue was I mean using a variety of different manufacturers the main issue we hit was like the internet at the convention center sucks and everyone knows this and if we paid up for like the terterodc with the 5G we just didn't have the budget for that this year I'm going to have like a phone I'm using as a hot spot to try to solve that issue we'll see how it goes so your original question comparing it to Blackmagic Cloud it's different because Blackmagic Cloud you still have to put the media on the cloud storage and then it downloads a local copy to whatever editor has access to that footage this you plug it you stream it you get access to the actual file you're basically turning your hard drive into a cloud accessible cloud drive like your own personal cloud without having to actually use a cloud service and it's not just like and no subscriptions and none of you know all of that advantages that come with your promising how would they make money from this i mean they have a subscription service right now i don't know what they're pricing or how they're going to price this in it's still early i mean I think they're previewing it i think the official roll out that they have planned is uh in June okay yeah i mean it worked when I tested it the playback was really smooth so I'm excited to see how this goes and it's pretty pretty innovative with what they're doing at Strada it is i've never heard of anything like it and of course production is so decentralized now we cover tax incentives of where everything's going this is like handinhand with all of that yeah if you could just plug in the drive on a main computer and then like you know all your team can access from your stage in Ireland to the team in LA yeah i mean I think the next stage would be like when you can start doing stuff with this media like running your like basically like running transcription running AI analysis doing string out edits in the cloud using this media on the hard drive and then being able to whatever workflow like download an XML file or download just the media clips that you need to your computer so a lot of possibilities here and just cool that they're going the route of where you don't have to have a cloud involved because that starts getting overarching cloud service a cloud service where you have to like you have your files locally but then you got to copy them or upload them to a cloud service and then you have duplicates of files and how do you manage all that and yeah it gets messy and expensive yes agreed all right adobe announced a bunch of updates and features yesterday one that was most interesting in Premiere is smart search okay context search i thought Jen Extend was also really interesting okay well maybe I'm like being picky on that one because they talked about that last year and it's been out in the beta so I'm like that's not new it's not it's officially in their official product release the the fact that they've rolled it out on the public release to millions of you know premier users that's a big deal like it this is generative AI at scale in the hands of consumers and something that is like a very practical application of it absolutely degenerative extend if you have if you're not familiar with it it is basically like a common issue you run into as an editor is like I want to add a transition or I need to you know extend this clip and they cut too soon there are a couple tricks you can do as an editor to try to fix that but their solution is you just drag the handle out and it uses their Firefly model to generate Yeah look at the video that you have before and then generate the extra frames that you need i mean the first question everybody's going to ask is how good is it i I don't know i've never I've not tested it i've seen Yeah i'm trying to think if I've seen people test it i've seen some videos online i think in some cases it's absolutely usable but I wouldn't go as far as to say that this is something that I see a lot of editors using day one i mean it's a case where you're like I need a couple extra frames to like hold for this dissolve that I'm trying to do and there weren't enough frames for it that's like probably the most common and if it's going to dissolve anyway you're not going to see final quality and all that yeah i mean my solution for that has always been like well just slow the clip down just you know buy a couple extra frames and uh yeah that's your solution so that is cool the one they announced that is new is basically I don't know what model they using if it's their own model but it can analyze your footage and you can start searching based on context you can search your footage you know oh the plate or like you know medium shot of a person so that smarter contextual search of your media inside Premiere that's video based not audio based video based that's you mentioned that that is really important that's really important yeah and that's something that not a lot have been tackling uh 12 Labs is a company we've interviewed a few times and they have a dedicated model to understanding video yeah but I think this is the challenge and this is the benefit where it's like as a user as an editor you want these tools like to be in the one app that you use all the time so it gets really complicated when it's like well yeah or even if I think they do have an extension or something where it's like you can bring it in but it just they never work as well as like it's built into the app and it just works yeah and that gives Adobe users so much more reason to continue their Creative Cloud subscription or whatever they're paying for they're like "This tool just keeps getting better and better." A very another very practical use of AI i feel like maybe Adobe went a little too hard on Generative and that kind of scared a lot of the user base and this is a much more like day-to-day practical like you need you have a lot of footage you need to sort through it or you need to find the shots although I I I will say Adobe I haven't looked into it too much there was a little bit of heat around their end user agreement around Yeah in the promo videos they did specifically say "We don't train on your we don't train on your videos." Right
yeah i think they wanted to clarify that cuz there wasn't like rewarded language in the end user agreement that someone noticed and then that kind of caused a stir yeah i think initially when they rolled out some of the first AI features from Adobe rolled out in Photoshop and when that got rolled out in the agreement it was something to the effect of paraphrasing like hey we may use your artwork to generate more stuff right this is like ours is vaguely language cuz they in the promo video it said we only use your video for generative extend to understand what we have to generate but that we're not training a model i would also be surprised if they even did that because their whole foundation of Firefly has been like they're one of the only models where it's like we have trained on licensed data we know everything is trained on we have permission to train on it it's commercially usable i think they back it by like a million dollar guarantee or something so they have been like hardcore have been like yes it is a commercially safe model like we know what went into it and their business depends on it yeah and then Frame.io launched an update with transcription oh load me some Frame.io i even saw this announcement and I was in frame and then this little thing popped up and it's like would you like to transcribe your video and I was like what that is another useful cuz it's like keeping everything in the same tool that has been another roundtrip process for us of like we need a transcript of this so we go take it to another tool and then there's even there's no way they never had transcript support even to search so yeah it's nice to just see that they're extending that out what I would love to see in frameio though that they still have not launched in version 4 and it's been a year now API there is still no API access in version 4 so I hope that comes like soon the the flip side is uh it's so strong on a browser like there is no other tool I can think of you know even like Sync Sketch or Shot Grid that is as good as Frameio as clean as easy to use on a browser on a phone or it just works it's just a seamless experience i mean Adobee's move to acquire Frame.io was so brilliant you know and uh did you hear Joey that they are in talks with Figma right now weren't they trying to and then it fell through or they have another So I think there was like a monopoly thing and that just got cleared so I think they're good to go oh so they are going to move forward with Okay so Figma Figma was eating their lunch with like well Canva was eating their lunch with Canva is the competitor to Adobe Express figma is the industry standard tool for any kind of UI design like graphic design app yeah like mobile apps and stuff uh you can very quickly put animations together in Figma huh it's all web based and stuff very cool tool and so that is moving forward cuz I remember that was wasn't like a $20 billion deal or something like that yeah i feel like shifting more web- based tools too is another kind of future trend like I feel for like for that to work especially with something with Premiere is that you can have the same project the same experience and not have this is the web version of Premiere and you can cut here but then it has absolutely nothing to do with your desktop app yeah it's like I would want something where it's like they mimic each other where it's like okay I could have the producer or someone if they need to do some like descript like experience of like make an assembly cut but that it syncs and is represented back in the desktop version where the professional editor is going to be doing the finishing version control same file something or it's not like is it even like Photoshop Express or Photoshop what Adobe Express Adobe Express yeah where it's like the web version and it sort of has some like cloud syncing with Photoshop but Adobe Express is in my opinion just like a attempt at Canva mhm where Canva clearly shines on how sophisticated it is as a browserbased tool but Adobe Express is nothing compared to the power you have on Photoshop mhm so you're absolutely right i think that bridge needs to be just more sort of intertwined yeah the other thing I think they can do more of too is connecting Adobe and Frame more because comparing it to the Blackmagic experience where now we're working in a Blackmagic cloud project and it integrates directly with Blackmagic storage and resolve and the media just syncs adobe sort of has the teams like which is like a cloud-based collaborative project but it just handles the project files it doesn't do anything with the media so they're like well then you know use a third party solution like Lucid Link and put your proxies there but we tried it and it was just not a very seamless experience i agree and I'm sure there were And they own the cloud right i mean they have frame it's like they have their own cloud service it's like you could you can you can do this you can put the two together yeah the other one was an update from Celtics so are you familiar with Celtics no i would have never known how to pronounce this by the way they were one of the first sort of web- based screenwriting programs okay I remember they've been around for a while and they sort of were web based kind of trying to compete with like uh Final Draft and the other screenwriting programs they were bought a while ago by Backlight which is the company that owns Iconic and a couple other media tools and I'm excited to see updates cuz I haven't really seen much updates with them but I'm excited to see those there's progress there one thing that they're launching that looks cool is a screenplay plugin for Premiere that will tell me more about that it seems like it is a similar feature to Avid's script sync where Avid has this built-in feature where you can import the script and it'll sort of match it with your shots and kind of help you pick takes and build out your first like assembly cut okay it seems like it is a similar feature to that but with Celtics and with Premiere and so you can connect your Celtic script inside Premiere and help like find your shots and do your like initial rough cut yeah i mean I've never done feature film editing i'm sure it's very daunting i'm sure you have not a future but yeah like really long big projects yeah i mean the longest editing projects I've done is probably 15 minutes right so imagine editing like a 90minute thing uh you would need tools like this it helps speed it up yeah especially if you're doing something episodic you have a script you're doing a vertical soap opera yes by the way uh uh one of our podcast listeners from Philippines messaged me and said that they're doing some vertical dramas over there and I asked him to send me some so he will all right yeah I want to see I want to check that out so yeah that was another update i I just thought that was cool and worth mentioning because it Yeah Celtics has been interesting and I'm glad to see that there's still a new product to update very cool moving into virtual production yeah what's Moses up to moses so one of their updates is they have a smaller version of their Star Tracker coming out that's their camera tracker yeah I saw a picture of it looks like small much smaller like a little I mean the Star Treker is small yeah so smaller than the Star Treker this one's smaller i don't know if they're targeting more of the creator space with just having something to maybe not necessarily do in camera visual effects or real-time tracking but something to put on your camera to just track the 3D space accurately to use it later in post so if you're running with like a Sony FX3 or A7R you just pop it on top it's as small as those cameras yeah and then then you can get your camera data and you can use it later simple thing similar to a use case you might use Lightcraft for uh as your phone i mean it might do real time as well and just be smaller also they're probably competing more down market so where HTC Vibe Mars is like uh you know that's a pretty solid camera tracking system for I think a couple grand yeah yeah the whole kit's five grand so I think Yeah an emos system is probably double that like a star tracker yeah so I'm wondering is this going to be in the 10K range is are they going to loosen up on the subscription fees which Yeah yeah these higherend trackers charge continual fees for I'm excited i mean any kind of uh technical advancement miniaturization streamlining of technology in the virtual production space benefits everybody yeah i mean I don't think seeing two new trackers was on my card this time between this and the Oelis from Sony you know two new trackers in the market the other thing that they announced and I feel like this is going to be I was trying to think of like you know what trends or what other things are we going to see at NAB ai has always been a hot topic but I still don't think we're going to see anything that's like revolutionary definitely not at NAB it's it's it's all on Twitter the new the AI updates well that's what I was telling you i think last episode you're like is Runway dropping this in front of NAB i was like Runway has no idea what NAB is those are two different things uh I think the big theme here to me feels like refinement refinement workflow integration yeah I see a lot of AI updates but it's like on the enterprise level of smart dynamic ads or upscaling your broadcast stream or your older broadcast streams like so that they're you know from HD to 4K using AI uprising the manufacturers are smart they're following the money the money is no longer in like a classic film and TV shoot per se because the market has has just been weak for the last couple years so where is the money broadcast VP outside of film and TV it's a corporate house of worship and that's why I'm wondering if like having it as a smaller and just like a camera tracker and not necessarily like oh you want to build it into a sound stage right a real time LED wall ICVF effects but you just want highly accurate like camera tracking if that's Yeah you look at a consumer oriented company like DJI right they're selling to millions and millions of people i think Moses wants a piece of that little market yeah might as there's like a market there too education but trend that I do think is sort of sneaking up gajun splats i've seen like three or four different company updates awesome around integrating or something with gajun splat so one of them is another update from Mosis we got this email this morning calling their scan to shoot VP workflow and it is basically it seems like it's a combo of using LAR and god and splats to scan a scene to do previs and tech scouts where you can scan an environment and then bring it in and I believe Moses is coming into play using their software and create a photorealistic looking highly accurate 3D representation of whatever location you're trying to scout and then everyone else can come in and like virtually scout that's super forward thinking on Moses' part because they already have the ecosystem so why not just bring a Gazian splat technology into it and I think the announcement that I saw was they scanned something in London sent it to Studio X in Thailand mhm shout out to Linda and uh they shot as if they were in London in Thailand this wasn't just for previs this was also like to put it on the wall like I believe so yeah I guess that makes sense with scan to shoot would make sense in the location across the world yeah okay yeah when I was reading the press release it seemed more like a tech scout kind of thing but yeah okay to getting final so how high of the resolution would you need display on a on a LED wall pretty high and I get yeah Gaussian splats are not there yet as far as resolution goes not because the scans aren't good enough it's just a real-time performance limitation because they're computing those ellipsoids those gazian little bubbles on the fly millions of them at the time they need to be in the billions uhhuh and I I don't think like the current algorithms and the compute power is just there yet yeah i mean it always comes down to like what kind of shots do you need and is it going to be blurry in the background so yeah I'm curious more and I'm talking to Moses i'll see what this workflow looked like i think it it's definitely promising because the whole you know upside to Gazian splats is you get the parallax you get all the benefits of photoggramometry without any of the work involved with building a photogometry asset you just did a whole thing with global objects and Leica and there was a lot yeah mostly Leica Geio Systems we got the part two with Global Objects that is not released yet but Global Objects takes the Leica high res larers and then combines it with their method of doing high resolution photography and then merges it to get these photorealistic highly accurate digital twins that are good enough to display on an LED wall the global object outputs will probably be really high resolution and good to go on an LED volume because they're done the traditional way probably using something like reality capture like a very industry standard software what if you can do you know half of that quality level at like 5% of the work right that's the promise of gaj splats and they they're believers too like that gajian splats will be the future it's just it's not there yet but that is where it's going to go but we're seeing a lot of updates with that as now so I mean the other one lightcraft jetet which we talked about repeatedly but yes where they have the app on the phone you can strap it to your camera but they have been doing a lot with loading gajun splats directly onto your phone much lighter weight you can have a bigger world you don't have to deal with hill on real engine that's for a lot on the phone we'll also see how that works and get a demo of the workflow with that but yeah that was another big push they were doing and also one of their other updates was that when you're recording on your phone it will also record the LAR data from the depth sensor yeah from your phone as another data track that you can use in post-production and I think the LAR data just comes in as a grayscale image so it's pretty easy to work with in like Nuke or something mhm and where you just kind of have like a video depth map based on the data yeah and then this other update from Valinga which is another splat yeah digitizing the shout out to Fernando Rivas yeah valinga announced that they have a ASUS color managed gajian splash wow that's that's how does that even work yeah so uh I know I know I I know Valinga a little bit and they have been finding out ways to take their 3D gauian splat engine and make it production ready so some of the things that are missing in Gausian splats right now is like bit depth like the stuff that it you know it's mostly 8bit color space is a big issue i mean you're shooting something on an iPhone going through the gausian splat solve and then when it comes comes out onto a wall what color space is it at mhm well it depends on the code space you shot it but is there any transform happening during the solve so I think this is an attempt at fixing all those things if you can have an ASUS ready Gausian splat then you can color manage it with any sort of standard media server like a disguise system and then that will ensure that you know your oranges are exactly the shade of orange that you intended and so on so this is a this is a really big deal doesn't seem like much but it is uh if without this step gauian splats won't be taken seriously in the production world right this puts it more in the actual production pipeline to be more viable yeah and professional use pair this with stability AI joining the academy software forum so they are also I'm speculating here they're just probably working on making their generations ASUS compliant right so it's all like it doesn't matter what you can build with it that stuff soon has to go into a movie pipeline mhm that right and that's been Yeah big issue with the AI outputs it's like yeah you can get it but it's like a weird low res format that's barely usable for like editing outside of the web yeah it's typically sRGB 8 bit just like any JPEG image on the internet but that's not what movies are made with yeah so if you are I mean do you know if you were shooting a gajin splat for this pipeline do you have to shoot it on a camera that can handle this color system or in a format that can go into thea pipeline and maintain the color space yeah I would imagine that take some really high dynamic range camera you know just like a Pixus right and uh shoot it raw and then color grade it in your computer so conform it to like a rec 2020 push that into Valinga it'll output a Gausian splat that is now built with Rec 2020 files and then Valinga will be aware that it's in 2020 space and so when you plug it into a system that can recognize color spaces like aces you can check the 2020 box so my guess is it's something like that yeah that would be cool very very cool yeah so yeah I feel like yeah Gajian splats is a little bit sneakier as more of a thing I've been seeing trending more than just AI it is just more of a practical use case of AI technology i mean the Gazian splat system is solving using machine learning it's just not generating stuff it's recreating the world and I think that has far more implications not just in film and TV you're talking about museums and virtual experiences and domes and anywhere you're trying to recreate something else from the physical world 3d GS is very cool and uh last one that we had flag that was interesting another update from Sony they have an update to their Venice extension system it's the uh Venice extension system mini i have been a little bit fuzzy on how different this is than the Rialto and why it's not just called the Rialto Mini i think the Rialto was very niche boutique and experimental i know they used it for Avatar 2 and they used it for Top Gun and just to set the stage for like so the Rialto was a you have this the Sony Venice camera body the Rialto is you take the sensor and put your lens on it and have a 8 footish cable so you could kind of mount the camera in different spots that was is physically too small to mount or handheld yeah that you can separate the body from the sensors so you put the body in your backpack or somewhere else yeah like just to give a frame of reference to our listeners the Venice 2 is about the size of your head and let's say you want to shoot Top Gun uh Tom Cruz is in a cockpit in front of him is the HUD of the plane there is probably like a 6 in by 6 in space to put a camera so you can't put a Venice 2 there so this cable extends just the sensor to that little spot which is that size and then all of the processing then the memory card and all the writing is happening up to I think 18 ft away something 18 ft okay yeah that's right so you can put the body by his feet or somewhere else where you have more space where you can mount it out of the shot i think it's like a fiber optic cable that's super you know ruggedized so the Rialto this was a couple years ago 2 three years ago when that made a big splash it still felt like a very boutique thing that not a lot of cinematographers are using it's very expensive it's something that Sony is probably supporting one to one this feels more like a product like that's just ready to buy and go out into the world and it is smaller than I believe the Rialto was like a physical like size of the It's the same sensor but the It's like I think it's pretty much the sensor in a little box yeah it's like a little bit bigger than a GoPro uh-huh yeah it's very small for a fullframe I mean I think the Venice 2s are even bigger than fullframe that's a giant sensor in that little package yeah yeah there's another update from them there i heard rumors Joey it was just a rumor but you know we were undenoised here that Sony's looking at extension system for other cameras like the FX3 so like more of a consumerbased extension system that's already so small but imagine if it could be like 1 in by 1 in i could see it for like the What's their more body fx6 fx6 up is more of a traditional body yeah I can see like that where you have a kind of a bigger body and so you can put the body somewhere else yeah fx3 is like a A7R DSLR type is like Yeah so like it's small already yeah we'll see how it goes but you know they shot the creator on the FX3 did you go talk you heard it here first you heard it here after 20,000 other people kept talking about that yeah uh yeah that'd be Yeah that'd be interesting yeah I'm I'm I know there is a version of it for DJI came out with a version of it for the uh the 4D for the 4D Yeah which I love that camera and u the corridor crew folks the YouTubers they love that camera they use it all the time i've never used it but it looks fascinating for very interesting shots they use it for Civil War yeah they did i saw I think I told you this too like I saw it on uh I saw them using it on F1 in reality like when they were doing cuz when they're shooting the F1 film they were like bringing the actors into like actual F1 events and stuff oh that's right you said I saw on the podium when they had like Brad Pitt on the podium and they had everyone act like you know he really won there was a person with the uh 4D like filming him cuz yeah you don't want a giant no you can get I mean they talked about that on Civil War was like well you can get you can tweak the stabilization balance where you can have it you know kind of have a little bit of handheld but stabilize it so you have like a lot more control over you know if you have like a lock solid gyro stabilized or something in between handheld and stabilized i think it has like a weird lens mount that's like my only note is how do I mount like a it's like DJI's lens the the Zen Moose system or whatever that is yeah yeah so like you can't just throw any lens on it but I heard the sensor is pretty solid yeah i remember when uh in the drone when the drones are such a hot thing and they were coming out in AB and do you remember 3D Robotics no yeah they were shortlived um I have one i bought one they're they were like oh they didn't make the actual cameras they were a GoPro mount but they had some of the earliest ones they were one of the first companies that kind of where you could program smart moves into the drone but then it would hold a GoPro camera and my thought was like well I would rather go with GoPro who makes the sensors than go with like DJI where it's like what they make drones and sensors right that was the wrong choice clearly they figured out that the whole game so yeah that's our kind of NAB roundup of what we know now but obviously there'll probably be more updates and stuff and so we'll be doing a bunch of video coverage so it uh all of our videos going to be on YouTube so check us out on YouTube and uh we will do another debrief next week uh after we check out everything and see what we uh didn't cover here and all the new updates and stuff absolutely yeah and I'm sure there'll be a few surprises so definitely tune in next week for our full NAB coverage all right and last story here another kind of interesting update in the AI space what do we got yeah so OpenAI one of the companies that we're all following closely as they're really setting the trend for what is happening in the AI world right they're generally first to market chat GPT40 image generator blew a lot of image generators out of the water they were the first to sort of introduce an LLM to the public with Chad GBT first to kind of come to market with a video generation model Sora so this is a company that's clearly you know steering the ship and so now they announced what they're calling an openweight open-source model so let me back up here yeah what is Yeah so basically when you have a AI model that's fully trained it's quote unquote closed which means the actual secret sauce and how it is reasoning is all built into neural network weights and there's billions and billions of these weights which connect you know one matrix multiplier to another and there's a weight and imagine like a very complex circuit with a lot of wires and each wire is either very thick or very thin and that's the weight of the wire and this is more or less the AI model's uh personal that that is the snowflake right that is what makes the AI model unique from the next one so for them to just come out and release quote unquote their secret sauce of a model that's huge uh this is exactly what DeepS seek did and that's why anybody can run deepseeek on their computer because you can just download the weight take it or modify it make it your own version yes uh I believe stable diffusion is open weight as well so that's why stable diffusion is so popular llama 2 meta llama or they llama 3 is open weight as well i meant two as two as well yeah t yeah but you remember when uh you know this was like a year ago when Elon Musk had a problem with Sam Alman and Elon Musk was one of the primary founders of Open AI back in the day said "Well you might as well call your company closed AI because there's nothing open about it." Mhm so fast
forward today they're coming in on their promise making it open yeah they're making it open and I don't think they're doing it because they're altruistic you know i think they're doing it for very hard to find business reasons to me it feels like that is enterprise and where all of the money is to be made you cannot make that money with the closed model so if it's open you can just download the model and then modify yourself correct like do companies charge for the models if it's an open weight or is it pretty much standard where it's like it's freely available if it's an openweight model the model's already been trained you can modify the output to some extent by attaching luras and some front-end tools but the biggest benefit is if it's like you know we're thinking in the me and world you have a Disney or an Amazon or a Netflix these guys are very protective over where their data lives you know they have release slates that cannot be made public secretive movies they're working on they're not going to use a black box on the cloud for any kind of work it just doesn't make any security sense so having an openw weight model will enable them to bring in this open AI model within a controlled environment within the four walls of the data center at wherever run it locally do the inference locally and have full control over it mhm let's say you're a small company and you want to like like a Deep Seek Deep Seek people download that they don't have to pay DeepSync is free for the model so like is open AI going to do the same thing or are you like are they going to have to I don't know you have to pay to license it yeah so I'm just following what stability AI is doing with stable diffusion there is two sets of licenses and this is the same with like Unreal Engine you have a consumer license for the average amateur user that's free practically free and then if you're going to use it to make a commercially viable you know output like a movie or TV show then you're going to have to pay the company so my guess is OpenAI will roll out they'll still have a free version of this open weight model that's probably some amount limited or the last generation of it and then if you want the latest and greatest you're going to have to pay some kind of license fee to get the open weights and maintain them at your own site right and do you think this is going to be the money play in the future of like maybe it's not by charging the API access to the blackbox of their the models that they host but building out custom solutions or building or like support or like something similar to like how Linux is free operating system but they still make a lot of money by you know offering support and packages and custom solutions yeah Joey I I got to say I'm not sure how AI revenue will sort of grow and evolve over the next couple years it's not as linear as we all thought it was which is that you spend $10 billion on making this model you host the model people log into the model they get what they need and they pay you which is kind of what Chat GBT is but it's clearly not where the money is the money is not selling $20 $200 a month subscriptions the money is selling a $1 million a year subscription to a big B2B company and those guys they don't want to run a blackbox on the cloud right so yeah I I agree I think something like Red Hat where Linux is free Red Hat Linux is free you can download it but then good luck building your own application on it without Red Hat support um something breaks who do you call who do you call all the training that you need to build a Linux application and that's where Red Hat makes their money I think technically they're a nonprofit company if I'm not mistaken i would imagine OpenAI goes that route where first of all it's an arms race like the LLM model that's amazing today is completely outshadowed in 6 months so you can't just spend $10 billion and sit on it you have to get ready to spend the next $15 billion to get the next bigger model done and while you're doing that you have to figure out how you recoup the cost for the 10 billion you just spent so it's a it's an extremely aggressive business with lots of cost lots of expenditure and unclear revenue that is the part that worries me a little bit i think the AI bubble will burst a little bit having said that what AI promises and how it's transformational is absolutely true i'm just not sure that we thought you know as linear as a business model could be it is going to be that in the future mhm i mean I'm thinking too like is it also more you know is it the million-doll a year subscriptions is it the someone building an actual solution that makes it more user friendly to like the wider user base like there's a lot of kind of criticism where it's like well your product's just a chat GPT rapper it's a chat GPT rapper it's like well that could be very useful cuz a lot of people are not going to want to have to figure out how to like prompt hack or you know like even going back to our editing examples of like going back and forth between multiple tools want their one tool to work and use it and like on a consumer level ignoring the you know enterprise needs where it's like they want private data and private training and serving yeah I think uh the sort of uncharted territory and where a lot of money is to be made and still hasn't been made is applications I think there are no killer applications with LLM yet you know of course you have a chat GPT app but it's really a direct access to the cloud-based blackbox and that's why everything Adobe is doing With Premiere to me it's so promising because this is an application that's widely used if you drop a couple of new features into it it immediately gets utilization and you get feedback and data on it you can improve those features over time you need more application layer stuff in this AI ecosystem one of the examples early on you know remember when Mid Journey was on Discord yeah yeah yeah and uh you wanted to generate an image this is like early days just a couple days early days yeah early middle last year back in the day and instead of that Leonardo which is a competitor to Midjourney had an iOS app so you can do everything Midjourney could do but from your phone it had all the controls so which one would you use Leonardo right I use Mid Journey well for me it's all about convenience I figured out how to use Discord so I could use the Mid Journey and then you're waiting for your feed your picture to come back but you could you had to do it on public channels so as you're waiting for your image every other image is just like popping up popping up popping up okay but again I I think the application layer hasn't been cracked and so these models like you know Chachi BT or Runway Gen 4 the models themselves are not the money maker it's everything around it that'll make money yeah and how to make it more useful I mean I think also the other thing to crack is like the killer app or the useful app is when it could get into you know the data that we use every day and I think the two companies that are most poised for that but have not really been doing the best at it is Google with Gemini which has sort of been integrated in Google Docs but then I ask it I'm like you know do this thing with this doc like I can't access that info and it's like what are you doing in my Google docs if you can't like do all this stuff yet and then Apple which you know I mean it's on our phone and it could easily you know handle calendar and contacts and all of or you know a personal assistant like a true personal assistant and I you know they've been delaying their AI launch and stuff and I'm sure for like they're billions of users so it's like when they launch something it's got to I understand that partly but yeah Marquez Brownley has a has a AI Apple AI video out now that kind of paints them in a negative light and kind of rightfully so you know it's like big tech company reaches uh full maturity is at the cusp of big technical breakthrough can no longer keep up mhm that's happened to Kodak that's happened to IBM that's happened to practically everybody so the question is is Apple there right now mhm i still have a lot of faith in Apple but yeah I mean I think the biggest fumble was announcing this and we had a big push for this last year and then coming out this year and being like "Sorry not ready we're gonna wait another year." Yeah no I I think Apple intelligence is half rolled out on iPhones now i I have it turned on and it's not useful i'm not getting anything like mindblowing out of it it summarize emails and stuff you're not using uh Apple Mimoji uh AI emmojis yeah it does does a weird like text summary and email summary which nobody asked for yeah they're not useful yeah yeah okay all right let's wrap it up there all right sounds good links and everything we talked about if you're in NAB next week give a shout out to Joey or just leave us a comment and if you want to connect uh in real life let us know yep all right catch you in the next episode
2025-04-07 22:11