Innovation and Technologies in Neuroaesthetics with Amar Alnemer - modulyss Talks

Show video

Hello and welcome to yet another inspiring  session of modulyss Talks. My name is Isabelle, one of the sales directors of modulyss and I will be your host for today. For the ones that are joining for the first time. During a  modulyss Talk, we give a stage to some experts in the design industry and we are lucky to be  inspired by them. Today is our second webinar on the topic neuroaesthetics, the signs behind  how our brains perceive and respond on art and design experiences.  In the first session we had  the expert Karen Haller, who shared expertise on the psychological effect of colors and patterns  in interiors.

Our expert today is actually a Canadian award winning creative director Amar Alnemer. Amar is specialized in the design world, actually in creativity, design, innovation and technology. When you scroll on his Instagram you're always doubting reality is his fantasy but it takes on  a beautiful travel between time. We are lucky because today Amar will share with us  some of his knowledge & his expertise on how AI is forming the design process and how to define  color and patterns in AI. 

So with no further ado please welcoming Amar here on our stage. Amar I'm looking forward to learn from you. Thank you very much, much appreciate it.

Well, good morning everyone and thank you for joining this exciting session on AI innovation  and design. First and foremost this is basically the second installment that modulyss  is hosting and I'm presenting for you the AI and design innovation segment of it. Allow me to  introduce myself, my name is Amar Alnemer, I've been in the creative and design world for over  20 years.

I've worked in architecture, events, exhibitions, trade shows, branding and working on  all sorts of project from production, videography, presentation and experiential experiences. I am thrilled to take you through how AI is revolutionizing the design industry. Today I'll  be covering three core areas where AI is making a significant impact basically on architecture and  the other elements that relate to architecture which will be diving into further which  basically includes colors and patterns, both of which reflect on how the architecture  concept is being presented. We will see how AI-tools are enhancing the creative process helping  us push beyond our traditional design limit and shaping the future of how we approach design.  So yeah let's get started.

First and foremost, let's talk about AI in design so let's first  address what AI means in the context of design. AI is not here to replace designers but more to serve  as a powerful tool that helps us automate certain aspects of creative process while elevating our  capacity to innovate. With AI we can generate  fresh concepts design patterns and even experiment  with colors in ways that weren't possible before. It is honestly a tool that enhances our capability  to actually push further, it makes the work faster and it makes the work more efficient and enhances  our creativity and safe sty. 

It functions with the direction of a creative director or creator does  not function on its own though it offers a lot   of opportunities for expanded vision into whatever  we create. In my experience AI has been incredibly useful for transforming ideas quickly and enable  us to spend more time refining and perfecting our designs rather than starting over from scratch. For us AI isn't just a tool it's a creative partner, it helps us produce work faster, more efficient and has been integrated well within our process as we speak. In our world, we've started actually using AI for conceptual ideation much more and especially in the past six months, we have found the presence of AI within our integration of work to be a catalyst for actually how we operate right  now so basically what we do is any project that we receive we start off our now workflow  does include start off with working with   AI to create the conceptuals, so in essence  rather than you know sit on a project that we   work on for two or three weeks, building up 3D  models, presentation and all of that. What we started doing is actually capturing the concept  from the start with AI, by presenting a concept that we feel is more suitable for clients and  sometimes because of the time that we have it allows us to use two or three concepts  that we can present and it allows us much more opportunity to actually push in color  pallets, color scheme, color theory within the design aspect so that we can actually  showcase that to clients faster. Once the concepts are approved, we move into the traditional modeling however even then we started to see that we actually can integrate some more of the traditional within what we're doing.

So what I want to talk about is three segments, mostly  related to creative and design and architecture. How AI actually is transforming the work in  architecture and interior design. Basically, it allows us a visual exploration where we able  to present unique and stunning concepts in   a fraction of the time that's traditionally used  to present regular concepts. AI brings us a lot of of inspiration although some of the times we're  actually designing one specific direction, it does offer the opportunity for inspiration along  the way where you see things you did not expect.

And that's almost like having a team of creative where you brainstorm ideas and somebody comes up with an idea that you didn't  think about there's always one aspect that one idea that might pop up that might say well this  was actually something (...) we didn't think about. The other huge benefit of AI is like  instant visuals that you generate versus the traditional method of like drawing  a 3D and then like sending it into render engine and then texturing what you get is like rapid  idea generation that you can actually quickly scan   through and see what works and what not. In essence, it offers like endless possibilities of   fresh design perspectives, things that you did  not expect and things that you expected. It offers you the creative freedom to transform  imaginations into visuals, now it does have   limitations in terms of accuracy where you're not  able to actually put in the exact concept but there are ways to actually control what you're  trying to say so if we dive into like how AI transforms architecture, one of the most exciting  things is how AI is actually driving innovation by generating designs that take inspiration from  nature think like more of organic forms such as   honeycomb structures that not only look beautiful  but also structurally sound and suitable. We are able to basically integrate any element and I will show you like some of the examples moving forward, any element from nature  or any element like that inspired by culture and transform that into actually a design so AI really allows us to explore multiple iterations quickly, experimenting with different  material and design techniques that would have taken much longer using traditional method and  this is where we found the advantage of it in terms of time and efficiencies.

We're also seeing how AI algorithms can actually suggest material sometimes for us that we did  not expect, there are also both eco-friendly   and innovative pursuing boundaries of what we  can achieve in architecture. I will run through some inspirational images, I'm sure you have seen  hundreds if not thousands of images on the web, on LinkedIn or Instagram that  represent what AI can do but I'll just run through certain images for you just quick to kind of  give you idea and some of these images are intended for reason just to show you like what  what you're seeing is controlled in terms of color, in terms of texture, in terms of finishing and in terms of end results. I'm going to pause at this image right here and I'm going to  try to see what I could do in taking you through a journey of what I call AI transformation. Down memory lane when we actually started with AI, what we're seeing today what we're  looking at right now is not a product of like 10 years of work, this is product of probably two  years of work when AI first started to create the text to image and image to image iteration with mid journey and then like 10, 20, 30 other companies popped up with  a very similar product and they're all fantastic products and they all have their own use but what  I wanted to do actually is kind of take you, some of you who are not aware I'm sure a lot of  you are aware, where we were not even just two years ago to where have we become today so rather  than go from past to future I want to go actually from future to past to kind of showcase to you  not even six months ago what the outcome of certain images that we were getting looked like  versus today, so I'm going to start with the image that you're looking at right here this is actually  what I call a controlled output meaning that there were some elements other than just a prompt  to actually generate this image, some elements in terms of like style, references and some  other elements in terms of like color, theme & content. There are a lot of elements that go into  this and a lot of this I'm actually putting at the document at the bottom so whenever we're share  in this document, you'll see all these references so that you can actually make use of it but I'll  start with this image which is basically as of today's technology and it's improving  almost every month if I'm not mistaken to get better and better but I'll show you the difference  between today and not even like a year or two ago. Starting with this, this is what you can get out  of what I call again a controlled output iteration that's what the product looks like when I want  to get a prompt for luxury pop-up with this kind of finish in geometry and  the reason why I say control because just after that, this image is the same exact prompt that  was used for the previous image except this is not controlled, this is the generic content where  I popped up the exact same prompt, exact same requirements however without all the other  elements that actually control the output of it.

It's still very nice, it is not what I actually was envisioning, the previous one is what I envisioned this one is more like it comes up as  generic it still looks good, it's very realistic, it does offer you like the content  that you seek and you can work with that but it isn't what I actually intended however I wanted to  showcase it anyway just to show you the difference between a controlled output and a generic output  and the beauty about what I'm going to show you right now is that Midjourney has actually maintained, Midjourney is one of the main tools that we use for generating images, Midjourney  had actually maintained all the previous versions until today from six 5, 4, 3, 2 and 1, so I'll  be able to actually show you exactly what each version looked like using the exact same prompt  iteration. So this was again version six just like the previous one I showed you except this one has  has a no controlled elements within it and just   moving back one version before which is version  five which I believe wasn't that long ago that was released maybe months we're not talking years  here and that's what it looked like that's version five and even at the time when it came up that was  spectacular but as you can see by comparison to today's versions, it's not even close in terms of realism. This compares more to what we could see in 3D renderings, a 3D output which was  still good it was phenomenal we can always create great images with it but that's kind of  like the quality that we were getting at version five which I believe was like maybe six to eight  months ago. Moving on this would be kind of what version four looked like the exact same prompt requirements and that's what you would get with version five as you can see now we're  losing on resolution, we're losing on accuracy, we're losing on texture, definitely losing on  realism and to push this further that was what version three looked like by the same token using  the exact same prompt as you can see we're losing almost everything here. This is version two and not sure we can recognize what it is and this is version one, where it was basically a novelty tool  just for simple drawings that at that time you could never imagine being  used actually in business like into your real workflow as an architect or designer or  creative.

This is kind of where it is today just to give you a scope again, another picture of it  as you can see we have moved so far within the past two years, it's not like anything else  we've seen and the beauty for us consumers is that there are so many companies competing and  positioning for this that we're always getting the best product there is everybody's competing  nobody's being lazy everybody's trying to stay on top so they push the boundaries in terms of like  what they offer us as consumers to use, which is good for us because we're always getting the best  and latest and I will take you through other technologies that we could use, so in essence  as I mentioned before AI has literally changed the way we envision and see our work, how  we actually integrate AI tools within our production, within our visualization and  within our presentation. It has transformed the process in which we work and it has  really elevated it making the work, as I mentioned before, more efficient, faster and a lot better  and actually now we're even competing in terms of the standard methods of presenting 3D, because AI just does such a great job we actually have to elevate even the 3D presentations that  we work with but like I said it really changed the way we see things no pun intended here. I mentioned before that some of the drawbacks with using AI to generate visuals was that you're not able to actually depict exactly what you are thinking in terms of like the visual  product or the design concept, now that is not   entirely true, there are tools out there that  can actually output your exact design or sketch or 3D model if you may and  what they do is like they actually can produce relatively good rendered image out of a 3D sketch  whether it's a sketch perspective of a 3D, doesn't really matter but it does produce  something that's really decent enough to utilize and present to your clients. It does not match  the accuracy in terms of realism of say a Midjourney output however again what offer is that like complete accuracy over your conceptual design, I'll just go back  and show you kind of how we started so this is the actual sketch and this is the actual render as  I said the quality of it does not match that   of see a Midjourney output like with shadows and  lighting and realism but what this makes up for it, is in the accuracy of the model and having  said that there are also again AI tools that...

You can actually take the model in  and it has the way it looks to kind of match up with that, so there's a couple of tools, I  listed in my index that you could use to  actually upscale image, so in this instance you're  actually able to reproduce a sketch, not having to go through all the 3D modeling process  and from that sketch you can actually generate an image, a render and then from that render you can  actually elevate just a little bit more exactly   how that image would look. This is one tool that I actually like to use sometimes when we're trying to actually generate a 3D render from a 3D sketch image rather than actually go through the whole process We always end up doing  the 3D model because that's how it's going to be because we still need to do like engineering on  it and production but in the ongoing process this is one of the methods that we use to actually  generate 3D images. I wanted to show another tool. Again, it's listed under my index, this is more of an instant image producing tool  so basically I'm not going to show  the demo just because it takes too much time to show it live but I took a screenshot just to  give you an idea so basically you give it a prompt of exactly what you want to see for example double story, open story, condo in Manhattan, New York, open window and whatever you draw and  sketch here basically gets reproduced on this side of the image. If I move the circle, the light  moves up, if I move the square the window opens up, if I change the lines then got different  stripes in here and this can be used for   interior for exterior for any image you want and  you can actually have it stationed and then start changing the pront let's say you like this image  but you don't like the picture you can change this   into white or marble or finish so these are  kind of like some of the ideas I want to give you in terms of actually being able to produce  a more accurate or a different way of using AI to produce a more accurate visual, in this  case you' be basically almost catching live exactly what your model looks like once  you're happy you click save and and you can show that or you can have a client that sits with  you if you want to understand more of their vision you can do this live with them to understand more  like what they want to do so all the tools are basically just to help you enhance further the  work and make it more efficient and faster. Now that we have finished kind of talking  about architecture, what we're going to do is   touch on AI and the power of colors and how we  implement that in to architecture.

They're all related, at the end of the day, to producing  whether it's exterior or interior space, architecture models, retail spaces, hospitality,  commercial buildings it's at the end the same thing. But each one builds on another so like  what we started with was basically how to just generate architectural images what we going to be  diving into now is what we call AI and colors. How actually selecting colors and creating reference  colors can completely dominate and control the output of what you're doing, so in this instance  here AI actually is empowering designers by enabling us to explore a vast array of colors  combinations that are personalized and predictive. Whether we are working on a single  piece of art or an entire architecture project, AI driven tools can suggest color palettes that  align perfectly with user preferences and measuring trends.

In essence what we could  do, there are a few AI tools that actually use psychology, they use actually branding to  integrate so a perfect example of that is how AI can predict color trends by analyzing historical  data and cultural influences sometimes we've seen that in panon color forecast by using AI we  can create color experiences that are not only visually striking but also emotionally resonate  with our audience and in that sense there are methods of predicting color psychology like  emotional and psychological analysis tools that can predict what colors should be used,  there's brand personality matching we're creating like a color scheme which can align with  the brand personality luxury brand for example might lean towards a deep elegant tone like  black, royal blue or a playful tech startup might favor like brighter more energetic shades like a  neon green or orange. There are actually tools out there that actually can incorporate literally a  brand, a state of mind or exactly what the output should look like. There is also user-centric algorithms where AI considers target audiences, preferences and behaviors all these  things are tools that are out there to actually be used in generating colours, I'm not going to  get into those because I think part of it was   also covered in last week's sessions which  I encourage you to see was a really good one on the psychology of colors so I won't get so  deep into the implementation use or psychology of colors, but I'm just telling you that there are  amazing tools out there to actually use, to harmonize colors with the brand essence. In that sense AI can combines an understanding of brand identity, data driven insight into a  color palette so you can literally create a color palette or an image that  represents your brand. My focus will not be on that, for example you  can upload a brand asset, logo, previous campaigns. It will suggest like a harmonized color palette  based on the brand core identity, these aren't tools I'm going to showcase today but I'm  telling you they're out there in case you need to actually output certain colors or certain  look or an identity practical example like in a design workflow just to give you  an example let's say you like working with   a luxury eco resort brand that wants to evoke a  sense of calm and sensibility you can input the brand mission like sustainability, luxury look into an AI tool which would suggest a calming pallet of earth tones, green, soft blue and  whites so these are tools that are out there and I will put some of them on the  list so they can actually utilize some of that if   need be to basically integrate a complete brand  color and come up with a color palette that you could use.

What I will be doing however is  actually showing you somewhere how you can actually apply that method in AI to  generate what I could call now more controlled output of your branding or concept design  again back to that favorite concept and I'm only using it for reference because we've already  seen it so it's easier to refer to an image that you already identified with so this is like  a workflow example. What we wanted to do is say you had a client that was interested in  this color palette or you like this color palette or you like this color image or whatever the  case may be these are colors that you want or you   can see or foresee on a project that you're  working on. There are some tools online that you can actually grab that image, plug  it into the tool and it would give you all the color palettes within that image it's an incredibly  easy simplistic way to actually generate color palettes that reflect the object design what  you're looking at and we will go into other ways of of patterns but this is like in the color  scheme, basically you would generate what looks an image like this you can export this image  and now the magic begins, as you can see these   are the color palettes that you have of that and this  can be any image and I'll show you other ones as well.

Another magic begins where you go into a tool like Midjourney as you can see I've entered the color reference here, this is going to be my  my reference image to generate uh whatever I want to generate within this image so what I have  done is like I've called it my prompt, you can use any prompt you want like I'm asking  for like modern architectural interior design concept but you can use it for  anything to see what it generates and here's the magic. What happens is the output that you get  is basically absolutely controlled within those color paletet that you have uploaded as a style  reference, so any image now that I'm producing, carries only those color palette so now I'm  actually able to produce exterior & interior images of any space within that content that I'm  doing and they all reflect the exact same color palette so don't gear off so I can stay within  the same content if I'm designing something what I'm presenting to the client this is what I  meant by making the work so much more efficient, faster and easier we're now I can actually group  a concept within a specific color palette to show the client this is what we're thinking in terms  of what we think works for you. Again, any iteration of output you do as long as you got that  color palette reference you're going to get within the same imagery that you  can group up as one. As you can see, any image I produce, would always carry the same exact thing  and this is kind of the intent of this where now we have more control over the output of the images  that we're looking at in terms of color content and theme and this is kind of where we carry  this over to the client and say look this is the theme that we're thinking about, this is the look  and feel, these are the colors. What do you think of that? And if they say well we're thinking a  bit more brighter or whatever the case might be. We can always do something else say like what  image inspires you or what would you like to see? They would say bring us something like this  again I won't go through the same exercise again   but it's exactly the same exercise where I would  take this color palette from this particular image and plug it into the system and do that  process just use it as a reference now I'm in complete control over the output color  theme and this is kind of where like the color theory comes in.

Just a few simple examples that  I'm going to show you and run through just to give you an idea. This is the same thing where like we  using this color palette to produce these imageries. I mean they can be anything in this  respect here whatever you produce would always carry over that exact color palette  theme, the same thing on this case here just   showing you different ideas whatever output you do  now you're able to actually do even close-up shots of furniture, close-up shots of kitchens, close-up  shots of lobbies, close-up shots of reception and it would always carry almost the same color. I will show you this example that we did for trying to present something for a large  hospitality project and these are the color palettes that we wanted to introduce to kind  of showcase what that project would look like. I'm only showing you like the overall shot  of one image and the next image you will see exactly what I mean where every single product is  actually coming up within the same color palette.

For in this hospitality case whether we're  doing the bar, we're doing the reception, whether they're doing like the five shots, entryway, bedroom, lounge any area as you can see carries exactly the same color palette, so this is what I meant by it's actually more controlled right now because you have a theme  and a scheme that actually comprehends your vision. Again very similar it's a different idea  of using like different textures and materials again the output is all along the same lines  as you can see it doesn't gear off from that. I will show you another example, here using  the same color palette and now when you are typing up your prompts you're always going to  get the same exact reference imagery.

As you can see it's consistent, this is what I meant by like control theme so now   when I'm presenting this I'm presenting a very  consistent look and feel. Last but not least, this another concept we did, soft colors and  here we go so the results are almost the same in terms of like what you could see. This gives you an idea how to actually employ colors and actually generate more controlled output, using colors and using color reference and style reference within the images of AI. That concludes kind of the second segment of this process, so we started off with just architecture  then adding colors to architecture. Now the next phase is going to be adding textures to the element. How to actually include textures now 

into the process and this content  here I'd like to just first of all talk about patterns. What AI has done in creating like unique  patterns it has really revolutionized the way we create and incorporate patterns  into our designs through generative design. AI is actually capable of creating unique  intricate patterns that can be applied across architecture, textile, product and everything  else in between for marketing, branding, imagery, visuals, photography we are now  able to actually just literally imagine things that were hard to achieve in fraction  of the time to generate any image that we want. One of the most exciting aspects is how AI  also can adapt like motifs or art from various cultures or nature to generate patterns that  are completely new yet deeply rooted in tradition or organic organic forms this really opens the door to like endless possibilities for us to actually customize and personalize any  image whether it's in architecture or any other application or industry work by  applying the customized patterns it allows us to create design that are not only unique but also  deeply connected to the source of inspiration.

In this process I'm going to showcase  a slightly different workflow. This workflow can apply also for your color pattern  selection but I wanted to kind of leave this to textures which is slightly more complex so I  gave you alternatives of using color in a specific way to use AI tools to generate pallets. In this case we are going to be looking at the traditional ChatGPT tool.

Again I'm not going to go into a live erations of it but I will kind of give you an idea exactly how this  workflow process works, basically everybody knows the ChatGPT uses it or not or you can use  any other similar tool that doesn't really make a difference. The idea basically ChatGPT has  evolved a little bit where you can actually pick any image for inspiration that you  like or see and plug it into ChatGPT and ask it to actually analyze the image  give you an idea what that image is and where it is. Sometimes it really has a very good idea  where that image was taken, location, time of day, colors used, pattern textures, culture it actually  can really extract that much information from it.

From that information you can actually tell  it with that image that you actually uploaded, you can actually upload an image into ChatGPT  and from that image after the analysis you can actually tell it exactly what you want  it to do. If you wanted to generate a prompt, you can actually generate a prompt for you, you can tell it exactly what you want the prompt to be like. You can put that into some of those  AI tools like Midjourney or DALL-E to utilize that prompt and generate textures or pattern out of it. So the first process is to upload an image or just request a prompt that you could  use within ChatGPT, I always like try to upload an image for reference, once I get the prompt I  would go to Midjourney enter my prompt and I would get like a prompt for pattern or textures  whatever I'm trying to see to that I could utilize. I put the prompt into Midjourney, I asked for texture I got this texture and   again is where the magic starts to begin in two  forms. There are some tool I also put a list for it, where you actually could take any output of  pattern or texture and generate actually an endless & seamless amount of it, so that you  can use in low scale or small scale you can actually   scale it up, scale it down whichever way you  want that's like actually taking a pattern and   upscaling it into any image we want.

What I will do next is actually I have taken the process that I just showed you with ChatGPT asked it  to actually create a pattern of a (...) culture that I would like to use and it generated this. Now that I've got this pattern I'd like to plug it into, again this is generated by Midjourney  which I would use as my texture reference to.. Now it's an image reference to generate the interior design of architecture. Again, I'm not going to show you the exact prompt process because I'm sure everybody's familiar with it but this is kind of the output that you get, this is the pattern or texture that I'm using I would like to apply this into space and  this is kind of what I'm getting.

There are ways, when you're working with this, to change  what the wall looks like change what the carpet looks like but your first output completely  reflect exactly what you were looking at in terms of pattern and how to apply that. You can see other outputs of it using the exact same pattern again in the same manner that we talked  about in color, we are seeing the same thing in texture where we actually can control  and direct what the texture output should look like if we want to show a specific vision of  colors or patterns that we're using and I will show you later how we can apply that into wider applications for commercial but this is the idea again to we're using the same texture finish  to generate all sorts of architecture image controlled within that same imagery  that we have of course this goes into much wider application I just showed you one but when we're doing like commercial building, tiling for floors, wall finishing, the  same thing this was generated of a similar idea using a textured image carpet to  generate this flooring carpet. Again you can do the same thing ask for different  types of finishing to be generated as you can see the detail is just unbelievable when it comes  to this is not photographed this is all Midjourney product from AI, as you can see so you  can apply this into different places I'm showing you different images of different places and  different applications of it, you can see here   there's a different feel and look using  the textures and patterns.

Again the same thing applying the textures and patterns into flooring and you could always control exactly like which area if you don't like this and I'm going  to show you in a minute, like let's say this output we had here, I love what I'm looking at  I'm just not sure that this carpet is exactly how I want it to look like I've used a texture  reference, a pattern reference here of these pastel colors as you can see but I don't like this  layout of how it looks in terms of the imagery this is the beauty of some of the tools such  as Midjourney that you can actually fix everything and only change exactly the area  that you want to change so everything remains the same. I love the image, I love what I'm looking  at I'm just not crazy about the carpet so I do a couple of other runs until I get something that I  truly and really like and I'll go with that so in that same content that I displayed for you  what we did with color palettes where we actually controlled the output of an image, I will show  you some image results here. It's a bit different because you're more like extremely solly conceptuals, so let's say we taken this texture and pattern that we created here again  as you can see all my outputs go along the same lines of color and texture and finish and in this imagery here so that's what we can we're able to control again the product, now using colors  and actually patterns so combination of the two. As I mentioned we started off with just straight up architecture controlling using colors and now we're controlling using patterns, the same thing here we wanted to create an output for a Nike Lego concept, again everything is here controlled they along the same lines, in terms of like  color and finishing so this basically concludes... what we talked about in terms of how  they apply AI, the benefit of AI, in integrating it in design and the different means  and ways that you can actually integrate that... These are to me the most essential ones that  I like using most ChatGPT is the workhorse of all AI prompts and  use Midjourney for me but there are other tools that you could use like DALL-E and other  ones they're just as good the two softwares that I showed you the two tools Krea AIand Prome AI where you can actually generate instant AI from sketches or instant outputs. 

Topaz Photo AI and Magnific are two amazing tools that  for upscaling your imagery, I use that actually on  on any design not just AI design any pictures,  any photos anything I want to upscale I drop it into  one of these tools and it can upscale it by six times. You know for print purposes if you have an  image that worked great let's not forget Adobe Suite has some amazing tools and Illustrator  where integrates vector based imagery from AI.   Photoshop, image changes and in design where  you can actually generate images on the flight. Adobe Suite has a lot of interesting tools,  I'm not showcasing them there here but those are something to be used, then have also a open  AI with Stable Diffusion and Comfy AI.

In conclusion, I honestly just want to express the  importance of staying on top of AI as it's a co-creator tool, try to explore all tools that AI offers and see what tools works for you the best. I showed you kind of what works for me  and for us but there are other tools that work for other people, I mean there are like so many  that could... There are a lot of tools right now, for converting architecture, renderings from  sketches there are so so many out there.

These are my preferences these are the ones I think like  work for me best but definitely you can go   ahead and explore whatever you want the point  is, it really does help so much in terms of like the workflow, work efficiency and how  we generate our work, how we work with it right now.  So like as we wrap this up I just want  to make sure that it's clear that AI is here to stay and will continue to play a  key role in the future of design, it's not just about automation as you can see it's becoming  like really a co-creator I use it as a tool to help me out in in many different ways not just  in design process but in other elements. But it helps me push the boundaries of what's  possible, many times it produces things I did not imagine and I could go in that direction or  sometimes it's just way off that I don't use it. So honestly I would just encourage every designer out  there to just explore all these AI tools and start implementing them into their workflow it's  not a replacement but as a way to just enhance your creativity and efficiency, AI is opening  many doors to people and as we look ahead I'm excited to see where it's going to take us I think  the future is bright and friendly and there's so much to be discovered like I said we get new  tools almost every week or two so I can just   imagine the tools that we're using today where we will be in a year or two considering where we were just a year ago based on what I  showed you. With that I end my presentation thank you all for your patience and we can open  it up now for a Q&A if anybody has anything to ask.

Amar, fantastic inspiring us on this AI topic  where before maybe we felt it a bit or me as a threat, they seeing it actually a  co-creator you took us on a creative journey or creation journey actually with a  lot of tools and inspiring elements.   Thank you very much for that. Oh it's my pleasure  Isabelle and that's honestly the whole idea like it's a for people who are very aware of it  to see that there are a lot more potential even and people who are not aware of it just to see  that it really just a tool for me just like any other 3D design tool I use it just expands the  horizon of what I could offer what I can do at   a much deeper level. I'm going to take a few  questions, all the questions will come on our website anyway and when the webinar will be  recorded and sent to you, you will get all the questions with answers as well for  now we're going to limit it because it was so interesting so we  didn't want to interrupt for sure. One of the question is:  'Is the advance in AI image rendering pin the newer versions or is it rather the input that you gave or that the users gave that make  this AI evolve?' I mean it's a tool and I'll repeat again it's just like if some people are familiar with the traditional tools like OrCAD, 3ds Max, V-Ray the tool is only  as good as what the user inputs in it, that's why I keep repeating it will not take  anybody's job it will actually just enhance and support as a co-creator with you, so for sure some  of the imagery that you're seeing the quality of it, is controlled by the amount of input I put  in it because it's like I mentioned like any other tool it's how I actually control the  output there's a lot of aspects and tricks and tips that you could use to generate that  output but whatever you see me do you're basically are able to do if you just learn exactly how to  use it so in essence what you see is basically a learning curve, just understanding  like what camera angles are.

How to use what daylight shot to use sunlight it's  like any other tool you just basically input the information that you need in it and then you start  to play around of what works best and then you create your own template so to speak that generate  the images that you want for interior, for exterior for furniture, for close-ups, it matters but  the tool is the same tool for everybody I don't have a different tool than everybody else does  it's just the way you use it the way you input into it. All right.   'What future advancements in  technology do you foresee that will influence the color use in usage in interior design?' So one of the drawbacks of AI is that it does not produce the exact same imagery  that you imagine, it does its own iterations. Yes you can control it up to a limit with your  prompts and content and description. You can be as detailed as you can and it will get close in  some instances it gets 90% close to what you want which could work great for presentation but when  it comes to production it is not. What future advancements I would like to see, is being able to actually kind of what sketch the render works but I would like to see that actually within AI  where I'm able to produce an exact image of what I'm doing and here's wishful thinking from me be  able to actually generate a 3D model from that AI. 

This is like something that's not there now  I gave everybody an idea to work on something. It does exist where you can  actually generate almost a 3D model it's not exact accurate but the idea is if I'm looking at this  for example model that I'm looking at here the Q&A I'd like to be the ability to actually generate  a 3D model out of that so that I'm taking this straight into my engineering and fabrication  versus redrawing the concept that I just did for you in here so that's my wishful thinking. It will happen, I'm sure it will happen.

It's fun because it links immediately to a question that keeps popping up  you are so enthusiastic about this creation   journey I'm wondering from you   'What's the most exciting part, the  creation or seeing what you have created with AI?  I mean I would say  honestly it's the two folds the journey for me which is the creation aspect of actually seeing  what you're imagining come to life as you're working on it, that actually is very exciting  to see because like you have an idea in your head, you'd like to see how that output comes  up and so far I would say the randomness of it so far is actually for me it's exciting because  actually it does pop some things I did not expect at sometimes so the process, the journey of it is  actually exciting but the realization of it to see it being implemented that the  icing on the cake of course. All right thank you we have to stop this topic, limited time but thank  you very much for inspiring us. It leaves me to invite you all for our next webinar, the next modulyss Talk who will also be linked to neuroaesthetics and there we will talk about case studies, so realizations of interior to neuroaesthetic designs. We will have as expert Lucy Rees and Elena Nunziata, from Tétris London.  They will share their knowledge at that moment. Thank you very much for the presentation and sharing your knowledge with us.

Absolutely my pleasure always to have a good day and we'll talk later thank you  so much have a good day. Thank you very much.

2024-10-04

Show video