Hello, everyone. My name is Oliver Emberton and I'm your host today. I’m the CEO of Silktide, and we'll be talking about how AI is transforming web design.
Now, I'm going to begin with a quick demonstration. So this demonstration is brought to you entirely by generative AI. In fact, the following took under 5 minutes for me to create, and I wasn't really trying too hard to be that quick about it.
But to show you this, I need to explain a little bit of back story. So firstly, this is Bloom. Bloom is an entirely fictional organization that specializes in selling indoor plants.
So I want you to imagine the kind of company that sells designer plants to people with too much money and not enough time. Now, that's my business idea. Let's pretend I need a website.
Let's pretend I'm going to ask AI to help me out. So I spent all of a few minutes and I was able to pull together a design like this. That's pretty much my second attempt. Spent a little bit more time, I got one like this. A bit more rustic.
A little bit more time, I got one like this. This I really like. Maybe it's not that practical, but it's certainly visually impressive. And again, so far I’ve spent under 5 minutes. Now the thing is, if you spend any time with generative AI or recent modern AI advanced tools, it's pretty much impossible to escape the realization that it seems to challenge what we're actually doing on a fundamental level.
The question you might well ask yourself is how much web design will AI do for us? That's one of the main questions we’re gonna answer today. There would be many more besides. To get us started, let me show you what we're going to cover. Firstly, what goes in to web design. Hopefully everyone on this call is familiar, but we're still going to take a first principles approach and break it down.
We're then going to look at the state of AI today, specifically with regard to web design. And lastly, we'll talk about how to predict AI in 2 to 20 years. And if that sounds absurd, you should stick around, because it is. But it's also quite a lot of fun. Right.
So firstly, let's take a look at what goes into web design. Now, you can cut this up any which way you want. You can look at roles, you can look at tools, you can look at responsibilities.
I took a relatively simple approach looking at six main areas of almost intellectual responsibility. Skills, if you will. And you'll see why. Because what we're going to do is going to take each of these six and we're going to dive into them from the perspective of generative AI. So to begin, let's look at understanding.
So understanding a website is a core competency, even if it's not a specific role. Managers of a website project, for example, may be responsible. They may be creative directors. To be honest, some level of understanding is present, usually in everyone working on a web design project. And a typical area that this results in are a requirements specification document.
Understanding maybe just a load of messages passed around on Slack, but something about requirements and something about your audience which most commonly is represented as personas. Next up is messaging. So messaging can often be thought of as copywriting, but I think it's actually deeper than that.
This is where you actually try to figure out what specifically are you going to say and how are you going to say it in a way that's compelling to your audience. It's the difference between just saying, you know, directly stating out the point of your business and actually getting into the head of your customer and writing something for them that unlocks the desire and excitement and the intrigue in your prospective audience. Next up is architecture, which is just- I had to make it one word, but it's information architecture. It's essentially the structure of your site. It's the planning of the navigation, the content on the page, prototyping, etc.. Then we get to design, of course what many people think of as core web design, but it's just a visual component.
So visual design is making it look great. Then you also have the user experience component, which is generating a great experience for a user, and that encompasses a whole range of other things, including like how efficient the content is or the design is, as well as how accessible it is. Next up, media.
Now, anyone who's already been on the Internet for the last year has probably come across the application of AI to this. We'll take a little more of a detailed look later on. But typically websites are going to use some forms of illustration or fonts or icons or photography, often a combination of all. So we'll take a look at those. And then lastly, we have testing. Testing, we're looking at things like, well, running it on real users, actually taking your prototypes, potentially your low fidelity prototypes, wireframes, etc.,
and just running them in front of real people and saying, Hey, can you make sense of this? And then interpreting all of that and turning it into useful insight, helping improve the quality of your website. That's the theory. Of course, we all know not every web design team necessarily does all of these things, certainly not perhaps as much as they’d like, but this is the kind of ideal framework.
So let's dive into the first one. Understanding. Okay. So AI understanding a website, how would that even work? Well, I'm going to start you off with a tool you probably haven't heard of called QoQo. And you may be wondering, how do you pronounce that? And I honestly, I have to tell you, I have no idea. You'll see there's quite a few AI companies out there and they all seem to have quite terrible names.
But anyway, QoQo specialize in generating user personas with AI, so you can type in something as simple as a 35 year old payroll manager and they will actually attempt to generate a complete user persona from just that description. Generally you’ll want to give it a better one. This is incidentally a Figma plug-in. If you use Figma, check it out for free. It's pretty cool.
It's like a couple-day free trial or something. But yeah, no affiliation. I just- I've used it and I thought it's pretty useful. So in this example, I'm going to take our Bloom website and I'm going to give it a persona of a young, busy professional looking to decorate their apartment with lots of plants.
They're affluent and well-educated, and I ask for a persona card, and I get something like this. Now, if you actually stop to read this, it's kind of impressive. Take a look at some of the needs of this person. They want clear and concise information on plant care and maintenance.
They want guides on proper watering techniques, recommendations for low light indoor plants, tips for preventing and dealing with common pests and indoor plants, etc. This is actually legit. It's surprisingly insightful, especially given the brevity of my description. And this goes on beyond the one page you're seeing here.
There's actually like way more content below the fold that I'm not showing you. So that was pretty handy and it took me about a minute. So if I'm trying to get ideas for like my content calendar or my website structure, this was a genuinely useful kicking off point. Now you maybe aren’t using QoQo, maybe you're using say in this case I'm using ChatGPT, but you can still do something very similar. And I'm going to show you the contrast between the two because it's quite informative.
So imagine I just type in a prompt like this. I'm starting a business called Bloom, which delivers indoor plants that are easy to take care of. Help me create some user personas.
And sure enough, it does that. It comes up with about six. This is the first one, which is my favorite: Eco-Friendly Emma, and it's even given a funky name which I quite like. Bit of alliteration is always good and it's a very, very lightweight user persona in this case, right? Not really what I want. I want to get a bit more detail. So this is one of the things you're going to start to see. You'll see that the first tool was a very specialized AI product that was designed for a specific use case, and it does a very good job at it with minimal effort.
Here I'm using a general purpose AI tool, GPT, which I suspect hopefully 90% of the people in this call have actually used or one of the equivalents like Claude or Bard. But a general purpose tool can perform much the same task. But you'd have to put a lot more effort in. You'd have to sit here and kind of nudge it in the right direction. But anyway, let's say I like this Eco-Friendly Emma, great. I want a persona around that.
I actually think, you know what, it'd be quite fun to have a cartoon of Eco-Friendly Emma, that would look good. So let's get a cartoon. There we go. Took me about 30 seconds. So I take my cartoon and then I copy that into Word and I pasted the persona that it gave me with one extra prompt. And there in about 2 minutes, I've now got a reasonably decent user persona, for almost no work whatsoever.
And this is actually useful in driving understanding, shared understanding, of this website. I can share these personas as you should be doing with your staff and hopefully align people around a better understanding of your audience and what they can do. Crucially, you can also share that, of course, with your other AI’s, but we'll get to that. So again, to look at what we have here, this is a user persona generated for a general purpose tool, and a little bit of manual work.
And this is the persona that was generated by a specialized tool. And to be honest, the specialized tool did a better job. It came up with more genuinely valuable insights and it was quicker. But of course you have to pay for it. So you'll find that there'll be this divide between using specialized and generalized tools.
I suspect everyone's going to have a generalised AI and a handful of people will have specialized AI’s for their specific roles. So let's move on to messaging. So messaging, to recap, we're looking at things like the ways that you specifically explain, communicate your product, your benefits, your services. So in this case, I might ask GPT, Hey, can you write ten three-word slogans for this business? And literally off the bat they came up with Nature, Delivered, which I love. Green Dreams Delivered, Fresh Foliage, Straight to Your Door, Blossom Your Space, From Soil to Sanctuary.
Not sure I like that, but it's certainly creative. Nurture Nature Indoors. Again, that's kind of maybe a little too clever for its own good, but it's pretty cool. These are impressive, right? These are objectively impressive. They're not going to win a Pulitzer Prize.
But if you're not getting value out of this, you're probably not looking closely enough. These are hard for people to do. Most human copywriters are not this effective or nearly as fast. So I took nature delivered, which I liked, and I stuck it on this website design which took me again, another sentence or two of text.
And there you go. I've got a concept early on, of something to think about to spin ideas around that took me almost no time whatsoever. I could follow up. I could say, All right, can you identify some key messages that would resonate with Emma's user persona? The persona I had earlier on, feed that back in. It’s your briefing document the staff, it's also your briefing document to AI. Think of it as like onboarding your AI, which is a terrifying and brilliant new reality we now live in.
So having now onboarded your AI with Emma’s user persona, can it give you a heading and a single sentence of text copy for each? Let's take these key messages. So for Emma, being eco conscious, it suggests something like Green Living Made Simple, or Eco-Friendly, Always. Breathe Easy with Air-Purifying Plants. These aren’t terrible. They came up a whole bunch more.
But let's just take the first one, Green Living Made Simple, put it on a web page that it designs. You can sort of get a feel for what that might look like. Now, again, I wouldn't use that design, but it took me minutes, literally minutes to ideate like this. Let's move on a bit.
Architecture. So information architecture in this case not actually to be confused with any building construction. The information architecture typically is like the pages it will have, the structure of those pages, the naming of those pages, the sort of things like the tags or whatever, if you have search, etc., all that kind of stuff as well as like
what you put in those pages, like what kind of structure you put. Now for this, I want to show you another tool, AI tool called UIzard. And yes, this continues the long running trend of terribly named products. Seriously, what are these guys doing, they literally have access to AI to name their products. I don't know. But anyway, I genuinely do not know how to pronounce this name.
UIzard, QoQo... anyway. But it's it's pretty cool. And you can use- I'm going to show you here for free. So again, no affiliation, I just encourage you to check them out. It's kind of worth it for a for play.
What they do, they do a design tool that's a little bit like Figma. The design aspect isn't nearly as good as Figma, but the AI bit is pretty impressive, and it's worth playing with. So what I'm going to do here is I'm going to use the auto designer. I'm just going to give it a brief. My brief is four words. I say “Bloom” plant delivery business.
That's it. I was very, very lazy. And I just told it, Green, verdant, and contemporary for qualities. That's all I did, right? It's really, really basic.
And I thought, I want a web page or website, and it goes ahead and it puts together some sketches for pages. Now, these are not going to win design awards. They are not, you know, spectacularly beautiful or anything at all. But I encourage you to look at the structure and kind of the copy and so on, it's kind of thrown together as a starting point and consider what it's actually doing to pull this off. So it's got to discover a wide variety of plants for your home, maybe not necessarily the copy you want to lead with, but it came up with something that's technically valid. Put a search box under it.
Search for plants, browse by plant type, care level... You've got categories in there, indoor outdoor, flowering, succulents, herbs, ferns. I mean, I'm not an expert on plants, but that seems legit to me and it's a pretty decent starting point. If I didn't know plants the first thing I'd probably do is try to learn enough about it to figure out categories and logical sections to work with.
It's done it for me and it's just put them together on the web page design. I mean, again, the design isn't beautiful, but look here, this is a search box, right? So it's got recommendations at the top with some appropriate image. Well, maybe not totally appropriate. And I think the fourth one, is it a cloud?
But still, it's trying. It's got some tags, pet friendly, easy care, recently-viewed plants. That's actually legit functionality that I may not have thought of to include. And it's, yeah, it started all for me literally, literally from a four word brief.
There aren't many humans that can pull that off quite so effectively. Right. Moving on. We also see a shopping cart here. It's actually designed a complete shopping cart for us, which I'm not going to show you, but it's kind of impressive that it's capable of analyzing what you've got, speculating on pricing and categories and so on for your products and putting it all together.
And I can iterate on this. I can go into a chat box and I can say, Hey, you know the landing page, which sells our most popular cacti and generate that. And sure enough, that comes up with an extra page, which is so long, I can't show you the whole thing. But again, this is not perfect. You're not you're not going to publish this on the Internet, but it's a starting point for conversation.
It's reasonably insightful and there's some value with it, and it's not hard to imagine how a product or tool like this can become far, far more capable in the near future because none of this stuff existed less than 12 months ago. Moving on to design. So at the start of this, I gave you a quick demonstration and I'm going to go back to that. The demonstration I gave you was I asked ChatGPT 4, which has an image generator known as Dall-E 3.
I asked it to draw the home page of a plant delivering business called Bloom, and I literally gave it this short two sentence brief. That was it. And it came back from that and gave me this, which I wouldn't go live with.
And it's got some obvious flaws, but still, it’s a start for conversation. And then I asked it again and gave it some more refinement. I was like, Can you make it more colorful, for example? So it came up with this, and I continued in this line. So it's not difficult.
You just type in what you want and it comes back. And so I wanted a more rustic style. Now do I actually want a more rustic style? I don't know. At this early stage, I'm literally trying to feel out what I think works best.
So I might say, for example, do a crazy artistic one. Do a really old fashioned stylized one. I might ask for one that looks like a Salvador Dali painting because that just hits me in the moment right? You have all sorts of options there and this is really, really powerful. But it also reveals a range of strengths and weaknesses of this kind of technology and this kind of approach. So it's really impressively creative, to be honest.
I think we like to deride AI as like, it's derivative and whatever. But the truth is, if you got a human who did all three of those designs, especially if it did them in under 5 minutes, you'd be kind of impressed. It's as creative as the prompts that you give it, but it's surprisingly expressive within that. And because it knows basically everything it can do, any style you can choose to throw at it.
It's also unquestionably relatively appealing, like you may not realize this, but all of the image generators, the major ones, the Midjourneys, the Dall-E’s, they're all trained on human taste, on the human aesthetics. So they've actually been given lots and lots of images that have been scored by people, and they've essentially tried to learn what people like to look at. And for that reason, they tend to sometimes overplay doing a design that looks good, even if it doesn't quite deliver the qualities that you actually want. And of course, it's very, very fast.
You can't really argue with I got a design in 20 to 30 seconds, so it's a starting point, but it's definitely not something you could use as a real web page design. The main problem here in this specific toolchain is it's fixed. These are bitmaps and that the tool that I'm using here has no ability to refine it.
I can't go here and say, I like that design, but can you move the logo ten pixels to the left? It literally can't. If you ask for it, it will just do a new design again, which is not what you want. And in most cases that alone is going to kill stone dead the idea of using this as a web designer, but it's very useful as a form of inspiration.
And you can take the components of these designs and pull them out so you can say something like, I love that image in the middle. Can you generate an image like that? Just the image, and then you can put that into your own design tool, for example. It's inaccurate.
So anyone who's played with these kind of tools knows this, but you can ask for something and it can outright reject everything that you're asking for. So in this case, the first design on the left gave me what I wanted, but then the second one, it put it inside a screenshot. So I don't- I'm sorry, I put it inside a monitor. Right. I don't want inside a monitor.
I want it to just look like a screenshot. That's what I care about. So I specifically said, Could you not do that? Could you not put it in a computer? Can you just give me an image? And of course it came back with a third image that was exactly what I didn't ask for. This is a common experience and it is somewhat frustrating. And of course you can't help but notice that when it comes to text image generators like this all kind of hit and miss. They've gotten better, they got a long way to go. Lots of reasons. It's pretty hard to get right.
Anyway, I also asked the same AI to help me generate some logos and again, well, these are not something you would commercially apply. They are kind of informative. The one on the left. I actually quite like the way it's joined the two O's into an infinity symbol.
If I were designing my own logo, I might use that as inspiration. I'm less keen on the delivery being joined with a V, the E, but you can see what it's trying to do. It's trying to create like a pattern, right? The design on the right is kind of accidentally hilarious. It's trying to loop in the L of Bloom into the O, but it's accidentally ended up writing the word boom, which is perhaps not the connotation I want for my advertent plant company, but provocative nonetheless.
I played with this slightly longer, a couple more minutes, and I got this, and this I actually quite like. It's not great, but as an inspiration, as a starting point that's pretty reasonable. And historically what I might have done is sketch on a pen and pad or just played around with Clip Art and Google for an hour or so.
So this is a faster alternative and interactive way of exploring more ideas like that. Another thing you may not realize you can do with modern design tools is expand your range of modality. And so what I mean by that is, let's say like me, you like to sketch on whiteboards or pieces of paper, you can do that and then you can upload your design and you can give it to a tool like UIzard, and it will turn your hand-drawn sketch on the left into a vector design like you see there on the right. This is kind of magic.
Now, it's not a great interpretation of that design. It's very functional, it's not very aesthetic, but nonetheless, it just saved me a whole ton of time. Are you going to see tools like this? Supercharged, I guarantee you Photoshop Illustrator, they will get the same tool.
There's no way they won't. Both are leading in AI advancements as of late. So this is the sort of thing that will progressively transform more and more of how we work and how efficiently we can work. Right. Next up, media.
So media covers things like photography, illustration and clip arts, logos, icons, etc. and it can't really escape anyone's attention that this has been turned upside down last year by things like Midjourney. This is Midjourney, which is probably the most skilled and capable image generation tool out there. What you may not be familiar with is what Midjourneys looked like over the years.
The top left corner that’s the first version of Midjourney from February 2022. I encourage you to look at this. This was in February 2022. This was state of the art. This was a revolution. No one had ever done anything like this before.
It's actually kind of crazy how transformative this was at the time. And then less than a year later, we had this. This is November, the same year. So significantly improved, but still far from perfect. I mean, pretty impressive.
If human artists did that, you’d have assumed it took them quite a few hours, but you know, we could do it in 30, 60 seconds, something like that. Anyway, a year later, from there, we’re at this level. It's not hard to see the significant advance in quality and capability for these tools. But just to underscore it, I want to show you another example.
This is again, v1 of Midjourney, February 2022. These images are clearly far from ideal, but let's look at less than a year later. We got to here and then a year later from there, we're at this. And of course that's December 2023, which in AI terms means a thousand years ago. So let's look at where we're at a month later.
So this is where we are right now. This is Midjourney 6, the latest, latest version of Midjourney. And it's amazing. It could do things like this. This is this looks like CGI, right? This is done by a generative AI tool, which is absurd.
Normally you'd need a 3D artist or something. But anyway, this one, frankly, is hard to believe. It's computer generated. That is outrageous as far as I'm concerned. I mean, honestly, if you didn't know, could you tell that wasn't a photo? Really? What about this? This looks like hand-drawn art to me.
I mean, consider the detail. Not just the nuance of like, you know, the reflection, the lighting on the jacket and the rain, but also consider like the artistry applied to generate something that creative and expressive and quite frankly, beautiful. It's amazing. And I can get it in 30 seconds by typing in the box.
And of course, this stuff, all right. Maybe you've seen it before. Sure. But have you seen this? This is also Midjourney 6.
This fools a lot of people, it's hard to believe, but AI can generate pretty much anything at this time. And realistic looking old style photos is a genre and it's hauntingly good at them. None of these people even existed.
Now, to that end, there is a whole wealth of new and exciting specialized tools, designs to help designers put together better things like web pages or graphic designs. And one of my favorites is this tool, Generated Humans. This tool essentially is a database of humans that they've generated entirely with AI that you can explore and download. And if you pay them a little bit of money, you can actually create and iterate yourself. So you can go in here and you can do things like, I want to set my hair color, skin tone, emotion, everything you want. You can play around and drag and drop aspects of that face and generate literally any photo, any image you like.
You can even do it with a full body pose like this. Now, the potential applications of this are numerous, but one of the most obvious is that you can actually have models essentially pose for you with any profile that you want, any type of you can look at your user personas from earlier and ask the AI to take some of your user personas that actually render them as prospective illustrations or photos in your website, which is kind of crazy. And there's no licensing agreement since no concerns that you need a release or that anyone else can use the same clip art that you chose to use. So the practical applications of this are quite profound, and tools like this were not even possible 6 to 12 months ago, and now there's tons springing up so you can try this one for free. I recommend you check it out. Anyway.
Lastly, let's look at testing. Testing. So typically testing, it seems like a pretty human orientated thing. The whole point is you've built out a web page or potentially just a prototype, a sketch, a wireframe, and you want to test it on real people.
You want to understand that it works. You want to understand that people resonate with it. Maybe you've just come up with some messaging, your core pitches and so on. You want to get that in front of you as well.
AI is actually affecting this in ways that may not be obvious. So there's a company I'm going to draw your attention to, called Attention Insight, and what they do is quite novel, quite specialized. It is they have synthesized the human behavior behind eye tracking. And so what they're able to do is they're able to create things like heatmaps, where they predict how users will interact with a design before the design is even live. So in this case, you're seeing this web page on the left and on the right hand side, you'll see the heat map for it. And the heat map is looking at things like the text.
And in this case, it's paid particular attention to the woman's face on the bottom left and it's derived this through AI. AI essentially has trained on a lot of human behavior and has learned how to imitate it. And so if you're working on a design like this, you can get an at a glance immediate assessment of how your users are likely to react to your design.
Now many of us have probably heard of shifting left when it comes to the accessibility community, the idea of moving the accessibility components of your production process further forward into the design process. Essentially shifting them left in your timeline, if you will. Now, I suspect we're going to see a similar transformation here with AI bringing forward aspects of user testing into the design process. We could literally be working on graphic designs. And as you're working on the graphic design of a web page, AI could be commenting in real time. It could be highlighting in real time how it expects users to respond to the page that you are sketching right now.
And similar tools could be integrated into CMS’s that could be integrated into all manner of tool chains. So we won't necessarily even need to wait to make something live in order to get feedback on it, which is kind of wild. Anyway, moving on, I want to show you something new and exciting from my own company, Silktide. We're launching a feature today for our Silktide AI platform, which we call Improve UX. Improve UX is leveraging AI to improve the user experience of your website.
And it works really, really simply. It's literally a button. Let's say you're looking at this web page here in Silktide right now. This is a fictional e-commerce website that sells wine. So the AI, you click the one button, takes a look at this page, and it slides in a panel like this, and it will assess what this page is actually trying to do. So it'll say, okay, the goal appears to be showcasing a specific wine, emphasizing its quality and value.
And another aim is to encourage visitors to make a purchase through a call to action button. So correct. The AI has figured that out. It's also figured out a strengths. It determines that the contrast between sale price, the original price is good strategy to highlight the deal being offered and that the use of an image of the wine bottle and glass helps to visually engage you and showcase the product. So that's all good.
But then it goes on for quite a while with improvements to consider, things that might make this page more effective. It suggests more detailed product information, like information about the wine’s characteristics. And also it notes that the video or audio player in the middle seems to be kind of dodgy because it says it has a duration of 0 minutes and 0 seconds, which it's actually smart enough to figure out not only exists but is probably very inappropriate. So this is wild. This is modern AI for you. It's literally looking at your web page and performing an intelligent user experience critique of your visual design, which is wild.
I know because I worked on this, but I know that six months ago none of this would even seem like it would- to be honest almost four months ago. This would have seemed like a dream. And now it's something that we're able to experience today. It’s wild. Anyway, you may have noticed there's a chat box in the bottom right hand corner, and if you click in that and you want, you can have a conversation with that UX assistant.
You can ask it follow up questions. So let's do that. I could ask what would be a better order for the content on this page? It could tell us, Well, I would put the product name and image first and then I would follow it up with the pricing information. And then lastly, I'd have the call to action right below the pricing.
So it's actually proposing reordering this right now. It's got the titles, some text, some video player, the price, and right at the bottom it's got the Buy Now button. And I believe quite correctly, it's assessing this and saying that's not a very good idea and you should do it better like this. I could ask it first of all, I could say, could you write some better text for the buy button? It says, Well, replace “Get me now” which is terrible with “Add to cart”.
And I could ask why that wording. Add to cart is a widely recognized and understood call to action that specifically indicates to users they'll be adding the product to their shopping cart. So it's important to realize here, and this is true of all the AI you've seen so far, is that at its best, AI is not just parroting copy and pasted points. It actually understands like it has reasoning for the thing it is advocating for, and it can justify it.
And if you're not convinced, you can ask it. It's kind of crazy. Anyway, we're very excited and proud to launch this feature. This is free to all existing Silktide customers and it will be going live in a few hours after this webinar today. So all you need to do is view a page in the Inspector, and click on the Improve UX button at the top. So let's wrap up this section.
So we've been looking at the state of AI today, so January 2024 and I have to add the January 2024 because this AI so by February 2024, everything will probably be different. But we can take a pretty good assessment of the kind of tools I've shown you and consider how they affect what goes into web design today. So if we look at this break down, the understanding, the messaging, the architecture and so on, I've attempted to chart out how much AI, present day AI, is affecting or able to affect that area.
Now this is nonscientific, but I just want to kind of give you some kind of frame to work with. So the areas that I would consider the most impacted are messaging and media. So messaging is where you're looking at things like writing a copy, particularly your original ideas for copy and media is things like illustration, photography, clip art and so on.
It's not hard to see why. ChatGPT takes care of messaging, or at least it doesn't take care of it, but it significantly accelerates your ability to perform in that area. And I've been writing copy for 20 odd years and love it, but I still wouldn't know how to tie my own shoelaces anymore without using GPT because for me it's just such a magnificent way of accelerating my creative process.
I don't rely on what it gives me. A lot of the time, I use what it gives me to reject the cliches, so I write something better. But it is incredibly impressive at getting you started.
And likewise with media. I wouldn't necessarily, you know, stop using real photography, real clip art, real illustrations. But I can try stuff out in minutes or even in seconds with generative AI and sometimes, yeah, the generative AI honestly, it's better. It's more custom, it's quicker.
You’ll certainly notice I've used plenty of it in this presentation today. And then as we move down, requirements and personas, well they’re getting, you know, there's definitely impacts of AI there, but it's not going to take over. But it's definitely making a big deal. And actually that's a key point to mention. When I say AI disruption here, I'm not talking about AI doing this job.
That's a whole other story. I'm talking about AI changing the nature of this job, right? So like, imagine if you were a writer and you didn't move from using pen and paper to a word processor. At some point you might be left behind. Well, in the same way here, if you're working on copywriting and you're not using generative AI, you are probably going to be left behind. So if you look at architecture, it's somewhere in the middle.
There's tools to help with planning and prototyping, but they're relatively early stage at this point. And then with things like visual design, UX design and the user testing, there's relatively little. And at the moment its ability to affect this specific set of requirements is quite low, but I would expect that to improve. So when we asked a question earlier on, how much web design will AI do for us, well, I guess at least for the moment, even though we could produce designs like this, they're not actually that commercially applicable yet. This is great. This beautiful.
And it's an intellectual curiosity. If nothing else. It certainly helps get me started posting real design. I might look at that and go, I had not thought about that. That's interesting. But then is this actually going to work as a real design? I mean, ignore the obvious flaws.
Like the text is nonsense. This doesn't necessarily commercially work for me, doesn't necessarily sell the products in the way you actually want, and it's certainly not something I can just lift into a design product like Figma and work with, not yet, but it's a good start. AI right now is giving us superpowers, it's giving us new abilities to do things that we've already been doing, but faster and more capably and to a higher standard. But the thing with giving us superpowers is that we great superpowers comes great responsibility.
So if you think about going from handwritten pen and paper to a typewriter to a word processor with each subsequent advance, we were expected to work faster or to make things better because the word processor lets you go back and edit or, you know, even a typewriter just lets you write quicker than you could with a pen and paper. But of course it also comes with the expectation that you're going to have to know more. Each subsequent technical revolution introduces a new set of skills, and in this case it'll be the skill of understanding and effectively applying AI, and that is going to transform huge aspects of what we do. It’s well worth thinking of it, I think, as a way like this diagram shows you as you are in charge almost of a little robot army. You are all going to become a director, creative director, if you will, or a leader, a CEO, if you will, of potentially a countless array of AI’s and it's up to you to be a good boss to them.
Properly instructed they will transform your potential, but improperly instructed, it'll be like being a bad boss to people. You won't get what you want at all. So as we move into the final section of today's presentation, let's take a look at something a little wilder and a little more speculative.
We've looked at AI as it currently exists within the web design industry, but I want to take a look at predicting AI 2 to 20 years in the future. Now, this may be something of a fool's errand. I'm setting myself up here to put this on YouTube so that someone can come back in 20 years and laugh at me.
But I have thought about it at considerable length and this is something I've been following for, well almost 30 years at this point. So let's take a shot at it. Let's look at 7 AI breakthroughs that you could reasonably expect.
I'm going to start with the most certain, most likely, moving our way down to the least certain. And as we go, each subsequent breakthrough, which hasn't happened yet, but you might expect, we're gonna look at how that would affect the web design industry. But we're also going to look at how that might disrupt us all. Right? In different ways, from where we are right now.
And my my thesis would be essentially that some, if not all of these will occur with varying levels of certainty. So let's let's take a look at what could happen. So the first thing the most boring and predictable thing is speed. Things are going to get faster, right? So anyone who's worked with computers for 10, 20, whatever years will tell you computers used to be really, really slow at stuff but they are now really really fast at. For example, you're probably all too young to remember, but Google once upon a time used to take a few seconds to reply to a search and now you can type into a search box in Google and as fast as you can type, Google will Google.
Literally autocomplete fast. You're going to see the exact same thing with AI. So right now, that image on the right hand side, predictably, was created with AI, and it took about 30 seconds for Midjourney- no it wasn't Midjourney sorry that was Dall-E, to create that image.
Right. I want you to imagine that that goes from 30 seconds to 3 seconds and then from 3 seconds to 0.3 seconds and then 0.00-whatever seconds, it doesn't matter, right?
So quick that as you type, the AI is generating the art in real time. That is basically inevitable. Seems crazy, but it's about the most predictable thing because computers just consistently get faster, faster, faster, and we get better at writing algorithms and technology and so on.
So AI is just going to keep catapulting forward. So the most boring and banal prediction is that you will be able to have AI generating things as fast as you can describe them. And that in itself doesn't sound- All right. It's nice, but why does it matter? Well, it's because it transforms everything else. So once AI is fast, it can fix a whole load of other problems. Let's take a look at the next one. Accuracy.
So accuracy is a common problem with say these image generators and to some extent with things like ChatGPT or text generators or video generators or whatever. You ask for something and it doesn't necessarily give you exactly what you asked for. I mean, it's trying, but it doesn't quite get there. Now, this is a hard problem to solve for lots of reasons, but it turns out speed gets you a lot of the way there. If you ask for an image and you don't get the image you want right now, what you generally do is you try it again. Maybe you try again with a slightly different prompt, but you try it again.
But if AI is fast enough, AI can try again for you and AI can try again maybe a thousand times in a second. And then another AI that is also super, super fast can check all those thousand versions so they can figure out which ones actually gave you what you wanted. Figure out the best one then give it back to you. And if that sounds wild, we're already doing it. We already do a version of that right now, when you ask for an image, for most things, like Midjourney, it's actually doing multiple images and it's killing off the ones that it thinks are not that great.
But we could just get better and faster at that process. So the point where what you type in is pretty consistently what you get. So I would say that speed and accuracy are the two most likely almost inevitable advances in all the AI’s that you're experiencing so far. Then we get slight less easy, slightly more speculative. So consistency.
So at the moment, present day AI technology is very deliberately what we call stateless, which is to say you ask for something and it deals with it and then it forgets everything you told it immediately and it's not affected by what you did in the past. I know there are applications where there's chat and so on and they kind of cheat by feeding in the history of the conversation and so the next thing you say, but basically, if I ask for something like this image here on the right and then I say, Hey, can you turn his head to the left? Can't do it. If I ask for a design of a logo and it gives me a great logo, and then I'm like, Yeah, let's put that logo in this poster. They can't do it.
There's no consistency to its understanding of the broader world. We have an ability to try and do that, but it's, to use annoying technical jargon, kind of reduced dimensionality. It's like, imagine if I showed you the Pepsi Cola logo and I said, Right, look at this logo. Now hide, hide the can. Hide the logo.
Do me a poster of that logo. And you can't look at the logo. You'd kind of remember the gist of it, but you wouldn't get it right. AI is a little bit like that. What we need to do is be able to give it access to look at the logo or whatever it is in front of it while it works.
We don't have that technology just yet, but we're going to, probably, and when we do, it's going to make these tools vastly, vastly more useful. Next up and related is iterative. Iterative is the superpower that humans have developed, but probably the most transformative technology that humans ever, ever developed.
And I know fire is definitely up there, wheel is great, whatever, but it's writing. And I'll explain to you why, because it's not obvious. So when- when you say- Imagine, I ask someone here to write a short story or a college essay or something like that. What you do is you write it onto a piece of paper and you look what you're doing and you think about it and you edit it, and so on and so on and so on. And you, you write on a word processor or whatever, right? There's a process where you take what's in your head and you dump it down in front of you so you can reflect on it and inspect it. Now AI is not doing that.
It doesn't have that ability yet. AI is doing the equivalent of you speaking off the top of your head at all times. So intellectually, when you ask GPT for example, to write an essay and it does a pretty good job.
It's doing the same kind of mental work in a way as you would be doing if you had to make an essay up off the top of your head and you couldn't stop and you couldn't pause and you couldn’t write it down. You just had to speak the whole thing out in one go. So it's pretty amazingly good considering it's doing that. Same with image generation in a different way.
So considering that, it's remarkable. Now imagine how much better humans got when we got the ability to write things down and look at what we'd said and then reflect on it and improve it, and also to work on each other's writing and so on, right. This is the difference between like an oratory tradition that has like a couple of stories and then like what we have now, a modern civilization with books and literature and so on. So AI has not gone through that revolution yet.
It's going to, almost certainly. And when it does, that's going to transform the potential output and capacity and capability of AI. So that's a remarkable revolution to expect at some point. As we proceed, now we're getting into somewhat more speculative territory.
So this is where it gets harder and each subsequent wave is going to get both less likely and more disruptive at the same time. So the next one would be autonomy. Developing the ability for AI to direct its own ability to behave independently to achieve high level goals. So you'll notice that if you've used any of these AI tools, they generally fire and forget. You'll type in something and it'll do it, and then you're done. Or you will- you'll make a request.
And it can only operate in the very narrow frame of solving that one thing at that moment in time. But of course, humans are not like this, right? You can go to a human and you can say, Hey, I want you to create a brand campaign. Right? And the brand campaign may involve coming up with ideas and doing research and then creating some of those and testing them and engaging feedback and iterating doing that ten times, then taking the end result to their boss and saying, Hey, what do you think of this? AI can't do that yet, but potentially it will. And when it does, we go from a world where we say to a tool, Hey, can you do me a design? To a tool you can say, Hey, here's my brief. What I want to accomplish are these goals.
You know, selling more indoor plants or whatever. And it will figure out the most efficient ways to do that. And present them to you at the end. That would be a remarkable transformation.
Next up, and perhaps more surprising would be taste. So we actually have some advances in this area already. I mentioned this before, Midjourney and Dall-E, which is what drew this actually quite beautiful image here on the right, have internal models of human taste, but they're quite basic. Essentially, AI has tried to learn what humans like to look at, and it's fairly good at it. It uses that information to create art that we find appealing, but it's also quite basic, quite derivative, quite vanilla in a way, which may be one of the reasons why AI tools have yet to produce some magnificent new piece of art that kind of transcends the genre. They tend to create stuff that looks a lot like the things we see everyday, but still developing the ability for computers to reliably model human taste could self direct AI to higher quality outputs.
So imagine if when you're asking GPT to write you a blog post, if it has a really, really good understanding of human taste, if it could- Potentially things like the emotional states like, you know, desire or intrigue, curiosity, etc. understanding and modeling those emotional states effectively would enable AI to potentially produce absolutely dynamite copy. What it's doing right now is mostly imitating the style of copy that it sees in the wild, right? So it knows how to write a slogan because it knows what slogans look like.
But what it doesn't know is how to model the human psychology so effectively that it could kind of get inside their head and do something that might surprise and delight them. That again, could unlock a new world of incredible creative potential outputs. And then lastly, and most speculatively, would be brilliant.
So this is definitely not- well, this would be the most controversial among researchers. So at the moment, generative AI works on the principle that we teach AI through an enormous array of examples, what the real world looks like, and then it kind of figures out an internal model for creating its own stuff. And so essentially in a way they’re kind of imitating styles, right? You give ChatGPT the prompts to write something in the style of Shakespeare, and it will do it and it will do a surprisingly good job. It will imitate Shakespeare. But what it won't do is it won't write as well as Shakespeare.
It will get his style. It will imitate his style like I might imitate his style. I'll use old words. I'll use certain iambic pentameters or whatever. Great. But it won't be Shakespeare, it won't be brilliant.
And it seems that at least so far, whatever we feed into these systems, it learns how to imitate. But it doesn't learn how to transcend, right? There's no kind of spark of genius of true, spectacular creativity. And so it remains to be seen whether or not that will ever change. If it does, of course, we have the potential that AI may, in fact, transcend what humans can do, or at least transcend what the vast majority of us could do. That would be immeasurably disruptive in ways that we honestly, scarcely could even try to imagine. But it's also by far and away the least likely to occur, certainly for some time in the near future.
So let's put all of that together. So this model, fast, accurate, consistent, iterative, autonomous, tasteful and brilliant, these are what I would kind of pose are almost the breakthroughs you can anticipate and where we end up is going to pretty much depend on- Sorry, the disruption effects of AI are gonna pretty much result- The disruption of AI is pretty much going to come down to where we land in these individual steps. So we asked a question earlier on: how much web design will AI do for us? Well, the answer is largely where we end up in this chart.
If we’re in the top four, AI is pretty much going to be totally subservient to us. It's going to do exactly what it's told, and that's going to be incredibly easy for us to work with and to add value. If in the middle it's going to be doing some level of self-direction. So you might imagine products like say, an automated AB testing where an AI looks at your website, comes up with AB tests, runs them and automatically improves your website. That seems very, very feasible.
And ultimately, if you end up at the very bottom, if you end up at the end in brilliance, well, honestly, that's a challenge to what it means to be human. But it's also not something that I would be expecting anytime in the next 25 minutes at least. So as we come to the end... Ending a presentation on AI job disruption with a memorable and impactful line is crucial. ChatGPT has some thoughts and it suggests we should say AI is not the end of jobs, but the beginning of a new way of working.
Good job, GPT. AI is not the end of jobs, but the beginning of a new way of working. I actually like this visual image here as well because it kind of shows what I suspect is our likely future. Not that we’re all dressed up in Iron Man costumes, but the idea that we ultimately, as with all technology, are somewhat transformed by it. Think how the smartphone has almost become an extension of our reality. Right.
Your intelligence, your brilliance, whatever, is now extended by the presence of technology you hold in your hand. And I suspect AI will do something similar and even more disruptive to us over the coming years. But yes, whichever way it goes, it's going to be a wild and exciting new future. Unfortunately, at this exact moment, OpenAI chose to go down. This is actually a true story. It crashed and prevented me from doing any additional work on this presentation.
So as a result, I'm going to wrap here, but thank you for listening, guys. I'm going to move on now to Q&A. My name was Oliver Emberton.
You can follow me on LinkedIn. And if you're interested in our AI Solutions, you take a look at silktide.com/ai. Jess, are you there? (Jess) Hi, everyone. There's been a lot of discussion going on in the question and answer section.
(Oliver) Oh good. (Jess) To cover one thing just briefly. We did record this session. We will be sending it out later in the week to everyone who was signed up for the webinar.
So if you missed the beginning, don't worry, you'll be able to watch it. And I believe Oliver mentioned it's probably going to go on YouTube too. (Oliver) That's right. (Jess) So some of the questions we’ve been getting... Dean Brady asked, is this going to create just more of the same look and feel? Can we assume it's going to just make more vanilla site designs? (Oliver) Actually, I would go the other way. So I'm old enough that web design- So I was doing web design when I was 16, and that was back in 1995.
I was doing web design, but it was more creative than it is now. So modern day web design has become very derivative and predictable. I would say most sites look the same as each other nowadays. AI actually gives you new superpowers. So let me give you an example. Right?
So I’m literally just going to go random here. Say you decided that you thought the best way to represent your brand. Say you sell, honey.
Okay, I'm making this up, right? Literally off the top of my head. Say you sell honey. Then you go, You know what? I'm going to want people dressed up as bumblebees in funny costumes. Okay? I just decided I want to do that. Now, if you're a traditional web designer, good luck, because you don't have the multi-hundred thousand dollar budget required to then go and get a bunch of models and do all the photoshoots and then just try all of that stuff and maybe make videos and trailers and whatever. Try all that stuff just test it just to go, oh shoot, that actually sucks.
That doesn't work at all. But what's probably going to happen in the coming months and years is you will just type that in. In fact you could do some of it now you could just say, All right, I want to try that and try an entire array of wild, exciting new concepts that previously would have been outside your reach. I think one of the reasons web design looks so derivative now is because it's reduced around our toolchain. It’s introduced around the idea of how can I quickly produce an efficient website that has minimal typography, has these three column layouts with the icon, with the text underneath it, all the things, all the cliches that you take for granted in web design right now, that's because they're easy to do. But when they're not easy to do- Sorry.
When new, exciting things become easy to do, maybe we get more variety. I'm actually quite excited. I think it could be a new golden age. (Jess) Okay. Because AI imagery is learned from other imagery, like actual artists’ work, how do you reconcile the copyright and ownership issue with AI imagery? I believe there are lawsuits ongoing in the US. (Oliver) Correct. Okay.
So this is a complex topic. I'm going to do my best to distill it grossly. Okay. So firstly, all the major tools that are doing this are aware of potential or actual legal jeopardy to them. You'll notice that they’ve all, I think, all created some kind of legal insulation for the users.
So, for example, OpenAI has actually said if you're sued, we'll cover you basically. It’s their problem. It is likely there will be some legal fallout for some of the large providers that are doing this. That is seemingly inevitable, but it is also already something that they have taken enormous steps towards eliminating. So, for example, you may not know this, but like OpenAI has tried to eliminate modern art or potentially copyrighted art from their training data entirely, so they don't even have the ability to learn from it. And they've got a whole range of prompts, like if you ask for certain protected things, they are going to try and stop you and so on. Now that's imperfect, but one of the things it revealed is even if you don't use that training data, AI can learn to be really, really good at art regardless, like with the art that you don't have to track, like the copyright free stuff or the stuff that's old, or just like looking at the universe or whatever.
It turns out AI can do an incredibly good job of learning things. And so I think
2024-02-07