Virtual Vigilance: Protecting Yourself in an Age of AI and Deepfakes | Access Tech Live

Virtual Vigilance: Protecting Yourself in an Age of AI and Deepfakes | Access Tech Live

Show Video

- [Narrator] Coming up on Access Tech Live, - [Marc Aflalo] ChapGPT is getting a memory, the state of cybersecurity in Canada and tech expert Amber Mac joins us with tips to stay safe online. - [Narrator] This is Access Tech Live with Steven Scott and Marc Aflalo. The latest in tech and accessibility every week. Follow us now and get involved at Access Tech Live.

(upbeat soft rock music) - Welcome to Access Tech Live. Little bit of a glitch there. Can't hear Steven Scott. We'll get him soon, don't worry about that. Thank you for joining us this week.

Hi, I'm Marc Aflalo, Steven Scott, somewhere on the other end in Glasgow, Scotland. This week diving into the world of cybersecurity and safety, tech expert Amber Mac will be joining us, plus the associate head for the Canadian Center of Cybersecurity will be joining us to get a handle on where we stand as a country, but let's kick things off with the headlines this week. - [Narrator] Now, Access Tech Live headlines. - Well, if you've ever used ChatGPT, then you know that when you ask it a question, it's almost like Groundhog Day. After a while, you have to keep reminding it how you want it to answer and the formatting of that answer, et cetera, et cetera.

Well, OpenAI knows about it and is working on fixing that problem. It's rolling out memory for ChatGPT that will allow the bot to remember information about you and your conversations over time. It'll work in one of two ways.

You can tell it to remember something specific like who's my wife, who's my son, or how you want it to phrase answers. Or it'll try and pick up things over time, like your preferences. The feature is rolling out now and we expect people to love it or hate it in the near future. Well, it was just a matter of time, but reports are in that hackers are turning to ChatGPT and other large language models to make their phishing emails and attacks smarter and more customized than ever.

Microsoft and OpenAI are reporting that hackers are attempting to use their tools for research into targets to improve scripts, and to help build social engineering techniques. Now, while the companies haven't detected anything significant in terms of attacks with their AI yet, they've been shutting down a lot of accounts and assets associated with known hacking groups. Microsoft is also warning of future use cases like voice impersonation, another topic we'll be covering a bit later on with Amber Mac in the program.

Trestle Lab is an Indian based company, secured funding for its accessibility technology called Kibo, courtesy of Shark Tank India. The startup enables users to listen to, translate and digitize printed, handwritten or digital documents. And thanks to the popular TV series, it secured funding that will fuel the growth of its products. That's right, the sharks recognized the immense potential of Kibo and joined forces to seal a deal, which will provide them with the resources necessary to make the technology accessible to millions of visually impaired individuals worldwide. A Tinder style app is being used by teens. It's causing major concerns after numerous reports of potential sexual extortion have been made to the Canadian national tip line.

Cybertip.ca is a site where Canadians can report cases of sexual exploitation of children online and it's warning parents about the app, which is called Wizz, W, I, Z, Z, or Z, Z. The app touts itself as the ultimate online platform for random chats with people from all around the world and asks the user to swipe through profiles randomly, pairing them with strangers in video chats. Since it's launch in 2021, the cybertip.ca line has received over 180 reports tied specifically to the app Wizz and tied specifically to sextortion and other luring type incidents.

We'll dive into this deeper with both our guests later on in the show. Scary stuff here, Steven, but we have to tackle it head on. - Absolutely.

Can you hear me now? - I can hear you now. It's like you are a voice, a distant voice in my head, but now you are back. Welcome back, Steven.

- It's the great irony of doing a technology show live and guess what? Technology breaks. How typical is that? It's lovely to be with you this week, Marc. I hope you're having a wonderful week so far. Mine is going great. - Yes, clearly it's going awesome and you know what? You at home, if you want to get involved, every week we throw out a question and this week's question, Steven, is have you ever been a victim of an online scam? Have you ever been a victim of an online scam? - I have nearly been. Yeah, I've always got these emails that you get.

- [Marc] Oh, nearly. - [Steven] You know those emails, it says, oh, just, you know, open this link and sign in with your bank details. And they're getting so clever these days. That's the problem.

So smart that it's getting to the point where you're starting to worry that, is this really the bank that's contacting me? Course you gotta be so careful nowadays. - Yeah but you were telling me a story where your screen reader has actually saved you a couple times because it reads through all the links and everything, so you actually have a little bit of head up on other people because you can actually hear the link read out loud. You don't have to really look for it. - Yeah so, you know, for example, let's say it's a link from apple.com,

you know the screen reader might notice that the L in Apple is maybe an I, so it might say appie.com and you think, hang on a minute, something's not quite right there. So, you know, yeah, there's one time that being blind is actually quite useful.

You can get ahead of some of the scams a little bit earlier, which is good. - Yeah, I mean listen, they are getting really smart. I'm getting text messages that are telling me that my phone number's been updated for my online profile on this bank or that bank. I think the trick at the end of the day, and something we're gonna hope that you take away from this week's show is just be very cautious, be vigilant about, if you suspect anything, then just don't, just don't click.

Don't click on anything, okay? Just don't click on anything for the rest of your life. - And always call whoever it is. If it's a bank or it's a, you know, something that you are actually associated with 'cause that's the thing, they get smart now. So you know, for example, it could be RBC or something you might get a text from.

You think, hang on a minute, that is my bank. Just be very careful and always call the bank on the number you know, and ask them if they've contacted you. And more often than not, they will tell you no we haven't. - Yeah, no, I've gotten calls where people say, can you just let us know your address again? I'm like, wait a second, shouldn't you have that on file? Yeah, that seems a little bit suspicious. So let us know. Have you ever been a victim of an online scam? That's a question we're gonna ask throughout the show today and we're gonna get to your answers a bit later on.

If you want to connect with us on all our platforms, it is @AccessTechLive. Our email address is always there for you, feedback@accesstechlive.com and you can get involved. Steven, coming up when we come back, she's a tech expert, award-winning podcaster and bestselling author and she's a mom.

Amber Mac joins us next here on Access Tech Live. - [Narrator] There's more Access Tech Live to come. Get involved and have your say at Access Tech Live on social media. We'll be right back. (upbeat music) - [Announcer] The latest in tech and accessibility.

This is "Access Tech Live" with Steven Scott and Marc Aflalo. - And we're back on "Access Tech Live". Now, Marc, before this big football game of this past weekend, Taylor Swift was in the news, but not for any celebratory reasons. - No, Steven, a couple weeks back, we found out that a whole bunch of Taylor Swift deepfakes, where they put her head on the body of something that she probably didn't wanna be part of, was circulating online. And later, we found out that it was actually the result of a challenge from the cyber group 4chan.

So, that got us thinking about AI, and while we try to spin the positive sometimes, you gotta focus on the negatives moreso for our protection, I think. And while AI isn't in the Terminator mode taking over the world, we definitely have to think about safeguarding it. So, Steven, joining us right now is Amber Mac. She's an award-winning podcaster, a bestselling author, the co-host of "The Feed" on SiriusXM.

She's a mother and she's an all-around tech expert. Amber, thank you for taking the time to join us this week. - Thanks so much for having me. - Now, Amber, the day after I emailed you, your newsletter came out and it was on the exact same topic, so I think that worlds were definitely on the same wavelengths. Here's the softest question I could possibly start an interview off with, which is, what is your take on some of this AI seediness and how on earth we're gonna combat this? And when I say we, I mean the collective we, the world, the governments, the people, the everybody that's involved, here.

- Well, I think for all of us who have been on the internet for a really long time, we know that there has always been a dark side to internet technologies and internet communities. That has always existed. But I think what's different in 2024 in terms of AI is that the threat of deepfakes, both with audio and video, is targeting everyday citizens, including young people. I think the concern really is that, first of all, as a parent, I'm worried about the next generation being targets of deepfake content. And secondly, I'm not entirely convinced that we have the right leaders in place in both big tech as well as in government to really guide us safely down this path. So, it's really creating a perfect storm when the lives of young people, and especially young girls and women, are truly at risk.

- Now, stories came out just last week about Meta and likely others as well to start identifying images that are AI-generated. I guess that's an interesting start to this, but the question is, Amber, is it enough and is it even helpful? - Well, I think when we see some of these big tech companies trying to make moves to allow us to move into this world of artificial intelligence with more transparency, it is always a good thing. But I think it's also fair to say that those of us who understand tech really know that, yes, you can, perhaps, identify images that are manipulated online, but it's easy enough for someone to take screenshots and for those images to spread around the internet. So, it's not really foolproof in terms of that level of protection that people really need.

It reminds me a little bit of the days where we would talk about disappearing social media. Like, oh, you send a message, and then it disappears and no one can see it. Well, we all know that that could be screenshotted and it was easy to do so. I think, again, these big tech companies are trying to keep up with these changes, but I think it's just a drop in in the bucket in terms of some of the things that they're offering to do, when really, at the end of the day, I think what we need, especially from those big tech companies, is more human moderators, more customer service channels, places that we can go when we have legitimate complaints about some of these deepfake videos and audio files that are impacting our lives. - Amber, you talked about not having necessarily the right leaders in place, and I think that goes for some of the companies, the big tech companies, some of the government as well.

What do you think it takes for that individual or the right people to be in place? What criteria do we need to have in place, there? - Well, I think it's fair to say that when we look at some of the social media companies that are running these online spaces around the world, there's not a lot of diversity among those leaders. I think we can start there. I think when you don't have diversity among some of those leaders in the social media space, what you get from them is a very narrow view of what experiences are like online. Those views often differ from what many of us are experiencing day to day. That's, I think, first and foremost. If we go over to the government side on the leadership issue, I think it's also fair to say that within government, we don't necessarily have that many leaders who truly understand digital in a way that is aligned with the impact that digital is having on our lives.

And again, this creates an environment where I think the average person is left out there living in the Wild West, unsure of what protections do exist and what avenues to follow if, in fact, they experience or encounter something online that is detrimental to their experience. - I wanna pick up on something you said earlier, Amber, about the disappearing messages. I think back to Snapchat, and of course, WhatsApp as well has that functionality. Are you saying that methods like that are not safe if we use, for example, incognito mode? If someone thinks, "Well, hey, I'm browsing on something maybe I don't want to be stored on my computer, so I'll go into incognito mode on my browser." Is that not working? Does it matter if we use those features, or is the data still presenting itself to these companies, so really, we're not making ourselves any safer by using those features? - I think it's good when people take precautions, and I would call them precautions.

I think it's good when you use some of the tools that do exist to further protect your experiences online. At the same time, I think people should understand that there is no safe experience online in terms of your data. If you don't want your privacy to potentially be impacted, you probably shouldn't be online at all. That's not realistic for most people.

But I would never tell someone that, "Hey, you know what? If you use incognito mode or if you use disappearing messages, you're 100% safe and no one can ever see those messages. All of us who understand the tech world get that there's always workarounds in terms of your information being shared. So, I don't wanna discourage people from using the layers of protection that do exist in the digital space, but I do want people to be aware that it's never 100% safe in terms of our online experiences. And when we approach these online experiences with that viewpoint, I think we start to better understand some of the risks that do exist.

- I think about our smart speakers as well, and there's always that question that comes up. Are they listening? Are they aware of what I'm saying? What's your take on this from your understanding? Because there are people who will say: Well, I don't go online. I don't search that much.

I don't spend a lot of time using websites that are nefarious or doing anything dodgy. But when I go on to Facebook or I go on to Twitter, an ad will pop up about something I was just either thinking about or I've been talking to a friend about. - I think it's really important for people to weigh the benefits versus the risks with all technology.

I speak a lot across the country about emerging technologies and I talk about smart speakers a lot. And I often get that reply that, "Oh, my gosh, I would never have a smart speaker, because it's listening." Well, guess what, that's actually how it works. It has to listen to you. That's how the technology is designed, so when you prompt it to wake up, it is listening to what you're saying, and then it responds. And so, that's just how it works.

But there are so many instances in terms of accessibility where voice assistance are hugely beneficial, even to seniors, for example. I think of my father-in-law. He had a fall a couple years ago and he used a smart speaker to call his daughter, and he was not near a phone. And essentially, that allowed him to have this vehicle to be able to get in touch with a loved one, and then the ambulance came. So, the benefits in that situation really outweighed the risk.

Was he worried, living alone, that the speaker was listening to him? Well, not really, he wasn't talking to anyone. So, we have to be intelligent about how these benefits outweigh the risks and accept that there's always risks to anything. Just like when I drive on the 401 in Downtown Toronto, there are really big risks to that, but sometimes, I have to get to where I'm going.

- Amber, what do you think we can do about some of the inequality when it comes to access to just digital access? I mean, you think about the accessibility community, and the fact of the matter is, when you look at the numbers, that not everybody has equal access to devices or the skills in which they need to even identify the things that can keep them safe. Are there things we can do other than having conversations like this? - Well, I think when we talk about access to technology, there are a number of issues. And so, it's kind of like peeling back an onion, right? And the issues include things like access to the internet. We know there are still lots of communities across the country that don't have adequate access to the internet, and that creates a digital inequality. I mean, that's something that we would all like to see fixed in the very soon future.

Also, when we talk about access to newer technologies, you're exactly right. I mean, there's plenty of examples of how technologies are just totally out of reach for the average person. That creates another issue in terms of people not having access. When we talk about skills, I mean, think about artificial intelligence as just one example, and I think it's very clear to say that the leading companies in this country and around the world are doing as much as they can to ensure that they are ready for this transformation that's happening that's fueled by artificial intelligence, and they're upskilling the average worker. Do I think that that upskilling is happening in call centers where people answering phones are gonna lose their job any day now? Probably not.

Unfortunately, as much as I love technology, I love the accessibility that technology does provide, I think in the future, unfortunately, I think we're going to continue to see this large gap in terms of access to the technology and to the skills in the digital space that can all help us really advance in our lives. And that comes back again, I think, to having leaders in place who understand that the skills gap is one example, the infrastructure gap, and really trying to close it, because there's no excuse. I mean, the digital world is not new, right? It's decades in the making. - Yeah, I mean, I do feel, I must admit, when I hear from younger generations, and I don't mean children.

I know we're gonna get into talking later about parents and children and staying safe online and all of the important stuff around that. But I think the younger generations, the teens and those in their 20s today are far less concerned about privacy than perhaps generations like mine are and were. Do you feel that there's a change in the air? - I think there's a absolutely a change, and you're bang on in terms of the next generation and how they view privacy. I think they're much more open and accepting of the fact that, hey, again, I wanna use these services, I wanna use these tools, I wanna connect with my friends. And there's a trade-off.

I mean, even look at the app Snap is one example for young people. I mean, one of the most popular features within that app is their maps. That means at any time, my 15-year-old son can see where all of his friends are at a moment, just clicking and seeing where they are around the city or across the country. To me, it feels like you're exactly right. I don't know if privacy is something that they perhaps take as seriously, but I think there's a downside to that as well.

I mean, Snap Maps, for example, really increases that kind of FOMO among the next generation, the fear of missing out in a very real way that is causing mental health issues among young people because at any moment, they feel like they're not included. That's the last thing that this generation needs. - Amber, stick around. We're gonna take a quick break.

When we come back, I wanna talk about that parent inside of you and tell you a couple stories that scared the crap out of me. This is "Access Tech Live". We are in conversation with Amber Mac. We'll be back in just a moment.

- [Announcer] We wanna hear from you. Follow us on social media and get involved at @accesstechlive. We'll be right back.

- [Announcer] The latest in tech and accessibility. This is "Access Tech Live" with Steven Scott and Marc Aflalo. - We're back on "Access Tech Live". Amber Mac is back with us again. - Now, Amber, you co-wrote the Amazon bestseller, by the way, "Outsmarting Your Kids Online", which is a perfect segue into a conversation I wanted to have, which is our kids.

You're a parent, I'm a parent. Steven has some dogs. We all have similar concerns.

And I'm gonna tell you two short stories. One story is a friend of mine whose 13-year-old son was a victim of sextortion. This is someone in his class made threats against him and put him into a deep depression to the borderline suicidal at points in time, and he was scared to call the police, scared to talk to anybody, and it was something that never actually even happened, and unfortunately, he was led down this path, and everybody's on a better note right now.

Along those lines in schools these days, there's a lot of conversations about access to cell phones. And there's two sides to that story. There's a story of, well, it's access to information in the classroom if they need it.

If something happens in the school, you can get in touch with your child to make sure they were okay. And there have been many instances like that, especially south of the border where there have been school shootings and people have had access to their kids. So there's a big debate going on as to how much is too much access, and what do we allow our kids to actually have in school.

Where do you stand on that? It's a hard stand to take, because I'm so torn in both directions here. - Well, first I will say that when I co-wrote "Outsmarting Your Kids Online", it was quite a few years ago. The book is totally out of date, don't buy it. - (Marc and Steven laugh]. But I was kind of naive I think in some ways, because I thought, okay, let's layer on more technology to help to protect our kids from the dangers of the social media world.

And I think what I've learned over the years is that that is not the answer. I think the answer generally speaking for all young people is really two words, get outside. And I know that sounds really simplistic, but I think one of the things that we've forgotten in this whole conversation is that it's not just technology that we have to blame. It's that kids are sitting inside for eight hours a day, then they're going home and then they're on their screens.

They're not getting any physical exercise necessarily. They're not necessarily connecting with nature. All these things that we know are really good for your mental health. And so I would say to that question about whether or not we have too much technology in the classroom, I think the answer is absolutely yes. But I would add to it, and I'm certainly not a health expert, but I would encourage people who are deciding on curriculum to include movement, include outdoor activities for kids, include other opportunities to get away from that technology.

And I think we'll see that we would have happier kids in the long run. So I think all of us really have an unhealthy relationship in many ways with technology. And we could all benefit from less technology in our lives. I don't think you're gonna be able to solve this problem with saying, "Hey, no phones in the classroom," because the kids are on computers with many of their courses, right? They know how to get text messages on those computers. They know how to bypass security layers that have been built in. But I think that's not the answer.

I think the answer is, hey, do we need to rethink education? And do we need to rethink it in a way where we recognize that hey, nobody benefits from sitting all day, including adults, and maybe we just have to do things differently. - I love that you say including adults, because I'm a victim of that. I'm guilty of that, sorry. It's an interesting challenge. My son is autistic and he socializes electronically. That's how he socializes with his friends.

And he has trouble face-to-face, which is an interesting challenge, because we're constantly balancing the how much is too much versus, oh, but he's playing with a friend that he wouldn't be playing with otherwise. So we try to do that in our house, but it is definitely challenging. I think everybody has a unique situation as well.

On the classroom side of things, I mean, there was a story I heard last week of a girl who was singing in class as a presentation and someone filmed it and it was out as a meme about five minutes later. So it was, it's a tough balance and I can't imagine in the education system that I know the conversation's happening now. As a parent, what do you do to try and encourage your child to get outside and be off technology? How do you balance that? - Well, I think everybody wants to kind of have this magical formula for this answer to how much technology is the right amount of technology. And the one thing that I did learn when we were writing this book and I've been researching technology when it comes to how young people are using it, is that there's a real difference between creation versus consumption. And when we talk about technology, for example, to me, in the category of consumption, I would include browsing endlessly on TikTok, right? You're just consuming content that's out there. You're not really being creative at all.

You're not collaborating or engaging. To your point of your son, I would say that video games are very social for a lot of of kids, especially young boys. And I don't think that's a bad thing. I think it's great to be able to connect and have those relationships, because I don't think that falls into the category of consumption. The worst type of technology and the one that we always try to get our son away from is that consumption piece, is that when you're just sitting and scrolling endlessly for hours and hours, I would prefer that he spends more time doing things like learning how to use things like cricket machines that belong in more of a category of kind of that maker category with technology.

So creating things versus consuming things, collaborating versus consuming. I think we have to kind of steer them in that right direction and add that physicality back into their day if you can and if that can be part of how they learn. - I often get the impression there are two types of parents in this arena.

There's the parents like yourselves, like you two, who are very keen to get your kids out there and do things and have a healthy relationship with technology and maybe sometimes struggle with that. But then there's other parents who will gladly give their child an iPad or an iPhone and say, "Off you go, leave me in peace." At which point do these parents not realize that if you're gonna take the time to bring a child into the world, you've kind of gotta look after him or her? - Well, I think a lot of parents really do struggle with this question about using technology and how to guide their kids, because maybe they don't know how to use it in their own lives. Maybe they don't have sort of those guardrails in place for how they engage in the online space either. So I think it comes down in some instances to a real lack of understanding of how technology works, the impact that it can have.

And also I do believe that there's a little bit of a misconception from a lot of parents as far as how the platforms and the tools their kids are using, how those are protecting those kids. I will say first and foremost that a lot of these tech companies don't necessarily put kids first when they're designing some of this technology, so that's a growing issue. I don't really wanna be hugely judgmental, because I think there are those parents, like the single parents who maybe they're out working and their kid has nothing to do, but kind of watch the iPad and it's a harder ask for them to be that digital guide. But at the same time, I think that all of the research points in the same direction, more screen time, more mental health issues, and kids already are facing an overwhelming number of issues with their mental health, especially coming out of the pandemic. - Yeah, yeah, I'll tell you a quick story, Steven, before I was a parent, I remember having some friends over who had two kids at the time, and we were at the dinner table and one of their kids was being rowdy, so they handed the kid an iPad or a game.

And I remember being very judgmental at the time going, "How can you do that?" Fast forward to when you're a parent and you're in a restaurant and your kids just are just not, they're not behaving and sometimes there's the easy way out and the easy way out just makes it a calm situation for everybody. So it's very easy to judge. And I'm not saying that in a mean way, Steven, but it's easy to judge before you're in that situation, which is why you talk about the single parent Amber.

And it's a perfect situation, you have to really look at these things case by case, Steven. - Oh, absolutely, yeah. But I think it's also responsibility of parents to look after the child they bring into the world. So, I think there's a very much so a responsibility here. And on that point of responsibility, Amber, I want to ask you about the responsibility of parents to understand the types of safety nets that the devices that they give their children have, because often parents are certainly ones I've spoken to in the past, oh, I don't have a clue.

I just give them the device and they figure it out. That's not good enough. Parents need to take responsibility there too, right? They have to actually take the time to look into what restrictions perhaps maybe need to be put onto devices.

- I think one of the best things that parents can do is that they should use the same tools that their kids are using. They should play a little "Fortnite", they should jump onto TikTok, they should download Snapchat. They should start to understand how those tools work.

I think it gives you an insight to your point into the experience that your kids are having online when you fully understand the tools that they're using, and you can kind of, you can nudge them in the right direction, right? To your point where all of a sudden if you say to a young person, okay, just take the iPad, do whatever you want. Parents have this tendency sometimes if they don't understand the digital space to kind of throw up their arms and say, "Hey, it's complicated, I don't get it, my kid is more tech savvy. I don't know what they're doing, oh my goodness." And they don't understand the full risk that exist. That was probably one of the most troubling things that I've seen over the years.

The kids who tended to get into the most trouble online tended to have parents who really didn't even understand that there were real threats in the online space. And that disconnect, I think, creates sort of this divide where it's almost impossible to guide your child if you have no sense of the path they're going down versus in the physical world. We know our neighborhoods, we know what streets are safe, we tell our kids where to go, how to cross the street, how to ask for help. We tend to guide them, but I think you're right. There are probably too many parents who've just said, "Hey, you know what? I don't know what you're doing. I'm not gonna get involved.

But let's face it, as we're talking about today, the risks are very real." - Couldn't have said that better, Amber. I actually encourage parents as you do to try those apps and actually use the opportunity if your child knows more than you let them teach you, it's a good bonding moment too. It's a good way to kind of separate them off the computer in that consumption mode and let them be a little bit more, empower them to teach a bit. And any final tips or tricks that you use with your son to keep him safe online, any things that you can or you look out to in the future you're kind of worried about? - Well, I think that you actually presented probably the best tip that there is, which is ask your child to teach you, ask your child questions, ask them to send you the TikTok videos that they think are funny so that you can post them on your Facebook account or Instagram account. Try to build a relationship where technology really is that thread that can kind of bring all of you together.

I think that's the best tip. And I think secondly, I think I would just warn parents about this deepfake danger that we are entering into where all of a sudden audio and video can be manipulated. And I would encourage them to warn their kids and explain to their kids what these risks are and that there are repercussions if you're the creator of those deepfake videos and if they are the victims of receiving them, then talk to them about what paths they can follow. - It's a hard question to answer. Just a final one, Amber, in a word, what's the right age for a child to have a smartphone? - Oh God.

- My answer is none. (group laugh) - Amber, Marc, thank you. - It's never a perfect answer. - Thank you so much for coming on.

It's been really interesting talking to you. Thank you so much for coming onto the show. Tell people how they can follow you, find out more about what you're doing. - You can follow me at ambermac.com. I do a newsletter each Tuesday that's totally free at ambermac.com/newsletter. - And it is a great newsletter.

Coming up, how to prevent cyber attacks in the country? Well, our next guest is the associate head of the Canadian Center for Cyber Security and we'll dive in, stick around. - [Announcer] We wanna hear from you, follow us on social media and get involved at "Access Tech Live". We'll be right back. - Now back to "Access Tech Live", the latest in Tech and accessibility with Steven Scott and Marc Aflalo. - Hey everyone, welcome back to Access Tech Live.

I'm Steven Scott, Marc Aflalo is with me. And don't forget all the conversations that we are having, you can catch up with on AMI Plus and also on YouTube as well. Marc, a really interesting conversation today on the state of, you know, the nation when it comes to cyber crime and staying safe online.

- Oh, absolutely, and our next guest is gonna be, you know, contributing to that conversation. Rajiv Gupta is the associate head for the Canadian Center for Cyber Security and he joins us now. Rajiv, thank you for joining us. - Thank you very much for having me. - Yeah, great to meet you Rajiv.

I wanna ask you first off, how many acts of cyber crimes happen in Canada every year? - So difficult to measure, right? So as a cyber center we always encourage voluntary reporting. So from our perspective, it's businesses that are calling into us to report cyber crimes. That is obviously very under-reported, so we don't usually publish the numbers. In 2022, we had said it was just over 200 ransomware events that we'd seen, but that's by no means cyber crime.

When you get into actual cyber crime and fraud, we worked very closely with the RCMP and the Canadian Anti-Fraud Center as well. And in 2023, they had said that there was over 62,000 reports of fraud in Canada, which is a pretty ridiculous number. - Yeah, I guess there's a fine line there, right? Because you know, getting a text with a link is technically cyber crime, you know, getting an email is a cyber crime. So I guess, where is that line? Is there a line or is it just all fraud at this point is just considered somewhat cyber crime? - Well, all of those digital, you know, like cyber malicious activities, they're crimes in Canada, so we always encourage, you know, report them to the police of jurisdiction as well as the Canadian Anti-Fraud Center. When it gets into businesses where they're, you know, we would like to know from a cyber center perspective. We work with businesses whether they're critical infrastructure, government, et cetera, right? So that's really where we draw the line.

But at the same point in time, these are all criminal acts, but there's so many and there's so many variants, it's difficult to track on an ongoing basis within Canada. - Have we seen an uptick since these large language models and AI have become a little bit more, you know, I guess more publicly focused? Have we seen more uptick or is it really not measurable yet? - Well, it's hard to attribute the activity to actual large language models. We've seen, you know, the activity grow year over year from a cyber criminal perspective. Where we're seeing the, you know, AI involvement, this is something that we'd put international cyber threat assessment from 2022, which was a look forward into 2023 and 2024. And we had said that the usage of AI, in particular LLMs or large language models and generative AI would be enabling cyber threat actors to actually, you know, pursue what their means, right? Which is really making money through cyber criminal means. We are seeing, you know, better, you know, crafted spear phishing emails.

We know that cyber criminals are gonna use AI to augment what they do with respect to the whole, you know, ransomware ecosystem that exists nowadays. But at the same point in time, we're also using AI within the cyber center to make sure that we can protect both the government better and leverage our defenses better in that sense as well. So, you know, we're hoping to use it in a defensive way, but it's also enabling cyber criminals to pursue what they're doing. - See, I find this whole subject very interesting because we talk about cybercrime today on the show, and I think to myself, I don't consider a text message like Marc was saying, that a text message to be cybercrime. I just think of that as a text and I usually just delete it.

I wouldn't even think about reporting it necessarily. So is it perhaps the case that these numbers that you're talking about are actually way higher? - Absolutely. We know that, right? We know that we think we get a small percentage of what's really happening in Canada reported to us both on, you know, the individual cyber criminal activity that happens to individuals, but also on a business level in terms of critical infrastructure as well.

So we know they're under-reported. We want people to report, you know, the more we know about the landscape, the more we can actually put out the right advice and guidance, the more government can actually make solid policy decisions and really recognize the scope of the problem. Very important to report as well.

But then, you know, we have put programs in place when you think, like you just mentioned the text message. So within Canada we're working with telcos on a program where you can actually forward the text message to 7726, and if it's a malicious message, then we will work jointly with the telecom providers to take down those websites to make sure those malicious links to make sure they don't impact other Canadians. - That is excellent.

That's a fantastic service to be aware of, and of course a great way for people to report it, which helps give you a bigger picture of what's actually going on. That is fantastic. You know, it might be an obvious answer to this, maybe not, but is there a particular segment of the population that's more susceptible to acting on messages or emails that they get? Perhaps, I mean, the obvious answer to this might be older people, but what's your take on that? What's your data on that? - I would say that we would hesitate to say that there's any segment that's not targeted, right? So we would say that the thread is there for everyone. Different people will be more susceptible to clicking on different things and the threat actors know that and they'll target their types of lures or the types of intent behind their messaging to actually attract that demographic to click on the messages. So we always say, you know, be skeptical of the information you receive, in a healthy way. You know, you have to be discerning of the messages you receive, question whether it's, you know, a real message that whether it's coming from a legit source and use whatever context you have to make sure that you're, you know, double checking before you're clicking on any of these things.

- You know, I'm very aware of phishing, I'm very aware of the text scams that are out there, so I do my best to report, but I didn't think I was in the minority of people when I tried to report it. That text number again, was it 7727, you said? You can just, 7 7 2 6. 7726. That's brilliant because I was actually looking.

Just the other day I got, it was during the Super Bowl, I got a message that said, your phone number has been updated in your RBC profile, click here to verify it. And you know, I knew it from the start, you know, obviously I wouldn't get a text message like this, but they're just getting smarter. They're just getting smarter and more creative and guaranteed when I click that link it would look really, really close to that RBC website when they tried to get my username and password. Other than phishing though, what other forms of attacks should we be on the lookout for that we may not be programmed to yet? Really just see? - Well, I would say that phishing is a big part of it, but really phishing takes different forms as well. So sometimes it's just getting that message that makes you click on the link.

Another very common way that threat actors get access to accounts and other sensitive information is really by people giving up their credentials, right? So their usernames and passwords. So you have to be very careful in terms of where you're entering your username and password, how those are being collected, making sure they're at legitimate sites. And often a phishing email will have a link to a site that actually, you know, encourages you to give up your username and password or other sensitive information.

And then that information is resold on the dark web so that, you know, other cyber criminals can use your username and password for whatever they are going to do. If it's, you know, a password for an individual, it could be selling that account online. If it's a password that gives access to an organization, then that could actually just be creating access to another ransomware group that they'd also sell online through the dark web. And that just gives another cyber criminal group the ability to go and ransomware an organization. So I guess at the the most basic level, it's protecting your identity and your password and being careful about how you actually use those credentials. - And they're really fast.

Like I've heard stories where you type in that username and password and within minutes they're already compromising your accounts. So don't suspect that just because you clicked on it that you shouldn't act like, oh, nothing's gonna happen to me. These are things that you gotta be very aware of, you know, and I think that's important.

Steven. - Yeah, I agree, and I think, you know, there is another side to this though, I think that's important and it's the language I think around all of this sometimes that kind of confuses people and it probably doesn't help your cause either, because you're trying to make people aware of these issues, make people report them, but even the language around it could be confusing, right? I mean, when I was growing up, fishing was a nice pastime with dad. Nowadays it's, you know, it's taking over my email account and terrifying the heck out of me. How do you get across the language and the meaning of some of these terms we throw around. - It's all education, right? It's a new domain and some of us are, you know , newer to technologies and others as well, but it's about educating people right from the start. We have a get cyber safe campaign from the Cyber center that we're trying to push out, you know, information as to how to protect your devices, how to protect your accounts, how to be safe online, you know, socializing the terms like phishing, smishing, vishing, these sorts of things, you know, but then also reinforcing all the really important elements of being safe online, which is understanding, you know, what controls are in your social media accounts and how you can lock those down, how you can get properly strong authentication with multifactor authentication and stronger passwords and these sorts of things.

But it's all basically building up a cognitive digital resilience in our society. It's an all of society challenge and we have to educate ourselves. We have to know what's available and we're trying from our perspective, in terms of having to get Cyber Safe campaign and, and reaching out and doing shows like this as well, which is very useful.

And thanks for hosting me. - Well that's kind of the thing, right? We want to be able to get people at least to get to the point where they start questioning these terms. Questioning all of this and finding out about your service of course is a big part of that. Okay, let's talk about the worst case scenario here. Someone has clicked the link or tapped on the message and gone through and had their data vished and smished and whatever else, you know, what happens next? Is there any recourse here? What happens to someone who falls victim? - Yeah, so for individuals, we always recommend talking to the, you know, reporting to the police of jurisdiction. And then look at what the provider is, if it's financial, reach out to your bank, make sure you notify them of what's happened, see what recourse is available there.

The Canadian Anti-Fraud Center has lots of useful advice and guidance as well. So that's an opportunity to report to the Canadian Anti-Fraud Center as well. If it's a business, you know, whether small business or critical infrastructure, especially is where the cyber center comes in because, you know, we've seen lots of these compromises and often when we talk to large organizations when they get compromised by ransomware or some other sort of cyber actor, it's, you know, one of their worst days of their lives in terms of trying to get their organization back online while losing money and downtime if their systems have been put outta service from the cyber criminal. And the Cyber Center has seen many, many of these over the years, and we're happy to help.

So we always encourage people to reach out. We're happy to provide advice and guidance, we're happy to, you know, give them other information to help understand the extent of the compromise within their organization. And even if need be, we can actually send people to site to work with them to help them mitigate and provide them the advice and guidance that they need possibly to properly recover from the event in a safe way. So different, I guess different scopes in terms of whether it's an individual, a small business or a, a large critical infrastructure provider. But definitely, you know, different elements within government that can certainly help and local police is being important for the average Canadian.

- It's great to know these resources are out there. And so we got the text number. What other ways can people report crimes today if they wanted to? - Yeah, so crimes, I mean the police of jurisdiction, right? So depending on what jurisdiction you're in, but then the Canadian Anti-Fraud Center and then the cyber center is definitely, you know, contact at cyber.gc.ca or 1-833-Cyber-88. We tend to focus on businesses, but even if it's not business, we'll make sure the person gets routed to the right other government organization to make sure they get the help that they need. - Rajiv, thank you so much for taking the time to join us.

This has been enlightening. You're welcome back anytime you need to help spread some of that message, we'll gladly do that for you. - Well, thank you very much.

- When we come back, we get to your answers to our question of the day. Stick around. This is Access Tech Live.

- [Announcer] There's more Access Tech Live to come. Get involved and have your say at Access Tech Live on social media. We'll be right back. (upbeat music) - [Narrator] Now back to Access Tech Live.

The latest in tech and accessibility with Steven Scott and Marc Aflalo. - And let's get to our question of the week. Today Marc, the question is, have you ever been victim of an online scam? What kind of response have we had? - [Marc] Vicki wrote us a good one here. We were scammed by a very clever website advertising cheap barbecues. Payment was via credit card with a 2.99% fee,

or direct transfer with a 5% discount. Then we received an email saying that due to logistical emails our order was canceled and being refunded. Not surprisingly, the money has not been refunded. Very clever approach.

The first time I've ever been scammed. Can't say I haven't seen that one before. James wrote, I was contacted by phone. The perpetrator was able to convince me that he was working for Telstra and that my internet, my IP address was compromised. I was able to help him by sending money overseas.

This trap would catch the hackers. We would deposit money into my savings account. I would then use that money to send overseas via MoneyGram at the local 7-Eleven store. It was important that I did no internet banking during this time for security reasons.

I did this several times until I became suspicious and check my bank balances. He'd been getting cash advances on my credit card and depositing money into my savings account. I immediately reported it to the bank.

That's like, I mean, my fear, Steven. This is what I try to teach my kids. And the last one here comes from Ellis who says, the scammers called my mother-in-law's phone advising that they were from Revenue Canada and that she owed $4,000 tax debt. My mother-in-law was very worried as they were threatening her saying the police were coming to arrest her.

She purchased $3,000 iTunes gift cards, $500 Google Play, and $500 in Steam cards. And then they sent a WhatsApp message requesting her to send photos of the cards, which she did, then of course requested more money and more cards to which she declined and contacted family. That's I mean, again, you know, they're not dumb. These scammers really know what they're doing and they know how to target, you know, a specific type of the part of the audience and you know, they throw things out and hope it sticks. If they send a million of these out and even 1% come back, they've won. You know, and that's what we're trying to stop.

Is them winning at all. - [Steven] Yeah, absolutely. - Steven next week. We're going to Vienna. - It's so important that we talk about it though, right? We gotta talk about these things.

You're absolutely right. - We're going to Vienna next week. We're going to be at the Zero Project conference in Vienna, Austria, in person, you, myself, Shaun Preece, a bunch of people from AMI.

Two live shows Wednesday and Thursday 12:00 Eastern. It's gonna be a lot of fun. You were there last year. What was the ambiance like? - It's an incredible event and a great opportunity to learn about what goes on outside of the bubble of Canada, the US, the UK, Australia. You know, it's nice to talk about countries that are doing amazing stuff for disabled people across India, across Africa.

You know, we often hear negative stories out these countries. It is great to hear positive stories, especially around disability. And that is exactly what the Zero Project is all about.

The conference is all about rewarding those people doing great things. So lots of great stories to come on our next two special episodes of Access Tech Live and we'll be there for Double Tap as well for those fans of that. So you can absolutely check us out here on AMI TV and of course on a AMI+ YouTube and on AMI audio, we're simulcasting on there as well.

So you can check us out everywhere. We're taking over. - We're taking over. When do you get in? You get in Tuesday.

So we're gonna be all primed and ready for next Wednesday. It's gonna be a fun time. I'm sure we'll be tired when it's all said and done. - Check out the the fried chicken places for me beforehand and then we're all good. - Oh my God, fried chicken places.

Everywhere we go Steven wants KFC. We could be in the most gourmet place in the world and all he wants is KFC. - I don't care. - I don't care. Well, thank you obviously to Amber Mac for joining us this week and sharing her perspective. We definitely have to have her back.

Rajiv Gupta, thank you so much for being here as well. We'll share all that information on social media so that you can make sure that you are protected and spread those messages to your friends and family so they report fraud when it happens, especially cybersecurity so we don't have any situations in the future. We don't want to hear any more scam stories.

On behalf of Steven Scott, I am Marc Aflalo. Thank you for joining us this week and every week. We'll catch you live from Vienna next week here on Wednesday at 12:00 Eastern. - [Narrator] Thanks for tuning in to Access Tech Live. Follow us online on all social media @accesstechlive. Email us feedback@accesstechlive.com.

Hosted by Steven Scott in Glasgow and Marc Aflalo in Montreal. Written by Steven Scott and Marc Aflalo. Producer - Marc Aflalo. Live Show Director - Anastasia Spalding-Stenhouse. Technical Director - Kaitlynn Robinson. Audio - Jordan Mulgrave.

Live Graphics and Playback - Kingsley Juuko. Graphics Coordinator - Eliza Rocco. Integrated Described Video Specialist - Em Williams. Supervising Producer - Michelle Dudas.

Produced in collaboration with Aflalo Communications Inc. and Double Tap Productions. Copyright 2024 Accessible Media, Inc. An AMI Original Production.

2024-02-23 21:53

Show Video

Other news