
K-7
8-12
Uni/Adult
Aristos Education Services
Episode 3 - Cyber Safety and Privacy in Education

Michael
Welcome, and thank you for joining us. Today, I have a special guest. I'm joined by Claudio Popa. Claudio is the CEO of Informatica Security and founder of Knowledge Flow Cybersecurity Foundation, which is a nonprofit designed to raise cyber situational awareness in the home. I met Claudio through a roundtable discussion on misinformation and disinformation for Project Defuse, which he co-hosted in conjunction with NATO. One of the focuses of the Knowledge Flow Foundation is to teach safe online interactions for school-aged children, and cyber safety and privacy in education is our main topic today. Claudio, thank you for joining us.
Claudio
Well, thank you for having me on the show. It's a pleasure.
Michael
So before we dive into the topic of education, last time we spoke, you mentioned the concept of dark patterns, and I was hoping you'd just say a little bit about what dark patterns are.
Claudio
Oh, well, thank you for asking. Dark patterns are something I've been fascinated with for about fifteen years now. It was one of the reasons why I decided to write the Canadian Cyber Fraud Handbook, which was published by Reuters. But it's for a legal audience, for law enforcement. And dark patterns have been an offshoot of the pressure put on marketers to deliver results. And so we've seen that the internet as a whole has been turned into a for-business, for-profit tool, much to our horror. People like me who have been online since the green screens back in the late eighties, and we saw the web come up and so on. And you used to be able to do transactions and make your own decisions, but with the more advanced abilities of the internet, the new languages behind the World Wide Web, we've seen capabilities for marketing and for influence take shape. And that kind of influence is something that has been tremendously useful for businesses, small and large, but usually larger ones. And so I have a presentation in which I took a screenshot of an Amazon screen and it shows two, three, four, five, five different elements of the page that all converge to get you to buy more or to buy a particular product or to be influenced to have a positive view on a particular decision that you should be making right this moment. So essentially dark patterns are influences that you do not perceive to be happening that often have a financial driver to them. And we can certainly provide examples. They range from something as innocuous as literally showing up to pick up your pizza, and there's a tip request. And so you're like, well, we're not. This is a pickup. We're not sitting down to eat at a restaurant. Why is there a tip that has been added to my bill? Oh, it's actually a service fee for us to cover operating costs. Well, that's a significant service fee at fifteen or eighteen percent, which sounds like a tip amount. But online and offline, these things are now normalized. And that's a concern because We have seen not just Amazon do this type of thing, but I often get calls from my own mother who says, you know, my antivirus is expiring and it's asking me for money. And I say, no, no, just click the free option and stick with that for now until I've had a chance to review it. And she says, no, there is no free option. And I look at the screen and the free option is clearly there, but it's grayed out. And I keep seeing more indications that there's a subset of the population that only sees highlighted things on screens. And it's very difficult for people to actually perceive grayed out buttons and prompts and words that simply say, you know, download free in gray on light gray versus bright yellow with some contrasting text color that says, buy now. And so she feels that, well, you know what, only the buy now option is available. And it's not just this type of deceptive practice, but it's the anxiety that goes along with it for a certain subset of the population that has made the decision The Electronic Frontier Foundation, which is a nonprofit group that protects people's digital rights and privacy, to partner up with Consumer Reports to create a tip line where people can report their own dark patterns. If you're looking for it, it's at darkpatternstipline.org, which is very interesting. I found out about it myself when I was preparing for our interview today. It's a growing aspect and it's a type of deception that unfortunately just adds to all the other baggage that we have whenever we go online. Everything is trying to influence us and whether that's ethical or not, sometimes marketers that are under pressure don't seem to care and they try to push the envelope.
Michael
Right. So these are design choices in the presentation of information for us that leverage what's known about human behavior and human psychology and human tendencies to try to direct us towards the desired outcome from the company or the marketers. So just to play up your example a little bit more, you might join a marketplace looking to shop for a particular item, and you think that you're surveilling the options and making an informed decision based on the options presented to you. But in reality, the website that they're using is actually funneling or otherwise manipulating you towards an outcome that they desire rather than letting you sort of have free reign amongst the available choices. Does that sound like a fair description?
Claudio
That's exactly right. And if it has to use psychology mixed in with misdirection to get you to make a decision, not just the particular decision that they're looking for, but that particular decision right now. is really the goal. because keep in mind this is transactions per hour. That's the metric that most organizations care about. And this is why there's such high pressure when there's a data breach to stall investigative reports. There's a whole, as part of my business, in information security, whenever there's a reported data breach, there's an entire communications program that is designed to effectively delay providing as much information about breaches as possible, because it impacts sales per unit of time. And that's expensive. We've seen it in a number of organizations, most notably here in Canada, we saw Indigo, be put out of business for a month. This was an online e-commerce activity that brought in something like a billion dollars or hundreds of millions or something crazy like that. I may be completely wrong on the numbers, but all that to say, as soon as that is interrupted, that's a big deal. So if people perceive that there's no incentive to buy, they simply walk away and they give their money to somebody else. So that's the impetus for a lot of this, what we perceive to be misleading and deceptive. They simply perceive as efficiency and effectiveness in e-commerce transactions.
Michael
Right. So there seems like a willingness to sacrifice the autonomy of the consumer in the name of the desired outcome for the company.
Claudio
That's correct. It brings about an entire set of discussions around, is it fair to influence using psychology, misdirection, et cetera, the decision process of someone? Do they still do so willingly and intentionally, or have they just been influenced by somebody who's an expert at this type of thing and who's accountable?
Michael
That's very interesting. Probably should dive into that on a different day, but thank you for describing that to us. It is a very intriguing concept, something that people need to be aware of, I think.
Claudio
I'm sure our listeners can think of any number of these dark patterns because sometimes you emerge from a website and you've just found yourself ordering a whole bunch of stuff. Amazon Prime members are typically primed to keep ordering stuff because suddenly two days have gone by and nothing arrives at the door. It's a similar situation to when you're late at night and you're watching these infomercials and you find yourself buying something. Well, you certainly were not planning on staying up late and buying stuff, but there you are.
Michael
Thank you for that. Let's pivot a little bit now and talk about our intended topic for today's podcast. So you're an expert in privacy concerns, specifically having to do with children and as that relates to education. So this was already happening before COVID, but certainly with COVID, it exacerbated the process of including or incorporating digital tools into our classrooms. And so I was wondering if you could just say a little bit about some of the concerns that you're seeing in terms of the way that digital tools are being used, implemented or introduced in education.
Claudio
Absolutely. I think it's always been inevitable and quite frankly, desirable for technology to enhance the way we educate. Technology has always been an enabler. Technology is something that is near and dear to my heart. I've created tons of technology tools and designed them and so on. And it's always been the intention of public and private education to leverage tools. Right around the turn of the century, the twenty three years ago, we saw more and more programs that are both engaging for students and able to track the performance of a student population over time so that you're able to extract some sort of statistical data based on the usage patterns of the software that's running on school board equipment, which used to be some of the safest in the world. So anytime I visited a school board, I saw that they had servers, some heavy duty servers, mainframes, et cetera, some regular Windows machines that were secured and lived inside of server rooms with people that took care of these things. So the care and feeding of these machines was always taken very carefully by qualified IT experts and specialists. As of the year of the cloud services and software-as-a-service. In other words, software-as-a-service had been coming for a long time. We called it the ASB model for a while and it just means delivering technology services over the internet. Instead of having programs running on your computer, suddenly you open a browser and there's the program and it looks like a simple interface. And that simple interface is, technically speaking, a website. It's served up from a server room somewhere. It may be in our country or another country. That's why we call it the cloud. It just kind of obfuscates this entire idea of where where's the data. So that's the beginning of the slippery slope. When people stop asking, where is my data? You know that there's a vast risk that is certainly looming because this is what the word cloud wants you to do. It tries to make you forget that all data is stored somewhere. So what we typically say in our business is the cloud is somebody else's computer. And so you can get stickers that say that. You just type it into Google Images, you get your sticker. It's a reminder that anytime you put data into the cloud or in any website, it's certainly not on your computer. so it's outside of your control instantly. Between between the years of the year of the website and the year of the web, tons of organizations started putting in a lot of effort into developing these website-based educational tools. they initially were doing the exact same as regular educational technology, and then they started expanding and getting more ambitious. And the reason for that ambition is because venture capitalists and investors came along and said, well, it would be nice if we had some data from demographics that we typically aren't allowed to collect, which is to say children, minors, students. And so that data is definitely something that they can't directly exploit. But what about indirectly exploiting it? What about just knowing how many children are in a particular jurisdiction? What about knowing a distribution of ages, seeing patterns as far as their likelihood to click on stuff? Let's say, and by the industry matured to the point where we were looking at tens of millions of dollars being invested into the e-learning space and these venture backed education companies took a turn for what I think is the worst. for two reasons. One, they started aggregating information without ever offering to delete it. So there was no data retention policy in place. The data just goes one way. It goes into a system. That system is under the control of an external company. The case was made for school boards to not need server rooms anymore. And if they don't have server rooms, they have no need for IT specialists to manage them. So they lost a lot of brainpower at that point and expertise. So suddenly, you're depending on an actual vendor, which is usually a small company that has just started in the past few years, or in some cases, months. And you depend on these developers, who are not security or privacy specialists, they are software development professionals. And they work as part of a group to create an interface that is pleasing so that people will enjoy using the technology, but they are specifically funded for a reason. And the funding goes into them because they are able to have touch points with specific information that's collected from students. This is important because the agreements, the legal agreements, the contracts that are put in place with the school board can never be changed. And the reason they can't be changed is they are tied to the investment that goes into these e-learning companies. So they come out and they say, look, we are going to monetize the data at some point. We may be able to sell part of this company. We may be able to sell the entire company, at which point all of this data will go along with it. have a stated objective to share this information with as many connectors as we can. Those connectors are external companies that school boards and schools have no visibility into. They simply provide little tidbits of code and additional services. They are part of the supply chain of the vendor, of the e-learning company. And so they're not even included in the agreement and the contract. So this is how far the personal information or perhaps the statistical data collected from students can go. It can go until you can no longer see it. So from the perspective of what happened in the year of the virus, in the year of the virus, we saw the beginning of massive data breaches. We saw that e-learning companies did not have adequate security in place, and it resulted in tens of millions of students' data being lost, being breached as part of some of these security incidents, which resonate to this day. I was doing a TV interview a couple of months ago, and the cameraman actually said, well, that's an interesting topic you're speaking about. My son I'm still getting spam that resulted as part of the data breach that took place. He was with one of the largest school boards in the Toronto area, and my son's information was stolen. And I said, well, that was quite a few years ago. And he said, yeah. We've been getting spam ever since. And the data was apparently leaked and released online, which is why this is such a big deal. It's a bigger deal than information belonging to adults. These are young people who are impacted perhaps for tens of years, for a generation. So anyway, the breaches began in twenty sixteen. And they continued right up until twenty nineteen when companies started being aware of the fact that their reputation was being impacted by these losses, by these privacy breaches. And of course, there was additional pressure from regulators to say, hey, you know what, we appreciate the innovation, but could you please tighten up your security practices because it's impacting children in so many different ways. So that in a nutshell is the evolution of e-learning in public education in particular. There really hasn't been a concerted effort to figure out procurement. So the procurement question is the crux of the problem because you have a number of incentivized individuals who build relationships with the school board. They sell technology to administrators. Administrators are not security or privacy folks. So they don't know what questions to ask as part of the service contracts. And they don't really have that much of a grasp of the actual harm that can come from data being stolen. And I presented to a number of law enforcement teams. I said, here are some screenshots just so that I'm not doing all the talking. Here are some screenshots. These were screenshots of cell phones. They were text messages sent to mothers saying, I can see that your child is in such and such a classroom. You're not going to see them again. there's a bomb attack in the school, we know exactly where they are at any time of day, you need to pay us now or you'll never see your child. So, of course, it's extremely stressful and likely the accountability for this is not just on the part of criminals, it obviously needs to be shared by the decision makers who adopt those types of insecure technologies.
Michael
Yeah, absolutely. To that point, I know that you work with the Privacy Commissioner. Is it the case that they were just sort of behind the times in terms of having legislation in place that requires school boards to vet these or to the limits of how the data can be used by these educational venture capitalists? Or what was the role there of the Privacy Commissioner?
Claudio
Well, that's a very interesting situation. So in my previous work with Privacy Commissioners has seen a number of of activities take place. One was in the design of best practices for handling personal information for kids, for adults, et cetera. So within the scope of privacy by design, that has everything to do with adopting privacy and security for their own sake, as opposed to making the case that, hey, you know what, we are in business, we need to maximize profits, but we will comply with the law as much as we can. In this particular case, we're talking about school boards that are regulated provincially. So provincial legislation that typically deals with privacy. Well, that kind of provincial legislation used to be very innovative back in the early eighties. But that predated e-commerce and the internet itself and even some bulletin board systems, BBSs, if any of our listeners are that advanced. So we're talking about the kind of legislation that structurally is meant to protect, you know, names and addresses. Back in those days where we used to buy, let's say, a computer program from a store that sold such things, we used to get a little card in the in the box and the card said, please fill out this card so that we can know our customer and we'll send you an update because we appreciate your paying for this thing. That is a form of consent that today is frowned upon because just look at the opportunities for friction. Just look at how many people do not return that card and all of those are seen as wasted opportunities. Back then it was seen as a nice option to have to know your customer. And if they didn't want to return the card, then so be it. We've already got their twenty two dollars. And now we need to be able to extract information in real time and potentially monetize it in some way or another. The legislation, municipal or even the federal legislation, basically said you need to. You need to care about personal information wherever possible. Try to ask for consent, but if you can't, just, you know, it can be, it could be passive. You could opt people in and then maybe give them an opportunity to opt out. Of course, all of this hinges upon the great assumption that people understand. what privacy is, what control over their information might be or entail, what the harms are in having your data in the custody of someone else. The simple fact that when somebody has your information, they don't have your information, they have a copy of your information. Therefore, you do not necessarily suffer at that particular moment from somebody else having a copy. But of course, if everyone has a copy of your information, your information has less value. I was having this discussion yesterday with a company that creates identity management software, and they have at three different identifiers, and they use that as a selling as a differentiator. And I explained that the more of their partners that information is shared with, the higher the risk of identity fraud, because with that many identifiers, you're essentially inviting somebody to clone the identity of a victim. The law has not been in place to curtail that type of data collection. It has not been in place to force people to request express consent, nevermind informed consent, which is verbiage that is used now at the federal level. How do you make sure that your customers are informed but not scared? Because there's this concept of data salience or privacy salience that says the more you talk to people about privacy, the less likely they are to share it. And so if it's your duty to inform people, then you're shooting yourself in the foot by providing them information. with choice. And so that is the big ethical question for marketers today.
Michael
Yeah. And one of the topics we talk about in my class is not just the importance of having informed consent as a goal, but the practicality of it. So there are some people who say, you know, given the recursive dynamic nature of information, dissemination, packaging, and resale, even if you could clearly convey to a client at a given point in time, exactly how their information is going to be used, that story might not be true a week from now, right? As you mentioned, there are different connections. There are third party relationships where this information is in a vast swirling whirlpool of intermingling with other information, repackaging connection to other things. And so, as you said, you could inform a client of how you're using their information now, but there's no, without safeguards in place, you have no idea how that information is going to be used in the future. And so informed consent becomes exponentially more difficult where first you have the obstacle of explaining to somebody who's not an expert what's going on with their information now, but you're also in the impossible situation of trying to have an exhaustive account of how that information might be used in the future, which, you know, even if they have good intentions and want to do that, which as you point out, they're disincentivized to do, it's not, it may not be practically achievable.
Claudio
Absolutely. So to my way of thinking, asking organizations to get informed consent, you're essentially asking organizations to find loopholes around this verbiage. And if it puts them in an ethically challenging position, they don't mind that because their loyalty is to the investors, for instance, or to their growth objectives. What I prefer to do, and I recommend this, I prefer informed consent to be accompanied by a number of privacy harms. And if you can list the categories of privacy harms, you are building credibility with the audience. You're building credibility around your own space. And you, by default, would become a privacy champion as an organization that simply says, you know what, there's a number of harms that come along. You don't have to list out scenarios, right? Because there are infinite scenarios. But there's almost a finite set of types of harm. For example, economic harms, discrimination harms, autonomy harms, physical harms, reputational relationship harms, psychological harms. These are serious sounding things, which if I read it in a privacy policy, I would say, oh, well, that's interesting. So, there could be a physical harm to the simple act of me agreeing to allow my information to remain in the custody of this organization, right? Because they've already collected it. They're just notifying me that there are such harms. So maybe I can start there and look into what scenarios might unfold. This is much better than just empty use of the word express consent. I consent to sharing something and I accept all risks. How can I accept all the risks if I don't even know what the risk categories are?
Michael
That's an interesting approach. I think that's possibly fruitful. So then categories, instead of needing to spout in detail exactly what's going to happen, you might be better positioned to explain the possible nature of harms, right? Like you said, so there could be physical security concerns with the type of information they're sharing. That's interesting.
Claudio
Again, we always get inspiration from somewhere and I got this idea because if I say it in a vacuum, it sounds outlandish. Who's ever going to adopt this? That already exists today. when you buy shares on the open market. You buy a couple of shares of a particular stock and it comes with a prospectus. In my case, I was reading this prospectus that was four hundred and twenty seven pages long to figure out why this stock dropped. And in it, one hundred and sixty pages was dedicated to risks associated with your investment. They were itemized exhaustively in a printed book that I have. And that, to me, is an ethical approach to informing people about their investment in a company that could go to zero. It's the same thing when your investment in an educational technology not only can deliver no value from an educational perspective, but it can go beyond zero in that your economic harms, your physical harms, and all the other different classes of harms can actually occur during a breach, during mishandling, monetization of your data, et cetera, et cetera, this could actually ruin your life. And that should be made clear, not in those words, hey, you're about to click on a button that will ruin your life. No, but if we're looking for an effective use of the word informed decision-making, of the term informed consent, then the ethical thing to do would be to list the categories of harm that can come as a result of your accepting the risk. Because keep in mind, this is what they're always doing. They're asking you to accept the risk. The problem in the public education sphere is that administrators do not understand the risk. Teachers are not educated to identify it or to exercise their autonomy to basically say, hey, you know what, we read the privacy policy and we don't think this is appropriate for our classroom. Parents are never asked and students are certainly not in a legal position to accept or reject a privacy consent form. So there is an ethical foundation to the entire e-learning stack, technology innovation stack, because the word innovation comes up a lot. There's a ton of pressure to modernize public education processes. And if you don't, then you're a laggard. There is an ethical stack here that should be looked at in a layered way. And the lack of leadership particularly from regulators, show that as a country or as an economy, we're not quite able to protect individuals. We are prioritizing growth and profit by monetizing the personal information of those most vulnerable who don't even have a voice, children.
Michael
Yeah, that's an important point that you raised there. I just want to touch on that for a moment that that in this instance, I mean, privacy is a valid concern for, you know, people considered fully autonomous adults, but in the situation of children, school aged children, where it's further complicated by the fact that the parents are surrogate decision makers for the information. that's, you know, ultimately about the student. And so that further complicates things, and I think is another area that's worth diving into. But for today, so let's refocus back on education, as you mentioned there. And if you had, you know, the ear of the Minister of Education, or the various ones across the provinces, what are some recommendations you would make to try to address some of these concerns that you've been raising?
Claudio
One big one goes back to our discussion about dark patterns. At the beginning of each school year, a letter ideally arrives, and parents need to be aware of which technology tools are going to touch the personal information of their children. That entire process is fraught with ethical concerns, and anyone can see it. For one thing, there's a delay. When this letter arrives, the school year has started. Secondly, when the letter arrives, it's a tiny bit of information about each tool and each description downplays the impact. It does not talk about harms. It doesn't talk about the processing that takes place within these e-learning tools. And the biggest issue with it is that it's an incomplete list. So this list, for example, at our school, we have something like a hundred tools that are used as part of the e-learning ecosystem. We are asked about three of them. Are you okay with using this math program which has been endorsed by such and such? Yes, no. Keep in mind that if you say no, your student will be excluded from such and such activities. So there's a ton of the kind of pressure that may be considered to be dark patterns, both in the verbiage that's used in these forms and in the implied threat of exclusion that comes with parents exercising their rights to not share that information. There's a ton of peer pressure that happens. All parents are saying, what do you mean, you opted out of that? Why would you do that to your child, et cetera, et cetera. So there are concerns with the consent process. So that's one thing I would definitely speak to the Ministry of Education about. Two additional things that I would talk about. The most important one is securing the supply chain. It doesn't just mean auditing the individuals who knock on the door and say, I have a better e-learning tool. It means auditing their fourth and fifth and sixth parties that they depend on because no cloud-based tool is self-sustaining. They all depend on suppliers and that relationship is based on sharing data. Data is the raw material that makes the new business model of the web function. All of that hinges upon somebody taking responsibility for the use of data. In all cases today, school boards come out to parents and they say, don't worry, the school board is accountable and responsible for your children's data. As soon as you push back on that and you ask questions, they either use secrecy, to withhold information, or they simply say, we are authorized to do this because of the Education Act of the year. Therefore, you do not have the right to withhold consent as a parent. That requires a ton of reform, but the focus here is on securing the supply chain and carrying out proper professional audits. That means learning how to do it, employing professionals who know how to audit a vendor. Right now, I have not seen it. I just have not met a school board that has a competent individual or set of individuals who know how to carry out proper vendor risk assessments. And these are year over year relationships that just get renewed at the end of the year. Why? Because the data is cumulative. And that is my third point. And perhaps most important, the fact that you need to have data deletion, you need to have a data purge. Data is like hot coals. If you're increasing your appeal to criminals and others, you're likely going to have a data breach. At the moment, part of the reason for the cloud, the success of the cloud model is the fact that you no longer have visibility into the attacks. It used to be that IT folks, they could name exactly number, the number of attacks against their firewall per second or per minute or per day. All of that is outside their control today. It is under the control of vendors and businesses. People who have not been properly vetted, do we know that they're using software developers that have been trained in secure coding? I really see that as a rarity. But what we need to remember is that data deletion is a way to protect students. So once a year, that information that gets collected throughout the year needs to be deleted securely from all of the systems of the vendor and all of their own ecosystem of third, fourth, fifth parties, partners. This is critical. Today, it's not happening anywhere in Canada. Data just grows immensely. That means if my son skips school on a particular day, as he goes into university, time will be remembered by the system and may be used for a decision for him to be accepted in university, because you never know what metrics are going to be in use when the competitiveness of the field increases, right? At that point, they might look at any kind of metric in order to put a quantitative score on somebody's eligibility. And so that data which could be mixed in with camera views, with schoolyard incidents, with all kinds of stuff dating back to grade one. All of that just keeps on accumulating and becoming that much more valuable for the company, for the vendor company. It's a terrible idea to store that information beyond the end of any school year. So that is a huge data purge that I see. I recommend that there should be a data deletion week at the end of each school year where you're celebrating and partnering with vendors who are conducting their business ethically and are saying, you know what, we are securely purging and you get a data deletion, data disposal certificate that says we've deleted everything and in the fall we start fresh. That is responsible use. of children's data, and that to me would be a hugely ethical improvement to public education, at least in Canada, but really all around the world.
Michael
Right. Yeah, absolutely. Yeah, that's very insightful. I appreciate that. Yeah, that's an important point you make that it's not just the principle of we want to protect the privacy of individuals as a principle that we uphold, but the sort of nefarious possible uses of that information against that, you know, without that person's initial consent is obviously a significant concern. That's fantastic. Well, thank you so much, Claudio. I really appreciate you taking the time today. You've got a lot of amazing insights into these topics. Do you want to list your website or some places people could possibly learn more about this type of information from you?
Claudio
Well, sure. I don't know if it was mentioned, but as the founder or a co-founder of the Knowledge Flow Cyber Safety Foundation, I'm very privileged to be working with a fantastic team of just privacy focused people of all ages. And we keep on adding resources to the knowledgeflow.org website. Anyone who goes there, they can be seniors, they can be new Canadians, they can be students, teachers, administrators, law enforcement. They will find a section that just has free resources in multiple languages. They've been professionally translated. tip sheets of all kinds created professionally. There are many opportunities to volunteer as well because this is the only cyber safety foundation in Canada that's certainly focused on these same topics. And we always appreciate people reaching out and saying, hey, you know what? We really appreciate your materials and would you like to speak at our group? for example, and that's how we end up speaking all over Canada. As far as my company is concerned, it's DataRisk.ca. It's a cybersecurity company that's actually a twin sister of Managed Privacy Canada organization. So ManagedPrivacy.ca is a twin sister of DataRisk.ca. Together, they are security experts. and privacy.ca. It's an interesting thing and I only mention it from the perspective of the fact that you, any organization, any association needs to really pay attention to both security and privacy, but you can pick and choose what you're looking for, whether you're looking for privacy compliance or are looking just for security to have the assurance that you can, you can protect yourself and others. It's an interesting way of doing that. And if you're a nonprofit, you go through knowledge flow, and you will get those same services from certified professionals, but at fractions of the cost, because our goal with knowledge flow is to deliver as much as possible that's free. So forget about everything I said about companies, What's important to us is to deliver free information and knowledge through knowledge flow, through kids, through volunteers, through passionate professionals. And also don't forget the coalition that I mentioned earlier, the darkpatternstipline.org. It's kind of fun to be coming across. Now that you've been sensitized, hopefully all of our listeners have been sensitized to the existence of dark patterns. You can go to darkpatternstipline.org and think of something to submit because a lot of companies are extremely innovative and it's always neat to see examples of dark patterns. I mean, at the moment, they're infinite. I have three-hour presentations on dark patterns and it's nice to have a place to submit these things. I have nothing to do with the site, but it seems like a really cool place. place to visit. Either way, thank you very much. It's been an honor to take part in today's podcast and look forward to next time.
Michael
Thank you so much. This has been incredibly informative and interesting. Obviously, so much more we could have dove into, but I appreciate you taking the time today. I'll say thank you very much for joining us on the podcast and enjoy the rest of your day.
Claudio
Thank you. Thank you. You too.
