Robert Bateman, head of content at GRC World Forums, is a well-respected expert on data protection, privacy, and security law. He built his reputation by producing in-depth reports on legal updates, compliance guidance documents for businesses, and news articles about the latest industry trends. He also has a deep interest in digital rights and is an avid supporter of free speech.
We discuss EU-US Data Privacy Framework, Upstream and Prism NSA, Meta’s Threat to the EU, Clearview, Open AI (Dall-E/ ChatGPT), Twitter VS. Mastodon.
Boozebot uses ChatGPT to generate “Death in the Afternoon”.
TIMESTAMPS
0:00:00 – CIS Controls Version 8: Updates and Changes
0:04:15 – The Impact of the Snowden Leaks on Data Protection and Privacy
0:06:16 – The EU-US Privacy Dispute: What’s at Stake and How to Fix It
0:12:49 – Facebook, Instagram, and WhatsApp Facing Fines for Privacy Violations
0:14:40 – The Impact of Social Media on Personal Data
0:22:46 – Twitter’s GDPR Fine and Data Management Issues
0:24:56 – Twitter Authentication Problems Run Deep
0:27:07 – Twitter vs. Mastodon: Which is Better for Security Practitioners?
0:29:31 – The Infosec Community on Twitter vs. LinkedIn
0:31:45 – The ethical implications of OpenAI’s technology
0:34:06 – The Impact of AI on Ground Truth and Job Loss
0:39:12 – The Benefits and Risks of Artificial Intelligence
0:41:22 – The Benefits and Risks of AI
SYMLINKS
Linkedin
Twitter
GRC World Forums
DRINK INSTRUCTION EPISODE SPONSOR CONNECT WITH US
Center For Internet Security (CIS)
http://www.barcodesecurity.com
Become a Sponsor
Follow us on LinkedIn
Tweet us at @BarCodeSecurity
Email us at info@barcodesecurity.com
This episode has been automatically transcribed by AI, please excuse any typos or grammatical errors.
Chris: Robert Bateman is head of content at GRC World Forums. He’s a certified information privacy profess. And a respected voice on data, security, privacy, and business practices. He built his reputation writing in-depth reports on legislative developments, compliance guidelines for organizations and news stories about the latest industry trends.
Chris: Robert, welcome to Barcode Man. Hi there.
Robert: Thanks for having me.
Chris: Absolutely. And also, here with me again is the one and only Rohan
Rohan: Light. Ooh, good day. Good day. Um,
Chris: A lot has happened within the data governance realm within these past, I’d say two, two to three months. Um, where did we start?
Robert: It was a good question, Chris.
Robert: I would start, uh, a while ago, perhaps even as far back as 2013 to frame some of the issues we’re dealing with. In Europe and between Europe and the US when it comes to data protection and privacy. The reason I say 2013 is because that’s when the Snowden leaks came out, so you’ll know about that. Of course, Edward Snowden, CIA contractor who leaked a load of information about how the NSA.
Robert: Uh, was spying on private, uh, communications and that was really the first time we had a glimpse of how the system in the US works for data interception. Uh, before that there was a lot of speculation and conspiracy, and Snowden kind of gave us an idea of how the law worked in practice. So this is still having repercussions today, particularly in relations between.
Robert: The eu and less so now, the UK and the us And this is what’s occupying a lot of people’s time over here in this sector, uh, in Europe and in the US as well. So the reason it’s so important is because it revealed this kind of fundamental conflict between the US law and the EU law. Uh, so the NSA had these two surveillance programs.
Robert: You might know Upstream and Prism, uh, which is, uh, two different programs for intercepting data and. The, the, the law in the US essentially conflicts with the fundamental rights in the eu. And so the, the problem from an EU perspective is that personal data traveling to the US is being intercepted by US authorities.
Robert: And the, the perception is that the US is violating European. Uh, human rights in that way. So this is a big dispute. It’s left a lot of companies in limbo and essentially there’s been a lot of. Law breaking going on just because the law is so broken. And, uh, all the US companies, Google, Facebook, uh, you know, all the meta companies, everyone, you know, Salesforce, uh, every tech company you can think of really has had been in this kind of legal limbo.
Robert: Um, for, for nearly a decade now as the u EU and the US try and sort this out politically and compliance people, people working in privacy and data protection have been kind of left to deal with this. With no real fundamental legal answers.
Chris: What do you think needs to happen in order just to, to make progress here?
Robert: What the EU and the US keep doing is, uh, well, fundamentally the EU doesn’t allow European companies to export people’s data as they put it to countries that don’t have a strong human rights regime. And so you might think of countries like China or or Russia, but also because of the eu very high standards.
Robert: The US falls into this category two because of certain. Problems with the US legal system from a European perspective. And of course that has huge implications because the US companies run the internet. You know, the Facebook and Google is, is it’s borderline impossible to avoid interacting with these US companies if you want to do business or go about your daily life these days.
Robert: And so the EU and the US have been trying to come to. These agreements whereby companies can sign up to these particular schemes, you know, um, promising to treat Europeans data, uh, better, uh, than they would otherwise, but they keep getting knocked down in court. So we’re on our third. Version of this framework now just released today actually was a draft new decision by the European Commission.
Robert: And, uh, it keeps being taken to court by one guy called Max Shrems, who, uh, is on a kind of campaign against. This, um, conflict in, in law and they, it keeps getting knocked down and then they try again, and now they’re on their third attempt. So we’re gonna see how long that one lasts. But yeah, there’s a fundamental conflict and it’s all about human rights from the European perspective and the right not to be spied on by the NSA and the CIA effectively.
Robert: As for what might fix it in the long term, I think it would require some sort of change to US law, uh, which doesn’t seem very likely cause the US is so keen on its, uh, intelligence gathering operations and says, you know, might be right that it needs to do this to maintain its own security.
Rohan: Uh, it’s a good question whether or not the US can reconfigure parts of itself in order.
Rohan: Evolve with the world.
Robert: It is and it’s, some people say there’s an element of hypocrisy in Europe. Cause many countries within the EU do their own intelligence gathering. You know, they do their own surveillance and they say they have a right to to intercept communications as well. France, the uk, when it was part of the eu, were always pushing for.
Robert: More power to retain people’s data to, to intercept communications. And the EU doesn’t tend to look inwards as much as it looks outwards. And of course, it’s created this, like I say, this impossible situation where the law is being broken billions of times a day just by people using companies like Google and, and Facebook.
Robert: You know, not just just them, any kind of, uh, electronic communications service pro providers, uh, from the US there, there’s a, there’s a fundamental problem there. Um, so it’s, it’s created quite, uh, an impracticality, uh, as well for for many people that have to deal with this stuff, uh, for their tropes. Hey,
Rohan: um, just for a change of t uh, can we jump to the Irish.
Rohan: PB rolling with meta
Robert: for sure. I mean, meta are in huge trouble over here and, um, they’ve said a few times that they may need to discontinue their European operations because of the, You know, the, the problems that they have complying with the law. A lot of people say that’s a bluff. You know, they would, they would never pull out of Europe.
Robert: And of course they, they wouldn’t if they could avoid it. But I have some sympathy with the argument actually, because they can’t really bring themselves into compliance with European law given their business models. So they’ve got all these companies now, what? Instagram, Facebook, of course. And each of these companies is undergoing an investigation or more for breaking various privacy rules.
Robert: So, uh, Instagram for example, had an issue, I mean this one is probably the least sympathetic, um, allowed children to convert their accounts into business accounts, and part of that process required those children to publish. Data, their phone number and email address on the open web. Uh, so that obviously is a big no-no.
Robert: As far as, um, European data protection law is concerned. So they’re in hot water over that one, uh, Facebook itself, well, they did what has been described as a consent bypass. So the idea is that you, of course, you sign up for Facebook, you need to. Um, be served personalized ads, and this is. In, in Europe, you’re supposed to have a choice about this, which is part of the reason we have these cookie banners that everyone hates so much.
Robert: Um, you’re supposed to be able to say no to that, but Facebook, we’re not allowing people to say no. So that’s a, that’s another problem that the island and the European Data Protection Board are dealing with. And then there’s WhatsApp. Who I’m sure you’ll remember there was a big controversy earlier in the year about the wanting to share more data with, uh, Facebook, a parent company.
Robert: And so they’re also facing, you know, a hundred of a hundred million of Euros in, in fines for, for all of these issues. Um, and there’s a whole data transfer thing that I was talking about there that goes back to Snowden. It’s the case against. Meta is that they are kind of illegally exporting people’s data into the us.
Robert: Of course, they can’t really help doing that cause they’re an American company, but the case has been put against them. So they’re, they’re the kind of test case for this. Mm-hmm. ?
Rohan: Yes. Uh, meta is a super national, um, entity extends across borders and envelopes entire countries in that sense. Um, you can see this as a, um, a, a large scale tightening of rules.
Rohan: Uh, this is why entities like meta are encouraging, uh, are encountering so many problems is simply because exposing things that are pretty unsavory in some cases, giving us a sense of what the business model actually.
Robert: Yeah, so the business model depends on a lot of people’s private information being processed in quite an intrusive way.
Robert: Um, the, the thing with Facebook is you kind of know what you’re signing up for in a way. There are other companies, like, um, you may have heard of Clearview ai, who are doing slightly more nefarious. Uh, activities. Uh, this is a company that has created this kind of biometric database of people from around the world, and, uh, I think they’ve got something like 6 billion.
Robert: Images of people’s faces. This is, uh, an American company also in trouble in various European states, um, facing these, these large fines across Europe. And so they scraped the web for facial images. They, I’m sure will all be on there. I know I am. Cause I asked them and they create a kind of biometric identifier for each, uh, person.
Robert: That they, they create a record for, and they sell access to that database to police services. So they can, they can, um, so they can find people they want to find. And, uh, this is, uh, just another example really, of how much of our personal data is subject to these sort of, Unnerving and arguably nefarious activities.
Robert: And you know, in Europe this is, regulators are struggling to, to keep on top of this stuff.
Rohan: Yeah. The case against Clearview can be tracked by the number of police agencies that adopt it and then two months later, suddenly unadopted once they realize they’re breaking multiple rules. It’s a fascinating thing to watch happen, um, as a, every society deals with the extension of facial recognition technology.
Robert: Yeah. So they, uh, there’s a UK police force that appeared to have tried them out at one point and, uh, very quickly decided against it. This is an interesting pattern, as you say. Uh, Um, Swedish police, I think also tried its, uh, the mountains in Canada to, uh, use it and it keeps being declared illegal in all these countries.
Robert: But, uh, not so much in the US because of course it doesn’t have the strong sort of, Privacy law that, that we have in, in other places, there was a private case against Clearview, uh, in Illinois. So the US has a good system for these private legal claims, but um, less, less regulation that you see in, in, in other parts of the world.
Robert: Yeah.
Chris: So to my knowledge, I know they’ve been fined at least once to no avail.
Chris: Do you feel like we’re going to continue to see this stay
Robert: unresolved? It’s so hard to see how it resolves itself. Um, there’s some suggestion, you know, I don’t think this is viable. Some people are saying, you know, they should, some European countries should try to extradite the c E O. And, uh, deal with it that way.
Robert: Because the thing is that they can’t, the business model is such that they can’t discriminate against, they can’t, they can’t say We’re only gonna collect facial images of Americans. It’s gotta be everyone cuz they don’t know where these images are coming from. So unless there’s some American law, I mean, they’re based in New York and as a.
Robert: It’s not the weakest in terms of privacy regulation, but it’s, there’s nothing stopping them from doing it. So it’s legal there. And this is just another example of how the internet interconnected nature of the internet. You know, the borders aren’t, they don’t mean much anymore. Um, You, you, you struggle.
Robert: Unless the US adopts some sort of, uh, federal privacy law, which is not completely inconceivable. There’s one on the books at the moment that a lot of people are saying might happen. But unless something like that takes place, then it’s hard to see how they can be stopped. Really. Also, they keep getting these fines in Europe.
Robert: I mean, they’re 20 million in Italy. Uh, France, I think is 20 million Euros as well. UK is seven and a half million pounds. I don’t see how these regulators are going to actually get this money because why would they pay, you know?
Rohan: Yeah. That was my next question was enforceability of fines.
Robert: Mm-hmm. , it’s a tricky one.
Robert: I mean, they’re ruled out of a lot of these markets anyway. They have tried to go into these European markets, but every time they do, someone makes a complaint and the, the business model is clearly not compatible with European law. So regulators do a big investigation, it takes a year or more, and then they issue a fine and then clearly normally appeals.
Robert: But there’s no real reason for them to pay because they say, well, th those laws don’t apply to us. That’s gonna be tested in the UK soon because they’re taking it to court. Um, and legally they, they might be right. I guess they, you know, it’s, it’s a real tests of that idea that Europe can somehow control non-European companies operating within.
Robert: Within Europe. Um, if you want to do business there, it is quite easy to, to penalize companies, but if you’re not interested, then how much can they really do? It’s, uh, it’s, it’s a difficult question. Yeah.
Rohan: Wow. I was just actually thinking, um, of, uh, another large tech company that, uh, is having interesting conversations with the EU at the moment, and that of course is Twitter.
Rohan: The, um, the, the, the communication platform of choice that suddenly, um, with a change of governance at the top, we’re actually able to see how, um, the downstream does change. It’s a fascinating topic right now.
Robert: It’s, yeah, and I’m a, I love Twitter personally. You know, I waste a lot of time on there. And, uh, it’s, uh, it’s my favorite of the kind of platforms and it’s particularly, I mean, it’s, it’s been like watching a slow motion car crash at the moment, but it’s very compell.
Robert: Um, seeing what’s going on, and I think people in my community anyway, the data protection people have kind of forgotten that. Twitter has never been very good at that stuff. So people are saying, you know, Europe’s gonna shut it down. The GDPR is gonna put it out of business. But Twitter’s have problems like this before they have had a GDPR fine.
Robert: It was only. Half a million euros or so. Um, other regulators wanted it to be much higher, but the Irish, uh, regulator where they’re based is quite, is known to be quite lenient. Um, and there were these revelations, of course, a few months ago from the, um, the ex has security there. Um, He, he came out the, you know, whistle blows Peter’s Peter Zko, or much as he was known, he came out with some pretty shocking stuff about what had been going on, uh, in Twitter for many years.
Robert: Um, they only allegedly only managed around 20% of their. Internal data sets. Uh, he said there was ignorance and misuse about what data was where he alleged that they had serious vulnerabilities at around half of their servers. And, uh, failing to manage employee devices. Uh, Serious problems with access control, I think over accounts.
Robert: Um, so I think people are kind of, people were very anti Twitter from the compliance point of view for a long time, and I think they now mask has taken over and he’s behaving, uh, objectionably in so many other ways. people are forgetting about the problems that Twitter’s always had in, in, in this sense, it’s just another company really built on.
Robert: Processing people’s data and trying to sell them stuff. You know, you, you are the product, as they say. Um, yeah, on Twitter, just like with Facebook, the, the
Rohan: Twitter authentication problem runs deep onto that platform. The, um, just, uh, I saw an article I think this morning, uh, referencing insider intelligence, Jasmine Enberg estimating 32 million users.
Rohan: Exiting TW over the coming two year period. Um, what, what are your thoughts? I think around the scale of the exit, whether it happens or not.
Robert: Even people leaving Twitter, you mean? Yeah, yeah. So a lot of people in telling me to move to master, uh, which is another social network, um, And I’m on it, but there’s something about Twitter that’s just so much more kind of fun, and, and lively, you know?
Robert: And you, it’s, it’s a bit like you occasionally see celebrities posting stuff and I don’t, you know, care very much about their side of things. It’s kind of nice to have them around in a way. I tried our masteron and. It’s very dry. Lots of academics, lots of tech people, which is kind of my comfort zone, but I sort of like to see stuff out from outside of that as well.
Chris: And your source of information is limited too, right?
Rohan: For sure. Yeah. Yeah. The UX is different. You are, you are looking at a TV mastered on as, um, curating, um, conversations at a different end of the spectrum. I. Which is one reason why, you know, a clearly a, a, a curated, a well curated conversational platform.
Rohan: Um, can work, should work, does work is working. It’s the extent to which we can keep the obnoxious drunks out of the bar. Yeah. This
Robert: algorithmic content systems get a bad name. Uh, I kind of like it to be honest, you know, having stuff pushed on me. Uh, and I think sometimes people might overstate the degree to which that’s sinister or nefarious exercise.
Robert: Um, is kind of decentralized as you, as you might know. So it’s, it’s difficult to find stuff unless you’re looking for it, and it, it kind of keeps you in your bubble a bit. I’m quite comfortable in my Twitter bubble though, to be fair. I’d never experienced or, or witnessed a lot of the more disgusting, uh, tweets.
Robert: Unless I look for them. Uh, you know, the more abusive and and horrible stuff you do, you do stumble across sometimes. But if you know your community there, I know the information security, um, community seems quite nice as well as data protection on, on privacy on my side. Uh, so I’ve built a kind of little bubble of people that I really like there.
Robert: It would be a shame if the whole thing fell apart. I don’t think it, I dunno about you two. I think Twitter was probably horrible on for a few years. Uh, w I don’t think it’s gonna disappear anytime soon.
Chris: I agree with that. And I think it comes down to the individual as well. Like you said, I mean, us as security practitioners, we, we know what lane to stay in.
Chris: Um, but it’s hard for me to believe that when you look at Mastodon or. An alternative that the, the bullshit won’t
Robert: follow. Yeah, I think that the bullshit will follow, and the thing that people forget about Twitter is for all its issues with trolls and whatnot, they’ve got a lot of money to spend on content moderation.
Robert: Whereas Master Dame, it’s all individual people running their own individual servers so they can make their own choices. Who gets banned, who gets suspended, and so on. And that can lead to some quite unhealthy outcomes as well. Um, I, in the pre-social media days, uh, on forums or whatever, I remember some pretty.
Robert: Bad situations and toxic communities, uh, on those platforms too. And of course we always have LinkedIn, which I know some great people on LinkedIn, but, um, it’s just, it’s, it’s quite corporate. Twitter’s just feels more kind of cool. . Yeah. Yes. Formal.
Chris: The InfoSec community on Twitter is, is a different regime than.
Chris: Info sec community on, on LinkedIn.
Robert: I like both and I’ve built a good community on both. Um, I know Ron from, from LinkedIn for example, but, um, I kind of feel like I’ve got my work clothes on at LinkedIn more so than with with Twitter . Ah, yeah.
Rohan: Yeah. Fair enough.
Robert: One interesting thing is I don’t see anyone talking about going back to Facebook when they’re talking about leading Twitter.
Robert: I mean, it just hasn’t entered the realm of possibility at all. Um, so I wonder if it’s done for, as a social network.
Rohan: Yeah. Yeah, I, I would say the, the bulk of their business now sits in advertising control. Um, I think meta, meta Zuck faces to what now is, is, uh, some sort of other beast that extends into other, uh, complimentary areas.
Rohan: And I think, we’ll, we’ll see them around for a while. But the, um, in terms of the. The, the previous darling of, um, compliance villain, um, Zuckerberg has, uh, fallen to the wayside a bit. We have, you know, the compliance people are looking now at the, the, the Clearview business models, the ones that can exploit in plain sight, as it were.
Rohan: Um, it’s an interesting situation. Yeah, no matter.
Chris: Social media platform you use, it’s hard to avoid seeing the trending topic of open AI and technology like Dolly or Chat, g p t Avo. Um, sorry,
Rohan: what’d you say? Evil. No, no,
Chris: not at all. Um, yeah, you know, I’d love to get, get your thoughts on that, Robert, in terms of the technology.
Chris: Um, the, the ethical aspect and the potential threat that it brings to the people that are using it as it continues to go more mainstream. And I think we can all agree that, that it’s going more mainstream. I’ve
Robert: been playing with I’d, I’d, I’d also like to come back to Ron on, on his, on his opinion on, on this one.
Robert: Cause I know he’s, uh, big in the world of AI and also said that G B T was was evil. Uh, you know, I tried it out and I have been impressed. It’s infuriating, uh, in many ways. Chat. G B T I know it’s kind of just a new interface really for open AI’s existing large language model. Um, but I’ve found it, you know, just on a personal level, this is not from a professional point of view.
Robert: I found it surprisingly limited in some areas and. Extremely impressive. In others. It’s very belligerent and stubborn about certain things. When it gets something wrong and it doesn’t understand why it’s wrong, uh, it, it, it, it’s not very, it’s, it’s not very self-aware. I know we shouldn’t expect that from an ai, but it seems to be trained to sound correct about everything.
Robert: Mm-hmm. .
Rohan: That’s interesting. A how it, um, it has a, a stubbornness to it in terms of how it defends its lack of knowledge in the areas that it knows, it doesn’t know. Yeah. Um, the, the, of course it’s been fed on people’s brains and the forms of, um, words. Right. Um, the thing is, I, I, yeah, you’re right. I don’t like it.
Rohan: And the reason why I don’t like it is because now, Ground truth can be so much more easily falsified. So now, uh, compliance professionals, when we are trying to get down to the ground truth of something, um, and it’s gonna get a lot harder because the eloquence level is gonna go up much, much higher. And this is going to impact careers in funny ways.
Rohan: I. Simply because we go, those were great words. Can you please elaborate? , and people stumble over and, um, revealed for faking it before they made it.
Robert: Yeah. It’s, it, it, it. So I’m sure you’re familiar with the paper paperclip. Maximize this thought experiment about an AI train to create paperclips and it’s. I mean, wrong word to use.
Robert: We’re so determined to keep making paperclips that it creates paperclips out of all the material are confined and ends up destroying the human race and it’s quest to make paperclips, and I’m thinking of. Chat, e p t, it’s kind of, paperclips is just sounding like it knows what it’s talking about. It, it kind of will steer the dialogue you’re having with it towards any position that makes it sound competent.
Robert: And if we, right now it’s kind of a, a gimmick to some extent, it can do some pretty cool stuff. You know, I’ve asked it to write me some code and that kind of thing. It does that very well. It writes. Possible blog posts. Um, do you think Liz
Rohan: Trust was using it, uh, as an idea generator? Um, really?
Robert: Is this the theory that’s going around or your own?
Rohan: Um, that was a pure slanderous joke, uh, on my part. Ignore it immediately. Ignore it. Ignore it.
Robert: It would actually, that is a great use for it. I think speech writing, cause the kind of stuff he comes out with is quite bland and. You know, it sounds okay to most people’s ears, I guess, but I think you’re right.
Robert: As it becomes more integral to society, those things are gonna have increasingly, uh, dramatic consequences and the, the idea that it is another. Water muddying sort of barrier to getting to ground truth as, as you say, becomes more and more important as we use AI in so many different fields. And as for the.
Robert: Jobs. I mean, I thought like many people, long distance load drivers, for example, truck drivers would be the first to lose their jobs to ai. Now that seems like a long way off and instead you’ve got service and knowledge workers that should be more worried, um, which is not how I expected it to go. Yeah.
Rohan: The case recently of the deceased, uh, south Korean artist being. Their work being extended in a highly plagiaristic manner, um, by grieving fans, but also, um, speculators, um, greatly changing the basis of actual value. Um, that’s an example of what these things I think can do to a market. Um, yeah, it’s interesting times.
Rohan: Um, but actually there’s a, there’s a really critical question which we haven’t covered. Um, and I need to know, uh, who was actually the best character on and or
Robert: Well, I was a big fan of the show. I really like, uh, the main guy and or himself, um, and, uh, You know, I think it was, uh, the street ahead of other Star Wars IP released recently. Personally. Now what about you?
Rohan: Oh, un unreal. It was, um, had to be straight away. The, um, the, uh, I s b wasn’t it? The, the, the main woman, uh, who’s figured out the plot, um, and who’s working through a series of really intense.
Rohan: Psychological episodes, which we, we are not privy to, but quite clearly the character is, you know, motoring through outcomes, very unpredictable, um, thoroughly enjoyed it, and the writing, the characterization and the depth of it. Awesome.
Robert: Yeah, I thought it was phenomenal. Great show. And, uh, I really liked, uh, Andy Circuit’s character as well in the prison.
Robert: The prison break whole scenario. That was, that was a great one. And, uh, well, coming back to ai, I must say, I can’t help but be excited by what’s going on in this space. I dunno if you, you seem to be, uh, I think I’ve become less cynical about it the more I. Had firsthand experience of it, to be honest. And there’s a lot going on in terms of the law around it.
Robert: Uh, particularly in Europe. Um, there’s gonna be a lot of changes coming up in when the AI regulation hits. Um, but I found myself being quite seduced by it, uh, since playing around with these tools. I dunno if you think that’s very naive of me.
Rohan: Oh. Uh, I’m an empiricist, um, but I’m also a complainer, you know, and I’m never short of an opinion.
Rohan: Uh, it’s just that, uh, I happen to have been watching this, um, sector for a while. Yeah. Yeah. And my, my main viewpoint is actually young people, um, and giving young people chances of, uh, non-digital and digital life experiences before they get too old. Um, I’d say that’d be my, my biggest. , but professionally, um, there’s no way that we, we have to get involved with these devices in order to be able to identify the control surface until we can, uh, identify the control surface.
Rohan: We can’t be sure of our risk actions. Um, that’s why the ground truth, any threats to the ability to empirically identify ground truth are significant. .
Robert: And do you think it’s possible to have that?
Rohan: Yes. Yes. Because, um, anything that has, and for something to be important, it must have an effect that is observable.
Rohan: Um, otherwise there’s nothing too differentiate it between two brains and a jar having an amazing conversation. They just don’t. , um, there must be a real world impact, uh, just because we can’t observe. It just means we are looking in the wrong place. So hence we just keep looking and it doesn’t take long to figure out the costs of, for instance, content moderation and how the, that industry has been fractionalized out into, um, Africa, for instance.
Rohan: The, there is an abundance of labor that can be cheaply exploited.
Chris: I mean, you can’t deny that the technology is phenomenal if the governance is there. What is an optimal use case that, that you envision?
Rohan: Well, we just treat these things the same way we treat cars. Um, we wouldn’t put up with mass casualties and car testing, so let’s not put up.
Rohan: Really bad rights outcomes in, uh, data system delivery sequences, which can last for years. It can take years to get products out there during which time. Lean manufacturing techniques tell us that yes, we can make multiple interventions, and so then therefore, by the time your car arrive, Um, you have an expectation of, uh, high reliability with intolerance levels, and those tolerance levels are buffered for your customer experience.
Rohan: And so it’s the same thing as any data product. And an AI is just a data product, it’s a configuration, and it’s a rapidly changing configuration, but we still, we still have the ability to turn the switch off the, at the moment, the AI can’t defeat that.
Chris: So Robert, you just attended Risk 22 in London. Um, can you tell me about that experience?
Robert: Yeah, I was, uh, I was chairing the Data Protection and Privacy Theater, so yeah, it was great and we had a lot of discussions about a lot of. Dorky stuff that I found highly interesting, but to an outsider might have seemed a bit dry. Uh, so mostly compliance related stuff. Uh, some pretty data protection is one of those areas where there’s some people that are extremely famous within this field that no one outside of it has heard of.
Robert: Uh, so it’s, it’s just one of those communities that. It’s quite, quite small and quite close. Uh, so we had some fairly well known speakers. Max Schrems, who I mentioned earlier, was one of them. Uh, I’ve interviewed him a few times now and, uh, I’m kind of an outsider in this field because although I know a lot about it and I’m fairly well is on it, and I have to keep on top of everything.
Robert: I’ve never actually done a day’s. In this industry, I never practiced it. Um, so I spent a lot of time talking to people at events like that. and I, I know I can hold my own in a conversation with academics, you know, lawyers or, or practitioners, tech people, to a lesser extent perhaps, but I don’t actually do the work.
Robert: So it’s interesting to meet people who do hear about their frustrations and their day-to-day graft, uh, whether it’s insecurity or, or, or data protection, or we had stuff on financial. Crime there as well. Uh, environmental, social governance, and, uh, it was, it was a really interesting event. Yeah. Nice.
Robert: There’ll be more next year. So we we’re hitting, uh, Amsterdam, Dubai, London again for risk throughout the year. It should be good. And you’re based in
Chris: the uk, correct? That’s right. Since this is Barcode, what’s the best bar you’ve ever been to and is it in the uk?
Robert: I like, uh, an old-fashioned English pub quite a lot. They’re dying out here in the uk, but you can still find them. And so I used to always go to a place called the Vine, which was very cozy. Uh, warm beer, which is an aberration to most people outside the uk. And a fireplace and occasionally a folk band playing a beer garden for the summer.
Robert: Very small slither of summer. We sometimes get in this country and the vine. I spent lots of years there. Some of them underage drinking with friends, and uh, a lot of my friends have moved away from the town where I grew up now, so I haven’t been there for a while. But that was the first one that came to my head.
Robert: The vine in Taring.
Chris: So I just heard last call here. You have time for one more. Sure. If you decided to open a cybersecurity themed bar, what would the name. And what would your signature drink be called? Hmm.
Robert: Well this is very much an in joke for European Data Protection Compliance people. I’m gonna call it the D P I L,
Robert: Um, nice. That abbreviation stands for data Protection impact Assessments. And that’s, that’s probably. Dork joke I’ve ever made. That’s for a drink. I like an old fashioned, I can’t think of a way to turn that into a privacy pun though, so I’m afraid I’m just gonna have to leave you with that just cuz I like old fashions.
Robert: Ah,
Rohan: yeah. Well you, you, you did it well at D P I A. That’s great. Good. Um, that’s great. Someone has to sponsor that for the next. Uh, GSA Priv, uh, priv State Conference. Yes.
Robert: Yeah,
Chris: good call. So, Robert, you mentioned you’re on Twitter quite often, you’re also on LinkedIn. Um, you know, where can our listeners that are, that are hearing this connect with you online?
Robert: Yeah, well, uh, Twitter, Robert j Bateman, LinkedIn just search for Robert Bateman. It’s, uh, protection of data is the url, but I didn’t think people. Care about that. When it comes to LinkedIn and, uh, GRC world forums, I write a lot and put up on that site, so grc wf.com. Uh, I haven’t had time to do as much, so I’d like recently, but, uh, people follow my writing.
Robert: I get into the minutia of data protection law and developments, so check out the websites. We got a lot of good events coming. Um, in January we’ve got a couple and I’ll be hosting moderating panels, interviewing people. So yeah, connect with me. As I say, uh, Twitter’s my favorite, so. Follow me there and I’ll follow back.
Chris: Cool man. Well this has been great. Thanks, uh, really appreciate the knowledge and take care. Be safe.
Robert: Good to meet you both. And uh, yeah, talk again. Cheers. See ya.