Biomatrix

Jeff Jockisch is a privacy expert with experience in privacy rights, privacy laws, data breaches, intrusion detection, as well as other areas, such as cognitive computing, content development and trust systems. He is also cohost of “Your Bytes, Your Rights” podcast.

Dave Burnett is the head of global business development at Zero Biometrics. He’s a serial entrepreneur who brings global executive expertise and experience from private and public companies into the security, biometric, and digital identity markets.

We discuss biometric systems and how they work, biometric system attack vectors, privacy concerns, public perception, regulatory and legal implications, and Zero Biometrics.

SYMLINKS
Jeff’s LinkedIn
Dave’s LinkedIn
PrivacyPlan.net
Your Bytes Your Rights Podcast
ZeroBiometrics
Terra Fermata | Stuart, FL
Dr. Funks | San Jose, CA
Tonga Room | San Francisco, CA
Le Tiki Lounge | Paris FR
Frankie’s Tiki Room | Las Vegas NV

DRINK INSTRUCTION
THE RED PILL
2 oz Pomegranate Juice
1 1/2 oz Bourbon
1/2 oz Raspberry Liqueur
1 tsp Maple Syrup
1/2 Lemon, Juiced
2 oz cold Club Soda
Add the pomegranate juice, bourbon, raspberry liqueur, and maple syrup to a large glass and mix with a cocktail stirrer until combined. Place an ice ball into another glass, then pour in the bourbon mixture. Place the tip of the spoon at the very top surface of the bourbon mixture, then slowly pour the lemon juice over the spoon so it layers over the bourbon mixture. Repeat with the club soda layer to fill the glass.

EPISODE SPONSOR
Center For Internet Security (CIS)

CONNECT WITH US
http://www.barcodesecurity.com
Become a Sponsor
Follow us on LinkedIn
Tweet us at @BarCodeSecurity
Email us at info@barcodesecurity.com

This episode has been automatically transcribed by AI, please excuse any typos or grammatical errors.

Chris: I’m here with Jeff Jockisch, who is a titan in the data privacy realm with experience in privacy rights, privacy laws, data breaches, intrusion detection, as well as other areas, such as cognitive computing, content development and trust systems. He also cohost the extremely popular and just truly awesome.

Chris: Your bites, your rights podcast. And I’m also here with Dave Burnett. Who is the head of global business development at zero biometrics. He’s a serial entrepreneur who brings global executive expertise and experience from private and public companies into the security, biometric, and digital identity markets.

Chris: Gentlemen, thank you for joining me today,

Dave: Chris. Wonderful to be here. I’m looking forward to this conversation. Thanks for hosting us.

Chris: Absolutely. So yes, in cybersecurity one aspect that we all focus on and zero in on is authentication mechanisms. And the, the risks associated with that and biometric systems being one of those methods of authentication, which in turn raise both security and privacy concerns.

Chris: So. I wanna discuss first off, just the level set and Dave, you may be the right person to ask this initial question to. Could you define what biometrics is and how do these authentication systems typically work?

Dave: Yes. I’m happy to do that. And, and let’s put some context around biometrics and authentic.

Dave: And why there’s been so much excitement about using biometrics in the first place. So, you know, we all know that, you know, name and password based authentication is, has a lot of risk associated with it. Right. You know, you’re you forget your passwords to begin with, but almost every single data breach that has had any of any note in the last.

Dave: 10 years is inevitably linked to name and password breach. It could be the result of a phishing attack. It could be the result of you know unsecured, credentials any number of ways, but it’s a very, very common attack vector. And so biometrics, which is the process of recognize you based off of what you look like or what some part of your body looks like has gotten.

Dave: Very popular, you know, it’s fast. It’s fairly convenient and the way it works is that, you know, a picture let’s just talk about face, a picture is taken of your face. And then it is digitized. The term is it’s it goes for a binarization process and that’s saved. Then when you come back to authenticate with your face then another photo is taken.

Dave: It has been arise, and then it is compared to the first photo. And if there’s a match confidence up above a certain threshold, then that’s considered to be a probabilistic match and you pass. So it’s, it sounds really complicated, you know, but in fact, the use experience. Is very elegant and it prevents a lot of the problems that we all know to exist with name and password.

Chris: you mentioned what you look like, so I’m envisioning facial recognition or retina scanning or something along those lines. What about fingerprints? Is that classified as biometric?

Dave: It is. And in fact, the reason why say that it’s a picture of you is, and even though it’s easy to think that that means face, frankly, it’s any of face Iris Palm print and fingerprint, because what a fingerprint sensor is really nothing more than a, a flat.

Dave: Plainer camera that you touch, and then it picks up a picture of what your fingerprint looks like. And then it goes through the same basic process that I described. And you know, that process as you know, it was, and the design that I described was invented back in the sixties originally for, for face authentication, but the same model applies to other.

Dave: We call biometric modalities, that’s Iris, voice, finger face, et cetera. But one thing that’s important to remember is that it was invented back in the sixties. And if you recall, there was no internet back then. And if you wanted to hack a system, you had to have physical access and it hasn’t that architecture hasn’t adapted to the present.

Dave: Where we have these mobile devices and cloud-based services. And this data about you, this, this data is it’s very it is very sensitive data. It’s personally identifying data. And in order to make these systems usable you have to wrap an enormous amount of security around them.

Chris: I’m curious. From your experience, where are you seeing these biometric systems implemented?

Chris: And, and how rapidly are you seeing these biometric authentication systems evolving into, to mainstream?

Dave: That’s a great question. And I, let me step back and answer your question from a very, very big picture view, right? So up until you know, about seven years or so ago biometrics were largely restricted to highly sensitive.

Dave: Installations or locations where you wanted to do basically physical security access control. This could be a room at a hospital where you store drugs and, and various medicines, or it could be a military facility. It was that it was really primarily targeted for access control. And then.

Dave: Biometrics came to our mobile devices, our, our smartphones, when apple launched the first iPhone with the fingerprint sensor. And that started a whole craze in the industry because every Android vendor had to every Android smartphone vendor had to add a fingerprint sensor to their phones. So that was like the second big wave in biometrics.

Dave: The third big wave really got underway around the COVID time period. When we all had to work from home. And how you would onboard new employees in a COVID environment where you couldn’t go into the office and you couldn’t show identity proofing documents. The, that started a whole new wave of services that used biometrics to compare your face.

Dave: To a photo on an ID document, a passport, a driver’s license and so forth. So those are the three big waves. And now you’re seeing the technology extend into many other use cases. Like I booked a hotel reservation in Palm Springs a couple of months ago. And the hotel that I booked at wanted to see my driver’s license and take a picture of me to verify my identity before I could complete the.

Dave: It’s becoming very, very common and it’s only going to become more common in the next few years.

Chris: Very interesting. Yeah, I’m definitely paying attention to, to the evolution and it is becoming a lot more aggressive. And, you know, with that, Jeff, I want to switch over to you for a moment, you know, as a privacy expert.

Chris: Based off of what Dave has described thus far, you know, what are some of your concerns in terms of the data privacy implications of biometrics?

Jeff: Yeah. Thanks Chris. Well, the biometric waves that that Dave’s talking about to give you a bit of a bit of scale. The facial recognition industry generated about 5 billion in revenue last year in, in 2001.

Jeff: So it’s growing really rapidly. The projections are in five years, it’ll be at about 13 billion, so sort of massive growth. Also. There’s really much more than just facial recognition and fingerprinting, which really gets all of the, the attention. But I made it sort of a database last year of the different types of biometric indicators.

Jeff: And they’re really close to like 40 different ones some, you know, sort of sort of on the edge, but quite a bit of different ways that you can be tracked biometric. And we won’t sort of go into those because that’s not really the, the subject matter here, but it is sort of amazing the different ways that you can be tracked by, by your biologicals, I guess.

Jeff: So to get sort of more to the point, though, there, there are a lot of different concerns with people being tracked by, by, by facial recognition and by biometric. right now, 80% of the world’s governments and, and 70% of the, the world’s police forces use facial recognition tech in some kind of way.

Jeff: And only two countries in the world have banned Belgium and Luxembourg. And they’re pretty small in terms of population. And so the real problem is that while the technology has gotten better quickly, there are still some problems with bias and with accuracy. You’ll hear a lot of vendors say that their accuracy, like 99.9%, but that’s generally in the lab.

Jeff: It’s not in the field where, you know, shadows and bad cameras and bad techniques and different things like that actually happen. You know, N found a few years back that, that it. Pretty bad in terms of bias. And we won’t really go into those sort of statistics, but even in, in terms of accuracy, it it’s pretty bad.

Jeff: You know, I think it, normal use cases is about 10% in terms of false positives, I think in the field, if you just look at some of the studies that have been out there and the problem is, is that, you know, as these face templates are collect. a lot of companies will tell you that, you know, they have really great security and that probably is true, but you’re collecting, you know, biometric information and that information, if it gets leaked is, is really.

Jeff: Bad. I mean, somebody can potentially, you know, find out who you are and depending upon what information that is connected to, they could get into your bank account. They could find out that you’re in witness protection. They could find out, you know, how much money you make. They could, they could assume your identity.

Jeff: They could find out that you’re you. all kinds of different things, right? Depending upon how, what kind of information that face is protecting. And because we think it’s very safe, we’re tending to use it to protect really important information. You know, there are, are political dissidents in, in people, in, in third world countries or even in, in other countries that, that have a lot at stake.

Jeff: And if we’re, if we’re using these things in government databases to protect our taxes like the IRS wanted to do it’s really pretty scary that, that this stuff is not a hundred percent foolproof.

Dave: Yeah. If I could add onto that. I incredibly violently agree with Jeff on the points that he’s making.

Dave: You know, I think there’s a, a kind of a, a transfer. Of believed safety from our mobile phones, which we generally believe to be safe. And we’ve gotten very comfortable with the use of finger or face on our phones that we kind of transfer the idea that okay, if a, if my bank or if my government is wanting to use my face for authentication, then it’s probably safe too.

Dave: But that is really. A very bad assumption to make, you know, because it, it just because a company claims or government claims that the data’s secure, it doesn’t mean that that’s true. Right. It’s incredibly hard to make a system that cannot be hacked or broken into. And it’s there. It is just, we’ve been lucky as an industry that there hasn’t been a major.

Dave: Biometric breach outside of things like OPM and that, that biometrics are not being used as attack vectors for breaking into our accounts yet. But that will happen, especially if we continue to have models like face like traditional face and traditional finger where, where biometric data is kept and therefore can be stole.

Chris: Is the biometric data stored as an image or is it stored digitally as ones and zeros? And, and I guess what I’m getting at is if there was an attack to occur what does that look like in terms of, you know, breach data? Are the attackers retrieving the actual fingerprint of someone or is that encrypted in some way or does it really depend on what system you’re looking.

Dave: It’s an excellent question. And let me the best way to start the answer. And frankly, we could spend an hour on all of those specific ways to answer the question, let me just start with what a biometric company generally sells. Right. What they sell is the capability to take an image, face, finger Palm print.

Dave: What have you turn it into you know, a, a binary object. And I don’t, when I say binarization, I don’t just mean ones in zeros. I mean a format that will help it do matching. Right matching, like literally faced, you know, a human readable photo to human readable photo. That’s, that’s outside of what these systems can do, right.

Dave: They all do some sort of transformation into something that a computer can more easily process. However, these transforms, regardless of what some vendors may. Those are not one way transforms. They are by and large, they are all reversible back into source image data. And there’s a lot of academic research that shows that.

Dave: So they sell you the technology to take image data transfer, transcode it into a format the system recognizes. And then we’ll match on that’s it, any security beyond that is up to the buyer of that technology. So if I’m making, let’s say a padlock with a fingerprint sensor, the fingerprint sensor vendor, doesn’t sell you anything to keep your biometric data secure.

Dave: It’s 100% up to the implement. Of the designer of that lock to keep your personal data safe. Same with banks that might use face recognition. For instance they have to provide all of the security and for the biometric systems that they enable for their customer use. So there are many, many attack vectors.

Dave: That that are, are, that can be leveraged against these systems. One is, you know, there would be a hammering attack where you try to hammer the system over and over with a synthetic face or a known face. There are other attacks where if you break in. To the back end of the system and you steal template data.

Dave: The biometric that’s what these files are called that save the biometric data they’re referred to as biometric templates or templates for short. You can, you can steal them from one company. And then use them to create an account at another company. And then and then log in or even worse.

Dave: You could substitute the template for Dave by stealing it from one bank and then injecting it into another bank.

Jeff: Now I wanna interject here. These attacks are not easy. Right. But as computing power increases, they’re gonna become easier and. And so it really depends upon who implements the, the security and stuff around these.

Jeff: It it’s always a matter of risk. Right. We don’t know what the risk is in, in as a consumer. We’re not privy to how much risk we’re actually taking. We don’t know. And, and that’s, what’s really, really scary.

Dave: Yes. And while apple has done a fantastic job of publishing white papers around their security architecture you know, they’re unusual in that respect, you know, it’s, I, I, you know, while some companies may also have done something like what Apple’s done, you know, it is not.

Dave: There’s no law. There’s no, there’s no public pressure on any organization that uses biometrics to publish their, their security architecture. That’s protecting your data.

Jeff: It might be, it might be good to actually talk a little bit about public perception of facial recognition tech, because it’s, it’s a little bit all over the board and I don’t wanna, I don’t wanna call up the conversation, but I think this is sort of crucial.

Jeff: Yeah. I wanna throw a couple stats out if that’s okay, Chris. Absolutely. So, so here’s a few, right? Sort of randomly, right? About 68% of Americans say facial recognition can make society safer. Right. About 30% of adults say it’s acceptable for companies to use facial recognition to monitor employee attendance.

Jeff: About 70% of adults. Think facial recognition can enhance security systems and are comfortable with using it at workplaces and schools and places they visit about 76% of American support facial recognition technology to identify child predators. So those are all sort of positives, right. But then just one in four Americans believes that the federal government, well, and now there’s a couple more positives.

Jeff: Only like 25% of Americans think the federal government should limit use of federal facial recognition tech and 60% of Americans are okay with using facial recognition tech, if it’s right. A hundred percent of the time, which of course we know that’s not true. Right. Right. But then on the opposite side, 81% of consumers are concerned about misuse of their biometric data.

Jeff: And 84% of people would support federal regulation of facial recognition tech, which seems to be the opposite of what I just told you in a couple other stats, right? And 95% Americans feel that they should have the right to opt out a facial recognition systems, which they generally do not.

Jeff: So, yeah, the, the, the, I think the way a lot of this stuff gets phrased when people are asked, these questions really changes how they perceive these systems, whether they think they’re getting more security out of it, or whether they think it’s a privacy problem.

Dave: And I not to overcomplicate things, but I think how people answer those questions is also somewhat dependent upon what they imagine that the facial recognition is doing.

Dave: Right. So I would respond very differently to some of those questions. If I was being asked about a, a security camera on a public. That might be doing face authentication on me. As I walk down the street, that’s a, that’s a public setting where there’s where my, my facial image is being compared to, you know, potentially a very large list of faces that are on some sort of a watch list somewhere that’s.

Dave: But then I would respond differently. If the question is around me using my face for connecting to my bank or the, or a government agency, some, some scenarios I’m gonna answer the same way, but they’re, they’re very different ways that these systems are used generally. Like the street scenario is a one to what’s called a one to end search where you’re seeing a face and you’re searching a large.

Dave: List of potential face matches to see if I’m one of those people, but you’re, you’re, you’re really trying to identify me out of a population. And then there’s, Hey, we think it’s Dave trying to log into an account. Let’s see his face and check to see if it’s really Dave.

Jeff: Yeah. Yeah. I think that’s something we should point out here, Dave, is that, that big difference between using your face to authenticate you right?

Jeff: Where essentially the system really has just your face and it’s comparing it to your face. Right versus the system has a million faces or a hundred million faces or 5 billion faces. And it’s searching your face, trying to find a match within that massive database that’s surveillance, right? That’s more what they call facial identification rather than facial authentic.

Dave: There’s an important footnote here. And you’re you, you, you hit the nail on the head about the distinction. Jeff, the interesting thing is that that by and large, any. solution face solution that you that you acquire from a vendor to do authentication that same technology can be used with the same data set for surveillance.

Dave: Right? So if I’ve, if I’m a bank and I have 10 million customers and I’ve got 10 million faces, the, the same tools that I use to authenticate a person to an account could be. To just do a search on all of the templates that I have for all of my 10 million customers to see if this one person is one of my customers by face.

Dave: That’s a deep duplication example, but it’s the same surveillance concept.

Chris: So I’m gonna go down a rabbit hole here and ask you about the legal aspects. Where does the regulatory line get drawn in terms of. Law enforcement tapping into these databases for you know, use in the court system.

Jeff: The police are using it a lot already.

Jeff: And they generally, well, it sort of depends whether they need a warrant. Generally they already have the data. It’s not like, I don’t know if any cases where they’re like going to bank of America and saying, Hey, we want Dave’s biometric data. But what they have been doing is things like going to hospitals and saying, you know, grabbing the DNA, which is not necessarily face data, but like grabbing biometric biological data from blood samples, from a child that you might be related to and using that.

Jeff: Potentially identify perpetrators. And so I could see that that might start happening with facial recognition data though. I haven’t actually heard of cases of that yet.

Chris: Do you see a benefit, you know, considering all the data privacy and the risk involved, do you see the advantages to biometric scanning systems or would you prefer to just go back to.

Chris: You know, a UID and password to authenticate. Do the risks outweigh the benefits?

Jeff: Well, I think, I think it’s already a foregone conclusion that we’re gonna be using facial recognition. It’s just, how do we make it safe? You know, clear view, AI’s already got 3 billion images in their database and they’re already selling it to, to every, you know, law enforcement organization in the United States.

Jeff: And half of them are around the world. From what I can tell. and there are lots of other organizations that are trying to do the same. So, I mean, there are laws that are trying to reel that stuff in privacy laws and, you know, like laws like Illinois has a law called BIPA the biometric information privacy act, which even has a private right of action.

Jeff: And that’s, what’s keeping a little bit of this stuff at bay. That’s why, you know, Microsoft and Amazon and Google have sort of like stopped doing some of their, their biometric. A research on, on facial recognition. But I think it’s only a matter of time before, before they figure out a way around that. And maybe a better way to do it.

Jeff: Maybe without biometric information, like what, what Dave’s company does.

Dave: Yeah. I mean, I think one thing that that’s really important to remember is that, you know, convenience. Of using any system, the more convenient you make it, the more it’s going to be used. Right. So and so biometrics have been playing a really important role in driving usability of a lot of different systems online and, and, and addressing a very.

Dave: Very significant issue around password retention and data breaches that come from passwords. You know, I could do a quick search online at, on the dark web and get, get passwords that I’ve used from different organizations. That have been stolen from those organizations. And every time that there’s a theft of password data, that information is turned around and you see it in hammering attacks where they’re using that password to try to break into your banks and other organizations that you do business with.

Dave: So biometrics. Our really a valuable tool towards a safer online world. The challenge is that the model that has that, the, the design model for all, almost every existing biometric solution out there requires that you retain sensitive personal information. However, that’s what my company is focused on is a new approach to authentication where the biggest weakness, which is the protection of this biometric data is fixed by not having any biometric data saved about you.

Dave: And we rely upon a, a branch of mathematics called zero knowledge proofs to make that actually possible.

Chris: Interesting. So it’s still the, is it still the biometric concept where as we talked about at the beginning of the conversation, you know, it’s gathering that imagery of what you look like, but it’s not storing it, or could you talk a little bit more about the technology.

Dave: Absolutely. This is one of the things that I’d love to talk about, not just because I’m with a startup that’s focused on a new type of innovative technology, but because I’ve been in this industry for so long and really have to come up with a, with better. More modern solutions that address this and do other things like record consent and give us more control over how our biometrics are used.

Dave: So to answer your question you know, as I mentioned before, you know, traditional biometrics is essentially take a picture, you know, say that. Take another picture of, of you when you’re trying to authenticate and then compare it to the original picture that was taken again, face finger, Iris, doesn’t matter.

Dave: It’s the same concept. What we do is we look at your face with our technology. The company name is zero biometrics and our face product is called zero face. And when we look at your face, we do something radically different. We start by identify. A number of places on your face, where there’s interesting and distinguishing biometric features.

Dave: We do not save what we see at those locations. We just save. A map of a kind of, where are those features located on your face? And then we generate what’s called a zero knowledge proof which is a hash value of what we would see at that lo at those locations, if it’s actually you and hashes are an important cryptographic concept where you take some data and you transform.

Dave: One way and it cannot be reversed back out into the original data. So we save a formula that describes your face sorry. That describes where the interesting things are located on your face. And then we store a one way transformation of what we would expect to see. And then when I come back, it takes that data.

Dave: It applies the formula. For my face to whatever face is presented and if it’s actually my face. Combined with my formula, the system will recreate that unique value. That is, you know, as I mentioned, it’s, it’s not reversible back into any sort of image and that is how we match. We don’t compare faces and probabilities.

Dave: We, we generate, we get a hash that’s a long spring of numbers. That just for your face, it’s a unique value. And then we, when we authenticate you, we generate a new string. And if the same, if those strings match exactly, then it’s.

Chris: That’s very interesting. And I assume that the false positive rate would be much lower as well.

Chris: Correct? Because you’re only looking for those, I guess, unique attributes.

Dave: Yes. In fact, this is one of the interesting side effects. And this happens with a variety of different technologies, right? Like it, it can even happen if you’re writing. Like back when we were all, you know, in school and we were writing papers, it’s all happened to us where we write something and then we accidentally delete it.

Dave: And then you have to write it a second time and you wind up writing a much better paper than if you stayed with the with your original draft. Either stress, of course, when you lose it, but you wind up with a much better paper. , that’s exactly the experience that we had when we implemented zero face.

Dave: We got an order of magnitude improve. On the on the most important data, the Mo the most important measure, which is how likely are we to incorrectly identify someone, meaning, you know, if Jeff tries to log into my account, how likely is it that we’re gonna make a mistake and allow Jeff to log Jeff with his face to log into my account, normal measures for face.

Dave: Our, you know, our one, a good face system will be, you know, one in a million, one chance in a million, there there’s one particular vendor. That’s doing some really good work and they’ve gotten into one over you know, 125 million. I believe. Fingerprint hold on fingerprint. Believe it or not fingerprints only.

Dave: Usually it’s at one over 50 K best case. It’s one over. You know a hundred thousand, but that’s not very common that it’s that good. Our implementation we’re at one chance in a billion that we’re going to mistakenly recognize the person that’s being presented for authentication one in a billion that’s unheard.

Chris: Yeah

Dave: And with no metric data saved, right? I mean, we literally don’t know what you look like. We don’t save anything about what you look like. And this is such a radical claim that we went out to a national science agency in Australia the, the companies based in, in Australia and S.

Dave: And we, we had their scientists essentially it’s the us equivalent of N for any of our Australian listeners. This is a C S I R O in Australia. And they validated all of these claims and our science and our numbers. So not just validation of these numbers, the, the performance claims, but the basic science and the claim that we don’t know what you look like.

Chris: Not part of my ignorance here, but what if I had just had a picture of this person and I, and I held a picture up, like, what is distinguishing a real person from a picture when you’re talking about facial recognition?

Dave: That’s an excellent question. And I assume that you’re using this to describe a scenario where you might be trying to, to break in to a system, correct?

Dave: Yeah. Yeah. It’s a great, so several things and, and, and frankly, all biometric technologies. Well, most of them at least, but, but Jen, all the ones that are popular have some form of what’s called liveness detection to defeat what are called presentation attacks to both the tech that a presentation attack is happening and then to defeat it.

Dave: And the, the actual technology varies by the. But several things occur. Sometimes there are, and it’s very dependent upon whose liveness algorithms you’re using. In some cases they do things to, to look, to see whether they’re looking at a two dimensional planer object. They, in some cases, look for artifacts that would indicate they’re watching a video replay, let’s say on a phone that you’re holding up to your PC camera.

Dave: There’s other types of analysis that go on much of which is very proprietary to the companies. But the, fortunately there are some consensus tests and standards around. Biometric presentation attack detection. So even if every company says what we do, our methodology is super secret.

Dave: There are, there are fortunately basic standardized tests that these companies have to meet in order to claim accreditation for their for their

Chris: attack detect. And the only reason I ask that is because I’ve seen it done. I’ve seen it done with an iPhone. And I think it was an older version, older iOS version.

Chris: But I’ve actually seen someone, you know, have a picture and be able to unlock an iPhone with that method.

Dave: That’s surprising. I mean, because the, maybe it was a face solution on an iPhone as opposed to apple zone face ID solution, you know, because Apple’s face ID is actually pretty sophisticated, right?

Dave: I mean, there. To spoof it, but the, the, the, the thing, and actually this is a pretty important point to make. Right. The, the there’s no solution that is crackable, right? There’s no solution that can’t be defeated with enough time and energy. And especially if like with an iPhone or, or an Android device, if you have physical control over the device itself, the whole point is to defeat the stuff that’s.

Dave: And make it really time consuming. You don’t want any of these attacks to be scalable or they can be run, you know, against a large population all at the same time. And you wanna make the, the, the, the thieves or the attackers to have to work pretty hard. Right. So, so right. A photo. Is too easy, right? If face systems could be routinely hacked by just holding up a printout of some photo that you find on the web, that would be, that would be a massively risky scenario.

Dave: Fortunately, that’s not the case and Apple’s face ID takes it a step further. By using infrared lights and dot projectors to, to look at your face in 3d. And then they layer in a bunch of other security characteristics. So they’ve, while it can be spoofed in there, but the it, the expense that you have to go through to actually defeat it is, is quite significant.

Dave: You know, in fact, the cost of the attack is one of the measures that’s used in these certification programs. You know, like, you know, you must defend against these different types of attacks and, you know, the, the attacker can’t spend more than $200 in materials to defeat the attack.

Chris: So, Jeff, what are your thoughts on that?

Chris: I mean, would this approach help you sleep better at.

Jeff: Yeah, to some extent, I, I didn’t know about the cost of materials thing. That’s pretty interest. I think it’s, it’s, it’s sort of interesting how Dave talked earlier about a lot of solutions are not, not necessarily integrated. So if a bank or another organization’s buying a solution, they might, you know, buy a facial recognition here, but the, you know, the liveness detection or the security that they implement might be, you know, something that they bolt on.

Jeff: And each of those different pieces. Could be good or could be great. A lot of them are getting better. I mean, some of the liveness detection’s amazing. And, and even some of the security things are getting better and better. You just have to worry about is the, you know, if somebody does get access to that information, I, if the organization isn’t taking good care of your data if the information that you’re protecting with that data or is connected to that data is particularly vital.

Jeff: If they can get into that data or other data that you protected with your face in another system, What does that mean for, for you? Right? It’s still pretty scary. I’m, I’m very, I’m very happy that, that the industry’s getting better and better, but it doesn’t make me sleep well yet.

Dave: Yeah. And, and let me add on to that, you know, part of what’s interesting about our company and, you know, I’m, I’m, I’m mixing the blend of, you know being speaking as a neutral industry observer, and the fact that I’m just passionate about what we do with this particular company is that in our system design, We started with a whole system approach, meaning we don’t just deliver a biometric piece.

Dave: You know, we don’t just deliver a piece that is that is privacy preserving by not saving your face data. We start with fundamentally different design assumptions, you know, biometric vendors. Typically just give you the biometric piece, whether it’s safe or not, or what that’s up to you. As we’ve discussed, we actually deliver a.

Dave: That defends against data breaches. And that assumes that whatever company buys our technology, that they are, that they are that they’re not going to be able to resist professional hackers. And that data from our system will get. From whatever customer has deployed our technology. And so we’ve built layered defenses in all aspects of our stack to, to defend against this.

Dave: Some of these risks that we’ve talked about and to insert a variety. Of privacy sensitive controls. For instance, we enable the user to say, you may only use my, my zero face data for 30 days or for six months. And after that, you my lose the right to use that information. And the system will not continue to use your zero face data for authentication.

Dave: It blows. Beyond that time period, we can also limit it to only being used at a specific organization or, or heck even in, in at specific physical locations so that if someone tries to, you know, suddenly authenticate you from Russia and use your zero face data to authenticate you from Russia or any other location that isn’t a normal one for you.

Dave: That that, that, that the system will just. There there’s many, many other layers of defense and anonymity that we build in because of course, you know, we don’t want our technology to be used to violate your personal privacy, much less your location. We we’re really trying to both make a safer system and return more control to the people that really are generating the biometric data because our face is our proper.

Dave: Until you start to use it for authentication. And then suddenly, you know, everyone starts to think that somehow they have a right to use your face. Right. And that’s, that’s just not the case. We, we need to resume. We need to move to a model where we control our biometric data and how it’s used in the world.

Dave: And that’s one of the many things that’s different about our solution. Yeah. Dave, I

Chris: love what you’re doing. And I love the fact that that zero biometrics is developing this. You know, revolutionary solution with security and privacy in mind as well. And, and you don’t often see that with, you know, cutting edge tech.

Chris: So I’m really happy to, to hear that. Plus you got Jeff Jockisch your side.

Dave: you know, U usually the, the biometrics technology people and the privacy folks are on other sides of the fence.

Jeff: it’s, it’s true. You know, one of the things that’s that sort of funny about this is that is I sort of became a believer in the zero biometrics solution.

Jeff: It really flips your mindset because if you can actually trust facial authentication, if, if it really is a hundred percent safe, because there’s no biometric data stored, it really opens up so many different possibilities for what you can do, because then you, if you’re not constantly worried about your data being stolen, the, all the different ways that you can actually apply facial recognition to different problems.

Jeff: I, I is amazing, right? I mean, that’s why it’s gonna be, you know, a 13 billion industry I mean, imagine a 13 billion industry and you had no privacy concerns. It, it might actually be

Dave: double that. Yep. And there’s no compromise in usability, right? So it’s all the convenience with, you know, a, a radically changed security architecture and in many ways more convenient because, you know, you know, like unlike today’s world, where if you’ve got a device that has biometrics, like your phone, you, and let’s say your PC and your tablet, you have to enroll three times one for each of those device.

Dave: You know, we allow you to authenticate and enroll in one place, sorry, enroll in one place, and then push that enrollment data to all of the other devices that you use. And to do it safely, which is something that is just very it’s, it’s not safe to do that with traditional biometrics.

Chris: So we’re coming up on time.

Chris: I do wanna say that this has been. Learning experience for me. And I thank you both for sharing your knowledge here. As we escape the matrix of biometrics, I’m gonna call this the, biomatix because it’s sort of what I feel like I’m in right now. I need to ask you both. Were you based geographically?

Jeff: I’m down in south Florida.

Dave: And I’m in Redwood city, California.

Chris: Okay. Coast to coast. Jeff, I’ll start with you then since this is barcode, what’s the best bar in south Florida that does not use biometrics to let you get in.

Jeff: I’m gonna have to fail you on this one, Chris. I I’m not much of a drinker. Okay. But there, there is a really cool music venue that serves drinks that I go to a lot called TerraFirmata.

Jeff: So that’s probably what I’d recommend. It’s a really great place.

Dave: All right. This is a question that I just I’m thrilled that you’re asking it. I actually do like to go to bars in particular as I, my job, as you might imagine, is required a lot of international travel.

Dave: And so wherever I go. I always find that town’s Tiki bar and literally all over the world. And I’ve got a collection of mugs from, you know, every country that I’ve traveled to, where I’ve gone to the obscure Tiki bar in Paris or in Prague or in Hong Kong, or what have you, and here in the bay area, my favorite one there, cuz there’s a lot of Tiki bars here in the bay area.

Dave: Is Dr. Funks down in San. For me, it’s a little bit of a drive, but wow. It’s just a great drinks. Great ambiance, you know, great food. Yeah. Nice. It’s a, it’s a, it’s a great it’s a great escape from daily life.

Chris: So traveling internationally, I have to ask best Tiki bar worldwide.

Dave: Oh, that is such a hard question. right. Because what you start to get into is like, Is it because none of these places are, are, you know, tourist draws. Right. I mean, these are all, it’s always locals. Yeah. These places. So the question is, you know, how, how interesting were some of these places? So, okay. First of all, like for me personally, like if you just want like the most off the.

Dave: Tiki experience. It’s, you know, it’s, it’s the Toga room in San Francisco, right? Because where else do you have a pool in the middle of a Tiki bar and a band playing on a boat floating around on the island and it rains every 30 minutes or something. okay. I mean, that’s just ridiculous. But you know, I.

Dave: My favorite. My just kind of personal favorite was the and I, the name’s escaping me, but was the, the Tiki bar that I went to in Paris. The, just because it was such a local scene and there’s, you know, it’s so. Especially in Paris in the summertime, you know, many of the locals can be a little bit disdainful of Americans, but everyone at the bar was super happy to chit chat and talk with someone from a different part of the world.

Dave: It was, it was just a great, friendly, engaging experience. Yeah, that’s that, those, those are my two top phases. I.

Chris: Best Tiki bar I’ve been to. And I don’t go to, I don’t seek out Tiki bars, but I do go to one in Vegas called Frankie’s Tiki room.

Dave: Ah, that’s a classic. But next time you travel for business or just even personal travel, you are guaranteed that it’s going to be a local’s experience. Wherever you go, right? If you go to a Tiki bar, right. Sports bars, who knows? I mean sometimes yes. Sometimes no. Sometimes really highly rated, you know, mixology type bars or speakeasies, you know, you’ll get international folks wandering through, but I, I have, I don’t know that ever met an American in a Tiki bar anywhere else, other than the United.

Dave: So it’s you just re it’s a total slice of local life, whatever good or bad, right? It’s a total slice of local life.

Chris: I just heard last call here. Do you guys have time for one more? Absolutely. All right. If you opened a cyber security word, data, privacy themed bar, what would the name be? And what would your signature drink be?

Dave: For me, it’s an easy one. The bar would be with this theme, right. I actually, I keep talking with my wife about actually opening a Tiki bar at some point, but, but in the theme that you’re talking about, it would absolutely have to be named enigma after the German encryption machine. The drink would have to be something like you know, you know the, the memory obliterate or something like that. so that, you know, a drink’s strong enough to make you forget all your passwords, right. yes. Hammerhead would be good. And then it’s the, the, the drink that will make you forget all your passwords, so, you know, and get hammered.  And it’s a nautical thing too. Most Tiki bars have a good nautical component.

Jeff: I don’t know. I don’t know about the name of the place, but I think I, I would I’d like to, to do a drink called the encryption. That really just locks your brain up.

Chris: Nice. And that could be scary cuz you don’t know what’s in it.

Jeff: right. It can be different each time.

Chris: Exactly. Yeah.

Dave: all of the ingredients are a protected secret, right?

Chris: I like it. Hey, Jeff, Dave, thank you so much for stopping by. Again, I appreciate all the knowledge before we go. Let us know where you can be found online.

Jeff: I’m at privacy plan.net. And also you can find me on LinkedIn pretty much any time of the day or night which is a little sad, but true.

Dave: Yeah. The best way to contact me is through the company and that’s zero biometrics within s.com and of course, LinkedIn as well. David Lee Burnett on linked.

Chris: All right. Well, awesome guys. Thanks again. Be safe, getting home and I’ll talk to you soon.

Dave: Appreciate it, Chris. You take care.

Dave: Thanks so much, Chris.

To top