Explore how Bastille's wireless intrusion detection system protects agencies from IoT attacks. Access the podcast series & secure your critical networks today.
[Anthony Jimenez]
Welcome back to Carahcast, the podcast from Carahsoft, the trusted government IT solutions provider. Subscribe to get the latest technology updates in the public sector. I'm Anthony Jimenez, your host from the Carahsoft production team.
On behalf of Bastille Networks, we would like to welcome you to today's podcast. What happens when everyday glasses can record you, identify you, and stream your data without you ever knowing? In this episode, Adrian Sanabria and John Bundy break down the real security and privacy risks of modern smart glasses, from AI-powered features and live translation to covert surveillance and facial recognition.
They explore how these devices are evolving, how they can be detected, and why their potential for abuse may be higher than most people realize.
[Adrian Sanabria]
Welcome to the Wireless Threat Podcast Series sponsored by Bastille Networks. I'm Adrian Sanabria, and joining me is John Bundy. How are you doing, John?
[Jon Bundy]
Yeah, I'm doing great. How are you doing, Adrian?
[Adrian Sanabria]
I'm doing good. I've been looking forward to this one. We've been talking about this one for a little while.
And I'm also excited for this one because we have some show and tell. We have some demos we can even do here. In this podcast series, we explore a new class of device to threaten each episode.
We help you understand the threat, walk through some real-life scenarios, and even do, as I just mentioned, the occasional live demo. Ultimately, the goal is to answer the question, should you be worried about this? If you have any devices, threats, or attacks you want us to dissect on this podcast, please let us know in the comments.
In this episode, we are discussing smart glasses. Let's get started. I think the very beginning, if you're to go way back in this, was Google Glass.
And Google Glass failed hard, failed hard enough that nobody touched this market segment for a long, long time. And I was selected as a glass explorer, was the term that they used back then. You had to basically write an essay on how you would use the things, how you envisioned using the things.
Because they wanted people to develop for them. They wanted to see what kind of use cases people came up with. And I was really excited about it, up until the point where they said, OK, you're ready to get your glasses, $1,600.
I was like, nope. I don't have that much extra cash laying around. So this is back, I don't know, 2010, 2012.
I forget when these things first came out.
[Jon Bundy]
It's been at least 10 years. It's amazing what the backlash was to them back then. But times are certainly different today, aren't they?
[Adrian Sanabria]
Yeah, well, I think one of the key differences, obviously, Meta decided to partner with Rayban and use one of their iconic styles. And it's only been in the last... I've owned this pair here, which have prescription glasses.
I am blind. I am heavily myopic and rely heavily on my glasses. And I've had these for about two years.
And really only in the last two or three months have people recognized these as smart glasses and not just normal glasses on my face. So first off, we had the term glass holes, right? From those Google Glass days, because they looked weird.
They stood out. They made people uncomfortable. And while these make people more comfortable because they look like normal glasses, I think we're kind of slowly seeing some backlash to them, because people now realize that they're surrounded by them.
I think you did some research and found that Meta has sold 7 million of these to date.
[Jon Bundy]
Well, that was just last year, right? And that tripled their sales from 2023 and 2024. So they're probably close to 10 million units from them alone.
So they're really leading the charge here. It looks like they're doing pretty well.
[Adrian Sanabria]
Yeah, so there's actually a Gen 2 of these out that has a better camera. I think it has more battery. So eight hours, you can leave the things turned on.
And they now have like sport versions of them. So instead of Rayban, they have a collaboration with Oakley. And so those are much more durable.
These are not waterproof. Those have some IP dust and water resistance rating. And those can do high-speed photography as well.
So like you're snowboarding or something like that, they want to be able to capture high frame rates.
[Jon Bundy]
Yeah, just working out all those different use cases now, I think, right? Just feeling out the market and expanding in different directions.
[Adrian Sanabria]
So before we go deeper on these, maybe let's talk a little bit about the rest of the market here because you've done a lot of research here.
[Jon Bundy]
Yeah, I mean, first, I think it helps just to kind of describe the market in general. Like you said, what are the classes or types of smart glasses out there? And you can slice and dice it a lot of ways.
But one way I've been looking at it is three or four categories of glasses. There's your AI assistants. These are the lighter weight glasses that are meant to look more normal and connect to an AI and just help you every day, right?
They're usually lightweight. They try to look like a normal set of glasses. They might have a camera.
They might not. They probably have a microphone for voice input. They probably have speakers on the frame there for feedback.
They might have a little monocular display. They're called glanceable displays because they're meant for you to kind of glance up. So if you see somebody kind of looking up all the time.
Usually like on this side, up in the corner.
[Adrian Sanabria]
These don't have those, just to be clear.
[Jon Bundy]
But the next model does, right? They've got them on a display out now. So those are kind of one category.
Like I said, I'll call them the AI assistants when we refer back to them. The ones that are connected usually through your phone to an AI and have some sort of voice input. They may or may not have a camera, may or may not have a heads-up display, usually monocular if it is.
And it's meant to display text or simple arrows for navigation, just simple things, not like an AR overlay, which we'll get to. So that's one class. Another class would be you kind of take that and turbocharge it and get yourself monocular vision, put some more sensors on it.
Now you have an augmented reality system like you see in the sci-fi movies where you look at something and you can see something in 3D space that these glasses are generating for you to help you out. Still probably connected to an AI. But these will have more cameras, more sensors to determine where you're looking, where you are in space.
They'll do something called SLAM, spatial location and measurement or modeling, to kind of figure out what you're looking at and what they should display there. Now this is cool because now you can pin apps on your desktop. You look over here on your desktop and there's something floating there that gives you the weather.
Look over there, there's a different app. Or you could get information about the 3D world you're in as you're walking around. These tend to be bigger.
Say you have more sensors going on, you need a bigger battery, you've got more cameras, you've got a lot to go on there. A little more noticeable than the first class of AI assistants. Then you've got something I'll call like extended reality or more of a display on face is a good term.
And I've got some of those. These are meant to be kind of tethered. Could use a high-speed wireless.
Here's some. You can see they're a bit thick, a little chunky, I'd say. Heavily tinted.
Kind of obvious that you're wearing them. You're not going to walk around with these. You can see through them.
So if I put them on, you can kind of tell something's out here. But what I see now is a screen that's floating like 10 feet in front of me in stereo. It's just mirroring whatever I'm connected to.
And what I'm connected to is just a laptop. You can kind of see. This isn't a good video, but there.
That's a reflection. There we go. So there's some stuff in there.
It uses a birdbath display and fancy optics to curve all the light. But these are thick guys. And it's meant just, like I said, for productivity enhancement, entertainment.
You can hook it up to your Switch. You get a big display floating in front of you. There's no network connectivity.
And so we're talking about threats. We can almost just kind of toss this one out right away as not being much of a threat. So those are kind of three categories.
And then if you want to add to that, there's some niches within each to focus on health. And then there's industrial use cases that are less threatening that we can talk about. And then there's what's upcoming with maybe Android XR and how they're building a framework to expand this market and capabilities.
But those will fit into those three categories I kind of said before.
[Adrian Sanabria]
Does that help? Yeah, absolutely. So it's interesting.
You know, one thing I should mention, manufacturers are not completely unaware of the concerns about privacy and things like that. So if I take a picture, you should see – oh, I have to turn it on first. So this is something I didn't even realize until months after I owned these things, is that they actually had a Switch to turn them on and off, which is actually really useful, because if I just need these to see, and I do need them to see, but I'm not planning on using them for anything, I can turn them off and save battery life.
Before that, I thought they just turned on the moment you took them out of the case, which they do if you have the Switch in the on position. I've got it in the off position right now. I'm going to switch them on right now.
And, yeah, you can see there's not a whole lot of obvious hardware in the design here. You can see, like, underneath are the speakers you mentioned. I'm not sure where the microphone is, but it does have some pretty good microphones.
I believe it's right there in one of the nose pads. And then you can clearly see it actually looks like there's a camera on each side, but one is an LED and one is the actual optical camera. So they're on now.
They're connected. They only take, you know, maybe 20 seconds to boot up. And when I take a picture, we see a little flash right here.
If I take some video, and I can do that by voice or by holding down the button for more than two seconds or something like that, it's now taking a video, so this will kind of slowly pulse. But we very quickly saw some services where you could send these off and somebody would kind of expertly drill out the LED here so that other people around you don't get these kind of notifications. And even with these, honestly, I've never had anybody spot it or say anything to me when I am taking a video.
Like, you know, maybe I'm biking on a greenway or something like that. You know, when I've been a tourist in different countries, I'm often using some of these features. You know, just I want to show the kids or somebody back at home, you know, what it's like to actually be in this place rather than just still images.
So I'll take a lot of videos for that purpose.
[Jon Bundy]
It's not obvious. You know, we're tuned to see, like, changes, and that's just a steady light. And it's not a very bright light or anything like that.
And you're right, immediately. So that's like the token privacy concern that the company is addressing, right? Oh, we'll put an LED on when you're taking a picture.
But you saw right away, and you can go to Amazon now and look for stickers just to cover that up. And then I think the next thing Meta did was they said, well, we'll put the LED right in the middle of the camera hole so you can't just cover the whole thing up. Well, they came up with tinted stickers that still kind of cover up that part.
And, you know, so it's a rat race between privacy advocates and people that are not so concerned about that, which is one of the threats.
[Adrian Sanabria]
You know, so talking about some of the threats, I also want to talk about some of the things it's capable of and some of the things I use it for. So if you're going to another country and you see a sign in a language that you don't speak, I'm going to see if I can demo this live here. But Meta, tell me what the sign says.
I see some Japanese characters on the sign, but the text isn't clear enough to read accurately. It might be a warning or instruction. Try moving closer to the sign so I can get a better look at the details.
Could you hear that?
[Jon Bundy]
Yeah. Came through very good. It seems like Matt was being a little camera shy there for a minute.
[Adrian Sanabria]
Yeah. I've had a few problems just prompting it like that. Most of the time I don't have trouble.
But so one of the key things there is because I don't have a display, I don't know how things are framed in the camera. So maybe, you know, maybe half of this was cut off the way I was holding it. I don't know because I get absolutely no feedback.
It's not until I actually download the images and videos from this onto my phone that I actually see what I've taken a picture of. So adding the little display in the camera. I don't know if they're doing that now, but that might help you frame your shot and make sure you're getting everything you think you're getting into the shot.
But everything that this camera does is portrait mode because it's designed to be used on Instagram. So I can also ask Meta to just start sending stuff live to Instagram and it will start live streaming something to Instagram. So those are some of the capabilities are pretty advanced.
It's pretty good at like, hey, what's this tree? What's this plant? Can I eat this?
It's pretty good with the analysis. And I think it's a pretty interesting use case. Like, again, if you're in a country and there's some kind of statue and you want to know, like, why is this here?
You know, why is it so weird? What is this thing called? It does a pretty good job of doing that stuff.
So there are some pretty compelling use cases, I think, here. Also, I look like less of a tourist, you know, when I'm not walking around like this everywhere. I can not draw attention to myself as a tourist and still leave with some pictures and videos that I can share.
[Jon Bundy]
Yeah. So you're right. They definitely do have a lot of capability and positive use cases.
Those AI assistant glasses, everything you use AI for, now it's just a little bit easier because you can just ask it. It's always there with you. It's working through your phone, connecting to a cloud-based AI.
And some of them, like you said, have cameras. Now they can do visual stuff. You can extend that to, you know, health-focused things.
For the hearing-impaired, you can have live translation using microphones to pick up conversations and either send it to the cloud for more accurate translations or have a local model for speed on your phone. And you get, you know, like captioned text-to-speech or speech-to-text right there in your display. It's less obvious than wearing hearing aids and, you know, might help with comfort levels.
And the same thing for visual-impaired. You can let those cameras in the lenses feed to an AI that can help identify objects or maybe move focus to the periphery if you have problems in your central field of view or do an edge detection in real time to help highlight areas of interest. So there's a lot of good use cases for these AI-based assistants.
[Adrian Sanabria]
Yeah, there's another one called Be My Eyes, where you're talking about hearing. You could even have some vision issues or blindness, and this thing will try to navigate you around. I'm tempted to try this to see how well it works.
But, yeah, there are these always-on modes that you can use. Like, if you're building something and you need some guidance on, like, what tools to use, you know, how to put things together, you know, it can look at what you're doing, analyze what you're doing, give you some guidance. So on the one hand, it's a compelling form factor because I'm doing all this hands-free, right?
You know, I'm not having to, like, juggle this, hold this with one hand while I'm using a tool with the other hand. So that makes it a bit more compelling. But these, at least, are dependent on my smartphone, which some of the other ones you were talking about, I think where this gets a little bit more interesting, is where the entire computer is in the glasses and they're standalone.
[Jon Bundy]
Yeah, so if you switch more to those kind of full AR experiences, those are still split. Some of them will still use a camera for the heavy lifting. Some of them are exploring the idea of using a hardware puck that's kind of attached to help with the processing and battery power.
Yeah, the battery's just not going to be as long-lasting with all those sensors and constant display going on. Then there's some that are aiming to be just completely computer on your face, like you said. And you've got the whole thing.
Don't need your phone. They might have a SIM, cellular connection, do all that processing and give you that full experience and maybe replace your phone. You can send emails, can take calls, get your SMS, your text messages, have apps, like I said, the apps floating around.
And that's where Android XR that's upcoming is really going to probably shoot for something like that because they're developing an open ecosystem. So a lot of these glasses now, they have a proprietary OS. They might have a few apps for it, but you're really limited.
Now, when Android gets in with this, they're going to have the whole Android ecosystem available for apps, as well as now what they'll call spatially aware apps that will have some sort of location-based functionality that will derive from the sensors in the glasses.
[Adrian Sanabria]
Yeah, another thing I wanted to mention here is that I mentioned people starting to see these as a threat. I'm trying to clear all my notifications here so that you don't see everything going on on this phone.
[Jon Bundy]
Some of that PII, afraid of leaking some of that, are you?
[Adrian Sanabria]
But, you know, we now have an app that will let you know if somebody nearby is using a set of smart glasses. And it's called Nearby Glasses. I think it's only on the Google Play Store.
And basically, it's just looking for, I guess, Bluetooth packets that match certain manufacturers of known smart glasses. And sure enough, I installed this this morning. And as soon as I turned my glasses on and had the app running, it did tell me, hey, somebody with MetaGlasses is near you.
[Jon Bundy]
And so, yeah, like you called it, they're using BLE advertising. I don't know if we've talked about that before, but a quick refresher. Bluetooth Low Energy is what BLE stands for.
A lot of these glasses use it. And the reason why, it's right there in the name, Low Energy. Really helpful for battery life.
All of our phones have it. Most of our headphones, earphones, earbuds will use it, keyboards. And it's a huge battery saving because the radio spends most of its time sleeping.
So you've got Bluetooth Low Energy. One of the big changes they made from Bluetooth Classic was in Bluetooth Low Energy, the peripherals will advertise. And it's just like it sounds.
They kind of shout out, here I am. Here's what I am. Here's what I do.
Whatever it is. And so these glasses, when they're on, will advertise their presence. It's not clear to me how long they'll continue to advertise.
Usually once a device connects, so it sees, oh, there's the phone. Let's make the connection. I'm connected to the app.
They'll stop advertising because the advertising job is done. The advertising purpose is to tell the other devices, which are called centrals, that the peripheral is nearby and what state it's in, if it's ready to connect or not or whatever. So these advertisements will often have some identifying information so you know what the device is, so you know if you want to connect to it or not.
And in this case, what the app does is it's going to look for certain unique identifiers that are assigned to manufacturers. And I think the developer just took a list of manufacturers that may make smart glasses and said, well, if I see one of these manufacturers, let's just pop a notification. So it might be a little false positive, especially given that it's in its early days.
But it's certainly better than nothing, as you saw it work properly with those metaglasses. So that might be something useful to people.
[Adrian Sanabria]
Yeah, and to your point, it's been about four minutes and it hasn't notified again. So I wonder if it does stop notifying as soon as these things are connected because it was sending. You know, you can see it does have a debug log here.
I think I found an angle where you can actually read it. But yeah, I stopped seeing those and they are still on. So I wonder if I ask.
Hey, Meta, what am I looking at? I don't know if you can hear any of that.
[Jon Bundy]
Only mumbles. Yeah.
[Adrian Sanabria]
OK, yeah, I should have held up the mic. It didn't didn't notify me anymore. So, yeah, I wonder if this is kind of missing a feature there where.
[Jon Bundy]
Well, it's not missing a feature so much as Bluetooth is working as design.
[Adrian Sanabria]
It's a limitation, right?
[Jon Bundy]
Yeah, it's it's a feature. So, I mean, that's how it's supposed to work is the glasses advertise.
[Adrian Sanabria]
I mean, a limitation of the app. Yeah. Well, yeah.
[Jon Bundy]
I don't know if there's a way to really solve that based on the way Bluetooth energy works. But so right now, the phone that your glasses are connected to the screens off, right? Yeah.
Screens off. Yeah. So the glasses are still connected to that phone.
They've stopped advertising the app that detects them can no longer see the advertisements. They're not there and it doesn't work. So that's prompting.
[Adrian Sanabria]
It might might cause some additional activity, but I didn't see anything. I still haven't seen anything for the last five minutes.
[Jon Bundy]
Yeah. When once they're connected, the glasses will probably stop advertising. That's what I'd seen in the past.
Now, some some devices will keep advertising even when they're connected. Notably, headphones and earbuds that are met or have the capability to connect to two phones at the same time. They need to make their presence aware to the second phone that might want to connect.
So they'll keep advertising that they're there.
[Adrian Sanabria]
So apparently this nearby glasses app needs the Bluetooth device to be advertising itself, which doesn't happen the whole time that the device is on and paired and connected to a phone. Apparently, maybe it just happens when it's first started. Unless it's one of the devices you mentioned that has a feature where it can connect to multiple devices.
So it just has to do it continually like like my my Sony Mark for, you know, a bit big. You have a pair of them there. Yeah.
So we those can connect to devices at the same time.
[Jon Bundy]
Two devices at the same time. They'll connect to one device. They'll keep advertising.
So the second device knows that they're available to be connected to. Exactly right. But from what I've seen from the limited glasses that I've tested is once those glasses do connect to the phone, advertising is done and that's just by design.
So that detector will have some usefulness, but limited. I expect most of the time people are using these glasses. They will want to be connected to their phones so they can use the eye.
Right. That's a lot of the purpose. But you could still walk around without your your phone and take pictures and videos.
[Adrian Sanabria]
Yeah. Yeah. So a few other threats that we didn't mention.
There were some students from I believe it was Harvard that modified their metaglasses to where they could pair it. There's this search tool called Pim Eyes. And if you give it a photo of someone, it has an index of all photos from the Internet.
I guess they scoured the whole Internet and pulled down all that stuff and they can now match faces. So they do facial recognition, match it to everything that's in their database. So it's kind of like a reverse Web search from a face.
So you can basically dock somebody in a lot of cases, like if you're online, if you're on LinkedIn or something like that. You know, these these reverse face searches can very quickly come up with details on you. And they just automated this entire process.
And a couple of years later, Meta just recently announced that they are going to have first party facial recognition support on their glasses. But only for people in your contact list, apparently. Which to me, the use case is I'm at a conference.
I don't remember this person's name because I haven't talked to him in a year. Maybe they're in my contact list. Maybe they're they're not.
But like I see the use case there. But I feel like that's that's a step in the wrong direction. If you don't want people if you don't want to get these devices banned in like schools, government building, like all of a sudden, I think we're at risk of seeing signs like no guns and no metaglasses allowed on the front door.
[Jon Bundy]
And in fact, the U.S. Air Force did ban smart glasses in January from uniformed personnel. They can't wear any glasses that have microphones, cameras or AI. Well, that's a smart glass.
That's your AI assistant. That's that's what we're talking about. So they've recognized the threat and acted on it, which is probably the right thing to do.
[Adrian Sanabria]
So wrapping up here, you know, I think it's time for just our conclusions here. Our threat score. I didn't think smart watches were a huge threat.
I think I gave that a three. I think these are going to be a lot higher on my list, although I love mine. Most of the time I just use mine to listen to audio books while I do chores.
Like I just prefer to have the speakers up here than something in my my ears. Things fall out of my ears. They're they get sore after a while.
You know, it's much more comfortable for me to have them up there for me. You know, I don't think the way I use them is is a threat to other people's private privacy or safety. But I think the potential for abuse there is going to give it maybe a seven out of 10 for me.
[Jon Bundy]
Yeah. And I mean, this could we could spend a long time talking about the threats. We can barely touch the surface of those.
I mean, to businesses, I think it's obvious. These are covert surveillance devices. Almost given no phone policy.
Well, when someone walks in with these, they have a microphone and a camera. Like, would you allow a video camera in there? If not, maybe your policy should block these as well.
Exactly. A lot of the recent security threats involving espionage have been using phones to take pictures of sensitive documents. Well, now you come up with your glasses, you take a video, you just scroll through them and there it is.
So there's a lot of threat, obvious threat to business. There's breaking corporate policies and D.A.'s GPDR privacy laws, a two party consent, one party consent. There's just so many things.
PII leakage. So there's a lot of threats and risks to a business, I'd say, for certain models. The ones with cameras, certainly microphones, a little bit less so.
But, yeah, I'd score them pretty high on a threat to business. Probably seven, eight. Yeah.
Now, the nice thing to address that there are some industrial and enterprise versions of these that will remove the camera, remove the microphone and just provide a heads up display or allow it or connect to a local A.I. So you're not going to the cloud. So there are workarounds and I think they're fabulous tools for field technicians, for engineers that want to see overlays and schematics and keep their hands free, like you said, checklists. There's a lot of great use cases in business.
And I think there are models that address it that will be better used. As far as like personal threats, we didn't talk about that, but much like smartwatches, you have to remember, these are connected to a cloud A.I. There's going to be all of your usual threats against that A.I. There's going to be local threats of A.I. leaking out of other people's A.I. leaking out. So the privacy concerns there is, you know, it could be a threat surface as well.
I see in the future with all the data these are collecting, some of these are meant to be your own digital memory. They even advertise themselves as your digital memory. Remember everything about your life.
So you have to. Who is that person you met? Keep it there.
That's a really juicy target for hackers. So I worry about that in the future. They might be more targeted.
But even without the threat of a malicious actor, you know, but just by using it, you're sending data about your personal life to the cloud where it might be shared. It might be leaked, probably go to advertisers and all the usual things with cloud based solutions you should be concerned about. So I think there also there's some risk to your consumers as well.
And I'd find them a threat to consumers in that way. Maybe not as high, but it's something I'm you probably find out more privacy concerned and something I probably wouldn't use. So it's kind of a middle of the road there for me and threats to consumers.
[Adrian Sanabria]
Yes, certainly. I think awareness is going to be a big thing. You know, we saw a similar pushback with smartphones when they first came out because it put phones in everyone's pockets all the time.
So, yeah, this is, I think, another adjustment period.
[Jon Bundy]
Yeah. And they're going to see the sales are great. The market is booming.
There's a lot of new entries. And I think the market, they're still trying to figure out what's allowable, what's not allowable. I think there's going to be policy changes as a result.
We've certainly already seen it from the U.S. Air Force saying, well, for now, no. So it's going to be interesting to see how these is going. And again, there's a lot of really useful use cases, the health focused ones, the assistive ones, even just personal organization.
So it's there. It's just understand the risk. And that's what we're trying to kind of talk about is just understand what risk there is there.
For me, it's these aren't going to be security first devices. Right. Yeah.
So there's always a risk of your data. And there's a lot of data being collected. You just videotaped me a little earlier, recorded me.
[Adrian Sanabria]
Yeah. Yeah. And easy to forget that you did it, too.
Right. Like it's not even been downloaded to my phone, you know, like a week from now. I'll grab that and delete it.
[Jon Bundy]
I don't need to keep it on Instagram already.
[Adrian Sanabria]
Yeah. It could be if I've misconfigured something. Absolutely.
[Jon Bundy]
What happens if I yell, hey, Meta, you know, record this to Instagram. If I yell it loud enough, are your glasses going to do it?
[Adrian Sanabria]
Yeah. If I had speakers on and these weren't in, it would be interesting to see how easy they could be triggered.
[Jon Bundy]
So, yeah. Interesting devices. Lots of great use cases.
Definitely risks and threats involved. And just be aware of those to your business and to yourself.
[Adrian Sanabria]
All right. And with that, we're going to wrap up this episode. Thanks again, John, for joining me today.
Very interesting conversation. Glad we got to have this one.
[Jon Bundy]
It was my pleasure. Thanks for being here, too. It was fun to talk about it.
[Adrian Sanabria]
And big thanks to Bastille for sponsoring this series. And you can check out Bastille.net forward slash blog for more information on wireless threats. And don't forget to leave a comment with what you'd like to see us discuss next.
Until then, see you later.
[Anthony Jimenez]
Thanks for listening. And thank you to our guests, Adrian Sanabria and John Bundy. Don't forget to like, comment and subscribe to Carahcast and be sure to listen to our other discussions.
If you'd like more information on how Bastille can assist your organization, please visit www.Carahsoft.com or email us at bastille at Carahsoft.com. Thanks again for listening and have a great day.