Fractals of Change

Exposure to Discernment

Mary Schaub Season 2 Episode 24

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 1:00:34

Mary Schaub speaks with Alec Harris about how privacy, security, and personal risk have evolved in a world shaped by data brokers, algorithmic profiling, AI, and digital surveillance. Rather than focusing only on hackers and technical systems, they explore a more fundamental question: what are we protecting—and how visible have we become without fully realizing it?

They examine why privacy is not about “having something to hide,” but about autonomy, safety, and discernment. Along the way, they unpack metadata, device fingerprinting, dynamic pricing, reputational risk, AI’s double edge, and the growing gap between convenience and control.

This is not a conversation about disappearing. It’s about calibration—understanding where the risks are, what matters most, and how to make more conscious tradeoffs in a world that increasingly rewards exposure.


Topics

  • Privacy vs. convenience in the digital age
  • The “human attack surface” in cybersecurity
  • Why metadata can be more revealing than content
  • Data brokers and the commercial data economy
  • Device fingerprinting and cross-platform tracking
  • Dynamic pricing and behavioral targeting
  • AI as both a defensive and offensive tool
  • Public exposure, doxing, and reputational risk
  • Practical digital security and privacy hygiene
  • Balancing visibility with protection


Memorable Quotes

  • “It’s less about who we have to protect ourselves from, and more about what we have to protect.”
  • “You don’t have to be the fastest antelope. You just can’t be the slowest.”
  • “The metadata around how we behave is often more revealing than what we say.”
  • “You can have a public persona and still protect the crown jewels.”
  • “It’s not about hiding. It’s about discernment and balance.”
  • “Doing a little bit for a long time is a fantastic place to land.”


Resources

  • How to Disappear — The Atlantic (on modern privacy and anonymity)
  • The Age of Surveillance Capitalism — Shoshana Zuboff (data and behavioral prediction)
  • Cambridge Analytica case (data-driven political targeting)
  • GDPR (EU privacy and data protection law)
  • Real-Time Bidding (RTB) (how user data is auctioned for ads)
  • Device fingerprinting (tracking users across platforms)
  • Managed attribution (how metadata reveals identity)
  • Don’t F**k With Cats (online investigation and digital tracing)


Keywords  

privacy and security podcast, digital privacy, personal cybersecurity, metadata privacy, online privacy, data brokers, surveillance capitalism, AI and privacy, digital security tips, reputational risk online, device fingerprinting, real-time bidding data, how to protect your privacy online, why privacy matters, personal security in the digital age, social media privacy risks, cybersecurity for individuals, doxing and online harassment, AI security risks, privacy vs convenience

Disclaimer:

***The information, opinions, and recommendations presented in this Podcast are for general information only and any reliance on the information provided in this Podcast is done at your own risk. This Podcast should not be considered professional advice.***

Credits: Written, produced and hosted by: Mary Schaub. Theme song written by: Mary Schaub

Contact: FractalsofChange@outlook.com  

Website: M. Schaub Advisory (MSA)

Understanding Risk in Personal Contexts

SPEAKER_01

Today I sat down with Alec Harris, CEO of Haven X and a leading voice in privacy and security, known for his work at the intersection of digital systems and human behavior. His work focuses on a simple but increasingly urgent question. What does it mean to protect ourselves in a world designed to expose us? Alec approaches this not just as a technologist, but as a practitioner. As someone who understands that the real attack surface isn't just our systems, it's us. You'll notice his video is intentionally darkened. That's not a stylistic choice. It's a reflection of the principles that he lives by. In a conversation about privacy, Alec practices what he teaches. In this episode, we explore the growing sophistication of tracking and evaluation technologies and the quieter skill required to navigate them. Discernment. Not withdrawal, not paranoia, but balance. The ability to engage with modern systems without being unconsciously shaped by them. Here's where we begin. For most of human history, danger was visible. You could see the enemy, you could see the weapon. But today it's different. When most people think about cybersecurity, they imagine hackers and machines attacking machines. But increasingly, the real battlefield isn't technical, it's human. We're living in a world where CEOs are being targeted in the streets, companies are engineering our attention to shape behavior, and sophisticated actors can track down personal information with intelligence and agency precision. As a species, we've built incredibly powerful systems, but we're still learning how to live with them. So in this environment, who are we actually protecting ourselves from Wow, what a what a great opener.

SPEAKER_00

So the way we, or at least I would segment risk up is less on who the adversaries are and like more on who the individual is. Right. And so, you know, me with an average suburban kind of like pedestrian lifestyle, I have different slices of risk than someone who's the CEO of a publicly traded company. And so the answer, like, who are we protecting ourselves from depends on a little bit of what do we have to lose or what do the outside world, what does an adversary think they could gain from targeting us. And targeting, right, could be digital targeting, it could be cyber targeting, it could be physical targeting, it could be reputational, right? It could be espionage, commercial intelligence. And so, you know, it's easy to think of, and we've all seen this photo of kind of the hacker with the cloaked mask and then maybe the guy fox, you know, face mask. And that's an easy one. But really, it I would say, as opposed to who, you know, who do we have to protect ourselves from, it's what do we have to protect. And then from there we can build a framework and about who we have to be concerned about. And a very simple and tangible example would be someone who owns bearer assets, whether it's diamonds or stock certificates or Bitcoin or gold or whatever, has a very different risk calculation about their home security than someone who keeps everything in the bank and at Vanguard and a safety box at the bank, right? Um, and so that's just an example of how you might have two people who are similar who would bifurcate that risk very differently.

SPEAKER_01

That's fascinating. That's a that's a really great way to look at it. I was imagining Mr. Robot as you were talking about that. That was a great show.

SPEAKER_00

Yeah.

SPEAKER_01

I was having dinner, this is very timely, with some friends a couple weeks ago, and the topic of privacy and security came up. And someone at the table said, I'm happy to trade privacy for convenience because I have nothing to hide. How do you respond when someone says something like this? And at an existential level, like, why does privacy matter?

The Importance of Privacy in a Digital Age

Preparedness and the Human Condition

SPEAKER_00

Yeah, so I I would say more than half of the people that I meet would have that worldview. Uh, and it's not, it's not necessarily wrong because the chances of any one individual being targeted are somewhat slim on the whole, right? And so, but it can happen. And the the reason that someone who thinks they might not have something to hide or to protect might be lulled into a false sense of security is there are countless examples, vignette after vignette, of people who didn't realize that they were going to be subject to scrutiny or the ire of the public or to some kind of security incident. And by the time that occurs, it's way too late to go back and put things in a box. And I'll just give you two super um commonly understood examples. One was it was last summer. Remember the CEO and his head of HR, right? So those two people probably would have said exactly that 15 minutes before they became the most recognized people in the country. Um, and then all of a sudden, I bet they're getting text messages and emails and threats, and who knows who drove by the house and figured out where their kids go to school. And I bet, you know, it was a very uncomfortable period for the two of them. Very now the world moved on and they're probably not experiencing that anymore. But those are two people that didn't expect the spotlight and suddenly were in it. Um, and one a little bit more near and dear that I saw firsthand is you know, five or so years ago when the whole Reddit GameStop, you know, hedge funds versus retail story was going on. So people were maybe aware that the the people that ran those hedge funds became household names for a little while. And, you know, lots of retail people, you know, people who were just buying a few shares at a time who felt like they were being they were the victim of you know the pressure that the big boys are putting on the market. Yeah. So the guys that were the billionaires that ran those hedge funds, that they took some heat. But what people don't realize is that rank and file employees in those hedge funds, you know, the finance level two analysts, the, you know, executive assistants, those people got it too.

SPEAKER_01

Wow.

SPEAKER_00

Because people just went on LinkedIn and they're like, yeah, okay, here's the boss, but that guy has executive protection. He's got a guard in front of his house, right? He's probably got the chief information security guarding his digital perimeter. I'm gonna just go after this mid-level person who did maybe had nothing to do with any of these decisions before they got hired at these hedge funds. And so those would be two examples of people that I would say rightly probably thought that they didn't have anything to hide and, you know, maybe didn't need to think about privacy, and all of a sudden we're thrown on the spot.

The Luxury of Extreme Privacy

SPEAKER_01

On the topic of not realizing what you need until it's it's too late, and I'm thinking back to 2001 when I was much younger and living alone in New Jersey. And of course, when you're young and carefree, you're living with few supplies, right? I I used credit cards for everything. I probably walked around with less than$20 on me at any point in time. And then 9-11 happened and everything shut down and I had no cash, very little food. And it was the first time I realized how fragile the systems around me actually were. Um and my dad, who was a firefighter and actually a member of FEMA, he took this opportunity to lecture me about not being prepared, having cash and non-perishables, water, even a go bag. And in the aftermath, I did all those things. I was very, very vigilant. But then as time passed, I sort of loosened up. I guess it's sort of a blissful optimism. And I was thinking that maybe on some level being prepared makes sense intellectually, but it also means confronting scary realities that we don't like to think about. And it and I'm wondering if there is resistance to thinking about security and privacy, like the example we we just talked about. Is underlying that a discomfort with acknowledging vulnerability?

SPEAKER_00

So let's take something somewhat timely. We uh have nation state, peer, and near peer adversaries that we know if anyone who reads the news knows, you know, have taken an interest in our power and utility grids and and their operations from a cyber, at least pre-positioning standpoint. Like this is well documented and has been announced by various parts of the government, including the FBI. So we all know this, yet everyone didn't go run out and buy a generator. And so somewhere along the way, people either aren't aware of this because it's not in their news digest, or they saw it and they thought, not enough of a inherent or immediate risk to me and my family, or I'll let the power company deal with that, or it's probably not gonna happen. And so there's some maybe willful ignorance bliss in that possibility. And I think the other part of it though is very pragmatically, everyone's very busy and you're you're doing the the core blocking and tackling of work, and then you got to pick up the kids and get the groceries. And, you know, it's now you're saying I need a generator, right? And so I have a lot of compassion for people who, you know, maybe aren't averse to security, but they just have a lot of other things to manage, and buying soccer cleats is more immediate.

SPEAKER_01

Absolutely. It it leads me to think about this level of attentiveness to security and privacy. You were featured in Benjamin Wallace's piece in The Atlantic last May, titled How to Disappear. And it raised something I think that connects to this, which is the idea that extreme privacy is is becoming sort of a luxury service. On the other end of the spectrum, there has always been a group of people who have been very hyper attentive to these types of things. They used to be called preppers or extreme uh, you know, extreme thinkers. And I'm wondering now, based on what I'm reading in the news day in and day out, if actually all of this is starting to become a more rational survival strategy for complicated times.

The Shift from Government to Private Sector Security

SPEAKER_00

Maybe. I mean, I have a bit of that in me too, but that that won't surprise anyone. But so let me answer in two parts. So, yes, extreme privacy to the nth degree is probably a luxury good, or you can do it yourself, but it's gonna, you gotta be a hobbyist, right? You would you would never do it in passing. So, so that's true. However, what I tell people is, and I firmly believe this, and I've seen this to be true, is and we've all heard this saying, right? Like you don't have to be the fastest antelope, you just can't be the slowest, right? Um, and so there's another version of that that I heard from a buddy of mine who's in the Navy, which is you make yourself abort mission criteria, which is, you know, there's adversaries out there, many of them are criminal and they're opportunistic and they're looking around and they're looking up and down the street proverbially, maybe actually the street or digitally or cyber, and they're checking out which house is the best and which has the security alarm and which has the cameras and which has the gate and which has the dog. And so some of those houses are going to be abort mission criteria because they appear to have taken some effort to implement security measures. The same would be true around privacy. You don't have to come along with me and fully disappear. But if you're hard to find and it's a little bit ambiguous where you live, and and the adversary is like, ah, like I could either do three hours more work on this person or I could go to the next person. Um and that's I I understand there's an inherent callousness to that because it just means it's someone else's problem. But realistically, like if we're thinking tribally, right, can I protect me and my family, you know, my colleagues, my block, whatever that is, it might be the best you can do. So that's one part of it. But then, you know, is this sort of paranoid thinking becoming more mainstream or deriga or something? I I think what we see is trends along the line of security measures becoming commonplace. And so when I grew up, I don't remember anyone having security cameras at their houses. Like it just and like some people had alarm systems, and even that was like, oh, wow. But you locked your doors. And I think when my parents grew up, you didn't even really lock your doors. But you maybe you kept the door shut at night. I don't know. Right. And so, so what we would consider baseline security and driving down your block or my block, you know, I'm sure more than half the houses have some kind of home security in place, would have been abnormally paranoid not that long ago in the same lifetime. And so the trend seems to be towards like whatever the conventional minimum level of security is, and people will gravitate towards that, but that seems to be escalating. It's not like people are letting go of security measures. That's my my view on it.

SPEAKER_01

That sounds fair. And I I grew up in the suburbs of New Jersey and I Dead End Street, and during the day, certainly doors were unlocked. It's interesting out where I live, nine times out of ten, when a car is stolen, someone left their key fob like in the in the dashboard. And so to your point, a lot of crime is just going to be either opportunistic or sort of easy. And maybe the ones where it isn't, there's something else going on, and then there's some percentage of where this really sucked. This was really bad luck for you. Unlike most guests, it wasn't possible to do deep background research on on your world because most of that has been redacted. But what I found publicly is that you founded Halo Privacy focused on government level security and then launched Haven X to serve the private sector. I'm curious what brought you to that decision. Was there a moment or a pattern when it became clear that this level of protection wasn't just a state-level concern anymore?

Navigating the Commercial Data Market

SPEAKER_00

Great question. And just to clarify to give credit where it's due, there were actually two founders that really kicked things off, Mark and Lance. And I was lucky enough to know them and join up with them right out of the gate. But they certainly did deserve the credit for getting things off the ground and continuing to provide the leadership. So it came as no surprise to anyone that knew me that working in a privacy domain would be fitting. My mom from when I was a young kid would always say that I was very reserved with information, very private, comfortable on my own. Certainly not a loner because I love people, but um, you know, I'd be okay, you know, not oversharing. And so there's there's some inherent personality traits. And so I got to know the guy who is our CTO and founder on the technical side, and a cryptographer and a great American some number of years ago. And he really helped me understand the connection between privacy and liberty. Those are his two things. And so I hadn't really extrapolated it out into why privacy might be important for freedom of speech and the ability to say what you want to say. Yeah, you can in America, gratefully, we can say what we want to say, but also to do so in an uncensored way where technology, you know, can't be layered on top of those speech rights. So uh cryptography being a core pillar of that, right? Because if you can have private, encrypted conversations with people, then you can speak freely in those. And so he really opened my eyes to what it meant at this more philosophical level beyond just, you know, personal privacy. And so Halo got started, as you mentioned, and still to this day, much more facing US government, much more around cryptography, much more around secure communications, and then around something that I would now think is a building block of everything we do, but I learned about it early on, which is called managed attribution. And this is a really fascinating component of privacy. And so, what it basically would say is if I have an encrypted call with my cardiologist once a week, every week for eight weeks, you don't need to break into that call to have like 80% context on what's going on. Obviously, I have a heart problem. And so the attribution there, just the link between me and the cardiologist and the regularity of that call, and that's like metadata, right? The time of the call, the length, the counterparties of the call, is a sufficient information to glean almost everything you would need to know about that communication. And so, yeah, maybe an adversary couldn't break into the conversation, but they got the information they needed. Well, if you take that and project it out into a much more critical scenario where someone is, you know, working with someone in a hostile foreign country, and maybe just the fact that they're communicating from America to this other country is damning enough to get them thrown in jail, right? So in those cases, you need to actually manage the attribution of how that communication occurs, and then you layer on cryptography. And so that way, you know, Mary, you and I can have a conversation both without creating a link between us. So no one knows we're planning on doing this podcast together, and they wouldn't know the topics that we're discussing for the podcast, right? And so that's been super interesting. And the the parallel to that in kind of all of our lives is what we think of as social graph, which is the our connections from our contact book and our social media connections and who we email and who we call and who we're connected to on LinkedIn and all of these things that paint a picture of who we are and what our interests are and who our inner circle is versus our you know concentric rings of contacts. And we all remember Cambridge Analytica, right? Which was the company that was doing data extraction on top of Facebook to influence elections. Well, what's very, very telling about that is okay, so Facebook, you know, huge black eye, Cambridge Analytica, you know, completely tarnished reputation. And they were using these, we remember them from Facebook. It's like, oh, fill out this survey and tell us, you know, if you were a cloud, what kind of cloud you'd be around, right? It was all hokey. And so they were directly eliciting information from the subjects. Well, what they learned over the same period of time at Facebook and other social media platforms was that the metadata of your activities on social media is actually more valuable and more authentically true than what you self-report. And so think about it, humans are always where we have these cognitive biases that we bring to conversations and how we want to be seen, certainly on social media. So Facebook kind of sulfed away, and Cambridge Analytica sulfed away, but they were really okay with it because the metadata around how we click, how often we click, what time, how many messages would we send, you know, the type of words that we use, that was actually more indicative of mental state or predictive of, you know, socioeconomic status or spending or voting information. And so they were like, oh, okay, we're just gonna do metadata and it'll be anonymized. But what they were fine with that because it was better information.

SPEAKER_01

I I received an email from I think it was from Zoom Info, but it wasn't from Zoom, it was from a third party. And they were giving me the opportunity to opt out. And this really frustrates me. And it's different than in Europe, right? In Europe, you need to opt in, and in the US, you have to opt out, which is you know, a real policy change that needs to happen. But they were giving me an opportunity to opt out that my data on my business Zoom account was being sent to this third party. And so, just as you're saying, the the emails, the meeting subjects, now in in the advanced Zoom application, you could put agendas in there, you could put meeting notes in there, you can put attachments. And unbeknownst to me, and I'm paying for this business version of Zoom, that all of that's going to this third party and I have to opt out. And a good thing I caught it. And in many cases, you know, that's where are we not even being notified that that's that's even happening? Wow.

The Implications of Data Breaches

SPEAKER_00

Yeah. It's just to add a little bit to that. So this opt-out function is something I know very well because we we work in that space. And then you're right, Europe, GDPR, they have more regulations around this. And so ostensibly, your baseline there is actually better than it is here. In the US, we have more optionality. So we have more latitude if you like it would be very hard for me to live my lifestyle in Europe. I couldn't actually do some of it. But the average European has more privacy than the average American. This is true. In the US, when even when you opt out and you have to really dive deep into the terms of service, that is actually only valid for one year. And then they're allowed to repopulate the data for if a new data event occurs. And so let's say I opt out of American Express's data selling regime. Well, 12 months later, they can put me back in anyway. But also, if I apply for a new card during those same 12 months, that's a new data event. So they would repopulate it as well. And so they've figured out a way to really minimize the efficacy of even the compliance regime that they're held to for data removal. The other version of this, and I've seen this firsthand because I go and I request data from different platforms. So that you've probably seen it at the bottom of a website website, it'll say, you know, request your data. So I do that. So I've done it with data brokers, and you'll get a sort of half result. And what I mean is they'll show you, okay, here's your email address and here's some associations with it. And then they'll say, but by the way, your data is also mixed in with a bunch of other data, and we can't actually extract it from the data lake because it's bound and you know, hashed with a bunch of other information. And so here's the data we can show you, and then there's a bunch of other stuff that we can't do anything about. And so what they're saying is go ahead and opt out, and we'll let you opt out of this part. But we've actually retained in such a way that who knows if it's not technically feasible or not, but they're saying that they couldn't possibly extract it. That to me is is problematic.

SPEAKER_01

And is the assumption that when you signed up for that service, there was some sort of terms and conditions that was a hundred pages that you signed off, and that that was in there.

SPEAKER_00

Yeah. And um, so there's this great book called The Age of Surveillance Capitalism by Shoshana Zobov. And so she pulls a vignette into that book. And I'm gonna get this slightly wrong, but it'll be directionally correct. That some academics researched how long it would take you to read all of the terms of service that an average person engages with in a year. And it would actually take 76 days of your year to read all the terms of service that you have to engage with to get through the year. So they know we're not reading it. And I I I've read some, but I don't read all of them either.

SPEAKER_01

I mean, even if we ran it through GPT or something like ultimately, like our iPhones, I mean, are we we just are we're not gonna use that product, right? It's not like we're we have really at this point much choice. And speaking of choice in the online world, even having a social footprint has become more important for livelihood and even professional belonging. So most people are off Facebook. I'm pretty much off Facebook and the meta apps, and I want to talk about meta apps in a minute, but LinkedIn is is really important professionally. And recently I was asking GPT, it does this very clever thing. And I want to talk about AI too, because I'm very conflicted because I'm loving it and I'm also kind of having the you know flashbacks of early social media. But I took screenshots of my profile and I asked LinkedIn for improvements or how to optimize it. And it told me, you know, you're not posting enough, you're not engaging enough. And so, therefore, based on the algorithm, it could tell me where I was sort of existing in the LinkedIn universe. So it is legitimately important to be to be out there. But I read this week that AI can now identify supposedly anonymous social media users by correlating posts across platforms. So even if people want to have some footprint in LinkedIn, and then maybe, you know, like I'll disclose my cats have an Instagram account. I don't like to be on social media because I don't like how it makes me feel and how it takes my time. But every once in a while, like there's some fun stuff on there. But I have my cats have an account. But now it appears that it doesn't matter all that clever stuff, setting up like different Gmail accounts and setting up like there's some sort of meta view of Mary that knows exactly what all these things are. And so I'm I'm probably toiling away thinking I'm being very clever. And it sounds like it doesn't really matter anymore. So at some point, it feels like we need to figure out how to be visible but reduce the risks that come with it. How are you advising your clients on this?

SPEAKER_00

Yeah, so a like you have found value in having some being discoverable to and to some extent, right? And it's actually professionally reassuring to people that, yeah, okay, he he looks like he does this privacy stuff, but at least I can see he's a person, right? Like there's something there. And so I too am on LinkedIn, you know, we're we're connected on LinkedIn. Um, and uh I will tell people, clients or otherwise, you can absolutely have a curated public persona that could be accretive to your you know professional life or personal endeavors, philanthropy, whatever it is, and you can separate that out from the crown jewels, which is the names of your kids, the where they go to school, your home residence, your personal phone number, your email address, but the things that don't actually need to be revealed in furtherance of your career. Those two things can occur. Now, what you referred to, and I know exactly what you're talking about because I saw the same study and I shared it with some of the technical people in our organization. I was like, can we do this too? But there's there's certain forms of stylometry, which is one way of it's a heuristic, it's not like dispositive, but it's a heuristic saying, like, oh, the writing style here or the posting cadence, or even like the way that they use the keyboard seems to be the same. And so we think that's the same person. And so that's been used as an investigative tool for many years, and it's been used to try to figure out who Satoshi Nakamoto was, right? Using these stylometry tools. But then there's another part of it, which is there's cross-applicing. And so on your phone, when you're running your LinkedIn application, it can actually see down into the phone, into the operating system. It knows how many pixels are on the screen and what colors you're using and what language your keyboard's in. And it actually drills down even into the hardware, which you can't change. And it'll tell you things about the actual physical handset. And so what so what LinkedIn sees is that. And then LinkedIn might be buying data from Zoom, and it'll see that same, it's called a fingerprint, it sees that same fingerprint. And even though maybe my Zoom account is in one email address and my LinkedIn is another, it'll say with a high certainty that's actually the same user. And so there's the stylometry, heuristic, you know, really educated guest version, and then there's the fingerprinting, you know, technical assessment of the device. And between the two, it's really hard to hide.

SPEAKER_01

Wow. Wow. So I had something similar happen. I was looking for a new camera, new for this podcast. And I went on Amazon and I saw it, but then I wanted to go to the company's website. So I flicked it over. When I went back to Amazon, the price went up. And so I opened up another browser and entered Amazon without an account. And I again, I think I thought I was being clever, but to your point, it knew it was me doing that, either my IP address or whatever. And I was using a browser, I wasn't using an app because I had heard having the app itself has more danger. But it appears that if there's this cohesion of technology companies and whatever way that might be formalized or not, if it's on your phone, if it's on the hardware, if it's on the browser that another company has, if if those three or four different organizations are connected in some way, even in selling data, that it sounds like a lot of what maybe I thought I was doing that was clever is really pointless.

SPEAKER_00

Well, yeah, I wouldn't say it's pointless. And you're talking about dynamic pricing, which I think is gonna be an issue that probably uh gets all the way into regulation because it's predatory. And so what they're seeing is look at Mary. Well, she lives in this zip code that has an average income of X and whatever other like socioeconomic descriptors can be assigned to your browser session. And then they'll be like, well, if she's looking at Delta and uh she's from that zip code, let's push economy plus and business class seats. And if she comes back again, let's add 20%, right? Because maybe she has discretionary funding to go on vacation, whether it's$800 or$1,200, right? And so that for sure happens with the airlines. It seems to be happening elsewhere. But there's another side of this that's even more insidious, which is there's the marketplace where that data gets exchanged is called RTB. It's real-time bidding. And so what happens is it's like a stock exchange for marketing information. So that browser session that you've engaged in, it may have been served up by a Google search, let's just say. And so Google is then taking that session and they're bringing it into this RTB market and they're saying, here's what we know about this session and this IP address and this browser fingerprint. Who wants to buy an ad on this? You know, and maybe, maybe it's united, right? Because they see that you're looking at Delta and like, oh, let's that's a high value customer. Let's go get married. Well, when they do that, they're also buying little slices of data associated with the session, which can include the location of the device. And so, in the case of a mobile phone, if there's multiple real-time bidding sessions going on throughout the day, some of which are bound to device location, I could go into that real-time bidding market, even though I'm not looking to serve ads, and I'm just buying the underlying geotemporal data where that phone moves over time. And I could use that to see where your phone goes. And if I know where your phone goes, I have a pretty good idea where you go. And so this data is 100% available commercially. I would argue it's probably not the indicated uh intent for that data, but it's a byproduct of it for sure.

SPEAKER_01

Wow. Okay, I've got a million things coming up for me right now. I uh the first thing is to the to the woman who made that comment, I'm happy to trade off convenience and privacy. This was the exact example I gave her. I said, so next time you have a wedding on the West Coast and the airline can tell that's in San Francisco and you have to go, and now the flight costs are going to be different. Are you okay with that? That was the one thing that made it a little clearer to her. This week there was a report that federal agencies were able to track phone locations using exactly what you're saying, the advertised data. And, you know, I think this is what scares most of us, which is how much personal information exists in this commercial data market without our real uh awareness. We don't know how it's being hosted or who it's being sold to. What are we, what is the public misunderstanding about how this kind of ambient data capture is going to affect them, like this manipulation of pricing? But also, is it secure and could it be sold to people who have a more harmful intent?

The Philosophical Dilemma of Technology and Privacy

SPEAKER_00

Yeah, 100% it for sure is being sold that way. So I'll give you some like metrics to try to frame this a little bit. And this is data that's a couple of years old, but it's probably only been amplified since I saw it. If you take the gross revenue assigned to just the domestic data broker marketplace, it's somewhere north of$250 billion a year. And that's just like the direct data broker industry, not counting like the secondary markets and the resellers and the collectors. So that is actually a larger number than the entire budget of the entire intelligence community in the United States. And everyone's like, oh, the NSA has the most exquisite targeting. Maybe they do, right? But but really the marketplace for commercially available open source data is much bigger than, you know, like this more narrowly captured intelligence data that's out there. So, okay, so if that's true, then we took a look at one of the top 10 data brokers in the US, and they had profiles on a billion different individuals across the planet, 200 million phones in the United States. They're selling this data to like nine of the top 10 banks, seven of the top 10 insurance companies, all the telcos, the ATTs, and Verizons, you know, on down the list, credit card companies. And so those companies are using that to make decisions around what's called alternative credit, which is instead of using your FICO score, they're gonna say, oh, this person actually might have a decent FICO score. But we see all these like less common or deviant uh metadata signatures for them. And so they're actually more of a credit risk than we think. And one of those is if you let your cell phone battery die, uh, that is considered an alternative credit risk because it means you're not reliably charging your phone. Um, and so there are some people that might let their cell phone battery die for a whole host of reasons that have nothing to do with their credit worthiness. So that data gets gobbled up, it's worth a lot of money, it gets sold all over the place. However, this same data broker experienced a massive data breach. And those same billion records that they had on people got stolen. Were they stolen by a nation state? Were they stolen by an organized criminal? Were they stolen by a lone hacker? I I don't know if we know. But so there's the problem that people can buy it, and there's the problem that that data wasn't really well secured in the first place, and so it's out there either way. Uh, and I I think that's problematic. And it's not like I'm not in those databases, too. It's impossible to live a life with a cell phone without being in those databases.

SPEAKER_01

Um, I was listening to a podcast this morning. They were bringing up the Sam Altman quote recently. He had said the cost of someone had asked him up about the cost of data centers and its and its impact on the environment. And his his response was something along the lines was, well, do you know how how many resources a a child takes up? It gets to the point where it can seem a little like domination and control versus optimization and convenience. About eight years ago, we had a leak in the apartment and I was having a verbal conversation about needing to get new flooring. And at the time we had an Alexa device in the apartment. And probably within hours, I was getting Home Depot flooring ads on Google, on Facebook, on everything. And Alexa was gone the next day and she hasn't been back. But now we have smart TVs, and I haven't even looked into that, what they're doing. So it really starts to become a little overwhelming. I don't know if you have a view of whether regulation is gonna start to wake up in this country. What does it take for this to become a top-of-the-ticket issue? Because it doesn't feel like there's a whole lot being spoken about on this.

The Impact of AI on Mental Health and Privacy

SPEAKER_00

Yeah, this is a really good question. And I maybe have a slightly off-brand view on this. So, okay, so Sam Altman, gifted entrepreneur, obviously, sometimes not a gifted communicator. And so, like, he's made statements like that that I think are kind of fumbles. But if we give a charitable interpretation of what he's saying, I think what he's saying is that it's isn't AI absolutely critical like the human race is critical, like the next generation is critical. And in that, to that question, I completely agree that the United States, the West, like AI that is at least like underpinned by democratic values and principles that we would have in the United States, is actually it is essential that that has a place in the technology stack of the future. And so much so that I would actually rather see a largely unregulated or loosely regulated approach that had fewer guardrails but allowed for quicker growth than a over-regulated, maybe more safety-oriented but laggard approach that allows a foreign nations tech stack to dominate. I actually think it's more important that we win candidly. That is completely contrary to the privacy stance that I have on most things, because a less regulated, you know, more data consumptive version of AI is going to gobble up uh privacy information, private information, and use it for its training or inference or whatever. That just shows you how important I think it is that we win on that. But the the other side of this, if AI is so incredibly important and essential, then who gets to harness it? And I think that question is being wrestled with right now, right? Because we have discussions around regulations at the state level. Should those actually be disbanded and managed at the federal level? And then should tech companies just self-regulate and come up with their own regimes? And doesn't that, you know, pull up the ladder for the smaller entities that maybe don't have the budget for big compliance teams and are more startup or venture-backed? I think there's far more questions and answers on that front. And it's interesting to watch and it's moving so quickly, and I barely feel like I can keep tabs on it. I don't, actually, I can't. I can't keep tabs on it, but I watch it with interest.

SPEAKER_01

Well, thank you for your candor. And I completely agree. We we schedule like two hours a week just to watch what the latest thing that's happened in AI. And um it's you can't keep up. I mean, it could be just a full-time job watching this stuff. The podcast this morning was talking about how to balance privacy around how people might be using this. So, for example, there was a very high-profile and tragic story about a man who had hurt himself because of AI and another girl who had learned how to inflict harm on others using using AI. And so the debate arose well, there must be something in the programming that that flags this and sends it up the chain and sends somebody to do a welfare check or something. And then that brought up the privacy checks about it. Ads, of course, are becoming a question. Will the recommendations be tarnished or skewed by the ads? So that's just one debate of many. But what's interesting, I think, I completely agree with you, by the way. I I really love it. I work, I I'm I'm very aware that it could be sort of like what Google Maps has been to my personal navigation capabilities, you know, back to being 17. I mean, I think I drove to Florida in my 20s by myself and back and with a map, paper map. And now if Google Maps like doesn't respond too quickly, I get very anxious and I'm like, are you on? Is the is the volume on? So I can see how reliance on a tool like Google Maps has compromised my ease in navigating for myself. And I am aware that if I use it for things like writing too much, editing's fine, spell checking, being a thought partner is fine. But I don't want to lose, I don't want to lose the muscle of these other muscles because the tool exists. But people are using it for things like medical diagnosis, personal decision making, emotional processing, even therapy and companionship. But at the same time, I'm assuming malicious actors have access to these same tools. I'm curious where you see AI playing out in the future of privacy and individual security. Will it strengthen our discernment or could it erode it?

SPEAKER_00

Wow, yeah, I think the jury's still out. I don't, I actually don't know. It's one of those, like, is you know, is the good AI gonna save us from the bad AI?

SPEAKER_01

Yeah.

Digital Visibility and Social Behavior

SPEAKER_00

And I don't know. And it certainly lowers the barrier to access capabilities that would have required lots of training previously. And so the example would be okay, like now a bunch of people can code that couldn't code six months ago because of clawed code. And so, okay, so that's great. So people are building more apps and you know, replacing expensive SaaS services or whatever, but some of those people are going to figure out how to code something nefarious. And so the question then becomes that that person was probably a bad person to start with, but have we just made it super easy for them to amplify that instinct out that you know would have been inaccessible to them previously? Maybe. The thing about like the people who have spoken to AIs and then later committed suicide or like those stories are pretty horrifying. At the same time, would those have been just Google searches six years ago? Fair. Maybe. I think the counterpoint to that is well, yeah, but a Google search isn't gonna encourage you and talk back to you and guide you down this path. And so I think it's fair to say AI is not the same as a Google search for rope. I just, if I'm totally honest, I just don't know yet. I think it's it's it's greenfield. We just don't know.

SPEAKER_01

It feels like this is going to really impact your your space. I can imagine that it's literally day by day seeing what arises and then figuring out how to respond to it. I think just this last week there was a report that AI-powered web browsers can be tricked into phishing attacks within minutes, right? Like the it it they it tricks the AI assistant, not the human. Um, it does feel like an arms race, you know, the nuclear equivalent of and to your point earlier, which I agree with, um, and and was Oppenheimer's uh, you know, uh retort, it was that someone's gonna build this. And so we need to, we need to build it. And yeah, it it feels like it's moving pretty rapidly. Speaking of moving rapidly, I I wanna let me just lighten it up and switch gears and go back to being more philosophical. A little um palette cleanser. So humans evolved in tribes where visibility meant safety, but digital networks seem to have inverted this. So a lot of this visibility, and and you mentioned some examples earlier, it amplifies outrage, it accelerates public shaming, it creates attacks that didn't exist before. I'm I'm wondering what your thoughts are about how we evolve from here. I did a I recently did an episode on recursion. I talk about patterns, universal patterns. And so this idea of recursion or you know, feedback loops, and some of the behavior that starts out as being very bizarre and random, this sort of random data point, then all of a sudden becomes a thing. Terms I'm just learning, like mobbing and doxing. Do you think it's the case where something happens, it becomes amplified through the system that we're all, you know, have on 24-7. The algorithm pushes it around, and then maybe someone else does it. Maybe then it starts to become normal. So I don't know if you can speak to a little bit of this behavior and how we operate as a society.

The New Privacy Threats in a Digital Age

SPEAKER_00

Yeah, I'm sure you're thinking some of the same stuff around uh digital relationships, digital inter interactions can be transactional, they can be disposable. It's easier to feel empowered behind a keyboard than you might in person. And uh you don't learn the same social skills and the cueing of like nonverbal communication. If if you want to be a bully in high school and it's 1985, you you kind of gotta like, you know, be a bully and like live up to it, right? And now you can just hide behind a keyboard or you can create a deep fake of someone that's non-consensual and maybe explicit. And I feel fortunate to not have grown up with that experience. For sure. And yet, as a parent, I worry about it for these up-and-coming generations, and certainly there seems to be an anxiety that comes with uh being digitally native that's well documented. I you know, love to meet people in person. I feel like one coffee is worth 10 Zoom calls, um, and a phone call is better than a text, right? And um, and a video call is better than a phone call. And so I don't know if those are things that are gonna be valued in the future. And I could see two versions. One is that they become more valuable, like there's a artisanal value to a human interaction, and then the other side of it is that it just becomes an old-timey thing. Oh, you're meeting for coffee, like how quaint, right?

SPEAKER_01

Yeah.

unknown

Right.

SPEAKER_00

My brother brings me on coffee. Yeah. I I I think humans are social and beings, right? And I think we we actually suffer uh in isolation. I think that's well documented. Um, and so we are tribal, like you mentioned. It's it's uh I think this is sort of a trope, right? Many, many generations ago, most people you could know was 50. Uh, you might know your tribe and the next tribe. And even 200 years ago, you'd live in a small town and a big city is what we would consider a neighborhood now. And so you could know everyone, and um and there there was an inherent privacy to that because there weren't even that many people to tell, and there wasn't anyone who could see what was going on in the privacy of your home. Well, now when we say in the privacy of our home, that's a misnomer entirely. The smart TV you mentioned is listening, your phone is listening, who knows what kind of internet of things devices you have and cameras and microphones. And so I don't even know if in the privacy of your own home is is a thing.

SPEAKER_01

You were recently quoted in the Daily Mail about a Woman who had been involved with a married man and who then tragically killed his entire family. She was exonerated, but it didn't matter because strangers from around the world still tried to locate and confront her. This feels a little bit like people watching or listening to too many true crime podcasts and trying to find a hobby, but it does feel qualitatively different from traditional privacy threats. Um I'm I'm just curious, is this becoming like a new category for risk that people didn't imagine they they needed to worry about protection? I mean, to your point about the Coldplay concert at the beginning, people who had no idea that this would be an issue. This was someone on the periphery, or or maybe like some of those employees that were sort of almost like collateral damage in the process. Are you are you seeing more of this?

The Growing Demand for Privacy Professionals

SPEAKER_00

Yeah, it's sort of a cultural uh idiosyncrasy we have, at least in the US, where something will happen and it'll be in the zeitgeist. And then the sort of amateur analysts or or sleuths of the internet will the like, and this has happened in sports, right? Where someone like interferes with a baseball game and advertently like catches a ball that maybe could have been caught otherwise. Yeah. And then and then all of a sudden their picture comes up on ESPN, and then everyone figures out who that is, and the home team makes their life miserable for 72 hours. And I've actually seen this up front. Someone that I know in my life, their brother is um a notable athlete, and but not a household name, but a notable athlete. And he did something in a game. I forget actually the particulars of it. He started getting a lot of hate mail, hate texts, hate calls, but it wasn't just him. It was his sister, it was his mother, it was his father, it was his brother. Uh, and it actually persisted for quite a while. And so this is another example of people who, you know, didn't necessarily think like who would think that their what their brother does on the football field would have privacy implications for them. The other side of that, though, is there's this sort of like uh white hat vigilanteism where people will find someone who is doing something nefarious and they will report them, or there's a form of doxing that is well intended. I don't know if you've ever seen the documentary. It's it's called Don't F with Cats.

SPEAKER_01

I heard the premise. But please, please continue. Like please explain it.

SPEAKER_00

It's it's a horrible premise, right? And uh it's implied in the title, but the actual story is about these good guy, good girl, vigilante online sleuths that spend years tracking this bad guy around the world using the tiniest little clues in the backgrounds of photos, and they figure out who he is and where he is, and they help the police catch him. And by the time he they find him, he's escalated from animals to people. And so it's a very dark story, but it's also a story about what you can do with persistence and a group of people who are willing to spend their own time trying to find someone. The takeaway is the tiniest little clue could, in the right hands, be magnified into, you know, an identifying point on someone.

SPEAKER_01

Based on everything we've discussed, I would expect that there'd be more demand for this type of work. And we certainly know that many people are looking to reskill right now, given AI's impact on traditional employment. Do you see this becoming a larger professional path? And how does someone even enter a field that requires staying out of the spotlight?

SPEAKER_00

Great, great question. So I always say our business is growing. Anecdotally, we see demand and and the cognizance that this privacy thing is a thing, right? But um the challenge is it's not like some of these more bona fide um disciplines like cybersecurity, where every school in the country has some kind of cybersecurity program and will graduate you and there's certificates, and it doesn't really exist in privacy. I was sort of self-taught and lucky enough to be around other people who thought the same way and read books from people who've done this who are better than me at it, and then implemented things and learned things along the way. Um, and for that reason, it's actually very hard to hire someone who's already fully fledged and can kind of start on day one. We tend to, as you mentioned, the word apprenticeship, like bring someone in. And the best, the best training is to belive it. And the reason for that is there's there are gonna be things on the periphery of your pattern of life or your life experience or your dynamic at home that are very hard to anticipate until you get there. Uh, and then you're gonna be there and you're gonna be like, oh, I hadn't thought about that. What am I gonna do? I'll give you a throwaway example. We receive most of our mail at what's called a commercial mail receiving agency, like a UPS store or a P.O. Box or a FedEx. And when I set that up, I was like, great, I got mail figured out. And then we moved in, and and my wife is like, well, we need to order a rug. And the rug's not gonna fit into a P.O. box. So we had to figure out something else. Those are the things that over years and years, I can speak to a client very much from my own experience. When you're moving and you're trying to redecorate, and it's not the time that you want to add friction. And so, you know, I can tell you it's gonna add friction, and what and I'll be there to help, we'll be there to help. But the reality is certain lived experiences are really hard to uh translate in an academic setting.

SPEAKER_01

Thank you for sharing that. As mentioned in the Atlantic article, you practice what you recommend. You and your family live by many of the same protocols that you you give to your clients. It's interesting you mentioning some of the stress of that. At the same time, I'm wondering if there are some positive upsides. So, as an example, I have a couple friends who are very vigilant about not posting their kids' names or photos online, and they've kept their teenagers off social media entirely, which is pretty amazing. And I would, I was assuming when I was talking to them about this that it was going to be fighting every night in the house, but actually they all seem genuinely closer. And in all cases, the kids are all thriving academically and socially. So I'm wondering if you've seen any upside to this way of living that I mean, I think we can all imagine those of us on this other side, we can imagine the stresses and some of the friction points. But what have been some of the unexpected positives?

SPEAKER_00

Yeah, great question. So, first of all, and this is gonna sound a little hokey, but I absolutely believe this and practice it. My privacy quirks are mine, and they're not your problem, and they're not the plumber's problem, and they're not the people at the post office's problem. And so I genuinely approach it in as like kind of kind and solicitous of a way as possible, not indignant and not like um, you know, frustrated that the world hasn't bestowed upon me an easier privacy lifestyle. And and therefore, like the interactions that I have are generally like really pleasant. Um, and I I've found I've also done that the wrong way for sure. And so I've found that one of the upsides of that is I've actually had a lot of interesting conversations with people who when I ask them to help me with this weird, you know, privacy request, and I tell them why, and I say, Would you mind? I know it's more work for you. And it opens up these, like the very human side of this. And some people are like, Oh, that's so interesting. Tell me more. And some people most of the time, people are like, No problem, right? When you ask nicely. And and so that's actually been a sort of silver lining that I wouldn't have anticipated, but I really try to not be indignant about it. It doesn't, it doesn't serve me or the others. Um, the other side is okay, so there's a real benefit to personal security that comes from it. And I've got kids, and so it is not entirely prophylactic, but if you're hard to find, then that's one other layer of protection. There, there's some peace in and of mind that comes from that. So that's nice. And then the third thing is I've had the opportunity to get phone calls from people wherein I've been able to help them, sometimes professionally, but other times not. And and so if I can share my experience and that has value, that's also been silver lining.

SPEAKER_01

You are absolutely helpful and you you've been so generous coming on here, but you're also really helpful on LinkedIn. And I would strongly recommend that folks follow you because you post some really, really practical and and actionable recommendations. So, as I was saying recently, I had asked GPT to do um, you know, an assessment of my LinkedIn. And then I thought, well, let me ask it to do an assessment of my security. So I just said, you know, here's my phone and my operating system. I have these this brand of smart TVs. Give me some realistic security settings. It gave me some very helpful things. I was very, very surprised about like microphone access, camera access, some of the some of that kind of stuff. Uh, but it also told me that there's a couple apps which you you really can't change any of that. Um so that is TikTok, which I I've never used, Google and the meta apps. And so I started to do some research about moving off Google. And so I was looking at Proton Mail, and then I saw a post from you that now proton may not be as private as most people assume. So it feels like things are rapidly changing here, and maybe that our our choices are becoming more limited. What what are your thoughts?

SPEAKER_00

Yeah, that's wow, great. Then it's getting harder. And I and Proton Mail is probably not as private as they market, but is certainly much more private than Gmail. Uh, and I'm 100% with you. I don't have, I don't use Google products. I don't have any meta apps or Google apps on my devices. And like my my son's sports league, they all all the parents chat on WhatsApp. And I'm the one, I'm actually the only one who's not in that chat. So I've, you know, I've been willing to bear some social consequences as a result. The um it is getting harder. Uh, even for a tool that is adept at building privacy uh into its functions, whatever that is, it's still hard because of like going back a couple questions about all the different vectors for data collection. So even if you're cutting off one or two, there's still so many others to consider. But being a little more private than the next person and making it hard for big data aggregators to collect on you to at a granular level, those are worthwhile victories to me. Um and it's it's a little more work, but I find value in that. And so I'm okay with it. And thanks for the shout out on the LinkedIn stuff. I try to keep it light, but also if I'm interested in it, then you know, hopefully someone else sees the same thing.

SPEAKER_01

Well, I mean, and some of the things in your world are are are probably more geared toward um a certain type of person. Uh, but we all use uh browsers. We all have issues around passwords and and things that now are just sort of table stakes. The other thing that you recently posted, and it was such a I was selfishly, it was so helpful. It was so helpful to me because I, again, I just did this thing that GPT gave me recommendations, like have a really great password manager. I'm like, duh, of course I need a great password manager. So I I I buy one and I install it, and then I found another browser. Uh, and that browser didn't have a password extension with the the manager I used. So then I found one that did. And then you posted, oh, actually, it's not safe to use the browser extensions. Um, I think to your point, now I feel comforted by what you just said, which is like being reasonable here is probably gonna get you X percent there. Um, but if I fret about every possible thing, like it's just it's a slippery slope, right? I'm never gonna get there.

SPEAKER_00

Yeah, and it would be great if all I ate was broccoli and you know, grilled chicken, but it's not their reality. And it's it's not the reality for most people. And so improving and being cognizant and exploring a little bit of incremental change on these fronts, I think, is a great victory and is accessible to people. And there's so many tools out there that are free or cheap that you know don't require an advisor or someone to sell you expensive software. Those things are are really great starting points. And they could be endpoints for people too. Um, and if someone you know hears this or is interested and just ends up, you know, deleting a couple apps and using a VPN and um, you know, using a password manager, that is outstanding news.

SPEAKER_01

Well, it's perfect. You you just you just teed me up perfectly. Uh but I was gonna ask you as we wind down, if there were a couple tips that just just very generally that are just good basic hygiene for people to know and understand.

SPEAKER_00

Where to start? So I have some that I recommend, right? And like password manager for sure is a big one. Um, and uh, we've hit on some. So avoiding some of the worst offenders of like Google is their entire business model is predicated on collecting and selling data. And that's fine. It's an amazing company. Uh, and their stuff works well. And they're not bad at they're actually great at security. They're just not really good at privacy. Um, and so what I would actually say to people is as opposed to some apps or or things to that I would recommend, is just to explore a little bit about what's going on in this privacy world uh and understand how data is collected and maybe consider what in that world is meaningful to you. And so maybe if you're a parent that it's just protecting your children. And maybe if you're a public-facing person, it's protecting some of the core equities around your personal information. Um, and maybe if you've made your entire brand on reputation, it's protecting the reputational equities around your digital persona. It could vary from person to person, but just understanding what's going on, I think is a great starting place. And there's so many resources for that. Um, and not the least of which is one that you mentioned, which I really like, which is you can actually talk to your AI and say, tell me how you collect data, right? How does Google collect data? How does Amazon collect data? And you'll get great succinct, uh, largely accurate answers. And then from there you can leapfrog into, okay, now I'm willing to explore. But the reason I say this is if I recommend using a VPN, okay, great. And I do. And then you use a VPN, but you you're not quite sure why, other than you heard someone say to use a VPN, what might happen six months down the road is you're like, ah, using a VPN is annoying because I can't really watch Netflix with it because they're always blocking me, you know, because Netflix needs to know where you are to make sure that the programming that you're paying for is regionally allocated.

SPEAKER_01

Well, that's right.

SPEAKER_00

And and so then then the likelihood that you're just gonna give up on the VPN because the understanding of why may might not have been there is higher. And so I would rather see someone do a little bit of privacy stuff that they care about forever than try to take on a bunch of things because they heard it somewhere. And it's the same, like you hear someone uh mention a supplement, and you're like, oh, I gotta get that supplement. I heard it cures whatever. Um, and you didn't really understand like, do I even have that problem, right? And like, does it work? Has anyone else validated this? And and so I know that's the maybe the answer you were looking for is like these three apps to roll them all. But my answer is that just being armed with the information is a great setting.

SPEAKER_01

No, that that was perfect. And thank you. I I think what you said is exactly right on. And I know in this conversation we we hit some of the outer layers of the things that are really scary and big, but I I really appreciate this conversation because you bring just a pragmatic and practical lens. So we find the balance between being naive and being hysterical and living in fear. And what I'm taking away from this conversation is that it's it's not about hiding, it's more about discernment and balance and I think knowing when to be open and when to protect and when the trade-offs make sense. So I want to thank you. I genuinely love this conversation and and the work that you do. And and I'm grateful for the generosity, not just in speaking with me today, but as I I um have said, you're very generous about your insights online. And I've called you a couple of times about some some things. Um, and I I know that you that you take this very, very personally, and that comes through in your work. So thank you very much.

SPEAKER_00

Well, that's super kind. Well, right back at you. It's a pleasure to interact with you. Thank you for the kind invitation to be here. And what I would leave everyone with is exactly what you said. There's a right sizing and a calibration that's appropriate for everyone. And doing a little bit for a long time is is a fantastic place to land.

SPEAKER_01

That's a pr great place to land on that episode. So thank you so much.

SPEAKER_00

Awesome.