âFocus more on resilience than cybersecurity.â â Dr. Georgianna Shea
Published on April 02, 2025 by Rixon Technology
In this episode of the Rixon Podcast, Dr. Georgianna Shea, Chief Technologist at the Foundation for Defense of Democracies and board member of Rixon Technology, joins host Heidi Trost for a deep dive into cyber-resilienceâa concept thatâs quickly becoming more vital than traditional cybersecurity.
With real-world examples like the Colonial Pipeline incident and the impact of untested software updates, Dr. Shea breaks down how businesses can anticipate, withstand, recover, and adapt in the face of cyber threats. She challenges outdated notions of cybersecurity, urging leaders to refocus on mission continuity, interdependency awareness, and resilient system design.
From industry-wide ripple effects to the evolving role of quantum computing in encryption, this interview highlights actionable insights for CISOs, CTOs, and executives navigating the complexity of modern digital ecosystems.
The conversation also explores advanced strategies such as tokenization, zero-knowledge proofs, and distributed data storageâcritical tools for protecting sensitive information beyond traditional encryption methods.
đĄïž Want to better understand how resilience impacts your organizationâs future? Watch the full video above and read on for the complete transcript below for an in-depth breakdown of everything discussed.
From zero-knowledge proofs to real-world case studies, resilience isnât just theoryâitâs strategy.
If youâre ready to take the next step in securing your systems and sustaining your mission… Resilience isnât a buzzwordâitâs your next move.
Introductions
Heidi: Well, hello, everyone, and welcome to the Rixon Podcast. I am here with Dr. Georgianna Shea, and we’re super excited to talk about cyber-resilience and what that means. So thank you, Dr. Shea, for joining me today.
Dr. Shea: Well, thank you for having me. And please, call me George.
Heidi: Okay, George. So, George is the Chief Technologist for the Foundation for Defense of Democracies (sometimes shortened as FDD), the Center on Cyber and Technology Innovation. Gosh, that is a mouthful, so apologies. Barely getting that right. She is also a board member at Rixon, and fairly recently, she served on the Cyber Physical Resilience Working Group of the President’s Council of Advisors on Science and Technology (also known as PCAST). I think you only work on things that have very long names.
Dr. Shea: Yes, and it has to have a bunch of acronyms as well.
Heidi: FDD, yeah.
Dr. Shea: Yes. Yeah. And I’ll probably mention at some point in the podcast that I was working with GRF and their BRC on the ORF and go over all those letters as well.
Heidi: Itâs fitting because in cybersecurity, we just have endless acronyms, so of course, you know, you would also have these acronyms with the different organizations you work with.
What Does Cyber-Resilience Mean?
Heidi: Okay, let’s talk about cyber-resilience. What does that mean? What does cyber-resilience mean?
Dr. Shea: So, um, I look at resilience as being able to anticipate, withstand, recover, and adapt from various cyber events.
Typically when you talk about cybersecurity, people are going to think about:
- What do I have to do in terms of compliance?
- What are the controls that you’re putting in?
- What does the audit look like?
- What are the requirements under GDPR, PCI, RMF?
You knowâwhatever that isâputting in the actual protections in your system.
But resilience really goes beyond just the cybersecurity. It’s about merging your mission with your security strategy:
đ§© What is the mission of the organization or system?
đ And how do you ensure that youâre able to continue that missionâeven in a degraded state or under attack?
That might mean:
- Identifying the critical path of your processes
- Pinpointing where you have choke points (like a single system that, if it fails, halts operations)
- Building in redundancy to prevent total failure
I really get back to the engineering fundamentals of the organization.
Itâs not just about componentsâit’s about people, process, and technologies working together.
You have to explore that critical path across all three to ensure your mission is metâeven when facing degraded or adverse events.
Heidi: Yeah, I liken it to kind of like the Boy Scout on a camping trip, right? Likeâthey’re prepared for anything.
Dr. Shea: Yeah, always be prepared. Be able to anticipate, be aware of what’s going on, and make sure you can get through it.
Why Is Cyber-Resilience Important Now?
Heidi: Yeah, resilience and cyber-resilience aren’t necessarily new terms, but in the things Iâve read recently, those words seem to come up a lot more.
Dr. Shea: Good.
Heidi: Yes. Good. Maybe itâs like when youâre thinking about buying a new car, and then suddenly you see that car everywhere. So maybe I just noticed the word resilience once, and now itâs showing up all over.
But why is that important now? Why should businesses careâand why is it coming up more and more lately?
Dr. Shea: Well, I donât know if Iâm the best judge of that because I kind of live in that space.
By trade, you’d probably put me in a cybersecurity fieldâbut honestly? I kind of hate that word.
It doesnât really mean anything unless you define:
đ What are the requirements?
đ What are you actually trying to secure?
đ§Ÿ What is the required outcome?
I now push for resilience over cybersecurity.

đ Whatever the mission is for your organization, how do you ensure that you can continue that missionâeven when something goes wrong?
The public sector is beginning to feel the impact more because of how interconnected everything is.
Think about critical infrastructureâthe interdependencies between systems are enormous.
Dr. Shea (continued):
You have things like the CrowdStrike issue that happened a couple months ago. A single piece of software that wasnât fully tested gets deployedâand the ripple effect was huge.
Not just one or two organizations. All of critical infrastructure was touched⊠even my mom, calling me because ancestry.com wasnât working.
“The Internet is broken!”
When these events start affecting everyone, they become front-and-center concerns.
Organizations must be able to anticipate, test, and build in resilienceâwhether the threat is a cyberattack, human error, or even a natural disaster.
Heidi: I love that. Youâre totally speaking my language. So glad I asked this!
Dr. Shea: Good.
Heidi: And when you said âWhat does cybersecurity even mean?ââthat really hit me.
It’s meaningless unless itâs grounded in your organizationâs mission.
Thatâs the whyâwhy businesses need to care. Your mission is your revenue stream. Whether youâre:
- Supplying water to a city đ§
- Manufacturing widgets âïž
…your entire business depends on protecting that mission objective.
Dr. Shea: Right.
Real-World Examples of Resilience
Heidi: Can you, so I gave a couple examples, but could you give, just to make it a little bit more concrete for people, like what, what exactly do you mean when you’re…
Dr. Shea: In terms of resilience? Yeah. So you mentioned that I worked on the cyber physical systems resilience paper for the President’s Council of Advisors on Science and Technology that the PCASTâwe put out a number of different recommendations in there, some high-level recommendations on resilience and we brought up some examples of, you know, past issues like the Colonial Pipeline.
So, you know, the Colonial Pipeline is a great example of there was a cyberattack on the office systems of the Colonial Pipeline, and, um, they then shut down the access to the operational technology piece and cut the flow of fuel to the entire East Coast. Not really understanding the impact of that. So let’s just contain this ransomware attack and make sure it doesn’t spread. So we’re going to cut access here, contain it. And then by, by, by the steps that were taken, then the East Coast didn’t have fuel for, I forgot, a couple of days, a week. It was almost a national disaster. I think people were filling Piggly Wiggly bags and plastic totes full of gasoline at the, uh, at the gas pump to ensure that they had it.

So, you know, not understanding how the systems are interconnected to the mission to the public to your customers, you know, that that is a recipe for disaster. So you really have to understand what those impacts would be. And then, you know, internally, how those technologies relate to each other.
And then, you know, aside from the Colonial Pipeline, you can look at things like, like a, you know, Log4j, for example, that was a piece of code a couple years ago that was found to be exploitable, and a lot of organizations didn’t, didnât even know if they had it. Like, does this pertain to us? We don’t really know. It’s a piece of code that’s within other pieces of code. So it’s an embedded piece of code, and if they didn’t have like a clear software bill of materials and an understanding of what their assets are, they’re spending all of their time just trying to identify: are we susceptible to this? You know, before they can even take the mitigations to go through and say, okay, we are susceptible, now let’s go through and update this, this version to the secure version.
Uh, you could look at Ukraine, not even on the cyberattack side, but if, you know, they were, uh, producing something, sending it to you, and then all of a sudden they’re at war, that disrupts your supply chain and your dependencies. So do you have alternate dependencies?
So, you know, it’s just really important to go through and, you know, again, map out what your mission is, what those, what I call the, the mission, the minimal viable objective or service, you know, for your organization. Which was a recommendation that was in the PCAST report, identifying what is that minimal viable service or product that you have to do to sustain operations and so that your customers can sustain operations.
And, you know, as I mentioned at the beginning, that, that did come from the GRF, BRC, ORF. And so I’ll explain those, those acronyms. There’s an organization, the Global Resilience Federation, which works with many of the, I think about 17 or 19 of the, the various ISACs out there, your Information Sharing Analysis Centers, and those are your, uh, sort of belly buttons for different topic areas. So your K-12 ISAC, your Space ISAC, your Operational Technology ISAC, your Manufacturing ISAC… I call it the, um, you know, phone a friend.
So if you’re working in this industry and you, you want to talk to a similar organization, you know, securely share information, have indications and warning, uh, non-disclosure agreement kind of things. Um, you can share information with them. So itâs the phone-a-friend like, âHey, I work at a K through 12 organization. This is what we’re seeing. These are our priorities.â Another K through 12 organization is probably going to have the exact same issues and concerns. So you can share information, but… anyway, so GRF works with a lot of those and they’ve pulled together the BRC, the Business Resilience Council, which are representatives from various sectors of critical infrastructure.
In the United States, we have 16 designated sectors of critical infrastructure. So, so when I say critical infrastructure, I just don’t mean like things that are important. I mean, the designated sectors being water, energy, the defense industrial base, healthcare sectorânot gonna name all of them, but you know, those, those designated sectors. And then this, um, BRC group had invited me to work with them on developing an operational resilience framework.
So I, I, um, I liken it to a business continuity plan, but more advanced, a very advanced business continuity plan because it’s not just what are the, um, what are the risks? What are the impacts? And what do we need to do for backup? But it is identifying that minimal viable objective or service that the company has, looking at the, you know, the upstream and downstream dependencies. Who are our suppliers? What does that look like? Could there be a choke point? Do we have redundancy? Do we have a stockpile of this flux capacitor and only one person makes it? What would it look like if the supply chain is disrupted? And then also looking at our customers and what do they have to have from us in order to continue their operations?
Because as we saw with COVID, there’s so many needs, so much interdependencies and connections amongst organizations and in that supply chain of services and products that, you know, when, when COVID hit, you’re like, âOh, we’re not getting this because some mom-and-pop shop way down the supply chain had some issue,â which then supplied a critical component to one organization, which then was a ripple effect through these major organizations.
So, you know, by understanding that, the ORF puts that into, um, you know, a business strategy for companies to better merge the business piece of it, the mission and processes with the technology. I think that was a lot. I don’t even know if I answered your question. I got, you know, okay.
Heidi: You did. You did.
Incentivizing Accountability in Leadership
Heidi: So one of the things that you talk about in the PCAST reportâand maybe this is kind of what you were getting at before. One of the recommendations is to, and I’m, I’m reading this, uh, this is a quote, “develop greater industry board, CEO, and executive accountability.” Can you give some examples of what that might be and how we can incentivize that sort of accountability outside of the government just telling people that they have to?
Dr. Shea: Yeah. So on the, um, you know, it’s veryâitâs a very complex issue. Um, when we talk about critical infrastructure, um, I think itâsâlast I lookedâ80â85 percent of our critical infrastructure is privately owned. So itâs not a “the government said so, so you have to do this” or, um, cyber command put out this order and now all of the military is going to follow it. Itâs not the same pattern of activity.
Um, in the defense industrial base, itâs a little easier. The army said, do it. So youâre doing it. Thatâs how the army works. Thatâs how the military works. Thatâs how DoD works. But when you, when you put that out to the financial sector, itâs a privately owned bank. You know, put it out to the healthcare sectorâitâs a privately owned hospital. You put it out to a lot of companies. They, they donât have the infinite, um, uh, resources that some of the other federalâI donât say infinite, you know, not quite infiniteâbut they, they donât have to worry as much. Um, theyâre not handed a bucket of money. âHereâs, hereâs a bucket of money for cybersecurity.â You know, that doesnât happen in the private industry.
Theyâre, um, you know, utilities, for example, youâyouâre charging customers, and then the money you get from those charges, you then have to put back into your company for services, payroll, other activity, developing product. And, um, cybersecurity is one of those very difficult areas to have a return on investment. Why do I… explain to me why I need to spend a million dollars on this particular thing when, um, we havenât been hacked yet? Whatâs the risk? Like weâve been fine without it. Why do we have to spend this money now?
So itâs a constantly evolving thing. Itâs a very difficult argument to make and, um, you know, the, the board engagement and getting CEOs involved is, um, it is convincing them of the importance of building out these resilient strategies, investing in cybersecurity, convincing them that: there are really two types of organizations out thereâthose that have been compromised and those that are going to be compromised.
And convincing them that you may think that, um, you can accept this riskâand maybe they can. But, you know, understand the, you know, rippling effects of that.
You can look at the, you know, Target breach from years ago nowâI think it was like in 2015. Target had a breach and that ended up being, you know, millions of dollars worth of cost to them.

So, so theâto incentivize boards and the CEO piece, the PCAST report talked about:
- Better engagement
- Public-private partnerships
- Involving the private sector more
- Having relationships and sharing more information
On the federal side, thereâs usually more access to the intelligence thatâs available. Weâre seeing these types of things. Weâre seeing this type of nation-state attacking our organizations. The commercial sector doesnât necessarily have that kind of insight. So by sharing that intel and the threat and what weâre seeing across other sectorsâthatâs, um, helpful.
And then to incentivize them, itâs really toâI guess thereâs a couple ways you can do that:
One is to paint that picture of what is the impact if you donât invest in these strategies or technologies.
Then you may be that Colonial Pipeline. You may be that organization thatâs preventing the entire East Coast from having gasoline for a large period of time.
Or you may have the financial burden of recovering from ransomware attacks.
But one of the recommendations we had talked aboutâand I donât know if it actually made it into the paper, I donât rememberâbut it was… I donât want to say public shaming, but… transparency.
You know, posting whatâor requiring companies to showâwhat are they doing for that resilience and cybersecurity?
How are they meeting some of these requirements? How much are they investing? 1%, 0.5%, 0.3%, 20%?
It ranges.
And if the public is seeing that youâre not investing, then ultimately thereâs going to be a big issue.
So maybe just through that public shaming, they would be motivated. I donât think we actually put that in the paper, but it was a fun discussion.
We actually compared that to environmental standards, where companies post their carbon footprintâand because it was embarrassing, they took steps to reduce it.
So we talked about that.
But I think through:
- Sharing of information
- Understanding the financial impact
- And understanding their role in the ecosystemâ
That should incentivize them. But theyâd have to really understand what that means.
Heidi: Yeah, really painting that picture.
Dr. Shea: I kind of feel like the silver bullet for motivating not just CEOs and board members is the insurance industry.
So I think the insurance industry is going to end up being the cyber savior to the country. Once they get all their ducks in a row, get organized, and put out meaningful directionâbecause right now itâs a little bit disparate.
Once the insurance industry says: âWeâre not going to insure you unless you do X, Y, and Z,â thatâs really going to motivate companies because theyâre like, âOh, we need insurance.â
You know, in cybersecurity classes, they talk about different types of risk management. Thereâs risk transference, right?
âWeâll just get insurance, so who cares?â
But insurance companies arenât insuring people anymore because the payout is massive.
The insurance industry is sort of imploding on the cyber side.
Itâs not just the cost of the computer. Itâs:
- the cost of a new computer
- retraining systems
- ransomware payments (could be millions)
- audits
- third-party recovery teams
Itâs not like car insuranceââcrash it, replace it.â Itâs exponential.
So in my mind, the insurance industry is eventually going to lead everyone to:
âThis is what you really need to do.â
And weâre not going to accept just risk transferring anymore.
Heidi: Yeah, thatâs really interesting. Yeah, nothing like the insurance company telling you what you have to…
Dr. Shea: Right, right. I mean, you talk about the federal government telling you what to do, right? Yeahâno, itâll be the insurance companies.
Heidi: There you have it. The inside scoop.
The Future of Encryption Standards
Heidi:
Um, I want to shift gears and talk about a recent CSO article. The title of the article was “European law enforcement breaks high-end encryption app used by suspects.” Â In the article you say, “CISOs should be taking note of the diminishing lifespans of current encryption standards.” (Click here to read the full CSO online article) So sorry for kind of like abruptly changing subjects, but it was a really interesting article and I thought what you, you, you said and what you explained in it was really interesting as well. So when youâre talking about the diminishing lifespan of current encryption standards, can you talk about what you mean there?
Dr. Shea:
So weâve already started on the federal side to go through and move to new algorithms for encryption. Um, so you have, you have basic encryption. Encryption is based on math, and itâs hard math. And so with the development of quantum computing, the math is getting easier to break because the computational power of quantum computing is so advanced. So that means new algorithms are being developed based on harder math. I mean, just to keep it super simple, um, yeah, so, so the math is getting harder and thereâs requirements in, in the federal space right now to, to change to this, these new algorithms that work with the harder math to make it more difficult with the expected development of, uh, crypto-relevant quantum computers (CRQCs), which are the computers that can break encryption, modern-day encryption.

And in modern-day encryption right now, uh, you look at RSA encryption. With, with the way computers work, itâs expected that it would take a billion years to break the encryption. However, with the quantum computer, that same, uh, encryption is expected to be broken within like six minutes, so that means:
đ No more passwords are safe. No more encryption is safe. No more data security. Everything can be broken.
So you know, the solution is, you know, harder math for, you know, more computational power. And so itâs in my mind, sort of a never-ending problem of computing power vs. hard math because weâre going to have harder math and more advanced computing power in the next 10 years, 15 years, whatever it is. Um, so I, I believe if you just remove the math problem and approach things from an information-theoretic secure state where youâre not relying on the math, but youâre relying on the complexity. So regardless of how much computational power you have, you still have secure data.
An example of that is distributed data storage. So if, if I have all of my data in one database and I have, um, you know, encryption on it, if, if someone were to break into that particular database, they have access to the data, itâs now encrypted. They go through, uh, break the hard math, they have all your data. But if you take the same data, and you were to distribute it in multiple locations, so you have, um, you know, fragments of data everywhere, you would first have to go through and collect all of those pieces of data to get them together before you can then apply that quantum computing capability to go through and break the encryption.
And so, so thereâs strategies in place: the distributed, uh, storage, tokenization, which, um, you know, is a big piece of how to ensure the confidentiality of your data. So if the information is compromised there, theyâre not getting anything. Itâs a multi-layered approach to security, and you donât have to worry about that quantum piece of it.
And I will say, if you have this conversation about quantum computing with people, the idea that comes up usually is:
“Well, itâs like 30 years out. Who cares? No big deal.”
I would really like to see a conversation between youâand Iâm not a quantum expert. I research, I study it, but Iâm not a quantum expert in any way. I just look at it from the cybersecurity standpointâbut you have experts in that field that are, you know, actively working on the development of those CRQCs (cryptographically relevant quantum computers), and they have wildly different opinions based on their work and what theyâre seeing in their research.
Some experts, you know, bona fide experts in that area, believe weâre going to get to that stage where we have to worry about breaking of modern encryption within the next five years, probably two years. But then you have an equally qualified expert that says:
“Itâs like 30 years, so who cares?”
And then people will go through and hear both of those opinions and they will depend on their own position, their own bias on how they feel like, â30 years, so Iâm not going to worry about it,â but, um, thatâs not necessarily the case anymore because now there are steps that have been taken by the federal government to move over to the new algorithms.
You have to take some action if youâre going to align with the government and federal standards, and you need to be aware of it. So if youâre the CTO, CISO, CIO, youâre buying new equipment, developing an architecture, setting up systems, youâre going to want to look at:
đ What is that future standard? What are those new algorithms we have to use?
…and implement those so that youâre compatible with the system, and then know that, okay, regardless if itâs two years from now or 30 years from now, itâs still a threat. So when you buy this equipment or you develop these architectures and you have these strategies:
- Are you planning for 30 years out?
- Are you planning for just one year out?
Take out the mathâthatâs what I say.
Heidi:
Okay, that makes perfect sense. And I want to kind of drill down a little bit deeper. You, Iâm sure you can anticipate what my next questions will be. So when youâre talking about like this, this strategic, you say strategic, you say multi-layered, like I want to unpack that a little bit. One of the, some of the things that you say in this CSO Online article are:
So those are a lot of terms and you kind of sprinkled them in, you know, as you were explaining this. But Iâm hoping that we can drill a little bit deeper into each one of them.
Unpacking Multi-Layered Defenses
Dr. Shea: Okay, so, so tokenization. Um, tokenization, you know, is the process of replacing the actual data with tokens. Uh, I donât know if you remember going to Chuck E. Cheese. I, I did not. Um, I, I tried to never go to Chuck E. Cheese, but every now and then my, my sons were invited to go there. You, you donât get… you, you donât hand your kids, like hereâs, hereâs 20 bucks. You, you go to the token machine, you, you give the, uh, the money to the, the machine or the person and they then give you tokens. And then you just give your kids the tokens, they run around the, uh, arcade putting the tokens in the machine.
So theyâre not actually handling the money, the sensitive currency. Um, theyâre, theyâre just dealing with those, uh, you know, representations of the money. So if, letâs just say the, you know, your, your child is then cornered in the ball pit, someoneâs like, give me, give me all your money. Itâs, itâs not really the money. Itâs just the tokens. Iâm sure for the child, itâs just as traumatic. They donât get to play as many games, but the, the money itself is still protected because it was never actually in the ball pit with the kids.
So, so the tokenization strategy, um, is that, uh, is that way of ensuring that, again, if, if, if the encryption is broken, someone gets into your, uh, system, your, your, your data, uh, flows, and theyâre able to, um, you know, man-in-the-middle attack it or wherever itâs being stored, get to that data. Itâs, itâs not the data. Itâs just a representation of the data. Itâs…
Heidi: Itâs the darn Chuck E. Cheese tokens!
Dr. Shea: Itâs just Chuck E. Cheese tokens. Yeah.
Heidi: That would make an attacker mad.
Dr. Shea: Yeah. Yeah. So itâs like, oh, I broke into it and this is useless. Itâs useless data. So the actual crown jewel, sensitive data, your PII, your PHI, all those elements that youâre trying to protect, theyâre still protected.
Heidi: Awesome. I love a great analogy.
Dr. Shea: I just came up with that. I should have thought about that before. I was like, yeah, thatâs, yeah. Some sticky, dirty tokens. Yeah.
Heidi: Zero-knowledge proofs, so explain those to us.
Dr. Shea: So, so zero-knowledge proofs are, um, um, that kind of gets, itâs, itâs sort of a scenario-by-scenario situation in which you would use it. Thereâs a lot of data sharing that takes place amongst organizations, um, for audits or, um, you know, whatever, what, what maybe for compliance, you have to share information. So, so zero-knowledge proof is a way of sharing information without having to share your sensitive information. So it may not apply in all instances, but you can definitely use it in some instances.

And Iâll give you, Iâll give you an example on, on something Iâm working on. So within the PCAST paper, we had recommended the, the, the standup of an organization called the, um, National Critical Infrastructure Observatory, which would act as a digital twin for critical infrastructure. So you have an understanding of the security and resilience posture of all of the critical, um, you know, systems, critical infrastructure of the United States. Because right now you donât really have that belly button out there on, like, what is our posture?
So if you develop this sort of digital twin organization, it would then have to get information from organizations. And we just talked about how 85 percent of critical infrastructure is privately owned. So, um, I donât know a lot of private organizations, if the government says, âHey, send me a copy of your, um, um, like your, your, like all of, all of the CVEs that youâre, like, compliant with, or your, um, like, password, simple, simple password. So send me a list of all your passwords so we can ensure that there are at least 16 characters with, um, special characters, capital letters, lowercase, lowercase letters, and, um, and symbols.â Yeah, yeah, theyâre going to be like, âYeah, no, no, Iâm not sending you a list of my, my passwords. Just, just take my word for it that weâre good.â
And theyâre going to say, âNo, we donât want to just take your word for it. We want some type of verification to understand that, um, youâre using complex passwords, basic cyber hygiene.â
So instead of the organization sending in, âHereâs a list of all my passwords for you to verify them,â um, the, the two organizationsâthis made-up organization that weâre promoting and the, the organization theyâre working with, you know, it would work with maybe a third-party organization that would develop a proof that goes through all of theirâlike a scan. âLet me scan all of your passwords and do a simple check. Is there a capital letter? Is there a lowercase letter? Is it 16 characters? Is there a special character? Is there a number?â
And then for every password on the, um, you know, on that, um, organization, that private company that theyâre trying to get the information from, it would come back as yes, no, yes, no, yes, no, yes, no. Hopefully it comes back all as yeses. And then the, um, uh, organization theyâre sending it to, like this National Critical Infrastructure Observatory, they would then get the responses, âYes, like it meets it, yes.â
So they both trust the proof. They both looked at the proof. They know the code, and they know what it looked for, and now they both trust, âYes, you absolutely have, um, complex passwords that are meeting these requirements without me having to see your passwords.â
So itâs a zero-knowledge proof. I donât know what any of your passwords are, but I have the confidence that they are meeting the requirement because we both trusted this particular proof that was written, this code, and now I can say confidently, yes, youâre doing this.
So itâs a way to, again, exchange information from organization to organization without sharing the, you know, the PII, PHI, GDPR issuesâthat protected data. The sensitive data.
Or even something as simple as, um, you know, I, I wanna know that, um, you know, again, which of your systems may be affected by this new CVE that came out. And you can run, you know, scans throughout the organization. And the, um, like the Nessus scans that come out of it would then say this software, this software version compatible, notânobody wants to show their attack surface out there. Like, âThese are all the software pieces Iâm using.â
So instead of, you know, showing that sensitive information, you could just come back with yes, no, yes, yes, no. So itâs a simple way to have trusted verification and shared information.
Heidi: Awesome. Thatâs really helpful. Also Iâve never heard anyone explain it so, so succinctly. I feel like I understand it a lot better.
Closing Thoughts
Heidi: I know that weâre at time here, so I want to say thank you so much for sharing your insights, like such, such cool stuff that youâre working on, and really appreciate you taking the time to unpack this all. Um, so again, uh, George works for FDD, then the PCAST report I will link to in the show notes. Um, any parting words for our listeners?
Dr. Shea: No, thank you for having me. And, um, I guess I will, I would, I will finish with what I started with. And thatâs, you know, focus more on resilience than cybersecurity.
Heidi: Yeah. CISOs everywhere are like, what?
Dr. Shea: Yeah, I know. I know. It makes no sense. And weâre pushing everyone into the cyber field right now with STEM. And I donât even think the word resilience comes up. So theyâre like, âIâm supposed to do these things.â Iâm like, âNo, no, no, have the big picture, you know, implement the big picture and mission, mission success.â
Heidi: Love it. Thank you so much.
Dr. Shea: All right. Thank you.
Ready to align resilience with your mission?
Letâs explore how your organization can anticipate, adapt, and thriveâno matter what comes next.
Disclaimer:
This transcript is provided by Rixon Technology for general informational and educational purposes only and reflects the personal views and opinions expressed during a podcast interview on April 2025. The content does not constitute professional advice, legal guidance, or official endorsement by Rixon Technology, its affiliates, or the individuals featured. Dr. Georgianna Shea and Heidi Trost, as contributors, share their insights based on their expertise; however, these statements are not intended to represent definitive solutions or guarantees. Rixon Technology, Dr. Georgianna Shea, and Heidi Trost are not liable for any actions, decisions, or consequences arising from the use of this information. Consultation with a personal legal or professional advisor should be sought where appropriate for application of information in this podcast to an individualâs own personal circumstances.
© 2025 Rixon Technology. Reproduction or distribution without prior written permission is prohibited.