Mister Non Grata: From Blame and Ego to No-Fault Problem Solving
📜 Mister Non Grata – Podcast Description
Why do the very experts who solve your organization’s biggest problems often end up unwelcome?
Mister Non Grata explores this paradox. Hosted by veteran technologist Bill Alderson, who has spent decades parachuting into high-stakes crises at Fortune 100 companies, the Pentagon, and critical infrastructure, alongside screenwriter and consultant Michael Rogan, the show reveals the human and cultural dynamics that make problem-solvers persona non grata.
Through real-world stories and candid conversations, the podcast uncovers:
- Why teams resist outside experts, even after breakthrough solutions.
- How ego, fear, and misplaced credit can sabotage collaboration.
- What leaders can do to foster no-fault, inclusive problem solving.
- Practical frameworks for building resilient, innovative, and engaged teams.
More than technology, Mister Non Grata is about culture, leadership, and transformation. Bill brings deep technical war stories from his 44-year career in network forensics and incident response; Michael reframes them with narrative clarity and fresh perspective. Together they challenge outdated paradigms and offer new ways to align leadership, teams, and external experts.
If you’ve ever felt the tension of solving a problem but getting resented for it, or you lead teams where egos and fear of exposure block progress, this show will resonate.
👉 Tune in to Mister Non Grata to learn how to turn resistance into collaboration, problems into opportunities, and cultures of blame into cultures of growth.
Transcript
Unlock the secret to transforming your organization's
2
:problem-solving culture, our unresolved
issues draining your team's productivity.
3
:Does bringing in external experts
create more attention than solutions.
4
:It's time for a paradigm shift.
5
:Introducing Mr.
6
:NonGrata a revolutionary program
that delves into the heart of
7
:organizational dynamics to turn
unwelcome problem-solvers and to
8
:catalysts for collaborative success.
9
:Join bill Alderson, a veteran
technologist with decades of experience
10
:resolving high stakes technical
challenges and Michael Rogan.
11
:A renowned screenwriter and
marketing consultant as they uncover.
12
:The hidden reasons why
external experts become Mr.
13
:NonGrata after solving critical problems.
14
:Powerful strategies for leaders
to foster an inclusive, no fault
15
:culture that embraces collaboration.
16
:Insights into overcoming ego
fear and resistance within
17
:teams to unlock true potential.
18
:Practical steps to implement best
practices and systemization and
19
:documentation for lasting success.
20
:Don't let outdated paradigms
hold your organization back.
21
:Embrace a new approach that not only
solves problems, but also builds a
22
:resilient, innovative and engaged team.
23
:Transform the way you lead.
24
:Empower your team.
25
:Redefine success.
26
:Listen to the first session of Mr.
27
:NonGrata now, and start your
journey toward a powerful and
28
:effective organizational culture.
29
:Ready to change the game.
30
:Tune into Mr NonGrata and discover how
to turn challenges into opportunities for
31
:unparalleled growth and collaboration.
32
:Bill Alderson: /Yeah, so I'm here
with the consultant, the marketing
33
:consultant, who helped me create
Mister Non Grata out of thin air.
34
:This is Michael Rogan.
35
:He is a famous screenwriter and I
really appreciate his assistance.
36
:Of course, it was very simple for him to
do it because He just has the brilliance
37
:that just is awaiting some kind of,
it's just awaiting some conversation.
38
:So anyway I'm Bill Alderson
and I am Mister Non Grata.
39
:Because I go in and I solve a problem
for some high level leader and all of
40
:the rank and file Members of the team
resent me because the boss brought me
41
:in and even if they solved it while I
was on site and it was their assistance,
42
:it doesn't matter, I get all the credit
because for 12 months they couldn't solve
43
:the problem and then all of a sudden
we focused on the problem resolution.
44
:I was the guy hired to come in and
do it and so I get all the credit.
45
:And of course, they are resentful
because had they been allowed
46
:all the time, opportunity, etc.,
47
:they probably would
have solved it as well.
48
:So it involves, yes, there's some
technology that people need to
49
:understand, but there's the focus.
50
:There's the problem statement.
51
:There's an accuracy of the
problem statement and a consensus
52
:of the problem statement.
53
:That has to be done.
54
:Now, so I end up being Mister Non Grata.
55
:Not welcome, because why?
56
:Because I solved the
problem and they didn't.
57
:Leadership gives me all the credit.
58
:They're now resentful.
59
:Boom.
60
:We have what a recipe
for unhappiness, right?
61
:And what we want to do is we want to
expose this in these sessions so that we
62
:can arrive at a paradigm shift where the
leader understands that this happens.
63
:So when they bring someone in it's great
and fine, but we need to essentially
64
:lay out everything on the table and let
their team understand what my role is,
65
:what others roles are, so that we can
truly, no kidding, solve the problem.
66
:And it's no fault or no
credit kind of thing.
67
:We're just going to solve it together.
68
:Michael Rogan: And manage
expectations, right?
69
:Of your own team.
70
:There's.
71
:There's egos and feelings, and
even despite technologists it's,
72
:they're still somewhat human.
73
:And by appealing to that, I think a
thought an executive can foresee problems
74
:ahead, and I'm sure foster a culture
where people are encouraged to point to
75
:things that aren't working, as opposed
to hiding them, until you show up.
76
:Bill Alderson: And in actuality, it's like
pulling teeth to get evidence or symptoms
77
:from people because everyone has an idea
and they all have a different perspective.
78
:It's like the old story of the elephant
and the definition of the elephant, one
79
:person touches the trunk, one person
touches the leg, one touches the tail.
80
:And every one of them seems like the
definition of a different kind of animal.
81
:But in actuality, it's the same animal.
82
:That's what we're doing.
83
:It's just trying to pull all the
different perspectives together to
84
:come up with a concise consensus, yes.
85
:Concise consensus.
86
:of what the problem is, and that
really is how we solve problems.
87
:Michael Rogan: It just occurred to me
that you're not unlike internal affairs.
88
:You are very necessary, and
you're resented by every rank and
89
:file officer in this metaphor.
90
:And it's, it is pulling teeth trying
to get information, and you're
91
:seeking the truth, but there's a
lot of obstructions to this truth.
92
:And it sounds like a little bit
of work before you even hit the
93
:ground can stave off a lot of future
94
:conflicts.
95
:And not only that, I'm persona Non Grata.
96
:So I have actually literally had
multiple chief information officers
97
:tell me that the way that they get
their team to solve a problem that
98
:has been It's a resistant problem.
99
:A stubborn problem is they say
you guys either solve it by this
100
:time or I'm bringing in Bill.
101
:So persona Non Grata,
amplified at that point, right?
102
:Because yeah, they don't want me to come
in because they know I bruised their ego.
103
:And the executive allowed all this
to happen and didn't understand the
104
:paradigm that we were working in.
105
:So what we're trying to do.
106
:With the Mister Non Grata podcast is to
help uncover these situations and to grow
107
:our environment and to help all of us.
108
:Myself as the incoming facilitator of the
root cause analysis, all the other members
109
:of the root cause analysis team and the
executive or the leader putting it all
110
:together in order to arrive at a concise
consensus of what the problem is and then
111
:work together as a team to resolve it.
112
:I, when you said that, I was thinking,
boy, if the executive is saying,
113
:here comes Bill, if not, that puts
you in an impossible situation.
114
:Oh, here comes the stepdad with
the paddle or something like that.
115
:And so you are initially met with
resistance or, sluggish responses
116
:and all that kind of stuff.
117
:Bill Alderson: People, frankly,
don't want to participate.
118
:Why?
119
:Because they know If they provide the
piece of information that helps us
120
:diagnose the problem, that they're
not going to get credit for it, I am.
121
:So we need, that's the paradigm
shift that needs to occur.
122
:Is that In the
123
:Michael Rogan: executive, which
I really didn't piece together.
124
:In the executive as well.
125
:Bill Alderson: So this is a
paradigm that we end up in.
126
:And I don't know what we would
call the paradigm, but it's the
127
:it's certainly an ego paradigm.
128
:It's certainly a negative
team experience paradigm.
129
:It's not effective.
130
:And so what we're trying to do is
identify all these elements from
131
:all the different stories I have and
then say, okay, here's the scenario.
132
:Here's the story.
133
:Here's the antagonist.
134
:Here's the protagonist, here's the
hero, here's the journey, here's
135
:the whole thing that happened.
136
:Now, how do we deconstruct that
and find out how to make it a
137
:positive experience for everyone?
138
:Michael Rogan: Yeah, it's, love
me a metaphor, but in sports,
139
:often it's hard to get people
to give up the ego of putting up
140
:points, or stats, or whatever, and
141
:It's just natural for all of us and
I'm sure, these egos that come up are
142
:just self preservation and an instinct.
143
:And what I'm getting for the first
time, Bill, is that this paradigm
144
:shift isn't just about the tech, right?
145
:Because this could have far
reaching implications for
146
:the culture of a department.
147
:And a company, right?
148
:If you are encouraged to help
somebody who comes in to solve the
149
:problem, you're rewarded for it.
150
:You're rewarded for saying, that's not
working, and I don't know how to solve it.
151
:That's actually a good thing.
152
:Then you're rewarded.
153
:I'm assuming that the relationship
and the results would probably be
154
:better and more efficient, right?
155
:Bill Alderson: Yes.
156
:In these situations, they put together
what's called a Tiger team or a SWAT team.
157
:One of the organizations didn't
like the term Tiger or SWAT.
158
:Because it had various connotations.
159
:So they called it a Care team.
160
:A Care team.
161
:So it, and that was a female leader
who brought in a perspective that was
162
:different and it was very positive.
163
:And so we created not just a Care Critical
Problem Resolution Team or a Root Cause
164
:Team, but we had several different teams.
165
:To focus on System Documentation,
diagramming dependencies of applications
166
:and deconstructing so that we knew, gee,
these hundred things have to be working
167
:perfectly in order for this application to
work or in order for this system to work.
168
:There are hundreds of things,
identifying what those are and then
169
:identifying which ones can be at risk
is associated with Business Continuity.
170
:It's also part of Disaster
Recovery planning, and then it's
171
:just very smart business sense
to put these things together.
172
:But these human dynamics, these
psychological human dynamics.
173
:That occur are something that we
need to take care of and to study.
174
:Michael Rogan: Because
words matter, right?
175
:Words matter a lot because
they paint pictures.
176
:And I'm imagining myself, SWAT team
coming in my fight or flight goes up.
177
:Just oh, they're going to rush
through and there's a sniper.
178
:The imagery is there, right?
179
:And Tigers, I don't want to
have a Tiger in the room.
180
:It's interesting how that different
perspective that she, that leader had.
181
:That is obviously what's needed
more in the technology space,
182
:but it really changes everyone's
frame as the therapists say.
183
:Reframing something as collaborative
and helpful and creative as opposed
184
:to instantly confrontational.
185
:Bill Alderson: Yes.
186
:Let me tell you about a story.
187
:I went into this organization
and they had this big problem.
188
:It was a technical problem and I
did diagnose it with my technical
189
:prowess, which None of the folks had.
190
:I do deep packet inspection and you have
to understand ports and protocols and
191
:systems that most people don't understand.
192
:It's like under the covers, under the
hood kind of understanding that people
193
:who are in a role of maintaining or
building a network or a system don't have.
194
:And so I have those
diagnostic capabilities.
195
:So it was my diagnostic capabilities
that was the key to this.
196
:And that's why I get called
in because I have those.
197
:However, in other cases where
I've just been the facilitator
198
:of problem resolution.
199
:Because I have a methodology.
200
:I have a whole system that I use.
201
:And so I went into this company.
202
:There was a big problem.
203
:We started working and
diagnosing the problem.
204
:And as we find evidence, or consensus
of evidence, those evidentiary facts
205
:or findings start coming out and it
begins to create a funnel or create a
206
:pointer to where we believe the problem
is and it starts to become a consensus
207
:and it starts to become obvious.
208
:Oh, It very well could be this over
here that we hadn't anticipated for
209
:the last three months that we've
had the problem, that's starting
210
:to point to this type of a thing.
211
:And in this one situation, it happened to
be a firewall that was in the middle of
212
:these transactions and were these problems
and the technology is somewhat irrelevant.
213
:But it was starting to
point to this one firewall.
214
:Who do you think started becoming
sensitive and on the either the
215
:manager who was in charge of security
and the firewalls, which is of
216
:course, or the technologist who
was in charge of those firewalls.
217
:And in this case, it was the architect
who was in charge of the decision to bring
218
:in this particular firewall, configure
this firewall for such a purpose.
219
:So there was a lot of, what would you call
it, buy in, or a lot of, they were dug in.
220
:It You
221
:Michael Rogan: would scatter some
222
:Bill Alderson: pushback.
223
:It wasn't just pushback.
224
:It's just they owned that.
225
:And there was a lot of
what do you call it?
226
:When you get too close to something,
when you're so heavily invested.
227
:Yeah.
228
:So this guy was incredibly heavily
invested and we were really zeroing in.
229
:And usually in most of these problems, I
have five days and, it's on the third or
230
:fourth day that it, like all the symptoms
come together, all the analysis, and then.
231
:And we start to arrive at a
conclusion or a diagnosis.
232
:And so this guy is out to lunch and he
doesn't come back from lunch for a while.
233
:And we literally start getting
concerned because we're about to
234
:convict his firewall of the problem.
235
:And he was very invested, overly invested.
236
:He had too big a stake
and lost his perspective.
237
:He lost his perspective, and
of course he took it personal.
238
:Instead of just saying, okay.
239
:So it ended up being that it was a Linux
firewall with a Linux platform, but it
240
:was the Linux colonel that was at fault.
241
:It wasn't even the software that we
were running on the Linux platform.
242
:It was the Linux platform
that had the fault.
243
:And we diagnosed it, we've proved it,
we took all of the firewall software
244
:off of the Linux station and had it just
be a router and it did the same thing,
245
:even without all the firewall stuff.
246
:So our only alternative was to
buy a Cisco firewall at the time.
247
:And so we bought a Cisco firewall.
248
:We put the Cisco firewall in
and boom, got it all running.
249
:And the problem was completely gone, but
it was there when the Linux firewall was
250
:in, it was there when we turned off and
re recreated the whole Linux firewall.
251
:Where it had no firewall software
on it, it was just a router at
252
:that point, and it still had the
same manifestation of problem.
253
:So it was a Linux kernel.
254
:Now, how can this engineer feel
like he is At fault in some way,
255
:because this is, millions of
people use Linux around the world.
256
:How can he feel that way, but he was so
invested, and his leadership allowed him
257
:to be invested, and wasn't identifying
how invested he was, until he didn't
258
:show up after lunch, he was like an hour
late and we were all saying, whoa, is
259
:this guy up on the, up on the roof, do
we need to talk him down because no,
260
:When you're talking about men and their
and it can happen to women as well, it's
261
:anyone who's over invested in whatever.
262
:It's very difficult when their own
fingers start pointing back at them.
263
:And that's just resilience and being able
to accept that we can make a mistake.
264
:So we just have to own it.
265
:Michael Rogan: It, I don't know if you
use the term stress test in technology,
266
:but like you're stress testing a
culture when you come in, and not only
267
:was he invested, but he was obviously
fearful of being exposed, of being
268
:vulnerable because he didn't know what
he didn't know but what he also you
269
:guys didn't know either you weren't
looking to assign blame He just felt
270
:that he should have known the problem.
271
:We
272
:Bill Alderson: were looking to assign
blame, but it wasn't to a person,
273
:Michael Rogan: right?
274
:the fact that's a culture that needs to
be instilled and set up before you arrive
275
:or adjusted to right As in this
276
:situation, as it's revealed, do you think
that's something the leadership should
277
:do before someone like yourself arrives?
278
:Or is that something when you arrive to
have sort of a foundational get together?
279
:Bill Alderson: That's one of the things
that I would like for some of the outcomes
280
:of what we're doing with Mister Non Grata
is an outcome that says, Hey, here's.
281
:Some scenarios where these things
are happening and they're the truth.
282
:And they're difficult for humans to deal
with and they're also difficult because of
283
:the adverse feelings or the reticence to
solve the problem because we don't really
284
:want to know what we don't want to know.
285
:Michael Rogan: It, this is a
string that could be pulled.
286
:In a lot of areas of the tech
industry it occurs to me as
287
:someone who knows nothing about it.
288
:Bill Alderson: Yeah, like I
said, Mister Non Grata has
289
:nothing, little or nothing to do.
290
:It's just that I am this guy who,
like Forrest Gump, has just gotten
291
:all these calls to go into the wild
to solve critical problems that are
292
:high visibility, high stakes problems,
and I end up arriving on site.
293
:And more than my technology is utilized
to help unravel the problem, to work
294
:our way through the knot hole, right?
295
:It's soft skills.
296
:Michael Rogan: I didn't realize you
were an on site therapist as well.
297
:Bill Alderson: There you go.
298
:Yeah.
299
:And I wasn't when I started, but
these things got revealed and then it
300
:became, oh, wow, we need to make sure.
301
:Cause one of the things that I always did
after a while is said, look, if I come in,
302
:we need to have no fault, troubleshooting,
we need to make sure everybody feels open
303
:and welcome to participate and to help us.
304
:Because you can't throw me in a
wire closet or over in a room and
305
:say, solve the problem, Mister
306
:wizard.
307
:It doesn't work that way.
308
:We have to interact with other
people and we have to do it in
309
:such a way where there is trust.
310
:So first up is Building the situation.
311
:And I have dozens and dozens of
these type of situations that I
312
:want to discuss and help people
understand through the experiences
313
:and through the podcast of Mister
314
:Non Grata.
315
:It
316
:Michael Rogan: seems that the leadership
almost needs the reframe, right?
317
:Like that, and I'm, I don't know if
you ever experienced this Bill, but
318
:have you ever come in and leadership
is exactly certain what the problem is?
319
:Bill Alderson: Let me
give you another example.
320
:Went into a very large company.
321
:They called me about two in the afternoon.
322
:Was in Sacramento at the time.
323
:I lived up there.
324
:The company was in San Jose or Sunnyvale.
325
:And they said, Bill, and I've worked
with them in the past to some degree and
326
:trained a bunch of their technologists.
327
:And they called me up about two
o'clock in the afternoon and said,
328
:Bill, our network, Global Network.
329
:Network has been down since 11 o'clock.
330
:This company had Network in its name.
331
:This company just turned a
billion dollars in revenue.
332
:Very successful.
333
:And this company could not take a
phone call, could not take an order,
334
:could not ship a product, could not
take a support call, could not do
335
:anything internal, could not send an
email, could not receive an email.
336
:They were persona Non Grata.
337
:They could not do anything.
338
:So I got the call three hours
after this had occurred.
339
:And I said, Bill, can you help us?
340
:And I said, sure, it'll take me, I'll
have to take me 30 minutes to get down
341
:to the airport where my airplane is.
342
:I'll hop in the airplane.
343
:I'll fly down.
344
:That'll take about 30 minutes.
345
:I said, it'll take me about altogether.
346
:An hour and a half to two hours to
get to San Jose airport to drive
347
:up to, this company in Sunnyvale
and help solve the problem.
348
:And so I did, and I
guess it was probably 4.
349
:30, maybe 5 by the time I
got there, a couple of hours.
350
:And I walk in and I had my equipment
and then the team is all there.
351
:There's a hundred people in this company
in a war room and they're trying another.
352
:method of rebooting all of their
network switches in a different order.
353
:Because when they turn them all on
at the same time, it didn't work.
354
:When they turned these on first and
those on second, it didn't work.
355
:When they turned those on first
and those on second, they were
356
:about to reboot all of the switches
and routers in another order.
357
:Does that ever
358
:Michael Rogan: work?
359
:Bill Alderson: Never.
360
:I shouldn't say never.
361
:Sometimes.
362
:When it's that big of a
problem We're reaching
363
:Michael Rogan: the broad bottom
of the barrel of solutions.
364
:Yeah.
365
:Yeah.
366
:Bill Alderson: It's Non scientific.
367
:It's just yeah.
368
:So they said we're about to do this and
we've all agreed and have a consensus
369
:that this is what we're going to do and
we believe it's gonna solve the problem.
370
:I said do you, are you sure?
371
:Is there any scientific
evidence or anything?
372
:No.
373
:And I said why don't
you let me take a look?
374
:And they said how long will it take you?
375
:And I said I'd need to power up my Network
analyzer and hook it to your network and
376
:turn it on for 10 minutes and then take a
look and I have some ideas on what to look
377
:for because I've got a lot of experience.
378
:Michael Rogan: This isn't my first rodeo.
379
:Bill Alderson: This isn't my first rodeo.
380
:So I, they said, okay, we'll give
you, we'll give you 20 minutes.
381
:So they gave me 20 minutes, I turned it
on, I turned on my analyzer, I plugged
382
:it into their network, I looked at
a few packets and I said, turn this
383
:parameter on, on every, they had, it
was a big campus, about 10 buildings.
384
:And they had network
links between all of 'em.
385
:And they couldn't go out
internationally, they couldn't come in.
386
:Everything was down.
387
:And said, put this parameter on
the configuration, turn 'em on,
388
:go in and configure and put this
mass change on every port of every
389
:switch, put this parameter on.
390
:And one of the technologists came over
and said, bill, this isn't gonna work.
391
:Because, when we tried some of that in the
past, OSPF stopped working when we did it.
392
:And I said sure seems like right now, OSPF
and a lot of other things aren't working.
393
:I'm a theorist.
394
:So I said, put those parameters, if
you don't want to do it, it's fine.
395
:I'm not going to force you.
396
:But if you want to do what the theorists,
the scientific, if you want to, if you
397
:want to do it scientifically, put this
parameter on every port of every switch.
398
:So they went ahead and did it.
399
:It was a spanning tree parameter.
400
:They put it on every port of
every switch, turned it up.
401
:And then I said, and then turn them on.
402
:Is there a certain order?
403
:And I said, no, just.
404
:After you get that parameter, turn one
on, configure it, turn the next one
405
:on, configure it, turn the next one
on, configure it, and and then, and
406
:then after all that parameter is on
every port of every switch in every
407
:building, I said it's all going to
come up and work, and by golly, it did.
408
:So the Chief Information
Officer comes over to me.
409
:And I knew what he was looking for.
410
:He wanted to know who to blame.
411
:Michael Rogan: I want a name.
412
:Bill Alderson: And I
said, thou art the man.
413
:Just like, Nathan said to King David
after he was caught stealing Bethsheba.
414
:I said, thou art the man,
you're the one who did this.
415
:And he says, what do you mean?
416
:I said you have no network documentation
that's really materially beneficial,
417
:so they couldn't see the effect of
not having these parameters on because
418
:they were not able to visualize
something so big and complex.
419
:Because it wasn't on paper to where
they could, like a, a construction
420
:an architect does drawings and
drafting in order to build a building.
421
:And if you have no architecture diagrams,
no blueprints, you can't solve it.
422
:I said and you're the one who
is the technology leader, right?
423
:Chief information officer.
424
:You're the.
425
:Technology Management Leader, and
this falls into technology management.
426
:You didn't have the proper
network visibility and diagrams.
427
:And so Did you get a hug
428
:Michael Rogan: afterwards?
429
:Bill Alderson: No, but I got a 100,
000 purchase order on my desk Monday.
430
:Hello.
431
:And then that company was now known
for having Spectacular network
432
:documentation and systems and you'd
go in to the network area for all the
433
:network engineers and their diagrams
were up on every cubicle wall and
434
:some of them went 10, 15, 20 feet.
435
:True story.
436
:And I'll bet
437
:Michael Rogan: they're not going to
have a configuration problem again.
438
:Bill Alderson: Maybe they will, but the
issue is that We solved that problem
439
:and we solved a whole lot of other
problems because now they had the song
440
:sheet from which everyone could sing
and understand and visualize the system.
441
:So those are best practices that they
should have had that they didn't have.
442
:And it wasn't some technologist who
didn't configure something right.
443
:It was because they could, it was too big.
444
:They couldn't see it until it
was on paper and they could
445
:trace lines and understand it.
446
:Michael Rogan: So you're How often do you
find in these gigs where you come in and
447
:do this, that you're not just fixing a
problem, but you're fixing what they call
448
:the problem under the problem, right?
449
:How many times did you find that
you had to actually best practices?
450
:I have a
451
:Bill Alderson: diagram of a tree and the
tree is on the top and that's operations.
452
:And down below is the trunk, and the
trunk goes into the ground, and then
453
:the ground shows the roots, and that the
initial root cause of the problem was
454
:a technology problem, which we fixed.
455
:But down below, the root cause
was due to poor systemization,
456
:poor technology management.
457
:Those are best practices.
458
:How often does that occur?
459
:Almost every single time.
460
:Oh yeah.
461
:Identify the root cause, not only
to the technology problem, but how
462
:did that technology problem exist in
did we not have good change control?
463
:Did we not, there's just a million things
and I have dozens of stories that I can.
464
:Explain exactly in the same venue
that point to the system management
465
:and why the root cause that was
technical wasn't truly only technical,
466
:it was managerial and leadership.
467
:Michael Rogan: Sounds like
you're, and people who are
468
:following this paradigm, right?
469
:They're solving problems
they weren't hired to solve.
470
:Yeah.
471
:They're coming in and, Just like when
someone comes to a therapist and says,
472
:The problem is the person at home.
473
:The problem is the person not here.
474
:And I'm sure the therapist is, huh.
475
:Okay.
476
:Let's unpack that one.
477
:Bill Alderson: And the therapist is, if
you've ever had therapy, they're always
478
:trying, when you're trying to focus
on the reason is my boss, my wife, my
479
:friend, or somebody else, or And not
focusing on what you can, you have power
480
:over and you can control, which is you.
481
:Michael Rogan: Yeah.
482
:Another, a completely different
market, the implications were there
483
:was a day of no productivity, but
you think of a company like Boeing.
484
:Which seem to have a culture of finger
pointing and hide, it seems from what I
485
:Speaker 2: I didn't know if I
could bring it up, but sure,
486
:yeah, go ahead, Mister Non Grata.
487
:Bill Alderson: I go to a defense
contractor , and I'm training about
488
:30 people in how to analyze their
computer network, and in doing
489
:so, we use their computer network
for some of our analysis tasks.
490
:So that they would know how to
analyze their computer network,
491
:not the generic computer network.
492
:And so during this analysis,
I said, okay let's go find all
493
:of these distributed sniffers.
494
:And there was a certain mechanism to find
these servers and so I told them how to
495
:do it and they went out and found out
and they said, Oh, I found one over here.
496
:I found one over here.
497
:I found it on the network.
498
:And it's okay, now let's
try and connect to it.
499
:So they tried to connect to it.
500
:And the bottom line was.
501
:And, is that there, there
are passwords for this system
502
:that come with the default.
503
:The default password was NGC, and
I knew it, and everybody knew it.
504
:And we found one, and we tried to
log into it, and we put in NGC, and
505
:by golly, we got into a sniffer.
506
:The sniffer allows you to look at
all the packets on the network.
507
:One of those sniffers, the part of
the system that allows you to log
508
:into it was actually on the internet.
509
:But it allowed you to see the
packets on the inside of the network.
510
:So that means anyone on the internet
could connect to that distributed
511
:sniffer and then looking at it and
capturing it could see what's on the
512
:inside network at a defense contractor.
513
:So I said, halt, stop
everything immediately.
514
:Don't do anything else.
515
:We have got to call your security
department and bring them in here and
516
:go find out who those sniffers belong
to and you need to change it right now
517
:because that is the biggest security
violation that there possibly could be.
518
:Which means that you're allowing somebody
to get to a sniffer on the internet
519
:that is looking at and capturing all the
packets on the inside of your network.
520
:So they call up security.
521
:Security comes down and the first thing
that they want to do is they want to . The
522
:short of being me, being arrested, they
literally, no kidding, wanted to escort
523
:me out because I found this security hole.
524
:Mister Non Grata?
525
:Yeah.
526
:Why?
527
:They didn't find it.
528
:I found it.
529
:It embarrassed them.
530
:Michael Rogan: Embarrassed them.
531
:Bill Alderson: So what
did they want to do?
532
:They wanted to shoot the messenger.
533
:So 20 of these people in the class.
534
:Who had a deep respect for what
I had taught them already said,
535
:you escort him out and I quit.
536
:Yeah, it was that bad.
537
:So the security people backed
off, everything backed off.
538
:They went and found those sniffers.
539
:They.
540
:Reconfigured or got them out and boom.
541
:And but that's the story.
542
:That is a defense contractor Corporation.
543
:I was just
544
:Michael Rogan: gonna say,
it's a good thing that they
545
:cleaned up their act, right?
546
:It's, yeah.
547
:It sounds like the same kind of stuff.
548
:Bill Alderson: That's going on right now.
549
:Right.
550
:Bottom line is, these sort of things
happen and do you see how anecdotal
551
:all these pro, I could go on for
days talking about my experience in
552
:military environments, in top secret
environments in, in small corporations,
553
:in healthcare organizations.
554
:You name it.
555
:Dozens and dozens of these exact
type of examples where I became
556
:Mister Non Grata.
557
:Not welcome.
558
:We don't like.
559
:We want to kill the messenger.
560
:Michael Rogan: And it sounds like
you're empowered to help future
561
:network professionals like yourself
to avoid that same sort of It's
562
:Bill Alderson: not just for me.
563
:It's for the rank and file, But like
you said, it does, it is, it becomes
564
:incumbent upon the leadership who
is quote unquote technology leaders.
565
:And sadly, when the internet and all
this networking stuff grew from zero to
566
:huge, very, in a very short period of
time, we pulled in a lot of leaders from
567
:other areas, and they weren't technical,
necessarily, and so they didn't know how
568
:to They didn't know what best practices
were for systemization of technology.
569
:They didn't know what these things
were to have diagrams, to have
570
:SWAT teams, to have Tiger teams,
to have all of these mechanisms.
571
:They didn't know it.
572
:So consequently, they had to learn
on the job and that's what I ended
573
:up doing for 44 years is finding
all of those type of occasions.
574
:And now I have all of this
and I'm Wanting through Mister
575
:Non Grata to help the, my environments
and leaders and followers and
576
:that sort of thing, come up and
understand and have a more functional.
577
:It's a functional environment as
opposed to a dysfunctional environment.
578
:Michael Rogan: Have you found
that some of the soft skill stuff
579
:that you have to on site massage?
580
:Bill Alderson: First of all,
I'm a hardcore technologist.
581
:Michael Rogan: Yeah, oh yeah,
582
:Bill Alderson: no, of course.
583
:Hardcore.
584
:Yeah.
585
:My skills I don't know who
this side of you is right now.
586
:My, my skills are very hard skills,
but through these experiences, using
587
:my hard skills, I uncovered all of the
needs for these soft skills, and some
588
:of them are systemization methods,
best practices, what do you need, and
589
:then secondarily, how do you, I think
you said it earlier, it's the culture.
590
:How do you set the culture as a leader,
or even as a follower, or as a sub leader?
591
:Michael Rogan: And I can see that if I
was a hardcore tech person, which I'm not,
592
:that if the leadership, the CTO or CIO.
593
:Was him or herself afraid of being
exposed because they don't know as much
594
:as they probably should have been to
be in that chair, that everyone else
595
:isn't a little bit of a just don't want
to be exposed for what they don't know.
596
:Bill Alderson: All of those things
are so powerful in our world.
597
:Yeah.
598
:It's, and it starts with what is
that thing that people say, it's,
599
:uh, where you don't think you know
anything or you're not qualified.
600
:What is the term for that?
601
:To
602
:Michael Rogan: be, oh,
the Peter Principle?
603
:Not
604
:the Peter Principle, but the,
it's a psychological term, it's
605
:a when you're fake, you're not
real you're something syndrome.
606
:It's It's like Suga.
607
:Impostor Syndrome?
608
:Impostor Syndrome.
609
:That's it.
610
:Where you get It's Impostor Syndrome!
611
:And what's funny is, like, all these
different industries, healthcare, military
612
:Bill Alderson: You name
it, it doesn't matter.
613
:They
614
:Michael Rogan: all think they have
unique, personal problems that no
615
:one else can solve, and it sounds
like There are as many human problems
616
:sometimes as technology problems.
617
:Bill Alderson: Yes.
618
:Vis a vis the guy we were afraid
was on the roof, ready to jump
619
:off because he was over invested.
620
:And he probably had to sell that solution.
621
:And he was very proud of that
solution and was proud of it
622
:regularly at his reviews and letting
all the other technologists know.
623
:He saved the company, a quarter of
a million dollars by using Linux
624
:firewalls instead of Cisco firewalls.
625
:So it repeatedly and he was rewarded for
that repeatedly and now all of a sudden
626
:this thing that he was so proud of was
now the cause of three months worth of
627
:dysfunction because that company had
to move files for across the nation.
628
:This company was actually a printing.
629
:It wasn't a printing organization.
630
:They did marketing and graphics, but
you know how you get in your newspaper,
631
:you get inserts and they have Target,
Walmart they did the Target and Walmart
632
:insertions, but you didn't print all
of those, for instance, in Chicago and
633
:put them on trucks to ship, a million
pounds around the nation in order
634
:to put them in all the newspapers.
635
:What do you do?
636
:You.
637
:You transfer files from Chicago
around the nation and then the
638
:printers print them locally.
639
:They couldn't do it.
640
:Couldn't send the files because
this particular firewall was kept
641
:killing the session and it would
die before the file was complete.
642
:And they had to resort to
putting all these files on DVDs.
643
:And initially they put people on
airplanes to fly them out to L.
644
:A.
645
:and Seattle and Miami in
order to get the files there.
646
:And then they ended up just shipping
them, FedExing them when they were done.
647
:That happened for three months
648
:because they couldn't do a file
transfer because of this firewall.
649
:That was costly.
650
:And everyone in the company knew it.
651
:It was a big problem.
652
:When we diagnosed it and then solved it,
and this guy was like the hero for having
653
:saved the company a quarter of a million
dollars, but now He cost a two bill.
654
:Oh, he cost millions more.
655
:Michael Rogan: Yeah.
656
:Bill Alderson: Yeah.
657
:And these are the sort of intrinsic
things that happen if you don't have
658
:understanding of technology management, of
how to go about how to prevent the Mister
659
:Non Grata.
660
:Michael Rogan: Seems like there's a lot
of noses being cut off to spite the face.
661
:Yeah.
662
:There's a, but it is just a
naturally human reaction to get
663
:a little defensive and, exactly.
664
:Status is important, reputation,
and there's a lot of mistakes.
665
:Bill Alderson: This has been a very
interesting conversation, Michael Rogan.
666
:I really appreciate your help and look
forward to additional conversations.
667
:I know that you're not particularly
a technologist in this arena but you.
668
:Now understand how these things can
happen and you don't have to have all
669
:my technology understanding because the
story stands on its own, doesn't it?
670
:It
671
:Michael Rogan: also is a little
bit refreshing to hear that
672
:despite the Machine aspect or the
tech aspect of all these things,
673
:there are still human problems.
674
:Bill Alderson: Yes.
675
:And it gives us all hope that we can
be part of the process of improving.
676
:Michael Rogan: Because if it's a human
problem, that same approach could work in
677
:a medical hospital setting and aeronautics
and whatever this is over here.
678
:This paradigm.
679
:This Mister Non Grata paradigm
has many applications and people
680
:who hear this will understand and
apply it in many different areas.
681
:But like I was telling you, my
desire is to create, diagrams that
682
:we can show how this works, the steps
through which people take and where
683
:they find themselves in the process.
684
:Mister Non Grata Paradigm Story
or The Journey, and then educate
685
:people in ways to avoid these things
or to fully embrace them as great
686
:leverage tools to improve the culture.
687
:There's a, I remember a story where
I think Ford Motor Company was trying
688
:to pull some management practices from
Japan, the Kaizen Six Sigma stuff.
689
:And they were like that, we
want to constantly improve,
690
:that, that kind of thing.
691
:And they put up a thing that said
for a hundred dollars, you put
692
:in a suggestion of something that
would improve and you could win a
693
:hundred dollars and nobody put it in.
694
:There just was no, then
they lowered it to 5.
695
:And everybody was throwing it in.
696
:And what they found was that people
got little bit overwhelmed and anxious.
697
:Do I have a 100 worth idea?
698
:But everybody's got a 5.
699
:Heck that's improving the
faucet on the bathroom.
700
:Bill Alderson: I think that is where
the organization that is now the
701
:gig economy, which is called Fiverr.
702
:com came from.
703
:Michael Rogan: You think so?
704
:Oh gosh, I didn't even think about it.
705
:Yeah, that's pretty good.
706
:That's pretty good.
707
:Yeah.
708
:Bill, I thank you so much for
every time I have a conversation
709
:with you about technology.
710
:I always feel like I'm closer to
getting my Masters of Science in stuff
711
:that I don't know how to pronounce.
712
:Bill Alderson: Thanks again, Michael.
713
:Look forward to talking to you again.
714
:Thank you, Bill.