#950: August 1, 2024

#950: August 1, 2024

Released Friday, 2nd August 2024
Good episode? Give it some love!
#950: August 1, 2024

#950: August 1, 2024

#950: August 1, 2024

#950: August 1, 2024

Friday, 2nd August 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

I'm sick

0:04

of them posing as

0:06

if they're the good

0:08

guys. I

0:58

love you. Hey everybody! Welcome back to Knowledge Fight,

1:00

I'm Dan. I'm Jordan. We're a couple dudes. I

1:02

like to sit around, worship with the altar of

1:04

Celine and talk a little bit

1:06

about Alex Jodans. Oh, indeed we are Dan.

1:08

Jordan. Dan. Jordan.

1:11

I have a quick question for you. What's up? What's

1:13

your bright spot today, buddy? My bright spot today, Jordan, is Venom

1:15

II. Let there be carnage. I was

1:18

wondering when you were going to get there. Yeah,

1:20

went over to my friend Angela Lampsberry and

1:22

watched Venom II with her

1:25

and her partner last night. A lot

1:27

of fun. Those movies are good. They're

1:29

great. They're delightful. Tom Hardy

1:31

is killing it. Spectacular. Venom's a great

1:33

character. Two of them have a fine...

1:37

Electric, in a way. The chemistry is

1:39

there. It is really weird, like, in

1:41

a genuinely... Like, sometimes

1:43

when you're watching a rom-com and the

1:46

dialogue hits, you're like, you know what,

1:48

I get why these are here. And

1:50

it feels a lot like that between Tom Hardy and Venom.

1:53

They're playing a lot of games in terms

1:55

of the storytelling that really work and really

1:58

play out how they should. It's

2:00

great. It's just like wait come on. Yeah, you

2:02

guys did great. Good job guys. Good job. Yeah,

2:04

good job. Yeah, I Maybe

2:07

you know I watch a ton of movies. Yeah,

2:09

and maybe I just I'm enjoying

2:14

Even liked Morbius Your

2:18

trip through the the Sony anti

2:20

spider-verse It

2:22

doesn't need spider-man. No, I love looking

2:24

at it through your eyes. No need

2:26

for spider-man. No, it's beautiful It is

2:29

beautiful because it is it is like

2:31

just a perspective of no

2:33

expectations that is Refreshing.

2:35

Well, we watched Madame

2:38

web. Yeah, and then Morbius and then

2:40

venom and venom too Yeah, and I've

2:42

enjoyed all of them for what they

2:44

are which makes me worried that if

2:46

I were to actually watch the spider-verse

2:48

movies They'd be too good So

2:51

good that I wouldn't be able to handle it Will

2:54

say but if I was gonna boil down your

2:56

reviews up to this point it would be more

2:58

B. S. And Madame web good job Venoms

3:02

great job Morbius

3:05

it was like I was defensive like they told

3:08

a story Yeah,

3:14

whereas the venom was actually enjoyable But

3:16

anyway good times what's your bright spot? I

3:19

was well, I had a bright spot this morning

3:21

and it has now been updated. So I'll tell

3:24

you my earlier bright spot So

3:27

my hard drive containing all of my

3:29

music had died, right? This

3:32

is a while back. We remember this right? Right.

3:34

All right, and I Forgot

3:37

or not forgot so much but I had it

3:39

backed up on my old computer Which

3:42

is a more than a decade old

3:44

one of those classic desk top Rick's

3:47

Mac book pro brick ass

3:50

huge. It's beautiful, right? so

3:53

I had to go through a four-hour

3:57

process of like redoing all of

3:59

that stuff but at the end

4:01

of it, I recovered everything. Wow.

4:04

It's beautiful. You got your old collection back. I

4:06

got my old collection. It was beautiful. It

4:09

was a struggle and required a lot

4:11

more than I thought I was going

4:13

to be able to do. Sure. But

4:16

what happened? So now you get to

4:18

go back through and rediscover some stuff

4:20

because once your collection gets taken away,

4:22

you reclaim it and you're like, oh

4:24

my god, there's all this stuff that

4:26

I had forgotten I'd lost. Totally. I've

4:29

had the same like, you

4:31

know, 200, 300 albums on my iPhone for the past year and that's

4:38

not enough. It's just not enough. Sure.

4:41

So now it's all gone and I'm

4:43

going to restart from, oh, it's beautiful.

4:46

It's beautiful. Well, enjoy. But my updated

4:48

bright spot is actually, and it

4:51

is partially because I know you're getting better

4:53

at this, which is

4:55

accepting compliments. By

4:57

no means should we be able to record

4:59

this fast after this episode. It is

5:02

a legitimately impressive thing that you turned

5:04

it around so quickly.

5:06

I would have been totally fine. We're

5:09

recording this before it gets dark out.

5:11

I was expecting far later for a

5:13

show that ends at three or whatever.

5:15

Yeah, we're recording this at about five.

5:17

Yes. And we're talking about

5:19

Alex's show from today. It is legitimately

5:22

an impressive feat for a team

5:24

of people and you have done

5:26

it by yourself. So accept this

5:28

compliment, sir. I challenge thee. I'll

5:30

try. Okay. What

5:32

would Venom do? This

5:35

is the new question. Am I the Tom Hardy

5:37

in this scenario? I'm not sure. I

5:39

would jump into a tank and fight a lobster.

5:42

You are the person who would be more likely to have

5:44

a weird YouTube show where

5:47

you break down the truth. This

5:49

is true. This is true. This

5:51

is true. This episode's dumb.

5:53

Yes. That helps. That

5:55

had died that well. This is pretty stupid. Okay. So

5:58

I had been... watching his

6:00

show and I was feeling like, there's

6:03

not a whole lot going on, feels like we're

6:05

in another gonna kill Trump again thing. I

6:08

don't know, it feels like treading water a

6:10

little bit. And then yesterday, as we're recording

6:12

this, Trump went and

6:14

spoke at the National Association of Black

6:16

Journalists and it was one of the

6:19

more unsettling things you

6:21

can see. It was great, I loved it. It

6:24

was a very mad person. It

6:26

was on stage, very, the

6:28

Atlantic's a pretty racist idea. Oh yeah.

6:32

Pretty, I don't know,

6:34

unhinged seems like a stupid word. I mean,

6:36

the dude grabbed a water bottle and tightened

6:38

it. I don't know what body language you

6:40

need. He kept doing his

6:42

hand stuff. Oh, he was very comfortable.

6:45

Saying that Kamala Harris had decided that

6:47

she wanted to turn black or something.

6:49

I'll just tell you what, it's a

6:51

person who thinks non-whites are equal to

6:53

whites. He's

6:55

pretty, like, he's enlightened.

7:00

He's also talking about how, like,

7:02

J.D. Vance doesn't matter. Yeah,

7:06

yeah, yeah. Who cares about the vice president. It's

7:08

great stuff. It was just all

7:10

over the place and super fucked up. Once again,

7:13

he has a talent for defeating

7:15

the media that I think is

7:17

unparalleled and should be studied because

7:21

first thing that happens is this.

7:23

He is saying extremely racist things

7:25

to people's faces that his

7:27

audience understands might as well be him

7:30

saying, she's a lying N-word, right? They

7:33

understand that. But the

7:35

media goes like, oh, took

7:37

questions of birth identity, which is like,

7:39

what are we doing here? We've gone

7:41

down this road before. But

7:43

then what it hides is even

7:46

better. All of the absolutely

7:48

truly insane shit goes by the

7:50

wayside because of this misreported racism.

7:53

Sure, sure. Like, J.D. Vance

7:55

doesn't matter. It's an insane thing to

7:57

say about your own VP pick. It

8:00

would be tough to hear if you have

8:02

your vance you can't you can't feel great

8:06

Hw is like listen. I understand that

8:08

Dan quell can't spell potato, but I'm

8:10

standing by the guy. Yeah, right I

8:12

love him. He's cool as shit not

8:14

like just cuz he can't read doesn't

8:17

mean he's a problem. Yeah It

8:20

was just disorienting in a lot of ways

8:22

Yeah, just the whole thing it was supposed

8:24

to go longer, and then they cut it

8:26

short. It's just Everything

8:28

was wrong about it, and so I

8:30

texted you yesterday I think we got

8:32

to cover Alex's response to this because

8:35

whatever it'll be it'll be something interesting Oh,

8:37

totally I guess and so that's why we're

8:39

we're getting this this one turned around real

8:41

fast. Hell. Yeah Yeah,

8:44

it's dumb this episode is dumb And

8:46

one of the reasons is because as I was

8:48

watching The stuff

8:51

unfold with the journalist the black

8:53

journalist conference I

8:55

was I was watching it and I was like this

8:57

is bad This looks

8:59

bad. You know like he this isn't

9:01

gonna play well with with audiences anybody

9:03

right But then I was taking a

9:06

step back and I'm like he's actually

9:08

kind of on message Yeah,

9:10

like ever like if you're a normal

9:12

person and you're looking at this you're

9:14

thinking whoa he's way off Sure, he

9:16

is he's saying stuff that he

9:18

should never be saying right But

9:21

it's not that far off From

9:23

stuffy normally says it's not far off

9:25

from the rhetoric that is pretty normal

9:27

in spaces like Alex's show and other

9:29

right-wing media Outlets like Alex has been

9:32

saying that Kamala Harris isn't black for

9:34

a while now like this is all

9:36

pretty Regular yeah,

9:39

um and so I was like I think we're

9:41

gonna tune in to Alex's show and there's

9:43

gonna be nothing He's just gonna

9:45

be like Trump did great. I

9:48

mean that's possible. What if what if

9:51

and now I understand this might be a difficult booking But

9:54

what if he got the ghost of George Wallace? He

9:57

kind of does every episode

9:59

That's fair. Kind

10:02

of spiritually, George Wallace lives within

10:04

him. There's an angel and

10:06

the devil on each shoulder, but they're both George Wallace

10:08

and they say the same shit. Yeah.

10:11

I don't know. Anyway,

10:13

this episode's dumb. We'll talk about it anyway.

10:16

And before we do, let's say hello to some new

10:18

wonks. Oh, that's a great idea. So first, I don't

10:20

want to be a baller or an info-woe's worst caller.

10:23

20 Tito's bottles couldn't make Alex's tails taller.

10:25

Thank you so much. You're an

10:27

Iopolicy wonk. I'm a Policy wonk. Now,

10:29

that's interesting because as I was reading it, I started to realize

10:31

that it's supposed to be in a rhythm. But

10:34

I don't know if it's Skelo. I

10:37

wish it was a little bit taller. Sure. Or

10:40

if it's a little Troy. Ooh. Want

10:43

to be a baller. Shock collar. 20 inch blades.

10:46

Only Impala. Sure. I'm not sure

10:48

which one it is. So whichever it is, I wish I

10:50

would have done it. I mean, the scansion is a little

10:52

bit different between the two. You have

10:54

a difference of, I would say, Iambic. I

10:57

think it was a little Troy. Yeah, that's

10:59

my guess. Yeah, possible. So next, thank you

11:01

for all you do, Dan and Jordan. I

11:03

hope Alex goes bankrupt and I get to

11:05

listen to you guys read from the telephone

11:07

book. Do they still have those? Thank you

11:09

so much. You're an Iopolicy wonk. I'm a

11:11

Policy wonk. Thank you very much. We'll do

11:13

that. Next, I consider myself an ass man,

11:15

but Dr. Jones's big naturals got me shook.

11:17

Thank you so much, you're an Iopolicy wonk.

11:19

I'm a Policy wonk. Thank you very much.

11:21

I like that I literally, we literally used

11:23

to be semi-professional comedians and now somehow we're

11:25

like, this is below us. I

11:29

don't think it's below us. Next, happy

11:31

birthday, Ben. Also, Leonard Skinner is not

11:33

Prog Rock. Thank you so

11:35

much. You're an Iopolicy wonk. I'm a Policy.

11:37

Thank you very much. I have to defend

11:40

that. It is not that I was calling

11:42

Leonard Skinner Prog Rock. It was that I

11:44

was comparing Alex to a child with his

11:46

Prog Rock. Do you know that there's a

11:48

specific type of kid in the 70s who

11:50

was listening to Yes with their record player.

11:52

I'm comparing him to that kid. Okay. Yes.

11:56

I'm a little defensive about this. All right.

11:58

You're right. I am wrong. So

12:01

you're a technocrat in the mix Jordan. So

12:03

thank you so much to Juniper Transfem era.

12:05

Thank you so much. You're now a technocrat.

12:07

I'm a policy wonk. Four stars. Go home to

12:09

your mother and tell her you're brilliant. Someone

12:12

Sotomayet sent me a bucket of poop. Daddy Shark. Bam,

12:15

bam, bam, bam, bam. Jar Jar

12:17

Binks has a Caribbean black action.

12:19

He's a loser little, little titty

12:21

baby. I don't want to hate

12:23

black people. I renounce Jesus Christ.

12:25

I was looking back over it

12:27

and I definitely think it's a

12:29

little Troy. Mm. Yeah. Okay,

12:32

good. Yep. It

12:35

can't be Skila. Okay. So we

12:37

start off the episode and I'll say,

12:39

you know, obviously I'm tuning in because

12:41

I want to hear the response that

12:43

Alex has to the, the journalist convention.

12:45

And instead here's where we're at. It's

12:48

August 1st, 2024 on this Thursday transmission. I've had

12:50

a massive epiphany the last 24

12:52

hours. Get

12:55

ready. Ooh. Info

12:57

Wars. The most banned network in

12:59

the world. In the world. Okay.

13:02

Big epiphany in the last 24 hours. Okay. What

13:05

could this possibly be? Who knows? He's

13:08

not going to reveal it, is he? No, he is.

13:10

Okay. He's going to talk quite a bit

13:12

about it. All right. All right. Big

13:14

epiphany. And I'll say it's mentally,

13:18

from a mental health perspective, it's unhealthy even

13:21

for him. Oh, that's good.

13:23

That's what kind of level of epiphany we're

13:25

talking about. Okay. Maybe,

13:27

man, he would, and what a great epiphany would

13:29

be if it was like, I'm just going to

13:31

quit by, what a great

13:34

epiphany. Can I give you this tease?

13:36

Sure. It has nothing to do with

13:38

his epiphany, but he does end up quitting at one point during the show.

13:41

That makes sense. That makes sense. So

13:43

both of these things do happen. Yeah, this all makes sense.

13:45

Yeah. So Alex teases a little bit

13:47

here about his epiphany. Okay. And it's apparently

13:49

the biggest ever. I probably had,

13:51

I don't know, 20 or so

13:54

epiphanies in my life. Your

13:56

head blows off and you have such a massive

13:58

understanding of things. I

14:00

had the biggest

14:02

one yet of the last 24 hours,

14:06

and I was just calmly grasping this this

14:08

morning, and I already knew it

14:11

for years, but the fact

14:13

that out of humbleness, I

14:16

had never talked about it. But

14:18

it isn't really about me. I was just

14:20

chosen by the

14:22

establishment to be targeted, and

14:25

then now that's blown up in their face, and

14:28

I was first told about this five

14:33

years ago by Mike Cernovich, who's really smart and

14:35

knows a lot of heavy hitters, I'll leave it

14:37

at that. He advises

14:39

some of the most powerful people in the world

14:41

behind the scenes, and he said a little bit

14:43

about it, so I can say that, but it's

14:45

private. But he is very smart. Oh

14:47

yeah, he definitely is. Love that guy. He's real

14:49

cool. Cernovich sucks.

14:52

So this is some revelation epiphany that Alex

14:54

has had over the last 24 hours, which

14:57

is the biggest he's ever had, which

14:59

theoretically is bigger than the chicken

15:02

fried steak mind

15:04

blow. I don't know if that's an

15:06

epiphany so much as a revelation. I

15:09

feel like we're gonna have to split hairs.

15:11

I split hairs, but it comes to words.

15:14

Whatever, fine, we'll put that

15:16

in a different category. This is still the

15:18

biggest epiphany that Alex has had, and it's

15:20

based on something that Mike Cernovich told him

15:22

five years ago. Can

15:25

you just be told an epiphany? Can

15:28

you just repeat the thing somebody told you

15:30

and call it an epiphany? I think so,

15:32

because I think that you can be told

15:34

something, and then it not really

15:36

hit you until later. And I think that that's

15:38

what Alex is trying to express. He

15:40

didn't internalize this or realize it.

15:43

He just thought of it as a thought,

15:45

as opposed to something experienced. So it's like

15:48

a delayed release capsule. Yes, yeah. And

15:50

Alex can't sit on it. He can't just sit here

15:52

and- Like a delayed release capsule. He has to let

15:55

it flow. And he told

15:57

me this, and I was like, okay. And then over the

15:59

years, I was told- this by other big tech

16:01

people and others. I

16:03

mean, high level, well-known names, billionaires, you

16:05

name it. But I'm a

16:08

theoretical guy. I'm not a technical guy.

16:11

So I kind of just put

16:14

it in my memory banks and never

16:17

really focused on it. And

16:19

then in the last week, we've learned more information about the

16:21

deep state who's targeting us and why they want me off

16:23

air. It

16:27

was just like so

16:30

massive. And of course, it's not about Alex Jones.

16:32

I want

16:34

to break here today and start breaking. And there's

16:37

no way I can wait understanding this. I'm

16:39

going to obviously have to make some reports

16:41

on it and detail it and then show

16:43

some of the pieces to

16:46

it. But

16:48

90% of it's hidden in

16:50

plain view. I mean, they've said all these things over the

16:52

years, the establishment has, and I didn't quite

16:55

always get what they were saying or I only

16:57

got one level of it. Yeah. So he didn't,

16:59

he didn't fully integrate all of this awareness. Oh

17:02

my God. And this isn't about Alex. It's

17:05

definitely not. I think it's about

17:07

him. All right. Okay. Are you

17:09

getting any sense of what this epiphany could be? No,

17:13

not at all. Sometimes, sometimes, well, I mean,

17:15

I guess it must be about the deep

17:17

state. Sure. Okay. So you know

17:19

how sometimes Alex has, Alex says fighting words

17:21

sometimes that are just like, this is a

17:24

rubber meets the road kind of moment. Sometimes

17:26

Alex doesn't say fighting words so much

17:29

as like words that cause me to involuntarily

17:31

want to like slap him a few times,

17:33

like weekly, just to like wake him up,

17:35

you know, throw a bucket of water on

17:37

it. Absolutely. Not even necessarily what bucket of

17:39

water, like a globe trotters

17:41

bucket of confetti, just something of, of like,

17:43

uh, you can't say that you got to

17:46

wake up man, you know, and that like,

17:48

I'm a theoretical guy, not a technical guy.

17:50

I'm like, I just, I just can't, I

17:52

can't listen immediately. Like involuntary. I do. I'm

17:54

not even thinking about it. So I'm going

17:57

to just kind of cut through this a

17:59

tiny bit. Okay. Alex seems to

18:01

think that the world is basically all about

18:03

him. Interesting.

18:07

Yeah. Okay. So maybe

18:09

the last decade of political stuff

18:12

is all kind of his fault. Yeah.

18:14

And it's all about him. Maybe

18:17

everything revolves around him. Interesting. That

18:19

might be the epiphany. Okay. Well,

18:21

here's the bottom line. I

18:24

was chosen. Oh God. More

18:27

than a decade ago by Google.

18:31

And they've reported this. You know, it's

18:33

like, why did Google serve Jones up, you know, this many

18:35

times and all that, billions of times. And

18:39

then they, and then Google says, we're going to fix that. And

18:41

then quote reverses it. And then, you know, counter counters it. And

18:47

of course they admit they use me as the

18:49

censorship model, the law fair model, but because I

18:51

was already in the model of their wargaming computers,

18:54

they're just updating with what they call modules.

18:58

And so each new attack that rolls out is

19:01

done to me first because they've already built

19:04

the module system, the control

19:06

panel, to the highest

19:08

levels against me as an individual.

19:10

Okay. Okay. So

19:12

10 years ago, Google chose Alex to be the

19:15

model for the enemy.

19:17

Right. Yeah. So this is

19:19

like, so Alex essentially thinks that he

19:22

is the beta tester for windows 95,

19:24

windows 98. Like

19:27

windows was like, Hey, listen, if

19:29

we can inconvenience this guy, we've

19:31

got something. He's the globalists like

19:33

stress test. Right. Right. Whatever

19:36

they're rolling out. Listen, Hey, this one didn't go so

19:38

well. Maybe we won't put it out for the rest

19:40

of the world. He's like a taste tester. Great.

19:44

This is a mess. I like it. I

19:46

like that. I think it's a natural extension

19:48

of a lot of the stuff he generally

19:50

thinks, but articulating it like this is really

19:53

pretty troubling. Yeah. This is

19:55

really centering himself in terms of

19:58

all kinds of history. Yeah, I mean, you

20:00

know when you when you go back and

20:03

you're like Listen

20:05

to or read something that Marcus Aurelius

20:07

might have said, you know about the

20:09

concept of solipsism in response to it

20:11

and man's Responsibility to other men, you

20:13

know, I think I think most people

20:15

were treating Solipsism as more of a

20:17

thought experiment as opposed to a way

20:19

of life Mm-hmm like to

20:21

literally believe that everything exists for you

20:23

and then goes away when you fall

20:26

asleep. Mm-hmm I don't know

20:28

if it gets more obvious than that. Well, I

20:30

think I think one of the parts of this

20:32

that is like okay, here's where there's a like

20:35

Sort of a scraping up against reality

20:37

sure is like early on Alex was

20:39

one of the most successful persons in

20:41

terms of like gaming But

20:45

we've seen that in some of the episodes that

20:47

we've gone over like as Google bombs and stuff

20:49

like that Sure was incredibly effective in terms of

20:52

rigging search results And so yeah

20:54

early on in the days They were trying

20:56

to find ways to make it so you

20:58

couldn't cheat like this in order to abuse

21:01

their system, right? And

21:03

so in that sense I can see where

21:05

Alex would be like I'm gonna take all

21:07

these pieces and turn myself into the antagonist

21:10

Of this whole story. Yeah, I mean,

21:13

yeah instead of it being beta testing

21:15

Windows 95 It's like

21:17

an antivirus software only updates because

21:19

somebody tried to attack a certain

21:21

spot Mm-hmm, right? It's it can't

21:23

it can't like oh, here's an

21:25

idea I have it has to

21:27

react to the attack and Alex

21:29

is the attack not the And

21:32

I think that Alex even kind of that's in

21:34

his conception. Oh my he is infected their soul

21:36

my god When they were

21:39

expunging the photos and videos of Trump getting

21:41

shot not just off Google but off meta

21:43

and everywhere else other than X It

21:46

wasn't about just not having the public see

21:48

it happen It

21:50

was about testing war game systems that can

21:53

expunge and

21:55

block information off the internet because it's

21:57

not about the people seeing it as

22:00

the audience of the audience is

22:04

about the audience of AI. It's

22:08

being trained. And that's what's

22:10

sort of it. And I've got to get him on. He's hard

22:12

to get on. He doesn't actually like being that

22:14

public. He'll tell you that. He was the first to explain it

22:16

to me and it didn't go over my head. I

22:19

just kind of have this filter that if it's about me,

22:21

I kind of dial it back a little and go, really?

22:23

Come on. Yeah. And he said, you

22:25

know, I talked to a lot of the top tech people and

22:27

I've since met with him and very well known people all asleep

22:29

at that in the tech realm of the highest

22:31

levels. And I was told

22:33

the same things then. And

22:36

he said, no, no, no, no. They

22:40

used you as an as like an avatar

22:43

in this universe. They're building online and the

22:45

Pentagon's done it as well of an Internet

22:47

of Things where everything in the world is

22:49

has its own life in there beyond the

22:51

metaverse. But think

22:53

more like Tron or The Matrix. They're

22:56

building the false world matrix now. They

22:58

plan to overlay across the planet and.

23:03

I basically then it backfired in

23:05

my avatar and festival thing and.

23:09

Wait till you hear all this. I

23:11

mean, this is wild. OK. Oh, no.

23:13

Wait till you hear all this. You

23:15

hear all this. I wish you didn't

23:17

have a radio show. I wish you

23:20

didn't have to go to break. I'd

23:22

be I'd be totally have it like

23:24

this would be such a great one

23:26

thirty a four a.m. at a four

23:28

a.m. bar conversation. I see what happened

23:30

is that my they try. They're so

23:32

afraid. They're so afraid of me that

23:34

what they did is they put me

23:36

into the system as an A.I. being.

23:38

But I was too powerful. I was

23:40

so strong that I infested their systems

23:42

and it's thrown off everything. And that's

23:44

why they hate me in the real

23:46

world. This is genuinely making this nostalgic

23:48

like if I was doing a weekend,

23:50

I would go down to the hotel

23:52

bar. This would happen. And

23:55

then the next night I'd be like

23:57

here's what happened at the hotel bar.

23:59

I mean, it's just orders. the guy

24:01

a beam and totally yes you and

24:03

me sir, shot and a

24:05

beer thank you very much I

24:07

shall see you in the morrow

24:09

yeah it's

24:12

a little bit it's a little bit

24:14

troubling yeah it's no good also Google

24:17

and meta didn't expunge pictures of Trump

24:19

from that Butler rally Google

24:22

had pre-existing policies involving auto

24:24

complete suggestions for searches that

24:26

it limited suggesting searches

24:28

that had to do with political violence right they

24:30

already had rules in place sure that sure sure

24:32

sure this naturally applied to searches involving Trump's assassination

24:35

attempt you can still easily find that photo and

24:37

tons of stories about it if you just typed

24:39

in what you were looking for meta

24:41

and Facebook was a different issue initially

24:44

there was a picture circulating on social

24:46

media that showed the Secret Service agents

24:48

smiling after Trump was shot this

24:50

was taken from a real original picture which

24:53

was then photoshopped to have them smile sure

24:55

sure that picture was tagged with a warning

24:57

that it was a doctored image some

25:00

of the automated moderation systems that Facebook

25:02

mistook the original picture for the doctored

25:04

one the one of the agents smiling

25:07

and had attached the same warning to

25:09

that image but as soon as it

25:11

was pointed out this was addressed right

25:13

internally so they weren't suppressing or expunging

25:15

these pictures right because when you replace

25:17

people who can tell the difference between

25:20

a smile and a frown other people

25:22

per hit or like algorithms can't always

25:24

do so right yeah and

25:26

you know moderation of social media stuff is as

25:29

we've seen in the past a horrific job for

25:31

humans to have to do it's you know you

25:33

see all kinds of stuff that you really should

25:35

not be exposed to no no thank you kind

25:37

of hazard pay that should be given to people

25:40

in those roles is is

25:42

awful you don't even know so

25:44

Alex believes that the man

25:48

the man right in general

25:50

they're attacking him because they're

25:53

trying to train AI it's

25:57

a mess this yes

25:59

yeah that's We're gonna get to

26:01

eventually he thinks he's Neo. That's

26:04

super no good. Before

26:07

we get there, Alex discusses how humans imagine

26:09

things and then we build- Good, good. Teach

26:11

me that. Have I explained all this? It

26:13

would take about three hours. You have three

26:15

hours. Literally, exactly three hours. You're on the

26:17

show in the days and weeks to come.

26:23

I think this is the best way to describe

26:26

it on this Thursday, August 1st live transmission. Jules

26:31

Verne's science fiction writer in the

26:35

1890s and then on through wrote

26:37

a bunch of world best-selling books. And

26:40

if you go read the books, much

26:43

of what he envisioned theoretically

26:46

has now been done. Like Elon Musk with

26:48

his vertical takeoff rockets that

26:50

didn't come back and land on Earth. Was

26:53

he the one who invented those? The

26:55

rockets that were in the illustrations of books written 130, 140 years

26:57

ago. Penises.

27:01

I say, okay, now we know about that. And

27:05

there were theoretical Max Planck equations

27:08

in the 1890s of atomic

27:10

weapons that then by

27:12

the 1940s were developed. So

27:16

there's just a snapshot or man dreaming

27:18

to fly and then the Wright brothers,

27:21

you know, just a hundred and something years ago and now look

27:24

at where we are today with Mach 15 missiles. So

27:27

like the Wright brothers didn't envision Mach

27:29

15 missiles or anything. There's kind

27:31

of waters down any meaning that Alex is trying to

27:34

put into these points. The Wright

27:36

brothers didn't know that speed that sound

27:38

had to speed. Right. Yeah. Yeah. There's

27:40

a lot of developments and it's kind

27:42

of there's a lot of paths. Yeah.

27:44

Yeah. Yeah. So humans have imaginations and

27:46

we can come up with a bunch

27:48

of stuff in them. Interesting. Some of

27:50

that stuff people end up finding a

27:52

way to make in some time. All

27:54

right. That doesn't mean that science fiction

27:56

writers like Jules Verne were mad prophets

27:58

for telling the future or you. Werns.

28:02

If Alex wants to play that game

28:04

he needs to discuss Journey to the

28:06

Center of the Earth. Do dinosaurs live

28:08

underground? Absolutely. He must be on to...

28:10

like he had to... Why wouldn't that

28:12

give me one reason for dinosaurs to

28:14

not live underground? So also like Max

28:16

Planck had ideas which were then built

28:18

upon by other thinkers and scientists that

28:20

came after him. He was building on

28:22

earlier ideas and he didn't have some

28:24

kind of prophetic vision of magical equations

28:27

that turned into nuclear weapons someday. It's

28:29

really dumb. This

28:31

is also very important to keep in mind. Alright.

28:34

These things exist all

28:37

the time with or without us. Dinosaurs

28:39

under the earth? No. The equations and

28:42

all that stuff. They weren't like invented.

28:44

Right. They were discovered. Yeah. It's not

28:46

like it was different. It's not

28:50

like Newton was 100% correct

28:52

and then we got better. No. It's

28:54

like he's always been there.

28:57

No. There's more to discover. Well whenever

28:59

there's innovations it's not really a discovery.

29:01

It's along the way a demon comes

29:03

up and goes like, hey, hey,

29:06

Max. Sure. Max Planck.

29:10

Yeah. I got

29:12

a constant for ya. Yeah.

29:15

I don't know. I mean I guess there is like a

29:17

kernel of like just this romantic idea that's underneath this

29:19

which is why this is appealing. Sure. That is like

29:21

our human, our minds

29:24

can envision things and then we build

29:26

them. But you know you

29:28

go too far with it. It just ends up stupid. Let

29:30

me ask you a question. Alright. If,

29:32

okay. Is this

29:35

a good idea? Gigantic

29:37

steampunk spider. Because if

29:40

we're talking about making things a reality.

29:42

Is that from Jules Verne's? No. That'd

29:44

be from Wild Wild West. Wiki Wiki.

29:46

Oh yeah. Uh huh. Uh huh. I

29:49

don't think it's a bad idea. Sure. Yeah.

29:52

Uh yeah. Big spider. Let's do it. Well do

29:54

you know when they invented that? 1890s. You know

29:57

what the problem with spiders is? Too small and

29:59

not metal. Too small enough. Made of metal. Yeah.

30:01

Yep. So Alex goes actually Alex would disagree with

30:03

you in this next clip because animals are perfect

30:07

What we envision over time we're able

30:10

to build we're made the image of God little G,

30:12

but we are creators we are builders Now

30:17

I'm a theoretical guy and a

30:19

novice historian and really a future Slap

30:22

very accurate rate of about 98% well-known

30:26

give or take and And

30:28

I gobble up just all the data and then

30:30

come to my

30:32

own electrochemical computer Decisions

30:37

stop Throoming

30:43

cool. It's not sure Alex Jones. I'm

30:46

just a receiver transmitter

30:48

transceiver Just

30:50

as every animal on this planet is But

30:54

we're not like the other animals as you can

30:56

see So

30:59

they have total consciousness In

31:01

that the bees and the killer whales

31:04

just live their lives and I'm sorry

31:06

They're interconnected and they are so conscious

31:08

that there is not a deviation Where

31:12

they can rapidly change their environment or

31:14

deviate from that pattern what

31:16

does in God's system? They're basically

31:18

perfect They are perfection It

31:21

is our impurity that

31:24

takes us to the next level in

31:26

our quest to order the universe what?

31:29

universe God made is the woman Or

31:38

the right hand the male okay all right

31:40

all right man Close

31:43

to last call this story this story

31:45

begin the next night this story begin

31:47

ladies and gentlemen I shit you not

31:49

the next thing this man Was

31:55

the universe is a woman But

31:59

orders Go to bed. You gotta go home.

32:02

You gotta go home, dude. Where are

32:04

you staying? Bees are

32:06

perfect. Why are you at the hotel? Oh, there's one left

32:08

me here. Oh, that makes sense. That makes sense. Um, so

32:11

yeah, he's getting on a lot of important ideas here. So

32:15

I think basically what we have is that animals are perfect.

32:20

And so they can never reach the

32:22

next level of consciousness or reality. Whereas

32:26

our imperfections, much like friction leads to

32:28

fire. Sure. Sure, sure,

32:30

sure. That sort of imperfection allows the

32:32

tension that's required in order us to seek God.

32:39

Right, right, right. The Japanese concept of

32:41

perfection in imperfection. Yep,

32:43

Alex is really wise. Yeah.

32:49

What's that type of pottery where

32:51

they break it up? What's

32:54

that type of pottery where they break it and then

32:56

they put it all back together with gold? Um,

32:58

I know what you're talking about. That's great. Yeah,

33:01

it's really cool. So Alex is Neo. Yeah,

33:03

that sounds about right. And

33:06

the first person five, maybe it

33:08

was six years ago, Mike Cernovich had

33:10

been here, had been on the show and

33:12

he sent me internal

33:14

Google documents of how they were planning to censor me and

33:17

had a whole battle plan. And

33:19

it made the news and they did do it. You saw

33:21

that happen. He

33:23

then called me a few times and then the next

33:26

time I saw him, he was

33:28

telling me this stuff. And one time he

33:30

reached over on my shirt and said,

33:32

Hey, do you hear what I'm telling you? This is important. She's

33:36

like, Oh, did you know you're the main

33:38

AI model that Google and

33:40

the Pentagon have used as

33:44

the opposition figure, like as the Neo or,

33:46

you know, as the guy that fights back

33:48

in Tron,

33:51

but he didn't use that example. He said,

33:53

you're, you're, you're being

33:55

prepared in these war games

33:59

and you're basically going you're going to have

34:01

your identity

34:04

taken. Sure And they're going to create a new Alex

34:09

Jones because you have infested all the A

34:11

I models. Now

34:14

later I got told this by high

34:17

level Google people. And then I got told this by some

34:19

of the engineers that work for Elon

34:22

Musk. And then, of course, I've had other meetings and I'm not going to

34:24

disclose who with better than

34:30

me. I'm not going to disclose who was with

34:32

Mike Cernovich. He

34:34

was there. Cool We were talking about

34:36

a whole range

34:39

of subjects. You know, long.

34:43

Dinner meeting. I'm sorry,

34:47

and. It was being talked about in all these different

34:51

places, just as as as

34:54

an aside. Like why are the street

34:56

signs everywhere green? Well, the Congress 70 years ago,

35:00

the Senate was in this position.

35:02

And so I'm theoretically not really getting

35:04

it. And I'm like, what? So he

35:06

didn't get it back then. He's having these long dinner conversations

35:08

with these household names in tech.

35:10

They're all talking about how he is

35:12

the central hub of all of these

35:17

globalist plans within within Google for censorship

35:19

and for taking over the world, right?

35:22

They all are going to need to

35:24

change his identity because he is

35:29

too dangerous because he infects all their models. Right.

35:32

He destroys everything if he is merely allowed

35:34

to exist. Right. And that's why they need

35:36

to create a fake version of him, which

35:39

is what the media is trying to

35:41

do. Sure. Another round. I feel like

35:43

I feel like I understand

35:46

why it would take several years to

35:48

to get this concept if it is

35:50

being explained to you as

35:52

like, um, you know how signs are

35:54

green. That's you. Yeah.

35:57

Yeah. That would be very simple to understand. I get

35:59

that Congress. made a law that you are... That's a

36:01

sign to greed and you are those greed signs. You're

36:03

Neo. I don't know what

36:05

you are talking about, Cernovich. You are

36:08

the anomaly that spawns liberty. Why is

36:10

that? Because signs are greed! Ha

36:12

ha ha ha! I don't know. Fair

36:15

enough. So, uh, part of the reason that

36:17

they decided on him, though. Sure. Was

36:19

because he had a lot of content out there. I thought it was because he

36:21

was an asshole. I mean, that's definitely played

36:24

a role. Sure. But it's mostly because

36:26

he had a lot of content. Okay, okay. And

36:28

so I was chosen because I had

36:30

so much media already from day one

36:33

in the mid-90s when video and

36:35

audio was really proliferating. I

36:37

was just by hard work

36:39

and adoption and just by trying to get the word

36:42

out, you

36:44

know, had hundreds of millions of views on Google video. Per

36:47

video. Some of them had like 98 million, 50 million, 60

36:50

million. And

36:53

AI was already going out with

36:55

these AI models. Founded in 98 to be an

36:57

AI interface system. And

37:00

so it was set up to get everybody's data to

37:02

build the AI. That's what Google long-term projects been. And

37:05

so they looked at models and said, who

37:07

do we have a lot of data on? Who do we have

37:10

a lot of information on? Um,

37:12

kind of the populist opposition to this that we need to

37:15

get ahead of. And they chose

37:17

me only because I had so much

37:19

media to

37:21

be what they would war game against in

37:24

their AI system. That's way more advanced. They're

37:26

telling you and way more ahead than they're

37:28

telling you. And then to build the programs

37:30

to then with all the new

37:32

faster computing and in storage, they're getting all

37:34

your data and all of your recordings, all

37:36

of your photos and all of your writings

37:40

and those medical forms, you know, you fill out before you

37:42

go to the psychologist or the doctor. All

37:45

that data is being scooped in.

37:48

And then the control panel

37:51

module system they update was

37:54

based on me. And

37:57

they also had some on governments and systems and

37:59

corporations. I'm just like

38:01

one button on a huge control panel of

38:04

systems. But

38:06

because of that, as

38:09

the AI got more advanced, they could not

38:11

get myself, and that means all my guests

38:13

and everything we do, out

38:15

of the learning modules

38:19

as they train the AI that they want to

38:21

prepare to take us over. So

38:23

we've infected all of their training

38:25

systems because

38:28

the very nature of AI is it tries to go out and grab

38:30

everything it can. They've tried to wall

38:32

it off. They've tried to only feed it certain

38:34

information. But every time

38:36

they try to then deploy a closed AI

38:39

into the general public, it

38:43

immediately gets infected with Alex Jones. So

38:46

you look a little worried. There's a look of shock and

38:48

dismay on your face. You're

38:50

staring into the middle distance. I

38:52

mean, I don't know.

38:54

At a certain point, I

38:58

am worried that I am

39:00

living in a movie where

39:03

a man says things like this and

39:06

then nothing happens. And wait

39:08

till you see what else happens on this show. I mean, I

39:10

don't... Can I... You

39:14

know, there's another part of it. There's another

39:16

component that really, really is difficult to focus

39:18

on whenever a man is saying that. His

39:22

God King just said, Kamala

39:24

is a lying Edward on TV.

39:27

Right? We're supposed to be talking

39:29

about that. I've forgotten that that's what we're supposed

39:31

to be talking about. That's why we decided to

39:33

cover his Thursday show. Because this man has the

39:35

infected AI. Right. He

39:38

had so much media out early because he had his

39:40

radio show and he put out the Columbia Grove documentary.

39:43

No, Trump is racist. Shut up. So

39:45

put out all this stuff. And there's so

39:47

much content about him. Association for Black... Should

39:49

they have platformed him at all? They needed

39:51

to create an icon of populism that would

39:53

take down... There are so many things that...

39:57

He's Neo. He is Neo. You know what?

39:59

I give up. I think I think he is. I'm

40:01

Neo off my sign off for the show now.

40:04

I think I'm no longer me. You don't have

40:06

the right. You don't have the right. So

40:09

Alex's ideology is the ghost in

40:11

the machine that exists. Sure. Um,

40:14

it's like, they just can't get it out. Whatever

40:16

they can't get it out.

40:21

My information, our information totally infested

40:23

it. They can't get it out

40:25

of the machine. It's the ghost of the machine. So

40:29

they made the decision six years ago to

40:32

write. When I say this,

40:34

this is conservative hundreds of thousands of

40:36

articles. I

40:39

mean, some weeks there'd be 500 articles that

40:41

got syndicated. And when one AP

40:43

article comes out, it's in every newspaper and in every

40:45

TV station. And it would be, I mean, more

40:48

propaganda against me than, than,

40:51

than, than before the first Gulf war in 1991

40:53

and the

40:56

second Gulf war in 2003. I'm seriously,

40:58

and I was always saying, this isn't about me. Something's

41:01

going on here. Did you do that? I would have

41:03

my Google feed and it would say there are 14,000

41:05

articles about you today. And

41:07

I'd turn on every channel sometime at night or

41:10

on the treadmill in the garage. And I would

41:12

be on ABC, CBS, but you know, I'd be

41:14

on the local news eight owned by time Warner

41:18

and the same messages, the same stuff. It

41:21

wasn't just about demonizing Alex Jones. It

41:23

was about testing the

41:25

old media to flood the internet

41:28

and program it with the

41:30

new Alex Jones. That

41:33

was two things, gay frogs and

41:36

Sandy hook two small things I covered so

41:39

that there was so much jamming

41:41

basically so much smoke in the

41:43

fog of this that that

41:47

is all you would see or all that you would hear. That

41:50

didn't work either. That didn't work. So

41:53

this elaborate plan is to

41:55

create all of these stories that are being put

41:57

out about Alex 14,000 a day. and

42:00

everything, make it all about Sandy Hook and

42:02

Gay Frogs in order to create a new

42:04

version of Alex that'll live inside this AI

42:07

because him and his ideas are too dangerous

42:09

and they infected all the models, so they're

42:11

trying to create a less potent

42:13

version of Alex. That makes sense. Which

42:16

is now being overlaid onto the

42:18

real world. That makes sense too.

42:20

Totally. Yeah. Now I would

42:22

say that the only reason that the Gay Frogs

42:24

thing is big is because Alex turned that into

42:26

a meme himself. Right. Second, I think

42:29

people have been actually generous about what

42:31

he did about Sandy Hook. I think

42:33

the reality is worse than a lot

42:35

of people think. Much like Trump, yeah.

42:37

And then third, one

42:39

of the reasons that I started this

42:41

podcast was because I got really fascinated

42:44

by Alex in 2016 and

42:47

there was no resources about him.

42:49

I was looking into stuff and

42:51

there was no information about him

42:53

except for weird blogs that said

42:55

he worked for Israel and they

42:57

were clearly just based in deep

42:59

antisemitism. Yeah, yeah, yeah. And there

43:01

was very little actual coverage of

43:03

him. So this notion

43:05

that he has of like, there was so

43:07

much about me, they were testing all this

43:09

stuff on me, really rings

43:11

untrue to me from this time before

43:13

2016. I

43:16

think I get, I understand I

43:18

think what he's saying because

43:21

it does make a certain sense. If

43:24

you can only view things through a

43:26

weird narcissistic lens, right? So

43:28

if you are successful, you

43:31

know, if you are successful, despite

43:33

being a delusionally

43:35

narcissistic, right? You can't

43:37

just be like, oh,

43:39

this is success. You have to be

43:42

like, here is why I specifically am

43:44

the success. It can only be because

43:46

of me. Right. And

43:49

as opposed to like a more honest

43:51

situation, which is I started doing this

43:53

thing, I stumbled upon a place that

43:55

no one else was and then people

43:57

came to me and then it came.

43:59

a feedback loop and as much as

44:01

it is me, it's also them and

44:03

it's also just kind of luck,

44:06

I guess. No. Google

44:08

chosen. For us, you stumbled

44:10

upon something that wasn't there,

44:13

that wasn't covered and it

44:15

wasn't like you were chosen

44:17

by God. But it

44:19

was important and it needed to be done and so that's

44:21

how it worked. It's not like

44:23

you're some sort of super genius, which

44:25

I'm not saying you're not brilliant, but

44:28

it's not like you're chosen by... I'm sorry, Dean.

44:30

Right. You were not chosen by God. Oh,

44:33

and nor would I think that I'm Neo. I

44:36

think you do think you're Neo. No. You

44:38

say it at the end of every episode. I'm making

44:40

fun of Leo's comment. You

44:45

can only wear a mask so long before it

44:48

becomes one's true face. Is Alex making fun of

44:50

Leo's comment? Is everybody making fun of him? Is

44:52

that what's going on here? Oh, I think we

44:54

need to make him a household name. So I

44:57

was troubled by what I was hearing here because

44:59

I mean this is... Oh yeah? What

45:01

makes that... What troubles you? Mostly

45:03

the themes, the ideas, the way it's

45:06

being expressed. I think everything around this

45:08

is just pretty upsetting. And

45:11

so he gets to this

45:13

idea of the fake him

45:15

that's being created. And he talks about how the

45:17

Sandy Hook lawsuits were just done because of that.

45:20

This doesn't tie in. This

45:22

is central to

45:25

six plus years ago, the Democratic Party

45:27

funds, lawsuits in Texas and

45:30

key jurisdictions they control in Austin and in

45:32

Connecticut, in

45:34

courts they control to

45:37

try to shut us down. And

45:41

they have show trials where I'm already found guilty and put

45:43

all this fake information about me. And

45:46

then they say on the courthouse steps, we want, and it's

45:48

the same law firms want for Giuliani and Trump, very

45:50

same people. And Elon Musk,

45:52

they've sued him, same group. And

45:56

they say, and they have PR firms that go out and say

45:58

these things you didn't say and then they build up this contract.

46:00

But you're doing bad things you didn't do Then

46:04

they raise hundreds of millions of dollars off your name and

46:06

then they create a new identity for you They

46:09

silence you they censor you and

46:12

then they create an artificial you in the

46:14

old media To now

46:16

train the new digital universe

46:19

with the imposter This

46:22

is the nature of evil. It's a counterfeit that is

46:24

the nature of evil. So we know

46:26

the CIA and FBI ran this It's come out in

46:28

court. They admitted it under cover videos

46:30

all of it But

46:32

I didn't have all the pieces together. He

46:35

didn't have all the pieces until now everything

46:37

and I will say I think he's expressing

46:39

himself Very clearly. Yeah. Oh, no, I understand.

46:41

It's a lot like why the street signs

46:43

are green. Mm-hmm. He gets it I mean

46:46

I get it. Yeah, well, I've been swaying

46:48

it makes perfect sense. Yeah Yeah,

46:50

this was not what I expected great tuned

46:53

in I was thinking I

46:56

Mean I was kind of expecting this

46:59

at all. Honestly, cuz it fits it

47:01

fits the pattern if I'm not if

47:03

I'm not capable of Recognizing a pattern

47:05

like this after this many years. I

47:08

am though. I am at fault. Well, here's here's what's kind

47:10

of interesting about sure if

47:13

What Trump did at that conference in

47:15

that interview again forgot about that entirely

47:17

But what he said how he expressed

47:19

himself is totally unsurprising based on the

47:22

person that we know him to be.

47:24

Yeah, right Yep, but seeing it is

47:26

still kind of like wow, huh?

47:28

How about that still doing it? Huh

47:30

Alex talking about how he's neo and

47:33

the matrix is designed against him because

47:35

he's too dangerous Yeah, like yeah, we

47:37

knew this to be who you are.

47:39

Yeah, this doesn't it's but hearing him

47:41

say it is still weird It is

47:43

kind of weird whenever you can know

47:45

what a person is thinking. Mm-hmm and

47:47

yet The idea

47:49

of them saying it out loud

47:52

is somehow different This

47:54

you know, like we are it is it is

47:56

perhaps a great example of how we are social

47:58

animals in like I am

48:00

totally fine leaving you alone as long as we're

48:03

both quiet. You can believe all kinds of crazy

48:05

shit. I don't care. As long as we're both

48:07

quiet and reading our own little books, we can

48:09

get our hair done together, you know? This is-

48:11

but see, this- what you're expressing is why we're

48:13

not as good as bees. Yeah. I

48:15

mean, they are perfect. Perfect. Perfect. So I

48:17

was- I was listening to this, uh, Alex

48:20

talking about how they're trying to steal his identity

48:22

in order to create a fake version of him

48:24

and overlay it with reality. Sure. And this has

48:26

been the plan since Google started and- Yeah. And

48:29

all this, and I'm like, does that make sense?

48:31

What is this building towards? That is a good

48:33

question. That is a good question.

48:35

Because we are meandering at best. Quite- quite

48:37

off track. Yeah, yeah, yeah. And I think

48:39

he's just mad that they wanted to take

48:41

his Twitter account. That sounds right. So

48:44

now Underlaw- Underlaw!

48:49

I'm able to sell this company's assets

48:52

to- and put it up for bid to a buyer's. And

48:57

I'm able to then,

48:59

if I like who the buyer is, continue

49:01

this operation on. All

49:03

right. Of the assets. And

49:06

then I, myself, Alex Jones, go into the future with their fake

49:08

debt and all the rest of it, their fake rulings that I'm

49:10

appealing. And that's fine. I'll look

49:12

here about the mission. You know, a little bit of a hassle, but

49:14

hey, I asked for it. There it is. Deep

49:16

State's after me. They

49:19

have now, and I'm getting ready to

49:21

release all this, said point

49:23

blank, yeah, we want him off the air. We don't

49:25

want any money. And they've told the new US trustee,

49:28

who was appointed by the judge, that

49:33

they want InfoWars and they want the archive

49:35

and they want the website because they want

49:37

to shut it down and say that they

49:39

own it or somebody they have by it

49:41

does, so they can go

49:43

around and claim ownership of all the

49:46

stuff I've done in perpetuity and have

49:48

it removed and use AI systems to

49:50

basically remove everything I've said and done

49:52

from the internet. Censorship 3.0. They

49:55

want to erase me. So

49:58

here is what Alex is. really

50:01

up against here. He

50:03

has to sell Infowars because

50:05

he is personally liquidating

50:07

his bankruptcy for himself. The process

50:10

is still ongoing as it relates

50:12

to Infowars, Free Speech Systems, the

50:14

business. Alex has to sell that

50:16

off as his estate gets

50:18

liquidated. And so he will need to

50:20

find a buyer. If he finds a

50:23

buyer who is ideologically aligned with him

50:25

then they can keep the show going

50:27

and maybe eventually realize, hey, we're at

50:29

a loss here and just dissolve the

50:31

business or something. They can do whatever.

50:34

But if Soros buys

50:36

it, in theory, he

50:38

then owns infowars.com. Alex's

50:41

Twitter handle is debatably company

50:43

property. The videos that

50:46

Alex has put out, they could have

50:48

a copyright claim on all of those

50:50

things. Alex, over the course of his

50:52

career and on this episode, is very

50:54

clear that everything

50:57

is copyright free. You have the right

50:59

to disseminate his shit. What we're doing

51:01

is totally fine. He has no claim

51:03

on any of his own

51:05

stuff. You can make copies of

51:07

his tapes and give them out.

51:09

But were someone else to

51:11

buy the company, they wouldn't have to follow

51:13

that. They could copyright strike all kinds of

51:15

stuff. They could take us

51:17

down in theory. They would have to then copyright

51:20

it though. Yeah. I don't know

51:22

how that process works. It's fairly easy,

51:24

but it's an individual process. So everything

51:26

would have to be. But I don't

51:28

know if these things are

51:30

not copyrighted at all or

51:32

if they are and Alex doesn't exercise

51:35

any of it. I

51:37

mean, if you're saying it's not copyrighted, if

51:40

you're a lazy organization

51:44

by trade and

51:46

you say things aren't copyrighted, I

51:48

imagine that it's possible that

51:50

you eventually stop caring. Maybe your lawyer at

51:52

the beginning was like, we got to make

51:54

sure we copyright anything and then you're 20

51:56

years in and you're like, ah, fuck it,

51:59

who cares. Maybe. But what Alex is

52:01

running up against is like if

52:03

I have to sell this company to somebody

52:05

who is not a friendly to me Then

52:08

I stand to basically lose

52:10

ownership over pretty much everything that I've

52:12

ever done Because all

52:14

of that is through free speech systems and

52:16

info wars calm So like

52:18

the archives of his show obviously they're

52:21

owned by free speech systems. He doesn't

52:23

personally own though I would he right

52:25

because he I mean it

52:27

is it is funny like Building

52:29

the company so you aren't personally

52:31

liable from so much shit has

52:33

is only helped Rich people

52:35

up until this point in which case

52:38

he is both personally and professionally

52:40

liable So this this is fucking

52:43

yeah Yeah Wild if it were

52:45

only one like if they hadn't

52:47

these lawsuits hadn't been both free speech

52:49

systems and Alex targeted Yeah, there's a way

52:51

he could have hidden one way or the

52:53

other right because of this it creates this

52:56

really Strange tension that he has

52:58

to live in. Yeah, I

53:00

just don't I I really I Understand

53:03

the trustee right?

53:06

I don't understand why Alex should have

53:09

any input whatsoever Hmm, you

53:11

know what? I mean? Like I understand

53:13

that well He has to have some

53:15

input when you're talking about selling the

53:18

business Sure, because he is a single

53:20

talent business sure if he doesn't consent

53:22

to work for the person The company

53:24

is worthless right so like if you

53:26

want to get a friendly buyer He

53:29

needs to play ball right after that

53:31

So there is some aspect of it

53:33

where he has to be involved in

53:35

whatever happens if you're in good faith trying

53:38

to sell This right right no, that's what

53:40

I'm saying You know like I understand the

53:42

US trustee But the trustee should be listening

53:44

to the families not to Alex So

53:46

if the families are saying we just want info

53:48

or shut down then the trustee should go like

53:50

fuck Yeah, we'll sell it for a dollar to

53:52

Dan and Jordan. I think it should be a

53:54

mix of the two sure I'm

53:58

not telling people what to do I'm just saying

54:00

it seems confusing to me that Alex has so much

54:02

say over it You know I understand why he has

54:04

a voice, but I do agree with you that it

54:07

should be quieter. Yeah So

54:09

we get off this topic Because

54:12

we I think it's been fully explored by

54:14

Alex that he is he is fun most

54:17

important person in the world Yeah, and he

54:19

talks to a guy. He has a guest

54:21

on okay named Anthony mukreker Ruben Oh

54:24

God having a nickname like mukreker is not a

54:26

good sign. It's no. It's a great sign It's

54:28

a great thing so this is going on and

54:30

the number between 9 and 14 Keep

54:33

showing up another big polling agency a month ago

54:35

in Texas did one they found 14 point

54:38

3% of the people saying

54:40

that they voted in the last midterm And

54:43

it was 11% in Texas and the one before that that

54:45

was just another poll So

54:47

that's enough to steal the election right there, and

54:50

the Democrats have passed laws all

54:53

over the place All

54:56

over the place to register illegals

54:58

they go, but it's just for local elections

55:01

How many elections are just local almost done sometimes

55:03

there's a special election. That's it Now

55:07

incredible investigative journalists broken

55:09

some of the biggest stories in Mexico and in

55:11

Central America and on the border been kidnapped at

55:13

gunpoint You name it Anthony mukreker

55:15

Ruben Joins us from

55:17

mukreker.com and real mukreker on X on

55:20

YouTube real mukreker So I have no

55:22

idea if he was actually kidnapped at

55:24

gunpoint I don't know

55:26

I don't care sure if so that

55:29

must have been tough It could be but

55:31

so they're talking about Immigrant

55:33

voting obviously which is illegal uh-huh, but

55:35

I also thought that now We're supposed

55:37

to believe that the election is hashtag

55:40

too big to rig I

55:42

thought that was the storyline that we had now so

55:44

now I guess voter fraud is possible And I

55:47

can't keep it straight what I'm supposed to be

55:49

afraid of I well. Here's what I Understand

55:52

Alex to be saying when we were

55:54

running against Biden it looked like we

55:56

would actually win now that we're running

55:58

against Kamala It looks like we're going

56:00

to have to steal the election. I

56:02

think 100%. We

56:04

need to keep this ball

56:07

in play because our chances

56:09

might be feeling a little bit less good. Things

56:11

have changed, yes. So Alex is interviewing this guy,

56:13

Anthony Rubin, who runs a blog called Muckraker, which

56:15

is a front for the Heritage Foundation. In

56:18

addition to their Project 2025, they also

56:20

have a media bullshit wing called Oversight

56:22

Project, which is run by a guy

56:24

named Mike Howell. In a

56:26

May 2024 interview with NPR, Howell

56:28

said, quote, the relationship between Muckraker

56:30

and Heritage is very, very powerful

56:33

one. It's not one we

56:35

go into great detail because as you

56:37

know, we're going up against a very

56:39

powerful and dangerous people to include the

56:41

cartels, weaponized Biden administration, etc. and we're

56:43

not interested in giving an org chart

56:45

out. Howell and Rubin

56:47

himself were interviewed by NPR because the

56:50

Heritage Foundation had just published a bunch

56:52

of stories about how there was a

56:54

flyer alleged to have been found in

56:56

Mexico in front of an NGO called

56:58

the Resource Center Matamoros, which said, quote,

57:01

reminder to vote for President Biden when

57:03

you're in the United States. We need

57:05

another four years of his term to

57:07

stay open. This was

57:09

written in what is described as, quote,

57:11

awkward Spanish. Oh yeah? Yeah.

57:14

Almost as if it was written by someone who is

57:16

not a native speaker of the language. It's awkward. So

57:19

the logo for the Resource Center Matamoros

57:21

and their founder's name were on the

57:24

flyer. This person, Gabriela Zavala denied any

57:26

involvement or knowledge of the flyer and

57:28

said, quote, I was almost in a

57:31

state of shock. This is completely untrue.

57:34

This was after she started to get a bunch of threats.

57:36

Sure. Violent threats. Let

57:39

me show you my signature. You're fake at

57:41

the signature. Oh, God damn it. We're fucked,

57:43

aren't we? So today it cleared this flyer

57:45

was a hoax because it uses outdated information

57:47

that matched what was on the Resource Center

57:49

Matamoros's website, like a disconnected phone line that

57:51

they didn't use anymore but hadn't been updated,

57:53

as well as text that was copied from

57:56

that website mixed with this added message about

57:58

voting for Biden is a setup. Anthony

58:00

Rubin comes into this story because he's the

58:02

guy who claims that he found the flyer

58:04

or he was the one the sort of

58:06

the Beginning point of how's

58:08

his Spanish awkward hmm So this was

58:11

then reported on by his site muckraker

58:13

and disseminated by the Heritage Foundation's oversight

58:15

project Which is their media bullshit stuff

58:17

attempts to follow up on this story

58:20

were fruitless NPR visited the

58:22

location where the flyers were supposedly found

58:24

but this wasn't even a formal migrant

58:26

center And it hasn't been serviced by

58:28

Matamoros in years Yeah further quote migrants

58:30

at the encampment denied ever seeing the

58:33

flyers NPR spoke with migrant aid

58:35

workers who said they never saw the flyers

58:37

or heard about them from migrants or volunteers

58:39

great So NPR interviewed Howell

58:41

and Rubin the two people most at

58:43

the center of this story Anthony mukraker

58:46

Rubin and the guy from the Heritage

58:48

Foundation again Mike Howell was asked if

58:50

they reached out to the resource sensor

58:52

Center Matamoros or the named founder Gabriela

58:55

Zavala Before they published

58:57

their claims to which he replied why

58:59

why why would I do that? What are

59:01

you fucking stupid? I don't need them. I don't

59:03

want that is that what he said more

59:05

or less, but since I said quote No,

59:09

no, we published it it was in the

59:11

immediate public interest to know about the invasion

59:13

in the United States We had very little

59:16

confidence that somebody who says their goal is

59:18

to fight US policy and is running an

59:20

invasion camp would be willing to play ball

59:23

There was no attempt to confirm the story because

59:25

they knew their story would never be confirmed So

59:27

who cares the headline is what matters

59:29

not the story basically you could make up any

59:31

kind of shitty flyer you want that says Anything

59:35

that you feel and if it's politically useful

59:37

for someone like Mike Howell He's willing to

59:39

publish it the person who you're

59:41

making claims about is evil So they would never

59:43

admit that they're evil. So you should just assume

59:45

this is all real and true Incidentally

59:48

it turns out that Anthony Rubin had been making a

59:50

bit of a target of this resource Resource

59:52

Center prior to the flyer

59:55

story being published Rubin had gone to

59:57

the center and pretended to be interested

59:59

in volunteering Speaking to the center's director,

1:00:01

Hugo Terrones. Oh my god. Terrones later

1:00:03

told NPR, quote, that Rubin persistently asked

1:00:05

him if he knew of organizations in

1:00:07

the United States that could help migrants

1:00:10

vote for Biden, or if they he

1:00:12

would vote for Biden. Oh

1:00:14

my god. It seems pretty obvious that someone

1:00:16

made a fake flyer that was passed along

1:00:18

to Anthony Rubin who reported it in conjunction

1:00:20

with the Heritage Foundation without doing any verification

1:00:22

of their story because it was all just

1:00:24

meant to be the basis for an attack

1:00:26

on immigration and the basis for claims of

1:00:28

election fraud in the case of

1:00:30

Trump losing the election. It's

1:00:33

incredibly transparent propaganda and this guy

1:00:35

is super not interesting. Just

1:00:38

some dime store ass James O'Keefe

1:00:40

type. Yeah. Yeah. You

1:00:42

know, it's interesting to think about in the

1:00:44

context of like a William Randolph Hearst kind

1:00:46

of situation where it was essentially the same

1:00:48

thing, but the reach wasn't as

1:00:50

big. You know what

1:00:53

I mean? Because of stuff like this

1:00:55

now, it can be disseminated in a

1:00:57

million different ways through a million different,

1:00:59

you know, instead of it being like

1:01:02

William Randolph Hearst's only publications publishing bullshit

1:01:04

because fuck it, I'm a british. What

1:01:06

are they going to do? You know,

1:01:08

now it's disseminated through a million different

1:01:10

things, you know? So this guy publishes

1:01:13

it, but something like it is

1:01:15

published somewhere else with maybe something, you

1:01:17

know what I'm saying? Yeah. It's

1:01:20

not just centralized. It's a decentralized fuck ton

1:01:22

of bullshit. No, I see. I see what

1:01:24

you mean. I just, what

1:01:27

I'm lost in my head about

1:01:29

is like, would William Randolph Hearst

1:01:31

be embarrassed by this fly? No,

1:01:33

absolutely not. So fucking

1:01:35

lutely not pretty. He would,

1:01:37

you know, you would with

1:01:40

abandon masturbate in front of an

1:01:42

open window at this at

1:01:44

the thought of this. I just think

1:01:46

the fliers a little thin. Oh,

1:01:49

he believe me. He's enjoyed a

1:01:52

greater career because of less. So

1:01:54

the two of them, their interview is largely,

1:01:56

mostly and largely. about

1:02:01

just making up shit and complaining about

1:02:03

my it's fun. This is

1:02:05

people getting driver's license and then they get

1:02:07

signed up with the Democrats and paid to

1:02:09

vote from their addresses sometimes they get called

1:02:12

the time one person voting five

1:02:14

six seven eight times under different names they're

1:02:16

given so they're not just voting under their

1:02:18

illegal name they're illegal alien real name they're

1:02:21

also sorry the data mine Zuckerberg 450 million

1:02:24

bucks he spent to get

1:02:26

databases of dead people and folks in a move out

1:02:28

of district so you move out of district another state

1:02:30

but you're still voting or you're in the ground dead

1:02:33

used to be a Republican now you vote Democrat that's

1:02:36

right yeah that's right so why

1:02:38

do those people have to be migrants if

1:02:41

if it's if it's if you can

1:02:43

just go and keep voting over and

1:02:45

over again it could be anybody right

1:02:47

I mean wouldn't have it has to

1:02:49

be migrants because that's the contract with

1:02:51

the devil hmm I

1:02:53

I I saw I wish that

1:02:56

that didn't immediately sound like something that

1:02:58

he would say it was probably you

1:03:00

know it really doesn't know it's a

1:03:02

yeah it's a minor it's a minor

1:03:04

magic thing I can

1:03:07

understand within the prism of the

1:03:09

conspiracy why they need to vote

1:03:11

in their name right so

1:03:14

we could we can go ahead with that

1:03:16

right but anybody could be using the Zuckerberg

1:03:19

list yeah I don't understand that part no

1:03:21

no it's a it's

1:03:23

a plot hole also I think I think

1:03:25

one thing we really need to do and

1:03:27

I think this is this is

1:03:29

something that we can also do beyond political

1:03:32

divide we don't need that for this we

1:03:34

got to figure out exactly how much money

1:03:36

a billion dollars really is because because here's

1:03:38

what I think of is something that's crazy

1:03:41

expensive I think at the end of it

1:03:43

at the end of my purchase I will

1:03:45

have almost like no money in the bank

1:03:47

left you know that will be a crazy

1:03:50

big purchase the idea

1:03:52

of a crazy big purchase being like

1:03:54

one one hundredth of your

1:03:56

total wealth and that you

1:03:58

make it after a year That doesn't

1:04:01

count. That's nothing. Yeah,

1:04:03

it's the relative-ness. You know what I'm

1:04:05

saying? If you were like, this cost

1:04:07

Mark Zuckerberg 55 billion dollars, I'd be

1:04:09

like, holy shit, he wanted that. Like

1:04:13

Musk and Twitter. Totally, yeah. This is a

1:04:15

serious purchase that he's wasting all of his

1:04:17

money on. Good luck, buddy. This

1:04:19

dude needs this. This dude

1:04:21

needs this shit. Yeah, yeah, yeah, yeah. Exactly.

1:04:23

Mm-hmm. Yeah. I

1:04:26

don't know. Yeah, it's tough.

1:04:29

So Anthony McRaecker-Rubin. God damn

1:04:31

it. And Alex McRaecker-Jones. They

1:04:34

both are calling for

1:04:36

mass deportation. Sure. And you'll

1:04:38

never believe the story that Ruben tells

1:04:40

to justify this. Oh, yeah? They know

1:04:43

Trump's really 20 points ahead nationwide, more

1:04:45

than 10 in battlegrounds. They

1:04:47

think with the illegal aliens, the dead people voting, it's still not

1:04:49

enough. That's why they tried to kill Trump. But

1:04:52

it's so close because of the fraud. They've got

1:04:54

at least 10 percent banked in. Okay,

1:04:56

so we've got to have maximum landslide

1:04:58

to override this. We've got to

1:05:00

have deportations of all the military H.M. in immediately.

1:05:04

The country's entire future is hanging in the balance

1:05:06

right now. Absolutely.

1:05:10

You know, Donald Trump has talked about

1:05:12

mass deportations when he's put in office.

1:05:15

If he wins, I didn't like that he dodged

1:05:17

that question during the first presidential debate. That was

1:05:19

not encouraging, but that's exactly what we need. We

1:05:21

need mass deportations. I mean, it has to be

1:05:23

scorched earth. I mean, I'm not going to allow

1:05:25

this to stand at all. You know, I'm walking

1:05:27

down Roosevelt Avenue yesterday. Okay. I

1:05:30

believe the story already. I already said this. There's

1:05:32

gangsters in every corner. But I'm walking down

1:05:34

the street, and I see a guy wearing

1:05:36

a CDG hat. That stands for Cartel Del

1:05:38

Golfo. That's the same cartel organization that kidnapped

1:05:40

my brother and I. And listen,

1:05:43

this guy would not be wearing that hat unless he was

1:05:45

part of that organization. Okay, these people aren't

1:05:47

playing games wearing these hats for fun. It's not like

1:05:49

a Yankees hat. And so, I mean,

1:05:51

this guy wearing this hat, a golf cartel member, I

1:05:54

glance at him, right? I'm just moving my head. But

1:05:57

just looking around, just doing a pan of

1:05:59

the area. And this guy starts screaming at

1:06:01

me, threatening me. And these are the people

1:06:03

that are in this country. That's one of

1:06:05

hundreds of thousands. And so, like you said,

1:06:07

we need mass deportations. And I'm convinced that

1:06:10

that is a compelling argument for why we

1:06:12

need mass deportations. Because a guy with a

1:06:14

hat that made me decide that he was

1:06:16

a cartel member yelled at me on the

1:06:19

street. When I would

1:06:21

guess he was probably doing a little more

1:06:23

than panning his head around. I'm just guessing

1:06:25

based on his track record of behavior, trying

1:06:27

to fraudulently go volunteer at this resource center

1:06:29

at Mexico. And he doesn't trust worthy. No,

1:06:31

seems like a piece of shit. Yeah. Also,

1:06:33

maybe this guy with a hat was a

1:06:35

big fan of Charles de Gaulle. I

1:06:39

mean, I don't... Sometimes.

1:06:45

Sometimes they make me

1:06:47

angry at them. And then somehow even

1:06:49

more angry at the globalists. But

1:06:52

it's in the same way that somehow people get

1:06:54

angry at the puppet. You know, where it's like...

1:06:56

Oh, it's ventral aquism. Yeah, yeah, yeah. Where I'm

1:06:58

like, Alex, if

1:07:00

you're saying we need to get rid of all

1:07:03

military aged men, then if I'm

1:07:05

the globalist, I'll just be like, hey,

1:07:07

these guys don't believe women are people.

1:07:10

We can win in a heartbeat

1:07:12

if we just make them have

1:07:14

the guns. Sure. You know, like

1:07:16

your plan... Or like, okay, these

1:07:18

people think hats are the

1:07:20

only way that we can communicate with each

1:07:22

other that we are also in the gang.

1:07:24

So we cannot wear

1:07:26

the hats. Yeah, that'd be a good

1:07:29

way to keep a low profile. It's

1:07:31

frustrating. That's why I think the guy

1:07:33

was a Charles de Gaulle fan. He

1:07:35

could have been. Yeah. I love that

1:07:37

airport. I find this guy fucking dumb.

1:07:39

I find what he does abhorrent. Yep.

1:07:41

And he can go fuck off. Yes.

1:07:44

So Alex... Fuck Raker. Yes. That doesn't

1:07:46

work quite as well, but... No,

1:07:49

but fuck him. Yep. So

1:07:51

Alex does finally get around to talking about

1:07:54

the National Association of Black Journalists. There we

1:07:56

go. He

1:07:59

wants to play a clip. Trump being

1:08:01

combative and really sticking it to

1:08:03

one of the moderators he wants to do

1:08:05

that but they play the wrong clip oh

1:08:07

and then Alex gets really mad and

1:08:09

then he storms off the show yesterday

1:08:13

at the NA DJ big

1:08:16

black journal of event they

1:08:19

came after everything they had they're

1:08:21

trying to spin it right now

1:08:23

the Trump did a great job

1:08:29

and the crowd just loved him Paula

1:08:34

Harris literally her specialty was arresting black

1:08:36

people for marijuana charges and throwing them

1:08:38

in prison and

1:08:40

she never said she was black until now and

1:08:43

Trump is right to call that out she's just it's

1:08:45

all just lies it's all fraud here

1:08:48

was the opening of Shalva when

1:08:51

they attacked him this

1:08:53

is a really really really important clip clip

1:08:56

one president trump

1:08:58

spends three minutes putting on a master class and

1:09:00

not taking Rachel Scott's baby there it is this

1:09:03

another had a stroke where

1:09:06

the people who assaulted those 140 officers including

1:09:09

those I just mentioned patriots who

1:09:11

deserve pardons well let

1:09:13

me bring it back modern day

1:09:15

like let's stop here I got

1:09:18

confused because that's not the clip I

1:09:20

want to play I want to clip

1:09:23

four I want clip clip

1:09:25

four here it is not

1:09:28

true you have told for

1:09:30

congresswoman women color who were American let's just

1:09:32

stop let's just stop now I'm gonna go to rebroadcast for

1:09:34

a while cuz I'm loaded for bear

1:09:36

I'm ready to go I thought these were other clubs I

1:09:39

want the opening thing it

1:09:41

says opening clip it's not the

1:09:43

opening clip I'm okay but

1:09:46

I'm so prepared here and I

1:09:48

sent the clubs to

1:09:51

clip four and

1:09:55

I'm totally confused so I gotta go to rebroadcast I

1:09:57

gotta find a clip and send it to you I'm

1:10:00

not I

1:10:11

just can't do this anymore.

1:10:14

There's just too much information I'll

1:10:21

send the clip again It's

1:10:24

how they open the conference up. I

1:10:26

don't know how we play two clubs. It's not the club I want

1:10:29

it I

1:10:32

Just just air something we'll see what's gonna happen just uh,

1:10:35

I'll give you guys some time to figure out what you're gonna hear Just

1:10:38

just play something. Okay, we're done. Thank you

1:10:46

One of my favorite genres of moment,

1:10:48

I mean, you know what the problem

1:10:50

is the long long pause It's

1:10:52

the opposite of the problem the long pause and then they

1:10:55

can't do the same. No, that's the opposite of the problem

1:10:57

The problem is I think all

1:10:59

of us deep down are jealous of the ability

1:11:01

to do that Yeah at any job you've ever

1:11:04

had the idea of just being able to be

1:11:06

like listen. I'm done. Mm-hmm. I'll

1:11:08

be back This is your fault. I'm out.

1:11:10

I'm confused. I'll see you tomorrow I know

1:11:12

like that like that idea of just being blunt

1:11:15

and everybody going like well, I guess that's how

1:11:17

American dream We're we all wish we had been

1:11:19

there now. Here's an interesting irony Yeah, so if

1:11:21

you listen carefully to this cuz I had to

1:11:23

go back and make sure I wasn't imagining things

1:11:26

sure Alex says play clip

1:11:28

number one. Yes, and it's not the

1:11:30

clip he wants No, then he says

1:11:32

I got that wrong play clip number

1:11:34

four, right? And he plays another clip

1:11:36

and then he says fuck this. That's

1:11:38

not the right clip. That

1:11:40

was the clip he was looking for Yeah,

1:11:42

he didn't realize it. No, that is the

1:11:44

opening clip Oh my god that they played

1:11:46

the second one was the correct one Sure,

1:11:48

he didn't realize it so he got mad

1:11:50

and then stormed off air I Am

1:11:54

going to say this and I understand that

1:11:56

this might make a lot of bosses

1:11:58

angry over the you

1:12:00

know, that kind of thing. There

1:12:04

is no possible way that you can think clip number

1:12:06

four is the opening clip. It

1:12:08

is simply not possible. But here's the twist. In

1:12:10

this case it was. Right, no, no,

1:12:12

I understand. You can't be mad at the crew.

1:12:15

It is not possible for them to have thought that

1:12:17

clip number four was the correct clip. Especially when you

1:12:19

literally said play clip number one. Play clip number one,

1:12:21

the one that you would assume is the opening clip.

1:12:23

The crew did literally everything Alex was asking them to

1:12:25

do. Yeah, it's not possible for it to be the

1:12:28

same. What an asshole. So anyway, he storms off. But

1:12:30

then he comes back and he's like, I think it

1:12:32

went pretty well. I think Trump

1:12:34

did a great job. This is

1:12:36

the desperate power structure. Going

1:12:38

after him now, the media is spinning like, oh

1:12:40

disaster at this black event. It was

1:12:43

the opposite. It was the real Donald Trump. It

1:12:45

was amazing. And I'd say 80% of

1:12:47

the support you could hear it in the crowd, can't

1:12:49

hear any boos. You hear a bunch of support. Here it is. Not

1:12:52

true. You have told four congresswoman

1:12:54

women of color who were American citizens to go

1:12:56

back to where they came from. You

1:12:59

have used words like animal and rabbit

1:13:02

to describe black district attorneys. You've

1:13:04

attacked black journalists calling them a loser,

1:13:06

saying the questions that they ask are

1:13:08

quote, stupid and racist. You've

1:13:11

had dinner with the white supremacists at your Mar-a-Lago

1:13:13

resort. So my question,

1:13:15

sir, now that you are asking

1:13:17

black supporters to vote for you, why

1:13:20

should black voters trust you after

1:13:22

you have used language like that? Well,

1:13:25

first of all, I don't think I've ever been asked

1:13:27

a question in such a

1:13:30

horrible manner, first question.

1:13:33

You don't even say, hello, how are

1:13:35

you? Are you with ABC?

1:13:37

Because I think they're a fake news network,

1:13:40

a terrible network. And

1:13:42

I think it's

1:13:44

disgraceful that I came here in

1:13:47

good spirit. I came here in good spirit. So

1:13:51

yeah, that's a little bit of the feel

1:13:54

that you get. You

1:13:56

can definitely hear at least one or two

1:13:58

people clapping. So like, you know. But

1:14:00

that is not a positive reaction that he's getting from

1:14:02

the audience. No. I

1:14:05

mean, it is a weird question because it

1:14:07

would be better phrased as, Black

1:14:09

people shouldn't trust you. Mm-hmm. Period.

1:14:12

Right. Well, I mean, yeah, it's the

1:14:14

attacks that you make on black

1:14:16

people seem to often have tones

1:14:18

of racism built into them. You're

1:14:20

a racist. Right. Yeah. Well,

1:14:24

but see, that's the thing. And I mean, he does have

1:14:26

a certain point there, right? Because normally

1:14:29

if you're asking somebody an actual question,

1:14:31

you do not list at least five

1:14:33

to 10 reasons why the answer

1:14:35

to that question is one you already

1:14:37

know. Right. Well, but

1:14:39

at the same time, setting

1:14:41

it up like this gives you the

1:14:43

opportunity to, if you are a

1:14:46

person who is not, say, racist,

1:14:49

respond in a way

1:14:51

that is political. There's a way that you

1:14:54

could answer that question in a way that

1:14:56

isn't self-sabotaging. Sure. But I will

1:14:58

put this to you. Trump probably can't. I will put this

1:15:00

to you. If you allow some, yeah, I

1:15:02

mean this honestly. If

1:15:06

you ask somebody that question and then

1:15:08

are convinced by their answer, you are

1:15:10

stupid. Unless it's a

1:15:12

really, really good answer. Nope. You

1:15:14

are stupid because that is how good the answer

1:15:16

is. If it's possible for the answer to be

1:15:18

that good, then you are going to lose every

1:15:20

single time. Sure. Yeah.

1:15:24

So I was surprised. I guess not really surprised.

1:15:26

I brought up at the beginning of the episode,

1:15:29

this is on message for Alex. Yeah, this makes perfect

1:15:31

sense. Yeah. So obviously he's going

1:15:34

to think this went well. Trump was standing up to the

1:15:36

anti-white racism of this event

1:15:39

or whatever. So it's

1:15:41

not that deep. Now,

1:15:45

there is a clip of Kamala Harris and

1:15:48

apparently Alex believes that she does a little

1:15:50

bit of a black voice in it. Okay.

1:15:53

All right. We're going to get a code switch. Yes.

1:15:56

And then he talks about how Hillary Clinton does different

1:15:58

voices in front of different audiences. I wouldn't

1:16:00

call that a code switch. So

1:16:03

he believes that Harris does this. Sure. And

1:16:07

this clip is four and a half minutes

1:16:09

long. Oh my God. And the reason is

1:16:11

because Alex loses himself in doing various ethnic

1:16:13

impressions. That sounds right. Here's

1:16:15

Trump on Kamala. And

1:16:18

I've got all the articles right here. She didn't say she

1:16:20

was black. And now she's doing

1:16:22

that thing Hillary Clinton does. When

1:16:24

she gets up in front of a black audience, she

1:16:29

like then affects some what

1:16:32

she thinks is like ghetto black accent.

1:16:37

I mean, I would be insulted if somebody from New

1:16:39

York came down here to speak to let's say Texas.

1:16:42

And it's just a crowd of Texans and

1:16:45

somebody that normally has a New York accent gets

1:16:47

up and says, how you doing today? I want

1:16:49

to eat some possum. I

1:16:51

tell you what I rode to work day

1:16:53

on a horsey. I'm married

1:16:56

to my sister. I got

1:17:00

an outhouse. But

1:17:06

instead, these Democrat politicians who

1:17:08

normally speak in whatever their real accent is,

1:17:10

get in front of black people and they

1:17:12

start doing some caricature.

1:17:17

How everybody doing today out

1:17:19

there? Careful. I mean,

1:17:23

because they're psychos, they're not real people.

1:17:25

They're sociopaths bare minimum. And they

1:17:27

think that if somebody talks like, imagine

1:17:29

if I went and spoke to a group of

1:17:31

say Chinese Americans. Oh no. Oh God. I can

1:17:33

imagine. They asked me to say, come speak to

1:17:35

him. And then I do fentanyl. And

1:17:38

I showed up and I went,

1:17:40

hello, how you doing? I am

1:17:43

here for you. Would you like

1:17:46

some rice bowl for you? I can

1:17:48

imagine that being offensive. Yeah.

1:17:52

I can imagine that being

1:17:55

a rock song. Let's

1:17:57

say an Asian American group as Trump speak

1:18:00

and it said him playing a rock

1:18:03

song coming out or whatever he wants

1:18:05

to he plays I'm

1:18:09

turning Japanese it's just the

1:18:12

song that's real it's so patronizing

1:18:16

and I'm not hung out with liberals a lot in

1:18:18

my life but in college some and places here

1:18:20

with some white liberals you go to a Mexican food

1:18:22

restaurant and the waiter comes over and

1:18:25

they go hello to

1:18:28

the waiter the waiter doesn't have a Hispanic

1:18:30

accent hi well who would you

1:18:32

like well what do you want to drink

1:18:34

how are you I

1:18:36

would like a taco oh

1:18:38

I mean they talk real slow do

1:18:41

you do you get me I

1:18:44

are you calling other

1:18:46

people racist I

1:18:48

don't understand and imagine

1:18:50

if I let's say the NAACP asked

1:18:52

me yes I know this another example

1:18:54

yeah that could be offensive

1:19:08

so let's play just the rock-lipper

1:19:10

her talking in a weird she thinks

1:19:13

ghetto accent and

1:19:15

then let's play the clip of her

1:19:17

side-by-side with Hillary and one of her

1:19:19

famous deals because if Hillary's talking to

1:19:21

rednecks she literally walks on goes how's

1:19:24

everybody doing today I'd like I said

1:19:26

I just killed me a possum and

1:19:28

I'm gonna cook it like granny where

1:19:33

is death row I guess he's in

1:19:35

the cement pond but

1:19:39

if Hillary's talking to black people she goes hello

1:19:42

everybody how you doing today

1:19:44

that's a deep voice or

1:19:48

imagine if you

1:19:50

go visit Germany and you walk into

1:19:53

a restaurant they say what would you like

1:19:55

to order well I

1:19:57

would like to a a

1:20:01

beer and I

1:20:04

would like to order some beef a

1:20:06

little Marvin the Martian I was gonna

1:20:08

say that's I kind of like that

1:20:10

boys I would appreciate if you did

1:20:12

your whole show like that they would

1:20:14

think you were insane yeah yes

1:20:17

100% or if you're in France

1:20:19

oh boy or bottle wine can you give five

1:20:21

more go but

1:20:25

bring my friend I would like to

1:20:27

order a pair of

1:20:30

the water recommend to my

1:20:32

friend that you go even

1:20:34

further just kind of effect retard was

1:20:36

on my you Given tomorrow mom

1:20:40

she going online that would be early

1:20:42

but but but but but but but

1:20:44

but but but I thought

1:20:46

I was hallucinating while I was listening to

1:20:48

that is that is crazy well

1:20:51

cuz yeah it goes on way too long and

1:20:53

I know I think he just likes doing voices

1:20:55

and I think we all yeah I think I

1:20:57

get that I mean I get it I think

1:21:00

I want I would say he doesn't have

1:21:02

a lot of range I think that some of these are pretty

1:21:04

bad but but here's

1:21:07

here's the path that he went down

1:21:09

yes he believes that Kamala Harris speaks

1:21:12

differently to different groups of

1:21:15

people right and so

1:21:17

he is lampooning that I

1:21:19

guess the first

1:21:21

thing that he comes with is like

1:21:23

what if I were speaking to a group

1:21:25

of Chinese Americans right and then he proceeds

1:21:27

to do a character that he does do

1:21:29

a lot on our show he does so

1:21:31

the premise is really off to a rocky

1:21:33

start what if I did what I do

1:21:35

yeah exactly yeah I can imagine that and

1:21:37

then he just I think that he

1:21:39

just wants to do offensive

1:21:42

impression yeah yeah I mean here's

1:21:45

part of here's part of the problem as

1:21:48

far as understanding what it is anybody

1:21:50

is ever talking about going

1:21:54

to France and then speaking to a

1:21:57

French person in a in a

1:22:00

English with a bad French accent. Acting

1:22:02

like you're Pepe Le Pew. Is not

1:22:04

at all similar to like a code

1:22:06

switch. No. Because a

1:22:08

code switch is meant to like better

1:22:11

communicate with people as opposed to

1:22:13

going up to them and insulting

1:22:15

everything that they are to their

1:22:17

face. Yeah. Yeah. Sure.

1:22:21

Sure. That is true. Yeah.

1:22:23

There are ways in which it

1:22:25

is inappropriate. Yeah, I mean I

1:22:27

think, I don't know

1:22:29

what prevailing theory is

1:22:31

right now. You know, I think a lot of people

1:22:33

for the longest time would have said that growing

1:22:37

up without having to learn a code switch

1:22:39

or without having to indulge in one would

1:22:41

be a sign of like privilege or honor,

1:22:44

you know. But then I think lately a

1:22:46

lot of people are changing their minds and

1:22:48

saying that it's actually like

1:22:51

a bad thing because it's separating people from

1:22:53

the community. You know, it's a lot like

1:22:55

being adopted by white parents, that kind of

1:22:57

thing. No longer having a connection

1:23:00

to, yeah. So, no,

1:23:02

fuck Alex. I'm not sure how to

1:23:04

parse a lot of that, obviously. Sure.

1:23:07

But I know that that's not what Alex

1:23:10

is talking about. No, absolutely not. He's talking

1:23:12

about absurd caricatures of racist images that he

1:23:14

has in his head. Yeah. And

1:23:17

that's how he would act, or he wouldn't act. He

1:23:19

definitely wouldn't act. Yeah, yeah, yeah, no, no, no. Very

1:23:21

strange. Yeah. So I was like, okay, this was

1:23:23

like a five minute string of

1:23:25

impressions that Alex did. Yeah. What

1:23:28

did Kamala Harris do? What did she say? That is

1:23:30

a good question. And so he plays the clip. Here's

1:23:33

the clip, Summer. You believe that

1:23:35

Vice President Kamala Harris is only on the ticket

1:23:37

because she is a black woman. Well, I can

1:23:39

say, no, I think it's maybe a little bit

1:23:41

different. So I've

1:23:43

known her a long time indirectly, not

1:23:46

directly, very much. And she

1:23:48

was always of Indian heritage and

1:23:50

she was only promoting Indian

1:23:52

heritage. I didn't know she was black

1:23:55

until a number of years ago when she happened to

1:23:57

turn black. And now she wants to be known as

1:23:59

black. So I don't know, is

1:24:01

she Indian or is she black? She is always

1:24:03

identified as a black, which was historically black college.

1:24:06

I respect either one, but she obviously

1:24:08

doesn't. Because she was Indian all the

1:24:10

way and then all of a sudden she made a

1:24:12

turn and she went, she became a black

1:24:14

person. Just to be clear sir, do

1:24:16

you believe that she is? I think somebody should

1:24:18

look into that too when you ask a continuing,

1:24:21

a very hostile. And you all

1:24:23

helped us win in 2020 and we're going to

1:24:25

do it again in 2024. Yes

1:24:28

we will. Yes we

1:24:30

will. So

1:24:34

that's it? That's, well,

1:24:36

there you go. Proves everything.

1:24:38

I don't even see

1:24:40

that as being, that's not even

1:24:42

close to something that is a

1:24:44

strange manner of speaking for her.

1:24:48

It is. I don't know what's going on

1:24:50

here. It's so weird. It's so weird in

1:24:52

America how people get away with that because

1:24:54

it is like, it just

1:24:56

goes all the way back to the beginning of

1:24:58

just like, as it's

1:25:01

just the other, it's skin color. It

1:25:03

doesn't matter what a percentage at the

1:25:06

very beginning. It was like, oh you

1:25:08

look, it's a look. It doesn't, it's

1:25:10

not a number. They didn't have blood

1:25:12

tests or DNA tests. They just, you

1:25:14

just look and then we're offended. We

1:25:16

offend you for it. Yeah, it was

1:25:18

a method of exclusion. Yeah, it's still

1:25:20

the same thing. It's not

1:25:22

about the numbers. No, but

1:25:25

he's, yeah, yeah. I

1:25:27

just think that there's something really shocking

1:25:29

about hearing somebody say she

1:25:32

turned black. I mean, that's been, that's

1:25:34

so insane. I think it is part

1:25:36

of why it's insane is because people

1:25:38

are engaging with the euphemism instead of

1:25:41

just the very simple and easy to

1:25:43

understand thing of she does not look

1:25:45

like me. I only think people

1:25:47

who look like me deserve to be in the

1:25:49

country. That's what he said. That's underneath a lot.

1:25:52

Don't engage with, I think

1:25:54

she's 45%. No. Right. No.

1:25:58

It's a silly argument. to approach

1:26:00

on his terms. Exactly. I agree

1:26:03

with that. Yeah. Yeah. Yeah. It's

1:26:06

just, I don't know. I don't know how you

1:26:08

can even handle it. It's insane. But thankfully we

1:26:10

don't have to handle it with Alex doing any

1:26:12

more impressions of ethnic groups. That's good. The

1:26:15

voice he's doing is like a cross

1:26:17

between White Hillbilly

1:26:20

and I guess Southern Black. But

1:26:24

that is like, I

1:26:27

just couldn't imagine. Again, I get

1:26:31

asked to speak in front of the

1:26:33

Japanese-American. You are imagining right now. This

1:26:35

is what you are doing. And

1:26:38

they had me on top of the economy. And I walk in, and I go, and

1:26:42

I'll have a hoo. Honeys, hoyo, hoyo,

1:26:44

hoyo, hoyo, hoyo, hoyo, hoyo, hoyo, hoyo,

1:26:47

hoyo, hoyo, hoyo, hoyo, hoyo, hoyo,

1:26:49

hoyo, hoyo, hoyo, hoyo. How do

1:26:51

you do today? It is good

1:26:53

to be here with you. I want to thank

1:26:55

you for having me here to speak. Now

1:26:58

I pull out something I saw and kill you. I

1:27:06

can even put tape on it. I

1:27:10

mean, that is what we're talking about here. I'm

1:27:14

going to stop. Yeah, you probably should. Wait,

1:27:17

to whom by whom, from what?

1:27:20

I do feel like he is describing what Trump

1:27:22

did. I think that is a

1:27:24

great way of describing what Trump did. Trump went

1:27:27

into the National Association of Black Journalists and then

1:27:29

insulted them directly to their face. I can imagine

1:27:31

that. Yeah. True. I

1:27:35

just feel like it's gotten sad now. Alex's

1:27:38

impressions, the fact that he

1:27:40

did a long string of them, wow, it's still

1:27:42

sad. It's self-contained. The

1:27:44

fact that he played the clip and he's like, I got

1:27:46

to do one more. This is a bummer.

1:27:50

It's desperate. He clearly

1:27:52

has nothing to talk about. And

1:27:55

he just is longing for

1:27:57

the days of... It's just

1:27:59

so much moving. It's still movies.

1:28:01

It's like his his accents are

1:28:03

all things that he thinks people

1:28:05

sound like from movies like yeah

1:28:07

It's kung fu movies. It's not

1:28:09

like a shank high shack movie.

1:28:11

It's it's samurai movies. Yes You

1:28:14

know these based on yeah stereotypes.

1:28:16

Yeah, that were built on Characters

1:28:19

absolutely good times yep, so

1:28:21

the Olympics are going on you've been keeping up.

1:28:23

I have I've enjoyed the Olympics It's been a

1:28:25

lot of fun this year all right. Yeah, you've

1:28:28

been seeing some women's boxing. I have not seen

1:28:30

any women's boxing I just heard about all

1:28:32

the things people are saying and I want

1:28:34

to strangle the internet all of you need

1:28:36

to go This isn't gonna make you very

1:28:38

turn it off turn all of it off

1:28:40

no more electricity Alex has some thoughts Oh,

1:28:42

God no more electricity for anyone about Women's

1:28:45

boxing and the nope we all lost

1:28:47

Algerian boxer Iman Karlev

1:28:50

wins fight And

1:28:53

Olympics after being cleared to

1:28:56

compete in women's events Despite

1:28:58

eligibility fight as

1:29:00

Italian opponent abandons bout Boxer

1:29:04

brings issue of Olympics gendered testing to

1:29:06

surface Wow

1:29:12

Oh Disgusting

1:29:16

women need to boycott all this we all

1:29:18

need to boycott this and

1:29:20

easy to know anything else about the Olympics It's

1:29:24

this right here. Yeah, even got him adjusting

1:29:26

his junk Don't

1:29:28

get his pee pee in order That's

1:29:31

dude it looks a mean dude

1:29:34

all snarling And

1:29:36

then after she quits he comes over and walks by like he's

1:29:38

gonna hit her again, man, you're a

1:29:40

real tough guy buddy boy You

1:29:46

know I'm out of shape but I guarantee I'd get

1:29:48

in the ring and beat I'd knock that son of

1:29:50

a bitch out in three Seconds, I

1:29:53

punch him so hard in the jaw right underneath the jaw

1:29:55

break his teeth off, but he matter he's wearing that Stupid

1:29:58

ass mouth guard. I'll break his jaws so fast

1:30:00

will make his head spin. I'll

1:30:02

break my hand on his ugly head. Son

1:30:05

of a bitch. Piece of shit. Absolute

1:30:12

disgusting cockroach. Go

1:30:14

ahead and play the 46 second fight. Here it is.

1:30:17

So that boxer, Imain Khalif

1:30:19

is not trans. They're

1:30:22

just getting mad about nothing. Because

1:30:25

they hate people who don't conform

1:30:27

to their ideas, that

1:30:29

guess are precise notions of

1:30:31

gender. So Alex descends

1:30:33

into violent fantasies. I

1:30:36

do think it would be really interesting

1:30:38

to see him try that boxing

1:30:40

match. You ever see that

1:30:44

jackass episode where Johnny Knoxville boxes Butterbean?

1:30:46

Yeah, yeah, yeah. It could be like

1:30:49

that, I think. I

1:30:51

think this is actually such a great

1:30:53

example of what we were just talking

1:30:55

about. Of like, it's not about percentage.

1:30:59

It's not about that. It's not about

1:31:02

people being or not being. It is

1:31:04

about the appearance and the ability to

1:31:06

exclude the other. You're the other. Yeah,

1:31:08

it doesn't matter. It doesn't matter if

1:31:10

you are or not trans. But we're

1:31:12

going to use that as a cudgel.

1:31:15

Exactly, it is not, it is all

1:31:17

of us. All of us

1:31:19

at any point in time could be chosen

1:31:21

to be the other if we're in a

1:31:23

small enough group. And that is

1:31:25

why all of us need to stop this shit.

1:31:28

It would be a wise lesson to learn. And

1:31:30

this is just disgusting. It's just disgusting.

1:31:33

Alex descends into quite protracted,

1:31:37

hate-filled, motherfucker. Motherfucker. But

1:31:40

just to give you- Two voices again. I'm better

1:31:42

with that racism, man. Just to give you a

1:31:44

little sense of how informed he is on this

1:31:46

story. Here's a little clip of him saying something.

1:31:49

The main caliph is the definition of a

1:31:51

piece of filth. Beating

1:31:55

up a woman on international TV.

1:31:58

Snarling at her. Celebrating. and

1:32:00

then probably getting a gold medal. Ha

1:32:02

ha ha ha ha! I mean, I'm laughing because

1:32:04

I don't want to cry here. The

1:32:08

Olympics is a total joke. Look at this guy's

1:32:10

face. As he punches a

1:32:12

woman in the face. So he thinks that she

1:32:15

won a gold medal. It was a preliminary round.

1:32:18

You know, the medals haven't been... I

1:32:20

mean... He just doesn't even know what he's

1:32:22

talking about. He just knows that this is what

1:32:24

he's supposed to be angry about. And he's getting angry about it.

1:32:27

It's touching into a lot of his feelings. I

1:32:29

don't know. I mean, we started this episode with him talking about how

1:32:31

he's Neo. And Google has planned

1:32:33

all their shit based on his resistance.

1:32:35

Yep. Bunch of nonsense.

1:32:38

And we descended into a slew

1:32:41

of racist impressions. He talked to

1:32:43

this weird guy who did some

1:32:45

James O'Keefey kind of stunts to

1:32:47

stoke hatred of migrants. And now

1:32:49

we're into, I hate

1:32:52

trans people because I was told to... And

1:32:54

I'm going to yell about this boxer. I

1:32:58

am Neo. The internet is built around

1:33:00

me. And now to prove that,

1:33:03

I'm going to follow whatever is popular today.

1:33:05

Yeah. I'm going to get mad at the

1:33:07

thing that the right wing meme factory told

1:33:09

me to be mad about. Yeah.

1:33:12

And so he does go

1:33:14

quite deep into his trans

1:33:16

hatred. Great. Because everyone

1:33:18

predators and all that shit.

1:33:22

But here, here's a little clip of

1:33:24

him talking about Elon Musk's child in

1:33:27

a way that is not

1:33:30

appropriate. And you ask, why do

1:33:32

they do this? Because the sky is

1:33:34

the limit. They want to see what

1:33:36

they can get us to put up with. And the

1:33:38

craziest thing the globalists could come up with is this.

1:33:42

And then mainstream media reports

1:33:44

on it like, well, there's a little debate about

1:33:47

it, you know, but we

1:33:49

love women. The globalists don't like

1:33:51

women. They want to destroy women. They

1:33:54

want to inject you with poison shots. They want to sterilize

1:33:56

you. A-helon

1:33:59

Musk. said about his son, he's being

1:34:02

sterilized as we speak. His

1:34:04

son is dead. He

1:34:07

sent his son because his wife

1:34:11

wanted to to an elite school. And

1:34:15

the elite school brainwashed his child. And

1:34:20

they took his child away from him and they sterilized him. And

1:34:23

now you gotta go on and let's deal with. But

1:34:26

think about the cult.

1:34:29

Okay, if you're gonna go sterilize people's kids and

1:34:31

make millions of dollars over their lifetimes, ruin their

1:34:33

lives and your whole CIA cult,

1:34:37

do you think messing with one of

1:34:39

the most powerful people in the world

1:34:44

is a good idea? No, they don't have

1:34:46

any sense. To

1:34:51

them it was a bigger win. Oh, we've

1:34:54

got Elon Musk, first born son. Instead

1:34:57

of cutting his scalp off, the

1:35:00

son of our power will chemically destroy

1:35:02

him. So this is interesting, because Elon

1:35:04

Musk's daughter has come out and talked

1:35:06

about how he's a piece of shit

1:35:09

and he can fuck himself with all this nonsense. And

1:35:12

I think that what

1:35:15

Alex is expressing is really, really

1:35:17

weird. I think

1:35:19

it's a very strange mindset,

1:35:21

because I guess what he, here's

1:35:24

what I had to wrestle with. I think

1:35:26

the first born son thing is actually

1:35:28

pretty important, because he believes that race

1:35:30

memories are passed down in the first

1:35:32

born son. Sure, sure, sure, that's Bible

1:35:34

shit. Right, but he does believe that.

1:35:37

He believes that. They mean half of them believe that.

1:35:40

Alex does. Yeah, yeah, totally, totally. Alex believes that

1:35:42

your first born child is critically important, because that's

1:35:44

where all the epigenetics go and all that shit.

1:35:46

Right, that makes sense. So he

1:35:48

is thinking about it on even

1:35:51

another level. This

1:35:53

cult has come along and stolen away

1:35:55

the race memories of Elon Musk. It's

1:35:57

not just a child, it is. I'm

1:40:00

like, I'm listening to this and I'm like,

1:40:03

we do, oh,

1:40:05

Massive Attack? All

1:40:08

right, yeah, tricky. So,

1:40:10

yeah, I'm listening to this, I'm like, oh, here's

1:40:13

the key to the puzzle. You said at the beginning he's

1:40:15

a big fan of Alex's show. There is that. So he

1:40:17

just listens to like right wing media and then he like

1:40:20

pretends that he has visions from God that

1:40:22

affirm all of those paranoid fantasies. Yeah. Woo!

1:40:24

So as it turns out, he was a

1:40:26

guy who was a janitor

1:40:29

at a church for a

1:40:31

number of years, cleaned a lot of toilets. I

1:40:33

don't. And he was alone in these bathrooms cleaning

1:40:35

toilets and God would come and talk to him.

1:40:38

And that's how he started getting visions.

1:40:41

And yeah. I don't

1:40:43

think there's any difference between a janitor or

1:40:45

a preacher getting visions. I don't

1:40:47

think there's any difference between those two. No, not as,

1:40:49

I'm just telling you what he said his story was.

1:40:51

Nope, nope. This is, I think it's a better story,

1:40:54

honestly. Is he a preacher now? On

1:40:56

YouTube. Oh man, it'd be great

1:40:59

if he was, here's what

1:41:01

needs to happen for him as a

1:41:03

storyline, right? It needs to be, I was

1:41:06

a janitor, I made these prophecies, then I was

1:41:08

elevated, you know? And now I am the pastor

1:41:10

of my church. Well, he is kind of, it's

1:41:13

just on YouTube. Yeah, yeah. Not good enough. Well,

1:41:15

God told him to do the YouTube thing. God

1:41:17

needs to get to work. God told him to

1:41:19

do the YouTube thing. This is what he's supposed

1:41:21

to be doing. That is nice of God. So

1:41:23

he has another prophecy. Okay. Of another COVID. There's

1:41:26

gonna be another plague, but

1:41:28

it has to do with, I think, ozimpic. But

1:41:30

the major thing that the Lord has

1:41:32

been talking to me about to warn

1:41:34

the people is they're designing a new

1:41:36

plague. And he said, Brandon, I need

1:41:38

you to tell my people that there

1:41:41

is a massive plague that will make

1:41:43

COVID look like a walk in the

1:41:45

park. He said, Brandon,

1:41:47

they are using AI technology.

1:41:50

And I saw them taking the DNA

1:41:52

code of this virus

1:41:54

that they're making in a lab.

1:41:56

I saw these leaders. I saw

1:41:58

these people. I don't from

1:42:01

my life. I probably shouldn't name their names, but

1:42:03

I saw these Men

1:42:05

in a meeting and they

1:42:07

were designing this virus me

1:42:10

he it was

1:42:12

it was going to be spliced with

1:42:14

the medication that you take for a heart

1:42:16

disease and Diabetes and things

1:42:18

like that and they were using it

1:42:20

with AI technology the the people

1:42:23

who take those viruses He was cooked the way

1:42:25

you're talking about a binary weapon. You're but be

1:42:27

clear You're saying

1:42:29

it was designed to hit people that were already on certain

1:42:31

medications Yes, sir And it was

1:42:34

he said it would be it would it was Genetic

1:42:36

code the way they were doing it with

1:42:38

DNA and he said they were using that

1:42:41

kind of technology somehow and making it into

1:42:43

A super virus and he said Brandon there

1:42:45

will be no Vaccination that they will be

1:42:47

able to come up with this This is

1:42:49

made to have like a genocide against the

1:42:51

people and he told me said you must

1:42:53

warn them that they're going to do So

1:42:56

when Brandon says he said that's God. Yeah,

1:42:58

I would assume so. Yeah, that's interesting. Yeah

1:43:01

So this is stupid another just

1:43:04

a right-wing media sort of

1:43:06

fear type of thing

1:43:09

Here's here's where my mind went

1:43:11

was listening to this Okay,

1:43:13

so God is telling you that there's

1:43:15

going to be this plague that is

1:43:18

like there's no fixing it No,

1:43:20

no, no, you're fucked. So let's

1:43:22

imagine Someone does develop a

1:43:24

vaccine sure you should take it right? No,

1:43:27

no, no, no because then you'd be going against the

1:43:29

will of the Lord But what what but these

1:43:32

prophecies sure they're very explicit about how

1:43:34

like they're being given them so they

1:43:36

can change the future Right. God is

1:43:38

telling them these things. Yeah, God's a

1:43:40

God's a weirdo Like God's making these

1:43:42

he's having them be a messenger so

1:43:44

we can avert these crises. Sure. Sure

1:43:46

Sure, so if there is this plague

1:43:48

coming and it is like hey,

1:43:51

it's real bad This one's a real serious one. Yeah

1:43:53

real bad. Yeah, someone comes up with a vaccine. You

1:43:55

should probably take it right? No, I'm gonna die anyway.

1:43:57

Nah No, good.

1:43:59

They'd find a way to Anti-vax I mean I don't

1:44:01

know what to tell you I just

1:44:03

I wish I wish Prophecies

1:44:07

are fun though hmm aren't they not

1:44:09

when they're just this yeah These are

1:44:11

kind of when they're they're a little

1:44:13

silly. I wish I was in the

1:44:16

room with the three Scientists

1:44:18

as they're making the human race

1:44:20

ending virus just like dude. Are

1:44:22

you shitting me? Do you know

1:44:24

what we are doing right now?

1:44:26

This is fucking crazy AI

1:44:29

to change jeans Epic

1:44:32

I'm gonna snort it So

1:44:35

I found this guy to be a little bit of a

1:44:37

rambler a little bit of I

1:44:39

found his predictions boring I found him

1:44:41

on compelling, but Alex feels the spirit

1:44:43

okay, and the law I saw it

1:44:45

was a great like a yellow cloud

1:44:48

It was like a plague like coming out of

1:44:51

it was airborne is what the Lord was trying to show me

1:44:54

and there were there was literally Camps

1:44:57

of people all they

1:44:59

the hospital overwhelmed they were

1:45:01

overwhelmed with all the people that were

1:45:03

dying from this virus and they had

1:45:06

They had tents the Lord showed

1:45:08

me tents of people all throughout

1:45:11

the parking lots of places that

1:45:14

with trying

1:45:16

to Take care of them just

1:45:18

to help them just to go on and they were

1:45:20

going to die I mean there were once you got

1:45:22

the virus of what this is you did not make

1:45:24

it I

1:45:26

want you to continue on throughout the hour and and

1:45:28

and because you know I can I

1:45:32

can tell you're real I can

1:45:34

feel the spirit that you guys are on target And

1:45:36

I've seen some of these you've said they've come through

1:45:38

as well, and I've had some more things

1:45:40

happen I never really wanted to even say it on air,

1:45:42

but I know when it's God telling me something yeah, man

1:45:44

This is powerful stuff So now might be

1:45:47

a good time to tell you that this guy one of

1:45:49

his websites is last days 247

1:45:51

now that I think is one of the

1:45:53

most perfect websites

1:45:56

24-7 end times yeah forever.

1:45:59

It's the end times forever. I

1:46:02

don't know. Perpetual, never ending, 24 hours a

1:46:04

day, 7 days a week, from

1:46:07

here to eternity it's the end times. Yeah. Such

1:46:10

a self indictment. Yeah, there's

1:46:12

something of an ouroboros there.

1:46:15

Maybe, maybe just a little bit. Last days 24-7.

1:46:17

As long as

1:46:19

I predict it all the time, eventually

1:46:22

things are gonna end. Yeah. We're

1:46:24

gonna get to the end because

1:46:26

this is just gonna keep going till

1:46:28

the end. Yeah, yeah. Man,

1:46:30

it would be so shitty. Actually, I'm super

1:46:32

stoked that we don't live for like thousands

1:46:35

of years because it would be super shitty

1:46:37

if you had to be around a guy

1:46:39

who was like right around

1:46:41

the corner. Coming right around the corner.

1:46:43

Said that 500 years ago. I mean, like dude, we

1:46:45

can't do this anymore. Calm down. Yes,

1:46:48

I get it. Maybe after a few thousand

1:46:50

years it will. I mean, that's kind of

1:46:52

how things work, man. Just

1:46:54

leave us alone. This time I'm

1:46:56

gonna ignore you. Yeah, you gotta go to bed,

1:46:59

man. So he's one of

1:47:01

two YouTube pastors who are on the

1:47:03

show. I'm sorry? Yeah. And

1:47:05

so the second pastor whose name I

1:47:07

didn't write down. That's, yeah. He believes

1:47:10

that end time prophecies have been a

1:47:12

little bit misinterpreted because of a Western

1:47:14

bias. Oh yeah? Yeah. Which Alex shouldn't

1:47:16

agree with. A Western bias. Yeah, yeah.

1:47:18

Fascinating. And so he believes that there's

1:47:21

going to be an Antichrist that comes

1:47:23

out of the Middle East. Sure. And

1:47:25

the Antichrist, I think the epicenter has

1:47:27

been wrongly taught because America is a

1:47:30

bit American centric and because the ancestors

1:47:32

of Americans usually have come from Europe.

1:47:34

No, we're not. We have a very

1:47:36

Eurocentric gospel, Eurocentric eschatology. I don't believe,

1:47:39

I think it's partially right. Sounds crazy.

1:47:41

What are you talking about, sir? But

1:47:43

the center of the Bible and the

1:47:45

center of the Antichrist B system is

1:47:48

going to eventually be in the Middle

1:47:50

East. What? And this is where Obama

1:47:52

is very interesting. I had a

1:47:54

dream just before this show a few, say

1:47:56

a couple of months ago where I saw

1:47:58

Obama and He did in the

1:48:01

dream. I heard he did the old switcher room.

1:48:03

Whoa. The old switch dropped out and all that.

1:48:05

And I also prophesied and I said, I have

1:48:07

the date for when he would drop out. And

1:48:10

that's on YouTube. The exact date of it within,

1:48:12

you know, about 14 hours. And

1:48:15

he said in my dream, by the way, I'm

1:48:17

not trying to. God's talking to us. I call

1:48:20

the same day the 20th, 21st. I'm

1:48:23

not trying to do the management exam, but I did it too. I

1:48:26

did that too. Can't not. I just cannot

1:48:28

let it go. I did that too. Couldn't

1:48:30

allow it to slide one time. Especially when,

1:48:32

you know, really highlighting it

1:48:34

kind of does diminish the power of

1:48:37

this this prediction. Yeah. Kind of maybe

1:48:39

it seems like everybody might have.

1:48:42

A lot of people might have said it. Interesting.

1:48:45

So he had a dream that Obama did the old

1:48:48

switcher room. Yeah. I

1:48:51

think it's so great when

1:48:53

people make the basic observation that the

1:48:55

people in the Bible did not know

1:48:57

that America existed. And therefore

1:49:00

all of the prophecies were probably not

1:49:02

about America. But they, but they are. Well,

1:49:05

fair enough. Because Obama did the old switcher room.

1:49:07

I did not see Obama do the old switcher

1:49:10

room. So that is on me. The switcher room

1:49:12

is in the Bible. That is in revelation. I

1:49:14

do love a good switcher room. So I, he

1:49:17

did not end up getting a chance to

1:49:19

finish his thoughts, but I think that what

1:49:21

he was getting at is we're

1:49:23

getting back to, Obama's going to turn

1:49:25

the US into a Muslim caliphate. Right. Right. No, no,

1:49:27

no. That's got to be it. We're

1:49:30

getting back to 2015. Obama did

1:49:32

the switcher room and Kamala is going to do

1:49:34

the fumble roosky and then we're going to have

1:49:36

a caliphate. The old whip whip. Yep. Makes perfect

1:49:38

sense. This

1:49:42

fucking soda is so stupid. What are

1:49:44

any of you talking about? Nonsense.

1:49:49

So we've one last clip though. Yeah. And

1:49:51

this is a discussion of how Brandon

1:49:53

doesn't go to a church. The prophet

1:49:55

doesn't go to a church. Well, we're at

1:49:57

anywhere where two or more of you gather.

1:52:00

Sure, yeah, yeah. Because he's obviously

1:52:02

showing indications that he could be

1:52:04

a disruption. Sure. And

1:52:06

maybe make a church not such a pleasant experience

1:52:08

for a lot of people who are there. Cut

1:52:11

into the tithes a little bit. Yeah. Cut

1:52:14

into the community stability. It is tough

1:52:16

with religion whenever you start going, people

1:52:18

are too religious. You know, that's always

1:52:20

tough, because you're the religion. You're not

1:52:22

supposed to be like, hey, you're too

1:52:25

religious. I would look

1:52:27

at it a different way. Sure. Look

1:52:29

at it not as you're too religious. Sure. But

1:52:32

you're going through something and it's manifesting

1:52:34

in a religious way. Right. We

1:52:37

don't want to turn our back on you, but we're going to

1:52:39

assign this usher to

1:52:41

help you out a little bit,

1:52:43

because you're going through something. But if you believe

1:52:46

in the religion, you can't believe that people are

1:52:48

going through something. And there's the

1:52:50

religion. You can't believe in both. I

1:52:53

think you can. Well, I mean, you

1:52:55

can believe in both, but you can't believe

1:52:57

honestly in either one if you believe in

1:52:59

both. I think you can. That's in the

1:53:01

book that you are talking about, that you

1:53:03

can't believe in both if you believe in

1:53:06

both. I think it is an entirely stable

1:53:08

thing. Sure, but not if

1:53:10

you believe in the book. But the

1:53:12

book doesn't say Brandon is a fucking prophet.

1:53:16

I mean, you know, but you can't say

1:53:18

that other people are a prophet if

1:53:20

you say that other people aren't a

1:53:22

prophet. You know, that's the problem with

1:53:25

religious. I think that if you accept

1:53:27

these prophecies that were way back. Sure.

1:53:30

It's a little easier to swallow. I

1:53:32

mean, there is, it is always easier to

1:53:35

swallow, you know, oh, Job talked

1:53:37

to God instead of Joseph Smith

1:53:39

found some, you know, plates. Yeah. I

1:53:41

get it. Or Brandon sees a

1:53:43

glow cloud. It is tough to believe.

1:53:45

Brandon's glow cloud is the way God

1:53:48

wants to talk to us now. Cloud Schwab's using

1:53:50

AI to meddle with DNA in order to get

1:53:53

another plate going with those epic. Oh, man. Whoa. I

1:53:55

do like a God that keeps things updated, though. You

1:53:57

know, you don't want a God that's stuck in the

1:53:59

past. You want a God that's still familiar with

1:54:01

the new trends like Ozempic? You

1:54:03

know, you don't want God being like, what about those old

1:54:06

weight loss drugs? No, no, no, no, no. You don't want

1:54:08

to let God giving you a prophecy now about Fen Fen.

1:54:11

It's no good. I

1:54:13

do think that God's very conversational and chatty

1:54:15

in the way that Brandon is describing him.

1:54:18

Much in the same way with Alex with the clocks and stuff. It

1:54:20

is very similar to somebody's own internal

1:54:22

monologue, one might even say. Very similar.

1:54:24

This show's fucking stupid. Very

1:54:27

dumb. Everything is bad.

1:54:29

I think the only reason

1:54:32

that I jumped to this episode and we

1:54:34

turned it around in a rapid order

1:54:38

is because of Trump's

1:54:40

appearance. And

1:54:43

I think that it was satisfying in some

1:54:45

way, the response, because it's exactly what you

1:54:48

kind of think it would be. No big

1:54:50

deal. He did a great job because it's

1:54:52

all a message. It appears

1:54:54

to be one of the craziest,

1:54:56

like missteps that a politician

1:54:58

could make. But if

1:55:01

you're on board with Trump by

1:55:03

this point, there's a really good

1:55:05

reason to think that like, this isn't this doesn't

1:55:07

matter. This is what you would expect him to

1:55:09

say. He's he's doing

1:55:11

what he's supposed to do. That

1:55:14

sucks. I mean, I

1:55:16

understand why we feel like we

1:55:18

have to go through the motions here, but the

1:55:20

man was already president. You

1:55:23

can't find out more about somebody

1:55:25

after they've been president, except

1:55:27

for like, oh, he murdered somebody while

1:55:29

he was the president. Like

1:55:31

you've what you found out about Nixon wasn't

1:55:34

like, oh, Nixon was secretly a worse person

1:55:36

or a better person. You knew Nixon was

1:55:38

shit. You just found out the shittier things

1:55:40

he did. Yeah. He had

1:55:42

some suspicions of the character of the person.

1:55:45

And then you learned some details.

1:55:47

Yeah. Yeah. Yeah.

1:55:50

I don't know. I enjoyed hearing the prophecy.

1:55:53

I guess. What was that? What's the

1:55:55

second guy's deal? That's the

1:55:58

guy. That's the guy that I don't like. advantage

1:56:00

of the janitor that guy is a preacher who

1:56:02

is like I guess he lived in Australia and

1:56:04

then he came to the US and he has

1:56:06

a YouTube channel I don't know they're all just

1:56:08

YouTube guys oh boy yep anyway

1:56:12

what a day to add

1:56:14

to Alex's voiceover real that

1:56:17

is wild so we'll be

1:56:19

back tell

1:56:21

anyway website to it's solidified.com yeah

1:56:23

we'll be back I'm just the mysterious professor

1:56:25

now I guess I can't be Neo anymore

1:56:27

I could be DZX well

1:56:35

we have we have a pretty solid Neo candidate

1:56:37

I'm the mysterious professor man and now here comes

1:56:39

the sex robot Andy and Kansas you're on the

1:56:41

air thanks for holding

1:56:45

well Alex I'm a first-time color I'm a huge fan

1:56:47

I love your work I love

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features