61 Years Featuring Kaylee Jade Peterson and David Eliot
In this episode commemorating the 61st birthday of the host, Kaylee Jade Peterson talks about her run for Congress in Idaho and David Eliot discusses his book, Artificially Intelligent.
In this episode Erik Fleming celebrates his 61st birthday and leads a powerful conversation covering local and national news, protests, and justice. He interviews Kaylee Jade Peterson, a progressive candidate running for Congress in rural Idaho, about restoring trust in government, public lands, and economic challenges for working-class families. David Eliot, an AI researcher and author, discusses the human origins of AI, its societal impacts, surveillance concerns, and paths forward. The show also reflects on recent shootings, protests, and the urgency for civic action.
00:05 - Welcome to A Moment with Erik Fleming
01:14 - The NBG Podcast Network Introduction
05:52 - News Moments with Grace G.
08:10 - Interview with Kaylee Jade Peterson
10:18 - Icebreakers with Kaylee
18:15 - Kaylee’s Political Journey
19:28 - Restoring Trust in Politics
22:38 - Addressing Idaho’s Concerns
28:41 - Abolishing ICE Discussion
31:22 - Rapid Fire Questions
50:31 - Introduction of David Eliot
51:56 - AI and Its Human Story
01:02:39 - Knowledge Sharing and the Book
01:03:15 - Navigating AI in Everyday Life
01:04:21 - Understanding AI’s Implications
01:06:37 - The Dangers of Systemic Bias
01:11:23 - Data: The New Oil Debate
01:13:58 - Google and Surveillance
01:16:51 - The Value of Personal Data
01:19:03 - The Luddite Fallacy in AI
01:25:54 - AI in Healthcare and Accessibility
01:26:34 - Hope for the Future with AI
01:29:22 - Closing Thoughts on AI and Society
01:46:35 - Current Events in Minnesota
01:54:51 - The Age of Ridiculousness
02:05:09 - A Call to Action for Change
WEBVTT
00:00:00.017 --> 00:00:06.117
Welcome. I'm Erik Fleming, host of A Moment with Erik Fleming, the podcast of our time.
00:00:06.417 --> 00:00:08.977
I want to personally thank you for listening to the podcast.
00:00:09.337 --> 00:00:12.737
If you like what you're hearing, then I need you to do a few things.
00:00:13.257 --> 00:00:19.397
First, I need subscribers. I'm on Patreon at patreon.com slash amomentwitherikfleming.
00:00:19.757 --> 00:00:24.637
Your subscription allows an independent podcaster like me the freedom to speak
00:00:24.637 --> 00:00:27.957
truth to power, and to expand and improve the show.
00:00:28.557 --> 00:00:32.837
Second, leave a five-star review for the podcast on the streaming service you
00:00:32.837 --> 00:00:35.577
listen to it. That will help the podcast tremendously.
00:00:36.217 --> 00:00:41.917
Third, go to the website, momenterik.com. There you can subscribe to the podcast,
00:00:42.297 --> 00:00:47.257
leave reviews and comments, listen to past episodes, and even learn a little bit about your host.
00:00:47.857 --> 00:00:51.857
Lastly, don't keep this a secret like it's your own personal guilty pleasure.
00:00:52.577 --> 00:00:57.097
Tell someone else about the podcast. Encourage others to listen to the podcast
00:00:57.097 --> 00:01:02.417
and share the podcast on your social media platforms, because it is time to
00:01:02.417 --> 00:01:04.177
make this moment a movement.
00:01:04.597 --> 00:01:10.177
Thanks in advance for supporting the podcast of our time. I hope you enjoy this episode as well.
00:01:15.377 --> 00:01:20.197
The following program is hosted by the NBG Podcast Network.
00:02:00.642 --> 00:02:06.302
Hello, and welcome to another moment with Erik Fleming. I am your host, Erik Fleming.
00:02:07.202 --> 00:02:13.302
So today, as this episode drops, I'll be officially 61 years old,
00:02:13.502 --> 00:02:17.962
and it'll be just another day for me.
00:02:18.162 --> 00:02:25.942
I do acknowledge the fact that I am 61 years old, and I'm thankful that I've
00:02:25.942 --> 00:02:27.182
been able to live this long,
00:02:28.422 --> 00:02:34.522
especially as a black man in America, and we'll get into that later on in the
00:02:34.522 --> 00:02:37.242
show, along with some other things that happened.
00:02:37.782 --> 00:02:40.822
Update on Nekima Armstrong, if you,
00:02:41.542 --> 00:02:47.622
Nekima Levy Armstrong, if you haven't been following, she was released along
00:02:47.622 --> 00:02:52.562
with the other two people that were arrested that were organizing the protest,
00:02:52.882 --> 00:02:59.022
and Grace will mention that in the news summary, but, and they're still out there.
00:02:59.162 --> 00:03:04.822
They just had a press conference last week and, you know, we got,
00:03:04.922 --> 00:03:07.582
we got all sorts of things coming.
00:03:08.202 --> 00:03:10.302
You also know there was another shooting.
00:03:11.082 --> 00:03:14.042
Grace's going to mention that. I'm going to talk about that a little bit.
00:03:15.842 --> 00:03:19.482
And Don Lemon and Georgia Fort got arrested.
00:03:22.770 --> 00:03:28.650
So, yeah. Yeah, I'm going to get into a lot of that. But before I do,
00:03:29.090 --> 00:03:30.690
I'm going to have two guests on.
00:03:31.450 --> 00:03:37.490
So I'm very fortunate. Another person that's running for Congress is going to be on.
00:03:38.150 --> 00:03:43.910
And I'm going to have another person talking about artificial intelligence coming on.
00:03:46.110 --> 00:03:52.810
And this author is coming at AI from a sociologist viewpoint,
00:03:52.810 --> 00:03:59.230
and it's a very, very good read, and it turned out to be a great interview, too.
00:03:59.410 --> 00:04:01.930
So I hope that you enjoy all that.
00:04:03.730 --> 00:04:07.750
And, you know, we're in an interesting time, but like I said,
00:04:07.850 --> 00:04:09.070
I'll get into that a little later.
00:04:09.810 --> 00:04:18.790
Please support us. You know, Georgia is an independent journalist in Minnesota.
00:04:19.570 --> 00:04:23.790
Don Lemon, after he left CNN, he became an independent journalist.
00:04:24.250 --> 00:04:30.950
Those, all of us that are doing podcasts, you know, show your love and support.
00:04:30.950 --> 00:04:34.030
Listen to us. Follow us on social media.
00:04:34.770 --> 00:04:40.230
I don't post as much as some of these other folks, but I do at least post to
00:04:40.230 --> 00:04:45.670
let you know who's going to be on the show and, you know, and just updates about
00:04:45.670 --> 00:04:48.370
what's going on with the podcast.
00:04:48.990 --> 00:04:52.850
But also to just, you know, do what you can to support the podcast.
00:04:52.850 --> 00:05:01.030
This is our 13th season and we do the math, right? Par...
00:05:02.002 --> 00:05:08.922
Sixth year? Seventh year. It'll be our seventh year in July of doing the podcast.
00:05:10.402 --> 00:05:14.262
So support us any way you can. If you want to support this podcast,
00:05:14.742 --> 00:05:20.322
go to www.momenterik.com and do that.
00:05:20.542 --> 00:05:26.262
But please, please stand by those folks who are trying to tell you the truth
00:05:26.262 --> 00:05:29.442
and bring you people that are doing the work.
00:05:30.422 --> 00:05:35.522
Because we are inundated by the foolishness every day.
00:05:36.082 --> 00:05:42.142
So those of us that are trying to give you hope and give you encouragement and
00:05:42.142 --> 00:05:47.182
give you what's really going on, please, please stand by us.
00:05:48.002 --> 00:05:52.722
All right, speaking about news, let's go ahead and kick this program off.
00:05:52.722 --> 00:05:56.722
And as always, we kick it off with a moment of news with Grace G.
00:06:04.189 --> 00:06:10.729
Thanks, Erik, and happy birthday. Federal agents killed Alex Pretti, a 37-year-old U.S.
00:06:10.809 --> 00:06:15.349
Citizen in Minneapolis, during a confrontation sparking intense local protests
00:06:15.349 --> 00:06:18.949
and condemnation following the second such incident this month.
00:06:19.489 --> 00:06:24.729
Gregory Bovino was removed from his high-level U.S. Border Patrol post and replaced
00:06:24.729 --> 00:06:26.629
by Border Czar Tom Homan.
00:06:27.349 --> 00:06:31.509
Tracy Mergen, an FBI agent investigating the fatal shooting of Renee Good by
00:06:31.509 --> 00:06:33.169
a federal officer, resigned.
00:06:33.729 --> 00:06:38.149
Three activists, including Nekima Levy Armstrong, were released from detention
00:06:38.149 --> 00:06:42.589
after being charged with federal crimes for disrupting a church service to protest
00:06:42.589 --> 00:06:45.489
an ICE official's leadership role within the congregation.
00:06:45.809 --> 00:06:50.949
A man was arrested after spraying Representative Ilhan Omar with a pungent liquid
00:06:50.949 --> 00:06:55.509
during a Minneapolis event where she was speaking against federal immigration policies.
00:06:56.209 --> 00:07:01.429
A 28-year-old man was taken into custody on Friday night following an alleged assault on U.S.
00:07:01.969 --> 00:07:06.529
Representative Maxwell Frost during a Sundance film festival party in Park City, Utah.
00:07:06.909 --> 00:07:11.589
The FBI searched a Fulton County election office in Georgia following President
00:07:11.589 --> 00:07:14.569
Trump's claims regarding the 2020 election results.
00:07:15.309 --> 00:07:19.649
A Virginia judge has halted an effort by state Democrats to implement a more
00:07:19.649 --> 00:07:23.089
favorable congressional redistricting map. The U.S.
00:07:23.449 --> 00:07:27.189
Justice Department has joined a lawsuit alleging that the UCLA David Geffen
00:07:27.189 --> 00:07:31.629
School of Medicine illegally uses race as a factor in its admissions process.
00:07:32.309 --> 00:07:36.349
Relatives of two Trinidadian men killed in a U.S. missile strike near Venezuela
00:07:36.349 --> 00:07:40.949
have filed a wrongful death lawsuit claiming the military campaign unlawfully
00:07:40.949 --> 00:07:42.309
targeted civilian vessels.
00:07:42.709 --> 00:07:47.289
A severe winter storm, along with Arctic cold, paralyzed much of the eastern
00:07:47.289 --> 00:07:52.369
United States, killing 62 people, causing massive travel disruptions and leaving
00:07:52.369 --> 00:07:54.309
over a million people without power.
00:07:54.829 --> 00:08:00.249
And the South Carolina measles outbreak has reached 789 cases reported.
00:08:00.529 --> 00:08:04.069
I am Grace Gee, and this has been a Moment of News.
00:08:11.139 --> 00:08:15.759
All right. Thank you, Grace, for that moment of news and for the birthday wishes.
00:08:16.319 --> 00:08:23.919
That means a lot. Thank you so much. All right. It's time for my guest, Kaylee Jade Peterson.
00:08:24.579 --> 00:08:29.579
Kaylee Jade Peterson is a 35-year-old working-class mother of two who was making
00:08:29.579 --> 00:08:31.259
her third run for the U.S.
00:08:31.339 --> 00:08:34.839
House as a Democrat in Idaho's first congressional district.
00:08:35.059 --> 00:08:38.199
She was recruited during her freshman year at community college,
00:08:38.199 --> 00:08:44.439
where she double majored in criminal justice and political science to run because
00:08:44.439 --> 00:08:45.799
it was nearly impossible to find
00:08:45.799 --> 00:08:49.959
candidates in a district that received no financial support or attention.
00:08:50.219 --> 00:08:55.559
She grew up with a single mother who worked two to three jobs at a time,
00:08:55.559 --> 00:08:58.239
was very politically active as a teenager.
00:08:59.079 --> 00:09:03.579
Being an advocate and public speaker for diverse students in Idaho's foster
00:09:03.579 --> 00:09:10.339
and school systems, managing local state representative campaigns in 2008 before
00:09:10.339 --> 00:09:11.839
life went a different direction.
00:09:12.199 --> 00:09:18.279
She met her husband 16 years ago, had her daughter at 21, and became a stay-at-home
00:09:18.279 --> 00:09:21.899
mom who worked on the side because they couldn't afford childcare.
00:09:22.359 --> 00:09:29.139
She spent eight years at home, had her son, and became a foster mom before deciding
00:09:29.139 --> 00:09:34.979
she needed to get involved in public policy again and started attending college at 30 years old.
00:09:35.159 --> 00:09:39.259
She has spent the last four years learning how to do rural progressive politics
00:09:39.259 --> 00:09:44.079
differently, how to communicate to rural voters successfully without compromising
00:09:44.079 --> 00:09:48.619
her progressive policy, putting the focus on working-class families and people,
00:09:49.219 --> 00:09:54.299
holding elected officials accountable, and organizing in a rural district that
00:09:54.299 --> 00:09:58.199
has been left behind by national progressives for nearly three decades.
00:09:58.799 --> 00:10:03.199
Ladies and gentlemen, it is my distinct honor and privilege to have as a guest
00:10:03.199 --> 00:10:07.599
on this podcast, Kaylee Jade Peterson.
00:10:18.989 --> 00:10:23.309
All right. Kaylee Jade Peterson, how are you doing?
00:10:23.909 --> 00:10:28.329
I'm doing all right. I think it's a hard mix, especially right now in American
00:10:28.329 --> 00:10:33.409
politics, where you feel kind of this heavy weight of everything that's happening
00:10:33.409 --> 00:10:38.089
right now, but then also the joys of getting to work with a really amazing community.
00:10:38.369 --> 00:10:41.009
So we tried and ride that balance right now.
00:10:41.649 --> 00:10:46.769
Yeah, yeah. You know, I always tell candidates when they come on the show that
00:10:46.769 --> 00:10:49.769
I'm doing better than they are because I'm not running.
00:10:50.549 --> 00:10:54.369
I've been there, done that. So I know what you're going through and I appreciate
00:10:54.369 --> 00:10:58.129
you sharing some time in the campaign to talk to me.
00:10:59.131 --> 00:11:04.131
One of the things that I like to do to kind of kick everything off is I do a couple of icebreakers.
00:11:04.491 --> 00:11:12.771
So the first icebreaker I want you to participate in is a quote that I want you to respond to.
00:11:13.051 --> 00:11:18.671
And the quote is, together we possess a shared stake in improving and streamlining
00:11:18.671 --> 00:11:22.551
government, even if it means facing uncomfortable troops.
00:11:22.851 --> 00:11:28.271
To pursuit of a more perfect union demands this commitment each and every day.
00:11:28.271 --> 00:11:29.571
What does that quote mean?
00:11:30.591 --> 00:11:37.091
To me, it's everything that I run on, which is this idea that our country and
00:11:37.091 --> 00:11:42.471
our government is not just some abstract political process and a bunch of men in suits.
00:11:42.611 --> 00:11:48.491
It is really the supposed ideals that we believe in as a shared nation.
00:11:48.491 --> 00:11:53.071
It's the constant striving to fulfill these ideals and these promises.
00:11:53.071 --> 00:11:58.991
It is the opportunity of us to innovate the brightest possible future for each
00:11:58.991 --> 00:12:00.411
and every person that lives here.
00:12:00.591 --> 00:12:07.291
Every child, every student, every working class mom and dad,
00:12:07.491 --> 00:12:09.451
every small business owner, every farmer.
00:12:09.711 --> 00:12:15.551
I mean, literally the success of every member of the community rides on how
00:12:15.551 --> 00:12:21.491
well our government is performing and what our government is doing in order to serve these people.
00:12:22.111 --> 00:12:26.591
And so there is this really, really heavy obligation for those in elected office
00:12:26.591 --> 00:12:33.331
to try and constantly strive for that level of perfection or at least fulfill those promises.
00:12:33.711 --> 00:12:37.431
It's everything that government's supposed to be that I think we've lost sight of.
00:12:38.751 --> 00:12:43.651
Yeah. OK. Now, the next icebreaker is going to be what we call 20 questions.
00:12:44.211 --> 00:12:50.351
So I need you to give me a number between one and 20. Eleven. All right.
00:12:51.542 --> 00:12:57.022
Where do you go to check a fact that you see, hear, or read?
00:12:58.102 --> 00:13:04.102
Oh, I'm a former speech and debate kid, so there isn't really one sort of information.
00:13:04.102 --> 00:13:08.422
What I'll tend to do is research where the information came from,
00:13:08.422 --> 00:13:14.002
and then I'll go to any institutions or organizations that have a basis in this,
00:13:14.102 --> 00:13:15.742
so whether it's scientific, academic,
00:13:16.062 --> 00:13:20.662
whether it's a governmental agency, and then I'll look and see where there might
00:13:20.662 --> 00:13:25.882
be other sources of information on that same topic and try to balance it.
00:13:25.982 --> 00:13:29.562
I think that's the best way we can try to find truth right now.
00:13:30.441 --> 00:13:36.221
Yeah. All right. So I've been to 36 states in my lifetime, but I've never been to Idaho.
00:13:36.701 --> 00:13:41.421
So tell me something about the state you grew up in and want to represent in Congress.
00:13:42.881 --> 00:13:47.841
Idaho is this weird, magical place because the state itself is huge.
00:13:48.001 --> 00:13:53.181
My district is over 500 miles long. And so it takes about 10 to 12 hours to
00:13:53.181 --> 00:13:55.261
get from one side of my district to the other.
00:13:55.261 --> 00:14:02.041
And while Idaho seems like this very homogenous state, my district is almost
00:14:02.041 --> 00:14:04.321
like four separate states all in one.
00:14:04.481 --> 00:14:08.181
So you go to North Idaho, you have the Pacific Northwest, you have these gorgeous
00:14:08.181 --> 00:14:12.481
lakes and mountains and the most beautiful places you can imagine.
00:14:12.481 --> 00:14:16.781
And then you go a little further south and you have the Palouse and these rolling
00:14:16.781 --> 00:14:22.601
waves of yellow grain and purple grain and such incredible agriculture and small
00:14:22.601 --> 00:14:24.821
farming families that have been doing it for generations.
00:14:24.821 --> 00:14:31.281
You have a college town, University of Idaho, which is one of the most vibrant
00:14:31.281 --> 00:14:34.521
community college communities that I've ever been part of.
00:14:34.741 --> 00:14:37.281
And then you also have really urban areas.
00:14:37.541 --> 00:14:41.641
And then peppered throughout, we have areas that are so rural that they're still
00:14:41.641 --> 00:14:47.541
considered frontier. So Idaho is really this magical melting pot where you can
00:14:47.541 --> 00:14:51.101
get a little bit of everything depending on what part of the state you're in.
00:14:51.461 --> 00:14:58.741
I think what we remember and what we value and try to represent is how kind
00:14:58.741 --> 00:15:02.521
Idaho was and how close-knit our communities were.
00:15:02.521 --> 00:15:05.621
It was the kind of place where you could never meet a stranger,
00:15:05.621 --> 00:15:10.181
whether you were in line at a gas station or walking down the sidewalk in the middle of Boise.
00:15:10.481 --> 00:15:14.501
It was the kind of place where you could easily build community and connection.
00:15:15.506 --> 00:15:18.546
Yeah. So how did your journey in politics start?
00:15:19.686 --> 00:15:26.086
Mine is a unique and definitely non-traditional trajectory into the political process.
00:15:26.086 --> 00:15:32.686
I was a very, very political child back before there was Google and websites
00:15:32.686 --> 00:15:37.266
that still considered odd talking about the Bush presidency in elementary school.
00:15:37.466 --> 00:15:42.446
So I got my start really writing the Clinton administration over the hanging
00:15:42.446 --> 00:15:46.866
of Kosovo. And I wrote all of my state senators, I wrote the Clinton administration,
00:15:46.866 --> 00:15:48.406
and I got these responses.
00:15:48.446 --> 00:15:53.646
And it just kind of pushed me out into being more politically involved.
00:15:54.086 --> 00:15:58.586
So I remember the no blood for oil protests that had actually gone out in the
00:15:58.586 --> 00:16:01.346
first deployment of Operation Iraqi Freedom.
00:16:01.586 --> 00:16:06.846
And I spoke at many of the protests across Idaho, trying to hold the Bush administration
00:16:06.846 --> 00:16:10.506
accountable for the lies that they told to get us into that conflict.
00:16:11.266 --> 00:16:17.166
And then I was grateful I became a public speaker and advocate by the time I was 12, 13 years old.
00:16:17.366 --> 00:16:21.866
I would speak at conferences for diverse youth in our foster and school systems
00:16:21.866 --> 00:16:25.486
here, working with law enforcement and social workers, students,
00:16:25.726 --> 00:16:30.126
educators on how we could try and legislate a more successful pathway for all
00:16:30.126 --> 00:16:31.406
students to be successful.
00:16:31.866 --> 00:16:37.266
I ended up managing a campaign when I was 18 years old, and this was in 08.
00:16:37.266 --> 00:16:42.986
So it was Obama's first election. That was my first foray into managing a local
00:16:42.986 --> 00:16:48.326
campaign and being a part of this much bigger political process of community
00:16:48.326 --> 00:16:52.306
and volunteers and door knocking and events on campus.
00:16:53.415 --> 00:16:56.315
So for me, then life kind of just took a different direction.
00:16:56.575 --> 00:16:59.055
I wasn't able to jump into university at 18.
00:16:59.395 --> 00:17:03.935
And so I almost kind of started over, met my husband. We had a family.
00:17:04.215 --> 00:17:09.795
I had my daughter when I was 21. We became foster parents. Then we had my second child.
00:17:10.375 --> 00:17:13.575
And I became a stay-at-home mom almost out of necessity.
00:17:13.835 --> 00:17:18.895
I worked nights and weekends, but I was home with my kids during the day until
00:17:18.895 --> 00:17:24.055
the 2016 election. And I think I saw such a drastic shift.
00:17:24.255 --> 00:17:27.875
It wasn't just kind of corporate corruption or the normal corruption we were
00:17:27.875 --> 00:17:29.875
expecting to see from our federal government.
00:17:30.035 --> 00:17:34.775
There was this drastic shift in the dialogue, the way we talked about policy,
00:17:35.055 --> 00:17:39.895
name-calling, the vitriol, the way that we looked at somebody with an R or deans
00:17:39.895 --> 00:17:43.695
to their name instead of any kind of character or integrity.
00:17:44.415 --> 00:17:47.655
And I decided that as soon as my kids were old enough that I'd go back to school
00:17:47.655 --> 00:17:52.375
and I would get my degrees and hopefully have some kind of positive impact.
00:17:52.715 --> 00:17:57.515
So in 2020, right after the COVID lockdown, I was three years old.
00:17:57.515 --> 00:17:59.575
I registered to college.
00:17:59.775 --> 00:18:03.075
I double majored in criminal justice and political science.
00:18:03.075 --> 00:18:08.415
And it was my freshman year at the College of Western Idaho that I saw a Facebook
00:18:08.415 --> 00:18:13.455
post from a local political group I had joined looking for volunteers that were
00:18:13.455 --> 00:18:15.115
willing to put their name down on the ballot.
00:18:15.515 --> 00:18:18.595
And at that point, I said, that's definitely something I can do.
00:18:18.755 --> 00:18:22.955
I'm shy. I'm not worried about having my name out there. I'll do this.
00:18:23.996 --> 00:18:30.096
But ultimately, it became this opportunity to build something that Idaho was desperately needing.
00:18:30.516 --> 00:18:33.936
There's no money in my district. There's no support in my district.
00:18:34.196 --> 00:18:36.736
The DNC doesn't invest in my district.
00:18:37.056 --> 00:18:42.116
There hadn't been any real progressive politics in the state of Idaho in nearly two decades.
00:18:42.396 --> 00:18:46.816
And I fell in love with the work that I get to do, the communities that I was
00:18:46.816 --> 00:18:51.036
building with, the contacts I was making, getting volunteers mobilized,
00:18:51.176 --> 00:18:52.616
getting neighbors connected.
00:18:53.096 --> 00:18:56.816
And I realized that there was huge opportunity that I love the work and that
00:18:56.816 --> 00:19:01.396
I was going to continue doing until we could right the wrongs of the current
00:19:01.396 --> 00:19:02.476
leadership in the state.
00:19:02.676 --> 00:19:06.876
So, yeah, it's been a long time coming, but also kind of came out of nowhere at the same time.
00:19:08.076 --> 00:19:14.396
Well, it sounds like that you were an activist and then, you know,
00:19:14.516 --> 00:19:19.256
those kindles, those that fire just got rekindled based on what you were seeing.
00:19:19.676 --> 00:19:22.836
So I don't think that's an unusual path to politics at all.
00:19:22.936 --> 00:19:27.896
I think that's commendable that you decided to get back in there.
00:19:28.677 --> 00:19:34.617
And to take it to another level. Why is restoring trust your top priority as a candidate?
00:19:35.797 --> 00:19:42.197
I think it is the distrust that voters across the country have in the government
00:19:42.197 --> 00:19:47.997
that has led to the kind of hyper-partisan rhetoric manipulation,
00:19:47.997 --> 00:19:54.637
specifically of rural communities that I see happening from media and social media networks.
00:19:54.637 --> 00:19:58.377
And it feels like because the government
00:19:58.377 --> 00:20:02.057
had corruption because career politicians
00:20:02.057 --> 00:20:05.897
and elected officials didn't truly value their place
00:20:05.897 --> 00:20:09.257
and and did were known for lying you
00:20:09.257 --> 00:20:13.377
know everybody knew that politicians were taking money from places they should
00:20:13.377 --> 00:20:17.677
never were voting for things that didn't work for their state and that distrust
00:20:17.677 --> 00:20:25.437
bred this vulnerability to you know ellison and and all of these right-wing
00:20:25.437 --> 00:20:28.117
kind of media network to shift the narrative.
00:20:28.597 --> 00:20:33.657
And so for me, if we want to fix this hyper-partisan divide,
00:20:33.877 --> 00:20:39.577
if we want to fix the ability to communicate with these communities about what needs to happen,
00:20:40.037 --> 00:20:43.297
what would actually help, that if I have a D next to my name,
00:20:43.397 --> 00:20:46.697
I'm not the enemy, that I'm not trying to destroy their way of life.
00:20:47.417 --> 00:20:53.417
Then I think kind of dismantling a system that didn't work for anybody and replacing
00:20:53.417 --> 00:21:00.217
it with this transparency and integrity and the idea of leadership that we all have,
00:21:00.397 --> 00:21:05.217
where elected officials were a part of the community, that they represented the community,
00:21:05.437 --> 00:21:08.097
that they knew the families, businesses,
00:21:08.477 --> 00:21:10.097
and industries and issues, and
00:21:10.097 --> 00:21:13.397
they were present whenever they weren't in office, that they were present.
00:21:13.617 --> 00:21:19.437
And I think returning that will help us overcome that R versus D,
00:21:19.817 --> 00:21:25.357
that MAGA versus progressives, that trying to overcome that huge divide that
00:21:25.357 --> 00:21:28.317
is hurting rural communities and rural districts now.
00:21:28.497 --> 00:21:31.957
And I think that there's a lot of opportunity in restoring that trust.
00:21:32.836 --> 00:21:41.356
Yeah, and it's very important because I think when somebody mentioned the word,
00:21:41.356 --> 00:21:48.456
when I was talking to them about authenticity, and I think that's what people really want.
00:21:48.816 --> 00:21:54.676
I don't think people are so, the majority of Americans want,
00:21:54.996 --> 00:22:00.396
you know, we've got about, I don't know, maybe a third of the nation is Democrat,
00:22:00.416 --> 00:22:02.096
a third of the nation is Republican.
00:22:02.096 --> 00:22:05.656
And it all varies when we start breaking it down in districts.
00:22:05.876 --> 00:22:11.336
But I think the overwhelming majority of people, regardless of what camp they
00:22:11.336 --> 00:22:14.976
lie in, they want somebody that's genuine and authentic.
00:22:16.656 --> 00:22:20.596
And that's the most important thing to convey.
00:22:21.196 --> 00:22:27.296
And they also want to elect somebody that seems like they're enthused about doing the job.
00:22:27.836 --> 00:22:32.056
And just in the brief conversation we've had so far, I think you're enthused.
00:22:32.196 --> 00:22:37.096
I think you are a person who really wants to do this. So I think that's going
00:22:37.096 --> 00:22:38.436
to carry a long way for you.
00:22:39.156 --> 00:22:44.356
All right, let's get into some issues. Idahoans concerned about the Epstein files.
00:22:44.396 --> 00:22:49.276
And if not, what are they concerned about and how will you go about addressing those concerns?
00:22:50.013 --> 00:22:55.853
The Epstein Files might be the first bipartisan issue that we have in the state of Idaho.
00:22:55.853 --> 00:23:00.933
It is really the one place where I don't care how conservative or progressive
00:23:00.933 --> 00:23:07.553
you are, we all understood that this was the epitome of corruptive power and
00:23:07.553 --> 00:23:10.273
the abuse of the most vulnerable among us.
00:23:10.553 --> 00:23:15.153
And that's what the Epstein Files truly represents is our ability to hold those
00:23:15.153 --> 00:23:19.013
at the top levels of either government or business accountable.
00:23:19.753 --> 00:23:25.213
We're all sick and tired of watching those at the top have little to no consequence,
00:23:25.213 --> 00:23:28.313
no matter how horrific the things that they've done is.
00:23:28.513 --> 00:23:30.973
And the Epstein files is that case.
00:23:31.253 --> 00:23:36.733
And it's been really, really frustrating for me specifically because I'm running
00:23:36.733 --> 00:23:40.233
against somebody who's incredibly close to the Trump administration.
00:23:40.233 --> 00:23:41.913
He's close with the family.
00:23:42.073 --> 00:23:45.533
They actually just started business and enterprises together.
00:23:46.113 --> 00:23:50.193
So when he's in the state of Idaho, when he's talking on the radio or in the
00:23:50.193 --> 00:23:54.433
news, he's 100 percent supportive of releasing the Epstein files.
00:23:54.633 --> 00:23:59.173
But he refuses to do anything that would actually cause the release.
00:23:59.293 --> 00:24:00.353
He refused the scientific.
00:24:00.753 --> 00:24:04.753
He refuses to hold the oversight committee responsible for the fact that we're
00:24:04.753 --> 00:24:08.313
over a month out now from when the files were supposed to be released.
00:24:08.673 --> 00:24:13.193
So for me, it's talking about this isn't just some big political headline.
00:24:13.853 --> 00:24:19.593
This is the abuse of the most vulnerable in our community. This is the abuse of children.
00:24:20.521 --> 00:24:25.401
And I think in conservative circles, they use the term save the children so
00:24:25.401 --> 00:24:32.001
often that the Epstein files become an opportunity for us to come together and agree on this.
00:24:32.281 --> 00:24:34.901
And it's something we're trying to push.
00:24:35.201 --> 00:24:39.501
But at the same time, it perfectly highlights the level of corruption.
00:24:39.501 --> 00:24:43.481
No matter what somebody says, what they do is often totally different when they're
00:24:43.481 --> 00:24:48.541
in elected office. and trying to hold them accountable, especially Epstein,
00:24:48.881 --> 00:24:51.841
and every single name in that file.
00:24:52.421 --> 00:24:55.901
I think it's horrific. You saw the testimony of Sasha Riley,
00:24:56.121 --> 00:25:01.681
and he's naming people like Jim Jordan and Andy Biggs who are sitting in these
00:25:01.681 --> 00:25:03.161
kind of oversight committees.
00:25:03.421 --> 00:25:09.781
So this is far-reaching, and every single person that I've met who's not in
00:25:09.781 --> 00:25:14.561
the files wants to see the text released immediately and wants to see a real
00:25:14.561 --> 00:25:18.541
hammer of justice brought down on those culpable in it.
00:25:19.324 --> 00:25:25.924
Yeah. So what's a what's another big issue that that you're that you want to
00:25:25.924 --> 00:25:28.324
address once you get elected to Congress?
00:25:28.864 --> 00:25:34.424
It's interesting. In Idaho, I think one of the there's a few top issues.
00:25:35.044 --> 00:25:40.024
Idaho, because we have such a large majority of federally owned lands.
00:25:40.324 --> 00:25:44.624
So we are a very big wilderness state. The last major name in the state was
00:25:44.624 --> 00:25:50.564
Frank Church, and he was such an incredible pioneer for protecting environmental issues.
00:25:50.804 --> 00:25:53.024
We actually have the Frank Church Wilderness Area.
00:25:53.284 --> 00:25:58.664
So the people who move to our state are often people who love their sportsmen.
00:25:58.844 --> 00:26:00.684
They love hiking and camping.
00:26:01.064 --> 00:26:05.644
They love being able to take family out on the weekends and spend it in nature.
00:26:05.804 --> 00:26:07.024
They love rafting the rivers.
00:26:07.304 --> 00:26:11.764
We are a huge state amount of business that's done alongside our rivers.
00:26:12.564 --> 00:26:18.424
And public lands then becomes one of the other few issues that the vast majority of Idahoans agree on.
00:26:18.544 --> 00:26:24.544
So 98% of Idahoans agree that public lands is essential to the quality of life in our state.
00:26:24.704 --> 00:26:28.424
And right now, they are 100% under threat.
00:26:28.644 --> 00:26:32.704
Obviously, we have the new BLM director who's coming in, who has been a huge
00:26:32.704 --> 00:26:34.744
proponent of selling off lots.
00:26:34.744 --> 00:26:40.784
They had the keep public lands in public hands bill, which my opponent was actually
00:26:40.784 --> 00:26:44.684
the only member of our congressional delegation, even though they're all Republican.
00:26:44.864 --> 00:26:50.684
My opponent was the only one who voted against keeping public lands in public hands.
00:26:50.684 --> 00:26:56.364
So it's not just a quality of life issue because how clean our water and our
00:26:56.364 --> 00:27:01.384
soil is and the four or five tribes that operate in my district off of the land,
00:27:01.584 --> 00:27:07.244
but also just how essential it is to how we view our way of life here and that
00:27:07.244 --> 00:27:09.324
it is under such threat right now.
00:27:10.007 --> 00:27:14.967
Not even to mention the huge cuts that were made to the Forest Service.
00:27:14.967 --> 00:27:17.727
I work really closely with the Federal Employees Union.
00:27:17.987 --> 00:27:23.967
They were gutted earlier last year. And they actually laid off most of the people
00:27:23.967 --> 00:27:27.007
in the Forest Service on Valentine's Day last year.
00:27:27.187 --> 00:27:32.107
We put people now left to try and manage these lands on a shoestring budget
00:27:32.107 --> 00:27:34.567
without staff that they need in really difficult places.
00:27:35.167 --> 00:27:40.447
So public lands is huge. But then day to day, we are a working class state.
00:27:40.607 --> 00:27:44.887
The average income in my district is $37,000 a year.
00:27:45.387 --> 00:27:49.647
And rural health care is suffering. Rural education is suffering.
00:27:49.867 --> 00:27:53.687
Our wages are stagnating while the cost of housing and rent explodes.
00:27:53.687 --> 00:27:59.647
So for me, I think one of my biggest priorities is just the economic well-being
00:27:59.647 --> 00:28:04.967
of lower working class families in the state and how they are able to thrive
00:28:04.967 --> 00:28:10.807
and compete in a market where we've kind of sold them out to monopolized interests,
00:28:11.187 --> 00:28:15.927
to corporate interests, to big ag that they cannot compete against,
00:28:15.927 --> 00:28:18.167
that exploits their labor and their time.
00:28:18.167 --> 00:28:22.567
And then their love just trying to pick up the pieces and take care of the family.
00:28:22.827 --> 00:28:27.467
And we are an incredibly hardworking team. And we take a lot of pride in our
00:28:27.467 --> 00:28:29.107
work ethic and the grind.
00:28:29.467 --> 00:28:34.407
And so to see these incredible people who work so hard to give back to our communities
00:28:34.407 --> 00:28:39.167
then suffer just trying to have the basic essentials is really difficult and
00:28:39.167 --> 00:28:40.967
definitely a priority for me.
00:28:41.839 --> 00:28:48.759
Yeah. All right. So would you vote for legislation that will abolish ICE?
00:28:49.739 --> 00:28:58.419
Yes, 100%. I want to dismantle this entire system, this DHS,
00:28:58.839 --> 00:29:00.719
the Immigration Services.
00:29:00.939 --> 00:29:07.239
I mean, we need to completely reimagine the way that our government functions.
00:29:07.839 --> 00:29:12.699
One, because incarceration is the most expensive and least effective tool that
00:29:12.699 --> 00:29:15.119
we have in the country to address social issues.
00:29:15.459 --> 00:29:18.899
We know that it's failed. It's been failing for over five decades.
00:29:19.139 --> 00:29:22.879
And there are only a handful of people who can benefit from ICE,
00:29:23.399 --> 00:29:27.999
from DHS, and from the way that our government's operated. But also the lack of trust.
00:29:28.259 --> 00:29:33.639
They have slaughtered American citizens. They are snatching American children,
00:29:33.639 --> 00:29:36.059
whether they are black or brown, white.
00:29:36.059 --> 00:29:39.399
They are taking our children, our neighbors, our co-workers,
00:29:39.979 --> 00:29:45.759
valued members of our communities in indefinite detention into these camps that
00:29:45.759 --> 00:29:51.159
are obviously privately owned and privately profiting off of this kind of harm to our communities.
00:29:51.379 --> 00:29:53.419
And there's no ringing that bell.
00:29:53.659 --> 00:29:58.519
You cannot walk back and say, OK, well, we've addressed these problems and we've
00:29:58.519 --> 00:30:02.839
retrained and now this is a agency to be operating.
00:30:03.059 --> 00:30:05.679
That's just not how this is going to work going forward.
00:30:06.059 --> 00:30:11.159
So 100%, I'm all for abolishing ICE. But I think it's also important we communicate
00:30:11.159 --> 00:30:13.919
that doesn't mean just dismantling a system.
00:30:14.259 --> 00:30:19.219
That means reimagining the way that we get to do it. We get to build something
00:30:19.219 --> 00:30:21.719
back better than it has ever been before.
00:30:22.395 --> 00:30:27.315
And that's where I think that common ground lies, where why don't we streamline immigration?
00:30:27.575 --> 00:30:32.975
Why don't we reimagine and innovate an immigration system that works for migrants,
00:30:33.255 --> 00:30:38.395
that works for those seeking asylum, that works for local communities, law enforcement?
00:30:38.875 --> 00:30:45.555
That's where I really want to spend our time and our funding and leave this
00:30:45.555 --> 00:30:47.115
incredibly dark chapter behind.
00:30:47.355 --> 00:30:52.435
There's no excuses that can be made. There's no apologies that can be had.
00:30:52.715 --> 00:30:58.335
They have murdered at least two American citizens, if not the 38 people that
00:30:58.335 --> 00:31:00.835
have died in ICE custody since this began.
00:31:01.435 --> 00:31:05.675
And there has to be accountability there. And I'm not even just talking about
00:31:05.675 --> 00:31:07.155
dismantling and abolishing ICE.
00:31:07.295 --> 00:31:12.555
I am talking about bringing justice, sexual criminal justice,
00:31:12.555 --> 00:31:16.615
for those who participated in these murders.
00:31:17.295 --> 00:31:22.395
Yeah. Yeah. And I agree totally on that last point for sure.
00:31:22.695 --> 00:31:31.455
You know, when I ran in 2008, in 2006, even, you know, I made the argument that,
00:31:31.675 --> 00:31:36.215
you know, we've had immigrants coming into this nation.
00:31:37.335 --> 00:31:42.575
And, you know, when we had Ellis Island and all that stuff, and then even the
00:31:42.575 --> 00:31:44.695
movement of people within this nation,
00:31:44.895 --> 00:31:50.135
we didn't have all this technology, but yet we were allowing people to come
00:31:50.135 --> 00:31:54.675
in and make contributions eventually to the growth of this nation.
00:31:55.015 --> 00:32:01.575
And I'm just amazed that now we are so technologically advanced that we can't
00:32:01.575 --> 00:32:04.695
seem to have a simple solution for immigration.
00:32:04.975 --> 00:32:09.255
So I'm with you on that in totally dismantling,
00:32:10.305 --> 00:32:15.025
Did you support the Democrats' position during the government shutdown the first time?
00:32:16.245 --> 00:32:22.505
You know, I would have been able to find a way to support it if they had actually
00:32:22.505 --> 00:32:27.385
accomplished affordable and accessible health care on the other side.
00:32:27.825 --> 00:32:33.045
But right now, with not only health care, but what's happening with ICE,
00:32:33.465 --> 00:32:38.585
I think every single one of us is saying, where are these supposed leaders?
00:32:38.585 --> 00:32:43.745
Like, where are these Democrats that are in positions of power and authority
00:32:43.745 --> 00:32:48.825
that can truly lead a movement to offering things like health care and protection
00:32:48.825 --> 00:32:53.525
from an out-of-control executive base mercenary army?
00:32:53.525 --> 00:32:59.225
And so when the shutdown happened, we were all trying to hold tight.
00:32:59.485 --> 00:33:03.925
I was so proud of our Association of Federal Government Employees,
00:33:04.125 --> 00:33:06.645
which oversaw our TSA members.
00:33:06.885 --> 00:33:10.845
They stopped during that government shutdown to make sure that these federal
00:33:10.845 --> 00:33:15.405
employees had house goods and toiletries and hygiene and food.
00:33:15.405 --> 00:33:20.565
And I saw this brotherhood of our unions step up in this community there.
00:33:20.565 --> 00:33:23.865
So we were all saying, okay, we will hold the line.
00:33:24.065 --> 00:33:26.565
We believe in this. We will make this work.
00:33:27.005 --> 00:33:31.285
So then to turn their back on these federal employees who sacrificed so much
00:33:31.285 --> 00:33:34.985
and the harm that came from these communities and not be able to deliver the
00:33:34.985 --> 00:33:39.645
one thing that they were standing for, it's so frustrating.
00:33:39.665 --> 00:33:44.085
And I get angry because I look at what my opponent was saying during that government shutdown.
00:33:44.425 --> 00:33:49.085
The entire alt-right had been spending this 30 days saying that, whatever.
00:33:49.123 --> 00:33:53.903
Democrats wanted to give $1.5 trillion, and they used the word illegal.
00:33:54.123 --> 00:33:55.663
We would say documented immigrants.
00:33:55.923 --> 00:34:01.323
They tried to make it look like the ACA subsidies and our Medicaid was all going
00:34:01.323 --> 00:34:05.463
to the undocumented criminals, the way that they tried to define this community
00:34:05.463 --> 00:34:06.843
that doesn't exist, right?
00:34:06.843 --> 00:34:10.023
So for me it was hard to support
00:34:10.023 --> 00:34:13.383
the democrats when they weren't up there trying to
00:34:13.383 --> 00:34:17.383
fight away this completely false narrative it's
00:34:17.383 --> 00:34:20.323
hard to excuse their absence right
00:34:20.323 --> 00:34:25.763
now and you can't justify the shutdown considering they gave in and didn't get
00:34:25.763 --> 00:34:29.943
anything accomplished for the people so at the time i was trying to be supportive
00:34:29.943 --> 00:34:37.883
but look back there are no excuses yeah so your Your thing is I'm down with
00:34:37.883 --> 00:34:39.663
the fight, but we got to win the fight.
00:34:39.823 --> 00:34:47.923
We can't just, you know, say, okay, it's like enough is enough and we'll play ball again.
00:34:48.143 --> 00:34:52.683
Your thing is if we're going to do this, let's get some results that really help the people.
00:34:54.074 --> 00:35:00.354
Exactly. Especially, I mean, the Democrats in Congress weren't suffering during the shutdown, right?
00:35:00.594 --> 00:35:05.914
It was my working class constituents and the people serving my communities that
00:35:05.914 --> 00:35:08.754
felt, you know, this down.
00:35:09.334 --> 00:35:14.574
So I'm trying to understand the movements. I'm trying to understand logic.
00:35:14.854 --> 00:35:16.954
I'm trying to look at the bigger picture and think, OK, well,
00:35:17.014 --> 00:35:20.934
if they are trying to achieve affordable and accessible health care for these
00:35:20.934 --> 00:35:23.994
same families, then I will stand in this line with them.
00:35:24.074 --> 00:35:26.774
But that to give up was just unexcusable.
00:35:26.974 --> 00:35:31.314
And I think it's the problem that the Democrats have seen really for the last
00:35:31.314 --> 00:35:34.434
few years is where is the leadership?
00:35:34.674 --> 00:35:36.914
Where is the fight? Where is the strategy?
00:35:37.214 --> 00:35:41.754
Because the right is incredibly well organized. No matter how much they dislike
00:35:41.754 --> 00:35:48.714
each other, no matter how different their outcomes are or their values are, they are all united.
00:35:48.974 --> 00:35:52.714
They are pushing the same message. They are using the same strategies.
00:35:52.714 --> 00:35:56.574
They are all coordinated, especially in states like Idaho.
00:35:57.074 --> 00:36:00.154
And so for the left and for the Democratic Party, it's like,
00:36:00.314 --> 00:36:02.534
where is the strategy? Where is this fight?
00:36:03.014 --> 00:36:08.554
And I'd like to see them step up in a way where it feels like somebody or adults
00:36:08.554 --> 00:36:14.194
in the house trying to do the right thing back in D.C., but we have yet to see that, really.
00:36:15.374 --> 00:36:19.534
Yeah. All right. So this is going to be kind of a rapid fire deal. but.
00:36:20.739 --> 00:36:26.299
Would you support, well, do you support the current situation that we have with
00:36:26.299 --> 00:36:32.959
Venezuela that I guess now the president has declared himself in charge of Venezuela?
00:36:33.459 --> 00:36:37.839
So I guess it's a territory. Do you support that position at all?
00:36:39.159 --> 00:36:43.019
No, no. I mean, listen, I should note here's from a note.
00:36:43.239 --> 00:36:47.959
I think he was horrific. But the idea that we are bombing civilians in the capital
00:36:47.959 --> 00:36:51.459
city, the fact that we have not legitimized the opposition party,
00:36:51.599 --> 00:36:53.819
which did democratically win the last election,
00:36:54.439 --> 00:37:01.519
this seems like just a land grab, oil execs at Exxon, and incredibly reckless.
00:37:01.899 --> 00:37:05.059
It's definitely not America first. No, definitely not.
00:37:05.439 --> 00:37:09.099
Okay. Do you support military aid to Ukraine? No.
00:37:09.585 --> 00:37:16.985
I do. I think being one, Russia is a huge threat to our national security,
00:37:17.105 --> 00:37:20.665
and that we are not addressing it as if they are really agrees me.
00:37:21.145 --> 00:37:25.945
Ukraine is essential. The elements, the mineral rights, and for Russia to get
00:37:25.945 --> 00:37:30.885
those becomes a huge issue, not just for America, but for the global community and stability.
00:37:31.125 --> 00:37:34.645
So providing aid to Ukraine isn't just the right thing.
00:37:34.845 --> 00:37:38.345
It isn't just It's important for us to be there for our allies to push back
00:37:38.345 --> 00:37:43.305
against a threat like Putin, but it's also to protect these elements that Russia
00:37:43.305 --> 00:37:44.765
would be able to do a lot of harm with.
00:37:45.225 --> 00:37:47.945
All right. Do you support military aid to Israel?
00:37:48.565 --> 00:37:50.985
No, not a cent.
00:37:51.885 --> 00:37:56.505
Netanyahu and this government is committing absolute atrocities.
00:37:56.505 --> 00:38:00.685
And I think our position in the global stage or at least position that we had
00:38:00.685 --> 00:38:07.205
six months ago before the Trump administration destroyed any trust we had on the national stage,
00:38:07.425 --> 00:38:13.525
we are supposed to be the ones stepping up and drawing line against these war crimes.
00:38:14.165 --> 00:38:17.525
There's no two ways about it.
00:38:17.685 --> 00:38:21.825
I mean, this is a genocide and atrocities, and we should be standing up and
00:38:21.825 --> 00:38:23.405
protecting those most vulnerable.
00:38:23.605 --> 00:38:26.905
But it looks like the Trump administration just wants to profit off the land,
00:38:27.085 --> 00:38:28.545
and that's inexplicable.
00:38:29.645 --> 00:38:34.625
Yeah. I don't know if you saw the, that was introduced over in Davos,
00:38:34.825 --> 00:38:41.005
the conceptualizations of Gaza, the new Gaza city, whatever,
00:38:41.805 --> 00:38:44.085
with the high rises and all that stuff.
00:38:44.265 --> 00:38:48.385
You know, it was like, you know, one reporter dared ask the questions,
00:38:48.565 --> 00:38:50.985
where are the Palestinians in this conversation?
00:38:51.265 --> 00:38:55.745
And nobody, nobody went to the mic to address that. So that's,
00:38:55.765 --> 00:38:58.225
that's really, really a crazy thing that's happening there.
00:38:58.545 --> 00:39:04.445
All right. Taking into account that less than 1% of your population in the state is African-American.
00:39:05.354 --> 00:39:11.234
What is your position on reparations? Yeah, I think we're actually in the top
00:39:11.234 --> 00:39:13.634
five biggest states in the nation.
00:39:13.634 --> 00:39:18.014
And I think that's actually caused a lot of issues and allowed a lot of the
00:39:18.014 --> 00:39:19.754
rhetoric and the misinformation,
00:39:19.754 --> 00:39:24.634
the manipulation of information for all communities that don't have firsthand
00:39:24.634 --> 00:39:29.034
experience with the Black community and with people of color.
00:39:30.614 --> 00:39:39.574
I think that we have to acknowledge that the last 50 years, systemic oppression has created a huge rift.
00:39:39.774 --> 00:39:43.154
I mean, we know this. The data is there. The research is there.
00:39:43.414 --> 00:39:47.114
I don't know exactly how reparations should look.
00:39:47.114 --> 00:39:51.854
I don't know exactly how the federal government can create programs to try and
00:39:51.854 --> 00:39:56.434
address that inequity and that oppression that's happening, but I know that
00:39:56.434 --> 00:39:59.254
it does need to be addressed and it does need to happen.
00:39:59.534 --> 00:40:04.774
And I think the pendulum swung kind of in the right direction under the Obama
00:40:04.774 --> 00:40:08.734
administration, but then it felt like there was such a violent pushback that
00:40:08.734 --> 00:40:11.854
we got so much further away from it than we were 10 years ago.
00:40:11.854 --> 00:40:17.574
So I would like to see an administration where we have the representation there,
00:40:17.814 --> 00:40:24.114
that we have the community there trying to devise programs that do address this
00:40:24.114 --> 00:40:27.994
inequality and oppression that has been systemic for so long.
00:40:27.994 --> 00:40:32.994
But right now it's devastating and we're not even allowed to address the inequality
00:40:32.994 --> 00:40:35.954
in our criminal justice system, which is so black and white,
00:40:36.154 --> 00:40:39.154
so obvious when you look at these statistics and these numbers.
00:40:39.534 --> 00:40:46.114
So I 100% want to address it and I will support it. I just don't know exactly what that looks like.
00:40:46.294 --> 00:40:50.374
And usually I'm such data nerds. Usually I'll have the policy or the solution.
00:40:50.634 --> 00:40:53.854
And that's one issue that I don't think I have the solution yet,
00:40:53.854 --> 00:40:57.654
but I want to support the right solution in the end. Yeah.
00:40:58.014 --> 00:40:59.854
And I appreciate that. So.
00:41:00.826 --> 00:41:04.546
There's a bill that always comes up. I think the latest number was H.R.
00:41:04.566 --> 00:41:08.926
40 that asked for a study on the issue.
00:41:09.086 --> 00:41:12.526
Would you, so you would support that bill if you got elected?
00:41:13.066 --> 00:41:16.166
Oh, a hundred percent. I would go even further.
00:41:16.426 --> 00:41:20.786
I would love to create a department that is looking at addressing these issues
00:41:20.786 --> 00:41:23.966
with our Black community, even with our tribal communities.
00:41:24.066 --> 00:41:27.546
I mean, we have to look at how do we make up the ground.
00:41:27.906 --> 00:41:33.466
And I mean, history is how I feel in love with politics and government in the first case.
00:41:33.666 --> 00:41:39.426
And so I get incredibly frustrated that those in elected C aren't familiar with
00:41:39.426 --> 00:41:43.486
the history or at the very least are intentionally blind to it.
00:41:43.786 --> 00:41:47.086
And so whether it's creating a committee, whether it's creating a department,
00:41:47.426 --> 00:41:52.366
I would support any and all efforts to create a study or to find the right solution.
00:41:53.066 --> 00:41:59.426
Since the circuits of the U.S. Court of Appeals are assigned a circuit justice from the U.S.
00:41:59.706 --> 00:42:04.806
Supreme Court and there are 13 Court of Appeal circuits, would you vote for
00:42:04.806 --> 00:42:08.326
an expansion of the U.S. Supreme Court to 13 justices?
00:42:09.235 --> 00:42:16.515
I want to say yes to this because we have got to address the issues within the Supreme Court.
00:42:16.755 --> 00:42:19.435
The Supreme Court has been compromised.
00:42:19.875 --> 00:42:23.215
There are no ifs, ands, or buts about it. We watched this happen.
00:42:23.435 --> 00:42:27.575
So we have to address it. The expansion is one way of doing that.
00:42:27.755 --> 00:42:30.695
I don't know if there are other ways. I know that other people are bringing
00:42:30.695 --> 00:42:35.155
up term limits or trying to cut that down. I don't know if that creates a whole
00:42:35.155 --> 00:42:40.415
other issue where we politicize these benches even more than it has already been.
00:42:40.715 --> 00:42:46.115
But yes, expansion has been the way that I have been looking at trying to fix it.
00:42:46.275 --> 00:42:50.315
So that's that's my support is right now. But it is something I am trying to
00:42:50.315 --> 00:42:52.075
look at the potential solution.
00:42:52.475 --> 00:42:55.375
But it's a priority that we have to address immediately.
00:42:55.995 --> 00:43:02.435
All right. Idaho has not voted for the Democratic presidential nominee since 1964.
00:43:03.135 --> 00:43:07.455
Since 1990, only two Democrats have represented the 1st Congressional District
00:43:07.455 --> 00:43:09.655
for a total of six years of service.
00:43:09.855 --> 00:43:14.495
The last Democratic woman to represent the 1st District left office in 1963.
00:43:15.824 --> 00:43:23.484
Last time you ran in 2022, you received 27% of the vote. Why is 2026 going to be different?
00:43:24.784 --> 00:43:27.884
This is where I knew my promise meant something.
00:43:28.084 --> 00:43:34.004
This is, I knew immediately in 2022, because in 2024, I never once said we could win the election.
00:43:34.244 --> 00:43:38.924
I knew that we needed a candidate who was willing to commit the time and look
00:43:38.924 --> 00:43:43.644
at the picture and do long-term campaigning to build out the infrastructure,
00:43:43.844 --> 00:43:48.524
the trust, the networking, to oppose the kind of dark money that's coming from
00:43:48.524 --> 00:43:52.864
places like the Heritage Foundation and the Freedom Foundation and all of these
00:43:52.864 --> 00:43:56.804
special interests that profit off of these far-right kind of interests.
00:43:57.424 --> 00:44:02.904
And when I started, there were most communities, I could not find a single progressive
00:44:02.904 --> 00:44:05.544
contact. I didn't have any Democratic voters.
00:44:05.784 --> 00:44:07.644
There was no local county parties.
00:44:08.064 --> 00:44:12.504
Most of these areas, most people considered dangerous to just go in and wave
00:44:12.504 --> 00:44:14.264
your hand around saying, I'm a Democrat.
00:44:14.944 --> 00:44:21.344
And I think I benefited from the fact that I am a blonde woman who looks like
00:44:21.344 --> 00:44:23.044
a conservative Idaho mom.
00:44:23.204 --> 00:44:27.344
I think I have also benefited from growing up in these circles.
00:44:27.344 --> 00:44:31.064
My family are kind of old school Eisenhower, Reagan Republicans.
00:44:31.304 --> 00:44:35.904
My dad was a Trump supporter. I married into a Christian evangelical family.
00:44:35.904 --> 00:44:40.724
So I understand what these discussions look like at the dinner table versus
00:44:40.724 --> 00:44:45.284
what we understand actually being involved in politics and on the ground.
00:44:45.624 --> 00:44:50.504
So I think I was able to try and bridge that gap. And it takes time, right?
00:44:50.624 --> 00:44:54.884
I go in and I'm with VFWs and Grange Halls and every local union,
00:44:55.144 --> 00:44:58.244
even though 75, 80% of their membership is Republican.
00:44:58.704 --> 00:45:03.324
I am going straight into kind of the belly of the beast, as some Democrats might
00:45:03.324 --> 00:45:06.584
put it, and showing them that I'm not the enemy. But I'm not the enemy. I'm not the enemy. I'm
00:45:06.696 --> 00:45:11.156
More importantly, offering them the solutions Republicans refuse to.
00:45:11.476 --> 00:45:15.896
Life has gotten much harder in Idaho, and we have had a Republican super majority
00:45:15.896 --> 00:45:17.996
for 30 years. You can't blame the Democrats.
00:45:18.456 --> 00:45:22.316
But when there hasn't been a Democrat in over 20 years in these communities
00:45:22.316 --> 00:45:25.916
to combat that narrative, then we're not going to get anywhere.
00:45:25.916 --> 00:45:31.836
So 22 and 24, it was recruiting local candidates, recruiting local contacts,
00:45:32.076 --> 00:45:38.356
volunteers, building trust and relationship with local communities and organizations, expanding that.
00:45:38.536 --> 00:45:41.336
So it's not just Kaylee for Congress doing work-based communities,
00:45:41.416 --> 00:45:47.356
but now we motivated and connected and given the resources necessary for local
00:45:47.356 --> 00:45:49.736
community members to grow and organize.
00:45:50.556 --> 00:45:55.736
2026 is a really, really unique opportunity for us. One, because the Republican
00:45:55.736 --> 00:46:00.596
Party hasn't delivered on a single promise that they made to the American people.
00:46:00.976 --> 00:46:06.096
Wages are still stagnated. We can't afford groceries, gas, public lands are under threat.
00:46:06.436 --> 00:46:11.316
Things done so that most Republicans I know are questioning this industry.
00:46:11.656 --> 00:46:17.776
But also, I have four years of trust, contact, and volunteers that are so ready
00:46:17.776 --> 00:46:24.616
to mobilize. And on top of that, we have reproductive rights on the ballot for the first time in 26.
00:46:25.116 --> 00:46:30.556
So you have 75% of the population that are progressive already,
00:46:30.716 --> 00:46:36.316
that are women already, that have a really important life-threatening issue
00:46:36.316 --> 00:46:37.956
to show up for on Election Day.
00:46:37.956 --> 00:46:43.656
And even on top of that, we have a decriminalization of marijuana and medicinal
00:46:43.656 --> 00:46:47.236
marijuana on the ballot for the first time, which my veterans,
00:46:47.476 --> 00:46:51.056
my affiliated voters, my libertarian voters 100% support.
00:46:51.276 --> 00:46:53.456
So we have the momentum.
00:46:53.916 --> 00:46:59.196
It is no just Kaylee and Congress out yelling into this giant district trying
00:46:59.196 --> 00:47:00.436
to get people to pay attention.
00:47:00.436 --> 00:47:06.216
We have built a massive community that is able to do the work to win,
00:47:06.316 --> 00:47:10.416
and for the first time I know that we have a path to success.
00:47:11.108 --> 00:47:16.448
It'll be a Hail Mary. Don't get me wrong. It's not going to be easy. It's not guaranteed.
00:47:16.908 --> 00:47:21.128
But I know exactly where the voters are we need to win.
00:47:21.288 --> 00:47:26.168
I know exactly how we reach them to win. And I know about the trust in the community
00:47:26.168 --> 00:47:28.648
to mobilize and engage them to do so.
00:47:28.888 --> 00:47:35.748
So this is a really, really special election and opportunity for us to surprise everyone. All right.
00:47:35.908 --> 00:47:40.488
So I'm asking this question to all of my guests this year.
00:47:40.788 --> 00:47:44.348
Finish this sentence. I have hope because.
00:47:44.348 --> 00:47:52.948
No matter how bad it gets, no matter how hard life is for everyday families,
00:47:53.428 --> 00:47:59.888
no matter how dire and depressing the situation in the news becomes,
00:47:59.888 --> 00:48:04.748
the worse it gets, the more I see our communities come together.
00:48:05.148 --> 00:48:12.308
The worse it gets, the more I see everyday people step up and do extraordinary things.
00:48:12.308 --> 00:48:16.968
And I think when we are put into these incredibly difficult times,
00:48:17.268 --> 00:48:20.348
it's when we really truly get the chance to see the best of us.
00:48:20.548 --> 00:48:24.508
And that is what I have seen, especially over the last six months,
00:48:24.768 --> 00:48:30.648
is I have seen people that don't have resources or experience step up and do
00:48:30.648 --> 00:48:33.288
incredible things and organize incredible solutions.
00:48:33.888 --> 00:48:40.008
And they gave me hope and I think they make it really easy to work as hard as
00:48:40.008 --> 00:48:44.908
we have to work to try and provide some kind of relief and solutions for the people of my state.
00:48:45.428 --> 00:48:50.528
All right. So if people want to get involved with the Kaylee Jade Peterson campaign.
00:48:52.908 --> 00:48:57.548
To reform and revive Idaho, how can they get involved?
00:48:58.406 --> 00:49:02.786
I'm the only Kaylee for Congress that has ever run for Congress in America.
00:49:03.186 --> 00:49:07.926
So I'm the only one you'll find online. It's just Kaylee for Congress,
00:49:08.146 --> 00:49:12.406
K-A-Y-L-E-E-E, all spelled out. Dot com is my website.
00:49:12.806 --> 00:49:17.986
At gmail.com is my email. And it's Kaylee for Congress on all of my social medias.
00:49:18.386 --> 00:49:23.066
We're big on TikTok, Facebook and Instagram. We're incredibly consistent there.
00:49:23.226 --> 00:49:25.406
We're getting all of our volunteers together.
00:49:25.626 --> 00:49:29.406
And we have volunteers from New York to Texas to Florida to California.
00:49:29.506 --> 00:49:31.886
We really are an Asia-wide team.
00:49:32.046 --> 00:49:34.466
We do a lot of remote virtual work together.
00:49:34.866 --> 00:49:38.426
But all those pathways go right to me.
00:49:38.586 --> 00:49:44.206
So if somebody had a question, if they want me to come do some work in the community,
00:49:44.466 --> 00:49:46.386
I am always acceptable to them.
00:49:46.486 --> 00:49:49.606
And I look forward to hearing from people listening today.
00:49:50.246 --> 00:49:54.926
Well, Kaylee Jade Peterson, I am really, really honored that you took the time
00:49:54.926 --> 00:49:56.546
out of the campaign to do this.
00:49:56.686 --> 00:49:58.906
I wish you much success in the campaign.
00:49:59.486 --> 00:50:03.426
One of the rules I have is that, you know, once you've been a guest,
00:50:03.626 --> 00:50:09.506
you have an open invitation to come back, and it would be really, really sweet if a U.S.
00:50:09.806 --> 00:50:13.586
Congresswoman from Idaho would come back on to be a guest on the program.
00:50:13.766 --> 00:50:18.006
So thank you so much for doing this, and again, good luck on the campaign.
00:50:18.546 --> 00:50:22.646
Thank you so much for the time. And I look forward to seeing you in the end
00:50:22.646 --> 00:50:28.766
of January next year when we've been initiated and sworn in.
00:50:28.986 --> 00:50:31.586
Thank you so much, Erik. And thank you for the work that you're doing.
00:50:32.346 --> 00:50:34.686
All right, guys. And we're going to catch y'all on the other side.
00:50:54.121 --> 00:50:59.741
All right, and we are back. And so now it is time for my next guest, David Eliot.
00:51:00.101 --> 00:51:04.861
David Eliot is a PhD candidate at the University of Ottawa,
00:51:04.861 --> 00:51:09.721
where he researches the social and political effects of artificial intelligence.
00:51:10.081 --> 00:51:15.301
He is a member of the Critical Surveillance Studies Lab, and his work on AI
00:51:15.301 --> 00:51:21.201
has been recognized with numerous awards, including the 2022 Pierre Elliott
00:51:21.201 --> 00:51:24.541
Trudeau Foundation PhD scholarship.
00:51:24.921 --> 00:51:30.081
His first book, Artificially Intelligent, The Very Human Story of AI,
00:51:30.501 --> 00:51:35.101
was recently published by the University of Toronto Press, and we're going to
00:51:35.101 --> 00:51:37.641
be talking about that during the interview.
00:51:37.921 --> 00:51:42.121
So, ladies and gentlemen, it is my distinct honor and privilege to have as a
00:51:42.121 --> 00:51:45.361
guest on this podcast, David Eliot.
00:51:56.846 --> 00:52:00.526
All right. David Eliot, how are you doing, sir? You doing good?
00:52:01.266 --> 00:52:04.646
I'm doing great. Thank you so much for having me on today. Well,
00:52:04.786 --> 00:52:06.266
I appreciate you coming on.
00:52:06.486 --> 00:52:12.146
I just recently had a guest, so I don't know what's going on in Canada with
00:52:12.146 --> 00:52:16.106
AI, but I recently, I just had a professor up at the University of Waterloo,
00:52:16.246 --> 00:52:19.146
and she had written a book about AI, and I looked and I said,
00:52:19.266 --> 00:52:21.166
oh, yeah, I got another guest coming on.
00:52:21.166 --> 00:52:26.346
So even though this is a political show, it seems like I'm getting some of the
00:52:26.346 --> 00:52:29.946
top minds as far as artificial intelligence goes.
00:52:30.126 --> 00:52:34.186
So I really appreciate that. That raises my level, makes people think that I'm intelligent.
00:52:35.166 --> 00:52:39.346
Oh, I'm sure you don't need the help of that, but I'm glad I can be in persistence.
00:52:40.646 --> 00:52:46.406
Yeah, yeah, yeah, yeah. I appreciate that. So look, I usually start off the
00:52:46.406 --> 00:52:49.166
interview with a couple of icebreakers.
00:52:49.946 --> 00:52:54.426
So the first icebreaker is a quote that I want you to respond to.
00:52:54.666 --> 00:53:01.546
And the quote is, what use is producing knowledge if we cannot effectively share
00:53:01.546 --> 00:53:03.126
it with those who need it most?
00:53:04.618 --> 00:53:07.378
Yeah. So, I mean, it's a quote from the prologue of the book.
00:53:08.258 --> 00:53:12.938
And it was one of the major reasons I wrote this book. It was one of the first lines I actually wrote.
00:53:13.358 --> 00:53:17.498
And it came from a frustration I have of academia. And I am an academic.
00:53:17.938 --> 00:53:21.498
My grandparents were academics. I was born to this world and I love it.
00:53:22.078 --> 00:53:25.698
But I'm also critical of it because I feel like a lot of the time we end up
00:53:25.698 --> 00:53:30.718
in these ivory towers producing this knowledge that we don't effectively share
00:53:30.718 --> 00:53:33.738
with people. that just becomes a bit of an echo chamber.
00:53:34.038 --> 00:53:38.058
And at times we can just be justifying our own existence. That's not everybody,
00:53:38.058 --> 00:53:41.198
but it's become more and more common, I think, with academics.
00:53:41.918 --> 00:53:47.758
And I see people in the everyday world experiencing problems that in the academy,
00:53:47.758 --> 00:53:52.838
we feel like we have the answers for, and then they get upset that people aren't doing things.
00:53:53.198 --> 00:53:58.378
And I feel like, well, we aren't reaching out. We aren't producing work in ways that are accessible.
00:53:58.558 --> 00:54:01.258
You know, we write peer-reviewed papers. We write journal articles.
00:54:01.638 --> 00:54:04.698
We respond to each other, but we just build this echo chamber.
00:54:05.998 --> 00:54:09.218
And I really wanted to in this book, but I feel like a lot of great authors
00:54:09.218 --> 00:54:13.218
have done, this is not unique to me, was to really kind of try and take the
00:54:13.218 --> 00:54:17.218
academic research and present it in ways that it gets to the communities who need it.
00:54:17.498 --> 00:54:20.918
And that we meet people where they are. We don't say, oh, well,
00:54:20.998 --> 00:54:22.658
you should come to us for our knowledge.
00:54:22.918 --> 00:54:26.378
That we should meet people where they are, make it accessible to them.
00:54:26.618 --> 00:54:32.098
And beyond everything else, make it enjoyable. because people want to enjoy
00:54:32.098 --> 00:54:36.518
learning. They want to enjoy the content they're getting. And not everybody's an academic.
00:54:36.818 --> 00:54:39.498
Not everybody was meant to be in this life.
00:54:40.598 --> 00:54:46.098
Yeah, that's true about being an academic. So the next iceberg is what we call
00:54:46.098 --> 00:54:51.138
20 questions. So I need you to give me a number between 1 and 20.
00:54:51.958 --> 00:55:00.278
Go 18. Okay. What's one thing we might all agree is important no matter our differences?
00:55:01.558 --> 00:55:02.758
Ooh, that's a good one.
00:55:04.447 --> 00:55:06.887
It's one of those things where I feel like there's a lot we agree on that's
00:55:06.887 --> 00:55:10.247
important. And when you get asked the question, I kind of blank on it.
00:55:10.867 --> 00:55:16.927
I just think central values of humanity in general and trying to be kind to each other.
00:55:17.067 --> 00:55:19.327
And I think in practice, what that
00:55:19.327 --> 00:55:23.347
takes and promoting human thriving is something I think we all agree on.
00:55:24.027 --> 00:55:27.667
I think where we differ is in practice of how we achieve that.
00:55:27.887 --> 00:55:31.967
And that's where a lot of division can arise. And I think actually with this,
00:55:32.127 --> 00:55:36.867
a great quote from an indigenous leader in Canada I got to spend some time with.
00:55:37.047 --> 00:55:40.727
And she's a very famous environmentalist, has done amazing work,
00:55:40.747 --> 00:55:46.127
but is constantly speaking to Fortune 500 companies, to oil companies, to all these groups.
00:55:46.287 --> 00:55:51.067
And I asked her, how do you work with them when they seem to be so diametrically opposed to you?
00:55:51.227 --> 00:55:55.227
And she said, I always need to remember that every human being at their core
00:55:55.227 --> 00:56:00.607
has 98% in common. our core values and what drives us is usually the same.
00:56:01.087 --> 00:56:06.047
That 2% of how we try and actualize it and how we understand how we do that is where we differ.
00:56:06.427 --> 00:56:10.707
But if we focus on that 2%, we'll get nowhere. If we can try and build from
00:56:10.707 --> 00:56:13.227
that 98%, we can build something together.
00:56:13.627 --> 00:56:17.627
And that's what was sat with me. So I think what we share tends to be core values.
00:56:18.267 --> 00:56:24.127
You know, it's interesting because I always made that argument when I was in the legislature.
00:56:24.147 --> 00:56:31.247
I used to tell people that, about 98% of the time we all agreed on stuff is
00:56:31.247 --> 00:56:33.127
just the 2% that made the news.
00:56:34.467 --> 00:56:39.727
And it looked like we hated each other's guts and all that. So that's interesting.
00:56:40.367 --> 00:56:44.287
That's a good philosophy to maintain.
00:56:44.827 --> 00:56:50.227
Maybe that'll help us here in the United States kind of navigate things a little better.
00:56:50.707 --> 00:56:56.407
How does one evolve from a magician to a researcher of artificial intelligence?
00:56:57.605 --> 00:57:01.605
Yeah, it was a very interesting and weird path, because I was working in that
00:57:01.605 --> 00:57:07.045
industry, having some nice success, touring, and I got a little tired.
00:57:07.365 --> 00:57:10.845
The entertainment lifestyle is exhausting, and I wanted a bit more stability.
00:57:11.225 --> 00:57:16.145
So I decided to go to university during the winter months and tour during the summer months.
00:57:16.465 --> 00:57:20.325
So I was doing my degree for eight months of the year, tour for four months,
00:57:20.505 --> 00:57:23.785
and I ended up studying sociology, which I loved.
00:57:24.125 --> 00:57:26.005
Thought I was going to continue that research.
00:57:26.385 --> 00:57:28.965
And actually, my original research was on American politics.
00:57:29.165 --> 00:57:33.485
I was doing the sociology of American politics at the time, specifically as
00:57:33.485 --> 00:57:36.045
it related to the rise of Donald Trump.
00:57:37.459 --> 00:57:41.519
And in that, we focused a lot on misinformation. So how does misinformation
00:57:41.519 --> 00:57:44.699
get produced? How does it get spread? Why is it so difficult to deal with?
00:57:44.939 --> 00:57:48.719
And one of the things with it is how easy it is to produce. It's much harder
00:57:48.719 --> 00:57:51.299
to produce good journalism than it is just to write, you know,
00:57:51.379 --> 00:57:52.559
a falsehood and put it out there.
00:57:53.399 --> 00:57:57.479
And then I came across a program which said it could write like a human,
00:57:57.659 --> 00:58:01.859
that you could just give it a prompt and it would write like a human. This was 2019. team.
00:58:02.399 --> 00:58:06.399
I'm like, well, if that was real, that could be really dangerous for misinformation.
00:58:07.039 --> 00:58:11.119
So I started researching this, met with some friends from Silicon Valley,
00:58:11.159 --> 00:58:14.599
and they're all like, it's real. It's really crazy. You need to see it.
00:58:14.859 --> 00:58:18.479
They directed me towards the company. I got to see some early demos of it.
00:58:18.759 --> 00:58:22.179
And I was floored. And I instantly said, this is going to change everything.
00:58:22.519 --> 00:58:24.599
And originally I was thinking about misinformation.
00:58:25.139 --> 00:58:28.779
But quickly I realized, no, this is going to change everything.
00:58:29.259 --> 00:58:33.899
And that program was GPT-2. So the company that I was looking at was OpenAI.
00:58:34.339 --> 00:58:38.259
And I just had the same moment that everybody else had when they saw ChatGPT.
00:58:38.719 --> 00:58:42.039
And when this is going to change everything, and every academic started doing
00:58:42.039 --> 00:58:46.739
AI research at that point, I just got lucky that I got to have that moment three
00:58:46.739 --> 00:58:50.339
years earlier, or four years earlier. Yeah, four years earlier.
00:58:51.079 --> 00:58:54.939
And then the pandemic hit, I couldn't tour. So I said, well,
00:58:54.979 --> 00:58:58.939
I'll just continue of this academic path, researching AI, at that point,
00:58:59.019 --> 00:59:01.839
I'm like, I feel like we've got at least 10 or 15 years before this becomes
00:59:01.839 --> 00:59:04.659
a really big deal. Boy, was I wrong on the timeline.
00:59:05.299 --> 00:59:09.799
And things just transitioned in. I never left this job when the world opened up.
00:59:09.899 --> 00:59:14.379
And I'm so happy to be in this field and getting to do work and try and help
00:59:14.379 --> 00:59:15.999
people understand this moment we're in.
00:59:17.099 --> 00:59:21.959
Why is artificial intelligence a very human story?
00:59:23.145 --> 00:59:29.225
So I think it's interesting because we tend to talk about AI as if it's some alien technology.
00:59:29.585 --> 00:59:34.845
I actually think in Noah Harari's recent book, he called it like an alien multiple
00:59:34.845 --> 00:59:38.945
times, this idea of framing it as this like extraterrestrial thing that's come
00:59:38.945 --> 00:59:41.125
in that's so different, that's this other.
00:59:41.705 --> 00:59:44.425
But the reality is that ai is a
00:59:44.425 --> 00:59:47.965
human technology it's a technology that was made by us
00:59:47.965 --> 00:59:51.065
and it's controlled by us
00:59:51.065 --> 00:59:54.165
in the story of this book i try and explore
00:59:54.165 --> 00:59:56.885
the human foundations of ai so in this we're
00:59:56.885 --> 00:59:59.905
looking at the humans who made it how they built it the decisions they
00:59:59.905 --> 01:00:02.865
made and i think it teaches us some interesting things
01:00:02.865 --> 01:00:05.925
and one is how the behaviors of ai really take
01:00:05.925 --> 01:00:09.325
on our behaviors how the grief of
01:00:09.325 --> 01:00:12.205
some of its creators how their objectives really shaped this
01:00:12.205 --> 01:00:16.005
technology we're dealing with right now that ai it's
01:00:16.005 --> 01:00:18.705
directed by us it might feel separate but if
01:00:18.705 --> 01:00:23.005
we treat it like that we treat it as something that we need to control as something
01:00:23.005 --> 01:00:27.285
that you know is an existential threat instead of realizing that it is an oftentimes
01:00:27.285 --> 01:00:32.505
a reflection of us a reflection of our decisions that's why ai in the united
01:00:32.505 --> 01:00:37.645
states is so different than ai in europe and why ai in Europe is so different than AI in China.
01:00:38.025 --> 01:00:42.065
It builds and reflects the people building it, the cultures building it,
01:00:42.145 --> 01:00:43.705
the understandings and the values.
01:00:44.205 --> 01:00:47.685
And I think one of the really empowering things about that that I wanted readers
01:00:47.685 --> 01:00:50.885
to take out of this book is understanding that the shape it's taking,
01:00:51.165 --> 01:00:54.265
the way it's affecting us, is caused by human decisions.
01:00:54.765 --> 01:00:58.985
And that there are still decisions left to be made. We as humans get to make
01:00:58.985 --> 01:01:01.345
more decisions now that will decide the future of AI,
01:01:01.585 --> 01:01:05.505
that will decide how it's implemented into our society, how it's built how it's
01:01:05.505 --> 01:01:11.225
designed so we get to make choices right now that will define the ai world that
01:01:11.225 --> 01:01:15.225
we get to live in that our grandchildren get to live in and potentially people
01:01:15.225 --> 01:01:19.265
for the next 100 200 years of civilization get to live in.
01:01:20.444 --> 01:01:25.744
Yeah, I, you know, it was, it was really the way you started the book out.
01:01:25.924 --> 01:01:32.884
That was really, really fascinating because, you know, I, there were some terms
01:01:32.884 --> 01:01:35.304
like Boolean. I had heard that term before.
01:01:35.904 --> 01:01:38.924
I guess I was paying attention to math class when that happened,
01:01:39.044 --> 01:01:44.564
but it was like, so to hear the origin story behind that and how we got the
01:01:44.564 --> 01:01:46.544
word algorithm and all that stuff.
01:01:46.544 --> 01:01:52.824
It was like it was you could tell from the jump how you were reminding us of
01:01:52.824 --> 01:01:57.204
how much of an impact humans have had in this.
01:01:57.204 --> 01:02:03.524
Because, you know, a lot of, you know, even though in the back of our mind,
01:02:03.684 --> 01:02:06.824
a very, very basic understanding is like,
01:02:07.264 --> 01:02:15.164
yeah, we humans created AI, but it just, you know, it's another to make a real
01:02:15.164 --> 01:02:19.724
historical connection all the way back to the beginning of mathematics.
01:02:19.824 --> 01:02:23.144
So I really appreciated how you did that. Well, thanks.
01:02:23.804 --> 01:02:28.344
Did you? So when you were giving your answer on the quote,
01:02:28.604 --> 01:02:36.484
you were talking about why you felt compelled to share knowledge,
01:02:36.484 --> 01:02:40.184
how to take the knowledge from the ivory tower and bring it to the masses.
01:02:40.284 --> 01:02:43.904
Was that the same compulsion that led you to write the book?
01:02:45.129 --> 01:02:51.029
Yeah, yeah, 100%. Because I really felt like with AI, especially at that time,
01:02:51.229 --> 01:02:54.769
when I started the book, you know, we are really starting to kind of develop
01:02:54.769 --> 01:02:56.609
the public conversation on AI.
01:02:57.189 --> 01:03:00.829
And what I saw was in academia, there was a lot of really good research,
01:03:01.089 --> 01:03:02.069
a lot of really good stories.
01:03:02.349 --> 01:03:05.109
You know, you can look at the reference list in this, I'm trying to pull from,
01:03:05.249 --> 01:03:07.409
you know, this academic body of knowledge.
01:03:08.069 --> 01:03:10.969
But then I found with my friends, whenever they would ask me,
01:03:11.169 --> 01:03:12.389
you know, what book should I read?
01:03:12.529 --> 01:03:15.729
What do I need to do to like learn how to navigate AI in my work?
01:03:15.869 --> 01:03:18.389
And you know, my friends come from every background.
01:03:18.609 --> 01:03:23.029
You know, I'm friends with marketing researchers, I'm friends with athletes,
01:03:23.209 --> 01:03:27.049
I'm friends with people from, you know, unfortunately, I'm friends with some politicians too.
01:03:27.529 --> 01:03:30.809
And they will always ask, you know, what do I read? And I'd give them these
01:03:30.809 --> 01:03:32.049
readings that I thought were valuable.
01:03:32.549 --> 01:03:35.129
And they would never read them. And then I would be like, well,
01:03:35.169 --> 01:03:36.929
what are you watching? What are you learning from?
01:03:37.289 --> 01:03:41.009
And when I came to realize through the way they talked about it or the things
01:03:41.009 --> 01:03:45.329
I saw was that what was accessible tended to be,
01:03:45.749 --> 01:03:50.149
you know, these crypto bros turned AI bros who were spewing what was really
01:03:50.149 --> 01:03:53.249
like not great advice, in my opinion.
01:03:53.689 --> 01:03:58.529
So you kind of saw this really weird thing where I felt like this good knowledge
01:03:58.529 --> 01:04:00.409
wasn't available to these people.
01:04:00.709 --> 01:04:03.629
So I really wanted to kind of fill that gap. I want to provide,
01:04:03.829 --> 01:04:07.529
I'm like, if this book doesn't exist, I want to provide it because right now
01:04:07.529 --> 01:04:11.229
it seems like the option is, you know, easily accessible and digestible,
01:04:11.329 --> 01:04:15.089
but not good information or very good, but not easily accessible information.
01:04:15.089 --> 01:04:20.009
So I kind of felt like I was craving that book so I could suggest it to my friends.
01:04:20.149 --> 01:04:21.249
And I'm like, you know what?
01:04:21.609 --> 01:04:25.709
If there is a opening here, someone's going to make it. Might as well be me.
01:04:26.569 --> 01:04:32.309
Yeah. You stated that it wasn't necessary for you to know how AI works,
01:04:32.309 --> 01:04:35.069
but to study its implications.
01:04:35.669 --> 01:04:37.209
Why, why that approach?
01:04:38.675 --> 01:04:42.455
Yeah. So I think a major thing with this is you, of course, need to know how
01:04:42.455 --> 01:04:44.235
it works to a certain extent.
01:04:44.555 --> 01:04:48.675
But I think one of the really intimidating things with AI is the mathematics
01:04:48.675 --> 01:04:54.255
behind it and understanding how vectors work, understanding how neural networks work.
01:04:54.755 --> 01:04:58.675
And that nitty gritty is very important when you're trying to understand specific
01:04:58.675 --> 01:05:01.775
implications and the specific way it involves.
01:05:02.035 --> 01:05:05.035
But in the general, I think you only need
01:05:05.035 --> 01:05:07.835
to know a certain amount like there's a base level of knowledge you
01:05:07.835 --> 01:05:10.535
need to join the conversation and to be
01:05:10.535 --> 01:05:14.375
involved in it and to be able to you know identify when someone's
01:05:14.375 --> 01:05:18.455
selling you a lie to be able to identify you know who might actually know what
01:05:18.455 --> 01:05:22.315
they're talking about in this area to be able to look at a scenario look at
01:05:22.315 --> 01:05:27.855
a situation see a new story and say hey that's a problem because of this the
01:05:27.855 --> 01:05:30.975
way they're using ai here is a problem because of these principles,
01:05:31.755 --> 01:05:34.615
So the book tries to teach you how AI works.
01:05:34.915 --> 01:05:38.795
Like we talk about the actual structures of it. We talk about how these structures
01:05:38.795 --> 01:05:41.835
cause issues, how they interact with social processes.
01:05:42.235 --> 01:05:48.415
But the specifics of the mathematics, those aren't as important for the general
01:05:48.415 --> 01:05:49.355
knowledge understanding.
01:05:49.555 --> 01:05:52.975
They aren't as important for your everyday person and what I feel like they
01:05:52.975 --> 01:05:55.535
need to effectively participate in democracy.
01:05:56.695 --> 01:06:01.655
Yeah. Yeah. Well, I'm glad you took that approach because I hated math.
01:06:02.095 --> 01:06:06.955
And my dad was a math major, and he would get frustrated. It's like,
01:06:07.015 --> 01:06:08.215
why are you not getting this?
01:06:08.315 --> 01:06:10.815
It's like, because I don't like it. He said, well, you got to like it because
01:06:10.815 --> 01:06:12.395
you got to pass, all that stuff.
01:06:12.755 --> 01:06:21.055
But he would have got real heavy into the math part of the whole thing, but not me, not so much.
01:06:22.555 --> 01:06:26.635
I think there's an actual moment in the book too where i am talking about something
01:06:26.635 --> 01:06:30.815
and i say don't worry we're not going to go into the math on this and then i
01:06:30.815 --> 01:06:35.875
felt like i also had to add parentheses possibly because i don't understand it either.
01:06:37.928 --> 01:06:45.548
All right. So how dangerous is systemic bias in facial recognition?
01:06:46.628 --> 01:06:50.988
Massively dangerous. It is one of the biggest issues with AI.
01:06:51.608 --> 01:06:55.148
And it's been there from the beginning. And even as it gets better,
01:06:55.428 --> 01:06:58.248
we still have these issues and they continue to come up.
01:06:58.428 --> 01:07:02.108
So for your listeners who might not understand how the systemic bias works,
01:07:02.648 --> 01:07:05.988
you know, with AI is trained on a data set so when
01:07:05.988 --> 01:07:08.688
you're doing that data set so there's kind of two ways in a sense this
01:07:08.688 --> 01:07:11.708
can happen one is just like a data set made for say
01:07:11.708 --> 01:07:14.848
recognizing faces if that data set
01:07:14.848 --> 01:07:17.648
like the original data sets which were used done in silicon
01:07:17.648 --> 01:07:21.128
valley are heavily you know say white people it
01:07:21.128 --> 01:07:24.108
will have a difficult time identifying people of different races
01:07:24.108 --> 01:07:27.208
this associates in
01:07:27.208 --> 01:07:30.228
many different ways because you have one thing if like the system
01:07:30.228 --> 01:07:32.968
might not be that good at associating people from one group or another
01:07:32.968 --> 01:07:35.708
so it can be shown to be you know very successful in
01:07:35.708 --> 01:07:38.548
one place but then it confuses two people together so in
01:07:38.548 --> 01:07:44.268
stuff like criminal justice applications that has massive issues we also see
01:07:44.268 --> 01:07:48.628
it where people tend to think about facial recognition in the sense of you know
01:07:48.628 --> 01:07:52.408
we're going to take one person take their photo say who is this and we'll find
01:07:52.408 --> 01:07:55.748
who it is so misidentification is a huge problem there.
01:07:56.698 --> 01:08:01.258
But facial recognition extends beyond that. There's ideas of using facial recognition
01:08:01.258 --> 01:08:05.178
to, you know, detect someone's mood, to detect, you know, to say,
01:08:05.298 --> 01:08:08.498
this person's being aggressive or things like that.
01:08:08.878 --> 01:08:14.898
Again, you have massive bias implication there because these are not objective
01:08:14.898 --> 01:08:16.778
ideas. These are social ideas.
01:08:16.998 --> 01:08:21.558
So if you've trained, say, the system in one region where the way that someone
01:08:21.558 --> 01:08:24.778
shows, you know, aggressiveness on their face has a very, you know,
01:08:24.858 --> 01:08:28.318
pronounced kind of idea of and that's how it is in one culture,
01:08:28.538 --> 01:08:30.258
that might not be the same in another.
01:08:30.458 --> 01:08:34.238
So it could easily read onto this different thing.
01:08:34.338 --> 01:08:38.498
So a system that's meant to recognize if someone's being aggressive, that can be a problem.
01:08:38.758 --> 01:08:42.258
I was at a company that was showing that they actually have facial recognition
01:08:42.258 --> 01:08:46.678
systems on their computer now, that when dealing with sensitive documents,
01:08:46.858 --> 01:08:51.778
it is looking to see if you seem upset or agitated, and it will actually restrict
01:08:51.778 --> 01:08:55.798
your access to sensitive documents if you're upset or agitated out of a fear
01:08:55.798 --> 01:08:59.958
that you're going to be stealing them or taking some sort of retribution against the company.
01:09:00.398 --> 01:09:03.058
And just in a cultural sense.
01:09:04.106 --> 01:09:08.426
It doesn't work like that. Like we don't have that ability to adequately do
01:09:08.426 --> 01:09:09.906
that because of the bias in the data.
01:09:10.166 --> 01:09:15.646
So yeah, there are huge problems, especially when it comes to policing in the United States.
01:09:15.726 --> 01:09:18.846
We see a lot of it used in policing systems there.
01:09:19.266 --> 01:09:23.386
It's being used way more in England right now. I have a friend doing a lot of
01:09:23.386 --> 01:09:25.606
research on facial recognition systems in England.
01:09:26.326 --> 01:09:32.986
It's just a whole can of worms that has an entire subset of academia looking at it.
01:09:33.046 --> 01:09:35.766
And most of the best researchers that I know all kind of say,
01:09:35.766 --> 01:09:40.986
we should not be using this for anything important because it's just going to
01:09:40.986 --> 01:09:42.406
cause more problems than it's worth.
01:09:43.026 --> 01:09:47.126
Yeah. I, you know, I know about the, the policing thing.
01:09:47.266 --> 01:09:52.086
That's if you watch any cop show that's based out of Britain,
01:09:52.706 --> 01:09:57.506
every cop, it doesn't matter if they're a normal B cop or they're the head of Scotland Yard.
01:09:57.786 --> 01:10:01.086
Everybody's like, check the facial recognition. You know what I'm saying?
01:10:01.086 --> 01:10:06.726
And so it's like, I know it's a big deal in London, but, and it's something
01:10:06.726 --> 01:10:08.846
that's being incorporated in the United States.
01:10:09.046 --> 01:10:13.206
And the young lady I was talking about, Kem-Laurin Lubin, she'd written a book
01:10:13.206 --> 01:10:14.666
called Design Heuristics.
01:10:15.466 --> 01:10:21.086
And she talks about, you know, tries to talk about how we can get to a better
01:10:21.086 --> 01:10:25.286
way of dealing with those kind of biases and stuff.
01:10:26.426 --> 01:10:29.366
And there's a lot of work being done on it because it's important.
01:10:29.366 --> 01:10:32.386
And but like one example, I like to throw it on these biases,
01:10:32.386 --> 01:10:36.906
too, though, which is interesting was I believe it was California tried to use
01:10:36.906 --> 01:10:41.346
an AI system to do sentencing or I think it was it might have been bail.
01:10:41.626 --> 01:10:45.306
It was either bail or sentencing, one of the two. And the idea was they were
01:10:45.306 --> 01:10:46.746
trying to use it to be less biased.
01:10:47.066 --> 01:10:50.046
They were like, instead of having a human, which is going to use their judgment,
01:10:50.286 --> 01:10:54.386
we're going to have an AI system do it that will be objective that, you know, won't judge.
01:10:55.770 --> 01:10:58.790
And what they found when they audited it was that the system was being racist.
01:10:58.870 --> 01:11:02.290
It was, you know, having harsher penalties against minorities.
01:11:02.510 --> 01:11:07.570
And it comes from the principle of bias of garbage data in garbage outputs,
01:11:07.570 --> 01:11:11.270
because the data that these systems were learning from, again,
01:11:11.410 --> 01:11:15.290
this is why we say AI is very human, is data that we chose to collect.
01:11:15.470 --> 01:11:18.890
It's data that reflects our actions. So it's reflecting us.
01:11:19.090 --> 01:11:23.310
So if we have racist data, we're going to have racist outcomes.
01:11:23.910 --> 01:11:28.550
Yeah, so let me ask this question. So how is data the new oil?
01:11:30.130 --> 01:11:34.830
It's an interesting one because on one hand, I actually reject that idea.
01:11:35.090 --> 01:11:37.390
But the other hand, it is kind of true.
01:11:37.850 --> 01:11:43.030
So data is the new oil, as people like to say, because it's incredibly valuable right now.
01:11:43.230 --> 01:11:47.270
It is the raw resource that allows AI systems to work.
01:11:47.270 --> 01:11:50.550
And it's it's interesting because it both fuels the
01:11:50.550 --> 01:11:53.650
creation of ai system and acts as the gasoline that
01:11:53.650 --> 01:11:56.770
powers them so when you're training an ai system like
01:11:56.770 --> 01:11:59.710
we're talking about there you need the data for it to make its
01:11:59.710 --> 01:12:02.770
inferences from to develop its algorithm but then
01:12:02.770 --> 01:12:07.410
for it to actually act it also needs data coming in so real-time data processing
01:12:07.410 --> 01:12:11.750
it needs you know to be observing something so even like no facial recognition
01:12:11.750 --> 01:12:16.330
the data is the video camera is producing this data which is being interpreted
01:12:16.330 --> 01:12:20.450
by an AI system designed and built from data.
01:12:20.970 --> 01:12:25.550
So data is just so important. Having a rich data set to train from is incredibly
01:12:25.550 --> 01:12:27.070
important and incredibly valuable.
01:12:27.290 --> 01:12:32.090
It's why Google is such a valuable company and is actually far more valuable
01:12:32.090 --> 01:12:36.250
than they show in their balance sheets because they have more data than anybody else in the world.
01:12:36.930 --> 01:12:40.450
And that doesn't come up in their balance sheets anywhere. There's nowhere on
01:12:40.450 --> 01:12:42.630
their financial statements that list the value for that data,
01:12:42.790 --> 01:12:47.170
because we just don't even know how to value it. We just know it's worth a lot.
01:12:48.083 --> 01:12:52.883
The reason I say it's not the new oil, though, is because data and oil are completely
01:12:52.883 --> 01:12:54.963
different in how they act as resources.
01:12:55.603 --> 01:12:58.523
When we talk about oil, oil is non-rival.
01:12:59.263 --> 01:13:03.863
If I have a barrel of oil, only I can burn it. You can't also use it.
01:13:04.103 --> 01:13:07.583
You know, if you want to use it, you have to buy it from me. Only one of us can use it.
01:13:07.903 --> 01:13:12.303
But if I have a hard drive full of data, and we both want to train AIs,
01:13:12.563 --> 01:13:17.083
I can train my AI on the data and hand it over to you and you can train your AI.
01:13:17.083 --> 01:13:19.803
It's not diminished by that so i kind
01:13:19.803 --> 01:13:23.903
of reject this data as a new oil because economically it
01:13:23.903 --> 01:13:27.463
works completely different it's a a fascinating
01:13:27.463 --> 01:13:32.423
situation where when we talk about as oil we make this mistake of kind of this
01:13:32.423 --> 01:13:36.963
idea of why we created economies like capitalist economies of you know we have
01:13:36.963 --> 01:13:40.523
limited resources we need to figure out how to distribute them what's the best
01:13:40.523 --> 01:13:44.803
way to do it so you know we distribute it that way but like with something like
01:13:44.803 --> 01:13:47.283
data there's questions about, well,
01:13:47.523 --> 01:13:50.623
everybody can use it. This could just be a free good.
01:13:50.803 --> 01:13:53.963
It's completely different. And that's something governments are really struggling
01:13:53.963 --> 01:13:58.023
with right now in understanding how we should build data economies.
01:13:58.663 --> 01:14:04.103
So since you mentioned Google, you state that Google search is a farmer's surveillance.
01:14:04.763 --> 01:14:08.283
Why did you feel it was important for your readers to know that?
01:14:08.403 --> 01:14:12.283
Because everybody jokes about that, right? They'll say, oh, well, you know.
01:14:13.370 --> 01:14:16.450
You know, if you put too much information in there, you know,
01:14:16.550 --> 01:14:18.310
they, you know, they spying on you and stuff.
01:14:18.490 --> 01:14:23.810
And then, of course, you know, we'll see whatever we research or whatever.
01:14:24.070 --> 01:14:28.430
Then it's like all of a sudden we start getting ads. So kind of explain how
01:14:28.430 --> 01:14:30.590
Google searches a form of surveillance.
01:14:31.210 --> 01:14:36.890
So it's not actually even just Google search. It's the entire suite of the Google infrastructure.
01:14:37.130 --> 01:14:40.350
So we call Google a surveillance advertising company.
01:14:41.070 --> 01:14:44.570
And you might think that your ads come purely from your Google search.
01:14:45.010 --> 01:14:48.450
But there's this interesting thing. Go to any website. I'm sure if you have
01:14:48.450 --> 01:14:50.430
a website, you might not even realize this.
01:14:50.770 --> 01:14:54.550
If you scroll down to the bottom and look at it, I can't remember the last stat.
01:14:54.670 --> 01:14:58.730
It sounded like 75% of the websites on the internet run off Google Analytics.
01:14:59.550 --> 01:15:04.090
What that means is whenever a user is on your website, Google can see what they're doing.
01:15:04.470 --> 01:15:08.510
All of that is data that they can see as well. They can see those actions along
01:15:08.510 --> 01:15:09.870
with the searches they're making.
01:15:10.350 --> 01:15:13.250
To build a bigger profile on that user.
01:15:14.342 --> 01:15:17.822
And then they use that profile to run through, you know, advertising algorithms.
01:15:18.142 --> 01:15:21.702
But they're creating this knowledge of who you are, which is,
01:15:21.842 --> 01:15:25.822
I think, really important to understand kind of this economic engine that we're
01:15:25.822 --> 01:15:29.222
in right now of surveillance, of that they are surveilling you,
01:15:29.342 --> 01:15:30.562
they're producing data about you.
01:15:30.742 --> 01:15:33.142
And then that is something that they're using to create value,
01:15:33.342 --> 01:15:37.902
whether that's advertising or now we're seeing this data used to create AI systems.
01:15:38.342 --> 01:15:42.802
So I think it's really important to recognize that what's going on here is surveillance
01:15:42.802 --> 01:15:44.862
and to speak about it in that way.
01:15:45.362 --> 01:15:50.642
Because once we kind of recognize that reality, we gain a better ability to
01:15:50.642 --> 01:15:53.182
kind of regulate it and say, how do we want to regulate this?
01:15:53.362 --> 01:15:54.842
How do we feel about this?
01:15:55.062 --> 01:16:00.102
And I think it's important to kind of build an understanding of the power of these systems.
01:16:00.342 --> 01:16:04.562
And I don't want to say, you know, I might be a bit more radical on the privacy side with this.
01:16:04.862 --> 01:16:08.222
Not everybody needs to be. Some people might be more comfortable with this.
01:16:08.322 --> 01:16:12.202
I have family members who know all this information, who are very comfortable with it.
01:16:12.262 --> 01:16:18.802
But I think we need that knowledge to have these conversations and decide what we as a society want.
01:16:19.382 --> 01:16:25.702
Yeah, because you basically make the argument that the legislation that says, okay,
01:16:26.002 --> 01:16:32.682
well, you can opt out of putting in your personal data is not even really scratching
01:16:32.682 --> 01:16:37.802
the surface as far as how these companies like, you know,
01:16:38.042 --> 01:16:43.082
Google and I guess AWS, how they can get information.
01:16:43.082 --> 01:16:47.742
They don't necessarily need your personal data to get the information they need
01:16:47.742 --> 01:16:50.962
to cater to or market to you.
01:16:52.126 --> 01:16:55.526
Yeah, and that really is an interesting one because there's two sides to this.
01:16:55.926 --> 01:16:59.626
One is that in the old economy, which was surveillance advertising,
01:17:00.606 --> 01:17:03.186
they did need a certain level of personal data.
01:17:03.386 --> 01:17:04.566
And that's why we started regulating
01:17:04.566 --> 01:17:07.946
it. Because they needed to know stuff about you to market to you.
01:17:08.406 --> 01:17:14.446
In the new AI economy, the valuable data is what was not that valuable in that economy.
01:17:14.686 --> 01:17:16.906
The valuable data is just like everything.
01:17:17.186 --> 01:17:20.786
They're just trying to sweep up everything. personal data is interesting but
01:17:20.786 --> 01:17:23.626
it doesn't cover everything so now they're
01:17:23.626 --> 01:17:26.946
kind of trying to segment off and being like hey we'll let you pass privacy laws
01:17:26.946 --> 01:17:30.286
that focus on personal data so you can feel safe as long
01:17:30.286 --> 01:17:34.006
as you let us have all this other stuff that like personal data
01:17:34.006 --> 01:17:36.846
just means that they can't identify you from it it could
01:17:36.846 --> 01:17:40.106
still be like you know your heart rate from your smartwatch
01:17:40.106 --> 01:17:43.446
there are countries in the world where that's not considered personal data you
01:17:43.446 --> 01:17:46.186
know what you're doing on websites they'll just
01:17:46.186 --> 01:17:48.946
cut off well david's name isn't attached to
01:17:48.946 --> 01:17:52.046
it so you know we can use this information what
01:17:52.046 --> 01:17:54.906
he did because they're not looking to market to me they're looking
01:17:54.906 --> 01:18:00.886
to aggregate that into ai systems an interesting one is how that has moved into
01:18:00.886 --> 01:18:05.546
marketing algorithms a really interesting specific one that i worked on was
01:18:05.546 --> 01:18:09.506
a proposal by google where they were like you know what we're going to stop
01:18:09.506 --> 01:18:12.746
collecting all this personal data we're going to stop doing that And instead,
01:18:12.866 --> 01:18:16.226
they came up with an algorithm that would basically hop into your computer,
01:18:16.466 --> 01:18:18.386
I think it was like once a week.
01:18:18.906 --> 01:18:23.786
Look at your last five URLs you had visited, and from that, create a profile
01:18:23.786 --> 01:18:26.726
of who you were based on just your last five searches.
01:18:26.726 --> 01:18:29.666
That was far more accurate at predicting
01:18:29.666 --> 01:18:34.346
like who you were what you wanted to see what your feelings were that period
01:18:34.346 --> 01:18:39.586
of time then they had ever got from doing like this massive personalized version
01:18:39.586 --> 01:18:43.566
where they had to collect all this data which just shows the power of ai there
01:18:43.566 --> 01:18:48.666
will ai create a new luddite fallacy in our society.
01:18:49.992 --> 01:18:54.552
It's an interesting proposition because the Luddite fallacy,
01:18:54.592 --> 01:18:59.552
right, is this idea that, well, the fallacy is the idea that the,
01:18:59.712 --> 01:19:03.932
or I need to hop back and remember which one specifically the Luddite fallacy is.
01:19:04.232 --> 01:19:08.012
Because the Luddites are the people who rejected the technology or are seen
01:19:08.012 --> 01:19:11.392
as being the ones who, you know, rejected the technology.
01:19:11.392 --> 01:19:17.692
But then, oh, I'm trying to work through the economic fallacy in my head.
01:19:17.692 --> 01:19:20.932
Because yeah the fallacy is that they rejected new
01:19:20.932 --> 01:19:24.372
technologies of the industrial revolution and didn't recognize
01:19:24.372 --> 01:19:27.412
that those technologies would help them in the
01:19:27.412 --> 01:19:33.552
long run overall you know would raise economic standings it is also based off
01:19:33.552 --> 01:19:38.032
a misunderstanding of the luddites that the luddites they didn't reject the
01:19:38.032 --> 01:19:42.312
technology they were actually fine with it they rejected how it was being integrated
01:19:42.312 --> 01:19:46.192
into society and what they were saying is that this is destroying our communities,
01:19:46.852 --> 01:19:48.972
So the Luddites were mainly skilled artisans.
01:19:49.312 --> 01:19:51.752
So when they saw the technology come in, they're like, whoa,
01:19:51.932 --> 01:19:55.212
we can produce so much better stuff now. We can make much better products for
01:19:55.212 --> 01:19:56.652
people and we can mass produce it.
01:19:56.852 --> 01:20:00.712
But the factory owner said, no, we're firing all the artisans and we're mass
01:20:00.712 --> 01:20:03.372
producing cheap things and we're going to get more profit from that.
01:20:04.012 --> 01:20:07.252
That's what the Luddites were upset about. They saw it as this destruction of
01:20:07.252 --> 01:20:11.432
agricultural work, the forcing of people into cities, into factories.
01:20:11.692 --> 01:20:16.712
So what they were more concerned about was a quality of living and a quality of life in their time.
01:20:17.782 --> 01:20:20.842
So I think what we could see is, yeah, that rise again, where,
01:20:21.202 --> 01:20:23.922
you know, we see it right now, like with the economy, right,
01:20:24.082 --> 01:20:27.702
where that line's going up, the stock market's going up.
01:20:27.822 --> 01:20:30.482
And this is kind of like what that fallacy is talking about.
01:20:31.142 --> 01:20:34.762
You know, look, everything's going great. Economic performance is awesome.
01:20:34.942 --> 01:20:38.402
But that doesn't account for the lived experiences of people within it.
01:20:38.642 --> 01:20:43.262
It doesn't account for the fact that the Luddites were experiencing a massive trauma.
01:20:43.382 --> 01:20:46.362
They were having their livelihoods and their jobs taken from them.
01:20:46.362 --> 01:20:48.202
And that's what they were pushing back against.
01:20:48.402 --> 01:20:52.582
They were saying, we need to do this in a humane way. We approve of this technology.
01:20:52.582 --> 01:20:55.062
We agree it can create massive economic benefit.
01:20:55.482 --> 01:20:58.622
We need to implement it in a way that's humane. And I think,
01:20:58.722 --> 01:21:03.302
you know, with AI, we do risk dealing with the same problems there because,
01:21:03.322 --> 01:21:06.862
you know, people are like, oh, it's going to be so much better for productivity, all of this.
01:21:06.862 --> 01:21:10.102
But you have situations where you know a 40
01:21:10.102 --> 01:21:12.962
year old father who's been in a
01:21:12.962 --> 01:21:15.962
field you know has his education might get
01:21:15.962 --> 01:21:18.782
laid off and they'll say sorry you're redundant now ai can
01:21:18.782 --> 01:21:23.702
do this what is that person going to do you know they're so far into their career
01:21:23.702 --> 01:21:27.782
retooling is difficult do they go back to school do they accept a job that's
01:21:27.782 --> 01:21:31.922
now much under their salary of what they had before and we've seen this happen
01:21:31.922 --> 01:21:36.322
with the rust belt states in the u.s right and we've seen the effect this has
01:21:36.322 --> 01:21:37.962
on people when they are de-skilled,
01:21:38.162 --> 01:21:41.202
when you have these massive automation capabilities.
01:21:41.502 --> 01:21:43.642
And I think that's something we really need to be worried about.
01:21:43.782 --> 01:21:48.442
So in a political sense, one of the things I really advocate for is we don't
01:21:48.442 --> 01:21:51.162
know how difficult this transition is going to be.
01:21:51.302 --> 01:21:57.042
We still don't know how big the AI job transformation is going to be. We know it will exist.
01:21:58.419 --> 01:22:02.479
We need to prepare for that worst case scenario. We need to prepare and build
01:22:02.479 --> 01:22:06.499
social safety nets to build understandings that like if people are being de-skilled,
01:22:06.539 --> 01:22:08.559
you know, in their 40s, how are we going to deal with that?
01:22:08.839 --> 01:22:11.159
That'll be different than people who just went to university,
01:22:11.159 --> 01:22:15.999
got a business degree, paid $100,000 for it, you know, have now had basically
01:22:15.999 --> 01:22:17.179
all their knowledge wiped out.
01:22:17.659 --> 01:22:21.339
What do we do about those people? These are questions we need to be asking to
01:22:21.339 --> 01:22:26.679
figure out like what systems could we have that reduce the harm to those people
01:22:26.679 --> 01:22:27.819
from that because that harm is real.
01:22:28.419 --> 01:22:31.899
That harm has massive social effects, massive psychological effects,
01:22:32.079 --> 01:22:34.079
and very long-term effects.
01:22:34.079 --> 01:22:39.019
As someone who personally was affected by the 2008 recession with my family,
01:22:39.279 --> 01:22:43.499
I can speak to being a kid that lives in one of those families that experiences that.
01:22:43.839 --> 01:22:49.639
As we've seen from the Rust Belt states, we can see the political shift it can cause.
01:22:49.739 --> 01:22:52.119
We can see the resentment that it can cause.
01:22:52.759 --> 01:22:57.019
So I think it's something that we as a society need to be preparing for and
01:22:57.019 --> 01:23:01.279
need to be asking difficult questions about, yes, AI is beneficial for society.
01:23:01.419 --> 01:23:04.059
There's no stopping the train. It's left the station.
01:23:04.579 --> 01:23:10.759
How do we make sure it runs smoothly? Yeah, because Nike just, for example,
01:23:11.799 --> 01:23:17.259
made a decision that they were going to lay off like 700-some employees and
01:23:17.259 --> 01:23:20.259
they were saying that AI was going to handle.
01:23:21.357 --> 01:23:27.277
Stuff that they will handle. And I guess it's going to be a hybrid that some
01:23:27.277 --> 01:23:29.557
of the mundane things AI can do,
01:23:29.797 --> 01:23:34.397
but some of the things that require human touch, they'll stay,
01:23:34.617 --> 01:23:36.577
which I'm trying to envision in my mind.
01:23:36.737 --> 01:23:42.997
It's like, okay, well, as far as ringing up the items, AI will do that some kind of way.
01:23:43.517 --> 01:23:49.257
But as far as the sales pitch and getting you to buy these particular shoes
01:23:49.257 --> 01:23:53.397
or whatever, you still need that human creativity to make that happen.
01:23:53.757 --> 01:23:58.617
I don't know what their plan is and how that's going to work,
01:23:58.617 --> 01:24:05.397
but that just highlights to me what you just addressed in that.
01:24:05.737 --> 01:24:12.017
A couple more questions. What particular application for AI are you most excited about?
01:24:12.697 --> 01:24:17.257
I think I'm really excited about it in accessibility and healthcare.
01:24:17.617 --> 01:24:21.557
I think those are two areas. I mean, right now, healthcare really is the big one.
01:24:22.057 --> 01:24:25.817
And the major reason I'm really excited about that is not just the advances
01:24:25.817 --> 01:24:30.797
that we are making, but also the structure of the healthcare system and why
01:24:30.797 --> 01:24:33.937
it's actually a really well-structured space for automation.
01:24:34.317 --> 01:24:39.177
Because the advances are amazing in reading radiology scans.
01:24:39.557 --> 01:24:44.417
And I was just at a rural health care center in Saskatchewan that's implementing
01:24:44.417 --> 01:24:49.657
AI to help bring accessible health care to rural communities that are run in
01:24:49.657 --> 01:24:50.797
centralized locations.
01:24:51.297 --> 01:24:55.757
Just amazing technologies here. And what I think is interesting about it is
01:24:55.757 --> 01:25:00.157
when we're applying it in these situations, it's about expanding.
01:25:00.477 --> 01:25:03.317
It's about expanding the health care offerings we already have,
01:25:03.537 --> 01:25:07.077
expanding them to more regions, trying to bring them to people that don't already have them.
01:25:07.237 --> 01:25:10.397
It tends to be less about replacing something with a worse product.
01:25:11.571 --> 01:25:14.511
So we end up seeing this situation where you know
01:25:14.511 --> 01:25:18.611
because it's heavily regulated because health care is incredibly heavily regulated
01:25:18.611 --> 01:25:24.211
you don't see you see more kind of skepticism and you see more testing of systems
01:25:24.211 --> 01:25:28.231
before they're implemented because the cost of messing up is so much higher
01:25:28.231 --> 01:25:33.831
and the regulatory cost of messing up is very high so you see much better systems being used,
01:25:34.431 --> 01:25:39.471
on top of that we don't tend to see true automation and what i mean in that
01:25:39.471 --> 01:25:41.011
sense is if you are a doctor.
01:25:41.951 --> 01:25:45.311
And so if you're a radiologist, let's say, and they bring in,
01:25:45.391 --> 01:25:47.911
you know, an AI technology to read scans for you.
01:25:48.451 --> 01:25:51.831
In other fields, as we see, like maybe with Nike, they bring in that technology
01:25:51.831 --> 01:25:54.591
and they go, lay off, lay off the person.
01:25:55.131 --> 01:25:59.651
We are in a doctor shortage. Doctors already are not doing all the work they
01:25:59.651 --> 01:26:01.751
could be doing. There's so much more that could be done.
01:26:02.071 --> 01:26:04.751
So when you bring in something like this that can read the scans,
01:26:04.951 --> 01:26:06.311
you're liberating their time.
01:26:06.671 --> 01:26:10.451
You're giving them more time to engage in other activities. So now maybe that
01:26:10.451 --> 01:26:12.451
radiologist will treat more patients.
01:26:12.731 --> 01:26:16.891
Maybe that radiologist will spend more time on research, helping to design new
01:26:16.891 --> 01:26:19.911
cancer detection tools or new cancer treatment tools.
01:26:20.411 --> 01:26:25.731
So I think the healthcare implications are massive and I think really exciting
01:26:25.731 --> 01:26:29.971
both in the technological developments and in just the structure that the healthcare
01:26:29.971 --> 01:26:33.871
system has to be able to implement them in very humane ways.
01:26:34.811 --> 01:26:37.611
What do you want the readers to take from this book?
01:26:38.614 --> 01:26:42.374
Really want this book to be a starting point for readers i want you to start
01:26:42.374 --> 01:26:46.514
here and not end here i want to be a place that introduces you to the things
01:26:46.514 --> 01:26:49.134
you need to know but also that's empowering,
01:26:49.774 --> 01:26:55.094
because i think a lot of talk about ai is very doom and gloom i talked before about you know,
01:26:55.554 --> 01:26:59.574
that i felt like my friends were seeing not great information the other side
01:26:59.574 --> 01:27:04.454
of that coin was information that was just you know doomerism that was just
01:27:04.454 --> 01:27:07.474
you know ai is going to take over where AI is going to destroy everything,
01:27:07.634 --> 01:27:08.874
it's this all-powerful force.
01:27:09.574 --> 01:27:11.594
I want people to come out with a bit of optimism.
01:27:12.814 --> 01:27:16.134
Or optimism might be the wrong word. The last chapter is called Hope,
01:27:16.274 --> 01:27:18.674
and I think there's a difference between hope and optimism.
01:27:19.234 --> 01:27:21.754
And hope is the belief that there's still a good path forward.
01:27:22.074 --> 01:27:23.774
Hope is the last thing we lose.
01:27:24.254 --> 01:27:28.214
And I hope that people come out with a sense of hope, not only that the future
01:27:28.214 --> 01:27:32.174
with AI can be better, but hope that they can make a difference,
01:27:32.374 --> 01:27:36.674
and hope that they can understand it, a realization that we get to shape this future.
01:27:37.094 --> 01:27:40.014
And it's not one person who gets to make the decisions here.
01:27:40.254 --> 01:27:43.594
The future comes from us. The future comes from democracy.
01:27:43.974 --> 01:27:48.634
The future comes from us working together and demanding a better future.
01:27:48.854 --> 01:27:52.474
So I hope you gain the tools from this book to advocate for yourself.
01:27:52.814 --> 01:27:57.154
You gain the tools from this book to be able to speak to your own experiences
01:27:57.154 --> 01:27:59.374
and to be able to ask for more.
01:27:59.514 --> 01:28:03.494
And you become a part of this great conversation, which is, I think,
01:28:03.534 --> 01:28:05.354
the most important conversation of our time.
01:28:05.734 --> 01:28:12.514
So it's funny you bring up hope because one of the questions I'm asking every
01:28:12.514 --> 01:28:16.634
guest as we close out is to finish this sentence.
01:28:17.154 --> 01:28:19.474
I have hope because...
01:28:21.231 --> 01:28:25.251
I have hope because there's still many decisions left to be made.
01:28:26.111 --> 01:28:30.531
Okay. All right. It's succinct and to the point.
01:28:31.071 --> 01:28:35.011
David, how can people get this book, Artificially Intelligent,
01:28:35.331 --> 01:28:39.431
A Very Human Story, and how can they reach out to you?
01:28:40.071 --> 01:28:43.571
So you can find Artificially Intelligent anywhere books are sold.
01:28:43.571 --> 01:28:49.331
You can go to your local bookshop, Barnes & Noble, order it online through bookshops.org.
01:28:49.331 --> 01:28:52.711
It's a great place if you're looking to order it. supports local bookstores.
01:28:53.271 --> 01:28:56.911
Get it through Barnes & Noble, get it through Amazon. You can get it directly
01:28:56.911 --> 01:28:59.351
through my publisher, University of Toronto Press.
01:28:59.551 --> 01:29:03.851
I think they have a big sale on right now actually to celebrate their birthday.
01:29:04.131 --> 01:29:05.251
So you could go check that out.
01:29:05.591 --> 01:29:08.451
And you can find me at davideliot.org.
01:29:08.951 --> 01:29:13.351
Eliot has one L and one T in it. I will be spelling that out for the rest of my life.
01:29:13.631 --> 01:29:17.471
So davideliot1l1t.org. I have an email form on there.
01:29:17.591 --> 01:29:21.711
If you want to send me an email, get in touch with me and I hope to pick up
01:29:21.711 --> 01:29:22.971
the book. I hope you enjoy it.
01:29:23.131 --> 01:29:26.671
I hope you join the conversation and thank you so much for having me on today, Eric.
01:29:27.251 --> 01:29:31.631
Well, David Eliot, it was an honor to have you on and I can relate,
01:29:31.851 --> 01:29:39.031
you know, the way I spell Erik seems to be unique to some people instead of with the C is with the K,
01:29:39.571 --> 01:29:44.091
but you know, even, even my AI, when it does the transcript.
01:29:44.351 --> 01:29:46.691
I have to go in and edit that all the time.
01:29:47.211 --> 01:29:51.291
Oh no. But, but David, I'm, I'm really glad that we had this discussion and
01:29:51.291 --> 01:29:52.451
I appreciate you coming on.
01:29:52.931 --> 01:29:56.511
Perfect. Thank you very much. All right, guys. And we're going to catch y'all on the other side.
01:30:08.952 --> 01:30:13.872
All right. And we are back. So I want to thank Kaylee Jade
01:30:14.252 --> 01:30:17.752
Peterson and David Eliot for coming on the show.
01:30:19.152 --> 01:30:23.752
You know, I wish Kaylee well. She is, you know,
01:30:24.812 --> 01:30:30.772
been really steadfast and has made it a crusade to make sure that progressive
01:30:30.772 --> 01:30:35.552
voices are heard and are organized in Idaho.
01:30:35.552 --> 01:30:38.612
And I know a lot of people are like, really?
01:30:39.552 --> 01:30:41.712
Is that really a worthwhile endeavor?
01:30:42.792 --> 01:30:48.352
And yes, it is, especially for her and her family, because she lives there.
01:30:48.972 --> 01:30:57.792
And I really hope, I think that if she gets in, she's going to be a positive voice for us.
01:30:58.992 --> 01:31:05.032
And it's definitely going to be better than what they got up there already.
01:31:05.212 --> 01:31:07.812
So I wish her well. You can...
01:31:09.150 --> 01:31:13.790
Just Google her, and if you want to support her, go ahead and do that.
01:31:14.290 --> 01:31:18.150
And then David Eliot, young man, very insightful young man,
01:31:18.310 --> 01:31:23.290
who basically has written his book, Artificially Intelligent.
01:31:24.070 --> 01:31:28.050
And it is really, I learned a lot of stuff.
01:31:28.050 --> 01:31:35.510
Even though he is a sociologist by training, and he's done a lot of research
01:31:35.510 --> 01:31:41.410
from that capacity dealing with artificial intelligence and its impact and all that.
01:31:41.550 --> 01:31:43.850
So he's not a computer guy.
01:31:44.270 --> 01:31:46.050
But just like Dr.
01:31:46.950 --> 01:31:57.750
Lubin, who came on before, brings a particular perspective, especially from the human side.
01:31:57.750 --> 01:32:06.310
But I learned a lot from him as far as like the origin of the word algorithm, you know.
01:32:07.010 --> 01:32:10.750
Yeah, I got to get that book. It's pretty. And as a matter of fact,
01:32:10.870 --> 01:32:16.070
I mentioned to, if you're in the interview, remember, I mentioned Dr. Lubin's work.
01:32:16.270 --> 01:32:18.910
And he was, David was scribbling that down.
01:32:19.290 --> 01:32:23.350
And so, Dr. Lubin, I think you're going to have one more person buying your
01:32:23.350 --> 01:32:27.910
book for sure. Yes, I just want to thank them for coming on,
01:32:28.290 --> 01:32:33.370
especially during this moment that we are in.
01:32:34.250 --> 01:32:42.550
So as I'm recording this, a lot of things have happened in Minnesota over the last week.
01:32:43.881 --> 01:32:50.261
You know, this is a weekly show, so we kind of, as recording stuff and all that,
01:32:51.201 --> 01:32:57.601
things happen right around the time we're recording or I miss it because I'm recording.
01:32:57.841 --> 01:33:03.581
But a young man got shot, same age as Renee Good. His name was Alex Pretti.
01:33:03.921 --> 01:33:07.221
Alex was an ICU nurse.
01:33:08.341 --> 01:33:13.421
And having part of my job when I worked for the Fulton County Sheriff's Office,
01:33:13.881 --> 01:33:21.281
I was assigned to Grady Hospital, and I would have to sit on patients that were sent to ICU.
01:33:22.021 --> 01:33:25.401
That is a very, very diligent job.
01:33:25.621 --> 01:33:33.381
It's not as hectic as being down in the emergency room, but it's still very,
01:33:33.501 --> 01:33:38.241
very stressful, very tenuous situation because these people made it from the
01:33:38.241 --> 01:33:40.001
emergency room to the ICU.
01:33:41.640 --> 01:33:46.500
Their life is still in the balance. And so these people have to be very sensitive,
01:33:47.020 --> 01:33:52.240
and very professional and very aware of what's going on with the patients they're assigned.
01:33:53.860 --> 01:33:59.100
And there's a video that has gone around of Mr.
01:33:59.220 --> 01:34:07.440
Pretti giving basically a eulogy to a veteran who died at the VA hospital in
01:34:07.440 --> 01:34:09.620
Minnesota where he worked.
01:34:11.640 --> 01:34:15.800
You know, that's just kind of a glimpse of what this young man was.
01:34:15.980 --> 01:34:18.960
He was, you know, he was an avid outdoorsman.
01:34:19.220 --> 01:34:22.000
He liked to hike and bike and all that stuff.
01:34:23.220 --> 01:34:31.320
But he was a pretty active protester of ICE because another video has come out.
01:34:31.440 --> 01:34:33.160
And again, these people are so dumb.
01:34:34.280 --> 01:34:38.840
And I'm kind of like Tiffany Cross now. You know, everybody else is trying to
01:34:38.840 --> 01:34:45.300
be polite and, you know, not say direct things like,
01:34:45.540 --> 01:34:49.860
you know, she'll get on CNN and just say, you're lying. Right.
01:34:50.420 --> 01:34:53.600
And, you know, saying, well, I don't agree with that or whatever.
01:34:53.820 --> 01:34:57.360
No, she just come out and say it. And that's I think that's the way you have
01:34:57.360 --> 01:35:03.060
to treat these people, because they take advantage of kindness.
01:35:03.940 --> 01:35:07.880
Right. They take advantage of civility. because they have none.
01:35:08.520 --> 01:35:13.820
And, you know, so they're bulls in the China shop for real, right?
01:35:14.460 --> 01:35:20.100
But they're also stupid people, which usually kind of goes hand in hand with brashness.
01:35:20.840 --> 01:35:25.340
And so they released a video talking about.
01:35:26.633 --> 01:35:30.293
Well, not talking about, but it was a video showing Mr.
01:35:30.433 --> 01:35:35.293
Pretti like 11 days before he got shot at another protest.
01:35:36.253 --> 01:35:42.753
And obviously there was some exchange between one of the ICE or Border Patrol
01:35:42.753 --> 01:35:46.253
or one of the federal people that was out there.
01:35:46.653 --> 01:35:50.033
And he didn't take too kindly to it.
01:35:50.213 --> 01:35:56.713
He walked up to the car. I think fluids were exchanged, like spitting.
01:35:57.873 --> 01:36:02.873
Then he kicked the taillight out, right, of the car.
01:36:04.053 --> 01:36:07.553
And somebody said, wow, he's got to be pretty strong. You know,
01:36:07.953 --> 01:36:11.333
cold weather kind of helps, makes things brittle.
01:36:12.433 --> 01:36:16.173
So when he kicked the taillight out of the car, the officers swarmed him and
01:36:16.173 --> 01:36:19.873
all that, took him down. But he didn't get arrested.
01:36:20.853 --> 01:36:23.613
They just kind of subdued him.
01:36:24.653 --> 01:36:31.673
And then when he stood up, you could see that he had a gun holstered.
01:36:32.853 --> 01:36:36.733
Didn't take the gun from him. They almost acted like they didn't see it.
01:36:37.693 --> 01:36:45.113
And, you know, they got up, they walked away, got in their SUV with a damaged
01:36:45.113 --> 01:36:46.213
taillight and drove off.
01:36:47.733 --> 01:36:51.993
So obviously it was a different group of people that Mr.
01:36:52.153 --> 01:37:00.973
Pretti encountered 11 days later when he was recording an action that they were
01:37:00.973 --> 01:37:05.813
taking out according to Secretary Noem, which basically is,
01:37:06.173 --> 01:37:11.453
you got to fact check her breathing, right? Because she lies that much.
01:37:11.893 --> 01:37:17.753
She said that they were about to arrest a child pedophile.
01:37:17.953 --> 01:37:20.473
And like some comedian said, is there a difference?
01:37:21.333 --> 01:37:22.893
Is there any other kind of pedophile?
01:37:24.253 --> 01:37:28.273
But they it was this person
01:37:28.273 --> 01:37:33.313
they person of interest they were after somebody that had been accused of being
01:37:33.313 --> 01:37:39.733
pedophile and so the story that they're telling is that he impeded now initially
01:37:39.733 --> 01:37:45.913
they said that he came in had the gun drawn was basically doing his best Edward G.
01:37:46.053 --> 01:37:51.173
Robinson Jimmy Cagney imitation and say I'm gonna take you coppers out you see
01:37:51.173 --> 01:37:56.573
I mean just to the extreme, right, when it was totally opposite.
01:37:56.933 --> 01:38:00.393
There was a woman who was beside Mr.
01:38:00.593 --> 01:38:05.353
Pretti who was also videotaping, and obviously she was saying some things,
01:38:05.593 --> 01:38:09.213
and one of the officers got offended and pushed her down in the snow.
01:38:10.799 --> 01:38:15.859
Mr. Pretti went to go pick her up. And in the process of picking her up,
01:38:16.219 --> 01:38:20.979
another officer came and pepper sprayed her and Mr. Pretti.
01:38:22.779 --> 01:38:25.939
And needless to say, he didn't take too kindly to that.
01:38:26.199 --> 01:38:34.199
But before he really could do anything, they grabbed him and started beating
01:38:34.199 --> 01:38:36.799
him and holding him down.
01:38:36.799 --> 01:38:42.719
And then one of the officers saw the gun this time.
01:38:42.919 --> 01:38:47.159
One officer saw the gun in his holster and pulled it out.
01:38:49.139 --> 01:38:56.799
Now, the audio is terrible, so you can't really hear anything but the gunshots, clearly.
01:38:58.239 --> 01:39:03.539
But I'm sure the officer said something to the effect is, I have the gun.
01:39:03.539 --> 01:39:12.819
Now, I am believing that the officers heard the word gun.
01:39:14.119 --> 01:39:17.199
And at first I thought it was...
01:39:19.666 --> 01:39:23.206
Thought it was like somebody accidentally might have, you know,
01:39:23.326 --> 01:39:26.286
fired off a round and all that commotion and chaos.
01:39:27.226 --> 01:39:31.686
And, you know, and then more shots were fired. That wasn't the case.
01:39:32.506 --> 01:39:38.646
It was like Mr. Pretti got up or he looked like he was about to get up and officers
01:39:38.646 --> 01:39:41.106
shot him directly in the back.
01:39:42.066 --> 01:39:48.606
And not once, not twice, but four times. And as he was rolling over from that,
01:39:49.006 --> 01:39:52.166
he got shot five or six more times.
01:39:52.826 --> 01:39:56.786
I'm still waiting on the autopsy to see how many bullets actually hit him,
01:39:56.886 --> 01:39:58.666
but at least 10 shots were fired.
01:39:59.246 --> 01:40:01.366
So basically they killed him.
01:40:02.706 --> 01:40:10.126
And, you know, the scenario I gave would probably be the best scenario they
01:40:10.126 --> 01:40:11.186
would have in their defense.
01:40:12.526 --> 01:40:21.126
But, you know, even if the person had a weapon, once you have them subdued like
01:40:21.126 --> 01:40:26.086
that, you know, if you have their arms, they can't shoot you.
01:40:26.526 --> 01:40:32.546
They can't. I have yet to see a human being who is being held down by police.
01:40:33.106 --> 01:40:37.166
Both of their arms are held down and they're able to shoot somebody.
01:40:37.166 --> 01:40:39.926
I have never seen that happen.
01:40:41.442 --> 01:40:45.322
In the wildest Ripley, believe it or not, moment. I have never seen that happen.
01:40:45.682 --> 01:40:53.242
And so for that officer to fire the weapon, again, just bad policing.
01:40:54.162 --> 01:41:00.822
And the way that he did it, he's going to get a murder charge if the state brings
01:41:00.822 --> 01:41:02.222
charges, and they should.
01:41:03.922 --> 01:41:11.122
He might get off with manslaughter, but he shot an unarmed man at that point.
01:41:12.022 --> 01:41:17.542
A person that was in control, a person that was subdued.
01:41:18.522 --> 01:41:22.082
The only thing they should have done at that point was put handcuffs on them,
01:41:22.422 --> 01:41:23.622
if that was their intention.
01:41:26.162 --> 01:41:30.842
So there's that. And then, you know, we've been trying to keep y'all abreast
01:41:30.842 --> 01:41:33.442
of what happened with, you know,
01:41:34.462 --> 01:41:38.202
Nekima Levy Armstrong, who's been on the show a couple of times and a couple
01:41:38.202 --> 01:41:43.442
of other folks that had organized a protest at a church where the leader of
01:41:43.442 --> 01:41:47.802
ICE in the state of Minnesota is a pastor.
01:41:48.502 --> 01:41:52.622
And we know that she got arrested and then they used AI.
01:41:53.122 --> 01:41:57.922
It seemed like she was crying when she got arrested, which wasn't the case.
01:41:58.242 --> 01:42:02.462
Thank goodness for cameras everywhere because the camera in her apartment complex
01:42:02.462 --> 01:42:08.222
showed somebody that was not crying and the cameras at the courthouse.
01:42:08.922 --> 01:42:15.882
At no time was she boo-hooing like they depicted and put on the actual White House website, right?
01:42:17.155 --> 01:42:22.455
So she got released. Her, all three of them got released.
01:42:23.095 --> 01:42:28.075
Now, as I'm recording, two of the reporters that were there,
01:42:28.255 --> 01:42:36.415
Georgia Fort, who is an independent journalist, like I said in the intro, out of Minnesota.
01:42:36.795 --> 01:42:42.435
She was like a local anchor there and basically started her own news service.
01:42:42.435 --> 01:42:50.495
And, of course, she's been covering, you know, any and everything dealing with
01:42:50.495 --> 01:42:57.275
the protests and all that, and she knows Nekima.
01:42:57.955 --> 01:43:07.615
And so she was at the first press conference Nekima and the coalition had right
01:43:07.615 --> 01:43:10.435
after Renee Good got killed.
01:43:12.035 --> 01:43:14.275
And, you know, just been keeping track.
01:43:14.475 --> 01:43:19.195
He's been keeping track ever since ICE showed up in force in Minneapolis.
01:43:19.895 --> 01:43:25.875
And so Don Lemon has, you know, he just gets on a plane and he goes wherever he wants to go.
01:43:26.255 --> 01:43:29.875
If he's not doing man-in-the-street stuff, he's going where the action is.
01:43:29.975 --> 01:43:31.495
So, of course, he was in Minneapolis.
01:43:32.135 --> 01:43:34.335
He got wind of this protest.
01:43:34.995 --> 01:43:38.075
Well, he got wind that some activity was getting ready to happen.
01:43:38.915 --> 01:43:44.615
And he kind of explains that if you follow him. He explained what was going to happen.
01:43:45.355 --> 01:43:49.575
He knew, had no idea what they were going to do. They just knew some kind of
01:43:49.575 --> 01:43:50.675
action was going to take place.
01:43:51.095 --> 01:43:57.275
And so when Nekima and the group went in to protest, he and George and everybody else.
01:43:59.519 --> 01:44:04.579
Journalists that were covering it went in with him. And, you know, Don,
01:44:04.979 --> 01:44:10.599
being the aggressive person he is, he basically kind of got the pastor to the
01:44:10.599 --> 01:44:14.639
side and who was the guest pastor, I guess, or the assistant pastor.
01:44:14.879 --> 01:44:16.379
He wasn't the guy that was over ice.
01:44:17.119 --> 01:44:21.739
But, you know, he started interviewing him. And then he interviewed like two
01:44:21.739 --> 01:44:26.199
or three parishioners of the church, you know, afterwards. And,
01:44:26.279 --> 01:44:28.979
of course, he interviewed some of the protesters.
01:44:30.779 --> 01:44:38.679
So somehow, Ms. Dillon, who is over the Civil Rights Division of the Department
01:44:38.679 --> 01:44:43.939
of Justice now, which is clearly an oxymoron in the Trump universe,
01:44:44.939 --> 01:44:50.739
her and Pam Bondi, the Attorney General, decided Don Lemon was the ringleader.
01:44:52.119 --> 01:44:56.099
And so they decided to go after him,
01:44:56.959 --> 01:45:00.059
and then Georgia, they went after and then
01:45:00.059 --> 01:45:05.899
they arrested Georgia as well because initially they couldn't get any charges
01:45:05.899 --> 01:45:11.999
when they got Nekima and the other organizers they couldn't get an indictment
01:45:11.999 --> 01:45:19.419
on Don and I guess they kept working on it and doctored it up I'm sure Pam was putting pressure on Ms.
01:45:19.519 --> 01:45:21.779
Dillon to come up with something.
01:45:22.419 --> 01:45:32.199
And so they got a grand jury to indict them, federal grand jury to indict them, and they arrested them.
01:45:34.119 --> 01:45:39.239
And, you know, there were other journalists there, but you arrested the two
01:45:39.239 --> 01:45:41.359
black journalists who were there.
01:45:42.739 --> 01:45:48.719
So no coincidence, right? That's just, is what it is. You arrested the two black
01:45:48.719 --> 01:45:50.399
journalists that were there. Thank you.
01:45:51.508 --> 01:45:53.908
They must have had something to do with it because the protesters,
01:45:54.168 --> 01:45:57.948
the majority of them were black. So they must have had something to do with it, right?
01:45:58.868 --> 01:46:05.248
So, you know, meanwhile, in D.C., Donald Trump is,
01:46:05.548 --> 01:46:12.788
you know, was attending a grand opening or premiere of his wife's documentary,
01:46:12.788 --> 01:46:23.248
which was basically a $40 million bribe from Amazon to the Trumps to stay in their good graces.
01:46:24.915 --> 01:46:32.155
It cost $40 million to do a documentary on, I think it's like 10 days before the election.
01:46:36.155 --> 01:46:46.535
Anyway, $40 million on a documentary. Anyway, so once they heard that Mr.
01:46:46.635 --> 01:46:50.235
Pretty was shot, they didn't think about, well, maybe we can move it down today.
01:46:50.275 --> 01:46:52.055
Oh, no, the show must go on.
01:46:52.615 --> 01:46:57.895
It didn't happen here in D.C. It happened in Minneapolis, so why should we care?
01:46:57.995 --> 01:47:01.215
Now, the NBA stopped an actual basketball game.
01:47:01.495 --> 01:47:05.495
That didn't come from the Timberwolves. That didn't come from the visiting team.
01:47:05.615 --> 01:47:09.495
That came from the league. The league said, yeah, no, we're not playing a game
01:47:09.495 --> 01:47:11.195
tonight in Minneapolis.
01:47:11.555 --> 01:47:16.735
The NBA canceled a game, but the first lady of the United States couldn't put
01:47:16.735 --> 01:47:18.775
off a premiere for one day.
01:47:19.715 --> 01:47:22.935
I mean, those are the kind of people we're dealing with. It's just,
01:47:23.215 --> 01:47:29.435
it's to the point now where it's beyond being angry about it,
01:47:29.615 --> 01:47:34.635
which I encourage people to still be angry, but it's really ridiculous.
01:47:35.575 --> 01:47:42.555
It's, you know, there's anger and then there's frustration added to that.
01:47:42.675 --> 01:47:47.915
And the ridiculous part creates the frustration because we don't seem to have
01:47:47.915 --> 01:47:54.195
any real leadership, at least leadership that has backbone, right?
01:47:54.615 --> 01:47:58.875
You know, and there's some people I don't have high expectations for, like Henry Cuellar.
01:47:58.875 --> 01:48:04.855
Or, you know, after this young man was shot, which now there has been two people
01:48:04.855 --> 01:48:09.735
who have been killed, there's been three people that's been shot in Minneapolis,
01:48:10.055 --> 01:48:12.255
but two people have been killed,
01:48:12.955 --> 01:48:18.395
which has basically tripled the homicide rate in Minneapolis for this year.
01:48:19.973 --> 01:48:25.413
So after that, they had a vote. Congress decided to actually take a vote on something.
01:48:25.833 --> 01:48:32.893
They voted to include a budget package to avoid the shutdown because that was
01:48:32.893 --> 01:48:36.113
looming at the end of January. That was looming.
01:48:39.013 --> 01:48:41.873
And so as I'm recording this I don't know
01:48:41.873 --> 01:48:48.253
if the shutdown has happened or not you you know we'll catch up if it did but
01:48:48.253 --> 01:48:52.693
as of right now it looks like it wasn't going to happen and here's why because
01:48:52.693 --> 01:48:59.393
the shutdown could have happened in the house but seven democrats including Henry Cuellar,
01:49:00.213 --> 01:49:05.453
sided with the Republicans to pass the continuing resolutions.
01:49:06.493 --> 01:49:10.513
Now, again, I don't have a whole lot of high hopes for Cuellar.
01:49:10.673 --> 01:49:15.693
Cuellar's been like, I don't even know, does he know the difference between
01:49:15.693 --> 01:49:21.553
a Democrat and Republican? I think he just runs as a Democrat because the district is Democratic.
01:49:22.273 --> 01:49:27.093
But, you know, he got Donald Trump to pardon him and then turn around and told
01:49:27.093 --> 01:49:28.613
Donald Trump, I'm not switching party.
01:49:29.393 --> 01:49:31.073
So he gets over on everybody.
01:49:32.093 --> 01:49:37.793
That's just him. And if the people keep voting for him, you get what you vote for.
01:49:39.391 --> 01:49:47.231
But it was one of the seven decided to do a video young lady out of Spokane,
01:49:47.351 --> 01:49:53.051
Washington, who basically beat a super MAGA guy to get in Congress.
01:49:53.131 --> 01:49:59.131
But she's more or less like the House version of Kyrsten Sinema because she's
01:49:59.131 --> 01:50:05.391
voted against student loan relief and, you know, some other progressive things.
01:50:06.051 --> 01:50:11.851
And, you know, she tried to explain why she voted for the resolution and it
01:50:11.851 --> 01:50:15.731
was just kind of like probably we're better off not even doing the video,
01:50:15.931 --> 01:50:17.831
probably just take your lumps.
01:50:18.031 --> 01:50:20.431
She's got an opponent in the primary.
01:50:22.031 --> 01:50:27.171
So we'll see how that goes. So anyway, it passed the House and now it was over
01:50:27.171 --> 01:50:33.531
in the Senate and so, you know, the senators were kind of talking tough and
01:50:33.531 --> 01:50:36.251
saying that they weren't going to vote for it.
01:50:36.371 --> 01:50:40.191
The Democrats were, as a couple of Republicans said, they weren't going to vote
01:50:40.191 --> 01:50:44.711
for the continuing resolutions until something happens at Homeland Security.
01:50:45.311 --> 01:50:48.771
You know, got to get these folks out of Minnesota.
01:50:48.831 --> 01:50:54.551
You got to, you know, get Kristi Noem out. Just, you know, there was a lot of demands.
01:50:55.651 --> 01:50:59.271
But, you know, and then Hakeem Jeffries got up there and said,
01:50:59.271 --> 01:51:05.591
well, if she doesn't resign, we're going to go forward with impeachment proceedings,
01:51:06.531 --> 01:51:07.851
you know, started drafting it up.
01:51:08.031 --> 01:51:15.351
Now, of course, he couched that by saying, you know, once the Democrats get
01:51:15.351 --> 01:51:19.511
control of the House, that's going to be the first order of business if she
01:51:19.511 --> 01:51:21.031
hasn't resigned by November,
01:51:21.451 --> 01:51:24.111
I guess. So...
01:51:26.095 --> 01:51:29.895
And supposedly they've worked out some kind of deal where they're going to let
01:51:29.895 --> 01:51:32.095
the other continuing resolutions go,
01:51:32.275 --> 01:51:39.775
but they're going to set aside the Homeland Security continuing resolution and
01:51:39.775 --> 01:51:46.895
change that to like keep it going for two weeks instead of, you know,
01:51:47.735 --> 01:51:52.175
voting for the package like they did in the House for the whole fiscal year.
01:51:52.175 --> 01:51:58.355
And then give them a couple weeks to discuss all of the demands that Democrats want.
01:51:58.675 --> 01:52:04.255
And supposedly they've agreed on that. Supposedly President Trump has given his blessing to that.
01:52:04.855 --> 01:52:14.215
And I'm like, it's just gotten to a point now where nobody will take the hard stand.
01:52:15.455 --> 01:52:18.755
Nobody will just say, you know what, just shut this down.
01:52:19.655 --> 01:52:25.795
I've been trying to get these folks to shut it down from the very beginning, right?
01:52:26.555 --> 01:52:32.395
If Chuck Schumer had shut it down the first time, they had the opportunity to shut it down, cool.
01:52:32.575 --> 01:52:36.235
Then when you did shut it down, you didn't really have a plan.
01:52:36.635 --> 01:52:38.315
You didn't have an exit strategy.
01:52:38.915 --> 01:52:42.415
You just said, well, we got to shut it down because we got jumped on for not
01:52:42.415 --> 01:52:43.635
shutting it down the first time.
01:52:45.350 --> 01:52:48.630
Have an exit strategy when they was cutting off everybody's food stamps,
01:52:48.910 --> 01:52:51.550
you didn't know how to counter that, right?
01:52:52.570 --> 01:52:57.210
And of course, the government employees being laid off and, you know,
01:52:57.430 --> 01:53:04.790
so that ended with no real changes in healthcare subsidies.
01:53:05.070 --> 01:53:12.250
People still didn't get the tax break or the subsidy to offset the health care costs.
01:53:13.010 --> 01:53:14.490
So now here's the third opportunity.
01:53:15.270 --> 01:53:21.690
We've had two people killed by the federal government and the government is still operating.
01:53:24.290 --> 01:53:27.450
We're making deals instead of making demands.
01:53:28.470 --> 01:53:38.030
I don't get it. I don't get it, but, you know, maybe people smarter than me do, but I don't get it.
01:53:38.030 --> 01:53:45.690
And, you know, this is the time where you use all the tools in the rule book to,
01:53:46.701 --> 01:53:51.441
down. Whatever parliamentary procedure you have to pull out,
01:53:51.701 --> 01:53:58.401
whatever no votes you have to give, whatever you need to do, this is the time.
01:53:59.101 --> 01:54:03.801
You know, the president is on a delusion that he's still popular.
01:54:04.581 --> 01:54:07.341
He's always going to think that for the rest of his life.
01:54:08.341 --> 01:54:14.221
People say that's senility, But he's been that way when he was fully functional,
01:54:14.221 --> 01:54:17.781
if he ever was technically fully functional.
01:54:18.041 --> 01:54:22.021
But when he was much younger, he had these same delusions of grandeur.
01:54:22.201 --> 01:54:26.481
So that's not going to change. It's like his nature.
01:54:27.181 --> 01:54:31.721
But the polls are showing that people are pissed, right?
01:54:32.001 --> 01:54:37.581
And how they're planning this no king's march in March.
01:54:38.341 --> 01:54:43.541
But I don't know how many people are going to be dead by then at the rate they're going, right?
01:54:44.041 --> 01:54:48.141
And then we got the flip side.
01:54:52.401 --> 01:54:59.861
Where we've got somebody like a Nicki Minaj who is so caught up in her personal
01:54:59.861 --> 01:55:03.901
agenda that she's tone deaf to what's happening.
01:55:04.581 --> 01:55:11.981
You know, she's trying to get some kind of relief for her husband,
01:55:11.981 --> 01:55:13.501
some kind of relief for her brother.
01:55:14.241 --> 01:55:19.841
And now she's got her Trump gold card, which means that she paid a million dollars.
01:55:21.801 --> 01:55:26.841
Get permanent residency in the United States. Up until this point, she had not done that.
01:55:27.461 --> 01:55:32.121
She's paid taxes. She's lived in the country. She's done her thing,
01:55:32.121 --> 01:55:37.301
but she had not gone through all the steps to be at least a permanent resident.
01:55:38.221 --> 01:55:41.921
And I guess with all the stuff that was going on and some of the things she
01:55:41.921 --> 01:55:48.921
had said about Trump in the past, I guess she felt this was her time to get her manumission.
01:55:50.101 --> 01:55:53.921
And, you know, she got on Twitter and showed it.
01:55:55.801 --> 01:55:59.601
So, you know, it's just a terrible time.
01:56:00.261 --> 01:56:03.541
But this is the reason why we don't deify human beings.
01:56:03.701 --> 01:56:10.521
This is the reason why we don't glorify human beings too much or that we shouldn't do it.
01:56:11.521 --> 01:56:15.381
Because human beings are human beings, which means that if you put them up on
01:56:15.381 --> 01:56:18.801
a high pedestal, So they're going to let you down because they're going to be human.
01:56:19.581 --> 01:56:23.041
And I think, you know, if you're a fan of the music, great.
01:56:23.301 --> 01:56:29.761
But if you're part of the fan club, that's pushing it because fan is short for fanatic.
01:56:31.021 --> 01:56:37.701
But it's like, you know, a lot of these people that we are entertained by are not on our side.
01:56:39.861 --> 01:56:47.081
And, you know, they might rap about this or, you know, because of our culture,
01:56:47.081 --> 01:56:49.901
we embrace them for their athletic prowess.
01:56:50.301 --> 01:56:56.241
But a lot of them, once they get to a certain financial status, not on our side.
01:56:56.381 --> 01:57:01.281
I just watched Mike Epps' special and he was joking about the fact that when
01:57:01.281 --> 01:57:05.021
black folks get some money, they try to move away from black folk, right?
01:57:06.704 --> 01:57:15.964
You know, and so, you know, I know there's a lot of people that are disappointed about that.
01:57:16.644 --> 01:57:23.524
But as we stated, I am now 61 years old and I've been a black man all my life.
01:57:24.044 --> 01:57:27.304
So I'm used to it. I'm not happy about it.
01:57:27.624 --> 01:57:32.444
It's like, OK, they got another one of us to do their work.
01:57:32.444 --> 01:57:37.164
Right you know this whole thing popped off when they were trying to arrest Don
01:57:37.164 --> 01:57:44.124
Lemon Nicki Minaj started going in on Don Lemon and using homophobic slurs and
01:57:44.124 --> 01:57:48.864
all this stuff and you know I guess because Don went in on her for,
01:57:49.544 --> 01:57:56.444
being on the show with Charlie Kirk's wife right or widow I should say,
01:57:57.564 --> 01:58:02.104
and that and that's a bizarre weird thing,
01:58:03.484 --> 01:58:10.384
So, you know, we're in an age, you know, we're a long way from an age of heroes.
01:58:10.964 --> 01:58:17.684
We're in an age of ridiculousness. It makes sense that a show called Ridiculousness
01:58:17.684 --> 01:58:22.844
is the longest running show on American television right now.
01:58:23.224 --> 01:58:26.864
Because that's where we are. We're at this point.
01:58:28.924 --> 01:58:35.284
And if it wasn't killing us, if it wasn't oppressing us, it'd be funny.
01:58:35.944 --> 01:58:39.204
And we try to make light of it so we can get through it.
01:58:39.584 --> 01:58:45.764
That's why we value comedians like Jimmy Kimmel and Dave Chappelle and all these other folks.
01:58:45.764 --> 01:58:57.444
But this has really gone to a level that if you ain't trying to fix the problem, you are the problem.
01:58:57.744 --> 01:59:07.304
If you are not willing to shut it down, if you're not willing to give it your all, you know, then...
01:59:10.743 --> 01:59:15.403
But part of the problem, the old Army adage is lead, follow,
01:59:15.543 --> 01:59:17.323
or get out of the way, right?
01:59:18.163 --> 01:59:23.003
And the people that need to get out of the way are the biggest impediments to
01:59:23.003 --> 01:59:24.023
what we're trying to deal with.
01:59:25.703 --> 01:59:32.263
Again, like I said, man, this is a crazy time that we live in.
01:59:33.143 --> 01:59:43.083
I don't know what else to tell you, man. I just hope that we get some resolve, right?
01:59:43.623 --> 01:59:48.243
You know, I know our everyday lives dictate that we do something.
01:59:48.423 --> 01:59:51.223
We got to go to work and deal with whatever's going on at work.
01:59:51.363 --> 01:59:54.623
We have families and we got to deal with whatever's going on with our family.
01:59:55.283 --> 02:00:03.103
But I think the most important sacrifice we need to make is to make time to fight for this nation.
02:00:03.103 --> 02:00:09.803
Because anything that's going on in your life right now, if this nation falls,
02:00:10.203 --> 02:00:11.963
that's going to change it.
02:00:12.563 --> 02:00:16.783
You're worried about what school. In an authoritarian setting,
02:00:16.783 --> 02:00:18.383
you may not get that choice anymore.
02:00:19.143 --> 02:00:23.683
You know, what kind of job you got, you may not have that choice anymore.
02:00:25.646 --> 02:00:30.586
How much money you make, you may not have a choice anymore.
02:00:31.266 --> 02:00:38.166
You know, I can imagine because we all human beings and human beings have evolved
02:00:38.166 --> 02:00:43.866
to a degree, but behavior wise, not that much.
02:00:44.846 --> 02:00:48.886
I'm sure there were folks that were just trying to go about their everyday life
02:00:48.886 --> 02:00:52.246
in the middle of the American Revolution 250 years ago.
02:00:52.466 --> 02:00:57.626
I imagine. I imagine there were some people that were more concerned about what
02:00:57.626 --> 02:01:03.486
they were going to do at their blacksmith shop or candle making shop or whatever,
02:01:03.926 --> 02:01:08.746
more so than the politics that was going on, that the revolution that was ongoing,
02:01:09.426 --> 02:01:14.966
you know, that was coming forth, you know, or that was already taking place.
02:01:15.366 --> 02:01:17.946
Right. Shots had been fired. Right.
02:01:19.155 --> 02:01:23.735
People that was trying to cozy up to the British, you know?
02:01:24.235 --> 02:01:26.975
You know, it was black folks that looked at the British and say,
02:01:27.115 --> 02:01:28.355
well, at least I'll be free.
02:01:29.195 --> 02:01:34.195
Not understanding that all the ships that were bringing them in were British ships.
02:01:34.935 --> 02:01:40.375
Were their cousins to the Caribbean islands, right?
02:01:42.375 --> 02:01:47.575
And, but they figured, well, you know, if we side with the British, we'd be free.
02:01:48.235 --> 02:01:53.215
And there were those of us that sided with the colonists thinking,
02:01:53.675 --> 02:01:59.195
well, we fight for the freedom of this nation. We might get our own freedom too, right?
02:02:00.995 --> 02:02:09.055
So I get it that there's no guarantees that things are going to be better if we get rid of Trump.
02:02:09.055 --> 02:02:17.655
But just like the gamble that was taken by Crispus addicts and other blacks
02:02:17.655 --> 02:02:23.935
like him, the side with the colonists, I just think the alternative would have been much worse.
02:02:25.515 --> 02:02:32.335
Because we won the war when the Constitution was written, there was an effort
02:02:32.335 --> 02:02:35.155
on paper to end the slave trade.
02:02:35.395 --> 02:02:40.695
There was actually a date put into the Constitution to end the slave trade, right?
02:02:41.135 --> 02:02:46.015
Don't know if that would have happened if King George had succeeded.
02:02:46.655 --> 02:02:50.995
I mean, eventually it would have ended, right? Maybe.
02:02:51.635 --> 02:02:57.995
But it wouldn't have been a civil war to decide it, because we've all been colonies of England.
02:02:58.955 --> 02:03:05.375
We had all been part of the British Empire, which was exploiting everybody and everything. Right?
02:03:06.215 --> 02:03:13.995
I mean, at some point, we've got to show some resolve and fight for what is right.
02:03:15.515 --> 02:03:17.875
We've got to do that, eventually. Right?
02:03:19.461 --> 02:03:26.301
If that means that you might not need to go to choir practice or you might not
02:03:26.301 --> 02:03:33.001
need to go on their dinner or you might not need to go shopping for not grocery
02:03:33.001 --> 02:03:34.621
shopping, but just shopping,
02:03:35.141 --> 02:03:38.881
you know, an indulgence, if you will. Right.
02:03:39.441 --> 02:03:42.941
Because people have talked about, well, why don't we just have a national strike?
02:03:43.721 --> 02:03:45.761
We're too selfish for that to work.
02:03:47.081 --> 02:03:48.121
Nobody's willing to make the
02:03:48.121 --> 02:03:55.221
sacrifice for that to work but I would I pray that we get to that point.
02:03:57.478 --> 02:04:02.938
It doesn't make sense now. It is ridiculous now. It is a joke now.
02:04:03.678 --> 02:04:11.598
These folks have succeeded in making something that took 250 years to build almost irrelevant.
02:04:12.418 --> 02:04:21.398
And for the few of us that still believe, you know, we just need the rest of y'all to trust us.
02:04:22.538 --> 02:04:27.138
Make that sacrifice. Make that sacrifice to give these people hell.
02:04:27.478 --> 02:04:35.118
And just, you know, some point we just got to shut this down because I can't
02:04:35.118 --> 02:04:42.278
imagine the leadership that we have now doing what needs to be done like Mandela
02:04:42.278 --> 02:04:46.438
and them did in South Africa or what we did,
02:04:46.758 --> 02:04:51.238
what the world did with the Nuremberg trial, right?
02:04:51.738 --> 02:04:57.838
I just can't see us doing that. I can't envision us doing the right thing and
02:04:57.838 --> 02:05:01.718
disbarring all of these lawyers doing this evil work.
02:05:01.918 --> 02:05:09.438
I can't envision us putting people in jail for sedition for what they've done to this nation.
02:05:09.958 --> 02:05:16.398
Right? Because all of this foolishness, all of this ridiculous stuff is basically sedition.
02:05:17.238 --> 02:05:23.158
That might sound harsh to people, but if you're trying to destroy the American
02:05:23.158 --> 02:05:25.878
government, that's what sedition is.
02:05:26.298 --> 02:05:33.038
But we want to just say, oh, it's politics, da-da-da-da-da. Nothing is normal. It is not normal.
02:05:33.938 --> 02:05:42.458
So I just want to end that. I hope and pray that we decide when we've had enough,
02:05:42.458 --> 02:05:45.898
we can be unified enough to let them know.
02:05:46.398 --> 02:05:52.658
And shut it down. Now, whatever comes after that, we'll deal with that.
02:05:54.715 --> 02:05:58.755
We took to Gamble 250 years ago to break away from an empire.
02:05:59.475 --> 02:06:06.655
There were a lot of bumps on that road because there were some mentalities that we needed to overcome.
02:06:07.275 --> 02:06:09.495
But now we're at another point.
02:06:10.415 --> 02:06:15.315
We're at another road marker. And we're going to have to make a decision.
02:06:15.915 --> 02:06:23.615
You have to overcome your fear, your timidity, and your apathy because this
02:06:23.615 --> 02:06:27.335
stuff this ridiculous state that we're in.
02:06:27.835 --> 02:06:36.935
Oh, I didn't even mention, they came and took the voting files from the 2020 election in Georgia.
02:06:37.575 --> 02:06:42.715
And you got Tulsi Gabbard, who is trying to stay in good graces.
02:06:43.955 --> 02:06:48.815
She had no input about what was going on in Venezuela, had no input of what
02:06:48.815 --> 02:06:53.055
was going on with Iran or any of these international conflicts.
02:06:53.055 --> 02:06:54.895
Or even in the Greenland conversation.
02:06:55.435 --> 02:07:00.215
But they sent the Director of National Intelligence to go get voter files from
02:07:00.215 --> 02:07:05.795
the county warehouse in Georgia, in Fulcan County, in Atlanta.
02:07:06.735 --> 02:07:08.455
All of this stuff, man.
02:07:09.675 --> 02:07:14.415
Somebody said maybe they're going to throw in Tulsi in there.
02:07:14.555 --> 02:07:18.775
Maybe they're going to try to get Maduro to say that Venezuela had something
02:07:18.775 --> 02:07:21.915
to do with the 2020 elections being rigged and Donald Trump losing.
02:07:23.055 --> 02:07:27.775
I wouldn't put it past them because we're at a point of everything being ridiculous.
02:07:29.175 --> 02:07:32.055
So guys, just y'all chew on that.
02:07:32.415 --> 02:07:36.355
And y'all say, well, Fleming, you know, you, you can say all that,
02:07:36.475 --> 02:07:38.895
but I got to live my life. I feel.
02:07:39.815 --> 02:07:45.735
But when your life is totally shattered, when you had a chance to do something
02:07:45.735 --> 02:07:49.315
to stop that, you got to deal with that too.
02:07:50.255 --> 02:07:53.235
All right, guys, thank y'all for listening until next. you.
DAVID ELIOT is a PhD candidate at the University of Ottawa, where he researches the social and political effects of artificial intelligence. He is a member of the Critical Surveillance Studies Lab, and his work on AI has been recognized with numerous awards, including the 2022 Pierre Elliott Trudeau Foundation PhD Scholarship. His first book, Artificially Intelligent: The Very Human Story of AI, was recently published by University of Toronto Press.
Congressional Candidate
I'm a 35 year old working-class mother of two who is on my 3rd run for the U.S. House as a Democrat in Idaho's First Congressional District. I was recruited out of my freshman year of Community College where I double major in criminal justice and political science to run because it was nearly impossible to find candidates in a district that receives no financial support or attention. I grew up with a single mother who worked 2-3 jobs at a time, was very politically active as a teenager, being an advocate and public speaker for diverse students in Idaho's foster and school systems, managing local state rep. campaigns in 08 before life went a different direction. I met my husband 16 years ago, had my daughter at 21 and became a stay at home mom who worked on the side because we couldn't afford childcare. I spent eight years at home, had my son, became a foster mom, before deciding I needed to get involved in public policy again and going to college at 30 years old. I have spent the last four years learning how to do rural progressive politics differently, how to communicate to rural voters successfully without compromising my progressive policy. Putting the focus on working-class families and people, holding elected official accountable, and organizing in a rural district that has been left behind by national progressives for nearly 3 decades.