Sci-Fi IRL
Released on 12/07/2023
Hi. Hello. Good afternoon, welcome to Sci-fi IRL.
Or as I've been calling it all week, Sci-fi-RL.
That, for those of you,
the one or two who are baffled by those abbreviations,
that means science fiction in real life.
As the voice of God said,
I am Jason Kehe, I'm a Features Editor at Wired.
And most of the science fiction we've published,
I'd say in the last 10 years, I've probably edited,
including by at least two of our three panelists today.
And you can read about their many books and distinctions
in the program.
We don't have a ton of time,
so I just wanna bring them up here as fast as possible.
So please help me welcome Charlie Jane, Annalee,
and Yudha to the stage.
[upbeat music]
Oh, fantastic. So let's get started right away.
It sort of strikes me that when
technologists imagine the future,
when they predict the future, we call it futurism.
When writers do it, we call it science fiction.
But maybe I had that wrong, or maybe I'm oversimplifying.
So Annalee, let's start with you.
Do you consider yourself a futurist?
I think that I do futurism more in my non-fiction
than my fiction.
I do think it's useful to have a distinction
between futurism, which is often hopefully evidence-based,
unless it's about paperclips,
in which case it's sort of based on faith.
And then science fiction,
I think we're able to bring in our imaginations more.
We can kind of bring in a tooth fairy or two
to make things happen.
Whereas in good futurism, I think,
you always wanna be keeping it as real as possible.
Charlie Jane, same question. What do you make of futurism?
Do you consider yourself doing it?
I mean, I sort of think of it as like a side hustle,
to be honest.
I don't know.
I mean, I love what Annalee said about faith-based futurism.
I think we're dealing with,
we're actually struggling under the weight
of a lot of faith-based futurism right now.
And the job of speculative fiction authors
is basically to be apostates and to say,
Yeah, your beautiful candy-colored,
candy-covered future that you've imagined
probably isn't gonna come to pass,
and you haven't thought through any of the possible
second order impacts of all the shiny technologies
you think we're gonna have miraculously in the future.
And so I think skepticism is what
a good speculative fiction author can bring
to the kind of pie in the sky futurism we often see.
And I should add for the audience, real quick,
that all three of our panelists write both fiction
and nonfiction.
I wanna ask Annalee about her upcoming book
a little later, but Yudha, where are you beaming in from?
I'm in Singapore right now.
So I'm from Sri Lanka, but right now I'm in Singapore.
It's about four in the morning.
And I'm in a hotel room that's, let's just say,
it's basically booked by the hour.
[audience laughing]
It's the cleanest place that I could find
at four in the morning.
[Jason] Well, we really appreciate you being here
at such an insane time.
Futurism. Do you consider yourself doing it, ever?
As a writer, as a thinker?
I've been accused of it, yes.
I think of it as rolling the dice
and seeing where they fall.
I'm very wary of painting myself as a futurist
because a lot of things that I see in futurism.
I'm from Sri Lanka,
a country which is mostly known these days
for economic collapse and before that, for war.
So some of the things I read in, I stop in novels, I go,
Well, that's not too bad, that's just Tuesday.
So I'm sometimes hesitant to put myself into that camp.
So I think of it as rolling the dice.
Some of my all-time favorite writers,
I'm guessing they're also some of your favorite writers,
have made versions of the remark
that science fiction as a genre is not predictive,
it's descriptive.
That's Le Guin. Gibson says the work is not prescient.
Ray Bradbury said, I'm not a predictor of futures.
I'm a preventor of futures.
So when I run down some of those lines,
what does that make you think?
Do you agree with those?
Do you consider yourself part of that lineage?
Or are you doing something different, Charlie Jane?
I mean, is it predicting to the future
to say if you lick a light socket,
you're not gonna have a nice time?
Like is that prediction or is that just like,
Hey, don't lick light sockets, people!
'Cause I feel like a lot of what people consider
to be prediction,
especially when it comes to like dystopian
or apocalyptic science fiction,
is literally just don't lick light sockets.
I love that.
The don't lick light sockets PSA version of science fiction.
I mean, I think in some ways that's true,
that dystopian science fiction,
which now is quite mainstream
and is often just considered drama, really is a PSA.
Like, yeah, if you keep releasing carbon emissions,
the planet's gonna warm up.
We're gonna have more storms.
I think that science fiction,
when it's not being kind of near future dystopian,
does have a chance to be not predictive,
but just opening up a bigger map
that allows us to see what we're doing in a larger social
or cultural context, a larger temporal context,
and just make us think about that,
and kinda consider the implications,
the second and third-order effects
that we were talking about before.
Yudha, anything to add?
Nope. Agree completely.
It's a bit like The Garden of Forking Paths, right?
You take an idea and you test it and you sort of logic,
you try to extrapolate where this can go.
And science fiction is a really good canvas for that.
Doesn't necessarily mean that it might come true
in exactly that fashion, but along the way,
you might learn really useful lessons.
Like, let's not lick the light sockets.
I do wanna try to unpack this a little bit more
because the fact is so much of the time,
the futures in science fiction do come to pass in some form.
I mean, most people in the tech world,
from Elon Musk and Peter Thiel,
all the way down to some coder hacking away
in her parents' basement, read science fiction.
Not only do they read it,
they kind of maybe not even so secretly
want to make it real.
I mean, flying cars, teleportation, AI,
these were all in science fiction for many, many decades.
And now they're all starting to come up.
So is there, I guess the question is,
is there an obligation on the part
of a science fiction writer
to responsibly predict the future,
knowing that it might inspire a generation
of real world technologists
who then want to go on and build it?
I mean, one of my favorite examples of this is,
the classic dystopian novel 1984,
which predicts telescreens.
And then clearly Silicon Valley was like,
Great, let's have Zoom. Let's have social media.
Like telescreen sounds so awesome, badass.
You never turn it off and it can always see you?
That sounds great!
Think of all the marketing that we can build around this.
So I think George Orwell was not sitting there,
hoping that his invention would become something
that VCs were funding in the future.
And I don't know how he could have been more responsible.
He was basically like, This will turn your world to shit.
And yet it still became a marketing,
it still became part of a business plan.
So I think that our responsibility
is really to engage with our audiences
and to make sure that we don't just put a story
into the world without also kind of engaging
with factual issues
and kind of talking about the real-world issues
that go along with the fiction.
Yeah, actually, it's a shameless plug.
Annalee and I did a series on our podcast recently
called Silicon Valley versus Science Fiction.
And our podcast is called, Our Opinions are Correct.
And we kind of delved into some of the ways
that Silicon Valley kind of either deliberately,
or kind of inadvertently misinterprets speculative fiction.
And often when they say that they're inspired
by science fiction, that's just marketing.
It's a marketing thing.
They just are like,
This is a thing that people have seen in popular media
that we can say we were inspired by.
Or they're kind of just taking what Annalee said,
the kind of dystopian stuff from a dark future
and then making it look shiny for consumers.
But I think that the idea of responsibly predicting
the future feels like a trap to me.
Like it feels like you can't possibly,
whenever you write a story,
you can't possibly know how people are going to receive it
or what they're gonna get from it.
And you just have to tell the best story you can.
Is there, I think I remember an op-ed you wrote,
Charlie Jane, several years ago,
in maybe the Washington Post, about climate fiction?
Is this ringing a bell?
Yeah, yeah, I think so. Yeah.
You, I think, called on all science fiction writers
to incorporate some climate element into their storytelling.
I think you may have even said
there was a moral obligation to do so.
So I wonder how you think about that now?
'Cause that was a few years ago.
And if it plays a part in what I'm sort
of asking about the responsibility of writers
to imagine futures a certain way?
Yeah, I mean, thank you for asking that.
'Cause that is something that's really close to my heart.
I think the crux of my argument
isn't just that you have a moral obligation
to warn about climate change, which I think is true.
But it's really that if you ignore this huge thing
that's on the horizon, and now more than, like,
whenever I wrote that op-ed, right?
I think it was almost like four or five years ago.
Now, it's even more like looming on the horizon.
If you're like,
Yeah, my novel takes place 20 years from now,
and there's no climate change.
That's not realism.
Like realism is acknowledging that 20 years from now,
we are going to be really, really up to our necks in it.
And there's no way to imagine a future
or depict a future of the 2040s
that where climate change isn't a huge big deal
at this point.
So it's just like,
Are you gonna write about what's actually gonna happen?
Or are you gonna just be like,
'Oh, the magical fairies came down.'
And you can write a fantasy novel where fairies came down
and just sucked up all the carbon.
They were carbon-eating fairies,
who just came in, nom, nom, nom.
These fairies have been starving for some CO2,
for thousands of years in fairy land.
And then they found a rich source of it.
But other than that, no, you have to deal with it.
Yudha, any thoughts on moral obligations?
The responsibilities of writers?
Well, I used to be of the camp
that we have a moral obligation to address certain things.
As of late, my view has changed.
And it's changed on a very personal scale,
which is Sri Lanka, like last year,
went through an economic crisis, mass protests,
the whole nine yards.
And most of my work also involves being a journalist.
We mapped 600-odd protests as they happened.
We used satellite imagery to locate a mass grave.
And the strain of this work, I would say,
forced me to think about writing in a different way,
which is to take the brakes off and say,
Okay, there's no moral obligation to do anything.
There are things that are on the horizon, yes,
but it's up to the individual
whether they want to talk about it or not.
So I can only answer for myself in this,
which is I've started to approach my fiction
from a perspective of, What ideas am I interested in?
Because from where I sit,
the world is broken in so many ways.
And if I were to write about that,
I would write forever about things that depress me
and I would end up killing myself.
So I've tried to strike a balance in my own practice,
if you will, and sometimes I fail.
2021, I wrote a novel about Sri Lanka failing,
the economy being collapsed artificially,
and mass protests and President being thrown out.
2022, that happened. And people gave me an award for it.
I absolutely hate that it happened.
I was in that crowd as it happened,
and that was not a prediction I ever wanted to come true.
So I would start to think of it from the perspective
of what concerns us,
what issues do we really want to talk about?
What do we have funded?
I'm staying away from the moral responsibility
of science fiction, or this question altogether.
I know it's a cop-out answer, but...
No, I like it.
I do wanna talk now about optimism then,
since you sort of alluded to it.
And I think optimism, maybe as a philosophy,
sometimes gets a bad rap,
or dismissed as intellectually unserious.
Or I don't know.
I'm curious how each of you thinks about optimism as writers
of science fiction when you are imagining futures
that could be shitty or could be hopeful.
Well, I've just spent the past couple of years
working on a nonfiction book
that's coming out next year that's a history of PSYOPs
and culture war in the United States,
which was very depressing.
There isn't a lot of hope in that field.
But one of the ways that I dealt with it,
because a lot of it was very heavy,
like a lot of learning about
how people have been emotionally tormented for 200 years.
I, at the same time wrote a novel called The Terraformers,
which is out.
And it's actually about a revolution on another planet,
which is relatively successful as revolutions go.
And I was imagining the book in some ways,
the fiction book as a counter PSYOP.
And I think it was because I was so immersed
in this language of PSYOPs
that I started thinking of how a story itself
can actually embed itself into people's minds
and provide a way out of some of this dystopian thinking
that we're stuck in.
And there's a reason we're stuck in dystopian thinking.
Like, things are going badly. I'm sure you've noticed.
And I think that being able to have stories
that map out a way beyond warfare,
a way that goes beyond constant resource extraction
is really important.
And so, I don't know if I would say that I'm optimistic,
but I am definitely in favor of attempting
to put down our weapons to use stories
as a form of peacemaking.
Charlie Jane, I mean, do you see? Yeah.
Yeah. I mean, I think everything I write is hopeful.
I don't write anything that's like despairing
or overly depressing or doom-saying.
I feel like everything I write is hopeful
about our ability to surmount challenges.
And I think that's what part of what science fiction
is really good for.
Science fiction is the literature of problem-solving.
And it's great for imagining ways
that we can get past some of the challenges
that we are undoubtedly gonna be facing.
And by the way,
when I say you have to deal with climate change
in your work,
I don't mean that you have to write about climate change
or that it has to be what the story is about.
It can be in the background.
It can be like there's all this other stuff going on,
that's your story.
But then we have to at least be aware
that climate change is a thing in the world,
is sort of where I'm at.
But yeah, I think hope.
Annalee mentioned Terraformers.
I'd love, Charlie Jane,
if you can mention a book that maybe speaks
to some of these themes, tell us about it.
Tell us, talk about the hope involved in it,
and how you kind of conceptualized it.
A book that just seems relevant to this conversation.
Wait, so, Of your own.
Oh, you mean All the Birds -
Just plug one of your books, basically.
Oh, okay. I mean -
Tell us about your work. Charlie Jane.
I mean, I feel like all my books,
like All the Birds in the Sky
is probably the one you're thinking of.
That's the one that's like about technology and nature
and the ability to kind of see them
as parts of a whole, rather than in opposition.
And to show that like we can actually,
they can work in harmony, and we can,
through human connection,
through learning to understand each other,
we can actually build a world
where technology and nature are harmonious,
rather than opposites.
That's definitely the book that
where I deal with that most directly.
But all of my work, I mean my recent young adult books,
are all about that as well.
About like, people..
Like Victories Greater than Death
is all about people learning
to kind of understand each other
so that they can join together
and solve huge intractable problems.
That's like a theme that's like,
that really resonates for me.
Yudha? A book of yours, I'd love to hear.
Huh. I'd say The Salvage Crew.
Since there's a bunch more stuff to come,
but I would definitely say The Salvage Crew,
which is about a C-tier crew of humans and an AI poet
who was once a human,
just dropped onto a planet with barely any resources.
One of the most horrible jobs
that a person can undertake in the universe.
And they end up meeting something other.
And that something other recognizes the AI poet,
not because of its machinery,
but because it's playing the language game.
I've sort of,
I think I've adopted the pessimism of the intellect,
optimism of the will approach in writing these things.
This panel is called Sci-fi IRL.
And I think a lot about the question
of whether our lives are science fictional.
Has technology science fictionalized our lives?
Is the future here, in some sense?
I'd be curious what each of you think.
Do you think we're living a science fictional reality?
Does that question even make sense?
I mean, science fiction has become
an incredibly mainstream form of entertainment,
whereas earlier in the 20th century,
it was kind of this marginal weird thing
that a bunch of dorks did.
And I'm not saying that dorks aren't still doing it,
like, Go, my dork brothers and sisters.
But I think that we like to imagine
that we're in a science fictional world
because that would be great
if we really were making the future come true that we want.
And I wanna bring this back to
what Charlie Jane was saying earlier about science fiction
kind of functioning as marketing sometimes?
I think this is a story we like to tell ourselves,
but it really masks some of the ways
that we're in fact not living in a shiny future.
And lots and lots of people don't have access
to the technology that we have
saturating this room right now.
So it's a wonderful fantasy.
I think many science fictional stories are something
that we should aspire to, but no,
we are not living in science fiction.
We're still stuck in the Middle Ages.
I mean, I said before
that science fiction is the literature of problem-solving.
I think science fiction is also the literature of change.
And to the extent that it feels like the world
is changing more rapidly,
that's when science fiction becomes more relevant.
But that's very subjective.
I want to add to what Annalee said, which is that yes,
we have this story saying
that science fiction is a predictive field, if you will.
But that claim is always a bit off
because to make that claim,
you kind of have to survey every piece
of science fiction ever written,
map out all the predictions there and say,
Yes, this percentage of predictions came true.
And if you did that,
I'll wager you'll find that the rate is actually quite low.
But I am going to give three anecdotes
as to why this does feel a little bit science fictiony
to me.
So firstly, so growing up,
I got access to the Internet in about 2012.
That was about the time that broadband
really arrived in Sri Lanka, right?
And until then it was this junkyard Pentium Two
and me trying to play Caesar Three on it.
And suddenly there was this thing
where I could go out and learn about things
that I would otherwise never have been able to.
And I suddenly felt like I had access to the sum total
of human knowledge at my fingertips.
The Internet as a whole, the connections that we have,
the fact that I'm presenting here
even though we are thousands of miles apart,
we are having this conversation.
I'm able to read people's books, they're able to read mine.
Many progresses,
that many sort of technological progressions
that we see do feel science fiction.
But it's uneven.
Where science fiction sort of the stuff that we looked to
as reference markers used to imagine better infrastructure.
And this is a thing that we really seem to have filled.
Better infrastructure for humans.
Not just concrete, but also better legal infrastructure,
better social infrastructure.
And we seem to have kind of shut the bed
on all those fronts.
What we do have is an incredibly advanced ARM cortex,
whatever processor and another 300 megabits of camera.
And those keep increasing at an incredible rate
that it's truly mind-boggling.
I just went out and bought a VR headset today.
I'm looking at it and going, Wow, this is amazing.
But so much of what we imagine,
so much of what we consider to be the common,
so much of what we consider to be essential
for the functioning of the civilization,
it seems like we've let those stagnate.
So it kind of feels like the architectural bedrock
of our civilization is still somewhere around the 1960s,
and we've just bolted on a bunch of stuff on top.
It is progress, yes.
But it's not really a holistic type of progress.
It's a very scattershot type of progress.
I'm hoping this leads to progress in other fields,
but it feels weird that our infrastructure is way behind
what it should be.
Our energy infrastructure is way behind what it should be.
Our ability to deal with climate change
is way behind what it should be.
Our progress on human rights, on norms,
on prevention of warfare,
these things have fallen pretty much,
and we just sort of just let them go, which feels odd.
So it's half-half.
Some parts of it are very futuristic.
Some parts of it are just,
Why? This just still feels like Cold War era stuff.
Okay, we have time for probably one more question.
So I have to ask about artificial intelligence,
which not only have you all written about.
You've all written about it before this current boom.
I mean Annalee's first book was called Autonomous.
Charlie Jane, you wrote a piece for Wired
about AI and journalism in 2020,
and maybe even 2018, I think?
Yudha, you wrote a piece about AI and advertising
for one of our science fiction packages.
So now that it's kind of come to pass in some form,
is this what you thought would happen when it happened?
How's it playing out from your perspective?
I mean, I don't think it has come to pass.
You mean corporations pretending that they have a product
that does things that it doesn't do?
Yeah, I thought that would happen.
Yeah. I am both excited,
and often like shaking my head itself.
So for The Salvage Crew, for example,
the main character was a machine poet.
So back then GPT-2 was pretty hot
and had just come off the press.
I retrained GPT-2 for the poetry of the machine poet
because it felt natural and I wanted to experiment
and I ended up building these little markup chain systems
for simulating weather and character events
and sort of sitting in the middle
of like a nest of half-baked R and Python programs
with things feeding information to me.
And it felt interesting.
Not my usual process, but interesting.
And then I saw the tilt,
which is where this went from being interesting and cool
and weird and experimental to,
Oh no, the crypto folks and the NFT people
have stumbled on this
and I'm now profoundly bored by the sheer number
of idiotic stuff coming out.
That being said,
I'm also extremely excited at what the open source community
is doing.
I see that as a lovely counterweight.
There's been incredible efforts by some people from,
particularly from Meta and so on and so forth,
releasing Llama, Mistral models,
the fine tunes on top of that,
Gerganov's work, the quantization stuff.
So I'm hoping that there's a future
where all this stuff comes back to the hands of people,
people being able to individually run their models
without necessarily having to open up Twitter
and see 100 crap skins on top of an open AI API.
Because that is a very boring and also kind of shit future.
Whereas the other direction is a little bit interesting.
The other direction is gaining traction again.
So I'm interested again.
Well, on that note,
please buy the books by these three writers.
They really are some of the most exciting writers
in the genre today.
And so please give them a round of applause.
[audience applauding]
Thank you, Yudha. Thank you.
The Mighty Nein Cast Answer Their 50 Most Googled Questions
Historian Answers Revolution Questions
Has The U.S. Become A Surveillance State?
Sydney Sweeney Answers The Web's Most Searched Questions
Economics Professor Answers Great Depression Questions
Palantir CEO Alex Karp On Government Contracts, Immigration, and the Future of Work
Historian Answers Native American Questions
Cryogenics, AI Avatars, and The Future of Dying
EJAE on KPop Demon Hunters and Her Journey to Success
Why Conspiracy Theories Took Hold When Charlie Kirk Died