I like to balance out all the left-wing propaganda which I find enjoyable by reading a news aggregator called Hacker News. It’s a redit offshoot that was created by a venture capitalist company, so it has the predicable right-libertarian tech bro-ey-ness.
For instance, there is a kind of interesting (to me at least) discussion around a dystopian sci-fi film which can be found here:
https://news.ycombinator.com/item?id=37807281#37808591
Here’s one of the comments that I found interesting- it’s a pretty normal question which I think points out a specific inconsistency on how right-libertarians understand individual action:
The theme with so many of these movies is “once we created AI it is out of our control!”, which I guess is a specific version of “humans and especially scientists shouldn’t play God” angle. It always felt very… I dunno, ignorant of our own ability to maybe not do something?
Like, fear of “the singularity” is narratively appealing maybe, but it’s ignoring that in the real world problems like climate change or enshittification through [whatever buzzword tech] are actually the result of constantly passively choosing to stay on a destructive path.
Is there any sci-fi in which all incoming disasters are perfectly or mostly preventable but the real problem is that people in power just choose to ignore it for their own short-sighted gain? Aside from “Don’t Look Up”, I mean.
vanderZwan
This was my response:
I get your point that things don’t have to go certain ways.
However, there are plenty of systems where the a giant and real pubic good isn’t fought for because the “good” is highly distributed and the “bad” which kills it has highly concentrated profits for a few folks.
These are systematic issues because the massive group of folks who are passively harmed have a difficult or impossible time “choosing otherwise” or fighting back or whatever because there are so many divergent bad actors that are operating on them.
The only way to avoid that situation is to make changes to the underlying system in which it is more difficult to concentrate the outcomes for a few people; very few people here are wanting to institute those kinds of changes, as they are correct in understanding that fact as broadly anti-capitalist.
That’s neither here nor there- even if you can’t accept that political stance, the trope of “incoming disasters [which] are perfectly or mostly preventable but the real problem is that people in power just choose to ignore it for their own short-sighted gain” is literally everywhere.
Dr. Stangelove can be understood this way, though for “MAD” rather than climate change. The “Aliens” films (especially the second one) can be understood that way. Hell, “Who Framed Rodger Rabbit” can be read that way, but for public transportation.
“I just wanna make this cool tech so I will ignore how it impacts other people because I see the short term benefits” is literally everywhere in this culture.
If someone could make $2MM releasing Skynet and knew it would end up with the entire plots of the first three Terminator films occurring I have zero doubt someone would do it.
I don’t think we, in general, are “ignorant of our own ability to maybe not do something?” I think most of us have a very realistic understanding of how the people who unleash their technology into our shared ecosystem have worked historically.