My new Facebook page has taken off much faster than I expected, gaining 500 followers in less than five days. As a way to reward the people who liked my page, and have since given me some great feedback, I promised to take suggestions on what I should write about next and then pick one subject out of the stack. I received a lot of great potential topics from a lot of great people: Islam, Donald Trump, atheism, the meaning of happiness, and on the list went. But the one that stood out to me, and the one that I picked, came from a reader named Travis who said:
Though this is a really great question, it's also a really broad one, and one that can be taken in a thousand different directions. But Travis I guess I want to start with the presumption your question stems from, which is that the "skyrocketing of technology" is always a good thing. You seem to think— and correct me if I'm wrong— that an economic system which causes technologies to be developed at a rapid pace is better than an economic system which does not allow technology to develop as quickly. This is a really common attitude and an understandable one, because after all, technology has drastically improved humankind's quality of life over the last century and a half.
But I think it's a flawed assumption to believe that because the rapid ascension of technology has so far been beneficial to the human race, that technological ascension will always be beneficial to the human race. By extension then, an economic system whose attributes may have caused humankind to take a "great leap forward" may now be the very system whose attributes will drive us to great peril. And so when you ask me why technology has skyrocketed under capitalism if capitalism is so bad, I guess my answer would be twofold: 1) I'm a democratic socialist, so I don't think capitalism is bad, I just think it needs to operate within very strict parameters, and most importantly 2) I think one of the reasons ill-regulated capitalism is bad is because it has no stopping mechanisms in place should technological development escalate to the point where people get left behind.
This second point is one that I really want to be more specific about. When I talk about technological development escalating to the point where people get left behind, I'm thinking mainly of two realities on our horizon: job automation and artificial intelligence.
In 2013, Oxford University published a study revealing that in the next 20 years nearly 50% of all existing American jobs would be automated. This would mean the following things: 1) Obviously a lot of jobs that currently require human beings won't require them by 2033, which means a huge portion of Americans will be shoved out of the workforce, 2) Any new jobs that come into existence which do require people will be tech-oriented, and since not everyone has a "technical mind" this means a lot of people will stay out of the workforce once shoved out, and 3) Assuming governments don't have a plan for addressing the problem of technological unemployment (which isn't a big leap of the imagination), then large numbers of hungry and desperate people will do what they have to do to survive (and that will not be pretty). In a nutshell, rather than technology increasing our productivity, it is on the verge of outpacing it for the first time in history.
What does any of this have to do with capitalism, you ask? Simple: CEOs won't care about people being replaced by machines. To them, automation is more efficient because it allows them to keep more money from their sales by not having to pay any salaries or provide benefits. This is why it is vital that workers find a way of seizing the means of production.
Now there are those who have suggested instead that a universal basic income might prove to be the answer for saving Americans shoved out of the workforce, and I do think the idea of a universal basic income has a lot of promise. But basic income or no basic income, the fact remains that technology cannot displace workers if workers collectively own the means of production in advance. And that, Travis, is where democratic socialism comes in.
However, I fully admit that my solution to the problem of technological unemployment (i.e. job automation) might be far too simplistic. After all, I am not an economist or expert of any kind. But if seizing the means of production will not stop the oncoming wave of job automation, then we will no doubt be forced to rethink the age-old model of money-in-exhange-for-labor... we'll have to get creative.
The second problem with the idea that technological progress will always be a good thing because it's always been a good thing so far, and, by extension, the problem with the idea that capitalism is a "good" system solely because it yields technological progress the fastest, is that these beliefs don't take into account the very real possibility that artificial intelligence could lead to our extinction. Lest you think that artificial intelligence is the bogeyman of science fiction, please know that another Oxford study reveals that the likelihood of artificial intelligence existing by 2040 is 50%, and by 2075 those chances rise to 90%.
So what happens when capitalism delivers this gift to the human race? It's anyone's guess of course. But is it really so hard to imagine that beings who become conscious in the same way we are conscious, and yet at the same time have the capacity to learn and process information at an infinitely faster speed than we can, will eventually arrive at the conclusion that we— their Dr. Frankensteins— are less valuable than they are... and do something about it?
Ask yourself this: If a species more advanced than our own were to exist in the future, and were to write humankind's epitaph, what would it say? "Beneath us in the soil lie the remains of the humans— a paradox, in that they were simultaneously the most advanced and the most idiotic. They went from caves to the moon, and yet, having no natural predators to fear, they created their own."
I suppose what drives me crazy is that the dialogue around artificial intelligence is that it's "inevitable", and it's inevitable because capitalism will drive us to it. This is far too fatalistic for me to stomach. We do have it within ourselves to 1) harness capitalism for the common good rather than let it "drive us" toward anywhere we don't want to go, and 2) we do have it within ourselves to stop deliberately trying to create artificial intelligence (like these guys are doing). None of this is inevitable. We can stop before it's too late, and not let anyone tell us that something is happening regardless of whether we want it to or not.
To wrap this up, I suppose what I'm saying Travis, is that I don't think technological advancement is a good marker of a good economic system. Yes, so far technological advancement has been nothing but good for humankind and has greatly improved our quality of life; but given the studies coming out of Oxford and the number of experts sounding the alarm, I think we have reason to be very worried about the future's developments. And I see potential within democratic socialism to stop these developments from happening much more than I do with our current system of economics.
This was a really great question.