This dystopia rests on 3 premises:
a) Half jobs done by humans are going to be taken away.
b) Mass joblessness with no way for people without work to quickly re-skill and compete with robots.
c) Mass joblessness results in major global unrest.
Technology progress means productivity increases — we achieve more with less.
Lump of Labor Fallacy: implying amount of work to be done in society is fixed
Human needs are infinite — divine discontent! — and we automate the boring + painful parts of the job, so we get to do more meaningful work (e.g. Coaching)
Every tech revolution has brought more new jobs.
We replaced half of our jobs every 90 years for about 300 years now
People thought that the rise of the auto would result in job loss for people who took care of the horses for a living.
“Cars created so many jobs we had to bail out the car companies!
2nd order effects too:
Restaurants, apartment complexes, suburbs, etc
Some that may be created in this revolution:
— Coaching
— Senior care
— Vr world designers
— Chief of staffs
— Data analysis
— Energy
Instead of something that just replaces machine component of jobs.
Also we don’t consider that lots of jobs are either mentally or physically back-breaking, literally (e.g farming).
When Deep Blue beat the best human champion, people thought it was the end of human chess. But actually the best is a mix of AI + Human.
Same with medicine, education, etc
Get diagnoses from computers — but get an explanation from a doctor.
More teachers and more doctors too.
And timelines also matter. In the short term, it sucked for railroad engineers who were displaced by the car.
Even if better in long run, it’s going to temporarily suck for drivers and truckers displaced by self-driving cars.
— Make it easier to relocate (ease immigration, fight NIMBY-ism, make better transpiration)
— Make it easier, from regulatory perspective, to be an entrepreneur and to invest
— Skill development
— Vigorous social safety net
They say General AI will mean that, sure, there will be new jobs, but AI will be better to do them better than humans can.
Our first Q wouldn’t be : “What happens to the unemployment rate?” We’d have bigger problems (e.g are the aliens friendly?)
In fact, the concern is not that that they’ll take your job. It’s that they won’t.
We need economic progress more than ever, while also having a social safety net like above that takes care of the people disclosed from automation.
/fin