Post by Trades on Apr 7, 2016 10:57:25 GMT -5
Are robots going to steal your job? Probably
If you put water on the stove and heat it up, it will at first just get hotter and hotter. You may then conclude that heating water results only in hotter water. But at some point everything changes – the water starts to boil, turning from hot liquid into steam. Physicists call this a “phase transition”.
Automation, driven by technological progress, has been increasing inexorably for the past several decades. Two schools of economic thinking have for many years been engaged in a debate about the potential effects of automation on jobs, employment and human activity: will new technology spawn mass unemployment, as the robots take jobs away from humans? Or will the jobs robots take over release or unveil – or even create – demand for new human jobs?
The debate has flared up again recently because of technological achievements such as deep learning, which recently enabled a Google software program called AlphaGo to beat Go world champion Lee Sedol, a task considered even harder than beating the world’s chess champions.
Ultimately, the question boils down to this: are today’s modern technological innovations like those of the past, which made obsolete the job of buggy maker, but created the job of automobile manufacturer? Or is there something about today that is markedly different?
Malcolm Gladwell’s 2006 book The Tipping Point highlighted what he called “that magic moment when an idea, trend, or social behavior crosses a threshold, tips, and spreads like wildfire”. Can we really be confident that we are not approaching a tipping point, a phase transition – that we are not mistaking the trend of technology both destroying and creating jobs for a law that it will always continue this way?
This is not a new concern. Since at least as early as the time of the Luddites, in early 19th-century Britain, new technologies have caused fear about the inevitable changes they bring.
It may seem easy to dismiss today’s concerns as unfounded in reality. But economists Jeffrey Sachs of Columbia University and Laurence Kotlikoff of Boston University argue: “What if machines are getting so smart, thanks to their microprocessor brains, that they no longer need unskilled labor to operate?” After all, they write:
Smart machines now collect our highway tolls, check us out at stores, take our blood pressure, massage our backs, give us directions, answer our phones, print our documents, transmit our messages, rock our babies, read our books, turn on our lights, shine our shoes, guard our homes, fly our planes, write our wills, teach our children, kill our enemies, and the list goes on.
There is considerable evidence that this concern may be justified. Eric Brynjolfsson and Andrew McAfee of MIT recently wrote:
For several decades after World War II the economic statistics we care most about all rose together here in America as if they were tightly coupled. GDP grew, and so did productivity – our ability to get more output from each worker. At the same time, we created millions of jobs, and many of these were the kinds of jobs that allowed the average American worker, who didn’t (and still doesn’t) have a college degree, to enjoy a high and rising standard of living. But … productivity growth and employment growth started to become decoupled from each other.
Productivity and average real earnings in the US. Photograph: US department of labor statistics
As the decoupling data show, the US economy has been performing quite poorly for the bottom 90% of Americans for the past 40 years. Technology is driving productivity improvements, which grow the economy. But the rising tide is not lifting all boats, and most people are not seeing any benefit from this growth. While the US economy is still creating jobs, it is not creating enough of them. The labor force participation rate, which measures the active portion of the labor force, has been dropping since the late 1990s.
While manufacturing output is at an all-time high, manufacturing employment is today lower than it was in the later 1940s. Wages for private nonsupervisory employees have stagnated since the late 1960s, and the wages-to-GDP ratio has been declining since 1970. Long-term unemployment is trending upwards, and inequality has become a global discussion topic, following the publication of Thomas Piketty’s 2014 book Capital in the Twenty-First Century.
Most shockingly, economists Angus Deaton, winner of the 2015 Nobel memorial prize in economic science, and Anne Case found that mortality for white middle-aged Americans has been increasing over the past 25 years, due to an epidemic of suicides and afflictions stemming from substance abuse.
Is automation, driven by progress in technology, in general, and artificial intelligence and robotics, in particular, the main cause for the economic decline of working Americans?
In economics, it is easier to agree on the data than to agree on causality. Many other factors can be in play, such as globalization, deregulation, the decline of unions and the like. Yet in a 2014 poll of leading academic economists conducted by the Chicago Initiative on Global Markets, regarding the impact of technology on employment and earnings, 43% of those polled agreed with the statement that “information technology and automation are a central reason why median wages have been stagnant in the US over the decade, despite rising productivity”, while only 28% disagreed. Similarly, a 2015 study by the International Monetary Fund concluded that technological progress is a major factor in the increase of inequality over the past decades.
The bottom line is that while automation is eliminating many jobs in the economy that were once done by people, there is no sign that the introduction of technologies in recent years is creating an equal number of well-paying jobs to compensate for those losses. A 2014 Oxford study found that the number of US workers shifting into new industries has been strikingly small: in 2010, only 0.5% of the labor force was employed in industries that did not exist in 2000.
The discussion about humans, machines and work tends to be a discussion about some undetermined point in the far future. But it is time to face reality. The future is now.
Automation, driven by technological progress, has been increasing inexorably for the past several decades. Two schools of economic thinking have for many years been engaged in a debate about the potential effects of automation on jobs, employment and human activity: will new technology spawn mass unemployment, as the robots take jobs away from humans? Or will the jobs robots take over release or unveil – or even create – demand for new human jobs?
The debate has flared up again recently because of technological achievements such as deep learning, which recently enabled a Google software program called AlphaGo to beat Go world champion Lee Sedol, a task considered even harder than beating the world’s chess champions.
Ultimately, the question boils down to this: are today’s modern technological innovations like those of the past, which made obsolete the job of buggy maker, but created the job of automobile manufacturer? Or is there something about today that is markedly different?
Malcolm Gladwell’s 2006 book The Tipping Point highlighted what he called “that magic moment when an idea, trend, or social behavior crosses a threshold, tips, and spreads like wildfire”. Can we really be confident that we are not approaching a tipping point, a phase transition – that we are not mistaking the trend of technology both destroying and creating jobs for a law that it will always continue this way?
This is not a new concern. Since at least as early as the time of the Luddites, in early 19th-century Britain, new technologies have caused fear about the inevitable changes they bring.
It may seem easy to dismiss today’s concerns as unfounded in reality. But economists Jeffrey Sachs of Columbia University and Laurence Kotlikoff of Boston University argue: “What if machines are getting so smart, thanks to their microprocessor brains, that they no longer need unskilled labor to operate?” After all, they write:
Smart machines now collect our highway tolls, check us out at stores, take our blood pressure, massage our backs, give us directions, answer our phones, print our documents, transmit our messages, rock our babies, read our books, turn on our lights, shine our shoes, guard our homes, fly our planes, write our wills, teach our children, kill our enemies, and the list goes on.
There is considerable evidence that this concern may be justified. Eric Brynjolfsson and Andrew McAfee of MIT recently wrote:
For several decades after World War II the economic statistics we care most about all rose together here in America as if they were tightly coupled. GDP grew, and so did productivity – our ability to get more output from each worker. At the same time, we created millions of jobs, and many of these were the kinds of jobs that allowed the average American worker, who didn’t (and still doesn’t) have a college degree, to enjoy a high and rising standard of living. But … productivity growth and employment growth started to become decoupled from each other.
Productivity and average real earnings in the US. Photograph: US department of labor statistics
As the decoupling data show, the US economy has been performing quite poorly for the bottom 90% of Americans for the past 40 years. Technology is driving productivity improvements, which grow the economy. But the rising tide is not lifting all boats, and most people are not seeing any benefit from this growth. While the US economy is still creating jobs, it is not creating enough of them. The labor force participation rate, which measures the active portion of the labor force, has been dropping since the late 1990s.
While manufacturing output is at an all-time high, manufacturing employment is today lower than it was in the later 1940s. Wages for private nonsupervisory employees have stagnated since the late 1960s, and the wages-to-GDP ratio has been declining since 1970. Long-term unemployment is trending upwards, and inequality has become a global discussion topic, following the publication of Thomas Piketty’s 2014 book Capital in the Twenty-First Century.
Most shockingly, economists Angus Deaton, winner of the 2015 Nobel memorial prize in economic science, and Anne Case found that mortality for white middle-aged Americans has been increasing over the past 25 years, due to an epidemic of suicides and afflictions stemming from substance abuse.
Is automation, driven by progress in technology, in general, and artificial intelligence and robotics, in particular, the main cause for the economic decline of working Americans?
In economics, it is easier to agree on the data than to agree on causality. Many other factors can be in play, such as globalization, deregulation, the decline of unions and the like. Yet in a 2014 poll of leading academic economists conducted by the Chicago Initiative on Global Markets, regarding the impact of technology on employment and earnings, 43% of those polled agreed with the statement that “information technology and automation are a central reason why median wages have been stagnant in the US over the decade, despite rising productivity”, while only 28% disagreed. Similarly, a 2015 study by the International Monetary Fund concluded that technological progress is a major factor in the increase of inequality over the past decades.
The bottom line is that while automation is eliminating many jobs in the economy that were once done by people, there is no sign that the introduction of technologies in recent years is creating an equal number of well-paying jobs to compensate for those losses. A 2014 Oxford study found that the number of US workers shifting into new industries has been strikingly small: in 2010, only 0.5% of the labor force was employed in industries that did not exist in 2000.
The discussion about humans, machines and work tends to be a discussion about some undetermined point in the far future. But it is time to face reality. The future is now.