When I was young, Yellow Pages was ubiquitous. Businesses paid a modest fee to appear in the directory (or a less modest one if they wanted a bigger notice). The big yellow books of listings were delivered free to almost every household. The company brought together Balham’s plumbers with its inhabitants’ leaking taps; and summoned minicab drivers wheresoever they were needed at 2am on a Sunday morning. So Yellow Pages made a healthy profit by providing a valuable service. They also produced some outstanding television advertisements. No more. The internet has seen to that. Hibu, as the company is now called, appears to be on its last legs.
Has the internet destroyed the value in this once profitable company? It has certainly destroyed many viable businesses. And not just Yellow Pages. It has done the same to bookshops and it is beginning to eat into other retailers as well. So where has the value gone? The answer is that it has moved to you and me. We find it more convenient to do things online. It frees up time and saves us money. But our extra free time isn’t immediately monetised and we might not spend the cash we save. Eventually, we’ll reassign our time and money to more profitable activities, but that isn’t much comfort if you publish a telephone directory.
It was the same in the late eighteenth century. New machinery like the spinning Jenny and the mule meant that fewer workers were needed to produce the same amount of cotton fabric. People saw the machines as a threat to their livelihoods. And they were right. A few went so far as to try to hold back progress by force. My old friend Jenny Jones, a Green Party London Assembly Member, recently described the luddites as fiery and reasonble. And you can see their point, even if they turned out to be on the wrong side of history (although when household appliances made domestic service obsolete, no one seemed so worried).
Productivity is a good word. Businesses and governments strive for it. But basically, it means fewer people doing the same amount of work. An increase in productivity removes money from the pockets of workers and deposits it in the pockets of consumers (as well as companies’ coffers). The service sector used to be immune to this effect (which is why the number of jobs in the manufacturing sector always seems to be shrinking relative to the services sector). No one ever managed to automate salespeople or waiters. But the internet has begun to increase productivity (or destroy jobs, depending on your point of view) in the service sector as well. For instance, I’ve stopped using my firm’s helpline when I have an IT problem. Just logging into a chat room is so much easier while the worker at the other end can manage multiple queries at the same time.
But of course, this is only part of the story. Markets reassign resources, including workers, to where they are needed. We can enjoy our extra free time or work even harder if we want to. We can write blogs, play computer games and read to our children. The hole in GDP left by the loss of telephone directories is filled by App designers and delivery drivers. Companies invent things like iPads that we never knew we wanted or needed. Making things more efficient is ultimately good for all of us. Doing away with Yellow Pages increases the demand for other things. But we should not forget that, even though capitalism’s destruction is creative, the destruction is still real.
Discuss this post at the Quodlibeta Forum
Monday, September 24, 2012
Saturday, September 15, 2012
Islam: the Untold Story
Tom Holland wrote and presented Islam: The Untold Story for Channel 4. The show follows on from his book In the Shadow of the Sword on the early history of Islam. Most commentary on the show has concentrated on the complaints and threats it has engendered from the Muslim community. Holland himself deserves a great deal of praise for tackling this subject. I caught the show on Channel 4’s internet service (a screening for journalists and experts was cancelled due to security concerns). I haven’t read Holland’s book, so the comments below are based on the show rather than his more detailed written treatment.
Holland takes a sceptical approach to early Islamic history that echoes the work of the nineteenth-century higher critics on Christianity. His thesis is that the Arab invaders who destroyed the Persian Empire and took over the Levantine provinces of Byzantium was not originally Muslim. Islam only arrived sixty years later when it was introduced to hold the disparate Caliphate together.
The early written sources for Islam are indeed scarce. So, what do we know? Professor Patricia Krone, the dean of critical studies of Early Islam, summarised the situation in Holland’s film. There is the Koran, which dates from the early-seventh century. Muhammad certainly lived about that time. But the Koran contains almost no historical information and exists in a vacuum. There is no way to connect it to Mecca (which it only mentions once) and no evidence of the Arabs being Muslims until sixty years after Muhammad died. Holland and the critics assume this absence of written evidence is evidence of absence and fill it in with their own speculations.
Let me say from the outset that I think Holland is wrong and that the traditional account of Islam’s origins, in its rough contours, is basically correct. We can’t trust the early biographies of Muhammad to be completely accurate, but alternative accounts (that Mecca was in Syria, or that the Arab invaders weren’t Muslims) are simply implausible. Looking only at the evidence presented by Holland in his show (which, you’d hope, is the best he’s got), his alternative hypothesis of a late adoption of Islam by the Arabs simply falls apart.
For instance, he notes that the Arabs initially worshipped at the Jewish temple. They were certainly monotheists, but not Christians or Jews. Early Christian sources have no idea what they were and they would surely have recognised a Jew if they saw one. But sixty years later, we are asked to believe that these Arabs became Muslims; that they did so right across there now vast Empire; and that no one else converted at the same time. We know the early conquerors made no effort to proselytise their subject populations. But a new religion coming from outside after the conquests would surely have led to conversions across the board and not just in a single narrow strata of society. Far more likely that the Arabs were Muslims from square one and that Islam spread across their new Empire as they conquered it, not following a couple of generations later.
In fact, the earliest record of Muhammad comes from a coin minted by a pretender to the throne of the Umayyads called Ibn al-Zubair. But, al-Zubair, who was based in Medina and controlled Mecca, lost the resulting civil war. So, Holland asks us to believe that the Umayyads defeated the rebel and then successfully adopted his new religion over their vast domains. Again, it is far easier to imagine that both sides were already Muslims and that the rebellion only prompted the Umayyads to advertise their religious adherence more widely. This they did with spectacular effect at the Dome of the Rock.
Holland also notes that much of the Koran is addressed to farmers. This is odd because Mecca was a trading entrepĂ´t in the middle of the desert. There were no farmers to address. What Holland forgets to mention is that Muhammad spent much of his life in Medina (he is even buried there) and it was here that he made his first converts. Medina is an oasis city which had plenty of agriculture. Of course, the Saudi authorities have been industriously destroying all trace of early Islam from Mecca and Medina in their misguided iconoclasm. Many ancient mosques and monuments have been bulldozed and dynamited in an orgy of destruction that makes the Taliban look like museum curators.
Still, the traditional story of Islam’s origins is likely to be accurate in its main themes. Admittedly, a huge bundle of traditions grew up around Muhammad’s life, many of which are not true. These were culled into the received version of his biography and teachings between one and two hundred years after his death. Critical analysis of these texts is likely to give us a picture of Islam’s origins with more nuance than we currently possess. But today’s critical scholars have, like Christianity’s higher critics, gone too far in their scepticism. None of this means this isn’t a debate worth having. But non-Muslims need to be careful that they don’t gleefully turn it into a rod to beat Islam. This, at least, is something Tom Holland could never be accused of. Discuss this post at the Quodlibeta Forum
Sunday, September 09, 2012
Lukewarmism
England has had a cool wet summer. The relevance of this in the debate on global warming is pretty close to nil. Likewise, the cold winters in 2010 and 2011 didn’t tell us much about the long term trends of the world’s climate either. The data that does matter, the average world temperature records, show that temperature rose to about 0.7 degrees centigrade above pre-industrial levels in the years up to 1998. Since then, it has basically stood still. And not all the climate alarmism of all the lefties and greens on this Earth has managed to budge it one iota.
I’ve long struggled to articulate my position on global warming. Obviously, I’m a sceptic about the silly claims about tipping points, massive rises in sea level (we’ve had about 8 inches in the last 150 years) and extinct polar bears. The world isn’t going to see temperatures rise by 6 degrees any time soon, and if it does, it won’t have anything to do with us. But there is also no point in denying the temperature records that do show moderate warming in the last half-century or that carbon dioxide is part of the reason for this. Doubling the level of atmospheric carbon dioxide from pre-industrial levels should lead to an increase in average global temperatures of about 1 degree. This means that we can expect some further moderate warming in the coming decades.
You can only reach the alarming figures of warming of 3 degrees plus by adding feedback mechanisms to the mix. The scientific justification for these mechanisms is that they are needed to explain the past. Climate scientists have found that to get their computer models to reflect temperature records from the last hundred years accurately, they need to add more factors than just carbon dioxide to the mix. Then they extrapolated their models into the future and declared, on this basis, that things would get a whole lot hotter. They might be right. But simply using a model that explains the past to make predictions about the future is scientific garbage. You cannot take a complicated model that you’ve tweaked to fit a certain curve and then rolling it forward. This is about as accurate as a predictive device as tossing a coin. Unfortunately, it is also human nature. We are very good at spotting trends and expecting them to continue. There is even a name for this: it’s called the gambler’s fallacy.
Back in 2001, the IPCC told us to expect rises in temperature of 0.1 to 0.2 degrees per decade. Since 2001, that simply hasn’t happened. Now, there might be good reasons for this tied up with long-term patterns that were not properly understood even ten years ago. But there might just as well be other even more long-term effects that could cause temperatures to fall in the next fifty years, or not, as the case may be. We simply don’t know. And until we have models that have been shown to work in advance of what they are supposed to predict, we can’t rely on them.
That’s why I am sceptical about predictions of future climate. You should be too. But that doesn’t make me a global warming denialist, since I certainly accept the solidly-based past temperature records. I also expect further moderate warming on the basis of the well-understood “greenhouse effect”.
So what am I? Matt Ridley, author of the excellent The Rational Optimist, has the answer. He calls himself a luke-warmist. And that is what I am too. As a luke-warmist, I accept the following aspects of climate change orthodoxy: the world is getting (a bit) warmer; atmospheric carbon dioxide is a major cause of this; and human activity has increased the level of atmospheric carbon dioxide from about 280 to 400 parts per million.
But thereafter, luke-warmists part company with orthodoxy. For instance, I do not claim that the world’s temperature was an optimum in 1970 (or whenever). Instead, I recognise from history that warmer climates have been beneficial in the past and this is likely to remain the case in the future. I reject forecasts of future climate change based on computer models that have not been shown to make accurate predictions. I believe it is abhorrent to encourage poor countries to use expensive energy when cheaper carbon-based alternatives would boost their economic performance. I believe wind power is an expensive white elephant and using agricultural land to grow biofuels a scandal (on this last point, I find I’m in agreement with many Greens). I note most anecdotal reports on the effects of global warming turn out to be false or misleading (from Himalayan glaciers to vanishing coral reefs). I note that the best way to reduce carbon emissions is to use more gas (thankfully there is plenty of it about and the fracking technology to extract it). And finally, I note that climate change propaganda is well-funded and pervasive (unlike climate change scepticism, which has to work on a shoestring).
In short, global warming isn’t yet much of a problem and is more likely to be mildly beneficial. If it does turn out differently, then we should adapt. But the current fad for slowly destroying the world’s economy by artificially increasing the cost of energy isn’t just stupid. Cheap energy drove the industrial revolution and much of the development that has given us such high standards of living today. The oil shocks of the 1970s caused two damaging recessions. Keeping energy affordable when we know the consequences of high prices will eventually be disastrous should be a much greater priority than preventing uncertain climate change.
Note: this post was amended on 14 September to clarify certain points. It seems that there are only two boxes allowed in this debate - denailist and true believer. I have tried to make clearer that I am neither.
Discuss this post at the Quodlibeta Forum
I’ve long struggled to articulate my position on global warming. Obviously, I’m a sceptic about the silly claims about tipping points, massive rises in sea level (we’ve had about 8 inches in the last 150 years) and extinct polar bears. The world isn’t going to see temperatures rise by 6 degrees any time soon, and if it does, it won’t have anything to do with us. But there is also no point in denying the temperature records that do show moderate warming in the last half-century or that carbon dioxide is part of the reason for this. Doubling the level of atmospheric carbon dioxide from pre-industrial levels should lead to an increase in average global temperatures of about 1 degree. This means that we can expect some further moderate warming in the coming decades.
You can only reach the alarming figures of warming of 3 degrees plus by adding feedback mechanisms to the mix. The scientific justification for these mechanisms is that they are needed to explain the past. Climate scientists have found that to get their computer models to reflect temperature records from the last hundred years accurately, they need to add more factors than just carbon dioxide to the mix. Then they extrapolated their models into the future and declared, on this basis, that things would get a whole lot hotter. They might be right. But simply using a model that explains the past to make predictions about the future is scientific garbage. You cannot take a complicated model that you’ve tweaked to fit a certain curve and then rolling it forward. This is about as accurate as a predictive device as tossing a coin. Unfortunately, it is also human nature. We are very good at spotting trends and expecting them to continue. There is even a name for this: it’s called the gambler’s fallacy.
Back in 2001, the IPCC told us to expect rises in temperature of 0.1 to 0.2 degrees per decade. Since 2001, that simply hasn’t happened. Now, there might be good reasons for this tied up with long-term patterns that were not properly understood even ten years ago. But there might just as well be other even more long-term effects that could cause temperatures to fall in the next fifty years, or not, as the case may be. We simply don’t know. And until we have models that have been shown to work in advance of what they are supposed to predict, we can’t rely on them.
That’s why I am sceptical about predictions of future climate. You should be too. But that doesn’t make me a global warming denialist, since I certainly accept the solidly-based past temperature records. I also expect further moderate warming on the basis of the well-understood “greenhouse effect”.
So what am I? Matt Ridley, author of the excellent The Rational Optimist, has the answer. He calls himself a luke-warmist. And that is what I am too. As a luke-warmist, I accept the following aspects of climate change orthodoxy: the world is getting (a bit) warmer; atmospheric carbon dioxide is a major cause of this; and human activity has increased the level of atmospheric carbon dioxide from about 280 to 400 parts per million.
But thereafter, luke-warmists part company with orthodoxy. For instance, I do not claim that the world’s temperature was an optimum in 1970 (or whenever). Instead, I recognise from history that warmer climates have been beneficial in the past and this is likely to remain the case in the future. I reject forecasts of future climate change based on computer models that have not been shown to make accurate predictions. I believe it is abhorrent to encourage poor countries to use expensive energy when cheaper carbon-based alternatives would boost their economic performance. I believe wind power is an expensive white elephant and using agricultural land to grow biofuels a scandal (on this last point, I find I’m in agreement with many Greens). I note most anecdotal reports on the effects of global warming turn out to be false or misleading (from Himalayan glaciers to vanishing coral reefs). I note that the best way to reduce carbon emissions is to use more gas (thankfully there is plenty of it about and the fracking technology to extract it). And finally, I note that climate change propaganda is well-funded and pervasive (unlike climate change scepticism, which has to work on a shoestring).
In short, global warming isn’t yet much of a problem and is more likely to be mildly beneficial. If it does turn out differently, then we should adapt. But the current fad for slowly destroying the world’s economy by artificially increasing the cost of energy isn’t just stupid. Cheap energy drove the industrial revolution and much of the development that has given us such high standards of living today. The oil shocks of the 1970s caused two damaging recessions. Keeping energy affordable when we know the consequences of high prices will eventually be disastrous should be a much greater priority than preventing uncertain climate change.
Note: this post was amended on 14 September to clarify certain points. It seems that there are only two boxes allowed in this debate - denailist and true believer. I have tried to make clearer that I am neither.
Discuss this post at the Quodlibeta Forum
Subscribe to:
Posts (Atom)