Monday, October 22, 2012

Please tell me there's more to this story

Italian scientists have been convicted of manslaughter and sentenced to six years in prison for ..... failing to predict an earthquake.

There has to be more to this, right? Some exculpatory detail that makes this story not crazy?

Discuss this post at the Quodlibeta Forum

Wednesday, October 17, 2012

Talk tomorrow in Oxford and Minneapolis in November

Sorry this is rather late notice.  I am speaking tomorrow 18 October at 8.30 in the Sutro Room, Trinity College, Oxford on ancient Greek and medieval science.  The talk is a seminar for the Ian Ramsey Centre but everyone is welcome.  Do drop by if interested.

I am also giving a Faith and Life lecture at St Philip the Deacon Church in Minneapolis at 7pm on 15 November.  Again, this is a free public event.

Update: you can watch a video of the Oxford talk here.

Discuss this post at the Quodlibeta Forum

Monday, October 08, 2012

Dennett contra Weinberg

There's a relatively famous quote by physicist Steven Weinberg : "With or without religion, good people can behave well and bad people can do evil; but for good people to do evil—that takes religion." I think this is an incredibly naive claim. I would replace "religion" in that quote with "ideology." After all, good people do evil in the service of political ideologies all the time. But that's a post for another day. Right now I want to point to an interesting passage from Daniel Dennett's Darwin's Dangerous Idea that contradicts Weinberg's claim. It's from page 264 of my copy, the third page in chapter 10; the emphases are mine.

Anybody as prolific and energetic as [Stephen Jay] Gould would surely have an agenda beyond that of simply educating and delighting his fellow human beings about the Darwinian view of life. In fact, he has had numerous agendas. He has fought hard against prejudice, and particularly against the abuse of scientific research (and scientific prestige) by those who would clothe their political ideologies in the potent mantle of scientific respectability. It is important to recognize that Darwinism has always had an unfortunate power to attract the most unwelcome enthusiasts -- demagogues and psychopaths and misanthropes and other abusers of Darwin's dangerous idea. Gould has laid this sad story bare in dozens of tales, about the Social Darwinists, about unspeakable racists, and most poignantly about basically good people who got confused -- seduced and abandoned, you might say -- by one Darwinian siren or another. It is all too easy to run off half cocked with some poorly understood version of Darwinian thinking, and Gould has made it a major part of his life's work to protect his hero from this sort of abuse.

So Dennett not only affirms that science can lead good people to do evil, but evolution in particular can do so. Of course, Dennett and Weinberg and I would respond to this charge that such people are obviously misunderstanding science and evolution in such cases. But then I don't see why this defense isn't available for religion as well.

(cross-posted at Agent Intellect)

Discuss this post at the Quodlibeta Forum

Thursday, October 04, 2012

Fakes: Jesus' wife, boyfriend and brother's coffin

I have a post at On the Square, the blog for First Things magazine, looking at the Jesus' wife papyrus and some other notorious forgeries.

Discuss this post at the Quodlibeta Forum

Monday, September 24, 2012

Luddites and the internet

When I was young, Yellow Pages was ubiquitous. Businesses paid a modest fee to appear in the directory (or a less modest one if they wanted a bigger notice). The big yellow books of listings were delivered free to almost every household. The company brought together Balham’s plumbers with its inhabitants’ leaking taps; and summoned minicab drivers wheresoever they were needed at 2am on a Sunday morning. So Yellow Pages made a healthy profit by providing a valuable service. They also produced some outstanding television advertisements. No more. The internet has seen to that. Hibu, as the company is now called, appears to be on its last legs.

Has the internet destroyed the value in this once profitable company? It has certainly destroyed many viable businesses.  And not just Yellow Pages. It has done the same to bookshops and it is beginning to eat into other retailers as well. So where has the value gone? The answer is that it has moved to you and me. We find it more convenient to do things online. It frees up time and saves us money. But our extra free time isn’t immediately monetised and we might not spend the cash we save. Eventually, we’ll reassign our time and money to more profitable activities, but that isn’t much comfort if you publish a telephone directory.

It was the same in the late eighteenth century. New machinery like the spinning Jenny and the mule meant that fewer workers were needed to produce the same amount of cotton fabric. People saw the machines as a threat to their livelihoods. And they were right. A few went so far as to try to hold back progress by force. My old friend Jenny Jones, a Green Party London Assembly Member, recently described the luddites as fiery and reasonble. And you can see their point, even if they turned out to be on the wrong side of history (although when household appliances made domestic service obsolete, no one seemed so worried).

Productivity is a good word. Businesses and governments strive for it. But basically, it means fewer people doing the same amount of work. An increase in productivity removes money from the pockets of workers and deposits it in the pockets of consumers (as well as companies’ coffers). The service sector used to be immune to this effect (which is why the number of jobs in the manufacturing sector always seems to be shrinking relative to the services sector). No one ever managed to automate salespeople or waiters. But the internet has begun to increase productivity (or destroy jobs, depending on your point of view) in the service sector as well. For instance, I’ve stopped using my firm’s helpline when I have an IT problem. Just logging into a chat room is so much easier while the worker at the other end can manage multiple queries at the same time.

But of course, this is only part of the story. Markets reassign resources, including workers, to where they are needed. We can enjoy our extra free time or work even harder if we want to. We can write blogs, play computer games and read to our children. The hole in GDP left by the loss of telephone directories is filled by App designers and delivery drivers. Companies invent things like iPads that we never knew we wanted or needed. Making things more efficient is ultimately good for all of us. Doing away with Yellow Pages increases the demand for other things. But we should not forget that, even though capitalism’s destruction is creative, the destruction is still real.

Discuss this post at the Quodlibeta Forum

Saturday, September 15, 2012

Islam: the Untold Story

Tom Holland wrote and presented Islam: The Untold Story for Channel 4.  The show follows on from his book In the Shadow of the Sword on the early history of Islam.  Most commentary on the show has concentrated on the complaints and threats it has engendered from the Muslim community. Holland himself deserves a great deal of praise for tackling this subject. I caught the show on Channel 4’s internet service (a screening for journalists and experts was cancelled due to security concerns). I haven’t read Holland’s book, so the comments below are based on the show rather than his more detailed written treatment.
Holland takes a sceptical approach to early Islamic history that echoes the work of the nineteenth-century higher critics on Christianity.  His thesis is that the Arab invaders who destroyed the Persian Empire and took over the Levantine provinces of Byzantium was not originally Muslim.  Islam only arrived sixty years later when it was introduced to hold the disparate Caliphate together.  
The early written sources for Islam are indeed scarce. So, what do we know?  Professor Patricia Krone, the dean of critical studies of Early Islam, summarised the situation in Holland’s film.  There is the Koran, which dates from the early-seventh century.  Muhammad certainly lived about that time.  But the Koran contains almost no historical information and exists in a vacuum.  There is no way to connect it to Mecca (which it only mentions once) and no evidence of the Arabs being Muslims until sixty years after Muhammad died.  Holland and the critics assume this absence of written evidence is evidence of absence and fill it in with their own speculations.
Let me say from the outset that I think Holland is wrong and that the traditional account of Islam’s origins, in its rough contours, is basically correct.  We can’t trust the early biographies of Muhammad to be completely accurate, but alternative accounts (that Mecca was in Syria, or that the Arab invaders weren’t Muslims) are simply implausible.  Looking only at the evidence presented by Holland in his show (which, you’d hope, is the best he’s got), his alternative hypothesis of a late adoption of Islam by the Arabs simply falls apart. 
For instance, he notes that the Arabs initially worshipped at the Jewish temple.  They were certainly monotheists, but not Christians or Jews.  Early Christian sources have no idea what they were and they would surely have recognised a Jew if they saw one.  But sixty years later, we are asked to believe that these Arabs became Muslims; that they did so right across there now vast Empire; and that no one else converted at the same time.  We know the early conquerors made no effort to proselytise their subject populations.  But a new religion coming from outside after the conquests would surely have led to conversions across the board and not just in a single narrow strata of society.  Far more likely that the Arabs were Muslims from square one and that Islam spread across their new Empire as they conquered it, not following a couple of generations later.
In fact, the earliest record of Muhammad comes from a coin minted by a pretender to the throne of the Umayyads called Ibn al-Zubair.  But, al-Zubair, who was based in Medina and controlled Mecca, lost the resulting civil war. So, Holland asks us to believe that the Umayyads defeated the rebel and then successfully adopted his new religion over their vast domains.  Again, it is far easier to imagine that both sides were already Muslims and that the rebellion only prompted the Umayyads to advertise their religious adherence more widely.  This they did with spectacular effect at the Dome of the Rock.  
Holland also notes that much of the Koran is addressed to farmers.  This is odd because Mecca was a trading entrepĂ´t in the middle of the desert.  There were no farmers to address.  What Holland forgets to mention is that Muhammad spent much of his life in Medina (he is even buried there) and it was here that he made his first converts.  Medina is an oasis city which had plenty of agriculture.  Of course, the Saudi authorities have been industriously destroying all trace of early Islam from Mecca and Medina in their misguided iconoclasm.  Many ancient mosques and monuments have been bulldozed and dynamited in an orgy of destruction that makes the Taliban look like museum curators.
Still, the traditional story of Islam’s origins is likely to be accurate in its main themes.  Admittedly, a huge bundle of traditions grew up around Muhammad’s life, many of which are not true.  These were culled into the received version of his biography and teachings between one and two hundred years after his death.  Critical analysis of these texts is likely to give us a picture of Islam’s origins with more nuance than we currently possess.  But today’s critical scholars have, like Christianity’s higher critics, gone too far in their scepticism.  None of this means this isn’t a debate worth having.  But non-Muslims need to be careful that they don’t gleefully turn it into a rod to beat Islam.  This, at least, is something Tom Holland could never be accused of. 

Discuss this post at the Quodlibeta Forum

Sunday, September 09, 2012

Lukewarmism

England has had a cool wet summer. The relevance of this in the debate on global warming is pretty close to nil. Likewise, the cold winters in 2010 and 2011 didn’t tell us much about the long term trends of the world’s climate either. The data that does matter, the average world temperature records, show that temperature rose to about 0.7 degrees centigrade above pre-industrial levels in the years up to 1998. Since then, it has basically stood still. And not all the climate alarmism of all the lefties and greens on this Earth has managed to budge it one iota.

I’ve long struggled to articulate my position on global warming. Obviously, I’m a sceptic about the silly claims about tipping points, massive rises in sea level (we’ve had about 8 inches in the last 150 years) and extinct polar bears. The world isn’t going to see temperatures rise by 6 degrees any time soon, and if it does, it won’t have anything to do with us. But there is also no point in denying the temperature records that do show moderate warming in the last half-century or that carbon dioxide is part of the reason for this. Doubling the level of atmospheric carbon dioxide from pre-industrial levels should lead to an increase in average global temperatures of about 1 degree. This means that we can expect some further moderate warming in the coming decades.

You can only reach the alarming figures of warming of 3 degrees plus by adding feedback mechanisms to the mix. The scientific justification for these mechanisms is that they are needed to explain the past. Climate scientists have found that to get their computer models to reflect temperature records from the last hundred years accurately, they need to add more factors than just carbon dioxide to the mix. Then they extrapolated their models into the future and declared, on this basis, that things would get a whole lot hotter. They might be right. But simply using a model that explains the past to make predictions about the future is scientific garbage. You cannot take a complicated model that you’ve tweaked to fit a certain curve and then rolling it forward. This is about as accurate as a predictive device as tossing a coin. Unfortunately, it is also human nature. We are very good at spotting trends and expecting them to continue. There is even a name for this: it’s called the gambler’s fallacy.

Back in 2001, the IPCC told us to expect rises in temperature of 0.1 to 0.2 degrees per decade. Since 2001, that simply hasn’t happened. Now, there might be good reasons for this tied up with long-term patterns that were not properly understood even ten years ago. But there might just as well be other even more long-term effects that could cause temperatures to fall in the next fifty years, or not, as the case may be. We simply don’t know. And until we have models that have been shown to work in advance of what they are supposed to predict, we can’t rely on them.

That’s why I am sceptical about predictions of future climate. You should be too. But that doesn’t make me a global warming denialist, since I certainly accept the solidly-based past temperature records. I also expect further moderate warming on the basis of the well-understood “greenhouse effect”.

So what am I? Matt Ridley, author of the excellent The Rational Optimist, has the answer. He calls himself a luke-warmist. And that is what I am too. As a luke-warmist, I accept the following aspects of climate change orthodoxy: the world is getting (a bit) warmer; atmospheric carbon dioxide is a major cause of this; and human activity has increased the level of atmospheric carbon dioxide from about 280 to 400 parts per million.

But thereafter, luke-warmists part company with orthodoxy. For instance, I do not claim that the world’s temperature was an optimum in 1970 (or whenever). Instead, I recognise from history that warmer climates have been beneficial in the past and this is likely to remain the case in the future. I reject forecasts of future climate change based on computer models that have not been shown to make accurate predictions. I believe it is abhorrent to encourage poor countries to use expensive energy when cheaper carbon-based alternatives would boost their economic performance. I believe wind power is an expensive white elephant and using agricultural land to grow biofuels a scandal (on this last point, I find I’m in agreement with many Greens). I note most anecdotal reports on the effects of global warming turn out to be false or misleading (from Himalayan glaciers to vanishing coral reefs). I note that the best way to reduce carbon emissions is to use more gas (thankfully there is plenty of it about and the fracking technology to extract it). And finally, I note that climate change propaganda is well-funded and pervasive (unlike climate change scepticism, which has to work on a shoestring).

In short, global warming isn’t yet much of a problem and is more likely to be mildly beneficial. If it does turn out differently, then we should adapt. But the current fad for slowly destroying the world’s economy by artificially increasing the cost of energy isn’t just stupid. Cheap energy drove the industrial revolution and much of the development that has given us such high standards of living today. The oil shocks of the 1970s caused two damaging recessions. Keeping energy affordable when we know the consequences of high prices will eventually be disastrous should be a much greater priority than preventing uncertain climate change.

Note: this post was amended on 14 September to clarify certain points.  It seems that there are only two boxes allowed in this debate - denailist and true believer.  I have tried to make clearer that I am neither.

Discuss this post at the Quodlibeta Forum

Wednesday, August 22, 2012

UK and EU: A Doomed Marriage?

With the Euro-crisis dragging on without any sign of resolution, even the most excitable journalists are beginning to get a bit fed up with waiting for nemesis. The most likely scenario is not a sudden collapse. The European elites are clearly willing to spend however many billions of Euros that are required to avert that possibility. They have also managed to convince their populations that such a collapse would be worse than the current impasse. So, crisis has become the new normal and we can expect it to drag on for many years. Europe will slowly stagnate. It won’t be until a new generation of politicians arise, without any commitment to the failed Euro-project, that a major change of direction becomes possible. The tragedy is that these politicians are most likely to come from the radical left or right, although in most cases the two polarities are difficult to tell apart.


This means that the UK finds itself shackled to a corpse and makes Daniel Hannan’s new book restating the case against British membership of the EU especially timely. He has called it A Doomed Marriage (and you can read an extract in the Daily Mail). The title comes from something he once heard about marriage guidance counsellors. They say, as long as the couple are still arguing, the marriage can be saved. The very fact of disagreement means both parties value the other’s opinion enough to want to change it. It is when they settle into silent contempt that the union is finished. And that is the attitude of the British towards the EU: icy distain. There is no longer even a token effort to put a positive case for rule from Brussels. Few think joining the EU has done us any good and, aside from a few harmless cranks (one of whom, regrettably, is deputy prime minister), British Euro-enthusiasm is dead. Particularly telling is the number of ex-Europhiles now claiming that they were sceptics all along. Our marriage with Europe has now got to the stage where the only reason we stay in is for fear of life outside . It is the equivalent of a warring couple sticking it out for the sake of the children.

Realising this, Dan has begun to develop a narrative for life outside the EU centred around our shared culture and strong trading relationships with the English-speaking peoples of North America, Australasia, India and (the honorary English-speakers) of Scandinavia. In parallel, his American bestseller, The New Road to Serfdom, warned Americans to avoid the mistakes of Europe. A Doomed Marriage catalogues the arguments that have been made in favour of the EU and systematically debunks them. We learn that we joined late in the day not because we deliberately stood aside, but because our constructive suggestions about what shape the nascent EU should take were ignored. Only when we ditched all our arguments and completely accepted the federalist model, common agricultural policy (“CAP”), external tariffs and the rest were we allowed to join. And yet, Euro-enthusiasts spent decades telling us we had to be in to have influence. After forty years, it is clear that our membership has had no influence at all. CAP still extorts money from consumers and third-world farmers, there is no single market in financial services (which the British are best at) and we have not been able to stall the constant flow of regulations from Brussels. To add pecuniary injury to diplomatic insult, we have been the largest net contributor to EU funds apart from Germany.

In his most shocking chapter, Dan explains how the EU has funnelled cash through charities like Oxfam and Friends of the Earth, to buy support. Millions of Euros are paid to these organisations who, in turn, provide useful propaganda. The same has long gone for Brussels journalists who are carefully cultivated in return for uncritical coverage. The EU has learnt the hard way that it is a waste of time trying to convince the public of its virtues. Referendum defeats in Ireland, Denmark, the Netherlands and France (all duly ignored) have put paid to that. Bankrolling pressure groups and NGOs serves the EU’s purposes much better. And there are now so many articulate people on the European payroll, it is hardly surprising that our elites cannot bring themselves to accept we are better off out.

The other main reason for Euro-enthusiasm that Dan identifies has been political tribalism. Never mind the arguments: the BBC, the intelligentsia and most politicians simply defined the EU as progressive, modern and moderate. They have dismissed the EU’s opponents as lunatics. Even though the sceptics have been proven right, it is very hard for the enthusiasts to admit this (see here for a pretty good example of Michael White both trying to claim he's a Eurosceptic himself and that Dan is nuts.  White is refreshingly ignorant of Dan's politics and assumes he's a social conservative on marriage.  Dan is actually a liberatarian supporter of gay marriage). After all, pro-Europeans would have to accept that they have been the lunatics all along and they have been the ones running the asylum.

Dan’s short book eloquently sets out what is wrong with the EU and our relationship with it. It is hard to escape the conclusion that Britain’s marriage to Brussels isn’t just doomed, it is abusive.


Discuss this post at the Quodlibeta Forum

Thursday, July 12, 2012

Defining Ignorance

In the comments to a recent post, Tim argued that agnosticism should be defined according to its provenance in T.H. Huxley. By agnosticism, Huxley meant the claim that no one can know if God exists because God is inherently unknowable. I argued in the comments that the term agnosticism is much more diverse than this: it can be personal ("I don't know") or universal ("no one knows"), it can be diffident ("we don't know") or assertive ("we can't know"), it can relate to the object ("it can't be known") or the subject ("we're incapable of knowing"). Tim argued to the contrary that while agnosticism has taken on these colloquial meanings, its technical meaning is Huxley's definition, and we should stick with the technical definition because appealing to colloquial definitions is a recipe for disaster.

The problem with this is that Tim is wrong. I'm giving the technical definition of agnosticism. Tim thinks this is due to creeping colloquialism but he's simply incorrect. Here's why: agnosticism is used by philosophers to denote the withholding of belief in any proposition. If you are unpersuaded that there is life elsewhere in the universe but don't actively disbelieve it, you are agnostic about it, you withhold belief. If you think the evidence for string theory is lacking but think it is very possible that forthcoming evidence may shore up the gaps you are agnostic about it. This may also be how it is used colloquially, but that doesn't mean it's not also the technical definition.

Can we say that agnosticism means one thing when it bears on knowledge in general but something else when it bears on knowledge of God? Well we could but that's not how it's used by philosophers (not to mention the fact that using any term in such a way would be a recipe for disaster if anything would be). For example, agnosticism about God's existence can refer to an individual's claim to not know one way or the other or to the universal claim that no one really knows. I've heard this distinction referred to as subjective/objective, personal/universal, individual/general, but regardless this is a real distinction made by philosophers regarding agnosticism about God's existence. (In a debate William Lane Craig once called it ordinary agnosticism vs. ornery agnosticism.) To insist that the technical use of this term should be abandoned in favor of how it was originally conceived is incorrect. Of course we shouldn't read our definition of agnosticism back into Huxley's writings, it just means that regardless of how a term originated, its technical definition is determined by how it is used in technical discussions. I don't know if this is considered a linguistic fallacy -- treating a term's original meaning as its technical definition -- but it strikes me as one. It's a close cousin to the root fallacy where one defines the meaning of a term according to the parts that make it up as well as the fallacy of semantic obsolescence. Perhaps we can call it the fallacy of provenance.

Part of the problem is that Tim tried to define knowledge in an absolute way: "No-one can definitively know if something exists or not, unless they are omniscient." I presume his reason for this is that unless you know everything the possibility remains that one of the items you don't know would invalidate your claim to knowledge. But epistemologists know that knowledge doesn't work that way. The classical definition of knowledge is justified true belief but this has fallen on hard times in the last several decades. Regardless, whatever one wants to call the third condition of knowledge (or fourth condition if one wants to add something to justification) very few philosophers, if any, maintain that it requires the absolute assurance that only omniscience could provide. (Maybe Peter Unger would accept this, but I haven't really read his defense of skepticism.) Many epistemologists do make one of the conditions of knowledge be that one have a reasonably thorough knowledge of the object known in order for it to qualify as knowledge, but nothing like omniscience. Infallibilism is the view that knowledge is ... wait for it ... infallible and so would be in that general direction, but infallibilists don't make omniscience a condition of infallibility; far from it. In fact, as far as I can tell, the majority view among epistemologists today is fallibilism: knowledge doesn't have to be absolute or certain or anything like it. As long as one believes the proposition, even weakly, and the right conditions are met (and the belief is true) then it qualifies as knowledge.

The whole discussion was raised by the issue of burden of proof. Many atheists claim that only the one who is making a positive claim bears the burden of proof, therefore, they do not have to have any evidence or reason or grounds for their atheism. This doesn't really work since the claim that God does not exist is the claim to know something, just as much as the claim that he does. Disbelief is the belief that a proposition is false. Some atheists respond by saying "You can't prove a negative" which was the topic of the post (quick answer: of course you can). So the atheist has to shoulder his share of the burden of proof. Some atheists respond further by redefining atheism to mean the absence of belief in God rather than disbelief. Disbelief is hard atheism, lacking a belief is soft atheism. I've written about that before too: I understand what belief means; I understand what disbelief means. I also understand what the withholding of belief (i.e. agnosticism) means. Further, I understand what it would mean to have no conception of something: prior to hearing about Russell's orbiting teapot or the flying spaghetti monster I had no conception of them. I could see calling this last case "lacking a belief," but this doesn't help the soft atheist since he has obviously heard of the concept of God. Once I've heard of a concept I no longer lack a belief in it: I believe, disbelieve, or withhold belief in it. So I asked Tim, as I've asked others, to define what lacking a belief in God means and how it differs from belief (theism), disbelief (atheism), and withholding belief (agnosticism). Tim, much to his credit, offered the first response to this I've ever received: he said it seemed to be closer to withholding belief. The problem is that this is not atheism, it is agnosticism. Hence the discussion. I kind of suspect that soft atheists disbelieve in God, but weakly; they're right on the border of being agnostic but leaning towards disbelief. But they insist that's not what they mean, they mean they lack a belief -- a concept I can't get them to define.

The believer must shoulder the burden of proof, but so must the disbeliever, the one who believes that the proposition up for grabs is false. I, for example, disbelieve in Russell's orbiting teapot and the flying spaghetti monster. I must therefore shoulder the burden of proof, I must supply a justification or reason for my belief that these proposals are false. I've written about this before too: The reason why I don't believe these claims -- and the reason why everyone else doesn't believe them either -- is because they are completely ad hoc. The more ad hoc, or contrived, a claim is, the less likely it is true. As I wrote before, "The degree to which it is ad hoc is the degree to which it is implausible. This is particularly evident with the absurdly ad hoc propositions mentioned above: we react against such suggestions because they are completely contrived. It's not merely that we have no reason to think they are true; we think, for whatever reason, that they are just "made up," and this is a specific reason to think they are not true." Of course, these ideas were originally conceived in order to try to compare them to belief in God, and to show that, just as the rational response to the orbiting teapot and spaghetti monster is to disbelieve them until we have evidence for them, so the rational response to God is to disbelieve until we have evidence. Obviously this attempt fails: the concept of God is not inherently ad hoc, it has been present in all cultures in all times. Of course the concept of God can be used in an ad hoc way, as can other concepts (Marxists, for example, use economics in an ad hoc way), but this does not mean that the concept itself is ad hoc.

Only the withholder of belief, the agnostic, does not have to bear any burden of proof, because he is not believing or asserting anything (and of course the one who has never heard of the concept does not bear the burden of proof for it). Soft atheism strikes me as an attempt to get the negative position of the atheist and the burdenfree position of the agnostic. It looks like an attempt to disbelieve in God without having to go through all the rigmarole of having any burden of proof placed upon one's shoulders. Until I hear a definition of lacking a belief that is distinct from disbelief and withholding belief, I see no reason to question this impression.

Let me just conclude by reiterating something I posted on Agent Intellect: "Agnostic" is the Greek-based term for someone with no knowledge. The Latin-based term is "ignoramus."

Discuss this post at the Quodlibeta Forum

Friday, July 06, 2012

God particle found

This is so cool. The Higgs-Boson particle has been discovered.
In physics terms, evidence for a new particle requires a “3-sigma” measurement, corresponding to a 1-in-740 chance that a random fluke could explain the observations, and a claim of discovery requires a 5-sigma effect, or a 1-in–3.5 million shot that the observations are due to chance. In December representatives of the two experiments had announced what one called “intriguing, tantalizing hints” of something brewing in the collider data. But those hints fell short of the 3-sigma level. The new ATLAS finding met not just that level of significance but cleared the gold standard 5-sigma threshold, and CMS very nearly did as well, with a 4.9-sigma finding.
So let the jokes begin:

-- The Higgs-Boson particle was recently found herding two of each kind of subatomic particle into a subatomic boat of some kind.

-- The scientists who discovered the Higgs-Boson particle announced today that its first commandment was "Thou shalt not misuse the name of the Higgs-Boson particle in the science/religion conflict."

-- Scientists have successfully detected a Higgs-Boson particle but were unable to determine if there were any others. There is one Higgs-Boson particle and the Large Hadron Collider is its prophet.

-- It has been determined that the Higgs-Boson particle used to be a lowly electron, but it died and was resurrected.

Yeah I know, they suck. Supply your own in the comments.

Discuss this post at the Quodlibeta Forum