Close

We use cookies to improve your experience of our website. Privacy Policy

Skip to main content

Institute of Applied Data Science

Search
Menu

News

What are the chances of that? The church minister's hobby and clever machines

19 May 2021

Professor Norman Fenton
Professor Norman Fenton

Professor Norman Fenton, Queen Mary Turing Fellow and Professor of Risk Information Management at School of Electronic Engineering and Computer Science, Queen Mary University of London, has recently contributed a number of articles to the latest issue of the Computer Science for Fun (CS4FN) magazine: Issue 27: Smart Health: Decisions, decisions, decisions.

Computer Science for Fun's aim is to share the passion about all things to do with Computer Science and especially to show that it is an exciting subject that is great to learn about just for the fun of it. CS4FN team, that includes researchers from Queen Mary's School of Electronic Engineering and Computer Science, produces a free magazine twice a year as well as a series of special booklets (for example three magic books on magic tricks and the computer science behind them). They have also produced magazines on Electronic Engineering and Audio Engineering and their links with computing. All are sent free to schools in the UK (over 20,000 copies per issue). Online pdfs of CS4FN's fun magazines and booklets have been downloaded hundreds of thousands of times from people in over 80 countries. CS4FN partners to the BBC's Make it Digital campaign and is supported by UKRI EPSRC.

A copy of Professor Fenton's article on "What are the chances of that? The church minister's hobby and clever machines" is provided below.

What are the chances of that? The church minister's hobby and clever machines

The hobby of a church minister over 250 years ago is helping computers make clever decisions.

Thomas Bayes was an English church minister who died in 1761. His hobby was a type of maths that today we call probability and statistics, though his writings were never really recognised during his own lifetime. So, how is the hobby of this 18th century church minister driving computers to become smarter than ever? His work is now being used in applications as varied as: helping to diagnose and treat various diseases; deciding whether a suspect's DNA was at a crime scene; accurately recommending which books and films we will like; setting insurance premiums for rare events; filtering out spam emails; and more.

How likely is that?

Bayes was interested in calculating how likely things were to happen (their probability) and particularly things that cannot be observed directly. Suppose, for example, you want to know the probability that you have an infectious virus, something you can't just tell by looking. Perhaps you're going to a concert of your favourite band - one for which you've already paid a lot of money. So you need to know you are not infected. If recent data shows that the virus currently affects one in 200 of the population, then it is reasonable to start with the assumption that the probability YOU have the virus is one in 200 (we call this the 'prior probability'). Another way of saying that is that the prior probability is 0.5 per cent.

A better estimate

However, you can get a much better estimate of how likely it is that you have the virus if you can gather more evidence of your personal situation. With a virus you can get tested. If the test was always correct, then you would know for certain. Tests are never perfect though. Let's suppose that for every 100 people taking the test, two will test positive when they actually do NOT have the virus. Scientists call this the false positive rate: here two per cent. You take the test and it is positive. You can use this information to get a better idea of the likelihood you have the virus.

How? Bayes worked out a general equation for calculating this new, more accurate probability, called the 'posterior' probability (see page 8). It is based, here, on the probability of having the virus before testing (the original, prior probability) and any new evidence, which here is the test result.

A surprising result

If we assume in our example that every person who does have the virus is certain to test positive then, plugging the numbers into Bayes' theorem, tells us there is actually a surprisingly low, one in five (i.e., 20 per cent) chance you have the virus after testing positive. See A Graphical Explanation of Bayes' theorem" for why the answer is correct. Although this is much higher than the probability of having the virus without testing (two per cent), it still means you are unlikely to have the virus despite the positive test result!

If you understand Bayes theorem, you might feel it unfair if your doctor still insists that you have the virus and must miss the trip. In fact, many people find the result very surprising; generally, doctors who do not know Bayes' theorem massively overestimate the likelihood that patients have a disease after a positive test result. But that is why Bayes' theorem is so important.

To go or not to go

Of course, no one knows which of the five concert goers are the ones infected. If all 25 ignore their doctor that means there are five people mingling in the crowd, passing on the virus, which would mean lots more people catch the virus who pass it on to lots more, who ... (see Ping pong vaccination).

We have seen that, with a little extra information (such as a test result), we can work out a more accurate probability and so have better information upon which to make decisions. In practice, there are many different kinds of information that we can use to improve our estimate of the real probability. There are symptoms such as lack of taste/smell which are quite specific to the virus. Others, like a cough, are common in people with the virus but also in people with flu. There are also factors that can cause a person to have the virus in the first place such as close contact with an infected relative. So, instead of just inferring the probability of having the virus from one piece of information, like the test result, we can consider lots of interconnected data, each with its own prior probability. This is where computers come in: to do all the calculations for us.

We first need to tell the computer about what causes what. A convenient way to do this is to draw a diagram of the connections and probabilities called a 'Bayesian network' (see A Simple Bayesian Network). Once a computer has been given the Bayesian network, it can not only work out more accurate probabilities, but it can also use them to start making decisions for us. This is where all those applications come in. Deciding whether a suspect's DNA was at a crime scene, for example, needs the same kind of reasoning as deciding whether you have the virus.

Obviously, it is more complex to apply Bayes' theorem in realistic situations and, until quite recently, even the fastest computers weren't able to do the calculations. However, breakthroughs by computer scientists developing new algorithms mean that very complex Bayesian networks, with lots of inter-connected causes, can now be computed efficiently. Because of this, Bayesian networks can now be applied to a multitude of important problems that were previously impossible to solve. And that is why, perhaps surprisingly, the ideas of Thomas Bayes, from over 250 years ago, are showing us how to build machines that make smarter decisions when things are uncertain.

The latest issue of the CS4FN magazine on 'Smart Health: Decisions, Decisions, Decisions' can be viewed and downloaded here.

Link: http://www.cs4fn.org/magazine/magazine.html