For instance, he writes about Bob Voulgaris, a basketball gambler,
Bob’s money is on Bayes too. He does not literally apply Bayes’ theorem every time he makes a prediction. But his practice of testing statistical data in the context of hypotheses and beliefs derived from his basketball knowledge is very Bayesian, as is his comfort with accepting probabilistic answers to his questions.But, judging from the description in the previous thirty pages, Voulgaris follows instinct, not fancy Bayesian math. Here, Silver seems to be using “Bayesian” not to mean the use of Bayes’s theorem but, rather, the general strategy of combining many different kinds of information.
January 26, 2013
"In some cases, [Nate] Silver tends to attribute successful reasoning to the use of Bayesian methods..."
"... without any evidence that those particular analyses were actually performed in Bayesian fashion."
Subscribe to:
Post Comments (Atom)
38 comments:
Nate Silver is a popularizer, an effective communicator, and a pundit. The reason he isn't in Vegas raking in millions per week is because he's either too timid or plain not as good as his supporters think.
Seahawks v. Patriots in the Nate Silver Super Bowl.
Bayesian... at the Moon?
BRUTUS
Remember March, the ides of March remember:
Did not great Julius bleed for justice’ sake?
What villain touch’d his body, that did stab,
And not for justice? What, shall one of us
That struck the foremost man of all this world
But for supporting robbers, shall we now
Contaminate our fingers with base bribes,
And sell the mighty space of our large honours
For so much trash as may be grasped thus?
I had rather be a dog, and bay the moon,
Than such a Roman.
Did you ever hear someone who plays the lottery explain their "system". If they win, what the hell can you say? You just go back to your drink, and pretend to listen.
Silver is just a librul media hyped darling who has gotten lucky a couple times. He wasn't the only one to know that the repugs were fooling themselves in the 2012 election.
(I'll make a Bayesian prediction of a future Edoucher post: No one was fooled, Silver was wrong, the dems stole the election the Chicago way)
It would be interesting to read what the great Nate Silver calculates for the effects of the gun control push on the next election. I predict would he slant his priors to the left and fail.
Not to be critical of Silver, who has done some good analysis, but he needs to be careful of veering into punditry. This is particularly true when it comes to analysis of personal viewpoints or voting trends that can change based on new information.
Either you follow statistical methods or you don't. You can't be sort of Bayesian, just like you can't be sort of pregnant.
His methods have become... unsound.
I go with maximum entropy methods myself.
The real reason that too many published studies are false is not because lots of people are testing ridiculous things, which rarely happens in the top scientific journals; it’s because in any given year, drug companies and medical schools perform thousands of experiments. In any study, there is some small chance of a false positive; if you do a lot of experiments, you will eventually get a lot of false positive results (even putting aside self-deception, biases toward reporting positive results, and outright fraud)—as Silver himself actually explains two pages earlier.
And in any given year, thousands of pundits and analysts make predictions. With enough predictions, many people using faulty methods will make correct predictions. Given Bayesian logic and the assumption that true predictive talent is rare, the chance that Silver is good rather than lucky is under 50%.
His methods have become... unsound.
That's what they said at Baseball Prospectus when they had to rebuild Silver's PECOTA forecasting model.
A valid criticism of Nate Silver, which was apparent in his baseball days, is that his models are over complicated, with millions of moving parts that do nothing to increase the overall accuracy of the model. For example, his PECOTA system, which used dozens of interlinked spreadsheets, but gave essentially no improvement over a simple 5/4/3 weighting of the last three years, plus regression to the mean.
People who place too much emphasis on doing things scientifically sometimes have a bias to think that whatever they happen to be doing is therefore scientific.
Acually, the problem is that there's more than one definition of "Bayesian probability," and Silver doesn't make clear in "The Signal and the Noise" which he is, or which he believes the people he talks about to be. I will note that his comment about Bob Voulgaris "testing statistical data in the context of hypotheses and beliefs derived from his basketball knowledge" is indeed a perfectly good description of the "informative objective Bayesian" approach to statistics.
What a lot of people fail to realize is that the standard probability and statistics routinely taught in school is entirely based on repeated random experiments... that can't actually be performed in the overwhelming majority of situations that you care about. In these terms, you can't ask what the probability that it will rain today is, because today isn't going to happen 1,000,000 times for you to build a frequency distribution from. The various forms of Bayesian probability share an approach to updating probabilities based on new information. As the link above explains, where they differ is in their approach to assigning prior probabilities to plug into the equation.
We actually do know what the ideal prior probability to plug into the equation looks like—this is the result arising in Solomonoff induction—but relatively few people know about it, and it suffers from being only semi-computable, so it's exceedingly difficult to take advantage of in practice. Nevertheless, it serves as a kind of benchmark against which to compare other approaches to statistical inference.
I thought being smart had advantages? Oh well, better for me.
rhhardin: I go with maximum entropy methods myself.
You, E.T. Jaynes, and me too. Among countless others these days.
My assumptions were a much higher turn out election than it was; I often conceded with a lower turn out, Obama had a better chance at winning than in a high turn out race. Turns out that the situation was what Silver predicted.
You didn't understand the article but google ought to help you out.
There's nothing particularly "fancy" about "Bayesian math" [sic]. The math is no more complicated than multiplying and dividing numbers or algebraic variables. Any literate high school student could apply it. The issue is about statistics, not mathematics.
Silver's explanation explicitly states how Voulgaris uses Bayesian thinking (as opposed to frequentist thinking). Silver explicitly rejects the notion that Voulgaris "follows instinct" by stating that he tests statistical data in the context of hypotheses (the rejection that he follows instinct) .
"A very Bayesian way" stems from "beliefs derived from his baseball knowledge" and "his comfort with accepting probabilistic answers to his questions." The latter, in particular, seems to baffle the illiterate as does the former if they are so illiterate as to see quantifying beliefs as "fancy."
Silver's complement is much more specific and informed than your mushy "general strategy of combining many different kinds of information."
The latter, in particular, seems to baffle the illiterate...
I think "probabilistic thinking" is the least baffling aspect of Bayesianism to the average person. It's implicit in any bet where people specify the odds.
But Silver seems to muddle the distinction b/w objective and subjective probabilities. He writes his column in the language of the former, but discusses this basketball bettor in terms of the latter.
There is a saying in statistics - "All models are wrong, but some are useful."
Recently, Silver has come up with some useful models. Only time will tell how useful his future models will be.
Jason said...
His methods have become... unsound.
Of course we all recognize that line from the Nowsian Apocalypse: related
Top two comments in that video link are hysterical!
It helps if your side cheats.
Howard said...
Silver is just a librul media hyped darling who has gotten lucky a couple times. He wasn't the only one to know that the repugs were fooling themselves in the 2012 election.
(I'll make a Bayesian prediction of a future Edoucher post: No one was fooled, Silver was wrong, the dems stole the election the Chicago way)
Yes, they would have won honestly without all that vote fraud.
Right?
I take it you're Howard the Douche?
Reminds me of Rachel Ashwell throwing doilies on distressed furniture, and calling it "Shabby Chic!".
Chip S.:
Bayesianism is not well understood by the average person--or even the average journalist.
The media will report something like "80% of lung cancer patients were smokers"--and then lots of folks will think that means that smokers have an 80% chance of developing lung cancer.
Those kinds of Bayesian fallacies account for many of the breathless news reports about the latest nutritional supplement or fad diet.
Writ Small:
Luck tends to run out.
Nate Silver didn't just make one or a few predictions, which could be attributed to luck.
He made 50 accurate predictions: He called every state race accurately, something that no other well known pundits did. (Intrade.com came close though.)
Now it's true that it doesn't take sophisticated math to realize that MA would vote for Obama, while Utah would vote for Romney. But he did call all 11 swing states accurately, which is quite an achievement given the narrowness of the winning margins in some of them.
sinz52, you misread my comment.
Yes, Rove and Dick "the Genius" Morris are better more talented at picking these things...
You never cause me to update my priors, machine.
As the article correctly points out, Bayesian reasoning is mathematically elegant, but unless one has good estimators for the a prior and conditional probabilities, then it's simply a mathematically elegant way of doing "garbage in = garbage out."
Is this just criticism you engage as a Republican to make yourself feel better?
@ sinz52
Calling all the swing states correctly wasn't a great feat. Anyone who took the average of likely voter polls in the days prior to the election could have called them all, too.
That said, I don't know Nate's history. If he has a very good long term track record, then that increases the probability it wasn't just luck, of course.
I just find it ironic that if you apply Nate's own philosophy of prediction to the man himself, you have cause to be skeptical.
The left has put this guy on such a pedestal that it's hard to believe he can maintain that stance forever. From some of the fawning adulation, you'd think a new category of Nobel Prize should open, with him as the first recipient.
@Ritmo, math is math. If you don't understand the limits of Bayes theorem then there's not much help for you.
" He wasn't the only one to know that the repugs were fooling themselves in the 2012 election. "
1. Didn't find anyone of those "in the know" around here did we.
2. Silver may not have been the only one who knew, but he was the only one who said so in the Times (every day), on the internet, and on national TV.
I suspect that Silver, like most gamblers would rather be lucky than good.
"But he did call all 11 swing states accurately, which is quite an achievement given the narrowness of the winning margins in some of them."
-- The narrowness of the election doesn't matter, unless he called that too. Simple guessing will give you .00048828125, that is of course, assuming that you assume these are independent events (if my early morning math is right.) That sure seems low!
But, elections are not independent. If you assign VA to one of the candidates, then your belief should be enough that Ohio and Florida, at least, are going to the same candidate. Silver made different assumptions than those who saw a closer or different outcome, yet his outcome pretty much fell in line with what most people said would happen if the electorate were depressed (that I saw bring it up, at least.) Did Silver ever estimate how many people would vote?
I think he was right simply because he made an assumption most other people didn't think was safe to make. Sort of like with the ACA decision: Most people agreed "Well, if it is a tax, it is constitutional. But no one said it was, and the government argued it was not, so that doesn't matter." Sometimes, people are just sideswiped by an unexpected assumption becoming true. Those who gambled on the unsure thing get the pay off though.
Now, taking that extremely unlikely way of calling the swing states, let's be honest. Wis. was not a swing state; N.C. was not a swing state. N.H.? Not a swing state.
Even Nev. and Iowa aren't really swing states. Colo. is only a theoretical swing state. Those were being competed in because Romney was expecting higher turn out; if he had known that this was going to be more a heavy base election and turning out your consistent voters, then he would have focused in the actual swing states (Ohio, Virginia, Florida.) Part of the reason Silver's estimates seem impressive is that we had an exceedingly broad range of states called swing states that had no business being in that category.
I only bet on elections after Titanic Thompson makes his bet.
Why, I say thar, I say thar,I bet I can throw a peanut across Times Square
//THE LEGENDARY GAMBLER WHO INSPIRED “GUYS AND DOLLS”
Alvin Clarence Thomas, a.k.a. Titanic Thompson, got his nickname because he sank everybody he gambled with. He also:
Bet he could drive a golf ball 500 yards—and won by teeing up beside a frozen lake.
Double-crossed mob king Arnold Rothstein (featured in Boardwalk Empire) in the high-stakes poker game that got Rothstein killed
Conned Al Capone by throwing a lemon over a building—a lemon weighted with buckshot
Won bets by claiming highway mileage signs were wrong—after he’d moved the signs the night before
Hosted the very first World Series of Poker
Flipped a playing card through the air, cutting a flower off its stem
Slept with movie star Jean Harlow and traded card tricks with Houdini
Made a loaded pistol disappear
Lost $1 million playing pool with Minnesota Fats, then joined forces with Fats and fleeced some of the country’s best sticks
Made putts by using a steel-centered golf ball—after he magnetized the steel cup-liner in the hole
Shot 66 right-handed, 65 left-handed, and told Ben Hogan, “Pro golf’s not for me. I couldn’t afford the pay cut.”
Threw a peanut across Times Square, tossed a watermelon onto a skyscraper’s roof, lifted a 10-foot boulder, leaped over a pool table, and shot six bullets through one bullet-sized hole//
Ya, that's the ticket, It's all math.
ChipS, You are one smart motherfucker. I would love to have you besides me @ a crap table.
Post a Comment