Matt Wong

NEXT TIME you're in the wine store trying to decide what to buy, and you see one of those tags boasting that a wine scored 92 points or won a gold medal—you know what? Ignore it. The system of rating wines by blind tastings is bogus. Here's why:

1. It's all subjective.

Yep, despite how the tasting industry (the magazines, star reviewers, bloggers, anyone with a palate and a keyboard) claims objectivity, it comes down to personal preference. Sure, they have experience and can eloquently express what's happening with the wine, but it's just a personal opinion. I mean, just because Ann Coulter has read more policy papers than you doesn't mean you're going to give up your liberal stance on gay marriage and immigration, does it?

2. Wine critics are human.

They don't live in a hermetically sealed world where all they do is taste and spit into buckets. They get tired, they have bad days, they get hungry, or their palates fatigue during long tastings—all of which can affect their judgment. They're like you and me: fallible, emotional, and thinking of the weekend, or how the hell they're going to pay their bills.

3. The wines start to look the same.

When shifting through dozens of wines in one sitting, the bigger and heavier ones tend to shine, while more delicate ones are overlooked. Winemakers note this and then produce styles they think will score well, so that a trip down the wine aisle begins to look as interesting as a Beaverton subdivision.

4. Experts are inconsistent.

Numerous studies have shown they can't agree on a wine's characteristics. When judges at a wine fair were served a series of wines under blind-tasting conditions, they rated the ones that had been (unknowingly to them) drawn from the same bottle with wildly different scores. Another study demonstrated that there's no consistency behind wines that received awards—the winning medals may as well be handed out at random.

5. They ignore context.

Where and when and whom with is just as important as the vintage. A cheap Chianti enjoyed on a sun-kissed terrace in Tuscany as you gaze into your significant other's eyes won't be as agreeable on a damp February night in Gresham after a fight with the significant other. (We recommend Night Train for that.)

6. They inflate prices.

Those blockbuster scores add big bucks to the price of the bottle, as much as a 7 percent per point bump in the case of a good review from a revered critic like Robert Parker.

7. The scores keep getting bigger.

Scores in the 90s used to be rare, but now more and more wines are being granted a perfect 100. So how do you score a wine that comes along that beats one with top marks? You could turn the scores up to a Spinal Tap-like 11, but that only defers the problem (as does the 1,000-point system that was recently developed).

8. The system is (allegedly) corrupt.

There are accusations that in certain magazines the relationship between the ads and the wine scores are a little too cozy....

9. They're prescriptive.

Sometimes the best experience with wine is when it confounds expectations, when it doesn't fall into neat categories, when it has nothing to do with what a wine is supposed to be like. Idiosyncrasy is the enemy of competitive rankings.

So if score systems and blind tastings don't work, what does? As prosaic and old-fashioned as it may sound, a personal relationship with your local bottle shop (or perhaps supermarket if they have good wine stewards) will open a world of recommendations. They should have suggestions for regions and varietals you might not have heard of but offer value for money, wines that fit the mood or occasion (such as knowing when that $10 Portuguese red will suffice or when you need to upgrade), and suggestions for things you might not have tried but may like. And if you hear the words "points" or "award winning," go find a different mentor.