Try this Wine: 2015 Hess Select North Coast Cabernet Sauvignon

IMG_0027

The 2015 Hess Select North Coast Cabernet Sauvignon is riding a strong commercial tailwind. It’s cabernet sauvignon, which is second in popularity only to chardonnay in America, and it’s from California, which dominates America’s wine production, store shelves (commercial demand) and exports. If one’s focus was on making wine that would sell easily and in large numbers, they’d make cabernet sauvignon and chardonnay in California.

If there’s a headwind against this wine, it’s that at a suggested retail of $19 it is too expensive for the mainstream (though I wouldn’t be surprised to find it for $12-15 in many grocery stores nation-wide). Even at $12, it’s outside the 78% of total domestic wine sales that come in under $10 per bottle. However, the tide is turning. In 2017, purchases of bottles priced $15-19.99 saw double-digit growth. Things were never down for the Hess Select cabernet sauvignon, but they are nonetheless looking up.

With so many Americans buying California cabernet sauvignon, this instance of Try this Wine aims to be a twofer. First, for those who regularly buy grocery store cabernet sauvignon, I’m hopefully drawing attention to a particular wine that over delivers. And second, for those who normally eschew under $20 cabernet sauvignon, I’m hopefully drawing attention to a wine that demonstrates real quality can be had for a lower-than-expected price that is also available in serious wine stores.

I visited Hess last December during an epic five days in Napa, not knowing much about the producer and walking in with a critically wrong assumption about them. Here’s a line from the post that I wrote about Hess:

“I had sort of assumed that because of its size, its quality and personality were going to be, um, uninspiring. After trying the samples, I knew the only ass in that assumption was me.”

Hess was awesome. A medium-sized producer, which by California standards is pretty large, they poured me nearly their entire range, beginning with several Select wines. The Select series is the winery’s entry point, and accounts for 65% of total Hess production, making it the company’s financial backbone. The series begins with the $12.99 Select chardonnay, and tops out around the $20 mark. We slowly climbed the ladder until reaching the top: the $185 Lion cabernet sauvignon. There wasn’t a bad wine in the bunch, and I found several to be inspiring. More than anything, though, I was impressed with the Select chardonnay because I was shocked that anyone could make a chardonnay of that quality that could retail for $12.99 – I’ve had many $25-30 chardonnays that are on all accounts no better than the Select.

I vividly remember asking Hess’ winemaker how they made such a good thirteen dollar wine and learning that they have vineyards dedicated to the Select line that get the same attention as their more prestigious vineyards, and an assistant winemaker who focuses on the Select line, giving it as much attention as the head winemaker does for the more expensive wines. Since then, I’ve included the chardonnay in several tastings I’ve led and talked it up on many occasions.

This is why it was fun to revisit the Select line with this cabernet sauvignon, which I received as a sample. They produce 175,000 cases of the Select cabernet, which represents 35% of the total Select series production. That’s serious quantity, so achieving equally serious quality is no small order, and rare at this scale. This alone is reason enough to try this wine.

Tasting note: This fresh, ripe nose gives off aromas of cherry and blackberry compote, toasted oak, potting soil, graphite minerality and blood orange zest. The body is very polished and lush, balanced nicely by good acidity that keeps it from becoming cloying or heavy. Flavors are focused the dark and juicy cherry and boysenberry, though tobacco, wet dirt and lavender peak through. 88 points. Value: A.

Where to buy:

This is a widely distributed wine – available in all fifty states and twenty-three countries outside America – and is available at serious wine stores, grocery stores and online retailers, including wine.com. Below are a few places where it is available. As always, you can head over to wine-searcher.com and input your zip code and a radius to find nearby stores.

Chicago area: Sal’s Beverage World with locations in Addison, Elmhurst and Villa Park.

Denver: Argonaut Wine & Liquor, 760 E. Colfax Ave, Denver CO 80203. 303-831-7788.

Florida: Crown Wine & Spirits, nine locations on the Pacific Coast.

Los Angeles: Wally’s Wine & Spirits, three locations.

Memphis: Buster’s Liquors, 191 South Highland, Memphis TN 38111. 901-458-0929.

New York: Garnet Wines & Liquors, 929 Lexington Ave, New York NY 10065. 212-772-3211.

The 90-Point Rut

 

Looking back at the wines I’ve had over the last several months, I’ve clearly been in a 90-point rut: prior to this last weekend, I’ve given either 90 or 91 points to 10 of the last 16 wines I’ve had. Three of those remaining six received fewer than 90, and three more than 91. Some of these 90 or 91-point wines came from wineries I greatly respect: Waters and Baer in Washington State, Cameron and Bergstrom in Oregon, Melville in California. A Gigondas from Kermit Lynch’s Domaine les Pallieres was supremely disappointing. This past weekend, though, I devoured standout wines from Washington’s Reynvaan and Australia’s Torbreck, and they’ve solidly pulled me out of the 90-point rut. But it has me thinking: was the rut in the glass or in my head?

My gut tells me that when I’m on the fence about a wine, I default to 90 points. If it’s good but too expensive, do I take the easy way out and default to 90 points? If it’s solid but unremarkable, do I go straight to 90? One way or another, perhaps, I rationalize my way to 90 if the wine satisfies but doesn’t excite. It’s my comfort zone. It’s aesthetically pleasing. 90 is also safe in a crowd. Experienced winos can disagree with a 90 – maybe they’d go 89 or 91 – but they tend to respect it either way mostly because they just don’t get excited about it. 90 points can deflect attention, and sometimes that’s what we want.

It’s difficult to find out how many wines receive these scores by amateur or even professional reviewers. Two of the largest retail wine inventories online, K&L and wine.com, let you see all their wines with 90+ points, but aren’t able to show you only the 93-point wines, for example. Cellartracker.com doesn’t allow you to search by score, either. I went back to the report written by winecurmudgeon.com on the “winestream media bias” towards giving red wines higher scores than whites to see if they broke down their sample of over 50,000 professional reviews, and the answer was kind of. Still, it’s illuminating. The higher the score, the lower the quantity:

screen-shot-2016-12-07-at-9-11-11-am

So what does a 90 point review really mean? What should the consumer take from a 90-point review? After all, wine reviews are primarily for the consumer. These are complicated questions, but I think there are some simple ways to think about evaluating the bridgmanite of wine scores in the wine aisle.

First, a 90 score is low enough, and abundant enough, that within the context of wines reviewed by a particular source it’s unlikely to be a wine of distinction. If you’re looking for a uniquely expressive wine then you probably shouldn’t spend your dollars on a wine because it received 90 points.

Second, place the wine in the context of its category. Napa cabernet sauvignons aren’t cheap. There are some over-achieving bottles that start around $25, but most of the good stuff starts around the $50 price point, which is essentially where you also find bottles that hit a level of profile consistency that transcends vintage variation. So, if the 90-point bottle in question is a $50 Napa cab, it’s probably a well-executed version of the prototypical Napa cab lacking in particularities that would make it unique. Another good example are sauvignon blancs from New Zealand. These routinely start around $10 and few go above $25, and it’s hard to find any version at any price that goes into the mid 90s on the 100-point scale of any major reviewer. If you find a 90-point version for $12 you’ve probably found an over-achiever, and if you’re looking at a 90-point $20 bottle you’ve probably got an under-achiever.

You can also do this evaluation based on the grapes involved. For example, a Bordeaux(-style) blend. There are fantastic Bordeaux blends, from Bordeaux, for $20-25, as well as from other parts of the world. If you see a wine of this ilk for $55 with a 90-point score, then you should probably do a bit more research before deciding.

Third, a 90-point score means, like all scores, very little in the end. It comes down to what you like and what intrigues you.

Finally, refer back to the second paragraph of this post. 90 points is a safe place to go for a wine reviewer if, for whatever reason, they’re unsure or unmoved by the wine but recognize it meets the broad concept of “quality wine.” It’s my belief that wines that achieve more than their parts, wines whose profiles transcend the varietal or blend, earn the right to be considered exceptional. I can say with a high degree of confidence that no 90-point wine meets either of those conditions, and I say that both from a good amount of experience drinking wines and reading wine reviews. I cannot recall seeing adjectives like “special” or “brilliant” used in 90-point reviews. At the end of the day, unless the wine is of exceeding value, I can take or leave 90-point wines, though I’m still not sure whether they’re in my head or my glass.

Is there a “Winestream Media” Bias?

wine-critics-wine-searcher-comw-blake-gray

Credit: wine-seacher.com/ © Bob McClenahan/Stephen Tanzer/Nathaniel Welch; W. Blake Gray

On October 24th, the guys at Wine Curmudgeon released a study on whether American wine magazines were biased in favor of red wine. The anecdotal notion that red wines receive higher scores than white wines in these publications has been noticed for years, but this study does a much deeper dive into the data than anything I’ve seen. Their conclusion is the “winestream media” (great line) does indeed have a red wine bias because it gives far more 90+ point scores to red wines than whites. Unfortunately, though, the study’s methodology and data collection is not adequate to provide either (1) instructive data or (2) reliable analysis. As is said in the introduction, the data was provided on the condition of anonymity, which means that we know nothing about how it was collected, and further that it “was not originally collected with any goal of being a representative sample.” Therefore, the 14,885 white wine scores and 46,924 red wine scores don’t have context or relational relevance, and this hollows out any explanatory power the study could have had. Statisticians would say it was not statistically significant. The study is, however, quite interesting in the questions one can raise from it and I thank Wine Curmudgeon for that.

The central observation of the study, that more of the 90+ point wines are red than white, seems obvious to anyone who follows wine scores. This could be, as the study wonders, because we only know the scores that are reported, and publications are more likely to publish scores above 90. Further, “winemakers are likely to promote scores above 90.” This rationale seems likely, though it doesn’t tell us whether or why there is a red/white bias behind the scores. The study also wonders if this means red wines “are inherently better than white wines.” This is the question that got me thinking, though not in the direction the question would likely send someone.

I’ve noticed that reds tend to score better than whites, too, but then I’ve also generally scored reds higher than whites myself. Or so I thought until I looked at the wines I’ve reviewed on Cellartracker: 121 reds at an average score of 90.9 and 45 whites at an average score or 90.3. So, um? I’ve noticed that other wine drinkers, from the casual drinker to the expert, tend to show preferences for red wine as well, though there are exceptions. The best chardonnay from Burgundy and California (and increasingly Oregon), sauvignon blancs and semillions (and their blends) from Bordeaux, chenin blancs from parts of Loire, and reislings from Germany not only receive scores often well above 90, but come from regions where many of the reds produced in the same regions score well below what the best whites have achieved; that is to say, within certain regions the best whites and reds often both score well into the 90s. And because each region is often covered by the same critic, this observation would seem to suggest that something other than skin color plays a role in scoring.

This presents another question: can a critic who does not have a particular liking for one grape or blend give that grape or blend a high score based on factors like quality and complexity despite a disinclination towards that grape or blend? I don’t believe that they can. I’m just not a reisling person no matter how hard I try. I’ve had well-aged, super expensive reisling and I’ve had $18 bottles that I’m told are awesome values, and I can hardly tell the difference between the two. I like to think I have a good palate and am able to detect intricate nuances, but my taste buds don’t taste reisling’s notes well enough to discern between a “drink now” bottle and a cellar selection. And I imagine this is a very sad thing because reisling is supposed to be a wine collector’s mecca.

Another question, though a bit off topic: should the price-to-quality ratio, or “value,” be a variable in a wine’s score. I’ll use old school Rioja as an example. If you read my post on my most memorable reds, you’ll notice that it includes a ~$40 leathery Lopez de Heredia that I really enjoyed and scored 92 points, the lowest score of any of the wines in that post and the same score as I gave to a 2014 Barkan Classic Pinot Noir from Israel on Cellartracker that sells in the US for $8.99 and requires no aging. I gave the Barkan an extra point on Cellartracker because of its supreme value, which means had I posted it using my Good Vitis system it would’ve scored a 91 and been given an “A” value rating (my twin scoring method isn’t captured by Cellartracker’s analytics and so I gave it a 92). I would prefer that the value be kept out of the numerical score and captured separately by another rating. For the record, I would have given the Heredia Tondonia a B value rating at $40.

The final question I’ll pose is, do we need to relate one wine to the body of wine we’ve had in order to pass judgment? I’m not sure what the answer should be. If a wine can be judged in a bubble solely its merits, then we’re getting pretty solid insight in how that wine performs in its own right. If we didn’t do that it would be akin to saying we don’t like burritos because we don’t like Chipotle, which is pretty logically weak because it strips away all context from the relationship between the body and one of its parts. My reference-point wine critic is Stephen Tanzer because my tastes seem pretty similar to his: when he scores a wine, I’m likely to reasonably agree with that score. This is different from someone like Robert Parker, whose lower scored wines tend to be more to my liking than his higher scored wines. Knowing how my tastes line up with the critics’ is helpful in deciding whether I want to purchase a particular wine. They key to understanding how I align with these reviewers is their consistency and their ability to tie their scores to common wine characteristics, which can only be done if we relate one wine to others. This jury of one is still undecided on this question.

The subject of wine reviewing and scoring is a really contentious one that the wine academy will never find consensus on, but as you can see that doesn’t discourage us wine lovers from considering the viewpoints. The debate rages on…