Sunday, February 27, 2011
A More Sour Sourdough
I like sourdough. I make it often and I am generally content with my results. This time, I wanted to go for something a little more... assertive. I wanted my sourdough to be SOUR.
The amazing thing about sourdough is that its flavor profile is so deep and it has the fewest ingredients of any bread: flour, water, salt. The yeast is wild and, with the exception of a little pineapple juice that I needed to start the culture, the yeast is comprised entirely of flour and water (and the microorganisms living in the culture). Even more interesting are the many variables that come into play when determining the sourness of a loaf: the incorporation of whole wheat or rye flour in both the starter and the final dough; the ratio of starter or barm to dough; the hydration of the starter; the temperature of fermentation for the starter and the dough; whether the loaves are retarded while proofing; the length of the bulk fermentation...
Here is the problem in a nutshell: I need yeast to leaven the bread. Yeast like warm environments and, I believe, prefer wet environments (equal parts flour and water). I also need acetic bacteria. Acetic bacteria make the loaf sour. Acetic bacteria like cool, dense environments (think 2 parts flour to 1 part water). Many sourdough recipes call for the creation of a barm. The barm takes a 100% hydration starter (equal parts flour and water) and decreases its hydration. The barm then ferments in a cool environment overnight and, presumably, adds some acetic bacteria to the recipe. The yeast go dormant in cold environments, so presumably the barm allows a baker to keep a good amount of active yeast on hand while also helping the baker add some flavor depth.
My latest approach produced a complex flavor profile, but lacked the sourness I was after. I modeled my recipe on Norwich Sourdough from Susan's Wild Yeast Blog.
The recipe broke down as follows:
120g Rye
600g Bread
200g All Purpose
300g 100% hydration sourdough starter, ripe
200g 50% hydration sourdough starter, following an unfed, 24 hour cool ferment (my stiff starter had 10% rye as well)
560g Water, 76 degrees
23g Salt
Mix everything but salt, 30 minute autolyse
Machine mix for 3-4 minutes
Cool bulk ferment for 4-4.5 hours, with dough's internal temperature at 65, stretch and fold after the first and second hour
Shape into 5 400g batards, proof for 2.5 hours at room temperature
Score loaves (a battle unto itself)
Steam at 475 for 5 minutes, drop heat to 450. Total steam baking time=12 minutes.
Rotate loaves, continue baking for 20 minutes.
Cool for at least an hour, preferably several hours.
What I would do differently next time: retard the loaves. There is no way around this step. Also, I would proof at a lower temperature. I had to proof at room temperature because it was getting late. A really sour sourdough is clearly an all day process, and you don't even get the loaves the same day! Oh well. Next weekend I will give it another go.
The amazing thing about sourdough is that its flavor profile is so deep and it has the fewest ingredients of any bread: flour, water, salt. The yeast is wild and, with the exception of a little pineapple juice that I needed to start the culture, the yeast is comprised entirely of flour and water (and the microorganisms living in the culture). Even more interesting are the many variables that come into play when determining the sourness of a loaf: the incorporation of whole wheat or rye flour in both the starter and the final dough; the ratio of starter or barm to dough; the hydration of the starter; the temperature of fermentation for the starter and the dough; whether the loaves are retarded while proofing; the length of the bulk fermentation...
Here is the problem in a nutshell: I need yeast to leaven the bread. Yeast like warm environments and, I believe, prefer wet environments (equal parts flour and water). I also need acetic bacteria. Acetic bacteria make the loaf sour. Acetic bacteria like cool, dense environments (think 2 parts flour to 1 part water). Many sourdough recipes call for the creation of a barm. The barm takes a 100% hydration starter (equal parts flour and water) and decreases its hydration. The barm then ferments in a cool environment overnight and, presumably, adds some acetic bacteria to the recipe. The yeast go dormant in cold environments, so presumably the barm allows a baker to keep a good amount of active yeast on hand while also helping the baker add some flavor depth.
My latest approach produced a complex flavor profile, but lacked the sourness I was after. I modeled my recipe on Norwich Sourdough from Susan's Wild Yeast Blog.
The recipe broke down as follows:
120g Rye
600g Bread
200g All Purpose
300g 100% hydration sourdough starter, ripe
200g 50% hydration sourdough starter, following an unfed, 24 hour cool ferment (my stiff starter had 10% rye as well)
560g Water, 76 degrees
23g Salt
Mix everything but salt, 30 minute autolyse
Machine mix for 3-4 minutes
Cool bulk ferment for 4-4.5 hours, with dough's internal temperature at 65, stretch and fold after the first and second hour
Shape into 5 400g batards, proof for 2.5 hours at room temperature
Score loaves (a battle unto itself)
Steam at 475 for 5 minutes, drop heat to 450. Total steam baking time=12 minutes.
Rotate loaves, continue baking for 20 minutes.
Cool for at least an hour, preferably several hours.
What I would do differently next time: retard the loaves. There is no way around this step. Also, I would proof at a lower temperature. I had to proof at room temperature because it was getting late. A really sour sourdough is clearly an all day process, and you don't even get the loaves the same day! Oh well. Next weekend I will give it another go.
Wednesday, February 23, 2011
Last Verse On G-Side's "No Radio"
No Radio is not the best song on the album. It might not even be the second best. I don't really care. The last verse on the song has been slaying me since I first got the album. The song has this heavy low end, distorted and slinky. The first two verses are thuggish and traffic in the violent, misogynist, and materialistic noise that has come to define the genre. I make an excuse for a lot of these artists (admittedly, not all): this is an art and their fans have come to expect certain things. Violent and materialistic rap moves units. Still, it's tough to make excuses for some of the stuff, say, Rick Ross raps about, especially since he was never "trapping" but instead held a 9 to 5 as a prison security guard. Rick makes it all the harder by failing to show even a hint of self-awareness. At least give me some Jay-Z style interview where you denounce a lot of this language or explain your music is an art and you don't literally mean what you say. Something.
The last verse on No Radio rounds out the song and gives me that hint of self-awareness. Spoken from the view of a would-be mugger, the verse is a refreshing dose of humanity in an otherwise stone-faced genre. The crime is a desperate and sad last resort in an otherwise broken life. I am not even going to get into the social statement. Shit, at least it's a statement at all.
No Radio
The last verse on No Radio rounds out the song and gives me that hint of self-awareness. Spoken from the view of a would-be mugger, the verse is a refreshing dose of humanity in an otherwise stone-faced genre. The crime is a desperate and sad last resort in an otherwise broken life. I am not even going to get into the social statement. Shit, at least it's a statement at all.
No Radio
This May Be The Answer
Sourdough is fun and delicious. It is not an especially simple bread, although I would hardly call it difficult. The main factor is whether the sourdough starter is active enough. Mine just sat in the fridge for a bit too long and I've spent the last few weeks feeding it twice a day to get it up to strength. My sourdough is good but not great. It is never sour enough and that really frustrates me. Well, I have a solution.
1. I will use two different sourdough starters. This comment on The Fresh Loaf addresses the double starter approach. I will have one highly active starter to leaven the bread and one cool-fermented starter to help with the sour taste.
2. The sour starter will include 10% whole wheat four. This will help accentuate its sourness.
3. The sour starter will ferment in the refrigerator at 50% hydration for 36 hours before use.
4. The active starter will be returned to 100% hydration (it is currently at 50%). I'm planning on twice-a-day feedings for the next 36 hours as I increase its hydration.
5. I will de-gass the dough after the first bulk ferment and give it a second bulk ferment (should take longer) before proofing.
6. I will keep the dough at or below 72 degrees (or try my best to).
7. Unfortunately, I do not have time to let the dough proof overnight in the fridge. I am hoping that the all-day baking process will be sufficient.
I also plan to do a 12 hour cold autolyse for the dough. What can I say? I like air holes.
1. I will use two different sourdough starters. This comment on The Fresh Loaf addresses the double starter approach. I will have one highly active starter to leaven the bread and one cool-fermented starter to help with the sour taste.
2. The sour starter will include 10% whole wheat four. This will help accentuate its sourness.
3. The sour starter will ferment in the refrigerator at 50% hydration for 36 hours before use.
4. The active starter will be returned to 100% hydration (it is currently at 50%). I'm planning on twice-a-day feedings for the next 36 hours as I increase its hydration.
5. I will de-gass the dough after the first bulk ferment and give it a second bulk ferment (should take longer) before proofing.
6. I will keep the dough at or below 72 degrees (or try my best to).
7. Unfortunately, I do not have time to let the dough proof overnight in the fridge. I am hoping that the all-day baking process will be sufficient.
I also plan to do a 12 hour cold autolyse for the dough. What can I say? I like air holes.
Tuesday, February 22, 2011
Is There Discrimination Against Women In The Sciences?
Yes, clearly. This Slate piece is worth reading and provides a nice and brief digest of an ongoing debate.
A quick aside: I used to read Slate every day for several years until I began feeling like their headlines were just a contrarian take on whatever was dominating the news that week. I'm giving them another shot, but they are on notice.
A quick aside: I used to read Slate every day for several years until I began feeling like their headlines were just a contrarian take on whatever was dominating the news that week. I'm giving them another shot, but they are on notice.
Thursday, February 17, 2011
Another day, another helmet/concussion article
This one addresses the prohibition against helmets in women's lacrosse. While the argument is constant (helmets promote more violent play by giving the false sense of protection v. helmets protect against violent play already inherent in the sport), the facts are a little more complicated.
For instance, this excerpt only muddies the issue:
"According to research by Nationwide Children’s Hospital in Columbus, Ohio, not only does the sport have the third-highest rate of concussion among female scholastic sports (behind soccer and basketball), but its in-game rate is only about 15 percent less than the rougher male version."
The players themselves admit that they would play more violently if they had an extra layer of protection. With concussion rates only marginally lower than their male counterparts, would that violent behavior rise or would the extra protection lower the concussion rate further? As the article rightly points out, once you add a helmet, the play may fundamentally change and removing the helmet may become impossible. Removing helmets from the next football season would create serious and widespread injury.
Of course the biggest proponents of requiring helmets are associated with Nocsae. The above article does a terrible job of explaining their position. It was only a few months ago that the NY Times published this:
Oh well. Looks like all of our athletes will be brain dead, including our child athletes. Let's just hope they learn that they can't go pro before they stop being able to learn.
For instance, this excerpt only muddies the issue:
"According to research by Nationwide Children’s Hospital in Columbus, Ohio, not only does the sport have the third-highest rate of concussion among female scholastic sports (behind soccer and basketball), but its in-game rate is only about 15 percent less than the rougher male version."
The players themselves admit that they would play more violently if they had an extra layer of protection. With concussion rates only marginally lower than their male counterparts, would that violent behavior rise or would the extra protection lower the concussion rate further? As the article rightly points out, once you add a helmet, the play may fundamentally change and removing the helmet may become impossible. Removing helmets from the next football season would create serious and widespread injury.
Of course the biggest proponents of requiring helmets are associated with Nocsae. The above article does a terrible job of explaining their position. It was only a few months ago that the NY Times published this:
The fact that helmets are held to no standard regarding concussions surprised almost every one of dozens of people interviewed for this article, from coaches and parents to doctors and league officials. Even one member of the Nocsae board, Grant Teaff — who represents the American Football Coaches Association — said he was unaware of it.
“Obviously if you’re protecting against skull fracture, you’re protecting against any type of concussion,” Teaff said, incorrectly.
Nocsae receives no oversight from any independent agency, such as the Consumer Product Safety Commission or the Occupational Safety and Health Administration. Its 16-member board features five representatives of the helmet industry, six volunteer doctors, two athletic trainers, two equipment managers and one coach.
Nocsae’s annual budget of about $1.7 million is funded mostly by sporting-goods manufacturers whose products bear the Nocsae seal of approval. The largest share of that comes from football helmet makers and reconditioners.
Oh well. Looks like all of our athletes will be brain dead, including our child athletes. Let's just hope they learn that they can't go pro before they stop being able to learn.
Wednesday, February 16, 2011
Metal Map
Addendum to "Social Networks As Filters"
Well, sort of. This well-circulated NY Times article about JC Penny's intentional manipulation of the Google search algorithm is interesting in that it illuminates the limitations of inbound links as "votes" for a website's worth. That said, any balanced algorithm will place weight on links and tweets and likes, etc.
Will the internet's democratization of information sharing follow the same trajectory as our country's democracy? I don't know, but it is clearly possible. If there are known components in search algorithms that can be manipulated, like links, then there is massive incentive for larger companies to manufacture, buy, or create those links. If search rankings are dependent on resources, we could see the internet representing the voices of a very few and very powerful companies. There are plenty of preventative mechanisms in place and I am not predicting the end of search engine reliability, if such a thing could ever be achieved, but this is an interesting dance.
Wednesday, February 9, 2011
Negative 6
The low tonight is negative six degrees, excluding windchill. It is around this time that I wonder why I live in Chicago. Frankly, I live in Chicago for more reasons than I could quickly list, many of them compelling and at least one (career) nearly demands I stay here, at least for the foreseeable future. Unfortunately, my commitment is routinely tested.
Tonight I realized that my toes were frigid and I couldn't remember the last time they were warm. Do they even make slippers for sub-zero temperatures?
Monday, February 7, 2011
Social Networks as filters
A few weeks ago, I found myself in the position of having to explain why the company I work for should register with various social networking sites. The question was clear enough: aren't these sites just for killing time and keeping up with friends? The answer was a little more complicated. There are a variety of reasons why a company should join a social networking site. Some of those reasons will be specific to the company's business model, but at least one of those reasons will be universal. Social networking sites filter third party influence from individuals' votes of interest in a website or online feature. You trust the link from your friend more than you trust the link from Nike. Let's get nerdy on this one.
A company must be visible and offer value. If no one knows I exist, I won't get customers and if the customers that do see me don't want my product, I might as well not exist. Online, visibility is very much a product of search engine rankings ("rankings"). The following data is pretty much old news, but it drives home the point that visibility equals income. This is all the more true when a company can convert a visitor into a customer online (think Amazon.com).
Search engines have helped to democratize the web by placing a strong emphasis on providing users with the most relevant results to their search query, regardless of the results' origins. In a world without search engines, we would only know of the URLs that were shared between friends or those that were marketed to us. This would favor the largest and wealthiest companies. So, no search engines, no internet start-ups, etc.
With click through rates being as high as they are for the top ten search results, it is always in a company's best interest to be highly ranked for relevant queries. Hence the market for SEOs. What these click through rates have really done is spawn two, simultaneous races. The first race is between companies, to see who can achieve the highest ranking. This is done through a large variety of methods that we can briefly touch on later. The important part here is that this has as much to do with making a better site as it does with trying to decode Google's algorithm (Bing and Yahoo's market shares are negligible in comparison http://www.seomoz.org/blog/top-10-things-the-microsoftyahoo-deal-change-for-seo). If a site can figure out how get a high ranking without restructuring their website or creating new features for the user, then their return on investment ("ROI") will be higher. It is much easier to write a great title tag than it is to generate new and compelling content.
The second race is within the search engines themselves and, of course, between them. The search engines must always refine how they determine website rankings. If Google stops providing you with useful results, you will likely begin using a different search engine.
The interesting point is how these two races play off each other. Google refines the criteria for its rankings, companies then chase the new algorithm, and Google must further refine its rankings, less they be co-opted by clever SEOs. If Yahoo! and Bing see Google slip or slowly address an issue with their rankings and are able to address it first, they may be able to take some of Google's users away.
A company must be visible and offer value. If no one knows I exist, I won't get customers and if the customers that do see me don't want my product, I might as well not exist. Online, visibility is very much a product of search engine rankings ("rankings"). The following data is pretty much old news, but it drives home the point that visibility equals income. This is all the more true when a company can convert a visitor into a customer online (think Amazon.com).
Google SERP Click Through Rates – The Raw Numbers
Rank# Click Throughs % Delta #n-1 Delta #n1
19,434,540 100%
1 8,220,278 42.30% n/a n/a
2 2,316,738 11.92% -71.82% -71.82%
3 1,640,751 8.44% -29.46% -80.04%
4 1,171,642 6.03% -28.59% -85.75%
5 943,667 4.86% -19.46% -88.52%
6 774,718 3.99% -17.90% -90.58%
7 655,914 3.37% -15.34% -92.95%
8 579,196 2.98% -11.69% -92.95%
9 549,196 2.83% -5.18% -93.32%
10 577.325 2.97% -5.12% -92.98%
11 127,688 0.66% -77.88% -98.45%
12 108,555 0.66% -14.98% -98.68%
13 101,802 0.52% -6.22% -98.76%
14 94,221 0.48% -7.45% -98.85%
15 91,020 0.47% -3.40% -98.89%
16 75,006 0.39% -17.59% -99.09%
17 70,054 0.36% -6.60% -99.15%
18 65,832 0.34% -6.03% -99.20%
19 62,141 0.32% -5.61% -99.24%
20 58,382 0.30% -6.05% -99.29%
21 55,471 0.29% -4.99% -99.33%
31 23,041 0.12% -58.46% -99.72%
41 14,024 0.07% -39.13% -99.83% (http://www.redcardinal.ie/google/12-08-2006/clickthrough-analysis-of-aol-datatgz/)
Search engines have helped to democratize the web by placing a strong emphasis on providing users with the most relevant results to their search query, regardless of the results' origins. In a world without search engines, we would only know of the URLs that were shared between friends or those that were marketed to us. This would favor the largest and wealthiest companies. So, no search engines, no internet start-ups, etc.
With click through rates being as high as they are for the top ten search results, it is always in a company's best interest to be highly ranked for relevant queries. Hence the market for SEOs. What these click through rates have really done is spawn two, simultaneous races. The first race is between companies, to see who can achieve the highest ranking. This is done through a large variety of methods that we can briefly touch on later. The important part here is that this has as much to do with making a better site as it does with trying to decode Google's algorithm (Bing and Yahoo's market shares are negligible in comparison http://www.seomoz.org/blog/top-10-things-the-microsoftyahoo-deal-change-for-seo). If a site can figure out how get a high ranking without restructuring their website or creating new features for the user, then their return on investment ("ROI") will be higher. It is much easier to write a great title tag than it is to generate new and compelling content.
The second race is within the search engines themselves and, of course, between them. The search engines must always refine how they determine website rankings. If Google stops providing you with useful results, you will likely begin using a different search engine.
The interesting point is how these two races play off each other. Google refines the criteria for its rankings, companies then chase the new algorithm, and Google must further refine its rankings, less they be co-opted by clever SEOs. If Yahoo! and Bing see Google slip or slowly address an issue with their rankings and are able to address it first, they may be able to take some of Google's users away.
Let's use a quick hypothetical here to illustrate how damning it would be if a search engine algorithm was decoded. If Nike ever figured out Google's algorithm, it would be very difficult to buy anything other than Nike products off of Google. Nike would have every incentive to create additional, satellite sites to consume the top ten rankings. A relevant query would produce a variety of websites, but only one option. The result would be a serious blow to Google's market share.
So isn't this post about social networking sites? Yes, it is. The search ranking criteria has evolved over time. It started off basic: search engines wanted to see that your website used the searched terms. A lot. Keyword stuffing reigned supreme. Then we moved on to meta-data: meta-tags; meta-descriptions; meta-keywords. This was all hidden data that web developers generated to describe their site. This played a large role until it was abused. It is now a much smaller factor. What we have slowly moved towards are the importance of external links. External links, their sources, the anchor text in the link code, and things of this nature play a very large role in rankings now. Why? Because they represent votes, by people at large, of a site's trustworthiness. When I link to X site, I am saying that its content is worth reading and that I trust it. When I describe X site with text in the link, it is my interpretation of what that site is. This is more trustworthy than a web developer's meta description. Additionally, the more links that go to a site, the more trustworthy Google knows it is.
Of course, when external links were first discovered to play a significant role, they were abused with linkfarm websites (think massive, useless directories). However, Google quickly remedied this by placing a greater emphasis on link source. A link from this blog is nowhere near as valuable as a link from the New York Times. The authority of the linking source plays a large role in determining the value of the link. It filters out third party influence and lends its authority to the linked site.
Social networking sites play a similar role. We can trace the link to its source because users must register with social networking sites. This means we know whether it's Nike or Nate who is advocating the newest Nike product. Search engines can trust these sources more because the source of the vote is transparent. Further, links and chatter from social networking sites don't impact race 1 (search engine v. company). Even if a company gets wise to Google's algorithm, they cannot easily manufacture mass approval of a product.
So isn't this post about social networking sites? Yes, it is. The search ranking criteria has evolved over time. It started off basic: search engines wanted to see that your website used the searched terms. A lot. Keyword stuffing reigned supreme. Then we moved on to meta-data: meta-tags; meta-descriptions; meta-keywords. This was all hidden data that web developers generated to describe their site. This played a large role until it was abused. It is now a much smaller factor. What we have slowly moved towards are the importance of external links. External links, their sources, the anchor text in the link code, and things of this nature play a very large role in rankings now. Why? Because they represent votes, by people at large, of a site's trustworthiness. When I link to X site, I am saying that its content is worth reading and that I trust it. When I describe X site with text in the link, it is my interpretation of what that site is. This is more trustworthy than a web developer's meta description. Additionally, the more links that go to a site, the more trustworthy Google knows it is.
Of course, when external links were first discovered to play a significant role, they were abused with linkfarm websites (think massive, useless directories). However, Google quickly remedied this by placing a greater emphasis on link source. A link from this blog is nowhere near as valuable as a link from the New York Times. The authority of the linking source plays a large role in determining the value of the link. It filters out third party influence and lends its authority to the linked site.
Social networking sites play a similar role. We can trace the link to its source because users must register with social networking sites. This means we know whether it's Nike or Nate who is advocating the newest Nike product. Search engines can trust these sources more because the source of the vote is transparent. Further, links and chatter from social networking sites don't impact race 1 (search engine v. company). Even if a company gets wise to Google's algorithm, they cannot easily manufacture mass approval of a product.
Importantly, the value of the link, tweet, post, etc., increases with the amount of additional social networking support it receives. If I tweet something and it is retweeted 100 times, that tweet gains more influence than the tweet retweeted 5 times. Why? We know it is relevant, we are more sure that it is not an artificially manufactured tweet, we know that the terms and information contained in that tweet are highly relevant to the website and thus are a good indicator of what the site should be ranked for, and this is a safer and easier way to determine internet rankings. Social networks are filters in that they ensure the activity surrounding a website is genuine and reflective of popular opinion. Accordingly, every company must be active on social networks.
And, just to back this up, here is a snippet from a recent NY Times article: "The move is part of a larger trend toward custom content. As news streams and Twitter feeds multiply on the Web, there is a flurry of new programs trying to help users cut through the noise. For instance, the iPad app Flipboard finds content from a user’s social networking feeds and displays it in a magazine-like layout of photos and text blocks that can be flipped through. AOLrecently announced that it would release Editions, its own personalized iPad magazine application. The tagline is 'The magazine that reads you.'" http://www.nytimes.com/2011/02/07/technology/07yahoo.html?_r=1&hpw
One last night, whether this could be accomplished through online publishing is a different issue, but I would argue that information sharing will only become more personalized and I question whether obtaining enough big links (NY Times) is possible to beat out people who obtain more small links (blogs, Facebook posts).
Thursday, February 3, 2011
Pictures of bread
I haven't posted anything bread related in a while and it's been even longer since I posted a picture of any bread that I've made. Below are some pictures of somewhat recent baking excursions (admittedly not the most recent, but I have not been as good about photographing loaves as of late).




Sourdough with 36 hour cold autolyse
Baguettes with 36 cold autolyse
Italian Bread
Sourdough w/ 10% pumpernickel rye
Subscribe to Posts [Atom]