Friday, June 25, 2010

A Brief Revisiting of Peak Oil

For background on this subject go to this link:

Although discussed previously, the recent explosion of Deepwater Horizon and the resultant oil discharge into the Gulf of Mexico have some raising the issue of ‘Peak Oil’ as one of the hodgepodge of reasons to reduce the general reliance of oil in our society. The appropriate way to view ‘Peak Oil’ is the point where global oil production peaks and some believe that time as already occurred. Unfortunately others have expanded the peaking of production to signify a lack of available supply. Basically global production cannot reach the previous high because the total remaining supply of oil is insufficient. This distinction is not correct, for there are still large quantities of oil available.

Why can the statement be made that there are still large quantities of oil available? First, although it is true that most traditional existing wells are on the down-slope of their production curves, there are some remaining undeveloped traditional sites, mostly located in Iraq and Russia. In fact the prospective reserves in Western Iraq are thought to be especially large. Second, deepwater sites similar to Deepwater Horizon are still being explored and as technology continues to advance exploration of potential new sites will become more accurate and inexpensive. Most of the best deepwater sites are thought to be located off the Atlantic coast of South America and in the unfortunately fast-melting Arctic. Third, untraditional sources of oil have been identified (oil shale, oil sands, etc.) with very large deposits in Venezuela, Canada and even in the United States (Rocky Mountains). So with all of these additional acquisition opportunities around the world, not just in the Middle East why are there so many that believe the world has already entered the era of ‘Peak Oil’?

The simple answer is that each of these three types of resources have one major thing in common, oil extraction from these sources is more expensive than from current widespread traditional wells. Although many experts believe there are still large globally meaningful deposits that can be acquired through traditional wells, the countries that control the land containing this oil have been reluctant to allow foreign companies the opportunity to invest and conduct business under fair contracts. Thus, foreign companies do not view investment in these new undiscovered areas as profitable enough to warrant their time. Deepwater drilling has caught on in recent years as a viable alternative to unsuccessfully haggling with uncooperative foreign countries. However, as witnessed in the Deepwater Horizon disaster, deepwater drilling can be troublesome when things go wrong, environmentally and economically. Extraction from untraditional sources has been a hot topic in the petroleum industry for decades and as traditional sources become less and less available, the first serious wells are being developed, but mass production of these sources have stalled due to the high costs. Interestingly enough despite the belief of ‘Peak Oil’ it seems more likely than not that all three of these sources will be tapped to significant extent despite the significant cost obstacles because of the shear lack of alternatives and the necessity of oil in driving economic growth.

One of the chief elements to the cost obstacle is the classical volatility profile of oil prices. A general rule of thumb is that businesses love patterns and predictability. Some argue that the general volatility of oil price will keep petroleum companies from fully investing in these alternative sources types thus resulting in global production values that will fail to exceed current values. Basically a company needs to know if they are going to make profit on an investment a majority of the time and price volatility confuses the ability to make that prediction. Other potential unpredictability is what big consumption countries like the United States, China, India, etc. will do regarding carbon emission policy and how that new policy will influence oil price and consumption.

With falling ‘easy’ supply sources, future oil price volatility is somewhat reduced in that prices will steadily increase with few, if any, significant drop periods. Basically the future oil price curve will oscillate around a steadily increasing positive slope with the total period and amplitude of the oscillations dropping as the time progresses. An example of this concept is shown in below. The increase in oil price will eliminate a significant amount of trepidation for investment in more expensive sources. As long as a company can predict a profit to be made without a viable alternative, effort will be applied to make that profit.

Look at it this way: suppose you normally climb a 10 ft. tree to collect apples that you sell at a net profit of $30 per apple. Unfortunately almost all of the 10 ft. trees no longer have any applies available. However, there are 30 ft. trees will lots of apples, but the extra time and equipment required to collect apples from these higher trees will reduce the net profit per apple to $15. Some may question the will to continue to collect apples at half the profit, but if the individual cannot devote time to a more profitable venture (an option facing most oil companies) and people still want apples and can pay for them then apple collection will still occur. Right now the individual does not have the real viable option to sell oranges or some other fruit (biofuels) at $20 per unit, so apples it is.

Looking at possible environmental and energy regulations, some argue that changing dynamics in the United States will reduce demand for oil, thus reducing the total capacity for price increase thereby reducing investment incentive and total future supply. While it is true that average motor vehicle gas mileage will increase in the future with increasing design efficiencies of internal combustion engine driven vehicles along with further deployment of hybrid and 100% electric vehicles, there are some issues that are not addressed. While proponents are eager to mention these efficiency increases, they do not discuss the high probability of increasing fleet size. While motor vehicles will become more efficient, there will also be more of them on the road, which with the exception of 100% electric vehicles will increase oil demands. So when including changes in fleet size it is difficult to measure whether or not U.S. demand will actually drop and if it does, by what amount. Also increasing fleet size leading to increased oil demand is an especially large concern with developing countries such as China, India, Indonesia and Brazil that may not have the infrastructure to support electrical vehicles and develop cleaner use power sources. Side note: some people cite China having 100-125 million electrical bicycles as a good thing, funny that they fail to mention that those bicycles are being powered by coal.

A quick detour to explain the importance of market forces. The two chief reasons why some believe the era of ‘Peak Oil’ has begun are that official oil discoveries have been declining significantly over the last 40 decades (most believe that the discovery peak was reached in the late 60s – early 70s) and significantly rising prices starting in the middle of 2005 which only dropped due to the global recession and an inability of large consumers to buy oil. Many believe that this ‘catastrophic’ price increase was due to rising demand finally breaking through outpacing of supply, thus ‘peak oil’ must be close. Unfortunately there may be a problem with this logic. As the average price of oil rose from approximately $41 (a barrel to approximately $60 a barrel, global oil production was maintained at a consistent 85 million barrels a day.

A steady-state production level despite rising prices (almost a 50% increase) initially implies an inability to increase production because basic economic theory anticipates some level of increase in supply to maximize the profit potential from a price increase that is not accompanied by a demand decrease. However, when oil prices really jumped through the roof in late 2007 to mid 2008 (exceeding over $130 a barrel) global production did slightly increase which implies a greater sense of control over the production value over basic economics. Without understanding the mindset of those in control of the principle rates of production (OPEC provides upper 30 to lower 40%) it is difficult to identify whether or not there is any production supply significance to the size of the increase. Note that in this radical oil price time frame, oil production reached a new global peak when prices were around their own maximum. When prices began to fall due to the global recession, production dropped as well.

There are two possibilities regarding existing tapped oil supplies with respects to supply changes occurring in response to the radical price changes. First, oil supplies have become rather inelastic thus it is difficult for supply to increase significantly in accordance to a price increase like basic economic theory anticipates. Second, increasing oil supply is more difficult than economic theory predicts due to an existing set of rules created by OPEC. There is reason to believe that the dramatic price shift was caused in part by oil speculation in the free market due to the past free-flowing credit market where anyone could get credit even if he/she did not have the necessary assets. The entry of purchasers that previously would be unable to purchase oil could easily spike prices. Such price reaction, and even manipulation, may have been viewed as ‘short-term’ in the minds of the producers, thus the corresponding increase in production was slower than theoretically anticipated. Overall a combination of both of the reasons make-up the rationality behind the production response with a greater weight on the inelasticity issue.

In the United States further questions regarding the inability of possible buyers to acquire credit will reduce demand for oil in the United States not from lack of want, but from lack of ability to pay. However, regardless of the change in the U.S. it is almost impossible to consider a drop in global demand driven entirely by market forces, because of rapidly expanding demand in the developing world and a vast amount of available credit for purchase. Therefore, despite higher production costs petroleum companies should not be deterred due to questions about demand.

Although it appears that the future marketplace will not drive a decrease in global demand, new global environmental regulations and new widespread electrical grid design fostered by government law and/or directives could. Administration of a carbon tax, cap and trade system and/or streamlining grid construction would have a positive effect on lower oil demand by either adding cost to oil exploration and production or reducing the costs for oil alternatives allowing them to be more competitive in the marketplace. Unfortunately all of the large oil consuming countries appear to only be slowly advancing to adding additional costs to oil and other carbon sources, if moving at all, thus as it stands it appears that very little demand shift will come from the one element that could create a significant short-term shift.

Even though oil will still be required in the global community and this requirement will continue to drive investment and exploration, historical evidence and economic theory predict that dealing with higher oil prices will be financially difficult for most societies, especially the United States. Historical evidence set a magic price of $80-85 per barrel before the probability of financial recession in the United States increases significantly. In fact a majority of the time that oil has been in or above this price threshold the economy has been officially in a recession. The two most important industries that are influenced by oil prices are transportation and food production (harvesting equipment for large factory farms). Therefore, if the reality of more expensive oil is almost guaranteed then new strategies need to be executed to improve the ability to maintain quality of life in the United States and probably later in the developing world as well.

Unfortunately the development and deployment of oil alternatives do not appear ready in the near-future to play any significant substitution role for oil. The popular refrain for this deficiency is the claim of economics, but the real reason is scale. While algae and cellulous-based bio-fuels may eventually churn out 10-20 million barrels of oil a year by 2020 when the global demand is 80-100 million barrels of oil a day that amount is rather worthless regardless of how much it costs to produce that bio-fuel. If ethanol-based fuel ever even got close to the current demand for oil a vast majority of the population would starve because of the land use competition between bio-fuel and foodstuffs. With regards to electrical and hybrid vehicles, despite heavy optimism from proponents heavy deployment is unlikely. Even if such a revolution occurred, it remains to be seen if the evolution of the electrical grid would keep pace with the new demand or uncertainty and brownouts would become the norm. So without the ability to depend on the brute force of viable alternatives the global community as a whole needs to decide on a course of action.

For example consider global trade and its dependence on shipping. A higher oil price increases the overall costs associated with shipping requiring an increase in the price paid by the businesses importing the good, a cost that will later be passed on to customers. Eventually the oil price will increase to a point where the consumer will be unable to purchase the good, which will end trade of the particular good in that particular region. Therefore, a new strategy must be implemented in the transportation sector to counteract the negative effects of the oil price increase. Shipping in general is rather efficient in maximizing capacity in order to save money (ships are loaded to near full capacity), so the change must come on the propulsion end. Reverting back to ‘no active force’ (a.k.a. wind power) is not a suitable solution because of the wait times between departure and arrival, especially for perishable goods. The development of an electrical motor powerful enough for a ship of any reasonable size seems unlikely and inherently inefficient.

Realistically bio-fuels and nuclear are the two most plausible alternatives for ship travel where due to supply issues nuclear has the advantage. However, how safe would it be to have thousands of nuclear vessels transporting goods across the ocean on a daily basis? Such a ship-based population would be much larger and more exposed to danger than ‘proof-of-concept’ nuclear submarines.

Overall when discussing ‘Peak Oil’ it seems silly to discuss a lack of available oil or even a lack of drive to access oil in more expensive and/or difficult locations. Instead the most ironic feature of ‘Peak Oil’ is that government and individual decision-making will most likely bring on its occurrence. Reduction of demand through increasing efficiency or use and further development of alternatives are the only real ways to enter the ‘Peak Oil’ era in the short-term. Maybe it would be more appropriate for ‘Peak Oil’ converts to talk about entering the ‘Expensive Oil’ era or the end of ‘Cheap Oil’. Sadly the era of ‘Expensive Oil’ may be better than the era of ‘Peak Oil’, but not by much. If society is to effectively manage the era of ‘Expensive Oil’ significant changes need to be made with regards to travel and food production.

Monday, June 21, 2010

Brief Discussion of Precision Statistics

This blog has previously discussed the importance of using statistical analysis when making decisions and analyzing information. However, one point that was not made was that both the information used in the analysis and the methodology of the analysis need to focus on generating meaningful conclusions. Without meaningful conclusions the analysis itself is rather worthless and may even lead people to misunderstand the power of statistics.

For example a very simple example of the real descriptive power of statistics can be taken from analysis of possession percentage from soccer (football). Ball possession is often considered one of the more important statistics because it typically describes which team is controlling the flow of the game. However, the description of control is extremely broad. If possession was divided between the offensive and defensive half then statistical analysis is significantly more powerful in generating an understanding of the general behavior of the game. If the new possession statistic demonstrates a large amount of total possession, but most of it in the defensive half then without watching any tape one can reasonably anticipate that such a team uses a long-ball based offense and pushes extra bodies back on defense. Interestingly soccer has already demonstrated the power of deeper statistical analysis where stats are taken of not only of shots on goal, but also how many of those shots are on target illustrating the overall effectiveness of the shots attempted.

One may argue that using such a simplistic example does a disservice to the importance and power of precision statistics, but most people inherently shy away from statistics and would probably not appreciate and/or understand more eloquent and complex examples. Therefore, when dealing with statistical neophytes it is important to introduce precision statistics through a medium that these individuals will care about, thus motivating an attempt to understand driven to better their own knowledge as a means to better enjoy the medium. Basically killing one bird with two stones. Overall not only is it important to include statistics in any deterministic analysis, but the representation of those statistics needs to be appropriate both in accuracy and depth so they tell the important parts of the story.

Wednesday, June 16, 2010

Revisiting Quantum Mechanics and Consciousness

The recent evidence demonstrating that the brain in a relaxed wakeful state requires approximately 20 times more energy than when the conscious mind is focused on a single task provides an interesting angle on the further exploration of the principle action(s) behind consciousness. One of the initial avenues of explaining the additional energy requirement is the belief that without focus instead of not addressing any possible outcome outside of those required for maintaining life, all possible outcomes are considered. The consideration of all possible outcomes within the current parameters of a relaxing mind could make sense because the first duty of the primitive mind is survival. Survival odds are increased when more information can be processed regarding a given situation, thus the brain would attempt to acquire as much information as possible by considering all possible scenarios. If that is the case, perhaps it provides some interesting insight back into theories relating to quantum mechanical properties of consciousness.

Without a point of focus the brain runs on a default state that can be akin to various uncollapsed wave functions. Recall that in quantum mechanics wave functions are in a continuous state of flux and cannot be defined until that function collapses which usually occurs through observation of the wave function. The very process of conscious thought could be considered as the observation of a very complex wave function which results in a set ‘focused’ pattern of firing collapsing various smaller wave functions in the brain to create the characteristics of the original complex wave function. The collapse of all of these wave functions reduces the amount of energy required because the conscious mind has determined to focus on a single given scenario.

One potentially interesting experiment regarding the above supposition is to see if different ‘noise’ environments produce different levels of energy expenditure. For example suppose an individual is placed in an environment with a lot of foot traffic vs. one with no foot traffic and see if and how energy consumption patterns change when the subject attempts to relax. The purpose of this experiment is based on the presumption that the more possible scenarios the can be comprehended by the brain the greater the number of subconscious wave functions will not be collapsed increasing the total amount of required energy expenditure. If energy consumption does change then that result could be used as a very general element of support for quantum mechanical involvement in consciousness. However, if energy consumption does not change then one of three conclusions can be drawn.

First, the subconscious brain already considers all possible scenarios for all possible situations through uncollapsed wave functions regardless of whether or not the situation is realistic for the given circumstances, thus the baseline energy consumption for a resting waking mind should not change. Second, wave functions as described above have little to nothing to do with energy consumption in the relaxed waking mind. Third, wave functions may have something to do with consciousness and energy consumption, but it is unclear to how and the suggested experiment does not help to determine this functionality. Regardless the new energy consumption evidence does seem to suggest a role for quantum effects in consciousness and should be explored further.

Friday, June 4, 2010

Counteracting Possible Negative Effects of Anesthesia

For a number of years, it was thought that a mind at rest utilized less energy than a mind that was focused on a particular task or one that was multi-tasking. Such a thought seemed reasonable due to its apparent intuitiveness. However, this presumption does not appear to be correct. Although the experiments are only preliminary, neuronal imaging has provided evidence that there is a persistent level of activity when the conscious mind is not engaged in some task. In fact this activity appears to be 20 times greater vs. when the mind is focused. Note that this neurological behavior is only applicable during periods of rest or anesthesia, not during periods of sleep.

A future blog post will discuss possible reasons behind this flood of activity during periods of rest, but one interesting issue that arises from this new energy distribution in the brain is how it affects preparation for surgery. Previously this blog discussed possible rationalities behind why anesthesia may cause brain damage in young children and the elderly. Here

Could additional oxidative stress brought on by a surplus demand of energy due to the administration of anesthesia for an extend period of time (hours due to typically surgical length) be the primary reason for brain damage? If yes, then could a meal high in complex sugars (12-24 hours before surgery) reduce the probability of brain damage through the reduction of oxidative stress? The brain has its own specific source of glycogen, which has a unique slow release property. A simple yet special pre-surgical preparation diet for infants and young children may be a useful deterrent to anesthesia-derived brain damage. Bring on the mouse studies.

Wednesday, June 2, 2010

Why lay the blame on scientists?

For a considerable period of time there has been continuous criticism of scientists, most notably climate scientists, by environmentalists and others on the “Left” wing of the political spectrum regarding their inability to properly communicate the perils of climate change. Unfortunately these critics have diluted themselves into believing that simply changing the semantics of the argument can significantly sway public opinion. This belief stems from the conclusion that the principle reason the public is not 90-95% in favor of new policy that would reduce greenhouse gas emissions is a lack of scientific understanding in both the realities and dangers of climate change. It is highly unlikely that such a contention is true because a number of easily understood and straightforward explanations of these two very issues embodying climate change are available from a number of online sources. In fact even this blog has produced such a document at the following link.

So if the documentation that meets the desires of these critics is available, why are they continuing to complain about a lack of scientific messaging from scientists? The principle reason for the criticism is that these critics are frustrated by the quantity of correct information versus misinformation. Such frustration can definitely be understood because both right wing bloggers and the mainstream media either go out of their way to artificially hype/lie about information that does not support human-driven climate change or offer equal weight to an issue that does not deserve such consideration.

However, to blame poor communication by climate scientists as reason for this environment is silly. Changing the message by using clever analogies and simple plainspoken language is not going to significantly change the general environment perpetrated by the right wing bloggers and the “mainstream” media. The problem is that the public is accepting that there are two reasonable and logical opposing sides to the issue of climate change where there are not. Basically a majority of the communication mediums have elected to present the options of 2 + 2 = 4 and 2 + 2 = 56 as a viable topic for debate. Offering a simpler, hopefully more easily understandable, analysis could help, but as long as another explanation exists the public will have to measure and judge the validity of each argument.

Regardless of how simple the pro human-driven climate change argument is presented it will still involve scientific explanation, either in the form of analogy or straight science, thus it will require thought to understand. This thought requirement involves a level of understanding and effort which is unfortunate when the opposing argument against human-driven climate change is ‘prove it beyond any doubt’ instead of ‘a reasonable or scientific doubt’. Thus according to the opposition, the pro human-driven climate change movement must prove that human-driven climate change is a scientific certainty. Interestingly enough such a contention can be demonstrated through reasoning that the Greenhouse Effect is scientific fact and a vast majority of greenhouse molecules that have been added to the atmosphere in the last 200 years are human derived. Unfortunately such an argument is heavily based in science something most do not want to wade through. Compound this opposing argument with the fact that major lifestyle changes at significant philosophical and short-term economical cost will be required when accepting and combating human-driven climate change and one sees why human-driven climate change deniers have made so much headway at stemming positive steps to combat climate change.

Therefore, it may be wise to invest in a different strategy than try to win the debate by simplifying an argument to battle against an argument that is as simple as it gets. The two most important questions to ask an individual during a debate:

1. Why do you believe x?
2. What prevents you from believing y?

The primary goal of the first question is to engage the individual and discover what attributes drive the belief of an individual. For example is the individual heavily influenced by upbringing, educational environment, logic, etc? Once this information is acquired arguments can be better generated to attack flaws in the belief system leading to a higher probability of success in expelling incorrect beliefs.

The second question is standard scientific analysis; find out what the individual believes are the flaws of a presented argument; if the individual cites flaws that are incorrect and the individual in question cannot accept these ‘believed’ flaws as correct then there is no need to continue the conversation because the individual is unwilling to change his/her beliefs. Such is another flaw in the thinking of those that wish for scientists to simplify their explanations of human-driven climate change, they believe that everyone will be receptive to the argument when both sides are stripped down to nothing but logic and facts, such is not the case.

Note that it is important to try to impress upon an individual the detriments of being incorrect and how it would negatively affect his/her life. Unfortunately the focus of this influence needs to be in the short-term if possible due to the human psychological characteristic of instant gratification and causality (the philosophy of ‘I’ll be dead so who cares what happens that far into the future’). For example one common argument made by environmentalists is explaining the ease and low cost of converting to a less fossil fuel dependent energy environment. Unfortunately this argument is not as powerful as it should be because even if switching involves lower cost than previously anticipated, it still costs money and if the rational for this switch is not properly expressed then any cost will be considered wasted.

Overall the problem of less than appropriate support for climate change is not predicated by over-complicated scientific communication. Sure, a wider range of depth in explanation would be a useful tool, but it should be quite apparent that treating the issue of human-driven climate change as a debate and trying to ‘out-argue’ the other side is the wrong strategy because the other side does not have a real argument. Instead of using a ‘fact shotgun’ approach to convince people, perhaps the better strategy is to ask what information people need to see to be convinced of the validity of human-driven climate change. Provide the necessary information to those that provide reasonable requests and no longer be concerned with those that do not. Using this strategy more precise effort can be applied to those that can be convinced and one can avoid wasting effort on those that cannot or do not want to be convinced.