One day towards the end of 2000 my wife finally lost her financial patience with me. As she threw a pile of bills and receipts at me she screamed “That’s it! I’m never doing the accounts or paying the bills again. You never collect receipts, you never write a memo in the checkbook, and your work expenses are impossible to understand!” She then stormed out of the room. I was in no doubt that she meant it and she has remained free of the household accounting burden ever since. I unfortunately have not. It is true that until that day I had never balanced a checkbook in my life and habitually threw bank balances in the trash without even opening them – another contributing factor to my wife’s rant. In fact the only time I ever knew my bank balance was when the ATM refused to dispense cash. So it was with great trepidation that I began my fiscally responsible life. I figured that as I had designed and built large financial software systems I ought to be able to use a small one. So after some research I selected Quicken as they had a large share of the market and had a version for Mac.

I have been using Quicken, on a Mac, for all our finances for four years now. I can monitor our financial activity at any level of detail. I can find any transactions in any account, and can create summary reports and graphs showing spending in various categories, for any time period. I can even calculate my net worth – a depressing activity! It takes between 15 and 30 minutes a week to maintain our accounts depending on the amount of financial activity that has occurred during the previous week.

In bringing our accounts under control I have come to realize that complete and accurate financial information is a powerful thing. We recently bought a house, and were able to calculate exactly what we could afford based on four years worth of data. Without Quicken we would have been guessing. Looking back over the past four years using Quicken I can now see that I have gone through several distinct phases. These phases are parts of an unconscious optimization process aimed at reducing the overall cost and effort involved in managing our finances. I did not plan this process it just became obvious once I had access to complete and accurate data. The result is that the banks get less of my money and I spend less time managing it. From the banks perspective financial management software like Quicken is a “bad thing” because customers who use it are less profitable. The following sections describe the optimization process that emerged and some of the pitfalls I encountered.

Stage 1 Account Setup

It took about three months to get everything setup. First I had to find out what accounts we had – quite a few as it turned out. As I discovered each account I set it up in Quicken. The actual setup was trivial but finding all the required information and getting my account “activated” so I could access it through Quicken took many phone calls to the customer support desks of each financial institution. Getting technical support from a bank is like getting medical advice from a hairstylist – it lacks authority and can be completely misleading. Sometimes, it was not until I had been passed through several customer service representatives that I finally heard “Oh ! we don’t support that feature for Quicken on a Mac!” Once the account is setup it takes at least one billing cycle to cross check the paper statement with the Quicken managed data to make sure they reconcile.

Quicken for Mac provides four levels of support for maintaining an account. These levels provide increasing control for the user and decreasing control for the bank. Most of the tradeoffs I made were to do with the cost and level of service I would accept from an institution for the benefits I would get. The four levels of support are described below.

Manual maintenance

This is the worst solution. If your account has more than a few transactions a month entering them by hand will be a major headache. This is the only solution for financial institutions that provide no support at all for Quicken. I avoid these institutions if I possibly can.

Web connect

This is the basic level of support provided by most financial institutions. It requires a visit their website once a month to download a file containing all the transactions which must then be imported into the correct account in Quicken. You can get data more frequently but you have to remember exactly what dates you have already downloaded since you can get duplicate transactions if you are not careful. This approach is ok if you have a limited number of accounts but rapidly becomes a pain if you have more than a few accounts.

Direct connect

This is the best approach if you need to monitor the account on a daily basis. Transactions are downloaded whenever you request and are automatically loaded into Quicken. I have this option for all my bank and credit card accounts. I will not open a new account of this type unless I can get this level of service. I pay $3 a month to my current bank for this service. But my credit cards provide it free. It is interesting that credit card companies will provide this service free but banks will not. I believe the banks fee for this is a deterrent intended to drive customers to use the banks web based service instead.

Direct connect with online bill payment

This provides all the benefits of direct connect but also allows payments to be made through Quicken. This feature means I hardly use checks at all anymore and I can schedule payments to be made at any date in the future. I pay my current Bank $6.95 a month for this service. Again I believe this fee is a deterrent and a way to make up for lost income as I no longer use checks. Paying this fee is a compromise, as well as online banking I want access to a large network of ATMs. I can avoid out of network ATM transactions and do online bill payment through Quicken.

Stage 2 – Categorization

Once my accounts were setup I started to classify transactions. Initially I used Quicken’s default categories but over time I started developing my own categories tailored to my needs. This meant I started to monitor the things that mattered to me. After a while I began to see patterns of spending where we could easily make savings. These patterns were only visible because of the customized categorization scheme.

Stage 3 – Cost Reduction

One of the obvious costs that could be reduced was banking fees. Firstly we decided to only get cash from our Bank’s ATMs. Those $2:00 fees for using other banks ATMs add up quickly. Then there was the $10 fee for automatic transfers from the savings account to the checking account at the end of the month. With the forward view available in Quicken I can anticipate these events and transfer money to avoid any fees. Cumulatively these savings easily cover the cost of the yearly software upgrade.

Stage 4 – Account Consolidation

The next big realization was that we had too many accounts. By consolidating accounts we could further reduce banking charges and the effort required to manage them. For example one savings account was under the threshold balance for free banking. As a result we were being charged once a quarter for the privilege of leaving our money in the account. So I consolidated our savings in a few accounts. This put the account in question over the threshold and stopped the charges.

Stage 5 – Increasing Ease of Use

I have now reached the stage where the main driver for my financial decisions is the ease with which I can integrate new financial institutions into my system. There is enough choice out there that if an institution will not support my computing needs then I will take my business elsewhere. Even my primary checking account can be moved if necessary. I don’t rely on my bank for anything other than acting as the interface between my software and the rest of the financial world. In short they have become a utility provider. The telephone companies provide dial tone and my bank provides the financial equivalent and that’s it. I believe this is why banks make it so difficult to integrate their systems with personal financial software and why they try so hard to get you to use their proprietary online banking systems. They want to lock our data into thier systems and prevent their services becoming commoditized.

Quicken is without doubt the most valuable piece of software I use and has returned the small investment I made in it many times over. It is also a valuable weapon for managing the sharks of the personal finace world.

When Stephen Hawking said “The only thing nature abhors more than a vacuum is a naked singularity”. He was talking specifically about the laws of physics in relation to black holes. But his observation could equally apply to the body of human knowledge and the existence of unprecedented phenomena. The only thing that drives our desire for knowledge more than a complete absence of information is the presence of a single, undeniable but unprecedented piece of evidence. Such tantalizing evidence demands explanation.

Unique evidence is hard to handle precisely because it is unparalleled. Science relies on repeatable experiments that can be designed to test hypotheses. Part of the challenge with understanding unprecedented phenomena is learning why they are without parallel in the first place. Are they in some way special, or just rare and the first to be discovered?

Unprecedented phenomena do not fit easily into the general framework of science. They create a dilemma: Should existing theory be modified to handle a single piece of evidence? Or can the phenomenon be treated as a special exception? Or is a completely new theory required to explain the evidence?

Some of the greatest problems in science involve the explanation of unprecedented phenomena: The presence of life on earth but nowhere else in the solar system, the lack of a rigorous explanation for the existence of human consciousness and the problem of why the Universe goes to the trouble of existing at all. No one is quite certain if these phenomena are unique and if they are what their uniqueness might mean. One thing is certain – if we ever solve any of these problems the solution will have many consequences.

Understanding unprecedented phenomena like those described above is one of the best ways to rapidly expand the breadth of our knowledge. Such an understanding leads to a cascade of other dependent findings. The problems described above raise big questions that often attract religious explanations and more than their fair share of crackpots. In an effort to avoid theological arguments I was trying to think of a case which does not deal with such big issues. Then I remembered the Oklo Fossil Reactors.

The Oklo Fossil Reactors

I first heard about The Oklo Fossil Reactors in the late 1980s while I was working at the British Geological Survey. A colleague of mine was developing calibration procedures for devices used to measure radioactivity. He was explaining various methods of calibration and mentioned that measuring the U-238 U-235 ratio was one way to calibrate a device since the ratio was well known and was constant throughout all natural sources in the solar system U-238 (99.27%) U-235 (0.72%). Except, he said, in the case of the Uranium ore from Oklo mine in Gabon, Africa. He then explain how in March 1972 some scientists at the Pierrelatte Uranium enrichment plant in France were conducting mass spectrometry tests and noticed that the U-238, U-235 ratio in their samples was not what it should be. These measurements caused considerable consternation and after dismissing all the obvious causes; miss-calibration, contamination etc, they traced the exceptional samples back to Africa. At first they suspected that someone had exploded a nuclear bomb in Africa, without anyone noticing, as the only way to change the U-238, U-235 ratio is by nuclear reaction. This theory was dismissed when the unusual readings were found to localized to ore from the Oklo mine. It gradually became apparent that a natural nuclear fission reaction had occurred at various sites within the ore body and that these reactions had been responsible for changing the U-238 U-235 ratio. At least that’s how I remember the story.

Oklo fossil reactors

Zone 15 is one of the remaining Oklo Fossil Reactors, it is accessible through a tunnel from the main mine workings

The true Oklo detective story is considerably more elaborate. The main reason it was so difficult to make the leap to assuming a natural nuclear reaction is that today it is no longer possible.

When the earth was formed natural uranium consisted of about 17% U-235 . However, U-238 has a half life of approximately 4.5 billion years and U-235 has a half life of only 700 million years. Hence U-238 now accounts for 99.27% of all Uranium and U-235 for only 0.72 % (there are trace amounts of other isotopes). In today’s proportions U-235 can only support a nuclear reaction in the presence of a heavy water moderator. So it can not react naturally since heavy water does not occur naturally. But, and this is the leap the scientists made, 1.7 billion years ago U-235 accounted for nearly 3% of uranium and in that quantity it could have been moderated by ordinary water.

The Uranium ore body at Oklo is several kilometers long and contained pockets of highly enriched (up to 70% pure) uranium as UO2. The ore was buried underground and in places was thick enough to create a critical mass of uranium. Water percolating down from the surface through cracks in the rock was present in sufficient quantity to moderate the reaction, and finally there were no natural poisons like boron that could prevent the reaction. The reaction generated heat, which turned the water to steam, which escaped from the reactor and reduced the amount of water , which reduced the intensity of the reaction, until the rocks cooled enough for the water to return – and so on – until the reactor reached a state of equilibrium. This occurred in 15 separate places within the ore body and continued for hundreds of thousands of years. In all approximately 10 tons of Uranium ore was depleted in this way.

Only one other natural reactor has been found 30 miles from Oklo.

One of the most fascinating consequences of the Oklo Fossil Reactors discovery is the chain of follow-on hypotheses, discoveries, and implications that has spread into different fields of research. Some of these discoveries are hard science and some highly speculative hypotheses, while others are so extreme as to be worthy of note only because of their lunacy. The uniqueness of the Oklo phenomenon has provided a rich set of precedents that have fueled many claims.

Nuclear Waste Disposal

One of the most significant short term consequences of the Oklo fossil reactors are for the disposal and storage of nuclear waste. It is difficult to imagine a more unsuitable place to dispose of nuclear fission by-products than the environment at Oklo – buried with no containment vessel in an underground stream. And yet, remarkably, most of the fission by-products at Oklo have remained exactly where they were produced or have moved only a few meters in 1.7 billion years. This containment has been achieved without any of the elaborate precautions being designed for nuclear waste repositories like Yukka Mountain.

The periodic table showing the elements produced by the Oklo fossil reactors and the degree to which they were retained at the reactor site

The periodic table showing the elements produced by the Oklo fossil reactors and the degree to which they were retained at the reactor site.

The Office of Civilian Radioactive Waste Management behind the Yukka Mountain Project use the Oklo Fossil Reactors as a precident for long term nuclear waste mobility. Their argument – in the only natural example ever discovered buried nuclear waste was safely contained, without man-made precautions, in a far worse environment than Yukka mountain.

Cosmological Constants

It is generally assumed that the laws of nature are the same everywhere in the universe and have been so ever since the universe began. This fundamental assumption enables us to explain events billions of light years away and make predictions about the behavior of the early universe. But, if it is not true then many of our predictions about the universe could be called into question. Studies of light from distant quasars suggest the fine-structure constant, a number that determines the strength of interactions between charged particles and electromagnetic fields, could have changed over the history of the universe. If the fine-structure constant has changed, then other constants might have changed as well, and this could have major consequences for our understanding of the universe. Peter Moller, a Los Alamos physicist collaborated on the paper Nuclear data in Oklo and Time-Variability of Fundamental Coupling Constants . This paper used evidence of unusually low samarium-149 at Oklo to conclude that the fine-structure constant has not changed significantly between the time the fossil reactors were active and the present day. Only the existence of the Oklo reactor allowed this measurement to be made.

Planetary Magnetic Fields

Since 1990 J. Marvin Herndon, Ph.D Has pursued the idea that some planets in our solar system may have planetary scale nuclear reactors at their cores. In the 1960s astronomers discovered that the planet Jupiter radiates about twice as much energy as it absorbs from the sun. Hendon believed the existing explanations for this phenomenon (gravity) were inadequate and suggested a natural nuclear reactor as the source. He cites Oklo as evidence that his theories are possible and has also suggested that the earth has a similar reactor at its core. By Herndon’s reckoning the earth’s reactor is 5 miles in diameter and is responsible for the earth’s magnetic field. He suggests that the natural shutting down and restarting of the reactor explain the frequent reversals in the Earth’s magnetic field and that if the reactor should ever use up all its fuel it would shutdown the earths magnetic field for good, with disastrous consequences for life on earth. Deep-Earth Reactor: Nuclear Fission, helium, and the geomagnetic field. D. F. Hollenbach and J. M. Herndon pdf

The Earths Magnetic field bent by the solar wind

Herndon’s theories have not been well received by other geophysicists. But, before he is completely dismissed it should be remembered that the more accepted theories are equally strange, and it took geologists 50 years to accept the theory of continental drift.

Current Biography Profile of J. Marvin Herndon

But Herndon found that instead of being the subject of discussion and debate, his work was systematically ignored. His grants were no longer renewed, and without that support, his position at the University of California at San Diego was eliminated.

Current Biography. November 2003

The Age of the Earth

Several people have suggested that the Oklo Fossil Reactors are hard evidence against the Creationist idea that the earth may only be a few thousand years old. Some defenders of creationism have actually tried to argue that scripture can be interpreted to account for the evidence of the Oklo Fossil Reactors. In the quote below C. L. Webster of the Geoscience Research Institute (“Integrating Science and Faith”) presents his desperate closing arguments and concludes that in the case of Oklo “We need to seek a better understanding … through the guidance of the Holy Spirit.” It appears that the only way for Christian Holy Scripture to account for the Oklo Fossil Reactors is with the support of a direct appeal to God through prayer! A cyclical argument if ever their was one!

A question that now arises in the face of these strong data supporting long ages for the existence of abiotic matter is, “Can these data be accepted within the Scriptural framework of a literal seven-day creation as described in Genesis?” I personally believe that the answer is “Yes!”

One of the immediate consequences of accepting these long ages for the abiotic material of the earth is the assumption that this matter existed on planet Earth before the creation of life. This assumption is supported by interpreting Genesis 1:1-2 as identifying God as the Creator of the “foundations” of the Earth, regardless of when that creation process took place. The creation of life and living processes, as we know them, begins with verse 3 of Genesis 1. In addition, one can add the fact that there is no specific reference in the scriptural account of Creation week that addresses the creation of water or the mineral components of dry land (“earth” that was created on day three). The only reference made to their creation is “in the beginning.” It seems possible then that the elementary abiotic matter is not bound by the limited age associated with living matter.

The implications of this approach would suggest that the radiometric clocks are not reset to zero whenever the minerals are transported by igneous or erosional processes. This approach also strongly suggests that the radiometric age assigned to the inorganic minerals associated with a fossil is more a reflection of the characteristics of the source of this inorganic material than an indication of the age of the fossil.

Conflicts between scientific and biblical interpretations are minimized with these assumptions. However, not all of the questions are answered, and areas that call for the exercise of faith remain.

In seeking to harmonize the revelation of God through Scripture and natural science, we must find a model that is consistent with both sources of revelation. Where such consistency is not found, we need to seek a better understanding of both sources through the guidance of the Holy Spirit.

[The Implications of the Oklo Phenomenon on the Constancy of Radiometric Decay C. L. Webster Geoscience Research Institute]

Ancient Civilizations

In the twilight zone of fringe religions and pseudo science it has even been suggested that the Oklo reactors were created by an ancient, non-human, race and all that remains of this race is their nuclear waste. This claim seems to have been floating around since the 1970′s. Below is a quote from a Falun Dafam website.

As a matter of fact, many people today know that the reactor is a relic from a prehistoric civilization. It’s probable that two billion years ago there was a fairly advanced civilization living at a place now called Oklo. This civilization was technologically superior to today’s civilization. Compared to this huge “natural” nuclear reactor, our current nuclear reactors are far less impressive.

[ Prehistoric Nuclear Reactor In Gabon Republic from Pure Insight ]

It’s difficult to know what to say about such wild claims. At least C. L. Webster had the honesty to admit Christian Scripture could not account for the Oklo Fossil Reactors without a good dose of faith. Falun Dafam on the other hand appears not to be concerned with intellectual honesty!

Conclusion

Explaining unprecedented phenomena can lead to a cascade of new findings and theories spreading into many areas of human knowledge. These findings can have significant consequences as they ripple through other areas of research. However, it is hard to handle unique phenomena with scientific rigor precisely because they have no parallel with which they can be compared. As a result they are open to wild interpretation by those with a tenuous grasp on reality. It would be a shame if the wildly irrational were able to dissuade the strictly logical from making significant advances. But one gets the impression that only a few brave , confident, or foolhardy scientists are prepared to tackle such unprecedented phenomena because they fear the damage being associated with the lunatics would do to their careers.

Charles Babbage and Howard Aiken. How the Analytical Engine influenced the IBM Automatic Sequence Controlled Calculator aka The Harvard Mk I

In 1936, [Howard] Aiken had proposed his idea [to build a giant calculating machine] to the [Harvard University] Physics Department, … He was told by the chairman, Frederick Saunders, that a lab technician, Carmelo Lanza, had told him about a similar contraption already stored up in the Science Center attic.

Intrigued, Aiken had Lanza lead him to the machine, which turned out to be a set of brass wheels from English mathematician and philosopher Charles Babbage’s unfinished “analytical engine” from nearly 100 years earlier.

Aiken immediately recognized that he and Babbage had the same mechanism in mind. Fortunately for Aiken, where lack of money and poor materials had left Babbage’s dream incomplete, he would have much more success

Later, those brass wheels, along with a set of books that had been given to him by the grandson of Babbage, would occupy a prominent spot in Aiken’s office. In an interview with I. Bernard Cohen ’37, PhD ’47, Victor S. Thomas Professor of the History of Science Emeritus, Aiken pointed to Babbage’s books and said, “There’s my education in computers, right there; this is the whole thing, everything I took out of a book.”

The Harvard University Gazette. Howard Aiken: Makin’ a Computer Wonder By Cassie Furguson

Charles Babbage's Difference Engine I Demo (front)

The quote is incorrect. The “brass wheels” were a small demonstration piece for the Difference Engine I not the Analytical Engine. They were one of six such pieces constructed by Babbage’s son Henry after his fathers death. These demonstration pieces were distributed among various universities including Harvard. Aiken must have been sufficiently intrigued by the mechanism to investigate Babbage. In the course of this investigation he would have discovered Babbage’s Analytical Engine and the similarities it bore to his own machine. It is not clear when Aiken was given Babbage’s “books” or indeed what they contained. They did not contain plans of the Analytical Engine since the only plans have always been stored at the Science Museum at Kensington in London. Aiken may have been able to obtain some of the following documents which together comprise the complete published account of the Analytical Engine

These documents along with Babbage’s “books” would have given Aiken a high level description of Babbage’s planned machine.

The pictures below show front and rear views of one of six demonstration pieces for the Difference Engine I created by Henry Babbage after his Fathers death. This piece is similar to the one shown to Howard Aiken in 1936

Charles Babbage's Difference Engine I Demo

Aiken may also have seen photographs of the largest test piece of the Difference Engine I held by the Science Museum, Kensington, London, UK.

Charles Babbage's Difference Engine Demo pieces held at the Science museum in London.

Two large fragments of the Analytical Engine were constructed by Babbage’s son and Aiken may have seen photographs or otherwise become aware of their existence.

Charles Babbage's Analytical Engine pieces, constructed by his son.

In 1991 the Science Museum in London constructed the Difference Engine II, the printer was added in 2001. These pieces are on display in the Museum, which is well worth a visit. The construction of the Difference Engine II is documented by Doron Swade in his book The Difference Engine Charles Babbage and the quest to build the first computer. The Difference Engine II was the last machine Babbage designed and employs lessons he learned from both the Difference Engine I and the Analytical Engine. For example the printer was designed for use by the Analytical Engine and Babbage reused it for the Difference Engine II. The similarity between the Difference Engine II and the machine that Aiken built is striking. Note the drive shaft running along the bottom of both machines and the general arrangement with printers at one end of a long tall frame. This may be the result of convergent evolution rather than direct influence but the similarity is still striking.

The Difference Engine II at the Science museum in London and Howard Aiken's Harvard Mk I, IBM Automatic Sequence Controlled Calculator under construction at Endicott.

In the foreword to the manual for the operation of the Automatic Sequence Controlled Calculator (ASCC) Howard Aiken states that “The appendices were prepared by Lieutenant [Grace] Hopper” with the assistance of others and that “[She] acted as general editor, and more than any other person is responsible for the book.” It seems safe to conclude that Howard Aiken and Grace Hopper were not only influenced by Charles Babbage but they and their team held him in high regard and considered themselves guardians of his reputation and inheritors of his quest.

Chapter 1. Historical Introduction

“If, unwarned by my example, any man shall undertake and shall succeed in really constructing an engine embodying in itself the whole of the executive department of mathematical analysis upon different principles or by simpler mechanical means, I have no fear of leaving my reputation in his charge, for he alone will be fully able to appreciate the nature of my efforts and the value of their results.”

Charles Babbage The Life of a Philosopher (1864)

The Manual of Operation for the Automatic Sequence Controlled Calculator. By the staff of the Computation Laboratory with a forward by James Bryant Conant. Cambridge, Massachusetts. Harvard University Press. 1946.

The staff of the Computation Laboratory at the time the ASCC manual was published in 1946 are listed below. They went on to have a considerable influence on the development of the modern computer. Not least of which was Grace Hooper who developed the first compiler and several popular languages.

  • Comdr. Howard H. Aiken USNR. Officer in Charge
  • Lt. Comdr. Hubert A. Arnold, USNR
  • Lt. Harry E. Goheen, USNR
  • Lt. Grace M. Hopper, USNR
  • Lt(jg) Richard M. Bloch, USNR
  • Lt(jg) Robert V. D. Campbell, USNR
  • Lt(jg) Brooks J. Lockhart. USNR
  • Ens. Ruth A. Brendel, USNR
  • William A. Porter, CEM
  • Frank L. Verdonck, YI/c
  • Delo A. Calvin, Sp(I)I/c
  • Hubert M. Livingston, Sp(I)I/c
  • John F. Mahoney, Sp(I)I/c
  • Durward R. White, Sp(I)I/c
  • Geary W. Huntsberger, MMS2/c
  • John M. Hourihan, MMS3/c
  • Kenneth C. Hanna
  • Joseph O. Harrison, Jr.
  • Robert L. Hawkins
  • Ruth G. Knowlton
  • Eunice H. MacMasters
  • Frederick G. Miller
  • John W. Roche
  • Robert E. Wilkins

The influence of both, Howard Aiken, and the IBM ASCC – Harvard Mk I machine, on the later development of computers should not be over stated. The published notes from The Moore School Lectures (held in 1946) are rather scathing with respect to Aiken and his understanding of the direction in which the new electronic computing machines would lead.

Hartree was very forward looking and was excited by the mathematical potential of the stored program computer. On the other hand, Aiken was absorbed in his own way of doing things and does not appear to have been aware of the significance of the new electronic machines.

The Moore School Lectures (Charles Babbage Institute Reprint)

Unlike Aiken and his machine, Grace Hopper and some of her colleagues went on to have a significant influence in the early development of compilers and language design. One wonders what if any influence Babbage and Ada Lovelace had on Grace Hopper’s ideas. Unfortunately I can find no comments by Hopper regarding either Babbage or Lovelace.

[ Pictures via The Science and Society Picture Library]

The International System of Units (SI) [72 page pdf Brochure] is maintained by the Bureau International des Poids et Measures at it’s headquarters in Sevres near Paris, France. The Metric System as it is often known has a long history; supposedly invented in 1670 by Gabriel Mouton, a French clergyman, It was adopted by France in 1795 and by the United States in 1866. The system gained international status with the signing of The Convention of the Meter in Paris on 20th May 1875. The U.S. was one of the original seventeen signatory nations and is the only industrialized nation that still does not use the system.

Note: At this time, only three countries – Burma, Liberia, and the US – have not adopted the International System of Units (SI, or metric system) as their official system of weights and measures. Although use of the metric system has been sanctioned by law in the US since 1866, it has been slow in displacing the American adaptation of the British Imperial System known as the U.S. Customary System. The US is the only industrialized nation that does not mainly use the metric system in its commercial and standards activities, but there is increasing acceptance in science, medicine, government, and many sectors of industry.

CIA World Fact Book. 2000

In 1971 the U.S. Metric Study by the National Bureau of Standards resulted in a Report to the Congress called “A Metric America, A Decision Whose Time Has Come.” The report recommended that the U.S. should switch to the metric system deliberately and carefully through a coordinated national program, and establish a target date 10 years ahead. In 1992 NIST the National Institute of Sciences (The Successor to the National Bureau of Standards) published a report titled A Metric America: A Decision Whose Time Has Come – For Real (NISTIR 4858) June 1992. The US has spent over a century trying to switch to the metric system through a process of voluntary adoption.

The SI is without question more rational that the US Customary System and the British Imperial System on which the US system was based. But it has failed to gain acceptance in the US and its adoption in many other countries has been painfully slow. This is a classic example of the influence of network effects resulting in lock-in. But it is also an example of economic protectionism masquerading as defence of cultural heritage and patriotism. The adoption problems of the SI are not unique. When an established ontology is challenged by a newer ontology there will be resistance to the new ontology no matter how good it is. Ontologies are like Khunian Paradigms; new ontologies are resisted by those who have a vested interest in the old system. Ontologies, like jargon, can form barriers to market entry that effectively exclude potential competitors and protect established players. The benefits provided by these barriers can outweigh the benefits of adopting the new ontology. In such cases voluntary adoption will not occur. Establised players are acting out of self interest in resisting change.

Effective conversion from an old established ontology to a new replacement ontology requires more than encouragement it must be accompanied by coercion. In the UK enforcement of the metric system is proceeding through government mandated use in; the education system, the military, all government contracts and for all commerce. Government departments are also aggressively pursuing offenders like Tesco through the courts. This policy has generated a backlash. Groups like the BWMA are mounting popular resistance campaigns aimed at preventing change.

Building a great ontology is only the first step. Getting people to adopt it is far more challenging. Adoption is not driven by the merits of the new ontology alone. Enforcement is often required. The US will not become metric until congress is prepared to enact enforceable laws that mandate the use of the metric system.

Vannevar Bush and The Limits of Prescience

Today Vannevar Bush (rhymes with achiever) is often remembered for his July 1945 Atlantic Monthly article As We May Think in which he describes a hypothetical machine called a Memex. This machine contained a large indexed store of information and allowed a user to navigate through the store using a system similar to hypertext links. At the time of writing his essay Bush knew more about the state of technology development in the US than almost any other person. During the war, he was Roosevelt’s chief adviser on military research. He was responsible for many war time research projects including Radar, the Atomic Bomb, and the development of early Computers. If anyone should ever have been capable of predicting the future it was Vannevar Bush in 1945. He is an almost unprecedented test case for the art of prediction. Unlike almost anyone else before or since Bush was actually in possession of ALL the facts – as only the head of technology research in a country at war could be.

The Editor of the Atlantic Monthly introduced the article as follows:

As Director of the Office of Scientific Research and Development, Dr. Vannevar Bush has coordinated the activities of some six thousand leading American scientists in the application of science to warfare. In this significant article he holds up an incentive for scientists when the fighting has ceased. He urges that men of science should then turn to the massive task of making more accessible our bewildering store of knowledge. For years inventions have extended man’s physical powers rather than the powers of his mind. Trip hammers that multiply the fists, microscopes that sharpen the eye, and engines of destruction and detection are new results, but not the end results, of modern science. Now, says Dr. Bush, instruments are at hand which, if properly developed, will give man access to and command over the inherited knowledge of the ages. The perfection of these pacific instruments should be the first objective of our scientists as they emerge from their war work. Like Emerson’s famous address of 1837 on “The American Scholar,” this paper by Dr. Bush calls for a new relationship between thinking man and the sum of our knowledge. -THE EDITOR

The essay was prescient in many respects. However, it failed to anticipate several innovations that are fundamental to modern information management and made many predictions that are only partially correct. It is easy to ignore Bush’s off-target predictions and focus solely on what he got right, but this would be a waste of an opportunity. By examining the innovations Bush failed to anticipate and the predictions he got half-right, and even wrong, we can develop a better understanding of prediction itself.

Background

Before the War Bush had been involved in the design and construction of analog computers for many years. At MIT He led colleagues and students in the development a series of analog machines that could solve differential equations. In 1927 Bush and others started developing the Integraph – a machine capable of solving first order differential equations. This was followed by the Bush Hazen Differential Analyzer, a general purpose equation solver that could solve 6th order differential equations. The Bush Hazen machine was operational at MIT in 1932 and served as the prototype for many similar machines built elsewhere. Finally in December 1941 the Rockerfeller Differential Analyzer (RDA) became operational at MIT. Financed by the Rockefeller Foundation, this machine used vacuum tubes and relays. It weighed 100 tons and was immediately classified. It spent the war calculating artillery tables. By the Wars end the RDA was redundant having been superceded by totally electronic machines like the ENIAC.

As Director of the Office of Scientific Research and Development Bush was Roosevelt’s chief adviser on military research. He was an engineer, an expert administrator, a capable politician, and was not afraid of fight. He allocated funds and managed priorities for many of the major US funded research projects of the Second World War. At the end of the war when he wrote the essay he knew many secrets.

Veiled Secrets not Predictions

Vannear Bush’s paper was published at the dawn of the digital age in July 1945. Many of the “predictions” it contained were merely veiled descriptions of secret wartime developments that had yet to be declassified. When Bush wrote his essay the great electronic computers that had been developed to aid the war effort were still secret. The ENIAC was the first of these machines to be publicly announced by the New York Times on February 16th, 1946. Bush undoubtedly knew of ENIAC and other machines under development. The following quote from the essay is stated as a prediction but is actually a fairly accurate description of the ENIAC.

Moreover, they [computers] will be far more versatile than present commercial machines [punch card tabulators and hand calculators], so that they may readily be adapted for a wide variety of operations. They will be controlled by a control card or film, they will select their own data and manipulate it in accordance with the instructions thus inserted, they will perform complex arithmetical computations at exceedingly high speeds, and they will record results in such form as to be readily available for distribution or for later further manipulation. Such machines will have enormous appetites. One of them will take instructions and data from a whole roomful of girls armed with simple key board punches, and will deliver sheets of computed results every few minutes. There will always be plenty of things to compute in the detailed affairs of millions of people doing complicated things.

The first atomic bomb was detonated at the Trinity site in New Mexico on July 16, 1945. a few weeks after the essay was published. Bush is said to have had a nervous collapse after witnessing the test detonation. It’s success must have been a tremendous relief for Bush who had persuaded the President to commit the $2 Billion necessary to build the bomb. The following paragraph describing the impact of the war on scientific research, especially physics, seems to refer to the massive Manhattan Project and all the physicists involved.

It is the physicists who have been thrown most violently off stride, who have left academic pursuits for the making of strange destructive gadgets, who have had to devise new methods for their unanticipated assignments. They have done their part on the devices that made it possible to turn back the enemy, have worked in combined effort with the physicists of our allies. They have felt within themselves the stir of achievement. They have been part of a great team. Now, as peace approaches, one asks where they will find objectives worthy of their best.

Predictions

Bush starts his visionary predictions by suggesting that computers could be made to manipulate premises in than same way they manipulate numbers.

It is readily possible to construct a machine which will manipulate premises in accordance with formal logic, simply by the clever use of relay circuits. Put a set of premises into such a device and turn the crank, and it will readily pass out conclusion after conclusion, all in accordance with logical law, and with no more slips than would be expected of a keyboard adding machine.

He then describes the Memex as a personal desktop interactive device. However it is here that his foresight breaks down because the Memex it described as analog not digital. While it contained some computing components information was stored photographically on microfilm and retrieved electro-mechanically. The Memex was nothing like the room sized computers of the late 1940′s. In the 1946 New York Times article announcing the ENIAC the new computer was described as “an amazing machine which applies electronic speeds for the first time to mathematical tasks hitherto too difficult and cumbersome for solution.” It took a long time before people began to implement Bush’s suggestion that computers could manipulate premises as well as numbers. Alan Turing had understood that computers were manipulators of symbols and that those symbols could represent any concept. But this knowledge was tightly bound to his work on code breaking and he in turn was bound by secrecy not to discuss it.

Ultimately Bush’s prescience was limited by two factors: Failure to anticipate the emergence of fundamentally new technologies, and failure to predict the exponential improvements in many areas that such inventions would support.

The Relay gave way to the Thermionic Value which in turn gave way to the Transistor which itself was replaced by the Silicon Chip. Each paradigm shift maintained the exponential rate of growth in computing power. Bush could not have predicted this chain of technological advances. But as Moore has shown the exponential growth it has produced is predictable.

In 1945 the ENIAC could not even store its own meager program in what little memory it had and all data was stored externally. The idea of storing vast quantities of data digitally was not considered realistic, it was accepted that there had to be some form of external physical storage. Bush merely replaced the punched card with a microfilm. But memory storage advanced in a similar way to computing power, from mercury delay lines, and magnetic drums to William’s tubes, magnetic core memory and tape, to modern chip based RAM, and high speed disc drives. Today a standard home computer is typically shipped with over a 100 Gigabytes of storage and several hundred Megabytes of memory.

Bush’s biggest failings were in predicting implementation details and his most accurate predictions concerned the interaction of people and technology. The Memex is eerily similar to a networked PC running a web browser. Even Bushes description of the wearable camera is remarkably close. We don’t wear our cameras because they double as portable phones but everything else about them from their size to the number of photos they can take is remarkably accurate.

Conclusions

Reading this essay with hind sight, and knowing that large amounts of information were still secret in July 1945, one is forced to wonder who was the intended audience of the essay. The essay seems to be an inverse call to arms, aimed at the scientists and researchers who would have recognized their own secret war time work between Bush’s lines. In effect Bush was suggesting a path for post war research and development based on his uniquely broad knowledge of the state of technology. The Memex was a technological phoenix to be built from the ashes of wartime science. It was an example of how various wartime advances could be combined to create something awesome but benign. A modern library of Alexandria on every desktop.

The Memex

The ability to see even a decade into the future is impressive. That Vannevar Bush was able to see much further is remarkable and a testament to his brilliance. It would be a shame if he were only remembered as the inventor of hypertext. When in fact he foresaw the information revolution.