As I pen these words in early 1937, the prospects for my automatically controlled sequence calculator seem dim after years of personal effort. My fellow physicists and mathematicians at Harvard still view the idea as sheer fantasy – an impossible attempt at mimicking in metal, wire and electricity Charles Babbage‘s 19th century mechanical computing glory. But my experiences tabulating complex calculations by hand for endless days have given me a vision; one I‘m determined to make a reality despite the skepticism and doubt that plagues my quest thus far.
You see, in every observatory, laboratory, engineering firm and university like our own, teams of low paid workers specially trained as "computers" labor away carrying out the millions of mathematical operations needed to build astronomical tables, track radio signals, design new machinery or test scientific theories. The best human computers become strongly sought after for their combination of numerical aptitude, concentration and meticulous accuracy checking abilities. But for all their skills, reliance on people as slow and fallible computing elements creates a critical bottleneck holding back progress across nearly every technical field.
My own physics doctorate work a decade ago fell victim to this numbing math burden. As I sought numerical solutions for complex differential equations on the behavior of electrons, most of my research time went not to analysis or experimentation, but rather the calculation process itself. I still recall the excitement upon first learning of Charles Babbage and his 19th century attempts at mechanical calculation. That a Victorian-era inventor without electricity conceived of a computing device surpassing human capabilities gave me immense hope. When I had a chance to examine a small demonstration fragment of Babbage‘s original Difference Engine, I felt as though he were speaking to me personally from the past about realizing the future of automation.
By 1936, the combination of my graduate research tribulations and inspiration from Babbage led me to draft conceptual plans for what I called the Automatic Sequence Controlled Calculator or ASCC. I envisioned a machine that could not just carry out individual calculations, but also automatically sequence a series of arithmetic, logical and information storage operations without human direction. This automatic, programmable nature would allow complex algorithms and mathematical tables to be produced faster than ever before. After testing the idea with peers and refining capabilities like conditional branching based on results, I set out to make this computerized dream a reality.
My first stop was the Monroe Calculating Company, producers of advanced mechanical calculators. Their enthusiasm initially gave me hope. But after months of discussions, they ultimately balked at the development costs my plan would require. Thankfully, their chief engineer suggested I approach contacts at International Business Machines – in particular Harvard professor Theodore Brown who worked closely with IBM president Thomas Watson.
I demonstrated my concept to IBM executives in early 1938. Though lacking technical details, they grasped the business potential of an improved scientific calculator based on their punch card machine experience. An agreement was reached by 1939 after chief engineer J.W. Bryce helped assign $300,000 to develop what we expected would be a 5 year project. I committed to advising the engineering team on desired functionality as work began under IBM manager Clair Lake at their Endicott, New York labs.
As the calendar advanced into 1941 while Europe descended into world war, I balanced teaching physics at Harvard while monitoring progress and providing guidance during regular visits to Endicott. Chief designer Frank Hamilton skillfully worked through questions on translating specifications into machine parts and electric circuits. In early 1944 – after 5 years of effort – the computer was fully assembled. Weighing multiple tons and occupying a large basement area, the imposing machine was disassembled and shipped to Harvard where it became operational that March.
As the Mark I calculator whirred into motion using hundreds of miles of wires and over three quarters of a millions components, I couldn‘t help but feel Charles Babbage‘s presence approving our feat. The innovative high speed relays and counters obtained from IBM let addition happen in a third of a second; six times faster than originally planned. Paper tapes drove operation while special electric typewriters output results. Over the remainder of 1944, Navy researchers used the machine to compute ballistics tables, design radar, and even support development of the first atomic weapons.
In the subsequent years before the Mark I‘s retirement in 1959, I established one of the nation‘s first academic departments focused on the emerging field of computer science. Students lined up for a chance to get hands-on time learning programming and machine operation. Many pioneers across computing and technology like An Wang and Grace Hopper started their careers under my guidance. While the Mark I did utilize deprecated electromechanical parts, I feel immense pride at the small role I played bringing digital computing from concept to reality. That our calculator laid a seed helping inspire and educate the founders of today‘s mobile computing revolution would have seemed unfathomable back in that doubtful year of 1937. But the cycle of vision birthing technology birthing greater visions persists as the computing quest I began continues onward into the future.