Transcript
“My God,” he thought, looking at his newspaper. “They didn’t take my advice.”
Intro: Bob Bemer, 1999
Robert Bemer–legendary computer engineer, father of ASCII, the Escape key, backslashes, curly brackets, and COBOL, the Common Business Oriented Language–read newspapers from his beautiful clifftop home in rural Texas, along the winding King Possum Lake. We mentioned this place in our last episode.
We didn’t necessarily focus, though, on its darker side. The reason he moved to such a remote area. The reason he kept its location a secret for years.
In the early summer of 1999, he did give his address to reporters from the Washington Post, who took the long journey out to visit.
By this point, interviewing Bob was a bona fide cliche in their industry. He was in every paper, because everybody was talking about Y2K, and he was the guy who knew about it first. Not to mention, he designed the language running many of the systems people were worried would fail.
So everyone wanted to know: was this elder statesman, this oracle, as worried about Y2K as they were?
They would not be comforted by his answer.
They finally arrived and met the man: now 79 years-old, with a gray mustache and receded hair, skin wrinkled and freckled.
Before they even got to the interview, the reporters wanted to know about the house.
Our first question is why the heck he recently moved from a big city all the way out to East Bumbleflop, U.S.A.
It’s a good place to be next New Year’s Eve, he says.
From a kitchen drawer he extracts two glass cylinders about the size of the pneumatic-tube capsules at a drive-through teller. Each is filled with what appears to be straw.
“They’re Danish,” he says. “They cost $500. We ran water with cow [poop] through them and they passed with flying colors.”
They’re filters, to purify water. If Y2K is as bad as he fears, he says, cocking a thumb toward his backyard, “we can drain the lake.”
Was Bob Bemer a genius scientist gone mad? Or was he, in the end, just a failed fortune teller? Or maybe something else entirely?
The answer depends on how we remember the cause to which he dedicated his later years. Was Y2K a hoax, like perhaps you think of it? Or was it a legitimate threat prevented only by billions of dollars and countless hours of effort, from people like Bob and thousands of others worldwide?
What’s remarkable about this dilemma is that even today–a whole quarter century after the fact–there isn’t a consensus answer to that question.
A common view is that it was a total non-event. National Geographic’s official encyclopedia entry for Y2K, as just one example, highlights how, quote, “Australia invested millions of dollars in preparing for the Y2K bug. Russia invested nearly none,” but, “Nothing happened. [. . .] Due to the lack of results, many people dismissed the Y2K bug as a hoax or an end-of-the-world cult.”
Others have looked at the same events, the same information, and come to the opposite conclusion. As one Stanford University futurist told Time Magazine in 2019, quote, “The Y2K crisis didn’t happen precisely because people started preparing for it over a decade in advance.” If some believe it’s a hoax…well, quote, “[it’s] better to be an anonymous success than a public failure.”
So who’s right here?
The World Prepares
“[Hillis] People took it quite seriously.”
Danny Hillis is an inventor and pioneer of parallel computing and AI.
“[Hillis] In fact, there were people who moved away from cities because they were afraid of the chaos that would happen.”
Like Bob Bemer.
“[Chen] It’s so simple. Right? You know, that’s why I can explain why I think the general public was so possessed by it, and the media and government, your non technically sophisticated government people. [. . .] Because you could explain it to people, right, like, we should have four digits but we only have two, and at the year 2000 some computers – maybe most, not all – will think it’s 1900. It’s very easy to understand that.”
Everyone’s worst fears were amplified in the news media, and in endless magazines and books. Perry Chen’s “Computers in Crisis” exhibition from 2014 documented 155 books written about the event, including 27 aimed at corporate audiences, 17 of a religious nature, and 52 survival guides.
Also five apocalypse-oriented cookbooks, and plenty of other interesting takes on the end of civilization: like “Y2K for Women,” “Crisis Investing for the Year 2000: How to Profit from the Coming Y2K Computer Crash,” and, of course, “Surviving Y2K: the Amish Way.”
Three books, among all 155 documented, focused on debunking fears of disaster wrought by the two-digit year.
U.S. Response
“[Chen] The question then is simply how many systems are like this – and it was most, I think, at the time – or or many critical infrastructure ones and then second, you know – what will happen from that? Nobody could answer that question.”
To try and contain the fire, the Clinton White House appointed a “Y2K czar.” The title sounds silly, but the project was anything but. 5 billion dollars of federal funding was allocated for bringing government systems nationwide into compliance, and another 3.2 billion was set aside in case of emergencies.
Even that, many believed, would not be enough. In 1997, Ed Yourdon, Director of the Cutter Consortium’s Y2000 Advisory Service, wrote to the Fannie May Year 2000 Team, quote:
“Nobody seems willing or able to say it in simple language, so let me be the one: the federal government is not going to finish its Y2000 project. No maybes, no ifs, ands, or buts. No qualifiers, no wishy-washy statements like “unless more money is spent” or “unless things improve.” We’re not going to avert the problem by appointing a Y2000 Czar or creating a National Y2000 Commission. Let me say it again, in plain English: The United States federal government will not finish its Y2000 project.”
India
It was the sheer scale of the problem which was so daunting. Remember, this was the peak of the Dot Com Bubble–when the web was already growing faster than people could reasonably account for–and now, suddenly, thousands upon thousands of more programmers were needed to fix the millennium bug, at companies across the United States, and the rest of the world.
It was a nightmare. Or, as one industry VP told The Guardian, quote, “Y2K has been a godsend.”
“[Hillis] There were hundreds of billions of dollars spent going back over software and rewriting it. Which was also a lot of what was driving it, because a lot of people were making a lot of money rewriting that software.”
That Guardian source was from Hyderabad–a central Indian city which, in the years leading up to 2000, had developed an entirely new cottage industry. Dozens of computer training camps were founded there alone, promising to train people to become computer experts. In just three months, they claimed, you could convert from another career path, and get a job in America working on computers.
And they were right, thanks to the widespread computer issue which required an immense amount of diligent effort to fix, yet was easy enough to understand even if you only had three months’ experience in programming. “Anybody can enter into Y2K. You don’t need to be technically qualified,” one Indian tech recruiter marveled to journalists.
And so, for the first time at such a scale, Western companies began to outsource their IT needs to Indian companies. Across Hyderabad, Delhi, Bombay, and more, new ones sprung up, hiring thousands of people and training them to service Year 2000 bugs, setting a precedent for global trade that remains with us to this day.
Bemer Back in Business
The most obvious way in which these and other programmers went about fixing the Year 2000 problem was with date expansion: simply editing each program, database, and file to expand the two-digit year to four. This method was, as you can imagine, incredibly monotonous, costly, and, to Bob Bemer’s great concern, time consuming.
He saw the writing on the wall: with time running out until 2000, governments and companies worldwide simply hadn’t left themselves enough time to deal with the problem, at least if they went about it as they were.
And so, just as his son was entering retirement, he came out of his. As he told the Sun, simply, “I would feel guilty if I didn’t do something.”
He was now fifteen years off of work, and many decades more since his glory years. As one New Yorker reporter quipped, “Bob Bemer can’t program a VCR and is baffled by Windows.”
But he did have one advantage, at least.
“[Hillis] There was an interesting point around in the 1990s. When people started retiring that had built the first had written the first computer software.”
Not only had a generation passed, but they didn’t necessarily leave behind diligent notes. Call it shortsighted, or humble: they hardly thought their old, clunky systems would survive decades of technological advancement.
Who the heck would still be using FORTRAN and COBOL in 1999?
“[Hillis] And so literally, there was software around that nobody knew exactly how it worked.”
If anybody did know how it all worked–or, at least, could figure it out–that man would’ve been Bob Bemer.
For months, Bemer studied technical manuals and mainframes. How could all the world’s code–150 billion lines, at least, by his estimates–possibly be fixed in the time they had left?
With hardly a couple of years left until the big moment, he came up with his proposal. One which provided a fitting dovetail to his illustrious career.
Bigits
They called it “Bigits.” Bemer digits.
Officially, though, they called it “Vertex 2000”–“vertex” short for “vertical expansion.”
Back then, most computers used ASCII: a standard scheme to represent the 128 most common characters, numerical digits and punctuation marks, using two bytes of information (actually, 7 bits, to be exact.) The traditional fix for the two digit date, we mentioned, was to add two more digits to the end. Horizontal expansion. Two more bytes of information on the tail end.
But Bemer, being the Father of ASCII, knew that the two existing bytes could, in fact, hold more than 128 characters: in other words, one could “squeeze in” a few more characters into the existing two bytes than the ASCII standard allowed. That is, nothing would change for the user but, under the hood, the two date bytes would convey more information than they previously had. These extra characters, the Bigits, could be used to store the century part of the date.
The beauty of Bemer’s suggestion was that if your system was only storing dates which fall on the 20th century – no change would be needed. Only systems which need to process dates from past or future centuries would need to be patched – and even then, that patch would be an extra layer of logic that supplemented, rather than replaced, the current code – so it will be easier to implement and test. It might slow down performance as much as 20%, Bemer warned, but it would speed up Y2K readiness efforts ten times over.
Not everyone was sold on it. The New Yorker noted that, quote, “When Bemer presented his idea to companies such as E.D.S., the information-services giant, and the investment bank Morgan Stanley, he got a polite ‘No thanks’ or, at best, a ‘Let’s wait and see.’”
More to the point: there was hardly more than a year left for Vertex 2000 to go to market, sell, and be implemented worldwide.
Other Solutions
Meanwhile, other companies came up with other solutions. With only a few months before the big day, a vendor called Planet City Software published the so-called “Millennium Bug Kit.”. For the price of just $50 it offered both a correct real-time clock, and the capability to fix any underlying issues in your computer’s BIOS. Other products, like “Y2K Test” and “IntelliFIX 2000,” worked in much the same way. Norton offered software to test and fix the BIOS, and check against a number of known Year 2000-related bugs.
There were also other methods for fixing the two-digit year. “Date compression” compressed date code into binary 14-digit numbers, capable of counting 16,384 years. There was date “re-partitioning” and “windowing.” Bemer even came up with another method for solving the problem, called “XDay,” by playing with how dates are represented by the Gregorian versus Julian calendar standards.
Efforts to patch the Year 2000 bug continued right up until the most anticipated day in 1,000 years finally arrived.
12/31/99
“[Hillis] There were certainly people who got home generators and extra food and things like that.”
9
Around the world–especially in America–people prepared for catastrophe.
8
“[Hillis] Many people that were worried that planes would fall out of the sky and oil depots would blow up and gas lines would catch on fire. [. . .] And it was very hard to prove they were wrong.”
7
“That’s why [Bemer] has requested that we not mention the town in which he lives,” the Washington Post reporters noted. “He doesn’t want nutballs descending on him in the hellish chaos of Jan. 1, somehow blaming him.”
6
From the Baltimore Sun, quote:
“Under the stairs of [Bemer’s] home are 61 cartons of freeze-dried delicacies such as instant couscous and pre-cooked scrambled eggs, enough for him and his wife to live on for a year. Atop the stack is “The Official Pocket Survival Manual.” In a nearby drawer he has stashed a water filter, a box of Fire Chief wooden matches and a collection of Duracell flashlight batteries.”
5
Working in shifts, U.S. and Russian soldiers monitor missile launches across the planet, from the Air Force mountain base near Colorado Springs.
With a network of radars, satellites, and sensors, they track the point of origin, time, number of missiles detected, trajectories, types launched, projected target area, and projected impact time for any launches reaching beyond 500 kilometers.
4
“[Chen] At the time I was like, Well, who knows on New Year’s Eve? You know, I didn’t think the world’s gonna end but I was like, you don’t know you kind of look out the window.”
3
2
1
1/1/00
Around the globe–first in the Micronesian island of Kiribati, then Asia, Europe and Africa, and the Americas, two-digit computer clocks ticked from 12/31/99 to 1/1/00.
First, in China and Hong Kong, some government computers started glitching.
In Turkey, an oil pumping station failed, cutting off supply to the capital.
15 nuclear reactors worldwide shut down.
In Scotland, air traffic controllers phoned an emergency line to London: their radar failed, and they could no longer track any aircraft. In fact, their radar was working as well as always–they couldn’t see any aircraft because all flights had been canceled in anticipation of any Y2K-related safety failures.
Meanwhile, U.S. and French military communications satellites suffered information processing errors.
In Jamaica, a series of traffic lights at major intersections went out.
And three missiles fired off from Russia. They landed in Chechnya—intentionally, as part of the ongoing Second Chechen war.
“[Chen] There were some scattered issues, but in general, you know, overall, nothing like the problems that people have foreseen.”
“[Hillis] They just weren’t as ubiquitous as everybody imagined. “
Aftermath
In January, 2000, a social worker visited an elderly care facility in Norway. She was there to visit a young girl whose family hadn’t responded to an offer to enlist her in kindergarten. The girl, it turned out, was 105 years old. As one worker recalled, quote, “When our list showed she was born in ’94’ we just assumed it was 1994 rather than 1894.”
Most of the outcomes of Y2K were either minor–like taxi meters and bus ticketing machines failing–or easily remedied–like the court computers in Italy which added 100 years to some prisoners’ sentences and subtracted 100 from others’–or just comical–like the New Yorker who got slapped with a late fee for a video rental totalling $91,250.
Which raises the question: was it all a hoax?
Because the bugs that did occur were so tiny, it was easy enough for people to infer after Y2K that it was only ever going to have caused tiny bugs. And it didn’t help that those who were in a position to explain otherwise, didn’t.
Politics of Prevention
“[Chen] It’s not like COVID where it’s like we’re kind of forgetting about what it was like. It’s like literally, everybody somehow without saying it, you know, it was like, Listen, guys, we’re all embarrassed. Let’s never speak of it again. [. . .] First business has no interest in speaking about anything that’s not like making this look good. [. . .] The government Absolutely, in my opinion, should have had some follow up. And the press Absolutely. But they felt that they had egg on their face.”
Should they have felt that way, though?
“[Chen] The real story here is with the limited amount of information that they have, did the people in power make the right decision to spend resources and time on this and to stay that this is a real material potential problem that could have real consequences? [. . .] it’s hard because I don’t know…I don’t know how to have done it any better.”
Some companies and some countries may have survived easily enough without doing much prep, but the U.S. government, for its part, wasn’t spared Y2K by accident. According to the Department of Defense, 99.9% of its critical systems–2,101 in all–were addressed by December of 1999.
“[Chen] What I kind of discovered was, and what was lost with no post mortem, was basically something very human and important, which is that you know, we want the powers that be to address probabilistic problems if they can cause material harm. If a meteor is coming towards the Earth, and it has a 2% chance of hitting the Earth and destroying all or half life on the planet, do we want the government to work on that? And when they do, they spend $2 trillion on it. And they deploy something that they think helped move the trajectory. Won’t everybody say later – or a lot of people – that was wasted money, and it wasn’t gonna hit us anyway. “
It’s the same problem that haunts cybersecurity to this day, and hundreds of other areas of life–climate change, COVID, you name it. It’s easy to give credit or assign blame when something happens, but when something is prevented due to diligent preparation, we’re much less likely to care, or even notice it if we do. Governments and critical industries probably prevented significant Y2K-related errors, but even today it’s hard to say for certain, and it’s outright impossible to point to specific things that otherwise would’ve happened but didn’t.
Because of this underappreciation of prevention, we’re less likely on the whole to take preventative measures than reactive ones. Politicians don’t fix bridges because we don’t notice when they do. Organizations fund research into existing diseases, but not potential upcoming pandemics. And programmers deal with the bugs that affect their programs today, not tomorrow…
Y2038
Ten years ago, a video on the internet showed us a glimpse of a danger which still awaits us thanks to this politics of prevention. A second Y2K.
The internet phenomenon that was Gangnam Style got so popular that it nearly broke YouTube. And I don’t mean “broke” like we typically use that term: when something goes modestly viral. No, Gangnam Style actually threatened to reach the upper limits of the YouTube website’s capabilities.
Computer processing, as we know, runs on binary code–0s and 1s. 16-bit computer processors can store and access up to 2^16 (or 65,536) binary values, 32 bits gets you 2^32 different values (4,294,967,295), and 64 bits allows for somewhere in the range of 18 quintillion values.
When YouTube was created, its coders opted to use a 32-bit counter for views. This meant that it could account for up to 2,147,483,648 views–half of 4,294,967,295 since, for technical reasons relating to the programming language, it also unnecessarily counted negative numbers.
For most of YouTube’s history, this counting problem was never even a remote consideration. But at the turn of winter in 2014, Gangnam Style was creeping past two billion views. The glitch was prevented when, with a Y2K-style edit, Google’s coders changed the view tracker from a 32- to 64-bit number.
But why am I telling you this?
Back in our last episode, you’ll recall, Bob Bemer and his colleague unsuccessfully lobbied for the U.S. government to adopt the four-digit year standard. Two-digit years became enforced beginning on January 1st, 1970.
Coincidentally, just months earlier, two computer engineers created the first version of Unix on a PDP-7 minicomputer. Unix didn’t have a Y2K problem–instead, it counted time in seconds, beginning at midnight on that same date of January 1st, 1970.
Like YouTube’s view counter, Unix timekeeping is typically encoded as a 32-bit signed integer, meaning it can count up to 2,147,483,648 seconds–a value which will be reached at exactly 03:14:07 on January 19th, 2038.
At that point, all of the world’s mainframes, mobile devices, workstations, and embedded systems which rely on Linux – Unix’s offspring – will experience an overflow error. Critical processes, dangerous equipment, and everyday computers could fail. Or, maybe, hardly anything will happen at all. Someone will get a big video rental bill, or a criminal will complain to their prison guard because the computer says they should’ve been released on parole in 1970!
We still have 14 years to fix the so-called “Y2038” bug. But we probably won’t.