ChatGPT is Just a Calculator . . . Crossed with a Nuclear Reactor

The good news is our careers are safe, even if our jobs are not. The bad news is society may bear some consequences even as we reap the benefits.

February 7, 2023
/
5
min read

Photo by Possessed Photography on Unsplash

People are worried about ChatGPT and other AI coming to take their jobs. It’s scary because it seems new, but it’s really not new and not that scary when it comes to job risk. If anything, it’s an exciting step forward for productivity. However, it also comes with a societal risk on the scale of Pandora.

For those who are feeling concerned for their jobs, welcome to the rest of the world. People working on assembly lines have been losing out to cheaper labor for years, both from other humans and from machines. Have you talked to any toll booth collectors lately? They are harder and harder to find, but commuters aren’t weeping as those jobs disappear. Does anyone wish they had to use a travel agent instead of booking flights directly online? The travel agents may, but the rest of society does not. Automation of tasks has a negative impact on a few, but greater positive impact for many as they lower overall costs. However, traditionally there are two big fears.

The first is the speed of change. In the US the number of people working on a farm dropped from 14 million in 1910 to about 3 million by the end of the century (source); it was even more dramatic the century before as the US went from 83% of its workforce in agriculture in 1800 to 55% fifty years later (source). The numbers went down in an absolute and relative sense all while the total population grew. This is because automation and other technological improvements allowed more yield for less work.

It wasn’t without casualties as small farms went under and people lost their livelihoods. Overall, this transition took decades allowing the labor force to readjust following supply and demand. The concern today is the speed of change doesn’t take place over decades and generations but rather only a few years. What happens if driverless cars become the norm tomorrow? Over just 3-5 years hundreds of thousands of people could lose their jobs (to say nothing to people driving private buses, trucks, and more). The response has always been: retrain them for jobs of the future. Realistically the labor market cannot respond that quickly and this could create extended unemployment.

The second concern is one of sloth. In 1967 Texas Instruments created the first handheld electronic calculator. The news was full of warnings about how people would soon forget how to do math. (It inspired one of my favorite science fiction stories, The Feeling of Power by Isaac Asimov in which humans have forgotten how to do math and instead rely solely on computers.)

Those concerns were both founded and incorrect. On the one hand, plenty of people, including my colleagues in science and engineering, very much know how to do math by hand. We can still do long division if needed, but generally we’ll use a calculator to save time. On the other hand, I’ve met plenty of people who tell me they can’t calculate the tip for a restaurant bill. While they can pull out their calculator (or years ago the cards people carried in their wallets with the tip calculations precomputed) to do so at the restaurant this becomes a failure elsewhere. When listening to a political speech, for example, they don’t know how to understand the percentages being referenced, or the long-term impact of borrowing money versus the projected boost to the economy by doing so. Many people wind up in debt because they couldn’t create a budget or understand that they are overspending and facing compounding interest. It’s not about getting the calculations perfectly accurate, but about being able to have a sense for where the math is leading them. That understanding often comes from the wax on, wax off training in the fundamentals.

Both of those groups rely on calculators. The first group employs them for convenience, allowing the workers to do more sophisticated work. The second relies on them for understanding, and in doing so loses some autonomy and control.

ChatGPT and other AI tools are evolving rapidly. When used correctly, they will reduce the cost of work for society. Just as our road tolls sped up and the cost of booking a flight went down, so too will we see improvements in law, publishing, software, finance, medicine, and many other fields. This will benefit society. Thirty years ago, an editor would need to review this article, twenty years ago word processors corrected my typos, today it can clean up my writing by correcting whole sentences, in the future it can help me plan my article structure.

The two prior warnings remain. White collar workers will need to retrain as their blue-collar colleagues were told to do decades ago. The goal is to move up the value chain. I cannot create value by turning a screw as robots can do it better than I (not to mention 24/7). I cannot create value booking a flight for others, since it’s generally not cost effective (there are still pockets of exceptions). Lawyers don’t create value redlining contracts to catch little mistakes, they provide value with a legal strategy. Software developers don’t create value by putting in lots of rote code to catch edge cases and exceptions, they do so by innovating data workflows. Most jobs have a lot of grunt work and AI will remove that the same way software made doing my taxes easier. Work needs to focus on the higher value they provide. We will need to use the tools, but not have it replace our thinking. (Personally, as a thought leader, I’m thankful that the countless volumes of prose my colleagues create will be devalued so that only those which provide new ideas will stand out.)

However, there is one other aspect to AI that differentiates it from the steam engine and word processor. It has all the power and complexity of a nuclear reactor.

Tools are not good or evil. A knife in the hands of a criminal can take a life, or in the hands of a surgeon can save a life. The same is true of a nuclear reaction; the only difference is it’s more than one life it impacts at a time. Being software, AI scales like a virus, allowing for a larger impact. For example, the data it's trained on may have a bias and that bias is now amplified. The only thing ever amplified by a word processor was the use of the Oxford comma.

When nuclear power was first conceived people didn’t fully understand the implications. They knew it could be a reactor or a bomb, but they didn’t appreciate the level of proliferation, or the fact that unstable regimes would have access to it decades later, creating a global threat. ChatGPT has been compared to mansplaining, a tool with more confidence than accuracy in many cases. We don’t yet know the implications. Conspiracy theories (which often find the most fertile ground in people who can’t do the math themselves) spread via social networks. Those channels, as well as others, can now be used to quickly proliferate the rapid generation of faulty information, especially as people abdicate critical thinking which acts as a check on misinformation.

Railroads were an economic boon to society. It wasn’t until 1873 that Eli Janney invented the knuckle coupler; prior to that invention many railroad workers lost fingers to this economic boon. What’s a few fingers compared to railroad tycoon fortunes? Threshers, cars, and planes [thank you word processor for your devotion to Oxford commas] all began without safety and only added them in earnest when mandated by the government. In the race to industrialize, factories sprung up without regards to the pollutants and other externalities. The Cuyahoga River fire was seen as the cost of progress.

With nuclear power, we have seen accidents like Three Mile Island and Chernobyl, both under strict government regulation. What would have happened if nuclear reactors in the 1960s were built like factories in the 1860s with companies willing to say damn the consequences to others, full speed ahead? What will happen when we do this with AI?

For the labor market, AI is just an automaton, no different from a jackhammer or calculator. It may be more complex as are the problems it solves, but it doesn’t truly think. Job wise, we must remember not to abdicate our need to think and strategize, and to actively understand the rote tasks being performed for us.

But a much, much, much more complex tool than the jackhammer or calculator is the nuclear reactor, which replaced tens of thousands of coal miners and provides cheap and relatively clean energy to millions. We do understand how it works, and have extensive safety systems, yet we still have serious accidents. With most industries other than nuclear power the regulation came decades later, and the societal costs were not insignificant. The question is how much regulation we need and how soon to prevent the internet from catching on fire in the name of progress.

By
Mark A. Herschberg
See also

Not Sure How to Ask about Corporate Culture during an Interview? Blame Me.

It’s critical to learn about corporate culture before you accept a job offer but it can be awkward to raise such questions. Learn what to ask and how to ask it to avoid landing yourself in a bad situation.

February 8, 2022
/
7
min read
Interviewing
Interviewing
Working Effectively
Working Effectively
Read full article

3 Simple Steps to Move Your Career Forward

Investing just a few hours per year will help you focus and advance in your career.

January 4, 2022
/
4
min read
Career Plan
Career Plan
Professional Development
Professional Development
Read full article

Why Private Groups Are Better for Growth

Groups with a high barrier to entry and high trust are often the most valuable groups to join.

October 26, 2021
/
4
min read
Networking
Networking
Events
Events
Read full article

The Career Toolkit shows you how to design and execute your personal plan to achieve the career you deserve.