The first time I ever made a computer actually do something, I felt a bit like a magician. I was on an ancient little system from the '80s called the Acorn Electron, and using a couple of lines of Basic programming, I did something very simple, like change the color of the screen or fill it up with words. It was incredible, and empowering.
Making computers do things is of course so much easier now. The advent of the connected era and graphical computing has long since supplanted the obtuse, arcane days of actual programming. Now we can make a website or edit a video without really having to know anything technical at all.
That change has inarguably democratized computing — but it has also come with unexpected costs. Consider just this week: Apple released the newest version of its Mac operating system and signaled that it will soon no longer allow software that Apple hasn't approved to be installed. In short, an ideal of computing in which users can make their devices do whatever they want seems to be dying — and we'll all be worse off for it.
This model is of course familiar to anyone who has used a smartphone or tablet; it's exactly how the app store works. And to be clear, I think there is real power in having certain locked-down, closed-off computing systems. Being able to hand my elderly parents an iPad and tell them "nothing you can do will break this" is immensely freeing, and the fear and intimidation they felt with more traditional computers is gone.
That is ostensibly the reason Apple decided to change how the Mac works. Starting in 2020, all software installed on a Mac will have to be "notarized" — i.e. approved by Apple's developer program to assure it contains no malware or spyware. For most people, that's great news. But it also means that the network of do-it-yourself software, or obscure, older bits of software, will now have to clear a much higher bar in order to find users. In particular, app creators who can't afford Apple's developer fee, or simply want to work independently and, say, make a custom app for friends, or coworkers looking to automate some specific task, will face more work and higher costs.
I recently started a new project that involves a bit of self-publishing. To do it, I downloaded a piece of writing software still in beta, and a little app that lets me upload images and then automatically produces a link. I'm elated at how well it all works. But it's exactly the sort of niche, custom workflow that I can only really accomplish on a full computer, rather than a tablet or more locked-down operating system. And it's just that sort of personalized approach that Apple's new move will make harder.
Yes, this is a niche concern. But such marginal examples tend to augur future changes that affect the masses. First comes a minor annoying development, like the advent of the Mac App store in which only pre-approved apps are allowed in. A few people kick up a fuss, but they're dismissed as paranoid because other avenues for apps still exist. But then, the model of pre-approved software extends to the whole operating system, and in hindsight, those original whistleblowers weren't so paranoid after all.
This insidious process of robbing users of their control and freedom goes beyond just a single operating system. It's also seen in the world of cloud computing. For example, this week, Adobe cancelled the accounts of all its users in Venezuela. Though the company says the move was to comply with sanctions against the country by the current U.S. administration, Microsoft and other cloud computing providers haven't followed suit. Suddenly, users in Venezuela — whether they're for or against the government — are shut out.
The point is that each evolution of tech — from the wild west of open computing to app stores to the shift from local software to cloud computing — comes with hidden downsides. The shift to cloud computing has advantages: constant updates, and security among them. But the centralization of control and removal of user choice, despite making computing easier for the masses, also chips away at user autonomy. In prior eras, software you bought continued to exist and function on your device regardless of what happened in the outside world, and there was real power in that because it meant that a tool you used could be relied upon to keep working.
Computers offer enormous power for democracy, individual autonomy, and even forms of resistance. But that power isn't guaranteed, nor is it safe from interference. Quite to the contrary, as the freedom to use our computers or software in the ways we see fit disappears, our capacity to create autonomously is degraded.
Want more essential commentary and analysis like this delivered straight to your inbox? Sign up for The Week's "Today's best articles" newsletter here.