Centralized
Many consider the first electronic computer to be the Z1, invented by German Konrad Zuse in 1936 and 1937. The Z1 could perform Boolean logic and binary floating-point numbers. Its biggest problem was that it was unreliable, as you would expect from something that is part mechanical. One of its shortcomings was that it was an electromechanical computer. Moving parts means things can wear out. Zuse went on to create the Z3 in 1941, which was considered to be the first programmable computer, albeit it, like the Z1, was electromechanical.
The first purely electronic computer was ENIAC (Electronic Numerical Integrator And Computer). It was created at the U.S. Army's' Ballistic Research Laboratory in 1945 by J. Presper Eckert and John V. MAcuhly from the University of Pennsylvania. t was used to calculate trajectories for missiles. Like the Z1, its input was through relays by flipping switches. This made "loading" a program take far longer than was feasible.
Computers immediately following these into the early 1950s got better, but were still relegated to a single input that was tedious. During this period, inputting a program became important. That's when paper tape was used for quite a while. It was during the 1960s that IBM came up with for connecting to the IBM mainframes. At first, they were hardcopy terminals (meaning they were like a typewriter with a roll of paper you typed on and interacted with the mainframe that way). By the time the 1970s rolled around, those terminals were now CRTs with an attached screen and keyboard.
The idea with these types of computers was that the computer itself was centralized and there were various terminals that would be attached to the computer. In some cases, remote access could be obtained through a phone line.
Coin Flip
But then in the 1980s with the advent of IBM PCs and Windows and MacOS and Unix, things began to go the other way. The industry went towards having singular servers that would serve up a single application such as email or file and print for a particular floor at a university or business office. From there the concept of distributed computing versus centralized computing would see-saw every several years. Places where I worked we had a ton of mainframes as well as distributed systems. It was never like a holy war, it was just depending on the application whether it was run on a mainframe versus a distributed system.
Grid computing came along in the early 2000s. The concept is considered by some an early cloud computing concept. This meant lots of hardware and software was shared for various applications. From there, we have come to cloud computing. Cloud computing is that perfect blend of the concept of sharing hardware and software in the background. Where it differs is that you can have both centralized and distributed systems under the hood. This is where things such as PaaS (Platform As A Service) and SaaS (Software As A Service) come from. You can have both individual servers running in the cloud as well as software instances (and never knowing what actual operating system or software is running the instance).
This leads me to wonder what the next technological innovation will be? Like the distributed systems from years ago, it felt like this was as good as it's going to get. But for me to believe that would make me naive. As exciting as grid computing was in the early 2000s, cloud computing is just as exciting. We are about as close to the appliance-like computer as ever. When an instance goes down, it gets destroyed and a new one created in its place. In the end, the cyclic distributed versus centralized computing ended with both winning. So could this be the best of all worlds and the last of the great technology in computing? It could say no. There will be something new and shiny coming along in the next decade and we'll all jump on that the moment it arrives.