The “Technology Quarterly” section in the March 12th-18th issue of The Economist magazine is concerned with the future of computing. Quite a few people, many in the media, are predicting that Moore’s Law, which has predicted the incredible increase of computing power over the past 50 years, is beginning to no longer hold true. Initially, in 1965, Moore’s law, named after Gordon Moore, one of the founders of Intel, predicted that the number of electrical components that could be put into an integrated circuit would double every year.
In 1970, he changed that to every two years, and it has been amazingly accurate. It is difficult to realize what this means, but as the author of the initial article in the section points out, an integrated circuit manufactured 44 years ago by Intel had 2,300 transistors on it, which was thought miraculous. But since then the count has doubled 44 times, leading to a recent chip released from Intel that has over 5 billion transistors on it, only 22 nanometers apart. This incredible shrinking of transistors and increasing power of a chip will probably continue, but at a slower rate. As chip dimensions have increased, and the size of the electronic components decreased, they have become incredibly more powerful. But now the transistors are becoming so small that the performance gain is not as great, and simultaneously the manufacturing is becoming more expensive, it is probable that Silicon chips will to some extent slow down their miraculous growth
But no one expects the growth of computation to slow. Many organizations are working on alternate ways to build digital devices, ranging from using the strange behavior predicted by quantum theory, to using light, to carbon atoms, to following the processes in the human brain. But The Economist articlea emphasize that a great deal of development and business will accompany the better and alternate use of the integrated circuits we now possess. In particular, it speaks of the predicted coming of the “internet of things”, in which low cost, and perhaps lower performance integrated circuits will be integrated in almost everything we use or contact. As examples one article predicts refrigerators that order food, washing machines that ask dirty clothes for instructions, and paving that is capable of monitoring traffic. Apparently a consulting company named Gartner is predicting that by 2020 there will be 21 billion connected devices in the world. You can find the section at http://www.economist.com/technology-quarterly/2016-03-12/after-moores-law#section-3
I read it with mixed feelings. I am an engineer, and have made use of computers since 1960 when I ran into an early IBM mainframe when I went to work at the Jet Propulsion Lab. Like most people, I have been amazed at the rapidity of their growth in power, and am fascinated that the little thing in my pocket that insists on slipping out and falling on the ground has more computing power than the big computers of those days, and the spacecraft we were designing and successfully launching on then exotic flights contained computers that would hardly deserve the name these days. But these days, I am not involved either in building spacecraft or trying to model the atmosphere, and am quite frankly swamped by the relatively small amount of digital equipment I own and use. This is partly due to the constant “upgrading” I am being asked to do and learn, the proliferation of hardware and software, the various encryption and security packages I am required to have and use, and in particular the lack of standardization. Most of my friends who use Apple desk-tops (as I do) are fed up with the frequency of operating system upgrades, what seems like backward progress in their programs such as the move from iPhoto to Photos, and the increasing “features” which make almost everything more, rather than less, confusing. In fact many of us would switch to Windows, except we did that when the University decided to standardize on it, and although the operating systems are not all that far apart, the distance is enough to be confusing, and Windows will not hold still either.
As to reliability of “things”, within the last few months I have replaced a washing machine, a dishwasher, and a microwave oven because the digital control systems gave up and replacing them was more expensive than the devices were worth. The old fashioned “machinery” part of them was fine. And now, a couple of months after I bought the microwave, the touch pads that control the cycles that depend on humidity are beginning to fail. On the other hand we have a gas stove that is probably seventy years old and a refrigerator that is probably thirty, and they are doing fine —nothing digital.
I am tired of being unable to open my friend’s cars at night from the back seat, being required to read the operator’s manual (if available) on modern TV’s and audio systems in other people’s houses and in motels and hotels, and risking my neck in random rental cars that have too many controls that operate functions I don’t want located in what seem like strange places. Do we need this?
I realize we love innovation, and that digital capability is moving fast, but maybe it’s time to consider more standardization, before we connect up all of the 20 billion devices_— In fact, before we design them. I realize that great diversity of digital devices makes for challenge and feelings of accomplishment, for engineers, lots of profit for companies, lots of IT jobs and openings for people who write instruction manuals and give classes on how to use the newest thing, and big profits for companies who can rapidly replace their products with “new and better” ones. But I want to keep using hardware and software that does what I need done, I know how to use, is reliable, and especially does not become useless to me because the company developing it no longer ”supports” it. Many devices being considered for the “internet of things” should have a lifetime of many years, and perhaps will be imbedded in “things” in a way that makes it difficult to change them, especially if the format keeps changing (think of Apple’s projector plugs, the increasing flavors of USB plugs, and the pile of different looking, and sometimes different voltage transformers that pile up in people’s basements until they hit the trash barrel}.
It’s time for digital devices to start growing up. They are supposed to help us, not clutter up our lives with buying and learning to use new features we don’t need or want. Maybe if the magic growth of integrated circuits is going to slow down, we should take advantage of the opportunity to learn to build devices that fit the average user a little better, and the people who want to make exotic equipment should target users who need it—somebody must be building a simulation of expected earthquakes on Ceres.
Having criticized the direction Apple’s desk tops are taking, I should say I think their iPhones are following a much more reasonable change pattern, since features (apps) are optional, and the yearly releases don’t change radically in operation. Maybe Apple is putting more effort and thought into hand-helds than desk-top models. But I just had to buy a new phone, because the operating system in my old one could not be encoded by whatever software is being used by Stanford to encode smart phones. And it is still ridiculously difficult to change iPhone batteries.
Recent Comments