Lamont Wood: Future of Computing Revealed

Written by Lamont Wood

Our Lamont Wood thought long. He thought hard. Here’s the future of computing — as he sees it. It’s the future of computing revealed. Enjoy. — Past behavior being the only reliable predictor of future behavior, our Lamont Wood has distilled three decades of watching tech. Here’s his version of the future of computing.

1. Pundits will continue to talk as if Moore’s Law is about to be defunded by the Republicans or something. The “law” is an observation that the number of transistors on computer chips will double every 18 to 24 months — and cost the same. Details vary. Pundits have been predicting the impending demise of Moore’s Law, coined by Intel founder Gordon Moore in 1964 or 1965 (details vary), ever since Moore came up with it.

But customers keep demanding faster systems and the vendors keep delivering. And the results look a lot like Moore’s Law. Chip makers can no longer do it through faster clock speeds, and soon they won’t be able to through smaller silicon geometry, but there is still 3D architecture, and maybe they’ll jettison silicon for carbon nano-tubes. The efficient use of parallel processing on multiple cores — as in, dozens of them — remains an undiscovered country.

2. There will be no predicting the Next Big Thing. Remember scoffing at the first camera phone? Be honest, you did. As Moore’s Law continues its relentless march, more and more computational power is being dropped into the hands of end users. And no one knows what consumers and developers will decide to do with it. If you can stop playing Angry Birds and pay attention to what I’m saying, you know what I mean.

So often, the end result is that expensive stunts turn into commonplace functions, like video conferencing. The Internet of Things, 3D printing, gamification, robots, and mobile devices seem like trending-from-stunt candidates. Or maybe it’ll be some combination of everything.

Perhaps it’ll be all the rage to make, and control with your iPhone, a robot Orc that chases you around the office with a flaming sword as part of the gamification of your business software. Or something else.

3. We will achieve Storage Nirvana. Storage won’t matter. Yes, makers will produce non-volatile RAM that retains data for decades after the power is turned off, eliminating the need for hard drives and their mortality issues. We will still be obsessed with the possibility of hardware failure and the need to backup data. Meanwhile, we will be forced to confront a bigger issue: software obsolescence.

Okay, so you have all your old WordStar files backed up. Good for you. But do you have any software that can read them?

4. Pundits will continue predicting that IT, as a profession, will disappear. Yawn. After all, they argue, you don’t see desktop publishing any more – it was absorbed by word processing, thanks to increasingly-sophisticated office software.

But the analogy doesn’t stand up. Enterprises will always try to stand out from the competition by pushing the envelope in whatever field they compete, creating a frontier where IT professionals will be worth every cent they’re paid.

5. Pundits also will continue predicting that IT will become a management function like any other, akin to accounting or marketing. This is really the expression of long-standing hope that IT professionals will become more like other middle-level managers and be easier to talk to.

In enterprises where IT actually means word processing, this is probably already the case. In enterprises where competitive survival means pushing the technological envelope, it better not ever happen. The price of eternal tech vigilance is eternal tech geekness. But maybe the IT people could, for instance, be trained to read interoffice memos for content, rather than just correcting them for typos and sending them back.

6. Pundits will continue predicting the death of the personal computer. Maybe people will adopt tablets for browsing, but powerful, late-model desktop computers (be they of the PC or Apple persuasion) will continue to be the mainstay of office work. Yes, they’re a mature technology, and thus technologically unexciting compared to the rapidly-evolving mobile devices. And while they offer more and more power, allowing bigger spreadsheets, etc., the users can’t type or read faster, so the crescendo of power does not equate to excitement. Users are now able to easily swap complex electronic documents and are actually getting away from paper, but paperless was prophesized decades ago. So that doesn’t count.

7. Total computer security will remain unobtainable. Actually, the general adoption of appropriate technology and security practices could (in a perfect world) reduce old-style viruses, Trojans and worms to irrelevance. But if this were to ever happen (and don’t hold your breath) we would be left confronting what has been the real danger all along: social engineering. But right now there’s a nice young man on the phone who says he’s from the help desk of the department store where I bought my computer (didn’t I buy it there? I must have) telling me that I need to go to a web site and download some kind of software patch. He seems so helpful …

8. White collar productivity will remain a myth. For a generation or two we all have preached that personal computers can raise white collar productivity. For a generation or two we danced around the fact that we cannot measure white collar productivity. Actually, we might be able to, but first we have to get past all these emails, instant messages, phone calls, scheduled meetings, news alerts, fascinating Wikipedia articles, and funny cat videos that the computer gives us instant access to.

Considering that we used factory automation to raise blue collar productivity and then moved manufacturing overseas, perhaps it’s just as well that we’re stuck.

9. Management will fund searches for the Holy Grail of data. Today’s digital commerce environment — where you can track everything — offers the glimmering possibility of certitude. With enough processing power all available though servers-by-the-hour in the cloud surely a rehashing of yesterday’s data will allow insight into tomorrow. If certitude remains elusive, pile on more processors and more data. However…

10. The Holy Grail will remain a myth. To get yesterday’s data to tell you something meaningful about tomorrow, you have to query it based on assumptions about tomorrow. In plain English, that means guesses.

Yes, you can guess that the customers will behave this winter like they did last winter—if the weather, the economy, and fashion fads are the same. They won’t be, but maybe there will be enough parallels to produce predictions superior to coin-flipping.

11. Computers will continue to offer us mere facts. We, the users, have to supply the cognitive effort that turns facts into information, and information into knowledge, and knowledge into wisdom, and so on.

There are those who prefer to skip the effort and cherry-pick from the increasingly-vast storehouse of undigested data available online, looking for isolated facts that will, when properly distorted, prop up their world view. Previously this obsession at least kept them off the streets, but today they seem to be running for office.

12. Deperimeterization. You will need to learn that word. With all your devices having high-speed Internet access, you don’t need office networks. You just need to have an Internet address on each device — and probably VPN software. Admittedly, servers will probably remain preferable to cloud-based processing for a lot of tasks, and those servers will probably remain networked to each other.

And there you have it, a window on the future. Feel free to call yourself a consultant, set up a palatial office, and charge hefty fees to, basically, rewrite the preceding predictions as a lengthy series of bulleted PowerPoint slides, customized for each client. Just remember to dress well—we have a reputation to uphold, after all.

For, I’m Lamont Wood.

Based in San Antonio, Texas, Lamont Wood is a senior editor at He’s been covering tech trade and mainstream publications for almost three decades now, and he’s a household name in Hong Kong and China. His tech reporting has appeared in innumerable tech journals, including the original BYTE (est. 1975). Email Lamont at or follow him @LAMONTwood.


  • I wresting. But Mr. Wood is just as backup wards about this as are other pundits.

    It’s true that we maybe coming to the end of easy Moore’s law processing advances, at leant for a while. Intel confidently has in it’s road map, that 14nm is in final development. They don’t have that 22nm was late by over a quarter, and that 14 has now been pushed back by a total of 6 months. They also confidentially show that 10, 7 and even 5 nm is in research. Well, 10 will likely be pushed back by a year. It isn’t even known yet whether 7 and 5 is even possible, dispute all the new technologies that are being developed. At some point, the iron hand of physics comes down—and that’s it!

    New technology such as carbon nanotubes are still quite a ways out. We don’t even know if hey are a practical substitute for silicon chips for CPUs and GPUs, even though they look to be a possible advance for memory.

    An for t ten being just good for browsing, as Mr. Wood states’ that’s just a joke, right? He does know that millions of iPads are used in business and government for more than browsing, I hope.