Local resources will probably become extremely expensive because the number of people who want them will be so small.
I dunno; local resources are necessary to run much of the 'thin client' to begin with. I mean, unless you are talking about having NO local processing whatsoever - something which is, to me, just crazy considering how cheap processors are these days - and NO local storage whatsoever, then why the hell would you deny people root access to their own machines? It doesn't make any sense!
You are getting hung up on "root access" you can have root access on a thin client and it can still be a thin client model by design.
But I happen to agree with you about local resources not becoming expensive in this shift. After all, the cloud uses commodity hardware. They aren't using supercomputers they are using the same off-the shelf hardware that consumer computers use.
Additionally, games - one of the biggest if not the biggest driver of new sales for computer technology - are increasingly resource-intensive and the idea of a remote client for modern games is a little farcical.
I don't think you've thought this out well. The greater resource intensiveness is a strong argument for the parallelization and lateral scalability that cloud gives.
Games tend not to run off the cloud as much (but are increasingly doing so) because of the latency of delivering the multimedia art as well as the latency of response times (milliseconds matter in gaming) but games will increasingly be on the cloud as well and casual gaming exploding on the web and networked "online play" has become a must-have feature in gaming.
Right now there is a lot of need for local resources for these games (at the very least it's an easier way to get GBs of multi-media onto their computer right now) but ultimately it may play more and more of a caching role and the business logic for these games can be executed remotely.
Bandwidth would have to both explode tremendously and decrease in price exponentially for that model to work for a huge percentage of computer users and 'early adopters.'
Of course. That's why many people have talked this talk for many years and it has not yet come. But it will. Video online was a joke till bandwidth caught up and flash video made the usability better (on a proprietary platform, but sometimes proprietary labs lead innovation) and now it's ubiquitous in a few short years.
Part of the reason that Thin Clients never caught on so long ago is that a huge group of people - I might as well say 'us' - don't want them and use programs for which they are not well suited.
Perhaps, but honestly the biggest reason they haven't caught on was that the technology and service offerings weren't ready. That's why despite the iPad's flaws I think it's important, as it will push the hardware specs forward and spur more innovation for other companies to improve their internet appliances and ultimately this will bring more digital consumption.
We also spend more on computers then the average person and are willing to be early adopters. This helps keep the market from being overwhelmed by the bozos out there who don't know **** about their boxes and don't care to.
I don't understand why I should be happy about a future in which computer use is dumbed down to the lowest common denominator. It represents a decline in our society's use and understanding of computers, not a gain.
I think you unfairly assume that the cloud model is "dumber" when in reality it is a dramatically more complex advancement in computer science that solves many common user problems and that allows for computing on a scale that just was never possible locally (e.g. things like mapreduce, Amazon S3 storage that scales linearly. I think you just call it dumb because you don't understand and/or like it. Scaling vertically (more powerful computer) is dumb, scaling laterally is where it is at and the thin client's reduced processing power is more than matched by the distributed computing of the cloud.
There's absolutely nothing dumber about leveraging the parallelization and networking of the internet and the people most involved in this shift are the technology leaders, not the "bozos" you'd like to characterize as the target of this shift. The folks who most get technology are the ones who are driving this. "Bozos" aren't inventing cloud computing, it's hardcore geeks who are pushing the bounds of what is possible in the browser and increasingly relegating the OS to the browser-launcher. What one computer can do is nothing compared to what 100,000 networked computers can do together and cloud computing is much more complex and powerful through using this expandable grid as the foundation instead of a vertically scaling local machine.
Desktop software is an old paradigm of old castle makers. The software is buggy, less connected and runs in fewer places. Read this desktop developer's experience for some examples:
He touches on just a few of the many reasons that web apps are killing off desktop apps. There are fundamental ways that web apps are superior, and the ways they are inferior are steadily decreasing with the advancement of browser technology (e.g. ajax going mainstream meant Google Docs was possible and more innovations in the browser are coming that will continue to make the web apps richer).