18
   

Good god the iPad is a steaming pile of disappointment

 
 
Cycloptichorn
 
  1  
Reply Fri 5 Feb, 2010 11:53 am
@Robert Gentel,
Robert Gentel wrote:

Cycloptichorn wrote:
This will be a sad, sad day. It represents a loss of control which I am personally unwilling to accept. When the programs don't reside on your own box, you don't own them; when the data doesn't reside locally, you can be locked out of it at any time. No thanks.

To me, this is a rather dystopian future of computing you are describing; one in which people will become beholden to subscription models for EVERYTHING they do.


Well you can certainly describe anything negatively if you so desire, I'm sure you can even come up with something negative to say about the current paradigm of local, disconnected data and security issues from your downloads.

There are upsides and there are downsides to each way, but ultimately it doesn't matter whether or not you or I like it. This is where it is going.


Well pardon me if I don't cheer for a future in which computer use is both dumbed down and restricted to subscription models. Guess I'm just old-fashioned that way, what with wanting actual control and ownership of my own data and where it's stored.

The thing about Chrome and Google is that they will create a device which works for me; one which allows you to have local storage and control as well as utilize the cloud. They will do so because it is exceedingly cheap to do so (the Ipad could do it right now, if they wanted to) and because the market exists to sell the product.

Everyone agrees that the Ipad isn't a 'computer replacement.' You are talking as if these products eventually will be computer replacements. Unless they include the features that I'm talking about, they really won't be.

Cycloptichorn
rosborne979
 
  4  
Reply Fri 5 Feb, 2010 12:14 pm
@Cycloptichorn,
Cycloptichorn wrote:
This will be a sad, sad day. It represents a loss of control which I am personally unwilling to accept. When the programs don't reside on your own box, you don't own them; when the data doesn't reside locally, you can be locked out of it at any time. No thanks.

To me, this is a rather dystopian future of computing you are describing; one in which people will become beholden to subscription models for EVERYTHING they do.

Hearing you say this brings back memories for me. During my time at Oracle Corp (managing server support for their Apps division), this is EXACTLY what we heard from the software developers all the time, even though Larry Elison's vision was for ThinClients, and even though the engineers themselves were writing code to move more and more functionality to The "Cloud", none of them were comfortable with giving up any desktop power or desktop storage, even though it was costing the company huge dollars in regular equipment upgrades and challenges for my support teams.

The interesting this was that ONLY the Engineers were of this mindset. We never heard this from the accounting people or the sales people or the marketing people or the managers. For every other user on the internal network (thousands of them), the location of their data was completely irrelevant, they didn't understand where it was nor did they care. They only wanted quick, easy access from wherever they happened to be.

I believe our user base was very much representative of the general public as well (since it was just a small slice of that same public). I assume there will always be powerful machines and local storage for people who want it, but the vast majority of the public will (and already do) use computers as IO devices, nothing more. And since people are already used to paying for subscriptions to services (power, cable, phone, etc) I think they will easily gravitate toward paying for remote server services. Eventually there will be Power Providers, DataStream Providers, and Portal Providers and they will all be subscription based. And I suspect that Programs will all eventually be sold like Apps and run on whatever Portal Service a user has (functionality dependent on number of dollars spent). Local resources will probably become extremely expensive because the number of people who want them will be so small.


Cycloptichorn
 
  1  
Reply Fri 5 Feb, 2010 12:24 pm
@rosborne979,
Quote:
Local resources will probably become extremely expensive because the number of people who want them will be so small.


I dunno; local resources are necessary to run much of the 'thin client' to begin with. I mean, unless you are talking about having NO local processing whatsoever - something which is, to me, just crazy considering how cheap processors are these days - and NO local storage whatsoever, then why the hell would you deny people root access to their own machines? It doesn't make any sense!

Content creation and the ability to control that content is a bigger part of the computer experience then you guys are letting on. Additionally, games - one of the biggest if not the biggest driver of new sales for computer technology - are increasingly resource-intensive and the idea of a remote client for modern games is a little farcical. Bandwidth would have to both explode tremendously and decrease in price exponentially for that model to work for a huge percentage of computer users and 'early adopters.'

Part of the reason that Thin Clients never caught on so long ago is that a huge group of people - I might as well say 'us' - don't want them and use programs for which they are not well suited. We also spend more on computers then the average person and are willing to be early adopters. This helps keep the market from being overwhelmed by the bozos out there who don't know **** about their boxes and don't care to.

I don't understand why I should be happy about a future in which computer use is dumbed down to the lowest common denominator. It represents a decline in our society's use and understanding of computers, not a gain.

Cycloptichorn
rosborne979
 
  2  
Reply Fri 5 Feb, 2010 12:31 pm
@Cycloptichorn,
Cycloptichorn wrote:
I dunno; local resources are necessary to run much of the 'thin client' to begin with. I mean, unless you are talking about having NO local processing whatsoever - something which is, to me, just crazy considering how cheap processors are these days - and NO local storage whatsoever, then why the hell would you deny people root access to their own machines? It doesn't make any sense!

That's pretty much what we're talking about with ThinClient, and it makes perfect sense, *if* you can provide the functionality that *enough* people want.

Cycloptichorn wrote:
Part of the reason that Thin Clients never caught on so long ago is that a huge group of people - I might as well say 'us' - don't want them and use programs for which they are not well suited.

If you really look at the data, I think you'll find that the real reason ThinClients haven't caught on yet has to do with Bandwidth limitations. Oracle's internal network, which had tons of Bandwidth was rapidly going ThinClient. And now that Internet access for the general public is beginning to reach the necessary Bandwidth/cost threshold.... well, the writing is on the wall.

You should work for Oracle as a software developer, you would fit right in Wink
Cycloptichorn
 
  1  
Reply Fri 5 Feb, 2010 12:34 pm
@rosborne979,
I have doubts that the bandwidth will EVER keep up with the actual needs of computer users in the way that you describe. Right now pretty much all the major providers have to throttle their high-end users because they simply overload the system (Bittorrent gets throttled almost everywhere). When you have remote processing, throttling simply isn't an option - people are too used to hitting a button and having instant results.

And this is with a comparatively small amount of 'power users' using the net. Imagine what the situation would be like if EVERYONE was using massive amounts of data constantly, just to run their normal computer operations! You are talking about exponential increases in bandwidth. That isn't going to happen anytime soon.

Cycloptichorn
djjd62
 
  1  
Reply Fri 5 Feb, 2010 12:36 pm
http://www.batmancomic.info/gen/20100205113541_4b6c64fd32669.jpg
rosborne979
 
  1  
Reply Fri 5 Feb, 2010 12:47 pm
@Cycloptichorn,
Latency is the only really limiting factor, and they will eventually get around that by placing redundant storage farms at balanced geographic locations around the planet.

Bandwidth and access speeds will continue to grow at least as much as processing power does. In addition, new methodologies for dividing and overlapping the data streams will improve the carrying capacity of fiber networks and other transport media, including wireless. See Orthogonal frequency-division multiplexing for one example.
rosborne979
 
  1  
Reply Fri 5 Feb, 2010 12:49 pm
@djjd62,
djjd62 wrote:
http://www.batmancomic.info/gen/20100205113541_4b6c64fd32669.jpg

Ha Smile I can see we're going to milk this comic thing for everything it's worth Wink
Cycloptichorn
 
  1  
Reply Fri 5 Feb, 2010 12:51 pm
@rosborne979,
rosborne979 wrote:

Latency is the only really limiting factor, and they will eventually get around that by placing redundant storage farms at balanced geographic locations around the planet.

Bandwidth and access speeds will continue to grow at least as much as processing power does.


The question is, does it grow cheap at the same rates? Processors are getting pretty cheap and energy efficient at the same time. Why start making devices which don't include them, when having local processing power provides a huge advantage in many situations - not the least of which being the ability to actually DO SOMETHING when there's no net connection.

When the power goes out, my laptop will still work (as long as there's battery life). The device you envision will do nothing. If the internet goes down in your area, you can't do anything. It's a limited device. Having a cheap local processor and storage on board removes these limitations.

Quote:
In addition, new methodologies for dividing and overlapping the data streams will improve the carrying capacity of fiber networks and other transport media, including wireless. See Orthogonal frequency-division multiplexing for one example.


Sure, but improve it to the point where we are streaming that much data? Not anytime soon.

I think if you asked people, 'Do you want a computer that does nothing at all if you can't get a web connection?' The answer would be a universal no. Sure, you can have toys and limited little devices that work this way, but hardly a replacement for our modern systems.

Cycloptichorn
Robert Gentel
 
  1  
Reply Fri 5 Feb, 2010 01:03 pm
@Cycloptichorn,
Cycloptichorn wrote:
Well pardon me if I don't cheer for a future in which computer use is both dumbed down and restricted to subscription models.


I don't care if you cheer for it or not, but you don't seem very informed about it. For example, it's neither dumbed down (the networking of the data and parallelization of the processing allows for much more powerful uses) and so far it has largely not been a subscription model at all, it's been a free ad-supported model.

Quote:
Guess I'm just old-fashioned that way, what with wanting actual control and ownership of my own data and where it's stored.


And you can still have that with either model. Its true that for most people it would be harder to control their data in the networked model (they may have to run their own cloud) but the model itself doesn't mean you lose control.

In the currently predominant model if you merely backup remotely you've just lost the same control over your data and failure to do so tends to be a much bigger problem for end users than the theoretical concerns about data control (i.e. they have tended to be much worse about keeping their data than any of the clouds).

Quote:
The thing about Chrome and Google is that they will create a device which works for me; one which allows you to have local storage and control as well as utilize the cloud.


You don't seem very familiar with the Chrome OS. Here is what they said:

Quote:
First, it's all about the web. All apps are web apps. The entire experience takes place within the browser and there are no conventional desktop applications. This means users do not have to deal with installing, managing and updating programs.

http://googleblog.blogspot.com/2009/11/releasing-chromium-os-open-source.html


Local resources will exist, of course, you should have a local processor, local storage etc if for nothing else to cache data and reduce the latency issues. But no, their whole point with the Chrome OS is to give you no ability to install anything. There's only their browser.

Quote:
Everyone agrees that the Ipad isn't a 'computer replacement.' You are talking as if these products eventually will be computer replacements. Unless they include the features that I'm talking about, they really won't be.


"Computer replacement" is ambiguous. When people talk about "desktop replacement" in laptops they mean something that can do it all. In that sense it certainly isn't. But for some users who just use their browsers it certainly can be and this will replace computers in millions of use cases.
Cycloptichorn
 
  1  
Reply Fri 5 Feb, 2010 01:07 pm
@Robert Gentel,
Quote:

"Computer replacement" is ambiguous. When people talk about "desktop replacement" in laptops they mean something that can do it all. In that sense it certainly isn't. But for some users who just use their browsers it certainly can be and this will replace computers in millions of use cases.


Yeah, like I said - a toy. Not for serious users. And please see my above objections to having a system which is totally dependent on web access for functionality.

Quote:

You don't seem very familiar with the Chrome OS. Here is what they said:

Quote:

First, it's all about the web. All apps are web apps. The entire experience takes place within the browser and there are no conventional desktop applications. This means users do not have to deal with installing, managing and updating programs.

http://googleblog.blogspot.com/2009/11/releasing-chromium-os-open-source.html



Local resources will exist, of course, you should have a local processor, local storage etc if for nothing else to cache data and reduce the latency issues. But no, their whole point with the Chrome OS is to give you no ability to install anything. There's only their browser.


You can still manage data on your drive - through the browser. Your computer and local data becomes just another site to view and edit. Can't do that on the Ipad. The fact that all programs are 'web apps' doesn't mean that these apps won't let you do stuff on your local box with local data, hell, there are web apps that do that right now.

If you have local resources, then locking people out of utilizing them without being connected a third party is ******* stupid! What's the point?

Cycloptichorn
djjd62
 
  1  
Reply Fri 5 Feb, 2010 01:20 pm
@rosborne979,
i may use it in all my replies
0 Replies
 
Robert Gentel
 
  2  
Reply Fri 5 Feb, 2010 01:41 pm
@Cycloptichorn,
Cycloptichorn wrote:
Quote:
Local resources will probably become extremely expensive because the number of people who want them will be so small.


I dunno; local resources are necessary to run much of the 'thin client' to begin with. I mean, unless you are talking about having NO local processing whatsoever - something which is, to me, just crazy considering how cheap processors are these days - and NO local storage whatsoever, then why the hell would you deny people root access to their own machines? It doesn't make any sense!


You are getting hung up on "root access" you can have root access on a thin client and it can still be a thin client model by design.

But I happen to agree with you about local resources not becoming expensive in this shift. After all, the cloud uses commodity hardware. They aren't using supercomputers they are using the same off-the shelf hardware that consumer computers use.

Quote:
Additionally, games - one of the biggest if not the biggest driver of new sales for computer technology - are increasingly resource-intensive and the idea of a remote client for modern games is a little farcical.


I don't think you've thought this out well. The greater resource intensiveness is a strong argument for the parallelization and lateral scalability that cloud gives.

Games tend not to run off the cloud as much (but are increasingly doing so) because of the latency of delivering the multimedia art as well as the latency of response times (milliseconds matter in gaming) but games will increasingly be on the cloud as well and casual gaming exploding on the web and networked "online play" has become a must-have feature in gaming.

Right now there is a lot of need for local resources for these games (at the very least it's an easier way to get GBs of multi-media onto their computer right now) but ultimately it may play more and more of a caching role and the business logic for these games can be executed remotely.

Quote:
Bandwidth would have to both explode tremendously and decrease in price exponentially for that model to work for a huge percentage of computer users and 'early adopters.'


Of course. That's why many people have talked this talk for many years and it has not yet come. But it will. Video online was a joke till bandwidth caught up and flash video made the usability better (on a proprietary platform, but sometimes proprietary labs lead innovation) and now it's ubiquitous in a few short years.

Quote:
Part of the reason that Thin Clients never caught on so long ago is that a huge group of people - I might as well say 'us' - don't want them and use programs for which they are not well suited.


Perhaps, but honestly the biggest reason they haven't caught on was that the technology and service offerings weren't ready. That's why despite the iPad's flaws I think it's important, as it will push the hardware specs forward and spur more innovation for other companies to improve their internet appliances and ultimately this will bring more digital consumption.

Quote:
We also spend more on computers then the average person and are willing to be early adopters. This helps keep the market from being overwhelmed by the bozos out there who don't know **** about their boxes and don't care to.

I don't understand why I should be happy about a future in which computer use is dumbed down to the lowest common denominator. It represents a decline in our society's use and understanding of computers, not a gain.


I think you unfairly assume that the cloud model is "dumber" when in reality it is a dramatically more complex advancement in computer science that solves many common user problems and that allows for computing on a scale that just was never possible locally (e.g. things like mapreduce, Amazon S3 storage that scales linearly. I think you just call it dumb because you don't understand and/or like it. Scaling vertically (more powerful computer) is dumb, scaling laterally is where it is at and the thin client's reduced processing power is more than matched by the distributed computing of the cloud.

There's absolutely nothing dumber about leveraging the parallelization and networking of the internet and the people most involved in this shift are the technology leaders, not the "bozos" you'd like to characterize as the target of this shift. The folks who most get technology are the ones who are driving this. "Bozos" aren't inventing cloud computing, it's hardcore geeks who are pushing the bounds of what is possible in the browser and increasingly relegating the OS to the browser-launcher. What one computer can do is nothing compared to what 100,000 networked computers can do together and cloud computing is much more complex and powerful through using this expandable grid as the foundation instead of a vertically scaling local machine.

Desktop software is an old paradigm of old castle makers. The software is buggy, less connected and runs in fewer places. Read this desktop developer's experience for some examples:

http://www.kalzumeus.com/2009/09/05/desktop-aps-versus-web-apps/

He touches on just a few of the many reasons that web apps are killing off desktop apps. There are fundamental ways that web apps are superior, and the ways they are inferior are steadily decreasing with the advancement of browser technology (e.g. ajax going mainstream meant Google Docs was possible and more innovations in the browser are coming that will continue to make the web apps richer).
Robert Gentel
 
  1  
Reply Fri 5 Feb, 2010 01:45 pm
@rosborne979,
rosborne979 wrote:
Latency is the only really limiting factor, and they will eventually get around that by placing redundant storage farms at balanced geographic locations around the planet.


IMO the real limitation isn't bandwidth or network capacity but availability of a data connection at all. There are still many places where the internet is not an always-on resource and that is like having a hard drive that fails.

So IMO wireless data evolution is what drives this and the more like a utility the internet data connection becomes the more reliable the cloud is.
DrewDad
 
  1  
Reply Fri 5 Feb, 2010 01:47 pm
@Cycloptichorn,
Cycloptichorn wrote:
Quote:
leaving more and more people using their computers essentially as a monitor and keyboard that boots a browser.


This will be a sad, sad day.

For most people, that day is already here. Most people don't know or care what processor they're running, or how much RAM or disk storage they have.

They want their computer to run. They want it to run reliably. They want to check their E-mail, and surf the web, and do their daily work.



Now, your needs may be different, but you are not the marketplace. You have your needs, I have my needs, and Apple is looking for the sweet spot that means their product will be bought by a bunch of people.
0 Replies
 
DrewDad
 
  1  
Reply Fri 5 Feb, 2010 01:51 pm
@Cycloptichorn,
Cycloptichorn wrote:
I don't understand why I should be happy about a future in which computer use is dumbed down to the lowest common denominator. It represents a decline in our society's use and understanding of computers, not a gain.

This has been said about nearly every technology every developed, and it is a silly argument.

Further, are you saying you really understand your computer? You understand how a transistor works? Boolean logic? How RAM works? You know what a register is?
Cycloptichorn
 
  1  
Reply Fri 5 Feb, 2010 01:54 pm
@DrewDad,
DrewDad wrote:

Cycloptichorn wrote:
I don't understand why I should be happy about a future in which computer use is dumbed down to the lowest common denominator. It represents a decline in our society's use and understanding of computers, not a gain.

This has been said about nearly every technology every developed, and it is a silly argument.

Further, are you saying you really understand your computer? You understand how a transistor works? Boolean logic? How RAM works? You know what a register is?


100%. I built my first x86 computer when I was five years old (with my dad's help of course) and have been working on them extensively my entire life. I would wager I understand how modern computer hardware works as well as anyone.

Why would you assume that I don't know these extremely simple things?

Cycloptichorn
rosborne979
 
  1  
Reply Fri 5 Feb, 2010 01:55 pm
@Cycloptichorn,
Cycloptichorn wrote:
The question is, does it grow cheap at the same rates?

Yes.

Cycloptichorn wrote:
I think if you asked people, 'Do you want a computer that does nothing at all if you can't get a web connection?' The answer would be a universal no.

I think you would be surprised.

But you also have to ask a fair question. iPod and iPad level ThinClients will still have substantial functionality even without web access (just as they do today). Most people just want to browse the web, read Email, Listen to music and watch podcasts and small shows. All this can be fed through download and accessed online or offline.

Go ask some people who are not computer people if they care whether their computer ran Word over the network or on the local drive (if they couldn't tell the difference), and see what they say. If you can even get them to understand the question, I think you'll find that most people don't care where the data is, they only care about getting stuff easily.

DrewDad
 
  2  
Reply Fri 5 Feb, 2010 01:56 pm
@Cycloptichorn,
Cycloptichorn wrote:
when having local processing power provides a huge advantage in many situations - not the least of which being the ability to actually DO SOMETHING when there's no net connection.

Most people can't DO SOMETHING when there is no net connection, because DOING SOMETHING involves communicating over the network.


Cycloptichorn wrote:
When the power goes out, my laptop will still work (as long as there's battery life). The device you envision will do nothing. If the internet goes down in your area, you can't do anything. It's a limited device. Having a cheap local processor and storage on board removes these limitations.

And having local storage imposes limitations, too. If you drop your laptop, you've lost any changes that you've made since your last backup. If you lose your laptop, then proprietary information is potentially exposed.





You're sounding reactionary, frankly. The 'net is moving on, and you best do so as well.
Robert Gentel
 
  1  
Reply Fri 5 Feb, 2010 01:58 pm
@Cycloptichorn,
Cycloptichorn wrote:
Sure, but improve it to the point where we are streaming that much data? Not anytime soon.


You have a strange misconception about bandwidth. The networks aren't overloaded by cloud computing and thin clients and their biggest problems are multimedia downloads.

The growth of thin client models isn't a bandwith issue so much as a concurrent users issue and the data providers will just have to improve. They whine a lot about it because they are defending things that a data connection as a commodity would kill (things like charging you for phone calls that can be done for free on the internet, charging you for text messages at a rate more expensive than it would be to take data to the moon etc).

Quote:
I think if you asked people, 'Do you want a computer that does nothing at all if you can't get a web connection?' The answer would be a universal no.


That's because they are used to not getting an internet connection, but see how that is changing? A few years ago the notion of Kindle's whispernet would have been hard to get. "What? I download books when I happen to have a cellular data connection?"

And this device is yet another that incorporates an "always-on" internet model. Sure, the cell data connection isn't perfect but for some people in some regions it will be a huge step closer to internet everywhere.

If you asked people that question but gave them the option of the always-on internet you may get a different answer, and the internet connections are only going to become more ubiquitous as wireless evolves.

Quote:
Sure, you can have toys and limited little devices that work this way, but hardly a replacement for our modern systems.


Just for reference, the model you are arguing for is the older one, you are arguing against the change to the more modern one (and sounding a bit like a tech dinosaur in the process). Not that describing something as "modern" makes it good or anything but if you are going to imply as much it works better when it's not the older model being called the "modern" one.
0 Replies
 
 

Related Topics

Steve Jobs has died - Discussion by jespah
iTablet Announced At Last! - Discussion by Gargamel
Apple's Churlish Side - Discussion by Robert Gentel
Tablet Wars: Google Strikes Back! - Discussion by tsarstepan
iTunes has a mind of its own - Question by jespah
Iphone Randomly Changes to Silent - Question by Kayster
HELP - Question by Mealad565
 
Copyright © 2024 MadLab, LLC :: Terms of Service :: Privacy Policy :: Page generated in 0.03 seconds on 04/24/2024 at 06:21:39