I've just read the Pew Research Center's annual survey on cloud computing. As ever, it contains plenty of thought-provoking commentary from experts. One that really caught my attention was this quote from Nokia's Davis Fields:
"It's 2010 and I could already basically use only cloud-based applications on my computer. Local storage is already increasingly irrelevant – I have my all my photos stored on Flickr, my address book is in my Gmail and I've got all my emails stored there as well. Apple will likely move iTunes online in the next few years, and streaming movies from Netflix will eliminate the need to download movie files. I use Microsoft Office and Photoshop out of familiarity as my main two desktop apps, but good alternatives already exist online. I predict most people will do their work on ‘screens connected to the web,’ There won't be any sort of ‘computer’ anymore."
Now, I agree with the sentiment that every screen will be connected to the web and the experience on each will be interchangeable - each will be an extension of the other and a portal into data hosted elsewhere. The oversimplification is that each screen is, of course, a computer in its own right. Google TV's tie up with Sony and Intel to produce connected TV's is an example of this. The TV, like the mobile phone, is becoming an interactive computer that enables rich use (as opposed to the computers that perform operational tasks behind the scenes today).
I suggest that the next step in the dynamics of the cloud will be to aggregate up all this spare computing power and use it to form part of the flexible resource. The Seti project and similar initiatives started a wave of this in the early part of this Century, but I can't help feeling that a more market-based approach where resources are requested and drawn dynamically based on 'weather' could open up new opportunities. Could there be money to be made in enabling your spare processor resources to be fed back up to the 'grid' when the machine is idle? As with power generation at home, the actual revenue made is tiny, but psychologically it incents users to participate and may therefore add to the resources available at incremental cost.
Just a thought. Might be far fetched!
"It's 2010 and I could already basically use only cloud-based applications on my computer. Local storage is already increasingly irrelevant – I have my all my photos stored on Flickr, my address book is in my Gmail and I've got all my emails stored there as well. Apple will likely move iTunes online in the next few years, and streaming movies from Netflix will eliminate the need to download movie files. I use Microsoft Office and Photoshop out of familiarity as my main two desktop apps, but good alternatives already exist online. I predict most people will do their work on ‘screens connected to the web,’ There won't be any sort of ‘computer’ anymore."
Now, I agree with the sentiment that every screen will be connected to the web and the experience on each will be interchangeable - each will be an extension of the other and a portal into data hosted elsewhere. The oversimplification is that each screen is, of course, a computer in its own right. Google TV's tie up with Sony and Intel to produce connected TV's is an example of this. The TV, like the mobile phone, is becoming an interactive computer that enables rich use (as opposed to the computers that perform operational tasks behind the scenes today).
I suggest that the next step in the dynamics of the cloud will be to aggregate up all this spare computing power and use it to form part of the flexible resource. The Seti project and similar initiatives started a wave of this in the early part of this Century, but I can't help feeling that a more market-based approach where resources are requested and drawn dynamically based on 'weather' could open up new opportunities. Could there be money to be made in enabling your spare processor resources to be fed back up to the 'grid' when the machine is idle? As with power generation at home, the actual revenue made is tiny, but psychologically it incents users to participate and may therefore add to the resources available at incremental cost.
Just a thought. Might be far fetched!
What I should have also said is that the computer in the screen is likely to have significant resources to spare thanks to Moore's Law. Just as the iPhone is as fast and capacious as a late '90's PC...
ReplyDelete