__/ [ BearItAll ] on Tuesday 09 January 2007 12:01 \__
> Roy Schestowitz wrote:
>
>> Why Google's Network Will Kill Node Computing
>>
>> ,----[ Quote ]
>> | Node technology is already inferior to network computing. In not too
>> | many years, most node computing technology will also be rendered
>> | totally obsolete by the coalescing network computer. People will
>> | still use their node computers, but it will become a popular hobby
>> | like making your own beer, or repairing your own car. Maybe it is
>> | time for the stock market to assign a more accurate risk premium
>> | to the node computing companies given the near certain likelihood
>> | of an eventual meltdown in Nodeville.
>> `----
>>
>> http://biz.yahoo.com/seekingalpha/070108/23709_id.html?.v=1
>>
>
> Part of the idea of nodal computing in a network is shared processing,
> which has happened to some extent. Each computer being a node in a common
> network.
Yes, I think so as well. Some would argue there is little or no use of the
power of networks. Much of the time I can afford to have little RAM because
I run applications remotely.
> I suppose you could say that Bit Torrent is one use of the nodal system,
> though it isn't really shared processing. A truer example was the BBC's
> experiment where your client actually added to the processing power of the
> whole calculation.
>
> If the Internet could be a totaly trusted arena then imagine the power of
> all the PCs in all the Internet working together to share CPU time. Take
> the free CPU time of your own machine and multiply that by the One billion
> internet users. That is a lot of spare CPU time.
Some search engines and indexers capitalise on this notion. What if you
offered free services (applications) in exchange for compute share? Some
would argue that Google gives free services in exchange of personal data
which can be used for all sorts of things (e.g. Google Trends). Privacy and
ad revenue aside...
> (The 1 billion is internet users of all kinds, including phones/pdas etc,
> so not an accurate sum, but the numbers are still large).
>
> I do think that applications will, and should, move back to the servers.
> That is both in-company and internet applications. But only in a way that
> the processing is sensibly shared between the server and client. Java is
> still possibly the most likely way this would work on the client, with the
> server putting out the client java/applets so that the user has the
> {view/interface/some tools} code local for best response.
As usually argues, backups, patching, upgrades, security (e.g. firewall) etc.
is then managed centrally, so one sysadmin can administer and manage many
users without leaving his/her seat or even log in remotely.
> Though of cause they are other interactive systems that might prove better.
> For example a php on the server and php local too to give better responses
> for the user. The reason I say java is because the install/update system is
> already a well proven system, and this has to be a system that can make use
> of the client but is not dependant upon it.
>
> I find this sort of thing is one of the few exciting areas of computing
> left. It would be a huge boon for Linux, because as I said, the
> applications are meant to be platform independant.
Absolutely. Since you have well-defined interfaces such as SQL, then you also
rid yourself from lockins and vendor dependence. It ends cusotmer abuse, be
it the buyer or the end user.
--
~~ Best regards
Roy S. Schestowitz | Gas, brake, honk! Honk, honk, punch! Gas, gas!
http://Schestowitz.com | Open Prospects ¦ PGP-Key: 0x74572E8E
Tasks: 130 total, 1 running, 127 sleeping, 0 stopped, 2 zombie
http://iuron.com - knowledge engine, not a search engine
|
|