ed-dot

Researchers Plan scrapping the Internet

13 posts in this topic

Don't know if anyone has read this article but apparently plans are underway to scrap the Internet. I read this in APRIL this year, any comments ?

You can read about it here http://www.msnbc.msn.com/id/18095186/.

Part of the article says

"No longer constrained by slow connections and computer processors and high costs for storage, researchers say the time has come to rethink the Internet's underlying architecture, a move that could mean replacing networking equipment and rewriting software on computers to better channel future traffic over the existing pipes".

0

Share this post


Link to post
Share on other sites

If you really want to improve the internet, make it more like P2P. If every computer connected to the internet acted like a proxy cache, with about about 500GB of space reserved for internet data, and everyone was required to have as much upload speed as download speed, centralized servers could be reserved only for tracking and backup.

If you want to have a website, you host it on your own computer and then upload a very small tracking file to a central server. Say you have a 20MB video on your website. With normal hosting, this video must be uploaded to a server, and when people download it, it taxes the server and the video ends up just sitting in the temporary internet files of everyone who downloads it, not being used.

With this method:

internet_3.jpg

When you download a file, first you would download a tracking file from the central server that has file ID information. You would then check a local server (like the neighborhood server of your ISP) for that file ID number, and the local server would give you the IP adresses of all the computers in your area that have that file. Your computer would then check its own database to choose the computer that is most directly connected to your computer. Your computer would request the file from the other computer. If it is unsuccessfull, it tries other computers. If sucessfull, it sends a message to the local server that it has sucessfully downloaded the file and includes the IP address of the computer it was downloaded from. The local server would then send a message to the central server with the number of computers on its local network that have the file. If another area has a computer requresting the same file, and no one in that area has it, the central server can be used to find which areas have it, and then a file could be transfered between two areas.

A file would only ever be downloaded from the original source if it is the closest computer with the file. Each time a file is downloaded, there is one more server available for the file, and now more bandwidth is available for downloading that file.

The only potential problem is people repalcing the actual file with a different file (like a virus) on thier computer. With the tracking system, however, it would be easy to find out who is the perpetrator.

or maybe I'm just rambling on...

0

Share this post


Link to post
Share on other sites
If you really want to improve the internet, make it more like P2P.

P2P fails when it comes to small files. Normal server-client connection is more efficient and faster when browsing web pages.

0

Share this post


Link to post
Share on other sites
If you really want to improve the internet, make it more like P2P.

P2P fails when it comes to small files. Normal server-client connection is more efficient and faster when browsing web pages.

This is not traditional P2P I am talking about. Each computer is a dedicated server and ISP. Yeah, it would probably be slightly slower for small files, but not slow like BitTorrent for small files.

Edited by greeniguana00
0

Share this post


Link to post
Share on other sites

The problem with freenet is that there is a high probability of exploitation that breaks certain laws and may cause innocent individuals to be prosecuted for content that they didn't actively participate sharing and where sharing only because they where forced by the cache. As you mentioned files, but this could be other materials that are illegal on moral grounds with in the user's given country. That would cause a certain regulation of the new network that would censor alot of bad stuff in general, but some things that may be alright in one society will be forbidden info in another. This could trickle down to every day info creating an extremely high form of censorship. Destroying one of the reasons the internet is such a great tool in the first place.

0

Share this post


Link to post
Share on other sites

I don't know about scrapping the whole infrastructure, but email sure as hell is broken.

0

Share this post


Link to post
Share on other sites

I think ultimately this just boils down to: some moron is always trying to "reinvent the wheel." almost all of the internet's major problems in my opinion stem from user error. If they want to improve the internet, or "build a better one", then they need sysadmins with baseball bats and samurai swords ready to take on anyone who even thinks about running unpatched explorer.

0

Share this post


Link to post
Share on other sites

I agree. I think the current setup is fine. My only problem is with bandwidth that individuals pay for and the lack of basic, free, low bandwidth access to all individuals.

0

Share this post


Link to post
Share on other sites

With all of the unlit fiber laying around, "scrapping" everything would be foolish. Time would be better spent developing replacement protocols for things like e-mail and newsgroups.

0

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now