Researchers Explore Scrapping The Internet

[The following article has the markings of Big Brother’s fingerprints all over it. This valuable internet tool, that somehow ‘magically’ appears before you, seems to have gotten into the hands of the wrong people…..ordinary, free-thinking citizens!

Is it possible that ordinary folks just like ourselves, may just stumble onto some real truths through the use of the internet, and actually discover, we, the common people are deliberately kept in perpetual ignorance, thus, under their complete control like penned cattle most of our lives?

These self-appointed cabals of ‘elite controllers’ have determined that the internet is much too accessible to the ‘useless eaters’ of the world.  So now, they’re trying to ‘put the genie back into the bottle‘ by trying to scrap this indispensable tool you’re now viewing, and starting over again with a “clean slate“.

Obviously, some parts their arsenal of excuses for shutting down the internet, will be pornography, spam and anything else that will stampede the masses into gullible acceptance. Hopefully, those same masses will awaken before the ‘pen masters’ are able to implement their nefarious plans.] *Elliotlakenews*

By ANICK JESDANUN, AP Internet Writer

NEW YORK – Although it has already taken nearly four decades to get this far in building the Internet, some university researchers with the federal government’s blessing want to scrap all that and start over.

The idea may seem unthinkable, even absurd, but many believe a “clean slate” approach is the only way to truly address security, mobility and other challenges that have cropped up since UCLA professor Leonard Kleinrock helped supervise the first exchange of meaningless test data between two machines on Sept. 2, 1969.

The Internet “works well in many situations but was designed for completely different assumptions,” said Dipankar Raychaudhuri, a Rutgers University professor overseeing three clean-slate projects. “It’s sort of a miracle that it continues to work well today.”

No longer constrained by slow connections and computer processors and high costs for storage, researchers say the time has come to rethink the Internet’s underlying architecture, a move that could mean replacing networking equipment and rewriting software on computers to better channel future traffic over the existing pipes.

Even Vinton Cerf, one of the Internet’s founding fathers as co-developer of the key communications techniques, said the exercise was “generally healthy” because the current technology “does not satisfy all needs.”

One challenge in any reconstruction, though, will be balancing the interests of various constituencies. The first time around, researchers were able to toil away in their labs quietly. Industry is playing a bigger role this time, and law enforcement is bound to make its needs for wiretapping known.

There’s no evidence they are meddling yet, but once any research looks promising, “a number of people (will) want to be in the drawing room,” said Jonathan Zittrain, a law professor affiliated with Oxford and Harvard universities. “They’ll be wearing coats and ties and spilling out of the venue.”

[…]

Clean-slate advocates say the cozy world of researchers in the 1970s and 1980s doesn’t necessarily mesh with the realities and needs of the commercial Internet.

“The network is now mission critical for too many people, when in the (early days) it was just experimental,” Zittrain said.

The Internet’s early architects built the system on the principle of trust. Researchers largely knew one another, so they kept the shared network open and flexible — qualities that proved key to its rapid growth.

But spammers and hackers arrived as the network expanded and could roam freely because the Internet doesn’t have built-in mechanisms for knowing with certainty who sent what.

The network’s designers also assumed that computers are in fixed locations and always connected. That’s no longer the case with the proliferation of laptops, personal digital assistants and other mobile devices, all hopping from one wireless access point to another, losing their signals here and there.

Engineers tacked on improvements to support mobility and improved security, but researchers say all that adds complexity, reduces performance and, in the case of security, amounts at most to bandages in a high-stakes game of cat and mouse.

Workarounds for mobile devices “can work quite well if a small fraction of the traffic is of that type,” but could overwhelm computer processors and create security holes when 90 percent or more of the traffic is mobile, said Nick McKeown, co-director of Stanford’s clean-slate program.

The Internet will continue to face new challenges as applications require guaranteed transmissions — not the “best effort” approach that works better for e-mail and other tasks with less time sensitivity.

[…]

Full article HERE.