SAN FRANCISCO--The National Science Foundation is planning an effort to fundamentally re-engineer the Internet and overcome its shortcomings, creating a network more suited to the computerized world of the next decade.
The new project, the Global Environment for Networking Investigations, was described for the first time by researchers and foundation officials at a technical meeting held in Philadelphia last week.
The project, which has not yet received financing and may cost more than $300 million, is intended to include both a test facility and a research program. As described in documents circulated by National Science Foundation officials, the network will focus on security, "pervasive computing" environments populated by mobile, wireless and sensor networks, control of critical infrastructure and the ability to handle new services that can be used by millions of people.
Peter A. Freeman, assistant director of the science foundation for computer and information science and engineering, said that "simply to provide the kind of security everyone needs and carry the huge volumes of data necessary in the future, there was strong thinking that new architectures beyond the Internet were going to be needed."
The National Science Foundation is looking for more participants for the project, including other government agencies and potentially other countries, Freeman said.
To begin the development of the network, the government agency provided six small planning grants this summer and then introduced the idea at an all-day meeting involving a group of leading computer scientists and network experts in Washington last Monday.
A new network test bed for experimentation would allow scientists to make measurements and test new design ideas in ways that are not possible with the current Internet, said Leonard Kleinrock, a computer scientist at the University of California, Los Angeles who was involved in developing the Arpanet, the network that preceded the modern Internet.
Kleinrock said it would be possible to design a network that was better able to handle traffic from the edge of the network, at the level of individual users. In the next decade, computer researchers expect an explosion of data from mobile and wireless devices as well as sensors that will vastly outnumber today's PCs.
The project described last week is an opportunity to work from a clean slate, according to several researchers involved in planning it.
"If you look at the Internet today, it does what it does really well," said David Clark, a senior research scientist at the Laboratory for Computer Science at the Massachusetts Institute of Technology. "It's profound, but we can look at it and see some things that aren't right. The most obvious is that there is no framework for security."
When the Internet was designed in the 1970s, its engineers did not expect that the project would have to be scaled to cover much of the world's population, and security was not an important consideration.
"The culture of the original Internet was one of trust," Kleinrock said.
Faster transmission speeds are not one of the design goals of the new network.
"Making a network faster has never made it more secure or easier to use," Clark said.
There are several similar experimental networks in the United States, including Internet2, the National LambdaRail and PlanetLab. There was also an earlier effort to redesign the basic Internet protocols, known as IPv6.
But these efforts are either only partial or, in the case of IPv6, many of their features have already migrated to the existing Internet, Clark said.
He acknowledged that one of the principal challenges facing the new project's designers was the question of how to handle the transition to a better network.
"What we need to envision the future," Clark said, is to "stop thinking about the present and saying, 'Let's put a Band-Aid here.' "
http://news.com.com/Early+look+at+effort+to+re-engineer+the+Internet/2100-1028_3-5843922.html?tag=cd.top
The new project, the Global Environment for Networking Investigations, was described for the first time by researchers and foundation officials at a technical meeting held in Philadelphia last week.
The project, which has not yet received financing and may cost more than $300 million, is intended to include both a test facility and a research program. As described in documents circulated by National Science Foundation officials, the network will focus on security, "pervasive computing" environments populated by mobile, wireless and sensor networks, control of critical infrastructure and the ability to handle new services that can be used by millions of people.
Peter A. Freeman, assistant director of the science foundation for computer and information science and engineering, said that "simply to provide the kind of security everyone needs and carry the huge volumes of data necessary in the future, there was strong thinking that new architectures beyond the Internet were going to be needed."
The National Science Foundation is looking for more participants for the project, including other government agencies and potentially other countries, Freeman said.
To begin the development of the network, the government agency provided six small planning grants this summer and then introduced the idea at an all-day meeting involving a group of leading computer scientists and network experts in Washington last Monday.
A new network test bed for experimentation would allow scientists to make measurements and test new design ideas in ways that are not possible with the current Internet, said Leonard Kleinrock, a computer scientist at the University of California, Los Angeles who was involved in developing the Arpanet, the network that preceded the modern Internet.
Kleinrock said it would be possible to design a network that was better able to handle traffic from the edge of the network, at the level of individual users. In the next decade, computer researchers expect an explosion of data from mobile and wireless devices as well as sensors that will vastly outnumber today's PCs.
The project described last week is an opportunity to work from a clean slate, according to several researchers involved in planning it.
"If you look at the Internet today, it does what it does really well," said David Clark, a senior research scientist at the Laboratory for Computer Science at the Massachusetts Institute of Technology. "It's profound, but we can look at it and see some things that aren't right. The most obvious is that there is no framework for security."
When the Internet was designed in the 1970s, its engineers did not expect that the project would have to be scaled to cover much of the world's population, and security was not an important consideration.
"The culture of the original Internet was one of trust," Kleinrock said.
Faster transmission speeds are not one of the design goals of the new network.
"Making a network faster has never made it more secure or easier to use," Clark said.
There are several similar experimental networks in the United States, including Internet2, the National LambdaRail and PlanetLab. There was also an earlier effort to redesign the basic Internet protocols, known as IPv6.
But these efforts are either only partial or, in the case of IPv6, many of their features have already migrated to the existing Internet, Clark said.
He acknowledged that one of the principal challenges facing the new project's designers was the question of how to handle the transition to a better network.
"What we need to envision the future," Clark said, is to "stop thinking about the present and saying, 'Let's put a Band-Aid here.' "
http://news.com.com/Early+look+at+effort+to+re-engineer+the+Internet/2100-1028_3-5843922.html?tag=cd.top