A Practical Navigator for the Internet Economy

SURFnet Users Pay Cost of Connection

While Gov't Funding Buys Advanced Infrastructure Capabilities That Serve as Enablers of New Applications

Like Canada an Emphasis on Dark Fiber Pushes Network Forward While Other European Nations Focus Much More on Applications,

pp. 1-8

We interview Kees Neegers, Managing Director of SURFnet who explains first and foremost the government's policy that "the research and education community must pay for its own Internet connectivity while government subsidies are used to "tender" for a commercially advanced network. What we did is to ask for things that we think are technically possible, but for whatever reason the commercial market is not yet willing to deliver."

He states that 'the nature of the resulting contract is more partnership for a common development than just a normal supply contract. Research people from both Telfort(BT) and Cisco Systems are actively involved in our activities." Since the late 1980s he has operated on the following dynamic: " with the 'innovation' money we built a network - SURFnet1. With the user money, we made it operational and kept it operational. In parallel, we used the government money to build the next generation network. Now we are building SURFnet5 with government money and the users are paying for SURFnet4 which is essential, because it is the lifeline to the Internet for our customers."

Kees' philosophy is that leading edge network infrastructure will enable many new and productive applications. How the bandwidth is produced is of primary importance as government money is used to catalyze efforts by the commercial carriers in binging new technologies to market.

SURFnet is asking for and receiving a similar kind of optical customer owned lambda network as Canarie in Canada. "In our next generation networking, we are no longer just talking to carriers to make plans, we are talking to customers to make plans. And the carriers are welcome to deliver what we ask. But you see, we have the strengths of the end user community to tell them what we want. As with SURFnet5, we told them what we wanted : 10 gig lambdas, rather than ATM and SDH."

The fiber builds that it undertakes for its backbone are also used to attract and create commercial fiber infrastructure. SURFnet's philosophy has been one of continually pushing the envelope. To do this it needs to be able to move quickly and make its own decisions promptly. In part for these reasons it has chosen not to participate in the next generation pan European GEANT project which is a consortium of European national research and education networks.

Neggers: "SURFnet has found that European networking so far tends to be driven by the speed of the slowest. I don't like that model. It is a recipe for not being able to be state of the art. We wanted to use the opportunity of the setting up of Geant to learn from the past and improve the situation. However the way the Geant network is organized is a continuation of the structure from the past. Dante has no central management. It's a consortium of some twenty-six national research networking entities. All 26 have to agree on everything. I didn't want to be the 27th of that group. So my proposal was, Dante should do the procurement, should operate the network and I want to be a customer of DANTE. And the consortium should only be a consortium in its relationship to the European Commission to define the set-up. But none of the participants in Europe were willing to go that route." While it is the only European nation not to participate in Geant, it is connected to the pan European Ten-155 research backbone and will connect to Geant as well.

SURFnet has been following the development of OBGP. According to Kees: "The reason that Bill St. Arnaud and I are interested in this is that we are not providers. We work for a user community. If it is better for the user community to get many lambdas on their premises, we will deliver them. However, a provider might well want to keep a provider relationship where the provider is the smart network provider and it can force you to work through it."

SURFnet policies are designed so that expenditures of the network serve as a magnet for other telecom players and communities to lay dark fiber. "Everybody in the Netherlands is allowed to dig fiber in the ground and own it. . . If one provider asks permission to lay a fiber, it will be announced and all others who are interested in the same route are free to use the same digging and then the providers have to share the costs of the digging together." The national policy of the Netherlands government is to create conditions such that the combined actions of the Research and Education and commercial sectors will create a fiber based infrastructure for the entire nation and will keep it in the top ten in European information technology infrastructure national expenditures.

State of the Internet:

Light, IP and Gigabit Ethernet

A Road Map for Evaluation of Technology Choices Driving the Future Evolution of Telecommunications,

pp. 9-10

From the COOK Report Annual Report : There appears to be a choice of two paths to our telecom future. One is to go with the highly innovative pure Internet approach of gigabit Ethernet over condominium fiber. Such a choice empowers the customer, facilitates decentralization over centralized control and provides small and innovative businesses with the environment that they need in order to flourish. The other path is to try to fore stall the innovation by squashing competitors with a massive vertically integrated company founded on older technology and leveraging access to content and over a network monopoly so pervasive that people will find they have no choice but to buy it. What could be in store for us all, if things go in this direction, was summarized by Scott Cleland CEO of the Precursor Group on Friday January 19th, 2001. "Precursor believes AOL-TW has budding 'Microsoft-like' potential to grow increasingly dominant by being the leading national company that brings together the various online interfaces (content, Internet access, buddy lists, instant messaging, etc.) to become the de facto consumer online access market standard much like Microsoft Windows brought together the various desktop applications to become the de facto consumer software market standard." See http://cookreport.com/lightipgige.shtml

Bandwidth, User Tools Migrate Toward Network Edge Fueling Idea of Always on Disciplinary Computational Grids

User Control of Bandwidth Raises Interest in Shaping Network Environment to Needs of Subject Matter Communities,

pp. 11 -18

In the context of broadband networks, grids are becoming a much discussed subject. We interview Peter MacKinnon who is Managing Director, Synergy Technology Management. Grids are seen as a pervasive computing fabrics into which users can plug. Says MacKinnon: 'computational grids' are viewed as a network of distributed computing resources that can work both cooperatively and independently of each other. They allow applications to operate in a distributed multi-platform environment across various geographic scales defined by the physical networks involved. Computational grids represent one of the frontiers of computing. They raise many fundamental challenges in computer science and communications engineering, much of which has to do with partitioning a problem across multiple machines, latency in the networks and administration and allocation of the grid's resources.

Then there are access grids. According to MacKinnon: "That's another way to look at the grid, where it provides access to devices, such as, say, radio telescopes or optical telescopes. I make these references in technical terminology or scientific terminology, because this is the locus of this grid frontier. It's not in the commercial world yet. For the time being we're not talking about applications that relate to customer relationships."

A more complex example called 'Neptune a Fiber Optic Cable to Inner Space' is actually a proposal right now. It's being led by Woods Hole and JPL in the U.S. and the Department of Fisheries and Oceans in Canada, at this stage. There are other players. Basically the intent is to put fiber optics in a grid-like form on the Juan de Fuca plate off the West Coast of North America. This will be an ocean floor-based grid that will have nodes spaced a hundred kilometers apart and a receptor. And in that receptor or junction box, which would be analogous to a satellite system or a space station system into which you'll be able to plug in instrument packages.

A number of financial systems could be turned into a grid by being connected within a single phone network. With these grids a global trading house, for example, could end up having its neural network system in London connected with that in New York, with that in Tokyo, with that in Sydney. So now you're monitoring on a different level. If you have the latency problem solved, so that you can both do this in real time and do the computations required, it becomes a potential example that could lead to the development of new types of financial instruments or new ways of hedging or new risk reduction capabilities. The financial area would probably be one of the first major commercial uses of grid-like capability. Another example of grid-like capability could well be in utilities. These are organizations that have distributed systems already that want to use the grid, in a simple sense, to do status checks, self-healing, monitoring, or whatever the case may be.

The Grid Forum is basically a place to talk shop for everyone from those who are trying to build grids to those who actually want to use grids. It is a common meeting ground as a place to discuss the entire dimension of grids. They have organized themselves into several working groups. See http://www.gridforum.org/

Many of the technical issues needing to be solved involve integrating current communications and computing advances with the architectural needs of grids. However we have to find ways of solving certain problems in both areas before we are going to be able to make grids, in what we might call a promise, the success that they might appear to be.

For delivering the promise of grids, as demonstrated by the notion of grid space, which we just talked about, the following are likely. Grids will provide powerful, interactive, dynamic and flexible environments allowing for opportunities to create new discoveries, on a level of Grand Challenges. They will also allow for more R&D without increasing other resources as well as widen access and enhance educational uses.

Actual implementation of robust grids is going to require a great deal more advancement in software systems than is currently available. When you start to think about what it is that you're going to do in a grid-like problem-solving environment, then there are some really fundamental technical issues that need to be addressed.

If you have a computationally intensive problem, there have been a lot of advances made in development of parallel computing approaches in the last several years. These advances allow problems to be partitioned so that multiple parts of the problem can be simultaneously computed on different processors . Therefore, you have to understand in some detail the kind of problem that you're dealing with in order to know how to partition the processing.


Chicago Civic Network

Fiber to Link More Than 1600 City Institutions

November RFI for Condo Style Build Yields 63 Responses

pp. 19 - 21 and

Request For Information Chicago CivicNet pp. 22-24

We interview Joel Mambretti Director of the International Center for Advanced Internet Research at Northwestern University about Chicago's plans for CivicNet, a public private partnership that would link all public schools and libraries and city agencies by fiber.

The City is doing what any large organization should naturally do, that is to say, it is planning for a future rise in demands, and it is planning for new services, while understanding that it has a set budget.

The projected CivicNet budget is $250 million over 10 years. The question is: How can we best optimize this expenditure? Part of this process is simply doing what organizations should normally do. Certainly, then, they want to ensure that their requirements are clearly communicated to potential providers of those services, because the providers are very eager to know what customers require so that they can respond. Another part of this process is establishing an ongoing dialogue between the City and potential providers of services to match what is being asked for with what can be provided. This is a process that is healthy for both sides and one that both sides appreciate

Therefore, the idea isn't to go out say: Build something. But rather to say to the general world: Here are the requirements, and then ask for a response. That is why the city has issued an RFI not an RFP.

Thanks to subscriber Frank Coluccio we received a pointer to the Civic Net web site that has been built by the city of Chicago. http://www.chicagocivicnet.com/civicnet/SilverStream/Pages/civiclist.html The site and the RFI document itself - some 130 pages in Adobe PDF format - begins to bring home the seriousness of the project. For a project of this scope, it seems likely that the city is making unprecedented use of the Internet to bring a public focus to its civic net acquisition procedures. In addition to the RFI the web site contains a forum for respondents to question the city and it contains access to city mapping tools and map sets that in non electronic form would be hugely expensive to enlist in a venture such as this. Finally it is perhaps the most significant example of CANARIE's telecom outlook influencing events in the United States. We republish here a shortened form of the RFI text that is also lacking its extensive appendices. The complete document is worth examining in order to get a sense of how ambitious this project is.

Congress Gives ICANN Second Look

Auerbach and Froomkin Testify Before Senate that Compared to House is in Early Learning Stage pp. 25-26

When the Senate Commerce Committee announced ICANN hearings on the heels of house hearing and invited At large Board member Karl Auerbach to speak, we were encouraged Unfortunately the Senators were not well informed. While Auerbach's and Froomkin's testimony was accurate and should have caused rapt attention, Mike Roberts and Roger Cochetti as two of the people most responsible for the mess sat there and made apologies for an ICANN that really was not yet mature and had not yet had a chance to do what it was put into place for. Senator Burns in the opinion of an observer called the hearings only because he figured if his House colleagues were concerned he better find out what this was all about.

The highlight of the morning came when Senator Burns asked Michael Froomkin what he would propose in light of the criticisms of ICANN that were delivered to the Committee. Froomkin: "the most important issue is not setting a precedent by which a Department, like the Department of Commerce can end run the Administrative Procedures Act. And that is an issue that frankly is bigger than the Internet. The global concern here is not just in this process. For this is a way in which agencies can by pass ordinary procedures to create a privately organized regulator in all but name. A regulator that uses control over a federally dominated resource to make people sign contracts with it, pay it money and do what it says. And then not be subject to due process. Not be subject to court challenge. Not be subject to ordinary oversight. That is really cutting the congress and cutting the American people out of the regulatory process. So while in the case [of the seven selected new TLDs] you might have had an outcome which was better than no decision at all - I have nothing against any of the winners here. I have no reason to believe that any are bad or imperfect, and for all I know we would be all better off if they were all put in the root along with lots of others too. It seems to me that there is a good government issue that is pretty serious here. Someone needs to hold Commerce's feet to the fire on this one." Burns pledged a follow up hearing at to look at "redress of due process." Whether he really understood remains to be seen. As we have said before and as one of the other witnesses pointed out, if ICANN succeeds there will be other ICANNs all designed by corporate interests to engage itself dealing and ignore all due process rights of those whom they would regulate.

Hearings on ICANN Governance

Prepared Statement of Karl Auerbach before the Senate Commerce, Science and Transportation Committee

pp. 27-30

Auerbach: There are those who say that ICANN is merely a technical body. I am a technologist. Yet I have a difficult time understanding how any of ICANN's decisions concerned with the Domain Name System have any technical content at all.

One must wonder where the technical component might be in ICANN's Uniform Dispute Resolution Policy - a policy that expands the protection of trademarks to an extent not granted by any national legislature. And one must also wonder where the technical component might be in ICANN's preservation, indeed in ICANN's extension, of the hegemony of Network Solutions over the naming systems of the Internet. We know more about how the College of Cardinals in Rome elects a pope than we do about how ICANN makes its decisions.

There are lessons to be drawn from ICANN: - ICANN has shown us that governmental powers ought not to be delegated to private bodies unless there is an equal obligation for full public participation and public accountability. - ICANN has shown us that a public-benefit and tax exempt corporation may be readily captured by those who think of the public less as something to be benefited than as a body of consumers from whom a profit may be made. - The role of the US Department of Commerce in ICANN has shown us that Internet may be used as a camouflage under which administrative agencies may quietly expand their powers without statutory authorization from Congress or the Executive.

ICANN Governance

Prepared Statement of A. Michael Froomkin Professor of Law University of Miami School of Law P.O. Box 248087 Coral Gables, FL 33124 before the Senate Commerce, Science and Transportation Committee Communications Subcommittee, ,

pp. 31- 35

Froomkin: If in 1985 the Internet itself had been a proposal placed before a committee that behaved as ICANN did in 2000, the Internet would have been rejected as too risky. Risk aversion of this type is antithetical to entrepreneurship and competition.

Worst of all, ICANN applied its criteria arbitrarily, even making them up as it went along. The striking arbitrariness of the ICANN decision-making process is illustrated by the rejection of the ".union" proposal based on unfounded last-minute speculation by an ICANN board member that the international labor organizations proposing the gTLD were somehow undemocratic. (That this same Board member was at the time recused from the process only adds to the strangeness.) The procedures ICANN designed gave the applicants no opportunity to reply to unfounded accusations. ICANN then rejected ".iii" because someone on the Board was concerned that the name was difficult to pronounce, even though the ability to pronounce a proposed gTLD had never before been mentioned as a decision criterion.

Testimony of the Domain Name Rights Coalition and Computer Professionals for Social Responsibility

pp. 35- 37

DNRCI: The sad fact is that ICANN has been "captured" from the beginning. Special interest groups have dictated the direction of ICANN, and have morphed it into an Internet Governance body with none of the protections afforded by governments.

As currently constituted ICANN has failed on all charges. It has moved slowly; been unrepresentative; acted to limit competition; and failed to offer useful, fair, coherent policies, or even policies which encourage investment in virtual property. ICANN is a policy experiment that has failed.

ICANN is correct in that its formation was an unprecedented experiment in private sector consensus decision-making. Unfortunately, that experiment is in the process of failure. ICANN's claim of "openness and transparency, based on Internet community consensus, bottom-up in its orientation and globally representative" is far from the reality of the situation. ICANN is the classic top-down organizational structure without accountability. When its by-laws are inconvenient, they are changed without discussion.

The Internet is the single most significant communications medium ever created. Its power goes well beyond that of shopping malls and e-commerce, and empowers individuals in a way never before imagined. It is thus a national as well as an international resource. The ability to control important aspects of this technology cannot be underestimated. It is up to all of us to remain vigilant when organizations are given special privilege by a branch of the US Government to control this vast means of expression. Safeguards must be put into place whereby individuals, non-profit entities, churches, tribal governments, and other disenfranchised groups may provide unencumbered input and opinion to an open, transparent and accountable entity. This entity is, unfortunately, not ICANN in its current form.