Print 

A New Telecom IT Research Agenda

John Hagel and John Seely Brown Are Establishing a New Research Center at Deloitte's San Jose Office

Hagel and Seely Brown future forecast and symposium discussion. Articles on problems of the Internet Core and IPv6 transition. How to purchase this issue. $350 or $1400 group.

July 16, 2007 Ewing, NJ -- The September issue contains the expanded version of John Hagels talk on the future of the firm in the new century.

Executive Summary

How to Help Enterprises Cope with Internet Globalization and the Increasing Speed of Change? pp. 1-8

Deloitte Touche it seems is one of a number of firms deciding that the future for its consulting services lies in its ability to show the major enterprises of the world that its expertise in enabling them to understand the changing shape of the strategic business climate in which they operate is second to none. To accomplish this they made a smart move in hiring the top two experts in the world John Hagel and John Seely Brown to establish a new research center focused on these issues.

 

These two men have been saying that the reason for the existence of “the firm” in the 21st century is likely to be talent development. In other words how to you get really really good at producing the firm’s stated product or service. How do you know what those should be or how they should change over time? I can only imagine that Deloitte has decided that by signing on these men with one stroke it elevates the seriousness with which Deloitte advertises the serious of its intention to compete and it serves as an incubator within Deloitte to develop the talent of Deloitte’s employees.

John gave one of the closing talks at Supernova his subject was the research agenda for his and JSB’s new center. Here is how he framed what said.

The challenge for me is to figure out what are the questions that are both intellectually interesting but even more important have significant economic impact hanging on the answers so that action on them really makes a difference in terms of where we come out.

Foundation questions – what is going on in the world around us?

What if there is no equilibrium? - what are the institutional architectures that are going to be required in a world where there is no equilibrium? Can the firm survive as action flows to the edges?

I think that the rationale for the firm in the 21st century is as an institutional platform to accelerate talent development. The firms that are good in economizing on transactions are not very good at accelerating talent at all. We need to figure out an institutional platform that will accelerate talent development.

It looks now to be likely that networks of creation and economic webs will trump all other ecosystems in value creation and the capture of business opportunities. These networks and webs have the ability to embrace and extend the value creation potential of other ecosystems.

If the world is so flat, why are spikes becoming more prominent?

Strategic questions – how can we create and capture value given the changes going on around us?

While adaptation is certainly necessary, it misses the real opportunity. That is, that in a world with accelerating change and growing uncertainty, there are increasing opportunities to shape outcomes in ways that were simply not feasible in much more stable environments.

We need to harness institutional innovation and in particular move from institutional arrangements that were designed to provide scalable efficiency to institutional arrangements that offer scalable learning in order that we can begin to find ways to learn faster and find ways to get ahead of the pack in a more sustainable fashion.

The ability to have a destination clearly defined is the big advantage of smaller, entrepreneurial companies relative to larger companies.

What are the opportunities for the bottom of the pyramid to attack the top of the pyramid? C. K. Prahalad has done a great job of describing the opportunity for innovation to serve the needs of low income consumers round the world.

The bottom of the pyramid may become the basis for the formation of successful attacker strategies to go after the incumbents in more developed economies. In other words the institutional innovations required to serve customers at the bottom of the pyramid can also be used to carve out significant share in more developed economies.

So the sixth question is: How do we measure success when so many of the rules are changing?Platform questions – how can various types of platforms help us to achieve our goals?

These have to do with various types of platforms that can augment our capabilities. Here I will focus on three questions. One of them is when is self-organizing not enough?

We have done a lot of work have done on governance mechanisms in open source software.

What kind of modifications are needed for governance mechanisms in other products or categories or what modifications to the creation process might we need in order for those mechanisms to really work beyond software?

We are moving to much more flexible pull platforms that focus on helping people to connect with the resources that are most relevant to them whenever and wherever they need the resources.

Push programs in general tend to treat people as passive consumers (even when they are producers like workers on an assembly line). They treat them as passive consumers whose needs can be anticipated and shaped by centralized decision-makers. Whereas pull platforms treat everyone as networked creators (even when they are customers purchasing goods and services).

I think there is a real opportunity to start from scratch and say what would an IT architecture look like from the outside in? There you might start with the observation that you would want to organize a service base of ten thousand business partners. How would you do that? What kind of architecture would you require in order to do that?

Without these outside-in architectures, companies will be constrained in accelerating talent development because limiting ability of individuals to connect into rich networks of specialization and push themselves to get better faster by working with others – they will still do it, but the scope and scale of the interactions will be much more limited.

Who’s Minding the Internet Core? p. 9

KC Claffy delivered an impassion and provocative response at the end of one of the panels that indicated that the internet in its current stage is being left with seriously debilitating issues being left unattended because it is no one’s immediate economic interest for things to be fixed.

KC: “But let me just mention a few concrete problems that are under the hood of the internet architecture.

We are running out of address space. IP v4 space will be used up in two to three years given current rates of allocation. Now of course the IETF community has been working on IPv6 for some time now – over a decade. It’s just that use in the US and a lot of other countries is not picking up. People are not adopting IP v6. Despite the fact that DoD is trying to self-regulate it into existence.

Even if people do take up IPv6, THAT makes the routing problem worse. So we have a routing problem. But we don’t have a scalable routing system. We have no provable bounds on the scalability of the routing system. And no one is working on routing systems that can scale to any availability of networks.” snip.

KC: Finally I don't understand this thing that John Knauer said earlier. His words struck me so hard that I wrote them down.

"Market forces will create an open network."

That is a little strange given that the US government funded people like Van Jacobson to create an Open network in the form of the internet. That's what the architecture was. That is where the end-to-end principle came from. What happened after 1994 with commercialization is that it became clear that there was no economically sustainable way to provide an open end-to-end network in the free market and ten years later we see the free and open network going away." KC concluded to applause.”

So What About IP v6?

At the beginning of July Geoff Huston posted an essay suggesting that at current rates of consumption the routing registries are likely to allocate their last blocks of IP v4 address. That is more than two years but less than three years from now! If the internet is to be able to continue to grow, it has to start routing IP6 addresses. KC enumerated many of them in her response summarized above.

This led to the following very relevant observation by John Curran who is a board member of ARIN the regional Internet registry for North America:

“While there are going to be efforts to recover unused IPv4 space, we're currently going through 10 to 12 blocks of /8 size annually, so you may get an additional year or two, but it doesn't change the end state.

There's no reason for end organizations to change their existing IPv4 infrastructure, but they do need to get their public facing servers reachable via IPv6.

Anyone who thinks that the ISP's community can continue to grow using smaller and smaller pieces of reclaimed IPv4 address space hasn't considered the resulting routing table. We've build an entire Internet based on the assumption that most new end user sites are getting hierarchical, aggregatible PA assignments. This assumption is soon to fail until there's an option for connecting customers up via new hierarchical address space.

Interoperability is achieved by having public facing servers reachable via IPv4 and IPv6.”

Symposium Discussion May 18 – July 5th p. 14

Volker Vessels Continues to Fiberize the Tulip belt

Van der Woude: Reggefiber of Dick Wessels is now extending the FttH network they operate in Nuenen and parts of Eindhoven.

In the Geldrop-Mierlo area (map: http://tinyurl.com/2vaswl) they actively have been signing up families. After a whopping 57% of 16,000 homes signed up they now will start to roll out the network, called "OurBrabantNet" (Brabant is the local province).

The Value of IMS? p.15

Two professionals wonder if it is safe to ignore IMS as a vendor telco funded boondoggle that will collapse under its own weight.

Coluccio: I constantly face the question of whether it is worth my time and the time of others in my organization to become expert in it, or to just let it ride itself out, hoping that it will either go away or die a slow death through SP and vendor abandonment.

Dryburgh: If you read Benkler, then IMS is an attempt to cling to the industrial information economy. How does IMS aid peer-production; flow of information and "cultural productions" between network users? For the little it could add, at what cost? It comes down to the real role that operators should be asking themselves which is "how can we aid the networking information economy? What assets do we have or could acquire to add value in the sphere of the networked information economy"? I've been in telecoms enough to know when something is being engineered for the sake of being engineered and for getting a tick from shareholders. But if you do want to hear some conceivable uses of IMS, speak to say Brough Turner (NMS) who I believe is already on this list.

Coluccio: I generally view IMS as something that will eventually pass, like Token Ring and ATM did. Which is not to suggest that both of those are no longer in use, but rather that they've been confined to areas where they do less harm than they otherwise might.

Goldstein: I don't see IMS in the same terms. ATM was real. It had concrete requirements and defined interfaces. You could easily see what it could be used for. IMS is just marketechture. It doesn't define the exact behavior of the reference points, so it's not suitable to multi-vendor implementations. Therefore it's not a standard. It's just a collection of bad ideas.

Spectrum Issues p. 20

David Reed explains to Roland Cole why the regulator’s spectrum auctioneers cause the blood pressure of the radio engineers to rise.

Here is my translation of what David says: Basically it is because spectrum regulation was built on a 1920s knowledge of spectrum behavior. What we have put in place is something similar to economics as run by the Chicago school of macro economics.

The innovations of complexity economic theory are there petitioning the power that be to recognize them. The powers that be are the telcos and MSOs whose business models depend on exclusive use monopoly spectrum allocation.

Thanks to the continued progress of Moores law enabling digital signal processing to do ever more amazing things it is now possible for software defined radio to operate on radically different principals from the analog power broadcasting of the 1920.

It is now possible to build software defined radios that can listen for other spectrum users in the neighborhood and fine tune their behavior sufficiently enough to stay out of each other’s way and hence not interfere.

The problem was that even if the Republican FCC were not controlled by the duopolists that it regulates earlier versions of the organization in a mindless effort to raise receipts for the federal government such that these private equity based non public good companies have paid for exclusive right to use spectrum where it was formerly thought that only a single person could ever talk in the space in a single time slot.

Imagine what would have happen in the fiber world had the telcos been able to buy exclusive use of all parts of the spectrum that might be defined as red while another telco bought exclusive use of violet, and an MSO exclusive use of blue. The Chicago school missed a wonderful opportunity to help us dig our economic grave even deeper.

David put it this way: “the spectrum isn't even full, as Shared Spectrum demonstrated by measuring all of the spectrum used in NYC during the RNC, which meant that in addition to regular stuff, there were vast numbers of military and first responders active as well. They found that if you used very sensitive measurement tools, only 13% of the entire RF spectrum had *any* detectable signal energy during the full time period, and the actual occupancy was infinitesimal.

If we imagine using just the technology we know and can make cheaply today, we can make any one band much more efficient in terms of the number of bits/second available for use without confusion. But we have no bands to deploy that new technology into - because every band is allocated 3 x over (primary secondary and tertiary users).” “Paradoxically, except for cellular data, pretty much unused, until someone proposes to use it and then the current licensees say: wait, we have made crucial plans to use it someday OR that use might make some grandmother who bought something a long time ago have some speckles on their flickering Dumont TV, so pay us lots of money.”

“Bob cites digital techniques - in particular what we know about channel coding - as a major direction for improvement. This is a mature subject, but only digital cellular systems and some military systems actually use it well. And beyond that are adaptive techniques - which adapt the signals to the physical environment.”

All of the above are techniques - ones that are not employed, and in many cases, not legal for employment because they postdate the 1934 regulatory structure and are too hard for the lawyers to incorporate into a fundamentally decade-time-constant static frequency allocation scheme. In other words, the rate of change *built in* to the radio systems in the field by the *laws* is that radio system innovations take 10-20 years to introduce, though they now take 12 months to invent, and soon will take days to invent (with Software Defined Radio).”

David then showed that with out too much effort you could separate out an interfering wave form in your radio design and come up with a software defined radio that would run in the 2.4 g hertz band and not be subject to interference by operation of your microwave oven. (Read page 22 and 23 – it’s worth it.)

He concludes in a way that reminds me of the assumptions that the macro economics used to make with their models. One of my students recently showed (in a class project to be a published paper) that you can even do the soft-decoding trick with retransmitted packets - or with packets that are detected at multiple points in space - so that one can decode and separate the packets with *far* higher reliability than one could otherwise.

Why do most "authorities" not get this? Well, there are lots of reasons. Most radio systems designers don't do coding theory, for a start. Second, most radio front ends are not designed to allow for this. Third, the standard engineering assumption is that ALL simultaneous signals are *gaussian random variables*.

A gaussian random variable is pretty much the signal with the maximum randomness you can send with a given signal energy per Hz of bandwidth. In other words, it is the least predictable. So a microwave oven is not a Gaussian Random Signal. If you don't look too closely, it may look like one. But that just means you didn't look, not that it is random.

If you design radios assuming that signals are all Gaussian Random Signals, there is a pretty result: the Shannon Capacity Theorem says that all you need to know about the other signals is their energy, because the energy is the variance of the Gaussian Process, as well as the amplitude. And since the Gaussian Random Signal is the *most* random, that suggests that you are designing for the worst case. (actually that is false, but the argument is made anyway - the worst case for a radio depends on its design, and a very predictable and unrandom signal might drive it bonkers).

Typically, if someone talks about SNR (signal to noise ratio), they are assuming that the interference is gaussian random. But in nature, NOTHING is gaussian and random.

So if you make that assumption to start, there is precious little you can do. But if you want to live with a *particular* microwave oven - and I think your house doesn't have a microwave oven that comes from a different manufacturer every millisecond or so - it is not a random or unpredictable noise source. And for milliseconds at a time, it is quite predictable, and therefore quite subtractible.”

Basically thanks to spectrum regulation,there is no chance of building systems that are both innovative and legal. This when the technology has progressed to the point where untold wealth could be created were it not blocked by the regulatorium.

The Google 700 Mhz Proposal p. 23

Reed: The idea of spectrum markets reminds us of California blackouts, not smooth reliable supplies of energy. Markets can smooth out capacity, but market dynamics can also amplify spikiness of capacity.

People who understand physical systems dynamics typically see all the hooey about speculators always making markets more "efficient" as complete malarkey. To stabilize an unconstrained physical system, you need to constrain it - which means some form of "regulator" - perhaps in code rather than in law or contract.

Even simple systems can have wild and unpredictable behavior. Look at the Mandelbrot set, which is a simple one stage feedback loop's deterministic output. Do you want spectrum allocation to be governed by a Mandelbrot process? If so, turn it over to a "market" that is not designed with stability as a criterion.

It is fashionable to imagine "Instant Efficiency - just add markets". Fashion is crappy engineering.

Snip

David Reed: Radios weren't on a "Moore's Law" curve until recently. They are now.

Instead, the economists are dividing up the illusory spectrum into chunks with granularity that is far too small in the frequency direction, far too large in the spatial direction, far too removed in the allocation direction, and with a presumed technology cycle time of decades, not minutes. And they have asked their friends the lawyers to reify (pretend that those concepts are real) those bad design decisions for the next hundred years.

1) far too small in frequency: the more wideband the system, the more adaptable and flexible. But the theory of the economists is micro-allocation of frequencies.

2) far too large in space: energy falls off with distance as the square, cube, or 4th power. This means that the most efficient systems are femto-cellular (whatever that means).

3) far too removed in allocation direction: if you want to \"buy" a piece of spectrum for a few milliseconds, you have to send a message to the NY Spectrum Exchange, where it is arbitraged by a collection of NASDAQ supercomputers, traded 15 times, with various markups for middlemen, and then you get your allocation. I think it is probably better to just transmit, and assume that the police can't find you. The policing transaction costs then soar, and the spectrum price becomes 99.99% transaction costs if not 5 9's. Great for the auctioneers, useless for radio designers. Yeah, I'm sure there's a middle ground, but Coase's Theory of the Firm would suggest that this is a *bad* idea. Why does spectrum have to be inter convertible into dollars or, worse, Euros?

4) presumed technology cycle: to enforce this, the FCC will have to certify code, since all radios will be SDR. Certifying code is a lot harder than certifying a fixed and simple hardware design. Thus the FCC certifier will become a bottleneck, and industry will have to pay for it. This works only if the technology innovation cycle is decades for any particular radio class.

So yeah, in *theory* we could make auctions of spectrum the law of the land, and in some la-la land, one can imagine technological solutions to accomplish it. But in my real world, I'd rather bet on digital signal processing progress, rather than lawyers and economists solving problems.

There is no "Moore's Law" in Law or Economics.

Life with Level 3 p. 28

Tom Hertz: what Tom W. says is typical. Usually there is an O&M charge for maintaining the physical integrity of the fiber ($$/route-mile/year) plus co-lo rack rent ($$/rack/mo plus or inclusive of DC power). Any management gear would be in the rack rented, and the rack itself may be in a cage. The IRU grantor doesn't get involved in ops (beyond maintenance of the site and the physical fiber itself). In the long run the fiber is the cheap part; O&M and co-lo space recurring charges dominate.

Erik Hunsinger: The rule of thumb used commercially is: Over 20 years, for network alone- 1/3 of cost is fiber 2/3 of cost is colo, power and maintenance (here is where the market collapsed in the roaring 90's, the unaccounted for recurring costs.)

To light it- Optronics was 2/3s (in the 90s) of the day 1 total up and running system, this has been turned upside down, some say 75% of it's former glory

Plus an operational team to run it….300 people nationwide or outsourced (???) @ 100K/person/year (if you give them health care) Sales, marketing, management with CEO on top of that.

This is of course a commercial five 9 system, but doesn't include extra fiber to upgrade gear. Companies really should have 2 pairs of fiber to effectively do that.

Social Networking Tools on Mobile Phones p. 30

Bob Frankston: But do you want Cisco to do it for you? http://zdnet.com.com/1606-2_2-6186133.html?tag=nl.e539 Editor: The url shows Cisco demonstrating a Tagr like mash up.

Ed Pimentel: Yes They very excited about demonstrating the IBM SameTime Client to send IM messaging from a PC to the MAC Their people finder (IM Presence and Mapping) was appealing. Finally they show a Telepresence VideoConference. Now that was nice.

Frankston: True but I am troubled by Cisco trying to be so visible rather than just staying down in the sewers where they belong.

Pimentel: Cisco, is now going through another transformation (they do that often and are good at that). When I left Cisco, I was wondering when they would realize that they can not solve all the world’s problems with a BOX.

Netness and What Will We Do with 15 Billion Transistors a Second? p. 32

Sheldon Renan: An 83 year old man who lives independently in a suburban condo steps into a bathtub and turns on the showers. While reaching for something, he slips and begins to fall. The bathtubs senses he is falling. It recognizes its user and knows he has osteoporosis. As the man falls, the bathtub autonomously modifies its surface characteristics in order to moderate the outcome of a potentially dangerous event. Instead of breaking the elderly man's hip, his bathtub softens to cushion his fall. At the same time, the bathtub connects to a remote attendant who can check to assure his user is OK.

This is an example of netness. And it's value.

The virtuous bathtub is not actually that far from being a reality. Neither is netness.

COOK Report: The shower that catches the old man when he falls is a very interesting metaphor. I suspect that, as people think about, it some may actually be repelled. If my environment always protects me from harm what does this do to the very conception of what it means to be human? 15 billion new transistors every second will transport us rapidly from technology into the area of philosophy and religion.

Copper Landlines Life Lmited p. 35

Coluccio: The copper "last mile" line to the house won't exist in six years, according to Tom Evslin. The co-founder of Internet service provider AT&T Worldnet and voice-over-IP wholesaler ITXC, Evslin made the prediction this month in his Web log, "Fractals of Change." "By 2012 [there will be] no more reason to use our landlines--so we won't," Evslin wrote. "I don't think the copper plant will last past 2012. The problem is the cost of maintaining and operating it when it has very few subscribers. Obviously [it's] a huge problem for AT&T and Verizon. And an important social issue as well."

(more at

http://telephonyonline.com/access/news/copper_landlines_gone_052507/index.html)

Isenberg: Like most engineers, Tom Evslin is seeing that the first 90% takes 90% of the effort, but forgets that the last 10% will take 90% of the effort too :-) . . . I've thought this one through, and I've concluded that the last 10% will need a big push, perhaps a tax break or other form of telco subsidy.

The French Regulator Acts to Unbundle Fiber p. 44

Dirk van der Woude: The French broadband market is quite dynamic, partly because their 'FCC', ARCEP, is quite actively making sure that local loop unbundling is taken serious.

Yesterday ARCEP announced that they will take things a step further, in answer to the now massive roll outs (in Paris four different operators ;-) of fiber.

The official press release below contains these telling sentences: "It is indispensable that the terminal part of networks be shared: -1. to limit disruptions in apartment buildings and houses by avoiding having different operators lay networks -1. to let inhabitants put competition into play between very high speed service providers without being held captive by the first operator to have wired their building" Seems Board member Mme Gauthey (see: http://tinyurl.com/2rceah) hasn't worn-out yet...

Cheap Wi-Fi is Too Slow p. 52

Dirk van der Woude: Lafayette's John St Julien has an interesting post on WiFi,_based on an article in the Sydney Morning Herald.

http://www.smh.com.au/news/wireless--broadband/cheap-wifi-too-slow/2007/06/18/1182019028191.html#

I agree with John that it 's remarkable honesty of Earthlink And not too good news for cities that think they can avoid the FttH debate by choosing Wi-Fi..

Tuesday, June 19, 2007 http://lafayetteprofiber.com/Blog/2007/06/cheap-wi-fi-too-slow.html "Cheap wi-fi too slow" so says: Bill Tolpegin who is vice-president of planning and development for the municipal networks unit of Earthlink, a US-based company that built municipal wi-fi networks for cities including New Orleans, Philadelphia and Anaheim and has been asked to devise plans for networks in San Francisco, Houston and Atlanta.

The 700 Mherz Auction and Frontline Wireless p, 55

Evslin: Although I'd far rather see all this spectrum simply opened, the FCC doesn't have that discretion in this rule making. Simply put, Congress wants the money for the auction and has written that into law. So I'm trying to concentrate in this critical period on the rule making the FCC can do.

Ironically the Frontline proposal may be rejected because it could be seen as reducing the take from the auction in a way congress has not sanctioned. But I do think it is deeply flawed in a number of ways including the assumption that the 10MHz is needed for public safety, the prioritization scheme, the cordoning off of 10MHz nationally with likely very delayed usage in the areas that need it most. Certainly it's self-interested and commercial but these facts alone don't make it a bad proposal; it's the other stuff.

From the floor at Supernova p.56

COOK Report: Yes - that they don't get it was MY editorializing. It was what they said in response to Kaliya's question that to me indicated they had not a freaking clue about what the whole point of this entire conference was about.

I am thinking also of David Young's response to another question. And David please don't take this personally I was delighted to meet you FTF for the first time. As David Isenberg said to David Young - David you are a good person but unfortunately you work for a telco. So David if I am wrong please correct me, but here's what I remember you saying to the question of why we have such lousy bandwidth at such high prices.

Namely, that countries have different regulatory philosophies and that in the US it is intermodal competition and that it is working really well because Verizon has been incented to lay fiber to compete with the MSOs and that now the MSOs are improving their plant to not be left in the dust by Verizon and that the result was healthy competition and a healthy market.

OUCH! ;-( To someone who doesn't know what we on this list know that answer might past muster.... But oh boy... to me that ain't the way the world turns.!

Big Snip: Coluccio: Ironically, what we've actually seen happening over the past five or six years, at least, has been an indirect form of government assistance to incumbents, all along. Unlike direct cash infusions such as the S&Ls received "after" they collapsed, however, the government in the case of the telecoms industry, through the courts and its many agencies, have delivered aid by way of legislative and judicial bias, instead, and they've been doing this in a rather transparently manner, to boot. And that makes it doubly reprehensible, if not embarrassing, as well, from an interested citizen's point of view.

John Kneuer Presents The Bush Admin Point of View p.64

Doc Searls: My own 2¢, while also waiting for a plane, this time in Houston. (As thunderstorms loom… fun.)

What happened between John Kneuer and many (far from most) in the audience - including David Isenberg, David Weinberger and myself (each of which had exchanges with Kneuer from mikes on the floor) was two frames that don't overlap. I listened closely to Kneuer's speech, and it was framed entirely on a set of assumptions that are not shared by his opposition in the audience - or by their community of like-minded folks, many of the most active and notable of which are on Gordon's list.

Kneuer's frame is the Regulatorium. To him, big infrastructure requires big commercial entities who will own and operate the infrastructure they build. Big companies doing Big Things is the free market at work. Duopolies are the free market at work. Auctions that bring billions to the feds exist to assure that the largest and most capable pockets will do the work required to build out this vast and new private infrastructure.

The opposing frame sees the Net as wide open and "stupid" in the Isenbergian sense - a rising tide that lifts all boats, including vast and wide-open markets that dwarf any amount of business the likely buyers of 700MHz spectrum are ever likely to grab for themselves while creating scarcities for everybody else who runs "on" their private networks.

The opposition (that's us) want open spectrum to allow an ocean to fill, and its tide to lift all boats. Kneuer wants to build canals across the empty spaces where seas might otherwise rise. The wi-fi success example means nothing to him. Wi-fi is cordless phones for laptops, not Real Infrastructure. It's not serious, and it's an example of nothing if you're talking about networks.

I think there *might* be common ground in here somewhere, but I doubt it. Now they're getting ready to call our rows for the plane, so I don't have more time to think and write about it. Meanwhile, thanks for bringing him in anyway, Kevin. It was a Good Thing, even if the two sides talked past each other.

John Kneuer: What's Good for AT&T is Good for America p.65

Kevin Werbach: We've just posted the full video from John Kneuer's Supernova session on spectrum and broadband policy, with the subsequent lively audience conversation around network neutrality: http://conversationhub.com/2007/06/27/video-john-kneuer-on-spectrum-policy-and-network-neutrality/

COOK Report: I have posted a good bit about Kneuer. I watched the video fragment. And I watched the whole video that Kevin just posted. The whole video is very much worthwhile watching. It puts what the Bush Administration is doing to our economy in perspective. These people know only one thing. The Chicago school's abstract justification of so called free markets which of course are not free but are rigged to perpetuate the old technology of the last century. But the language is one of religion not reasonable economics or technology. And they have no understanding of what the pre Chicago school would call a natural monopoly. Or of what used to be called common carrier. Their framing, as Doc Searls points out, is alien to our own and to anyone who understands telecom as a utility system undergirding the rest of the economic system of the nation. Their concept is that they will run the printing presses and the rest of the world will accept our paper forever. Not likely. A day of reckoning will come.

IP Sphere Forum: p. 68

Coluccio: So then tell me Fred, All, what do all of these folks who sit on these IMS and IPSphere forums do year after year, only to learn at some point in time that all of their work has been relegated to anachronisms, if not dust, by yet another incarnation of that thirty-year-old technology called Ethernet?

Goldstein: Or whatever else comes along... The forums are largely populated by mid-level engineering types who enjoy that type of work. I used to be one of DEC's standards team, so I speak from firsthand experience attending such meetings. Since they like the work, they are in no hurry to finish. The projects drag on and on. And if one dies, they have other projects to work on -- a forum or committee tends to have parallel activities, enough to keep everyone who shows up busy. Actually delivering a product is irrelevant and potentially counter to their interests.

McCauly: Thanks for the useful responses! The main reason for my query was that IPsphere has been raised as something that should be considered for NZ Connected Health , which is an initiative to replace the NGHN (Next generation Health Network , now known as the Not going to Happen network!) Connected Health is about setting standards and creating a brand to allow a qualified group of vendors to deliver interoperable, and standardized offerings to the Health Sector (at least that's the short version!).

What's Wrong with this Picture? Fairpoint, USF, and Vermont p. 70

COOK Report: What's wrong with this picture? Verizon uses reverse Morris Trust accounting to sell its Northern New England properties to Fairpoint, and reap 1.7 billion in income. While Fiairpoint, a publicly traded corporation pays dividends yielding over 8.5%. It pays more than it earns in dividends and apparently can do this because its companies are rural, impoverished and receive 6 to 7 percent of their total revenues from the Universal Services Fund.

Stephen Otter Holmes commented on

http://gordoncook.net/wp/?p=181 I am an ex-employee of Fairpoint's Maine Operations. I have been involved with the Maine Public Advocate's Office in several matters involving what we refer to as (Un)Fairpoint. I suggest anyone with an interest in the Verizon to Fairpoint sale check out the online posting of Randy Barber's testimony to the Vermont regulators.

COOK Report: I followed through on Stephen suggestion. Googled and got: State of Vermont Public Service Board Docket 7270 May 24 2007

Scanning through the testimony I selected the following: FairPoint is a holding company that specializes in acquiring, operating, and selling small, primarily rural telephone companies. It currently owns 31 operating companies that provide communications services to rural and small urban communities (though it is in the process of selling one of them in Illinois). As I discuss below, FairPoint pays very high dividends, yielding over 8.5% at current share prices. Its dividend payments are significantly more than FairPoint earns, and it relies heavily on depreciation to generate the cash flows it requires to support its dividend policy and further acquisitions. [SNIP]

Fundamental to its financial strategy is the utilization of "free cash flow," derived primarily from depreciation, to pay very high dividends. FairPoint is cannibalizing itself by continually paying out more in dividends than it earns. It generates the cash to do this from depreciation - taking money that should be reinvested in its networks and, instead, paying it out to stockholders as a dividend. In its short life as a public company, FairPoint's shareholder equity has declined by $57 million, or 21.4%, even though its net income was $60 million for the same period. That is, in the last two years, FairPoint has paid dividends equal to nearly twice its level of net income. [SNIP ]

ARCEP Paper on Functional Regulation p. 74

Dirk Van der Wourde: ARCEP did it again, a full newsletter with the pro's and con's of fuctional separation. A lot of space for Openreach and Ofcom to state their case in favor - and for France Telecom to be against it.

The position of ARCEP itself? Undecided, for wired seem (for now) to stick to their (successful) ull-policies - however they as well seem to accept that with FttH they might end up between a rock and a hard place. Which is a problem as France is going to get fiberized at break neck speed.

Wireless might be another thing, as the column below of their Board Member seems to hint at some form of measures that would ensure the theoretical benefits of separation

Full 12 page document (1.1 Mb) can be found at http://www.arcep.fr/uploads/tx_gspublication/lettre55-eng.pdf

On July 5th -- Isenberg: Friends & Colleagues, It occurs to me that many of us probably present to the general public as Bob sometimes presents to us. I'm sure we're sometimes seen as nattering self-righteously about ideas that seem obvious to us, but are obscure, arcane and opaque to others.

If the problem were simply developing technology or explaining "the right way," it'd be so easy. But instead, we must explain why something non-obvious is of critical importance -- and demands action -- to people who need to be on our side if the world we envision is to materialize.

 

There's a body of knowledge on how to get people on one's side, also a set of well-established practices and skills. I'm not too good at it, but I am slowly learning how because I don't want the promise of the Internet trampled under the feet of dinosaurs. If we want to be heard, we're probably going to need to speak in their language.

For the complete issue you must subscribe.

Contents

The Internet and Global Business

John Hagel Maps Out Agenda for Research Center on Intersection of ICT and Business Strategy

Foundation Questions – Understanding Our Business Environment p. 2

What if there is no equilibrium? p. 2

Strategy – How to Create and Capture Value p. 4

Platforms To Augment Our Capabilities p. 6

Questions p. 8

Who’s Minding the Internet Store?

Little Attention Given to the Health of the Internet as a Complex System p. 10

NANOG Mail List Discusses the Problems Inherent in IP V6 Transition That KC Touched on Above p. 11

Symposium Discussion May 18- July 5

Volker Wessels’ Reggefiber Announces:
57% of All Homes in Geldrop-Mierlo Area
Signed up, Let the Building Begin p. 14

The Value of IMS? p. 15

Spectrum Issues Toleration of Interference and Unlicensed Spectrum p.20

The Google 700 Mhz Proposal p. 23

Life with Level 3 NLR and Internet2 p.28

Social Networking Tools on Mobile Phones p. 30

Netness and What Will We Do with 15 Billion
Transistors a Second? p. 31

Copper Landlines Gone by 2013? p.35

The Reverse Morris Trust p. 40

Incumbent Schizophrenia: Lessons Learned from KPNs 1992-93 IP Backbone p. 42

The French Regulator Acts to Unbundle Fiber p. 44

The World Is Also Circuitous p. 47

Ellacoya, Deep Packet Inspection and Content
Distribution Methods p. 49

Cheap Wi-Fi is Too Slow p.52

The 700 Mhz Auction and Frontline Wireless p. 55

Roses for Verizon – Brickbats for ATT p. 56

They Just Don't Get it - From the Floor at Supernova June 22 p. 56

Fiber in Europe, by Dirk van der Woude, June 2007 Edition p. 60

What John Kneuer Didn't Say at Supernova p. 61

American Style Regulation p. 62

John Kneuer: What's Good for AT&T is Good for America p. 65

IPSphere Forum p.67

What's Wrong with this Picture? Fairpoint, USF, and Vermont p. 70

Google's Health Care Advertising Blog - Understanding What it Means to be an Internet Company p. 72

ARCEP Paper on Functional Regulation p. 74

Wireless: Aiming for a Harmonized Dividend p. 75

Symposium & Interview Contributors to this Issue

Affiliation given for purposes of identification - views expressed are those of the contributors alone

 

Kevin Barron, Network Manager, Kavli Institute for Theoretical Physics, U. Cal Santa Barbara

Nicco Bakken, Fiber Architect and KPN Network Strategist, Professor Delft University

Vint Cerf, Internet Evangelist Google, co-author TCP/ip

Frank Coluccio , President DTI Consulting Inc., New York City

Roland Cole, Director of Technology Policy at Sagamore Institute for Policy Research

Mark Cooper, Director of Research Consumer Federation of America

Robert Cromwell Senior Assistant City Attorney in Seattle specializing in telecom regulation

John Curran Board member ARIN the RIR for North America

> Lee Dryburgh, Carriers consultant now doctoral Candidate University College, London

Tom Evlsin, Founder and CEO of ITXC

Bob Frankston developed Visicalc and Lotus and later home networking at Microsoft

Fred Goldstein Principal of Ionary Consulting, author of The Great Telecom Meltdown

John Hagel, author, consultant and now Director of new research Center at Deloitte Touche

Shant Hovnanian, President Wibiki, Software Defined Radio advocate

Erik Hunsinger, Network Engineer, Level 3

David Isenberg, author of the Stupid Network and proprietor of ISEN.com

Olivier Jerphagnon, Strategist Calient Networks

Peter MacCaulay, Managing Direector, number 1 IT Group, Auckland, New Zeland

Andrew Odlyzko Director Digital Technology Center, University of Minnesota

Ed Pimentel, CTO AgileCo, Alpharetta Georgia

David Reed: HP Fellow at HP Labs and Adjunct Professor at MIT Media Lab

Sheldon Renan, President Vision and Startegy

Hendrick Rood, Principal Stratix Consulting and Faculty Delft University

Doc Searls Editor Linux Journal and Doc's IT Garage blog, ClueTrain Co-author

Mike Spector Attorney and long time friend of Bob Frankston

Jeff Sterling, ISP Emeritus founder IXA, broadband Developer Idaho

Bill St Arnaud Director Ca*Net4, Canada's high speed research network

Chris Savage Attorney CRB, Washington, DC

Dirk van der Woude Civil Servant Amsterdam and fiber expert

Tom West, CEO National Lambda Rail

Tom Vest, Senior Analyst, Internet Economics & Policy CAIDA

John Waclawsky Chief Software Architect, Motorola

Kevin Werbach, Assistant Professor Wharton School, Producer Supernova