The FCC should try a different economic model of supply and demand

Federal Communications Commission chairman Tom Wheeler posted the following statement today in a blog post:

“Getting the Incentive Auction right will revolutionize how spectrum is allocated. By marrying the economics of demand (think wireless providers) with the economics of current spectrum holders (think television broadcasters), the Incentive Auction will allow market forces to determine the highest and best use of spectrum.”

According to Chairman Wheeler’s demand and supply model, when assessing competition and market demand, the market is between wireless providers and television broadcasters.  The auction is designed to make a market between participants within each group.

Chairman Wheeler also appears to lay the groundwork for the argument that since AT&T and Verizon hold a large amount of low-frequency spectrum ( the quality that best serves long distances and penetrates buildings), there may be some adverse impact on smaller carriers.  In his words:

“A legacy of earlier spectrum assignments, however, is that two national carriers control the vast majority of low-band spectrum. As a result, rural consumers are denied the competition and choice that would be available if more wireless competitors also had access to low-band spectrum.

Low-band physics also makes this slice of spectrum essential in urban areas, since it permeates into buildings better than does high-band spectrum. With more and more Americans opting for wireless-only connectivity, they should not run the risk of being unable to place a 911 call from the interior of a building just because their wireless company has the wrong spectrum.

While many factors go into determining the quality of wireless service, access to a sufficient amount of low-band spectrum is a threshold requirement for extending and improving service in both rural and urban areas.”

Yes, AT&T and Verizon are the “Monsters of the Cellular Midway.”  According to data from the FCC, AT&T has 32.54% of market share based on reported connections while Verizon has 29.12% of connections.  Sprint comes in at 17.44% while T-Mobile brings up the oligopoly rear at 10.26%.  But does the size of a wireless carrier or even the services it provides means that it is the one driving demand for spectrum?  I argue no.

The wireless carrier isn’t the one making the call.  The consumer is.  The carrier, as it’s function implies, is just an intermediary that offers a set of technologies that allows a consumer to abandon two tin cans and a wire and use digital and wireless technology to make a call way beyond his front yard.  The consumer is driving demand for spectrum.  The carrier is merely his agent, an agent that promises his client to get his call to where he wants it with the best technology around.  The carrier shouldn’t be penalized because it was successful in aggregating the largest number of consumers into its portfolio.

The carrier shouldn’t be punished because it out-marketed and out-hustled a bunch of other players.  That’s synonymous to punishing a business for leveraging its capital and providing investors the best returns by providing consumers with the best service.

With its spectrum policy what the FCC should be doing is allowing a wireless carrier, no matter its size, to plan new service offerings without the uncertainty of allocation rules that would hinder organic growth and innovation.  Recognizing that consumers drive spectrum demand would help the FCC stay focused on providing a regulatory framework that encourages innovation.  When it comes to spectrum, it’s about consumer demand and a regulatory bottleneck that needs to be, at a minimum, widened, and in the ideal, straight up broken.

 

Dear Al Franken. You’re missing the globe for the bushes

The U.S. Senate Judiciary Committee met today to give their thoughts about the proposed merger between media companies Comcast and Time Warner Cable.

Wait a minute.  Did I say media companies?  Yes I did.  Comcast and Time Warner Cable provide end-users with access to content, whether they purchase that content from programmers such as ESPN or produce that content themselves, such as through their regional sports networks or other entertainment networks.  The questions posed by most of the senators displayed either their ignorance or fear of Comcast and Time Warner’s new roles as content providers.  Their unique position as owners of video distribution pipes that go into the homes of consumers shouldn’t lessen their primary roles as content providers nor should ownership of transmission mediums be the primary determinant of the legal and regulatory framework for their oversight.

Senators like Al Franken, Democrat of Minnesota, have the tendency to focus on small issues that generate the most political excitement and this tendency results in myopic analysis of the issue in front of them.  The senators rather focus on consumer issues of increased prices for ESPN and sports blackouts.  They would rather cater to testimony from content providers complaining about their inability to get their products displayed the digital version of a grocery store shelf, complaining that the store brand is getting the prime spot in the middle of the eye level shelf.

Take for example the testimony of James Bosworth, chief executive officer of Back9Network Inc.  Back9Network provides video programming that promotes the golf-lifestyle.  Mr. Bosworth argues that for independent programmers like his company, it will be near impossible to compete against similar programming provided by Comcast.  Mr. Bosworth would like the merger halted because he believes his firm will not be able to compete with Comcast’s other golf and/or lifestyle programming.

Could the real issue be that programmers such as Back9Network don’t bring much value to the end-user much less the “digital grocery store” that is Comcast to put it in a deserving position for more eyeballs?  In an industry allegedly valued at $177 billion with approximately 26 million golfers, maybe Back9Network, still an infant having been in business only since 2010, hasn’t come up with that compelling business model that Comcast’s David Cohen admits is necessary for the company to place a network in its network line up.  Maybe programmers need to focus on creating something that people want to see in the first place.

But there is something more fundamentally telling in this debate over the merger of Comcast and Time Warner.  If there are so many independent programmers out there jostling for room on a media company’s platform, maybe it’s time for programmers to explore technological alternatives for getting their products into market.  For example, why couldn’t independent programmers combine their content, establish a network, and distribute their programming to end-user laptops, tablets, and smartphones via Roku devices similar to the services provided by Aereo.

Mr. Franken and other senators would rather see the media bottleneck forcibly widened by denying mergers like the proposed Comcast-Time Warner combination.  Instead, politicians and policymakers should promote alternative methods of distribution, especially for content providers who are still trying to make a compelling case that their content provides consumer markets sufficient value.

The knowledge economy needs speed and net neutrality would slow it down

The knowledge economy stands on four pillars:

  1. Education and training,
  2. Information infrastructure,
  3. Economic incentive and institutional regime, and
  4. Innovation systems.

An educated and trained labor force is required for the creation, sharing, and use of knowledge.  To facilitate effective communication, dissemination, and processing of information, an infrastructure ranging from radio to Internet is needed.  This infrastructure needs capital and capital needs an incentive in order to flow to infrastructure initiatives.  This calls for a regulatory framework that makes it easier for knowledge to flow while supporting future investment.  Research centers, universities, think tanks, and community groups create new knowledge via the information infrastructure and spawn innovative systems to tap into, accumulate, and adapt for local uses the knowledge stock.

As John Houghton and Peter Sheehan pointed out in a 2000 study on the knowledge economy what has been taking center stage in the use of knowledge to drive the economy is how the value of knowledge in an economic system has increased in value.  What drives the increase in value is the rising knowledge intensity of the world economy and our increasing ability to distribute knowledge.  According to Houghton and Sheehan, “The implications of this are profound, not only for the strategies of firms and for the policies of government but also for the institutions and systems used to regulate economic behaviour.”

What’s driving the knowledge intensity is the combination of the information technology revolution and the increasing pace of technological change.  The increasing ease at which the flow of information moves around the globe is a result of deregulation and significant changes in technological capabilities and capacity.

As technology improves in capacity and capability, the marginal costs of storing, manipulating, and transmitting information continues to fall.  Also, as markets become increasingly networked and globalized, time, according to Houghton and Sheehan, becomes an increasingly important factor of production.  The increasing rate of accumulation of knowledge stocks has a positive impact on economic growth which also aid in making knowledge significantly important as a factor of production.  Global markets also find that it is less expensive to codify and spread information than it is to re-invent knowledge stocks.

What happens, however, when you throw net neutrality rules into a continually emerging knowledge economy and its underlying information and knowledge markets?  Sub-optimality, according to a persuasive argument from Roslyn Layton.  ”Net neutrality ensures that the lowest common denominator becomes the standard for all internet service.”

Even Netflix realizes this, which is why the online video distributor and content provider entered into an agreement with Comcast to have a direct lane to the Internet access provider in order to get video faster to Netflix subscribers.  The realities of sluggish service to its subscribers forced Netflix to wisely open up an alternative lane that by-passes backbone provider Cogent.  According to The Wall Street Journal,  ”Netflix’s carriers send far more traffic to broadband providers’ networks than they take back, sometimes accounting for a third of all North American peak Internet traffic, according to Internet traffic-management company Sandvine Corp.”

If the Federal Communications Commission were to, pursuant to section 706 of the Telecommunications Act of 1996, expand its jurisdiction over broadband to include traffic between edge providers and Internet access providers but imposing net neutrality requirements on this portion of the Internet eco-system, the speed in diffusing information and knowledge over the Internet could slow down significantly based on Netflix’s example.

The solution?  Don’t promulgate any rules.  Authority to regulate does not have to manifest itself in anymore rules.  The FCC should consider the benefits consumers could receive where private participants in the knowledge economy work out issues surrounding traffic flow on their own.

Comments Off

Broadband and the global brain

I don’t remember where I saw this quote and I may not have it written correctly, but to summarize, “In the future, work will be about learning.”  I believe the speaker was trying to get across that we are moving steadily out of an economy that focused on what we could do physically and into one where the measure of our value will be placed on what we can learn and how we communicate it.  We are going to be relied on more for our brains, less on our brawn, and more on our broadband.

Just think of ourselves as mini encyclopedias connected to everyone and everything else via a digital connection, uploading our knowledge into a processing system that uses our know-how to design, construct, test, re-construct, market, and sell product.  As we quickly move to an Internet of Things, we will see that the  new eco-system will be about machines and servers talking to each other and how integrated human beings will be in the production process the Internet of Things supports.

In a report released last month, GE discussed how the process behind the development, construction, and delivery of products will change where each step along the production change will be tied by digital networks and advanced design and construction devices such as 3-D printing.  Playing an important role in the new approach to production is a concept called the “global brain.”  The global brain is defined as the collective intelligence of human beings across the globe integrated by digital communications networks.  The hope is that as more people connect to the Internet that the global stock of knowledge will increase with each human beings connection.

The expectation is that as more physical, menial work is off loaded onto  machines, people will be freed up to exercise more of their creative and entrepreneurial sides.  The ideas that newly freed minds come up with can be uploaded into cloud-based data platforms for use in building product and providing services.  This is a lot more appealing than and should not be confused with the villages created by companies such as Facebook, Instagram that add nary anything to true public knowledge.

“We are witnessing the rise of the global brain, when a buzzing hive of knowledge, connectivity, technology and access unites the human and the machine, the physical and the digital, in previously unimaginable ways,” says Beth Comstock, GE’s chief marketing officer. “Scientific discovery, information sharing and sheer ingenuity are giving us the ability to hack our human brains to learn, do, be more. At the same time, we can model human intelligence into machines to help us gain insights, increase speed and know more.”

 

Comments Off

Net neutrality will not encourage capital flow to edge providers

Capital flows to an activity that provides the highest returns.  That activity may involve a high level of risk but the premium paid out by entrepreneurs to their investors hopefully compensates for uncertainty.  Policy making by government regulators is a source of uncertainty for entrepreneurs in the Internet space.  Levels of uncertainty increased after the U.S. Court of Appeals-District of Columbia’s ruling in Verizon v. FCC where the court held that the Federal Communications Commission’s authority to regulate access to broadband is based on its duty to promote the deployment of advanced services.

Underlying the philosophy of net neutrality is an argument that the consumer, the end-user, should be responsible for paying all the costs associated with the transmission of data from an edge provider.  This would include costs associated with data traveling down a path that goes from the edge provider through a backbone provider onward to the end-user’s Internet access provider.  For example, Netflix, a company that distributes videos over the Internet, would not be responsible for paying fees to Cisco, a backbone provider, or, should Netflix interconnect directly with it, Comcast, an Internet access provider.

What the net neutrality argument ignores is the multi-sided market that Internet access providers operate in.  The argument also ignores costs generated at interconnection points through the Internet and fails to address the proper costs recovery mechanism at each interconnection point.  The net neutrality argument concludes that cost causation is one-way, from the end-user to and through the Internet and the fees the end-user pays to his Internet access provider should recover this cost.

In actuality, the Internet is what I term, “multi-way.”  The self-healing and permanent virtual circuit characteristics of Internet protocol requires constant communications between nodes, routers, servers, and other devices on the Internet regardless of whether an end-user’s laptop is on or not.  Information requests are constantly being exchanged between backbone providers, edge providers, and Internet access providers.  The end-user is not the only cost causer.  If entrepreneurs cannot explain to investors why costs are not being recovered from all cost causers on the network, capital will flow to the next best generator of returns.

Work by Larry F. Darby and Joseph P. Fuhr appears to bare out this conclusion.  In a 2007 study on the impact of net neutrality on capital flows, Messrs Darby and Fuhr concluded that requiring end-users to pay for next generation networks:

  • Is inconsistent with practice in other multi-sided markets;
  • Will increase investor risk, suppress investment, and slow construction of next generation broadband networks;
  • Will increase rates to consumers; and
  • Will reduce the present value of consumer surplus

Network providers are challenged with coming up with pricing schemes that optimize the scale of their networks while maximizing the networks’ values.  The prospect that investors will recover their investment in networks has a dark cloud hovering over it; a cloud generated by the 2000 dot-com bubble burst.  Wall Street tends to see reductions in short and middle term returns as a result of large infrastructure investment.  Investors are also see growing threats to broadband network deployment from satellite and municipal-provided broadband networks.

According to Messrs Darby and Fuhr, “Specifically, investors have a big stake in the resolution of net neutrality issues and particularly in the outcome of the debate over who can be charged, by what principles and by whom–that is, in resolution of a set of network access pricing issues.”

If the FCC truly wants to promote deployment of advanced services, it should talk more about the importance of not only spectrum as natural capital, but the financial capital necessary for deploying the networks on which advanced services are expected to run.