At the end of last week, we heard the news that the initial bids in the FCC’s auction of 700MHz spectrum reached $2.4 billion. This auction is being run to reallocate spectrum previously used for analogue TV that has been freed up by the move to digital.
I don’t know very much about the technical details of spectrum, but I find it a fascinating commodity and allocation process (OK, I’m a geek). It’s a classic public good – it isn’t at all clear who should own it in the first place in order to sell it. There is a tragedy of the commons problem though if it isn’t allocated somehow – if everyone tries to use it, then no-one gets to (because of interference problems). This isn’t a problem for some low-power, short-range technologies (like bluetooth or ultra wideband (UWB) (not that UWB seems to be a problem anyway, given no-one is using it)), but you need allocation before you get technologies like mobile phones, TV, radio etc.
Historically, governments tended to allocate spectrum to public bodies (e.g. the BBC) or companies by beauty parade (read: cronyism). Back at the end of the 1900s, the economists, mathematicians and game theorists crashed the party and argued that allocation by auction would bring a number of nice benefits:
- Revenue for the Government / taxpayer
- Efficient allocation (i.e. the party prepared to pay the most should be the party who has the ‘best’ use of the spectrum in the sense of the one that people are most prepared to pay for)
- Fairness of allocation – no reliance on ‘contacts’ or cronyism – it becomes all about the business model. The markets sort it all out.
I am actually more qualified to write about this stuff than pretty much anything else you’ll see us blathering on about here on the Distilled blog since auction theory research formed part of my graduate studies. If you care about the background, I’ve included a bit at the bottom of this post.
As well as a bit of academic background, I also used to work for a company called Analysys consulting for industry, financiers and the Government on telecoms and Internet issues. One of the things we studied (which I still can’t really talk about) had to do with re-allocation of spectrum used by the mobile operators for 2G services (which is theoretically being made redundant by the introduction of 3G). Another study that is public knowledge that I was involved with was reallocation options for a range of selected spectrum bands that had become free after use in various public safety, military and emergency services roles (among other uses).
US 700MHz spectrum
The 700MHz band is a particularly attractive band for many commercial users. It is a long enough wavelength that a single ‘cell’ can cover a significant distance and also has good penetration of barriers such as walls and buildings. As wired says:
Generally, the lower a radio signal’s frequency, the farther it can propagate and the more easily it can penetrate obstacles like walls and buildings. Lower frequencies also tend to be more efficient, enabling radios to transmit more bits for each hertz of frequency band. As a result, the 700-MHz band should provide better coverage than current cellular bands, which are between 800 MHz and 1900 MHz. So if you’re frustrated by the lack of reception you get on your mobile phone while in the office, a cell service that uses 700 MHz spectrum could offer some relief.
GigaOM has more on the technical details.
Why SEOs should care
The flippant answer for why SEOs should care about the results of the 700MHz auction is that Google is involved (and we all care about everything Google does, right?).
In fact, I think there is another reason. There is a pretty good chance that the eventual winner of the so-called “C-block” of coveted spectrum is going to roll out a new (US) nation-wide wireless data transmission network. People who know about these kind of things (see GigaOM link above) estimate that using this spectrum in place of the most attractive currently available alternative will half the cost of building the infrastructure for such a network from $4 billion to more like $2 billion. That makes it pretty likely that someone planning to do this kind of thing will be the efficient user of this spectrum.
Especially in rural areas, the properties of the 700MHz spectrum mean a far greater chance of ubiquitous wireless connectivity. Finally, mobile internet comes of age.
More than anything we are currently seeing, this will result in game-changing shifts in power for online marketers. We are going to be writing more over the coming weeks about the differences between ‘regular’ online marketing and online marketing targeting mobile users. But for now, suffice it to say that much of what you currently do is going to need to be re-evaluated as more and more users go mobile.
Going back to the flippant reason: the ‘Google factor’, there has been some speculation that Google would like to win the 700MHz spectrum not only to create ubiquitous wireless data connectivity, but also that they might want to launch free services supported by usage-contextual advertising (link). Scary huh? Opportunities, though…!
Some light relief – past auction comedy
While my overview of auctions above lists loads of good reasons why they are an attractive allocation mechanism, they can also backfire badly (as can anything when governments are involved). From my paper on combinatorial auctions (compiled from J. McMillan, “Selling Spectrum Rights”, Journal of Economic Perspectives, Vol. 8, 145–162, 1994.):
Having been advised by the US/UK consulting firm National Economic Research Associates (NERA), the New Zealand government decided to adopt a second-price sealed-bid auction for their radio, television and cellular telephone spectrum. The scarcity of competition in the small New Zealand markets caused a politically embarrassing situation when winners paid prices far below their bids. Some extreme cases included a firm paying the second highest bid of NZ 6 following a bid of NZ 100,000 and another paying NZ 5,000 after bidding NZ 7m (at the time NZ 1 equaled 55 (US)). Whilst these can perhaps be justified in terms of economic efficiency (if not revenue maximization) another case of a student from Otago University bidding NZ 1 for a television licence for a small city and being awarded it for free after no-one else bid simply makes a mockery of the process.
If the New Zealand government wanted to take advantage of the nice theoretical properties of the second-price auction, they should have anticipated the relatively low competition and imposed reserve prices. Due, though, to the media furore surrounding the fiasco, this route was discarded in favour of first-price sealed bid auctions for future allocations.
Similar difficulties were had in Australian auctions for satellite television services, described by an opposition politician as “one of the world’s great media licence fiascoes”. The licences were won unexpectedly by Hi Vision Ltd. and Ucom Pty. Ltd. (beating favourites including a consortium of Rupert Murdoch, Kerry Packer and Telecom Australia). They did this by exploiting a flaw in the auction design that allowed them to default on bids without paying a penalty. They bid very high: A 212m and A 177m respectively (at a time when A 1 was worth 68 (US)), which was widely hailed as demonstrating that the Australian television industry had come of age.
In fact, they never had any intention of paying these high prices. As they defaulted on their bids, the licences were awarded to the next highest bidders: the same companies. They had placed bids at A 5m intervals right down to A 117m and A 77m respectively (both in fact going to Ucom after Hi Vision defaulted on all their bids). Ucom proceeded to sell both licences at a profit. This auction had both failed to award the licences to their efficient owners, and to generate decent revenue. In addition, the repeated default process had delayed by almost a year the introduction of pay television to a country already behind much of the world. If the cost of defaulting is low, the bidders are effectively bidding on options on the items for sale rather than the items themselves (P. Klemperer, “What Really Matters in Auction Design”, Economics Working Paper, WUSTL, 2000.).
It has been suggested that even the deposits of hundreds of millions of pounds may not have been enough to prevent default in the UK 3G telecom auctions had the desire arisen among the winning bidders – who were paying billions of pounds (P. Klemperer, “The Biggest Auction Ever: the Sale of the British 3G Telecom Licences”, Economic Journal, 2002.).
My background in auction theory
At university, I studied combinatorial auctions (which are often used in spectrum allocation) during my ‘part III’ 4th year. Combinatorial auctions are those where bidders are bidding on multiple lots at once and may have preferences for particular groups (making it inefficient to auction the lots one after another). The difficulty from a mathematical perspective is that depending on how you structure the auction, either the bidders or the auctioneer has a very hard (NP-complete) problem to solve.
I even managed to introduce a small bit of original thinking (I believe) around a particular class of auction whereby the auctioneer distributes the hard calculations across all the bidders and asks them to search for the winner. The problem with this, of course, is that bidders have an incentive to only report improved solutions that benefit them. I wrote about game-theoretic approaches to stopping bidders gaming this stage of the process.