Given the smartphone’s ubiquity and our dependence on it, “App Coverage” (AC) is something confronting us every day, yet we know little about it. At the CCA Global Expo this week in San Antonio Glenn Laxdal of Ericsson spoke about “app coverage”, which the vendor first surfaced in 2013. AC is defined as, “the proportion of a network’s coverage that has sufficient performance to run a particular app at an acceptable quality level.” In other words the variety of demand from end-users for voice, data and video applications is outpacing the ability of carriers to keep up. According to Ericsson, monitoring and ensuring performance of app coverage is the next wave in LTE networks. Here’s a good video explaining AC in simple, visual terms.
Years, nay, decades ago I used to say coverage should be measured in 3 important ways:
Geographic (national vs regional vs local)
In/Outdoors (50+% loss indoors)
Frequency (double capex 1900 vs 700 mhz)
Each of these had specific supply/demand clearing implications across dozens of issues impacting balance sheets and P&L statements; ultimately determining winners and losers. They are principally why AT&T and Verizon today have 70% of subscribers (80% of enterprise customers) up from 55% just 5 years ago, 84% of total profit, and over 100% of industry free cash flow. Now we can add “applications” to that list. And it will only make it more challenging for competitors to wrestle share from the “duopoly”.
The conference concluded with a panel of CEOs hailing Sprint’s approach, which Son outlined here, as one of benevolent dictator (perhaps not the best choice of words) and exhorting the label partner, partner, partner; something that Terry Addington of MobileNation has said has taken way too long. Even then the panel agreed that pulling off partnerships will be challenging.
The Good & Bad of Wireless
Wireless is great because it is all things to all people, and that is what makes it bad too. Planning for and accounting how users will access the network is very challenging across a wide user base. There are fundamentally different “zones” and contexts in which different apps can be used and they often conflict with network capacity and performance. I used to say that one could walk and also hang upside down from a tree and talk, but you couldn’t “process data” doing those things. Of course the smartphone changed all that and people are accessing their music apps, location services, searches, purchases, and watching video from anywhere; even hanging upside down in trees.
Today voice, music and video consume 12, 160 and 760 kpbs of bandwidth, respectively, on average. Tomorrow those numbers might be 40, 500, 1500, and that’s not even taking into account “upstream” bandwidth which will be even more of a challenge for service providers to provision when consumers expect more 2-way collaboration everywhere. The law of wireless gravity, which states bits will seek out fiber/wire as quickly and cheaply as possible, will apply, necessitating sharing of facilities (wireless and wired), heterogeneous network (Hetnet), and aggressive wifi offload approaches; even consumers will be shared in the form of managed services across communities of users (known today as OTT). The show agenda included numerous presentations on distributed antennae networks and wifi offload applied to the rural coverage challenge.
Developing approaches ex ante to anticipate demand is even more critical if carriers want to play major roles in the internet of things, unified (video) communications and the connected car. As Ericsson states in its whitepaper,
“App coverage integrates all aspects of network performance – including radionetwork throughput and latency, capacity, as well as the performance of the backhaul, packetcore and the content-delivery networks. Ultimately, managing app coverage and performance demands a true end-to-end approach to designing, building and running mobile networks.”
The US has lacked a telecom network visionary for nearly 2 decades. There have certainly been strong and capable leaders, such as John Malone who not only predicted but brought about the 500 channel LinearTV model. But there hasn’t been someone like Bill McGowan who broke up AT&T or Craig McCaw who first had the vision to build a national, seamless wireless network, countering decades of provincial, balkanized thinking. Both of them fundamentally changed the thinking around public service provider networks.
But with a strong message to the markets in Washington DC on March 11 from Masayoshi Son, Sprint’s Chairman, the 20 year wait may finally be over. Son did what few have been capable of doing over the past 15-20 years since McGowan exited stage left and McCaw sold out to MaBell: telling it like it is. The fact is that today’s bandwidth prices are 20-150x higher than they should be with current technology.
This is no one’s fault in particular and in fact to most people (even informed ones) all measures of performance-to-price compared to 10 or 20 years ago look great. But, as Son illustrated, things could be much, much better. And he’s willing to make a bet on getting the US, the most advanced and heterogeneous society, back to a leadership role with respect to the ubiquity and cost of bandwidth. To get there he needs more scale and one avenue is to merge with T-Mobile.
There have been a lot of naysayers as to the possibility of a Sprint-T-Mo hookup, including leaders at the FCC. But don’t count me as one; it needs to happen. Initially skeptical when the rumors first surfaced in December, I quickly reasoned that a merger would be the best outcome for the incentive auctions. A merger would eliminate spectrum caps as a deterrent to active bidding and maximize total proceeds. It would also have a better chance of developing a credible third competitor with equal geographic reach. Then in January the FCC and DoJ came out in opposition to the merger.
In February, though, Comcast announced the much rumored merger with TW and Son jumped on the opportunity to take his case for merging to a broader stage. He did so in front of a packed room of 300 communications pundits, press and politicos at the US Chamber of Commerce’s prestigious Hall of Flags; a poignant backdrop for his own rags to riches story. Son’s frank honesty about the state of broadband for the American public vs the rest of the world, as well as Sprint’s own miserable current performance were impressive. It’s a story that resonates with my America’s Bandwidth Deficit presentation.
Here are some reasons the merger will likely pass:
The FCC can’t approve one horizontal merger (Comcast/TW) that brings much greater media concentration and control over content distribution, while disallowing a merger of two small players (really irritants as far as AT&T and Verizon are concerned).
Son has a solid track record of disruption and doing what he says.
The technology and economics are in his favor.
The vertically integrated service provider model will get disrupted faster and sooner as Sprint will have to think outside the box, partner, and develop ecosystems that few in the telecom industry have thought about before; or if they have, they’ve been constrained by institutional inertia and hidebound by legacy regulatory and industry siloes.
Here are some reasons why it might not go through:
The system is fundamentally corrupt. But the new FCC Chairman is cast from a different mold than his predecessors and is looking to make his mark on history.
The FCC shoots itself in the foot over the auctions. Given all the issues and sensitivities around incentive auctions the FCC wants this first one to succeed as it will serve as a model for all future spectrum refarming issues.
The FCC and/or DoJ find in the public interest that the merger reduces competition. But any analyst can see that T-Mo and Sprint do not have sustainable models at present on their own; especially when all the talk recently in Barcelona was already about 5G.
Personally I want Son’s vision to succeed because it’s the vision I had in 1997 when I originally brought the 2.5-2.6 (MMDS) spectrum to Sprint and later in 2001 and 2005 when I introduced Telcordia’s 8x8 MIMO solutions to their engineers. Unfortunately, past management regimes at Sprint were incapable of understanding the strategies and future vision that went along with those investment and technology pitches. Son has a different perspective (see in particular minute 10 of this interview with Walt Mossberg) with his enormous range of investments and clear understanding of price elasticity and the marginal cost of minutes and bits.
To be successful Sprint’s strategy will need to be focused, but at the same time open and sharing in order to simultaneously scale solutions across the three major layers of the informational stack (aka the InfoStack):
upper (application and content)
lower (access and transport)
This is the challenge for any company that attempts to disrupt the vertically integrated telecom or LinearTV markets; the antiquated and overpriced ones Son says he is going after in his presentation. But the US market is much larger and more robust than the rest of the world, not just geographically, but also from a 360 degree competitive perspective where supply and demand are constantly changing and shifting.
Ultimate success may well rest in the control layer, where Apple and Google have already built up formidable operating systems which control vastly profitably settlement systems across multiple networks. What few realize is that the current IP stack does not provide price signals and settlement systems that clear supply and demand between upper and lower layers (north-south) or between networks (east-west) in the newly converged “informational” stack of 1 and 2-way content and communications.
If Sprint’s Chairman realizes this and succeeds in disrupting those two markets with his strategy then he certainly will be seen as a visionary on par with McGowan and McCaw.
Intermodal competition is defined as: “provision of the same service by different technologies (i.e., a cable television company competing with a telephone company in the provision of video services).”
Intramodal competition is defined as: “competition among identical technologies in the provision of the same service (e.g., a cable television company competing with another cable television company in the offering of video services).”
Focus on 4 words: same, different, identical, same. Same is repeated twice.
Saying wireless represents intermodal competition to wired (fiber/coax) is like saying that books compete with magazines or radio competes with TV. Sure, the former both deliver the printed word. And the latter both pass for entertainment broadcast to us. Right?
Yet these are fundamentally different applications and business models even if they may share common network layers and components, or in English, similarities exist between production and distribution and consumption. But their business models are all different.
Wireless Is Just Access to Wireline
So are wireless and wired really the SAME? For voice they certainly aren’t. Wireless is still best efforts. It has the advantage of being mobile and with us all the time, which is a value-added, while wired offers much, much better quality. For data the differences are more subtle. With wireless I can only consume stuff in bite sizes (email, twitter, peruse content, etc..) because of throughput and device limitations (screen, processor, memory). I certainly can’t multi-task and produce content the way I can on a PC linked to a high-speed broadband connection.
That said, increasingly people are using their smartphones as hotspots or repeaters to which they connect their notebooks and tablets and can then multi-task. I do this a fair bit and it is good while I'm on the road and mobile, but certainly no substitute for a fixed wired connection/hotspot in terms of speed and latency. Furthermore, wireless carriers by virtue of their inefficient vertically integrated, siloed business models and the fact that wireless is both shared and reused, have implemented onerous price caps that limit total (stock) consumption even as the increase speed (flow). The latter creates a crowding out effect and throughput is degraded as more people access the same radio, which I run into a lot. I know this because my speed decreases or the 4G bars mysteriously disappear on my handset and indicate 3G instead.
Lastly, one thing I can do with the phone that I can’t do with the PC is take pictures and video. So they really ARE different. And when it comes to video, there is as much comparison between the two as a tractor trailer and a motorcycle. Both will get us there, but really everything else is different.
At the end of the day, where the two are similar or related is when I say wireless is just a preferred access modality and extension of wired (both fixed and mobile) leading to the law of wireless gravity: a wireless bit will seek out fiber as quickly and cheaply as possible. And this will happen once we move to horizontal business models and service providers are incented to figure out the cheapest way to get a bit anywhere and everywhere.
Lack of Understanding Drives Bad Policy
By saying that intermodal competition exists between wireless and wired, FSF is selectively taking aspects of the production, distribution and consumption of content, information and communications and conjuring up similarities that exist. But they are really separate pieces of the of the bigger picture puzzle. I can almost cobble together a solution that is similar vis a vis the other, but it is still NOT the SAME for final demand!
This claiming to be one thing while being another has led to product bundling and on-net pricing--huge issues that policymakers and academics have ignored--that have promoted monopolies and limited competition. In the process of both, consumers have been left with overpriced, over-stuffed, unwieldy and poorly performing solutions.
In the words of Blevins, FSF is once again providing a “vague, conflicting, and even incoherent definition of intermodal competition.” 10 years ago the US seriously jumped off the competitive bandwagon after believing in the nonsense that FSF continues to espouse. As a result, bandwidth pricing in the middle and last mile disconnected from moore’s and metcalfe’s laws and is now overpriced 20-150x impeding generative ecosystems and overall economic growth.
I've written about the impacts of and interplay between Moore’s, Metcalfe’s and Zipf’s laws on supply and demand of communication services and networks. Moore’s and Metcalfe’s laws can combine to drive bandwidth costs down 50% annually. Others have pointed out Butter’s law, coming from a Bell Lab’s wizard, Gerry Butter, which arrives at a more aggressive outcome; a 50% drop every 9 months! Anyway those are the big laws that are immutable and washing against and over vertically integrated monopolies like giant unseen tsunamis.
Then there are the smaller laws, like my friend Russ McGuire at Sprint who penned, “The value of any product or service increases with its mobility.” Wow, that’s very metcalfian and almost infinite in value because the devices and associated pathways can move in 3 planes. I like that and have always believed in McGuire’s Law (even before he invented it!).
Since the early 1990s, when I was one of the few, if only, analyst on the Street to cover wired and wireless telecoms, I’ve been maintaining that wireless is merely access to wireline applications. While that has been validated finally with “the cloud” and business models and networks have been merging (at least at the corporate level) the majority of people still believe them to be fundamentally distinct. It shows in simple things like interfaces and lack of interoperability across 4 screens. Thankfully all that is steadily eroding due to cloud ecosystems and the enormous fight happening in the data world between the edge and the core and open vs closed: GOOG vs AAPL vs MSFT (and let’s not forget Mozilla, the OS to rule all OS’?).
Anyone who works in or with the carriers knows wireless and wired networks are indelibly linked and always have been in terms of backhaul transport to the cell-tower. But over the past 6 years the symbiosis has become much greater because of the smartphone. 1G and 2G digital networks were all capable of providing “data” connections from 1998-2006, but it really wasn’t until the iPhone happened on the scene in 2007 along with the advent of 3G networks that things really started taking off.
The key was Steve Jobs’ demand to AT&T that smartphone applications purchased through the App Store have unfettered access to the internet, be it through:
2G, which was relatively pervasive, but slow at 50-300kbps,
3G, which was not pervasive, but faster at 500-1500 kbps, or
Wifi (802.11g), which was pervasive in a lot of “fixed” areas like home, work or school.
The latter made a ton of sense in particular, because data apps, unlike voice, will more likely be used when one is relatively stationary, for obvious visual and coordination and safety reasons; the exception being music. In 2007 802.11g Wifi was already 54 mbps, or 30-50x faster than 3G, even though the Wifi radios on smartphones could only handle 30 mbps. It didn’t matter, since most apps rarely need more than 2 mbps to perform ok. Unfortunately, below 2 mbps they provided a dismal experience and that’s why 3G had such a short shelf-life and the carriers immediately began to roll out 4G.
Had Jobs not gotten his way, I think the world would be a much different place as the platforms would not have been so generative and scaled so quickly without unconstrained (or nearly ubiquitous) access. This is an example of what I call Metcalfian “suck” (network effect pull-through) of the application ecosystem for the carriers and nothing exemplified it better than the iPhone and App Store for the first few years as AT&T outpaced its rivals and the Android app ecosystem. And it also upset the normal order of business first and consumer second through the bring your own device (BYOD) trend, blurring the lines between the two traditionally separate market segments.
Few people to this day realize or appreciate the real impact that Steve Jobs had, namely reviving equal access. The latter was something the carriers and federal government conspired to and successfully killed in the early 2000s. Equal access was the horse that brought us competitive voice in the early 1980s, competitive data in the early 1990s and helped scale digital wireless networks nationwide in the late 1990s. All the things we’re thankful for, yet have forgotten, or never entirely appreciated, or even how they came about.
Simply put, 70% of all mobile data access is over Wifi and we saw 4G networks develop 5 years faster than anyone thought possible. Importantly, not only is Wifi cheaper and faster access, it is almost always tied to a broadband pipe that is either fiber or becomes fiber very quickly.
Because of this “smart” or market driven form of equal access and in appreciation of Steve Jobs’ brilliance, I am going to introduce a new law. The Law of Wireless Gravity which holds, "a wireless bit will seek out fiber as quickly and cheaply as possible.” I looked it up on google and it doesn’t exist. So now I am introducing it into the public domain under creative commons. Of course there will be plenty of metaphors about clouds and attraction and lightning to go along with the law. As well, there will be numerous corollaries.
I hope people abide by this law in all their thinking about and planning for broadband, fiber, gigabit networks, application ecosystems, devices, control layers, residential and commercial demand, etc…because it holds across all of those instances. Oh, yeah, it might actually counter the confusion over and disinformation about spectrum scarcity at the same time. And it might solve the digital divide problem, and the USF problem, and the bandwidth deficit….and even the budget deficit. Ok, one step at a time.
A year ago it was rumored that 250 Apple employees were at CES 2012, even as the company refused to participate directly.The company could do no wrong and didn’t need the industry.For the better part of 9 months that appeared to be the case and Apple’s stock outperformed the market by 55%.But a few months on, and a screen size too small for phones and too big for tablets, a mapping app too limited, and finally a buggy OS, Apple’s excess performance over the market has narrowed to 10%.
Two major themes of this year’s CES--mobile device screen size and extensive application ecosystems to connect just about anything--will place Apple’s mobile dominance and lead further in doubt. To us it was already in evidence last year when we talked about the singularity. But the real reason today is becoming apparent to all, namely that Apple wants to keep people siloed into their product specific verticals. People and their applications don’t want that because the cloud lets people update, access and view information across 3 different screens and any platform. If you want Apple on one device, all your devices have to be Apple. It’s a twist on the old Henry Ford maxim, “you can have any device…as long as it is Apple.”
This strategy will fail further when the access portion of the phone gets disconnected from all the other components of the phone. It may take a few years for that to happen, but it will make a lot of sense to just buy an inexpensive dongle or device that connects to the 4G/5G/Wifi network (metro-local or MAN/LAN) and radiates Bluetooth, NFC and Wifi to a plethora of connected devices in the personal network (PAN). Imagine how well your “connection hub” would last if it didn’t need to power a screen and huge processor for all the different apps? There goes your device centric business model.
And all that potential device and application/cloud supply-side innovation means that current demand is far from saturated. The most recent Cisco forecasts indicate that 1/3rd of ALL internet traffic will be from mobile devices by 2016. In the US 37% of the mobile access will be via Wifi. Applications that utilize and benefit from mobility and transportability will continue to grow as overall internet access via a fixed computer will drop to 39% from 60% today.
While we believe this to be the case, the reality is far different today according to Sandvine, the broadband policy management company. This should cause the wireless carriers some concern as they look at future capacity costs. In their recent H2-2012 report Sandvine reveals that power smartphone users are already using 10x more than average smartphone users, or 317 megabytes a month vs 33. But even the former number is a far cry from the 7.3 gigabytes (20x) that the average person uses on their fixed broadband pipes (assuming 2.3 people per fixed broadband line). Sandvine estimates that total mobile access will grow from ~1 petabyte in H2-2012 to 17 petabytes by H2-2018.
My own consumption, since moving from 3G to 4G and going from a 4 to 4.7 inch screen is a 10x increase to 1-2 gigs 4G access and 3-6 gigs wifi access, for a total of 4-8 gigs a month. This is because I have gone from a “store and forward” mentality to a 7x24 multimedia consumption model. And I am just getting comfortable with cloud based access and streaming. All this sounds positive for growth and investment especially as the other 95% of mobile users evolve to these usage levels, but it will do the carriers no good if they are not strategically and competitively well positioned to handle the demand. Look for a lot of development in the lower access and transport layers including wifi offload and fiber and high-capacity microwave backhaul.
Since I began covering the sector in 1990, I’ve been waiting for Big Bang II.An adult flick?No, the sequel to Big Bang (aka the breakup of MaBell and the introduction of equal access) was supposed to be the breakup of the local monopoly.Well thanks to the Telecom Act of 1996 and the well-intentioned farce that it was, that didn’t happen and equal access officially died (equal access RIP) in 2005 with the Supreme Court's Brand-X decision vs the FCC. If it died, then we saw a resurrection that few noticed.
I am announcing that Equal Access is alive and well, albeit in a totally unexpected way.Thanks to Steve Jobs’ epochal demands put on AT&T to counter its terrible 2/3G network coverage and throughput, every smartphone has an 802.11 (WiFi) backdoor built-in.Together with the Apple and Google operating systems being firmly out of carriers’ hands and scaling across other devices (tablets, etc…) a large ecosystem of over-the-top (OTT), unified communications and traffic offloading applications is developing to attack the wireless hegemony.
First, a little history. Around the time of AT&T's breakup the government implemented 2 forms of equal access. Dial-1 in long-distance made marketing and application driven voice resellers out of the long-distance competitors. The FCC also mandated A/B cellular interconnect to ensure nationwide buildout of both cellular networks. This was extended to nascent PCS providers in the early to mid 1990s leading to dramatic price declines and enormous demand elasticities. Earlier, the competitive WAN/IXC markets of the 1980s led to rapid price reductions and to monopoly (Baby Bell or ILEC) pricing responses that created the economic foundations of the internet in layers 1 and 2; aka flat-rate or "unlimited" local dial-up. The FCC protected the nascent ISP's by preventing the Bells from interfering at layer 2 or above. Of course this distinction of MAN/LAN "net-neutrality" went away with the advent of broadband, and today it is really just about WAN/MAN fights between the new (converged) ISPs or broadband service providers like Comcast, Verizon, etc... and the OTT or content providers like Google, Facebook, Netflix, etc...
(Incidentally, the FCC ironically refers to edge access providers, who have subsumed the term ISPs or "internet service providers", as "core" providers, while the over-the-top (OTT) messaging, communications, e-commerce and video streaming providers, who reside at the real core or WAN, are referred to as "edge" providers. There are way, way too many inconsistencies for truly intelligent people to a) come up with and b) continue to promulgate!)
But a third form of equal access, this one totally unintentioned, happened with 802.11 (WiFi) in the mid 1990s. The latter became "nano-cellular" in that power output was regulated limiting hot-spot or cell-size to ~300 feet. This had the impact of making the frequency band nearly infinitely divisible. The combination was electric and the market, unencumbered by monopoly standards and scaling along with related horizontal layer 2 data technologies (ethernet), quickly seeded itself. It really took off when Intel built WiFi capability directly into it's Centrino chips in the early 2000s. Before then computers could only access WiFi with usb dongles or cables tethered to 2G phones
Cisco just forecast that 50% of all internet traffic will be generated from 802.11 (WiFi) connected devices.Given that 802.11’s costs are 1/10th those of 4G something HAS to give for the communications carrier.We’ve talked about them needing to address the pricing paradox of voice and data better, as well as the potential for real obviation at the hands of the application and control layer worlds.While they might think they have a near monopoly on the lower layers, Steve Job’s ghost may well come back to haunt them if alternative access networks/topologies get developed that take advantage of this equal access. For these networks to happen they will need to think digital, understand, project and foster vertically complete systems and be able to turn the "lightswitch on" for their addressable markets.
The first quarter global smartphone stats are in and it isn’t even close. With the market growing more than 40%+, Samsung controls 29% of the market and Apple 24%.The next largest, Nokia came in 60-70% below the leaders at 8%, followed by RIMM at 7% and HTC at 5%, leaving the scraps (28%) to Sony, Motorola, LG, ZTE. They've all already lost on the scale front; they need to change the playing field.
While this spread sounds large and improbable, it is not without historical precedent.In 1914, just 6 years after its introduction the Ford Model T commanded 48% market share. Even by 1923 Ford still had 40% market share.2 years later the price stood at $260, which was 30% of the original model in 1908, and less than 10% what the average car cost in 1908; sounds awfully similar to Moore’s law and the pricing of computer/phone devices over the past 30 years.Also, a read on the Model T's technological and design principles sounds a lot like pages taken out of the book of Apple.Or is it the other way around?
Another similarity was Ford’s insistence on the use of black beginning in 1914.Over the life of the car 30 different variations of black were used!The color limitation was a key ingredient in the low cost as prior to 1914, the company used blue, green, red and grey.Still 30 variations of black (just like Apple’s choice of white and black only and take it or leave it silo-ed product approach) is impressive and is eerily similar to Dutch Master Frans Hals’ use of 27 variations of Black, so inspirational to Van Gogh. Who says we can’t learn from history.
Ford’s commanding lead continued through 1925 even as competitors introduced many new features, designs and colors.Throughout, Ford was the price leader, but when the end came for that strategy it was swift. Within 3 years the company had completely changed its product philosophy introducing the Model A (with 4 colors and no black) and running up staggering losses in 1927-28 in the process.But the company saw market share rebound from 30% to 45% in the process; something that might have been maintained for a while had not the depression hit.
The parallels between the smartphone and automobile seem striking. The networks are the roads.The pricing plans are the gasoline.Cars were the essential component for economic advancement in the first half of the 20th century, just as smartphones are the key for economic development in the first half of the 21st century. And now we are finding Samsung as Apple's GM; only the former is taking a page from both GM and Ford's histories. Apple would do well to take note.
So what are the laggards to do to make it an even race?We don’t think Nokia’s strategy of coming out with a different color will matter.Nor do we think that more features will matter.Nor do we think it will be about price/cost.So the only answer lies in context; something we have raised in the past on the outlook for the carriers.More to come on how context can be applied to devices. Hint, I said devices, not smartphones. We'll also explore what sets Samsung and Apple apart from the rest of the pack.
I love talking to my smartphone and got a lot of grief for doing so from my friends last summer while on vacation at the shore.“There’s Michael talking crazy, again,” as I was talking TO my phone, not through it.“Let’s ask Michael to ‘look’ that up! Haha.”And then Siri came along and I felt vindicated and simultaneously awed.The latter by Apple’s (AAPL, OCF = 6.9x) marketing and packaging prowess; seemingly they had trumped Google/Android (GOOG, OCF = 9.6x) yet again.Or had they?What at first appeared to be a marketing coup may become Tim Cook’s Waterloo and sound a discordant note for Nuance (NUAN, OCF = 31x).At its peak in early February NUAN hit a high of 42x OCF, even as Apple stood at 8.2x.
The problem for Apple and Nuance in the short term is that the former is doubling down on its Siri bet with a brand new round of ads by well-known actors, such as Samuel L. Jackson and Zooey Deschanel.Advertising pundits noted this radical departure for a company that historically shunned celebrities. Furthermore, with two (NY and LA) class action suits against it and financial pundits weighing in on the potential consumer backlash Apple could have a major Siri migraine in the second half.Could it be as big as Volvo’s advertising debacle 20 years ago; a proverbial worm in the apple?Time will tell.
The real problem isn’t with Apple, the phone or Siri and its technology DNA, rather the problem lies with bandwidth and performance of the wireless networks.Those debating whether Siri is a bandwidth hog are missing the point.Wireless is a shared spectrum.The more people who are on the same spectrum band, the less bandwidth for each user and the higher the amount of noise.Both are bad for applications like Siri and Android’s voice recognition since they talk with processors in the cloud (or WAN).Delays and missed packets have a serious impact on comprehension, translation and overall responsiveness.Being a frequent user of Google’s voice-rec which has been around since August, 2010, I know when to count on voice-rec by looking at the wifi/3G/2G indicator and the number of bars.But do others know this and will the focus on Siri's performance shift to the network?Time will tell.
The argument that people know voice-rec is new and still in beta probably won’t cut it either.I am puzzled why major Apple fanboys don’t perceive it as a problem, not even making it on this list of 5 critical product issues for the company to address.But maybe that’s why companies like Apple are successful; they push the envelope.In the 1990s I said to the WAN guys (Sprint, et al) that they would have the advantage over the baby bells because the “cloud” or core was more important than the edge.The real answer, which Apple fully understands, is that the two go hand in hand.For Sprint (S, OCF = 3.2x) there were too many variables they couldn’t control, so they never rolled out voice recognition on wired networks.They probably should have taken the chance.Who knows, the “pin-drop” company could have been a whole lot better off!
A natural opposite for "siri" for the droid crowd would be an "Eliza", based on Audrey Hepburn's famous 'The Rain in Spain' character. Eliza is the name of a company specializing in voice-rec healthcare apps and also a 1960s AI psychotherapy program.
Previously we have written about “being digital” in the context of shifting business models and approaches as we move from an analog world to a digital world.Underlying this change have been 3 significant tsunami waves of digitization in the communications arena over the past 30 years, underappreciated and unnoticed by almost all until after they had crashed onto the landscape:
The WAN wave between 1983-1990 in the competitive long-distance market, continuing through the 1990s;
The Data wave, itself a direct outgrowth of the first wave, began in the late 1980s with flat-rate local dial-up connections to ISPs and databases anywhere in the world (aka the Web);
The Wireless wavebeginning in the early 1990s and was a direct outgrowth of the latter two.Digital cellphones were based on the same technology as the PCs that were exploding with internet usage.Likewise, super-low-cost WAN pricing paved the way for one-rate, national pricing plans. Prices dropped from $0.50-$1.00 to less than $0.10. Back in 1996 we correctly modeled this trend before it happened.
Each wave may have looked different, but they followed the same patterns, building on each other.As unit prices dropped 99%+ over a 10 year period unit demand exploded resulting in 5-25% total market growth.In other words, as ARPu dropped ARPU rose; u vs U, units vs Users.Elasticity.
Yet with each new wave, people remained unconvinced about demand elasticity.They were just incapable of pivoting from the current view and extrapolating to a whole new demand paradigm.Without fail demand exploded each time coming from 3 broad areas: private to public shift, normal price elasticity, and application elasticity.
Private to Public Demand Capture.Monopolies are all about average costs and consumption, with little regard for the margin.As a result, they lose the high-volume customer who can develop their own private solution.This loss diminishes scale economies of those who remain on the public, shared network raising average costs; the network effect in reverse.Introducing digitization and competition drops prices and brings back not all, but a significant number of these private users.Examples we can point to are private data and voice networks, private radio networks, private computer systems, etc…that all came back on the public networks in the 1980s and 1990s.Incumbents can’t think marginally.
Normal Price Elasticity.As prices drop, people will use more.It gets to the point where they forget how much it costs, since the relative value is so great.One thing to keep in mind is that lazy companies can rely too much on price and “all-you-can-eat” plans without regard for the real marginal price to marginal cost spread.The correct approach requires the right mix of pricing, packaging and marketing so that all customers at the margin feel they are deriving much more value than what they are paying for; thus generating the highest margins.Apple is a perfect example of this.Sprint’s famous “Dime” program was an example of this.The failure of AYCE wireless data plans has led wireless carriers to implement arbitrary pricing caps, leading to new problems.Incumbents are lazy.
Application Elasticity. The largest and least definable component of demand is the new ways of using the lower cost product that 3rd parties drive into the ecosystem.They are the ones that drive true usage via ease of use and better user interfaces.Arguably they ultimately account for 50% of the new demand, with the latter 2 at 25% each.With each wave there has always been a large crowd of value-added resellers and application developers that one can point to that more effectively ferret out new areas of demand.Incumbents move slowly.
Demand generated via these 3 mechanisms soaked up excess supply from the digital tsunamis. In each case competitive pricing was arrived at ex ante by new entrants developing new marginal cost models by iterating future supply/demand scenarios. It is this ex ante competitive guess, that so confounds the rest of the market both ahead and after the event. That's why few people recognize that these 3 historical waves are early warning signs for the final big one.The 4th and final wave of digitization will occur in the mid-to-last mile broadband markets. But many remain skeptical of what the "demand drivers" will be. These last mile broadband markets are monopoly/duopoly controlled and have not yet realized price declines per unit that we’ve seen in the prior waves. Jim Crowe of Level3 recently penned a piece in Forbes that speaks to this market failure. In coming posts we will illustrate where we think bandwidth pricing is headed, as people remain unconvinced about elasticity, just as before. But hopefully the market has learned from the prior 3 waves and will understand or believe in demand forecasts if someone comes along and says last mile unit bandwidth pricing is dropping 99%. Because it will.
Is reality mimicking art?Is Android following the script from Genesis’ epochal hit Land of Confusion? Is it a bad dream on this day that happens once every four years?Yes, yes, and unfortunately no.Before I go into a litany of ills besetting the Android market and keeping Apple shareholders very happy, two points: a) I have an HTC Incredible and am a Droid fan, and b) the 1986 hit parodied superpower conflict and inept decisions by global leaders but presaged the fall of the Berlin Wall and 20+ years of incredible growth, albeit with a good deal of 3rd world upheaval in the Balkans, Mid-east, and Africa.So maybe there is hope that out of the current state a new world order will arise as the old monopolies are dismantled.
Yesterday Eric Schmidt prognosticated at MWC a world where more is better and cheaper; which may be good for Google but not necessarily the best thing for anyone else in the Droid ecosystem, including consumers.Yet, at the same time Apple managed to steal the show with its iPad3 announcement.Contrast this with HTC rolling out some awesome phones that will not be available in the US this year because their chip doesn't support 4G.
The answer is not better technology, but better ecosystems.The Droid device vendors should realize this and build a layer of software and standards above Google/ICS to facilitate interoperability across silos (at the individual, device and network level); instead of just depreciating their hardware value by competing on price and features many people do not want. They can then collectively win in residual transaction streams (like collectively synching back to a dropbox) like Apple.
Examples of these include standardization and interoperability of free or subsidized wifi offload, over the top messaging, voice and other solutions and the holy grail, mobile payments. Companies like CloudFoundry allows for cross Cloud application infrastructure support, with AppFog and Iron Foundry are pursuing these approaches individually. But just think what would happen if Samsung, HTC, LG and Motorola were to band together and coordinate these approaches and develop very low cost balanced payment systems within the Droid ecosystem to promote interoperability and cooperation, counteract Google and restore some sanity to the market. Carriers (um battleships?) will not be able to stop this effort and may even welcome it just as the music industry opened its arms to Apple.
Apple hasn’t been an innovator so much as a great design company that understands big market opportunities and what the customer wants. The result is an established order that other industries and their customers clearly prefer. Millenials are too young to know Land of Confusion, but the current decision makers in the Droid ecosystem do and so they should take a lesson from history on this Leap Day.Hopefully we’ll wake up in 4 years and there will be a wonderful new world order. Oh, and a Happy 4 Birthdays to everyone present and past who was born on this day.
Wireless service providers (WSPs) like AT&T and Verizon are battleships, not carriers.Indefatigable...and steaming their way to disaster even as the nature of combat around them changes.If over the top (OTT) missiles from voice and messaging application providers started fires on their superstructures and WiFi offload torpedoes from alternative carriers and enterprises opened cracks in their hulls, then Dropbox bombs are about to score direct hits near their water lines.The WSPs may well sink from new combatants coming out of nowhere with excellent synching and other novel end-user enablement solutions even as pundits like Tomi Ahonen and others trumpet their glorious future.Full steam ahead.
Instead, WSP captains should shout “all engines stop” and rethink their vertical integration strategies to save their ships.A good start might be to look where smart VC money is focusing and figure out how they are outfitted at each level to defend against or incorporate offensively these rapidly developing new weapons.More broadly WSPs should revisit the WinTel wars, which are eerily identical to the smartphone ecosystem battles, and see what steps IBM took to save its sinking ship in the early 1990s.One unfortunate condition might be that the fleet of battleships are now so widely disconnected that none have a chance to survive.
The bulls on Dropbox (see the pros and cons behind the story) suggest that increased reliance on cloud storage and synching will diminish reliance on any one device, operating system or network.This is the type of horizontalization we believe will continue to scale and undermine the (perceived) strength of vertical integration at every layer (upper, middle and lower).Extending the sea battle analogy, horizontalization broadens the theatre of opportunity and threat away from the ship itself; exactly what aircraft carriers did for naval warfare.
Synching will allow everyone to manage and tailor their “states”, developing greater demand opportunity; something I pointed out a couple of months ago.People’s states could be defined a couple of ways, beginning with work, family, leisure/social across time and distance and extending to specific communities of (economic) interest.I first started talking about the “value of state” as Chief Strategist at Multex just as it was being sold to Reuters.
Back then I defined state as information (open applications, communication threads, etc...) resident on a decision maker’s desktop at any point in time that could be retrieved later.Say I have multiple industries that I cover and I am researching biotech in the morning and make a call to someone with a question.Hours later, after lunch meetings, I am working on chemicals when I get a call back with the answer.What’s the value of bringing me back automatically to the prior biotech state so I can better and more immediately incorporate and act on the answer?Quite large.
Fast forward nearly 10 years and people are connected 7x24 and checking their wireless devices on average 150x/day.How many different states are they in during the day?5, 10, 15, 20?The application world is just beginning to figure this out.Google, Facebook, Pinterest and others are developing data engines that facilitate “free access” to content and information paid for by centralized procurement; aka advertising.Synching across “states” will provide even greater opportunity to tailor messages and products to consumers.
Inevitably those producers (advertisers) will begin to require guaranteed QoS and availability levels to ensure a good consumer experience. Moreover, because of social media and BYOD companies today are looking at their employees the same way they are looking at their consumers.The overall battlefield begins to resemble the 800 and VPN wars of the 1990s when we had a vibrant competitive service provider market before its death at the hands of the 1996 Telecom Act (read this critique and another that questions the Bell's unnatural monopoly).Selling open, low-cost, widely available connectivity bandwidth into this advertising battlefield can give WSPs profit on every transaction/bullet/bit across their network.That is the new “ship of state” and taking the battle elsewhere.Some call this dumb pipes; I call this a smart strategy to survive being sunk.
Data is just going nuts!Big data, little data, smartphones, clouds, application ecosystems.So why are Apple and Equinix two of only a few large cap companies in this area with stocks up over 35% over the past 12 months, while AT&T, Verizon, Google and Sprint are market performers or worse?It has to do with pricing, revenues, margins and capex; all of which impact ROI.The former’s ROI is going up while the latters’ are flat to declining.And this is all due to the wildness of mobile data.
Data services have been revealing flaws and weaknesses in the carriers pricing models and networks for some time, but now the ante is being upped.Smartphones now account for almost all new phones sold, and soon they will represent over 50% of every carriers base, likely ending this year over 66%.That might look good except when we look at these statistics and facts:
Streaming just 2 hours of music daily off a cloud service soaks up 3.5GB per month
Carriers still derive more than 2/3 of their revenues from voice.
Cellular wireless (just like WiFi) is shared.
Putting this together you can see that on the one hand a very small percentage of users use the bulk of the network. Voice pricing and revenues are way out of sync with corresponding data pricing and revenues; especially as OTT IP voice and other applications become pervasive.
Furthermore, video, which is growing in popularity will end up using 90% of the capacity, crowding out everything else, unless carriers change pricing to reflect differences in both marginal users and marginal applications. Marginal here = high volume/leading edge.
So how are carriers responding? By raising data prices. This started over a year ago as they started capping those “unlimited” data plans. Now they are raising the prices and doing so in wild and wacky ways; ways we think that will come back to haunt them just like wild party photos on FB. Here are just two of many examples:
This past week AT&T simplified its pricing and scored a marketing coup by offering more for more and lowering prices even as the media reported AT&T as “raising prices.” They sell you a bigger block of data at a higher initial price and then charge the same rate for additional blocks which may or may not be used. Got that?
On the other hand that might be better than Walmart’s new unlimited data plan which requires PhD level math skills to understand. Let me try to explain as simply as possible. Via T-Mobile they offer 5GB/month at 3G speed, thereafter (the unlimited part) they throttle to 2G speed. But after March 16 the numbers will change to 250MB initially at 3G, then 2G speeds unlimited after that. Beware the Ides of March’s consumer backlash!
Unless the carriers and their channels start coming up with realistic offload solutions, like France’s Free, and pricing to better match underlying consumption, they will continue to generate lower or negative ROI. They need to get control of wild data. Furthermore, if they do not, the markets and customers will. With smartphones (like Apple's, who by the way drove WiFi as a feature knowing that AT&T's network was subpar) and cloudbased solutions (hosted by Equinix) it is becoming easier for companies like Republic Wireless to virtually bypass the expensive carrier plans using their very own networks. AT&T, VZ, Sprint will continue to be market performers at best.
“There is no clear definition, but usually the Singularity is meant as a future time when societal, scientific and economic change is so fast we cannot even imagine what will happen from our present perspective, and when humanity will become posthumanity.”Futurist Ray Kurzweil wrote the Singularity is near in 2005 and it sure felt that way at CES in Las Vegas in 2012.
In any event, we didn’t come up with this association.Matt Burns at TechCrunch did during one of the live streamed panel discussions from the show floor.But given that we’ve been at the forefront of mobile devices and networks for 2 decades, we couldn’t have said it any better.After all this time and all these promises, it really felt like all the devices were connected and this app-ecosystem-mass was beginning to move at a pace and direction that none of us could fully explain.
Vernor Vinge, a sci-fi writer, coined the expression back in the early 1990s.When you look at the list of publications at the Singularity Institute one can’t help but think that the diaspora of different applications and devices self organizing out of CES 2012 isn’t the beginning of AI.Will AI develop out of the search algorithms that tailor each individuals specific needs, and possibly out-thinking or out-guessing the individual in the process?
We probably won’t know the answer for at least a decade, but what we do know is that we are about to embark on a wave of growth, development and change that will make the past 30 year WinTel-Internet development look embryonic at best. After all the Singularity is infinity in mathematical terms. Hold onto your seats.