Skip to main content
SpectralShifts Blog 
Sunday, March 30 2014

Why App Coverage Will Drive Everything

Given the smartphone’s ubiquity and our dependence on it, “App Coverage” (AC) is something confronting us every day, yet we know little about it. At the CCA Global Expo this week in San Antonio Glenn Laxdal of Ericsson spoke about “app coverage”, which the vendor first surfaced in 2013.  AC is defined as, “the proportion of a network’s coverage that has sufficient performance to run a particular app at an acceptable quality level.”  In other words the variety of demand from end-users for voice, data and video applications is outpacing the ability of carriers to keep up.  According to Ericsson, monitoring and ensuring performance of app coverage is the next wave in LTE networks.  Here’s a good video explaining AC in simple, visual terms.

Years, nay, decades ago I used to say coverage should be measured in 3 important ways:

  • Geographic (national vs regional vs local)
  • In/Outdoors (50+% loss indoors)
  • Frequency (double capex 1900 vs 700 mhz)

Each of these had specific supply/demand clearing implications across dozens of issues impacting balance sheets and P&L statements; ultimately determining winners and losers.  They are principally why AT&T and Verizon today have 70% of subscribers (80% of enterprise customers) up from 55% just 5 years ago, 84% of total profit, and over 100% of industry free cash flow.  Now we can add “applications” to that list.  And it will only make it more challenging for competitors to wrestle share from the “duopoly”.

Cassidy Shield of Alcatel-Lucent, further stated that fast follower strategies to the duopoly would likely fail; implying that radical rethinking was necessary.  Some of that came quickly in the form of Masayoshi Son’s announcement of a broad partnership with NetAmerica and members of CCA for preferred roaming, concerted network buildout and sharing of facilities and device purchase agreements. This announcement came two weeks after Son visited Washington DC and laid out Sprint’s vision for a new, more competitive wireless future in America.

The conference concluded with a panel of CEOs hailing Sprint’s approach, which Son outlined here, as one of benevolent dictator (perhaps not the best choice of words) and exhorting the label partner, partner, partner; something that Terry Addington of MobileNation has said has taken way too long.  Even then the panel agreed that pulling off partnerships will be challenging.

The Good & Bad of Wireless

Wireless is great because it is all things to all people, and that is what makes it bad too.  Planning for and accounting how users will access the network is very challenging across a wide user base.  There are fundamentally different “zones” and contexts in which different apps can be used and they often conflict with network capacity and performance.  I used to say that one could walk and also hang upside down from a tree and talk, but you couldn’t “process data” doing those things.  Of course the smartphone changed all that and people are accessing their music apps, location services, searches, purchases, and watching video from anywhere; even hanging upside down in trees.

Today voice, music and video consume 12, 160 and 760 kpbs of bandwidth, respectively, on average.  Tomorrow those numbers might be 40, 500, 1500, and that’s not even taking into account “upstream” bandwidth which will be even more of a challenge for service providers to provision when consumers expect more 2-way collaboration everywhere.  The law of wireless gravity, which states bits will seek out fiber/wire as quickly and cheaply as possible, will apply, necessitating sharing of facilities (wireless and wired), heterogeneous network (Hetnet), and aggressive wifi offload approaches; even consumers will be shared in the form of managed services across communities of users (known today as OTT).  The show agenda included numerous presentations on distributed antennae networks and wifi offload applied to the rural coverage challenge.

Developing approaches ex ante to anticipate demand is even more critical if carriers want to play major roles in the internet of things, unified (video) communications and the connected car.  As Ericsson states in its whitepaper,

“App coverage integrates all aspects of network performance – including radionetwork throughput and latency, capacity, as well as the performance of the backhaul, packetcore and the content-delivery networks. Ultimately, managing app coverage and performance demands a true end-to-end approach to designing, building and running mobile networks.”

Posted by: Michael Elling AT 10:37 am   |  Permalink   |  0 Comments  |  Email
Monday, March 17 2014

A New Visionary In Our Midst?

The US has lacked a telecom network visionary for nearly 2 decades.  There have certainly been strong and capable leaders, such as John Malone who not only predicted but brought about the 500 channel LinearTV model.  But there hasn’t been someone like Bill McGowan who broke up AT&T or Craig McCaw who first had the vision to build a national, seamless wireless network, countering decades of provincial, balkanized thinking.  Both of them fundamentally changed the thinking around public service provider networks.

But with a strong message to the markets in Washington DC on March 11 from Masayoshi Son, Sprint’s Chairman, the 20 year wait may finally be over.  Son did what few have been capable of doing over the past 15-20 years since McGowan exited stage left and McCaw sold out to MaBell: telling it like it is.  The fact is that today’s bandwidth prices are 20-150x higher than they should be with current technology.

This is no one’s fault in particular and in fact to most people (even informed ones) all measures of performance-to-price compared to 10 or 20 years ago look great.  But, as Son illustrated, things could be much, much better.  And he’s willing to make a bet on getting the US, the most advanced and heterogeneous society, back to a leadership role with respect to the ubiquity and cost of bandwidth.  To get there he needs more scale and one avenue is to merge with T-Mobile.

There have been a lot of naysayers as to the possibility of a Sprint-T-Mo hookup, including leaders at the FCC.  But don’t count me as one; it needs to happen.  Initially skeptical when the rumors first surfaced in December, I quickly reasoned that a merger would be the best outcome for the incentive auctions.  A merger would eliminate spectrum caps as a deterrent to active bidding and maximize total proceeds.  It would also have a better chance of developing a credible third competitor with equal geographic reach. Then in January the FCC and DoJ came out in opposition to the merger.

In February, though, Comcast announced the much rumored merger with TW and Son jumped on the opportunity to take his case for merging to a broader stage.  He did so in front of a packed room of 300 communications pundits, press and politicos at the US Chamber of Commerce’s prestigious Hall of Flags; a poignant backdrop for his own rags to riches story.  Son’s frank honesty about the state of broadband for the American public vs the rest of the world, as well as Sprint’s own miserable current performance were impressive.  It’s a story that resonates with my America’s Bandwidth Deficit presentation.

Here are some reasons the merger will likely pass:
  • The FCC can’t approve one horizontal merger (Comcast/TW) that brings much greater media concentration and control over content distribution, while disallowing a merger of two small players (really irritants as far as AT&T and Verizon are concerned).
  • Son has a solid track record of disruption and doing what he says.
  • The technology and economics are in his favor.
  • The vertically integrated service provider model will get disrupted faster and sooner as Sprint will have to think outside the box, partner, and develop ecosystems that few in the telecom industry have thought about before; or if they have, they’ve been constrained by institutional inertia and hidebound by legacy regulatory and industry siloes.

Here are some reasons why it might not go through:

  • The system is fundamentally corrupt.  But the new FCC Chairman is cast from a different mold than his predecessors and is looking to make his mark on history.
  • The FCC shoots itself in the foot over the auctions.  Given all the issues and sensitivities around incentive auctions the FCC wants this first one to succeed as it will serve as a model for all future spectrum refarming issues. 
  • The FCC and/or DoJ find in the public interest that the merger reduces competition.  But any analyst can see that T-Mo and Sprint do not have sustainable models at present on their own; especially when all the talk recently in Barcelona was already about 5G.

Personally I want Son’s vision to succeed because it’s the vision I had in 1997 when I originally brought the 2.5-2.6 (MMDS) spectrum to Sprint and later in 2001 and 2005 when I introduced Telcordia’s 8x8 MIMO solutions to their engineers.  Unfortunately, past management regimes at Sprint were incapable of understanding the strategies and future vision that went along with those investment and technology pitches.  Son has a different perspective (see in particular minute 10 of this interview with Walt Mossberg) with his enormous range of investments and clear understanding of price elasticity and the marginal cost of minutes and bits.

To be successful Sprint’s strategy will need to be focused, but at the same time open and sharing in order to simultaneously scale solutions across the three major layers of the informational stack (aka the InfoStack):

  • upper (application and content)
  • middle (control)
  • lower (access and transport)

This is the challenge for any company that attempts to disrupt the vertically integrated telecom or LinearTV markets; the antiquated and overpriced ones Son says he is going after in his presentation.    But the US market is much larger and more robust than the rest of the world, not just geographically, but also from a 360 degree competitive perspective where supply and demand are constantly changing and shifting.

Ultimate success may well rest in the control layer, where Apple and Google have already built up formidable operating systems which control vastly profitably settlement systems across multiple networks.  What few realize is that the current IP stack does not provide price signals and settlement systems that clear supply and demand between upper and lower layers (north-south) or between networks (east-west) in the newly converged “informational” stack of 1 and 2-way content and communications.

If Sprint’s Chairman realizes this and succeeds in disrupting those two markets with his strategy then he certainly will be seen as a visionary on par with McGowan and McCaw.

Posted by: Michael Elling AT 09:58 am   |  Permalink   |  0 Comments  |  Email
Friday, August 17 2012

How To Develop A Blue Ocean Strategy In A Digital Ecosystem

Back in 2002 I developed a 3 dimensional macro/micro framework based strategy for Multex, one of the earliest and leading online providers of financial information services.  The result was to sell themselves to Reuters in a transaction that benefited both companies.  1+8 indeed equaled 12.  What I proposed to the CEO was simple.  Do “this” to grow to a $500m company or sell yourself.  After 3-4 weeks of mulling it over, he took a plane to London and sold his company rather than undertake the “this”.

What I didn’t know at the time was that the “this” was a Blue Ocean Strategy (BOS) of creating new demand by connecting previously unconnected qualitative and quantitative information sets around the “state” of user.  For example a portfolio manager might be focused on biotech stocks in the morning and make outbound calls to analysts to answer certain questions.  Then the PM goes to a chemicals lunch and returns to focus on industrial products in the afternoon, at which point one of the biotech analysts gets back to him.  Problem.  The PM’s mental and physical “state” or context is gone.  Multex had the ability to build a tool that could bring the PM back to his morning “state” in his electronic workplace.  Result, faster and better decisions.  Greater productivity, possible performance, definite value.

Sounds like a great story, except there was no BOS in 2002.  It was invented in 2005.  But the second slide of my 60 slide strategy deck to the CEO had this quote from the author’s of BOS, W.Chan Kim and Renee Mauborgne, of INSEAD, the Harvard Business School of Europe:

“Strategic planning based on drawing a picture…produces strategies that instantly illustrate if they will: stand out in the marketplace, are easy to understand and communicate, and ensure that every employee shares a single visual reference point.”

So you could argue that I anticipated the BOS concept to justify my use of 3D frameworks which were meant to illustrate this entirely new playing field for Multex.

But this piece is less about the InfoStack’s use in business and sports and more about the use of the 4Cs and 4Us of supply and demand as tools within the frameworks to navigate rapidly changing and evolving ecosystems.  And we use the BOS graphs postulated by Kim/Mauborgne.  The 4Cs and 4Us lets someone introducing a new product, horizontal layer (exchange) or vertical market solution (service integration) figure out optimal product, marketing and pricing strategies and tactics a priori.  A good example of this is a BOS I created for a project I am working on in the area of Wifi offload and Hetnet (heterogeneous access networks that can be self-organising) area called HotTowns (HOT).  Here’s a picture of it comparing 8 key supply and demand elements across fiber, 4G macro cellular and super saturation offload in a rural community.  Note that the "blue area" representing the results of the model can be enhanced on the capacity front by fiber and on the coverage front by 4G.

The same approach can be used to rate mobile operating systems and any other product at a boundary of the infostack or horizontal or vertical solution in the market.  We'll do some of that in upcoming pieces.

 

 

Posted by: Michael Elling AT 09:49 am   |  Permalink   |  0 Comments  |  Email
Sunday, February 19 2012

The sports world is a great place to test and prove out management tactics and strategies.  Unlike in the commercial world where reasons for and measurement of product success are highly variable and of uncertain duration, sports leagues provide tightly controlled competitive conditions that enable consistent quantitative comparisons.  As a result, one can clearly see winning strategies and tactics based on season over season performance.  None is more remarkable than Sir Alex Ferguson’s record at Manchester United, probably the best record of any team/coach at the top leagues in any sport over the past 2+ decades.   Others have written about his management style here and here.

A great leader is one who gets the best from himself, his employees and customers regardless of the situation or context, be they sports teams, armies, corporations, academic institutions, societies, etc...  SAF's record is more remarkable in that it has improved the game of soccer at the same time.  Ultimately everyone is subservient and bound by the objective of the team, including himself.  SAF's style is best described as a disciplined command and control approach that leads to employee empowerment.   SAF has grown, bought, developed, and nurtured superstars and when they've outgrown the team, they are gone.

In his own words SAF says detail is important.  He is probably an acolyte of Edward Tufte, who says more information is better.  Through analytics SAF looks to gain the proverbial inch to give him the advantage over other teams.   In the process his staff has increased from 8 to 30 people over the past 2 decades.  In order to synthesize the information and implement strategy leading to individual decision making SAF has implemented an organizational matrix approach and not a management pyramid (or inverted one for that matter).  The result is that on game day ManU teams are capable of making decisions on the field and have the resolve to come from behind more often than not.

Furthermore, soccer records and trophies are won not just by outright wins, as in most other sports, but by a combination of wins, goal differentials and draws.  As well, Home and Away records are typically very important and a better than 50% winning percentage away from home means a lot.  In the table of SAF's points analysis we see that he achieved his remarkable record not just because he won, but because he managed to lose fewer than half his games away from home.  This record was achieved because he managed the goal differential ("for" over total goals in last column) significantly in his team's favor away from home. Anybody who knows the game of soccer knows the manager's strategy and tactics before and during the game typically has more of an impact on goal differential than luck.  The chart illustrates that ManU's goals for percentage mirrors the total winning points percentage.   For those who don't understand the subtleties of soccer's scoring, a 0-0 draw away from home can be the deciding factor in a trophy if a home win is managed.  Furthermore drawing 0-0 away from home to a big club is synonymous with a victory during the regular season.  (Of course an away goal in a cup competition is worth more than a home goal.)

So how did SAF manage this?  As a soccer purist, I love the style of play that is uptempo and attacking.  As a former defender I love when ManU’s right or left defenders become 3rd and 4th attackers up the wings.  But at the same time, striking the correct balance offensively and defensively is key.  By establishing a matrix approach SAF made it easier to move players around between and during games and to identify and increase their value relative to other players around them, so that a weakness in one area can be countered with strength in another.  Furthermore, the matrix approach empowers weak players and makes superstars out of "ordinary" players like Ryan Giggs and Paul Scholes; whom many would never have guessed to have such an impact on the team, or the game, as individuals.

In a matrix, employees have a 3D view of their decisions and the impact on the rest of the organization (resource constraints/availability).  From a tactical perspective (micro level) it enables forwards to play midfield and help on defense and defenders to develop the attack and even score.  A good example is a player like Valencia who used to just be a wing at another team, but has become a very dangerous midfielder and at times has started at right-back for an injury-riddled defense this season.  Strategically (at a macro level), the matrix defines key technology/product (supply) and customer/market (demand) parameters.  With a matrix approach micro and macro perspectives can be easily related.  For instance, SAF would never have let an individual player take over an entire organization’s public perception as Luis Suarez has at Liverpool this season.

Because of the matrix SAF’s strongest player is often the one others' consider weakest.  Rather than focusing on his superstars, he focuses on getting the superstars to work with the weakest players and vice versa.  Bringing this closer to home and more timely, the New York Knicks should implement a matrix approach both at the management/organizational and team levels.  Just look at the benefit the team gets from a player considered weak, Jeremy Lin, but who passes the ball effectively to very talented players versus superstars like Carmelo Anthony or Baron Davis who have a hard time being “matrix” players.  Every organization should develop its matrix and understand how employees play together as a team and how the organization is positioned in competitive industries.

Related Reading

Josh Linkner: Next Play, Mike Krzyzewski and Linkedin

John Doerr of KPCB used football example for OKRs for Google

No player is above the team according to SAF

Why soccer is so hard to predict

Posted by: Michael Elling AT 10:58 am   |  Permalink   |  0 Comments  |  Email
Sunday, February 12 2012

Last week we revisted our seminal analysis from 1996 of the 10 cent wireless minute plan (400 minutes for C$40) introduced by Microcell of Canada and came up with the investment theme titled “The 4Cs of Wireless”.  To generate sufficient ROI wireless needed to replace wireline as a preferred access method/device (PAD).  Wireless would have to satisfy minimal cost, coverage, capacity and clarity requirements to disrupt the voice market.  We found:

  • marginal cost of a wireless minute (all-in) was 1.5-3 cents
  • dual-mode devices (coverage) would lead to far greater penetration
  • software-driven and wideband protocols would win the capacity and price wars
  • CDMA had the best voice clarity (QoS); pre-dating Verizon’s “Can you hear me now” campaign by 6 years

In our model we concluded (and mathematically proved) that demand elasticity would drive consumption to 800 MOUs/month and average ARPUs to north of $70, from the low $40s.  It all happened within 2 short years; at least perceived by the market when wireless stocks were booming in 1998.  But in 1996, the pricing was viewed as the kiss of death for the wireless industry by our competitors on the Street.  BTW, Microcell, the innovator, was at a disadvantage based on our analysis, as the very reason they went to the aggressive pricing model to fill the digital pipes, namely lack of coverage due to a single-mode GSM phone, ended up being their downfall.  Coverage for a "mobility" product trumped price, as we see below.

What we didn’t realize at the time was that the 4Cs approach was broadly applicable to supply of communication services and applications in general.  In the following decade, we further realized the need for a similar checklist on the demand side to understand how the supply would be soaked up and developed the 4Us of Demand in the process.  We found that solutions and services progressed rapidly if they were:

  • easy to use interface
  • usable across an array of contexts
  • ubiquitous in terms of their access
  • universal in terms of appeal

Typically, most people refer only to user interface (UI or UX) or user experience (UE), but those aren't granular enough to accomodate the enormous range of demand at the margin.  Look at any successful product or service introduction over the past 30 years and they’ve scored high on all 4 demand elements.  The most profitable and self-sustaining products and solutions have been those that maximized perceived utility demand versus marginal cost.  Apple is the most recent example of this.

Putting the 4Cs and 4Us together in an iterative fashion is the best way to understand clearing of marginal supply and demand ex ante.  With rapid depreciation of supply (now in seconds, minutes and days) and infinitely diverse demand in digital networked ecosystems getting this process right is critical.

Back in the 1990s I used to say the difference between wireless and wired networks was like turning on a lightswitch in a dark room filled with people.  Reaction and interaction (demand) could be instantaneous for the wireless network.  So it was important to build out rapidly and load the systems quickly.  That made them generative and emergent, resulting in exponential demand growth.  (Importantly, this ubiquity resulted from interconnection mandated by regulations from the early 1980s and extended to new digital entrants (dual-mode) in the mid 1990s).  Conversely a wired network was like walking around with a flashlight and lighting discrete access points providing linear growth.

The growth in adoption we are witnessing today from applications like Pinterest, Facebook and Instagram (underscored in this blogpost from Fred Wilson) is like stadium lights compared with the candlelight of the 1990s.  What took 2 years is taking 2 months.  You’ll find the successful applications and technologies score high on the 4Cs and 4Us checklists before they turn the lights on and join the iOS and Android parties.

Related Reading:
Fred Wilson's 10 Golden Principles

Posted by: Michael Elling AT 11:37 am   |  Permalink   |  0 Comments  |  Email
Sunday, February 05 2012

You all know Monsieurs (MM.) Moore et Metcalfe.  But do you know Monsieur (M.) Zipf?  I made his acquaintance whilst researching infinite long tails.  Why does he matter, you inquire?  Because M. Zipf brings some respectability to Moore et Metcalfe, who can get a little out of control from time to time. 

Monsieur Moore is an aggressive chap who doubles his strength every 18 months or so and isn’t shy about it.  Monsieur Metcalfe has an insatiable appetite, and every bit he consumes increases his girth substantially.  Many people have made lots of money from MM Moore’s et Metcalfe’s antics over the past 30 years.  The first we refer to generally as the silicon or processing effect, the latter as the network effect.  Putting the two together should lead to declines of 50-60% in cost for like performance or throughput.  Heady, rather piggish stuff!

Monsieur Zipf, on the other hand, isn’t one for excess.  He follows a rather strict regimen; one that applies universally to almost everything around us; be it man-made or natural.  M. Zipf isn’t popular because he is rather unsocial.  He ensures that what one person has, the next chap only can have half as much, and the next chap half that, and so on.   It’s a decreasing, undemocratic principle.   Or is it?

Despite his unpopularity, lack of obvious charm and people’s general ignorance of him, M. Zipf’s stature is about to rise.  Why?  Because of the smartphone and everyone’s desire to always be on and connected; things MM Moore and Metcalfe wholeheartedly support.

M. Zipf is related to the family of power law distributions.  Over the past 20 years, technologists have applied his law to understanding network traffic.  In a time of plenty, like the past 20 years, M. Zipf’s not been that important.  But after 15 years of consolidation and relative underinvestment we are seeing demand outstrip supply and scarcity is looming.  M. Zipf can help deal with that scarcity.

Capacity will only get worse as LTE (4G) devices explode on the scene in 2012, not only because of improved coverage, better handsets, improved Android (ICS), but mostly because of the iconic iPhone 5 coming this summer!  Here’s the thing with 4G phones, they have bigger screens and they load stuff 5-10x faster.  So what took 30 seconds now takes 3-10 seconds to load and stuff will look 2-3x better!  People will get more and want more; much to MM Moore’s et Metcalfe’s great pleasure.

“Un moment!” cries M. Zipf.  “My users already skew access quite a bit and this will just make matters worse!  Today, 50% of capacity is used by 1% of my users.  The next 9% use 40% and the remaining 90% of users use just 10% of capacity.  With 4G the inequality can only get worse.  Indignez-vous!  Which is the latest French outcry for equality.”  It turns out Zipf's law is actually democratic in that each person consumes at their marginal not average rate.  The latter is socialism.

Few of us will see this distribution of usage as a problem short-term, except when we’re on overloaded cellsites and out of reach of a friendly WiFi hotspot.  The carriers will throw more capex at the problem and continue to price inefficiently and ineffectively.   The larger problem will become apparent within 2 years when the 90% become the 10% and the carriers tell Wall Street they need to invest another $50B after 2015 just after spending $53B between 2010-2014.

Most people aware of this problem say there is a solution.  More Spectrum = more Bandwidth to satisfy MM Moore et Metcalfe.  But they’ve never heard of M. Zipf nor understood fully how networks are used.  Our solution, extended as a courtesy by M. Zipf, is to “understand the customer” and work on “traffic offloading” at the margin.  Pricing strategies, some clever code, and marketing are the tools to implement a strategy that can minimize the capital outlays, and rapidly amortize investment and generate positive ROI.

We’ve been thinking about this since 1996 when we first introduced our 4Cs of Wireless (cost, coverage, capacity and clarity) analyzing, understanding and embracing 10 cent wireless pricing (introduced by French Canada's revolutionary MicroCell).  As a result we were 2-3 years ahead of everybody with respect to penetration, consumption and wireline substitution thinking and forecasts. Back in 1995 the best wireless prices were 50 cents per minute and just for buying a lot of local access.  Long-distance and roaming charges applied.  So a corporate executive who travelled a lot would regularly rack up $2000-3000 monthly phone bills.  The result was less than 10% penetration, 80 minutes of use per month, and ARPUs declining from $45 to $40 to $35 in analysts' models because the marginal customers being added to the network were using the devices infrequently and more often than not putting them into the glove compartment in case of emergencies.  Fewer than 3% of the population actually used the devices more than 1x day.

We used to poll taxi drivers continuously about wireless and found that their average perceived price of $0.75 per minute was simply too high to justify not having to pull over and use a payphone for $0.25.  So that was the magical inflection point in the elasticity curves.  When MicroCell introduced $0.10 late in the Spring of 1996 and we polled the same set of users, invariably we were just able to avoid an accident they got so excited.  So we reasoned and modeled that more than just taxi drivers would use wireless as a primary access device.  And use it a lot.  This wireless/wireline substitution would result in consumption of 700-800 minutes of use per month, penetration hitting 100% quickly and ARPUs, rather than declining, actually increasing to $70.  The forecast was unbelievably bullish.  And of course no one believed it in 1996, even though all those numbers were mostly reached within 5 years.   

But we also recognized that wireless was a two-edge sword with respect to localized capacity and throughput; taking into account the above 3 laws.  So we also created an optimal zone, or location-based, pricing and selling plan that increased ARPUs and effective yield and were vastly superior to all you can eat (AYCE) and eat what you want (EWYW) plans.  Unfortunately, carriers didn't understand or appreciate M. Zipf and within 2 years they were giving away night and weekend minutes for free, where they could have monetized them for 3-6 cents each.  Then some carriers responded by giving away long-distance (whose marginal cost was less than a minutes without access; but still could cost 2-3 cents).  Then AT&T responded with the One-rate plan, which destroyed roaming surcharges and led to one-rate everywhere; even if demand was different everywhere.

Here’s a snapshot of that analysis that is quite simple and consistent with Zipf's law and highly applicable today.  Unfortunately, where my approach would have kept effective yield at 8 cents or higher, the competitive carriers responded by going to all you can eat (AYCE) plans and the effective yield dropped to 4 cents by 2004.  Had intercarrier SMS not occurred in the 2003-4 timeframe, they would have all been sunk with those pricing models, as they were in the middle of massive 2G investment programs for the coming "wireless data explosion", which actually didn't happen until 3G and smartphones in the 2008-2009 timeframes.  It was still a voice and blackberry (texting and email) world in 2007 when the iPhone hit. With ubiquitous SMS and people's preference to text instead of leaving vmail, minutes dropped from 700 to 500, lowering carrier's costs, and they were able to generate incremental revenues on SMS pricing plans (called data) in the 2004-2007 timeframe.

All that said, the analysis and approach is even more useful today since extreme consumption of data will tend to occur disproportionately in the fixed mode (what many refer to as offload).  I let you come up with your own solutions.  À bientôt!  Oh, and look up Free in France to get an idea of where things are headed.  What is it about these French?  Must be something about Liberté, égalité, fraternité.

Related Reading:
It's All in Your Head by Robert Metcalfe

JD Power Surveys on Wireless Network Performance

The Law of the Few (the 20/80 rule) from the Tipping Point

Briscoe, Odlyzko get it wrong because of one dimensional thinking and mixing apples and oranges

See Larry Downes' Law of Disruption

Law of Bandwidth (but only in a period of monopoly)

See our own home cooked Law of Wireless Gravity

Keck's law of fiber capacity (may be coming to an end?

Kurzweil's 20 Laws of Telecosm (how many right and wrong also depends on timing)

Posted by: Michael Elling AT 11:38 am   |  Permalink   |  0 Comments  |  Email
Sunday, January 29 2012

Every institution, every industry, every company has or is undergoing the transformation from analog to digital.  Many are failing, superseded by new entrants.  No more so than in the content and media industries: music, retail, radio, newspaper and publishing.  But why, especially as they’ve invested in the tools and systems to go digital?  Their failure can be summed up by this simple quote, “Our retail stores are all about customer service, and (so and so) shares that commitment like no one else we’ve met,” said Apple’s CEO. “We are thrilled to have him join our team and bring his incredible retail experience to Apple.”

Think about what Apple’s CEO emphasized.  “Customer service.”  Not selling; and yet the stores accounted for $15B of product sold in 2011!  When you walk into an Apple store it is like no other retailing experience, precisely because Apple stood the retail model on its head.  Apple thought digital as it sold not just 4 or 5 products--yes that’s it—but rather 4-5 ecosystems that let the individual easily tailor their unique experience from beginning to end.

Analog does not scale.  Digital does.  Analog is manual.  Digital is automated.  Analog cannot easily be software defined and repurposed.  Digital can.  Analog is expensively two-way.  With Digital 2-way becomes ubiquitous and synchronous.  Analog is highly centralized.  Digital can be easily distributed.  All of this drives marginal cost down at all layers and boundary points meaning performance/price is constantly improving even as operator/vendor margins rise.

With Digital the long tail doesn’t just become infinite, but gives way to endless new tails.  The (analog) incumbent sees digital as disruptive, with per unit price declines and same-store revenues eroding.  They fail to see and benefit from relative cost declines and increased demand.  The latter invariably occurs due to a shift from "private" to public consumption, normal price elasticity, and "application" elasticity as the range of producers and consumers increases.  The result is overall revenue growth and margin expansion for every industry/market that has gone digital.

Digital also makes it easy for something that worked in one industry to be easily translated to another.   Bill Taylor of Fast Company recently wrote in HBR that keeping pace with rapid change in a digital world requires having the widest scope of vision, and implementing successful ideas from other fields.

The film and media industries are a case in point.  As this infographic illustrates Hollywood studios have resisted “thinking digital” for 80 years.  But there is much they could learn from the transformation of other information/content monopolies over the past 30 years.  This blog from Fred Wilson sums up the issues between the incumbents and new entrants well.  Hollywood would do well to listen and see what Apple did to the music industry and how it changed it fundamentally; because it is about to do the same to publishing and video.  If not Apple then others.

Another aspect of digital is the potential for user innovation.   Digital companies should constantly be looking for innovation at the edge.  This implies a focus on the “marginal” not average consumer.  Social media is developing tremendous tools and data architectures for this.  If companies don’t utilize these advances, those same users will develop new products and markets, as can be seen from the comments field of this blog on the financial services industry.

Digital is synonymous with flat which drives greater scale efficiency into markets.  Flat (horizontal) systems tend toward vertical completeness via ecosystems (the Apple or Android or WinTel approach).  Apple IS NOT vertically integrated.  It has pieced together and controls very effectively vertically complete solutions.  In contrast, vertically integrated monopolies ultimately fail because they don’t scale at every layer efficiently.  Thinking flat (horizontal) is the first step to digitization.

Related Reading:
Apple Answers the Question "Why?" Before "How?" And "What?"
US Govt to Textbook Publishers: You Will Go Digital!
This article confused vertical integration with vertical completeness
Walmart and Retailers Need to Rethink Strategy
Comparison of 3 top Fashion Retailers Web and App Strategies

Software Defined Networking translated to the real world

Posted by: Michael Elling AT 10:54 am   |  Permalink   |  0 Comments  |  Email
Sunday, January 22 2012

Data is just going nuts!  Big data, little data, smartphones, clouds, application ecosystems.  So why are Apple and Equinix two of only a few large cap companies in this area with stocks up over 35% over the past 12 months, while AT&T, Verizon, Google and Sprint are market performers or worse?   It has to do with pricing, revenues, margins and capex; all of which impact ROI.  The former’s ROI is going up while the latters’ are flat to declining.  And this is all due to the wildness of mobile data.

Data services have been revealing flaws and weaknesses in the carriers pricing models and networks for some time, but now the ante is being upped.  Smartphones now account for almost all new phones sold, and soon they will represent over 50% of every carriers base, likely ending this year over 66%.  That might look good except when we look at these statistics and facts:

  • 1% of wlx users use 50% of the bandwidth, while the top 10% account for 90%.  That means 9 out of 10 users account for only 10% of network consumption; clearly overpaying for what they get.
  • 4G smartphone displays (720x1280 pixels) allow video viewing that uses 300x more capacity than voice.
  • Streaming just 2 hours of music daily off a cloud service soaks up 3.5GB per month
  • Carriers still derive more than 2/3 of their revenues from voice.
  • Cellular wireless (just like WiFi) is shared. 

Putting this together you can see that on the one hand a very small percentage of users use the bulk of the network.  Voice pricing and revenues are way out of sync with corresponding data pricing and revenues; especially as OTT IP voice and other applications become pervasive. 

 

Furthermore, video, which is growing in popularity will end up using 90% of the capacity, crowding out everything else, unless carriers change pricing to reflect differences in both marginal users and marginal applications.  Marginal here = high volume/leading edge.

So how are carriers responding?  By raising data prices.  This started over a year ago as they started capping those “unlimited” data plans.  Now they are raising the prices and doing so in wild and wacky ways; ways we think that will come back to haunt them just like wild party photos on FB.  Here are just two of many examples:

  • This past week AT&T simplified its pricing and scored a marketing coup by offering more for more and lowering prices even as the media reported AT&T as “raising prices.”  They sell you a bigger block of data at a higher initial price and then charge the same rate for additional blocks which may or may not be used.  Got that?
  • On the other hand that might be better than Walmart’s new unlimited data plan which requires PhD level math skills to understand.  Let me try to explain as simply as possible.  Via T-Mobile they offer 5GB/month at 3G speed, thereafter (the unlimited part) they throttle to 2G speed.  But after March 16 the numbers will change to 250MB initially at 3G, then 2G speeds unlimited after that.  Beware the Ides of March’s consumer backlash!

Unless the carriers and their channels start coming up with realistic offload solutions, like France’s Free, and pricing to better match underlying consumption, they will continue to generate lower or negative ROI.  They need to get control of wild data.  Furthermore, if they do not, the markets and customers will.  With smartphones (like Apple's, who by the way drove WiFi as a feature knowing that AT&T's network was subpar) and cloudbased solutions (hosted by Equinix) it is becoming easier for companies like Republic Wireless to virtually bypass the expensive carrier plans using their very own networks.  AT&T, VZ, Sprint will continue to be market performers at best.

Related Reading/Viewing:
AT&T pricing relies heavily on breakage
Useful stats on data growth from MobileFuture
FoxNews report on AT&T Data Throttling
This article actually suggests "dissuading usage" as 1 of 4 solutions
Consumer reports article whereby data equivalent voice pricing = 18 cents, OTT on lowest plan = 6.6 cents, OTT on highest plan = 1 cent
New shared data plans set a new high standard

Posted by: Michael Elling AT 09:13 am   |  Permalink   |  0 Comments  |  Email
Sunday, January 15 2012

“There is no clear definition, but usually the Singularity is meant as a future time when societal, scientific and economic change is so fast we cannot even imagine what will happen from our present perspective, and when humanity will become posthumanity.”  Futurist Ray Kurzweil wrote the Singularity is near in 2005 and it sure felt that way at CES in Las Vegas in 2012. 

In any event, we didn’t come up with this association.  Matt Burns at TechCrunch did during one of the live streamed panel discussions from the show floor.  But given that we’ve been at the forefront of mobile devices and networks for 2 decades, we couldn’t have said it any better.  After all this time and all these promises, it really felt like all the devices were connected and this app-ecosystem-mass was beginning to move at a pace and direction that none of us could fully explain.

Vernor Vinge, a sci-fi writer, coined the expression back in the early 1990s.  When you look at the list of publications at the Singularity Institute one can’t help but think that the diaspora of different applications and devices self organizing out of CES 2012 isn’t the beginning of AI.  Will AI develop out of the search algorithms that tailor each individuals specific needs, and possibly out-thinking or out-guessing the individual in the process?

We probably won’t know the answer for at least a decade, but what we do know is that we are about to embark on a wave of growth, development and change that will make the past 30 year WinTel-Internet development look embryonic at best.  After all the Singularity is infinity in mathematical terms.  Hold onto your seats.

Posted by: Michael Elling AT 09:10 am   |  Permalink   |  0 Comments  |  Email
Sunday, January 08 2012

Yahoo executed on the portal concept better than any other company and one time was worth $113B.  Today it is worth $17.6B and is trading at 12.9x operating cash flow.  But its interest in Chinese and Japanese subsidiaries is $19B pretax or $13B aftertax.  Either way YHOO is absurdly cheap on $5B of revenue and $1.4B cash flow and 700 million users.  So new CEO Thompson has a chance to pull off the turnaround of the century for a company that is hard to define by many.

What would we do in his shoes?

First, we would go back to what made Yahoo so valuable in people’s eyes to begin with, namely an approach that used others’ content and turned it into gold via traffic streams and advertising revenue.  As a tools and application company they have been pretty poor at monetizing their innovations.  Ultimately, the chief reason people came to Yahoo was the breadth and easy accessibility of the information, making people shout “yahoo” in joy.  The brand is a great brand after all and one of the real internet originals which will fix itself.

So what is the “new” content today?  Applications.  Yahoo should evolve from content portal to "app-portal".  More specifically we are about to witness an explosion of cross application solutions and commerce that will develop entire new ecosystems of revenue opportunity.  And these will exist across multiple platforms and screens as one writer points out.  Facebook is trying to harness this opportunity with Opengraph, but already people might be concerned that it is just too much confusion and overkill and TMI.

Yahoo can not only remove that confusion but allow people to tailor their own environments.  HTML5 and connectivity across smartphone, TV, tablet and PC/ultrabook will pave the way.  Understanding and harnassing the big data required to do this is one of Thompson’s key attributes.  At the same time virtual currencies can be created to monetize this interactivity beyond mere advertising and that’s what Thompson brings from Paypal.

With over 70 different services/tools and 14,000 employees, Thompson will definitely have to restructure, reduce and refocus in order to have the resources to aggressively develop an app-portal future.

Posted by: Michael Elling AT 08:43 am   |  Permalink   |  0 Comments  |  Email
Thursday, January 05 2012

Counter-intuitive thinking often leads to success.  That’s why we practice and practice so that at a critical moment we are not governed by intuition (chance) or emotion (fear).  No better example of this than in skiing; an apt metaphor this time of year.  Few self-locomoted sports provide for such high risk-reward requiring mental, physical and emotional control.  To master skiing one has to master a) the fear of staying square (looking/pointing) downhill, b) keeping one’s center over (or keeping forward on) the skis, and c) keeping a majority of pressure on the downhill (or danger zone) ski/edge.  Master these 3 things and you will become a marvelous skier.  Unfortunately, all 3 run counter to our intuitions driven by fear and safety of the woods at the side of the trail, leaning back and climbing back up hill.  Overcoming any one is tough.

What got me thinking about all this was a Vint Cerf (one of the godfathers of the Internet) Op-Ed in the NYT this morning which a) references major internet access policy reports and decisions, b) mildly supports the notion of the Internet as a civil not human right, and c) trumpets the need for engineers to put in place controls that protect people’s civil (information) rights.  He is talking about policy and regulation from two perspectives, business/regulatory and technology/engineering, which is confusing.  In the process he weighs in, at a high level, on current debates over net neutrality, SOPA, universal service and access reform, from his positions at Google and IEEE and addresses the rights and governance from an emotional and intuitive sense.

Just as with skiing, let’s look at the issues critically, unemotionally and counter-intuitively.  We can’t do it all in this piece, so I will establish an outline and framework (just like the 3 main ways to master skiing) and we’ll use that as a basis in future pieces to expound on the above debates and understand corporate investment and strategy as 2012 unfolds.

First, everyone should agree that the value of networks goes up geometrically with each new participant.  It’s called Metcalfe’s law, or Metcalfe’s virtue.  Unfortunately people tend to focus on scale economies and cost of networks; rarely the value.  It is hard to quantify that value because most have a hard time understanding elasticity and projecting unknown demand.  Further few rarely distinguish marginal from average cost.  The intuitive thing for most to focus on is supply, because people fear the unknown (demand).

Second, everyone needs to realize that there is a fundamental problem with policy making in that (social) democrats tend to support and be supported by free market competitors, just as (conservative) republicans have a similar relationship with socialist monopolies.  Call it the telecom regulatory paradox.  This policy paradox is a function of small business vs big business, not either sides’ political dogma; so counter-intuitive and likely to remain that way.

Third, the internet was never open and free.  Web 1.0 resulted principally from a judicial action and a series of regulatory access frameworks/decisions in the mid to late 1980s that resulted in significant unintended consequences in terms of people's pricing perception.  Markets and technology adapted to and worked around inefficient regulations.  Policy makers did not create or herald the internet, wireless and broadband explosions of the past 25 years.  But in trying to adjust or adapt past regulation they are creating more, not less, inefficiency, no matter how well intentioned their precepts.  Accept it as the law of unintended consequences.  People feel more comfortable explaining results from intended actions than something unintended or unexplainable.

So, just like skiing, we’ve identified 3 principles of telecoms and information networks that are counter-intuitive or run contrary to accepted notions and beliefs.  When we discuss policy debates, such as net neutrality or SOPA, and corporate activity such as AT&T’s aborted merger with T-Mobile or Verizon’s spectrum and programming agreement with the cable companies, we will approach and explain them in the context of Metcalfe’s Virtue (demand vs supply), the Regulatory Paradox (vertical vs horizontal orientation; not big vs small), and  the law of unintended consequences (particularly what payment systems stimulate network investment).  Hopefully the various parties involved can utilize this approach to better understand all sides of the issue and come to more informed, balanced and productive decisions.

Vint supports the notion of a civil right (akin to universal service) for internet access.  This is misguided and unachievable via regulatory edict/taxation.  He also argues that there should be greater control over the network.  This is disingenuous in that he wants to throttle the open-ness that resulted in his godchild’s growth.  But consider his positions at Google and IEEE.  A “counter-intuitive” combination of competition, horizontal orientation and balanced payments is the best approach for an enjoyable and rewarding experience on the slopes of the internet and, who knows, ultimately and counterintuitively offering free access to all.  The regulators should be like the ski patrol to ensure the safety of all.   Ski school is now open.

Related reading:
A Perspective from Center for New American Security

Network Neutrality Squad (NNsquad) of which Cerf is a member

Sad State of Cyber-Politics from the Cato Institute

Bike racing also has a lot of counter-intuitive moments, like when your wheel locks with the rider in front.  Here's what to do!

Posted by: Michael Elling AT 01:23 pm   |  Permalink   |  0 Comments  |  Email
Sunday, December 18 2011

 

(The web is dead, long live the apps)

 

Is the web dead?  According to George Colony, CEO of Forrester, at LeWeb (Paris, Dec 7-9) it is; and on top of that social is running out of time, and social is where the enterprise is headed.  A lot to digest at once, particularly when Google’s Schmidt makes a compelling case for a revolutionary smartphone future that is still in its very, very early stages; courtesy of an ice cream sandwich.

Ok, so let’s break all this down.  The Web, dead?  Yes Web 1.0 is officially dead, replaced by a mobile, app-driven future.  Social is saturated?  Yes, call it 1.0 and Social 2.0 will be utilitarian.  Time is money, knowledge is power.  Social is really knowledge and that’s where enterprises will take the real-time, always connected aspect of the smartphone ice cream sandwich applications that harness internal and external knowledge bases for rapid product development and customer support.  Utilitarian.  VIVA LA REVOLUTION!

Web 1.0 was a direct outgrowth of the breakup of AT&T; the US’ second revolution 30 years ago coinciding ironically with the bicentennial end of the 1st revolution.  The bandwidth bottleneck of the 1960s and 1970s (the telephone monopoly tyranny) that gave rise to Microsoft and Intel processing at the edge vs the core, began to reverse course in the late 1980s and early 1990s as a result of flat-rate data access and an unlimited universe of things to easily look for (aka web 1.0).  This flat-rate processing was a direct competitive response by the RBOCs to the competitive WAN (low-cost metered) threat.

As silicon scaled via Moore’s law (the WinTel sub-revolution) digital mobile became a low-cost, ubiquitous reality.  The same pricing concepts that laid the foundation for web 1.0 took hold in the wireless markets in the US in the late 1990s; courtesy of the software defined, high-capacity CDMA competitive approach (see pages 34 and 36) developed in the US.

The US is the MOST important market in wireless today and THE reason for its leadership in applications and smart cloud.  (Incidentally, it appears that most of LeWeb speakers were either American or from US companies.)  In the process the relationship between storage, processing and network has come full circle (as best described by Ben Horowitz).  The real question is, “will the network keep up?”  Or are we doomed to repeat the cycle of promise and dashed hopes we witnessed between 1998-2003?

The answer is, “maybe”; maybe the communications oligopolies will liken themselves to IBM in front of the approaching WinTel tsunami in 1987.  Will Verizon be that service provider that recognizes the importance of and embraces open-ness and horizontalization?  The 700 mhz auctions and recent spectrum acquisitions and agreements with the major cable companies might be a sign that they do.

But a bigger question is whether Verizon will adopt what I call a "balanced payment (or settlement) system" and move away from IP/ethernet’s "bill and keep" approach.  A balanced payment or settlement system for network interconnection simultaneously solves the issues of new service creation AND paves the way for the applications to directly drive and pay for network investment.  So unlike web 1.0 where communication networks were resistently pulled into a broadband present, maybe they can actually make money directly off the applications; instead of the bulk of the value accruing to Apple and Google.

Think of this as an “800” future on steroids or super advertising, where the majority of access is paid for by centralized buyers.  It’s a future where advertising, product marketing, technology, communications and corporate strategy converge.  This is the essence of what Colony and Schmidt are talking about.   Will Verizon CEO Seidenberg, or his rivals, recognize this?  That would indeed be revolutionary!

Related Reading:
February 2011 Prediction by Tellabs of Wireless Business Models Going Upside Down by 2013

InfoWeek Article on Looming Carrier Bandwidth Shortages

 

 

 

 

 

Posted by: Michael Elling AT 09:56 am   |  Permalink   |  0 Comments  |  Email
Sunday, December 11 2011

Look up the definition of information and you’ll see a lot of terminology circularity.  It’s all-encompassing and tough to define.  It’s intangible, yet it drives everything we do.  But information is pretty useless without people; in fact it doesn’t really exist.  Think about the tree that fell, unseen, in the forest.  Did it really fall?  I am interested in the velocity of information, its impact on economies, societies, institutions and as a result in the development of communication networks and exchange of ideas.

Over the past several years I have increasingly looked at the relationship between electricity and communications.  The former is the number one ingredient for the latter.  Ask anybody in the data-center or server farm world.  The relationship is circular.  One wonders why the NTIA under its BTOP program didn’t figure that out; or at least talk to the DOE.  Both spent billions separately, instead of jointly.  Gee, why didn’t we add a 70 kV line when we trenched fiber down that remote valley?

Cars, in moving people (information) around,  are a communications network, too; only powered by gasoline.  Until now.  The advent of electric vehicles (EV) is truly exciting.  Perhaps more than the introduction of digital cell phones nearly 20 years ago.  But to realize that future both the utility and auto industries should take a page from the competitive wireless playbook.

What got me thinking about all this was a  NYT article this week about Dan Akerson, a former MCI CFO  and Nextel CEO, who has been running (and shaking up) GM over the past 15 months.  It dealt specifically with Dan’s handling of the Chevy Volt fires.  Knowing Dan personally, I can say he is up to the task.  He is applying lessons learned from the competitive communications markets to the competitive automotive industry.  And he will win.

But will he and the automotive industry lose because of the utility industry?  You see, the auto industry, the economy and the environment have a lot to gain from the development of electric vehicles (EV).  Unfortunately the utility industry, which is 30 years behind the communications and IT revolution “digitizing” its business model, is not prepared for an EV eventuality.  Ironically, utilities stand in the way of their own long-term success as EV’s would boost demand dramatically.

A lot has been spent on a “smart grid” with few meaningful results.  Primarily this is because most of the efforts and decisions are being driven by insiders who do not want to change the status quo.  The latter includes little knowledge of the consumer, a 1-way mentality, and a focus on average peak production and consumption.  Utilities and their vendors loathe risk and consider real time to be 15 minutes going down to 5 minutes and view the production and consumption of electricity to be paramount.  Smart-grid typically means the opposite, or a reduction in revenues.

So, it’s no surprise that they are building a smart-grid which does not give the consumer choice, flexibility and control, nor the ability to contribute to electricity production and be rewarded to be efficient and socially responsible.  Nor do they want a lot of big-data to analyze and make the process even more efficient.  Funny those are all byproducts of the competitive communications and IT industries we’ve become accustomed to.

So maybe once Dan has solved GM’s problems and recognizes the problems facing an electric vehicle future, he will focus his and those of his private equity brethren’s interests on developing a market-driven smart-grid; not one your grandmother’s utility would build.

By the way, here’s a “short”, and by no means exhaustive, list of alliances and organizations and the members involved in developing standards and approaches to the smart grid.  Note: they are dominated by incumbents, and they all are comprised differently!

 

Electricity Advisory Committee
Gridwise Alliance
Gridwise Architecture Council
NIST SmartGrid Architecture Council
NIST SmartGrid Advisory Committee
NIST SmartGrid Interoperability Panel
North American Energy Standards Board (NAESB)
SmartGrid Task Force Members (Second list under Smartgrid.gov)
Global SmartGrid Federation
NRECA SmartGrid Demonstration
IEEE SmartGrid Standards
SmartGrid Information Clearinghouse


 

 

Posted by: Michael Elling AT 10:52 am   |  Permalink   |  0 Comments  |  Email
Sunday, December 04 2011

Be careful what you wish for this holiday season?  After looking at Saks’ 5th Avenue “Snowflake & Bubbles” holiday window and sound and light display, I couldn’t help but think of a darker subtext.  I had to ask the question answered infamously by Rolling Stone back in 2009, “who are the bubble makers?   The fact that this year’s theme was the grownup redux from last year’s child fantasy by focusing on the “makers” was also striking.  An extensive google search reveals that NO ONE has tied either years’ bubble themes to manias in the broader economy or to the 1%.  In fact, the New York Times called them “new symbols of joy and hope.”  Only one article referenced the recession and hardship for many people as a stark backdrop for such a dramatic display.  Ominously, one critic likened it to the “Nutcracker with bubbles” and we all know what happened to Tsarist Russia soon thereafter.

The light show created by Iris is spectacular and portends what I believe to be a big trend in the coming decade, namely using the smartphone to interact with signs and displays in the real world.  It is not unimaginable that every device will soon have a wifi connection and be controllable via an app from a smartphone.  Using the screen to type a message or draw an illustration that appears on a sign is already happening.  CNBC showcased the windows as significant commercial and technical successes, which they were.  Ironically the 1% appear to be doing just fine as Saks reported record sales in November.

Perhaps the lack of critical commentary has something to do with how quickly Occupy Wall Street rose and fell.  Are we really living in a Twitter world?  Fascinated and overwhelmed by trivia and endless information?  At least the displays were sponsored by FIAT, who is trying to revive two brands in the US market simultaneously, focusing on the very real-world pursuit of car manufacturing.  The same, unfortunately, cannot be said about MasterCard, (credit) bubble makers extraordinaire.  Manias and speculative bubbles are not new and they will not go away.  I’ve seen two build first hand and know that little could have been done to prevent them.  So it will be in the future.

One was the crash in 1987 of what I like to call the “bull-sheet market of the 1980s”.  More than anything, the 1980s was marked by the ascendance of the spreadsheet as a forecasting tool.  Give a green kid out of business school a tool to easily extrapolate logarithmic growth and you’ve created the ultimate risk deferral process; at least until the music stops in the form of one down year in the trend.  Who gave these tools out and blessed their use?  The bubble makers (aka my bosses).  But the market recovered and went to significant new highs (and speculative manias).

Similarly, a new communications paradigm (aka the internet) sprang to life in the early to mid 1990s as a relatively simply store and forward, database look-up solution.  By the end of the 1990s there was nothing the internet could not do, especially if communications markets remained competitive.  I remember the day in 1999 when Jeff Bezos said, in good bubble maker fashion, that “everyone would be buying goods from their cellphones” as a justification for Amazon’s then astronomical value of $30bn.  I was (unfortunately) smart enough to know that scenario was a good 5-10 years in the future.  10 years later it was happening and AMZN recently exceeded $100bn, but not before dropping below $5bn in 2001 along with $5 trillion of wealth evaporating in the market.

If the spreadsheet and internet were the tools of the bubble makers in the 1980s and 1990s, then wireless was the primary tool of the bubble makers in the 2000s.  Social media went into hyperdrive with texting, tweeting and 7x24 access from 3G phones apps.  Arguably wireless mobility drove people's transiency and ability to move around aiding the housing bubble.  So then what is the primary tool of the bubble makers in the 2010s?  Arguably it is and will be the application ecosystems of iOS and Android.   And what could make for an ugly bubble/burst cycle?  Lack of bandwidth and lack of efficient clearinghouse systems (payments) for connecting networks.

Posted by: Michael Elling AT 08:51 am   |  Permalink   |  0 Comments  |  Email
Sunday, November 20 2011

Are We Stressing the Environment?

Two major global concerns are the price of oil and level of carbon emissions. The US DOE makes a conservative estimate that oil will be consistently between $110-120 by the end of the decade. Population is the key driver as evidenced by the chart to the left comparing growth in population from 1900 to the production of oil.  Note in particular that despite conservation efforts in the mid to late 1970s, production has matched population growth over the past 25 years. Supporting the general trend up in demand and hence prices, the UN expects population to continue to expand for the forseeable future as shown in the below figure.  From there we see that population will rise from current 6.5B to between 8-10B by 2040.  That would imply production exceeding 100 million barrels per day.

Additionally, and perhaps more alarming is the increase in CO2 levels and average temperatures from the late 1800s through the present in the figure below.  The critical number for long-term environmental sustainability is 350 ppm of CO2.  As can be seen from the chart, that level was surpassed around 1990 and now exceeds 370; up 110 ppm over the past 130 years. 

Electricity production accounts for 1/3 of all CO2 production.  The 2011 U.S. EIA Energy Outlook Report states that electricity currently accounts for 40% of total residential delivered energy consumption in the U.S., with projections for both residential and commercial consumption expected to increase 1.2% annually from 2010 to 2035 (not including significant electric vehicle penetration). This growth will require over 200gW of additional electrical energy capacity. With 40% of this capacity already under construction and assuming current construction costs for a gas turbine plant with transmission facilities are $700/kW, additional electric generation costs will approach $90 billion in today’s dollars or $750/U.S. household.

This represents both an energy problem and opportunity for utilities, their customers and society as a whole.  Electric utilities and their customers need to focus on conservation and smart grid solutions to offset the rise in prices and take advantage of new technologies making alternative energy and electric vehicles more economic. The incremental costs for power generation of $750/HH can instead be invested in home energy management systems, at the same time reducing the total amount of CO2 that is generated.

Related Reading:

Map of US showing locations of renewable energy production

Map of US showing over 6400 facilities producing most CO2

 

Posted by: Michael Elling AT 08:00 am   |  Permalink   |  Email
Sunday, November 13 2011

A humble networking protocol 10 years ago, packet based Ethernet (invented at Xerox in 1973) has now ascended to the top of the carrier networking pyramid over traditional voice circuit (time) protocols due to the growth in data networks (storage and application connectivity) and 3G wireless.  According to AboveNet the top 3 CIO priorities are cloud computing, virtualization and mobile, up from spots 16, 3 and 12, respectively, just 2 years ago!   Ethernet now accounts for 36% of all access, larger than any other single legacy technology, up from nothing 10 years ago when the Metro Ethernet Forum was established.  With Gigabit and Terabit speeds, Ethernet is the only protocol for the future.

The recent Ethernet Expo 2011 in NYC underscored the trends and importance of what is going on in the market.  Just like fiber and high-capacity wireless (MIMO) in the physical layer (aka layer 1), Ethernet has significant price/performance advantages in transport networks (aka layer 2).  This graphic illustrates why it has spread through the landscape so rapidly from LAN to MAN to WAN.   With 75% of US business buildings lacking access to fiber, EoC will be the preferred access solution.  As bandwidth demand increases, Ethernet has a 5-10x price/performance advantage over legacy equipment.

Ethernet is getting smarter via a pejoratively coined term, SPIT (Service Provider Information Technology).  The graphic below shows how the growing horizontalization is supported by vertical integration of information (ie exchanges) that will make Ethernet truly “on-demand”.  This model is critical because of both the variability and dispersion of traffic brought on by both mobility and cloud computing.  Already, the underlying layers are being “re”-developed by companies like AlliedFiber who are building new WAN fiber with interconnection points every 60 miles.  It will all be ethernet.  Ultimately, app providers may centralize intelligence at these points, just like Akamai pushed content storage towards the edge of the network for Web 1.0.  At the core and key boundary points Ethernet Exchanges will begin to develop.  Right now network connections are mostly private and there is significant debate as to whether there will be carrier exchanges.  The reality is that there will be exchanges in the future; and not just horizontal but vertical as well to facilitate new service creation and a far larger range of on-demand bandwidth solutions.

By the way, I found this “old” (circa 2005) chart from the MEF illustrating what and where Ethernet is in the network stack.  It is consistent with my own definition of web 1.0 as a 4 layer stack.  Replace layer 4 with clouds and mobile and you get the sense for how much greater complexity there is today.  When you compare it to the above charts you see how far Ethernet has evolved in a very rapid time and why companies like Telx, Equinix (8.6x cash flow), Neutral Tandem (3.5x cash flow) will be interesting to watch, as well as larger carriers like Megapath and AboveNet (8.2x cash flow).   Certainly the next 3-5 years will see significant growth in ethernet and obsolescence of the PSTN and legacy voice (time-based) technologies.

Related Reading:
CoreSite and other data centers connect directly to Amazon AWS

Equinix and Neutral Tandem provide seamless service

 

Posted by: Michael Elling AT 12:46 pm   |  Permalink   |  0 Comments  |  Email
Sunday, November 06 2011

Would gamification work in the smart grid?  Possibly.  Others have asked the same question.  But some would ask, why do you need to incent people to save money?  Because people’s self-interest might not be aligned with the smart-grid as currently envisioned by vendors and utilities. 

Gamification’s value is to do something against one’s self-interest without realizing it.  At the same time, people play games to accomplish something aspirational.  How can these two, somewhat contradictory, precepts be applied to the smart-grid? 

People resist the smart grid because of its perceived complexity, expense and intrusiveness.  They are acting in their self-interest.  Secondly, the smart-grid is supposedly about giving the end-user controls over their own consumption.  Unfortunately, utilities are scared by this future, since it runs counter to revenue growth.

Enter gamification where everyone might win.  If introduced into the design of smart-grid solutions from the get-go it could have a fundamental impact on penetration, acceptance and ultimately revenue and profit growth for the utility industry.   Why?  Because the demand for electricity is potentially unlimited and the easier and more efficient the industry makes consumption the greater the growth potential.

So what might gamification of the smart grid look like?  It would need to satisfy the following conditions: personal growth, societal improvement and marketing engagement.   Right now solutions I’ve read about focus on individual rewards (see Welectricity and Lowfoot), but there is a growing body of evidence that people respond better when their use is compared to their neighbors.  So why not turn efficiency and production into a contest?  Research is already underway in Hawaii and Chicago.  Small, innovative app-driven solutions are entering the market; even supported by former US Vice Presidents.

To get as much participation and ensure wide-spread rewards smart-grid gamification contests should be held at home, neighborhood, city, county, state, all the way to national levels.  It should provide for both relative and absolute changes to provide ALL users an incentive to win; not just the largest users.  And not just individuals, but groups as well.  Contests could also get down to the appliance level and ultimately should include contribution/cogeneration (here’s another example). 

Utilities have done a poor job of getting customers to look at their info online; less than 10% on average.   Playing games with customers and following recipes like this might be a way to change all that.  Win, win, win.

Related Reading:

Gaming across all industries

 

Posted by: Michael Elling AT 11:45 am   |  Permalink   |  0 Comments  |  Email
Sunday, October 23 2011

Even though the US has the most reliable electric system in the world, utility companies are not schooled in real-time or two-way concepts when it comes to gathering and reporting data, nor when it comes to customer service. All of that changes with a “smart-grid” and may be the best explanation why so many smart-grid solutions stop at the meter and do not extend fully into the customer premise. Unfortunately, utilities are not prepared to “get” so much information, let alone “give” much to the customer. Over 20 million smart meters, representing 15% penetration in residential markets, have been deployed as of June, 2011 according to IEE.  They forecast 65 million (50%) by 2015, at an average cost of $150-250 per household.  While these numbers are significant, it will have taken 15 years to get there and even then only 6 million premises, less than 5% of the market, are expected to have energy management devices by 2015.  So while the utilities will have a slightly better view of things and have greater controls and operating efficiencies, the consumer will not be engaged fully, if at all.  This is the challenge of the smart-grid today.

Part of the issue is incumbent organizations--regulatory bodies, large utilities and vendors--and their desire to stick to proven approaches, while not all agreeing on what those approaches are. According to NIST, there are no fewer than 75 key standards and 11 different standards bodies and associations involved in smart-grid research and trials. The result is numerous different approaches, many of which are proprietary and expensive.  As well, the industry breaks energy management within smart-grid into 2 broad categories, namely Demand Response Management (DRM or the side the utility controls) and Demand Side Management (DSM or the side the customer arguably controls), instead of just calling it “end-to-end energy management;” which is how we refer to it.

Another challenge, specifically for rural utilities is that over 60% have PLC meters, which don’t work with most of the “standard” DRM solutions in the market, necessitating an upgrade. This could actually present an opportunity for a well designed end-to-end solution that leapfrogs the current industry debate and offers a new approach.  Such an approach would work-around an expensive investment upgrade of the meter AND allow DSM at the same time. After working with utilities for over 10 years, we’ve discovered that rural utilities are the most receptive to this new way of thinking, not least because they are owned by their customers and they can achieve greater operating efficiencies from end-to-end “smart” technology investment because of their widely dispersed customer base.

Ultimately the market will need low-cost, flexible end-to-end solutions to make the smart-grid pervasive and generate the expected ROI for utility and customer alike.

Posted by: Michael Elling AT 08:13 am   |  Permalink   |  Email
Sunday, April 24 2011

A couple of themes were prevalent this past week:

  • iPhone/Android location logging,
  • cloud computing (and a big cloud collapse at Amazon),
  • the tech valuation bubble because of Groupon et al,
  • profits at Apple, AT&T vs VZ, Google, most notably,
  • and who wins in social media and what is next.

In my opinion they are all related and the Cloud plays the central role, metaphorically and physically.  Horowitz recently wrote about the new computing paradigm in defense of the supposed technology valuation bubble.  I agree wholeheartedly with his assessment as I got my first taste of this historical computing cycle over 30 years ago when I had to cycle 10 miles to a High School in another district that had a dedicated line to the county mainframe.  A year or two later I was simulating virus growth on an Apple PC.  So when Windows came in 1987 I was already ahead of the curve with respect to distributed computing.  Moreover, as a communications analyst in the early 1990s I also realized what competition in the WAN post-1984 had begat, namely, Web 1.0 (aka the Internet) and the most advanced and cheapest digital paging/messaging services in the world.  Both of these trends would have a significant impact on me personally and professionally and I will write about those evolutions and collapses in future Spectral issues.

The problem, the solution, the problem, the solution, etc….

The problem back in the 1970s and early 1980s was the telephone monopoly.  Moore’s law bypassed the analog access bottleneck with cheap processing and local transport.  Consumers and then enterprises and institutions began to buy and link the PCs together to communicate, share files and resources.   Things got exciting when we began to multitask in 1987, and then by 1994 any PC provided access to information pretty much anywhere.  During the 1990s and well into the next decade, Web 1.0 was just a 1.2-way store and forward database lookup platform.  It was early cloud computing, sort of, but no-one had high-speed access.  It was so bad in 1998 when I went independent, that I had 25x more dedicated bandwidth than my former colleagues at bulge-bracket Wall Street firms.  That’s why we had the bust.  Web 1.0 was narrow-band, not broadband, and certainly not 2-way.  Wireless was just beginning to wake up to data, even though Jeff Bezos had everyone believing they would be ordering books through their phones in 2000.

Two things happened in the 2000s.  First, high speed bandwidth became ubiquitous.  I remember raising capital for The Feedroom, a leading video ASP, in 2003 and we were still watching high-speed access penetration reaching the 40% “tipping point.”.  Second the IP stack grew from being a 4 layer model to something more robust.  We built CDNs.  We built border controllers that enabled Skype VoIP traffic to transit foreign networks “for free.”  We built security.  HTML, browsers and web frontends grew to support multimedia.  By the second half of the decade, Web 2.0 became 1.7-way and true “cloud” services began to develop.  Web 2.0 is still not fully developed as there are still a lot of technical and pricing controls and “lubricants” missing for true 2-way synchronous high-definition communications; more about that in future Spectrals.

The New “Hidden Problem”

Unfortunately, over that time the underlying service provider market of 5-6 competitive service providers (wired, wireless, cable) consolidated down to an oligopoly in most markets.  Wherever competition dropped to 3 or fewer providers bandwidth pricing stopped falling 40-70% like it should have and only fell 5-15% per annum.  Yet technology prices at the edge and core (Moore’s Law) kept on falling 50%+ every 12-18 months.  Today, the price differential between “retail” and “underlying economic” cost per bit is the widest it has been since 1984.

That wouldn’t be a problem except for two recent developments:  the advent of the smartphone and the attendant application ecosystems.  So what does this have to do with cloud computing, especially when that was “an enterprise phenomenon” begun by Salesforce.com with its Force.com and Amazon Web Services.  A lot of the new consumer wireless applications run on the cloud.  There are entire developer ecosystems building new companies.  IDC estimates that the total amount of information accessible is going to grow 44x by 2020 to 35 zetabytes.  And the average number of unique files is going to grow 65x.  That means that while a lot of the applications and information is going to be high-bandwidth (video and multimedia), there are also going to be many smaller files and transactions (bits of information); ie telemetry or personal information or sensory inputs.  And this information will be constantly accessed by 3-5 billion wireless smartphones and devices.  The math of networks is (N*(N-1))/2.  That’s an awful lot of IP session pathways.

Why is That A Problem?

The problem is that the current wireless networks can’t handle this onslaught.  Carriers have already been announcing datacaps over the past 2 years.  While they are falling over themselves to announce 4G networks, the reality is that they are only designed to be a 2-3x faster, and far from being ubiquitous, either geographically (wide-area) or inbuilding.  That’s a problem if the new applications and information sets require networks that are 20-50x faster and many factors more reliable and ubiquitous.  The smartphones and their wireless tether are becoming single points of access.  Add to that the fact that carriers derive increasingly less direct benefit from these application ecosystems, so they’ll have less and less incentive to upgrade and reprice their network services along true technology-driven marginal cost.  Neustar is already warning carriers they are being bypassed in the process.

Does The Bubble Have to Burst?

Just as in the late 1990s, the upper and middle layer guys really don’t know what is going on at the lower layers.  And if they don’t then surely the current bubble will burst as expectations will get ahead of reality.  That may take another 2-3 years, but it will likely happen.  In the meantime, alternative access players are beginning to rise up.  Even the carriers themselves are talking about offloading traffic onto femto and wifi cells.  Wifi alliances are springing up again and middle layer software/application controls are developing to make it easier for end-users to offload traffic themselves.  Having lived through and analyzed the advent of competitive wired and wireless networks in the 1990s, my sense is that nothing, even LightSquared or Clearwire in their current forms, will be significant enough to precipitate the dramatic restructuring that is necessary to service this coming tidal wave of demand.

What we need is something that I call centralized hierarchical networking (CHN)™.  Essentially we will see three major layers with the bottom access/transport layer being controlled by 3-4 hybrid networks.  The growth and dynamic from edge to core and vice versa will wax and wane in rather rapid fashion.  Until then, while I totally get and support the cloud and believe most applications are going that route, let the Cloud Players be forewarned of coming turbulence unless something is done to (re)solve the bandwidth bottleneck!

Posted by: Michael Elling AT 09:34 am   |  Permalink   |  0 Comments  |  Email
Tuesday, April 19 2011

5 Areas of Focus

1) How does information flow through our economic, social and political fabric?  I believe all of history can be modeled on the pathways and velocity of information.  To my knowledge there is no economic science regarding the velocity of information, but many write about it.  Davidow (OVERconnected) speaks to networks of people (information) being in 3 states of connectivity.  Tom Wheeler, someone whom I admire a great deal, often relates what is happening today to historical events and vice versa.  His book on Lincoln’s use of the telegraph makes for a fascinating read.  Because of its current business emphasis and potential to change many aspects of our economy and lives social media will be worth modeling along the lines of information velocity.

2) Mapping the rapidly evolving infomedia landscape to explain both the chaos of convergence and the divergence of demand has interested me for 20 years.  This represents a taxonomy of things in the communications, technology and internet worlds.  The latest iteration, called the InfoStack, puts everything into a 3 dimensional framework with a geographic, technological/operational, and network/application dispersion.  I’ve taken that a step further and from 3 dimensional macro/micro models developed 3 dimensional organizational matrices for companies.  3 coordinates capture 99% of everything that is relevant about a technology, product, company, industry or topic.

3) Mobile payments and ecommerce have been an area of focus over the past 3 years.  I will comment quite a bit on this topic.  There are hundreds of players, with everyone jockeying for dominance or their piece of the pie.  The area is also at the nexus of 3 very large groupings of companies:  financial services, communications services and transaction/information processors.  The latter includes Google and FaceBook, which is why they are constantly being talked about.  That said, players in all 3 camps are constrained by vestigial business and pricing models.   Whoever ties/relates the communications event/transaction to the underlying economic transaction will win.  New pricing will reflect digitization and true marginal cost.  Successful models/blueprints are 800, VPN, and advertising.  We believe 70-80% of all revenue in the future will derive from corporate users and less than 30% will be subscription based.

4) Exchange models and products/solutions that facilitate the flow of information across upper and lower layers and from end to end represent exciting and rewarding opportunities.  In a competitive world of infinite revenue clouds of demand mechanisms must exist that drive cost down between participants as traffic volumes explode.  This holds for one-way and two-way traffic, and narrow and broadband applications.  The opposing sides of bill and keep (called party pays) and network neutrality, are missing the point.  New services can only develop if there is a bilateral, balanced payment system.  It is easy to see why incumbent service and application models embrace bill and keep, as it stifles new entrants.  But long term it also stifles innovation and retards growth.

5) What will the new network and access topologies look like?  Clearly the current industry structure cannot withstand the dual onslaught of rapid technological change and obsolescence and enormously growing and diverging demand.  It’s great if everyone embraces the cloud, but what if we don’t have access to it?  Something I call “centralized hierarchical networking” will develop.  A significant amount of hybridization will exist.  No “one solution” will result.  Scale and ubiquity will be critical elements to commercial success.  As will anticipation and incorporation of developments in the middle and upper layers.  Policy must ensure that providers are not allowed to hide behind a mantra of “natural bottlenecks” and universal service requirements.  In fact, the open and competitive models ensure the latter as we saw from our pro-competitive and wireless policies of the 1980s and 1990s.

In conclusion, these are the 5 areas I focus on:

1)      Information Velocity

2)      Mapping the InfoStack

3)      Applications and in particular, payment systems

4)      Exchange models

5)      Networks

The analysis will tend to focus on pricing (driven by marginal, not average costs) and arbitrages, the “directory value” of something, which some refer to as the network effect, and key supply and demand drivers.

Posted by: Michael Elling AT 09:43 am   |  Permalink   |  0 Comments  |  Email
Monday, April 18 2011

Today, April 18, 2011 marks my first official blog.  It is about making money and having fun.  Actually I started blogging about telecommunications 20 years ago on Wall Street with my TelNotes daily and SpectralShifts weekly.  Looking back, I am happy to report that a lot of what I said about the space actually took place; consolidation, wireless usurpation of wireline access, IP growing into something more robust than a 4 layer stack, etc…  Over the past decade I’ve watched the advent of social media, and application ecosystems, and the collapse of the competitive communications sector; the good, the bad, and the ugly, respectively.

Along the way I’ve participated in or been impacted by these trends as I helped startups and small companies raise money and improve their strategy, tactics and operations.  Overall, an entirely different perspective from my ivory tower Wall Street research perch of the 1980s-90s.  Hopefully what I have to say is of use to a broad audience and helps people cut through contradictory themes of chaotic convergence and diverging demand to take advantage of the rapidly shifting landscape.

I like examples of reality imitating art.  One of my favorites was Pink Floyd’s The Wall, which preceded the destruction of the Berlin Wall by a decade.  Another, the devastating satire and 1976 classic Network, predating by 30 years what media has become in the age of reality TV, twitter and the internet moment.  I feel like a lot has changed and it’s time for me to start talking again.  So in the words of Howard Beale (Peter Finch) “I’m as mad as hell, and I’m not going to take it anymore.” 

Most of the time you’ll see me take an opposite stance from consensus, or approach a topic or problem from a 90 degree angle.  That’s my intrinsic value; don’t look for consensus opinion here.  The ability to do this lies in my analytical framework, called the InfoStack.  It is a three dimensional framework that maps information, topics and problems along geographic, network and application dispersions.  By geographic I mean WAN, MAN, LAN, PAN.  By network, I mean a 7 layer OSI stack.  And by applications, I mean clouds of intersecting demand.  You will see that I talk about horizontal layering and scale, vertically complete solutions, and unlimited “cloud-like” revenue opportunity.  Anything I analyze is in the context of what is going on in adjacent spaces of the matrix.  And I look for cause and effect amongst the layers.

I see us at the beginning of something very big; bigger than in 1987 at the dawn of the Wintel revolution.  The best way to enjoy the great literary authors is to start with their earliest works and read sequentially; growing and developing with them.  Grow with me as we sit at the dawn of the Infomedia revolution that is and will remake the world around us.  In the process, let’s make some money and build things that are substantial.

Posted by: Michael Elling AT 01:00 pm   |  Permalink   |  0 Comments  |  Email
Email
Twitter
Facebook
Digg
LinkedIn
Delicious
StumbleUpon
Add to favorites

Information Velocity Partners, LLC
88 East Main Street, Suite 209
Mendham, NJ 07930
Phone: 973-222-0759
Email:
contact@ivpcapital.com

Mastodon

Design Your Own Website, Today!
iBuilt Design Software
Give it a try for Free