Skip to main content
SpectralShifts Blog 
Friday, January 11 2013

A year ago it was rumored that 250 Apple employees were at CES 2012, even as the company refused to participate directly.  The company could do no wrong and didn’t need the industry.  For the better part of 9 months that appeared to be the case and Apple’s stock outperformed the market by 55%.  But a few months on, and a screen size too small for phones and too big for tablets, a mapping app too limited, and finally a buggy OS, Apple’s excess performance over the market has narrowed to 10%.

Two major themes of this year’s CES--mobile device screen size and extensive application ecosystems to connect just about anything--will place Apple’s mobile dominance and lead further in doubt.  To us it was already in evidence last year when we talked about the singularity.  But the real reason today is becoming apparent to all, namely that Apple wants to keep people siloed into their product specific verticals.  People and their applications don’t want that because the cloud lets people update, access and view information across 3 different screens and any platform.  If you want Apple on one device, all your devices have to be Apple.  It’s a twist on the old Henry Ford maxim, “you can have any device…as long as it is Apple.”

This strategy will fail further when the access portion of the phone gets disconnected from all the other components of the phone.  It may take a few years for that to happen, but it will make a lot of sense to just buy an inexpensive dongle or device that connects to the 4G/5G/Wifi network (metro-local or MAN/LAN) and radiates Bluetooth, NFC and Wifi to a plethora of connected devices in the personal network (PAN).   Imagine how well your “connection hub” would last if it didn’t need to power a screen and huge processor for all the different apps?  There goes your device centric business model.

And all that potential device and application/cloud supply-side innovation means that current demand is far from saturated.  The most recent Cisco forecasts indicate that 1/3rd of ALL internet traffic will be from mobile devices by 2016.  In the US 37% of the mobile access will be via Wifi.  Applications that utilize and benefit from mobility and transportability will continue to grow as overall internet access via a fixed computer will drop to 39% from 60% today. 

While we believe this to be the case, the reality is far different today according to Sandvine, the broadband policy management company.  This should cause the wireless carriers some concern as they look at future capacity costs.  In their recent H2-2012 report Sandvine reveals that power smartphone users are already using 10x more than average smartphone users, or 317 megabytes a month vs 33.  But even the former number is a far cry from the 7.3 gigabytes (20x) that the average person uses on their fixed broadband pipes (assuming 2.3 people per fixed broadband line).  Sandvine estimates that total mobile access will grow from ~1 petabyte in H2-2012 to 17 petabytes by H2-2018.

My own consumption, since moving from 3G to 4G and going from a 4 to 4.7 inch screen is a 10x increase to 1-2 gigs 4G access and 3-6 gigs wifi access, for a total of 4-8 gigs a month.  This is because I have gone from a “store and forward” mentality to a 7x24 multimedia consumption model.  And I am just getting comfortable with cloud based access and streaming.   All this sounds positive for growth and investment especially as the other 95% of mobile users evolve to these usage levels, but it will do the carriers no good if they are not strategically and competitively well positioned to handle the demand.  Look for a lot of development in the lower access and transport layers including wifi offload and fiber and high-capacity microwave backhaul.

Related Reading:

Smartphones use more data than tablets for first time.
 
Broadcom develops small, multi-modal, multi-band modem chips!

 

Posted by: Michael Elling AT 01:51 pm   |  Permalink   |  0 Comments  |  Email
Sunday, June 03 2012

Since I began covering the sector in 1990, I’ve been waiting for Big Bang II.  An adult flick?  No, the sequel to Big Bang (aka the breakup of MaBell and the introduction of equal access) was supposed to be the breakup of the local monopoly.  Well thanks to the Telecom Act of 1996 and the well-intentioned farce that it was, that didn’t happen and equal access officially died (equal access RIP) in 2005 with the Supreme Court's Brand-X decision vs the FCC.  If it died, then we saw a resurrection that few noticed.  

I am announcing that Equal Access is alive and well, albeit in a totally unexpected way.  Thanks to Steve Jobs’ epochal demands put on AT&T to counter its terrible 2/3G network coverage and throughput, every smartphone has an 802.11 (WiFi) backdoor built-in.  Together with the Apple and Google operating systems being firmly out of carriers’ hands and scaling across other devices (tablets, etc…) a large ecosystem of over-the-top (OTT), unified communications and traffic offloading applications is developing to attack the wireless hegemony. 

First, a little history.  Around the time of AT&T's breakup the government implemented 2 forms of equal access.  Dial-1 in long-distance made marketing and application driven voice resellers out of the long-distance competitors.  The FCC also mandated A/B cellular interconnect to ensure nationwide buildout of both cellular networks.  This was extended to nascent PCS providers in the early to mid 1990s leading to dramatic price declines and enormous demand elasticities.  Earlier, the competitive WAN/IXC markets of the 1980s led to rapid price reductions and to monopoly (Baby Bell or ILEC) pricing responses that created the economic foundations of the internet in layers 1 and 2; aka flat-rate or "unlimited" local dial-up.  The FCC protected the nascent ISP's by preventing the Bells from interfering at layer 2 or above.  Of course this distinction of MAN/LAN "net-neutrality" went away with the advent of broadband, and today it is really just about WAN/MAN fights between the new (converged) ISPs or broadband service providers like Comcast, Verizon, etc... and the OTT or content providers like Google, Facebook, Netflix, etc...

(Incidentally, the FCC ironically refers to edge access providers, who have subsumed the term ISPs or "internet service providers", as "core" providers, while the over-the-top (OTT) messaging, communications, e-commerce and video streaming providers, who reside at the real core or WAN, are referred to as "edge" providers.  There are way, way too many inconsistencies for truly intelligent people to a) come up with and b) continue to promulgate!)

But a third form of equal access, this one totally unintentioned, happened with 802.11 (WiFi) in the mid 1990s.  The latter became "nano-cellular" in that power output was regulated limiting hot-spot or cell-size to ~300 feet.  This had the impact of making the frequency band nearly infinitely divisible.  The combination was electric and the market, unencumbered by monopoly standards and scaling along with related horizontal layer 2 data technologies (ethernet), quickly seeded itself.  It really took off when Intel built WiFi capability directly into it's Centrino chips in the early 2000s.  Before then computers could only access WiFi with usb dongles or cables tethered to 2G phones

Cisco just forecast that 50% of all internet traffic will be generated from 802.11 (WiFi) connected devices.  Given that 802.11’s costs are 1/10th those of 4G something HAS to give for the communications carrier.  We’ve talked about them needing to address the pricing paradox of voice and data better, as well as the potential for real obviation at the hands of the application and control layer worlds.  While they might think they have a near monopoly on the lower layers, Steve Job’s ghost may well come back to haunt them if alternative access networks/topologies get developed that take advantage of this equal access.  For these networks to happen they will need to think digital, understand, project and foster vertically complete systems and be able to turn the "lightswitch on" for their addressable markets.

Posted by: Michael Elling AT 10:21 am   |  Permalink   |  2 Comments  |  Email
Sunday, April 29 2012

The first quarter global smartphone stats are in and it isn’t even close.  With the market growing more than 40%+, Samsung controls 29% of the market and Apple 24%.  The next largest, Nokia came in 60-70% below the leaders at 8%, followed by RIMM at 7% and HTC at 5%, leaving the scraps (28%) to Sony, Motorola, LG, ZTE.  They've all already lost on the scale front; they need to change the playing field. 

While this spread sounds large and improbable, it is not without historical precedent.  In 1914, just 6 years after its introduction the Ford Model T commanded 48% market share.  Even by 1923 Ford still had 40% market share.  2 years later the price stood at $260, which was 30% of the original model in 1908, and less than 10% what the average car cost in 1908; sounds awfully similar to Moore’s law and the pricing of computer/phone devices over the past 30 years.  Also, a read on the Model T's technological and design principles sounds a lot like pages taken out of the book of Apple.  Or is it the other way around?

Another similarity was Ford’s insistence on the use of black beginning in 1914.  Over the life of the car 30 different variations of black were used!  The color limitation was a key ingredient in the low cost as prior to 1914, the company used blue, green, red and grey.  Still 30 variations of black (just like Apple’s choice of white and black only and take it or leave it silo-ed product approach) is impressive and is eerily similar to Dutch Master Frans Hals’ use of 27 variations of Black, so inspirational to Van Gogh.  Who says we can’t learn from history.  

Ford’s commanding lead continued through 1925 even as competitors introduced many new features, designs and colors.  Throughout, Ford was the price leader, but when the end came for that strategy it was swift. Within 3 years the company had completely changed its product philosophy introducing the Model A (with 4 colors and no black) and running up staggering losses in 1927-28 in the process.  But the company saw market share rebound from 30% to 45% in the process; something that might have been maintained for a while had not the depression hit. 

The parallels between the smartphone and automobile seem striking.  The networks are the roads.  The pricing plans are the gasoline.  Cars were the essential component for economic advancement in the first half of the 20th century, just as smartphones are the key for economic development in the first half of the 21st century.  And now we are finding Samsung as Apple's GM; only the former is taking a page from both GM and Ford's histories.  Apple would do well to take note.

So what are the laggards to do to make it an even race?  We don’t think Nokia’s strategy of coming out with a different color will matter.  Nor do we think that more features will matter.  Nor do we think it will be about price/cost.  So the only answer lies in context; something we have raised in the past on the outlook for the carriers.  More to come on how context can be applied to devices.  Hint, I said devices, not smartphones.  We'll also explore what sets Samsung and Apple apart from the rest of the pack.

Related Reading:

Good article on Ford and his maverick ways.  Qualities which Jobs possessed.

 

Posted by: Michael Elling AT 09:24 am   |  Permalink   |  0 Comments  |  Email
Sunday, April 22 2012

I love talking to my smartphone and got a lot of grief for doing so from my friends last summer while on vacation at the shore.  “There’s Michael talking crazy, again,” as I was talking TO my phone, not through it.  “Let’s ask Michael to ‘look’ that up! Haha.”  And then Siri came along and I felt vindicated and simultaneously awed.  The latter by Apple’s (AAPL, OCF = 6.9x) marketing and packaging prowess; seemingly they had trumped Google/Android (GOOG, OCF = 9.6x) yet again.  Or had they?   What at first appeared to be a marketing coup may become Tim Cook’s Waterloo and sound a discordant note for Nuance (NUAN, OCF = 31x).  At its peak in early February NUAN hit a high of 42x OCF, even as Apple stood at 8.2x.

The problem for Apple and Nuance in the short term is that the former is doubling down on its Siri bet with a brand new round of ads by well-known actors, such as Samuel L. Jackson and Zooey Deschanel.  Advertising pundits noted this radical departure for a company that historically shunned celebrities.  Furthermore, with two (NY and LA) class action suits against it and financial pundits weighing in on the potential consumer backlash Apple could have a major Siri migraine in the second half.  Could it be as big as Volvo’s advertising debacle 20 years ago; a proverbial worm in the apple?  Time will tell.

The real problem isn’t with Apple, the phone or Siri and its technology DNA, rather the problem lies with bandwidth and performance of the wireless networks.  Those debating whether Siri is a bandwidth hog are missing the point.  Wireless is a shared spectrum.  The more people who are on the same spectrum band, the less bandwidth for each user and the higher the amount of noise.  Both are bad for applications like Siri and Android’s voice recognition since they talk with processors in the cloud (or WAN).  Delays and missed packets have a serious impact on comprehension, translation and overall responsiveness.  Being a frequent user of Google’s voice-rec which has been around since August, 2010, I know when to count on voice-rec by looking at the wifi/3G/2G indicator and the number of bars.  But do others know this and will the focus on Siri's performance shift to the network?  Time will tell.

The argument that people know voice-rec is new and still in beta probably won’t cut it either.  I am puzzled why major Apple fanboys don’t perceive it as a problem, not even making it on this list of 5 critical product issues for the company to address.  But maybe that’s why companies like Apple are successful; they push the envelope.  In the 1990s I said to the WAN guys (Sprint, et al) that they would have the advantage over the baby bells because the “cloud” or core was more important than the edge.  The real answer, which Apple fully understands, is that the two go hand in hand.  For Sprint (S, OCF = 3.2x) there were too many variables they couldn’t control, so they never rolled out voice recognition on wired networks.  They probably should have taken the chance.  Who knows, the “pin-drop” company could have been a whole lot better off! 

Related Reading:
First impressions of i4s from Android user and Pros/Cons of Siri

A natural opposite for "siri" for the droid crowd would be an "Eliza", based on Audrey Hepburn's famous 'The Rain in Spain' character.  Eliza is the name of a company specializing in voice-rec healthcare apps and also a 1960s AI psychotherapy program.
 

 

Posted by: Michael Elling AT 10:38 am   |  Permalink   |  0 Comments  |  Email
Wednesday, February 29 2012

Is reality mimicking art?  Is Android following the script from Genesis’ epochal hit Land of Confusion?  Is it a bad dream on this day that happens once every four years?  Yes, yes, and unfortunately no.  Before I go into a litany of ills besetting the Android market and keeping Apple shareholders very happy, two points: a) I have an HTC Incredible and am a Droid fan, and b) the 1986 hit parodied superpower conflict and inept decisions by global leaders but presaged the fall of the Berlin Wall and 20+ years of incredible growth, albeit with a good deal of 3rd world upheaval in the Balkans, Mid-east, and Africa.  So maybe there is hope that out of the current state a new world order will arise as the old monopolies are dismantled.

Apple clearly has the digital formula right at present; simplicity, ease of use, performance, and yet, at the same time unlimited choice and customization.  Contrast that with this parody from SNL of Verizon and 4G/3G/2G/noG and the Samsung Superbowl Ad portraying a wild party.  The result is a disturbing trend if you are an Android phone lover.  The ecosystem’s rate of new technology adoption is slowing down even if better technology is being made as consumers are clearly confused.  In the tablet market there is even a greater disparity, with Android tablets hardly making a dent in Apple's share.

Yesterday Eric Schmidt prognosticated at MWC a world where more is better and cheaper; which may be good for Google but not necessarily the best thing for anyone else in the Droid ecosystem, including consumers.  Yet, at the same time Apple managed to steal the show with its iPad3 announcement.  Contrast this with HTC rolling out some awesome phones that will not be available in the US this year because their chip doesn't support 4G.   

The answer is not better technology, but better ecosystems.  The Droid device vendors should realize this and build a layer of software and standards above Google/ICS to facilitate interoperability across silos (at the individual, device and network level); instead of just depreciating their hardware value by competing on price and features many people do not want. They can then collectively win in residual transaction streams (like collectively synching back to a dropbox) like Apple.

Examples of these include standardization and interoperability of free or subsidized wifi offload, over the top messaging, voice and other solutions and the holy grail, mobile payments.  Companies like CloudFoundry allows for cross Cloud application infrastructure support, with AppFog and Iron Foundry are pursuing these approaches individually.  But just think what would happen if Samsung, HTC, LG and Motorola were to band together and coordinate these approaches and develop very low cost balanced payment systems within the Droid ecosystem to promote interoperability and cooperation, counteract Google and restore some sanity to the market.  Carriers (um battleships?) will not be able to stop this effort and may even welcome it just as the music industry opened its arms to Apple.

Apple hasn’t been an innovator so much as a great design company that understands big market opportunities and what the customer wants.  The result is an established order that other industries and their customers clearly prefer.  Millenials are too young to know Land of Confusion, but the current decision makers in the Droid ecosystem do and so they should take a lesson from history on this Leap Day.  Hopefully we’ll wake up in 4 years and there will be a wonderful new world order.  Oh, and a Happy 4 Birthdays to everyone present and past who was born on this day.

Related Reading:

Good assessment of and comments on the fragmentation of Android

Is it the people Apple and Google hire?  Maybe, maybe not.

Posted by: Michael Elling AT 09:17 am   |  Permalink   |  0 Comments  |  Email
Sunday, January 15 2012

“There is no clear definition, but usually the Singularity is meant as a future time when societal, scientific and economic change is so fast we cannot even imagine what will happen from our present perspective, and when humanity will become posthumanity.”  Futurist Ray Kurzweil wrote the Singularity is near in 2005 and it sure felt that way at CES in Las Vegas in 2012. 

In any event, we didn’t come up with this association.  Matt Burns at TechCrunch did during one of the live streamed panel discussions from the show floor.  But given that we’ve been at the forefront of mobile devices and networks for 2 decades, we couldn’t have said it any better.  After all this time and all these promises, it really felt like all the devices were connected and this app-ecosystem-mass was beginning to move at a pace and direction that none of us could fully explain.

Vernor Vinge, a sci-fi writer, coined the expression back in the early 1990s.  When you look at the list of publications at the Singularity Institute one can’t help but think that the diaspora of different applications and devices self organizing out of CES 2012 isn’t the beginning of AI.  Will AI develop out of the search algorithms that tailor each individuals specific needs, and possibly out-thinking or out-guessing the individual in the process?

We probably won’t know the answer for at least a decade, but what we do know is that we are about to embark on a wave of growth, development and change that will make the past 30 year WinTel-Internet development look embryonic at best.  After all the Singularity is infinity in mathematical terms.  Hold onto your seats.

Posted by: Michael Elling AT 09:10 am   |  Permalink   |  0 Comments  |  Email
Sunday, October 30 2011

Without access does the cloud exist?  Not really.

In 2006, cloud computing entered the collective intelligence in the form of Amazon Web Services.  By 2007, over 330,000 developers were registered on the platform.  This rapid uptake was an outgrowth of web 1.0 applications (scale) and growth in high-speed, broadband access from 1998-2005 (ubiquity).  It became apparent to all that new solutions could be developed and efficiencies improved by collapsing to the core a portion of processing and storage that had developed at the edge during the WinTel revolution.  The latter had fundamentally changed the IT landscape between the late 1980s and early 2000s from a mainframe to client server paradigm.

In late 2007 the iPhone was born, just as 3G digital services were introduced by a competitive US wireless industry.  In 2009 “smartphone” penetration was 18% of the market.  By the 3rd quarter of 2011 that number reached 44%.  The way people communicate and consume information is changing dramatically in a very short time. 

The smartphone is driving cloud (aka back to the mainframe) adoption for 3 reasons: 1) it is introducing a new computing device to complement, not replace, existing computing devices at home and work; 2) the small screen limits what information can be shown and processed; 3) it is increasing the sociability, velocity and value of information.   Information knows no bounds at the edge or core.  And we are at the very very early stages of this dramatic new revolution.

Ice Cream Sandwich (just like Windows 2.0 multi-tasking in 1987) heralds a radical new world of information generation and consumption.  Growth in processing and computation at the edge will drive the core and vice versa; just as chip advances from Intel fed software bloat on desktops further necessitating faster chips.   

But the process can only expand if the networks are there (see page 2) to support that.  Unfortunately carriers have responded with data caps and bemoan the lack of new spectrum.  Fortunately, a hidden back door exists in the form of WiFi access.  And if carriers like AT&T and Verizon don’t watch out, it will become the preferred form of access.

As a recent adopter of Google Music I have become very attuned to that.  First, it is truly amazing how seamless content storage and playback has become.  Second, I learned how to program my phone to always hunt for a wifi connection.  Third, when I do not have access to either the 3G wireless network or WiFi and I want something that is stored online a strange feeling of being disconnected overtakes me; akin to leaving one’s cellphone at home in the morning.

With the smartphone we are getting used to choice and instant gratification.  The problem with WiFi is it’s variability and unreliability.  Capital and technology is being applied to solve that problem and it will be interesting to see how service providers react to the potential threat (and/or opportunity).  Where carriers once imagined walled application gardens there are now fertile iOS and Android fields watered by clouds over which carriers exert little control.  Storm clouds loom over their control of and ROI from access networks.

Posted by: Michael Elling AT 09:10 am   |  Permalink   |  0 Comments  |  Email
Sunday, October 23 2011

Even though the US has the most reliable electric system in the world, utility companies are not schooled in real-time or two-way concepts when it comes to gathering and reporting data, nor when it comes to customer service. All of that changes with a “smart-grid” and may be the best explanation why so many smart-grid solutions stop at the meter and do not extend fully into the customer premise. Unfortunately, utilities are not prepared to “get” so much information, let alone “give” much to the customer. Over 20 million smart meters, representing 15% penetration in residential markets, have been deployed as of June, 2011 according to IEE.  They forecast 65 million (50%) by 2015, at an average cost of $150-250 per household.  While these numbers are significant, it will have taken 15 years to get there and even then only 6 million premises, less than 5% of the market, are expected to have energy management devices by 2015.  So while the utilities will have a slightly better view of things and have greater controls and operating efficiencies, the consumer will not be engaged fully, if at all.  This is the challenge of the smart-grid today.

Part of the issue is incumbent organizations--regulatory bodies, large utilities and vendors--and their desire to stick to proven approaches, while not all agreeing on what those approaches are. According to NIST, there are no fewer than 75 key standards and 11 different standards bodies and associations involved in smart-grid research and trials. The result is numerous different approaches, many of which are proprietary and expensive.  As well, the industry breaks energy management within smart-grid into 2 broad categories, namely Demand Response Management (DRM or the side the utility controls) and Demand Side Management (DSM or the side the customer arguably controls), instead of just calling it “end-to-end energy management;” which is how we refer to it.

Another challenge, specifically for rural utilities is that over 60% have PLC meters, which don’t work with most of the “standard” DRM solutions in the market, necessitating an upgrade. This could actually present an opportunity for a well designed end-to-end solution that leapfrogs the current industry debate and offers a new approach.  Such an approach would work-around an expensive investment upgrade of the meter AND allow DSM at the same time. After working with utilities for over 10 years, we’ve discovered that rural utilities are the most receptive to this new way of thinking, not least because they are owned by their customers and they can achieve greater operating efficiencies from end-to-end “smart” technology investment because of their widely dispersed customer base.

Ultimately the market will need low-cost, flexible end-to-end solutions to make the smart-grid pervasive and generate the expected ROI for utility and customer alike.

Posted by: Michael Elling AT 08:13 am   |  Permalink   |  Email
Email
Twitter
Facebook
Digg
LinkedIn
Delicious
StumbleUpon
Add to favorites

Information Velocity Partners, LLC
88 East Main Street, Suite 209
Mendham, NJ 07930
Phone: 973-222-0759
Email:
contact@ivpcapital.com

Mastodon

Design Your Own Website, Today!
iBuilt Design Software
Give it a try for Free