Ask people to name the best thing that happened in tech in 2013, and you’ll get a wide variety of sunny answers. Ask for the worst, and one reply is likely to rear its ugly head.
That said, let’s take our medicine first, and leave the best for last.
WORST OF 2013
The Healthcare.gov Debacle
“Without a doubt, the biggest tech failure of 2013 was the Health Insurance Marketplace, the software system developed to facilitate the purchase of medical insurance as mandated under the Patient Protection and Affordable Care Act (more commonly referred to as Obamacare),” said Craig Clausen, executive vice president and principal analyst at New Paradigm Resources Group Inc.
“Given the central importance of the system to the new health care law, this failure to function would put this as the most significant tech failure of the year,” Clausen continued. “However, there are compounding factors that solidify the marketplace’s position as the most significant failure or the year. Nearly $1 billion was spent on developing the system and based on previous experiences in software development, NPRG anticipates that correcting the systems flaws will cost approximately 60 to 80 percent of what initial development costs ran. Finally, the Health Insurance Marketplace holds the position for this year’s (decade’s) biggest boondoggle because of the project managers’ lack of knowledge and experience.”
Peter Bernstein, a telecom veteran and TMCnet senior editor, concurred.
“As critical a part of the Obamacare implementation as it was/is, made it absolutely the biggest tech failure of 2013 – or almost any other year,” said Bernstein. “Others have used the term catastrophic to describe the lack of preparation and testing, I would use the word inexplicable.”
Michael Stanford (News - Alert), a veteran VoIP entrepreneur and strategist and a monthly columnist for this magazine, agreed that healthcare.gov earned this dubious prize, but sees it more as a PR failure than a tech failure.
“Forty percent of major ERP projects come in more than four months late,” noted Stanford. “Assertions that commercial sites don't have problems like this are plain wrong. High-profile commercial sites frequently go down, sometimes for days, like the Apple (News - Alert) Developer website in September, or the entire AT&T wireless data network when the iPhone was introduced. So misses like this are business as usual in tech.”
Here’s a fourth comment on healthcare.gov, just for good measure. It comes from Frank Stinson (News - Alert), partner and senior analyst with IntelliCom Analytics: “Well, that is a pretty easy one – the launch of the Obamacare federal exchange website. It was not exactly built using state of the art technology from what I understand, but it would be hard to imagine a more high-profile failure given its budget and lead time.”
Had healthcare.gov not arisen in the fourth quarter to capture the worst of 2013 prize, perhaps BlackBerry (News - Alert) would’ve claimed the title. In fact, BlackBerry was named the worst of tech 2013 by Karl Dahlin, director of strategic partnerships at VCI-Group.
“BlackBerry [experienced] one of the biggest falls from industry hero to zero in modern
business history,” said Dahlin. “Yes, they've been struggling for a while after plenty of
early success, but they've had so many chances to get back in the game and become relevant again but failed to execute on a viable turnaround strategy in 2013.”
The fall of BlackBerry was pretty bad, but let’s not be hasty in giving it the silver medal.
The End of Privacy
It seems to me that should instead go to the PRISM scandal.
While it probably shouldn’t come as a surprise to any of us that the U.S. government has been listening in to our exchanges (remember The Patriot Act?), the extent of this practice and the low level of worker to whom such information was (is?) available likely ramped up excitement over all of this.
Telecom people already knew this was going on, but it came as a shock to the general public, according to Stanford, who said PRISM stands as a marker of the official end of privacy.
“… privacy is obsolete, thanks to phone tracking, click tracking, call tracking, street view, traffic cams, satellite imagery, and the mass storage and big data software technology that makes all that data storable and mine-able for all eternity,” added Stanford. “George Orwell had it right, except he missed by 20 years, and by the scale. It isn't just Big Brother watching you: anybody can, and you fuel it with your Facebook Likes and Instagrams. This is a fundamental change in social reality, ranking with the printing press and the Industrial Revolution.”
Bernstein, who has deemed Edward Snowden as the most interesting person in tech-2013, added: “We are still dealing with the repercussions of the rolling thunder of revelations. These are already impacting how the Internet is used; how data is stored and accessed; how risks are addressed and mitigated in the government and commercial sectors; and will likely lead to new laws around the world regarding privacy and security. It will also set off an explosion of purchasing in the areas of encryption and other types of security measures.”
Indeed. And the fact that we are all increasingly connected makes this issue all the more important. That brings us to the discussion of the underlying network. While we have made many gains in adopting and building on IP communications, expanding our possibilities via broadband networks, and allowing rich media sharing even to mobile users through new LTE networks, there’s still a lot of old plumbing and gear that’s moving along our voice traffic.
Here, let me interject my own nomination for worst of 2013: the legacy voice switch network.
Legacy telephone switches have far outlived their initially stated product lifecycles of 20 to 25 years, yet they still bear most of the load of the nation’s telephone traffic. In fact, they still support billions of dollars in revenue. There were approximately 13,000 legacy voice switches in operation in the U.S. at this time last year.
Recognizing that America’s traditional phone system is not as dependable as it used to be, the Federal Communications Commission in October ordered phone companies to start collecting statistics on calls that fail to complete. That’s because at least one estimate indicates as many as one in five incoming long-distance calls doesn’t connect, which may have something to do with the decay of traditional landline infrastructure.
Most operators believe that they can rely on the grey market to provide critical cards for continued operation, according to a white paper put out earlier this year by GENBAND. But, it noted, most cards have been recycled through repair and return processes multiple times and, according to one analysis, about 40 percent of used cards do not survive a power cycling of being put back into a live switch. If they do survive, the switch becomes dependent upon key control cards from the late 1970s or early 1980s, which may have 200,000 to 300,000 operating hours on them.
“This is comparable to buying a rusty old rebar to repair an aging bridge: it’s an improvement, but it will only slightly delay the bridge’s demise,” according to the paper.
The problem of outdated legacy switches remains such an important issue that Metaswitch just last month announced that it has extended its service and support to include a range of legacy Class 5 switches, including the Nortel DMS-10 and DMS-100 switches and Alcatel-Lucent 5ESS Class 5 switches. Many legacy voice switches are no longer supported by their makers, some of which are companies that have been acquired at least once in recent years, said Phil Harvey, director of corporate communications at Metaswitch.
Harvey explained that Metaswitch is not so much interested in helping carriers maintain out-of-date infrastructure as it is freeing them from worrying about such concerns so they can turn their attentions to migrating to IP environments.
“We want our customers looking forward and not backward,” he said.
We think that now is the time for carriers to be transforming their networks,” he added. “TDM switches will not run forever, and available expertise is running short.”
BEST OF 2013
Software Eats the Network
That takes us to what many folks think is among the best developments of 2013: the concept of the software telco, and software-based networking in general.
This idea, which is frequently closely identified with network functions virtualization and software-defined networking, stands to offer service providers and other network operators big benefits in terms of flexibility, network optimization, scalability, and time to market.
"As organizations move beyond virtualization of production workloads, attention is shifting toward the management and automation of the software-defined data center," said Peter ffoulkes, TheInfoPro's research director for servers and virtualization. "Over the next two years, the foundations for enterprise cloud computing will be deployed with cloud platforms standing out as the hottest technology and the most critical strategic decision to be made."
TheInfoPro, a service of 451 Research, in December released new research indicating that spending on infrastructure will slow down over the next two years as attention shifts to software-defined data centers. It explained that cloud platforms stand out as the hottest technology for adoption in the next two years, followed by the management and automation functions required for production and virtualized data centers.
“The three pillars of the cloud are storage, the data center computing environment and the networks, i.e., the ones connecting data center elements and data centers to each other,” said Bernstein. “Whether it is the enterprise in the form of private clouds, the public cloud services, or the telecommunications global infrastructure, all aspects of ICT are rapidly become software-centric. SDN and NFV, while both in their infancies, are the future. In fact, how fast we get to the so-called software telco is going to be fascinating to watch as service providers seek to transform themselves in a rapidly changing world and exert their relevance in the face of OTT competition.”
The cloud, NFV and SDN will effectively turn carrier networks into living beings that can change as needs require, said Paul Miller, vice president of technology and strategy at GENBAND.
“The whole thing has to grow and shrink almost as if it were a living organism,” he said.
It’s early days for the software telco. Indeed, at this point software telco is more idea than reality. But this concept has legs. Several leading vendors, not to mention noted startups, already have introduced products to help enable the creation of the software telco; several standards efforts on NFV and SDN are under way; telco trials are in the works; and it doesn’t hurt that some of the world’s leading telcos are leading the way on NFV.
Cloud: Still Rolling In
As the comments above reflect, the cloud plays a central role in software-based networking and virtualization. But whether we’re talking about NFV and SDN or just networking in general, the cloud has become a – if not the – central theme in the communications and IT space.
When asked for the three most important tech developments of 2013, Phil Edholm of PKE Consulting LLC responded: “Cloud, cloud, and cloud.” New technologies like WebRTC and HTML5 are changing the landscape, he added, but behind it all the biggest and most important development is that cloud is changing the business of IT and information.
“Just like the Industrial Revolution changed manufacturing from a bespoke cottage business to mass manufacturing, cloud is changing information and communications,” said Edholm. “The big difference is that things will cost much less, but will be available in a wide range of defined options. Just as in the hardware store you can choose between 10 hinges, but you cannot specify a unique hinge. Similarly, technology is becoming a range of choices, but not customized in each implementation.”
The impact of cloud networking is far and wide. For example, it now enables people to store and share video, pictures and presentations via the Internet using services such as Dropbox (News - Alert). It gives even the smallest businesses quick and affordable access to computing, infrastructure and platform as a service offerings, meaning upstarts can get to market faster and easier – and compete much more effectively with their much larger competitors. And the cloud provides organizations of all sizes access to services, such as unified communications offerings, that otherwise may have been out of their reach due to costly upfront investments and ongoing maintenance requirements.
There are a variety of approaches businesses adopting cloud solutions can take, of course. As Stinson noted, the cloud today is a bifurcated market where public cloud services have had the most success with small businesses while large enterprises have gravitated to private clouds hosted from their own data centers.
“In 2014,” he said, “you will see greater provider focus on hybrid approaches blending public and private cloud implementation to bridge that gap. And whether it is through your customers, partners, or own employees, the user experience of new applications and technologies will remain a key area of emphasis going forward.”
Ovum’s Sapien, meanwhile, sees the connection of clouds proliferating in many dimensions – clouds connected to other clouds, clouds connected between private and public clouds, enterprise applications using multiple clouds within one application, and cloud service marketplaces where applications will be a combination of cloud services and not one service cloud. He also talked about the importance of cloud-network integration.
“Cloud providers of all kinds will be connecting to each other, many networks and many different applications, but the difference here is the network and cloud will be integrated and controlled from one portal or vendor,” he said. “Cloud providers will be integrating the required network, and telecom providers will be integrating the cloud services. SDN will be one of the catalysts or ingredients.”
Big data was another important trend of 2013, and will obviously continue to grow during 2014 and beyond.
The proliferation of structured and unstructured data within organizations, including service providers like telephone and broadband service providers, among others, creates opportunities for these organizations to analyze and aggregate information to better understand such things as customers’ past and likely future buying habits, the customer experience their own organization is providing, how to most efficiently deploy a fleet of vehicles, or even how to better engineer a network.
For example, a new company called Mobile Pulse has developed technology that gathers cellular and Wi-Fi network granular data from mobile devices, explained Clausen of NPGR.
“Initially, it will be used to analyze wireless network performance by network, carrier, device and geography,” said Clausen, who considers Mobile Pulse the top startup of 2013. “However, the technology coupled with the richness of the data it can provide holds tremendous potential for delivering insights into a range of network and usage related areas. Mobile Pulse’s technology will also contribute another component of the big data picture.”
But combing through and analyzing big data into something that’s useful is no small task, said Mark Ricca, partner and senior analyst of IntelliCom Analytics.
“The majority of companies in the world today are not suffering from a lack of data; instead they are challenged by an abundance of data and a lack of intelligence,” he said. “Enter providers recognizing the need and opportunity that have solutions that can tailor the big data into relevant and meaningful intelligence.”
M2M & The Internet of Things
The ability to mold multiple data sources into meaningful intelligence will become even more important as more of our world becomes connected. Increasingly, not only will we want to use and reuse information gathered from and about smartphones, Internet usage, customer service interactions and the like, but we will also be hunting for and gathering data relative to unmanned connected devices in factories; within homes and neighborhoods (Did you hear about the Amazon drone?); and on planes, trains and automobiles. That’s not to mention the connected devices we are, or will be, wearing on our bodies. That includes things like Google Glass, the tattoo-like tab Google is working on (see the M2M Transcendent column in this issue), smart watches, football helmets, personal health and wellness meters, and more.
“While these devices have been the butt of late-night comedians jokes, they are positioned to expand the way we interact with our surroundings, much the way smart mobile devices transformed the way we communicate and access information,” said Clausen.
Global M2M cellular connections are forecast to hit the 374.9 million mark by 2017, expanding at a compound annual growth rate of 26.5 percent from 91.4 million in 2011, according to research firm IHS. Berg Insight, meanwhile, forecasts that M2M devices with cellular connectivity will increase by 22 percent this year to reach 164.5 million in emerging markets, and estimates that M2M connections will grow at a CAGR of 24.4 percent with 489.9 million connections in 2018. And Analysys Mason says the M2M market will be worth $88 billion in the next 10 years.
To get there, M2M will have to overcome at least some of its challenges, which include fragmentation within the marketplace and an absence of established business models. The good news is that M2M appears to be poised for big things. Just look at the expanding adoption of these solutions, the decreasing costs of hardware, the growing array of products and businesses in the M2M space, and rising interest from investors in machine-to-machine companies.
“The next Internet wave will be the M2M revolution, where almost anything – from an automobile, to a shipping container, to a home electricity meter – can become a part of a vast network,” said Sam Lucero, senior principal analyst for M2M & Internet of Things at IHS. “Cellular communications will play a key role in this new era of the Internet of things, serving as the glue that connects hundreds of millions of nodes together. However, the cost and complexity of developing, deploying and operating cellular M2M applications is daunting, leading increasing numbers of companies to outsource cellular M2M application development, deployment and – in many cases – operation, to VAS providers.”
While it’s very early days for WebRTC, we would be remiss if we didn’t mention WebRTC as among the best of 2013. The events that Technology Marketing Corp. puts on in collaboration with Edholm are in themselves an indicator of all the interest in the WebRTC space. The most recent event drew around 700 attendees.
WebRTC is already supported on more than one billion endpoints, according to Google, a key proponent of this new technology. Disruptive Analysis expects that to grow to 3.9 billion by 2016.
“I consider 2013 to be the year of WebRTC,” said Dahlin of VCI-Group. “This was the year we moved beyond the hype and were actually able to use commercial products and see why the world will be forever changed going forward because of WebRTC.”
Bernstein agreed, saying: “WebRTC is possibly the most disruptive thing to hit the communications industry in over a decade. It literally holds the promise to transform the way in which all of us interact professionally and personally.”
Evolve IP's CTO Scott Kinka added that WebRTC, while still a couple of years away from widespread adoption, will begin to see applications beyond the enterprise, adding benefits for customer, partners, suppliers, and beyond. According to Infonetics Research, 20 percent of Americans currently work from home, a number that is expected to increase by 63 percent in the next five years, he said. That points to the growing need for problem-free videoconferencing, he added, and the eventual adoption of WebRTC will facilitate this.
Edited by Cassandra Tucker