Making Peace With All That Data

Opening in Boston in April, To the Brink – an exhibit commemorating the Cuban Missile Crisis – will be the largest new exhibit hosted by the Kennedy Library in more than 10 years.

As an AT&T employee, I wonder if visitors to the exhibit will be aware that there is a deeper connection between the company and this landmark event in history.  Back in 1962, the United States government was still relying on the AT&T public switched telephone network (PSTN) as the underlying technology for its various defense systems infrastructure.  And although it provided levels of reliability that are still the benchmark for today’s services (Is there anything we can rely on more than picking up the phone and hearing a dial tone?), it was a hierarchical design that aggregated traffic into a small number of nodes that could be considered an easy target for enemies with long range weapons, as well as other non-military types of threats.

As the Department of Defense’s dependence on technology was increasing at the same time as global unrest, it was time to find a new technology that provided a more distributed, decentralized, and recoverable architecture.  This, along with the fact that computers were growing larger and more powerful (and therefore moving into specialized research facilities, with users around the country now needing reliable remote access), led to the development of the packet switched network.  The first iteration, called the ARPANET, went on line in October of 1969 but crashed two letters into its first transmission…so instead of sending “login” between UCLA and Stanford…all that made it was “lo.”

Obviously the problem was resolved, and we have been rebooting our computers ever since.  Only now, a few iterations later, the network that we are booting up on is called the Internet, and AT&T’s backbone is once again a major part of delivering that traffic.

Necessity is truly the mother of invention

Switch to today and the fear has shifted from big missiles to big DATA. Lurking out there, just out of site and ready to “attack” – its sheer enormity potentially bringing every server and spindle to a virtual halt. Generated by just about every conceivable device or application from medical images to smartphones to kitchen appliances, data is on the attack.  Customer information, photos, software logs, RFID tags, search engines, and social media – it’s all contributing to the situation.  In fact, in March 2012, The White House announced a national Big Data Initiative that consisted of six Federal departments and agencies committing more than $200 million to Big Data research projects. Is this the Cuban Missile Crisis of our generation, from an IT standpoint? Are we at the “brink” again?

Developing ways to effectively store, backup, analyze, and secure the data has become a focal point for all of the leading networking providers, hardware manufacturers, and information management software companies. One report estimates the industry to be $100 billion and growing. And no industry is immune from the phenomenon — it is affecting them all, including manufacturing, retail, financial and medical.  Complicating the situation is the changing backdrop of technology, with companies also trying to pivot to cloud-based and mobile infrastructure models to reduce costs and improve business operations.

A diplomatic resolution

In Mastering Big Data: CFO Strategies to Transform Insight into Opportunity (December  2012), Gary Simon suggests that success in resolving this conflict starts at the top with the CFO.  First, with a clear vision of what they are trying to solve with Big Data. Next, is ensuring that they have the right organizational foundation in place (talent, experience and insight) to legitimately treat data as a corporate asset.  And thirdly, the right technology foundation – with the proper data sources identified, capable hardware and software in place, and the right components to integrate the two.

Despite the challenges of the situation, it will undoubtedly be a great opportunity for the companies that execute best on an effective Big Data strategy, providing them with the valuable and reliable information that they need to quickly evaluate the market, analyze their environment, assess their competitors, and plot a course to success.

For those not armed with this type of “fire power,” it could very well become another landmark crisis.

What do you think? What approach should we take to protect Big Data before its security becomes a crisis?
The Networking Exchange Blog Team About NEB Team