Cybersecurity Represents the Internet’s Technical Debt

August 20, 2014 / Security / 0 Comments

Not very long ago and not very far away, scientists, engineers, and possibly Al Gore, began inventing the fundamental building blocks of the Internet.

Internet Timeline

Development of the Internet from Arpanet to W3C (Internet Society Graphic)

During this period of accelerating invention, those making the important discoveries were primarily concerned with getting things to work. Few networks were connected together and those that were tended to have close relationships and a high degree of trust. A network inside a building or facility was secured by physical access. Password access to the local network was largely sufficient.

Security was always an afterthought
The hacker movement closely followed the development of the Internet. In the 1970s many hackers amused themselves by hacking phone networks, so called phreaking. One of the first large scale network hacks was an attack on 60 institutions including Los Alamos Laboratory and the Sloan Kettering Cancer Institute. The first firewalls emerged in the late 1980s as basic packet filters. It wasn’t until the 1990s that researchers begun working on layering security on top of TCP/IP, the fundamental protocol of the network, which unfortunately happened to be fundamentally insecure (no encryption, no authentication of access points, etc…). DNS, the service that translates natural language names such as CloakLabs.com into machine addresses such as 50.19.227.130, was also fundamentally insecure. Work on DNSSEC didn’t start until 1990.

Enter Technical Debt

Technical debt is a metaphor referring to the eventual consequences of poor system design, software architecture or  software development. The debt can be thought of as work that needs to be done before a particular job can be considered complete or proper. If the debt is not repaid, then it will keep on accumulating interest, making it hard to implement changes later on.

Technical debt is incurred in almost every software project of any complexity. There’s always a nagging bit that could have been done better, an interface that should have  been exposed, test cases or documentation that should have been written. Projects incur technical debt because of lack of time, lack of resources, sloppiness, poor training, poor vision, and countless other reasons. However there’s little doubt that most projects would never ship if all the technical debt had to be paid beforehand. The Internet is not a single project, it is a mammoth collection of smaller projects, with the core projects having significant debt on the security side.

Move fast and break things. Unless you are breaking stuff, you are not moving fast enough. – Mark Zuckerberg
The entire internet security industry, including Cloak Labs, owes its very existence to the technical debt incurred by the designers of the Internet. Of course the Internet would still be under construction if it had been required to be secure from the very beginning; you wouldn’t be reading this in a web browser or on a mobile device and you’d be wondering what to do with all the free time you would have had from not having to keep up with Facebook and Twitter. Those closely concerned with cyberwarfare have even advocated that we shouldn’t have an Internet due to the high risks it creates for our modern infrastructure (power, water, financial services, transportation, healthcare, …) Have we amassed so much technical debt that our economy will eventually crash, not unlike what happened with mortgage debt in 2008? If you had been one of the inventors of the Internet, what would you have done differently knowing what you know now?

When you park your data do you hand your keys to the valet?

August 5, 2014 / Security / 0 Comments

Car Keys

Security remains a top concern for IT managers considering the cloud. This is symptomatic of trust issues when working with cloud providers. After all, when you hand your precious data over to a cloud provider in general you are also handing over the keys! Just like when you valet your car. You’ve never met the young gentleman with the red vest and the bow tie but you are handing him the keys to your brand new Mercedes! Your corporate data could be worth much much more.

Once you’ve handed the valet the keys to your data you have no control over how he (she) handles those keys or who they might share them with. This applies to your data as well. The primary fear is that hackers might gain access to your data and exploit it, resulting in loss of your business reputation and real money. Cloud applications, even when they are encrypting data in the cloud, have the keys to your data somewhere. Exploiting the application may reveal the keys or the application might be altered into revealing your data.

Inadvertent release (leakage) is also a possibility. The application may have a bug or security hole that someone might stumble into. The application thinks you’re asking for your own data but the requestor is actually someone else. Such errors are unfortunately all too common.

Then there’s access by state actors. For data stored in the US, the US government has several legal tools available to get access to your data without you ever being informed. If you’ve given your keys to the valet, the feds present the right legal documents to the valet (subpoena, national security letter, sometimes even less) and the valet gives them your data. Recently a NY court held that the US also has legal authority to request data that is stored overseas! This brings new legal risks for data stored everywhere. Foreign authorities may start making requests for data stored in the US by US companies that have overseas subsidiaries. European efforts to keep data in-country may become moot if the providers have presence in other countries. While one of the key benefits of the cloud is to make location irrelevant if this NY judge’s decision is upheld there could be a significant legal downside. Companies will feel that if they hold data in their own data centers at least they will be informed when their data is being requested by authorities.

At Cloak Labs we don’t hold our customers’ private keys. Only the sender and recipient of a message can read its contents. We don’t have to ask you to trust our cloud infrastructure, encryption and decryption happens on your premises. Our cloud infrastructure just queues and transports encrypted messages. Were we to be subpoenaed for your data, we would of course legally be forced to cooperated, but all we could provide authorities with is highly encrypted messages. We don’t wear shiny red vests and bow ties and we don’t have the keys to your data.