2 Comments

Summary:

Many expect American cloud companies to suffer in the aftermath of the PRISM controversy, but will European cloud providers come out ahead?

Jason Hoffman Joyent Markus Rex ownCloud Dan Gillmor Arizona State University Structure:Europe 2013
photo: Anna Gordon/GigaOM


Session Name: Post Prism Panopticon: Confidence in Public Cloud Computing

Chris Albrecht
David Meyer
Dan Gillmor
Jason Hoffman
Markus Rex
Joe Weinman

Chris Albrecht 00:00

with Dan Gillmor, author and professor at Arizona State University, Jason Hoffman, founder of Joyent, and Markus Rex, CEO of ownCloud. Please welcome our next panel to the stage.

David Meyer 00:28

As you’re aware, if you’d been sitting in on other sessions, it’s a topic that keeps coming up again and again related to anything cloud. Now we’re going to tackle it straight on. We’ve only got about 23 minutes and 44 seconds left, so let’s go straight into it. Dan, we’ll start with you. Do you think people are really worried about the surveillance revelations, the NSA stuff? I’m talking about consumers, enterprises, and do you think that is going to translate into action or inaction, in terms of not going into the cloud, potentially?

Dan Gillmor 01:08

Depends who we’re talking about, some people are terrifically worried, and I believe with justification. Those would include, on the individual level, people who think that in America the Constitution means something. People who believe that they should have a right to privacy of some kind, people who believe in little things like due process and Bill of Rights, these kinds of things, and people who believe that a pervasive surveillance state is deadening to society in ways that we can easily see from other nations where it’s happened. Companies should be very worried about it. The NSA has sabotaged some of the underpinnings of online communications and commerce. Sabotaged, the only good word for it. That would worry me if I were an enterprise, especially if I were an enterprise based in the US that tries to sell things to other people using security technology, where the assumption must be that it’s been compromised.

Jason Hoffman 02:26

I’ve seen specific examples when the Brazilian revelations came out, some of the things around Germany, where there were German customers, Brazilian customers put orders on hold.

David Meyer 02:35

Really? You must have seen great growth and demand for your product, I assume. This is very privacy-centric.

Markus Rex 02:42

Right, what was interesting was what we saw when the first revelation came out, there was a significant spike in our rep servers access, and there was also another spike when the first backdoor story came out. That was when we saw the second big spike and people obviously started to be more interested in other solutions than just pure online cloud.

David Meyer 03:05

Has it just been a spike or we’re talking a general upward trend?

Markus Rex 03:10

We saw more downloads of our free version. Clearly that has gone up quite a bit. As with all sales, end of prize sales take time, so we’ll see. Ask me next year.

David Meyer 03:23

Let’s talk about the technical side of this a bit. Let’s talk about policy, first. Do you think that US providers can actually offer secure services in Europe, because this is obviously a big selling point for a lot of local cloud providers there. They’re playing a little bit on perception, but do you think a US provider can offer services in Europe and say, “The NSA isn’t nosing into this because it’s sited elsewhere,” Jason?

Jason Hoffman 03:53

I think it depends on the service. If you’re thinking about someone running inside a virtual machine on a piece of block, the only way you can actually see that data is to have access to the virtual machine, or if it actually goes over a network below certain levels on encryption. A lot of the issues come where, even as a provider, there are people on those networks and you’re on the internet, and the internet’s a big sewer anyway, and just about anyone can go and look at what’s going on and very often exploit vulnerabilities on the customer-side. The thing that’s even more surprising, as even a provider, is a lot of the tactics that are employed in this, are the same tactics you’d do if you were doing a botnet, if you were a spammer, if you were just trying to scrape pages from someone like that. If you were just listening on a network, looking for the things, doing a lot of those typical exploit parts. The idea that one can actually come in from a provider standpoint and go from underneath, we don’t normally even have access to that data because it’s usually sitting in a file system on a blocked store on a big black box piece of storage.

David Meyer 05:13

In terms of the Patriot Act, Microsoft caused quite a stir a few years back when they said, “Yeah, even if it’s in Europe, the data center is in Europe, it’s still beholden to the Patriot Act.” What’s your understanding of that arrangement?

Jason Hoffman 05:26

A lot of times when that stuff comes in, I’ve seen that come in from DA all the way down to other three letters, and they’re actually typically fine with the response. We’re like, “Well, it’s in a database, in an operating system, on a file system where we don’t have access from that below. We’re literally just a processor at that point, or we’re running a VM and providing a piece of block storage. There’s no way to discombobulate it from our level.” Then usually the only way one can really, truly get access to that is to actually break into the VM or listen on a network.

David Meyer 06:09

What do you think of the perception side of things? Do you think providers can actually promise security or do you think that there are some kind of rascally outfits out there who are saying, “We can do security!” but they don’t really know what they’re talking about?

Dan Gillmor 06:28

I’m not enough of an expert on that to give you a good answer. I will tell you what Bruce Schneier, who is an expert on these things has said, which is that it’s, at this point, safer to trust open source things, although we can’t absolutely trust anything anymore, but you’re probably better off with open source than a commercial provider when it comes to security products.

Jason Hoffman 06:53

And to add to that, if we look at the basic issues around, we take security, the word that we don’t talk about enough is integrity. If you think of that word, you think security is the prevention of corruption, or the prevention of crime. Integrity is the detection of corruption. We can’t be perfect at preventing crime, but we can be perfect at detecting crime. You can sit down and think about methods, for example, several that were developed in Estonia by the Estonian governments, like what the Guard Time guys have done, where these are not around completely prohibiting people from looking at your data, it’s around sitting there and saying, “If anyone’s modified it in any way” if anyone’s modified it, in particular, in a way where it’s still usable, it’s not corrupted, but it’s still usable, it leads you to make a wrong decision. Is there a way to go and verify that a crime has occurred in a completely compromised, untrusted environment? I think we all have to have the same position that the NSA has, internally, and that is that security isn’t possible. Everything’s been compromised, everyone’s in your stuff. Trust no one. So that all you’re basically left with is you can verify that someone hasn’t messed with your stuff in a way that leads you to make a bad decision.

David Meyer 08:11

Markus, what’s your take on this, because your product is open source, and what level of promise to you give around the security that you provide?

Markus Rex 08:20

What really matters is that people who are interested in finding out what goes on, have a chance to check for themselves whether there is anything in there.

David Meyer 08:34

You mean orders in the code?

Markus Rex 08:35

In the code, yeah. Honestly, I’ve done open source for many years now, I don’t know of anybody that actually did, but the fact that they could is a great promise. What I just noticed is, for example, when somebody would come and would want to do something to our stuff, could they order us to put some things in? Maybe, but other people would see and would notice.

Jason Hoffman 08:59

The subtlety on that– Most back doors are not explicitly put in. Most back doors are bugs, and they’re mistakes, and they’re things that are being exploited in a particular way. The only thing, when it gets suspicious, is, for example, when you have things like the Stuxnet piece of malware or flame. Those actually used zero-day vulnerabilities in Microsoft that were known for four years. When you see things that don’t get fixed after being reported, that’s suspicious. But most of these things are relatively innocent bugs that get exploited.

Markus Rex 09:37

Of course, but nevertheless, if it is more open, the chances that things are being fixed are significantly higher.

Jason Hoffman 09:47

And you have more people looking at it.

Dan Gillmor 09:51

Interesting thing, on open source, even there, what if NSA spent, for them, a trivial amount of money, paying someone to become part of a community that’s developing something and five years in, does a commit of something that looks harmless, everyone trusts that person.

Jason Hoffman 10:08

It’s really, really, really difficult to create a bug on purpose. It really honestly is. It is significantly easier to have an open market for exploits to actually purchase malware that hits those exploits. It’s a lot easier to take those developed things against those exploits than it is to actually obfuscate a bug. It’s one of these things where it’s part of the creative process of writing software, it’s not apparent to people, it becomes apparent during use. It’s a lot easier to buy exploits than it is to create bugs from scratch. It really, really is.

David Meyer 10:50

Obviously trust is fundamental to this whole thing, and there’s been a lot of suspicion around the public statements of companies, and as you say, they don’t necessarily insert at a back door, per se. What’s your take on that in terms of your Microsofts and Googles and Yahoo!s, the general councils and CEO’s coming out and saying, “We don’t know anything about this.” Do you think they’re being honest?

Dan Gillmor 11:17

Some of them probably are saying what they believe. I would just ask each of them in turn, in the US, “Do you have security clearance?” If they don’t, then I would ask, “Do you have a group inside your company that does?” And they don’t talk to them, so how could they know? They can make assurances without knowing.

David Meyer 11:42

What level are those groups at? How high up the chain are we talking?

Jason Hoffman 11:46

The best indication of how involved a company is, is really how rapidly they fix exploits. If you look at the Transparency Project at Google, they actually audit things pretty heavy. You really can’t do anything on those systems without people knowing. You can go and, I would suggest, what’s the history of something being reported and something being fixed? For example, an exploit being reported and four, five, six years later it’s not fixed, and yet somehow it had a piece of malware that had a four year life cycle on it, is a suspicious bit of involvement.

Dan Gillmor 12:29

There’s some interesting research from Matt Blais, and others at University of Pennsylvania that suggests that, they don’t have it nailed yet, but they’re looking into the possibility that things that are rapidly iterated, like Chrome and like Mozilla Firefox, may be considerably safer because the malware people have to keep finding new exploits.

David Meyer 12:58

What’s your take?

Markus Rex 12:58

It’s very funny, we had a very similar discussion inside ownCloud with that same argument. We say, better be very quick at everything because it makes it more difficult for people.

Jason Hoffman 13:11

It’s expensive to write software. You don’t have people around doing it. We all known, just look at the new iPhone coming out, now you have to make different images, you’ve got to do this, you’ve got to do that, you’ve got to do the same thing if you’re making bad software. New iPhone, new exploit. It’s not easy.

David Meyer 13:33

What are the things that the general public don’t know yet? Give away the secrets of the industry, please. What are the things that the general public don’t know yet, that are extremely likely to come out? We’re talking ideally knowledge, but also healthy educated guesses. Perhaps Jason, start with you? In terms of security.

Jason Hoffman 14:00

In general, when you have an encryption method that’s very easy to do and process, it makes it significantly easier to do hardware-based methods to crack it. In particular, when you look at anything less than 128-bit SSL can basically be done in real time on a deep-action inspector-type thing. It’s not a big deal.

David Meyer 14:26

128-bit? What’s safe, what isn’t?

Jason Hoffman 14:31

Depends on the method.

David Meyer 14:33

But hardware is the way through?

Jason Hoffman 14:36

Yeah, if you have crypto-enabled pieces of hardware, then you can have decrypto-enabled pieces of hardware. We also have to keep in mind that what people very often forget is nothing in D-RAM is every encrypted. If a private key is in D-RAM and someone has access to the operating system itself, it’s trivial to steal someone’s entire private key infrastructure. So the reliance on keys and the reliance of that trust-model versus things that you don’t need keys and you don’t need trust, that’s where we’re really going to have to move to. If you look at things like the fundamental insider threat that Estonia faced at one point, and they literally had to develop keyless integrity checking and ended up going GuardTime and things like keyless encryption, and shared secret encryption, you have these ideas that everything’s been compromised, nothing can be trusted. I would just suggest people to get over security. Nothing can be secure. Stop trusting things, stop trying to secure it. Just figure out a way that you can verify that you’re not going to be a false-positive.

Dan Gillmor 15:53

Go live in a cabin in the woods.

David Meyer 15:58

How do you deal with key management and secondly, do you think that it is worth trying to encrypt and secure or not?

Markus Rex 16:07

I think there are two different scenarios that we have to look at. One is, if somebody like the NSA, with a certain amount of resources, actually gets at things, it’s hard to protect yourself against this. There is the other thing, if you’re a medium-sized business, another medium-sized business, it’s significantly more difficult for them to do something. And against that you can relatively easily protect yourself, and you should. I agree with you, everything is compromised, but there are certain things, “I don’t have to make it that easy. I can still lock the door.”

Jason Hoffman 16:50

And we’re largely saved by the fact that a lot of us are not interested. A lot of us don’t have interesting data. A lot of companies don’t have interesting data.

Dan Gillmor 17:00

I spend a fair amount of time with journalists talking about this and explaining to them that they have to understand what the threat model is. What are they worried about, and protect as best as possible for that, and that applies to all of us. What are you worried about? Do your best against that, and understand that if a NationScape, like the one I’m from, is after you, you’re really not going to be able to stop them unless you’re incredibly lucky and good.

Jason Hoffman 17:28

If you data-mined my Facebook, and Path, and Instagram, you’d see that I like wine, food, and my kids. By all means, call me a criminal.

Dan Gillmor 17:39

I don’t like the argument, “Well, I have nothing to hide” because when someone says that to you, the response to that is, “Fine, then you won’t mind when we install cameras in your bedroom, bathroom, the rest of your house, and by the way, we’d like you to wear a camera and microphone around with you every day and we’ll store it all. Check back when we feel like it.” That’s for people who have really nothing to hide.

Jason Hoffman 18:05

The concern is like that DA story that came out, even things we see where there’s plenty of things where you can function like a spam-bot on Twitter, data-mine a lot about people on Twitter, data-mine a lot of people on Facebook as a result of being like a spammer. The big concern around what the DA story exposed was–

Dan Gillmor 18:27

You should probably tell them what that is.

Jason Hoffman 18:29

These guys were data-mining the internet and coming up with effectively the potential possibility that someone might be a criminal, and then effectively then tipped off an agent to pull them over for some other reason and then they went back and filled the due diligence process. The main concern being you’re basically identified in some way and then that kicks off a process that fundamentally covers up the fact that you were illegally discovered. I think that’s a big problem.

Markus Rex 19:07

I think this is a major problem. It goes back to the question of, “What should a government be allowed to do?” I certainly think that a target investigation is a very fine thing. Governments have been doing this for the longest time, why should we prevent them from doing this?

Jason Hoffman 19:22

But at the same time, they arbitrarily make laws and sometimes make crimes.

Markus Rex 19:29

True, but the thing I’m worried about is the magnet approach where everybody is considered as suspect until proven otherwise. I just don’t know that I like that.

Jason Hoffman 19:39

But that, for example, that is what is supposed to be illegal in the United States.

Markus Rex 19:45

You said the magic word, “supposed”.

Jason Hoffman 19:47

Supposed to be illegal, meaning we all thought that walking down the street, having weed fall out of your pocket is not supposed to be a crime. Nothing’s ever happened to me. I just live in San Francisco.

Dan Gillmor 20:06

You can move to Colorado where it’s completely legal.

Jason Hoffman 20:09

Is that where you live? Nice.

Dan Gillmor 20:14

He said, hopefully.

Jason Hoffman 20:16

Nice, nice. Email me your address, Dan.

David Meyer 20:20

The numbers that have come out so far, the disclosures that have actually been permitted so far seem quite small. I’m talking about the numbers that came from whichever vendors it was in the US who said we’ve had these many requests, subpoenas, are those–

Dan Gillmor 20:41

From people who’d been lying for years? I totally believe them.

David Meyer 20:47

Is that back-fill, from what you can guess?

Jason Hoffman 20:53

If you bookend it, on one hand it could be back-fill, where you sit down and say, “Oh, we actually data-mine everything, and 1% is suspicious, so let’s make sure we go through our process.” On the other hand, that might actually be the case. The only saving grace, in my mind, is you take Google, Facebook, Microsoft, for example, they have exceptionally large footprints, so the idea that somehow, somewhere, magically there’s a government-based data center footprint that is the size of Google plus Facebook plus Microsoft that somehow allows them to have copies of all of that in some way is like, “Hm”. You can’t hide that. That’s literally a data center you can see from outer space.

Dan Gillmor 21:37

They’re building one hell of a big one in Utah.

Jason Hoffman 21:40

It made the news it’s big, but it’s actually not that big. It’s a relatively small data center. It’s a single digit percent of even Google’s.

Dan Gillmor 21:49

The things they want and need to store, there are some things where metadata is all you need like what link did they go to? Other things where you don’t need the actual contents, you can go back and get that. Other things are probably being stored, including phone calls, which they say they’re not storing, I don’t buy that, I think they probably are.

Jason Hoffman 22:17

I plead the fifth.

Markus Rex 22:19

You only need the metadata and if you have your back doors you can get the rest anyway, if you need it and you don’t have to pay for the storage.

Dan Gillmor 22:27

You have the internet archive.

David Meyer 22:30

Let’s bring it back to the topic of the whole show which is the cloud industry, particularly in Europe, do you think that the cloud industry here really has a lot to gain? Is that going to actually happen? There’s been some wild projections, some very high, some not so high projections of how much is this going to hurt the US cloud industry. Is that going to flow to Europe?

Dan Gillmor 22:54

It will if you buy the idea that the European governments aren’t going to insist on doing exactly the same thing with the providers within their own jurisdictions. I sense that a lot of outrage from leaders in places like Germany is that they reserve the right to spy on their citizens to themselves, rather than to my government. So we’ll see.

Markus Rex 23:17

I would wholeheartedly agree with that. There is a different perception of what is the target. If the target is to go after some economic data, people might feel, “I really want that in my country” because they feel that their own government is not necessarily spying on their trade secrets, which might be different at being in a different jurisdiction. But for the person’s stuff, well–

David Meyer 23:46

What do you think of that?

Jason Hoffman 23:47

Reality doesn’t matter when it comes to these things. Perceptually I think there’s a very big market opportunity for someone to capitalize on the perception of it. The truth is, when your service provider comes down to trust warranting indemnification and so it depends on who you trust. Who will actually warranty a contract? Who will indemnify in the contract? Who will actually make a promise and put it on paper? If you can couple that with a great perception in the market that you are the trustworthy one, I think there’s potentially a definite inside track for someone.

David Meyer 24:24

We’ve got a question from Mr. Joe Weinman over there.

Joe Weinman 24:27

A couple of quick thoughts on technology. On the one hand you’ve got some new techniques that are in various stages of reality, encrypted databases, homomorphic encryption, quantum internet based on entanglements that you can’t possibly eavesdrop without screwing up the transmission, and then on the other hand you’ve got the future cracking techniques. Everything ranging from leveraging GPU’s, there’s the guy that built the GPU appliance that can crack enormous passwords in 39 nanoseconds.

Jason Hoffman 25:05

Or mine Bitcoins, same thing.

Joe Weinman 25:08

Yeah, or mine Bitcoins. You’ve also got the quantum computer coming out duo-wave systems that’s now at 512 cubits, so has the potential to do cracking relatively efficiently. So any notion as to any real technologies that are around the pike that might completely change the balance one way or another as far as much more secure or much less secure?

Jason Hoffman 25:34

Those come down to, in a perfect world you can still do the keyless signatures based on time, so the Guardtime approach. Or you’d actually be able to do homomorphic encryption in normal CPU’s. We’ve actually managed to do those two. Both of those are actually quantum-secure. You’re not even going to be able to crack those with a quantum computer. A quantum computer right now is just that, if you’re familiar with NMR, MRIs, you basically take a bunch of atoms and you put them in a magnet and they either go with the magnet or against the magnet. You put an energy and they flip, so it’s literally a state of one or a state of zero. And right now, most of us are encoding a bit in effectively eight atoms. Being able to get down to literally just a bit and then a cubit and being able to do that is the ultimate way to do a computer anyway. But homomorphic encryption and the keyless signatures are both quantum-secure. We just get to that point as quickly as possible, and the keyless signature part is technically possible on a millisecond, second time frames. But to actually do homomorphic encryption right now would take years to sometimes do the equivalent. So if we can just get those algorithms down to the point where those are actually possible on chips that would be a big win.

David Meyer 26:56

We’re out of time so I’m going to be very naughty and actually steal another couple of minutes out of the schedule, which is just to, very quickly, each one of you just say whether you think technology can fix this, or whether policy can fix this, or whether nothing can fix this.

Dan Gillmor 27:11

Both, we need both. We can’t rely on one or the other.

Markus Rex 27:18

Also both and I think it’s about due time and a discussion happens on what society wants to live with. Which kind of vigilance or surveillance do we want, then we have to live with whatever the outcome is.

Jason Hoffman 27:30

Same, if we do the two things we just talked about with Joe, plus decide that we want to be a society that’s perfectly catching criminals and not preventing made-up crimes, then we’re good to go.

David Meyer 27:43

Thank you very much, all, for joining us.

firstpage of 2
  1. I can confirm one important thing : PRISM is a huge issue for US cloud companies. Here in France, we alreay see a huge increase of businesses trying to relocate datas and cloud.
    Great for us and bad for US companies..

    Share
  2. If you are a European company who stores data on US Servers or anywhere in the US, that data is accountable to the Patriot Act. Now all the cloud vendors are trying to relocate their data any where but the US.

    Thanks for the read.

    Share

Comments have been disabled for this post