From left, Michelle Finneran Dennedy, Jeremiah Grossman, Amit Mital, Erin Nealy Cox, Todd Simpson, and Laura Mather
Ray Ozzie of Talko
Mark Anderson of Strategic News Service
Michelle Finneran Dennedy
Michelle Finneran Dennedy
Vice President, Chief Privacy Officer, McAfee, a division of Intel Security
Founder and Interim CEO, WhiteHat Security, Inc.
Chief Technology Officer, EVP, GM IoT & Mobility, Symantec
Erin Nealy Cox
Executive Managing Director, Stroz Friedberg
Chief Strategy Officer, AVG Technologies
Founder and CEO, Unitive
Mather: I’m thrilled that you all are here. My name is Laura Mather. Some of you saw me last night. Right now I’m working in the very exciting human resources space, but my background is cybersecurity, so I’m somewhat relevant to this panel.
But we’re not here to talk about me, which is awesome. Before we get started, I do want to point out that the way I like to run panels is, it’s all about you, right? You’re here for a reason, and if there are questions you have or comments you want to make, this format is actually fantastic for that. So we really want to encourage lots of interaction. We have microphones—I will tell you that if you raise your hand, you’re going to need to wait until the microphone gets to you because we’re recording this. But definitely we want to make this as interactive as possible.
So what we’re going to do is we’re going to start and have each of the panelists just do a quick introduction of themselves, start with sort of some overview questions, and then, again, the more this can be a discussion of all of this group, the better.
So, Todd, do you want to get started?
Simpson: Sure. I’m Todd Simpson with AVG Technologies. You probably remember AVG from 10 or 15 years ago as the free antivirus on PCs. So we’re still a security company, but we actually now offer a lot more security solutions. We have approaching a hundred million users on Android, as well as all of our old PC base, and we offer not only antivirus, but a lot of other malware protection. We talk about doing devices, data, and people, and protecting your device, your data, and you as a human.
Mather: Great. Erin?
Cox: Hi, my name is Erin Nealy Cox, and I’m an executive managing director with Stroz Friedberg. We are a cybersecurity and investigations consulting firm. Before I was with Stroz, I was a federal prosecutor with the Department of Justice for almost ten years, and that informed my job at Stroz. I was prosecuting economic espionage and cybercrime cases for the Department, and now as part of my responsibilities at Stroz Friedberg, I run the incident response unit, which we have teams of cyberspecialists that go into companies and assist them when they’ve been breached.
Mital: Amit Mital, I’m the CTO and also the GM of the IOT and mobile businesses at Symantec. Symantec is the largest security company in the world, and we have a very, very interesting and exciting challenge and opportunity in front of us with what is happening with security, and also our enormous assets across the enterprise that we hope to leverage to provide better solutions for our customers.
Grossman: Hi, I’m Jeremiah Grossman. I am the founder and CEO of WhiteHat Security. But please don’t let the title fool you; I am a hacker. [LAUGHTER] And not like the code hackers you might read about. I’m the guy that actually breaks into systems. In my career I’ve broken into everything from finding vulnerabilities in Google, Facebook, Microsoft, and banks you probably use. So the company I founded, our mission is to secure the web, and we do that by finding vulnerabilities in the sites and the systems that the bad guys eventually will, and hopefully those issues will get fixed and the web gets to be just a little bit more secure.
Dennedy: Hi, Michelle Dennedy, I’m going to hopefully have a voice for at least one more hour. I am the chief privacy officer at Intel Security, formerly known as McAfee. I guess I would also characterize myself as a hacker of processes and myths about data privacy. And so I do that during my day job. My team is responsible for compliance as well as product development at Intel Security. In my part-time free time I run a site called The Identity Project to help educate parents and other vulnerable populations about the real impact of identity theft.
Mather: Great. So what we’re going to do is start with just getting a sense of what everyone on the panel thinks is the state of the union with cybersecurity. I don’t like to go in order again, so, Amit, do you want to go first?
Mital: Sure. Thank you. So cybersecurity, and I’d say the enterprise in particular, is at this perfect storm of need, opportunity, urgency. If you look at all the news that is happening in the last 12 months, with all the major breaches, you know, the number is escalating, the damage is enormous, and obviously, there’s a bunch more that’s not public.
If you think about why this is happening, one of the most fascinating things about security for me is that you have an active adversary, and so you have these really smart, really motivated, really creative, really agile guys creating amazing IP to steal your stuff and do damage to you. And we aren’t talking about dozens, we aren’t talking about hundreds, we’re talking about literally tens of thousands of people who are doing this day in and day out, and a breathtaking amount of IP created all the time. And so that by definition creates a huge demand for, quote/unquote, “the good guys” to create innovation. And so the opportunity for innovation is absolutely enormous. And because this data and the infrastructure that is being damaged is so important, security has now become the top of the CIO’s priority list. So as an engineer, this is amazingly exciting because it’s such an opportunity for continuous innovation, basically infinite amount of innovation. But also, from a business perspective, I believe it’s probably the most exciting place to be in the enterprise.
Mather: So let me ask something of the audience. When the panel was talking about cybersecurity, we were thinking about it in terms of sort of enterprise, consumer, and then even infrastructure as sort of a third category. How many of you think that the most threats right now are against enterprise or consumer or infrastructure? So how many people think the most threats right now are against the enterprise? A couple.
Washington: Are banks enterprise?
Mather: Yes, banks are enterprise.
Dennedy: That’s a good question.
Mather: Well, actually, no. Banks can be enterprise until they’re starting to infiltrate the consumer accounts, right? So then it would be against the consumer is the way I look at it. So enterprise, a few. Consumer?
Washington: I think your question’s artificially constrained.
Mather: Can we get a microphone?
Washington: So, sorry to disrupt your survey, but—
Mather: No, no, no, this is good.
Washington: But, you know, your question’s artificially constrained because ultimately what matters is how all of those, all three of those pathways eventually touch people. And if the enterprise is not secure, then individuals won’t be secure, and if infrastructure is not secure, then enterprises can’t be secure. And I think a lot of people are living without a clear understanding that these three ecosystems are all intertwined, and we can’t forget that. And so one of my interests, and my hope is that we have some discussion around how do you thoughtfully address the vulnerabilities when not all of those vulnerabilities come from your domain of influence.
Mather: Ah, so when you don’t have control over it.
Washington: Yes, exactly. Right.
Mather: So that’s actually a great question. And I want to be clear that, for this panel, we’re very thoughtful about—there’s a lot of really scary stuff we could talk about today. But the good news is there’s actually a lot of reasons to have opportunity and hope and innovation that is helping to protect, and so we’re going to try very hard to show you both sides of that coin, because it is not our goal for everyone to walk out and like close all their online accounts.
And is there another question or comment here?
Anderson: Yes, thank you. I agree completely with, I don’t know your name, but with what you just said.
Washington: Ken Washington.
Anderson: Ken. Yes, I’m Mark Anderson, and I believe that the wisest way to look at the threat landscape today, or portfolio, is through what people are after. And they’ll use any technique, as you describe, to get what they’re after. But they do kind of break cleanly into classes if you start thinking of them in terms of their own targets. And so if you do the card or ID thing, that’s one group of very distinct criminals from a certain number of countries and places. And then if you want to go after people who are taking crown jewel IP out of commercial enterprises, you’d have to use bring your own device to do it, or supply chain to do it. That’s a very distinct group of people in different countries. So I think that’s a very useful way of sorting this all out.
Dennedy: Can I just jump in with a point?
Dennedy: To sharpen that, I think we’ve spent a lot of time, in the tech industry in particular, looking for magic fairy dust that will fix everything for us, and I think what we haven’t really done in a root cause analysis is to get at where I thought you were going, and I think where your comment goes, which is the reason that people used to rob banks is because that’s where the money was. The reason people attack consumers, enterprise, and infrastructure is because that’s where the data is now. And if we’re truly living in a data economy, we need to stop looking at data as (a) being free, (b) not being an asset, and (c) being some sort of an exhaust fume that comes out of the technology. Because that’s what the hackers are after. They’re after personal data that they can leverage, they’re after enterprise disruption because of information flow stoppages, and then they’re up to no good in mechanical stoppages too, with mass SCADA system-type nation state attacks. But I think if we start to look at that corpus of what is the thing that we need to protect, I think it changes, and I think illuminates a new path of exploration for solutions.
Cox: And I’m going to jump in there too. I think it’s good to think about what the hackers are targeting. But in my line of business, and this was true at the Department of Justice, and now at Stroz Friedberg, I have found that hackers are targeting almost anything. I mean the buckets have to consume every piece of data out there, right? And so you can certainly think about it in terms of buckets of information and who the threat actors are. But I also think, if there’s any silver lining about the last year, 18 months, is the publicity that these high-profile breaches have, if you try and look at the silver lining of it, it’s that some companies that used to think they were never going to be targets are now becoming more aware that they in fact can be targets. You know, when I got out of the Department of Justice six years ago, I had been investigating these cases for years and I would talk to potential clients and they would look at me like I was—you know, this is “War Games,” this is not going to happen to me. And if there’s one thing that we can say positively about what’s happening, it’s that more and more companies have come to the realization that they in fact can be targets for any of the data in their networks, and hopefully that means progress in terms of awareness and security, because ignorance is not bliss in this area of the world.
Mather: We have a question there?
Westby: Hi, Jody Westby. It’s kind of an add on, which is, in my work, and I do a lot of assessments of cybersecurity programs for large multinational corporations from top to bottom and that the threat environment is very sophisticated. It involves all the actors Mark talked about, still involves some random actors alone. It involves IP and sensitive, proprietary, confidential data, as well as customer data. And so, to me, I prefer to look at this as an enterprise risk, and you have to look at all of those because I see companies with vulnerable web apps and their website being exploited. They don’t have the right configurations on their firewalls, they don’t have the right network architecture. There’s so many different touch points that if you look at it as an enterprise risk and not try to break it apart, then you don’t end up maybe missing something that you should look at. Because you have to look at it overall, because the threat—they’re winning. We’re trying to catch up. But to keep it as that enterprise risk and understanding the whole environment and all the actors and say, now, how do we best prepare for all of that?
Mather: Yes. Todd, did you have a comment?
Simpson: I was just going to comment that maybe architecturally we’re making it easy for people to target data as the asset. You know, we talk about privacy or best practices in collecting data. We’ve evolved into this world where every device you own, every service you subscribe to are these vertical silos that take all of our data and store them all in these nice rich cloud databases, which become very attractive targets. And so that’s where a lot of the efforts go is to break into those large, large datasets. And so another component about thinking about the whole space is are we designing the systems correctly, and should we actually be stepping back and thinking about more distributed models as opposed to putting all of our data into multiple, multiple services and actually leaving ourselves exposed because of the way everything is architected today.
Mather: Did you have a question?
Audience: I was going to respond and say what about encryption, strong encryption for data at rest, data in motion, data in transit. It seems like we still seem to be walking away from that as being either too expensive, too cumbersome. Is that true, or do you think that we really should be able to move so that we really are living in a world in which strong encryption is everywhere?
Simpson: So Android and IOS have both started to step up to that, as well as on the web, HTTP-S systems are getting better and better. So we’ve evolved from almost none to more, and I think it’s an accelerating trend. So I think we will see encryption much more widely deployed. It’s just one layer though, right? There’s obviously attack vectors through the current encryption protocols as well, as we’ve seen even fundamental in some of the algorithms, there are still being bugs found that were dormant for 25 years.
Mital: Let me add to that as well. Encryption as you know, there’s a substantial user experience component to it. As Erin said, most people, they hear about the breaches, they hear about the security vulnerabilities, but few people, at least historically, believe that it applied to them, “It will never happen to me.” People were historically reluctant to pay the tax on the experience. Building it in the U.S. really helps because then it hopefully becomes a transparent to the user, but my belief is that, kind of this idea of DRM everywhere, everything being encrypted all the time.
Mather: Can you define DRM
Mital: Oh sorry, Digital Rights Management, as an approaching—
Audience: How about PRM, Personal Rights Management.
Audience: My data.
Mital: PRM, yes, I think will be in the future one of the pillars of solving this security issue.
Mather: Yes. Go ahead Jeremiah.
Grossman: Encryption’s got to be like table stakes, without a whole lot of crypto-communication. With data at rest, you can’t do much. It’s necessary, but insufficient. Let’s say Facebook or Google, or any bank, has all of their data encrypted, an outside adversary like me it’s not going to matter a whole lot. I’m going to use my browser. I’m going to go to their website. I’m going to take all the data out. Because the application has all the keys and that’s the only thing that really matters. Who the adversary is that you’re trying to prevent with using encryption, it’s somebody with physical access to the machine or the ability to sniff the wire, mostly law enforcement.
Westby: One tiny point, which isn’t so tiny actually, which is one of the reasons people are targeting data and it’s so easy for them to target is because they don’t have people like Michelle in their companies.
Dennedy: Thank you Jody.
Westby: She knows how to integrate privacy requirements into the security program. Most people don’t so they have a privacy officer and they approach this as a policy/legal thing and they don’t really interact with IT that much. They’re not involved in the controls or any of the other stuff. In fact most companies I walk into don’t have a data inventory. They know they have PII somewhere, PHI somewhere, they have stuff, but they don’t know who all accesses it. She’s a big piece of this because she’s a privacy person and an engineer background and she knows how to take privacy and get it in security programs—
Dennedy: This was not a plant. [LAUGHTER]
Westby: —and that’s one of the reasons data is so easily gotten to today. She didn’t pay me to do this.
Dennedy: I didn’t even know Jody was coming. Thank you. Just as a quick response, I wrote a book that is now an international best-seller which for textbooks apparently is a very low threshold [LAUGHTER]. But we open-sourced it because we want people to see our methodologies, it’s in the back of room if you guys would like a copy of the paper version. It’s called “The Privacy Engineer’s Manifesto” and this is a conversation that we’ve been having with various people around this room for many, many years taking what we knew from the 1960s in working with my father, who’s a coauthor, using UML, using data mapping, using the things even the youngest of engineers and code slingers in hacker dens in high school even understand, business activity diagrams looking at the assets systemically, and then figuring out where does encryption really fit. What I found as a privacy person in the security industry is—and this goes to the lack of awareness, as Erin was pointing out—that people believe if they can just put a little encryption Band-Aid on it and we’re done. “Look, yay! Reasonable security! I met all of the executive orders things.” And they’ve totally not really looked at systemically at what’s going on, what does this mean to my customers, what does this mean for ownership of data, and globalization. I thank you for that and I think the person or persons that you need are starting to retire now actually. I think we need the COBOL programmers, I think we need people that trained under Grace Hopper who started and said we have to have a systemic way of coding and not just to make it easy because we forked that way, but now how to make it sustainable. I think that’s the big trend that’s really exciting me today.
Mather: So we can take a question here.
Sprague: This is Kara Sprague from McKinsey. Building on the point that you were making and the fact that cloning technology doesn’t work yet, what are we doing in a world where every company is at risk, our critical infrastructures are also at risk, and we just have a severe talent shortage? We can talk about teaching developers how to embed security protocols into their code, but is there an easier way to do this that’s simply based on hygiene?
Dennedy: Let’s start at the very top and I promise not to monopolize too much, but I have such a passion about this. After the Target breach what happens? We replace the CEO, then we hire a CSO who’s in charge of your logical security. We’ve got now physical security, logical, and so we’re holding ourselves like this where he can just sucker punch you in the face. Who’s sitting on that board of directors now? Where’s the chief privacy officer? Where’s the data asset and risk and management person? On every board in the U.S. they should have gotten scared. By January 1 we should have had every CEO calling these recruiters for the next board seat saying, “Who knows about data asset management? Who knows about—?” I mean, PII, personal identifiable information, is a form of IP and we can all agree that IP, intellectual property, needs to be protected. Where are we sitting on our governance? That, I believe, is the one move that will trickle down. It starts with leadership.
Mather: And I have some data on that—and we’ll get to another question. I went to an event by “Agenda Magazine,” they have the magazine for board members and they polled the Fortune 5000 and of those 126 companies had a risk management board member on them—of Fortune 5000. When asked if that was a priority of the next board seat the overwhelming response was no. Just a little data about how the boards see this. We have a question in the corner?
Anderson: Yes. I’d like to go back and underline what Jody said a minute ago and I think all three of these comments. If you look at this from the human beings side of things it’s absolutely right the board is the fulcrum for this. I don’t know whether it’s the SEC with a little tougher view forcing disclosure of all kinds of things that will get the board on board, so to speak. That’s got to happen somehow. They really have to be afraid for their own personal safety in terms of liability I think before they take action. They’ve got to understand that they have responsibility, just like a fiduciary responsibility, to the shareholders, to protect those assets. They’re not acting like that right now. We see it if you survey of CEOs and CSOs, whatever you want to call them, there’s this huge distance between what the CEO will say who’s in denial, saying that, “My Company’s is fine. Doesn’t need anything else. We’re spending enough now.” And then the chief security officer will say, “Absolutely not. We definitely need to spend more. We’re not good.” These guys know each other, but they don’t agree somehow. The boards job is to fix that. The deeper problem, I think, comes back to what you were saying, Jody, we have to identify the assets, so you said it very well I thought. We work in this all the time and the questions is, If I go to any CEO, Ray Ozzie and say, “What’s your crown jewel IP?” Almost no one can identify what that is. They have not been through the project of talking to their top people and actually physically identifying where is it stored, and how is it accessed, what are the policies about those particular five people, instead of 5,000 who have access to those things. Those things haven’t been done properly. Just understand, if you’re a law firm and you’re going through a merger with someone in China, guess what, you’re going to get hacked. Have you put those documents and those bid numbers into some place that’s unplugged from the internet? No. Okay. Then your guy’s going to lose. It’s really important to identify what is the IP for Coke, or Ford. And trust me, Ford doesn’t know. So, no one knows.
Cox: Ken’s going to know. He’s only been there two months. [LAUGHTER]
Anderson: You and I talked about this before I think. And Ford got hacked after our last conversation. We’ll have that conversation later. So anyway it’s a big deal and I think it’s a human thing and the boards have to apply the pressure, but then the CEOs have to put the pressure on their staff and say, “Let’s identify this important stuff and let’s sequester it, and restrict access to it.”
Mather: Yes, but let’s be clear. How many direct reports of CEOs would say, “No, I’m good. I’ve got plenty of budget.” I don’t think any of them would, right?
Cox: And just to be clear what CIO is going to go before the board and say, “We’re in a lousy position.” What CSO is going to do that? It’s inherently counterintuitive for those individuals to be there saying, “Yes. We have real problems.” The CEO is not going to do that. Some of it, I wanted to echo a point that you were making which is, it is an enterprise issue, but what I found over the years that I’ve been working for the private sector clients that have these problems is, ultimately they’re getting better. The CSO position didn’t really exist five or six years ago accept maybe for some financial institutions. Now we’re seeing more CSOs. The chief privacy officer as well. We’re seeing that companies need to think about it in a multidisciplinary way. It’s not just an IT problem and it’s not just the board’s problem, it is actually almost every stakeholder within the company: corporate communications get involved because it’s a crisis, the CFO gets involved because it’s an unbudgeted expense, CEO obviously gets involved, the chief privacy officer gets involved, and all these people, and of course chief legal officers too, because of all the liability implications. So if all these people in a company are echoing the sentiments you’re echoing, which is this a huge important issue for our company and trying to get both the best inside and outside experts to help them with it, they’re going to be in a far better off position.
Mather: And it sounds like people are moving forward with that. You had a question and then we’re going there.
Washington: Yes, so I had a question I’d like to ask the panel and address this. My name’s Ken Washington and I’m the vice-president of research at Ford. Six years ago I was asked by my company at the time, which was Lockheed Martin, to create a privacy program. I was a former privacy officer. Through that experience I learned the importance of not only doing this from a governance point of view, but the fact that we were creating something from nothing because no one had given it any priority before and the hardest part of protecting the data and the information, and building a system which could be responsive was the fact that we already had a bunch of infrastructure, and a bunch of architectures that were already in place, and so you’re trying to Band-Aid or patch something that was fairly complex, and already had significant amounts of cost sunk into it, and had a lot of momentum into it. This is not unique to the company I used to work for, which is the heart of my question, which is: how do you deal with the fact that you’re dealing with immense amounts of legacy and a huge amount of momentum in a system—if we had the architect, the governance, and the technology architectures to mitigate the risk today from nothing, it would be a whole lot easier. It wouldn’t be easy, but it would be easier.
Cox: Startups have it easier?
Washington: Startups have it a lot easier. Right. But most of us don’t have that advantage and I’m asking from the point of view of the challenges I now face at Ford which is, the automobile is about to get more connected than ever before and it’s in a situation where we can get it right the first time because no one had been thinking about cars as nodes on the Internet before, but now that’s going to be the case. So, we’re going to be architecting our solutions with the knowledge of the fact that we have to do it right the first time, and we’re committed to that, and I have the background from my former privacy officer and security days to do that. But I’d like to hear the panel’s thoughts, and maybe anyone else in the room regarding how do you deal with the fact that you have all this momentum and built up architecture, because you just can’t rip it out and start over again.
Mather: Go ahead.
Grossman: I had this exact problem idea where every single day no one empathized with this more than the web security expert. Yes it’s all nice to have that conversation about how do we build more secure systems, but it’s devoid of the fact that the web’s already built and so now what? If you’re up against professional hackers, guys who do this for a living, this is how they make their money, I always bring the bad news, the bad news is you’re going to lose. You’ve got two choices, two strategies. One is, how do you increase their costs, because they’re on an economic scheme too so you have to increase their costs economically through whatever means possible, you know, security with obscurity if you have to. The second thing is fast detection and response. Most of the damage that’s done after these breaches, pick your favorite Fortune 500 breach, the breaches only causes damage because the bad guys were on the systems for months. If you can detect the breach within a few hours or few days, a compromise, and kick them off the systems, or get them out, it’s almost as if you were never breached because the bad guys couldn’t really profit a whole lot. If you gave me root access on a bank it would take me a long time before I could figure out how to get access to the mainframe system and extract money. It’s very difficult to do that or to extract a gigabyte worth of information. Its fast detection and response and then the ROI changes.
One other point. One of the things you have to understand is that black hats have economics too. Imagine you’re a CSO and you have a million dollars to spend on defense. What does the bad guy have to spend to beat your million? Right now it’s probably no more than $100 to $1,000 dollars. If we start looking at our defenses, whether the product services or strategies we employ, the more we can bridge that gap and bring that closer the more successful we’ll be. Right now it’s just off. You can beat, right now, pick your Fortune 500 Company, a day, no more to break into, that’s all it really takes.
Mather: Anybody else want to comment?
Mital: A couple of things. I fully agree with you Jeremiah on the ease of entry. Not only do the bad guys have to spend a lot less, but if you’re playing defense, which is what you’re doing when you’re protecting stuff, you have to be right all the time without really compromising the experience you’re customers have. If you’re a bad guy you’ve got to be right one out of a hundred, one out of a thousand times, you’re still ahead.
Cox: I like say that bad guys don’t have to do QA. We have to do QA [LAUGHTER]. It sucks.
Mital: That’s one. The other thing is that the very thing that makes life easier for developers: a stable, homogenous, consistent operating system also makes it much easier for the bad guy. Because in the end they’re writing code and writing applications. There’s a way to change the targets, constantly change the targets, obfuscation, polymorphism. Maybe that’s one of the interesting avenues that people are investigating as well. So you level the playing field so that their economic cost is equivalent. I’m not sure how well that works for state sponsors because they have mostly the deep pockets, at least you begin to dent the conversation.
Dennedy: I have another point that’s—
Cox: One quick—I’m not going to reiterate. I agree with everything that has been said. I would challenge a little bit that startups have it easier. In theory they have it easier which is they have new systems and they have new technologies they can deploy. But I would suggest to this group the tech sector is one of the most targeted sectors that we have and there’s not a lot of people talking about it because it’s not like a retailer or a hospitality where they have mandatory disclosure. But we know there’s very valuable IP that’s happening in the tech sector that’s being targeted by hackers all the time. And they often have bad security programs. Startups have the worst security programs.
Audience: They’re not putting any energy into it.
Cox: That’s right, because all their energy is into growth and R&D and hardly any of the energy is in security.
Audience: But that’s by choice.
Mather: This person’s been waiting.
Kvochko: Elena Kvochko, I’m from the World Economic Forum and I’m responsible for oiur cyber resilience program. Talking about solutions, I would be interested to know, in your opinion, who do you think should drive the agenda the cyber security agenda? Who do you think should drive security of the online environment? Actually last year we did an executive survey together with McKinsey where we asked about 300 companies what they think, who they think should take the primary responsibility, and the results were quite surprising. We have a such a diverse panel, fast growing companies, we have government perspective, we have industry leaders. So, if you were to pick some leaders who would you choose? Would it be law enforcement, public sector vendors, private sector corporations, and I guess probably you’re tempted to say, “We should collaborate,” but if it was sort of a forced ranking who would you pick as sort of the primary driver.
Mather: I’ll make them choose one. You have to choose one. [LAUGHTER]
Simpson: I’ll go and then you’ll get some maybe better answers because we’re a consumer security company. We actually think a lot of this starts with having smarter users. There’s a lot of tools available to users where they can actually do a lot better job—
Mather: I thought you were going to say; we’re a consumer company so we think it’s the government [LAUGHTER].
Simpson: No, I think, obviously there’s a lot of education that goes on in enterprises around privacy security. There’s also a lot of education to go towards the end users. We have a lot of tools and we have a lot of techniques that can actually make end users a lot more secure which makes a lot of data more secure etcetera. A lot of the attacks are still social engineering attacks. So, just password managers, real basic stuff that we have, but users don’t deploy or use properly. We’re actually starting this initiative called Smart Users as opposed to smart phones, so if the next two billion people come up on their smart phones we’d also like them to be smart users. We started this and would love to have other people help us, launched at the Clinton Global Initiative to start working with the ecosystem to put better tools and smarter users behind the smart phones.
Mather: Does someone else—You have to say one. You can’t—
Cox: Okay. I’ll say one. I’m not going to pick the government because if we’re relying on the government we’re going to be woefully disappointed [LAUGHTER]. I think it has to be the private sector and doing it on behalf of their clients, and their consumers, and their customers. They have to be the leaders and I think that everyone has to be willing to give up some of the convenience. Security always comes at the price of convenience. Encryption is just one example of that. I think everyone has to be willing to give up some convenience to be more secure.
Mather: Other thoughts?
Grossman: Would you believe it, I’m a security person, I don’t trust anybody at all to do this. [LAUGHTER]
Mather: We have another question here.
Audience: I love the name of this panel which was “Civil Defense” because the first thing that came into my mind was air raid sirens going off and everybody running down into the shelters.
Mather: You don’t hear that. [LAUGHTER].
Audience: No. My question is what makes us hear the siren going off? Running down into the shelters is not the answer and that whether it’s going to be civil defense it’s something that is beyond just the individual corporations because I don’t think the individual companies can do it on their own. But I do believe you can enlist an army, there’s a lot of developers out there who have a lot of new ideas about how to make secure containers with virtual machines as everything moves to the cloud. I think we were saying, build forward. The legacy problem is just a horrible problem to solve, but building forward we could, I believe, have the technology—make some advances in technical solutions to this. If we were to enlist the entire civil defense core for the Internet and for the digital age, and it’s going to take that kind of rallying cry to do it.
Mather: I think I start to hear those sirens going off. Security budgets are going up, is what I’m hearing. I think some people said that earlier right? We’re not there, but it’s much better than it was five years ago.
Mital: It was three weeks ago Jimmy Dimon the CEO of JPMorgan says he’s going to double his security budget in two years from a quarter billion to a half a billion dollars.
Dennedy: And meanwhile he busted his CPO down to director from VP. It’s this teeter totter of when does the siren ring and what is our answer to the call. Are we going into a bunker? Or are we—I like the answer very much that Todd gave and actually AVG had a great book out for digital children’s, parent protection which I can’t think of the name of it, but it is good. I liked it. But I think there’s a couple things. One is, as Barton and I were discussing this morning, one of my feelings about Ed Snowden and his whole activities, I was ticked that it wasn’t me. I’ve been crying and begging for people to care about this for almost two decades now and I was pissed that it wasn’t us that rang the siren. What will be the thing? How much worse does it have to get? I think the answer is we have to do the Randy Pausch head fake. We have to go into schools and we have to talk about STEM. There’s a lot of money being spent on STEM and I think one of the most exciting careers, deeply biased myself, is in tech, is in cyberism, is in civil obedience, like massive civil obedience.
Mather: Can I make a point about that? CSI is actually coming out with “CSI Computer” in the next four months. Patricia Arquette is the lead actress and I think this is amazing.
Dennedy: “Scorpion” has just come out and “Hackers.”
Mather: I think when CSI launched, the increase in people doing forensic science was like 1,000%. This can help. The media supporting this is also helpful.
Dennedy: I think so too and I loved when the “Big Bang Theory” came out because they talked about encryption stuff, but of course the portrayal of women is just abysmal. But I do so want to make [LAUGHTER]—I know, we’re not actually all stupid. It’s shocking. We can walk and talk. Now I know the portrayal is like these [LAUGHTER] I know. I married one of those guys who is just like that guy, so I have a soft spot in my heart.
But I also wanted to respond to the question too of balance and I think we are giving this a very Western perspective because I think having these same conversations last week throughout Europe, which is why I sound like this, I think the answer there would be government, and government and heavy regulation, particularly in the civil law countries over there. I don’t think that’s necessarily the answer because I don’t think it opens it up to this open self-defensive, self-knowledge, I don’t think there’s any big daddy that’s going to come and save us in big data. I think we need to figure out how to make kids smart about ethics, and morality, and legality, and technology, and I think all those are great careers. Talking about—under programs such as the Department of Education and the Entertainment Network we’re launching a huge thing on Thursday, I’m probably not supposed preannounce, but we’re going to hit 50 million kids this year and teach them how to be safe online. I’m very passionate about that. Yay!
So, I’m really passionate about it, but I think. I don’t know to this point, what’s it going to take for people to care?
Grossman: After you have your first heart attack you buy some running shoes and that’s pretty much how it works. Information securities—it doesn’t have a whole lot of, there’s no dead bodies yet. Yet. I don’t like to bring this news forward and be doomed, but that will be an eventuality. We’re going to have pacemakers with Wi-Fi access
Audience: Andthe infrastructure hasn’t been kept up and that’s where I’m afraid.
Grossman: Medical devices running Windows XP.
Cox: Imagine if you could get control of the President’s car, right?
Simpson: It comes back also architecturally. I don’t need my thermostat to go to the cloud to control my furnace, but that’s the way the world is being designed to think.
My thermostat should talk directly to my furnace. Architecturally I think we’re on the wrong foot right now. We back up everything absolutely to the Cloud which means that all of that data is always accessible. Data should be routed appropriately as close as it can to the use case.
Mital: The thing I wonder, the cultural question I wonder from a consumer perspective is; our banking and our credit cards infrastructure in America anyway, is set up—I think the maximum liability is only $50 bucks and not even really that. So you know that even if you get hacked, whatever, as long as—
Cox: Yes, we’ve incented consumers the wrong way.
Mital: Yes exactly. The consumer say, “Okay. Well I might get hacked. It’s unlikely because there’s so many other people, but even if I do I’ll be made whole by my bank.” That’s one. Then somebody breaking into my identity and stealing my pictures, that’s a Paris Hilton problem, that’s not a “me” problem.
Denndy: Now it’s Jennifer Lawrence.
Mital: Jennifer Lawrence, exactly. That’s a problem most people wish to have is that they’re that famous that somebody wants to steal their pictures? For most people that issue, at least culturally, the downside of the liabilities aren’t quite there. Once we get into this IOT world where my car traveling 70 miles per hour down the highway has the same energy as a stick of dynamite, which is true, it’s a megajoule, then suddenly I’m, once something happens there then the conversation changes, but that’s a debilitating conversation.
Mather: I want to point out that we’ve only got around 10 minutes left so if there’s any questions you guys really wanted to ask that hasn’t gotten asked, please speak up, but we’re going to go here and then there, and then I think there’s one over here. So, we’ve got a line, but—
Gellman: I’m Bart Gellman and I want to be a Johnny One-note, I did want to ask the other panelists about the implications of Michelle’s point on Snowden and how it relates. I know security folks talk a lot about threat models and if you’re representing or if your clients are principally big enterprises worried about their IP, I understand why the U.S. government isn’t a big part of your threat model, but to what extent does the Snowden revelations have any impact on the threat, the mitigation possibilities or your thinking about how to protect what you’re trying to protect?
Simpson: Again,I think we should be deploying a lot more peer-to-peer technologies, less centralized systems. Before AVG I worked at Mozilla on WebRTC, which is a generic peer-to-peer that’s deployed now by Google and by Firefox, so it’s in a billion web browsers around the world. We should encourage people to use technologies like that, fully encrypted peer-to-peer, when they develop services that should be deployed that way.
Mather: That brings us back to, I think it was Jeremiah’s point, you were just making it more expensive right? Because you have to hit multiple spaces.
Grossman: I can tell you what our customers tell us, WhiteHat’s a cloud provider, we host very important data and our international customers, let’s say in Amea, they want us to host and have entities in Amea and their data doesn’t leave. Rather than having all data in the U.S. where we get economies, efficiencies, there’s now a lot more data being hosted outside the U.S., that’s not good, bad, or otherwise that is what’s going to happen.
Gellman: Probably misguided in terms of trying to defend yourself from the—
Mather: Can you say it into the microphone? Sorry. What was that?
Gellman: I was going to say that’s probably a misguided instinct if you’re trying to protect yourself from the NSA.
Grossman: You have to remember many of these Fortune 500s are multinational conglomerates right. They don’t trust the U.S. government. They don’t trust the NSA, so the extent that they can remove the data outside their jurisdiction that’s what they’re going to do, right or wrong, that’s what they’re doing.
Gellman: They’re opening additional legal and technical doors to the NSA by keeping it under the U.S.
Grossman: Known risks verses unknown.
Mather: Well also where the NSA can spy versus can’t.
Mathews: I’m Dave Mathews, I’m a reformed hardware hacker. I’m using my powers for software now [LAUGHTER]. A couple things. The phrase, “We’re from the government, we’re here to help you,” was always a favorite of mine. I was with the CIA of Estonia last month and they had a country-wide denial service attack because the whole country uses smart cards for everything, from driver license to payments, etcetera, and they literally killed the switch that brought the Internet to its knees because of the hackers going after this small country. That was interesting to hear. A couple days after that I interviewed the kids from LulzSec, which is a pretty notorious hacker group out of Northern Europe and some U.S. ties too. We did a live hack on stage with a project with the Mayor of London. I gave them access to hack my WordPress site. I said, “In the past HTML was flat. Now we’re all database-driven.” And these kids—the preview before the hack, we were hanging out at our friend’s flat, and they were like basketball players, one of them would use a laptop for a minute, they’d volley to the next one, the next one would hammer, and what they came up with was this injection that looked just like my domain name and it wasn’t a phishing scheme, but it’s amazing these vulnerabilities that these kids find and they just go at it with such precision like a highly tuned team would. The reason why I think we’re in dire straits now as we have all this intellectual capability with kids going after vulnerabilities, but right when broadband put every Windows machine on the Internet then those became botnets very quickly from people sending an email out. We’re about at that point now where the IoT world is going to turn that in with a refrigerator or television that’s sending SPAM email messages. How about the network detached storage servers that are mining bitcoin—
Mather: Just to be clear, IoT is the Internet of Things, in case anyone didn’t catch that. Sorry.
Mathews: So, the NEST thermostat doesn’t need to talk to the cloud unless I want to turn my Tahoe house heater up because it takes 12 hours to bring it up to temperature. The problem with the startups is they don’t put money into R&D and security. There’s Internet-connected light bulbs that are putting the WPA keys in the clear. I think we need not only a best practices, maybe like a security consultancy that anyone can say, “Here’s an open source set of rules to go by.” But also I think ultimately this plays into the operators, so the cable providers, the Comcast Xfinity router which gives this extra capacity for someone else to jump on to my Wi-Fi as well as my own protected network, but those guys are the gateways. There the ones keeping the packets either in or out of the house and if my NEST thermostat is going crazy and my network attached storage device is mining bitcoin, and we detect these anomalies, those are going to be the only real ways we can stop this or turn it off because the government is not going to kill America’s Internet.
Dennedy: Just a quick comment back. On the kids that can do this kind of hacking, first of all, I think of the cement guy this morning, did you see the cement guy rolling out his thing? How many billions of dollars and fossil fuels have we burned with cement trucks that had liquid in it? All he did was desiccate the cement and water it and that’s a huge innovation. I think some of these hackers, if we can figure out how to remove the water, the chaff, the waste from some of this data, that’s a business. How do we get these guys into the business? The other quick point I wanted to make is the investment community throws and we catch as companies doing M&A and I think minimal viable products has to have a basic privacy and security infrastructure. It’s easily done. We did it with my nephew over a weekend with one pizza and we’re able to take his app and turn it into an architecture that can be shown to a VC. This can be done, this can be taught, and I think that’s a minimum viable product. I think we shouldn’t be funding things that aren’t and we shouldn’t be buying things that aren’t, and we certainly at Intel are punishing company that come to us with crap.
Mather: Todd did you have a question?
Simpson: I just want to come back to that we as consumers have to choose smarter products otherwise the startup’s incentive is just to move faster and ship garbage. We have to educate people and we ourselves have to actually—
Dennedy: There’s always the WhatsApp unicorn that’s haunting us. “Oh, look, it’s garbage.”
Simpson: We have to make privacy policies readable.
Mather: And when you say consumers you’re saying enterprise is consuming?
Simpson: Enterprise is consuming or end consumers. We just have to make smart choices
Grossman: So, we should recognize we make it the telecom’s or the ISP’s responsibility to secure the pipes then they’re going to have to terminate SSL. That has a very real consequence. I don’t know if we know what that leads to.
Mather: Yeah, let’s get back to the whole encryption thing.
Audience: Why is that?
Grossman: Because they have to be able to see the data. If it’s all encrypted, every IoT device tunnels through. Fortunately, what IoT gives us is more targets, more things to hack, but fortunately it doesn’t increase the bandwidth utilization. There’s lots of more things to hack. But the other thing, the thing no one likes to talk about is the liability aspect, the reason we need minimum secure certified products is because we have this little thing called the EULA that purposely disclaims all liability over crappy software. When that goes away we’ll get more secure products. The EULA is killing us.
Mather: Define EULA please.
Grossman: I’m sorry?
Mather: What’s the acronym?
Grossman: I’m sorry. End User License Agreement.
Simpson: The thing that you click through on software?
Grossman: Yes, that thing.
Cox: I would agree with you. We have to have minimum standards of security. I’m conflicted about that though because once you impose this minimum standard it becomes a compliance regime as opposed to a security regime. That is the danger in that, is that companies just comply with the compliance regime, they say they’re secure, and they’re not actually paying attention.
Mather: But isn’t it better to have that instead of nothing?
Cox: I agree with having nothing, but my concern is then you shift to more of a compliance.
Simpson: You’re just checking the boxes.
Cox: You’re just checking the boxes. We see this with PCI/DSS.
Audience: And won’t spend any more than the level to be compliant.
Anderson: I’d like to be a bit of a wet blanket at this point.
Mather: No no. Can’t allow that. Sorry. [LAUGHTER].
Anderson: I forgot your name, I think the head of WhiteHat has made a most important statement of the day which is, “He can hack into any corporation in one day.” I’ll just tell you a little story that is not classified. I’m telling you what I said, but I was talking to Sir Iain Lobban, who until recently was the head of MI5 and GCHQ and he’s a pretty smart guy on this subject. He was talking about how Britain was doing really well and it’s true, in working with business, government and businesses together to solve these problems. He said, “We’re going to solve 80% of the hack problems when we get all done here.” I said, “That’s great, but you’re not going to solve any of the APT threats that are going after crown jewels that are important in the global economy. It’ll be 100% failure for that tip of the pyramid that matters the most, so don’t pat yourself on the back yet.” I think, from what you said today, there’s nothing wrong, I agree with everything you’ve all said, but the big problem this is insolvable. It’s a radar trap problem and every time you improve the radar detector they improve, you know—that game never ends and it keeps all of us getting paychecks. But if the real goal is security for the enterprise or security of some kind, I don’t see it. I think we’re all talking about the wrong things. I don’t think anything that we’ve talked about today will provide that. I guess the question I’d like to raise is: is the whole system, we’re kind of at that moment, is the whole system just not protectable? Then we’ve got to start from scratch in some other way. Because this system that I see, even before the IoT happens, no one can protect it. WhatHat can go in there in one day? And that’s not going to change I don’t think.
Grossman: When it comes to hacking systems if you’re—I’m from Hawaii, so if you’re in the surf lineup and a shark comes, you don’t have to outswim the shark, you have to outswim the other surfers. Same is true in security. So that means you have to be a little bit more secure than the next guy with an opportunistic target. If you are targeted then and the shark just wants to swim and eat you, well then you have a different problem all together. But fortunately we don’t have to make truly impenetrable systems and spend the ungodly sums of time and money to do it. We have to make them time-wise a little more secure. The distance between 10 minutes and 15 minutes matters a great deal and I’ve seen this in personal experience, if you make systems 20 minutes secure the bad guys will go elsewhere. They’re in an ROI model as well, that’s why they cast wide nets across the Internet. But if you’re targeted, that’s another—hopefully you’ve made your systems difficult enough where you can actually see the bad guys and watch them win and then kick them off. That’s what you’re looking for. You’re looking to make it expensive and noisy.
Mather: And maybe we just haven’t got to the place where we have the dead body yet that requires the reboot of the system. Maybe just to bring back the conversation that happened early. Do you have a question?
Audience: Yes it was actually two comments. Number one, I agree with you on the dead body. I actually think that as much as we would like to educate and have smarter users, etcetera, it’s going to be the 9/11 equivalent that’s infrastructure attack that actually wakes people up because people are insulated on their credit cards and they don’t really care a whole lot about their personal information anyway, but if suddenly there’s a fire or a reactor accident or something that impacts people more dramatically, that will change the conversation, likely. And it’s going to happen. You know it’s going to happen. On the “app developers are sloppy and there’s no real encouragement for them to do it,” I would merely put forth an idea that although I am strongly against government regulation of these things there are a couple of entities that represent really good choke points that if they encouraged developers to go through a specific audit in some way, shape, or form, it would be very effective: Apple App Store, Google Play Store. If they want to insert a certification requirement that one of the following handful of organizations for $1,000 will just do something. I don’t know what. Just don’t rule out the fact that they do have a very powerful position in the ecosystem right now in terms of encouraging developers to do best practices.
Mather: That’s a good point.
Audience: We keep talking about the bad guys as if it’s an amorphous mass. Is it bigger than ISIS? Is it a couple million people? How many people are we talking about? How quickly are they recruiting?
Mital: It depends on ways that—we know of many organizations that the commercial power of business—Let me back up. There’s basically three general categories: There’s state sponsored guys, the numbers vary and I don’t really want to get into those things. There are the people who are commercial hackers, and there’s leakage between the state sponsored guys and commercial hackers and that number is in the tens of thousands. Then there are people in the general category of activist hackers, sort of weekend hobbyists who want to do it for notoriety or fame or whatever, and that number varies. Is it somebody who’s testing a system in college? Then it could be in the millions, but really the people that are actually doing it is probably on the order of many tens, maybe hundreds of thousands.
Audience: What would normal law enforcements say about how you counterattack?
Grossman: You mean actually counterattack them?
Audience: I thought when people were talking about dead bodies it was making examples of people who were doing—
Grossman: No, actual—
Mital: Somebody getting hurt.
Audience: I’m being facetious [LAUGHTER]. I’m just wondering under normal circumstances, we have a government here who will go out and kill a terrorist and say, “This is what happens to you when you attack American soil.” Maybe we’re neglecting more extreme methods of dealing with this, not that you can deal with the Chinese government because I think you’re in a mutually assured destruction situation there which is analogous to nuclear war, but I think in the case of hobbyists there’s going to be more effective counterattack measures than, “Gee there are millions of them having fun in their college dorms.”
Grossman: It’s an attribution after a cyberattack gets really, really difficult and if you’re using truly extreme measures, be guaranteed I’m going to use them against you. I’m going to start targeting other people, we’ll make it look like them. So with the attribution it’s really challenging. But the other one, just a different way to count the bad guys, I used to work in Yahoo way back when and at the time we had 120 million active users. Our data says that roughly 1% of our users were malicious in some way, not just hackers. It was a quite a lot. It was spammers and other deviants and things like that. It was a lot. So, attribution is key. We should go after very key bad guys who cause a lot of damage, but we’re not going to be able to catch everybody. It just doesn’t scale with law enforcement, not one at a time. They’re going to have to prioritize and they kind of do already.
Mather: We have one more question and then we’ll do a little wrap up.
Audience: My question, thank you, it was really in his direction which is unilateral offensive behavior by private parties is impermissible. Is it technically possible and what would it look like in a general way? And should we be talking about that because if we’re talking about critical infrastructure and problems around a perimeter defense strategy I’m sure firms are deeply thinking about what kinds of things should they be doing that would have a more offensive character. If we could engage that question.
Grossman: Digital self-defense brings up a whole category of issues that’s really, really tough. It’s attractive, but it’s also tough. It’s not like protecting one’s own life. Usually a lot of times when you’re getting attacked by a system it’s on—What happens if one Fortune 500 company sees themselves getting hacked by another Fortune 500 company, what do we do then? So that’s a challenge. I don’t think there’s a really good answer. There’s plenty of dialogue, but I certainly don’t have any good answers on that one.
Mital: Translating a cyberattack to a kinetic response is a very dicey thing to think about. I mean today the world thinks about kinetic, sort of physical responses—
Mather: Military responses
Mital: —military responses. That’s very separate from cyber. I don’t think there’s any document there yet other than within governments on how that maturation … happens.
Dennedy: But I saw something where we are doing tactics like so like in Brian Krebs who’s a journalist really, and we’re constantly monitoring the darknets, we’re embedding with these people, we’re trying to figure out how, why are people in LulzSec, what are they motivated by, how are we dealing with these guys. I think it’s almost like that’s the jawbone, if you will, of the cyber complex. And to your earlier point why I think that is a scratching, but still something is I think the only industry more pointless than security is healthcare. And yet, it’s pretty good to be here in much better condition than to actually try to maximize what it’s like during our shelf-life of approximately 100 years. We’re not going to be able to do a total reboot unless we’ve had complete Third World War Armageddon with data so we have to get as healthy as we can, we have to educate people, we have to eat our broccoli, we’ve got to test out the systems. And then we have to figure out the dicey issues, which is where your question really is going, which is if I as an entity or an individual think that you are attacking me and I attack you back so I shut down your email because I see it’s coming from your IP address, but it turns out it was this guy over here because he’s out there swimming against the sharks trying to get me into a bad position, you get into all sorts of very difficult legal tangles, if you will. But I still think it’s worth it.
Mather: All right. Unfortunately we’re out of time. I want to ask the panelists to quickly say what their—we’ve talked again, a lot of doom and gloom, and I promise we would come back to what is working, so I’d like to ask the panel to say what they’re most hopeful about or excited about, or see the most promise in. Todd do you want to start?
Simpson: Sure. Again, from the consumer viewpoint. Quick comment, our houses are becoming like small businesses. We each have some 20 something devices connected and all these solutions. We’re early enough in that curve in our personal lives that I think there’s time to adopt solutions that make us as small businesses actually more secure and more private, and appropriately protected.
Mather: It gets to the lack of legacy thing.
Cox: I think that the battleground has shifted from guarding the perimeter to actually recognizing as soon as possible when you have an attacker in your environment. I think that’s where the focus needs to be. I’m not suggesting you shouldn’t guard your perimeter, you should, but I think—I’ve investigated cases where the attackers have then been in environments for months and months and they just have the opportunity to do more and more damage. So I think if we can shift our focus to making sure we identify them as quickly as possible and kick them out, they’re going to have lots less opportunities to damage the environment.
Mital: I think privacy and security are a triumvirate of policy, people, and technology. On the policy side, awareness is growing dramatically and people are much more open to the idea of being adherent to good hygiene. On the people side, for most security things we see, people are the biggest problems, either they do sloppy thing, or they have a weak password, they misconfigure stuff, so education can help and I think things are improving there as well. The thing I’m most optimistic about, being an engineer, is the technology side. I do actually believe if you can solve the first two there are actual technological solutions that are feasible and workable that certainly dramatically improve the situation, maybe not 100%, but dramatically.
Grossman: I started hacking websites 15 years ago and that long ago was pretty easy within minutes. Now most of the major websites it takes you substantively longer. Things are moving in the right direction there. But also just wanted to give a shout out to the Ruby on Rails guys. You’ve heard of this little vulnerability called SQL injection? It’s the cause of most of the compromises out there. We have yet to find a SQL injection vulnerability in any Ruby on Rail sites we’ve ever tested. The sample site is smaller, talking a few hundred here, but that’s a big deal. That one cause, that’s getting rooted off the face of the planet in the Ruby on Rails world. Those guys, keep up the good security work.
Dennedy: I think, although I’m not an optimist, I’ll completely pander to our moderator. I think one of the things that makes me very optimistic is the diversity question is starting to be addressed. The reason I think that’s so important is the carbon-based unit is the biggest risk factor, that’s us. If we don’t address different thinking, if we don’t address context through, if I have to see another dropdown click, we’re going to keep talking about inconvenience for security, but if we find beautiful iconery, we find music, if we find contextual smells where a puff of alcohol comes out of your phone when you go on a health app, I think we’re going to get to a different place. We have to really break this model. I think the other thing that is the most important to me is that we’re starting to see value in information in and of itself. I think value drives markets and market’s drive behavior, and behavior drives a greater drive to more technology, and better technology. I think it’s a very wholesome cycle that we’re on, even though we’re all going to die, we’re going to live better while we’re on the planet.
Mather: Just to follow up on that I do want to point out that we’ve had a lot of women in here and women who are technical security positions and I think Simone did a really good job of making sure we’ve got great representation, so that’s awesome. I think Michelle said it, if anybody wants one of her books they’re in the back so help yourself. Can we thank our panelists please?