Approaching Privacy as a Business Plan for Data
By Zane Pokorny on April 8, 2019
Our guest today is Michelle Dennedy. She’s vice president and chief privacy officer for Cisco. An outspoken advocate for building technologies that not only enhance our lives but also promote integrity and respect for people regardless of their level of technical sophistication, Michelle is leading the charge for better understanding and implementation of privacy and data security policies around the world.
Our conversation includes her thoughts on why organizations find privacy so challenging, the differences between aspirational messaging and foundational values, and where she thinks the next generation of security and privacy professionals may take us.
This podcast was produced in partnership with the CyberWire.
For those of you who’d prefer to read, here’s the transcript:
This is Recorded Future, inside threat intelligence for cybersecurity.
Hello everyone, and welcome to episode 102 of the Recorded Future podcast. I’m Dave Bittner from the CyberWire.
Our guest today is Michelle Dennedy. She’s vice president and chief privacy officer for Cisco, an outspoken advocate for building technologies that not only enhance our lives, but also promote integrity and respect for people regardless of their level of technical sophistication. Michelle is leading the charge for better understanding and implementation of privacy and data security policies around the world.
Our conversation includes her thoughts on why organizations find privacy so challenging, the differences between aspirational messaging and foundational values, and where she thinks the next generation of security and privacy professionals may take us. Stay with us.
The long and winding road. I actually do get mentees saying, “How do you become a privacy officer?” And I think, first you go and you get an undergraduate … a science degree in psychology, hoping to become a psychiatrist. Then you take a gross anatomy class, you realize you don’t want to do that. So you end up going to New York City, and actually, on the advice of a PhD program I was interviewing for, he said, “Kid, you are not a researcher. You are an advocate. Go do something in advocacy for a year, and if you still want in my program, you’re in.” I moved to New York City, I became a paralegal in a very large law firm, and a couple of years later, I ended up going to law school and starting my career as a patent litigator, actually, in New York City, doing medical devices of all things.
And then I moved out to Silicon Valley, followed a man as you do. Got a minivan, as you do, and I was recruited …
You were all in.
I was all in. I was like, “Okay burbs, here you go.” And so it was one of these things. It was like a series of weird accidents, and I stumbled into Sun Microsystems, and at the time called my then husband, now my ex-husband, and I said, “Hey, have you heard of Jay-va? Is that important?” And he said, “Java? Yeah. That’s really important.” I said, “Oh yeah, I’m going to this place called Sun. Have you heard of it?” He was like, “Good Lord. Don’t go shame the Dennedy name.”
Of course, I start off on this footing. I go in there, and I thought that I was interviewing for a patent law job, because the recruiter said that it was a patent law job. Well lo and behold, she told them I was interviewing for the trademark position. It was a great day and I walked home, and I got a call two days later and they said, “You know what? We have two candidates for this job. One is really well-qualified, the other one is you. And we think you’re perfect.”
And so I said, “Okay.” And it was just, I guess, now looking back, I was a good fit for Sun, which is also why I am a good fit for Cisco, very entrepreneurial, exploratory … We were changing the world with a high ethical foundation. But the big kicker was Scott McNealy our CEO who said, “Privacy was dead. You have zero privacy.” Topped it off with a hearty “Get over it.” And so no one in power, as you can imagine … Your CEO says, “This is garbage.” Only a crazy person would say, “I’ll do that!”, and that’s what I did.
I leaned in partially because it was interesting to me as a former patent litigator. I break things down into spheres of ownership and influence and prior art, and so at the time I looked at the portfolio that we had in technology, in the data center, and in networking, and I broke that out into what does that mean in terms of these relatively newish privacy laws then coming out of the European theater. They were cast in 1995, but they really didn’t start getting implemented into the aughts … And then I said, “Gosh, this is not just legal compliance. This is a business.” When you’re selling virtual containers and you’re selling encryption and you’re putting together an identity portfolio, what you’re doing is, you’re putting in aspects of control across the network that help people tell their stories individually, and help us keep the integrity of stories about our employees, about our customers, about our governments. And I got really, really excited. That was really the earliest stages of my diving in deep to data protection and privacy.
It sounds like what you’re describing to me is something foundational. Help me understand the difference between building privacy in at that foundational level versus it being something that’s grafted on, or it’s something that is … Something that you’re ordered to do by the marketing department, for example.
I looked at it again from an intellectual property and a risk perspective, and I said, first of all, whenever you post a notice publicly … And remember, websites, it was still unsettled law whether that was a contract or not. Whenever you say something publicly that the public … If you’re B2C or B2B or B2G, for government, when you state something affirmatively, it can affect someone’s material choice in doing business with you, you’re privately legislating. If you just post something, and you don’t have the backing for it, or you haven’t figured out where your claims are, you don’t know what your prior art is, you don’t really know that what you’re saying is (a) true, or (b) material, or could be impacting someone’s decision-making.
Bolting it on at the end means you have to do a lot of discovery work and maybe even a lot of retrospective repair, or you’ve got to lie. Lying is off the table for me. If it was something that was in line with the law and with ethics and with morality, but it wasn’t built into the product, if I started at the end of a project, we would have to redo, and you’d have to stand in front of the firing squad of, “Oh my God, we have a release date, and we have all these people and millions of dollars and dogs and cats living together.” No fun for anybody.
It’s a lot more fun to start in at the scoping basics. Now, everybody says, “Oh, call me early.” But they don’t say what kind of conversation to have. So that’s why I really focus on … And we actually offer a scoping workshop. We do a privacy engineering scoping workshop, and all of these elements of privacy engineering, I’ve honed across the years from trial and error, and working with industry people from places that it works. I with data quality people.
My father grew up as an architect and a security person in hardware and software. And so I would call home on the weekends and say, “Hey Dad, I’ve got a lot of different kinds of people, and portals are really hot right now.” And so we talked about segmentation and data modeling. And then I’d call home another weekend and go, “Hey, you know what? There’s this law, and it says that if things are encrypted, then they’re exempt.” And we talked about key management.
As you figure out what is the business problem you’re trying to solve, instead of what’s the technology gidget you’d like to have, then you start to really put together a plan that says, “Okay, here’s my data plan. I want to onboard people using a mobile device.”
“Okay, I got the geography. What kind of people are these? Are these … Am I the YMCA and I’m trying to get people who are underage and maybe even 12 to be in the babysitters? Ooh, okay. So I’ve got children’s data now.”
You start to collect all of those aspects. So it is the collection of these aspects, the monitoring of the data, and the constant overlooking as you do an agile type of development, whether it’s truly agile or whether you’re doing a modified water flow sort of a project-based project. At every single step of the way, you should be thinking about what is the data load, what’s the I and the O? We forget about what is the actual data going on, when we talk about IO. And it’s really, really critical.
Is it fair to say that it’s almost … It sounds to me like it’s almost like a value statement.
Very much so. I have a four-part test. So when people come with their business case, we want … This is an old case, not from Cisco. We want to embed a chip under a prisoner’s skin, and it will have all their crimes, alleged and proven, on this chip, so that we know who the bad guys are. So that’s the business proposition, and the “what’s in it for you” IT is of course lots and lots of gear, lots of processing, et cetera, and lots of consulting getting this crazy thing.
Now, four-part test. Usually the four-part test for other people starts with the law. Is it legal? And in that jurisdiction, that’s legal. And the second part test is, can we make money, and in that jurisdiction, the answer was yes. But that fails my four-part test. My four-part test starts with (a), is it moral? And that one didn’t even pass stage one for me. I don’t find it, as a human, something that I could stomach, to put known and alleged crimes, even if they’re legal in that place, knowing what I know about the power of the network, and how information can be and will be abused over time.
And the appetite of those caregivers, and they are supposed to be state caregivers, to think about what happens upon release, and what happens as they go throughout their lives, et cetera. For me, that was a moral test. But I went down the rest of the tests here. So ethics, to your point, what is your brand? When you think about brand and you watch marketing people go through a brand exercise and a rebranding exercise, they go through a series of these ooshy skooshy things, like trust, quality, speed, efficiency, scalability.
Yeah. Your privacy is important to us.
Your privacy is important to us. These are your corporate ethics, and I always apologize to ethicists everywhere. This isn’t the Aristotle version. But your ethics are your corporate brand and I think as I get deeper and deeper down the rabbit hole of ethics engineering, I’m getting more refined in that thinking. But for today, think about all of those superlative names and the avoidance of the negative names about your brand. Has your business plan crossed that path? We’re an honest broker. We don’t do business with people that do terrible things. So that particular use case did not pass that one either.
And then you get to legal. So you’ve passed moral, you’ve passed ethical, now is it legal? And sometimes that answer is, in this jurisdiction and not in that jurisdiction. And then you get to the final big kahuna, which is, is it still commercially relevant? Have you spent so much time avoiding risk that you lose your margin, or are you doing something that’s so geographically hampered in scope that it now doesn’t really make sense to execute on that plan? And that doesn’t mean you throw out the plan wholesale. It means you’ve gone through that thought experiment fairly quickly, and you’re thinking about how could we make this different? What if we just had chips in people’s uniforms, and they were assigned uniforms, instead of putting it inside someone’s body? Okay, well now we’ve probably passed the moral test. Running that data center. Is that something that we could do with our ethical brand? Well, it depends. And then you go down that rabbit hole, and you go down and you go down.
That’s what I do, I think getting to that table and telling the people at that table how to relate to data is, how you make that all important distinction between data protection, privacy, and security, because security is your best friend the whole way. Security is the manner in which you manage the why. So the what and the why is what’s the data, and that’s why it’s a fallacy to think that privacy officers only deal with personal data or PII. You don’t know what the bucket is until you know what’s in the bucket.
I’ve never had a job where confidentiality and taxonomies don’t fall into that same methodology. So if you’re an efficient organization, you put those things into similar, if not the same work streams, to figure out what is your policy. You look at those various elements of what I will call intellectual property broadly, and you figure out what are the rules that govern these types of data, and then who is allowed to access them. Whenever there’s a “who,” there’s privacy, because that’s a person. The “what” might entail “who.” It’s medical information, so that’s “who” data, and the people managing that medical data has a who, so there’s a privacy layer on there. But you also have intellectual property, the medications that are issued to that patient, et cetera.
As you’re going through this cycle, you understand, also, the risks and the opportunities that you can engage in with your security partners, and with your quality teams, and with your supply chain of data. Who are the third parties that you need to have to actually execute on this plan? Okay good, you’ve got to figure out how do you cross that barrier of my organization, third party organization, what do I need to know to make that work? So that’s why this is a constant hands-on … It’s not like, “Oh, this is PII and I’m done.” Or, “It’s not PII and I’m done.” It’s a conversation that is as rich as your business plan.
Yeah, so it’s an art, I will say. There’s not a distinct formula, and it also depends on who you are and how strong your brand is. You see a lot of very young companies and startups with huge statements, and they probably can live up to them, because they probably have one, maybe two, functions. If you only have one or two functions, I think it’s a fallacy to think only big companies can do privacy engineering. In fact, the best companies are startups. Know what your data nutrition label is, done. Know where it’s going, done. Document it, done. You’re doing it anyway to get funded, so why not do it right, and also have it so that when you’re acquired, we know what your data nutrition is.
I was having this conversation with the CPO of Uber, Ruby Zefo, who is fabulous. And we were talking about all these regulators, like, “It’s got to be readable. No one can read the policies!” And then they go ahead and write their law, and they require magic language. So we’re like, “Well, thanks for that magic language. You’ve just made my policy two pages longer. It’s not very readable.”
But if you looked at it line by line, there was either a graphic, a lesson, a call-out or an image that demonstrated those principles, so if you were looking at it … That’s something that I did at my last company that’s a little trickier. I’m not saying it can’t be done for Cisco, but we will probably do something different, because our layer, down in the network and in the services that we provide are a little bit different. What you do is, you figure out what’s that art form and who are you communicating to. I’m largely communicating to governments and other businesses at this stage. We do have a collaboration business that’s a little closer to B2C. Figure out the who.
The next layer down is your policy, and a big “P” Policy. And some people split them into parts, and we do here. But there’s a unified set of activities and promises that we make, and we train every single person that’s handling data. And then you have a series of either rule books, play books, standards … There’s a number of different names that are used for them, but those are the things that the person in the call center has a manual, here’s how you handle this type of data, and here’s what you do if that happens. All those FAQs, what do you actually do down and dirty? That goes down there.
If you stacked up that whole stack of paper, if you will, virtual paper, you probably would have a couple hundred page document. But if you provide it in … Our required standards of business conduct gives every single employee, and they’re required to go through the training … They have a top-line level on ethics, and our business plan, and our policy. And the whole policy is linked there. If you are onboarding in our TAC department dealing with customer problems, we have very specific training and rule books for you that will match up to those principles in the higher-level policy, but it will be much more in-depth for your part of the business.
I would say it’s the art of leadership, actually. It’s not just bolting anything on or covering … If you’re doing this for compliance, God bless you. I can’t imagine doing this for anything other than passion, because it’s just too hard. I’m constantly every day being told that I’m either critical and not done, or inconsequential. And it’s never anything in between.
So that leads to my next question, which is this, what are the natural tensions that exist within a company the scale of Cisco? Where do you find people bumping up against you? Where do those tensions lie?
Yeah, I think it’s certainly not specific to my company currently. I think there’s … (a), I think there’s a lack of data culture. We don’t have a model for data as an asset, and even when you say data as an asset, instantly your mind rushes to advertising and selling out and people aren’t in control and dogs and cats are living together again. What I mean by data asset is, something is either of value, and it’s causing you to make better decisions or run your business better or do something better, or make better, more ethical decisions, or it falls on the other side of the balance sheet. Even well-collected, well-curated data, if neglected, turns into a liability. We don’t necessarily think about the two halves of the whole, so that’s thing number one is, we don’t really have a common curricula and language.
I think the word privacy, which we primarily use in the United States … The academics will tell you, and I agree with them, that there is a fine line distinction between privacy and data protection in academia. In the enterprise application, it is data protection, but in the U.S. we use the word privacy, and so everyone has an opinion about what it means, “That’s not private, because it’s my work email.” But it has your name on it. And, “That’s not private because I gave a talk at some place.” Well, the fact of the talk is not, and maybe you’ve even published your slides, but it could be that there’s other aspects like what hotel you stayed at, that are private. It’s one of those things where you need a language for data, and that gets you some push-back.
And then the other thing is just old-fashioned shock and awe. There’s a lot to do in the modern company, to stay on budget, to feed the quarterly beast of the SEC and the investors that can sink you with the stroke of a pen based on an opinion. So you have to have a certain coordination of values. You have to have this quarterly report out, and if you’re doing something that’s going to give you short-term benefit, like data, but really short-term harm if it’s breached, but really the long-term view is, where you get to data strategy, and so where you’re building most efficiently, you have a strategy. If you’re building for compliance, you have tactics, and you’re busy. And so that runs into conflict as well, to figure out where’s your sweet spot? Do you want to take the investment view? That doesn’t mean everything goes slow, by no means, talk to my team. They’re like, always gasping on the pavement. I’ve been driving them so hard.
We do take a longer-term view, and we think … I look at the top line strategy. I tell every privacy officer, at least if you’re not reading the whole 10-K report in the U.S., read the letter to shareholders. That’s where you’re going. At least, that’s where your CEO thinks we’re going in the next year, and it’s not the day-to-day touch that you need to have for your business, but you have to be a business person, as well as a privacy advocate, as well as being able to at least speak the language of technology to understand how they tick and how they work. And then understanding the special personality that is the IT crew. There’s a special type of person that was attracted to security. There’s a special type of person that was attracted to marketing. There’s a special type of person attracted to sales. And I work with every single one of them, so I tell you what, I probably use more of my psychology degree than I do my law degree on a day-to-day basis.
Now, I think we find ourselves these days where folks have a real cynicism when it comes to privacy. I think when we look at all the stories in the news about companies like Facebook and this massive aggregation of our data and … I think this feeling of violation that folks are gathering and making connections about us that maybe we’re not so comfortable about. Do you think there’s hope on the other side of that? Do you think this is a stage we’re going to get through? Will we push through and as a society, come up with some rules to make everybody feel a little better about all of this stuff?
Yeah, so for me it’s a day-to-day thing of recharging my batteries, because it’s very easy to go home and suffer from the same cynicism of, do people really care if they’re willing to do these things, and is the money so great in selling other humans’ data and stories that no fine will ever be high enough, no shame deep enough, that they will do the right things? And I do go home some days thinking that. More days than not, I go home and I think … You know, what’s really cool is … My business partner Jonathan was in the store replacing his phone the other day, and he said, “I was standing in line in Verizon, and this guy was like, ‘I don’t want that app on my phone, and I don’t want this one, because it collects too much data.'” And he was like, “20 years ago when we started this, you never would hear someone not taking an app because of the data.”
You hear people talking about getting off of a social network, because they now know that they were sharing their data for free. They had at least an idea that there was advertising going on, but the fact that someone would try to manipulate my choices in an election is … It hits that moral and ethical thing for them. So I think that there’s that sort of outrage percolating. That’s thing number one. Thing number two is, this is a very well-paid job. Try, if you will, to get an eight year out privacy officer. You will not find them. During the recession, everybody got rid of the privacy people, stuffed them under security or told the security officers, “You are now security and …” So we have this big dumbbell. We have old farts like me, and then you have very young people coming up who are awesome.
Don’t get me wrong. I didn’t have a privacy class in my law school. It did not exist. There was one textbook by Bender, and it was on licensing, online licensing. And there was one book on online licensing when I went to law school. That’s how … It’s like ye olde wagon wheel law I am. Kids are coming out now with masters degrees. There are LLMs in privacy. But the middle … I know. It’s crazy, right? Now it’s … The youth is making me very excited about that energy. The money and the market to try to hire the scant staff of experienced privacy officers is cool. And every now and again, I’m like, okay, remind yourself. I’ve never really done it for the money, but I do have a mortgage. Every now and again, I’m like, stop being outraged morally. This is your life, and it’s a good one. Be grateful.
And then the final thing is, I think GDPR, all other things aside, has certainly at least hinted at a market. It’s a market in downside risk, because of the large fining schema, but I think it’s also a market in prominence, because one of the key aspects is that you have to actually go and list your critical data. And by doing that, that is the start of how you start counting assets. Once you figure out how many apples are in your store, and how many are going off, you start to understand how many you want to order from the orchard. And once you figure out that you can get the not so great ones and sell them at a lower cost to a baker who’ll turn them into a pie, you have an empire.
I think now that we’re starting to be required to map our data, we’re being required to do privacy impact assessments, and assessing the impact to the individual, the impact to the organization, and the impact that controls … And here come my security friends again … All those lovely beautiful controls … Now you start to understand that that market starts to really sing.
I’ll give you a concrete number from last year, in getting ready for GDPR, $5 billion dollars with a “b” were spent in getting ready for GDPR, just from consultants and lawyers. I’d like to see at least some of that money go into automation and building tools. And what we’re building here is, I hope to be a platform that will be the underpinnings of a network that I’m happy that my kids are traveling on. My kids are digital natives. They get on and off platforms when they don’t trust them. They will find their first job, probably find their first boyfriend or girlfriend online. They will definitely apply to colleges. I’ve got a child heading off to college, and everything was online. All of these things for my kids, I want to be safe, I want to be secure, I want their stories to have integrity, and so I build. And I get up tomorrow, and I build again.
Our thanks to Michelle Dennedy from Cisco for joining us.
Don’t forget to sign up for the Recorded Future Cyber Daily email, where every day you’ll receive the top results for trending technical indicators that are crossing the web, cyber news, targeted industries, threat actors, exploited vulnerabilities, malware, suspicious IP addresses, and much more. You can find that at recordedfuture.com/intel.
We hope you’ve enjoyed the show and that you’ll subscribe and help spread the word among your colleagues and online. The Recorded Future podcast team includes Coordinating Producer Zane Pokorny, Executive Producer Greg Barrette. The show is produced by the CyberWire, with Editor John Petrik, Executive Producer Peter Kilpe, and I’m Dave Bittner.
Thanks for listening.