Nishant Bhajaria, head of technical privacy and governance at Uber, joins David and Dominique to discuss how he has helped companies focus on data privacy. He outlines reputational risk versus better data and governance and why protecting data leads to better products, a more intelligent workforce, and a more engaged customer. They discuss why the amount of data being exchanged by companies and customers today is unlike any we’ve seen before and how transparency to consumers and enhanced data privacy is critical for a company to thrive. Nishant also shares how his newest book, Data Privacy: A Runbook for Engineers, is the first leading text for engineers on how to design, develop, and measure the effectiveness of privacy programs.

Listen to “Nishant Bhajaria: Head of Technical Privacy and Governance at Uber | Making Privacy a Priority – Episode 44” on Spreaker.


Episode Transcript

Dominique:

Welcome everyone to Decrypted Unscripted. My name is Dominique Shelton Leipzig. This is the opportunity that my partner, David Biderman and I have to really unpack what’s going on in privacy and data security, and data in general.

David:

We talk about privacy data information in general. We cover everything from national security to mom and pops getting ransomware attacks. It’s just great to spend some time with Dominique talking about these issues.

Dominique:

Data is everything. If a company is not digital and they are not using technology and data, they’re really not a company in today’s world.

David:

Thank you all for listening.

Dominique:

Welcome everyone to Decrypted Unscripted. My name is Dominique Shelton Leipzig and I’m a partner at Perkins Coie and I do this podcast with my partner, David Biderman. And we use this podcast as an opportunity to talk about what is cutting edge and most important. Basically, data issues that matter. With that, before I introduce our guests who we’re really excited to have Nishant, I want to give David an opportunity to say hello.

David:

I will say one thing that this is a podcast on privacy and we’ve had phenomenal guests, but you’re the first guest we’ve had, I think, who really digs into the technical issues on privacy and how privacy can be protected by big platforms. And you’re of course, the author of the book, Data Privacy runbook and you give instructions on YouTube on privacy. So anyway, we’re just so glad to have you, because you’re the one who makes it all happen in the background. And thank you for what you’re doing and we look forward to talking.

Dominique:

Oh, wonderful. I mean, you said it really well. We’re thrilled to welcome Nishant Bhajaria. He is an executive leader and industry expert in privacy and his focus is on privacy engineering and governance. He is the head of that, Privacy and Engineering and Governance Function at Uber and a wonderful prior experience at Netflix and other major platforms. So welcome Nishant.

Nishant:

Thank you, thank you. Happy to be here.

Dominique:

You know, your book is really important and cyber engineering and privacy engineering are becoming really critical to operationalize all of the UNSW this one Professor Graham Greenleaf does this count of data privacy laws, and he has counted 148 countries as of last year with new privacy and data protection laws. And we need somebody such as yourself to help implement the engineering to comply with all that. But before we get there, can you tell us a little bit about your background. How do we get to this point where you’re focusing on privacy engineering?

Nishant:

Yeah. Thank you again. And I’ll just add a little color to what David said and then answer your question Dominique. So I’m not the guy that makes it happen. I used to be the guy that wrote code. Now I run teams where engineers write the code, they build the products. I want to make sure that those folks who actually do the work get recognized as such, but just wanted to add a little humility to that comment in the beginning.

But to answer your question, Dominique, I began my career when I was an undergrad in a very eclectic fashion. I was an engineer. I wrote code. I have a CS degree, but I was also part of the college debate team. I also wrote op-eds for the college newspaper. I was also an RA in the college dorm. So I’ve always been this person that does three or four different things and trying to build a career and a theory of life that combines all the best skills from those four different domains.

So fast forward 10 years, I was writing code at Intel and WebMD. And I was just like every other engineer trying to write good code that wouldn’t crash when people tried to use it. And then something fundamentally changed. The ability for customers to onboard to services that connect with their Google ID, their Facebook ID meant that the connection between customers on the one side and platforms became a lot easier. We had mobile devices, fast internet. And what that meant is the amount of data that flows across the internet, across all these pipes that connect you and me exploded. And the things that made engineering very powerful silos, engineers being able to make their own decisions, very little bureaucracy and process enabled the tech industry to build a lot of amazing products.

And I was at the forefront of that, but those very things, this autonomy, this freedom, this independence made security and privacy very hard. It’s a bit like looking for something in the library without having alphabetized books, without having genres. It’s not fun, is it? So it’s great to buy all those books and stuff them, wherever, but you need a way to search for them right, and that’s what made privacy pretty hard. So I got into this domain over the last decade, because I have been the bad guy. I’ve been the guy collecting a ton of data. I’ve been the guy making access requests to stuff that I shouldn’t have any access to.

So I know what not to do. So it’s a bit like the thieves guarding the jail in this case. So my job now is to build out tools, programs, processes that help companies protect customer data at scale. So I gradually moved… Some of it was privacy finding me, and some of it is me finding privacy and it was kind of a two-prong process.

David:

When did you first convince a company that they needed somebody like you? I mean, I assume there was nobody that did what you did until you started. So who was first company you went to and what did you tell them to convince them that, “Hey, listen, you guys need to do this.”

Nishant:

Well, so there are two answers to your questions. So the first change that happened to me from a career perspective was when I was at WebMD and we used to get a ton of data and it would are sent by companies that used our software, but didn’t always understand the implications of sensitive health data. So I started dabbling in this field and asking a lot of inconvenient questions. Fortunately, I had a very, very good set of managers who gave me every opportunity to do things right. So I was by default to an extent a privacy engineer without having the title, but more formally 2014 is when the move began when I was at Nike.

When you have a global company, significant scale, data about customers, athletes, and what have you and then you formally become a privacy expert, so to speak. So Nike, Netflix, Google, and Uber kind of followed afterwards. So there were two pivot points, one informally as an engineer, and the second one as an executive leader starting my time at Nike.

David:

And was it hard to convince the executives at Nike that they needed to do something like what you were doing?

Nishant:

So it’s not often that engineers get to quote Richard Nixon on your podcast, but I’m going to be the first one. So President Nixon said that you campaign in poetry, you govern in prose. So when you convince people to do privacy right, you have to sort of use sort of the stick and the carrot at both. The stick, basically being that there’s reputational risk. There is a risk of fines. There’s a risk of loss of customer trust, bad media. And There are companies that have historically done a bad job with privacy, but haven’t really paid a price.

And my response to that is, well, it only happens every so often and those companies have a ton of money. They have a ton of people like me on the payroll, but if you want to be in the right side of the customer trust equation, you got to do privacy right. So that’s the stick arm of it. But there is also an argument to be made for having better data, having better governance. It leads to better products. It leads to a more intelligent workforce. It leads to a more engaged customer. It leads to having to pay a lot less money for data storage. It means you can focus on the core important products that you build as a company that make you money rather than having to fix the same old privacy problems again, and again, and again.

I have seen roadmaps end up on ash heep of history because engineers committed to stuff and they couldn’t get to it because the privacy issues kept coming up again and again. So I’ve used both those arguments depending upon the use case, but you got to have a bit of both prose and poetry, but due deference to our former president.

David:

That’s funny. You know one of our guests said there’s, there’s three certainties in life, death, taxes and data breaches.

Nishant:

Well, my job is to make sure that the third thing doesn’t happen.

David:

Okay. That’s wonderful.

Nishant:

I can’t fix death and taxes, but the breach issue, I can do something about.

David:

You got to move on to those two later on. That’s your third career.

Nishant:

Someday hopefully.

David:

If you were to grade the industry, how would you grade the industry in terms of its privacy practices?

Nishant:

I would say somewhere between a C and a B. The tech industry is not as bad as people think we are, but we are not as smart as we think we are. So we’re constantly overestimated by people on the outside. And a lot of times when I see privacy incidents go wrong, people often think it’s malevolence. Like my father-in-law genuinely believes that the tech industry’s out to get him. I’m like you know, don’t confuse malfeasance for incompetence is my response to him. But more seriously speaking, regardless of what the intention is, if your data gets misused, if somebody uses your data incorrectly, it doesn’t really make a difference whether it’s incompetence or malfeasance, right? The loss of trust, once it happens is hard to divorce. You cannot undo certain things.

I would say the tech industry has done very, very well when it comes to harnessing data for growth. Like when you open your Netflix app, immediately you have all these options, movies you might like, movies you have seen, it works in all kinds of device, all kinds of internet speeds. And that’s a great triumph. Like the idea that we could transport that much entertainment over the internet to your locations, wherever you happen to be is a significant victory. And I feel like at a time when we are going through so much uncertainty as a society, so much uncertainty as a country, the tech industry is one bright spot that creates wealth, creates work across the board, right?. That’s I think where the positive comes from.

On the negative side, I would say that we have not always been careful custodians of our reputation or our customer’s data. Too often people believe that somebody down the road will do the right thing. So I’m going to collect this data, use it, do with it what I want, but someone, some smart people down the road will fix it for me. And I feel that complacency, that lack of seriousness has come back and bitten us a few times. So I would grade a B and C for that reason.

Dominique:

I’m interested in what you’re saying here, in terms of the industry taking leadership or you know, folks such as yourself helping to from the inside out heighten awareness about how easy it is to actually implement privacy at the beginning in terms of baking it into the products. I think it’s so much harder to do when the product is already built. So I wanted to ask you Nishant, I mean, in terms of engineering… I was talking to one of our founder of our privacy group at the firm [inaudible 00:10:42] who’s a good friend also, but I wonder if privacy is taught sufficiently in the engineering schools. Is there a role for someone such as yourself you know to guess lecture there?

Nishant:

Yeah. So I don’t think a whole lot of schools are educating us on privacy engineering or security engineering. I know Carnegie Mellon in Pittsburgh has an amazing program. Dr. Cranor and Dr. Sadeh who run those programs are phenomenal. I’ve been a guest speaker there are a couple of times. I don’t think a formal curriculum exists. And that’s part of the problem here. If you could take existing engineering talent, people who write code, who build products and teach them privacy engineering on the job, that’s one way out of this mess, which is why I wrote this book.

This book is aimed at people like yourselves who care about this from a domain perspective. It is aimed at engineers who need hands on skills. It’s aimed at lawmakers and regulators who need to have some contextual understanding on how tech makes money. There was an iconic moment in 2018 when Mark Zuckerberg testified before Congress and a very senior U.S. Senator asked him how Facebook makes money. And he said, “Senator we run ads.” It’s a pretty famous clip. And when I was in a room with a bunch of engineers, when that clip played, they laughed. I’m like, you know, this is not funny. The people who have power to regulate us do not understand us, which is not a recipe for success.”

So my hope would be that people read this book. Also, there is a company called Data Protocol that is sponsoring a privacy engineering certification, largely based on my book. So Dominique, to answer your question that does not exist in any significant scalable fashion right now, but my hope is to become the change I seek to see as Mahatma Ghandi used to say and start basically with some core engineering concepts that help you, for example, delete data correctly, anonymize it correctly, build privacy into the data at the point of ingestion and then build out user consent platform so our customers know that we care about collecting their data in a meaningful fashion.

So all of those hands on tricks and techniques exist in both of these offerings that I mentioned. I would love to sort of do some kind of guest lecturing for students, because I feel like you can start with what students already learn in college from an engineering perspective and add the privacy knowledge as topsoil. It will not make you a privacy domain expert, but it’ll help companies, staff their privacy teams much faster at a time when the tech industry has a lot of jobs and not enough people.

That’s point number one, but the second thing I would like to do is sort of make sure that people like me have a seat at the table when Congress passes laws, discusses regulations. Because you need the people inside the company that actually understand how things work once you get past the first principles conversation, because everybody will say, “We care about privacy. We care about security. We care about diversity.” Everybody says that, but doing the work, making their decisions, allocating the budgets is where the magic really happens.

David:

That’s what I was going to ask about Nishant is what’s the role of government here? Listen, if you build a car, I mean, you have to comply with so many different rules and regulations. It’s for safety, right? But from what I’ve learned and I’m an outsider, I’m a litigator is that there’s just very few national standards on data protection. I wanted to get your take on that and your thoughts on what could be done in the future.

Nishant:

Definitely, definitely. So my favorite moment during one of my industry podcasts was when somebody on the regulatory side said that you know, this is why we passed GDPR to go after the big tech companies. My response to them was look at the date when GDPR became law and look at the stock price of some of the big tech companies on that day, and look at their stock price today. Some of those stock prices have doubled if not more. So if the intent of GDPR was to go after big tech, it’s kind of a funny way to show it to have their stock price go up so much, right?

So I feel like there is a role for government, the policy apparatus, the legal profession and the engineers to work together. So what I encourage my engineers and people who report into me is to work with standards bodies in their personal capacity or as emissaries of the company and try and get those use cases in. So, as an example, when companies collect data, why do they retain data for more than x period?

So, as an example, let’s assume you work for a company that makes maps for navigation purposes, right? If you happen to go from say, San Francisco to San Jose, it makes sense to track your location to give you correct directions. Maybe recommend a hotel for you to stay in or a restaurant for you to eat in. But that information, that location telemetry data makes sense to collect and keep for maybe an hour or two afterwards. So that because once you are home, maybe you don’t need that telemetry data anymore. Because I’m at home. I don’t need to say at a hotel, right? So what you need is for the policy apparatus to give engineers a seat at the table. Because I feel like a lot of the laws that are written, are written in a way that is almost impossible to violate.

Remember, when Target was breached in 2014, it’s a pretty famous breach. The bad guys got into the HVAC system, talk about heating things up here. And they were PCI compliant. So they were compliant with the standard of the day and yet they got breached and something very, very similar happened with the Colonial Pipeline breached just a few months ago. That was credentials they found on the dark web without MFA with VPN credentials right? So in 80 years, allegedly, we learn nothing at all and you want regulations that are somewhat prescriptive, that horn in on specific actions. So for example, what can you collect? How long can you keep it? Who should have access to it?

So I would want the laws to speak to that level of detail. And when I make this argument, David, I’m arguing against economic self interest because the more ambiguous the laws, the more job security I have. I exist because the laws are not clear enough. So when I want the laws to be clear, I’m actually arguing against my interest. But I do think my main customer here is not me. My main customer are people like my dad, people who are not tech savvy, people who use their smartphones to send me fake news on WhatsApp, because they’re just so happy to have a smartphone, right? It’s those people we need to protect because the burden of privacy protection too often is placed on the user when it belongs on the government and the company’s that really have all the power in this equation.

David:

What I would say the framework has been and Dominique, this is no criticism on you. Dominique was instrumental in developing the CCPA, but seems like what the framework is, is basically just saying, don’t let this data out, but really with no guidance about how to do it. In other words, you’re going to get punished if you let this data out but there’s really not a lot of information about what standards, et cetera, that need to be followed to prevent that from happening. I mean, I could be wrong, but do you tell me if you agree with that? Sorry, Dominique, I interrupted you.

Dominique:

Mine is sort of a piggyback. Sorry, I would answer David’s question first, but in the back of your mind, if you can think about what is your role and folks such as yourself in terms of what’s going on at the federal level right now.

Nishant:

So let me answer those three questions in sequence. I would say, and I know I like to make fun of GDPR more than the next guy, but I feel like people need to give credit to the folks who came up with the GDPR and the CCPA. They had to start from nothing in a world that had changed beyond recognition. It’s like that line in the Jane Eyre book. It’s like when she goes back to gates head, she’s like the things that could move have changed beyond recognition and the things that cannot move haven’t changed at all.

So the tech industry went through this significant change and somehow we were supposed to come up with laws that would keep the customer safe. I feel like GDPR and CCPA get a lot of credit for actually putting something on the table. A lot of investment, a lot of awareness, a lot of discussion around this domain exists, partly because those folks had the courage and the fortitude to write these laws and at least get a start. It’s not like security has done much better. Cybersecurity as a discipline has been around a lot longer than privacy and yet we have so many different breach notification laws for different jurisdictions, right?

So I tend to defend the folks who wrote the GDPR and CCPA, because we need them in the room, frankly, because without them, we would’ve had nothing at all right now. I think when it comes to the federal conversation, one of my biggest concerns is that we as country and I’m talking about just the US, we like to do big things. The people who run our country, whether it’s the president, the speaker of the house or our senior leadership, they came off age in the ’60s when we did big things. Person on the moon, the highway system, big infrastructure projects, right? But privacy is nuanced and especially the tech industry, we don’t look at scale when it comes to the size of physical artifacts. The small data set here. The small API here. The ability to create this algorithm in this case, the AI model here, all these small things create big impact.

So the idea that we’re going to have this huge privacy law that will fix all of these problems, I don’t think is going to really work out. So I feel like I’ll tell anybody who wants to listen and even some of people who do not want to listen, that I would optimize for some of these smaller problems, one by one and try and pass them bit by bit and see what happens. See what the impact is going to be on the big companies. What is the impact on the small startups? What is the impact on the VC sentiment, right? And get some data and then make decisions about the big privacy law based on that data because I’m not sure 60 words exist for any understandable privacy law right now.

I would love to have a law, but I feel like having a law that meaningfully protects customers is kind of what we need to optimize for. Because what I would not want is for this huge omnibus law to pass and for nothing to change. Because I feel like we are at a moment of institutional distrust, unlike any I have seen in my lifetime. I wasn’t around in the ’60s, but I’m fairly certain the level of distrust people have towards institutions, big business, government will not be helped if we pass a big privacy law and yet nothing changes. That’s the nightmare scenario for me.

Dominique:

That’s an interesting point.

David:

When you say smaller incremental changes or incrementals, suppose like the government said, “Listen, all platforms that provide customer access, they need two factor identification.” If that was just mandated. I mean, is that the kind of thing you’re thinking about or am I in the wrong direction.

Nishant:

No, I think that would be a great place to start. I would hope they have multifactor authentication. I would hope that companies are required to not just provide some level of authentication like you described, but also provide the different levels of protection and up level the debate a little bit and say, “What does 2FA or MFA get you? It gets you more access control. It gets you a more auditable trend as to who can look at the data, why and when, right? It gives you the ability to change those access settings in a meaningful fashion and therefore, it gives the customer more control over their data, right?

So yes, start with requiring some level of authentication, but then also put it on the companies to come up with other options as well because not every company may have the ability to provide 2FA on the fly. Maybe they have the ability to sequester data, encrypt it in a certain fashion. The goal is work backwards from the level of protection you want to provide to the customers, given what you know about the data right now and work backwards from there.

So I think that would be an excellent place to start. I feel like when you have that as your starting point, you can then say, “Here’s how it should work for companies with X amounts of data.” “Here’s how it should work for companies that have healthcare data.” It allows you to have a more intelligent conversation with the engineers, the platform owners, the policy experts, the attorneys at the table at the same time.

David:

And are privacy engineers deeply involved now in the discussion at the federal level?

Nishant:

Well, I don’t know at the federal level, but when it comes to industry level conversations, if you work for me, you definitely are part of the conversation because in selfishly speaking for engineers to get promoted at tech companies, you have to demonstrate impact and you have to demonstrate that you are different. So I, as a leader, what I tend to do is I tend to make sure that my engineers build amazing tools, but they also are able to get with the attorneys and the policy folks and make sure that the policy folks and the attorneys understand how these tools work because at the end of the day, that helps us make impact.

I’m not sure that there is a platform at the federal level right now, although I wish there were one and you can probably tell. I’m eager to participate in it because I feel like that level of experience is hard to have, and I’ve made my share of mistakes. I’ve made my share of comebacks from those mistakes. And having people at the federal forum have that conversation and say, “Yes, you think this is a good idea, but here’s what you’re missing.” That’s the part of the conversation that needs serious augmentation right now.

And to answer what Dominique asked earlier, what I have historically done in the last decade and a half decade or so is build teams that have engineers, architects, data analysts, people who can serve as privacy consultants who consult with engineers across the company at the point of the whiteboarding. So not when the design is written. Not when the spec is done. Not when the product is ready to ship. It’s too late at that point. It’s a bit like telling me, add a little less salt to your cooking. It’s impossible to do.

So my job is to make sure that I staff the team with engineers who can talk to attorneys, who can talk to other engineers. So people skills. The ability to engage, the ability to negotiate, the ability to simulate. I need all that and then have those conversations. So my hope would be that a similar engagement model could happen when we talk to the feds at some point.

Dominique:

Well, Nishant, what you’re saying is so important and it’s almost the mirror image of what I talked to with our team. The importance of getting the facts about how the product actually works. Understanding the data flow so that we can participate early and often in providing meaningful guidance on how to do this in a way that will be not just compliant, but help the company really leverage the data. So I was interested in your bio where you talked about , you know, your C level communications, because there’s the granular communication to get the product done. And then there’s a leadership conversation that has to happen to impress upon leadership, the importance of this to brand, value, trust. If you could say the name of your book for the audience, and we’ll put it in the notes, but I’m wondering about that piece as well as could leaders, such as yourself get connected with Senator Thune, Senator Wicker with these federal bills. I’m sure they would be so interested to hear from you.

Nishant:

Oh definitely. I would love to be part of that conversation. So when it comes to the C level conversation, I feel like the easiest approach I’ve had is to make it about numbers. You know when you look at sort of the C Level, they care about efficiency, they care about predictability. Business hates uncertainty. Everybody knows that. So if you want to reduce your data footprint, if you want to reduce your storage costs, how do you do it? You tell engineers to delete what they don’t need, right? And I can make common cause with a lot of folks who may not care as much about privacy. I’m talking about people who run the data warehouses, the people who have to provide cloud security, the people who have to build AI models and the better the quality of the data, the more intelligent, the more dependable the AI models.

So what I try to do is I don’t make it all about privacy and that’s kind of the epiphany I had like four or five years ago, which is if I make it about privacy, I come off as a scold. I come off as a kid that always has the right answer in the school that nobody likes. So if I can make it about, “Hey, if you improve these privacy controls, delete this data here. Anonymize this data there aggregate this data in this third place.” The overall data footprint shrinks. What that means is we’re paying less money to store data. We are having to spend less money encrypting data and managing those keys. We are querying the database a whole lot less because there is not that much to query to begin with.

So then instead of me making the argument, I have people who would otherwise be my internal adversaries, making that argument. And then I can swoop in and say, “By the way, if we do these things that connects to this GDPR control, this FedRAMP control, it means we can get into these markets that historically have not liked us. It means we can give customers a dashboard that lets them toggle their privacy settings. But for us to do that, we need to get these things done first. If you do these things, it helps us become more agile as a company.

So ironically, the way for privacy to become top of mind is to get into the background a little bit more and let other people do the talking. But this takes a lot of relationship building. This takes the kind of team I’d like to build, where to make sure that you have architects and experts and consultants advising engineers across the company. And also if those other engineering teams in the company don’t have the right tools, the tools my engineers build are available for service.

So you really want to be very strategic about that conversation. For me, the meeting with the C level execs is the afterthought. For me, the conversation happens in the lead up and make sure that other people are ready to speak up for privacy without me often saying a word.

Dominique:

Brilliant.

David:

Speaking of that privacy is different than just pure technology. It sounds like you have to think about what type of information you’re going to be covering and et cetera. Can you envision a time when there would be a chief privacy officer or C level privacy officer?

Nishant:

Oh, a lot of companies have them and there’s one at every company.

David:

Oh, do they have?

Nishant:

Exactly. There’s been a chief privacy officer at every company I’ve worked at in the last five years or so. The interesting thing is that the chief privacy officer typically tends to be somebody from the legal profession. Although I do think that as the discipline involves, hopefully people with my background get that role because you want to make sure that that role is very diverse. I feel like the CPO builds a team full of attorneys and my hope is that the most technical attorney can have an intelligent conversation with one of my engineers and engineers who are in my molds, people who care about more than just writing code are enough involved into their tech privacy conversation at the policy level to be able to have equal chat in that regard.

So as an example, in multiple roles I’ve had, my teams have built sort of this internal standard. This goes back to the point David you made, which is that there is not a US or world standard. So engineers reporting to me build that standard and they work with the legal team and the policy team to make sure that we have an internal abstraction that pulls from GDPR, pulls from CCPA, pulls from Federal Standards and creates this one consolidated version that you can then update on a six month basis because the laws change, public sentiment changes.

So I feel like the chief privacy officer role is extremely important, but you want to aspire for a chief privacy officer role where the attorneys who work under the CPO can work with engineering directly. These silos that we place ourselves into as a society, as a country, as a discipline are to our detriment. I feel we need a lot more of a cross-functional muscle to make sure that we can pivot quickly when required.

Dominique:

I agree with you there. I guess one question I have in addition to everything you just said is how much time is being devoted to this at the CEO level, at the board level? I feel like there’s been some movement on this in terms of cyber, but I’m just wondering how much time the typical CPO or chief data strategy officer gets with the strategy team?

Nishant:

So I’m going to use another analogy. I love to speak in terms of analogy. So I would say five, six years ago, doing my job was like running the McGovern campaign in Texas in 1972. Doing my job today is like running the Biden campaign in Georgia in 2020. There’s a hope. It’s not easy, but there’s hope. So, what I would say is the conversation at the C level is definitely improving across the industry, especially since really big companies have been called now. Like I remember one of the most prominent fines on a GDPR was levied by the British regulator on a British company, British Airways, right? That was a much bigger fine than was previously permissible.

So I feel like those numbers offer clarity of mind. And I feel like the C level conversation is improving. What I think what is really improving is just the level below the C level is kind of the level I play at, which is how do you make sure that the right metrics exist? How do you make sure that engineering roadmaps don’t get disrupted by certain privacy changes, right?. That’s where the conversation is really improving because a lot of engineers want certainty.

What you don’t want is a situation where a lot of data you’ve collected is now not usable because you collected it without consent. What you do not want is a situation where you end up having to rebuild a system because now it doesn’t work anymore because the assumptions of two years ago are not valid anymore. So I feel like that conversation needs to happen on an ongoing basis. So let me give you a template. What I have done in my last five years is on an ongoing basis, on a weekly basis, my team meets with edge teams across the company to just make sure that the roof isn’t caving.

We meet on a monthly basis to make sure that we are strategically aligned. That the tools my team build are adopted across the board at the pace at which we need. And then when it comes to MBRs or QBR, that is business review meetings, we have the most important metrics, key asks and key risks synthesized for the C level to fully absorb. But what that also means is at that conversation, you’ll have somebody like me, you’ll have somebody at the chief privacy officer level and you’ll have somebody at the engineering VP, SVP, director level, having that conversation too. And going into the meeting, we are aligned and asking for those resources together rather than being at each other’s throats.

So I feel like at the C level, the conversation has improved significantly, but and I could tell there’s a but coming. At this point, we’re like the runner who’s gotten on the treadmill on the 1st of January after feasting on donors over Christmas break, right? So we’re getting better. Things are moving in the right direction, but we have a bunch of holiday fat, which is tech debt to wear off. So it’s getting better, but I feel like we have a long way to go still.

David:

Switching things a little bit. We had a couple guests on disinformation and disinformation techniques and at least one of the guests said, they had developed some algorithms et cetera to try to detect disinformation. I was just to ask is that part of your role is to try to detect and protect from that or does someone else do that at the peace companies?

Nishant:

So I would say it depends upon the leader and the company. So in my case, I have engineers who report into me who have the roles as well, that routinely participate in conversations around fairness, algorithmic bias. Disinformation is kind of right next to it. Although in my current role, it’s not a big part of what I do. But I would say when you are looking at privacy through the lens of whether it’s antitrust, whether it’s fairness, whether it’s abuse, safety, trust on platform, physical locations, things like that. I would say, it’s all part of the larger trust conversation.

So I would think of data security, data privacy, as part of this larger chat about platform fairness, trust, things like that. I would say another imperative here, and this is where security, fairness, and privacy often our intention is that when you log into a website, the company has to check to make sure that you are who you say you are. And for that you need data, but privacy says collect only what you need. So there is tension there. When it comes to fairness, when it comes to adjudicating on disinformation, was it really you that posted this stuff? Are you really who you say you are? Are you using somebody else’s account? Are you a bot? Are you spamming? Are you DDoSing?

All of those things that require data collection and which is something privacy discourages, right? So then you get into the things like how long do we keep this data? How do we make sure that the access control is strict? So I would say disinformation is a pretty critical thing when it comes to privacy. I feel like it makes the argument. I made a few minutes ago, which is that you need these folks working together and talking constantly. The more you can build in terms of tooling, the more deterministic you can be, the more careful you can be. And the more you can prove to people that you are collecting data for a very legitimate purpose, rather than just collecting it for the sake of it.

David:

That’s a great answer. I love that. The concept of trust overall. That’s [inaudible 00:33:33].

Nishant:

And from a career perspective, I think Dominique talked about it. So if I were to think about my own career, my apogee point would be making sure that the chief trust officer role is occupied by somebody like me. I’m not saying me, but somebody like me, because you’ll have somebody who has built a tech, who understands the policy, who’s worked with the legal teams and who can basically represent the trust and safety metric for the entire organization, rather than just security or just privacy or just AI.

David:

Yeah. I was thinking that when you’re talking. Oh, sorry, Dom. Go ahead.

Dominique:

I really like that idea. I know this is an area. I also follow this a lot in terms of where the trainings are going with the International Association of Privacy Professionals, for example. And they just started the certifications for certified information, privacy technology. I think we just made a big hire for someone who’s helping out with the privacy engineering sort of leadership part, because so much of this is in the training, but this combination of skills that you’re talking about technology, policy and being able to speak between C-suite and ranking file engineers. That’s a really unusual combination. So we could replicate your skillset and stick it in every enterprise. That would probably solve a lot of the tension that we have with the legal community.

Nishant:

No, definitely. I think Dominique you made such a great point here, which is that you want to make sure that some of the tension goes down because honestly the disciplines are not that far apart. I just began this conversation by saying that the AI team, the marketing team, the finance team, the cloud security team, and the privacy team have a lot of the same interest. Why does it matter that the reasons are different? At the end of the day, we’re all going to the same place, right? Nobody wants to store data they don’t need. Nobody wants to pay for cloud storage, right?

So I feel like having that conversation and make it about data and making it about shared interest is pretty critical. I feel like the legal profession has made a lot of advances in this space. Like increasingly I’m seeing people with JD’s who actually read the journals, who actually look at product releases, who try to understand how breaches happen, things like that. Like increasingly when I talk to attorneys, they’re like, “No, don’t dumb it down for me. Let’s get into the specific details.” In fact, I’ve actually like in my current role and I’m here in my own personal capacity, but in my current role, I’m increasingly seeing that when the tools my team will get presented across the company, you will have a panel with a product manager and engineer from my team and somebody on the legal team.

I want this to be a joint effort because at the end of the day, my budget comes from attorneys or somebody on the CISO side. That’s typically who pays my salary. So it’s in my interest to make sure that those folks get banged for their buck. And fundamentally, if you look at what people look at and when and they see privacy, they care about, “Can I control my data? Do I trust this company?” All these changes that are taking place with Apple and Google with third party cookies and whatnot, people are looking for a trust metric and right now that’s what’s missing. So having this level of cross-functional work together means that those metrics are a lot easier to come by.

David:

Mentioned the book at the beginning, its Data Privacy Runbook, which I want. I love that. So you think somebody who doesn’t really know a lot about engineering, like myself could actually read that book and understand?

Nishant:

Yeah. Let me get to the heart of that question. So first up about the book, if you go to Amazon or any publisher or any retail website and put my name and the word privacy in it, the book comes up, because I’m literally the only person I know of with my first name, last name in the US, or anywhere in the world. I know that because when I was getting naturalized as a US citizen, the government told me that my name was that rare. So I have a vested interest in privacy. Because if you do something bad to me, it’ll happen just to me and not some other Nishant Bhajaria out there. So the life you save could be mine in this case. But yes, to answer your question, David, absolutely. The book aimed at engineers, that’s my primary audience. But if somebody like you or Dominique were to read it, it’ll give you the context of how the workflows.

What are the trade offs that engineers have to make? Why does it take so long? What are the different choices in terms of doing it early versus doing it late? Why would you want to build an inventory of data? So let me give you a specific example. As I mentioned in the beginning, imagine searching for something in a library that does not have categories that does not have alphabetical sorted books, that does not have different floors, that does not have those labels. That is what privacy feels like right now when companies don’t have those controls.

So my book talks about building a categorization approach. That is when you classify data based on risk, how do you categorize data at the point of ingest so that before anybody uses data, they know exactly what they’re dealing with? How do you build centralized tools in a team like mine, but also make them adaptable and accessible across the company for engineers to use? How do you build user facing consent tools so that your relationship with your customer is a direct function of you and your platform and the customer, rather than having Apple of Facebook put a layer in the middle? How do you build that direct connection? How do you make common cause with security? How do you build out a maturity model for privacy just like that exists for organizational maturity model?

So the book talks about this entire evolution and I feel like somebody like yourself could read the first one third of the book and the last one third, because there’s a lot of stuff in the middle that is purely engineering, but I’ve tried to strike a balance to make sure that engineers walk away with hands on talent, but at the same time, people in the legal profession, people in media, people in the policy space understand the context of what it’s like to actually make privacy happen at the engineering level.

David:

Oh, that’s fantastic. Dominique, we got to have your team read that book.

Dominique:

Absolutely. I just feel that for us, it’s just not possible to advise on the legal issues until we really understand the technology. I can’t tell you how many engineers and interviews and sitting down and just making them explain yeah whether it’s with diagrams or wire frames or data flows of her network architecture to just really get that down. But I encourage attorneys to get their CISSP training so that they can really understand the network architecture as it relates to breaches. I know I did that. I found it very helpful, even not as someone who was going to be creating the routing systems and so forth, but just to be able to have a cogent conversation.

But I really like this cross pollenization idea of getting all the stakeholders at the table. The product teams, the engineers, you and the lawyers. Do you find that people are understanding each other better through this or is there friction in those teams?

Nishant:

A bit of both. I think understanding doesn’t improve unless that is friction and friction often leads to better understanding. So I feel like some of that friction is necessary. My word of advice to the engineers. I know you talked about the legal side of the house, Dominique, but words have meaning to my fellow engineers. When you say deleted the data or soft deleted that data, there’s a difference. So when the attorneys come ask you questions, please be patient, please explain stuff to them the way you would to a new engineer on your team.

So I feel like understanding needs to happen on both sides, but Dominique, to answer your question, I would say you want that level of friction. So let me give you a specific example, in one of the companies I’ve worked for, I started a consulting arm within my team. The goal was to have my engineers talk to other engineers across the company and shape their products at the design stage and make sure that data wasn’t collected unless it was required or data wasn’t exported out unless there was proper protection on the third party vendor side, right?

When we first launched the program, there were concerns raised on the legal side. I was surprised. I’m like, “Why would you be concerned? We’re trying to do the right thing here. So that’s some of the friction we’re talking about Dominique. The concern on the legal side was that it would create this illusion that there are now two privacy reviews, one conducted by the legal team, another one conducted by my team and it would create swirl and entropy and confusion among the engineers. Just as critically, it would slow down the flow of products and engineering would revolt.

So there was a contextual difference and we came up with a better way to do this when I said, “Okay, I’ll call my team, the privacy consulting team and they will consult with engineers at the initial stage so as to make sure that stuff that gets caught often too late in the process is caught at the early stage. And we can give the legal team a heads up that, Hey, this thing is coming your way, you might want to be careful.” When we got that business model to work over the course of three to six months, the DPIs and the PIs that the legal team was doing got much faster. The number of things snuck through went down dramatically. The number of times the legal team had to say no shrank pretty significantly. And more importantly, my team had a better understanding of the sorts of things different business verticals do in a way that would never be possible if you came in at the tail end of the process.

I love friction, by the way. I love disagreement. I love the contest of ideas. It’s when we try to get along just for the sake of make believe that things go badly. I never get more scared than when everybody agrees in the first 10 minutes because I’m like, “Just give me 10 minutes more.”

Dominique:

In terms of the privacy impact assessments and the data protection impact assessments that you were mentioning the DPIAs and PIAs. I mean, this is one of those areas where having the heads up ahead of time would be so great because lots of times when you have to kill a project or say, oh, this will be a problem. That’s from a legal standpoint is often not really welcome from the product team. So I love the idea of collaboration.

Nishant:

Yeah. I feel like it’s even more important because look at what we’re going through right now. We have kids learning from home. We have people getting telemedicine from doctors at home. We have people doing all their shopping at home. So the amount of data that is being exchanged by companies and customers today is unlike any we’ve seen before. I think some of that is going to continue even after COVID is done, which I hope that happens tomorrow because I’m done with this virus. Sorry, had to get that little gripe in there.

So what’s true is that if the amount of data companies collect increases by like 2x or 3x, that does not mean that my team can increase by 2x and 3x. I’m an expensive resource. There are not that many people like me. Also, the legal team cannot increase by 2x and 3x. So we’re going to have to find a way to scalably, ethically do these security and privacy reviews in a way that does not turn the engineering team off, nor does it create the customer risk situation that we do not want either. And I think it’s important to be very pragmatic about this because engineers are very smart, whatever company you work at.

If you make the privacy review process overly bureaucratic, the engineers will find a way to work around you. That is just a fact. I hate to break it to you, but I used to be an engineer. I know where all the bodies are buried because I put some of them in there myself 10 years ago. So you have to figure out a way to make this process efficient. Otherwise, you’ll have a CEO that tells the world privacy is important and you have an engineer that at the same moment is collecting data that they should not be collecting. So I want to reduce this dichotomy here and make sure that the privacy review process is efficient, it’s intelligent, and it brings the engineers into the conversation early on in the process.

David:

On that note, and you mentioned at the very outset that you had more than one major in college and you had a soft major and had a hard major, which really… And you obviously combine ethics and science and technology. So just curious, knowing what you know about the way platforms are structured and the management structure, what would be your ideal position and what would you be doing beyond privacy in terms of doing the kinds of collaboration or fostering the kinds of collaboration that Dominique was talking about?

Nishant:

Yeah. So I think having some a trust board. So in my current role, I have people reporting into me and it’s been that way for the last several years, but this is not always possible for companies. They may not be able to have a central privacy engineering team. So in that scenario, what would be awesome is to have somebody like me essentially have direct connections and commitments from edge partners, business partners, product partners, across the company to say, we will figure out a way to build this stuff in a shared responsibility model. That would be pretty awesome.

That’s the more tactical answer, but a more strategic answer would be for myself to be in a situation where I can be an emissary for industry, for the government and essentially for trade associations, like the IAPP and others, the FPF and whatnot, and make sure that there is a way to come up with standards how-tos and how-not-tos to ensure that companies have a place to start. Because I talk to a lot of startup founders. I talk to a lot of VCs and they are concerned because they do not know how to measure risk. How do you draw the line between an idea, the initial design, the data, the risk and the engagement model?

How do you connect these dots in a way where you can come up with an understanding of how things break and how things work? So to be able to tell people, “Hey, this is like this other thing that happened four years ago, thou shall not do it.” Then codify on that in a standard would be pretty phenomenal. I mean, my I dream job after the one I already have right now would be to shape that conversation at a federal level and really make sure that customers, the people who pay our bills and use our services end up being protected. That would be a dream job for me down the line.

David:

Excellent. Just quickly speaking about the government, we’re going to have to run quick but we’ve got something going on in Ukraine and in terms of national security and protecting our infrastructure and things such as that, what are your thoughts about our current readiness for protection and what we can do to improve it? Anyway, we’ll get your thoughts on that before we conclude.

Nishant:

Definitely. I feel like this is one of my biggest fear factors. Our security as a country, as a society is based on norms. Laws are great, but unless we agree on norms like a common empirical truth, like adherence to societal respect and understanding the facts, we’re not going to get anywhere fast. So I feel like the fragmentation of our society where we self-select and park ourselves with people who are like minded is great when you are on social media, but it’s not so great when you’re trying to protect your country or your bank or your company.

So having a working model where industry and government work on an ongoing basis in a proactive mode, rather than in a reactive mode would be great. I think of this image of big tobacco CEOs testifying for Congress in the late ’90s. That’s not the way to go. I feel like the way to go is industry comes forward and says, “Here’s how we think security and privacy can work. Here are standards. Here are things that may not work well, but we can do with government support.” And then government incorporates that into laws on an ongoing basis and invests in those sorts of education and training programs to create the engineers that make it all possible would be great.

My worry about our posture, David, is that I think when it comes to reaction and fixing, we’re great. When it comes to proactive work and planning were not so great, which is where again, the B minus and C grade comes in, except I think the scale is much larger and the stakes are much higher. So I feel like we’re not as good as we can be or should be.

Dominique:

This concept with Ukraine and also thinking about what Nishant said really applies also to some of the newer issues that are coming up in terms of data fairness, discrimination, AI for… I’m just thinking about the Data & Trust Alliance that I’ve been reading about where industry trying to come up with standards with regard to things like bias and folks such as yourself are critical to those discussions because you have the technological knowledge to know what can and be implemented. So I’m so excited to see where your conversation goes. I’m sure. I personally have some introductions I want to make for you-

Nishant:

Would love to.

Dominique:

…at the federal level, but I’m excited for the discussion that you’re leading.

David:

Yeah. I don’t think we’ve had a guest like you in a while. That’d be critical of our other guests. All our other guests are great, but the way you span ethics and technology is so wonderful. It’s just terrific. So that’s going to lead me to my last question because it is a podcast. You got to ask a silly question. So what is the best fiction book, fiction not nonfiction that you’ve read during the course of the pandemic?

Nishant:

I reread the Tale of Two Cities again, and I love Charles Dickens’ narrative style because I’m not very patient as a person. I’m pretty happy but if you read a Dickens’ novel, you learn how to take things gradually, but it also teaches lessons about what happens when society gets too fragmented. When distrust becomes the currency of the land. It brought a lot of our current situation as a world into light, but I’ve enjoyed reading that book because I just love getting into a situation where it’s a vastly different world, but also A Tale of Two Cities is a bit like watching a Law & Order episode. It’s close enough to reality, but it’s just different enough.

David:

Wonderful. All right. Well, with that note Nishant, thank you so much for joining us. It’s been terrific.

Nishant:

Thank you so much for having me here. Thank you.

Dominique:

Thank you for listening to Decrypted Unscripted. A podcast by David Biderman and Dominique Shelton Leipzig.

David:

If you’re enjoying the show, please rate, review and subscribe on Apple podcast or wherever you listen.

Dominique:

To learn about the podcast, you can also go to our website.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of David T. Biderman David T. Biderman

David Biderman, a partner in Perkins Coie’s San Francisco and Los Angeles offices, focuses his practice on mass tort litigation and consumer class actions. He heads the firm’s Mass Tort and Consumer Litigation group. He has represented a wide variety of companies in…

David Biderman, a partner in Perkins Coie’s San Francisco and Los Angeles offices, focuses his practice on mass tort litigation and consumer class actions. He heads the firm’s Mass Tort and Consumer Litigation group. He has represented a wide variety of companies in state and federal courts in California for 30 years.

On consumer class actions, David represents packaged food companies, coffee companies, dairy companies, footwear companies and others whose nutritional or health claims have been challenged. He also has represented search engines and other online companies. He has a record of favorable results for clients. He successfully tried a major consumer fraud class action on behalf of one of the world’s major search engines in a case involving online gambling advertisements. For that same client, he negotiated a favorable settlement of a class action challenging its online advertising pricing. He represented a major coffee retailer in defeating a class action on standing grounds. He also has litigated pre-emption defenses arising out of food labeling and obtained a dismissal for a client whose nutritional statements were challenged.

For fifteen years, David managed the firm’s full-service product liability team responsible for defending over 1,000 toxic tort cases pending in Los Angeles and Northern California state courts. These cases entailed ongoing trial activity at various levels for several trials set each month. The highly experienced and well-coordinated team has handled thousands of asbestos toxic tort cases for a variety of clients, including FORTUNE 500 companies from such industries as consumer products, aerospace manufacturing, household goods, dry cleaning and industries that generate electromagnetic fields, such as electric utilities and operators of wireless communications systems.