← All talks

When GDPR and CCPA strike: Silver Lining for Security Teams in Data Protection Clouds

BSidesSF · 202024:50139 viewsPublished 2020-03Watch on YouTube ↗
Speakers
Tags
CategoryPolicy
StyleTalk
Mentioned in this talk
Standard
About this talk
Rafae Bhatti - When GDPR and CCPA strike: Silver Lining for Security Teams in Data Protection Clouds Data protection obligations can be an ally to the security team instead of a burden. Having a good understanding of them helps inform security risk modeling and prioritization, secure buy-in when setting the agenda of a security program, and reduce overall liability exposure of the organization.
Show transcript [en]

please join me in welcoming Rafi from mode good afternoon everyone it's 4:00 p.m. and I know I'm standing between you and the closing so trust me I feel you thanks for staying it's actually my wedding anniversary today and for those of you who are married you would know that I cannot afford to be late for dinner today right so yeah let's try and get all of you out of here as well as myself on time so I'm Rafi body I am the director of security and compliance at mode I build security programs at startups so currently at mode prior to that at a healthcare company called healthTap and before that at New Relic when it was pre

IPO more recently I'm also a licensed attorney in California don't hold that against me though because in in my new journey in my career I view my role more of not so much as a lawyer but more of a legal engineer an engineer that's trained in law and its application so enough about myself what does this talk about well this talk is going to be about scenarios that are based on data protection risks data protection risks like GDP are CCPA we all understand that they create risks but why should security teams care so let's talk about the problem that we're trying to solve and then we'll go from there any Jason Bourne fans in the audience yeah

this is a original motion picture soundtrack of that movie and there's a song in it called backdoor breach the movie is about backdoor exploits to gain unauthorized access into computer systems and the reason I'm choosing to start my talk with this is because the traditional security risk landscape is focused on backdoor breaches in fact this scene from the movie is a good analogy this is a custom-built SWAT truck it's in a car chase sequence on the waygu strip why do we build custom you know why do we build a custom SWAT truck here it's designed to be not infiltrated however for those of you have seen this car chase the drivers still ended up trashing it due to poor navigation and

if you haven't you should it's pretty stunning how this ends so now I imagine you are building an application that is designed to be impenetrable you do your threat modeling pen testing and the like but much like this sort truck app or an avid navigating it purely means that that application can still be compromised by someone who is in the driving seat so what do we do about that well this is where we want to focus between backdoor breaches and what what I'm going to call front door breaches let me give you a couple examples Equifax backdoor breach Facebook Cambridge analytic a front door breach now I know you would be thinking that you know the concept of front door

breaches is not you know pretty easy for us to digest as security teams after all by information security standards what happens in a front door bridge is technically not a breach so in this case no systems were compromised no sensitive information was accessed or leaked however in this case what happened is that it's not a matter of their systems being infiltrated or compromised but it's their systems working as designed and allowing the data to be misused so the next logical question is why should security team secure so let's look at the legal consequences Equifax about 170 million accounts compromised the penalties were seven hundred million dollar and up that was the settlement amount in case of

Facebook and kabat analytic a half of that amount was half about half as many users were affected so about 87 million user profiles for whom their data was accessed you know for political campaigns but at that time GD P R was not in effect and so the fine was capped at half a million pounds if that type of incident were to happen today given Facebook's current market cap defined would have been upwards of 1.4 billion pounds so security teams can no longer ignore front door breaches or you want you might want to call them legal breaches but what we can do is measure the data protection risk to be able to do that we need to view the risks from

both technical and legal lengths what you see in this picture is a scientific phenomenon called parallax it's the apparent displacement of an object based on the change in the observers point of view astronomers use this technique to measure distance with the Stars so in our world parallax occurs when the legal and security teams view their risks to an organization from different perspectives security teams might consider this is not a breach because no systems were infiltrated legal teams would say it's a violation of our legal obligations so this is a data breach so to address this dilemma engineers and lawyers need to do something they don't usually do they need to talk who would know what happened when an engineer and

a lawyer walked into a bar usually nothing because they never did except in this case the engineering lawyer is one person and he's talking to you right now and so the sentence ends with he ordered a drink so when I walk into the bar with myself what do I see I'm able to see security from data protection lens and in my experience it distills down to the following the biggest transition that security teams need to make is to view the risks from data subject perspective in the world of legal breaches it's not just about unauthorized access it is about reasonable expectation it's at the core of GDP our CCPA and almost all data protection laws so for example

getting clarity on the risks that are introduced throughout the data lifecycle if you use customer data for other than business purpose you're gonna be liable it's not just about technical controls but also actual obligations both upstream and downstream so if your vendor or partner misuses customer data it's a breach that you are liable for and it's about implementing reasonable controls security controls that help reduce the data protection risks you may want to do the risk calibration and trade offs to prioritize the most important risks based on their legal consequences so looks like we have a plan but are we done yet not quite while this is all good the path to navigating these risks is often

filled with uncertainty so you might ask okay what type of data can I collect our processor store what type of controls are adequate or reasonable for me and what type of obligations should we accept or enforce how do we find clarity amidst this uncertainty if you ask a lawyer their response to words most commonly used by lawyers any guesses there you go it depends I mean the question then is what would it take for us to go from it depends to hey what does it depend on and that's what we're gonna focus on next so next we're gonna find solution to this problem when do we find solution when engineers and lawyers talk so let's take a look let's take a

look at an example from real life suppose you are driving a car when do you buckle your seatbelt always right that doesn't even have anything when do you turn on your headlights when the visibility is low and when can you drive if drunk technically below the legal limit but I'm not endorsing that so what is this what does this tell us there's a correlation there's a risk between driving without headlights or under the influence and that risk depends on both technical and legal aspects for example the risk of driving without headlights depends on the level of ambient light at which visibility is low that's the technical perspective but it also depends on the legal interpretation of when the

visibility is considered low so this Mississippi state statute says it includes fog rain dawn dusk etc the three magic letters so how close to dusk or dawn how dense of a fog you could choose to drive with headlights turn on all the time and that's overestimating the risk or you could choose to drive with headlights turned off until very close to sunset and that's under estimating the risk now suppose turning the headlights on or off is the same as turning the encryption on or off on your data so keeping the data always encrypted is overestimating the risk and never encrypting the data even in storage is under estimating the risk so with a simple example it at least

illustrates the idea of what I'm calling right sizing your risk profile so now we have a plan and we talked about paying attention to data protection risks we talked about the uncertainty in navigating them and right sizing the risk profile and we talked about going from it depends to what does it depend on the best way to understand what what this looks like in practice is by using example scenarios so let's run with it scenario one is a company's vendor launches an email campaign using the personal data of the company's users the data belongs to consumers in California can a California consumers sue the company for a CCPA violation so it's a standard security practice to evaluate

vendors especially when they're processing personal data so you'd wonder how can i render that past your security evaluations become a source of legal consequences for the company to answer that question we need to understand how the concept of data sharing under CCPA is defined in particular the concept of sale so data shared with a vendor can be considered a sale if it's used for any purpose other than providing the service and for all data considered as sale the consumers as the right to opt out that you must honor let's look at an example say you have two tools one is a UI tool vendor that helps pinpoint bugs in your UI the other is a customer support portal or

ticketing system which one of them you'd think access personal data you may think the UI to vendor does not but it may still be accessing personal data in the way it's integrated with your system so it should be audited for data use let me ask you a question when was the last time anywhere anyone here ran a cookie audit on your vendors so very rarely right vendor security check this should include a cookie scan but very few actually do and you would need that to prevent front door breaches as for the customer support vendor that has personal data it must still continue to be held accountable if it starts to use personal data for purposes other than

customer support such as sending email campaigns for its own business that would be a CCP uh violation and an authorized sale and unlike the cookie scan that type of risk is better addressed using contractual obligations so the takeaway here is we need to tie personal data audit to Wender security review security teams can help prevent accumulated ental risk accumulation by auditing the risk of the use of personal data through a combination of tools and contractual obligations and remember if the use of the personal data by the lender does not stop for non-business purpose then the consumer can sue the company okay moving on to the next scenario a company accidentally shares a report belonging to organization a with

organization B the data belongs to consumers in the EU is the company required to notify consumers of a data breach so data sharing is the classic example of a front door breach but unlike the vendor scenario we are in a situation where the breach has already happened so this ship has sailed the opportunity to avoid the risk has passed however accidental sharing is a really you know it's a reality and can happen to anyone so what can the security team do now well we can prevent a bad situation from getting worse let me give you two examples of bad situations the SWAT truck driver in Jason Bourne navigated the risks poorly and crashed but in case of Sully he awarded the

crash instead he did a water landing let's refresh our memory about captain Sylvia flight 1549 that landed on the Hudson let's see if this goes as expected alright because you can't see I'm just gonna pause the video right here okay so [Music] flight 1549 had a unique risk profile this is an example of finding clarity in uncertainty if all Sully had at that time was a defense without the clarity of what it depends on then we can all imagine why it would not have ended so well but Sully made a decision based on the risk profile of 1549 which was speed was too high altitude was too low one of the most frequent comments that were

left said it's not closing ok ok so coming back to the breach typically you also have limited time to decide how would you decide if you require notification once again it's going to be based on your unique profile did the exposed data include personal data was the exposed data encrypted or was the source of breach addressed the right answer would make a difference between a water landing and a crash and the answer may even be different based on if you are in California or the EU in California ft none of this was true then no notification is required under gdpr you also have to consider if the processing was high-risk so what is high-risk processing say you have two

applications one is used to share reports with your users about their utility bills and the other is to share information about clinical tests and say you have a filter that is used to select the recipients of these reports based on the zip code and you enter an incorrect zip code so the reports go to unintended recipients ouch but it happens all the time in one case the billing in forget sneakin in the other case the clinical tests both of them are personal data but one of them is considered high-risk under the law what do we go back to data subject expectations processing of billing data is unlikely to be high-risk why because it's used to provide public service not

so much for clinical tests the expectation of the patient would be for that data to remain private so the processing of health data is considered high-risk therefore under gdpr this type of accidental sharing will require notification to the consumers the takeaway here is focusing on data processing risk helps guide breach mitigation and response tying the risk of disclosure back to business consequences should help prioritize security agenda for example if you have only one engineer on your team doing security reviews and you have two projects dealing with personal data but one of them in public utilities and the other in health care you now know which one to focus on because you know which processing is high-risk but also keep in

mind that the analysis is different in California and both cases will be equal priority continuing with breaches here's our final scenario let's assume the previous speech was reportable you did the right thing and reported the breach what can possibly go wrong with that well someone can come back in to you but not so fast first of all this right is not available under gdpr and under CCPA this is available when certain conditions are met so in other words even if the breach is reportable it's not always actionable so here's another opportunity for us as security teams to prevent a bad situation from becoming worse let's see how let's first find out under what conditions the

breach is actionable once again it's going to be based on our unique risk profile and again the answers are different based on if it's California or the EU so the right is not available under gdpr and in California under CCPA even if all of this was true meaning it was reportable breach but the company had reasonable security in place then it's not actionable so let's break it down what does it mean to have reasonable security it means reasonableness of security procedures and practice it means an operational cybersecurity program and how do you demonstrate that through written documentation and certification by adherence to an established security standard such as ISO CSF NIST CSF or sock to sock to is the one that most SAS

companies use and it's the one that's most preferred because it is it allows you to attest the operational effectiveness of your program so the takeaway here is that a demonstrably effective an operational cybersecurity program can be your last line of defense having an audit report helps demonstrate operational effectiveness but let's be honest most of us don't you know love compliance work oftentimes we question the value it brings to put the effort in a compliance framework but realizing that an audit report could be your ally and best friend in case of the breach and help mitigate legal consequences can drive prioritization okay so the title of this talk was silver lining for security teams so here's a recap of how

viewing security from both data protection lens from their protection lens helps with your security program the personal data audit to vendor tying personal data order to vendor security reviews focusing on data processing risk and investing in operational effectiveness of your program before I conclude I would like to add one more thing if you understand the risks you will reap the rewards so communicate the risks to the board CEO engineers everyone because when you understand the risks and translate them you can translate them into business needs why is that important because needs get buy-in risks get a workaround they will ask you to do it another way you should also harness the power of awareness around data protection to

build a security culture gdpr CCPA is good guidance that is becoming law use the momentum to your advantage and lastly more and more security teams are now recognizing how did our protection risks are tied to security agenda privacy engineering roles are on the rise I've seen quite a few companies put out directs for them so take a look at your unique profile and make a business case if you need to hire one again demonstrating the need would help you get the buy-in so yeah there you have it we've all been hit by birds at one point or the other hopefully you are taking something away from this talk that would help you avoid crashes and do what are

learning instead I'll leave you with this anecdote after Sully landed flight 1549 on water he got in touch with the airline operations manager that person however cut Sully off saying I can't really talk right now we have a plane down in the Hudson to it Sully said I know I'm the guy so here's hoping when it's our time to land we are on the Hudson not in it thank you happy to take any questions oh there is one yeah so going back to the data processing risk example I think that was pretty concrete in terms of when you have two different vendor two different situations requiring security review both of them have personal data but one of them does

not have the risk of reporting because the the reporting is not high risk whereas the other has so the healthcare data has you know high data processing risk attached to it therefore you would want to focus more on that in case of like an accidental sharing incident because that requires notification but if it's not a high risk processing and you meet all the three criteria that I talked about which is whether the data contains whether it's exposed data contains personal information whether it was encrypted and whether the source of the breach has been killed if you've met all those tweet things and the processing is not high-risk then you don't have to worry about it so going back to when you're

creating up at your security priorities you can look at the source of the data you can look at the processing risk attached to it and then you can prioritize where your investments need to be or which needs to get more focused and the other another example would be vendor you so cookie scan can help you detect if the vendor is collecting more data then then you know you are aware of so that's a risk of misuse of personal data by the vendor that you can translate into a business need by saying that if the vendor does that we will be liable therefore we need to invest into additional auditing of vendor data any other questions

otherwise I can go back to my dinner and you all can go to wherever you need to go thank you so much [Applause]