
all right thanks really excited to be talking here at b-sides virtually so um if you haven't gone over to the crypto puzzles channel in slack uh highly encourage you to do so if you're coming here from the crypto puzzle channel in slack you might get some hints on the puzzle listening to the talk so here we are to talk about modern symmetric encryption hiding important messages sent between two people it's been in a goal of people from as far back as second grade it might actually go back a little bit farther in 1 500 bc uh mesopotamian potter encrypted the method for making his pottery glaze and inscribed it on a clay tablet this would have been really important
information for him and valuable as a tradesman so the added effort of protecting the method for waterproofing and decorating his pottery totally makes sense ancient hebrew people also made use of encryption using a substitution cipher to encode the names of their enemies and parts of the bible as early as the 600s bc by the time the roman empire took over the greeks and population and power julius caesar improved the speed of encryption algorithms by taking each letter in a message and shifting it forward three places later his nephew augustus modified the cipher and instead of shifting messages three places only shifted them forward one letter i guess three was a little too hard by 1467 rather than encrypting just one
letter at a time encryption schemes started using two block letters and keys based on words phrases rather than a rotation of just a number from 1 to 25 like the caesar cipher this brought an important feature to in cypher messages previously any one of the plain text letters always translated to the same cipher letter but now there's lots more plain text possibilities for every letter in the cipher text making decryption much harder world war ii is often considered by many the start of modern cryptography with the advent of the use of machines rather than by hand calculations these cypher machines allowed for more complex ciphers to be executed faster and prevented calculations designed to break encryption
from being effectively done by hand so various cipher designers from different countries contrived several machines for use in wartime and many continued the use of the improved versions of these cypher machines until the advent of modern computing systems which brought about the ability to greatly impact the effectiveness of those cypher machines at the same time cypher machine security was being improved and their internal complications increased mathematical advances in the field of cryptography likewise moved forward during the first half century or during the half century following world war ii increase it increased research into cryptography from both private companies and public universities and that research continued at a ever accelerating pace since it was critical for a war limits
were imposed on the now much stronger cryptography these cryptographic systems were restricted from export from the us and put in the category with nuclear bombs design and even horses which probably came from a much earlier era this is the landscape that bore the data encryption standard or des or dez which is now considered woefully inadequate it could have been better though its original design had twice the key size but ended up getting curtailed by nsa's tweaks to the algorithm prior to standardization the restrictions on cryptography were relaxed a little bit in the 90s when the government realized that it was a lost battle largely due to the efforts of international publishers and open source enthusiasts
in the mid 90s triple des was standardized publicly and another cipher called rc4 was leaked to the public and then nearing the 2000s nist announced that they would create an advanced encryption standard better known simply as aes and open up submissions for that to the world textbooks and cryptography classes and online examples have been created to explain those algorithms triple des rc4 and aes and depending on the level of course explanations of the ciphers can range from extremely simple like this or go down to the full cipher detail down to each individual byte like this or this or put into a more fun format like this one the problem is whenever you start trying to explain the algorithm at that level
you end up looking like this each of these two different explanations are useful depending on the level of understanding that you need the problem comes when you try to extrapolate out of either just that deep knowledge or the very shallow knowledge with no additional context you end up with insecure systems these systems could be nearly anything encrypting data databases application storage transport mechanisms protecting data sent over the air in this talk i want to take the road in between these two views in order to give you a better understanding of what the current best practices are for safely encrypting data before we get too far though there is a little bit of background knowledge you need to get the most out of this
talk first it's important to understand that there's two main types of symmetric key ciphers block ciphers and stream ciphers both of these types have an output indistinguishable from random noise as long as the algorithm is cryptographically secure both des and aes are examples of block ciphers by design and then rc4 and salsa 20 are both stream ciphers these block ciphers operate over chunks of data they take the data break it into blocks to operate over and the stream ciphers on the other hand generate a stream of randomness to combine with a message the combining here is done using a logical xor which looks at a pair of bits and then sets the bit or marks down a
one as the answer if the inputs are different and then doesn't set or puts a zero if both of the inputs are the same the way i was taught about x4 was it's one or the other but not both one of the cool properties of xor is that it's fully commutative so if you think of columns a and b and the answer as cups in a cup and balls trick then to set a bit you can put a ball onto a cup so we'll set cup a and b which means that the answer because a and b are the same is not set so now we can take these three cups and then swish them around into every
different position and no matter what the equation is still true no matter how you mix up the cups the math holds this is the math behind what a stream cipher does those previous iterations today went through all but one of the rows in that truth table we saw two slides ago the last one is if both a and b are not set and that means you don't set the answer cup either and of course at this point you can still slide all the cups around without changing the truth of this equation the random stream of data used with ciphers goes back to a really old idea described in 1882 by frank miller used in securing the
telegram the idea was further developed in 1917 by somebody from bell labs in conjunction with the u.s army signal corps the idea was to have a really really long random strifer cipher or a one-time pad as a key and then combine that with the message a famous mathematician named claude shannon was working at bell labs and he mathematically proved that a one-time pad if it was truly random the cipher is completely unbreakable the goal of these stream ciphers is to approximate the idea of that by generating a pseudo-random stream of data it looks random but it was generated through a repeatable process and then combining that with a message to produce ciphertext one of the best things about xor is that
commutative property and once you have the ciphertext by rearranging these like the cups and balls you can recreate the random stream if you can recreate that random stream by using like a keyed stream cipher or a really long one time pad you can xor the cipher text and the random stream together and get the plain text message back so now let's quickly turn to block ciphers this algorithm takes in data and a key and then outputs an array of random looking bits now if you remember earlier that overly simplistic diagrams can be dangerous if you try and combine it without the knowledge in a greater context here's why block ciphers have security properties that are really desirable they promise
that with a secret key a message goes in random data comes out it's exactly what we want but what if the message going in the block is the same every single time the output every single time is also the same but good news each block of output is very random looking but only if you look at each block by itself one example of this effect was created by a student as part of a class paper and then added to the block cipher page of wikipedia this cool idea was greatly improved by philippa valdestora in 2013 with a cool pop art rendition of it of course understanding the effect of this because you can see that
the background of the penguin although encrypted you can definitely tell there's a penguin there the cryptographers understood this and told people never use this mode of block ciphers from the beginning and some cryptographic apis like java's helpfully made at the default microsoft.net api has a different default that's a little bit safer it's called chaining block cipher mode or cbc mode this was a concept and developed back in 1976 the mode prevents this penguin problem by taking the ciphertext output from the previous block and xoring it with the input to the next block cbc mode is further improved by adding another input one that gets passed along with the ciphertext or it's agreed upon before beginning any of the encryption
input's called an initialization vector and commonly is abbreviated iv and depending on the protocol this iv is combined somehow into the initial state of the cipher so encrypting the same message over and over with the same key but with a different iv can produce a different output as well because the iv is an input to the encryption function at the start it's not related to the key or the ciphertext it's considered a public value and can be sent or stored alongside the ciphertext cbc mode was used in protocols like tls for many many years because aes was fast built into processors and really secure but vulnerabilities arose in the way browsers implemented the cbc mode of encryption
and it allowed attackers to determine private information that was encrypted on the browser side but allowed things like cookie theft or session hijacking one attack like that is called beast it's an attack back from 2011. and another one is the lucky lucky 13 attack from 2013. and there's some great presentations out there on those that you should look up and watch if you're interested the stream cipher rc4 was used as a way to mitigate both of these vulnerabilities at least until a patch came out or a constant time implementation can be found in the case of lucky thirds lucky 13. but rc4 of course was not devoid of problems there was some it was a whole series of serious ssl
security citations surrounding rc4 starting in 2013 and stopping finally in 2015 with the no more attack which again follow allowed attackers to reveal private data about the encrypted text by themselves modern encryption and decryption constructs are excellent at protecting data they take data in put random data out with no bay to get back to the original without the key because of that there's no clear way to tell if the data being decrypted has been modified since it was encrypted and a lot of those attacks that i just described involved the ability for an attacker to re-inject encrypted blocks or inject specific blocks into the data stream or the receiving end to decrypt and after the receiver decrypted the data
it would then go and check a checksum to make sure it was correct but the timing of that check were and the rejection or non-rejection of the decrypted data sent a signal back to the attacker allowing them to glean information about the nature of the plaintext and and for this reason modern encryption was modernized encryption was designed to detect intentional unauthorized modifications of the data as well as accidental modification in other words to protect against attacks modifying the ciphertext encryption constructs using authenticated encryption started appearing in 2000 with research from ibm universities rsa laboratories these contract constructs are specific ways to utilize an encryption cipher like aes to not only create ciphertext but also a tag that authenticates what
was sent by the sender now there's several different ways to perform this authentication and it's easy to think of it like a secure hash of the data one way used by the original design of ssl and tls is to put the authentication tag over the plain text and then encrypt both the plain text and the tag together another way is to do the opposite you encrypt the text and then create an authentication tag from the encrypted value the final way is to combine the two ideas and encrypt both the plain text as well as the authentication tag that created from the plain text with any of these choices upon decryption if an authentication tag doesn't
validate then the decryption returns an error rather than the plain text this is what's done in modernly modern cryptography now unlike basic encryption functions that just take in a key and a plain text and output a cipher text or even the slightly more advanced version that adds in an initialization vector which is sometimes called a nonce or a number used once authenticated encryption instead takes a plain text a key an initialization vector and something called associated data this associated data is used to calculate the tag but it doesn't get it does not get encrypted then you can notice that the associated data doesn't go into that encrypt function but what is associated data well imagine
sending a postcard with a message using authenticated encryption you encrypt the message on the back of the card but you still have the two and the from addresses in plain text otherwise the postcard wouldn't arrive at its destination and you wouldn't be able to receive replies sent to the from address the issue is without us authenticated associated data anybody can copy the message and the authentication tag and any number of times and do things like reroute it or change the from address to make it look like it was sent by somebody else you'll know the message that you decrypted was correct but you would think that it came from another source associated data as an input into
the algorithm is used by internet protocols like tls 1.3 or the quick method that google started to make a faster version of tls and it's used to provide authenticated proof over the plain text portions of the protocols with unauthenticated encryption the decrypt function always outputs a block or stream of ciphertext it's up to the application to validate and reject bad data with authenticated encryption there's only two options for output either the actual plain text or an error message saying that the input was bad this error output will occur if somebody changes any of the inputs ciphertext the tag initialization vector or the associated data and this ensures that the integrity of the plain text is
maintained currently one of the most widely used authenticated algorithms is aes gcm this stands for galwa counter mode and it refers to the way that large amounts of data are encrypted the counter mode and the way that it's authenticated using a galway field in an algorithm the method for creating this authentication tag using the galway field it's just a mathematically defined group of a very very large set of numbers with defined ways of performing mathematical operations to ensure that all the values and operations fall into numbers within the specific set and it makes anything that would normally fall outside of the set wrap around to the start or to the end of the set depending on the calculation
and you can see the function there on the screen the input and the output for gcm are the same typical in and out that you'd expect from any authentication authenticated encryption algorithm the counter mode part the cm of gcm takes an aes block cipher and turns it into a stream cipher by repeatedly feeding the counter into that aes block this means that aes in the gcm mode creates a strung together series of blocks of bits and then turns into a bunch of one time pads each one aes block size long that then gets xored with the plain text and the aes function output and the xor the plain text is stored off as a stream of ciphertext
but at the same time that is fed back into a multi sorry a multiplication function from the field and each ciphertext block is multiplied by the previous one to create a small size a small constant size value and then that final block is ultimately xored with a value that's created by encrypting the initialization vector and the key with the key this provides security for that final tag value because while anybody can multiply the ciphertext blocks together only the key holder can do this final operation to seal the tag remember the the provably secure xor encryption with that we can't reuse or loop or replay any part of the one-time pad because doing that can lead to a situation where
you end up revealing the plain text in the same way gcm works by creating a pseudo-random pad same iv and the key used together will always create that same padding stream that's great news though because if you didn't you wouldn't be able to decrypt any of the encrypted data given the same key however because of this the randomness of this initialization vector is just important as the randomness in the key and it's imperative that it's only used once otherwise it's like reusing a one-time pad and it destroys the security properties of the encryption scheme and there were several recent vulnerabilities that i'll touch on later that have taken advantage of this one of the other
modernly modern cipher systems that's in heavy use today is called cha-cha 20 with poly 1305 it's a descendant of the previously mentioned salsa 20 cipher cha-cha 20 is a stream cipher and it's used to create a long pseudo-random string of bits that by using a counter similar to the way that aes gcm works with it the authentication portion of this cipher the poly1305 it refers to the polynomial equation that's creates that's used in the creation of the authentication tag well 1305 comes from the special prime number 2 to the 130 minus 5 and it's used in the upper bound that of the values that are done when performing the math using the tag polynomial and it just wraps around to the
beginning so let's step through how cha-cha 20 works in combination with the poly1305 accumulator that creates the tag i wanted to create this diagram because i couldn't find anything online that went through the entire process in a visual format at this level so the same way that asgcm starts you take the key the iv and the counter and feed it into the encryption function then as you continue you wrap around the counter start adding one to it using the same key and initialization vector to create an output stream that's the same size as whatever plain text you're trying to encrypt that gets xored with the plain text and becomes your ciphertext after that you take the ciphertext
and the associated data and you put those two together with some padding and the length of those two things and make a another buffer to push into the poly1305 algorithm now this is different from gcm because it actually uses a key to go into that poly1305 so you do this you run through that in full encryption buffer through the tag generation method and then that outputs the final tag value for cha cha 20. this 1305 prime that's used for the tag generation it was chosen because it gives the ability to add optimizations in the way that the message is broken up to perform all the various cryptographic operations in fact almost all of the internal
design of poly1305 lends itself to a very fast implementation and this is an important consideration if adding in authentication to encryption created a really large overhead and calculations or in size nobody would want to adopt it and its use would be relegated to obscurity and that brings us to one of the important considerations for choosing an authenticated encryption scheme speed aes has been around since the year 2000 and it means over the last 20 years cryptographers programmers processor chip designers they've all been diligently working to ensure that the overhead of performing encryption with aes isn't a significant detractor to its use and adoption chip designers have started including special registers and specific instruction sets in the micro
code of their processors that allow for very very fast performance of the different operations inside of aes and this makes aes faster than chacha 20 by a fairly good margin at least on computers they use that particular chip without it however chacha 20 is actually superior it was designed to be especially fast in software implementations without requiring any special hardware tricks so phones and tablets and even some of the new laptops are able to encrypt using chat show 20 much faster than they could do encryption with aes even still that didn't stop processor implementers from trying and in 2018 they introduced some new instruction sets on cpus that could actually boost cha cha 20's implementation
speed over the aes on a chip in some certain circumstances cha cha 20 was designed to defeat the problematic side channels that plague aes implementations because of the way aes it was designed implementers had added in optimizations but that could lead to issues when these timing issues happen if an attacker is there looking closely at how long the operation performed it can actually lead them to be able to define the key completely destroying the security of the protocol but because of cha cha 20's design all the operations happen in constant time and there isn't an opportunity for any timing attacks with the encryption so since i managed to sneak this in the weeds talk into a in the cloud slot
i figured i should probably talk a little bit about how this fits into the cloud so we can go through some of the the three big players in the cloud besides the transport layer encryption there's two different use main use cases for encryption in the cloud first there's data at rest and secondly i'll call encrypting application data i'll differentiate between these two by saying data at rest is anything that's done automatically by the cloud provider for all of one specific type or even all of the data that's stored by the cloud provider and then application data on the other hand is where the cloud provider provides a specific interface for applications that are running on the services to use keys to
encrypt data within their actual application code so with data at rest in aws it allows administrators to set up encryption for their stored data using aes gcm which provides a nice authenticated option for them to use to protect all their data gcp uses the same by implementing aes gcm to protect data at rest in their environment some added bonus material here whenever google sends data across their networks they take that data instead of using tls like other places they have a special optimized protocol that's specific to their environment called alts or alts it allows them to transfer data using either aes gcm or a different algorithm they made called aes vcm which uses a special authenticated
method authentication method that's based on integer arithmetic and it's designed to be especially fast on google's 64-bit processors microsoft azure on the other hand it does use aes for data arrest storage but disappointingly only uses the chaining block cipher mode and they cite the reason for this as making the retrieval of the storage faster on the application side every cloud provider here has the has an api that lets application teams to facilitate the encryption on specific pieces of data from their code that's running in the cloud with aws application data similar to their data at rest can be encrypted using aes gcm the interface here allows for providing additional associated data that extra non-encrypted data we talked about
as well as an initialization vector but it's up to the developer to make sure that that's never reused and is always completely random in google's cloud applications are given the ability to do something called envelope encryption this encryption api allows developers to simply encrypt data by just passing a byte array and a key name under the covers it uses gcm and a key management scheme designed by google that uses a key encrypting key to protect the key that does the actual data encryption the data encryption key and the initialization vector used are randomly generated on the fly anytime new data needs to be written azure on the other hand doesn't provide an api for symmetric encryption of data
at all it does provide an api that allows applications to create their own securely stored keys and then use them inside their application to encrypt data using any encryption methods supported by the programming language in use of course the examples in their documentation use the chaining block cipher mode but how does this protect you see the problem with encrypting data in the cloud is most people don't understand the threat model that it protects against cloud encryption doesn't protect against the cloud provider seeing your plaintext data they have to be able to decrypt the data so that they can store it sort it index it provide it back to you it also doesn't protect against operators setting permissions
incorrectly on the data in the key where anybody could get access to it through a regular web interface or an application this happened at capital one about a year ago they had 140 000 social security numbers and 80 000 bank account numbers accessed by an attacker even though they were all encrypted in the cloud the reason was because the access rights were set on the data interface allowed anybody hitting the cloud service to tell the provider and go get the keys and decrypt the data and then send it to them so what does cloud protect cloud encryption even protect against it prevents somebody with physical access to the hard drives from getting access to the raw data
you could achieve this access by breaking in and yanking drives out finding unreached drives in the trash however unlikely that would be or by an insider working in the data center with access to the physical drives another way to use the cloud keying mechanisms to protect your data it depends on the particular way that the crypto systems in the cloud are set up cloud administrators can do something called bring your own key or byok this allows the admin to provide a key to the cloud from their local site you'd want to do this if you are worried about the key persisting in the cloud after you didn't want it to and trusted the cloud provider to delete
copies of the key when asked in cloud consultants pitch this is a way to quickly remove the provider's access to a given key or set of keys that renders the data unreadable you'd want to do this after terminating a cloud agreement to make sure that they couldn't get to the data anymore or to quickly destroy the ability to decrypt the data for whatever reason a similar concept is called key caching it's one step removed from that and that's where every few minutes or hours the cloud provider makes a call back to the local company's key server to request the key in order to perform cryptographic operations on the data in an example not too far outside the
realm of plausibility given their access to the keys a cloud provider could turn out turnover data without your knowing or intervention when faced with a court subpoena however if you knew that the request for the data was imminent and wanted to retain the ability to fight the subpoena to prevent the court from gaining access to it you could revoke the cloud providers access to the keys and with it prevent their ability to provide meaningful data another example where companies could use this cloud encryption is to help protect data using a more complex data access configuration you allow specific users access to the stored encrypted data where they could see things like the location of the data and
possibly plain text information about what the encrypted data is associated to but then give the group access another group access to the data as well as the ability to decrypt it an example of this would be if a select group of people like doctors that needed to see images containing sensitive information like dental scans and there were other users like office staff that needed to have the ability to see information about people that have had dental scans but don't need to see the actual images as long as the access rights to the keys are managed correctly the data remains secure and protected now everything we've talked about so far has been in place for the last five years or so but of course
there's still some not using all of those best practices others are finding ways to work on the next best thing for encryption and bringing that into modern libraries incorporating it into systems utilizing cryptography one of the things that you might have noticed is that aes gcm is extremely popular it's used all over browsers in the clouds applications protecting important data for all types of uses for all types of people one of the problems with aes is that not everybody with asgcm is not everybody gets it right very similar to the way that people see aes and just jump on using it blindly in electronic codebook mode gcm allows implementers to easily and hopefully accidentally shoot themselves
in the foot if the same initialization vector is used with the same key but over different data you end up in a situation where that data can be decrypted even without knowing the key there are newer encryption constructs that haven't quite made it into the mainstream product lines yet but they've been designed to help protect against this foot gun by changing the algorithm to where using the same initialization vector with the same key multiple times over different data doesn't affect the security properties of the encryption scheme as a whole one of these methods is called aes siv and rather than taking an initialization vector as input into the algorithm it takes the plaintext data and hashes it and then that resultant
hash is fed into a block cipher or a like aes as an initialization vector and like gcm a counter mode here is used for each new aes block that's created to deliver that pseudo random string that eventually gets xored with the plain text the caesar competition began in 2014 and it wrapped up last year it was designed to find a replacement for aes gcm that was both robust and suitable for mass adoption there were several categories of requirements as you can see here and different algorithms were recommended based on the needs of the encrypter and there's lots of different algorithm names that probably haven't made it into anybody's lexicon quite yet but each of these categories here it was
either a equal choice to use both of them just depending on which one you liked or which one happened to get implemented the best within the software or it was a preferential ordering so use this one first or and if problems are found in that later we can switch this other one one thing that's an ongoing process is something that's designed to help processors with in smaller devices or constrained environments to look at the algorithms that are used and then change the insides to make sure that they're they take as little time and as few resources as possible this search for the algorithm this new type of algorithm is headed up by nist or the national
institute of standards and technology and it's currently ongoing and expected to complete over the next few years or so it's been delayed a little bit due to the current pandemic the round three release of submissions is currently scheduled to come out in december homomorphic encryption is another encryption type and it gives the ability to perform tasks or calculations over encrypted data technically it's not new it's been around since the 70s it just hasn't seen much use until recently and with good reason because there's one little problem with homomorphic encryption it's very very processing intensive and can end up taking a long time to complete any operation in there and there are several research institutions or
in companies that have products and tools that support homomorphic encryption ibm recently released i think last week one that works on both mac os and ios and they're going to be pulling out support for linux and android coming soon last but not least is the reaction to the threat of quantum computing and so quantum computing by building these different types of computers it can affect both symmetric algorithms and the asymmetric algorithms that are used in things like certificates and within tls and those asymmetric algorithms are hit a lot harder by quantum computing so most of the focus has been on finding something to replace any encryption needs for new signing algorithms so new types of certificates new types
of code signing things like that but there's another algorithm within that was developed i think back in the 80s that also affects symmetric cryptography like we've been talking about today luckily the this bit's easy all you have to do to prevent the problem is double the encryption security level in other words take the minimum key size you have set for any of the symmetric algorithms and just move it up from 128 to 256 and now your decryption of the data is safe and protected from the the threats of quantum computers so that brings me to the end of my talk i really hope that you've learned some new things about symmetric encryption and that your brain has been stretched a
little bit or a lot and that you have some questions that i can help answer we encourage the uh participants to uh provide their questions on our discord channel under the uh track two uh in the clouds yeah so one of the questions is why does doubling the key size work or protecting the encryption types from quantum computers so the reason for that is the particular algorithm that was developed by the researcher his name was grover and the algorithm that's used to break symmetric encryption with a quantum computer ends up scaling at a rate that makes it to where doubling the key size prevents the attacks from taking place so it works differently with the
asymmetric algorithms that scales linear linearly and so any doubling of key sizes with asymmetric algorithms you you end up having to just increase the number of qubits in your quantum computer by one
so next question again quantum computers is if the issue is around prime factorization or if there's something else to it so with with symmetric cryptography there's no prime numbers with things like rsa that use prime numbers the method to generate the private key is you come up with the biggest random number you can think of make sure it's odd and then start increasing that by two until you find a prime number and the algorithm that was used for quantum computers to quickly factor prime numbers is why that's affected there but since there's no prime numbers used in symmetric encryption it doesn't have any effect here the the scale for breaking symmetric encryption using quantum uh you would have to
add qubits on a logarithmic scale in order to get up to the ability to break 256-bit encryption instead of at a linear scale like you would for asymmetric uh another question is how to get the slides i will post these on on my website cem.me and i'll put a i'll put a blog post together that kind of goes through basically everything i talked about today plus a little bit more
cool what other questions do you all have feel free to keep popping him in the discord all right make sure you also go check out the crypto challenge it's on ctfscoreboard.bsidesatx.com what would be some advice that you would give somebody um to kind of start those crypto challenges like what would be some things they need to think about or minds that they have to go into it so one of the one of the things you could do is take the clue and put the word cipher after it and throw that into google and see what you come up with that's a good way to find out what algorithm is used to do the encryption puzzles and the clues are
there on the ctf site intent um all right somebody also mentioned some of the the content on my website the big versions of some of the posters aren't available because i haven't updated my cdn so yeah i think i need to work on that there's another question about your thoughts on browser encryption see uh browser encryption there's actually some new developments in that where browsers have taken and making made javascript versions of these algorithms so you have a fast javascript implementation of aes gcm or cha cha 20 as well as asymmetric algorithms like ecdsa that are available there in the in the browser for use to protect data and this can be useful if you're
making an application to do encryption for communication between two parties or if you're trying to store something locally in a safe manner um otherwise the other browser encryption that of course everybody knows and loves is the tls which is a great way to protect your data as long as you configure it correctly carl one of the questions that we received on discord is asking is it an appropriate takeaway that all the hype around quantum decryption doom is overblown what are your thoughts on that so it depends on the method that you're using for protecting the data so one of the reasons that people are focusing on encryption quantum algorithms now to prevent decryption or any cryptographic problems when
quantum computers are available is the way specific algorithms work for example in tls you come to a key agreement for the data to be encrypted with the problem is that key agreement happens with symmetric algorithms and or sorry it happens with asymmetric algorithms which are highly brittle when exposed to quantum computers so think about your facebook logon for a computer or mobile device when's the last time you entered in your password for facebook or for google the authentication cookies that you're sending there are extremely long-lived because people the the developers of those websites have decided that they don't want to make people log in continuously so because of that we have really really long-lived sensitive data that's being run through
algorithms like tls so if you take and store and capture data today that was encrypted with tls and a quantum computer is developed later they can go back to that data today that they kept and break that key agreement they're not breaking aes they're just breaking the the way that the aes key is derived because of that it puts that data that was sent today even though it was encrypted with a really strong algorithm that's not vulnerable to quantum encryption it allows them to decrypt that and get to the actual sensitive data so it's not necessarily overblown but you actually have to think of what the algorithm is what type of data you're trying to
protect with and how you should best try to protect that data i'm trying to pull up the uh the right channel to send for the crypto puzzle to the other question in the chat here you go there and look at the pinned messages on that channel you'll get all the links that you need for the the crypto puzzle
all right another another question is what hobbies i have outside of doing crypto stuff um i i enjoy music i have a banjo here it's fun to play i also play a couple other different instruments too so really enjoy that reading as well nobody ever expects this banjo it's like the spanish inquisition
you