Blockchain Encryption in Detail

in #bitcoin7 years ago

Before I officially enter the development of Bitcoin, we need to know the basic principles and types of cryptocurrency. If we do not have the bottom of these basic knowledge, it is very difficult to learn later. It is hoped that all readers will be able to patiently understand this part of cryptography. There has never been black technology.

The importance of cryptography in the field of information technology does not need much to say. Without the research results of modern cryptography, human society cannot enter the information age at all. The field of cryptography is very complex. Here comes some basic knowledge related to blockchain in the field of cryptography, including hash algorithms and digests, encryption algorithms, digital signatures and certificates, PKI architecture, Merkle trees, homomorphic encryption, and how Use these technologies to achieve confidentiality, integrity, authenticity, and non-repudiation of information.

Hash Algorithm
The Hash (Hash or Hash) algorithm is a very basic and very important technology in the field of information technology. It can map an arbitrary-length binary value (plaintext) to a short fixed-length binary value (Hash value), and it is difficult for different plaintexts to be mapped to the same hash value.

For example, if you calculate the segment "hello blockchain world", the MD5 hash value is 89242549883a2ef85dc81b90fb606046
$echo "hello blockchain world" | md589242549883a2ef85dc81b90fb606046
This means that we only need to perform an MD5 Hash calculation on a file, and the result is 89242549883a2ef85dc81b90fb606046. This shows that the content of the file is extremely probable.
It is "hello blockchain world".

It can be seen that the core idea of ​​Hash is very similar to content-based addressing or naming.
Note: The hash value is also called fingerprint and digest in the application.
Note: MD5 is a classic hash algorithm, and both the SHA-1 algorithm and the SHA-1 algorithm have proved to be insufficient for commercial scenarios.

An excellent hash algorithm will enable:

  • Fast Forward: Given the plaintext and hash algorithm, the hash value can be calculated within a limited time and limited resources.
  • Backward Difficulties: Given (a few) hash values, it is difficult (basically impossible) to reverse the plaintext in a limited time.
  • Input sensitivity: The original input information modifies a bit of information, and the resulting hash value should look very different. Conflict Avoidance: It is difficult to find two plaintexts with different contents so that their hash values ​​are consistent (clashes).
  • Conflict avoidance is sometimes referred to as "collision resistance." If one is given a plaintext, it is difficult to find another plaintext of the collision, which is called "weak collision resistance"; if it is difficult to find any two plaintexts and collide, the algorithm is said to have "strong collision resistance".

In many scenarios, it is also required to output a fixed-length hash result for any long input content. Popular Algorithms Currently popular Hash algorithms include MD5, SHA-1, and SHA-2. MD4 (RFC 1320) was designed by Ronald L. Rivest of MIT in 1990. MD is the abbreviation for Message Digest. Its output is 128 bits. MD4 has proven to be insecure. MD5 (RFC 1321) is an improved version of MD4 by Rivest in 1991. It still groups the input in 512 bits and its output is 128 bits. MD5 is more complicated than MD4, and the calculation speed is slower and safer. MD5 has been proven not to have "strong collision resistance." SHA (Secure Hash Algorithm) is a Hash function family. The first algorithm was published by NIST (National Institute of Standards and Technology) in 1993. The current well-known SHA-1 was introduced in 1995. Its output is a 160-bit hash, so it is more resistant to exhaustion. SHA-1 is designed based on the same principles as MD4 and mimics this algorithm. SHA-1 has been proven not to have "strong collision resistance." To improve security, NIST also designed the SHA-224, SHA-256, SHA-384, and SHA-512 algorithms (collectively referred to as SHA-2), similar to the SHA-1 algorithm. SHA-3 related algorithms have also been proposed.

Currently, it is generally considered that MD5 and SHA1 are not secure enough. It is recommended to use at least the SHA2-256 algorithm. In general, Hash algorithms are computationally-sensitive, meaning that computing resources are the bottleneck. The higher the CPU speed is, the faster the CPU can perform Hash. There are also some Hash algorithms that are not computationally sensitive. For example, scrypt requires a lot of memory resources. Nodes cannot increase hash performance by simply adding more CPUs. Digital Digest As the name suggests, digital abstraction is hashing digital content and obtaining unique digest values ​​to refer to the original digital content. The digital summary is to solve the problem of ensuring that content has not been tampered with (using the Hash function's anti-collision characteristics). Digital summarization is the most important use of the Hash algorithm. When downloading software or files on the Internet, a digital digest value is often provided at the same time. The original file downloaded by the user can be calculated by itself and compared with the provided digest value to ensure that the content has not been modified.

Encryption and decryption algorithm system The typical components of modern encryption algorithms include: encryption and decryption algorithms, encryption keys, and decryption keys. Among them, the encryption and decryption algorithm itself is fixed and invariable, and it is generally publicly visible; the keys are often different each time and need to be protected. Generally, for the same algorithm, the longer the key length, the greater the encryption strength. Big. During the encryption process, the plaintext is encrypted and the ciphertext is obtained through an encryption algorithm and an encryption key. During the decryption process, the ciphertext is decrypted through the decryption algorithm and the decryption key to obtain the plaintext. According to whether the encryption and decryption keys are the same, the algorithm can be classified into symmetric cryptography (common-key cryptography) and asymmetric cryptography (also known as public-key cryptography). . The two modes are suitable for different needs, and they just form complementarities. Many times, they can also be combined to form a hybrid encryption mechanism. Not all the strengths of encryption algorithms can be mathematically proven. The well-recognized high-strength encryption algorithm is approved by everyone after a long period of practice and demonstration, and does not mean that it has no loopholes. But at any time, self-inventing encryption algorithms is a less sensible behavior.

Symmetric encryption As the name implies, the encryption and decryption keys are the same. The advantages are high encryption and decryption efficiency (fast speed, small space usage) and high encryption strength. The disadvantage is that participating in multiple parties requires the key to be kept. Once someone leaks, the security is compromised; in addition, how to distribute the key in an insecure channel is also a problem. Symmetric cryptography can be divided into two types in terms of implementation: block ciphers and sequence ciphers. The former divides the plaintext into fixed-length data blocks as the encryption unit, which is the most widely used. The latter encrypts only one byte, and the password is constantly changing. It is only used in certain fields, such as encryption of digital media. Representative algorithms include DES, 3DES, AES, IDEA, and so on. DES (Data Encryption Standard): A classical block encryption algorithm. FIPS-46-3 used by the Federal Information Processing Standard (FIPS) in 1977 encrypts 64-bit plaintext into 64-bit ciphertext with a key length of 56. Bit + 8-bit verification.

It is now very easy to be brute-forced.

3DES: Triple DES Operation: Encryption --> Decrypt --> Encryption, processing and encryption strengths are better than DES, but are now considered less secure.
AES (Advanced Encryption Standard): The US National Institute of Standards (NIST) adopts DES instead of DES as the standard for symmetric encryption. From 1997 to 2000, NIST selected Rijndael algorithm from 15 candidate algorithms (by Belgian cryptographers Joan Daemon and Vincent Rijmen). Invention) As AES, the standard is FIPS-197. AES is also a grouping algorithm. The packet length is 128, 192, and 256 bits.

The advantage of AES is that the processing speed is fast, and the whole process can be described mathematically. At present, there is no effective solution. It is suitable for encryption and decryption of large amounts of data; it cannot be used for signature scenarios; it needs to distribute keys in advance.

Note: Packet encryption can only process fixed-length plaintext at a time, so too long content needs to use a certain mode for encryption. Cipher Block Chain (CBC) and Counter (Counter) are recommended in “Practical Cryptology”. CTR) mode.

Asymmetric encryption Asymmetric encryption is the greatest invention in the history of modern cryptography, and it can solve the problem of early distribution key needed for symmetric encryption. As the name implies, the encryption key and the decryption key are different and are called a public key and a private key, respectively. The public key is generally open. Everyone can obtain it. The private key is generally held by the individual and cannot be obtained by others. The advantage is that the public and private keys are separated and insecure channels can also be used. The disadvantage is that the encryption and decryption speed is slow, generally two to three orders of magnitude slower than the symmetric encryption and decryption algorithm; at the same time, the encryption strength is worse than the symmetric encryption.

The security of asymmetric cryptographic algorithms often needs to be guaranteed based on mathematical problems. At present, there are several ideas based on the decomposition of large number factors, discrete logarithms, and elliptic curves. Representative algorithms include RSA, ElGamal, and Elliptic Curve Crytosystems (ECC) algorithms. RSA: The classic public-key algorithm, co-sponsored by Ron Rivest, Adi Shamir, and Leonard Adleman in 1978, the trio received the Turing Award in 2002. The algorithm takes advantage of the difficulty in decomposing prime numbers for large numbers, but at present there is no mathematical proof that the two are equally difficult. There may be unknown algorithms that are decrypted without large number decomposition. Diffie-Hellman key exchange: The discrete logarithm cannot be solved quickly. Both parties can negotiate a public key on an insecure channel. ElGamal: Designed by Taher ElGamal, exploiting the difficulty of finding discrete logarithms with modular operations. It is used in security tools such as PGP.

  • Elliptic curve cryptography (ECC):
    A series of modern and interesting algorithms based on characteristics that are difficult to calculate with special multiplication inverse operations on specific points on an elliptic curve. It was first proposed by Neal Koblitz and Victor Miller in 1985. ECC series algorithms are generally considered to have high security, but the encryption and decryption calculation process is often time consuming. It is generally suitable for signature scenarios or key agreement and is not suitable for encryption and decryption of large amounts of data.

RSA algorithms, etc. have been considered to be less secure, and elliptic curve series algorithms are generally recommended. The hybrid encryption mechanism negotiates a temporary symmetric encryption key (session key, which is generally much shorter than the content) using asymmetric encryption with high computational complexity, and then the two parties perform symmetric encryption on the large amount of data transferred. Encryption and decryption processing. The typical scenario is the HTTPS mechanism that is commonly used by people today. HTTPS actually uses Transport Layer Security/Secure Socket Layer (TLS/SSL) for reliable transmission. TLS is an upgraded version of SSL that is currently widely used as TLS 1.0, corresponding to the SSL 3.1 release. The specific steps for establishing a secure connection are as follows: The client browser sends information to the server, including a random number R1, a supported encryption algorithm type, a protocol version, and a compression algorithm. Note that the process is clear text.

The server returns information including the random number R2, the selected encryption algorithm type, the protocol version, and the server certificate. Note that the process is clear text. The browser checks the certificate with the website public key. The certificate needs to be issued by a third-party CA. The browser and operating system will be initialized with the root certificate of the authoritative CA. If the certificate is falsified (man-in-the-middle attack), it is easily verified by the CA's certificate. If there is no problem with the certificate, the random number R3 is encrypted with the public key in the certificate and sent to the server. At this point, only the client and the server have R1, R2, and R3 information, and a symmetrical session key (such as the AES algorithm) is generated based on R1, R2, and R3. Subsequent communications are protected by symmetric encryption.

Sort:  

Very interesting post, thanks for taking the time to write it.

quite informative and interesting

This is a classic summary!

thanks @kiddady for your feedback

i had tired reading this article :-)

quite informative and interesting

thannks @tayyabali3 for your interests and reading

You just received a 15.38% lifting from @botox ! You can also earn by making delegation to @botox.
Tu viens de recevoir un lifting de 15.38% de la part de @botox ! Tu peux également être récompensé en faisant de la délegation à @botox.

Nicely explains hashing and encryption

Hi, thanks for the post, though you used so many technical terms...