|Did you know ...||Search Documentation:|
A hash, also called digest, is a way to verify the integrity of data. In typical cases, a hash is significantly shorter than the data itself, and already miniscule changes in the data lead to different hashes.
The hash functionality of this library subsumes and extends that of
by providing a unified interface to all available digest algorithms.
The underlying OpenSSL library (
dynamically loaded if
are loaded. Therefore, if your application uses
you can use
library(crypto) for hashing without increasing
the memory footprint of your application. In other cases, the
specialised hashing libraries are more lightweight but less general
The most important predicates to compute hashes are:
blake2b512. The BLAKE digest algorithms require OpenSSL 1.1.0 or greater, and the SHA-3 algorithms require OpenSSL 1.1.1 or greater. The default is a cryptographically secure algorithm. If you specify a variable, then that variable is unified with the algorithm that was used.
utf8. The other meaningful value is
octet, claiming that Data contains raw bytes.
|Data||is either an atom, string or code-list|
|Hash||is an atom that represents the hash in hexadecimal encoding.|
For the important case of deriving hashes from passwords, the following specialised predicates are provided:
crypto_password_hash(Password, Hash, )and computes a password-based hash using the default options.
Another important distinction is that equal passwords must yield, with very high probability, different hashes. For this reason, cryptographically strong random numbers are automatically added to the password before a hash is derived.
Hash is unified with an atom that contains the computed hash and all parameters that were used, except for the password. Instead of storing passwords, store these hashes. Later, you can verify the validity of a password with crypto_password_hash/2, comparing the then entered password to the stored hash. If you need to export this atom, you should treat it as opaque ASCII data with up to 255 bytes of length. The maximal length may increase in the future.
Admissible options are:
pbkdf2-sha512, which is therefore also the default.
^C. Currently, the default is 17, and thus more than one hundred thousand iterations. You should set this option as high as your server and users can tolerate. The default is subject to change and will likely increase in the future or adapt to new algorithms.
Currently, PBKDF2 with SHA-512 is used as the hash derivation function, using 128 bits of salt. All default parameters, including the algorithm, are subject to change, and other algorithms will also become available in the future. Since computed hashes store all parameters that were used during their derivation, such changes will not affect the operation of existing deployments. Note though that new hashes will then be computed with the new default parameters.
The following predicate implements the Hashed Message Authentication Code (HMAC)-based key derivation function, abbreviated as HKDF. It supports a wide range of applications and requirements by concentrating possibly dispersed entropy of the input keying material and then expanding it to the desired length. The number and lengths of the output keys depend on the specific cryptographic algorithms for which the keys are needed.
Admissible options are:
octet, denoting the representation of Data as in crypto_data_hash/3.
info/1 option can be used to generate multiple keys
from a single master key, using for example values such as
iv, or the name of a file that is to be encrypted.
This predicate requires OpenSSL 1.1.0 or greater.
The following predicates are provided for building hashes incrementally. This works by first creating a context with crypto_context_new/2, then using this context with crypto_data_context/3 to incrementally obtain further contexts, and finally extract the resulting hash with crypto_context_hash/2.
|Context||is an opaque pure Prolog term that is subject to garbage collection.|
This predicate allows a hash to be computed in chunks, which may be important while working with Metalink (RFC 5854), BitTorrent or similar technologies, or simply with big files.
The following hashing predicates work over streams:
true(default), closing the filter stream also closes the original (parent) stream.