Interesting article here on post-quantum cryptography. It goes over concepts I'm sure some of you guys are already familiar with: https://blog.trailofbits.com/2018/10/22/a-guide-to-post-quantum-cryptography/ For many high-assurance applications such as TLS traffic, medical databases, and blockchains, forward secrecy is absolutely essential. It is not sufficient to prevent an attacker from immediately decrypting sensitive information. Here the threat model encompasses situations where the adversary may dedicate many years to the decryption of ciphertexts after their collection. One potential way forward secrecy might be broken is that a combination of increased computing power and number-theoretic breakthroughs make attacking current cryptography tractable. However, unless someone finds a polynomial time algorithm for factoring large integers, this risk is minimal for current best practices. We should be more concerned about the successful development of a quantum computer, since such a breakthrough would render most of the cryptography we use today insecure. Quantum Computing Primer Quantum computers are not just massively parallel classical computers. It is often thought that since a quantum bit can occupy both 0 and 1 at the same time, then an n-bit quantum computer can be in 2n states simultaneously and therefore compute NP-complete problems extremely fast. This is not the case, since measuring a quantum state destroys much of the original information. For example, a quantum system has complete knowledge of both an object’s momentum and location, but any measurement of momentum will destroy information about location and vice versa. This is known as the Heisenberg uncertainty principle. Therefore, successful quantum algorithms consist of a series of transformations of quantum bits such that, at the end of the computation, measuring the state of the system will not destroy the needed information. As a matter of fact, it has been shown that there cannot exist a quantum algorithm that simultaneously attempts all solutions to some NP-complete problem and outputs a correct input. In other words, any quantum algorithm for solving hard classical problems must exploit the specific structure of the problem at hand. Today, there are two such algorithms that can be used in cryptanalysis. The ability to quickly factor large numbers would break both RSA and discrete log-based cryptography. The fastest algorithm for integer factorization is the general number field sieve, which runs in sub-exponential time. However, in 1994 Peter Shor developed a quantum algorithm (Shor’s algorithm) for integer factorization that runs in polynomial time, and therefore would be able to break any RSA or discrete log-based cryptosystem (including those using elliptic curves). This implies that all widely used public key cryptography would be insecure if someone were to build a quantum computer. The second is Grover’s algorithm, which is able to invert functions in O(√n) time. This algorithm would reduce the security of symmetric key cryptography by a root factor, so AES-256 would only offer 128-bits of security. Similarly, finding a pre-image of a 256-bit hash function would only take 2128 time. Since increasing the security of a hash function or AES by a factor of two is not very burdensome, Grover’s algorithm does not pose a serious threat to symmetric cryptography. Furthermore, none of the pseudorandom number generators suggested for cryptographic use would be affected by the invention of a quantum computer, other than perhaps the O(√n) factor incurred by Grover’s algorithm. Types of Post-Quantum Algorithms Post-quantum cryptography is the study of cryptosystems which can be run on a classical computer, but are secure even if an adversary possesses a quantum computer. Recently, NIST initiated a process for standardizing post-quantum cryptography and is currently reviewing first-round submissions. The most promising of these submissions included cryptosystems based on lattices, isogenies, hash functions, and codes. Before diving more deeply into each class of submissions, we briefly summarize the tradeoffs inherent in each type of cryptosystem with comparisons to current (not post-quantum) elliptic-curve cryptography. Note that codes and isogenies are capable of producing digital signatures, but no such schemes were submitted to NIST. Signatures Key Exchange Fast? Elliptic Curves 64 bytes 32 bytes ✓ Lattices 2.7kb 1 kb ✓ Isogenies ✗ 330 bytes ✗ Codes ✗ 1 mb ✓ Hash functions 41 kb ✗ ✓ Table 1: Comparison of classical ECC vs post-quantum schemes submitted to NIST In terms of security proofs, none of the above cryptosystems reduce to NP-hard (or NP-complete) problems. In the case of lattices and codes, these cryptosystems are based on slight modifications of NP-hard problems. Hash-based constructions rely on the existence of good hash functions and make no other cryptographic assumptions. Finally, isogeny-based cryptography is based on a problem that is conjectured to be hard, but is not similar to an NP-hard problem or prior cryptographic assumption. It’s worth mentioning, however, that just as we cannot prove any classical algorithm is not breakable in polynomial time (since P could equal NP), it could be the case that problems thought to be difficult for quantum computers might not be. Furthermore, a cryptosystem not reducing to some NP-hard or complete problem shouldn’t be a mark against it, per se, since integer factorization and the discrete log problem are not believed to be NP-complete. Lattices Of all the approaches to post-quantum cryptography, lattices are the most actively studied and the most flexible. They have strong security reductions and are capable of key exchanges, digital signatures, and far more sophisticated constructions like fully homomorphic encryption. Despite the extremely complex math needed in both optimizations and security proofs for lattice cryptosystems, the foundational ideas only require basic linear algebra. Suppose you have a system of linear equations of the form Solving for x is a classic linear algebra problem that can be solved quickly using Gaussian elimination. Another way to think about this is that we have a mystery function, where given a vector a, we see the result of ax, without knowing x. After querying this function enough times we can learn f in a short amount of time (by solving the system of equations above). This way we can reframe a linear algebra problem as a machine learning problem. Now, suppose we introduce a small amount of noise to our function, so that after multiplying x and a, we add an error term e and reduce the whole thing modulo a (medium-sized) prime q. Then our noisy mystery function looks like Learning this noisy mystery function has been mathematically proven to be extremely difficult. The intuition is that at each step in the Gaussian elimination procedure we used in the non-noisy case, the error term gets bigger and bigger until it eclipses all useful information about the function. In the cryptographic literature this is known as the Learning With Errors problem (LWE). The reason cryptography based on LWE gets called lattice-based cryptography is because the proof that LWE is hard relies on the fact that finding the shortest vector in something called a lattice is known to be NP-Hard. We won’t go into the mathematics of lattices in much depth here, but one can think of lattices as a tiling of n-dimensional space Lattices are represented by coordinate vectors. In the example above, any point in the lattice can be reached by combining e1, e2, and e3 (via normal vector addition). The shortest vector problem (SVP) says: given a lattice, find the element whose length as a vector is shortest. The intuitive reason this is difficult is because not all coordinate systems for a given lattice are equally easy to work with. In the above example, we could have instead represented the lattice with three coordinate vectors that were extremely long and close together, which makes finding vectors close to t…
I'm curious to know if any of you have any comments or takeaways from this
that’ll take a few days to digest properly
good source, and seems to be an intro, thank you, good find
We've grown a lot, we appreciate you being here. I want to let you know that we recognize things are a bit scattered and it may not be clear about the direction snowblossom is currently moving. I'm going to put more effort towards organizing and improving communication to users. Right now I think that'll currently look like * Buildling a roadmap of current/future projects * Improving wiki substance * improving initial navigation, so there's a clear path for new users to explore. Hopefully, this will help answer new users questions, keep you informed and in the loop.
some cryptocurreny projects have blogs that get updated every so often
turtlecoin has a weekly update although that might be too frequent for snowblossom
people are also encouraged to contribute in whatever ways they can to turtlecoin's development/website/marketing. i'm not sure how much work it takes from the lead dev to coordinate everything
I'm going to be working on a 1 pager this weekend and I've secured a Snowblossom related domain for some informational stuff
If I was a conspiracy theorist, I would theorize a big miner dumped from 40k sat on the ask to 20k sat on the bid in order to disincentivize adding hash to go to the next field and extend llama's utility. Why else would someone hose the order book like that when there have been consistent market buys for over the last week?
How long do you think we have until next field? I'm desperately trying to download field 7 so I can mine on my little Ultrabook
I don't know. It depends on how many of these big miners can keep their rigs running. Profitability just took a nose dive.
If you can buy it, it's not a bad idea to buy it. I don't see how the per rig mining profit number will ever be extraordinarily high unless there is a sweet spot yet to be found on another field
Can you fit field 8?
I absolutely cannot fit field 8 without buying another machine
@Fireduck @Clueless should I need update Mrplow, now the version is 1.4.0
latest version works with no issues on hamster pool.
got it thx
just avoid 1.4.2
Would be great to have bi-yearly hardwork schedule like monero
Also treasury something like https://forum.getmonero.org/8/funding-required Once a pitched and approved idea has been picked up by a developer or team it goes here for community fund-raising
1.4.2 will be the best, but it isn't released yet
ah yes, you only broke master for a bit
got ahead of myself there
so rainbow console output incoming?
@Rotonen not a bad idea.
rainbow console?
@Fireduck color coded logging.
for the sake of terminal safety, no
@Rotonen I'm confused
as a decent reference, ctrl-f unsafe https://plumbum.readthedocs.io/en/latest/colors.html
if you don't get it from the examples, you'd need to get deeper into the history of terminals and the horrible horrible legacy they carry with them
like how, if on the remote end of an ssh session, you have GPM running, you can click with your mouse in your local terminal, and the click goes through on the other side, in something like a man page in less or links
for extra reference http://freshmeat.sourceforge.net/projects/gpm/ A mouse server for the console and xterm.
or the whole vt100 to xterm transition silliness