tptacek 4 days ago

If you don't know precisely why you want a library like this (and sometimes you do), you want libsodium instead.

  • mdhb 3 days ago

    Curious for your opinion on where something like Tink fits into the picture here?

    https://developers.google.com/tink

    • tptacek 3 days ago

      In the same place. By all means, use Tink. Tink and libsodium are the two best-known user-proof misuse-resistant crypto libraries, which are the kinds of libraries most people should be using, and the only kind unless you have very particular reasons not to and a decent-sized verification budget.

  • wslh 4 days ago

    How does Libsodium compares with Crypto++[1] now? Wei Dai [2] is a highly reputable engineer.

    [1] https://github.com/weidai11/cryptopp

    [2] https://en.wikipedia.org/wiki/Wei_Dai

    • tptacek 4 days ago

      The MSB on these libraries isn't the author, but rather the intent of the library. Libsodium is designed to get you to basic core cryptographic functionality with the fewest possible landmines. Crypto++ is designed to be an interface to the maximum number of constructions and primitives. Those are radically different goals, and almost everybody is better served by the former.

      • blacklion 4 days ago

        For me, libsodium has one problem... grr... not exactly problem, but peculiarity, which is not mentioned in docs anywhere.

        libsodium doesn't put any metadata (including library and/or construction "version") in any its results, like boxes, keys, etc. So, there is no mechanism for versioning and backward compatibility.

        So, it is so-so Ok for real time communication between two installations of software product (and even then two ends could have different versions of software and/or library used!), but it is not suitable for long-time storage or communication between different systems.

        You store your "boxed" backup [keys] on disk. You want to read it 15 years later. How could you be sure, that libsodium didn't change algorithms in these 15 years? You need exactly same version of library that created this "box".

        And even application which is aware of this problem could not properly solve this: it could record & check libsodium version and refuse to work with "old" data, but how could it enforce future libsodium to read product of old one, as there is no any provision for this in API?

        libsodium is too simple for many tasks.

        • comex 3 days ago

          Has libsodium actually made backward incompatible changes to its formats?

          • blacklion 3 days ago

            Not YET, but there are no any promises too.

            New API like secretstream includes algo into API names, so, I hope, it is "stable forever", but it is rather new feature.

            Basic APIs like "crypto_secretbox" are "black boxes" and there are no any guarantees spelled out. Algorithms are mentioned in documentation, though.

            What will do libsodium if used algorithm will be broken? I don't know.

            • another2another 3 days ago

              I don't have the source right now, but I believe a lot of the algorithms used are set in build time #defines so it might be possible to capture them at build time and store them with the encrypted data.

              Means you'd have to find a compiled lib with the exact same settings to decrypt them though.

              • jedisct1 3 days ago

                If a high-level API ever has to be changed, that will be libsodium 2.0.

                In 12 years, libsodium never had any breaking API changes, even though I don't like the NaCl API much (especially usage of `unsigned long long` instead of `size_t` for sizes).

                API stability is something I'm very committed to, in all my software. APIs can always be improved. But from a developer perspective, a suboptimal but stable API is far better than something that requires changes to all your applications every time the dependencies are updated.

            • jedisct1 3 days ago

              This is documented in the FAQ.

    • adastra22 4 days ago

      I’m not casting shade on anyone, but “highly reputable engineer” is not how I would describe Wei Dai. “Early thinker in this field, respected for his opinions” might be more accurate.

      Especially if you are directly comparing against libsodium and Daniel Bernstein who is a widely respected engineer whose work is widely used and heavily reviewed.

      • wslh 4 days ago

        Daniel Bernstein is not the creator of libsodium. Libsodium is based on his work on <https://nacl.cr.yp.to/> which is not the same.

        • wolf550e 4 days ago

          Frank Denis, maintainer of libsodium, is also a reputable engineer.

        • adastra22 4 days ago

          Libsodium is a fork of nacl.

          • tptacek 4 days ago

            It's certainly not the same as Nacl. But I'd reiterate: the point here isn't trust in the library's author. It's the intent of the library.

    • awaythrow999 4 days ago

      Because extending trust usually works retrospectively?

      Or

      which battle tested applications exist today using crypto+, that illustrate it's a better choice than what sofar held up under libsodium (which is a lot)?

      • wslh 4 days ago

        My comment was intended to be constructive, not to spark a flamewar. If you compare Crypto++ and Libsodium, you’ll notice they were created in different decades. This reflects that, at one point, C++ developers predominantly used Crypto++, while later, Libsodium emerged as an alternative.

thamer 4 days ago

For what it's worth, I've used Botan in a personal project where I needed a few hashing algorithms to verify file integrity (SHA-1, SHA-256, even MD5), and also used Botan's base 64 decoder in the same project.

I found its modern "style" pleasant to write code for, and easy to integrate with good docs. That said, I did notice the large number of algorithms as others have pointed out, and I'm not sure I'd use it for more serious purposes when libsodium is well-established. It certainly wouldn't be an obvious choice.

But to quickly add support for common hash algorithms in a small project, I thought it was a good option and enjoyed its API design and simplicity.

JanisErdmanis 4 days ago

I personally don’t like that every single cryptography scheme is included in a single library. This creates a fake sense of powerfulness, but breaks down once some more complicated things need to be done such as zero knowledge proofs or homomorphic computations when it can become awkward at best. In a sensible language one makes seperate modules for hashing, cryptographic group library, pseudorandom number generation, cryptographic signatures, various zero knowledge proofs and TLS and uses package manager to install what one needs.

  • deknos 4 days ago

    Why? It's better if it's in one place and the overall quality is maintained there, than having different libraries with different authors, quality levels and guidelines.

    • JanisErdmanis 4 days ago

      Small libraries are easier to get into and contribute. Also, let’s say one develops some zero knowledge crypto system with Botan and then latter finds out that their elliptic curve implementation is not that performant. Improving performance of elliptic curves is one of the dark arts that only few know to do hence he decides to wrap one that OpenSSL library provides.

      The essential question is whether he would be able to use OpenSSL implementation without changing internals of Botan or his own zero knowledge crypto system implementation. In modular libraries this is less of an issue as itself generally implies working with abstract groups and writing wrappers or implementations outside.

      • theamk 4 days ago

        Small libraries are the biggest contributors to supply chain problems, and that is the last thing you want in your crypto libraries!

        If you assemble your own system out of random components found on the web, there is a good chance that one of those components will either disappear or stop being supported. A single monolithic library is much likely to survive once it gets a critical mass of developers.

        • JanisErdmanis 4 days ago

          Every developer is responsible for verifying the trustworthiness of given components when bringing the dependencies in and their upgrades. If the libraries are with a small API surface that is much easier to do than verifying for a big monoliths. One can simply check the specs for inputs and outputs to remain unchanged and that already provides assurances for incorrect implementations.

          Regarding supply chain attacks one of the important points is to establish consensus on dependency versions and their hashes via distributed replication or transparency logs (like in go). The big monolith then does not offer any advantages in this aspect.

          • theamk 4 days ago

            (1) The small API surface does not mean small complexity. A high-level package like "curl" or some SSL library might have a very small API hiding lots of complexity

            (2) How come "simply check the specs for inputs and outputs to remain unchanged" helps with trustworthiness or gives _any_ assurance of the incorrect implementations? (unless it's agda/coq, but we are not talking about them). The specs can say whatever, and yet the program might full of bugs.

            (3) How do you select a library in an area you are not familiar with? It seems you know cryptography well, so let's choose a different topic - what would you examine to find, say, good audio capture library, or a good OpenGL wrapper? For me, I'd look at popularity (by number of reverse dependencies in my repo, mentions on the web, or github stats); liveness (activity in repo and mailing list, open/closed bugs); general quality (commit history, CI logs if they are public, etc..). All of those would be _vastly_ easier with a single large repo vs tons of smaller one.

            (4) Think of few supply chain compromises that come to your mind. How many of those would be prevented by "distributed replication or transparency logs"? I've thought of xz compromise, left-pad pull and Marak's colors.js. _None_ of them would be fixed by "consensus on dependency versions and their hashes". All of them would be fixed if people just kept packages bigger, so that every one has actual CI, release team, and so on.

            • JanisErdmanis 3 days ago

              High level packages generally have much more users hence their inherent complexity is somewhat looked after by their users. Also curl has only a small user API, but once you consider implicit certificate store that is passed with every request and it’s countless variations it is in a sense untestable.

              On the other hand libraries for hashes, encryptions, cryptographic groups, pseudorandom number generators are much more testable on their own. One can take specs and other engineered tests and check that inputs corespond to outputs. The rest is solved via memory safe languages and common sense of not having tests commented out or made special cases for them.

              > How do you select a library in an area you are not familiar with?

              By either having popularity or by being able to read the code and seeing that there are clear expectations communicated to the user via API or documentation. I wouldn’t consider security much when chosing user facing libraries much, but for network facing libraries popularity would be essential to me.

              > All of them would be fixed if people just kept packages bigger, so that every one has actual CI, release team, and so on.

              A bunch of small libraries can still be maintained by a large organization. It does not make difference if the are installed seperatelly. However modularity does add the benefit of being able to break into complexity to contributue and also gives more transparency for new users who can see what parts of the library are actually widely tested in the production and which are not and need more care.

              Regarding xz, the lesson here is that one should not use programming languages where it’s limitations are deferred to preprocessors and complex build systems that only few can read. Similarly lesson from Log4j is that readability and language base features like string interpolation is much more preferable to having simply popularity.

  • volkadav 4 days ago

    tbh i'm just happy to see "crypto" and have it mean cryptography.

    sic transit gloria mundi or something. :)

  • asveikau 4 days ago

    It's good for most use cases for it to be a black box. TLS apis that were designed in the 90s still work with the newest protocols and ciphers and bug fixes. Consumers of the library can blindly and transparently adopt 30 years of research into the topic.

    • jeroenhd 4 days ago

      TLS APIs with moving cryptography targets have proven quite useful. I'm only sad that more low-level cryptography never got popular. In a perfect world, you tell the OS what key/trust store to use, what domain to connect to, and you just get a plaintext socket that'll do all the cryptography for you, regardless of whether you're using TLS 1.0 or TLS 1.3.

      I know the layered network protocol design is flawed, but I really like the elegance of the old design where TLS/IPSec/DTLS/WireGuard is just an option you pass to connect() rather than a library you need to pull in and maintain. Windows has something like that and it feels like somewhat of a missed opportunity for the internet.

      • tialaramex 4 days ago

        TLS 1.0 and TLS 1.1 are both a bad idea. If the far end of the connection isn't on a trip out of the solar system or buried under the ocean it needs to be upgraded. TLS 1.0 is from last century.

        • jeroenhd 3 days ago

          That's not the point. The point is that the application could've been written when TLS 1.0 was state of the art and now works over TLS 1.3 by simply updating the operating system. The OS can pick a safe set of communication protocols for you (unless manually override by user config to make broken old TLS 1.0 connections still work).

      • asveikau 4 days ago

        connect(2) and these other kernel mode interfaces are definitely the wrong layer. Should be done in user mode. You also want to be able to replace the transport layer with whatever you want.

        I think an opaque layer that lets you poke at internals when you decide you know what you're doing is the way to go. Eg. About 10 years ago I implemented certificate pinning atop a 1990s Microsoft API... you can use the api the "dumb" way and not touch it, and then optionally you can easily switch to implementing features the API vendor never envisioned.

        • blacklion 4 days ago

          Cannot be done in suer mode if you have "hardware" implementation inside your NIC.

          And it is only way to performant TLS.

          • toast0 4 days ago

            You don't really need NIC TLS unless your bitrate is rather high and you're serving data that is stored on disk / in disk cache rather than data that was just generated on the CPU.

            Bulk ciphering is pretty inexpensive since Intel Haswell (2014) where there was a big improvement in AES-NI, but the load and store around ciphering static content that could otherwise avoid a trip through the CPU can be a significant bottleneck at 100gbps+. I'm sure there are measurable gains from TLS offload in other situations, but it's only critical for static content at high bit rates.

ch33zer 4 days ago

A long list of supported hashes/algorithms is imo an antipattern for crypto libraries. They should focus on being very obviously correct for a small set of supported algorithms. Crypto's hard and this just increases the surface area.

  • nickelpro 4 days ago

    This only makes sense if you're doing greenfield development. The second you need to support existing protocols and devices where those decisions were chiseled in stone years or decades before, you'll be happy for a long list of hashes and algorithms all contained inside a single package with a single API to learn.

  • mcraiha 4 days ago

    It would be easier, if there would be one standard combo that would be enough (e.g. something similar to the WireGuard). Currently there are many IoT devices that mention TLS-support, but they don't specify e.g. supported ciphers and hash functions.

    • ekr____ 4 days ago

      In the case of TLS, at least, there is a set of mandatory to implement algorithms, so in principle two conformant implementations should be able to interoperate using those. Currently, they are:

      - ECDSA with P-256 for signature - ECDH with P-256 for key establishment - AES_128_GCM for data encryption with SHA-256 for hashing and KDF

  • cryptonector 3 days ago

    And then you have to deal with some case where you have to interop with something that uses a secure ciphersuite that your library doesn't support, and you block hard on that.

    Yes, if you want a cryptography library for a greenfield project, then you want something tiny and simple and secure (e.g., libsodium). For other things you'll want a more complete library.

  • capitol_ 4 days ago

    I would say that this is true for protocols, tls have been significantly improved by dropping support for many different algorithms.

    But its not obvious that the same is true for a library.

    The RustCrypto project breaks each algorithm into its own crate, while botan implrments everything.

    Its not obvious to me that one approach is clearly superior to the other.

    • cryptonector 3 days ago

      Each alg in its own crate means lower footprint when you need just a few.

      • randombit 3 days ago

        It's not really a significant factor here - the Botan build system allows you to select what features you want to include. For example if you wanted a binary that just supports AES, GCM, SHA-256, and ECDSA, with nothing else, that's easily done.

mdaniel 4 days ago

I first learned of it because KeePassXC uses it https://github.com/keepassxreboot/keepassxc/blob/2.7.9/cmake...

  • frantathefranta 4 days ago

    Same for me, found out what botan was because my nix managed KeePassXC package would not compile. Had to switch to the brew cask for the time being.

    • Handprint4469 4 days ago

      is Nix worth it nowadays? my past experience has been full of these little inconveniences that made me give up on it, but I'm wondering if things have improved in the last 1-2 years.

      • frantathefranta 3 days ago

        It's nice in NixOS, but my experience on MacOS isn't great. It could be due to me having an x86 2017 Macbook Pro with only 8GB of RAM but I regularly have to wait for bug fixes to update my system.

mempko 4 days ago

It's a great, easy to learn library. I used it for Firestr

gegabega 4 days ago

Is there a Rust alternative?

  • jedisct1 4 days ago

    The "boring" crate.

    It nicely wraps BoringSSL, and is actively maintained. The API is the most pleaseant of all crypto APIs I've seen in Rust.

  • wolf550e 4 days ago
    • tialaramex 4 days ago

      That's an FFI mechanism for Rustls the popular Rust TLS implementation (commonly used in Rust where say C or C++ programmers might use OpenSSL) So that's the component you would need if you've decided you want a Rust TLS implementation but, you actually want to use it from say, C++ instead.

      I assume the parent is actually a Rust programmer, or at least, intend to use this from Rust, so, yeah, plain Rustls is an obvious choice. https://crates.io/crates/rustls

      As Thomas Ptacek emphasised above, despite Botan saying it's "Crypto and TLS" if you think you might need some sort of "Crypto" rather than you just want to speak TLS, such libraries have way too many knobs and switches you don't understand, as does OpenSSL. The whole point of libsodium and similar libraries is to not have switches you don't need. Spell out a use case and deliver exactly what's needed for that use case, nothing else, no switches, no "optional" features.

      Rust has a libsodium wrapper, which is especially useful if you either trust libsodium particularly or you have experience with those APIs, and also some similar but Rust flavor approaches where they don't give you all the knobs and switches just an API that makes sense for the specific use case you chose from their list - either the library is correct or it's not, no "What kind of idiot pushes the red button?" problems.

jfwuasdfs 4 days ago

[flagged]

  • yazzku 4 days ago

    [flagged]

    • dang 4 days ago

      We've banned this account for repeatedly breaking the site guidelines.

      If you don't want to be banned, you're welcome to email hn@ycombinator.com and give us reason to believe that you'll follow the rules in the future. They're here: https://news.ycombinator.com/newsguidelines.html.