Verifiable delay functions: a quest in time

Johanna Maria Kirss

Chefs use recipes like computers use algorithms. Think of VDFs as slow recipes a computer cannot rush.

Algorithms are to computers what recipes are to chefs: sets of instructions to achieve a goal. Just like you might follow a recipe to make a pie, a computer might follow an algorithm to sort an array.

And just like some recipes have more steps than others, so do certain algorithms. We call that idea time complexity: the number of elementary steps an algorithm takes to finish running.

Notice that we say "time", but we don't think of it as the time we might measure with a stopwatch. That's not very helpful, since a computer can use many processors at the same time. For a visual, think of a chef with a team of assistants, as opposed to a solo cook. The recipe is the same for them both, but a team can finish much sooner than just one person.

So, this is why we think of the number of steps an algorithm has to perform instead. Unlike real running time, this number does not change depending on which computer is running it.*

Verifiable delay functions try to bridge that gap.

... but what if we wanted an algorithm to take the same amount of stopwatch time, regardless of "how powerful" a computer runs it? This is what verifiable delay functions (VDFs) aim to achieve.

A VDF has three properties. One, it creates a delay, meaning that it takes a specified amount of time to run. You give the function a time (T), and finding the result from an input will take exactly that amount of time. Two, it is a function, which means that if you give it the same input, it always gives you the same result.** Verifiability, the third property, means that once you have an input and a result, you can easily check if the result is correct.

These three properties create a useful building block in cryptography.

One of the possible uses of VDFs is randomness beacons. Randomness beacons regularly publish verifiable random values - if this sounds abstract, imagine trying to convince someone over the internet that you flipped a coin and didn't cheat. That's what a randomness beacon does; it publishes random values everyone can trust are unbiased.

These values can then be used in very technical settings (like blockchain consensus protocols) but also societal ones (for example, in ensuring a random airport security check is indeed random).

But how do you make a VDF?

Think back to the recipe analogy. We talked about how a chef with a team can prepare a meal faster than a solo cook but is that always true? Imagine making a layer cake - you can only add the next layer when the previous one has been finished. Now a team isn't as helpful, since you can't add any two layers at the same time anyways.

Making a VDF requires finding algorithms like that - sequential tasks. These are problems you can only solve step-by-step, not with multitasking. Making the computer solve those is what creates the delay we talked about since it cannot use its parallel processors.

If a sequential algorithm is also verifiable and a function, we can make a VDF out of it.

A handful of such algorithms have been found. The most well known of these is a piece of mathematics called modular exponentiation in an unknown group. Two separate researchers made VDFs out of it in 2018, and since modular exponentiation is well-researched, the VDFs are quite robust. They are currently in use in various blockchain protocols.

Another interesting VDF was created this year by Saab et al. It includes a mathematical object called isogenies, and it is post-quantum secure. What does this mean? Well, once quantum computers become more popular, a lot of the cryptography we use now can be broken. So, "post-quantum secure'' means that even quantum computers cannot break this VDF. This is not true for the modular exponentiation VDFs.


Verifiable delay functions are a recent invention but have already seen a lot of research. They are useful in various ways, and an interesting effect of their popularity is that delay cryptography (research into algorithms that deal with wall-clock time) generally gets more limelight. A couple of papers have been published about delay encryption, for example, which is a cryptographic building block different from VDFs.

It will be interesting to see how this research develops, and whether it grows as a field or remains mostly at VDFs.


*This is not entirely true in the case of classical and quantum computers. Shor's algorithm, which can break a lot of cryptography we use now, has a very high time complexity on classical computers. This means - luckily - that it is inefficient to run now. However, it is more efficient on quantum computers.
Post-quantum cryptography deals with preparing for when quantum computers become more feasible.

**Not all algorithms behave like that. Some might give a completely different output on different runs, even on the same input. Or they might not give any result at all, and instead create side effects: do things like changing a file on your computer or opening a website.