• Background Image

    News & Updates

December 6, 2016

Tech insights: the power of random numbers

Vernon Murdoch, Chief Architect at Haventec, explains the power of random numbers for security and fun

Random numbers generated by software shouldn’t make sense. Whether they’re used in cryptography or to add extra fun to a game, they’re meant to be unsystematic, unpredictable.

Programmers traditionally rely on external sources to generate seeds and salting to make it more difficult to predict the next number in the random sequence.

These unpredictable sources are known as ‘entropy inputs’. As more random data is required these entropy sources become depleted. This then means programmers have to wait until the entropy sources are replenished so they can get the seeds they need.

System entropy is actually a scarce resource on most computers. So to ensure the numbers are less predictable, we use cryptography to generate ‘secure’ random numbers.

If the seed is known, we have problems, because the supposedly random numbers become easier to guess if someone captures the output and analyses it.

On a non-compromised system the secure random stream is much more difficult to guess by just capturing the output.

There are a few major problems with random numbers being generated by computers:

(1)   Reliance on system entropy to seed and salt the number generation. Checking for randomness is difficult and there is no way to know if the seed has been compromised (the numbers have become guessable).

(2)   System entropy is always going to be a problem if it is just software based. Using hardware number generators can help. Cryptographers have created a secure number generator – Fortuna – which minimises the reliance on system entropy using seed pools and cryptography. Fortuna also has recoverable algorithms built into the randomness generation that stops compromised seeds from polluting the stream and making all future numbers guessable.

(3)   Testing a number generator’s output stream for randomness isn’t easy. Ideally you want to check that the random stream is still random and hasn’t become highjacked or entered a state where it is repeating sequences. The industry standard tool is “Dieharder” and is supported in the package management system of all Unix operating systems. This tool gives an indication of how random the data looks but provides no guarantees. The other problem with this tool is it takes a lot of CPU power to run all the tests needed to confirm that a sample of the stream is random. “Dieharder” is used to test random number generators when systems and software is being built but is never used during production random number generation, for obvious reasons.

In comparison a very quick and simple way to look for repeated sequences would be to use a compression tool, like gzip, to check the stream’s compression ratio.

A high compression ratio would indicate the stream is not random. A low compression ratio does not guarantee a random stream but it does give an indication the stream is still random. This requires much less compute time than running a full sequence of “Dieharder” tests.

Using all this new knowledge it becomes obvious that we are no longer blocked on waiting for system entropy and are now CPU bound, making pseudo random number generation a solvable issue through scaling.

December 6, 2016

Ric Richardson’s patently good lessons

Ric Richardson, inventor and co-founder at Haventec shares the patent lessons gained during his Uniloc days.

Anyone that knows my story usually thinks of the $300 million dollar headline, but I think about the things learned leading up to that. One of those lessons is very counter intuitive. When I first went for the Uniloc 216 patent I was extremely paranoid that someone else, somewhere in the world must have done this before… it just seemed so simple to me… But that’s how most of my inventions are. They are really obvious after the fact.

More than a decade after filing for that patent we were in court proving no one else had done software activation before us. I realised then that the paranoia I had back in 1992 really worked against us.

Not for us.

If you look at the act of filing for a patent, you are actually laying your invention on the line — and very publicly. Inevitably within a few years everyone will know what you are doing, so there is no use in trying to be secretive or obfuscative.

Your patent will stand or it won’t.

In either case it’s to your advantage to go public as soon as patent protection is in place. This way everyone is put on notice that you have claimed a space in your field of technology.

No one can say you lay in wait for other innocent technologists with the aim of making them pay up in court years later. Additionally, if someone has done something like it before, while you may risk looking a bit silly you will not have wasted your time, your effort and money or your investors’ support and funds.

So, for Haventec, we are exploring the idea of provisionally patenting all important new technologies and then pretty much immediately publishing an example of our concept for peer review.

This sounds counter intuitive.

It sounds dangerous.

But the question is: if the patent is being published in another year or two then isn’t it better to go public as soon as you can while still being protected?

We’ll soon publish our technology for cloud-based random number generation.

While it’s not a front page breakthrough, among security and enterprise technologists it may prove to be a very important technology.

Random number generation is the backbone of most security and encryption operations. And currently the only way to reliably provide random numbers is to use expensive customised random number generator hardware.

Going forward, with the blessing of Rob Morrish and Tony Castagna, we will be researching new technology then patenting it, then coding an example of the code and then publishing.

And if the new technology survives some scrutiny from our growing circle of expert supporters we may even release it to the media for review by the general public.

So keep an eye on the media for something about cloud based random number generation. Let’s see if the lessons learned bare fruit.

December 6, 2016

Passwords are an ancient concept

The first logins were set up for MIT’s Computer Time Sharing System in 1960.

Back then, sys admins wanted simple authentication, so they chose username and password.

“Nobody wanted to devote machine resources to this authentication stuff,” explained Professor Fred Schneider of Cornell University’s Computer Science faculty in an interview with Wired magazine in January 2012.

Early sys admins could have used a knowledge-based system like mother’s maiden name, first pet, first school… but Schneider said “that would have required storing a fair bit of information about a person”, while a simple login only needed a few bits or bytes.

The concept of a password had its roots in armies and secret clubs, but as the inventor of the computer password Prof Fernando Corbato of MIT admitted to the Wall Street Journal in May 2014:

“Unfortunately it’s become kind of a nightmare with the World Wide Web. I don’t think anybody can possibly remember all the passwords that are issued or set up. That leaves people with two choices. Either you maintain a crib sheet, a mild no-no, or you use some sort of program as a password manager. Either one is a nuisance.”

Haventec’s Authenticate is a third method that is more secure, easier to manage and does away with the ancient form of authentication altogether.