基於生成對抗網路的偽隨機數生成函式研究
No Thumbnail Available
Date
2022
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
如何生成安全和快速的隨機序列一直是密碼學中的一個關鍵問題。在本文中,我們將介紹如何用硬體噪音訓練GAN(生成對抗網路)並生成具有類似質量的隨機序列。Linux操作系統中由/dev/random產生的硬體噪音代表了我們GAN的訓練集。在訓練中,我們還應用了其他方法,如Early stopping,以防止模型過擬合。最後,我們使用128,000,000比特的隨機序列,在NIST(美國國家標準暨技術研究院)特別出版物800-22測試和ENT測試下,將我們的GAN與其他PRNG(偽隨機數生成器)進行比較。結果顯示,我們的GAN優於大多數PRNG,我們發現我們的GAN與/dev/random作為訓練集有很多相似之處,並且生成隨機序列的速度至少是/dev/random的1044倍。它證明了GAN作為一種神經網絡PRNG,可以模仿非確定性算法的硬體噪音,同時具有硬體噪音的高安全性和PRNG的速度優勢。而且,它已被證明可以取代安全但低速的硬體設備,並產生類似質量的隨機序列,為密碼學領域提供了一種全新的方法。
Generating cryptographic and fast random sequences has been a critical issue in cryptography. In this thesis, we will introduce how to train the GAN (Generative Adversarial Network) with hardware noise and generate random sequences with similar quality. The hardware noise generated by /dev/random in Linux OS represents the training set of our GAN. In training, we also applied other methods, such as Early stopping, to prevent the model from overfitting. Finally, we compare our GAN with other PRNGs (Pseudorandom Number Generator) using 128,000,000 bits of random sequences under the NIST (National Institute of Standards and Technology) Special Publication 800-22 test and ENT test. The results show that our GAN outperforms most PRNGs, and we find that our GAN has many similarities to /dev/random as the training set and generates random sequences at least 1044 times faster. It demonstrates that GAN, as a neural network PRNG, can imitate the hardware noise of non-deterministic algorithms that can have the high security of hardware noise and the speed advantage of PRNG at the same time. And, it has been shown to replace secure but low-speed hardware devices and generate random sequences of similar quality, providing a completely new approach to the field of cryptography.
Generating cryptographic and fast random sequences has been a critical issue in cryptography. In this thesis, we will introduce how to train the GAN (Generative Adversarial Network) with hardware noise and generate random sequences with similar quality. The hardware noise generated by /dev/random in Linux OS represents the training set of our GAN. In training, we also applied other methods, such as Early stopping, to prevent the model from overfitting. Finally, we compare our GAN with other PRNGs (Pseudorandom Number Generator) using 128,000,000 bits of random sequences under the NIST (National Institute of Standards and Technology) Special Publication 800-22 test and ENT test. The results show that our GAN outperforms most PRNGs, and we find that our GAN has many similarities to /dev/random as the training set and generates random sequences at least 1044 times faster. It demonstrates that GAN, as a neural network PRNG, can imitate the hardware noise of non-deterministic algorithms that can have the high security of hardware noise and the speed advantage of PRNG at the same time. And, it has been shown to replace secure but low-speed hardware devices and generate random sequences of similar quality, providing a completely new approach to the field of cryptography.
Description
Keywords
人工智慧, 深度學習, 分類, Artificial intelligence, Deep learning, Classification