Demo Video: https://www.youtube.com/watch?v=blJhvUyQZiU
Talk by Mark C. (@LargeCardinal) given at BSides London 2018 - we discuss problems in random number generation on IoT devices, the security and crytpographic implementations, and give a framework for assessing the fixes that are proposed for entropy gathering for PRNG's on IoT devices.
3. Why The Arduino Random Number Generator is so weak: an unseeded LCG
hi = seed / 127773L;
lo = seed % 127773L;
seed = 16807L * lo - 2836L * hi;
if (x < 0)
x += 0x7fffffffL;
Linear Congruential Generator code snippet Arudino Random Numbers
▪ The RNG is fully DRNG, with no entropy seeding from
a noise source whatsoever
▪ This means that the device will always generate
precisely the same stream of random bits
▪ Generated by use of an Linear Congruential Generator
(LCG)
▪ Even this relatively simple code is resource-heavy on
the AVR microcontroller
▪ It takes just over 2 days (~54hrs) to get through
the full cycle on an AVR
▪ It takes ~26 seconds to do the same
computations on an i7-4660U[Test code cycling through the full LCG space on a laptop]
Why are we picking on Arduino?
▪ In short, we’re not – the Arduino, specifically the avr-libc, is probably the best example of how this can go wrong
▪ Lots of ‘fixes’ are touted for the Arduino ecosystem and for ATMega ICs, yet many of them have not been tested
▪ They are cheap to get ahold of, and are used by many makers/creatives for producing IoT prototypes
▪ Nobody is surprised that Arduino’s have bad random number generation – least of all the author!
▪ The aim here is to show how bad it is and how we can be proactive about fixing these issues
4. Some mathematical properties of the LCG – hyperplanes and Marsaglia’s theorem
hi = seed / 127773L;
lo = seed % 127773L;
seed = 16807L * lo - 2836L * hi;
if (x < 0)
x += 0x7fffffffL;
Linear Congruential Generator code snippet Some of the mathematics behind LCG’s
▪ LCG’s and LFSR’s are not considered cryptographically secure
PRNG’s, but are still included in most legacy codebases
▪ Analysis of LCG’s in particular goes back to the 60’s –
Marsaglia and others studied the properties of this generator
in order to understand better its limitations and features
▪ Marsaglia’s Theorem details the way that outputs of an LCG
can be characterised
▪ They lie on specific but a small number of hyperplanes
▪ These hyperplanes intersect the unit n-cube
▪ Specifically, successive n-tuples obtained from an LCG
fall on at most 𝑛! ⋅ 𝑚 1/𝑛
parallel hyperplanes, where
𝑚 is the LCG modulus.
▪ See Marsaglia, G. (1968) ‘Random Numbers fall mainly
in the planes’
Fun Fact – the LSB’s of an LCG output can have some quirks –
e.g. They alternate odd/even if the modulus is a power of 2, as
such, MSB’s should only be used.
Image credit: Innocente, R. ‘Random Numbers for Simulations’
https://www.slideshare.net/rinnocente/random-numbers-for-simulations
5. Overview of some of the uses of random numbers in general
“Let us not conjecture at random about the most important things.”
Diogenes Laertius, Lives of the Philosophers
▪ Key Generation
▪ IV/Nonce Generation
▪ Hashing Salt Generation
▪ One-Time Pads
Cryptography
▪ Any Chance rolls
▪ Dice throws
▪ Roulette spins
▪ Card Deck shuffles
▪ Digital spins usually have a ‘win’ requirement
(4% in 100 million spins)
Gaming & Gambling
▪ Statistical Sampling
▪ Simulation randomness e.g. Monte Carlo
methods
▪ Control Group selection
▪ Analysis – generating reference noise for
comparison, esp. in astronomy (finding patterns
in background noise)
Sciences
▪ Divination – random chance is ‘influenced by
fate’ in many cultures
▪ Greek Democracy – sortition, with citizen
selection done by coloured dice in a kleroterion
▪ Dadaism
▪ E.g. Jackson Pollock paintings
Art & History
6. Overview of the uses of random numbers in Cryptography
▪ Key Generation
▪ Generating truly random bits for use as cryptographically
secure, unguessable keys
▪ IV/Nonce Generation
▪ Secure IV’s and Nonces are single use, unguessable
secrets in secure communications
▪ Hashing Salt Generation
▪ Used for the secure storage of user artefacts by hashing
the data with a salt to prevent collisions with other user
artefacts (e.g. passwords)
▪ Cookies/Session tokens
▪ Used to authenticate users to a web application after
they have logged in, and should not be easily predictable
to prevent session/account hijack/takeover.
▪ One-Time Pads
▪ Theoretically ‘perfect’ security relies strongly on the
supply of good random numbers
Uses of Random Numbers in Cryptography
▪ ‘True’ Random Number Generation
(TRNG)
▪ Hardware Security Modules
(HSM)
▪ Trusted Platform Modules (TPM)
▪ Pseudo RNG’s (PRNG)
▪ Linear Congruential Generator
(LCG)
▪ Linear Feedback Shift Register
(LFSR)
▪ Also called ‘DRNG’s
(Deterministric RNGs)
▪ Cryptographically Secure PRNGs
(CSPRNG)
▪ ChaCha20
▪ Arc4Random
▪ ANSI X9.17
Common RNG Methods
“The generation of Random Numbers is too important to be left to chance.”
-- Robert R. Coveyou
7. Application Server
Example IoT Technology affected by random number generation – LoRaWAN OTAA
Over-the-Air Activation (OTAA)
NetSKey and AppSKey are generated by
devices based off an application
generated AppNonce and a device
generated DevNonce in a join-
request/join-accept handshake
Activation by Personalisation (ABP)
This mode bypasses the join-
request/accept procedure, and is
intended as a development mode
LoRaWAN Device Activation MethodsNetwork ServerDevices Gateway
NetSKey – Network Session Key
AppSKey – Application Session Key
▪ Encryption uses AES128-ECB mode, with key exchange apparently in the clear
▪ LoRaWAN does not support any key changing algorithm for AppKey
A UK Flood Network prototype sensors
used ATMega328 uC’s
▪ Originally used Cisero RFU-328
▪ Changed to STM32F411 based
mDots by MultiConnect, now using
Nemeus sensors
▪ This could have been quite
problematic for DevNonce
generation
8. Example IoT Libraries with serious flaws in RNG/Entropy generation
Spritz for Arduino (still in use) From file ‘SpritzBestPractice.ino’ (https://bit.ly/2L3Qzf1):
▪ ‘Entropy’ pool of characters of digits of pi
▪ Given how numbers are coded in ASCII, this fixed the first 4 bits of
each character to 0011 (0x3)
▪ A minimal fix has been essentially ignored by the developers
▪ So, instead, there is now a warning stating that ‘Arduino328P does
not have an official way of getting entropy’
▪ This warning, however, does not remedy the problematic entropy
pool nor indicate any sources of help
Ceci n’est pas une solution…
byte LoRaClass::random()
{
return
readRegister(REG_RSSI_WIDEBAND);
}
Arduino-LoRa (current) From file ‘LoRa.cpp’ (https://bit.ly/2xr5NZg):
▪ ‘Random’ numbers taken from the RSSI of a LoRa IC
(SX17xx) using its SPI interface
▪ No processing/whitening is performed on the numbers
– they are simply the current value of signal strength
▪ A PR is currently made available to the developers to
push a minimal (non-Cryptographically Secure) fix based
on the test harness we will cover later
Ceci n'est pas une solution non plus ...
10. Some Mathematical Definitions – nonstandard symbols we will be using
Mathematics Term
σ
𝜎
log2(𝑦)
2 𝜔
2<ω
Definition and Intuition
Greek letter ‘sigma’, used to denote a string.
The length of sigma – or indeed, any string we put between these
vertical lines.
The logarithm corresponding to the number 𝑥 required to solve the
equation 2 𝑥 = 𝑦 – however, we use this to denote the ‘number of
bits required to write 𝑦 as a binary string’, more properly ⌈log2(𝑦)⌉
The set of all infinite binary strings, also called the ‘Cantor space’.
Shorthand derived from 0,1 ℕ
, where 𝜔 is the cardinal of ℕ.
The set of finite, but arbitrary length binary strings.
11. Mathematics of Random Numbers: Entropy
Example Entropy Space
12 Squares populated by 9 blocks
▪ We say that a block is ‘in structure’ if there is
some condition relating to the number of
blocks in the immediate neighbourhood
▪ No block has no neighbours at any time
(unless you introduce R/L motion)
▪ Ergo, we can say that this array tends towards
some sort of structure, in general
Entropy ≠ Disorder
▪ As such, bad sources of entropy exist
Counting the Entropy Space
▪ There are 4 possibilities for each column:
▪ We can enumerate these from left to right as:
▪ 00, 01, 10, and 11
▪ This gives us a very easy way to reference the state of
each column
▪ Note - we only need 2 bits per column to express the
full state of the system, so 6 bits overall
▪ As such, the entropy of this 12-bit fixed source,
variable state system is 6 ‘free’ bits
▪ The actual entropy may not be the full entropy
Motion
Idea: Entropy is the measure of the space of all possible outputs, usually with some probability distribution
applied - cf. Shannon Entropy in information theory, which can be expressed as H 𝑥 = σ𝑖=1
𝑛
𝑃 𝑥𝑖 log 𝑏 𝑃(𝑥𝑖)
12. Mathematics of Random Numbers: Entropy and Complexity
You have two bananas and 𝑛 buckets. You throw
the bananas and they land in the buckets. After
each round, you record where the bananas
landed on a square grid.
Question – how best can we describe each
possible game?
The Two Bananas Game
13. Mathematics of Random Numbers: Entropy and Complexity
You have two bananas and 𝑛 buckets. You throw
the bananas and they land in the buckets. After
each round, you record where the bananas
landed on a square grid.
Question – how best can we describe each
possible game?
The Two Bananas Game Outcomes for various values of n
𝑛 = 2 : (fairly boring outcome)
𝑛 = 3: (3 outcomes)
𝑛 = 4:
(6 outcomes)
▪ In general, the number of game states is 𝑛
2
=
𝑛2−𝑛
2
▪ If we ‘address’ each game state, we need log2(
𝑛2−𝑛
2
) many
bits to describe each address
▪ We could also measure the distance (in buckets) between
each banana, so we can represent each game in 2log2(
𝑛
2
)
many bits
▪ This gives us the following general inequalities, 𝑛 ≥ 3:
2log2
𝑛
2
≤ log2
𝑛2−𝑛
2
< 𝑛
Kolmogorov Complexity
▪ We define the Kolmogorov Complexity of a
string 𝜎 ∈ 2<𝜔as follows:
𝐶Φ 𝜎 = min 𝛼 : Φ 𝛼 = 𝜎
Where Φ is a Universal Turing Machine (UTM)
▪ NB - if we use a prefix free UTM, we use the
symbol 𝐾Φ(𝜎) instead – the differences are
mainly technical in proofs
▪ It is sensible, therefore, to say things like this:
𝐾 𝑃𝑌𝑇𝐻𝑂𝑁(𝜎)
14. Mathematics of Random Numbers: Complexity
What is the shortest program to write some string?
▪ What is XX in the following strings?
1. 0101 0101 0101 0101 0101 0101 0101 01XX
2. 0110 0110 0110 0110 0110 0110 0110 01XX
3. 1001 0100 0010 1011 0101 1010 0010 11XX
▪ This is equivalent to asking ‘how easy are these strings to
compute?’
Kolmogorov Complexity
▪ We define the Kolmogorov Complexity of a
string 𝜎 ∈ 2<𝜔
as follows:
𝐶Φ 𝜎 = min 𝛼 : Φ 𝛼 = 𝜎
Where Φ is a Universal Turing Machine (UTM)
▪ NB - if we use a prefix free UTM, we use the
symbol 𝐾Φ(𝜎) instead – the differences are
mainly technical in proofs
▪ It is sensible, therefore, to say things like this:
𝐾 𝑃𝑌𝑇𝐻𝑂𝑁(𝜎)
15. Mathematics of Random Numbers: Complexity
What is the shortest program to write some string?
▪ What is XX in the following strings?
1. 0101 0101 0101 0101 0101 0101 0101 0101
2. 0110 0110 0110 0110 0110 0110 0110 0110
3. 1001 0100 0010 1011 0101 1010 0010 1100
▪ The first two strings are easily computable, whilst the last
one is random, which is hard for us to effectively compute
Kolmogorov Complexity
▪ We define the Kolmogorov Complexity of a
string 𝜎 ∈ 2<𝜔as follows:
𝐶Φ 𝜎 = min 𝛼 : Φ 𝛼 = 𝜎
Where Φ is a Universal Turing Machine (UTM)
▪ NB - if we use a prefix free UTM, we use the
symbol 𝐾Φ(𝜎) instead – the differences are
mainly technical in proofs
▪ It is sensible, therefore, to say things like this:
𝐾 𝑃𝑌𝑇𝐻𝑂𝑁(𝜎)
‘Python Kolmogorov Complexity’ of our strings:
1. print "01"*16
2. print "0110"*8
3. print "10010100001010110101101000101100"
▪ NB – our program is between 8 or 11 extra characters per
string print primitive, but these are constant
▪ Thus we can produce an arbitrary length n string in approx.
log2 𝑛 + 𝑐 many characters of python code for 1. and 2
▪ String 3., however, can only be produced by printing string 3
c-incompressibility and Kolmogorov Complexity
▪ For 𝜎 ∈ 2<𝜔, c ∈ ℕ we can say that 𝜎 is c-
incompressible if K 𝜎 ≥ 𝜎 − 𝑐 where |𝜎| is
the length of 𝜎 in bits.
▪ The core idea of an arbitrary (but finite)
length string being c-incompressible is rooted
in the idea on the left
▪ Intuitively – we cannot do better producing
an incompressible string than ‘print it’
16. Mathematics of Random Numbers: 1-Randomness
c-incompressibility and Kolmogorov Complexity
▪ For 𝜎 ∈ 2<𝜔
, c ∈ ℕ we can say that 𝜎 is c-
incompressible if K 𝜎 ≥ 𝜎 − 𝑐 where |𝜎| is
the length of 𝜎 in bits.
▪ The core idea of an arbitrary (but finite)
length string being c-incompressible is rooted
in the idea on the left
▪ Intuitively – we cannot do better producing
an incompressible string than ‘print it’
1-Randomness
▪ 𝜎 ∈ 2 𝜔
is 1-Random if there exists 𝑐 s.t. every
finite initial segment of 𝜎 is c-incompressible
▪ I.e. There is no advantage to knowing the first
n bits of c in order to get the n+1th bit
▪ This was influential on Yao’s ‘Next bit’ test in
cryptography (Yao, 1982)
▪ This is the most ‘intuitive’ definition, and is
equivalent to Martin-Löf Random (Schnorr)
Tl;Dr – Here are the core concepts we wanted to convey
▪ Entropy is the measure of the number of ‘arrangements’ or ‘possibilities’ in a given space and the
probability of each occurring
▪ Kolmogorov Complexity is the measure of the shortest string that, on input to some UTM,
produces the desired output string
▪ in a sense, it is a measure of the information containted within a string
▪ A string is random if previous bits do not give any clue as to the next bits of the output
▪ There are formal definitions of randomness, of which we have covered one (1-Random) but have
not covered any of the wonderful theory and results that surround this quirky concept
▪ Further reading: Downey & Hirschfeldt; Algorithmic Randomness and Complexity
17. </maths>
NB – if you didn’t understand all that, you probably shouldn’t
be writing Random Number Generators…
18. The Linux Random Number Generator – an overview of a decent design
Random Numbers on Linux
▪ This diagram shows the complexity of the Linux Kernel Random Number Generator
▪ Although it is relatively infeasible to use a full instance of this on a microcontroller, it shows the
level of complexity that is generally considered to be required
▪ Taken from: Lacharme, Röck, et al., Linux Psudorandom number generator revisited
https://eprint.iacr.org/2012/251.pdf
Entropy Estimation
Mixing
Input Pool
Entropy
Counter
Entropy Accumulation
Entropy Transfer
Mixing
Mixing
Output
Random
Number
Generator
Blocking Pool
Non-Blocking Pool
Entropy
Counter
Entropy
Counter
Output
Output /dev/random
/dev/urandom
Entropy
Sources
19. How to Improve random()on Arduino (ATMega based platforms)
Commonly Suggested Methods of Entropy Pooling
▪ Watchdog Timer (WDT) Jitter
▪ Thermal noise (Johnson-Nyquist Noise) through
ADC Jitter
▪ RSSI Noise (Quantum Shot noise)
▪ Thermistor ADC Jitter
Minimal Improvement Model
PRNGEntropy
Pooling
Seeds/Reseeds
Better Candidates for microcontroller PRNG
▪ Salsa20 – used as builtin for libsodium – NaCl is
included in, e.g. the ESP32 esp-idf, by default
▪ ChaCha20 – somewhat slower, and loses
statistical performance if fewer rounds used
▪ Gimli Permutation engine – used in LibHydrogen,
developed by Bernstein et al.
20. Outputs
▪ Test results for different Entropy sources
▪ Implmentation recommendations for RNG in IoT
▪ Generally better understanding of the challenges at
hand
Developing a unified testbed for assessing entropy sources
What we have already to improve things in IoT
▪ Better PRNG’s and embedded system optimised
permutation engines, e.g. Gimli
▪ A decent theoretical understanding of what we
want out of our improvements
▪ Some suggestions for decent entropy sources as
errors here will now propagate
What we need
▪ A simple test setup into which we can ‘plug and
play’ code for getting data out of entropy sources
that are under test
▪ Some code that will get us reasonably good
samples and data and is independent of entropy
source
▪ An idea of the raw entropy of a source’s
properties: time to generate random bits,
whether there are ways to ‘cheat it’, etc.
Entropy
Gathering
• We need code to gather random bits
• We then unify our output to something
uniform, e.g. 1 single bit
Entropy
Pooling
• Does some basic level of entropy
monitoring and/or whitening
• Detect and avoid sink states
PRNG
Seeding
• When do we initially seed the PRNG?
• How often do we re-seed the PRNG?
Our proposed workflow
21. Code Test Harness - grabbing fair random bits from an entropy source
int coin_flip(void) {
int little_bit;
return little_bit = (GetMagic() & 1);
}
int getFairBit(void) {
while(1)
{
int a = coin_flip();
if (a != coin_flip()){return a;}
}
}
int get_rand_byte(void){
int n=0, bits=7;
while (bits--) {
n<<=1;
n |= getFairBit();
}
return n;
}
Kaminsky Basic Generator Explanatory Notes
coin_flip() gets the LSB, our hopeful random bit
from whatever is returned from the GetMagic()
function – this is a generic interface for getting bits
getFairBit() applies a basic von Neumann
Extractor function to ‘whiten’ the bitstream by
omitting ‘00’ and ’11’ pairs. Thus, we only catch our
entropy source on a differential, and we wait when/if it
sinks to ‘000...’ or ‘111...’; this actually normalises the
probabilities s.t. We get as close as we can to 𝑃 1 =
𝑃 0 = 0.5
get_rand_byte() is what we will call to seed our
PRNG and consequentially fills a full byte with
whitened random bits
Source: Kaminsky in ‘PoC||GTFO 0x01’,§2,
https://www.alchemistowl.org/pocorgtfo/pocorgtfo01.pdf
22. Analysis – RC Noise entropy gathering
int coin_flip(void) {
int little_bit;
digitalWrite(2,1);
delay(2);
return little_bit = (analogRead(A6) & 1);
}
RC Noise –coin flip RC Noise Entropy
▪ Addition of some basic components – a
capacitor and resistor across from A2 to A6,
with wire to GND
▪ Only tested for thru-hole components (see the
dirty hack on the left)
▪ Method:
▪ We drive the pin high, give the component
2ms to warm up slightly (1st run)
▪ Read the LSB from the ADC signal read
▪ Observations:
▪ Slow to gather entropy, as it relies on
thermal properties of the components
▪ Provides good entropy
▪ Requires two extra components in BOM
▪ Conclusion – Not a perfect solution, and not the
best return on investment from components
added in order to make it work
▪ Might be feasible if a reliable way of
storing/recovering the entropy pool is in play
23. Analysis – WDT jitter entropy gathering
#include <Entropy.h>
void setup()
{
Serial.begin(9600);
Entropy.initialize();
}
void loop()
{
for (int i=0; i<10; i++){
randomSeed(Entropy.random(WDT_RETURN_BYTE)
);
for (int j=0; j<1000; j++) {
Serial.println(random());
}
}
}
WDT Jitter – Library function by pmjdebruijn WDT Jitter Entropy
▪ An oft-recommended implementation of entropy
gathering for AVR’s (cf. Internet fora)
▪ Has a flaw!! Locked value on device reset (see below)
▪ This is most likely due to the jitter being reset with the
device (essentially a full clock synchronisation)
▪ This entails that the RNG seed is predictable on device
reset – meaning a DoS and WD trigger can be used to
predict PRNG output
▪ Conclusion – would not recommend this for general use
Source –
https://github.com/pmjdebruijn/Arduino-Entropy-Library
24. Analysis – Thermistor Noise entropy gathering
int GetTemp(void)
{
unsigned int wADC;
ADMUX = (_BV(REFS1) | _BV(REFS0) | _BV(MUX3));
ADCSRA |= _BV(ADEN); // enable the ADC
delay(20); // wait for voltages to
// become stable.
ADCSRA |= _BV(ADSC); // Start the ADC
// Detect end-of-conversion
while (bit_is_set(ADCSRA,ADSC));
// Reading register "ADCW" takes care of
// how to read ADCL and ADCH.
wADC = ADCW; // Use this line after ADC is up
return (wADC);
}
Thermistor Noise – get temperature byte (mask for LSB in coinflip()) Internal Thermistor Noise Entropy
▪ Generates random bits from the
noise in the LSB of the internal
thermistor ADC
▪ Method:
▪ Uses a ‘hidden’ A8 ADC ‘pin’
reserved for internal
thermistor
▪ Read this pin to get a byte of
temperature data
▪ Observations:
▪ Fairly quick to generate
entropy compared to RC ex.
▪ Requires no extra components
▪ Fun Fact – can be sped up by
breathing on the IC
▪ Conclusion – A very easy to
implement solution, requiring no
extra parts; would recommend!
Source - https://playground.arduino.cc/Main/InternalTemperatureSensor
25. Analysis – RSSI Noise entropy gathering
...
byte LoRaClass::random()
{
return readRegister(REG_RSSI_WIDEBAND);
}
...
int coin_flip(void) {
int little_bit;
digitalWrite(2,1);
delay(2);
return little_bit = (LoRa.random() & 1);
}
...
RSSI Noise – get temperature byte and extract LSB RSSI Noise
▪ Received Signal Strength Indicator (RSSI)
▪ Accurate to ~.1dB measurement of strength of
a signal – as such, fluctuates in LSB continually
▪ Method:
▪ Communicate with radio IC/internal core
to get current RSSI value
▪ This can be done over SPI, I2C, etc.
▪ Take LSB of returned value
▪ Observations:
▪ Fast, and very reliable
▪ Some indications of repetition but this
was hard to reproduce reliably
▪ Conclusion – Another quite easy
implementation, but one requiring extra PCB
parts and interaction/setup of a radio. Certainly
a good candidate to be mixed with other
sources
▪ This should be tested for tolerance to RFI
Library Source: https://github.com/sandeepmistry/arduino-LoRa
NB – this test was performed on an ESP32-based Hentec LoRa
development board
26. Overall Findings – Preliminary analysis and overview of findings of entropy sources
Relatively Fast
WDT Jitter
Not unreasonably slow, but the
phase lock of the clocks on device
reset render this approach flawed
Relatively Slow
Components
not required
to implement
Components
required to
implement
RC Noise
Slow but reliable random bits
produced by adding 2 components
to a microcontroller
RSSI Noise
Very effective but requires a radio
core or radio IC to be present, and
that this can give raw data output
Internal Thermistor Noise
Very effective, fast, and with no
extra components required, but
does require access to thermistor
Not recommended
Somewhat recommended
Recommended
Resources:
▪ [githublab link to follow!!]
▪ https://medium.com/@LargeCardinal/random-numbers-on-the-arduino-fc34944615b
▪ https://medium.com/@LargeCardinal/random-numbers-on-the-arduino-97f325820284
▪ https://medium.com/@LargeCardinal/what-do-we-want-all-t3h-arduino-randomz-630dd2aceea7
▪ https://csrc.nist.gov/csrc/media/events/random-bit-generation-workshop-
2016/documents/presentations/session-i-1-john-kelsey-presentation.pdf - SP800-90B Entropy Sources Spec
27. Closing Remarks – defining future work
Our tests need to be carried forward
▪ The problem is not limited to AVR libc (see right)
▪ We need to test more microcontrollers in a like-for-like
manner to enable meaningful comparison
▪ We need to look to improving RNG across the board,
as we move towards greater permeation of tech such
as IoT into our lives
▪ Manufacturers should be held to account should their
claims be found to be false
▪ Push for “Not CSPRNG” warnings in datasheets
▪ Developers need to be made aware – make your
cryptographic decisions early in IoT!!
▪ Bad cryptography decisions very hard to fix later
if you suddenly find you need more crypto than
your hardware supports!
Future Work
▪ Analysis of open source RNG sources such as ‘Infinite Noise’ (thanks for the dongle!) and assess the design’s
feasability for use in IoT or embedded environments
▪ Extend test rig to include example code for Entropy Estimation, pool storate/restoration, etc.
▪ Use these results to inform projects like LibHydrogen to make embedded RNG much more effective and safe