Raspberry Pi Character LCD Degree Symbol

It is inevitable that you will want to write the degree symbol to your 16×2 character LCD once you have it wired to the Pi. My son wired it up, and then wanted to display the degree symbol. In the past I have solved that problem for the Arduino using the Adafruit character LCD Arduino library createChar() function. Relatively painless for the Arduino. Unfortunately, the Adafruit LCD library for GPIO not using i2c does not have an equivalent of the createChar() function. Sad.

The solution turns out to be easy. You rely on the fact that the LCD already has a degree symbol built in, and then use the write4bits() function to display it. There are other characters available, I found mine here. Here is our code.

# This is the process of importing the Adafruit library

os.system('modprobe w1-gpio')
os.system('modprobe w1-therm')

temp_c = 23.1
temp_f = 32. + temp_c*9./5.

lcd.clear()
lcd.message(&quot;temp C&quot;)
lcd.write4bits( 0xDF, True)
lcd.message(&quot;: %.1fn&quot;%temp_c)

lcd.message(&quot;temp F&quot;)
lcd.write4bits( 0xDF, True)
lcd.message(&quot;: %.1f&quot;%temp_f)



That Noisy Fan–Calibrated Measurement

I am working with the freetronics microphone module, which is described as having a sensitivity of “-40 dB typical”—let us assume it is dB (SPL) referenced to 20 micropascals root mean square (rms) at 1 KHz. When I think of an electronic element’s sensitivity, though, I’m thinking volts per micropascal and this is not provided.

The microphone’s schematic suggests the SPL measurement is the rms average over 3 milliseconds, and that the signal is proportional to rms pressure level. This suggests a log scale for display (a change from previous work).

The upshot is that the fan generates about 62 dBA; but to get that conclusion, I had to perform a spectral correction to match measurements from a more calibrated sensor.

Remember that I made the measurements with the RIMU data logger I built. That data logger has a digital low-pass filter on the microphone SPL. The low-pass filter has much too slow a response, but the basic result is OK. The RIMU, shown in the next picture, is the instrument.

I borrowed an SPL meter from my father (thanks!). It is, unfortunately, a C-weighted measurement, measuring in dBC. The C-weighting is occasionally a very useful measurement of sound level. Most measurements are done A-weighted, which is similar to human hearing. My challenge is to convert a measurement made with an unweighted microphone in arbitrary units to A-weighted measurement in dBA. The answer is to take a measurement with the instrument in dBC, and record the sound a with a sound recorder, figure out what converts C-weighting to A-weighting for this signal.

In the spectra below you can see that the spectrum recorded without weighting, by the Zoom recorder. I then applied an A weighting and a C weighting. What’s important is the conversion between the rms value for A and the rms value for C, which is a 6 dB correction in one case, and a 13 dB correction in the other.

So, to rehash the steps

• Record a sound level with the RIMU
• Measure a reference condition in dBC with the SPL
• Record a 15 second period with the Zoom
• Apply the A weighting to the Zoom record
• Apply the C weighting to the Zoom record
• Find the difference in dB between the A and C weighted records
• Assume the dBC weighting corresponds to the measurement made with the RIMU, and adjust the RIMU values so that they match the C-weighted measurement from the SPL
• Apply the C-to-A correction to the RIMU measurement.

Pretty epic pain, but at least I have the measurements. In the future I will recode the RIMU to take short A-weighted and C-weighted snapshots, and then calibrate the RIMU on the dBA record. Look for a follow-up post far in the future.

Arduino Analog Sample Rate

The Arduino Uno is not the ultimate signal processing machine, but it can do some light duty work on burst data. I had trouble finding the maximum sample rate the Uno can support, which is totally critical for most kinds of work.

With native prescale

• Samples, type casts into floats, and stores in an array at about 8929 Hz, or 112 microseconds per read-and-store
• Samples and into stores into an array of unsigned integers (16 bit) at about 8930 Hz, or 112 microseconds per read-and-store

We can step this up quite dramatically by setting the prescale set to 16:

• Samples, type casts into floats, and stores in an array at about 58600 Hz, or 17 microseconds per read-and-store
• Samples and stores into an array of unsigned integers (16 bit) at about 58606 Hz, or 17 microseconds per read-and-store

Pretty close to a 60 KHz sample rate, more than adequate for audio sampling. Of course, the Arduino doesn’t have enough memory to be a serious audio processor, but it is pretty goood.

The forums discuss this, and arrive at a similar conclusion.

#define NSAMP 5000

void setup(){
Serial.begin( 57600);

float array_float[ NSAMP]; // float
unsigned int array_int[ NSAMP]; // 16 bit
unsigned long int micsbegf, micsendf, micsbegi, micsendi;

for( int i = 0; i &amp;lt; 2; i++){
if ( i == 1){
// Set prescale to 16, and retry
}

// Record floats (extra time for type conversion presumably)
micsbegf = micros();
for( int i = 1; i &amp;lt; NSAMP; i++){
}
micsendf = micros();

// Record floats (extra time for type conversion presumably)
micsbegi = micros();
for( int i = 1; i &amp;lt; NSAMP; i++){
}
micsendi = micros();

if( i == 1){
Serial.println(&amp;quot;with prescale set to 16&amp;quot;);
}
Serial.print(&amp;quot;recorded &amp;quot;);
Serial.print( NSAMP);
Serial.print(&amp;quot; float samples in &amp;quot;);
Serial.print(micsendf - micsbegf);
Serial.println(&amp;quot; usec&amp;quot;);

Serial.print(&amp;quot;recorded &amp;quot;);
Serial.print( NSAMP);
Serial.print(&amp;quot; unsigned integer samples in &amp;quot;);
Serial.print(micsendi - micsbegi);
Serial.println(&amp;quot; usec&amp;quot;);
}
}

void loop(){
delay( 10);
}


Glue on Paper

Last summer I assigned my son, then six years old, to do an “inquiry.” Inquiries are guided studies of something, and empirical inquiries are like mini science fair projects. At the time I asked him to pick a topic he selected “how does glue work?” Exploring this at the chemistry level seemed like too big a reach for a 6 year old, so we morphed the question into “how well do different kinds of glue work?”

It turned out to be a really fun experiment. We gathered all the not-too-toxic glues from around the house. The oxiclean container has methyl cellulose inside.

He glued several trials of each paper together as crosses. Then, he brought the ends of the paper together and clamped, one piece of paper to a board hanging over the edge of the desk, and the other piece to bucket. In the picture below you can see the test setup. The paper is like two U-shapes that are glued where the come together. They don’t loop inside one another, they are held together only with glue. The glue joint area was approximately 1 inch square.

He added weight to the bucket until the paper failed, and then weighed the bucket.

The glues and results are in the table below. The “r” indicates that the paper ripped, otherwise the glue joint failed across the surface. The best and worst are boldface.

 # Glue Name Average Weight Held (lbs.) Trial 1 Trial 2 Trial 3 1 Elmer’s Glue Stick 1.52 1.096 1.234 1.222 2 PVA:Methyl Cellulose 1:1 vol. 2.37 3.064 2.888 2.062 3 Elmer’s Glue-All 3.39 3.270 3.944 (r) 2.928 4 Gorilla Glue 2.27 5.438 3.180 0.746 5 Methyl Cellulose 4.20 4.468 4.392 3.752 7 PVA 3.65 3.966 2.768 3.824 8 Titebond III 4.25 3.824 5.180 3.576 9 DAP Strongstick 1.54 1.460 1.562 1.576 10 Cyanoacrylate (Super Glue) 4.15 5.180 3.966 (r) 3.284 (r)

Most of these results fit my expectations. There were a few surprises though.

• Gorilla glue (polyurethane) was typically quite strong, but once was very poor.
• Methyl cellulose is usually considered a weak glue, but it performed almost as well as the best of them. Maybe it is strong in our dry climate and weaker in more humid climates?
• Glues that soak in well (super glue and methyl cellulose) seem to have an advantage.

The test was essentially performing a sort of peel-off strength test. In many ways I think the setup was quite good. The stresses on the glue joint are fairly typical for what materials are stressed with.

Does the Bathroom Fan Do Anything?

Other than make noise, that is. I built the most recent version of my Arduino-based data logger (the RIMU), and was looking for something to log. I’ve had this question for many, many years—does the bathroom fan actually do anything? It makes noise, deafening Niagara falls thundering noise. The mirror still gets foggy, though, and condensation still forms on the fixtures. Is there something good about all that noise?

Investigating this with the RIMU is more difficult than you might think. First, it is hard to control the variables. I recorded about seven days’ worth of data before I got two records that had similar enough baselines to separate the effects of the fan. Second, the analysis had so many measurements to work with.

Our bathroom is modest in size, with a counter and sink on one side and the shower on the opposite. The RIMU was sitting on the counter.

Humidity

The plot below shows the relative humidity over time. At approximately time zero I turned on the light. Shortly thereafter I turned on the shower, and the humidity began to rise. Weirdly, the humidity rose in a very similar fashion for the first few minutes. Probably the separation is when I climbed into the shower. I speculate that the dip at six minutes in the green “fan off” curve is climbing into the shower too. Opening the shower door, it seems, sets up different air currents.

One conclusion, clearly, is that the fan actually keeps the humidity under about 90%, instead of letting it rise to 100%. Another conclusion is that, compared to leaving the bathroom door open, the fan is really ineffective. A third conclusion is that the sound and light data are actually more interesting than the humidity data.

Light

The TSL chip that I use to sense light is quite a wonderful little device. It has controllable integration time. The idea is that if it there is a low light level the sensor can record longer to provide a better estimate of the actual level. In the RIMU I try to auto-tune the integration time. If the reading is very low, I increase the integration time, if it is very high, I decrease it. Unfortunately, when I programmed the RIMU I did not realize that the TSL library was providing measurements that had to be corrected for the integration time (counts, not Lux).

That brings us to the first interesting observation. In the blue “fan on” trace, there is a really whopping spike in the light level. What is actually happening is that there is a recorded measurement before the TSL integration time is reduced.

The second cool observation is that you can see the fluorescent lights increase their output as they warm up. I knew, intellectually, that this was happening—my scanner won’t scan until its fluorescent tube has warmed up. I always thought that was a 20 second process, not a three minute process.

The third cool observation is the visible background in the green “fan off” line. I took that recording on a Saturday morning, and the natural daylight trend is visible in the background.

Sound

The RIMU code runs in a constant loop, as fast as it can. The downside is that the sample rate on a polled sensor is whatever it can provide—not a specific value that you wish. RIMU records a sample to the SD card about every 10 seconds. In between it takes a reading of the sound pressure level as often as possible, and then averages them. The average is a moving average implemented with a single pole infinite impulse response filter. This is the simplest filter of all, something like

SPLaverage = SPLaverage*0.99 + measurement*0.01

The first observation is that the moving average is not very cleverly implemented. The response time is nearly 3 minutes—something like 10 seconds would have been better.

The second interesting thing to observe is how unimaginably loud the fan is. The fan more than doubles the sound level compared to the water flow. The noise during shaving is due to turning on the faucet, which was very close to the RIMU.

Temperature

The temperature in both trials increased an average of about 2 degrees Fahrenheit. The fan may provide some mixing of the air, that keeps the temperature more even—the blue “fan on” line rises much less than the green line.

Conclusion

Yep. The fan does something. If controlling humidity is really important (is it?) then you should just leave the door open. If you can’t leave the door open, turn on the fan.

Revised Raspberry Pi TrueCrypt Benchmark

Revised March 31, 2013 with updated benchmarking approach that uses actual access to the mounted volume. New results show no appreciable sensitivity to hash, which is as expected. The numbers are for encryption only (write). I have not pursued read.

 Hash Algorithm Encryption Algorithm Rate (MB/s) SHA-512 Twofish 2.8 Whirlpool Twofish 2.8 RIPEMD-160 Twofish 2.8 SHA-512 Serpent 2.6 Whirlpool Serpent 2.6 RIPEMD-160 Serpent 2.6 Whirlpool AES 2.1 RIPEMD-160 AES 2.1 SHA-512 AES 2.1 SHA-512 Twofish-Serpent 2.0 Whirlpool Twofish-Serpent 2.0 RIPEMD-160 Twofish-Serpent 1.9 SHA-512 AES-Twofish 1.6 RIPEMD-160 AES-Twofish 1.6 Whirlpool AES-Twofish 1.6 Whirlpool Serpent-AES 1.6 SHA-512 Serpent-AES 1.6 RIPEMD-160 Serpent-AES 1.6 Whirlpool AES-Twofish-Serpent 1.3 Whirlpool Serpent-Twofish-AES 1.3 SHA-512 Serpent-Twofish-AES 1.3 SHA-512 AES-Twofish-Serpent 1.3 RIPEMD-160 Serpent-Twofish-AES 1.3 RIPEMD-160 AES-Twofish-Serpent 1.3

Shell Script for Timing


#!/bin/bash

# Create a file of random elements, needs to be at least 300 bytes
dd if=/dev/random of=random bs=512 count=1

# Iterate over the hash hash funnctions
for HASH in RIPEMD-160 SHA-512 Whirlpool
do
# Iterate over the available encryption algorithms
for ENCALG in AES Serpent Twofish AES-Twofish AES-Twofish-Serpent Serpent-AES Serpent-Twofish-AES Twofish-Serpent
do
# Write the algorithms to the log
echo &quot;Algorithms: $HASH$ENCALG&quot; &gt;&gt; log
# TrueCrypt will report the performance in the output
truecrypt -c /home/pi/test.tc --filesystem=fat --size=10485760
--encryption=$ENCALG -p ppp --random-source=random --hash=$HASH --volume-type=normal --non-interactive
# Mount the partition
truecrypt --non-interactive -p ppp -m nokernelcrypto test.tc /home/pi/tcvol
(time  ./timeit) 2&gt;&gt; log
truecrypt -d /home/pi/tcvol
# Erase the created file
rm test.tc
done
done



Timed Routine


dd if=/dev/zero of=tcvol/test bs=5242880 count=1 &amp;&gt; /dev/null

sync



Python Reprocessor


import sys
fid = open( sys.argv[1], 'r')
fid.close()

tsecs = None
while len( lines) &gt; 0:
line = lines.pop(0)
lls = line.strip()

if lls.startswith( 'Algo'):
# If we already have a tsecs, then print
# the last elements
toks = lls.split()
if tsecs == None: # first record
algo = &quot;,&quot;.join( toks[1:3])
else:
print algo,&quot;,&quot;,tsecs
algo = &quot;,&quot;.join( toks[1:3])
elif lls.startswith( 'real'):
toks = lls.split()
toks = toks[-1].split('m')
tsecs = float( toks[0])*60 + float( toks[1].replace('s', ''))

print algo,&quot;,&quot;,tsecs



Raspberry Pi TrueCrypt Benchmark

Note: The results in this post have been improved with more accurate values at Revised Raspberry Pi TrueCrypt Benchmark.

I recently acquired a Raspberry Pi model B 512 MB from the excellent people at Adafruit. I am interested in it as a small computer for basic text processing, and am curious about its performance in consumer crypto. One part of the security of the Pi, or any modern computer, is disk encryption.

My disk encryption of choice is TrueCrypt, mainly because it is cross-platform. That it is also free and open source is a nice benefit, though the TrueCrypt3 license may not rise to Stallman’s standard. I found several posts from persons who compiled TrueCrypt on the RasPi, and it is relatively trouble free. At the bottom of the post are my notes on how I did the install and a script that performs the benchmarking.

While I don’t understand the relationship between the hashing function and the encryption function, I expected that speed would be unrelated to the hash algorithm. This was not what I experienced, as shown in the data below.

Performance, in MB seconds, as TrueCrypt reports for initializing a 10,000,000 byte file.

 Hash Encryption Speed (MB/s) RIPEMD-160 Twofish 3.4 RIPEMD-160 Serpent 3 RIPEMD-160 AES 2.5 SHA-512 Twofish 2.5 RIPEMD-160 Twofish-Serpent 2.3 SHA-512 Serpent 2.2 SHA-512 AES 2 RIPEMD-160 AES-Twofish 2 RIPEMD-160 Serpent-AES 1.9 SHA-512 Twofish-Serpent 1.8 SHA-512 AES-Twofish 1.6 Whirlpool Twofish 1.6 RIPEMD-160 AES-Twofish-Serpent 1.5 Whirlpool Serpent 1.5 SHA-512 Serpent-AES 1.5 RIPEMD-160 Serpent-Twofish-AES 1.5 Whirlpool AES 1.4 SHA-512 AES-Twofish-Serpent 1.3 SHA-512 Serpent-Twofish-AES 1.3 Whirlpool Twofish-Serpent 1.3 Whirlpool AES-Twofish 1.2 Whirlpool Serpent-AES 1.2 Whirlpool Serpent-Twofish-AES 1 Whirlpool AES-Twofish-Serpent 0.934

The upshot is that all of these are pretty slow, and all of them would be essentially unnoticeable for basic text file (or RTF) work. I wouldn’t want to do image or audio processing with this encryption, but then I wouldn’t want to do that on a Pi anyway.

Method of Speed Assessment

I wanted a non-interactive way to perform the test, so I wrote this script. I am relying on the data reported by the TrueCrypt volume creation process. Because TrueCrypt writes a status to the terminal it produces output that is dreadful to process, so I wrote the little python script to produce a CSV from the log.

The test was performed with an ARMv6 compatible processor rev 7 (v61) at 464.48 BogoMIPS. The OS is Debian GNU/Linux 7.0 (Wheezy), which was installed as the 2013-02-09-wheezy-raspbian image. I built TrueCrypt from source for 7.1a along with wxWidgets 2.8.12 (also built from source) and pkcs version 11.2.

Shell Script

#!/bin/bash

# Create a file of random elements, needs to be at least 300 bytes
dd if=/dev/random of=random bs=512 count=1

# Iterate over the hash hash funnctions
for HASH in RIPEMD-160 SHA-512 Whirlpool
do
# Iterate over the available encryption algorithms
for ENCALG in AES Serpent Twofish AES-Twofish AES-Twofish-Serpent Serpent-AES Serpent-Twofish-AES Twofish-Serpent
do
# Write the algorithms to the log
echo “Algorithms: $HASH$ENCALG” >> log
# TrueCrypt will report the performance in the output
truecrypt -c /home/pi/test.tc –filesystem=fat –size=10485760
–encryption=$ENCALG -p ppp –random-source=random –hash=$HASH –volume-type=normal –non-interactive >> log
# Erase the created file
rm test.tc
done
done

Python Reprocessor

import sys
fid = open( sys.argv[1], ‘r’)
fid.close()

speed = None
while len( lines) > 0:
line = lines.pop(0)
lls = line.strip()

if lls.startswith( ‘Algo’):
# If we already have a speed, then print
# the last elements
toks = lls.split()
if speed == None: # first record
algo = “,”.join( toks[1:3])
else:
print algo,”,”,speed
algo = “,”.join( toks[1:3])
elif lls.startswith( ‘Done’):
toks = lls.split()
speed = “,”.join(toks[-5:-3])
print algo,”,”,speed

Refigured Data Logger

The RIMU data logger I built last year received a major upgrade. I added a sound pressure level meter to learn how noisy it is and updated the case to make the switches easy to find, label, and operate.

It took a long time to get it working, mainly because I forgot that using the Adafruit Data Logger Shield occupies digital pins 10-13. Stupid mistakes were overcome and it is working now.

In the following picture you can see the sensor pod to the right and the main processor to the left. The reddish cable coil at the top right of the screen is the thermocouple, which I use to log high temperatures like the oven, barbecue grill, or smoker. The black cable running off-screen at the bottom right is for the thermistor. The thermistor is suitable for temperatures up to about 250 F, and is on thin enough wire that it can be placed outdoors while the electronics are sheltered indoors.

The display always shows the date and time—useful to confirm that logged data will be correctly time tagged. In the upper right of the display is a logging indicator (looks like L hugging a degree symbol). The degree symbol changes to a 1 when the logger is recording data.

The bottom row of the display shows the measurement from one sensor at a time. Hitting the mode button will step through all the modes. Unfortunately, this interface is awful. The Arduino is doing a lot, and it checks the button state as often as it can; however, it is common to press the mode button to no effect because the Arduino did not happen to check the pin at the time it was pressed.

There are 7 display states:

• Thermocouple temperature (degrees C)
• Barometric pressure (millibars) and temperature (C) from the BMP085
• Visible light luminosity (lux) from the TSL2561
• Infrared luminosity (lux) from the TSL2561
• Sound pressure level (uncalibrated units, an integer between 0 and 1023)
• Relative humidity (%) and temperature (C) from the DHT22
• Thermistor temperature (F)

The relationship between the display states and the sensor pod are straightforward.

I’m happy with this design, but there are a few things I would change. First, I would like the state control to be a physical, mechanical switch. I could easily accomplish this with a 7-position (or more) rotary switch. I would wire resistors between each position, and then use an analog pin to read the state directly. A variable resistor with coarse detents would work too. So far I have not been able to find a variable resistor with coarse detents (but I can find one with very fine detents), and rotary switches are way too big for the case.

The other major thing I’d like to change is the physical interface between the Arduino case and the sensor pod. I would replace the terminal blocks with a finer pitch set that would fit on the board in a row and have enough room for labels. I would then use a bundle of 22 ga. stranded wires in different colors so that the sensor pod would be more flexible.

In software I need to make a change so that there is positive feedback to the user if there is an error writing to the log file. This should be easily accomplished, and is likely the next software change.

Smoking Good Ribs—Temperature Management

I have smoked ribs about four times in the last couple years. I’ve been fortunate enough to have my father’s MasterBuilt smoker. It is a box about the size of a dorm fridge, and can readily smoke two racks, each cut in half. It has electronic temperature management and is well insulated. I’m fond of the smoker, and totally committed to the ribs from it.

At the bottom of the smoker is a pan, like a porkpie hat upside down. MasterBuilt’s instructions say to put water in the pan. Other sources suggest putting something solid in the pan instead. The argument is that boiling water is around 212°F (205°F  at my altitude), and that the smoker’s temperature will not rise above that. A very reasonable argument. Consequently, I have used bricks wrapped in aluminum foil in the bottom of the pan.

What does the thermal mass do, whether bricks or water? One idea is that it causes the temperature to be more even, especially for it to bounce back faster after the door has been opened. For this to work well the thermal mass should a) be quite conductive, and b) have a high specific heat. Conductivity makes the heat inside the mass available to the air in the smoker. High specific heat means that a modest amount of mass can hold lots of heat.

 Material Conductivity (W/m K) Specific Heat (kJ/kg K) Brick 0.8 0.9 Water 0.58 (but can convect) 4.2

It is hard to beat water for specific heat, and convection within the water pool should make the effective conductivity much higher than the intrinsic conductivity.

Which thermal mass provides faster temperature bounce back? It turns out they are functionally identical. The following figure shows four trials, two with water and two with brick moderator. During each trial I held the door open for one minute. There were differences in breeze and in ambient temperature (not analyzed). However, the chart shows that in all cases the temperature returned to its pre-opened temperature in the same amount of time, around 8 minutes. Rebound time is  not much affected by the choice of moderator.

Which thermal mass provides more even temperature? This is harder to measure because I opened the door for the rebound test. Nevertheless, by removing the open-door data I can see the thermostat’s temperature management wave, and estimate the variability.

The next figure overlaps these two sets. In both sets the temperature swing is quite high, either 13°F or 17.6°F as measured by twice the sample standard deviation. Water provides more even temperature, but one that still varies quite widely.

The chart reveals the tie-breaker fact: the average water temperature is almost 10°F closer to the thermostat’s set point. The set point is 225°F, as recommended for ribs. The thermocouple was positioned at the top 1/4 of the cooking chamber, in the center front/back and left/right. This is where the top slab of ribs goes.

Presumably the bottom half of the cooking chamber is cooler; rotating the food top-to-bottom may help. The full data record is plotted in the final chart. The warm-up and cool-down bracket the data set. The record started with brick moderator, until about 19:05, when I replaced the brick with boiling water. The deep dips in temperature come from opening the door.

The next experiment is to instrument the box at several heights in the column, but that is low priority. Spring is coming and pruning must be attended.

RIMU Calibration

I turned the freezer up to 7. I’m sure yours goes to 11, but I am not so lucky. After being turned up to 7, the resulting mean temperature is still too high at 18.8°F. And with that, we bid adieu to our refrigerator. It is around 20 years old—no way to be sure, and it has had a good life. I suppose the EPA would frown on a Viking funeral?

As a final check, I tested the thermometer by running it from a stirred ice bath up to a rolling boil. The ice bath temperature, averaged over about 1 minute, is about 33.1°F. A 1°F error does not present any problems.

Calibration in an ice bath is quite easy, but calibration at boiling is not. The pressure estimated from the RIMU predicts a temperature below the measurement, while the local airport’s barometric pressure predicts one above. In any case, the error is a few percent or less.