Do you want to build an audiophile quality music player with a Raspberry Pi for a fraction of the cost of a ready made system? I did. And this is how.
A few days ago I was seduced by an offer for a CodeBug.
From the CodeBug.org.uk website:
“CodeBug is a cute, programmable and wearable device designed to introduce simple programming and electronic concepts to anyone, at any age. It is easy to program CodeBug using the online interface, which features colourful drag and drop blocks, an in-browser emulator and engaging community features. Create your own games, clothes, robots or whatever other wacky inventions you have in mind!”
It can also connect to a Raspberry Pi. Well how could I resist. All this for only £15. It was too good to be true, so I bought 2. They arrived on Monday. Today is Friday. Here is my:-
CodeBug(s) arrived. Snazzy box. Unwrapped and took a look. Yes, it looks just like the picture. And comes with a USB cable, and a short instruction leaflet.
Visited CodeBug website and followed the get started instructions, watched the short intro video, which at less than 2 minutes shows you in real-time how to write your first CodeBug program, test it, deploy it to your CodeBug and finally run it.
(I didn’t believe the hype. This can’t be true. I tried it. It was!)
Within 10 minutes of opening the box I had a program up and running on the CodeBug. I understood the basics and had experienced a new programming language – Google’s Blocky (Err, actually Blockly – oops. Post updated.)
I thought, 10 minutes. Is that all the fun there is to be had from a CodeBug. Done that. Next.
But, hang on. The CodeBug has two buttons on it, which respond to program control. So my next program included a bit more logic, and used button presses to do different things. Displaying text on the 5*5 LED display. (Amazing to think that a 25 pixel display can actually display text.)
Then I wrote a program to animate a dancing bear. A very simple dancing bear I admit. This time using loops as well as button press logic.
I was beginning to be impressed with this little £15 device. So why not go further. On the website, which is clean easy to use, and has a professional feel to it, there are some learning examples. Next up is a Fruit Keyboard.
I only had apples. So it was going to be a single fruit keyboard. A single key keyboard.
To make this ideally you connect a wire with crocodile clip ends to the CodeBug and the apple. I only had a jumper lead, so I snipped off one end, and wrapped the wire around one of the 4 input / output terminals. Then pushed the pin end of the wire into the apple.
I modded the program to display apple. The tested the program in the on-line emulator, to ensure that it compiled first, and second worked correctly. Happy with a quick test, and desperate to try it in the real world, I downloaded the program to the CodeBug and set it running.
Every time I touched the Apple, “Apple” was displayed on the LED display – scrolling across. This was the most immediate programming feedback I have ever had. I was very impressed at the simplicity of the CodeBug, yet how powerful and flexible it was, combined with a visual programming language and emulator, including cloud storage for all of my CodeBug programs.
This has been well thought through.
But there is more. A code bug can be tethered to Raspberry Pi, and using Python programs running on the Pi you can control the CodeBug. Effectively a program is downloaded to the CodeBug which then works as a client, whilst you run Python programs on your Pi – server, and communicate through the USB cable to your CodeBug. Client Server computing. With a CodeBug. Amazing.
I have to admit here to my first downer with the CodeBug. It plugged it in but it didn’t work. I spent a lot of time trying, but absolutely nothing was happening. I was obviously doing something wrong, but couldn’t work out exactly what. I was using a Motorola LapDock, powering a Model A+ raspberry Pi.
There are 2 ways to tether a CodeBug to a Raspberry Pi. Tethered, and Tethered with I2C, one of the interface standards supported by the Pi. The CodeBug also has expansion pins on its base and can be plugged direct into the GPIO pins on the Pi.
Nothing I tried worked, so I slept on it. Thinking that a bit of rest might help. Before retiring though, I ordered some CR2032 batteries, because the CodeBug can be battery powered so it can work without a computer connection – it really is quite an amazing little device, and I ordered a USB Cable with a switch on it so I didn’t have to keep on plugging it in and out to reset and download programs.
The power of sleep is amazing. If you have a problem, then stepping away from it for a while can sometimes help. Meanwhile in the background your subconscious carries on working on your problem.
Wednesday morning I decided to rebuild the Pi OS. I started with the latest Noobs (pi speak) build. And try again. So after installing Geany, a lightweight IDE for writing Python I was ready to go.
I plugged in the CodeBug to the GPIO ports on the Pi, then turned on the Pi. Before the Pi had finished booting a dancing bear was showing on the CodeBug LEDs. This made me realise that I had forgotten to download the tethering program, and that I probably needed to read the instructions a little more carefully.
In rereading the instructions I realised that tethering comes in two flavours on the Pi. Firstly, and I had missed this, partly in my haste, that the first tether mode I was attempting was using a USB cable. However I had plugged the CodeBug into the GPIO pins. So USB tethering was never going to work. But i2C tethering through the GPIO pins should have worked. So the software refresh wasn’t a waste of time, but I also needed to read the instructions more carefully.
There were 2 references to connection via a USB cable. In a picture, that doesn’t actually show a Raspberry Pi, and then on the last line of the overview text of the example.
The examples are very good, and well documented. I admit I rushed, and didn’t read the single reference to USB tethering.
Lesson learnt: Use the latest build, and read the example text – thoroughly. Yes, I think that the USB tethering line could have been more prominent in the example, but there was a picture as well, and I had missed 2 clues!
Once plugged in, via a USB cable, I followed the example python scripts and all was well.
But that is the thing with CodeBug, you then think, “well I want to scroll some text like Blockly does“. Ah ha. When you write in Python the CodeBug tether library does not provide a scrolling program. So you have to write your own. An example is given, but it is more fun working it out from basics. The way you get text to scroll is by adjusting the pixel map start point. You display the same text over and over again, just decrementing the starting column, from 0, down to a minus value that is 5 (for five columns you have to scroll across) times the number of characters in the string. Therefore “Hello CodeBug!“, is displayed from column 0, to column (0 – (14*5)= – 70), in decrements of -1.
(NB. Longer lines of text seem to scroll slower!)
Then it was on to trying out the direct GPIO connect and the I2C interface. This means connecting the CodeBug directly onto the GPIO pins of the Raspberry Pi. Something that should always be done with care. And note that you should never connect power from the GPIO Pins, and either battery or USB. There is a warning on the page, but others seem not to be reading all of the text, just like me . . . .
I followed the scrolling digital CodeBug clock example. The python program gets the current time and date from the internet, then formats the time and date outputs and sends them to the CodeBug, scrolling them across the LED. I slightly modded the program to display the short month rather than digits, (%b not %B). On my Model A, it displayed the Time then Date almost exactly twice a minute. To within a fraction of a second.
I only experienced one small problem. I set the scrolling time parameter to 15 (15 seconds), not .15 (point 15 of a second) which meant that the display stood still for 15 seconds and would have taken approx 20 mins to scroll the time string! Ooops.
I was pleased that I had overcome the initial problems, software and not reading the instructions properly. And now could think of lots of things to do with the CodeBug, RasPi and Python. I think tethering via USB might be more useful than via the GPIO pins, but it is a useful capability.
Being quite taken with Blockly I decided to investigate further. Following the links I eventually came across the Google Developer Pages for Blockly and the instructions for installing Blockly on your own machine here.
Now thinking about how to get Blockly onto RasPi, and get it to generate Python code. This sounds like fun. Start small. One step at a time. Then when I have learnt enough, completely revise the architecture and start again 🙂
The CR2032 batteries arrived meaning that the CodeBug can be used without plugging into a computer. This just worked!
The USB leads arrived. I had bought the wrong ones. No data cable! They only provide power to the CodeBug, but don’t do data exchange. Oh well. Out with the soldering iron to make the lead I really wanted. Should have read the small print. These will be recycled to use with Raspberry Pis.
Write this blog post as a way of documenting the CodeBug experience and start to think of other things to do with the CodeBug. I am definitely going to try the ‘get coding in under 2 minutes’ out on some friends 🙂 And the fruit keyboard – the tactile immediacy, having written the code, seeing all the wires, is great fun.
And I am thinking of doing more with my CodeBug. I bought 2. I gave the other one to my wife. She’s got the bug as well! In fact she got going a little quicker than me. Obviously reading the instructions more carefully!!! A lesson to be learnt there no doubt?
One project I want to try is to tether 2 CodeBugs to the same computer, then scroll a message across both CodeBugs, so that it appears as if the text is crossing from one to the other. If I get that going I will add a video, so watch this space . . . .
I am very impressed with the CodeBug. It is immediate. It is tactile. It is fun. But I am not only with impressed with the device, but also the complete environment surrounding it. The interactive, visual, development environment, including compiler and simulator is great. The Cloud storage of your code is excellent. The price, £15, is fantastic. The examples are easy (ish) to follow – as long as you actually read them and not skip ahead. It is a well designed system, which just works.
If you have read this far and haven’t already ordered a CodeBug, then stop reading, and buy one now.
I have had a Raspberry Pi camera board since they were first available in May 2013.
One of the big advantages of the Pi camera is that it can be program controlled. That means you can write a program, in my case in Python, that can control when you take one or many photos.
I had seen a post on time-lapse photography and thought “I can build one of those“. So I did.
I think it looks fantastic when it is finished. If you can’t wait, fast forward to 42″, and watch a rain shower followed by brilliant blue sunshine. Don’t forget to watch in 1080p if you can.
It took about 75 lines of python code and now I have a time-lapse camera program. Thanks for the inspiration and example code link here.
Well, I have the program which takes the pictures. The images then need to be stitched together. And finally converted to something like an MP4 file.
Why 75 lines? Sounds like a lot? I wanted the program to be variable, so using inputs I can decide how many shots to take, and how far apart the shots are taken. This allows me to do a trial shot. Look at the results, check that the camera is pointing in the right direction, then go for the full run, for example 2,400 shots at a 6 seconds delay. Most of the code actually sorts out file names and directories. Just to be difficult I decided to use numbers for my file names, not just a time date stamp, so a lot of the code enables that. It is possible to write a much shorter program, or even just use the time-lapse feature in raspistill.
To make an 1080P HD movie you need to take 1920(w) * 1080(h) pictures, and stitch them together at 25 frames per second (in the UK). So for 1 minute of video you need to take 25 * 60 = 1,500 pictures. At 6 seconds delay between shots this is going to take 2½ hours.
(You will need to set some times aside to do this, make sure that the camera isn’t going to be moved, or become obstructed. Tip: Start with shorter runs.)
My set-up is simple, a Pi, WiFi dongle, and an Duracell emergency mobile Phone charger battery. I run the Pi headless, which means without a monitor and keyboard. I connect through either my laptop or a tablet. And if I am out and about then I can use open a hotspot on my phone.
I even bought a case for my Pi that looks like a camera. Not bad for about £10.
(Update: I have since added a magnetic lens, glued a washer to the front as a mount, and upgraded to a 12,000 mA-h PowerBank battery for even longer life. I am currently working on a tripod mount.)
I have recently bought a second camera case, a SmartPi, which has a GoPro tripod mount. This is designed for a B+ / B2. I have used a RasPi B+, which uses less power than the original B, and my PowerBank lasts even longer 🙂
(The SmartPi case has Lego mountings, which opens up a whole nother world of possibilities!)
Just a note: You can do all of the video encoding and conversion on the Pi, but I use my desktop PC. It is a lot quicker. For example 50 minutes+ on the pi = 5 minutes on the desktop! And all of the software is open source i.e. ‘free‘. The only additional expense need be the RasPi Camera Board.
“This is great, but I can do all this with my tablet and I don’t need to fiddle about with any of this Raspberry Pi stuff.” Well, yes, you can. But you are much less likely to stick your tablet in a Tupperware box, and leave it in the middle of a field for 24 hours, than you are with a Raspberry Pi. And where is the fun in using a tablet? With the Pi you have the satisfaction of knowing that you ‘made it‘.
One more that I made earlier:
PS. Avoid the sun being in the shot for long periods as here. This shot burnt off the IR filter and left a blue line across every subsequent image. The contrails look great though 🙂
I recently ran a workshop, with my good friend Phill Isles, at the Test Management Summit. The subject was Testing the Internet of Things: The Dark and the Light. One of the things that we wanted to do was demonstrate a live Internet of Things device, that the delegates could actually interact with, see how it works, and begin to understand what IoT means.
So I thought I would build a Raspberry Pi Tweet Cam, that the delegates could use to take selfie’s.
It would need a Pi Camera, obviously. Then a button to press to take the photo. An LED to show the user what was happening. And finally another button so that we could turn it off.
The aim was to run headless, i.e. no monitor, keyboard or mouse.
Finally it would be equipped with a Wi-Fi dongle, to enable it to connect to the internet and Tweet.
A fun Raspberry Pi project. I mostly used the instructions for Tweeting from Alex Eames RasPi.TV site (which I find extremely helpful). Details can be found here RasPi.TV Taking and Tweeting a Photo. Then added my own design and functionality.
I needed some parts:
- Model B+ (Tick)
- Pibow Coupé Case (Tick)
The pi looks great in the coupé case.
- Breadboard Base for Pibow
Which replaces the bottom layer of the Pibow Coupé case and gives a larger platform onto which a half-size breadboard can be affixed.
- Some buttons.
I got ones with round and square tops.
- An RGB LED.
Why install three LEDs when you can fit one that does all three colours. You still need 3 input connections though – one per colour.
(You can then mix the inputs to create additional colours – Tough to do in an individual bulb!)
- Resistors (Tick)
I didn’t quit have the right resistors. I managed to use two in parallel. And ordered a jumbo multi-pack of 2,000.
The Build. Once all of the parts had arrived, I thought on the matter for a few days. When I had a rough idea of what I was going to do I started the build. I used a rapid prototyping approach.
First I assembled the Pi in the coupé case extended with the breadboard base. Connected the camera using a simple flexible mount which plugs in the audio socket. (The mount works, but is a little loose in the socket – holds the camera just fine though.)
I then added a resistor, button and some wiring to the breadboard, and some jumpers to connect the breadboard to the Pi. Wrote some code to detect the button press. Then added code to taka picture when the button was pressed.
Next step was to add the RGB LED. There were no instructions for the RGB LED on the vendors site. I e-mailed them, and they responded with a two page .pdf, which had the orientation, and forward voltage. Not all RGB LEDs are the same. A simple internet search shows that.
After following some on-line guidance I connected the RGB LED, adding a resistor to the Red bulb. Then wrote a simple LED test program. When that was working I updated the TweetCam code to turn the LED Green when Ready, Red when not – I had decided that the TweetCam would only take a photo every 2 minutes, so as not to spam the world. And the LED would flash Blue when it was taking a photo. Wrote the code and tested it.
Then I added a second button, which was used to shut-down the Pi, as it would be running headless and this is always a good thing to do when turning off a Pi. And I made the LED flash Red whilst the Pi was shutting down.
Finally with the program doing everything but Tweet I added in the Tweet code. I followed the excellent instructions from Alex Eames. And yes, it worked. I pressed the button, the Pi took a photo, flashed the LED, and tweeted the picture.
This is ‘Testing in Production’. It is difficult to test a tweeting program without getting comments! So I only tweeted a few photos. I actually created a version of the program with the Tweeting line of code commented out, so that I could test changes, without bombarding Twitter.
The build took 6 hours from start to finish. I was quite impressed with the speed at which a functional and usable IoT (Internet of Things) device could be built and tested.
And if you are wondering what the pictures looked like the ‘live‘ output can be seen here TweetCam Pictures
We used the device in sessions on two days. On the first day the internet was not working at the conference venue. It was a all a bit of a damp squib. We were though able to demonstrate the inner workings of the Tweet Cam to the delegates but were unable to Tweet. Day two was perfect. Press the Blue button and tweet a picture of yourself.
A few days ago I met my good friend Phill Isles for coffee to plan an upcoming workshop centered around testing IOT devices.
Phill is quite interested in electronics and at the end of our meeting he handed me a small circuit board with what looked like half a small golf ball on one side (something like an icosahedron). It turns out to be an PIR motion sensor (Passive InfraRed sensor). He recently bought 5 and thought I could have some fun with one (Thanks Phill). And all for only 80 pence each including shipping.
Later he sent me a link to the adafruit website with a tutorial for the PIR sensor.
I wired the PIR sensor into my Raspberry Pi, then slightly modified the example program to print a ‘Movement Detected‘ message on the screen. And then started to test the sensitivity of the device.
As I moved away from the desk I could see the ‘Movement Detected‘ messages being displayed on the screen. But when I got 15 feet away I could no longer read the screen. Had a message been displayed? It was hard to tell.
How can I test the range? I was on my own with no-one to help. 2 minutes later I had a pair of binoculars to view the screen and all was well again.
It was the first time I have ever used binoculars for testing software. And a new tool was added to my software testing kit-bag.
According to Douglas Adams, it was the ‘Long Dark Tea-Time of the Soul‘, better known to all as Sunday afternoon. You can either hate them or engage them. Your mind begins to wander. And then you land on a thought. It won’t go away. An itch that needs scratching.
For me it was wondering how many lines of python the new Raspberry Pi 2 would execute per second. Just to give me a feel for how much work the Pi can actually do. I have 6 different Raspberry Pi models and a comparison would be interesting.
I wrote some simple python code:
#!/usr/bin/env python import time n = 0 print " \nBefore " + str(time.time()) while n < 100000000: n += 1 continue print " \nAfter " + str(time.time())
NB. This is the final version of the code.
I wrote and tested the code on my Windows desktop before transferring it to the Pi. And the first time I ran it I had to crash out because I had forgotten to increment the counter. An endless loop! Ooops.
The code is very simple. I Just wanted a time-stamp before and after, and a minimal loop, to get a rough idea of speed. The amount of work that was being done.
I set the loop to 10,000 iterations to start with. The ran. It completed within the same time-stamp. Almost instantaneous. So I upped the loop to 100,000. Again a fraction of a second. Then a million. Again inside a second. So I jumped to 100,000,000. 100 million iterations. This took 8 seconds on the desktop.
My Windows desktop is an intel core i7 3770K. The program was only running in 2 cores. Using 25% of those cores. Roughly 8% of the processing capacity.
The code was working so I transferred it to the Pi 2. I thought that this was going to take ages to run. First time through it took 156 seconds which I then got down to 115 seconds with the standard 1000mhz overclock. To process 300 million lines of python code. I find that amazing. (100 million iterations * 3 lines of code.) That works out to be 2,600,000 lines of python code a second. On a Pi.
(04/12/2015 Updated to include new Pi Zero.)
(04/03/2016 Updated to include new Pi 3 Model B, at just under a minute, and an original release Model B 256mb at 290 seconds.)
NB. The Pi 2 result came from running the program simultaneously in 4 cores and taking the longest core time.
This is in no way scientific. It is not a performance test. But as a very rough guide to just how powerful a Raspberry Pi can be, regardless of model, I found it amazing.
(Do try this at home!)
And links to some more thorough Raspberry Pi Benchmarking:
Some time ago I tripped across the adafruit temperature & humidity sensing page for Raspberry Pi and Beaglebone black, showing you how to build your own weather station which recorded details in a Google Docs sheet online.
At the time here in the UK I couldn’t get hold of a sensor so I put the project on hold. Recently I found a sensor available from ModMyPi in the UK for only £8.49 + VAT so I bought it.
Brilliant. With only a little soldering, I should be up and running with a weather station that can record temperature & humidity, and upload to a Google Docs sheet. Now that I had the sensor I re-read the instructions. They looked more detailed than my initial perusal, so I put the sensor on the shelf for a week or two.
Two weeks later I remembered about it, and thought ‘how hard can this really be?‘
Revisiting the instructions I thought they were actually very clear, and now I was getting down to the build, seemed quite simple and straightforward.
The instructions showed the connections made through a breadboard, but I wanted to connect direct to the GPIO pins on the Pi, so on each of the 3 sensor output wires, Red, Yellow and Black, I soldered a corresponding Red, Yellow and Black male to female jumper lead. (I soldered the Male pin to the sensor wire – the female pin connects to the Pi.) I also put some heatshrink around the joint to protect the connection.
So I now have a sensor with long wires which can connect directly to the Pi. Which I did.
Then I followed the software installs. Straightforward. Ran the test program. It surprised me, but the Temperature and Humidity were displayed straight away – yeah!
The next task was to follow the instructions to run the program and output the readings to a Google Docs sheet. This was a little harder for the following reasons:
- I had to change the sensor pin value from 23 to 4. The diagram showed wiring into GPIO pin 4, but for some reason the code had pin 23.
- I struggled to name my Google Docs sheet correctly. Just a standard typo. I thought I had removed the spaces in the file name but I hadn’t.
Then it worked. I was even more amazed, so I took a picture. And yes, everything looks a bit Heath Robinson’ish, and so it should. This is a prototype. I made a stand from a wire coat hanger. I still have to fix the heatshrink. Put the sensor on a different Pi with a full enclosure case. And I have to amend the software so that it runs automatically on boot, and checks that the internet is present before trying to write out to the Google Docs sheet. Then it will be a stand alone device that I can leave in situ working as a weather station.
I will update this post when I have completed the device, and post a finished picture for comparison. Just to say for now that if you want to build yourself a temperature & humidity sensing device ,and want to see the output on the internet from anywhere in the world, it will take you about 4 hours work and cost you less than £15 (that is assuming you already have a Raspberry Pi).
As I finish writing this the Pi is in another room in the house and I am viewing the spreadsheet on-line watching the temperature slowly go down overnight, and that gives me a real sense of achievement.
There are however a few blank lines being written out to the spreadsheet, and there is a note about this in the instructions, so I will also have to modify the code to overcome that. But it in no way diminishes the warm glow and in fact means more fun coding a solution 🙂
I made a couple of edits to the code, found some excellent new features in Google Docs sheets, and here is a ‘live‘ link to my Test WeatherStation data as a chart (Data No Longer Updated). This will be up actively updated for a few days or so whilst I fine tune the operation. [ And I have now added a Low Temperature alert messages direct to my Pebble watch via Pushover following these instructions. ]