Categories
Tech Blog

Final year project – A remote laboratory

Intro

As part of my final (3rd Professional) year of Computer Engineering at the University of Canterbury, I have been working on a full year project. The college views these projects as the capstone of the degree program. They are designed to allow students to focus on a specific area, working at their own pass under the guidance of an academic supervisor.

My project is to design a remote laboratory system to aid teaching of Embedded Software in the Electrical and Computer Engineering department. I’ll explain exactly what that means soon, but first some background.

In 2012 students in ENCE361 assigned a project which involved writing an embedded program to control helicopter. The helicopter was to move up/down and left/right in response to button presses and to maintain robust behaviour at all times. The helicopter is fixed in a stand which uses a light sensor to output an analogue voltage proportional to its height. Students are required to read this value using an ADC and to control the helicopter with PWM signals.

Students enjoyed this project, however there were problems in regards to the helicopter in terms of access and breakages. It was hard to ensure each group had equal opportunity to use a helicopter stand.

Around the same time, my supervisor, Dr Steve Weddell was in communication with the University of Technology Sydney (UTS) and had learnt about the concept of remote labs. He figured the helicopter project would be a suitable candidate to be converted to a remote lab format.

The Project

For my project to be successful it would have to provide the following features:

  • Two functioning helicopter rigs.

  • Ability to respond to ‘virtual’ button presses.

  • Ability to upload programs onto the microcontroller remotely.

  • Ability to view the helicopter on a webcam.

I’m pleased to say that all of these requirements have been meet. The video below shows how students might use the system (best viewed full screen):

So, How does it all work?

The key to the whole system is SAHARA Labs, a set of software packages which provide a framework for developing custom remote laboratory setups. SAHARA is open source, released under a BSD license. To view and download the most up-to date code, head to the project’s GitHub repositories.

SAHARA

SAHARA consists of three main components:

  1. Web Interface – this is the components students (or other users of the system are presented with). It provides facility to login and access rigs, queue or make reservations if all are in use. Academics are also able to monitor student usage and download reports through the web interface. Rig pages can be customized with buttons and other control elements.
  2. Rig Client – provides various functions to interact with hardware. It is written in Java and requires further development to provide the final, lowest layer of abstraction to a specific rig.
  3. Scheduling Server – ties multiple rigs together and coordinates user access through the web interface. It has the ability to tie into a universities existing authentication system such as LDAP.

I installed all three of these components on an Ubuntu machine. The next step was to extend the RigClient and to choose hardware to interact with the helicopter and Stellaris development board.

UTS had recently developed a rig with a number of similarities to our planned rig, and they were kind enough to provide us their source code as an example to work from. Their rig involved students programming a Digilent Nexys FPGA, where’s ours uses a Texas Instruments Stellaris EKS-LM31968 development board.

Buttons

I modified the web interface using HTML5 and JS to include the required buttons. When these are pressed, they fire Rig Client methods which are routed to a custom class. The next decision to make was how to send these logic signals to the microcontroller. preferably using a USB device. I investigated a number of options, including an Arduino board, but ended up choosing a FT245R FTDI device. This provides a bit bang mode which was perfect for this application. The standard way of talking to one of these devices is to write C code, using the libFTDI library. In order to achieve this from the Rig Client (which is written in Java) I used the Java Native Interface (JNI).

The following code snippet shows how pins are asserted in response to buttons presses routed from the web interface:


jboolean Java_au_edu_uts_eng_remotelabs_heli_HeliIO_setByte(JNIEnv *env, jobject thiz, jint addr) {
  if (!deviceExists) {
    // PRINTDEBUG("Cannot set data byte when not connected to Heli");
    return false;
  }

  int pin;
  if (addr == 0) {
    pin = UP_PIN;
  } else if (addr == 1) {
    pin = DOWN_PIN;
  } else if (addr == 2) {
     pin = SELECT_PIN;
  } else if (addr == 3) {
     pin = RESET_PIN;
  } else {
    // Do something sensible.
    return false;
  }

  /* Enable bitbang mode with a single output line */
  ftdi_set_bitmode(&ftdic, pin, BITMODE_BITBANG);

  unsigned char c = 0;
  if (!ftdi_write_data(&ftdic, &c, 1)) {
    innerDisconnect();
    return false;
  }

  usleep(200);
  c ^= pin;

  if (!ftdi_write_data(&ftdic, &c, 1)) {
    innerDisconnect();
    return false;
  }

  return true;
}

Code Upload

The other major bit of functionality required was to provide a way for students to upload binaries of their programs and to automatically program them onto the microcontroller for testing.

Luckily OpenOCD plays nicely with our chosen microcontroller. The Java Rig Client communicates with the OpenOCD daemon by instantiating a Python script, which in turn makes use of the Python Expect library. This is best understood by looking at the source code below:


import pexpect
import argparse
import os
import sys

def main(**kwargs):
if kwargs['format'] == 'bin':
upload_program(kwargs['program'])
else:
sys.exit(2)

def upload_program(program):
child = pexpect.spawn('telnet localhost 4444')

child.sendline('reset')
child.expect('>')

child.sendline('halt')
child.expect('>')

child.sendline('flash write_image erase ' + program)
child.expect('>')
child.sendline('sleep 5')
child.expect('>')

child.sendline('reset run')
child.expect('>')

child.sendline('exit')

if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Flash a bitfile to the Stellaris')
parser.add_argument('program', type=str, help='Program name')
parser.add_argument('format', type=str, choices=['bin'], help='Specify the file format')
args = parser.parse_args()
main(**vars(args))
sys.exit()

Webcam

Finally, the whole system is not of much use if students are unable to see the helicopter in action. A Logitech C920 is connected to the rig computer for this purpose. I had envisaged video streaming to be one of the simpler aspects of this project, but unfortunately  it was a pain in the ass to get working! The team at UTS said they used ffserver/ffmpeg, however I had no luck with the version in Ubuntu’s apt-get repository. It turned out, building the latest version from source was the only way to get it working:

sudo git clone git://source.ffmpeg.org/ffmpeg.git
cd ffmpeg/
sudo ./configure
sudo makesudo make install
usermod -a -G video username

I was then able to steam SWF, Flash and Motion JPEG using the following configuration file:

# Port on which the server is listening. You must select a different
# port from your standard HTTP web server if it is running on the same
# computer.
Port 7070

# Address on which the server is bound. Only useful if you have
# several network interfaces.
BindAddress 0.0.0.0

# Number of simultaneous HTTP connections that can be handled. It has
# to be defined *before* the MaxClients parameter, since it defines the
# MaxClients maximum limit.
MaxHTTPConnections 200

# Number of simultaneous requests that can be handled. Since FFServer
# is very fast, it is more likely that you will want to leave this high
# and use MaxBandwidth, below.
MaxClients 100

# This the maximum amount of kbit/sec that you are prepared to
# consume when streaming to clients.
MaxBandwidth 100000

# Access log file (uses standard Apache log file format)
# '-' is the standard output.
CustomLog -

# Suppress that if you want to launch ffserver as a daemon.
#NoDaemon

##################################################################
# Definition of the live feeds. Each live feed contains one video
# and/or audio sequence coming from an ffmpeg encoder or another
# ffserver. This sequence may be encoded simultaneously with several
# codecs at several resolutions.

<Feed feed1.ffm>

</Feed>

<Stream status.html>
 Format status
</Stream>

<Stream camera1.swf>
 Feed feed1.ffm
 Format swf
 VideoFrameRate 15
 VideoSize 320x240
 VideoBitRate 250
 VideoQMin 3
 VideoQMax 10
 NoAudio
</Stream>

<Stream camera1.flv>
 Feed feed1.ffm
 Format flv
 VideoFrameRate 15
 VideoSize 320x240
 VideoBitRate 250
 VideoQMin 3
 VideoQMax 10
 NoAudio
</Stream>

<Stream camera1.mjpg>
 Feed feed1.ffm
 Format mpjpeg
 VideoFrameRate 15
 VideoIntraOnly
 VideoSize 320x240
 VideoBitRate 500
 VideoQMin 3
 VideoQMax 10
 NoAudio
 Strict -1
</Stream>

With these tasks complete, the basic system works! A second rig client has also been added – this involves installing another copy of the Rig Client on a second machine, which talks to the Scheduling Server over the network. A number of other features have been added since and I might detail these in a future post.

I have written a paper on this project and will present this at the 2013 Electronics New Zealand Conference (ENZCON) in September. More complete details can be found in my Engineering Report.

Categories
Tutorial

Flashing a SAM4S MCU

One of my final assignments in my final year of Computer Engineering is to build a ‘Wacky Racer’ car. It is a group assignment with six students in each team. The project requirements include:

  1. Be controllable from a bog-standard infrared TV remote control.
  2. Be controllable wirelessly from a mobile phone.
  3. Can capture still images and send them wirelessly to a mobile phone or laptop.
  4. Have a soft on/off switch that will power up your vehicle.
  5. Have a green LED that indicates the system is powered up.
  6. Have a red LED that indicates that the battery needs charging.
  7. Only use a single battery pack for everything.
  8. Have no trailing cables.
  9. Use almost exclusively surface mount components apart from the supplied connectors.
  10. Monitor the battery voltage and motor speed.
  11. Be dastardly!
  12. Comprise three PCBs, each with a microcontroller:
    1. Motor board: power supply/motor-control/steering/infrared comms
    2. Camera board: using the TCM8230MD image sensor.
    3. Communications board: wifi or bluetooth
  13. The boards are to communicate with an I2C bus only.
  14. Each board is to have a UART or USB debug interface.
  15. Each board needs to be able to run standalone for debugging.

We decided to use a Atmel ATSAM4S16B Microcontroller for all three boards. With our PCBs designed, manufactured and the components placed, and the voltage levels appearing to be correct we then needed to configure a toolchain to connect to our board using it’s JTAG header. The instructions below cover installing OpenOCD 0.6.1 and installing an ARM EABI toolchain. We are using a USB to JTAG converter supplied by our department (Electrical Engineering) at the University of Canterbury, however the process will be similar for commercially available adapters. Note that our adapter is based on a FTDI FT2232 chip.

Populated BoardUSB to JTAG

Ubuntu Linux (12.10) is used in this tutorial.

1). Installing OpenOCD

The FTDI drivers are required. Install these with the following command:

 sudo apt-get install libusb-dev libftdi-dev 

Now, download OpenOCD 0.6.1 from http://sourceforge.net/projects/openocd/files/openocd/0.6.1/openocd-0.6.1.tar.bz2/download

Unzip it (I use the gui folder manager because I can never remember the command) and cd into the openocd-0.6.1 directory. Now build and install OpenOCD with the following commands:

 ./configure --enable-ft2232_libftdi
 make
 sudo make install

Test that the install worked by running openocd and checking that something like the following appears:

Open On-Chip Debugger 0.6.1 (2013-05-17-12:52)
Licensed under GNU GPL v2
For bug reports, read
    http://openocd.sourceforge.net/doc/doxygen/bugs.html
Runtime Error: embedded:startup.tcl:47: Can’t find openocd.cfg
in procedure ‘script’
at file “embedded:startup.tcl”, line 47
Error: Debug Adapter has to be specified, see “interface” command
in procedure ‘init’

2). Running OpenOCD

The next thing to do is to get the OpenOCD config files sorted for the SAM4S target and the USB to JTAG interface. Do this by creating a file called myconfig.cfg in your home directory and put the following in it:

# This section configures OpenOCD for using the university's USB-JTAG adapter.
interface ft2232
ft2232_layout usbjtag
ft2232_vid_pid 0x0403 0x6010
adapter_khz 4
adapter_nsrst_delay 200
jtag_ntrst_delay 200

# This section configures OpenOCD for working with a SAM7 chip.
source [find target/at91sam4sXX.cfg]

# Halt the MCU when GDB connects otherwise the connection fails. ($_TARGETNAME is defined in at91sam7sx.cfg)
$_TARGETNAME configure -event gdb-attach {
echo "Halting target due to gdb attach"
halt
}
$_TARGETNAME configure -event gdb-detach {
echo "Resuming target due to gdb detach"
resume
}

At this point, connect your board via the USB to JTAG adapter, and run OpenOCD from the same directory as your myconfig.cfg file with the following command:

 sudo openocd -f myconfig.cfg 

3). Connecting to OpenOCD

There are two approaches to connecting to OpenOCD, telnet or GDB. Use can use one of the following commands in a new terminal to connect to OpenOCD:

 telnet localhost 4444 #telnet 

 

 telnet localhost 3333 #gdb 

For the remainder of this tutorial we shall focus on using the GDB connection method.

4). Configuring an ARM EABI toolchain (with GDB)

We need to install a version of GDB configured for an ARM target. We might as well install our cross-compiler at the same time. This can all be achieved in one swoop by using James Snyder’s build system from https://github.com/jsnyder/arm-eabi-toolchain.

Install this as follows (alternatively, if you are used to cloning Git repos you can just follow the README on the github page):

Install git:

 sudo apt-get install git 

Install the required dependencies for the toolchain:

sudo apt-get install curl flex bison libgmp3-dev libmpfr-dev texinfo \
      libelf-dev autoconf build-essential libncurses5-dev libmpc-dev \

Clone the repo:

 git clone https://github.com/jsnyder/arm-eabi-toolchain.git 

Install the toolchain:

cd arm-eabi-toolchain
sudo make install-cross

Now wait for ages while the makefile does it’s thing. When it’s done, add the stuff to your system path and clean up.

export PATH=$HOME/arm-cs-tools/bin:$PATH
sudo make clean

You can check this all went well by starting the ARM version of GDB at the command line.

arm-none-eabi-gdb

You should be presented with a (gdb) prompt.

5). Using arm-none-eabi-gdb alongside OpenOCD

The following script can be used to upload a program to the SAM4S target using the GDB method:

target remote tcp:localhost:3333
monitor reset
monitor sleep 500
monitor poll
monitor soft_reset_halt
load
monitor reset

Save this script as program.gdb. Once you have compiled your program, and with the OpenOCD dameon running, you can invoke this script using the following command:

./arm-none-eabi-gdb -batch -x 'program.gdb myProgram.elf 

Where myProgram.elf is your compiled application.

TODO: 6). Compiling a program for the SAM4S + an example blinking LED program.