Monday, August 20, 2012

uJTAG, Open Source minimalistic JTAG dongle

I'm releasing the design files of a JTAG dongle. JTAG is a interface designed to test printed circuit boards (PCB) using boundary scan. But today, is also used to test Integrated Circuits (IC), in particular this interface is present in all (I think) ARM processors.

The JTAG dongle I designed, named uJTAG, is a minimalistic implementation with a 6 pin connector in contrast to the standard 20 pin connector. The purpose of this dongle is to flash and debug ARM microcontrollers. Especifically, this dongle has been tested with STM32 microcontrollers.

uJTAG old version

On the above image you can see the previous version of uJTAG I implemented. In the current version, I have removed the EEPROM memory, which I later figured was not necessary, as the default USB <-> UART configuration is enough. Also the components have been moved to the top layer.


As I mentioned previously the JTAG standard specifies a 20 pin connector shown below:


uJTAG connector uses only 6 pins. Pinout is shown below:


These four signals (TMS, TDI, TDO, TCK) are the minimum necessary to use JTAG on microcontrollers. Actually, the RESET signal is also important, but with these signals a RESET can be issued by software, i.e. as a combination of the other 4 signals.


I use this JTAG dongle with STM32 microcontrollers, specifically I use it with F4Dev, an open source development board for STM32F4 microcontrollers, and with openOCD 0.5.0., a software that enables a transparent JTAG communication from PC to microcontroller/processor via the USB protocol.

uJTAG mated with F4Dev


You can find all the design files in this repository.

Wednesday, August 15, 2012

Robbie, the open humanoid robot

This is the project I have been working on this year. The tittle of the project is "Development of a Open Platform for Research on Humanoid Robots", it's basically the implementation of the project I did last year.

This robot is used/will be used to test control stability, sensor fusion, stereo vision and path planning algorithms. Currently, my team and I are gradually releasing all of our source. Ultimately, a public repository will be created, containing the necessary information to replicate this project.

This is a very big project, which I'll analyze in the following sections.

Simulation (Not released yet)

Our previous project used Simulink to simulate Robbie in a virtual environment. Since simulation times were no good, we decided to switch to ODE (Open Dynamics Engine), which doesn't looks as nice as Simulink, but its simulation times are way better.

Robbie in a virtual environment. ZMP inverted pendulum not part of the simulation.

My friend Victor is in charge of this section. We haven't released the source code yet, but probably we'll release it as a Eclipse project.

Mechanics (Not released yet)

Materials: Alluminium, 18 Hitec HS-645MG servomotors and 2 Towerpro SG-90 micro servomotors.

This is how the Robbie looks today:

Robbie, the humanoid robot

Drawings not yet released.

Electronics (Partially released)

The electronics are splitted in 3:

  • F4Dev: Sensor / actuator manager and stability controller.
  • Distribution board: For servo connectors.
  • Beagleboard: For image processing.
Robbie's brain

Firmware ("Partially" released)

The firmware is built on top of bareCortexM, an Eclipse based IDE for ARM Cortex M processors, and libstm32pp, a template peripheral library. Both projects are open source.

The final firmware has yet to be released.

Software Interface (Old version released)

The software interface, qSerialTerm, allows the user to:
  • Send messages to Robbie.
  • Visualize moving plots of Robbie's sensor readings
  • Manually control Robbie's servomotors.
qSerialTerm, a Qt based serial port interface

Sensor fusion

From the orientation sensors (accelerometer, gyroscope and magnetometer) readings we compute an absolute orientation, using Madwick's MARG algorithm. This algorithm computes the orientation of Robbie's torso, seen from the coordinate system formed by the gravity axis (-Z axis) and the geographic north (Y axis).

I owe you a video here.

This computed orientation is important for the stability control and will later be used for path planning.

Control Stability

The accelerometer data is used to compute a stability indicator, named Zero Moment Point (ZMP). This indicator plus the giroscope readings are the core of Robbie's stability control.

Robbie's stability control maximizes its support polygon, i.e. maintains its foot parallel to the floor, and also maintains its ZMP inside its support polygon, this guarantees dynamic stability.

I owe you another video here.

Stereo Vision

Robbie has a stereo camera (a Minoru 3D webcam to be precise), which in essence are two cameras properly aligned to resemble human eyes. Using a stereo vision algorithm, the video streams from these two cameras can be used to obtain a sense of depth.

Minoru 3D webcam, a.k.a. Robbie's head

We use a stereo vision algorithm to compute the distance from Robbie to objects, and this information is used as a obstacle detector.

My friend Santiago trying to punch Robbie testing Robbie's stereo vision.

My friend Santiago is in charge of this part. Source code not released yet.

Early footage

On the following video, you can see Robbie doing tracking of gait trajectories without stability control.

Robbie walking, without stability control. (Feb 2012)


  • Prof. Jose Oliden, our advisor
  • Victor Paredes and Santiago Cortijo, team members
  • Control systems and Artificial Intelligence Research Group (GISCIA)
  • National University of Engineering (UNI)

F4Dev, Open Source Development Board for the STM32F4 microcontrollers

Today, I'm releasing the design files of a board I redesigned, you may have seen the previous version in my post about libKiCad.

F4Dev, previous non-released version.

This board, dubbed F4Dev, is a small (5x5 cm) TQFP-100 adapter like board loaded with various sensors and some communication interfaces.

The board was redesigned, because the old version has components on both faces and there was a problem with the USB <-> UART converter, due to a Silicon Limitation.

F4Dev, new released version.


  • Accelerometer, 3 DoF, 16 bits (LSM303DLHC)
  • Magnetometer, 3 DoF, 16 bits (LSM303DLHC)
  • Gyroscope, 3 DoF, 16 bits (L3GD20)
  • Barometer, 19 bits (BMP085)
  • ======================================== Equivalent to a 10 DoF IMU
  • Omnidirectional Microphone (MP45DT02)
  • USB <-> UART (FT230X)
  • Bluetooth <-> UART (See this post)
  • 6 pin JTAG connector (TMS, TDI, TDO, TCK, GND, VDD) (See uJTAG)
  • LED indicators

Intended applications

As you can see in the specifications, the F4Dev is loaded with a 10 DoF Inertial Measurement Unit (IMU) and wireless communication. This makes it ideal for robotics: wheeled robots, aerial robots, bipedal robots, etc.

Currently, the previous version of F4Dev, is used in Robbie, a humanoid robot. And this new version will be used in a Quadrotor/Quadcopter.


You can check all the design files (in KiCad) in the following repository(

Monday, August 13, 2012

Simulation of a humanoid robot

Today, I'll show some of the work I did last year in my university. I worked in a project titled "Stability Control of a Humanoid Robot". The project goal was to test a stability control algorithm on a humanoid robot in a virtual environment. This project was a first step toward the actual implementation of a humanoid robot.

I'm going to show the steps I followed to achieve a full simulation environment. And I hope these steps allow other people to do simulations of robots (not necessarily bipedal robots).

Mechanical design

The project started with the mechanical design of the humanoid robot. Since my faculty had servomotors, they were used as part of the design. The mechanical design was developed on Solidworks, and was inspired on various commercial designs. The robot has 18 degrees of freedom, 5 per leg and 4 per arm, and has no head!

Solidworks render of the mechanical design

The solidworks design provides pretty much all the necessary information (dimensions, mass and inertia) for a dynamic simulation.

Mass properties from Solidworks

The simulation environment

Next task was selecting a simulation environment, at that time my team and I only knew the Simulink environment from MATLAB, so the choice was limited. The Simmechanics toolbox was used to simulate the multibody dynamics of the robot.

The simulink environment

Bodies and joints

Simmechanics provides body and joint blocks to construct various mechanical assemblies. A body holds the inertial information necessary for a dynamic simulation. And two bodies can be connected via a joint, this joint can add zero or more degrees of freedom between the bodies.

6 DOF Joint between the humanoid torso and the ground (absolute reference point)

The bodies and joints are referenced to each other via Coordinates Systems (CS), which can be absolute or relative. Also, sensors and actuators can be attached to the bodies or to the joints for control applications.

Torso (body) properties: Inertial parameters and CS

There is a plugin for Solidworks called Simmechanics Link, that allows the conversion from a Solidworks assembly to a Simulink model. However, the bodies and joints were laid out manually for greater flexibility and control.

Modeling the floor

The problem with the Simmechanics toolbox was the lack of object collision, which is crucial for the simulation, as the normal and friction forces between the floor and the robot are key for gait.

To solve this problem, the floor was modeled as a PD controller. This  controller exerted normal forces on a few selected points of the robot sole, only when these points were on or below the floor level.

8 support points selected per foot.

The error input of this PD controller was the deviation of these points from the floor level. High proportional and derivative gains were used to minimize the sinking of the feet on the floor. The floor also exerted viscous friction forces, when the feet was in contact with the floor.

Floor controller block diagram.

Modeling the servomotor

For the servomotor model, specifications like the maximum angular speed and the maximum torque were used to produce an accurate DC motor model. For the servomotor controller model, a position + speed controller with gravity compensation and angular speed feedforward was used. Full block diagram is below.

Servomotor model

Each servomotor was attached to each joint via a joint actuator and joint sensors, as shown in the following image.

Actuated joint

Putting it all together

After modelling the servomotor and the floor, the rest of the work consisted in assigning each body its inertial parameters and its 3d model (.stl) and then gluing everything together with the servomotors. The final result is shown below.

High level assembly


The simulation of this whole system can take a considerable amount of time, depending on the number of degrees of freedom, CPU power, etc. Here's a tip about the animation, you can speed up a little the animation (not the simulation), by accessing the menu Simulation > Control Animation Speed and reducing the "Delay per frame" and tweaking the "Visualization sample time".

Controlling the animation speed

Making it move

Moving the robot is as simple as giving some references to the servomotors, but making it walk is a different history. The references that can be send to the servomotors are angular positions (denoted as Q), but usually you want the robot to follow trajectories in the XYZ space.

The process of going from the Q space or articular space to the XYZ space is called forward kinematics, and the reverse process is called inverse kinematics. The forward kinematics is usually easier to compute, and is required to compute the inverse kinematics.

Forward kinematics

To compute the forward kinematics, the Denavit Hartenberg approach was used. I won't go into much detail but basically, the Denavit Hartenberg transformation matrices can map positions and orientations from one coordinate system to another.

With these Denavit Hartenberg transformation matrices, the position of any body part seen from any other body part (generally one of the feet) can be easy computed.

Inverse Kinematics

Using the forward kinematics, an analytic expression of the position of any body part seen from any other body part can be derived, the inputs of this expression are the joint angles (Q) and the output is the XYZ position. Computing the inverse of this expression is impossible, as there are many solutions.

Since no analytical expression for the inverse kinematics can be computed, an iterative method name Damped Least Squares (DLS) was used. I won't cover the mathematics of the DLS method here, but it uses the pseudo jacobian to approach the desired position iteratively.

The trajectory

The chosen gait trajectory consist of a spline for the foot trajectory and another spline for the hip trajectory for the single support phase, i.e. when only one foot is in contact with the floor.

Gait trajectory

This continuous trajectory was discretized in 50 points, and these points were transformed to the Q space using the inverse kinematics. The Q coordinates computed were finally fed at 50Hz to the servomotors.

The stability indicator

Trying to walk just by using the previous computed references will result in a failure, it's necessary to implement a stability control. And for that we need to select a stability indicator.

For this project, we used the Zero Moment Point (ZMP) indicator, which is basically an extension of the concept of center of mass in a dynamic context. You might know that a static object can stay in equilibrium as long as the projection (along the gravity axis) of its center of mass is inside its support polygon.

The ZMP takes in consideration the center of mass and the acceleration of the body and must reside inside the support polygon to guarantee the stability of the body.

The ZMP indicator computation can be reduced to the cart table problem, as sketched in the following image, for each plane of interest.

Cart table model for the XZ plane.

The stability control

Our problem reduces to keeping the ZMP inside the support polygon, to achieve this we implement two separate controls.

  • Feet control, maintains the foot parallel to the floor maximizing the support polygon. 
 Left: Feet control disabled. Right: Feet control enabled
  • Hip control, move the ZMP inside the support polygon.
Hip control in action, with disturbance at t = .25s.
Yellow: ZMP, Cyan/Magente: Support polygon

The final result

Here is your reward, for getting this far.


  • Prof. Jose Oliden, our advisor
  • Victor Paredes and Santiago Cortijo, team members
  • Control systems and Artificial Intelligence Research Group (GISCIA) 
  • National University of Engineering (UNI)


Monday, August 6, 2012

qSerialTerm: Qt based Serial Port Data Acquisition Software with Moving Plots

With its latest update, qSerialTerm can now be use as a Serial Port Data Acquisition System (DAS) or as a Serial Port Terminal Emulator.

The DAS mode includes, on top of the data logger feature, a moving plot feature. To achieve these moving plots, qSerialTerm uses libqwt for plotting. This means, that libqwt is a new dependency for qSerialTerm.

The user can select between these two modes using the "From Device" menu. In the following image, you can see qSerialTerm  configured as a DAS.

qSerialTerm in "plot" mode.

The user can configure the number of plots and the data type, endianness and format of the incoming data.

On the following image, you can see the a fully configured qSerialTerm ready to receive two "variables".

qSerialTerm configured for two plots.

These "variables" must be grouped as packets, e.g. let's say we want to plot 3 variables, these variables are floats, and are encoded as little endian raw binary. Each packet of data will contain one data point for each variable, summing a total of 12 bytes (4 bytes per float) per packet.

Packet Variable 1 Variable 2 Variable 3
1 float1 float2 float3
2 float4 float5 float6
Two structured packets

qSerialTerm should receive these two packets in the following order: float1, then float2, and so on until float 6. Then, two data points will be added to each plot.


I'll now show some screenshots of qSerialTerm in operation. For these demos, I shorted the TX and RX lines of my USB <-> Serial converter, this means that every byte I send from the PC was echoed back to the PC.

The data was sent using the qSerialTerm's "frame" feature. Using the continuous mode, I sent data, whose values varied with sliders, this data echoed back to the PC, and ended being added to the plots.

1 uint8 variable

2 int8 variables

I'll post a demo video later.


You can get qSerialTerm source code from this repository.

More information

qSerialTerm as a serial port terminal emulator
qSerialTerm for data logging
qSerialTerm for serial servomotor control
qSerialTerm for image visualization 

Saturday, August 4, 2012

qSerialTerm: Serial port servomotor control

qSerialTerm, a Qt based serial port terminal emulator, now supports a "frame" feature, intended for servomotor control applications.

This "frame" feature allows the user to send multiple data points (uint8, int8, etc) grouped in a frame. The user can send this frame by click a button or choose to periodically send these frames via a checkbox.

The value of these data points can be varied via a slider or, alternatively, via a spin box. The frame can be sent in two formats: Raw binary or hexadecimal string, where each byte is represented by two characters. e.g. 31 (one raw binary byte) or "1F" (hexadecimal string = 2 bytes).

The minimum and maximum values for the data points and the endianness of data points can also be specified by the user.

Additionally, the user can append one byte and/or prepend one byte to the frame.

A image is worth a thousand of words, the above description will be clearer with the following image.

qSerialTerm frame feature

In the following sections, I'll show you some usage of this frame feature.

Raw binary format

Raw binary format

For this example, the frame was composed of these bytes: 48, 49, 50, 51; and sent in raw binary format. These bytes are equivalent to the '0', '1', '2' and '3' characters shown in the terminal.

Hexadecimal string format and endianness

Hexadecimal string format

In this example, these 4 words (uint16): 15, 255, 4095 and 65535 were sent in hexadecimal string format.

Notice that these 4 words should take 8 bytes, however the hexadecimal string format takes 16 characters (bytes).

Also the frame is in little endian format, this means that the least significant byte occupies the the lowest memory address. e.g. the word 15 is represented as 0x000F in big endian format, and represented as 0x0F00 in little endian format.

Finally, in little endian format 15 -> 0x0F00, 255 -> 0xFF00, 4095 ->0xFF0F and 65535 -> 0xFFFF. All this results in the following string: "0F00FF00FF0FFFFF", which is shown in the terminal.

Servomotor control demo

I owe you this one for now. I've used this software to send signal controls to a robot. I'll post a video later.


qSerialTerm source code is available in this repository.

More information

qSerialTerm as a serial port emulator
qSerialTerm for data logging
qSerialTerm for data acquisition
qSerialTerm for image visualization  

Wednesday, August 1, 2012

qSerialTerm: Improved and now can be used as Data Logger

qSerialTerm, the Qt based serial port emulator, has been improved from its first version, and now has a data logging feature, all the improvements are listed below:

  • Added data logging feature (rewrite file or append to file).
  • Reading data from the serial port is now periodic.
  • Fixed lose of focus when data is received.
  • Rearranged docks.
  • Docks can be called back from the "View" menu.

You can see below the new looks of qSerialTerm.

qSerialTerm new looks

I'll now show you via images the new logging feature. Below you can see the log file selection dialog.

Selecting the log file.

Before start logging, you can select whether to overwrite or to append to an existing file, using the "append" checkbox. After you start logging, a timer will show for how long you have been logging.

Testing the logging feature.

On the following image, you can see the contents of the "log.txt"

Checking the log file.