Anda di halaman 1dari 129

An Input Device for the Control

of 3D Graphics

Final Report - Spring 2003

David Humphreys

Supervised by Dr. S.I.Woolley

This report will summarise research into inertial tracking and the application of inertial sensors
for the control of 3D graphics on a PC. A summary of research and an investigation will cover
how gyroscopes and accelerometers could be used to passively sense the position of an
object and the problems that are caused by noise and inertial drift. This report also covers the
design and implementation of a prototype input device for rotating 3D objects using
gyroscopes and thresholding. This report summarises the different topics that were studied as
part of the development of the prototype including; implementing USB, analogue to digital
conversion and serial communications on a PIC microcontroller, interfacing with a PC using
RS232, implementing a Graphical user interface in Visual Basic, drawing a 3D cube from
basic principles, the packaging of the prototype and usability testing of the device. The
prototype device includes the following attributes:

• Measures 3-axes of rotation in a 3D environment

• Can be used to control the rotation of a 3D cube

• Two different modes of operation

• Ergonomically designed

• Robust and reliable during normal operation

• Connects to an IBM compatible PC running MS Windows

• Built-in usability tests

Thanks go to all those people who were involved with this project, particularly
Dr S. Woolley
Mr D. Checkley
Mr W. Hay
Mr A. Zanatni
Mr A. Yates
Mr S. Greep

My thanks also go to my Parents who have provided encouragement and support whenever I
have needed it.

Table of Contents

Table of Contents
1 INTRODUCTION...................................................................................................................................1
1.1 DESIGN BRIEF ...................................................................................................................................1
1.2 JUSTIFICATION ..................................................................................................................................2
1.3 BASIC CONCEPTS..............................................................................................................................3
1.4 REPORT APPROACH ..........................................................................................................................4
2 INITIAL SPECIFICATION....................................................................................................................6
2.1 FUNCTIONAL REQUIREMENTS.............................................................................................................6
2.2 ERGONOMICAL REQUIREMENTS .........................................................................................................6
3 PROJECT OVERVIEW..........................................................................................................................8
3.1 SYSTEM DIAGRAM .............................................................................................................................8
3.2 DESIGN PROCESS .............................................................................................................................9
4 PC INTERFACE ...................................................................................................................................12
4.1 AIM................................................................................................................................................ 12
4.2 SELECTING THE PC INTERFACE ...................................................................................................... 12
4.3 IMPLEMENTING PC INTERFACE USING RS232.................................................................................. 15
5 SENSORS .............................................................................................................................................17
5.1 INTRODUCTION TO INERTIAL SENSING.............................................................................................. 17
5.2 AIM................................................................................................................................................ 19
5.3 SENSOR RESEARCH ....................................................................................................................... 20
5.4.1 Accelerometers........................................................................................................................ 21
5.4.2 Gyroscopes ............................................................................................................................. 22
5.5 IMPLEMENTING SENSORS ............................................................................................................... 23
5.6 PRELIMINARY TESTING OF SENSORS ............................................................................................... 24
5.7 SUMMARY OF IMPLEMENTING INERTIAL SENSORS ............................................................................ 25
6 MICROCONTROLLER .......................................................................................................................26
6.1 AIM................................................................................................................................................ 26
6.2 HARDWARE SELECTION .................................................................................................................. 26
6.3 IMPLEMENTING MICRO-CONTROLLER TEST CIRCUIT ......................................................................... 27
6.4 FLOW DIAGRAM OF PIC FIRMWARE ................................................................................................. 28
6.5 IMPLEMENTING A/D CONVERSION ................................................................................................... 28
6.6 IMPLEMENTING ASYNCHRONOUS TRANSMISSION.............................................................................. 30
6.7 MISCELLANEOUS IMPLEMENTATION DETAILS .................................................................................... 31
6.7.1 Auto configuration.................................................................................................................... 31
6.7.2 Header bytes ........................................................................................................................... 31
6.8 TESTING AND RESULTS................................................................................................................... 32
7 DATA PROCESSING DESIGN...........................................................................................................33
7.1 AIM:............................................................................................................................................... 33
7.2 DATA PROCESSING METHOD SELECTION ......................................................................................... 33
7.2.1 Direct Mapping ........................................................................................................................ 33
7.2.2 Gesture Recognition................................................................................................................ 35
7.3 DESIGN CONSIDERATIONS .............................................................................................................. 36
7.3.1 Selecting Threshold Levels ..................................................................................................... 36
7.3.2 Normal Operation Mode .......................................................................................................... 37
7.3.3 ‘Auto-damping’ Mode............................................................................................................... 40
7.3.4 User Options............................................................................................................................ 41
7.4 USABILITY FLOW CHART ................................................................................................................. 44

Table of Contents


8.1 AIM................................................................................................................................................ 46
8.2 SOFTWARE STRUCTURE ................................................................................................................. 47
8.3 DATA ACQUISITION ......................................................................................................................... 49
8.4 DATA STORAGE ............................................................................................................................. 51
8.5 CALIBRATING DEVICE ..................................................................................................................... 52
8.6 DATA PROCESSING ......................................................................................................................... 52
9 IMPLEMENTING THE GRAPHICAL USER INTERFACE..............................................................55
9.1 INTRODUCTION ............................................................................................................................... 55
9.2 GRAPHICAL BARS ........................................................................................................................... 56
9.3 3D CUBE ....................................................................................................................................... 57
9.4 SETTINGS ...................................................................................................................................... 60
10 PACKAGING .......................................................................................................................................61

11 RADIO LINK........................................................................................................................................63

12 TESTING ..............................................................................................................................................64
12.1 FUNCTIONALITY TESTING: METHOD ................................................................................................. 64
12.2 FUNCTIONALITY TESTING: RESULTS ................................................................................................ 65
12.3 USABILITY TESTING: METHOD ......................................................................................................... 66
12.4 USABILITY TESTING: RESULTS ........................................................................................................ 68
12.5 TESTING CONCLUSION ................................................................................................................... 70
12.5.1 Functionality ............................................................................................................................ 70
12.5.2 Usability (compared with a mouse) ......................................................................................... 70
12.5.3 Optimised Settings .................................................................................................................. 72
13 PROJECT CONCLUSION ...................................................................................................................73
13.1 MEETING THE SPECIFICATION ......................................................................................................... 73
13.2 DISCUSSION................................................................................................................................... 74
13.3 FURTHER WORK ............................................................................................................................ 78
14 REFERENCES......................................................................................................................................79

APPENDIX A: IMPLEMENTING USB ON A PIC 16C745 ......................................................................... I

APPENDIX B: CIRCUIT DIAGRAMS ....................................................................................................... IV

APPENDIX C: CORRESPONDENCE ....................................................................................................... VII

APPENDIX D: HARDWARE CAD DESIGNS........................................................................................... IX

APPENDIX E: COMPONENT LIST AND BUDGET ................................................................................ XI

APPENDIX G: PIC CODE.........................................................................................................................XIV

APPENDIX H: VISUAL BASIC CODE.................................................................................................. XVII

List of Figures

List of Figures

Figure 1: Project Scope 1

Figure 2: Initial Concept 3
Figure 3: Summary of 3D Input devices already on the market 5
Figure 4: System Overview 8
Figure 5: Design Process (i) 10
Figure 6: Design Process (ii) 11
Figure 7: Comparison of Modern PC Interfaces 12
Figure 8: RS232 Protocol 16
Figure 9: HyperTerminal Settings 16
Figure 10:Comparison between Different Types of Accelerometer Technology 18
Figure 11: Changes in Gyroscope Technology in Recent Years 18
Figure 12: How the Murata Gyrostar Gyroscopes Sense Angular Velocity 19
Figure 13: Comparison of Inertial Sensors 20
Figure 14: Arrangement of Accelerometers to measure 6 -DOF 21
Figure 15: Arrangement of 3 Gyros and 3 Accelerometers to measure 6-DOF 22
Figure 16: Suggested Method of Connecting Gyros 23
Figure 17: A Graph to Show Results From Initial Testing of the Gyroscopes 24
Figure 18: PIC 16C774 Pin Connections 27
Figure 19: Pic 16C774 Test Board 27
Figure 20: Flow Diagram of PIC Firmware 28
Figure 21: Flow Diagram of A/D conversion 29
Figure 22: ADCON1 Register 29
Figure 23: ADCON0 Register 30
Figure 24: Asynchronous Transmission Flow Diagram 30
Figure 25: TXSTA Register 31
Figure 26: Configuration Word 31
Figure 27: Header Bytes Implemented on PIC 32
Figure 28: How Thresholding is used to detect a gesture 37
Figure 29: Definition of a Gesture 38
Figure 30: Process of Recognising a Gesture 39
Figure 31: Output in Normal Mode 39
Figure 32: ‘Inverted Square’ Output used in ‘Auto-damping’ mode 40
Figure 33: Output in ‘Auto damping’ mode 41
Figure 34: Demonstration of the Effects of Different Thresholds 42
Figure 35: Demonstration of Amplification Setting Set to 2x 42
Figure 36: Demonstration of Maximum Output Setting 43
Figure 37: Demonstration of Time Out Setting 43

List of Figures

Figure 38: Usability Flow Chart 44

Figure 39: Software/GUI Structure 48
Figure 40: 9-byte Input buffer 50
Figure 41: Data Acquisition from RS232 Serial Port Flow Chart 50
Figure 42: Data Logging Form 51
Figure 43: Calibration Process Flow Chart 52
Figure 44: Normal Mode Implementation Flow Chart 53
Figure 45: Auto-damping Mode Implementation Flow Chart 54
Figure 46: The Graphical User Interface 55
Figure 47: Details of GUI Bars 56
Figure 48: Implementation of GUI Bars 56
Figure 49: Numbering of Cube Faces and Corners 57
Figure 50: Flow Diagram of Cube Implementation 58
Figure 51: Settings dialogue box 60
Figure 52: Photograph of Packaged Device 61
Figure 53: Exploded Diagram of Packaging 62
Figure 54: Methods of Testing Used to Test Each Area of the Prototype 64
Figure 55: Functionality Testing Conclusions 65
Figure 56: Usability Testing Flow Diagram 67
Figure 57: Optimised Settings for Device as a Result of Usability Testing 68
Figure 58: Usability Test Results for Prototype Device 68
Figure 58ii: Usability Test Results for Mouse and VRML 69
Figure 59: Mean Opinion Score for Device on Ergonomical Attributes 70
Figure 60: Rating of Prototype Against Specification 73
Figure 61: Max232 Circuit Diagram IV
Figure 62: PIC16C774 Circuit Diagram IV
Figure 63: PIC16C745 Circuit Diagram V
Figure 64: Max495 Pinout V
Figure 65: Gyroscope and Amplifier Schematic V
Figure 66: Transmitter Circuit Diagram VI
Figure 67: Receiver Circuit Diagram VI
Figure 68: Bottom Half of Spherical Case IX
Figure 69: Top Half of Spherical Case IX
Figure 70: Face of Aluminium Cube X
Figure 71: Component List and Budget XI
Figure 72: Example Inputs and Output for ‘Normal Mode’ XII
Figure 73: Example Inputs and Output for ‘Auto-damped Mode’ XIII

List of Abbreviations

List of Abbreviations

A/D Analogue to Digital Converter

CNC Computer Numerical Control
DDK Driver Development Kit
DOF Degrees-of-Freedom
GUI Graphical User Interface
HID Human Interface Device
MDF Medium Density Fibreboard
MEMs Micro-Electro-Mechanical
MOS Mean Opinion Score
PIC Programmable Interface Controller
RSSI Receiver Signal Strength Indicator
SDK Software Development Kit
SNR Signal to Noise Ratio
UART Universal Asynchronous Receiver Transmitter
USB Universal Serial Bus
VRML Virtual Reality Modelling Language

Throughout this report the following terms are used:

‘Department’ – The Department of Electronic, Electrical and Computer Engineering at the

University of Birmingham, England.

‘Gesture Recognition’ – In this report refers to the recognition of a movement intentionally

made by the user to manipulate the object on the computer screen. In this report thresholding
is considered to be a form of gesture recognition.

‘Direct Mapping’ – When the position of an object on the screen is directly mapped to the
position of a human interface device (often referred to as absolute position tracking).


1 Introduction

1.1 Design Brief

“To design and build an original controller for an IBM compatible PC to manoeuvre 3D objects
in a 3D environment”. The PC shall be of a high specification and shall run Microsoft Windows
2000 operating system.
The suggested solution is to construct a wireless, handheld sensor linked to the PC. The
sensor should detect 6 degrees of motion (translation in the x, y and z planes and rotation
about the x, y and z-axis). The use of inertial sensors strongly advised to form an innovative
solution as well as being relevant to current research into inertial sensing within the Electronic
Engineering department. Figure 1, below, shows the scope of the project.

Input Process Output

User input 3D object

from hand This
moves and
motion Project
rotates on
Diagram 1: Project Scope

Figure 1: Project Scope


1.2 Justification
This product is intended to be of an intuitive design that can be used to manoeuvre a 3D
object around a screen and rotate it. It would be useful when viewing 3D objects, such as the
detailed cuneiform tablets created using a 3D scanner in research carried out at the University
of Birmingham. Currently most users use a mouse to manoeuvre an object on a screen.
However, a mouse is not an intuitive device to use for this application as it only provides 2
degrees of translational freedom whereas an object in 3D space has 6 degrees of freedom.
“The user needs to be able to rotate the tablet in all three axes to be able to see all of the
faces of the tablet. This gives six degrees of freedom; but most computers have only two-
dimensional input devices (mouse, trackball or joystick)… An ideal solution would be a form of
data glove that measures the position of the user’s hand (in all six degrees of freedom) and
renders the image to correspond with the position of the hand,” (Woolley, S.I. 2001 [32]).
The solution will not involve designing a data glove, but rather a sensor that the user olds in
their hand. This makes it easier for the user to switch between devices when working at a
desktop PC with a mouse and keyboard. The proposed device will be used alongside the
mouse, which would be used for pointing tasks in the 3D world.


1.3 Basic Concepts

The initial concept, given during the briefing, was interpreted as illustrated in Figure 2.
The user holds an ergonomically designed sensor that senses all 6 degrees of movement.
Within a specified range, the sensor would transmit data to a receiver module via a wireless
radio link. The base station would incorporate a micro-controller, which would receive data
from the sensor and relay it to the PC via a standard interface. The microcontroller would
make this information such that a generic 6 DOF input device driver could be used on the PC.
Figure 2 summarises these ideas.

3D object moves in
conjunction with
Hand-held sensor
the user’s hand
detects movement
of hand

User moves
sensor like they
want the object
to move on the

Receiver relays data

Sensor transmits from sensor to PC
information to the

Figure 2: Initial Concept


1.4 Report Approach

This report will explain all the features of a prototype device that has been designed to meet
the specification. It will also cover all the design decisions that have been made and, along
with code listing in the appendices, provide in an in-depth record of the implementation of the
The first task was performing research into ergonomical issues that would help form a
preliminary specification. Research into inertial sensing and 3D input devices was also
undertaken to help provide direction for the project and to highlight any potential design issues
before they were encountered first hand, interrupting progress of the prototype. Figure 3
summarises some 3D input devices that have already been developed.
Following the specification, the report will concentrate on the final design developed to meet
the specification. A system diagram is given in Figure 4, which shows how the design was
fragmented into the sections that are considered to be significant. Each part of the system
diagram has a corresponding section in the report.
The solution as a whole is evaluated in chapters 12 and 13. Functionality and usability tests
were completed to compare the effectiveness of the prototype at completing 3D tasks to a
mouse. The results from the tests were used against the specification to determine the
success of the prototype at meeting objectives.

Project or Product Sensors Used Technical attributes Other comments
An Inertial Measurement Unit for User 3 Murata Gyrostar angular Connects via RS-232 interface Prototype project to prove principles
Interfaces [2] rate sensors and Incorporates Kalman filtering and gesture of inertial tracking
3 ADXL 202 recognition Prototype cost ~ US$300
accelerometers RF Monoliths radio transmitter
Application of inertial sensing to handheld ADXL202 connected to Connects to a Linux PC using a Report into the development of an
terminals.[4] Microchip 16C622 development board inertial navigation system for a
Connects via parallel interface configurable phone project
Hybrid Inertial and Vision Tracking for Accelerometers and Detailed mathematical analysis Project using Inertia cube to
Augmented Reality Registration [19] angular rate sensors Uses Kalman filtering demonstrate effectiveness of vision
Vision based sensing to based correction techniques
compensate drift
Intersense Inertia Cube 2 [7] 9 discreet sensing RS-232 Interface Complete 3-DOF sensor
elements with advanced Measures roll, pitch and yaw Accurate to 1degree at 25 degrees C
Kalman filtering 0 -1200 degrees/second Approximately 1" cube
180Hz refresh rate Costs US$1,695
Logitech 3D mouse and Head Tracker [9] Stationary ultrasonic Incorporates separate transmitter, control Designed for high end workstations
transmitter unit and power supply Can be incorporated into VR
Receiver relays back to Connects via RS 232 interface headware
control unit Costs US$1,999
Movy - a sourceless and wireless input Accelerometers and Connects via RS232 interface Currently at prototype stage
device for real world interaction [6] angular rate sensors Uses radio link MOVY ring uses accelerometers to
50Hz refresh rate monitor gestures of finger
Miracle Mouse [11] Senses position using infra- Connects via USB Device is intended for disabled users
red transmitters and Powered from USB port Uses gesture recognition for Windows
detector IR transmitter fitted to headgear applications
IR detector sits on monitor
Gyration Ultramouse [5] Uses the MicroGyro dual Connects via USB Commercial project aimed at
axis gyroscope to sense Refreshes at 80Hz business
rotation NiMH or 3 AAA batteries This device works on or off the
25' Radio distance desktop
Retails for US$179
Figure 3: Summary of 3D Input devices already on the market

Initial Specification

2 Initial Specification

2.1 Functional Requirements

Below is a prioritised initial specification for the device, derived from the brief. It was important
that these features were kept in-mind when designing the project because the project was to
be tested against these in the conclusion.

• The device must be operable in a 3D environment

• The device must be ergonomically designed (see section 2.2)

• The device must be robust and reliable during normal operation

• The device must connect to an IBM compatible PC running MS Windows

• The device must measure movement in 6 degrees of freedom

• The device must be wireless

2.2 Ergonomical Requirements

The word ergonomics is defined in the Oxford English Dictionary as, “the scientific study of
efficiency of man in his working environment,”[25]. This project will look at the ergonomics of
pointing devices and in particularly those of a 3D controller. Ergonomic studies will be used to
test the success of the project; based on some of the following factors [33]:

Ease of Learning
This is most ergonomically important factor of the project. The device needs to be intuitive to
use so users can pick up the device and immediately start to use it.

It is important that the user is able to achieve their task quickly. The task might be to rotate an
object on the screen to the desired position.

Initial Specification

A mouse has to be a very accurate device, as the user has to position the cursor sometimes
on very small targets. This device must be accurate but not to the extent of a mouse. A mouse
has to point to small targets (such as icons) whereas for manipulating 3D objects targets are
not so small.

The co-ordination of a device is how well it works as a single unit. Because there will only be
one part that interacts with the user it should achievable. A device that has multiple controls,
such as steering wheels and pedals, might find it harder to be co-ordinated.

Device persistence and acquisition

Environmental effects and other problems and obstacles that face the device should not mean
that the device malfunctions. For example, 3D devices that rely on a magnetic field to calibrate
themselves will be affected by magnetic interference.

The proposed device will undoubtedly cause the user to become fatigued after prolonged use
because the user has to hold the device in the air. To reduce the fatigue the sensor should be
as light as possible.

Project Overview

3 Project Overview

3.1 System Diagram

Packaging (Section 10) Sensors (Section 5) Microcontroller (Section 6) Radio Link (Section 11)

Graphical Users Interface Data Processing (Section 8) PC Interface (Section 4)

(Section 9)

Figure 4: System Overview

Project Overview

3.2 Design Process

The project included a lot of hardware selection and design. A lot of background research was
undertaken throughout the project looking at different alternatives to meet the specification.
There were both different component options and different ways to arrange these components
to form a system. Figures 5 and 6 summarise the different options that were considered.
These diagrams address the design decisions the order that they come in the system between
user input and the graphical user interface. This report, however, is written in the
chronological order that different sections of the project were implemented. First of all the PC
interface was researched because this would have an effect on the format of the output from
the sensors and microcontroller and also the PC side of the interface because it was
necessary to know how the data would be received before any software was developed. The
second stage was researching inertial sensors. Having decided upon the sensors it was
necessary to design a method of relaying this data to the PC interface in the correct format.
Once the data from the sensors could be read using software on a PC some data processing
was implemented to turn this data into meaningful data used in the Graphical User Interface.
Having developed a prototype, packaging could be added without affecting functionality of the
electronics and software and a wireless link was attempted to replace the cable that
connected the handheld sensor to the PC interface.

Project Overview

Expensive solutions
however lower noise
and drift compared to

Design Murata. Beyond

Start Implementing a

software A/D was
e unneccesary.

l ic
Gyroscopes are not affected by

gravity. Knowing orientation
means gravity can be subtracted
o ftw
when accelerometers are added Medium cost, 5v power S

A/D converter

supply, available, high


performance, proven

Sensors Fragmented solution. x design with gyros

Gyros & Accelerometers Within budget, readily

Translation can be
Amplifier and Filter

added once rotation available, compact and

Hardware A/D are

low power consumption.

has be achieved with available on PICs and the


Output is differential.
gyroscopes 74 Low cost, readily M16C. They are reliable


available and familiar and straightorward to


to author. Requires implement


+/-15v power supply


Requires significant

processing to remove

gravity component of

Flash erasable,
acceleration and either all-
hardware USART
or-nothing will work
and 8 10-bit a/d £9

Higher transfer rate

Continued on however requires
Softw High-end PIC. 5v operation,

Figure 6 clock line
a re hardware USART and 10 12-bit a/d.

Low cost hardware
Data Processing £20 Primarily chosen over similar 16C774 PIC

UARTs on PICs. Data

PICs because of familiarity with

moved into buffer to
family by author plus capability if

Ha nsm

tr a
extra sensors/features required


rd is

w sio

ar n

8 10-bit a/d. and

hardware UART.

Similar to other

Single transmission line.
choice. £19


m are
Can be easily converted

ns tw
to RS232 voltage levels

tra Sof
Requires ‘bit-
banging’. Complex
Probably not powerful and processor
enough to handle inefficient
complex data processing
PIC Loses plug-and-play Supported in
aspect because driver is department (not USB
required so may as well though), author
M1 3 2 perform data processing
6C S -2 in software
familiarity with family
and low cost (£15)
Hig-end chip with R 45
hardware UART plus PC Interface 6C7
full departmental 1
support US
B PIC Low speed device
5v, 100mA supply, plug- designed for HID and
and-play, future USB Controller Cypress enCoRe similar peripherals. Well
compatibility, no driver documented in USB
development required Complete [xx] but no
HID support department support
undocumented or
not supported

More complex than


using HID drivers Microsoft


but could be used if De


Widely used and good HID is problematic DirectX SDK ve

Blaxxun VRML lo pD
support for
Contact Browser developers n riv
Ju er
It was decidied at this stage to re- PC Driver

design the project. The PIC


16C745 could not be made to


work properly and to add full HID support only for

functionality to the device a driver mouse and three-axis Built-in support from D
would need to be developed joystick. Not so widely used for Windows 98 onwards HI
(rather than ‘kludging’ as a
joystick), losing the plug-and-play
3D object browsers as Java 3D 3D Software for devices that are built
VRML. Less well to this specification.
advantage USB has over RS-232. supported in general Proposed device will
Software data processing was meet specification
instead considered

Figure 5: Design Process (i)

Project Overview

eti Extra sensors and
a gn processing that are

Would make a highly

M outside the scope and
time limits of this project
usable device. Recalibration Vision
Problems arise due to Method
ppi inertial drift and noise Us
a er
Innovative idea where user re-
e c calibrates device in hand or base.
Requires extensive processing
(possibly Kalman filtering), possibly
Excellent serial port connectivity uncompletable in time constrainsts
and GUI development. Author Data Processing
sic can learn to program VB within
l Ba scope of project
ua R l
Exponential decaying output after a gesture is
Vis Author has no ec
og en
tia made. Does not represent real-lifre movement
Continued from Program Java knowledge and serial n on
itio p
Figure 5
Language communications are
n Design using
difficult thresholding to eliminate
C/ effects of noise. Uses Operation Linear Linearly decaying output after a gesture is
+ Author experience differential output directly Modes No
made. Does not represent real-life movement
to measure rotational
however GUI is hard to
implement velocity of gesture n -fri
Information how to implement ctio
programmatic control of Contact n Named ‘normal mode’. Friction free
not available. Message posted
environment. Output constant after gesture
onto official newsgroup. May until counteracting gesture is made
only be possible in Java
ax Friction
Knowledge not available xu Named ‘auto-damping’ mode. Output decays in
without the retail version of n
the Cortona SDK which was
Co an inversely-squared manner in a similar way to
n tac in an envioronment where friction is present
outside of budget Co
rto t
Providing a way to interface with
Software a separate 3D program would
Choice make the device usable for real ter
Fastgraph was suitable but rotation applications na
about real-world axes behaved l
eratically 3D Libraries Displaying results
in 3D
OpenGL al
Author had no experience and formed
a compicated solution Controlling a 3D object within the
program GUI means having the In
DirectX 3D Language GUI designed for this project
always in the foreground,
Prototype design
therefore cannot be used for real
complete. Using first principles including application. It does, however ,
LPRS Easy Radio Wireless Link Further design circle geometry and back-face First facilitate testing features
followed culling it was possible to Principles Design
implementing a
wireless link and
implement a controllable 3D cube Complete
Cortona can be used for this
application being inserted into the
VB project. The documentation is
Browser Window
A cubic frame could be made required that comes with the SDK
from aluminium bar screwed but is outside the budget
together. Cheap but complex and
Low cost, light weight,
rs not exceptionally rigid
machinable, available.
‘Grip shaped’ and easy Can be coated to give
Transmitter/receiver pair. Data in
PCB Frame r to machine. professional finish
/ data out operation. Built-in error

correction. £40 a pair So lin

lid A solid block could be machined Cy Difficult to machine and is large Accrylic or high-impact
into a cubic frame. Wastage is high
but the resulting frame is rigid.
Packaging Shape Sphere because space wasted fitting Packaging Material Plastic polystyrene. Expensive
around a cube. Ergonomically and heavy and poor
Aluminium was chosen becasuse it finish after machining
designed and looks attractive
is light wieght but strong



Can be made small
Solid plastic juggling
becasue it can fit

ball [24] cut in half and
around the cubic frame
machined. Cannot
without wasting space.
easily be held for
Not ergonomic.

Figure 6: Design Process (ii)
PC Interface

4 PC Interface

4.1 Aim
The aim was to design an interface between a microcontroller and the PC. The specification
for the link was:

• The micro-controller will produce an asynchronous serial data stream

• The PC shall be equipped with standard interfaces

• The design should be capable of a data rate of at least (9600bps)

• The interface should be low cost and as user-friendly as possible (i.e. plug and play)

4.2 Selecting the PC Interface

There are several ways that the device could connect to a PC. Figure 7 summarises some of
most popular computer interfaces (modified from [5]).

Interface Format Number Maximum cable Maximum Typical use

of Devices length (m) speed (b/s)
USB (low speed) asynchronous serial 127 5 1.5M Mouse, joystick
RS-232 asynchronous serial 2 15-30 20k Modem, mouse
IrDA asynchronous serial IR 2 2 115k PDAs, mobile phones
I²C synchronous serial 40 5.5 3.4M Microcontroller comms
Parallel port parallel 2 3-9 8M Printers, scanners
Bluetooth GFSK * 72 10-100 721k PDAs, mobile phones

Figure 7: Comparison of modern PC interfaces

* Gaussian Frequency Shift Keying

RS232 and USB were short listed as being suitable for this project. They are both standard
PC interfaces that are low cost.
USB has some advantages and some disadvantages over RS232 that are relevant to this
project. Firstly the disadvantages: USB is not supported so well as RS232 in the Department,
is complex to implement and is more expensive. The advantages are the ability to provide a
five-volt power supply at up to 100mA to the device and providing a truly plug-and-play
interface. Microsoft describe the benefits of USB as, ”complete support for Plug and Play,
power management, and "hot plugging" to add or change devices without turning off the PC.
USB provides a fast, low-cost solution that is strongly recommended for gaming devices and
other input controls,”[22]

PC Interface

Another benefit is future compatibility: manufactures are starting to produce high-end

motherboards (Abit IT7-MAX2 [2]) without RS232 interfaces indicating that peripherals will
convert to USB and Firewire in the future.
Plug-and-play is a big advantage for USB, brought on because more and more non-technical
home users own PCs nowadays and do not want to have to install drivers and change
settings. The Human Interface Device (HID) specification was designed to provide plug and
play connectivity to devices that interact with humans. This category of devices includes
joysticks and mice and specific game controllers, from bicycle pedals to pinball devices (see
HID usage tables [30]). The device being described in this project fits well into the HID
specification, which means all PCs with a USB controller and Windows 98 or later has built in
drivers to support the device. Peter Sheerin [28] recommends to end-users in his report
written in February 2002, “go for the USB-interface models. They are built according to the
USB HID class driver specification, ensuring they can be supported via an industry-standard,
cross-platform driver interface now and later.”
The conditions that make a device compatible with the HID specification include data structure
(the format of the data), data rate and transmission type (interrupt, bulk, control or
isochronous) and are given in a checklist on page 293 of USB Complete [5]. The device being
described in this report meets the criteria for a HID device. The specification includes a
category for multi-axis controllers and data types including both absolute and relative units for
rotation and translation (see the HID usage tables [30], p29 of v1.11). The HID usage tables
define a multi-axis controller as, “an input device used to orientate eye points in 3D
space…that typically consists of 6 variable axis…for model manipulation and visualisation in
3D space,”[30]. This specification seemed to match the criteria for the prototype. At this stage
steps were taken to implement a USB controller on a Microchip PIC (Appendix A). Evidence
overwhelmingly suggested that USB and the HID specification would allow the device to be
plug and play, self powered and not require complicated interfacing to 3D software because
software supporting the specification would support the device.
The device is plug and play because the structure of the data for any HID device is read from
the USB controller during enumeration. Windows automatically enumerates the device by
reading a set of descriptors stored on the device firmware. This will, “let the host know what it
needs to do to communicate with the device,” [5] and tell the PC how to interpret the data,
including the type of data and units. Applications are able to handle the data received so the
user can interact using the device. VRML is web browser based language that is commonly

PC Interface

used on the Internet and was also the modelling language of choice for the Cuneiform project
carried out in the Department in 1999 [32]. Research found that, in fact, most mainstream 3D
software (including VRML browsers such as Cosmo [9], Cortona [26] and Contact [8]) do not
provide generic support across the range of HID devices that are described in the
Peter Sheerin is an industry expert on USB devices [28] who was contacted with reference to
problems finding support for unusual HID devices. The enquiring email and his full response is
listed in Appendix C. Sheerin commented in his email on the development of a USB HID
device, “If you continue on that path, that will make the list include your device, the USB
model of the SpaceBall, and future controllers from 3DConnexion…Unfortunately, the only
software I've seen that uses that spec is an internal utility from 3Dconnexion. But I wouldn't
give up quite yet on using that spec, since in the long run, it will result in greater compatibility.”
In Sheerin’s article written in February 2002 it is stated that [28 ],

“Unfortunately, CAD and other 3D-software providers have not adopted the
HID/DirectInput device interface at all, with some exceptions. A few, including
thinkdesign from think3, have eliminated the requirement to load an application-
specific plug-in to use a 3D input controller, but these programs still connect to
only one of the proprietary device drivers (the SpaceMouse, in this case). And the
one 3D viewer (the Cortona VRML browser) that I found with DirectInput-support
doesn't allow all six axes of a device to be used at once, forcing you to switch
viewing modes in the application in order to switch from movement along or about
one axis to another”
Parallel Graphics [26] produce a widely used VRML browser called Cortona that was
mentioned in the passage above. Cortona is widely used and supported well by Parallel
Graphics which is known to support HID mice and joysticks. They were contacted and asked
whether-or-not it was possible to support a Multi-axis HID class device using Cortona. The full
response is given in Appendix C. The extract below states that it is not possible to interface a
multi-axis HID device using Cortona unless the SDK was purchased, which was outside the
scope and budget of the project,
“To provide support for any other input devices in Cortona, an application, which
will handle events of the device and control movements in the Cortona 3D window,
should be developed. Such an application can be created with Cortona Software
Development Kit”

Whilst the support for HID devices was being further researched, development of a low-speed
USB controller was being developed using a Microchip PIC 16C745. Initial research had
suggested that support would be available for such a well-documented specification from the

PC Interface

USB Implementers Forum that seemed to have many advantages. Unfortunately this was not
the case.
Appendix A details the selection of the PIC16C745 microcontroller and implementation as a
joystick. Cortona supports input from a joystick that could be used to rotate or translate a 3D
VRML object. Interfacing the device as a joystick would provide a valid way of controlling the
movement of the object, even though the device was not a joystick.
The result of the attempted implementation of a USB controller is unfortunately unsuccessful.
It was possible to enumerate the PIC as a mouse and, using internally generated values on
the PIC, control the cursor on the screen using sample code from Microchip [21]. Modifying
the firmware code to enumerate the device as a joystick and control Cortona had limited
success. It was possible to rotate a VRML object in two axes but not in all three, which was
The unsuccessful attempt at implementing USB on a PIC could be put down to several things:
the lack of support for HID devices by software developers, lack of knowledgebase for the
PIC16C745 on the Internet and complexity of the USB specification. The results from
modifying the USB descriptors were not represented by the time that had been spent
changing values in descriptors, which seemed to be correct (having been checked using the
HID Descriptor Tool [30] and USB Complete [5]). It was decided, at this stage, that there was
no guarantee that the innovative (in final year projects at least) USB solution would work and it
was decided to implement PC device to PC communications using RS232. RS232 has a
successful record in similar projects, is well supported in the Department, is low cost and has
no disadvantages, other than plug-and-play (and having to develop a device driver) and future
compatibility, over USB for this project.

4.3 Implementing PC Interface using RS232

It was decided to use RS232 to interface with the PC. RS232 is a low cost and robust
interface that allows the developer to send data to the PC using a simple protocol:

• Asynchronous tranfer (no clock signal)

• Start bit, 8-bits transmitted followed by a single stop bit (figure 8)

• ±12v voltage levels

• Data rate determined by transmission distance - typically 3m for 9,600 bps (960

PC Interface

Figure 8: RS232 Protocol

RS232 communication was established using a Maxim [20] Max232 chip. The input to the
Max232 chip was as asynchronous data from the microcontroller described in section 6.
Figure 61 in Appendix B shows the schematic for connecting the Max232 chip.

The data transmitted from the microcontroller includes a 2 byte header to synchronise the
data with the software. See sections 6.7.2 and 8.3 respectively.

Known data was transmitted from a PIC16C774 microcontroller to the Max232 chip. The data
from the PIC was also output to a set of 8 LEDs (for debugging purposes). MS Windows
HyperTerminal (Figure 9) was used to monitor data that was received on COM1, which could
be compared with the known values sent from the microcontroller and the LEDs. The data
received is represented in ASCII format and can be converted to the hexadecimal values
transmitted from the PIC using an ASCII table [4].

Connection Name Unspecified

Connect Using COM1

Bits Per Second 9600

Data Bits 8

Parity None

Stop Bits 1

Flow Control None

Figure 9: HyperTerminal Settings


5 Sensors

5.1 Introduction to Inertial Sensing

Inertial sensors were recommended for this project during the briefing sense the movement
made by a user. The sensors would be built into a rigid construction, the ‘device’, which the
user could hold in their hand and use to manipulate objects on the screen by moving the
whole device. This project has investigated the two types of inertial devices available:
accelerometers and gyroscopes. It is hoped that this research will with help further the
knowledge of these devices and their uses in the Department. This section will introduce the
main technologies used in inertial sensing.
Inertial sensors are devices that can measure their own movement and are completely
passive, meaning that they require no external interaction to operate. They have an
advantage over other sensors because they are not effected by external factors, such as
friction, interference or position: “inertial sensors are desirable for general motion sensing
because they operate regardless of external references, friction, winds, directions, and
Inertial sensors have been used in aircraft and navigation systems for a long time. It is not
until recently that new technology has caused the price and size of gyroscopes and
accelerometers to make them available in consumer electronics. Of particular importance is
the MEMS (micro-electro-mechanical-systems) technology that has allowed small, cheap and
robust sensors to enter the market, ”recent advances in micro-electromechanical system
(MEMS) technologies have enabled inertial sensors to become available on the small size and
price scales associated with such commonplace devices as consumer appliances,”[31].
Accelerometers measure the transactional force encountered due to their acceleration. To
convert this to a velocity this output would need to be integrated once and to convert this to a
position, integrated twice. Accelerometers can use several different technologies, as shown in
figure 10.


Technology Advantages Disadvantages

Piezoelectric High Frequency response High cost

Piezoresistive Very low noise Very high cost

Capacitive Compromise - Medium cost/medium noise

Low frequency response
Defferential capacitive Low cost
High noise
Figure 10: Comparison between different types of accelerometer technology [31]

Gyroscopes measure the angular velocity that they are rotated at and to determine their
angular position would require a single integration. For reasons given in section 5.4, this
project shall concentrate on the use of gyroscopes.
Gyroscopes traditionally used rotating masses mounted on a set of gimbals to maintain
constant orientation when they were rotated. Mechanical rotating gyroscopes are both
expensive, have high power consumption and may suffer wear after prolonged use. Modern
fabrication techniques have meant that alternative (vibrating) gyroscopes have been
designed. They are significantly smaller and lower cost than mechanical versions although at
a cost of being more prone to inertial drift. Figure 11 is a summary of how gyroscope
specification has changed in the last 30 years.

Gyro Type Date Stability (°/hr) Size (in3) Cost/Axis (US$)

Electrostatic 1970s 0.02 50-100 17000

Expected near tem 1990s 0.02 10-20 5000-10000

Consumer Gyro Late 1990s 10 0.01-1 1-10

Figure 11: Changes in Gyroscope Technology in Recent Years [31]

Section 5.4 covers the selection process of the inertial sensors. The outcome was that three
Murata GyroStar Gyros were used to sense rotation around 3 axes. The Murata gyros contain
three piezoelectric (elements that produce a current when they are subjected to mechanical
pressure) that sense the rotation about a single axis. Figure 12 shows how the elements are


At Rest Under Rotation

Sensing Element A Sensing Element B Sensing Element A Sensing Element B

Coriolis Oscillations
Driving Oscillations Driving Oscillations

Sensing Element A = Sensing Element Sensing Element A < Sensing Element B

Figure 126: How the Murata Gyrostar Gyroscopes Sense Angular Velocity [31]

One of the elements is made to vibrate whilst the other two act as sensors. When the device
is rotated, the vibrating element experiences a coriolis force, causing a sinusoidally varying
difference between the two sensors with an amplitude proportional to their rotation. The
output is the difference between the outputs from the two elements.

5.2 Aim
The aim of this section is to select inertial sensors and implement them for use in a handheld
sensing device to provide passive sensing of movement in a 3-Dimensional space. The
following specification was drawn to help clarify what was required from the sensors

• The combination of sensors must measure, within appropriate limits, the movement
that the user makes to purposefully manipulate an object on the screen. Factors such
as accuracy (e.g. inertial drift and noise) must be considered.

• The total budget for the project is £100 which the sensors need to be bought from

• The device is going to be wireless so the sensors should be low power, preferably
operate at 5v (to complement the other components in the device), and be as small
and robust as possible


5.3 Sensor Research

Research has shown that there have already been similar projects that have investigated
inertial sensing and there are also several devices already on the market. Figure 3 shows
some of the main projects researched and lists the main technologies used. One project that
is particularly relevant is a thesis written by Benbasat in 2000 [7]. In his thesis, Benbasat
develops an inertial measurement unit using accelerometers and gyroscopes. His work is
referenced in this project on several occasions.

For this project there were several options for the gyroscopes, summarised in Figure 13. The
ADXL202 accelerometers have been used before in the Department (so there is already an
advanced knowledgebase) and represent good value for money and are readily available. The
ADXL202 accelerometers are highly suitable for this application so no others were

Device Type Voltage Current Number Range Typical Bandwidth Drift Size Cost for
per axis of Axis Noise 3 axis
Analog Accelerome 2.7v - 0.6mA 2 ±2g 4.3 mg 5kHz N/A <5mm³ US $24
devices ter 5.3v
Silicon Gyroscope 4.85v - <35mA 1 ±150°/sec 0.75°/sec 85Hz 0.55°/sec 30mm² x £279
CRS04 5.15v 8mm
Analog Gyroscope 4.75v - 6mA 1 ±150°/sec 0.35°/sec 500Hz 0.05°/sec 7mm² x pre-
Devices 5.25v 3mm production
Murata Gyroscope 2.7v - 3.2mA 1 ±300°/sec 0.5°/sec 50Hz 0.5°/sec <15mm³ £90
Gyrostar 5.5v
Gyration Gyroscope 2.2v - 2.7mA 2 ±150°/sec 0.15°/sec 10Hz 0.12°/sec <25mm³ US$450
Microgyro 5.5v

Figure 13: Comparison of Inertial Sensors (modified from [7])


5.4 Methods of Using Inertial Sensors to Sense Movement in a 3D


This section will summarise different ways of using a combination of accelerometers and/or
gyroscopes to sense the movement made by the user to manipulate the object on the PC.

5.4.1 Accelerometers

Three dual axis accelerometers could be used to measure all 6 degrees of freedom. Taking
the difference between the measurements observed by the accelerometer based opposite to
each other can differentiate rotation and translation from each other: “rotation can be
measured inertially without gyroscopes, using the differential linear accelerations measured by
two (or more) accelerometers undergoing the same rotational motion but located at different
distances from the center of rotation,” [31]. The configuration of these accelerometers is
described in detail in [11], section 6.1. Using accelerometers in this configuration involves
complex mathematics to track and subtract the effect of gravity. For this reason it was decided
to fragment the project: using gyroscopes to measure rotation and 3 single axis
accelerometers to measure translation. This simplifies the task of removing the component of
gravity because the orientation of the device is known from the gyros (which do not sense
gravity like accelerometers).

Figure 147: Arrangement of accelerometers to measure 6 DOF


5.4.2 Gyroscopes

Gyroscopes can be used to measure the angular velocity around the three axis x, y and z (roll,
pitch and yaw respectively). Gyroscopes do not suffer the effects of gravity like
accelerometers, so they can be used to sense orientation and track gravity. 3 single-axis
accelerometers can in conjunction with the gyroscopes to measure translation.
One major advantage of using gyroscopes to measure roll, pitch and yaw, then adding x, y
and z translation using accelerometers it increases the chance of the project succeeding.
Forming a complex solution using accelerometers alone carries the risk that either everything
or nothing will work.

Figure 158: Arrangement of 3 gyroscopes and 3 accelerometers to measure 6 DOF

It was decided to use three Murata Gyrostars will be used to sense 3-DOF. The Murata
Gyrostar offers the following features:

• Low cost (~£35)

• No moving parts that could wear out

• Small (<15mm³)

• Low power consumption(<2mA, 5v)

• Produces a DC output proportional to angular velocity (a differential output)

• Good availability

• Low pin count (4)

Due to time limitations, at this stage it was decided that to concentrate on building a 3-DOF
device and making the addition of accelerometers an extension to the project.


5.5 Implementing Sensors

The specification for the Murata Gyrostar gyroscope is available from the Murata website

[23]. The output from the sensors is a differential voltage that sits at approximately 1.35v when

stationary however, “depending on physical factors, like temperature, the frequency and

amplitude of this signal varies,” [14]

The Murata data sheet specified the circuitry represented in the following schematic to be
connected to the output of the gyroscopes. The same circuit would need to be made for each

A/D Data
Sensor Filter Amplifier
Converter Processing

Figure 16: Suggested method of connecting gyros

In his thesis [7], Benbasat used the Murata Gyrostar for similar applications. His
implementation of the gyroscopes and amplifiers were proven when applied to a 6-DOF
gesture-recognition inertial measurement. He makes these comments, “The purpose of a low-
pass filter is to reduce the effect on the system of noise in the bandwidth of immediate
interest. The maximum frequency of interest for human arm gestures is considered to be
approximately 10 Hz, though quantitative analysis of the sample data stream suggests that
most of the gestures in which we are interested have a maximum frequency in the 3 - 5 Hz
range. High-pass filtering can be used to remove constant and slowly changing values from
the signals…which can be very useful if thresholding of the signals is desired”.[7]

His design was used, largely unmodified, for this project. It incorporates the following features:

• The differential output signal floats around a central value set by pin Vref

• The inverting amplifier has a gain of between 1.36 and 1.67 by adjusting the value of
VR1. It is possible to change the amplification so that the output utilises the whole
range of the a/d on the micro controller.

• Capacitor C2 provides a low pass filter with a cut-off of 66Hz (this is the upper limit of
the range of frequencies were found to be significant for inertial input device
applications in [7]).


• The idle value can be adjusted precisely by varying the A/D reference voltages on the
micro controller (see section 6.5)

The Maxim Max495 operational amplifier was used to construct the inverting amplifier. The
Max495 has the following features:

• 5v operation

• Rail-to-rail output swing

• 150uA quiescent current

• 500kHz gain-bandwidth product

The circuit diagram and the Max495 pinout can be found in appendix B (Figure 64).

5.6 Preliminary Testing of Sensors

The three gyroscopes were inserted into three amplifier/filter circuits, which were built on
Veroboard and connected to an analogue to digital converter and quantised into eight bits.
Figure 17 shows a sample output from one of the gyroscopes. The movement was a sharp

90° turn clockwise followed by a sharp 90° anticlockwise to the original position. More results

can be found in Appendix F. The variable resistor (VR1) that formed part of the amplifier could
be adjusted to prevent the output saturating but also to use the full range of the output.
Adjusting the A/D reference voltages could alter the constant level the output remained at
when the gyro was motionless.

8-Bit Quantised Gyro Output





0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2

Time (s)

Figure 17: A graph to show results from initial testing of gyroscopes


The signal to background noise ratio when the sensor was at rest was calculated to be
approximately 12.5dB (based on the output ‘wandering’ a maximum of 15 levels when
Implementing this circuit for each axis makes and fixing each gyroscope perpendicular to each
other makes it possible to measure the rotation of the body in all three axes. The outputs of
the sensors are connected to an a/d converter, which is covered in section 6.5.

5.7 Summary of Implementing Inertial Sensors

To summarise, the solution for implementing inertial sensing is compared to the specification
set for the sensors in section 5.2.

• The sensors measure the velocity of rotations that the user makes in a 3D space. The
output from the sensors has a range that can be adjusted so the maximum expected
rate of rotation occupies the full range of the output.

• The quality of the output from the sensors is poor compared to more expensive

gyroscopes (Figure 13) in terms of drift (0.5°/s[23]) and noise (SNR 12.5dB).

• The cost to build the circuits for the sensors is £105 (based on the gyroscopes costing
£33 and the Max495 operational amplifiers costing £2[12])

• The sensors operating voltage is 5v and the combined current consumption is <6.5mA


6 Microcontroller

6.1 Aim
The following specification was drawn up for the micro-controller that would convert the
outputs from the gyroscopes to an asynchronous serial data stream that could be connected
to the Max232 RS232 interface:

• The micro controller must have at least three analogue to digital converters. Extra
analogue inputs may be required if extra sensors need to be added

• There must be a way of implementing asynchronous serial data transfer using either
hardware or ‘bit banging’. The data rate must be 9600bps or greater. Assuming that
each sample is constructed of 5 bytes, this gives a refresh rate of 192Hz. The extra
baud rate allows for extra information to be transmitted (at a reduced refresh rate).

• The device should operate from a 5v supply and consume low power as it will be
powered off a battery

• There should be means for testing the microcontroller and debugging any code during
written during development

• The device must be low cost and development facilities available in the Department

6.2 Hardware Selection

The Microchip PIC range of micro-controllers was chosen because of department support,
availability and value for money. The author is also familiar with programming in Assembly.
Microchip[21] have a range of re-programmable micro controllers that have a/d converters.
Not all PICs have an asynchronous transmitter built into their hardware (this is called a
Universal Asynchronous Receiver Transmitter). To implement asynchronous serial
transmission in software (‘bit banging’) is both processor and development time intensive. A
range of devices that had hardware UARTs were considered, including devices from the
16CXXX family upwards. The Microchip PIC selection guide was used for this [21].
The 16C774 was selected because of its high specification. At the time the microcontroller
was chosen it was not known if any other sensors would need to be connected or what
processing PIC would need to do. The micro controller was, in the final prototype, over-


The 16C774 has the following specification relevant to the project:

• 40 pin DIL packaging

• 10 12-bit analogue to digital converters

• Hardware USART

• Low power consumption (<2mA)

• 5v Operation under chosen operating conditions

• 25mA per source, sink or Vref

6.3 Implementing Micro-controller Test Circuit

The circuit drawn in figure 62 in Appendix B was built on Veroboard. The connections are
listed in Figure 18

Pin Numbers Pin Name Function

1 MCLR Connected to Vdd to activate PIC
2-4 AN0 – AN2 (Port A) Analogue inputs
5 Vref+ Upper reference voltage for ADC
11,12 Avdd, Avss ADC power supply
13,14 Osc1, Osc2 PIC clock
25 Tx Asynchronous data transmit
27 PortD Digital input
32,31 Vdd, Vss Power supply
33-40 PortB Drives LEDs (for debugging)
Figure 189: PIC 16C774 Pin Connections

RS232 socket

Variable resistor to
adjust Vref+

Connector and wire

Test LEDs to device

Figure 1910: PIC 16C774 Test Board


6.4 Flow diagram of PIC Firmware

Below is a flow diagram of the code that has been written in assembly for the PIC16C774.

Initialise Variables

Initialise Ports

Perform A/D conversion

Transmit Data

Figure 20: Flow Diagram of PIC Firmware

The code was written in Assembly and built into a .hex file using MPAsm [21]. The PIC was
programmed using the Department PIC programmer.

6.5 Implementing A/D Conversion

Figure 21 shows the flow diagram for the code that performs analogue to digital conversion.
The following design considerations were made:

• The microcontroller needed to sample the output from the gyroscopes at a frequency
of at least 50Hz [7]

• The data needed to be quantised to 8-bits, “A sampling rate of approximately 50 to 100

Hz and a signal resolution of 8 to 12 bits are necessary for adequate recognition
latency and accuracy,” [7]. 8 bits was chosen because it matches the serial interface
that transmits one byte at a time. This gives a 256 level resolution.


Set ADCON1 to 0x3B*

Set ADCON0 to select channel **

Wait ~50µS #

Set Go/Done bit in ADCON0

Poll ADCON0 until GO/DONE bit is clear

Read ADC result from ADRESH and store

Figure 21: Flow Diagram of A/D conversion

* Analogue to Digital Converter Control 1 (ADCON1)

bit 7 bit 0
0 0 1 1 1 0 1 1

Left justified A/D High Reference = External Vref+ AN0-AN3 Analogue Inputs
A/D Low Reference = AVss AN4-AN7 Digital Inputs

Figure 22: ADCON1 Register


** Analogue to Digital Converter Control 0 (ADCON0)

The microcontroller has only one analogue to digital converter. It can sample from any of the
analogue pins (set using ADCON1). The firmware samples AN0-AN2 in turn.

bit 7 bit 0
0 1 0 0 0 0 0 1

Fosc/8 CHS3:CHS0 A/D Module

0000 = Channel 00 Conversion Operating
0001 = Channel 01 not in
0010 = Channel 02 progress

Channel ADCON0 Value

Channel 0 0x41
Channel 1 0x49
Channel 2 0x51

Figure 23: ADCON0 Register

# A delay of greater than the 3TAD is required. TAD for a 4MHz clock and the conversion clock

select bits set in ADCON0 is 2µs. A delay of 50µs is invoked using a nested loop.

6.6 Implementing Asynchronous Transmission

Figure 24 shows the flow diagram of all the stages necessary to implement asynchronous
data transmission on a PIC16C774:

Set SPEN bit in RCSTA to enable serial port

Set value of SPBRG to 0x19*

Set value of TXSTA to 0x24**

Load 1 byte of data into TXREG

Poll TXSTA bit TRMT until it is clear (transmission complete)

Figure 24: Asynchronous Transmission Flow Diagram


The SPBRG register controls the baud rate that data is transmitted from the asynchronous
transmitter. The desired baud rate is 9600bps. The corresponding value of SPBRG is
calculated as follows (high baud rate is selected - BRGH bit of TXSTA). ([21] DS30275A]:

Desired Baud Rate = Fosc/(16(SPBRG+1))

SPBRG = (Fosc/(Baud rate×16)-1 = (4×106/(9600×16))-1 = 25 =0x19


bit 7 bit 0
0 0 1 0 0 1 0 0
8-Bit Transmit Asynchronous Un-
Don’t Care High Speed Read only Don’t care
Transmission Enabled mode implemented

Figure 25: TXSTA Register

RCSTA has to be given a value although not data is being received. Bit <7> (SPEN) needs to
be set high to enable the serial port. RCSTA is set to 0x90.

6.7 Miscellaneous Implementation Details

6.7.1 Auto configuration

A configuration word can be specified to tell the programmer the settings to use. The following
configuration word was used.


Code protect off Watchdog timer Brownout reset Power-up timer High Speed
off disabled disabled crystal
Figure 26: Configuration Word

6.7.2 Header bytes

The protocol used to synchronise the asynchronous data stream with the host software
included the use of header bytes. The PC software recognises the header bytes, removes
them and stores the following 3 data bytes. The header bytes also contain the state of the
activate button. See Figure 41 for details of how the header bytes were removed.


Activate button pressed

255 15 X value Y value Z value

Activate button not pressed

255 16 X value Y value Z value

Figure 27: Header Bytes Implemented on PIC

6.8 Testing and Results

The LEDs connected to Port B were used to debug the code. The following procedure was
adopted to develop the firmware:
1. Tests were carried out by sending known data to port B to test it worked
2. The A/D was implemented and the result was output to port B. A potentiometer
provided an analogue input and sampled when the activate button was pressed
3. A gyroscope was connected to AN0 and the A/D was tested as for step 2
4. Step 3 was repeated for AN1 and AN2
5. Known data was output to port B and to TXREG. Data was transmitted to the PC
using the MAX232 circuit (section 4.3) and monitored using MS HyperTerminal
6. Gyroscopes were connected to AN0-AN2 and data was monitored using
7. A Visual Basic Program was used to remove header bytes and monitor and record
output from gyroscopes

The testing proved the design of the microcontroller circuit and code. The code was designed
to re-sample the analogue inputs immediately the previous samples had been transmitted.
The refresh rate (determined from recorded data) was approximately 180Hz. The header
bytes had values in the range of the output of the sensors, rather than reserve two values
from the 256 values (to give output from gyros full 8-bit resolution). This meant there was a
chance that the header byte sequence being accidentally detected. The probability of this
happening would be approximately 1.5×10-5 had the output from the sensors been evenly
spread between 0 and 255 but because outputs from the sensors usually remains at
approximately 128 the chance of an error were less. An error was not noticed throughout

Data Processing Design

7 Data Processing Design

7.1 Aim:
The aim was to decide how to translate the data from the gyroscopes into data representing
rotation of an object in 3D space. The design would be a compromise between usability and
complexity. The following sections shall summarise the findings from research into possible
solutions and justify the decision made to implement a form of gesture recognition using signal

7.2 Data Processing Method Selection

Research found that methods of data processing for inertial devices can broadly be
categorised in gesture recognition and ‘direct mapping’. The following sections discuss each
method in detail.

7.2.1 Direct Mapping

‘Direct mapping’ is the term used in this report referring to the orientation and position of a 3D
object on a screen being directly related to the orientation and position of the input device
(also referred to as absolute tracking). As an example, if referring to the rotation of the object
(which we shall concentrate on in this project) direct mapping implies being able to rotate the

handheld sensor 180°, for example, and the object on the screen also rotating 180°.

Direct mapping between input device and 3D object has an obvious advantage in terms of
usability. The user would be able to intuitively rotate an object to the desired position. If the
object on the screen were to rotate in the wrong direction the user would form part of a
feedback loop and be able to correct their hand movements accordingly.
To implement ‘direct mapping’ would require the integration of the data from the three
gyroscopes to give the absolute rotation of the sensor. There are problems that make this
solution non-trivial because the sensors themselves are not perfect at measuring their
rotation. This can lead to them getting increasingly disorientated (for gyroscopes,
proportionally with time).

Data Processing Design

The problems with inertial sensors are:

• Noise: Figure 17 showed a sample of the output from a gyroscope. This sample shows
the noise present in the signal as the output varies when the sensors are stationary.
Integrating the noise will lead to errors in tracking absolute rotation.

• Drift: The Murata gyroscopes have a quoted drift of 0.5°/s. Low cost gyroscopes tend

to have a high drift. Drift is effectively where the gyroscope ‘slips’ and accuracy of the
rotational velocity is lost. If the rotational velocity is not accurate the orientation of the
device cannot be accurately determined

• Axis misalignment: The gyroscopes need to be perpendicular to each other to sense

only one axis of rotation. If two gyroscopes sense components of rotation about a
single axis the device will become disorientated.
These imperfections are especially significant when the outputs are integrated. The drift alone

will cause a loss of accuracy up to 0.5°/s (i.e. 20s after starting from a known orientation the

sensors may be anything up to 10° misaligned in each axis).

To overcome the effects that cause gyroscopes to become disorientated, devices use different
methods of re-calibration. Recalibration requires the device to periodically sense its actual
rotation/position by interacting with the outside world. There are a number of ways that this
can be done:

• Compass and Accelerometer to recalibrate on real-world phenomena [15]

• Vision Processing [29] to recalibrate on light sources in the local surrounding

• Artificial sonar sources to provide reference points [16]

These solutions have proven to effectively overcome the inaccuracies of inertial sensors. They
are, however, complex and require extra sensors and processing that makes them outside the
scope of this project.
A possible solution that was considered (of which no examples were found in any research)
was periodic ‘user’ recalibration. The idea was based on the user recalibrating the device
once it had become disorientated so it was unusable (either by placing the device in a cradle
or realigning the device with a specified orientation visually). It is hard to know, without
experimentation, how often user-recalibration is required.
‘User’ recalibration was considered along with the use of Kalman filtering, which research had
highlighted as a technique used frequently inertial sensing devices, “Kalman filtering is the
main analysis technique for inertial data and is used almost exclusively for inertial tracking, the

Data Processing Design

determination of position and orientation from inertial readings and an initial state,” [7]. It was
decided not to track absolute position because of the likelihood that extra sensors would be
required, as was stated in research:
“Inertial systems are not well-suited for absolute position tracking. In such systems, positions
are found by integrating, over time, the signals of the sensors as well as any signal errors. As
a result, position errors accumulate. Inertial systems are most effective in sensing applications
involving relative motion”. [31]
“Inertial sensors are completely passive, requiring no external devices or targets, however, the
drift rates in portable strapdown configurations are too great for practical use.” [29]
And finally, when referring to inertial tracking [14] states: “the frequent recalibration of the
system, i.e. with a compass is necessary.”
Another solution to overcome the imperfections in inertial sensing was gesture recognition
(relative tracking), which did not require re-calibration or extra sensors and fitted better with
the time limits imposed on the project.

7.2.2 Gesture Recognition

Gesture recognition can be used to sense physical intentions made by a user. “Gesture
recognition offers a natural and intuitive way for users to input information” [6]. There are
numerous different ways that sensors can be positioned on the human body to monitor
movements made by the user, particularly their heads [19] and hands [7].
A commercial product that uses gesture recognition and inertial sensing is the Gyromous [13]
which, through specialised drivers, can interact with Windows applications and is marketed to
work especially well with PowerPoint presentations. Benbasat [7] has used a Hidden Markov
Model based approach to perform gesture. Gesture recognition is a recognised technique in
inertial devices. Verplaetse describes gesture recognition as a, “method for estimating motion
and position is to use a Kalman filter state-estimation algorithm. Once the time-dependent
motions and positions of the system are estimated, a pattern recognition scheme such as a
neural network, hidden Markov model, or matched filter may be performed with that motion
data,” [31].
A form of gesture recognition, through thresholding the outputs from the sensors, has been
developed for this project. This solution has been designed to overcome the inertial drift and
noise that causes problems with absolute tracking. The output from the sensors is an angular
velocity that is directly proportional to the speed that the user has rotated the device. Using a

Data Processing Design

clutching method (for example, an activation button) the outputs from the sensors can be
separated into those made intentionally by the user and those that are general hand
movements. Thresholding is used to overcome gyroscope noise and small movements made
by the user either from shaking or rotational components made unintentionally on other axis.
In his thesis, Benbasat [7] states about his project, “a peak size threshold is used, to ignore
gestures that are caused either by the acceleration sensitivity of the gyroscopes, or by
misalignment of the sensors,” and then goes on to say in conclusion, “visual inspection
suggests that it would be possible to collect interesting information from that data stream,
certainly the presence of gestures and the number of peaks. However, it seems there is not
enough entropy in the stream for our current algorithm, which produced meaningless output. A
simpler scheme looking at pair-wise differences between data points could be successful in
this case and is left as possible future work.” It is understood that a term used in his
statement, ’looking at pair-wise differences between data-points,’ is referring to the same idea
that is being considered in this project. The solution is described graphically in the following

7.3 Design Considerations

7.3.1 Selecting Threshold Levels

Thresholding is used on the output of each sensor for the following reasons (numbers refer to
Figure 28):

• To eliminate noise from the sensors1

• To eliminate shake 2

• To eliminate unwanted components of a movement in other axes3. In the example in

Figure 28 a gesture was only intended on the X-axis but there is a small component is
the Z-axis the user accidentally made. Selecting a threshold too low would mean this
would not be achieved and selecting a threshold too high would mean low sensitivity.
The threshold is selectable in software depending on the preferences of the user. An output
that breaks the upper or lower threshold is considered to form part of a gesture. This will be
discussed in the next section.

Data Processing Design

250 X Axis
Y Axis
Z Axis
2 3
Output Value



Minimum Threshold required to

50 eliminate noise and unwanted
components in this example

0 0.2 0.4 0.6 0.8 1 1.2 1.4
Time (s)
Figure 28: How Thresholding is used to detect a gesture

7.3.2 Normal Operation Mode

Two modes of operation were implemented to suit the preferences of different users: normal
and auto-damping. Both modes are designed to be intuitive to use. Normal mode should be
considered as the user setting an object in motion in a frictionless environment and then
stopping it by counteracting this rotation. ‘Auto-damping’ mode should be considered as
setting the object in motion in an environment where friction exists and the object brings itself
to a halt. ‘Auto-damping’ will be covered in the following section.
The following explanation of ‘normal mode’ shall consider the input from a single gyroscope.
The same processing is applied to the three axes to give 3D rotation. All axes are processed
in quick succession, appearing simultaneous to the user.
Figure 29 shows how a user makes a gesture in a single axis.
1. The user rotates the device in one direction. The object on the screen rotates with an
angular velocity proportional to the up amplitude of the input up to the peak.
2. The user holds the device still and the 3D object continues to rotate at a velocity
proportional to the peak of the input
3. The user returns the device to the initial position and the 3D object stops rotating

Data Processing Design



Figure 29: Definition of a Gesture

The output from a single gyroscope corresponding to a gesture of this nature is sketched in
Figure 30. This is considered to be a ‘positive gesture’ because phase 1 is positive. If phase
one is negative the input is mirrored and the same processing technique is applied. Figure 30
also explains the different parts of a gesture.
A sample output for a single axis is shown in figure 31. It can be seen how the magnitude of
the output rises when there is a threshold-braking peak in the input and how the output
remains constant until the input breaks the other threshold. If two peaks are consecutively
break the same peak, the second is ignored and the user has to rotate the device in the
opposite direction to counteract first peak. The output falls proportionally with the peak of the
second phase of the gesture: if the magnitude is not so big as the peak in the first phase,
when a peak is detected in the second phase the output is automatically returned to zero.

Data Processing Design

Phase 1 Phase 2
250 A B C D A


Calibration Level
Upper Threshold
Output Value


100 Lower Threshold G2


0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
Time (s)

Stage Input Corresponding Output

A X_Val(t) <Upper Threshold No output
G1 X_Val(t)>= Upper Threshold X_val(t) – Upper Theshold
B X_Val(t)>X_val(t-1) Rises proportionally with X_val(t)
P1 X_Val(t)<=X_Val(t-1) X_Val(t-1) = X_Out_max
C X_out_max
G2 X_Val(t)<Lower Threshold X_out_max - (X_val(t)-Lower Threshold)
D X_Val(t)<=X_val(t-1) Falls proportionally with X_val(t)
P2 X_Val(t)>X_val(t-1) No output
Figure 30: Process of Recognising a Gesture





0 1 2 3 4 5 6 7


Time (s)
Figure 31: Output in Normal Mode

Data Processing Design

7.3.3 ‘Auto-damping’ Mode

Auto damping mode was developed to offer an alternative intuitive mode of use to normal
mode. The idea behind this mode is being able to rotate the object on the screen by making a
series of ‘nudges’ with the device. The starting angular velocity of the 3D object is proportional
to the amplitude of the gesture made by the user. The rotation is automatically damped so that
it comes to a rest in a specified time, in a similar way to if the object was in an environment
where friction existed. The damping duration, Dd, is specified by the user and is the period of
time the cube comes to rest (Figure 32)

Output Damping Duration, Dd



Figure 32: ‘Inverted Square’ Output used in ‘Auto-damping’ mode

Output(t) = Max - (λ×T)2 , where Max is proportional to the amplitude of the gesture

Where, λ = √(Max/Dd)

Figure 33 is an example of an input and corresponding output in auto-damping mode.

Data Processing Design


200 A(max) Input


Other peaks ignored
when there is an output

0 1 2 3 4

Time (s)

Figure 33: Output in ‘Auto damping’ mode

7.3.4 User Options

As well as the different modes the user can select, there are some settings the user can
change. It was hoped that performing usability tests would find optimised values for these
Sensitivity: The upper and lower thresholds are set with the calibration level as the reference
point. The calibration level is measured when the device is at rest and the user presses the
calibrate button.

Figure 34 shows an example of different threshold levels being used. If the threshold is ±50

there are 4 peaks, if the threshold is 25 there are 9 peaks and if the threshold is 10 there are
16 peaks. The significance of this is that noise is detected as a peak when the threshold is low
and when the threshold is high only large peaks are detected. In Figure 34 a threshold of 25 is
approximately the correct level to distinguish between noise and the wanted signal.

Data Processing Design


Threshold = +50

Threshold = +25
Threshold = +10

Threshold = -10
100 Threshold = -25

Threshold = -50

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5

Figure 34: Demonstration of the Effects of Different Thresholds

Amplification: The amplification is the factor, between 1 and 3, by which the output is
multiplied by (illustrated in Figure 35).



150 A2


0 0.5 1 1.5 2 2.5
-50 2.A1

Time (s)

Figure 35: Demonstration of Amplification Setting Set to 2x

Data Processing Design

Maximum Output: The maximum angular velocity at the output, as illustrated in figure 36. The
maximum output is the magnitude that the output is ‘clipped’ at. It allows the user to define the
maximum angular velocity that they are comfortable for the 3D object to spin at.


200 A1



capped at 100

0 1 2 3 4 5 6 7 8 9 10
Time (s)

Figure 36: Demonstration of Maximum Output Setting

Time Out: This is the minimum time between two gestures, in milliseconds. This setting was
developed to overcome problems in normal mode where the user ‘corrected corrections’ that
lead to unintended gestures.





Tout Tout Tout
0 1 2 3 4 5 6 7 8 9 10
Time (s)

Figure 37: Demonstration of Time Out Setting

Data Processing Design

7.4 Usability Flow Chart

Load Software

Connect device to COM1 and Opt to simulate input for testing

power device purposes

Enter preferences in ‘Settings‘ Enter preferences in ‘Settings‘

dialogue box dialogue box

Place device on desktop and click Use sliders to select simulated

calibrate button in ‘Settings’ calibration levels and click calibrate

Press button on device to activate Press Activate button to activate

Press button to deactivate device

and stop movement

Use device/sliders to manipulate

the cube on the screen

Tests>Test 1or 2 to perform a View>Log data to change to Data

usability test Logging dialogue box

Read test instructions and press Enter filename and directory and
activate button to start test start data logger

Complete objective. Elapsed time is Click Stop to stop data logging


Unconditional View>settings to load settings
Simulate Mode only dialogue box
Device Connected only
Currently data logging
Not currently logging data

Figure 38: Usability Flow Chart

Data Processing Design

The activate button is a toggle switch: pressed once to activate and once to deactivate rather
than a press and hold button. The reason for this is because it may be difficult to hold down
the button whilst using the device. The ergonomics of Computing Devices (page 53 [10])
quotes from Mackenzie (Mackenzie et al, 1991) that the trackball (which is similar to this
device) has a ‘drag lock’ feature that means that the user doesn’t have to hold the button
down at the same time as dragging. It was found that he trackball, “performed poorly for
dragging because it is awkward to hold the button down and move the ball at the same time,”

Data Acquisition, Storage and Processing Implementation

8 Data Acquisition, Storage and Processing Implementation

8.1 Aim
The aim of this part of the project is to develop a piece of software that handles all the
incoming data from the device connected to the serial port and converts it into meaningful
data that can be used to manipulate the object in 3D space. It should also log inputs and
outputs so they can be graphed.
The software was split into several sections. Splitting the project helped set milestones and
test individual sections of the software before moving onto the next to help reduce the
chances of encountering bugs.
1. Data Acquisition
2. Data Storage
3. Data Analysis
Microsoft Visual Basic was chosen to implement the software part of the project. Visual Basic
has the following features:

• Advanced COM port functionality

• Advanced debugging (for example, syntax problems are detected after each line of
code is entered)

• Intuitive Graphical User Interfaces can be produced

Visual Basic places emphasis on the graphical user interface. Each window in the GUI is
called a ‘form’ and behind each form is the code that is executed after a particular event,
whether this is a button press, timer timing-out or data is received on the serial port. This
software has ten forms. The code part of each form (rather than the graphical part) can
include functions. Functions have been made ‘public’ in this software, made possible due to
the relatively small size (approximately 2000 lines of code). For a larger project it may be
necessary to keep functions ‘private’ to a particular form to prevent forms working incorrectly
together (which is particularly important if there are multiple authors). All variables in this
project are also public, meaning that all forms and functions have access to them. Variables
are declared in the three Modules that accompany the forms to make up the whole Project.

Data Acquisition, Storage and Processing Implementation

8.2 Software Structure

There are five main forms that make up the software. They are listed here along with the main
tasks each form performs. Figure 39 shows the position of each form in the GUI. The full code
listing is given in Appendix H.

• Parent to all the other forms

• Performs data acquisition (section 8.3)

• Stores received data and outputs (section 8.4)

• Performs data processing on inputs (section 8.6)


• Draws a rotatable 3D cube that provides visual representation of output (Section 9.3)

• Handles part of the usability test functionality (Figure 56)

• Displays state of device


• Graphically represents inputs, outputs, threshold levels and maximum output (Fig. 47)

• User defined settings and calibration (sections 9.4 and 8.5 respectively)

• Specify location to log data (section 8.3)

• User can start /stop logging

• Opens and closes file written to by Main.frm

There are also five accompanying forms that are not core to the software:

• Allows user to simulate input if device is not physically connected (for testing

• Instructions for usability test (Figure 56)

• Sets up cube position and displays clock for tests


• Displays the time in which the test was completed (Figure 56)

Data Acquisition, Storage and Processing Implementation

Testx (accessed from toolbar) Cube Simulate_input

Bars Settings / Records Main

(changeable via toolbar)

Figure 39: Software/GUI Structure

Data Acquisition, Storage and Processing Implementation

8.3 Data Acquisition

The following specification was decided upon:

• The software must be able to receive data from COM 1

• It must be able to detect and remove header bytes included in data (Figure 27).

• Put received data in ‘public’ variables

The following stages make up the portion of the code that receives the data from the COM
port. The code for receiving data from the COM port forms part of the Main form.
1. The MS Comm object has to be added to the main form as follows:
Project > Components > Controls > Miscrosoft Comm Control 6.0
2. Set up COM port
The following VB code was used to set up the COM port using the settings shown in
Figure 9.

MSComm1.RThreshold = 1 Receive event fired after every byte is received

MSComm1.InputLen = 1 Number of characters from input buffer
MSComm1.Settings = "9600,N,8,1" Com port settings
MSComm1.CommPort = 1 Sets Com 1 as MSComm1
MSComm1.PortOpen = True Com port opened
3. Receive Data
If MSComm1.CommEvent = comEvReceive Then
Data = MSComm1.Input
End if
4. Convert received data to Decimal
Decimal = Asc(Data)

Every byte of data received is stored into an input buffer. The next stage is to remove the
header bytes and store the x, y and z values into separate variables. Header byte number 2
can have one of two values, indicating the state of the activate button. The stages in removing
the headers are shown in Figure 41.

Data Acquisition, Storage and Processing Implementation

‘Input_buffer’ is an array of nine entries. This is because this is the smallest size needed to
‘find’ the data, as shown in Figure 40 below:

H2 X Y Z H1 H2 X Y Z

Figure 40: 9-byte Input buffer

Endval = 0

Data received on
COM1 (single byte)

Received byte stored

into ‘Data’

Data converted to a
decimal from ASCII

Stored into
‘Input_buffer[endval]’ Endval = Endval +1 Endval = 0


Endval ≥ 4 &
Input_buffer[endval-4] = H1 & No Yes
Input_buffer[endval-3] = H2 Endval = 8?

ValueX = Input_buffer[endval-2]
Button Activated = False ValueY = Input_buffer[endval-1]
ValueZ = Input_buffer[endval]

H2 = H2Type1? Button Activated = True

Figure 41: Data Acquisition from RS232 Serial Port Flow Chart

Data Acquisition, Storage and Processing Implementation

8.4 Data Storage

It is a requirement of the software to:

• Record all inputs and outputs against a time stamp whenever new data is received

• User must be able to start and stop recording and select location for logged data

• Record states of different variables for debugging purposes

It was decided to store data into a text file. Each time stamp occupies a line and a comma
separates entries. The file can be opened in Excel (as a comma de-limited text file), which can
be used to plot graphs.
Green for logging,
Dir1.Path Filename.Text Start/Stop Logging Button Red for not logging

Figure 42: Data Logging Form

The following commands were used to open a text file:

Store_filename = Dir1.Path & "\" & Filename.Text
Open Store_filename For Output As #1
To write to the file the following command was then used:
Print #1, " "
Column headings were written to file when the logging was started. Data was written to file
when new data was received. The number of records and elapsed time since recording
started is displayed. When the user stops the data logging the file is closed:
Close #1
The recorded data can then be opened in Excel for analysis. This is how results were
obtained for this report.

Data Acquisition, Storage and Processing Implementation

8.5 Calibrating Device

The device has both an on/off power switch and an activate/deactivate switch. The reason
that the device is not switched off to deactivate it is so that the device can be calibrated. The
calibration processes illustrated in figure 43:

Device powered up

Repeated for Y
Device Uncalibrated and Z axis
Device Deactivated

X_cal = X_val
X_high_thresh = X_cal + sensitivity
Device remains still X_low_thresh = X_cal - sensitivity
User calibrates device

Device calibrated and Store other user defined settings

can be activated (amplification, time-out, maximum
output and damping duration)

User changes device


Figure 43: Calibration Process Flow Chart

8.6 Data processing

This section shall cover the processing of the raw data received from the device and
producing an output that directly relates to angular velocity. The design described in Section
7.3 was implemented.
Figures 44 and 45 show a flow diagram of the gesture recognition algorithm. The full code
listing is included in Appendix H which includes comments to explain commands and variables
in more detail.
As well as the details given in the program flow diagram, the following features also formed
part of the processing:

• The device must be activated for a gesture to be recognised.

• The time-out feature prevents the user making two consecutive gestures in less than
the specified ‘time-out’. A gesture is not recognised when this timer has not elapsed.

• Outputs are clipped so they do not cross zero

Data Acquisition, Storage and Processing Implementation

Data received and stored
into ValueX
Flipped = False
G1, G2, P1, P2 = false
Flipped = True
Flipped = false

X_val = valueX Flip X_val = ValueX flipped

about calibration level

P2 detected & x_val ≥ lower Threshold Start time-out timer

Lower Threshold ≥ X_val X_val ≥ Upper Threshold

G2 not detected P1 not detected G1 not detected G1 detected &

P1 not detected

G2 detected Flipped = true G1 = detected

X_val >
X_val_previous X_val ≥ X_val_previous X_val_previous > X_val
≥ X_val

X_int = X_int – (lower P2 detected X_int = X_val – Upper_threshold P1 detected

threshold –X_val) X_int = 0 X_val_previous = X_val

Flipped = False Flipped = True

Invert X_int

X_out = X_int × Amplification

Apply Max_ouput to limit X_out

G1 Gesture Phase 1* X_val Value of input after possible inversion

G2 Gesture Phase 2* X_int Intermediate value of processed input

Value passed to 3D cube. Directly proportional
P1 Peak Phase 1* X_out
to angular velocity of cube
P2 Peak Phase 2* Amplification User defined amplification

ValueX Data received from COM1 Max_out User defined maximum angular velocity
*Refer to figure 30

Figure 44: Normal Mode Implementation Flow Chart

Data Acquisition, Storage and Processing Implementation

Data received and stored
into X_Val
Flipped = False G1, G2, P1, P2 = false
Flipped = True
Flipped = false

X_val = valueX Flip X_val = ValueX flipped

about calibration level

Lower Threshold
≥ X_val &
P1 not detected
Flipped = true
X_val ≥ Upper Threshold

G1 not detected

G1 = detected G1 detected &

P1 not detected &
not auto-damping

X_val ≥ X_val_previous X_val_previous

> X_val

X_int = X_val – Upper_threshold P1 detected

X_val_previous = X_val

Calculate damping factor

Call auto-damping function

Auto-damping function Duration
produces output that elapsed
decreases to zero after
Damping Duration
Flipped = False Flipped = True

Invert X_int

X_out = X_int × Amplification

Apply Max_output to limit X_out

G1 Gesture Phase 1* X_val Value of input after possible inversion

G2 Gesture Phase 2* X_int Intermediate value of processed input

Value passed to 3D cube. Directly proportional
P1 Peak Phase 1* X_out
to angular velocity of cube
P2 Peak Phase 2* Amplification User defined amplification

ValueX Data received from COM1 Max_out User defined maximum angular velocity
* Refer to Figure 30
Figure 45: Auto-damping Mode Implementation Flow Chart

Implementing the Graphical User Interface

9 Implementing the Graphical User Interface

9.1 Introduction
A graphical user interface was developed to meet the following specification

• Provide an intuitive way for the user to set their preferences for the device

• Graphically represent the inputs and outputs from the data processing algorithms

• Graphically represent the outputs using a rotatable object

• Provide a means for testing the effectiveness of the device

The GUI was split into three main parts as shown in Figure 46

• Graphical bars to show magnitudes of inputs and outputs

• 3D rotatable object

• Settings and other features

Graphical Bars 3D Rotatable Cube

Figure 46: The Graphical User Interface

Implementing the Graphical User Interface

9.2 Graphical Bars

It was decided to use bars to represent the magnitude of the inputs and outputs. Figure 47
shows an annotated illustration each of the three types of bar (input, process and output – see
figure 46). The process bar shows graphically the input in relation to the threshold and is
useful The functions of the bars are listed below:

0 255
Scalewidth = 255


Single byte value received on COM1

0 255
Scalewidth = 255


Lower Threshold Upper Threshold

Calibration Value
0 150
Scalewidth = 300


-(Maximum Output) Maximum Output


Figure 47: Details of GUI Bars

The bars were drawn in VB as follows:
Picture Box “Picture_box_1”

Picture_box_1.Line (0, 0) - (Value, Picture_box_1.ScaleHeight), RGB(255,255,17), BF

Top Left Bottom right coordinate Bar colour

Figure 48: Implementation of GUI Bars

Implementing the Graphical User Interface

9.3 3D Cube
It was an important part of the project to provide a way to visually see the output from the
device so that it could be tested. To do this there were two options: link the program to a third
party piece of software to control movement of an object or implement a 3D object in the VB
interface which can be directly controlled.
VRML (virtual reality modelling language) is a language used frequently on the Internet to
display 3D objects and 3D worlds. Popular VRML browsers include Blaxxun Contact [8] and
Cortona by ParallelGraphics [26]. Research was carried out into ways of interfacing either of
these browsers with the VB software by controlling movement programmatically. Messages
were posted on newsgroups including the official Blaxxun Contact site but none of the replies
led to the solution. Research into Cortona by Parallel Graphics led to a VRML browser window
that could be embedded into the VB application and controlled, however, the Cortona SDK
was required at a cost that could not be justified. Since implementing another solution (below)
a paper was found that, “introduced a DeviceSensor for grabbing arbitrary input and a Camera
node to control the scene view and implemented both constructs in Blaxxun Contact”[3]. This
is an area for future development.
It was decided to construct a simple 3D object in Visual Basic from first principles. A cube with
different coloured sides was chosen so that that the orientation of the cube could be
recognised from the coloured faces. The flow diagram shown in figure 50 describes the stages
involved in drawing the cube.
8 7 Face No Colour Corners
1 Red 1234
4 3 2 Turquoise 2376
3 Blue 6785
4 Orange 5841 x
5 Yellow 4873 y
6 Purple 5126
1 2

Figure 49: Numbering of Cube Faces and Corners

Implementing the Graphical User Interface

Define the corners of a 3D cube in array

Corner Coordinate

Add perspective and store in new array, Sample Z_out from gesture recognition.
Perspective Coordinate Rotate Corner Coordinates proportionally

Calculate normal to each face using Sample Y_out from gesture recognition.
points defined in Corner Coordinate Rotate Corner Coordinates proportionally

Order faces from 1-6: 1 being the normal Sample X_out from gesture recognition.
of the face with the largest Y-component Rotate Corner Coordinates proportionally

Draw faces 3,2 then 1 using the polygon Pause for 10ms
function described in section 9.3

Figure 50: Flow Diagram of Cube Implementation

The cube is defined in 3D coordinates in an array, corner_coordinate. Each element of the

array has an x, y and z coordinate. The initial position is so face 1 is facing straight on. To
draw the cube, firstly perspective is added by multiplying the x and y vectors between 0.9 and
1, proportionally to the y vector. Next, the normal of each face is calculated. The three faces
with the largest positive normal (those most facing the user) are drawn in reverse order. This
technique is called back-face-culling and makes sure the faces are drawn in the correct order.
To draw a polygon in VB the library must first be loaded (loaded in module ‘cube’):
Public Declare Function Polygon Lib "gdi32" _
(ByVal hdc As Long, lpPoint As PointAPI, _
ByVal nCount As Long) As Long

Public Type PointAPI

X As Long
Z As Long
End Type

Dim PolyPts(3) As PointAPI 'data type define in gdi32.dll

The next stage is to enter the x and z coordinates for the four coordinates of the polygon into
PolyPts. The command to draw a four sided polygon is:
Polygon Cube.hdc, PolyPts(0), 4
This function is repeated for each of the three polygons that are draw (there are only ever
three polygons visible).

Implementing the Graphical User Interface

The cube can then be rotated using circle geometry. For example, to rotate by angle θ

(radians) around the x-axis, all corners of the cube are rotated by θ in the Y-Z plane using the

following equations:

Y_coordinate = (Y_coordinate × Cosθ) – (Z_coordinate × Sinθ)

z_coordinate = (Y_coordinate × Sinθ) + (Z_coordinate × Cosθ)

The cube is can be drawn with its new coordinates as before by adding perspective,
performing back-face culling and drawing polygons.
The output from the gesture recognition algorithm is directly input into the rotation of the cube.
A timer samples the output from the gesture recognition algorithm at an interval of 10ms and

rotates the cube proportionally to give an angular velocity of between 0 and 50°/s.

Implementing the Graphical User Interface

9.4 Settings
Other parts of the GUI include a settings dialogue box, simulated input dialogue box, data
logging dialogue box and functionality to perform some usability tests. The usability tests are
discussed in section 12.3 along with the results from testing and in 12.5.3 the optimised
settings are discussed.
The Settings dialogue box is shown in Figure 51 below. The settings are changed using the
sliders. The value shown above each slider is updated as the slider is moved. The values of
the sliders are stored to a corresponding variable when the Calibrate button is pressed
(section 8.5).

Figure 51: Settings dialogue box

The Simulate Input dialogue box allows the user to produce X, Y and Z inputs 0-255 if the
device is not connected (to test the software). Sliders are used to update the inputs into the
application every 10ms.


10 Packaging
The packaging for the device was made in conjunction with the Department’s mechanical
workshop. There are two main components that form the packaging for the device: the
aluminium cube that holds the PCBs in position and the MDF (medium density fibreboard)
sphere that provides an ergonomic case that the user holds.
The cube was machined out of aluminium. Aluminium was chosen because of its strength to
weight ratio, which is very high. The cube was machined out of a solid block to make a rigid
structure. Rigidity was important to make sure the gyroscopes remained aligned. The
dimensions for the cube were decided upon because of the size limitations of the PCBs. The
dimensions of the cube are given in the technical designs in Figure 70.
The packaging that holds the electronics is in a sphere shape and is made out of laminated
MDF sheets. The MDF is CNC machined into a sphere and a cavity is machined to house the
aluminium cube that, in turn, holds the electronics. The sphere is made in two halves that are
bolted together and clamps the cube in place. The on/off and activate buttons are fixed onto
the casing. Alternative materials to MDF included acrylic and high impact polystyrene. MDF
was lighter and cheaper. Another idea was using a 100mm diameter rigid plastic juggling ball
[24]. It was decided that it would be problematic to hold the juggling ball while it was machined
so this idea was not used. To get over this problem of holding a spherical object, the cavity
was machined when the MDF was in block form, then the spherical sides were machined last.
The MDF could be finished in a plastic coating to give a professional finish.
Figure 53 shows how the packaging fits together, including the proposed components to make
the device wireless (see section 11).

Figure 52: Photograph of Packaged Device


MDF upper half of



Gyroscope circuit

Battery holder

Radio transmitter

Gyroscope circuit

Gyroscope circuit
Aluminium cube to
mount PCBs

Power on/off slider


Microcontroller PCB

MDF lower half of


Figure 53: Exploded Diagram of Packaging

Radio Link

11 Radio Link
The initial specification included that the device should be wireless. The prototype, up until
now, has used a wire to provide power to the device. The PIC was mounted on a PCB outside
of the device: the analogue output from the sensors/amplifiers and the button was connected
to the PIC via a 6-core wire (Supply x 2, sensors x3 and button). Adding a single channel
radio link would have affected the schematic of the prototype considerably. Data would need
to be transmitted over a single-channel, digital (for reliability) data link. The PIC would need
mounting inside the device along with the transmitter. The device would also need to be
powered by a battery. The packaging had been designed to accommodate these changes
(Figure 53). The radio receiver would need to be mounted with the Max232 PC interface.
There are radio transmitter/receiver pairs available that are compact, have a low power
consumption, operate at RS232 data rate and include built in error correction. Such a device
could be connected to the asynchronous data stream from the PIC (unchanged from current
implementation) and fed straight into the Max232 serial interface. Because the radio link could
be added to the prototype without changing the functionality, an attempt to implement it was
made at this stage, once the rest of the prototype was functioning.
The LPRS Easy Radio 433Mhz radio transmitter/receiver pair was selected. The specification
for the device and pinout is available at [17] and an antenna design sheet is available from
RadioMetrix [27]. The data sheet was followed and a PCB was designed for the transmitter
and receiver.
The implementation of a radio-link had limited success. It was possible to verify that the
transmitter was operating, using a spectrum analyser set to 433.92 MHz. The receiver,
however, did not work. Testing the device found that it was drawing less current than the
specification and, despite using a high powered transmitter, the received signal strength
indicator pin did not have an output. The aerial was replaced and power supplies were
checked for any problems, such as excessive voltage ripple. A second receiver was ordered
and tested and shortly after being connected the RSSI pin voltage died (as the first had done).
Due to time limitations and cost of the receivers that were malfunctioning, it was decided not
to include the wireless link in the prototype. The reason for the malfunctioning is unknown:
perhaps they were damaged by static or were faulty when they were supplied.


12 Testing

12.1 Functionality Testing: Method

The functionality of the device was defined as how well the prototype worked in technical
terms. This meant testing for transmission errors and software bugs, rather than testing the
design, which is tested in the following section under usability.
The project was split into fragments (Figure 4) which made it possible to test the different
sections of the project as and when they were completed. The following areas were tested:

Section Areas to test Method of Testing

Sensors and • Output must be reliable • Analysing graphs of output
amplifiers • Output must not saturate but
must use full range available
Microcontroller • A/D must be reliable and • Testing A/D with
accurate potentiometer and volt meter
• UART must be able to transmit • See section 6.8
data accurately and reliably
Radio Link • Operating distance must be • Operating device at different
adequate distances and observing
• Link must be reliable and error behaviour
free • Extensive user testing and
• Performance degradation due studying graphs of output
to interference should be low
RS232 Interface • Transmission errors should be • Sending known data and
low monitoring received data in
• Data transfer must be reliable HyperTerminal
Data Acquisition • Errors in received data must be • Simulate data received on
detected COM port with known errors
• Data must be received reliably and monitor output
Data Storage • All required data must be stored • Compare recorded data with
qualitative description of
movements made
Data Processing • The data processing must be • Make graphs from logged
as specified in the design data and compare with
(section 7.3). design.
• Output should be predictable, • Analyse variable values
reliable and error free stored to debug code
Graphical User • There should be no broken • Perform extensive user
Interface features testing.
• There should be no way the • Make sure that controls are
user can make the program disabled to prevent users
unstable making invalid selections
• The program must not allow the
user to make invalid selections
Figure 54: Methods of Testing Used to Test Each Area of the Prototype


12.2 Functionality Testing: Results

After extensive use of the prototype and analysis of results from the data logging, the
functionality of the device could be evaluated. The majority of problems were fixed during
development so at the final stage of testing it was not expected that there would be many
technical issues. The results of testing are summarised in the table in Figure 55.

Section Positive Results Negative Results

Packaging Functional and good quality The CBs dictated the size of the
cube shaped frame and this
dictated the size of the sphere.
The sphere is larger than ideal
Sensors and Reliable output, uses full output Noisy signal (~12.5dB SNR)
Amplifiers range
Microcontroller Reliable, carried A/D and UART None
faultlessly. No noticeable errors
were received at the PC
Radio Link Radio transmitter operated Radio receiver did not function
(determined using spectrum correctly despite testing using
analyser) different antennas and powerful
RS232 Interface Reliable operation and no It was necessary to properly
noticeable errors were received at ground PSU for data
the PC transmission
Data Acquisition Reliable operation and no None
noticeable errors were received at
the PC
Data Storage Any variable could be recorded. A None
25 second test was 559kb
recording all data (115kb for time,
inputs and outputs only). Results
openable in Excel
Data processing The data processing was implemented as per the design and is
evaluated in the following usability testing
Graphical User Care was taken to make sure user It was not possible to interface
Interface could not select invalid options. with third-party 3D software to
GUI worked well and looked test device for real applications.
Figure 55: Functionality Testing Conclusions


12.3 Usability Testing: Method

To test the usability of the device it was necessary to provide a way to compare the
performance of the device with conventional ways of rotating an object on a screen
The Ergonomics of Pointing Devices[10] gives details of 1-D and 2-D ergonomic tests that can
be carried out on a device to test usability. In particular it concentrates on Fitt’s law and the
speed and accuracy a user can point, drag or draw using devices ranging from mice to data
gloves. There was no data available for 3D devices however: “while a number of 3D devices
exist, we could find no published research that establishes Fitt’s law for these types of devices
or for pointing (in general) in 3D virtual reality environments,”[10]
Tests were developed along the lines of the 2D tests described in [10] to determine how
quickly the user could rotate an object to a certain orientation to a specified accuracy. It was
necessary to compare results using the prototype with results from a standard input device.
To do this the same objectives were tested using a mouse to rotate an object in VRML. First
of all the implementation of timed tests for the prototype will be documented.

The implementation of the cube being from basic principles meant there was the ability to
have total control over the orientation of the cube at all times and to determine the direction
the faces of the cube using the normals worked out for the back-face culling. It was possible to
implement a 3D rotation task, which is described in Figure 21. There are two tests: one more
difficult than the other. The harder test requires the user to perform a more complex rotation
and to align the cube into position to a higher accuracy.
The test requires the user to align the cube so that the red side faces the user. This is done by
measuring the Y component of the normal to the red face, which will be greatest when the red
face is facing the user. The user is required to make sure the red face is facing forwards within
a certain tolerance. The tolerance is lower for the harder test.
These tests gave quantitative performance results on speed, accuracy and learning time.
Qualitative results were also important for measuring fatigue and obtaining recommendations.
The testers were asked to comment on how they rated the device against the ergonomical
criteria given at the start of the project (section 2.2) using mean opinion scoring. These results
are given in Figure 59.


User opts to start the test from the toolbar. Having read The program deactivates the device and
the instructions they click next. automatically rotates the cube to the start position.

The user rotates the cube to the position detailed in the The user presses the activate button. A clock
instructions and brings the cube to a halt. automatically starts and is displayed

The software knows the orientation of the cube and The time the task is completed in is displayed and
when it is lined up accurately enough. The clock is the user can either repeat the test (Yes)or continue to
stopped when the position of the cube meets the use the device(No).
criteria and the cube is stationary.

Figure 56: Usability Testing Flow Diagram


12.4 Usability Testing: Results

The two tests were used to quantitatively judge how usable the device was. A random sample
of people was selected to perform the test. They were all given a quick demonstration, a brief
lesson on how the device worked and then had a five-minute warm-up time to become familiar
with the device. They were able to choose the settings they felt worked the best for them. The
settings that each user chose are shown in figure 57. Half of the testers opted to use auto-
damping mode. The optimised values are the mean of the value of each users preferred
setting (not including exceptions).

Setting User Preferred Value Optimised

1 2 3 4 5 6 7 8 9 10 Value
2 2 2 1.5 2 2 2 3 2 2 2
15 23 20 15 30 25 22 32 30 25 24
45 45 45 45 45 45 45 45 45 45 45
- - 1700 3000 2500 - 2000 - - 3000 2440
250 250 250 250 250 250 250 250 250 250 250
Figure 57: Optimised Settings for Device as a Result of Usability Testing


30 Test 1 Test 2
Time (s)




1 2 3 4 5

Figure 58: Usability Test Results for Prototype Device

Test 2 was designed to be significantly more difficult than test 1 and the results show this
because it took all of the users longer to complete (in some cases, three times as long).


As a comparison, a similar test was undertaken in a VRML browser. The user was presented
with a 3D cube positioned in the same way, and, after the same warm-up period of five
minutes was asked to perform the same objective as the device testing. It was not possible to
have the same level of control over the VRML browser, so the tests were not automated and a
stopwatch was used. The results (shown in Figure 59) are discussed in the following section.

9 Test 1 Test 2

Time (s)


1 2 3 4 5


Figure 58ii: Usability Test Results for Mouse and VRML

The testers were also asked to give a verbal response to the device. Some comments were
favourable and others were constructive criticism. The comments included:
“It’s hard to get used to and at first I got slightly confused... I think there is a certain
knack to it”
“It seemed responsive”
“The button was useful to stop the cube rotating”
“I had to think quite hard at what movement I needed to make although I suppose it was
“Perhaps it would be easier with a more descriptive object.. part of the problem was
knowing which face was where”
The comments that were made are also made in literature about similar devices (see section
The following table (Figure 59) shows the MOS (Mean Opinion Score) for the different
ergonomical aspects of the project as decided by the test team. Each term was explained and
they were asked to rate the ergonomical factor from 1 to 5 (a higher score is preferable). The
results are the average of the user’s opinion. The users were asked to give their opinion on
how well the mouse performed ergonomically in a 3D environment to act as a comparison.


Ergonomical Factor Mean Opinion Score

Prototype Device Mouse
Ease of Learning
Device persistence and acquisition
Figure 59: Mean Opinion Score for Device on Ergonomical Attributes

12.5 Testing Conclusion

12.5.1 Functionality

The functionality tests proved the design of the prototype. Fragmenting the project into
sections helped to identify problems. It was possible to test each module and verify correct
operation before the system was assembled. The sensors were able to sense all but the
slowest of rotations and the output from the sensors utilised the full range of values available
for normal operation. Data was transferred successfully between the sensors and the PC
along a wire and any errors that may have occurred did not effect the operation of the device.
A professional looking graphical user interface was developed to interface the user to the data
processing output performed on data from the inertial sensors. The graphical user interface
formed the final part of what turned out to be a very robust design. Two attempts at
implementing a wireless interface was unsuccessful because, for unknown reasons, the radio
module stopped working indicated by the sudden disappearance of the received signal
strength indicator on the receiver. Despite problems with the wireless link, the prototype was
built, including the manufacture of packaging, and allowed usability tests to be carried out on
the design.

12.5.2 Usability (compared with a mouse)

A method described as ‘gesture recognition’ was used to turn the output from the sensors into
the rotation of a 3D cube. The gesture recognition had two main objectives: to overcome the
effects of noise and inertial drift that the gyroscopes produced and to make the device perform
well against the ergonomical criteria set at the start of the project. Thresholding was used to
eliminate background noise from the sensors. The use of thresholding in this way had a direct
effect on the usability of the device and results indicate that the device was not as intuitive to


use or as quick to use as a mouse for performing 3D rotation. The tests designed into the
graphical user interface produced accurate, quantitative results that could be used to compare
the prototype device to a mouse. Of the ten users, all performed the simple rotation of a 3D
object in less than approximately half than with a mouse. These finding match the findings of
Zhai[33], who, in his report states, “With proper clutching mechanism, it is conceivable to
implement an isometric device in position control mode or an isotonic device in rate control
mode. Interestingly, these two combinations tend to produce poor user performance. The
reason is quite simple: the self-centering mechanism in an isometric device facilitates the
‘start, speed-up, maintain speed, slow-down and stop’ cycle in rate control. The later half of
the cycle is somewhat automatic with the self-centering mechanism in isometric devices. With
a free moving device, one has to deliberately return to the null position.”
It is not felt, however, that this is conclusive evidence that this prototype device is significantly
less usable than a mouse for the simple fact all testers had experience at using a mouse that
ran into decades. The users only had approximately 15 minutes of contact time with the
device and as Zhai comments, “Rate control is an acquired skill. A user typically takes tens of
minutes, a significant duration for learning computer interaction tasks, to gain controllability of
isometric rate control devices. It may take hours of practice to approach the level of isotonic
position speed.”[33]
To test the device accurately against a mouse, tests would have to be carried out on users
who had no experience with either device. Performing such a test would be futile because an
overwhelming portion of people this device would be likely to attract would have experience
using a mouse. For such a 3D object controller to become marketable it the device would
need to allow the user to perform a task ‘better’ than a mouse having had a lot less exposure.
The alternative to finding a user with little experience using a mouse is a user with a lot of
experience with the device. The author performed a lot of tests using the device during
development. It can be claimed that the author was significantly better at using the device
than other users in terms of speed and accuracy of performing a task. As Zhai [33]
commented, using such a device is an acquired skill and it is unclear if users would be
prepared to use such a device for every-day tasks. Testing found the majority of users were
happier using a mouse, finding the prototype more of an interesting toy than a useful tool.


12.5.3 Optimised Settings

Allowing users to decide upon the setting they found most appropriate has found optimised
values for the different settings implemented. Adjusting these settings had a heavy impact on
the usability of the device.
The sensitivity value was set depending on how still the user could keep their hand, especially
when it came to isolating a single axis and not making components of an intentional rotation
appear on other axis. The sensitivity set had to be a compromise between having to make
unnecessarily large ‘jerks’ of the hand to break the threshold and making these unwanted
components of rotation. At no point was the preferred threshold level low enough that
background noise broke the threshold.
The amplification at the output stage was selected at approximately the same level for all
users. This was a compromise between having to make sharp ‘jerks’ of the wrist to produce a
large output and having an output that saturated even for a gentle rotation.
The maximum output was largely unchanged from the default by the majority of users. The
default value is the comfortable maximum angular velocity that the output is capped at. The
maximum angular velocity was setting worth experimenting with, especially with a larger
amplification in normal mode. One particular example is setting the amplification high in
normal mode so small gestures are amplified so they saturate the output of the device. The
extreme case is where the output is either zero or saturated. None of the users selected these
settings, opting to use the settings so that the output was analogous with the speed of the
gesture. This was a positive outcome for the device because users preferred to use a
variable-rate output as per the main design of the device.
Half of the testers opted to use the auto-damped output mode. This mode of operation is
simpler to use than normal mode because there is no need to think about how to counteract
rotation so some users found it easier to use and completed the usability quicker in this mode.
Users who found using normal mode okay generally performed the usability tests quicker than
testers who used auto-damping modes. Most users set the damping duration at the default
value. There is certainly potential to develop this mode of operation, in particular allowing the
user to ‘nudge’ the 3D object in a different direction to the direction it was moving (currently
the user has to wait for the output to decay to zero before any other direction in that axis is
The time-out feature used in normal mode was not adjusted by any of the users. This feature
was implemented to prevent a problem that was found during development (see Figure 37).
This feature is should be included but there isn’t a need for the user to alter the setting from

Project Conclusion

13 Project Conclusion

13.1 Meeting the Specification

An overall specification was set at the beginning of the project from the design brief and as a
result of initial research. The specification set the objectives that were followed throughout the
development process. The criteria are listed, in order of decreasing importance, in Figure 60
along with a rating out of five. The rating has been decided as a result of the testing.

Criteria Rating

The device must be operable in a 3D environment

The device must be ergonomically designed (see Figure 59)

The device must be robust and reliable during normal operation

The device must connect to an IBM compatible PC running MS Windows

The device must measure movement in 6 degrees of freedom

The device must be wireless

Figure 60: Rating of Prototype Against Specification

It is hard to give an overall rating for the project because the criteria in the specification are
ordered on importance. The most important feature was the device being capable of
manipulating 3D object that the device did achieve. The device also scored well at being
robust and reliable. The areas where the project was not completed successfully was
implementing 6 degree of freedom (only rotation was implemented) and the wireless link was
not included.
The project was, on the whole, successful and there are some interesting results and
conclusions that can be made on inertial sensing. These are discussed in the following

Project Conclusion

13.2 Discussion
Having completed the project this section will reflect on how well the project was executed, a
summary of the significant findings and further work that is likely to enhance the functionality
of the prototype. It is hoped that the findings from this investigation into an inertial device for
the control of 3D graphics will probe further work into this area, particularly in the Department,
and add to the existing knowledgebase on inertial sensing.

At the start of the project it was important to justify the project, so that a clear set of objectives
could be defined. It is believed that there is a need for a device to control 3D graphics that is
becoming increasingly made more aware as 3D graphics become incorporated into web
pages. The mouse is the default choice for the control of 3D graphics because it is low cost,
robust and has excellent ergonomic attributes. The mouse operates best in a 2 dimensional
environment: for 2D pointing and dragging tasks the average time generally found to be is less
than 1 second (Table 4.9 [10]) whereas for 3D tasks experiments in this project found that
time taken was, for the majority of users, over 6 seconds (Figure 59). Zhai [33] states: “As
three-dimensional (3D) graphics moves to the core of many mainstream computer systems
and applications, the search for usable input devices for 3D object manipulation becomes both
an academic inquiry and a practical concern. In the case of the 2D Graphical User Interface
(GUI), the computer mouse established itself very early and quickly replaced the light pen as
the de facto standard input. In the case of 3D interfaces, however, there is still not an obvious
winner suitable for all applications,”

There are no mainstream inertial devices available that have a widespread user base. The
problem that faces inertial devices is the overwhelming domination of the mouse on desktop
PCs. Any new device would have to be as cheap, robust and usable as a mouse. The cost of
inertial sensors and the necessary accompanying circuitry is gradually falling, partly due to
new MEMS fabrication techniques: “The use of inertial components to measure gestures has
increased due to the availability of cheap micro-electromechanical systems,”[11]. Inertial
sensing is a complex subject, due to fundamental problems of inertial sensors, so this
increases the cost and, unless the solution totally overcomes the problems will mean the
device is not as robust or persistent as a mouse. Having to pick up the device is also a
disadvantage because it makes switching between the device and the mouse or keyboard

Project Conclusion

slow. The final problem is the usability of a 3D input device that the user holds in their hand:
unless the device was very light, prolonged use would lead to fatigue. These disadvantages
might outweigh the intuition and speed at which a 3D inertial device could perform. To
compete with a mouse the device would need to have considerable advantages, especially
considering the versatility of the mouse and that the majority of the computing world is
experienced at using them. Specialist applications (such as studying cuneiforms in museums)
might be where there is a demand for such a device. Specialist markets might be able to
justify the extra expense to purchase a device although the demand compared to the desktop
PC market would be comparatively low so the incentives to developers would small. It is,
however, worth developing the idea, too see whether the performance of such a device can
be proven and with the cost of inertial sensors dropping, may become marketable in the

A prototype device was developed as part of an investigation into the application of inertial
sensors to the control of 3D graphics. Through books and substantial research on the Internet
it has been possible to offer many possible solutions to meet the design criteria. An industry
expert was also contacted and provided interesting comments on what was trying to be
achieved in this project. Before discussing the solution and further developments, two
particular areas of the research shall be discussed because interesting conclusions can be
made: the USB HID specification and the application of inertial sensors to human input

USB connectivity was desirable for the device because of future compatibility with PCs and
because the device could be designed to be automatically supported by Windows without
additional drivers. The HID specification was released by the USB Implementers Forum to
allow hardware developers to build computer input devices that are able to interface directly
with applications that also have HID support. By adopting the specification, hardware
developers can ensure that their device will be compatible with existing and future software
that supports the specification. Research, including the contacting of an industry expert, found
that the specification has not been adopted by much of the industry despite the one huge
advantage; that application-specific drivers for each device would not have to be developed.
For example, 3Dconnection specify that their top-of-the-range SpaceMouse is supported by
“100+ applications Catia, UG, Pro/ENGINEER, SolidWorks, SolidEdge, Inventor and 3Dstudio

Project Conclusion

among others,” [1]. If 3Dconnection and the software developers had used the HID
specification, listing specific software packages would not be necessary. The problem could
be that software developers are reluctant to incorporate HID support properly; after all, it is up
to the device developers to make sure that their device can interface with software, “Since
each company created its own 3D-device interface, each had to work with many vendors to
incorporate support into CAD software, and not all of the vendors were willing to do this two or
three times to support all the devices available,” [28]. Alternatively, it could be the hardware
developers that are not demanding HID support from application developers. Either way, the
HID specification seems to have many advantages that would benefit the developers and
consumers by improving the compatibility between human input devices and applications and
hopefully developers will start to develop human input devices using it.

This project also included a study into the application of accelerometers and gyroscopes to an
inertial input device, which would sense the movement of the user’s hand in all six degrees of
freedom. There are two ways of arranging inertial sensors to measure six degrees of freedom:
using three dual-axis accelerometers or three gyroscopes and three single axis
accelerometers. Gravity has severe implications on the first method because the
accelerometers sense force and therefore gravity needs to be cancelled out. The second
method (using gyroscopes and accelerometers) does not suffer from this because gyroscopes
are not affected by gravity, so the orientation of the sensor can be calculated and the
corresponding component of acceleration due to gravity sensed by the accelerometers can be
cancelled out. It was decided to use the latter method as the arrangement of the inertial
sensors. Due to time limitations, accelerometers were not used so the device was only able to
sense 3D rotation.

There are significant problems, other than the effects of gravity that will be encountered by
any device that uses inertial sensors to measure its own orientation and position. These are
caused by imperfections in the sensors outputs. Inertial sensors suffer from noise and also
inertial drift because they don’t measure their movement precisely (as if they were slipping).
When trying to calculate the absolute position and orientation an object the output from the
sensors has to be integrated (twice for accelerometers, once for gyroscopes) which amplifies
the imperfections in the output from the sensors. The result is the device will become
increasingly disorientated with time. To overcome this problem the device must re-calibrate

Project Conclusion

periodically on fixed reference points. Methods include taking a reference on gravity and
magnetic north, vision processing to fix onto light sources in the local surrounding and using
two or more artificial sources of sonar.

Including the extra sensors and the complex data processing that is required to overcome the
problems with inertial sensing described previously was outside the time limitations imposed
on the project. A prototype was developed that used ‘gesture recognition’ to overcome these
problems. It uses thresholding to overcome noise and orientation remains constant because
the user returns the device to the ‘null’ position after each. The output from the sensors is not
integrated, preventing the inertial sensor inaccuracies from being amplified.

The gesture recognition based approach overcame the problems that would have been
encountered using direct mapping. The proven downside to this method is usability suffered
as a result. Testing showed that the device was slower than a mouse at rotating 3D objects.
Particular problems concerning usability were users not returning the device to the null
position after making a gesture, then attempting to make another gesture, which was
interpreted incorrectly because the device was misaligned. Also, some users didn’t find the
device as intuitive as was hoped and found themselves having to concentrate how on what
movement to make to make the desired rotation, especially when the device was already
moving in another axis. The gesture recognition did not perform well against the mouse in its
current guise. There is a scope for development of the gesture recognition design and it is
possible that gesture recognition could end up being as intuitive to use as a direct mapping is
expected to be. Improvements could include different operational modes (such as linearly a
decreasing output) or matching outputs to complex mathematical models such as Hidden
Markov Models that have, “had great success in the area of human gesture recognition.”[7].

Despite the ergonomical problems that have been discussed, the project successfully
demonstrated a fully working, robust hardware system. The project proved the used of the
MEMs Murata Gyrostar gyroscopes, a robust system that transmitted the output from the
gyroscopes to the serial port on a PC using a PIC, and a professional looking and robust
graphical user interface to test the device. The author benefited from the chance to learn
Visual Basic, improve PIC programming skills, understand the USB interface, develop 3D
graphics from basic principles and research and gain first hand experience of using inertial

Project Conclusion

13.3 Further Work

It is hoped that this project will invoke further development into the application of inertial
sensors to the control of 3D graphics in the Department, as there are many different ways this
project can be continued:

• Gesture recognition algorithms could be developed to include different modes of

operation or include complex algorithms that match specific patterns in the sensor
outputs with specific models

• Adding accelerometers to the prototype to measure translation. The PIC and the
serial interface have the capability to handle the extra complexity and data rate

• Solve the problems encountered with the radio transmitter and receiver module to
make the device wireless

• Add extra sensors to the prototype that are required for the device to recalibrate
electronically for ‘direct mapping’ mode

• Remove gyroscopes and implement 6-DOF using three dual-axis accelerometers.

The PIC and serial communications should be able to accommodate this.

• Attempt user-recalibration. The hardware and GUI could remain largely unchanged
and the gesture recognition algorithm could be replaced with data processing that
would probably require the use of Kalman filtering.


14 References

[1] 3Dconnection (spacemouse
[2] Abit (AT7-MAX2 motherboard)
[3] Altoff, F et al. (2002) A Generic Approach for Interfacing VRML Browsers to Various
Input Devices and Creating Customizable 3D Applications. Technical University of
[4] Ascii-table
[5] Axelson, J. USB Complete (2nd Edition) Lakeview Research 2001
[6] Baldis, s. f. (1997) Input Device Interface For 3-D Interaction University of Washington
[7] Benbasat, A.Y (2000) An Inertia Measurement Unit for User Interfaces. MIT
[8] Blaxxun Contact
[9] CAI Cosmo
[10] Douglas S.A. The Ergonomics of Computer Pointing Devices, Springer –Verlang,
London 1997
[11] Erikson, G (2001) Application of Inertial Sensing to Handheld Terminals. Ericsson
Radio Systems AB 164 80 Stockholm
[12] Farnell
[13] Gyration
[14] Henne, P et al (1999) MOVY – A Sourceless and Wireless Input Device for Real World
Interaction. German National Research Centre for Information Technology
[15] Intersense
[16] Logitech
(3D Mouse & Head Tracker Technical Reference Manuel 1992 620402-00 Rev B)
[17] Low Power Radio Solutions
[18] Macek, A.
[19] Maui Innovative
(Miracle Mouse
[20] Maxim,
(Max232 Data Sheet number19-4323, 1999)
(Max495 Data Sheet number 19-0265, 1996)


[21] Microchip www.microchip .com

(MovCurs Demo
Product Selection guide
[22] Microsoft usb and game devices 2001
[23] Murata
(Gyrostar Data sheet SE42E2 1999 )
[24] Oddballs (Brian Dubé juggling ball)
[25] New Oxford Dictionary of English (2001) Oxford University Press
[26] Parallel Graphics – Cortona
[27] Radiometrix
(Antenna design sheet
[28] Sheerin, P (2002) The Sheerin Report #6
[29] Suya You et al.(2000) Hybrid Inertial and Vision Tracking for Augmented Reality
Registration. University of Southern California
[30] USB Implementer Forum (2001)
HID Usage Tables, Version 1.11
HID tool
[31] Verplaetse, C Inertial proprioceptive devices: Self-motion-sensing toys and tools
(1996) MIT Media Lab
[32] Woolley, S. I. et al. (2001) 3D Capture, Representation and Manipulation of Cuneiform
Tablets, SPIE. · 0277-786X/01
[33] Zhai, S (1998) User Performance in Relation to 3D Input Device Design. IBM Almaden
Research Centre.

Appendix A: Implementing USB on a PIC 16C745

Appendix A: Implementing USB on a PIC 16C745

Section 4.2 introduced and summarised the implementation of USB on a PIC16C745. This
appendix will briefly describe the selection of the PIC16C745 and how the USB-Mouse
sample code was implemented, how this code was modified to enumerate as a joystick.
Having decided to use the USB protocol to connect the device to the PC, the USB controller
chip had to be chosen. The 16C745 chip was considered alongside some of those described
in USB Complete [5].

• Programmable in assembly (with which the author has experience)

• Widely supported family of microcontrollers

• Programmer available in EE department

• Re-programmable (UV erasable)

• Low cost (£15.12 [12])

• Provides HID example code that can be modified

Research found that there was limited support available for the 16C745 due to its relatively
recent release in 2001. Research found a project that used the PIC16745 to develop a USB
HID microphone [18]. This project gave details of the circuit built and the modifications made
to the sample code provided by Microchip [21] to construct a microphone and was a useful
reference during this project. The sample code is available off the Microchip website [21]. It is
packaged as and contains 6 assembly files and one MPLAB project file than can
be assembled and programmed on the PIC to demonstrate the chip rotating the mouse cursor
in circles on the screen.
The sample code included the following four files, as well as 22-associated file that were not

• Usb_Ch9.asm Interface and core functions needed to enumerate the bus

• Descript.asm Device, configuration, interface, endpoint and string descriptors

• Hidclass.asm Hid class specification functions.

• Usb_main.asm Starting point for new applications

Appendix A: Implementing USB on a PIC 16C745

The sample mouse demonstration was built using MPLAB (loading the movecurs.pjt project
file) into an Intel Hex file that was then loaded onto the micro controller using the WPIC
software and PIC programmer. The settings for the programmer were:

• Oscillator H4 (HS)*

• Watchdog OFF

• Powerup Timer OFF

• Code Protect OFF

* 6Mhz, high-speed crystal was used
The circuit shown in Figure 63 was built to test the PIC16C745. Appendix E lists all the
components used in the circuit.
The PIC16C745 loaded with the MoveCursor demonstration program was connected to the
USB port. Immediately the device enumerated as a mouse and the cursor started to rotate
around the point that the mouse was previously at. This proved the circuit worked so the code
was now to be modified to enumerate the PIC as a custom HID.
The HID Descriptor Tool [30] was used to develop the report descriptor for the device. It helps
to create the report descriptor and will de-bug it before entering the hexadecimal values into
the PIC descriptor files. The report descriptor, “defines the format and uses of the data that
carries out the purpose of the device”[5]. The remaining configuration, device and interface
descriptors remained largely unchanged from the mouse example because of the similarities
between a mouse and a joystick in terms of the type of data transferred. The following report
descriptor functioned correctly for a two-axis joystick although a three-axis joystick, when
implemented, behaved erratically (negative values on one axis stopped working).

Appendix A: Implementing USB on a PIC 16C745

Report Descriptor
retlw 0x05
retlw 0x01 ; usage page (generic desktop) See HID Usage Tables [30]
retlw 0x09
retlw 0x04 ; usage (joystick)
retlw 0xA1
retlw 0x01 ; collection (application)
retlw 0x09
retlw 0x01 ; usage (pointer)
retlw 0xA1
retlw 0x00 ;collection (linked)
retlw 0x05
retlw 0x09 ; usage page (buttons)
retlw 0x19
retlw 0x01 ; usage minimum (1) Three button joystick
retlw 0x29
retlw 0x02 ; usage maximum (3)
retlw 0x15
retlw 0x00 ; logical minimum (0) Logic value for buttons
retlw 0x25
retlw 0x01 ; logical maximum (1)
retlw 0x95
retlw 0x02 ; report count (3) Three bits..
retlw 0x75
retlw 0x01 ; report size (1) ..of single bit size
retlw 0x81
retlw 0x02 ; input (3 button bits)
retlw 0x95
retlw 0x01 ; report count (1)
retlw 0x75
retlw 0x06 ; report size (5) Five bits of padding
retlw 0x81
retlw 0x01 ; input (constant 5 bit padding)
retlw 0x05
retlw 0x01 ; usage page (generic desktop)
retlw 0x09
retlw 0x31 ; usage (X) Two-axes joystick
retlw 0x09
retlw 0x30 ; usage (Y)
retlw 0x15
retlw 0x81 ; logical minimum (-127) 256 levels
retlw 0x25
retlw 0x7F ; logical maximum (127)
retlw 0x75
retlw 0x08 ; report size (8) Each report is eight bits
retlw 0x95
retlw 0x03 ; report count (2) There are two reports
retlw 0x81
retlw 0x02 ; input type(data, variable, absolute)
retlw 0xC0 ; end collection
retlw 0xC0 ; end collection

Appendix B: Circuit Diagrams

Appendix B: Circuit Diagrams

Figure 61: Max232 Circuit Diagram

Figure 62: PIC16C774 Circuit Diagram

Appendix B: Circuit Diagrams

Figure 63: PIC16C745 Circuit Diagram

Figure 64: Max495 Pinout

Figure 65: Gyroscope and Amplifier Schematic

 R1 + VR1 
Gain = − 
 R2 

Appendix B: Circuit Diagrams

Component Values (See figure 65):

Component Value
R1 9kΩ
R2 15kΩ
VR1 2kΩ
C1 0.1µF
C2 0.1µF

Figure 66: Transmitter Circuit Diagram

Figure 67: Receiver Circuit Diagram

Appendix C: Correspondence

Appendix C: Correspondence

Requestsent on 21st November 2002 for information from Parallel Graphics
I'm currently building an inertial 3D input device as part of my university project. The device is USB HID
compatible and will come under the multi-axis controller group. Could you tell me if Cortona (PC
version) supports DirectInput and whether this device should work? I want the device to be able to
rotate the object (i.e. be used instead of the mouse in 'examine' mode).At the moment it doesn’t work
and I am trying to work out why.
Thanks very much for your help

Response received 22nd November 2002 from Parallel Graphics

Thank you for the email. Your interest in ParallelGraphics products and technologies is sincerely
appreciated. ParallelGraphics Cortona VRML Client version 4.0 supports only standard mouse,
keyboard and two-button joystick as input devices. To provide support for any other input devices in
Cortona, an application, which will handle events of the device and control movements in the Cortona
3D window, should be developed. Such an application can be created with Cortona Software
Development Kit (
Best regards,
Mikhail Timofeev
ParallelGraphics Support Team

Appendix C: Correspondence

Request sent on 25th November 2002 for advice on HID support to Peter Sheerin
I am currently trying to develop a 3D input device for a PC using the USB HID specification. I have read
your article on Cadence that you wrote back in February and I hope you don't mind me contacting you.
The device I am making comes under group 8, multi axis controller, of the USB spec. Its an inertial
device that will send Rx, Ry and Rz data to the computer. The scope of my project is to build the device
but I need to interface it to some software to demonstrate it working. All it needs to do is rotate a cube
or something. I emailed Cortona and they say that I need to use their SDK to get my device to work. I
was thinking about grouping my device as a joystick instead, but I read on the MS website that there
might be problems as a joystick uses absolute position and I want to send angular velocity information.
Do you know of any software that might support my device for demonstration purposes or anyone else
who may be able to help me? Any help would be much appreciated.
Many thanks

Response Received 25th November 2002 from Peter Sheerin

I'm tickled pink to hear of someone else wanting to use the 6DOF-HID interface. If you continue on that
path, that will make the list include your device, the USB model of the SpaceBall, and future controllers
from 3DConnexion.
Unfortunately, the only software I've seen that uses that spec is an internal utility from 3Dconnexion.
But I wouldn't give up quite yet on using that spec, since in the long run, it will result in greater
compatibility--the joystick method is a kludge. There is a slight chance that the forthcoming CATIA
I've been meaning to pester the cortona folks again about proper HID support--and this note is a good
nudge. Asking you to use their propreitary SDK instead of an open standard just isn't right.
Also, I've had some interest in the past from SolidWorks developers, so with more than just
3Dconnexion asking for the HID interface, it might be possible to convince them to support it. I've also
seen references to a new glove controller that makes reference to HID, though I'm not sure which group
they're supporting. When I get back from the holidays, I'll track it down and test it.

Appendix D: Hardware CAD designs

Appendix D: Hardware CAD designs

Top View

Figure 68: Bottom Half of Spherical Case

Figure 69: Top Half of Spherical Case

Appendix D: Hardware CAD designs

Cube (identical faces)

Figure 70: Face of Aluminum Cube

Appendix E: Component List and Budget

Appendix E: Component List and Budget

The initial budget for this project was £100. Due to the cost of the Murata Gyrostar

gyroscopes alone, this budget was exceeded. The approximate budget for this project

was as follows:

Project Section Major Components Cost Supplier /Reference

Packaging Aluminium Cube £0.00

MDF Sphere £0.00

Rapid Electronics
Activate' Button £7.75
( 78-1704
Sensors Murata Gyrostar (x3) £99.00
Farnell [12] 334-9317
Max495 op-amp £6.24
Farnell [12] 246-426
Microcontroller PIC16C774 £19.06
Farnell [12] 315-4107
LPRS Easy Radio 433Mhz Rapid Electronics 43-0352
Radio Link £60.50
(Receiver x2) ( 43-0354
PC Interface USB Complete[5] £29.00
[] 965081958
PIC 16C745 £15.12
Farnell [12] 343-5188
Max232 IC £3.68

Data Processing None £0.00

GUI None £0.00

(Including electronic
Miscullaneous components and
Components consumables)

Total (approximately £300.35

Figure 71: Component List and Budget

Appendix F: Sample Data

Appendix F: Sample Data







0 2 4 6 8 10 12 14 16 18 20



X Axis
Y Axis

Z Axis
Figure 72: Example Inputs and Output for ‘Normal Mode’

Appendix F: Sample Data


200 Inputs




0 2 4 6 8 10 12 14 16 18 20



X Axis
Y Axis
Z Axis

Figure 73: Example Inputs and Output for ‘Auto-damped Mode’

Appendix G: PIC code

Appendix G: PIC code

; *
; Filename: 16C774-8.asm *
; Date: 5/3/03 *
; File Version: 8.0 *
; *
; Author: David Humphreys *
; Company: UoB *
; *
; Files required: none *
; *
; *
; Notes: *
; This program reads 3 values from analogue *
; pins and transmits them over the hardware *
; UART with 2 header bytes. The value of the header *
; byte changes depending if the activate button is *
; pressed *
; *
; *

list p=16c774 ; list directive to define processor

#include <> ; processor specific variable definitions

__CONFIG _CP_OFF & _WDT_OFF & _BODEN_OFF & _PWRTE_OFF & _HS_OSC ;configuration word

; Defining variables not declared in
COUNT equ 0x22
TEMP equ 0x23
VALUEX equ 0x24
vALUEY equ 0x25
vALUEZ equ 0x26
ANALOGR equ 0x27
ADCHAN equ 0x28
BUTTON equ 0x29

; Startup instructions. This section comes from 16c745 template

ORG 0x000 ; processor reset vector

clrf PCLATH ; ensure page bits are cleared
goto main ; go to beginning of program

; Start of main program

main ; start main program

call Init_ports ; initialize all ports
call Init_serial ; initialize serial port


Appendix G: PIC code

bcf STATUS,RP0 ; select bank 0
movf PORTD,w ; read PORTD into working
movwf TEMP
btfsc TEMP,4 ; Checks activate button..
movlw 0x0f ; ..if pressed 15 moved to working
btfss TEMP,4
movlw 0x10
movwf BUTTON ; ..else 16 moved to working
movlw 0x41 ; channel 0 selected, fosc/8, a/d on: 01000001
movwf ADCHAN ; puts value into ADCHAN
call Analog
movf ANALOGR,w ; puts analog sub-routine result into working
movwf VALUEX ; stores into VALUEX

movlw 0x49 ; channel 1 selected, fosc/8, a/d on

movwf ADCHAN
call Analog
movf ANALOGR,w
movwf VALUEY ; stores result from channel 1 to VALUEY

movlw 0x51 ; channel 2 0 selected, fosc/8, a/d on

movwf ADCHAN
call Analog
movf ANALOGR,w
movwf VALUEZ ; stores result from channel 1 to VALUEZ

call Xmit ; calls Xmit to transmit data

goto Loop ; repeats data acquisition and transmission

;This section sets all ports.

bcf STATUS,RP0 ; switch to bank 0
clrf PORTA ; clear ports A - E
clrf PORTB
clrf PORTC
clrf PORTD
clrf PORTE
bsf STATUS,RP0 ; switch to bank 1
movlw 0xff ; sets all of PORTA to inputs
movwf TRISA
movlw 0x00 ; sets PORTB to outputs
movwf TRISB
movlw 0x80 ; PORTC outputs apart from pin 7 which is the UART Rx
movwf TRISC
movlw 0x10 ; pin4 of PORTD is input for activate button
movwf TRISD
movlw 0xff ;PORTE set to inputs (not used)
movwf TRISE
movlw 0x3B ; sets up ADCON1
movwf ADCON1 ;left justified, ext. Vref+/AVss as reference, AN<0, 1, 2> analogue

; Sets up the UART

bsf STATUS,RP0 ; select bank 1
movlw 0x19 ; high baud rate, 4mhz oscillator, 9600bps
movwf SPBRG
movlw 0x24 ;. brgh =1(high speed) and transmit =1 (enabled).
movwf TXSTA ; setup transmit status>
bcf STATUS,RP0 ; goto bank 0
movlw 0x90 ;Receive not used but must be implemented..
movwf RCSTA ; enable serial port (bit SPEN set to 1)

Appendix G: PIC code

; Transmits data over UART and to LEDs connected to PORTB
bcf STATUS,RP0 ; select bank 0
call Transdelay ; waits 1ms for data to be transmitted
movlw 0xff ; moves 255 (header 1)to working
movwf TXREG ; moves to Transmit buffer
movwf PORTB ;moves to PORTB LEDs

call Transdelay
movf BUTTON,w ;second header. Value depends whether button pressed
movwf TXREG
movwf PORTB

call Transdelay
movf VALUEX,w ; Transmit VALUEX and output to LEDs
movwf TXREG
movwf PORTB

call Transdelay
movf VALUEY,w ; Transmit VALUEY and output to LEDs
movwf TXREG
movwf PORTB

call Transdelay
movf VALUEZ,w ; Transmit VALUEZ and output to LEDs
movwf TXREG
movwf PORTB

; Routine polled TXSTA bit 1 to see if the TSR register was empty. However polling TSR did not work
;so a timing loop has been implemented, based on the time to send 10 bits at 9600bps

Transdelay bcf STATUS,RP0 ;bank 0

;1000us delay needed in total
movlw 0xff
movwf COUNT
Loop3 decfsz COUNT,f
goto Loop3 ;765uS

movlw 0x54
movwf COUNT
Loop5 decfsz COUNT,f ;250uS
goto Loop5


; Routine to read values from a/d converter. p121 16c774 data sheet
bcf STATUS,RP0 ; select bank 0
movf ADCHAN,w
movwf ADCON0 ;moves values of ADCHAN to ADCON0
movlw 0x0f ;waits ~50us
movwf COUNT
Loop2 decfsz COUNT,f
goto Loop2
bsf ADCON0,2 ; starts A/D conversion
Loop4 btfsc ADCON0,2
goto Loop4 ;loops until ADCON0, GOis clear once conversion has taken place
movf ADRESH,w ;moves high byte of conversion to working
movwf ANALOGR ;stores result into ANALOGR

END ; directive 'end of program'

Appendix H: Visual Basic Code

Appendix H: Visual Basic Code


Private Sub MDIForm_Load() 'entered when form is loaded

Bars.Show 'shows other 'child' forms
Record_form.Hide 'this form is not visible when loaded
Setup_device_monitor 'calls these functions when loaded

End Sub

Public Sub Setup_parser() 'called when program is loaded

Activated_mode = False 'initial settings


X_val = 0
X_out = 0

Y_val = 0
Y_out = 0

Z_val = 0
Z_out = 0
End Sub

Public Sub Setup_device_monitor() 'called when program is loaded

MSComm1.RThreshold = 1 ' Rx event fired on every byte received

MSComm1.InputLen = 1 ' number of characters from input buffer
MSComm1.Settings = "9600,N,8,1" 'com port settings -9600bps, no partity, 8 bit, 1 stop
MSComm1.CommPort = 1 ' sets com port one as mscomm1
MSComm1.PortOpen = True 'com port opened

Endval = 0 ' sets variables to zero on start up

Valuex = 125
Valuey = 125
Valuez = 125

Appendix H: Visual Basic Code

Found_result = False ' sets found result to false (-1)

Calibrated = False
Settings_form.Uncalibrated_lock 'calls uncalibrated_lock function in settings_form
Started = False
Record_form.StartStop.Caption = "Start" 'start_stop button displays start
Record_form.Time_box.Caption = " " 'captions do not display anything
Record_form.Records_box = ""
Lasttime = 0

Previous_button_header = 16 'not pressed state

Simulate_input = False 'does not simulate input on startup
End Sub

Public Sub MSComm1_OnComm() 'function fired if data is received on com port 1

Dim Data As String 'declares vairables for use in this function only
Dim Dec_data As Long

' If Rx Event then get data and process

If MSComm1.CommEvent = comEvReceive Then 'if the event that has happened at the com port is 'receive' then...
Data = MSComm1.Input ' Data = received byte

If Simulate_input = False Then 'similate input has proirity of com port if enabled
Input_buffer(Endval) = Asc(Data) ' places a decimal conversion of the ascii symbol into input_buffer at next
Update 'calls update
End If

End If
End Sub

Public Sub Update() 'update function, called when data received

Check_header 'calls check_header

If Endval = 8 Then 'if buffer reaches end then will reset

Endval = 0
If Found_result = False Then 'if header and value not detected endval encremented
Endval = Endval + 1
Found_result = False
End If
End If

End Sub

Public Sub Check_header() 'called by update

If Endval >= 4 Then 'endval goes from 0 to 4. 5 values in the input buffer is the minimum for 2 header and 3 data
'if the first and second onf the previous 5 bytes are header bytes then...

If Input_buffer(Endval - 4) = 255 And (Input_buffer(Endval - 3) = 15 Or Input_buffer(Endval - 3) = 16) Then

If Input_buffer(Endval - 3) = 15 And Previous_button_header = 16 Then 'was not pressed, now pressed

If Activated_mode = True Then 'toggles activate mode

Activated_mode = False
Cube.Activated_caption.Caption = "Device Deactivated"
If Calibrated = True Then 'only activates if calibrated
Activated_mode = True
Cube.Activated_caption.Caption = "Device Activated"
Start_test_timer = True 'starts timer for tests
End If
End If

End If

Appendix H: Visual Basic Code

Previous_button_header = Input_buffer(Endval - 3) 'stores button header to compare to see if its changed

Valuex = Input_buffer(Endval - 2) 'takes the last 3 values and stored into valuex, y and z
Valuey = Input_buffer(Endval - 1)
Valuez = Input_buffer(Endval)
Store 'calls store

Endval = 0 'resets endval so points to start of array input_buffer

Found_result = True 'flags that a result has been found

End If

End If

End Sub

Public Sub Store() 'called by check_header

If Started = True Then 'will only store if started is flagged

Mytime = Timer - Starttime 'gives current time by subtracting start system time from current system time

Print #1, Mytime; ","; Valuex; ","; Valuey; ","; Valuez; ","; X_out; ","; Y_out; ","; Z_out; ","; X_cal; ","; Flipped_x;_
","; Gesture_phase1_detected_x; ","; Gesture_phase2_detected_x; ","; Peak_phase1_detected_x;_
","; Peak_phase2_detected_x; ","; Currently_auto_damping_x; ","; Valid_after_damp_x;_
","; X_val_previous; ","; Auto_damping; ","; Damping_factor_x; ","; X_int; ","; Max_peak_phase1_x;_
","; Damping_duration ‘Prints headers for inputs and outputs and variables

Lasttime = Mytime 'puts current relative time into lasttime

Record_form.Time_box.Caption = Int(Mytime) 'displays current relative time
Records = Records + 1 'displays number of records
Record_form.Records_box.Caption = Records

End If

End Sub

Private Sub Settings_Click() 'replaces the log data form with the settings form
End Sub

Private Sub Sim_Click() 'simulated input selected

Cube.Width = 8055 'resizes cube window
Simulate_input = True
If Activated_mode = True Then 'sets caption to activate/deactivate on button
Simulated_input_form.Activate_button.Caption = "Deactivate"
Simulated_input_form.Activate_button.Caption = "Activate"
End If
Simulated_input_form.Show 'displays simulate input form
End Sub

Private Sub Exit_Click() 'Exit button. closes program

Unload Me
End Sub

Private Sub Log_data_Click() 'displays data logging window

Settings_form.Hide 'hides setting window
End Sub

Appendix H: Visual Basic Code

Private Sub test1_Click() 'test 1 dialogue box loaded

End Sub

Private Sub Test2_Click() 'test 2 dialogue box loaded

End Sub

Public Sub Threshold_calc()

X_low_thresh = X_cal - Threshold 'works out upper and lower threshold for x axis
X_high_thresh = X_cal + Threshold

Y_low_thresh = Y_cal - Threshold 'works out upper and lower threshold for y axis
Y_high_thresh = Y_cal + Threshold

Z_low_thresh = Z_cal - Threshold 'works out upper and lower threshold for z axis
Z_high_thresh = Z_cal + Threshold

End Sub

Public Sub Monitor_x() 'checks input from x axis gyro

If Flipped_x = True Then

X_val = (2 * X_cal) - Valuex 'if flipped is true, will invert all inputs


X_val = Valuex 'if not flipped, x_val is simply the input

End If

'this function overcomes 'sticking' problem

If Gesture_phase1_detected_x = True And Peak_phase1_detected_x = False And X_val < X_high_thresh Then

Peak_phase1_detected_x = True 'if not rising/same value peak is detected

'conditions for start of damping

If Auto_damping = True And Currently_auto_damping_x = False And Valid_after_damp_x = True Then

Currently_auto_damping_x = True
Damping_factor_x = Sqr(X_int / ((Damping_duration) ^ 2)) 'works out damping factor
Max_peak_phase1_x = X_int 'parameter for damping
Damping_time_x = 0 'this is the current time from peak

End If

End If

'This overcomes sticking problem on peak phase 2

If Gesture_phase2_detected_x = True And Peak_phase2_detected_x = False And X_val > X_low_thresh Then

Peak_phase2_detected_x = True 'if not rising or same

If Currently_auto_damping_x = False Then

X_int = 0 'sets output to zero. this is so user doesn’t have to exactly counter-act phase 1 to make velocity 0

End If
End If

Appendix H: Visual Basic Code

If X_val > X_high_thresh Then 'positive peak

If Gesture_phase1_detected_x = False Then 'does this first time threshold is broken

Gesture_phase1_detected_x = True 'phase 1 of a gesture is recognised

If Currently_auto_damping_x = False Then

Valid_after_damp_x = True 'uses this variable so a gesture is not recognised half cycle
X_int = X_val - X_high_thresh 'difference between input and upperthreshold


Valid_after_damp_x = False 'won't validate damping if damping when Gesture phase 1 detected

End If

X_val_previous = X_val 'stores x_val for a comparison: increasing / decreasing

End If

If Gesture_phase1_detected_x = True And Peak_phase1_detected_x = False Then 'stage 1 of the gesture

If X_val >= X_val_previous Then 'checks to see if input is increasing

If Currently_auto_damping_x = False And Valid_after_damp_x = True Then

X_int = X_val - X_high_thresh 'outputs only if not autodamping and cycle completed

End If

X_val_previous = X_val 'stores x_val for a comparison: increasing / decreasing


Peak_phase1_detected_x = True 'if not rising/same value peak is detected

'conditions for start of damping

If Auto_damping = True And Currently_auto_damping_x = False And Valid_after_damp_x = True Then

Currently_auto_damping_x = True
Damping_factor_x = Sqr(X_int / ((Damping_duration) ^ 2)) 'works out damping factor
Max_peak_phase1_x = X_int 'parameter for damping
Damping_time_x = 0 'this is the current time from peak

End If

End If

End If

End If

If X_val < X_low_thresh Then 'detects if input goes below lower threshold

If Peak_phase1_detected_x = False Then 'if input is a negative gesture the input will need to be flipped

'flip_function and all inputs will be inverted then call monitor again
If Currently_auto_damping_x = False Then
Flipped_x = True
End If
Exit Sub

Appendix H: Visual Basic Code

If Gesture_phase2_detected_x = False Then

Gesture_phase2_detected_x = True 'recognises gesture

If Currently_auto_damping_x = False Then

X_int = X_int - (X_low_thresh - X_val) 'if auto-damping is off, will subtract input - calibration

If X_int < 0 Then

X_int = 0 'clips velocity at 0
End If

End If
X_val_previous = X_val 'stores previous value for comparison

If X_val <= X_val_previous Then 'looks for a trough

If Currently_auto_damping_x = False Then 'if not damping

X_int = X_int - (X_val_previous - X_val) 'subtracts difference from previous input from output

If X_int < 0 Then

X_int = 0 'clips at 0
End If

End If
X_val_previous = X_val 'stores for comparison

Peak_phase2_detected_x = True 'if not rising or same

If Currently_auto_damping_x = False Then

X_int = 0 'sets output to zero - this is so user does not have to exactly counter-act phase 1 to make
velocity 0

End If
End If
End If
End If
End If

'times out after a peak

If Currently_timed_out_x = True Then
If Current_Time_out_time_x > 0 Then
Current_Time_out_time_x = Current_Time_out_time_x - 10 'counts down from timeout
Reset_all_x 'restores back to original state
End If

End If

If X_val > X_low_thresh And Peak_phase2_detected_x = True Then

If Currently_timed_out_x = False Then 'sets timed out the first time crosses back over lower threshold
Currently_timed_out_x = True
Current_Time_out_time_x = Time_out
End If

End If

Appendix H: Visual Basic Code

If Flipped_x = False Then ' if flipped is true, will invert x_val before output

If (X_int * Sensitivity) <= Max_output Then 'limits output to max output

X_out = (Sensitivity * X_int) 'adds amplification

X_out = Max_output

End If

If (X_int * Sensitivity) <= Max_output Then 'flipped version

X_out = (Sensitivity * X_int * -1)

X_out = Max_output * -1

End If

End If

End Sub

Monitor_X is then repeated for the Y and Z axes (Monitor_Y and Monitor_Z respectively)
Public Sub Timer2_Timer() 'triggered every 10ms
If Activated_mode = True Then 'if device is activated
Monitor_x 'calls gesture recognition functions in turn

Auto_damp_x 'calls autodamp functions in turn



Reset_all_x 'will reset so if button is released all outputs set to zero


End If

End Sub

Public Sub Reset_all_x() 'called by timer2_timer

If Currently_auto_damping_x = False Then 'only resets if no auto damping

X_out = 0
X_int = 0
Flipped_x = False

End If

If Activated_mode = False Then 'if not activated will reset

X_out = 0
X_int = 0
Flipped_x = False
Currently_auto_damping_x = False
End If

X_val = 0 'resets variables back to start values

X_val_previous = 0
Gesture_phase1_detected_x = False
Gesture_phase2_detected_x = False
Peak_phase1_detected_x = False
Peak_phase2_detected_x = False

Appendix H: Visual Basic Code

Auto_damping = Settings_form.Auto_damping_box.Value
Damping_duration = Settings_form.Damping_duration_slider.Value
Currently_timed_out_x = False

End Sub

Public Sub Reset_all_y() 'called by timer2_timer

If Currently_auto_damping_y = False Then 'only resets if no auto damping

Y_out = 0
Y_int = 0
Flipped_y = False

End If

If Activated_mode = False Then 'if not activated will reset

Y_out = 0
Y_int = 0
Flipped_y = False
Currently_auto_damping_y = False
End If

Y_val = 0 'resets variables back to start values

Y_val_previous = 0
Gesture_phase1_detected_y = False
Gesture_phase2_detected_y = False
Peak_phase1_detected_y = False
Peak_phase2_detected_y = False
Currently_timed_out_y = False

End Sub

Public Sub Reset_all_z() 'called by timer2_timer

If Currently_auto_damping_z = False Then 'only resets if no autodamping

Z_out = 0
Z_int = 0
Flipped_z = False

End If

If Activated_mode = False Then 'if not activated will reset

Z_out = 0
Z_int = 0
Flipped_z = False
Currently_auto_damping_z = False
End If

Z_val = 0 'resets variables back to start values

Z_val_previous = 0
Gesture_phase1_detected_z = False
Gesture_phase2_detected_z = False
Peak_phase1_detected_z = False
Peak_phase2_detected_z = False
Currently_timed_out_z = False

End Sub

Appendix H: Visual Basic Code

Public Sub Auto_damp_x() 'auto-damping function

If Currently_auto_damping_x = True Then

X_int = Max_peak_phase1_x - ((Damping_factor_x * Damping_time_x) ^ 2) 'damping calculation

Damping_time_x = Damping_time_x + 10 'increments time

End If

If Damping_time_x >= Damping_duration Then 'duration of damping

Currently_auto_damping_x = False 'ends damping

Damping_time_x = 0 'resets variables

X_int = 0

End If

End Sub

Auto_damp_X is then repeated for the Y and Z axes (Auto_damp_Y and Auto_damp_Z)


Public Sub Form_Load()

Center = 250 'centre point around which cube is rotated
Radius = 130 'distance from centre to corner of cube
Start_length = Sqr((Radius ^ 2) / 3) 'distance from centre to cube edge

Pi = 3.14159265358979

Face_color_set 'sets colors of face_color array

Face_corner_numbering 'puts corners related to each face into face_corners array
Start_coordinates 'initial start points for all 8 corners

Me.Top = 0 'start position of form

Me.Left = 4710
Me.Width = 10545
Me.Height = 7950
Form_Paint 'draws cube on form

End Sub

Appendix H: Visual Basic Code

Public Sub Start_coordinates()

'this function translates the x,y and z rotation of each corner into
'x,y and z co-ordinates

' face numbering 1: 1234
' 2: 2376
' 3: 6785
' 4: 5841
' 5: 4873
' 6: 5126
' x-axis: 0 degrees is vertical, going clockwise looking into origin
' y-axis: 0 degrees is vertical, going clockwise looking into origin
' z-axis: 0 degrees is in y-axis direction, going clockwise into origin

'sets initial coordinates for each corner

Corner_coordinate(1).x_coordinate = -Start_length
Corner_coordinate(2).x_coordinate = Start_length
Corner_coordinate(3).x_coordinate = Start_length
Corner_coordinate(4).x_coordinate = -Start_length
Corner_coordinate(5).x_coordinate = -Start_length
Corner_coordinate(6).x_coordinate = Start_length
Corner_coordinate(7).x_coordinate = Start_length
Corner_coordinate(8).x_coordinate = -Start_length

Corner_coordinate(1).y_coordinate = Start_length
Corner_coordinate(2).y_coordinate = Start_length
Corner_coordinate(3).y_coordinate = Start_length
Corner_coordinate(4).y_coordinate = Start_length
Corner_coordinate(5).y_coordinate = -Start_length
Corner_coordinate(6).y_coordinate = -Start_length
Corner_coordinate(7).y_coordinate = -Start_length
Corner_coordinate(8).y_coordinate = -Start_length

Corner_coordinate(1).z_coordinate = -Start_length
Corner_coordinate(2).z_coordinate = -Start_length
Corner_coordinate(3).z_coordinate = Start_length
Corner_coordinate(4).z_coordinate = Start_length
Corner_coordinate(5).z_coordinate = -Start_length
Corner_coordinate(6).z_coordinate = -Start_length
Corner_coordinate(7).z_coordinate = Start_length
Corner_coordinate(8).z_coordinate = Start_length

End Sub

Appendix H: Visual Basic Code

Public Sub Add_perspective() 'adds perspective to cube

Dim i As Integer
Dim Scale_factor
For i = 1 To 8 Step 1 'applies the following for each corner
Scale_factor = ((Corner_coordinate(i).y_coordinate + 100) / 200) + 0.9 'the scale factor is between 0.9 and 1.
depending how far away corner is, it is x and z are scaled
Perspective_coordinate(i).x_coordinate = Corner_coordinate(i).x_coordinate * Scale_factor 'x scaled
Perspective_coordinate(i).z_coordinate = Corner_coordinate(i).z_coordinate * Scale_factor 'z scaled

End Sub

Public Sub Rotate_x_axis() 'rotation around the x axis

Dim i As Integer

For i = 1 To 8 Step 1 'applies rotation to every corner

'calculates rotation from x_rotation which is an angle in degrees

New_y_coordinate = (Corner_coordinate(i).y_coordinate * Cos(X_rotation * (Pi / 180))) –
(Corner_coordinate(i).z_coordinate * Sin(X_rotation * (Pi / 180)))
New_z_coordinate = (Corner_coordinate(i).y_coordinate * Sin(X_rotation * (Pi / 180))) +
(Corner_coordinate(i).z_coordinate * Cos(X_rotation * (Pi / 180)))

'Coordinates backed up and now restored. This is done because don't want to replace until both above
calculations are done
Corner_coordinate(i).y_coordinate = New_y_coordinate
Corner_coordinate(i).z_coordinate = New_z_coordinate

End Sub

Rotate_x_axis is then repeated for the Y and Z axes (Rotate_y_axis and Rotate_z_axis)

Public Sub Timer1_Timer() '100ms intervals

X_rotation = X_out / 30 'sets rotation in degrees

Y_rotation = Y_out / 30
Z_rotation = Z_out / 30

Rotate_x_axis 'calls the 3 rotation functions in turn


Add_perspective 'adds perspective to the cube

Calculate_normal 'calculates the normal to each face of the cube
Calculate_rendering_order 'the faces are ordered depending on which normal has the largest value
Form_Paint 'draws what we've just calculated

Test1_function 'calls test functions


X_axis_velocity_caption.Caption = Int(X_rotation * 10) 'rotates x_rotation 10 times per second

Y_axis_velocity_caption.Caption = Int(Y_rotation * 10) ' int stows only integers
Z_axis_velocity_caption.Caption = Int(Z_rotation * 10)

End Sub

Appendix H: Visual Basic Code

Public Sub Form_Paint() 'paints the cube

Cls 'clears previous frame

Dim Count As Integer 'variables used in this function only
Dim Current_face As Integer

For Count = 4 To 6 Step 1 'counts through the 3 faces that are visible
Current_face = Render_order(Count) 'recalls the 3 visible faces (there may only be 1, in which case the other 2
are covered up by the front one

FillStyle = 0 'solid fill

'retrieves RGB color from face_color array for each face

FillColor = RGB(Face_color(Current_face).R, Face_color(Current_face).G, Face_color(Current_face).B)

Dim PolyPts(3) As PointAPI 'data type define in gdi32.dll

'plots the pologon for each face using coordinates

PolyPts(0).X = (Center + Perspective_coordinate(Face_corners(Current_face).Corner1).x_coordinate)
PolyPts(0).Z = (Center + Perspective_coordinate(Face_corners(Current_face).Corner1).z_coordinate)
PolyPts(1).X = (Center + Perspective_coordinate(Face_corners(Current_face).Corner2).x_coordinate)
PolyPts(1).Z = (Center + Perspective_coordinate(Face_corners(Current_face).Corner2).z_coordinate)
PolyPts(2).X = (Center + Perspective_coordinate(Face_corners(Current_face).Corner3).x_coordinate)
PolyPts(2).Z = (Center + Perspective_coordinate(Face_corners(Current_face).Corner3).z_coordinate)
PolyPts(3).X = (Center + Perspective_coordinate(Face_corners(Current_face).Corner4).x_coordinate)
PolyPts(3).Z = (Center + Perspective_coordinate(Face_corners(Current_face).Corner4).z_coordinate)

'function draws polygon

Polygon Cube.hdc, PolyPts(0), 4

End Sub

Public Sub Calculate_normal() 'works out normal to each face

' face numbering 1: 1234

' 2: 2376
' 3: 6785
' 4: 5841
' 5: 4873
' 6: 5126

'calculates y component of normal to each face

Normal_coordinate(1).y_coordinate =(Corner_coordinate(1).y_coordinate + Corner_coordinate(3).y_coordinate) / 2

Normal_coordinate(2).y_coordinate =(Corner_coordinate(2).y_coordinate + Corner_coordinate(7).y_coordinate) / 2
Normal_coordinate(3).y_coordinate =(Corner_coordinate(6).y_coordinate + Corner_coordinate(8).y_coordinate) / 2
Normal_coordinate(4).y_coordinate =(Corner_coordinate(5).y_coordinate + Corner_coordinate(4).y_coordinate) / 2
Normal_coordinate(5).y_coordinate =(Corner_coordinate(4).y_coordinate + Corner_coordinate(7).y_coordinate) / 2
Normal_coordinate(6).y_coordinate =(Corner_coordinate(5).y_coordinate + Corner_coordinate(2).y_coordinate) / 2

End Sub

Public Sub Calculate_rendering_order() 'sorts faces into those who are most facing you

Dim Count As Integer 'variables used in this function

Dim Render_position As Integer

For Count = 1 To 6 Step 1 'all entries in sorted are set to false

Sorted(Count) = False

For Render_position = 1 To 6 Step 1 'cycles through the 6 places in the render_order array
Lowest_y = 1000 'an arbitrary high number
For Count = 1 To 6 Step 1 'cycles through each of the faces
If Sorted(Count) = False Then 'for sides that have not already been sorted
If Normal_coordinate(Count).y_coordinate < Lowest_y Then 'if this side is the lowest of those still unsorted
Lowest_y = Normal_coordinate(Count).y_coordinate ' store the y component of this face's normal

Appendix H: Visual Basic Code

Lowest_face = Count 'store the face number

End If
End If
Sorted(Lowest_face) = True 'flag that this face has been sorted
Render_order(Render_position) = Lowest_face 'store face number in position in order

End Sub

Public Sub Face_color_set() 'sets rgb values for each face

Face_color(1).R = 255
Face_color(1).G = 0
Face_color(1).B = 0

Face_color(2).R = 122
Face_color(2).G = 238
Face_color(2).B = 248

Face_color(3).R = 0
Face_color(3).G = 0
Face_color(3).B = 255

Face_color(4).R = 255
Face_color(4).G = 179
Face_color(4).B = 17

Face_color(5).R = 226
Face_color(5).G = 0
Face_color(5).B = 246

Face_color(6).R = 247
Face_color(6).G = 225
Face_color(6).B = 9
End Sub

Public Sub Face_corner_numbering() 'an array that stores corner numbers of faces
' face numbering 1: 1234
' 2: 2376
' 3: 6785
' 4: 5841
' 5: 4873
' 6: 5126

Face_corners(1).Corner1 = 1
Face_corners(1).Corner2 = 2
Face_corners(1).Corner3 = 3
Face_corners(1).Corner4 = 4

Face_corners(2).Corner1 = 2
Face_corners(2).Corner2 = 3
Face_corners(2).Corner3 = 7
Face_corners(2).Corner4 = 6

Face_corners(3).Corner1 = 6
Face_corners(3).Corner2 = 7
Face_corners(3).Corner3 = 8
Face_corners(3).Corner4 = 5

Face_corners(4).Corner1 = 5
Face_corners(4).Corner2 = 8
Face_corners(4).Corner3 = 4
Face_corners(4).Corner4 = 1

Face_corners(5).Corner1 = 4
Face_corners(5).Corner2 = 8
Face_corners(5).Corner3 = 7
Face_corners(5).Corner4 = 3

Appendix H: Visual Basic Code

Face_corners(6).Corner1 = 5
Face_corners(6).Corner2 = 1
Face_corners(6).Corner3 = 2
Face_corners(6).Corner4 = 6

End Sub

Public Sub Test1_button_Click() 'show test 1 dialogue box

End Sub

Private Sub Test2_button_Click() 'show test 1 dialogue box

End Sub

Public Sub Test1_function() ' test one function

If Test1_active = True And Start_test_timer = True Then 'if test 1 started
Test_timer = Test_timer + 100 'increments clock
Test_time_caption.Caption = Test_timer / 1000 'displays time
'90% of normal to face 1 must be facing you to stop clock
If Normal_coordinate(1).y_coordinate > (Start_length * 0.9) And X_rotation = 0 And Y_rotation = 0 And_
Z_rotation = 0 Then
Test1_active = False 'end of test
Time_label.Caption = " " 'clears time display
Test_time_caption.Caption = " "
Settings_form.Unlock_settings_controls 'unlock settings
Cube.End_test.Visible = False
Test1_complete_dialog.Test1_time_caption = Test_timer / 1000 'displays time completed in
End If
End If
End Sub

Public Sub Test2_function() ' test one function

If Test2_active = True And Start_test_timer = True Then 'if test 1 started
Test_timer = Test_timer + 100 'increments clock
Test_time_caption.Caption = Test_timer / 1000 'displays time
'95% of normal to face 1 must be facing you to stop clock and yellow vertical
If Normal_coordinate(1).y_coordinate > (Start_length * 0.95) And X_rotation = 0 And Y_rotation = 0_
And Z_rotation = 0 And ((Corner_coordinate(5).x_coordinate + Corner_coordinate(4).x_coordinate) / 2)_
< (-0.95 * Start_length) Then

Test2_active = False 'end of test

Time_label.Caption = " " 'clears time display
Test_time_caption.Caption = " "
Settings_form.Unlock_settings_controls 'unlock settings
Cube.End_test.Visible = False
Test2_complete_dialog.Test1_time_caption = Test_timer / 1000 'displays time completed in
End If
End If
End Sub

Private Sub End_test_Click() 'if test is ended prematurely

Test1_active = False 'reset variables
Test2_active = False
Time_label.Caption = " " 'clear time display
Test_time_caption.Caption = " "
Settings_form.Unlock_settings_controls 'unlock controls
End_test.Visible = False
End Sub

Appendix H: Visual Basic Code


Private Sub Form_Load()

Me.Top = 0 'position and size of form
Me.Left = 0
Me.Height = 10470
Me.Width = 4710
Me.ScaleHeight = 10065
Me.ScaleWidth = 4590

'scales for bars

X_input_bar.ScaleWidth = 255
Y_input_bar.ScaleWidth = 255
Z_input_bar.ScaleWidth = 255
X_process_bar.ScaleWidth = 255
Y_process_bar.ScaleWidth = 255
Z_process_bar.ScaleWidth = 255
X_output_bar.ScaleWidth = 300
Y_output_bar.ScaleWidth = 300
Z_output_bar.ScaleWidth = 300

'start values before calibration

X_cal = 125
X_low_thresh = 105
X_high_thresh = 145
Max_output = 100

Y_cal = 125
Y_low_thresh = 105
Y_high_thresh = 145

Z_cal = 125
Z_low_thresh = 105
Z_high_thresh = 145

End Sub

Private Sub Timer1_Timer() 'timer every 50ms

clear_bars 'call clear bars

'calls functions

Appendix H: Visual Basic Code


End Sub

Private Sub Process_X_input_bar() 'draws yellow line representing value of x input

X_input_bar.Line (0, 0)-(Valuex, X_input_bar.ScaleHeight), RGB(255, 255, 17), BF

End Sub

Private Sub Process_Y_input_bar() 'draws yellow line representing value of y input

Y_input_bar.Line (0, 0)-(Valuey, Y_input_bar.ScaleHeight), RGB(255, 255, 17), BF

End Sub

Private Sub Process_Z_input_bar() 'draws yellow line representing value of z input

Z_input_bar.Line (0, 0)-(Valuez, Z_input_bar.ScaleHeight), RGB(255, 255, 17), BF

End Sub

Private Sub Process_X_process_bar() 'draws process bar

If Valuex > X_high_thresh Then 'draws red up to high thresh and green past that
X_process_bar.Line (X_cal, 0)-(X_high_thresh, X_process_bar.ScaleHeight), RGB(255, 0, 0), BF
X_process_bar.Line (X_high_thresh, 0)-(Valuex, X_process_bar.ScaleHeight), RGB(0, 255, 0), BF
End If

If Valuex > X_cal And Valuex <= X_high_thresh Then 'between x-cal and high thresh just red line
X_process_bar.Line (X_cal, 0)-(Valuex, X_process_bar.ScaleHeight), RGB(255, 0, 0), BF
End If

If Valuex < X_cal And Valuex >= X_low_thresh Then 'between low threshold and x-cal just red line

X_process_bar.Line (Valuex, 0)-(X_cal, X_process_bar.ScaleHeight), RGB(255, 0, 0), BF

End If

If Valuex < X_low_thresh Then 'lower than low thresh then red line up to low thresh and green past that
X_process_bar.Line (X_low_thresh, 0)-(X_cal, X_process_bar.ScaleHeight), RGB(255, 0, 0), BF
X_process_bar.Line (Valuex, 0)-(X_low_thresh, X_process_bar.ScaleHeight), RGB(0, 255, 0), BF
End If

'draws thin calibration and threshold lines

X_process_bar.Line (X_cal, 0)-(X_cal, X_process_bar.ScaleHeight), RGB(0, 0, 255), BF
X_process_bar.Line (X_low_thresh, 0)-(X_low_thresh, X_process_bar.ScaleHeight), RGB(0, 0, 255), BF
X_process_bar.Line (X_high_thresh, 0)-(X_high_thresh, X_process_bar.ScaleHeight), RGB(0, 0, 255), BF

End Sub

Process_X_process_bar is then repeated for the Y and Z axes

Appendix H: Visual Basic Code

Private Sub Process_X_output_bar() 'draws x output bar

' draws orange line representing output.

X_output_bar.Line (150, 0)-(X_out + 150, X_input_bar.ScaleHeight), RGB(255, 200, 0), BF

'draws thin zero and max_output lines

X_output_bar.Line (150, 0)-(150, X_output_bar.ScaleHeight), RGB(0, 0, 255)
X_output_bar.Line (Max_output + 150, 0)-(Max_output + 150, X_output_bar.ScaleHeight), RGB(0, 0, 255)
X_output_bar.Line (150 - Max_output, 0)-(150 - Max_output, X_output_bar.ScaleHeight), RGB(0, 0, 255)

End Sub

Process_X_output_bar is then repeated for the Y and Z axes

Private Sub clear_bars() 'clears all bars


End Sub


Private Sub Form_Load()

Me.Height = 2520 'size and position of form

Me.Width = 10545
Me.ScaleHeight = 2115
Me.ScaleWidth = 10425
Me.Left = 4710
Me.Top = 7935

End Sub

Private Sub Drive1_Change() ‘when drive is changed directory box changed

Dir1.Path = Drive1.Drive
End Sub

Private Sub StartStop_Click()

If Started = False Then 'if in not recording state

StartStop.Caption = "Stop Logging" 'once clicked display 'stop'
Started = True 'now in started state
Shape2.FillColor = RGB(0, 255, 0) 'green = on
Starttime = Timer 'sets starttime to current system time (seconds after midnight)
Store_filename = Dir1.Path & "\" & Filename.Text 'directory and filename

Appendix H: Visual Basic Code

Open Store_filename For Output As #1 'opens file of name typed in filename text box
'prints input, output and variable values
Print #1, "Time"; ","; "Value X"; ","; "Value Y"; ","; "Value Z"; ","; "X_out"; ","; "Y_out"; ","; "Z_out"; ","; "X_cal";_
","; "Flipped_x"; ","; "Gesture_phase1_detected_x"; ","; "Gesture_phase2_detected_x"; ",";_
"Peak_phase1_detected_x"; ","; "Peak_phase2_detected_x"; ","; "Currently_auto_damping_x";_
","; "Valid_after_damp_x"; ","; "X_val_previous"; ","; "Auto_damping"; ","; "Damping_factor_x";_
","; "X_int"; ","; "Max_peak_phase1_x"; ","; "Damping_duration"
Records = 0 'sets records stored to 0
Records_box.Caption = Records 'displays records stored as 0
Filename.Enabled = False
Dir1.Enabled = False
Drive1.Enabled = False
Main_Form.Tests.Enabled = False
Main_Form.View.Enabled = False

Started = False 'if button pressed in start state then will stop
Shape2.FillColor = RGB(255, 0, 0)
StartStop.Caption = "Start Logging" 'displays start once again
Records_box.Caption = " " 'clears records box
Close #1 'closes file
Time_box.Caption = 0 ' time caption set back to zero

Filename.Enabled = True
Dir1.Enabled = True
Drive1.Enabled = True
Main_Form.Tests.Enabled = True
Main_Form.View.Enabled = True

End If
End Sub


Private Sub Form_Load()

Me.Height = 2520 'size and position of form
Me.Width = 10545
Me.ScaleHeight = 2115
Me.ScaleWidth = 10425
Me.Left = 4710
Me.Top = 7935

Threshold_caption.Caption = Threshold 'show initial values for variables

Sensitivity_caption.Caption = Sensitivity
Max_output_caption.Caption = Int(Max_out_slider.Value / 3)
Sensitivity_caption.Caption = 1 + (Sensitivity_slider.Value / 10)
Threshold_caption.Caption = Threshold_slider.Value
Time_out_caption.Caption = Time_out_slider.Value
Damping_duration_caption.Caption = Damping_duration_slider.Value
End Sub

Appendix H: Visual Basic Code

Private Sub Calibrate_button_Click()

X_cal = Valuex 'sets x calibaration to current input on each axis

Y_cal = Valuey
Z_cal = Valuez
Time_out = Time_out_slider.Value ‘stores values from slider
Threshold = Threshold_slider.Value
Main_Form.Threshold_calc 'calls function to calculate upper and lower threshold
Sensitivity = (Sensitivity_slider.Value / 10)
Max_output = Max_out_slider.Value
Main_Form.Reset_all_x ‘clears x,y and z process and resets variables
Calibrated = True
Uncalibrated_unlock ‘allows user access to controls now its calibrated

End Sub

Private Sub Max_out_slider_Change() 'displays max_output in caption

Max_output_caption.Caption = Int(Max_out_slider.Value / 3) 'calculated from xout/30 *10
Uncalibrated_lock 'when setting changed device locked before re-calibration
End Sub

Private Sub Auto_damping_box_Click() 'checks auto-sampind check-box

Auto_damping = Auto_damping_box.Value 'auto-damp check box
Uncalibrated_lock 'when setting changed device locked before re-calibration
End Sub

Private Sub Sensitivity_slider_Change() 'diaplays sensitivity in caption

Sensitivity_caption.Caption = (Sensitivity_slider.Value / 10)
Uncalibrated_lock 'when setting changed device locked before re-calibration
End Sub

Private Sub Threshold_slider_Change() 'displays threshold level in caption

Threshold_caption.Caption = Threshold_slider.Value
Uncalibrated_lock 'when setting changed device locked before re-calibration
End Sub

Private Sub Time_out_slider_Change() 'displays time-out in caption

Time_out_caption.Caption = Time_out_slider.Value
Uncalibrated_lock 'when setting changed device locked before re-calibration
End Sub

Private Sub Damping_duration_slider_Change() 'diaplays damping duration in caption

Damping_duration_caption.Caption = Damping_duration_slider.Value
Uncalibrated_lock 'when setting changed device locked before re-calibration
End Sub

Public Sub Lock_settings_controls() 'prevents user being able to change settings

Auto_damping_box.Enabled = False 'disables settings

Calibrate_button.Enabled = False
Damping_duration_slider.Enabled = False
Max_out_slider.Enabled = False
Sensitivity_slider.Enabled = False
Threshold_slider.Enabled = False
Time_out_slider.Enabled = False

End Sub

Appendix H: Visual Basic Code

Public Sub Unlock_settings_controls() 'unlocks settings controls

Auto_damping_box.Enabled = True 'enables settings

Calibrate_button.Enabled = True
Damping_duration_slider.Enabled = True
Max_out_slider.Enabled = True
Sensitivity_slider.Enabled = True
Threshold_slider.Enabled = True
Time_out_slider.Enabled = True

End Sub

Public Sub Uncalibrated_lock() 'locks menus and settings when uncalibrated

Cube.Uncalibrated_caption.Visible = True 'displays "uncalibrated"
Simulated_input_form.Activate_button.Enabled = False 'disables active button on simulate input
Main_Form.Tests.Enabled = False 'disables toolbar on main form
Main_Form.Log_data.Enabled = False
Calibrated = False
Activated_mode = False
Simulated_input_form.Activate_button.Caption = "Activate" 'device is deactivated
Cube.Activated_caption.Caption = "Device Deactivated"
End Sub

Public Sub Uncalibrated_unlock() 'unlocks menus and controls when device is calibrated
Cube.Uncalibrated_caption.Visible = False
Simulated_input_form.Activate_button.Enabled = True
Main_Form.Tests.Enabled = True
Main_Form.Log_data.Enabled = True

End Sub

Public Sub Test_lock() 'locks controls and menus when test is underway
Main_Form.Tests.Enabled = False
Main_Form.View.Enabled = False
Main_Form.View.Enabled = False
Simulated_input_form.Close_simulated_control.Enabled = False

End Sub

Public Sub Test_unlock() 'unclocks controls and menus after test is finished
Main_Form.Tests.Enabled = True
Main_Form.View.Enabled = True
Main_Form.View.Enabled = True
Simulated_input_form.Close_simulated_control.Enabled = True

End Sub

Simulated input

Appendix H: Visual Basic Code

Private Sub Form_Load()

Me.Top = 0 'position of window
Me.Left = 12765
Me.Height = 7950 'size of window
Me.Width = 2535
Valuex_slider.Value = X_cal 'sets sliders to calibration level when loaded
Valuey_slider.Value = Y_cal
Valuez_slider.Value = Z_cal

End Sub

Private Sub Close_simulated_control_Click() 'close simulated control

Cube.Width = 10545 'resizes cube form now controls have disappeared
Simulate_input = False
Me.Hide 'form dissapears

End Sub

Private Sub Timer1_Timer() '10ms timer

If Simulate_input = True Then

Valuex = Valuex_slider.Value 'puts values of sliders straight into valuex

Valuey = Valuey_slider.Value
Valuez = Valuez_slider.Value
End If
Main_Form.Store 'stores data like GUI would if connected to device
End Sub

Test 1/2 dialogue box

Test 1 dialogue is the same as Test 2 (given here) except for the code in red that is specific to
Test 2

Private Sub Next_Button_Click() 'user clicks next to start test

Activated_mode = False 'device de-activated

Simulated_input_form.Activate_button.Caption = "Activate" 'displays current mode
Cube.Activated_caption.Caption = "Device Deactivated"

Start_test_timer = False 'test timer not started

Cube.Time_label.Caption = "Time:" 'displays caption

Cube.Start_coordinates 'resets cube to start coordinates
X_rotation = 180 'moves cube from start coordinates into position
y_roation = 30
Z_rotation = -30
Cube.Rotate_x_axis 'calls the x rotation function
Cube.Rotate_y_axis 'calls the x rotation function
Cube.Rotate_z_axis 'calls the x rotation function

Appendix H: Visual Basic Code

Cube.Add_perspective 'adds perspective to the cube

Cube.Calculate_normal 'calculates the normal to each face of the cube
Cube.Calculate_rendering_order 'the faces are ordered depending on which normal has the largest value
Cube.Form_Paint 'draws what we've just calculated

Test_timer = 0 'timer is reset

Test2_active = True 'test activated (Test1_activate = True for Test 1)
Cube.End_test.Visible = True 'button appears for user to end test if they want
Settings_form.Lock_settings_controls 'menus and controls are locked during testing

Me.Hide 'hides this form

End Sub

Private Sub Cancel_Button_Click() 'hides this form to cancel test

End Sub

Test2 complete dialogue box

Private Sub Form_Load() 'when form loads displays time test completed in
Test1_time_caption.Caption = Test_timer / 1000

End Sub

Private Sub No_Button_Click() 'hides if user doesn't want to repeat test

End Sub

Private Sub Yes_Button_Click() 'loads test2_dialogue if user wants to repeat test

End Sub

Module 1 (Main)
Public X_cal As Integer 'calibration levels
Public Y_cal As Integer
Public Z_cal As Integer

Public Flipped_x As Boolean 'inverted gesture

Public Flipped_y As Boolean
Public Flipped_z As Boolean

Public Gesture_phase1_detected_x As Boolean 'phase1 gesture detected

Public Gesture_phase1_detected_y As Boolean
Public Gesture_phase1_detected_z As Boolean

Public Gesture_phase2_detected_x As Boolean 'phase 2 gesture detected

Public Gesture_phase2_detected_y As Boolean
Public Gesture_phase2_detected_z As Boolean
Public Peak_phase1_detected_x As Boolean 'peak phase 1 detected
Public Peak_phase1_detected_y As Boolean
Public Peak_phase1_detected_z As Boolean

Appendix H: Visual Basic Code

Public Peak_phase2_detected_x As Boolean 'peak phase 2 detected

Public Peak_phase2_detected_y As Boolean
Public Peak_phase2_detected_z As Boolean

Public Currently_auto_damping_x As Boolean 'auto-damping state

Public Currently_auto_damping_y As Boolean
Public Currently_auto_damping_z As Boolean

Public Valid_after_damp_x As Boolean ' this variable makes auto-damp wait for a complete cycle before working
Public Valid_after_damp_y As Boolean
Public Valid_after_damp_z As Boolean

Public X_val_previous As Integer 'stores previous value for of X_val for comparison
Public Y_val_previous As Integer
Public Z_val_previous As Integer

Public Auto_damping As Boolean 'whether or not to auto-damp

Public Damping_factor_x As Double 'damping factor

Public Damping_factor_y As Double
Public Damping_factor_z As Double

Public X_int As Integer ' intermediate value in data processing

Public Y_int As Integer
Public Z_int As Integer

Public Max_peak_phase1_x As Integer 'stores max value

Public Max_peak_phase1_y As Integer
Public Max_peak_phase1_z As Integer

Public Damping_time_x As Integer ' in milliseconds from time damping starts

Public Damping_time_y As Integer
Public Damping_time_z As Integer

Public Damping_duration As Integer 'in milliseconds how long you want it to last

Public X_low_thresh As Integer 'upper thresold

Public Y_low_thresh As Integer
Public Z_low_thresh As Integer

Public X_high_thresh As Integer 'lower threshold

Public Y_high_thresh As Integer
Public Z_high_thresh As Integer

Public Threshold As Integer 'value for threshold

Public X_out As Integer 'X_out is input to cube

Public Y_out As Integer
Public Z_out As Integer

Public Time_out As Integer 'time between consecutive gestures

Public Currently_timed_out_x As Boolean 'curently in timed out state

Public Currently_timed_out_y As Boolean
Public Currently_timed_out_z As Boolean

Public Current_Time_out_time_x As Integer 'keeps record of time out elapsed

Public Current_Time_out_time_y As Integer
Public Current_Time_out_time_z As Integer

Public Max_output As Integer 'max ouput setting

Public Sensitivity As Integer 'sensitivity setting

Public Calibrated As Boolean 'calibrated variable

Appendix H: Visual Basic Code

Module 2 (cube)
Public Type Corner_coordinate_type 'data structure for cubne coordinates
x_coordinate As Double
y_coordinate As Double
z_coordinate As Double
End Type

Public Corner_coordinate(8) As Corner_coordinate_type 'array storing 3D coordinates pf cube

Public Perspective_coordinate(8) As Corner_coordinate_type 'array storing 3D coordinates of cube with
perspective added

Public Normal_coordinate(6) As Corner_coordinate_type 'coodinates of normal of each face (only y-component is


Public New_x_coordinate As Double 'allows backup to be made while circle geometry is used
then stored into x_coordinate
Public New_y_coordinate As Double
Public New_z_coordinate As Double

Public Center As Integer 'position of centre of cube on screen

Public Radius As Integer 'user defines radius of cube (centre to corner)
Public Start_length As Double 'calculated from radius. length of half a side

Public X_rotation As Double 'amount cube is rotated. derived from x_out

Public Y_rotation As Double
Public Z_rotation As Double

Public Pi As Double 'pi. for circle geometry

Public Declare Function Polygon Lib "gdi32" _

(ByVal hdc As Long, lpPoint As PointAPI, _
ByVal nCount As Long) As Long '3D library to draw polygons

Public Type PointAPI 'define type for polygon coordinates

X As Long
Z As Long
End Type

Public Sorted(6) As Boolean 'order of faces for back-face culling

Public Lowest_y As Double 'normal of furthest away face

Public Lowest_face As Integer 'furthest away face number
Public Render_order(6) As Integer 'order to render faces

Public Type Face_color_type 'type to define colours of faces

R As Double
G As Double
B As Double
End Type

Public Face_color(6) As Face_color_type 'array for face colours. type as above

Public Type Face_corner_type 'stores which corners link what face

Corner1 As Integer
Corner2 As Integer
Corner3 As Integer
Corner4 As Integer
End Type

Public Face_corners(6) As Face_corner_type 'array of type above. entry for each face

Public Test1_active As Boolean 'state of tests

Public Test2_active As Boolean

Public Test_timer As Long 'timer for elapsed test time

Appendix H: Visual Basic Code

Module 3 (Device monitor)

Public Input_buffer(9) As Long 'array for received data
Public Endval As Integer 'pointer to next entry for received data
Public Valuex As Integer 'input values
Public Valuey As Integer
Public Valuez As Integer
Public Found_result As Integer 'true if valid header and 3 bytes of data in input buffer
Public Mytime As Double 'works out time from pc clock time
Public Starttime As Double 'clock time when logging started
Public Records As Integer 'number of records
Public Started As Integer 'logging data started
Public Activated_mode As Boolean 'state of device
Public Previous_button_header As Integer 'previous received button header to see if its changed
Public Simulate_input As Boolean 'state of simulated input feature
Public Start_test_timer As Boolean 'whether or not timing a test
Public Store_filename As String 'string of filename to store logged data file