T.R.I.S.S.H.
Target Response Inspection
Security System Home
EE 475 B
June 03, 2016
NAME
Jeffrey Nguyen
Jesus Sandoval
Minhhue H. Khuu
SIGNATURE
STUDENT ID
Table of Contents:
1.
ABSTRACT ............................................................................................................................................... 3
2.
INTRODUCTION ...................................................................................................................................... 3
3.
4.
TESTING ................................................................................................................................................. 23
4.1 TEST PLAN............................................................................................................................................ 23
4.2 TEST SPECIFICATION ............................................................................................................................ 23
4.2.1
Laser movement ......................................................................................................................... 23
4.2.2
Target recognition ...................................................................................................................... 24
4.2.3
Laser and Target tracking .......................................................................................................... 24
4.2.4
User Menu .................................................................................................................................. 24
4.2.5
Manual Mode ............................................................................................................................. 24
4.3 TEST CASE ............................................................................................................................................ 24
4.3.1
Laser movement ......................................................................................................................... 24
4.3.2
Target recognition ...................................................................................................................... 25
4.3.3
Laser and Target tracking .......................................................................................................... 25
4.3.4
User Menu .................................................................................................................................. 25
4.3.5
Manual Mode ............................................................................................................................. 25
5.
7.
1. Abstract
In this write up we will walk through the design cycle of from initial design to final product
through the different sections in the report. We start with the initial discussion of the project which
describes our system which we have named by the acronym T.R.I.S.S.H. standing for Target
Response Inspection Security System- Home edition. Its use and specification is described by
design specification as the ultimate final product and goal to design to for the customer/consumer
or end user. We give a detailed description of the hardware and software modules which implement
the target tracking and selection using image sensors (cameras) to the response modelled weapon
using a laser mounted on servo motors horizontal and vertical movement. Also the extra features
which include user interface for selectable modes, selectable modes consist of, automatic, manual
mode, and target selection. All these features of our system are described and explained by this
report in the discussion. Then the next section will describe the test plan used to troubleshoot and
prove proper function of our system. After we go over the results and analysis of that data which
is used to find error, make design decisions and modifications and if not resolved investment of
future work and research. Finally, we will conclude the report with the analysis of the process and
design cycle of product.
2. Introduction
The purpose of this lab was to implement a target tracking and security system. Using camera
sensors and image processing for the majority of the functionality, we took raw image size and
location data to track multiple targets, then follow the objects of choice using a motorized laser as
interactive response. We used a Raspberry Pi 3 microprocessor with a Raspbian linux distribution
to process data and program automation of the system using C programming. Also selectable
functions and extra features were programmed in for manual use of the system. Extensive testing
and calibration of the location information and corresponding laser movement was used to get an
accurate and effective system. We tested and limited movement speed of targets as well as captured
lighting and color limitations for our system and created a controlled and limited working
prototype which can be expanded with higher quality parts and more testing and calibration.
between the two. In manual mode, the user will have complete control over the laser with 10,000
points of motion using an analog stick and is capable of turning the laser on and off.
For more information about the project specification and its implementation continue reading
this section.
Value
30.5 cm
13.4 cm
17.8 cm
1687 g
The environment for the use of this system are the followings:
Moderate Lighting
o White light, 7000K warmth
o Fluorescent Lightbulb
o 100 W power output
Moderate Temperature
o 70.0 F or 21.1 C
o No wind conditions
o Low humidity
The system itself, is quite durable and is capable of being transported from one place to
another with moderate ease without worrying about damaging the product. However, that being
said the system is not capable of sustaining large drops. Additional testing needs to be done to test
the durability of the system for actual production.
3.1.3 System Input and Output Specification
In this section, the systems input and output interaction with the user will be explained. Here,
the user control is explained as well as the display for the user to read to understand the behavior
and the state of the system.
3.1.3.1 System Inputs
Button and Switches
o The buttons and switches are placed in to a solderless breadboard with tight
wiring to maintain signal integrity and strong connection.
o The buttons and switches will be used to control some of the basic behavior of
the system, this includes:
Reset, to reset the PIC microcontroller system.
Power, to shut down the system.
Enter button, to enter a selected option on the current menu system.
Back button, to return to the previous menu system.
o The input voltages will have two states, logical low, and logical high.
High Readings: 5.0 V 0.025 V
Low Readings: 0.0 V 0.025 V
The switches and buttons have a pull up resistor for a defined voltage
on an open circuit.
The buttons activate on an active low system, and is normally high.
Analog Stick
o Used to select menu options in the PIC.
o Use to manual control the laser when switched to manual mode.
o The input voltages will have two analog values for the X and Y directions.
High Readings: 5.0 V 0.025 V
Low Readings: 0.0 V 0.025 V
Default Readings: 2.5 V 0.025 V
The switches and buttons have a pull up resistor for a defined voltage
on an open circuit.
The buttons activate on an active low system, and is normally high.
Laser
o
o
o
o
INPUT SYSTEM
PIC
PIC
Raspberry Pi 3
PIC
Raspberry Pi 3
PIC
Raspberry Pi 3
PIC
Raspberry Pi 3
Raspberry Pi 3
PIC
PIC
DESCRIPTION
Reset the PIC system
Power the PIC system
A mode where the system will monitor the specified
area.
A mode where the user has full control over the laser.
A mode to demonstrate the possible features this system
can operate.
Initializes the GPIO pins on the Raspberry Pi 3.
A mode to calibrate the system on a white board.
Select between the two targets.
Turn the laser on or off while in manual mode.
Most of these operations will take time to map the appropriate analog movement and switch
combinations but for now the follow must be true.
SWITCH FUNCTION
Reset
To reset the system
Power
To turn off the power
The user control for selecting the correct menu option on the LCD screen is shown using the
following scheme.
CONTROL (ANALOG)
Down/Up
Right/Left
Enter
Back
FUNCTION
Read down and up on the LCD screen.
Read right and left on the LCD screen.
Enter Command of selected option
Back Command to previous menu
3.1.5
o UL 60065 Standard for Audio, Video and Similar Electronic Apparatus - Safety
Requirements
microcontroller took 4 clock cycles, a 5 instructions yielded 1 microsecond delay. However, when
dealing with function calls and for loops, each call causes a delay of some sort so we had to take
that into account. Overall, a for loop took about 26 instructions for an iteration greater than 1, and
a function call took 4 instructions. Using this we were able to create a perfect usleep10() delay
function which delayed for 10 microseconds and gate us the ability to create pulses ranging from
1.00ms to 2.00ms.
After creating the perfect delay function, we had to generate the pulse correcting, then repeat
that 10 20 times. This was to give the motor enough time to reach its position. By sending the
pulse 10 20 times the motor was given roughly 200ms to 400ms to move to its position, which
was more than enough time to get into position.
3.2.3 Image Detection
Image detection was done using the PIXY camera. The PIXY camera came with libraries of
code that allowed us to use. More detail on its implementation can be found on PIXYs website.
For our use case, we modified the hello_pixy.cpp code. Originally, the code was able to detect
an object and print its information in the target, x, y, height, and width information. By using this
information, we could use the given coordinate information and map it to motion with our motor
control system.
First, we had to send the data over to a temporary file. For this we used a mkfifo() command
to create a temporary file and put the program at a halt until the program was read by our movement
program. Once the program was read, our movement program parsed the given string, and
extracted the target, x, y, height, and width information and the camera was able to continue this
process continually. With two programs running simultaneously, we had to let the operating
system and the linux kernel schedule the programs but from our test cases it seemed to be
sufficient.
After getting the coordinate information from the camera, we took a series of measurements
and derived a linear equation for us to make to our laser motors and the system was capable of
detecting an object and responding to it.
3.2.4 Raspberry Pi 3 and PIC Communication
Communicating the Raspberry Pi 3 to the PIC was using two wires for sending and
confirmation signals and 7 data lines all in parallel. This was done in parallel because of the 3.3 V
to 5 V issues that occurred. By using parallel data transfer the PIC was able to recognize the signal
as high or low.
When sending data from the Raspberry Pi 3 to the PIC a send signal is sent, the Raspberry Pi
3 is then put at a halt until the PIC sees the send signal and returns a confirm signal. After that the
Raspberry Pi 3 will recognize that the transfer is finished and continues operation.
When sending data from the PIC to the Raspberry Pi 3, we used 1 wire that acted as a
multipurpose binary option wire based on the current state of the Raspberry Pi 3. Due to the
limitations of ports and time, we were not able to send more than just 1 bit of information towards
10
the Raspberry Pi 3. But this was sufficient for selecting between two objects in automatic mode,
and on/off laser in manual mode.
3.2.5 User Input
User input was done in a two-step process. The Raspberry Pi 3 had to first select the mode the
user wanted to be in. Then the use had to change the PIC to be in the correct operation state for it
to retrieve the data from the Raspberry Pi 3 and respond correctly. With the communication system
implemented, if either system was in the wrong state, the Raspberry Pi 3 or PIC would wait
indefinitely until the send or confirm signal was received from each respective system.
The user input in the Raspberry Pi 3 was done using terminal input using a C programming
which waits for a single character input for mode selection, and integer input for position
information for demo mode or nothing in all other modes.
The user input in the PIC was done using the analog stick and two buttons to enter or back out
of the menu system. This system allowed the user to select an entry by using the analog stick to
switch options
3.2.6
11
12
13
System Inputs
Button and Switches
o The buttons and switches are placed in to a solderless breadboard with tight
wiring to maintain signal integrity and strong connection.
o The buttons and switches will be used to control some of the basic behavior of
the system, this includes:
Reset, to reset the PIC microcontroller system.
Power, to shut down the system.
Enter button, to enter a selected option on the current menu system.
Back button, to return to the previous menu system.
o The input voltages will have two states, logical low, and logical high.
High Readings: 5.0 V 0.025 V
Low Readings: 0.0 V 0.025 V
The switches and buttons have a pull up resistor for a defined voltage
on an open circuit.
The buttons activate on an active low system, and is normally high.
Analog Stick
o Used to select menu options in the PIC.
o Use to manual control the laser when switched to manual mode.
o The input voltages will have two analog values for the X and Y directions.
High Readings: 5.0 V 0.025 V
Low Readings: 0.0 V 0.025 V
Default Readings: 2.5 V 0.025 V
The switches and buttons have a pull up resistor for a defined voltage
on an open circuit.
The buttons activate on an active low system, and is normally high.
Data Input from Cameras
o PIXY Camera
Communicates with the Raspberry Pi 3 via mini USB 2.0.
Sends color tracking information of specified colored object to the
Raspberry Pi 3.
o NOIR Filtered Pi Camera
Communicates with the Raspberry Pi 3 via proprietary cable.
Sends 8 MP images to the Raspberry Pi 3
Capable of viewing in the Infrared color spectrum
o The data input of the camera will be used to measure the input voltage to
High Readings: 3.3 V 0.016 V
Low Readings: 0.0 V 0.025 V
Console Control via Raspberry Pi 3
o Select modes to run when running the main program
i, initialization mode
Initializes GPIO pins
Typically Ran first
c, calibration mode
14
3.3.2
System Outputs
Servo Motors
o Motors are controlled using the PIC Microcontroller, which generates the
accurate pulse waves 10 times for a given degree of motion
o Motors are power by 5.0 V 0.025 V Rail
Horizontal Motors ranges from 0 to 130 degrees
Error of 2 degree.
The horizontal axis is relative to the level plane of the system.
Horizontal Motors ranges from 0 to 130 degrees
Error of 2 degree.
The vertical axis is relative to the perpendicular plane of the
system.
o Additional information of the motors is explained in implementation.
Laser
o Laser is capable of turning on and off.
o Controlled by the Raspberry Pi 3.
o 650nm 5 mW output corresponds to red laser output on high
o The output voltages will have two states, logical low, and logical high.
High Readings: 5.0 V 0.025 V
Low Readings: 0.0 V 0.025 V
LCD Display to display menu and status.
o Used to give the user information the status of the system and give the user
options to operate on giving suggestions on operation.
o Using a shift register for data bits,
o Displays on four lines instructions and status.
o Display is power by the system power of 5.0 V.
Image Saving
o Image is saved as a JPEG file onto an external USB drive.
o The image is captured when an object is detected for a set duration amount of
time.
15
16
TRISA = 0b10110011;
TRISC = 0b00000000;
TRISB = 0b11111111;
ANSELA = 0b00000011;
ANSELB = 0b00000000;
ANSELC = 0b00000000;
_RS = 0;
_RW = 0;
_OE = 0;
msleep18();
lcdINIT();
PORTBbits.RB0
PORTBbits.RB1
PORTBbits.RB2
PORTBbits.RB3
PORTBbits.RB4
PORTBbits.RB5
PORTBbits.RB6
PORTBbits.RB7
PORTAbits.RA4
LATAbits.LATA2
LATAbits.LATA3
LATCbits.LATC3
LATCbits.LATC2
LATCbits.LATC1
17
cursorState = 0;
strcpypgm2ram(buf, "Automatic Mode
");
writeStr(buf);
trig =1;
}else if(openADC(0)< 200 && !trig){
clearDisplay();
cursorState = 1;
strcpypgm2ram(buf, "Manual Mode
");
writeStr(buf);
trig = 1;
}else if(openADC(0)<600 && openADC(0)>400){
trig= 0;
}
Figure 3.4.3: Joystick Control Function
When manual mode is then selected the it gives the joystick the ability to control the laser
movement. Shown in Figure 3.4.4 is how we implemented the joystick control function to control
the laser movement.
_CONFIRM = 0;
h_new = openADC(0);
v_new = openADC(1);
if(h != h_new) {
h = h_new;
h_degree(h/10);
}
if(v != v_new) {
v = v_new;
v_degree(v/10);
}
18
double a;
double b;
a = 0.2200 * x + 15.671;
b = 0.2300 * y - 9.607;
*h = (int) a;
*v = (int) b;
}
Figure 3.4.6: Mapped Signal For the Servo Motors
Another feature added was a calibration mode. Much like touchscreens for calibration, we set
our board to contain a calibration mode. Shown in Figure 3.4.7 is a small fragment of one of our
13 calibration fragments.
printf("..........CALIBRATION..........\n");
printf("POINT 28-00\n");
degree(28,0);
while(!i) {
scanf("%d\n", &i);
}
i = 0;
printf("POINT 76-24\n");
degree(76,24);
while(!i) {
scanf("%d\n", &i);
}
i = 0;
printf("POINT 76-00\n");
degree(76,0);
while(!i) {
scanf("%d\n", &i);
}
i = 0;
Figure 3.4.7: Calibration Mode for laser.
Then for our manual pinpoint mode we simply passed in the two entered coordinates into our
movement mapped equations shown in Figure 3.4.8.
printf("Enter a Horizontal Position\n");
scanf("%d", &h);
printf("Enter a Vertical Position\n");
scanf("%d", &v);
printf("Setting @ position (%d, %d)\n", h, v);
20
degree(h,v);
Figure 3.4.8: User Controlled input
Part No.
Description/Purpose
N/A
18f25k22
CMUcam5
NoIR PI cam V3
N/A
Main Microprocessor
2nd sub-Microprocessor
Image processing, target tracking
Capture low light images
Weapon Simulation target
tracking
N/A
Display menu options
SG90 Micro Servo Motor 9G Pan and tilt for tracking system
RC
OSC-20
Input clock for PIc microprocessor
c595N
Reduce pin usage with
serialization
Figure 3.5.2: Table of Parts Used
21
3.5.2 Raspberry Pi 3
We chose the Raspberry Pi 3 as our main processor because of the popularity of the platform
along with the ease of use. As for the other Raspberry Pi 2 and Raspberry Pi 1, we avoided using
the older models due to the following listed settings.
A 1.2GHz 64-bit quad-core ARM Cortex-A53 CPU (~10x the performance of Raspberry
Pi 1)
Integrated 802.11n wireless LAN and Bluetooth 4.1
Complete compatibility with Raspberry Pi 1 and 2
As shown in the following specs, the Raspberry Pi 3 is nearly 10x faster than the Raspberry
Pi 1, this indicates that we can compile and use our code. Along with this, it allows us to produce
higher resolution images from our cameras. Also shown is that it contains Lan and Bluetooth. With
the nature of our project, we want the homeowners per say to receive notifications when an intruder
has arrived. As such, Bluetooth capabilities allows us a wireless feature to access networking for
future endeavors.
3.5.3 PIC Microprocessor
The PIC microprocessor was initially not planned for use. However, when implementing our
Servo motors, we noticed that we were receiving a 10% error in the duty cycle for the pulse set.
This 10% error causes a huge issue when controlling the motors. It would swerve constantly back
and forth due to the unstable sleep function the Pi allows us to. Also by using the PIC
microprocessor, it allowed us to have a seamless transition of using A LCD for user controlled
actions.
3.5.4 Pixy Camera
The Pixy Cam was our main camera of choice for tracking due to image processing and ease
of use. For our project, we needed to receive image processing that can output data that we can
interpret and read. The pixy cam automatically does this for us. Along with the ability to set
signatures and adjust the hue and brightness of the camera, it allowed us to receive multiple
readings that can then be interpreted for our laser and servo motors. Also, since the providers of
the PixyCam (Charmed Studio) allows cross communications between the Raspberry Pi and the
Pixy Cam, this camera was ideal for what we wanted to accomplish.
3.5.5 Night Vision Camera
For the Night Vision Camera, we used the NoIR PI cam V3. The NoIR PI cam was able to be
hooked directly to the Raspberry Pi. The reason we used the following camera was due to the
solution of brightness being a issue and the ease of use. The NoIR PI cam V3 allowed us to
manually set a night mode for when our PIXY cam cannot read anything. Also since the camera
itself has a 1080p camera, we can assure the quality of picture would be top notch.
3.5.6 Laser
For our laser, we simply used a generic laser that would indicate where it was pointing to.
This part is used solely for demoing purposes.
3.5.7 Servo Motors
For our motors we used the SG90 Micro Servo Motor 9G RC as our movement for our laser.
The main reason for using these motors was due to the sheer size the motors along with their cost.
22
Since cost was a huge factor in our project, we wanted to make sure that we used effective, yet
cheap parts. The servo motors was our motor of choice. With a operating speed of 0.12second/
60degree ( 4.8V no load), we were able to effectively model our laser movement.
3.5.8 Shift Registers
The C595N shift register was used importantly on our data busses to consolidate pins and
allowed for us to economies pin usage. We simply used 4 pins 2 clocks for generated from the PIC
processor and 1 serialized data pin and an output enable control pin also generated by our PIC
processor.
3.5.9 LCD
The LCD we chose was a standard HD44780 2x16 display. It allowed us to write ascii
characters to display our measurement readings. We connected control pins and data pins from the
sub processor along with +5v power and 0V ground and used software in the processor to encode
the characters we wanted to display. We were able with the 16 characters on one line to display
description of the reading along with value of no more than 5 digits and units.
4. Testing
4.1 Test Plan
The first main part for testing in our system came in calibrating movement of the laser. It is
controlled by C program on the Raspberry PI which drives servo motors mounted on a pivot system
for the laser having a pan and tilt. We have to test accuracy of laser movement and calibrate both
horizontal and vertical movement range. Then we must ensure correct placement of the laser as
specified the by our programming.
We also have to correctly test the accuracy of a pixy camera image. This means we must test
accuracy of color signatures being recognized for tracking purposes, analyzing what information
about the color signature desired in a controlled space. So we want to test the accuracy of object
being recognized, meaning dimensions of the object, and secondly the coordinates of our target
object being processed for our test area.
Combining the image processing with laser movement we want to test accurate movement of
the laser in comparison to the object as we want to track the object with the size and location
information is being used to follow the object.
User Menu
Check LCD output of menu
Use controller to scroll through menu options watching LCD for correct menu change
Select Menu option by depressing button watch for correct LCD change menu change
Check Menu option depressed function operational
4.2.5
1.
2.
3.
Manual Mode
Select manual mode in user menu
Use joystick to move left, right, up and down
See laser movement go horizontal left, right, up and down
Laser movement
Set a grid box on a white board 100 cm by 100 cm and 132 cm distance from the
laser
Set origin point in center of grid
Move one degree of movement to right, left, up, down by stepping servo motor
Then move to limit of the test grid to the furthest from origin in any direction
24
4.3.2
Target recognition
Set a grid box on a white board 100 cm by 100 cm and 132 cm distance from camera
Put target on bottom left corner of grid
Check value of measured target by pixel value coordinates
Move 1 cm in upward vertical direction check increase in vertical y direction
coordinate
Continue process multiple of 10 times for accurate response in positive y direction
Reset to bottom left corner of grid
Move 1 cm in right horizontal direction check increase in horizontal x direction
coordinate
Continue process multiple of 10 times for accurate response in positive x direction
4.3.3
4.3.4
User Menu
Check joystick voltage at center orientation should measure 2.5V in both pins for
x-axis and y-axis
Check joystick voltage in right orientation should measure 5v in right orientation
and 0V in left orientation on x axis pin
Check joystick voltage in far depressed orientation should measure 5V in up
orientation and 0V in down orientation on y axis pin
Check menu option of right, left, up, down selection of joystick with menu desired
option in program
Repeat menu selection 10 times for correct operation of user interface with
controlled movements
4.3.5
Manual Mode
Check joystick voltage at center orientation should measure 2.5V in both pins for
x-axis and y-axis
Check joystick voltage in right orientation should measure 5v in right orientation
and 0V in left orientation on x axis pin
Check joystick voltage in far depressed orientation should measure 5V in up
orientation and 0V in down orientation on y axis pin
Check linear increase across axis pin from 0-5V
25
26
XPoint
HorizontalDegree
163
165
164
44
50
67
280
276
262
108
111
231
226
YPoint
50
50
50
28
28
28
76
76
76
39
39
64
64
VerticalDegree
48
102
166
60
120
170
54
110
169
78
138
71
136
0
12
24
0
12
24
0
12
24
6
18
6
18
28
4.
Analyzing speed of our system we tested multiple runs and found that we had a lag between
target acquisition and motor redirection to specified target. After capturing multiple data points,
we are able to see a common theme and average run time for recognize to target this was shown
in the following table. Testing our grid, we chose opposite corners to measure largest latency
between recognition and acquisition and found and average time to equal 1.056 Sec. This would
not be ideal for real world application but noting the delay we were able to move on and continue
to adjust and make our tracking system work correctly.
1.11
.95
1.16
1.05
1.16
.89
29
1.09
1.11
.97
10
1.07
Precision was tested by selecting a specified location on our grid and marking and then
repeating. We took multiple measurements and found that there would be a slight difference in
location each time we chose the specified location. At 132 cm from the laser mount our laser
diameter was approximately .8 cm and our standard error of movement averaged to be 1.6 cm from
the same previous location move. Where we saw a dramatic reading and need for improvement
was in moving from multiple locations from we saw an error average 3.67 cm. We account this
for a small range of motion from in the servo motors being controlled by precision timing of a
pulse wave. If the movement was short, we would have less read time for the movement if a
movement was large it would have a long read time therefore higher velocity of movement in the
motor and prone to large difference in location as compared to a short move. We accounted for
this by setting of a calibration zone by selecting a box which would be 4 cm in each direction up,
down, left and right this would account for an error difference and keep us in the range of the point
we wanted to be in. Although this is not ideal for the purpose of this lab we were able to work with
this amount of calibration to continue a somewhat accurate location approximation.
30
Input Signal
Reset
Clock
X Coordinate
Y Coordinate
SEND
DATA [6:0]
SA0 Result
When the input is stuck on 0, the entire system will not operate. This is
because the input signal is an active low signal. If the power is also off while
the reset is off, then the entire system will be powered off yielding an output
of 0 on all lines. If the power is on, then the output of the system will yield
unstable and undeterminable results.
If Clock is stuck at 0, the entire system will fail, the system will never be able
to execute an operation. The system is dependent on a pulsing clock to be able
to move on an execute the program that is programmed on the PIC system.
If the X coordinate from the analog stick is stuck at 0 then the menu system
will always move towards the down option while in the menu system. If the
system is manual mode at the time of the error, then the motion of the laser will
go to its lowest horizontal angle and remain there.
If the Y coordinate from the analog stick is stuck at 0 at the time of the menu
system, then nothing will happen whatsoever. If the system is manual mode at
the time of the error, then the motion of the laser will go to its lowest vertical
angle and remain there.
The SEND signal will only have effect in automatic, calibration, and demo
modes.
If the SEND signal from the Raspberry Pi 3 is stuck at 0 then the PIC will be
signaled to read the data bits. If the SEND signal is stuck at 0 then the PIC will
forever and indefinitely read the data bits until the SEND signal turns 1.
The DATA signal will only have effect in automatic, calibration, and demo
modes.
If the data bits from the Raspberry Pi 3 is stuck at 0 then the data bits will be 0.
This will be an issue if the SEND signal is 0 as well since the data bits be read
as 0 indefinitely.
31
Y Coordinate
SEND
DATA [6:0]
SA1 Result
When the input is stuck on 1, the entire system will be operational. This is
because the input signal is an active low signal. The system will begin
behaving as programmed and described in the system description.
If Clock is stuck at 1, the entire system will fail, the system will never be able
to execute an operation. The system is dependent on a pulsing clock to be able
to move on an execute the program that is programmed on the PIC system.
If the X coordinate from the analog stick is stuck at 1 then the menu system
will always move towards the up option while in the menu system. If the system
is manual mode at the time of the error, then the motion of the laser will go to
its highest horizontal angle and remain there.
If the Y coordinate from the analog stick is stuck at 1 at the time of the menu
system, then nothing will happen whatsoever. If the system is manual mode at
the time of the error, then the motion of the laser will go to its highest vertical
angle and remain there.
The SEND signal will only have effect in automatic, calibration, and demo
modes.
If the SEND signal from the Raspberry Pi 3 is stuck at 1 then the PIC will
continually wait for and stuck waiting for a SEND signal to be 0 to continue
automatic operation.
The DATA signal will only have effect in automatic, calibration, and demo
modes.
If the data bits from the Raspberry Pi 3 is stuck at 1 then the data bits will be 1.
This will be an issue if the SEND signal is 1 as well since the data bits be read
as 1 indefinitely.
32
SA0 Result
The CONFIRM signal will only have effect in in automatic, calibration,
and demo modes.
If CONFIRM is stuck at 0 then the Raspberry Pi 3 will continue operation
and will assume that the PIC has received all of the information sent to it.
As a result, when CONFIRM is stuck at 0, the PIC will move the motor to
wherever the data bit is set to on the Raspberry Pi 3.
LASER/TARGET While in Manual Mode, if the LASER/TARGET signal is stuck at 0 then
the laser will remain off.
While in Automatic Mode, if the LASER/TARGET signal is stuck at 0 then
the target selection will be stuck at target 1.
In other modes, LASER/TARGET has no effect.
Table 5. SA1 Input of Raspberry Pi 3
Input Signal
CONFIRM
SA1 Result
The CONFIRM signal will only have effect in in automatic, calibration,
and demo modes.
If CONFIRM is stuck at 1 then the Raspberry Pi 3 will wait indefinitely
until the CONFRIM signal is 0 to continue operation.
When stuck at 1 the Raspberry Pi 3, assumes that the PIC has not received
the command yet.
As a result, when CONFIRM is stuck at 1, the Raspberry Pi 3 will be stuck
waiting.
LASER/TARGET While in Manual Mode, if the LASER/TARGET signal is stuck at 1 then
the laser will remain on.
While in Automatic Mode, if the LASER/TARGET signal is stuck at 1 then
the target selection will be stuck at target 2.
In other modes, LASER/TARGET has no effect.
Software
bonus features we wanted to implement required a lot of the processor ports. Through hindsight,
to remedy this problem we would simply use a different Microcontroller with more ports and
whatnot. However, since the software for our microcontroller was coded for the PIC 18F25K22,
we decided to share ports with other features. This then required us to manually run our settings
through the console during demo.
5.5.1.2 Image Processing
Another error that occurred was when calling Save Image function to directly save images
onto the USB/ Raspberry Pi, the Raspberry Pi would cause the program to freeze and crash. We
attributed this to the high resolution of the camera. Since the camera was producing very high
resolution pictures, the CPU processor utilization would hit very high numbers causing the
program to freeze and then crash. To remedy this solution, we would have two choices, either use
a stronger processor (something other than the Raspberry Pi 3) or reduce the resolution for our
camera.
5.5.2
Hardware
features needed to be implemented and tested for completeness of our design. We came up with a
plan to test the limits and regular use cases for expected results and this enabled us to find and
troubleshoot errors in our design. In our test plan taking many physical measurements of target
space and laser angles was key to our system working correctly. Calibration was the main testing
and troubleshooting of our system, but also signal integrity and communication between
processors and sensors was also key to the system function. We finally used our test plan to
measure error in our system along with difficulties that could not be resolved in the scope and
timeline of our project. The error analysis would be key to anticipating and resolving issues quickly
in future development and well as making plans for future changes design decisions if we continue
to enhance and develop our product. We were able to successfully implement a working system
which tracked and selectable targets automatically and were able to implement all the extra features
for ease of use but there were limitations in accuracy and speed which would need future work to
have an acceptable consumer security system. Although there were some limitations this does not
deter us as the analysis of those limitations gave rise to possible solutions and is informative to
and end product being attainable given more time and resources.
In conclusion we designed a motion tracking and response security device which we named
T.R.I.S.S.H., the acronym standing for target response inspection security system - home edition.
The purpose of which was to design it using cost effective parts so it could scale up to a possible
weaponized system for military and law enforcement use. The technology used was challenging
to implement a real time speed and accurate system but given more time and resources we feel our
prototype would be capable of real consumer quality results. Analyzing the possible errors was
informative because it raised our standards for future and testing and product development that
would be need in order to have a real product. This was a great exercise in the product development
cycle by most importantly isolating a problem in the world which could be addressed, improved
or solved and finding a cost effective way of doing it. We feel it helped shape our mind as
innovators and entrepreneurial opportunities which we hope to capitalize on in industry. The work
ethic needed to start and complete a project of our own creation proved very demanding to be selfmotivated. Again we feel this helped prepare us for life after the university and employing all our
Electrical and Computer Science Engineering skills we have gained through academic career to
successfully implement this project idea. Finally we are proud to work in great team who supported
each other for completion and resolution to difficult task and assignments, again a most value skill
for life after academia. Microprocessors.
35
CONTRIBUTIONS
Jeffrey Nguyen
Jesus Sandoval
Minhhue H. Khuu
36