Anda di halaman 1dari 37

Final Project

T.R.I.S.S.H.
Target Response Inspection
Security System Home
EE 475 B
June 03, 2016
NAME
Jeffrey Nguyen
Jesus Sandoval
Minhhue H. Khuu

SIGNATURE

STUDENT ID

Table of Contents:
1.

ABSTRACT ............................................................................................................................................... 3

2.

INTRODUCTION ...................................................................................................................................... 3

3.

DISCUSSION OF THE PROJECT ............................................................................................................ 3


3.1 SYSTEM DESCRIPTION ............................................................................................................................ 4
3.1.1
System Physical Description ........................................................................................................ 4
3.1.2
Specification of External Environment ......................................................................................... 4
3.1.3
System Input and Output Specification......................................................................................... 5
3.1.4
User Interface............................................................................................................................... 7
3.1.5
System Functional Specification .................................................................................................. 8
3.1.6
Operating Specifications .............................................................................................................. 8
3.1.7
Reliability and Safety Specification .............................................................................................. 8
3.2 DESIGN PROCEDURE ............................................................................................................................... 9
3.2.1
Tools ............................................................................................................................................. 9
3.2.2
Motor Control .............................................................................................................................. 9
3.2.3
Image Detection ......................................................................................................................... 10
3.2.4
Raspberry Pi 3 and PIC Communication ................................................................................... 10
3.2.5
User Input................................................................................................................................... 11
3.2.6
Real Life Circuit ......................................................................................................................... 11
3.3 SYSTEM DESCRIPTION .......................................................................................................................... 14
3.3.1
System Inputs .............................................................................................................................. 14
3.3.2
System Outputs ........................................................................................................................... 15
3.4 SOFTWARE IMPLEMENTATION .............................................................................................................. 16
3.5 HARDWARE IMPLEMENTATION ............................................................................................................. 21
3.5.1
Top Level Block Diagram .......................................................................................................... 21
3.5.2
Raspberry Pi 3............................................................................................................................ 22
3.5.3
PIC Microprocessor ................................................................................................................... 22
3.5.4
Pixy Camera ............................................................................................................................... 22
3.5.5
Night Vision Camera .................................................................................................................. 22
3.5.6
Laser........................................................................................................................................... 22
3.5.7
Servo Motors .............................................................................................................................. 22
3.5.8
Shift Registers............................................................................................................................. 23
3.5.9
LCD ............................................................................................................................................ 23

4.

TESTING ................................................................................................................................................. 23
4.1 TEST PLAN............................................................................................................................................ 23
4.2 TEST SPECIFICATION ............................................................................................................................ 23
4.2.1
Laser movement ......................................................................................................................... 23
4.2.2
Target recognition ...................................................................................................................... 24
4.2.3
Laser and Target tracking .......................................................................................................... 24
4.2.4
User Menu .................................................................................................................................. 24
4.2.5
Manual Mode ............................................................................................................................. 24
4.3 TEST CASE ............................................................................................................................................ 24
4.3.1
Laser movement ......................................................................................................................... 24
4.3.2
Target recognition ...................................................................................................................... 25
4.3.3
Laser and Target tracking .......................................................................................................... 25
4.3.4
User Menu .................................................................................................................................. 25
4.3.5
Manual Mode ............................................................................................................................. 25

5.

PRESENTATION, DISCUSSION, AND ANALYSIS OF THE RESULTS ........................................... 26


5.1
5.2

TESTING LASER/MOTOR RANGE OF MOVEMENT ................................................................................... 26


TARGET RECOGNITION RESULTS .......................................................................................................... 27

5.3 MOTION TRACKING AND TARGET FOLLOWING CALIBRATION............................................................... 27


4.
SPEED AND PRECISION .......................................................................................................................... 29
5.4 FAILURE MODE ANALYSIS ................................................................................................................... 31
5.4.1
PIC Failure Mode Analysis ........................................................................................................ 31
5.4.2
Raspberry Pi 3 Failure Mode Analysis ...................................................................................... 33
5.5 ANALYSIS OF ERRORS AND POSSIBLE ERRORS ..................................................................................... 33
5.5.1
Software ..................................................................................................................................... 33
5.5.2
Hardware ................................................................................................................................... 34
6.

SUMMARY AND CONCLUSION ......................................................................................................... 34

7.

AUTHORS AND CONTRIBUTIONS..................................................................................................... 36

1. Abstract
In this write up we will walk through the design cycle of from initial design to final product
through the different sections in the report. We start with the initial discussion of the project which
describes our system which we have named by the acronym T.R.I.S.S.H. standing for Target
Response Inspection Security System- Home edition. Its use and specification is described by
design specification as the ultimate final product and goal to design to for the customer/consumer
or end user. We give a detailed description of the hardware and software modules which implement
the target tracking and selection using image sensors (cameras) to the response modelled weapon
using a laser mounted on servo motors horizontal and vertical movement. Also the extra features
which include user interface for selectable modes, selectable modes consist of, automatic, manual
mode, and target selection. All these features of our system are described and explained by this
report in the discussion. Then the next section will describe the test plan used to troubleshoot and
prove proper function of our system. After we go over the results and analysis of that data which
is used to find error, make design decisions and modifications and if not resolved investment of
future work and research. Finally, we will conclude the report with the analysis of the process and
design cycle of product.

2. Introduction
The purpose of this lab was to implement a target tracking and security system. Using camera
sensors and image processing for the majority of the functionality, we took raw image size and
location data to track multiple targets, then follow the objects of choice using a motorized laser as
interactive response. We used a Raspberry Pi 3 microprocessor with a Raspbian linux distribution
to process data and program automation of the system using C programming. Also selectable
functions and extra features were programmed in for manual use of the system. Extensive testing
and calibration of the location information and corresponding laser movement was used to get an
accurate and effective system. We tested and limited movement speed of targets as well as captured
lighting and color limitations for our system and created a controlled and limited working
prototype which can be expanded with higher quality parts and more testing and calibration.

3. Discussion of the Project


The overall project was a final design project with the intention of walking through the entire
development process from initial planning, building, debugging, and testing. As such, this project
was accomplished over a time span of the four weeks with the initial planning and designing phase
accomplished over the initial six weeks while accomplishing other lab related tasks assigned
during the quarter.
The project is named T.R.I.S.S.H, but the full name is T.R.I.S.S.H.U.L.A. Which is an
acronym for Target Response Inspection Security System Home Use Laser Automation. As the
name suggests, this project is intended for home use, more detailed specifications will be explained
later in this section. The project is an automatic-manual controlled laser system that uses color
detection to distinguish a target and uses the laser to follow the colored object as the object moves
in real time with a latency of 500ms, more detailed specifications will be explained later in this
section. The current implementation of the system is limited to two colored objects, red and yellow,
but is capable of following either one in automatic mode. The user is also allowed to switch

between the two. In manual mode, the user will have complete control over the laser with 10,000
points of motion using an analog stick and is capable of turning the laser on and off.
For more information about the project specification and its implementation continue reading
this section.

3.1 System Description


This system will implement camera and image processing technology, combined with a
microprocessor to differentiate between a target and non-target. A target will be tracked and
followed and a laser pointer (weapon) that will lock and shoot targets. The user will be able to set
a target by using the user interface and scanning a colored object that will match the color of the
target object of interest. Both target/non-targets in this system will receive visual and audio
warnings. The laser weapon and camera will be mounted on a motorized rotational system for the
purpose of following the target. Image processed color codes and gestures along with an analog
stick, buttons and switches will activate settings and modes for input and output to this system.
Also information will be stored onto memory in the form images named with a time stamp in order
to keep history of objects that have been detected in the previous and current detections
3.1.1 System Physical Description
The systems physical appearance is made up of a couple 2x4 wood blocks to build the mount
for the system with the hardware components glued onto the wood blocks. The physical
dimensions and weight of the system is shown in Table 1.
Table 1. Physical Dimensions and Weight of the System
Measurement
Height
Width
Depth
Weight

Value
30.5 cm
13.4 cm
17.8 cm
1687 g

3.1.2 Specification of External Environment


The system will be a camera/weapon unit mounted on a motorized pivot system for movement
controlled by a PIC microcontroller and hardwired to a microprocessor on the Raspberry Pi 3 to
determine the computed movement. The power is supplied by two separate power rails, one for
the PIC and the other for the motors using an external power supply. However, the system can be
powered by batteries so long as the battery are 5 V and can supply up to 1 A of current, for example
two sets of 4 AA batteries can power this system.
The scenario in which this targeting system should be used in is strictly home or indoor
environments. That being said, this target tracking system will be made with the intention of home
use which will consist of moderate lighting, low wind conditions, moderate temperature, and other
home related conditions, which will be listed later. The system was tested in a fixed distance with
a white board in a lab environment and as such, that is the most ideal environment for use, more
detail of the testing scenario can be found in the testing portion of this documentation.

The environment for the use of this system are the followings:
Moderate Lighting
o White light, 7000K warmth
o Fluorescent Lightbulb
o 100 W power output
Moderate Temperature
o 70.0 F or 21.1 C
o No wind conditions
o Low humidity
The system itself, is quite durable and is capable of being transported from one place to
another with moderate ease without worrying about damaging the product. However, that being
said the system is not capable of sustaining large drops. Additional testing needs to be done to test
the durability of the system for actual production.
3.1.3 System Input and Output Specification
In this section, the systems input and output interaction with the user will be explained. Here,
the user control is explained as well as the display for the user to read to understand the behavior
and the state of the system.
3.1.3.1 System Inputs
Button and Switches
o The buttons and switches are placed in to a solderless breadboard with tight
wiring to maintain signal integrity and strong connection.
o The buttons and switches will be used to control some of the basic behavior of
the system, this includes:
Reset, to reset the PIC microcontroller system.
Power, to shut down the system.
Enter button, to enter a selected option on the current menu system.
Back button, to return to the previous menu system.
o The input voltages will have two states, logical low, and logical high.
High Readings: 5.0 V 0.025 V
Low Readings: 0.0 V 0.025 V
The switches and buttons have a pull up resistor for a defined voltage
on an open circuit.
The buttons activate on an active low system, and is normally high.
Analog Stick
o Used to select menu options in the PIC.
o Use to manual control the laser when switched to manual mode.
o The input voltages will have two analog values for the X and Y directions.
High Readings: 5.0 V 0.025 V
Low Readings: 0.0 V 0.025 V
Default Readings: 2.5 V 0.025 V
The switches and buttons have a pull up resistor for a defined voltage
on an open circuit.
The buttons activate on an active low system, and is normally high.

Data Input from Cameras


o PIXY Camera
Communicates with the Raspberry Pi 3 via mini USB 2.0.
Sends color tracking information of specified colored object to the
Raspberry Pi 3.
o NOIR Filtered Pi Camera
Communicates with the Raspberry Pi 3 via proprietary cable.
Sends 8 MP images to the Raspberry Pi 3
Capable of viewing in the Infrared color spectrum
o The data input of the camera will be used to measure the input voltage to
High Readings: 3.3 V 0.016 V
Low Readings: 0.0 V 0.025 V
Console Control via Raspberry Pi 3
o Select modes to run when running the main program
i, initialization mode
Initializes GPIO pins
Typically Ran first
c, calibration mode
Used to calibrated on a white board
Used to test the system
a, automatic mode
Switch system to automatic mode
Waits for PIC to switch into automatic mode
Waits for PIC to send target select signal
Waits for PIXY camera to send tracking information
d demo mode
Allows the user to change the position of the laser
Horizontal input ranges from 1 99, where 50 is center
Vertical input ranges from 1 99, where 0 is up right.
m manual mode
Switches control the PIC microcontroller and analog stick

3.1.3.2 System Outputs


Servo Motors
o Motors are controlled using the PIC Microcontroller, which generates the
accurate pulse waves 10 times for a given degree of motion
o Motors are power by 5.0 V 0.025 V Rail
Horizontal Motors ranges from 0 to 130 degrees
Error of 2 degree.
The horizontal axis is relative to the level plane of the system.
Horizontal Motors ranges from 0 to 130 degrees
Error of 2 degree.
The vertical axis is relative to the perpendicular plane of the
system.
o Additional information of the motors is explained in implementation.

Laser
o
o
o
o

Laser is capable of turning on and off.


Controlled by the Raspberry Pi 3.
650nm 5 mW output corresponds to red laser output on high
The output voltages will have two states, logical low, and logical high.
High Readings: 5.0 V 0.025 V
Low Readings: 0.0 V 0.025 V
LCD Display to display menu and status.
o Used to give the user information the status of the system and give the user
options to operate on giving suggestions on operation.
o Using a shift register for data bits,
o Displays on four lines instructions and status.
o Display is power by the system power of 5.0 V.
Image Saving
o Image is saved as a JPEG file onto an external USB drive.
o The image is captured when an object is detected for a set duration amount of
time.

3.1.4 User Interface


The user interface is where the user is told what they can or cannot do based on the
specification of the system. Some commands are inputted from two systems that have to
collaborate and work together to output one cohesive operation or command. The commands can
be sent to the PIC, Raspberry Pi 3, or both. The operations the users can commit are:
COMMAND
Reset
Power
Automatic
Manual
Demo
Initialization
Calibration
Select Target
Laser On/Off

INPUT SYSTEM
PIC
PIC
Raspberry Pi 3
PIC
Raspberry Pi 3
PIC
Raspberry Pi 3
PIC
Raspberry Pi 3
Raspberry Pi 3
PIC
PIC

DESCRIPTION
Reset the PIC system
Power the PIC system
A mode where the system will monitor the specified
area.
A mode where the user has full control over the laser.
A mode to demonstrate the possible features this system
can operate.
Initializes the GPIO pins on the Raspberry Pi 3.
A mode to calibrate the system on a white board.
Select between the two targets.
Turn the laser on or off while in manual mode.

Most of these operations will take time to map the appropriate analog movement and switch
combinations but for now the follow must be true.
SWITCH FUNCTION
Reset
To reset the system
Power
To turn off the power

The user control for selecting the correct menu option on the LCD screen is shown using the
following scheme.
CONTROL (ANALOG)
Down/Up
Right/Left
Enter
Back

FUNCTION
Read down and up on the LCD screen.
Read right and left on the LCD screen.
Enter Command of selected option
Back Command to previous menu

3.1.5

System Functional Specification


The system is intended to differentiate between a target and non-target using a camera
and image processing and be able to track and follow a target. A user or controller can use
buttons, switches, and image recognized color coded control gestures to shoot target, send
audio / visual warnings, change the state of the system on/off, data save and read modes.
The system must be capable of tracking the system indefinitely that matches the target
description until otherwise specified. The system itself should be able to keep track of the events
that occur and label it accurately based on the event. The event should also capture the time
stamp and save the image in to an external USB for future viewing.
3.1.6 Operating Specifications
The system shall operate in a controlled room with color coded targets and non-targets that
will move left, right, up, down, back and forth toward the camera/weapon module. The following
voltages are set as follow:
COMPONENT
System Rail
LCD
PIC
Raspberry Pi 3 and PIXY Cam
Analog Stick
Button and Switches

VOLTAGE (relative to ground)


5.0 V
5.0 V
5.0 V
3.3 V
5.0 V 0.0 V
0.0 V or 5.0 V

3.1.7 Reliability and Safety Specification


Due to the fact that this system has a great deal with security and targeting self-defense system.
Safety and reliability will be large part of this systems design ideology and philosophy. Additional
research will be required to completely target down the type of regulations needed to follow for
the system to become a truly safe and reliable system within standards set.
One standard includes the following:
Motion Capture and Target Tracking System shall comply with the following safety
standards
o UL 73 Standard for motor operated appliances
o UL 1004-6 Standard for Servo and Stepper Motors
o UL 1004-7 Standard for Electronically Protected Motors

o UL 60065 Standard for Audio, Video and Similar Electronic Apparatus - Safety
Requirements

3.2 Design Procedure


In this section we discuss the overall procedure we went through to design our system
according to the design specification. This means starting with the tools we identified to design
and test with, then proceeding to the three important designs to include motor control, image
detection, PIC to Raspberry Pi 3 communication, and user input.
3.2.1 Tools
In this lab we used a multitude of tools primarily for programming and debugging the
Raspberry Pi 3 and the PIC. By utilizing these tools we were capable of creating the final projects.
Tool
Purpose
InfiniiVision 4000 X-Series Analyze real world digital logic waveforms.
MSO
DC Power Supply
Used Fixed +5V Rail to power circuit
Digital Multimeter
Used to troubleshoot by measuring analog voltages.
Notepad++
Text editor.
MPlab IDE
Program Pic Microprocessor
PICKIT3
Hardware programmer/debugger for PIC processor
C Program
The primary programming language used for the PIC and
Raspberry Pi 3
GCC
Compiler for the C programs written on the Raspberry Pi 3
Bash Script
Bash Script to run camera function
3.2.2 Motor Control
One of the most important components that had to be designed was the motor control. The
servo motors operated on pulses that were a period of 20ms and used 0.5ms to 2.5ms high pulses
to determine the position of the motor. That equated to a duty cycle of 2.5% to 12.5% which was
something that was not achievable using the Raspberry Pi 3 due to its imperfect sleep function.
The sleep function in the Raspberry Pi 3 did not allow us to create perfect pulses and as a result
we relied on the PIC to control the motors. By using the PIC as the controller for the motors, we
were able to control the motors perfectly with great precision and good resolution. In the end, we
were able to create 100 points of motion.
In order to design the motor control system, we create a pulse that ranged from 1.0ms to 2.0ms,
with 0.01ms step intervals. When dealing with the duty cycle of 2.5% to 12.5%, instead of making
a pulse relative to the 20ms total pulse, we modeled the 20ms pulse by have a guaranteed 1ms
pulse at the start and a 18ms pulse at the end. This meant that there was only 1ms of pulse for us
to control which allowed us to control just a 1ms pulse with a resolution of 100 increments.
The first step in doing this was creating a perfect 10 microsecond delay function. In order to
do this, we had to calculate the amount of Nop() operations were required to create a 10
microsecond delay. Since we were using a 20 MHz clock, and each instruction in the PIC

microcontroller took 4 clock cycles, a 5 instructions yielded 1 microsecond delay. However, when
dealing with function calls and for loops, each call causes a delay of some sort so we had to take
that into account. Overall, a for loop took about 26 instructions for an iteration greater than 1, and
a function call took 4 instructions. Using this we were able to create a perfect usleep10() delay
function which delayed for 10 microseconds and gate us the ability to create pulses ranging from
1.00ms to 2.00ms.
After creating the perfect delay function, we had to generate the pulse correcting, then repeat
that 10 20 times. This was to give the motor enough time to reach its position. By sending the
pulse 10 20 times the motor was given roughly 200ms to 400ms to move to its position, which
was more than enough time to get into position.
3.2.3 Image Detection
Image detection was done using the PIXY camera. The PIXY camera came with libraries of
code that allowed us to use. More detail on its implementation can be found on PIXYs website.
For our use case, we modified the hello_pixy.cpp code. Originally, the code was able to detect
an object and print its information in the target, x, y, height, and width information. By using this
information, we could use the given coordinate information and map it to motion with our motor
control system.
First, we had to send the data over to a temporary file. For this we used a mkfifo() command
to create a temporary file and put the program at a halt until the program was read by our movement
program. Once the program was read, our movement program parsed the given string, and
extracted the target, x, y, height, and width information and the camera was able to continue this
process continually. With two programs running simultaneously, we had to let the operating
system and the linux kernel schedule the programs but from our test cases it seemed to be
sufficient.
After getting the coordinate information from the camera, we took a series of measurements
and derived a linear equation for us to make to our laser motors and the system was capable of
detecting an object and responding to it.
3.2.4 Raspberry Pi 3 and PIC Communication
Communicating the Raspberry Pi 3 to the PIC was using two wires for sending and
confirmation signals and 7 data lines all in parallel. This was done in parallel because of the 3.3 V
to 5 V issues that occurred. By using parallel data transfer the PIC was able to recognize the signal
as high or low.
When sending data from the Raspberry Pi 3 to the PIC a send signal is sent, the Raspberry Pi
3 is then put at a halt until the PIC sees the send signal and returns a confirm signal. After that the
Raspberry Pi 3 will recognize that the transfer is finished and continues operation.
When sending data from the PIC to the Raspberry Pi 3, we used 1 wire that acted as a
multipurpose binary option wire based on the current state of the Raspberry Pi 3. Due to the
limitations of ports and time, we were not able to send more than just 1 bit of information towards

10

the Raspberry Pi 3. But this was sufficient for selecting between two objects in automatic mode,
and on/off laser in manual mode.
3.2.5 User Input
User input was done in a two-step process. The Raspberry Pi 3 had to first select the mode the
user wanted to be in. Then the use had to change the PIC to be in the correct operation state for it
to retrieve the data from the Raspberry Pi 3 and respond correctly. With the communication system
implemented, if either system was in the wrong state, the Raspberry Pi 3 or PIC would wait
indefinitely until the send or confirm signal was received from each respective system.
The user input in the Raspberry Pi 3 was done using terminal input using a C programming
which waits for a single character input for mode selection, and integer input for position
information for demo mode or nothing in all other modes.
The user input in the PIC was done using the analog stick and two buttons to enter or back out
of the menu system. This system allowed the user to select an entry by using the analog stick to
switch options
3.2.6

Real Life Circuit

11

12

13

3.3 System Description


3.3.1

System Inputs
Button and Switches
o The buttons and switches are placed in to a solderless breadboard with tight
wiring to maintain signal integrity and strong connection.
o The buttons and switches will be used to control some of the basic behavior of
the system, this includes:
Reset, to reset the PIC microcontroller system.
Power, to shut down the system.
Enter button, to enter a selected option on the current menu system.
Back button, to return to the previous menu system.
o The input voltages will have two states, logical low, and logical high.
High Readings: 5.0 V 0.025 V
Low Readings: 0.0 V 0.025 V
The switches and buttons have a pull up resistor for a defined voltage
on an open circuit.
The buttons activate on an active low system, and is normally high.
Analog Stick
o Used to select menu options in the PIC.
o Use to manual control the laser when switched to manual mode.
o The input voltages will have two analog values for the X and Y directions.
High Readings: 5.0 V 0.025 V
Low Readings: 0.0 V 0.025 V
Default Readings: 2.5 V 0.025 V
The switches and buttons have a pull up resistor for a defined voltage
on an open circuit.
The buttons activate on an active low system, and is normally high.
Data Input from Cameras
o PIXY Camera
Communicates with the Raspberry Pi 3 via mini USB 2.0.
Sends color tracking information of specified colored object to the
Raspberry Pi 3.
o NOIR Filtered Pi Camera
Communicates with the Raspberry Pi 3 via proprietary cable.
Sends 8 MP images to the Raspberry Pi 3
Capable of viewing in the Infrared color spectrum
o The data input of the camera will be used to measure the input voltage to
High Readings: 3.3 V 0.016 V
Low Readings: 0.0 V 0.025 V
Console Control via Raspberry Pi 3
o Select modes to run when running the main program
i, initialization mode
Initializes GPIO pins
Typically Ran first
c, calibration mode
14

3.3.2

Used to calibrated on a white board


Used to test the system
a, automatic mode
Switch system to automatic mode
Waits for PIC to switch into automatic mode
Waits for PIC to send target select signal
Waits for PIXY camera to send tracking information
d demo mode
Allows the user to change the position of the laser
Horizontal input ranges from 1 99, where 50 is center
Vertical input ranges from 1 99, where 0 is up right.
m manual mode
Switches control the PIC microcontroller and analog stick

System Outputs
Servo Motors
o Motors are controlled using the PIC Microcontroller, which generates the
accurate pulse waves 10 times for a given degree of motion
o Motors are power by 5.0 V 0.025 V Rail
Horizontal Motors ranges from 0 to 130 degrees
Error of 2 degree.
The horizontal axis is relative to the level plane of the system.
Horizontal Motors ranges from 0 to 130 degrees
Error of 2 degree.
The vertical axis is relative to the perpendicular plane of the
system.
o Additional information of the motors is explained in implementation.
Laser
o Laser is capable of turning on and off.
o Controlled by the Raspberry Pi 3.
o 650nm 5 mW output corresponds to red laser output on high
o The output voltages will have two states, logical low, and logical high.
High Readings: 5.0 V 0.025 V
Low Readings: 0.0 V 0.025 V
LCD Display to display menu and status.
o Used to give the user information the status of the system and give the user
options to operate on giving suggestions on operation.
o Using a shift register for data bits,
o Displays on four lines instructions and status.
o Display is power by the system power of 5.0 V.
Image Saving
o Image is saved as a JPEG file onto an external USB drive.
o The image is captured when an object is detected for a set duration amount of
time.

15

3.4 Software Implementation


To develop this project C was used as the primary programming language for the Raspberry
Pi 3 and the PIC. The software can be found in the following.
Folder name of Code Purpose
/PIC/
Folder contains the C code for the PIC
/PI/
Folder contains the C code for the Raspberry Pi 3

Figure 3.4.0: Software Block Diagram


To develop the home operating system, we had to utilize C. With our C code were able to
build, compile, and program our microprocessor and Raspberry Pi 3. The microprocessor would
then allow us to send a pulse with up to 10 s delay to our servo motors and handle user control
signals. With the PIC microprocessor, it handled the LCD, the user input, and the communication
between the Pi and the Processor. The Raspberry Pi 3 code would handle the calibration of the
board with respect to the laser, the manual console inputs, and the demo initial presentation.

3.4.1 Software Implementation for PIC18F25K22:


When programming to the 18F25K22 PIC microprocessor, we first had to declare the
ports. Using the following declarations were able to set the ports and initialization process shown
in Figure 3.4.1.

16

TRISA = 0b10110011;
TRISC = 0b00000000;
TRISB = 0b11111111;
ANSELA = 0b00000011;
ANSELB = 0b00000000;
ANSELC = 0b00000000;
_RS = 0;
_RW = 0;
_OE = 0;
msleep18();
lcdINIT();

Figure 3.4.1: Initialization for PIC


Then to directly read the inputs and outputs, we declared the following variables as the
following pins shown in Figure 3.4.2
#define _IN_D0
#define _IN_D1
#define _IN_D2
#define _IN_D3
#define _IN_D4
#define _IN_D5
#define _IN_D6
#define _SELECT
#define _BACK
#define _HORIZONTAL
#define _VERTICAL

PORTBbits.RB0
PORTBbits.RB1
PORTBbits.RB2
PORTBbits.RB3
PORTBbits.RB4
PORTBbits.RB5
PORTBbits.RB6
PORTBbits.RB7
PORTAbits.RA4
LATAbits.LATA2
LATAbits.LATA3

#define _LASERCONTROL LATCbits.LATC0


#define _SEND
PORTAbits.RA5
#define _CONFIRM
LATAbits.LATA6
#define _RS
#define _RW
#define _E

LATCbits.LATC3
LATCbits.LATC2
LATCbits.LATC1

#define _SERIAL LATCbits.LATC4


#define _OE
LATCbits.LATC5
#define _RCLK LATCbits.LATC6
#define _SRCLK LATCbits.LATC7

Figure 3.4.2: Declared Variables for Ports A, B, and C.


Then for the user controlled analog joystick, we would receive two voltage values that ranged
between 0-1024. These values would then be read handle our control signals with threshold values
of less than 200 and values larger 800. Shown in Figure 3.4.3 is how we implemented the following
user control.
if(openADC(0) > 800 && !trig){
clearDisplay();

17

cursorState = 0;
strcpypgm2ram(buf, "Automatic Mode
");
writeStr(buf);
trig =1;
}else if(openADC(0)< 200 && !trig){
clearDisplay();
cursorState = 1;
strcpypgm2ram(buf, "Manual Mode
");
writeStr(buf);
trig = 1;
}else if(openADC(0)<600 && openADC(0)>400){
trig= 0;
}
Figure 3.4.3: Joystick Control Function
When manual mode is then selected the it gives the joystick the ability to control the laser
movement. Shown in Figure 3.4.4 is how we implemented the joystick control function to control
the laser movement.
_CONFIRM = 0;
h_new = openADC(0);
v_new = openADC(1);
if(h != h_new) {
h = h_new;
h_degree(h/10);
}
if(v != v_new) {
v = v_new;
v_degree(v/10);
}

Figure 3.4.4: Joystick in Manual Mode


When in automatic mode, we would send a signal to the Raspberry Pi which would indicate
the motors to be controlled by the Pi. Shown in Figure 3.4.5 is the signal being sent to the Pi.
_LASERCONTROL = 1;
while(_BACK == 1){
if(_SEND == 0) {
getDegree();
setDegree();
}
}

18

Figure 3.4.5: Signal Being sent for Automatic Mode

3.4.2 Software Implementation for Raspberry Pi:


When working with the Raspberry Pi, we had it send a control signal to the PIC to generate a
pulse to control our servo motors. To set the control signals for the PIC we used the following
code shown in Figure 3.4.6.
//H program
gpio_write(21, (h) & 1);
gpio_write(20, (h>>1) & 1);
gpio_write(16, (h>>2) & 1);
gpio_write(12, (h>>3) & 1);
gpio_write(7, (h>>4) & 1);
gpio_write(8, (h>>5) & 1);
gpio_write(25, (h>>6) & 1);
//Enable
while(gpio_read(24) == '1') {
gpio_write(23, LOW);
}
gpio_write(23, HIGH);
//V program
gpio_write(21, v & 1);
gpio_write(20, (v>>1) & 1);
gpio_write(16, (v>>2) & 1);
gpio_write(12, (v>>3) & 1);
gpio_write(7, (v>>4) & 1);
gpio_write(8, (v>>5) & 1);
gpio_write(25, (v>>6) & 1);
while(gpio_read(24) == '1') {
gpio_write(23, LOW);
}
gpio_write(23, HIGH);
Figure 3.4.6: Control Signal Being sent to PIC
Then to map the proper coordinates to for the laser movement we set the following equation set
from calibration data explained in the testing section. The following mapped equation is coded in
Figure 3.4.7.
void map(int* h, int* v, int x, int y) {
19

double a;
double b;
a = 0.2200 * x + 15.671;
b = 0.2300 * y - 9.607;
*h = (int) a;
*v = (int) b;
}
Figure 3.4.6: Mapped Signal For the Servo Motors
Another feature added was a calibration mode. Much like touchscreens for calibration, we set
our board to contain a calibration mode. Shown in Figure 3.4.7 is a small fragment of one of our
13 calibration fragments.
printf("..........CALIBRATION..........\n");
printf("POINT 28-00\n");
degree(28,0);
while(!i) {
scanf("%d\n", &i);
}
i = 0;
printf("POINT 76-24\n");
degree(76,24);
while(!i) {
scanf("%d\n", &i);
}
i = 0;
printf("POINT 76-00\n");
degree(76,0);
while(!i) {
scanf("%d\n", &i);
}
i = 0;
Figure 3.4.7: Calibration Mode for laser.
Then for our manual pinpoint mode we simply passed in the two entered coordinates into our
movement mapped equations shown in Figure 3.4.8.
printf("Enter a Horizontal Position\n");
scanf("%d", &h);
printf("Enter a Vertical Position\n");
scanf("%d", &v);
printf("Setting @ position (%d, %d)\n", h, v);

20

degree(h,v);
Figure 3.4.8: User Controlled input

3.5 Hardware Implementation


3.5.1

Top Level Block Diagram

Figure 3.5.1: Hardware Block Diagram


Hardware
Components
Raspberry PI 3
Pic Microprocessor
Pixy video Camera
Night Vision Camera
Laser Light
LCD Display
Servo Motors
20 MHz Clock
Shift Register

Part No.

Description/Purpose

N/A
18f25k22
CMUcam5
NoIR PI cam V3
N/A

Main Microprocessor
2nd sub-Microprocessor
Image processing, target tracking
Capture low light images
Weapon Simulation target
tracking
N/A
Display menu options
SG90 Micro Servo Motor 9G Pan and tilt for tracking system
RC
OSC-20
Input clock for PIc microprocessor
c595N
Reduce pin usage with
serialization
Figure 3.5.2: Table of Parts Used

21

3.5.2 Raspberry Pi 3
We chose the Raspberry Pi 3 as our main processor because of the popularity of the platform
along with the ease of use. As for the other Raspberry Pi 2 and Raspberry Pi 1, we avoided using
the older models due to the following listed settings.
A 1.2GHz 64-bit quad-core ARM Cortex-A53 CPU (~10x the performance of Raspberry
Pi 1)
Integrated 802.11n wireless LAN and Bluetooth 4.1
Complete compatibility with Raspberry Pi 1 and 2
As shown in the following specs, the Raspberry Pi 3 is nearly 10x faster than the Raspberry
Pi 1, this indicates that we can compile and use our code. Along with this, it allows us to produce
higher resolution images from our cameras. Also shown is that it contains Lan and Bluetooth. With
the nature of our project, we want the homeowners per say to receive notifications when an intruder
has arrived. As such, Bluetooth capabilities allows us a wireless feature to access networking for
future endeavors.
3.5.3 PIC Microprocessor
The PIC microprocessor was initially not planned for use. However, when implementing our
Servo motors, we noticed that we were receiving a 10% error in the duty cycle for the pulse set.
This 10% error causes a huge issue when controlling the motors. It would swerve constantly back
and forth due to the unstable sleep function the Pi allows us to. Also by using the PIC
microprocessor, it allowed us to have a seamless transition of using A LCD for user controlled
actions.
3.5.4 Pixy Camera
The Pixy Cam was our main camera of choice for tracking due to image processing and ease
of use. For our project, we needed to receive image processing that can output data that we can
interpret and read. The pixy cam automatically does this for us. Along with the ability to set
signatures and adjust the hue and brightness of the camera, it allowed us to receive multiple
readings that can then be interpreted for our laser and servo motors. Also, since the providers of
the PixyCam (Charmed Studio) allows cross communications between the Raspberry Pi and the
Pixy Cam, this camera was ideal for what we wanted to accomplish.
3.5.5 Night Vision Camera
For the Night Vision Camera, we used the NoIR PI cam V3. The NoIR PI cam was able to be
hooked directly to the Raspberry Pi. The reason we used the following camera was due to the
solution of brightness being a issue and the ease of use. The NoIR PI cam V3 allowed us to
manually set a night mode for when our PIXY cam cannot read anything. Also since the camera
itself has a 1080p camera, we can assure the quality of picture would be top notch.
3.5.6 Laser
For our laser, we simply used a generic laser that would indicate where it was pointing to.
This part is used solely for demoing purposes.
3.5.7 Servo Motors
For our motors we used the SG90 Micro Servo Motor 9G RC as our movement for our laser.
The main reason for using these motors was due to the sheer size the motors along with their cost.
22

Since cost was a huge factor in our project, we wanted to make sure that we used effective, yet
cheap parts. The servo motors was our motor of choice. With a operating speed of 0.12second/
60degree ( 4.8V no load), we were able to effectively model our laser movement.
3.5.8 Shift Registers
The C595N shift register was used importantly on our data busses to consolidate pins and
allowed for us to economies pin usage. We simply used 4 pins 2 clocks for generated from the PIC
processor and 1 serialized data pin and an output enable control pin also generated by our PIC
processor.
3.5.9 LCD
The LCD we chose was a standard HD44780 2x16 display. It allowed us to write ascii
characters to display our measurement readings. We connected control pins and data pins from the
sub processor along with +5v power and 0V ground and used software in the processor to encode
the characters we wanted to display. We were able with the 16 characters on one line to display
description of the reading along with value of no more than 5 digits and units.

4. Testing
4.1 Test Plan
The first main part for testing in our system came in calibrating movement of the laser. It is
controlled by C program on the Raspberry PI which drives servo motors mounted on a pivot system
for the laser having a pan and tilt. We have to test accuracy of laser movement and calibrate both
horizontal and vertical movement range. Then we must ensure correct placement of the laser as
specified the by our programming.
We also have to correctly test the accuracy of a pixy camera image. This means we must test
accuracy of color signatures being recognized for tracking purposes, analyzing what information
about the color signature desired in a controlled space. So we want to test the accuracy of object
being recognized, meaning dimensions of the object, and secondly the coordinates of our target
object being processed for our test area.
Combining the image processing with laser movement we want to test accurate movement of
the laser in comparison to the object as we want to track the object with the size and location
information is being used to follow the object.

4.2 Test Specification


4.2.1 Laser movement
1. Select a designated area with specified 2D box area at set distance from laser using a
whiteboard
2. Measure and document initial point
3. Move one step measure in any direction left, right, up and down
4. Measure distance from origin
5. Then pick a location on the grid and calculate number of step measurements to get to
location
23

6. Adjust and calibrate for correct movement


7. Check speed limitations starting slow and moving up in speed without reducing accuracy
4.2.2 Target recognition
1. Select a designated area with specified 2D box area at set distance from camera using a
whiteboard
2. Use a given color signature object for recognition by the pixy camera
3. Place that ball in designated area of testing and capture target using pixy
4. Get dimensions height and length of ball as well as x and y coordinates
5. Compare dimensions and locations of the ball as compared to the grid of area used for
testing
6. Move the ball a given step in any direction preferably one full ball diameter within the
measurement grid
7. Measure data captured from pixy camera as compared to the previous measurement
8. Compare and ensure uniform change in the direction of choice
9. Check speed limitations starting slow and moving up in speed without reducing accuracy
4.2.3 Laser and Target tracking
1. Select a designated area with specified 2D box area at set distance from laser & camera
using a whiteboard
2. Put in a selectable color coded target on the specified test area
3. Check laser location as compared to the target
4. Move the target on the grid at a very slow pace to another location and measure response
of laser movement
5. Compare and measure accuracy of movement
6. Check speed limitations starting slow and moving up in speed without reducing accuracy
4.2.4
1.
2.
3.
4.

User Menu
Check LCD output of menu
Use controller to scroll through menu options watching LCD for correct menu change
Select Menu option by depressing button watch for correct LCD change menu change
Check Menu option depressed function operational

4.2.5
1.
2.
3.

Manual Mode
Select manual mode in user menu
Use joystick to move left, right, up and down
See laser movement go horizontal left, right, up and down

4.3 Test Case


4.3.1

Laser movement
Set a grid box on a white board 100 cm by 100 cm and 132 cm distance from the
laser
Set origin point in center of grid
Move one degree of movement to right, left, up, down by stepping servo motor
Then move to limit of the test grid to the furthest from origin in any direction
24

4.3.2

Target recognition
Set a grid box on a white board 100 cm by 100 cm and 132 cm distance from camera
Put target on bottom left corner of grid
Check value of measured target by pixel value coordinates
Move 1 cm in upward vertical direction check increase in vertical y direction
coordinate
Continue process multiple of 10 times for accurate response in positive y direction
Reset to bottom left corner of grid
Move 1 cm in right horizontal direction check increase in horizontal x direction
coordinate
Continue process multiple of 10 times for accurate response in positive x direction

4.3.3

Laser and Target tracking


Set a grid box on a white board 100 cm by 100 cm and 132 cm distance from the
camer/laser
Put target on bottom left corner of grid
Check correct placement of laser on center of target
Move 1 cm in upward vertical direction check increase in vertical y direction of
laser
Continue process multiple of 10 times for accurate response in positive y direction
Reset to bottom left corner of grid
Move 1 cm in right horizontal direction check increase in horizontal x direction
laser
Continue process multiple of 10 times for accurate response in positive x direction

4.3.4

User Menu
Check joystick voltage at center orientation should measure 2.5V in both pins for
x-axis and y-axis
Check joystick voltage in right orientation should measure 5v in right orientation
and 0V in left orientation on x axis pin
Check joystick voltage in far depressed orientation should measure 5V in up
orientation and 0V in down orientation on y axis pin
Check menu option of right, left, up, down selection of joystick with menu desired
option in program
Repeat menu selection 10 times for correct operation of user interface with
controlled movements

4.3.5

Manual Mode
Check joystick voltage at center orientation should measure 2.5V in both pins for
x-axis and y-axis
Check joystick voltage in right orientation should measure 5v in right orientation
and 0V in left orientation on x axis pin
Check joystick voltage in far depressed orientation should measure 5V in up
orientation and 0V in down orientation on y axis pin
Check linear increase across axis pin from 0-5V

25

Checking mapping to laser movement by going to extreme direction in all 4


directions up, down, right, and left and corresponding to Vertical up, down and
horizontal left, and right movement of laser.
Repeat 10 times for correct operation and orientation of manual mode laser

5. Presentation, Discussion, and Analysis of the Results


In this section we analyze the results for testing our system to ensure a working system. We
tested all major components including range of motion of for laser movement, target capture and
location analysis, communication between target information processing and corresponding laser
movement. After analyzing results of all test we continue to do analysis of our system going into
the wrong states and how it handles incorrect input and adjustments to avoid issue. We then
analyze sources of error in both hardware and software which contribute to system
underperformance and discuss our design choices, solutions and possible future solutions given
more time and resources.

5.1 Testing Laser/Motor range of movement


Testing the laser mount with horizontal and vertical movement possibilities we wanted to
know how this movement being controlled by us would translate to location at a specified distance.
We used a whiteboard as described in test specifications to track and measure movements, location
and spacing. We first specified a range we wanted to work in and then position the laser with
programmed movement code to the center position the we proceeded to make a grid incrementing
by 2 precision degrees of our laser movement. We were able to get a horizontal x-axis along with
a vertical y axis as displayed by the following image.

26

Figure 5.1.1 Board For demo


The grid led to a map limited by the size of the whiteboard which fills up our camera viewing
screen or picture. After measuring we four the spacing between a degree of precision to
approximately 3 cm although due to the curve of movement this would be slightly less precise as
we moved to the corners of the the grid specifically the left upper and lower corner. This was
expected as the movement of our laser was in circular degrees not straight left and right or up and
down. That being the case in a small area near the center movement of the laser we could mostly
precise movement of straight horizontal and vertical lines. As seen in the image above the
horizontal axis is slightly angled and this is due to our mount and base of our camera and slight
off horizontal angle would make the system be slightly of true horizontal and vertical. This issue
would need to be addressed with precise calibration of the laser using a levelt, for purpose of the
lab if we used the angled grid for our further analysis of location then the adjustment was
automatic. Also as stated before due to the circular nature of the movement of the laser on both
axis the curve would widen the distance between points as the angle became more extreme. After
marking the grid we wanted we were able to plot 48 points of precision horizontally and 30 points
of precision vertically limited only by the whiteboard as our system had 100 x 100 points of
precision built in. These points are then able to be calibrated with location information given by
the camera system in order to plot laser movement with location information from the target
specified.

5.2 Target Recognition Results


In target recognition testing we were able to get live location and size information of multiple
targets for processing. We tested these results by using terminal print statements which displayed
the x, y, coordinates along with width and height of the target as it related to pixel of the image
specifically for the the target. Our Pixy camera is a 600 x 400 pixel camera so the range of the
coordinates and size were limited to those values at max. After receiving target information which
we tested by holding the target at a set location on the target and seeing the values we verified the
linear and correct values by moving in either horizontal or vertical directions and seeing the
specified direction increase or decrease linearly. Seeing this correct trend we were able to set a
coordinate area in our specified target range which would be used motion tracking and target
following calibration part of our testing.

5.3 Motion tracking and target Following Calibration


Motion tracking and target following calibration was the most significant part of our testing
procedure and would show if the main functionality of our system was viable and operational.
Here we would use the target information we tested previously and combine it with our tested grid
axii in our specified target location the we used test locations scattered on the target area to set up
our mapping to our targetting system. We picked 14 locations as depicted in the previous photo
with blue squares as the calibration points. Those points had a designated laser degree which was
plotted to the matching location information to calibrate precise degree to coordinate matching.
We also computed x and y point adjustment which took in height and width into consideration and
we chose to take the center coordinate of the target. These new coordinate readings would ensure
we get the laser single point in center position of target. The following table is the horizontal and
vertical degree points matched with the measured x-y coordinates for the giving calibration points
spread out across the board.
27

XPoint

HorizontalDegree
163
165
164
44
50
67
280
276
262
108
111
231
226

YPoint
50
50
50
28
28
28
76
76
76
39
39
64
64

VerticalDegree
48
102
166
60
120
170
54
110
169
78
138
71
136

0
12
24
0
12
24
0
12
24
6
18
6
18

Figure 5.3.1 : Data Taken Through Motion Tracking and Calibration


After we collected measured x-y coordinates with specified degree points we put the measure
data into an excel spreadsheet to capture the plot relationship between degree and points the
resulting graphs are as shown below. We also plotted a trend line which gave us a linear
relationship between the points one for each axis. This information was used to get the slope and
y intercept that we would use in our calculations for every point we wished to access in our
targeting scope. The two final equations used were for the horizontal graph y = 0.2161x + 15.671
and for the vertical graph y = 0.2039x - 10.307.

Figure 5.3.2 : Graph For Horizontal Degree vs X Point

28

Figure 5.3.3 : Graph For Vertical Degree vs Y Point

4.

Speed and Precision

Analyzing speed of our system we tested multiple runs and found that we had a lag between
target acquisition and motor redirection to specified target. After capturing multiple data points,
we are able to see a common theme and average run time for recognize to target this was shown
in the following table. Testing our grid, we chose opposite corners to measure largest latency
between recognition and acquisition and found and average time to equal 1.056 Sec. This would
not be ideal for real world application but noting the delay we were able to move on and continue
to adjust and make our tracking system work correctly.

Trial Runtime (sec)


1

1.11

.95

1.16

1.05

1.16

.89

29

1.09

1.11

.97

10

1.07

Precision was tested by selecting a specified location on our grid and marking and then
repeating. We took multiple measurements and found that there would be a slight difference in
location each time we chose the specified location. At 132 cm from the laser mount our laser
diameter was approximately .8 cm and our standard error of movement averaged to be 1.6 cm from
the same previous location move. Where we saw a dramatic reading and need for improvement
was in moving from multiple locations from we saw an error average 3.67 cm. We account this
for a small range of motion from in the servo motors being controlled by precision timing of a
pulse wave. If the movement was short, we would have less read time for the movement if a
movement was large it would have a long read time therefore higher velocity of movement in the
motor and prone to large difference in location as compared to a short move. We accounted for
this by setting of a calibration zone by selecting a box which would be 4 cm in each direction up,
down, left and right this would account for an error difference and keep us in the range of the point
we wanted to be in. Although this is not ideal for the purpose of this lab we were able to work with
this amount of calibration to continue a somewhat accurate location approximation.

30

5.4 Failure Mode Analysis


This section will cover the possible failures that will occur within our system based on the
type of input signals that are retrieved in both the Raspberry Pi 3 and the PIC subsystems. Within
our implementation the only possible errors that occur are related towards the PIC and the
communication from the PIC to the Raspberry Pi 3. The analysis and further future analysis will
be done at the perspective of the PIC and Raspberry Pi 3.
5.4.1

PIC Failure Mode Analysis


Table 2. SA0 Input of PIC

Input Signal
Reset

Clock
X Coordinate

Y Coordinate

SEND

DATA [6:0]

SA0 Result
When the input is stuck on 0, the entire system will not operate. This is
because the input signal is an active low signal. If the power is also off while
the reset is off, then the entire system will be powered off yielding an output
of 0 on all lines. If the power is on, then the output of the system will yield
unstable and undeterminable results.
If Clock is stuck at 0, the entire system will fail, the system will never be able
to execute an operation. The system is dependent on a pulsing clock to be able
to move on an execute the program that is programmed on the PIC system.
If the X coordinate from the analog stick is stuck at 0 then the menu system
will always move towards the down option while in the menu system. If the
system is manual mode at the time of the error, then the motion of the laser will
go to its lowest horizontal angle and remain there.
If the Y coordinate from the analog stick is stuck at 0 at the time of the menu
system, then nothing will happen whatsoever. If the system is manual mode at
the time of the error, then the motion of the laser will go to its lowest vertical
angle and remain there.
The SEND signal will only have effect in automatic, calibration, and demo
modes.
If the SEND signal from the Raspberry Pi 3 is stuck at 0 then the PIC will be
signaled to read the data bits. If the SEND signal is stuck at 0 then the PIC will
forever and indefinitely read the data bits until the SEND signal turns 1.
The DATA signal will only have effect in automatic, calibration, and demo
modes.
If the data bits from the Raspberry Pi 3 is stuck at 0 then the data bits will be 0.
This will be an issue if the SEND signal is 0 as well since the data bits be read
as 0 indefinitely.

31

Table 3. SA1 Input of PIC


Input Signal
Reset
Clock
X Coordinate

Y Coordinate

SEND

DATA [6:0]

SA1 Result
When the input is stuck on 1, the entire system will be operational. This is
because the input signal is an active low signal. The system will begin
behaving as programmed and described in the system description.
If Clock is stuck at 1, the entire system will fail, the system will never be able
to execute an operation. The system is dependent on a pulsing clock to be able
to move on an execute the program that is programmed on the PIC system.
If the X coordinate from the analog stick is stuck at 1 then the menu system
will always move towards the up option while in the menu system. If the system
is manual mode at the time of the error, then the motion of the laser will go to
its highest horizontal angle and remain there.
If the Y coordinate from the analog stick is stuck at 1 at the time of the menu
system, then nothing will happen whatsoever. If the system is manual mode at
the time of the error, then the motion of the laser will go to its highest vertical
angle and remain there.
The SEND signal will only have effect in automatic, calibration, and demo
modes.
If the SEND signal from the Raspberry Pi 3 is stuck at 1 then the PIC will
continually wait for and stuck waiting for a SEND signal to be 0 to continue
automatic operation.
The DATA signal will only have effect in automatic, calibration, and demo
modes.
If the data bits from the Raspberry Pi 3 is stuck at 1 then the data bits will be 1.
This will be an issue if the SEND signal is 1 as well since the data bits be read
as 1 indefinitely.

32

5.4.2 Raspberry Pi 3 Failure Mode Analysis


This section will focus on the local controller unit that is responsible for selecting and targeting
slaves to start the communication system. The local controlling unit consists of the following input
signals which includes reset, clock, RSR232 RX, and I2C data.
Table 4. SA0 Input of Raspberry Pi 3
Input Signal
CONFIRM

SA0 Result
The CONFIRM signal will only have effect in in automatic, calibration,
and demo modes.
If CONFIRM is stuck at 0 then the Raspberry Pi 3 will continue operation
and will assume that the PIC has received all of the information sent to it.
As a result, when CONFIRM is stuck at 0, the PIC will move the motor to
wherever the data bit is set to on the Raspberry Pi 3.
LASER/TARGET While in Manual Mode, if the LASER/TARGET signal is stuck at 0 then
the laser will remain off.
While in Automatic Mode, if the LASER/TARGET signal is stuck at 0 then
the target selection will be stuck at target 1.
In other modes, LASER/TARGET has no effect.
Table 5. SA1 Input of Raspberry Pi 3
Input Signal
CONFIRM

SA1 Result
The CONFIRM signal will only have effect in in automatic, calibration,
and demo modes.
If CONFIRM is stuck at 1 then the Raspberry Pi 3 will wait indefinitely
until the CONFRIM signal is 0 to continue operation.
When stuck at 1 the Raspberry Pi 3, assumes that the PIC has not received
the command yet.
As a result, when CONFIRM is stuck at 1, the Raspberry Pi 3 will be stuck
waiting.
LASER/TARGET While in Manual Mode, if the LASER/TARGET signal is stuck at 1 then
the laser will remain on.
While in Automatic Mode, if the LASER/TARGET signal is stuck at 1 then
the target selection will be stuck at target 2.
In other modes, LASER/TARGET has no effect.

5.5 Analysis of Errors and Possible Errors


5.5.1

Software

5.5.1.1 User Control


One error that occurred was the use of user control. When controlling the various options for
the main machine, we would send signals from the Microcontroller directly to the Raspberry Pi.
However, due to the nature of our ports, we did not have an output for the choices between the
automatic mode and the manual mode. Even by reducing our ports through shift registers, the
33

bonus features we wanted to implement required a lot of the processor ports. Through hindsight,
to remedy this problem we would simply use a different Microcontroller with more ports and
whatnot. However, since the software for our microcontroller was coded for the PIC 18F25K22,
we decided to share ports with other features. This then required us to manually run our settings
through the console during demo.
5.5.1.2 Image Processing
Another error that occurred was when calling Save Image function to directly save images
onto the USB/ Raspberry Pi, the Raspberry Pi would cause the program to freeze and crash. We
attributed this to the high resolution of the camera. Since the camera was producing very high
resolution pictures, the CPU processor utilization would hit very high numbers causing the
program to freeze and then crash. To remedy this solution, we would have two choices, either use
a stronger processor (something other than the Raspberry Pi 3) or reduce the resolution for our
camera.
5.5.2

Hardware

5.5.2.1 PIC resetting


One error that occurred constantly during debugging and the demo was that when in operation,
the PIC would constantly reset. Through research and testing, we determined that the PIC micro
controller was resetting due to the voltage drop at the reset pin. When a certain amount of voltage
is drawn through the PIC, the voltage would drop low enough on the reset pin causing the PIC to
automatically reset. To help remedy solution we would simply add a small capacitor in between
the power and the reset switch to account for sudden drops of voltage.
5.5.2.2 Ground
A problem that often occurred when programming the PIC was that we had to make sure that
the ground was connected properly with everything. We would notice the target voltage from the
PIC microcontroller was often not met and then we would test the voltage and get accurate
readings. The thing was we were not checking the voltage with respect to the ground on the PIC.
This in hindsight was a small error that was easily fixed. Also since we had two separate power
rails, we had to make sure none of the power rails were connected to one another specified range.

6. Summary and Conclusion


In this project we went through a full design cycle of a project of our choice starting from initial
idea and design specification of a target tracking and response security system for home consumer
use. We used the design specifications to generate initial block diagrams and ideas for
implementing the design according to the desired specifications. In this write up, we walk through
the software and hardware modules and components needed to execute the design. We chose
camera image sensors as our inputs for obvious security surveillance purposes but then chose to
re-use those cameras for image processing and target tracking information. Combining these image
sensors with two microprocessors we were able to process the information and generate a response
to the target track as a second main function of our system. We chose to use a laser on a mount
with servo motors for modelled multidirectional weapon response. We also added multiple extra
features, such as manual mode controlled by a user controlled joystick and user menu also
controlled by the joystick and image capture with night vision for low light conditions. All these
34

features needed to be implemented and tested for completeness of our design. We came up with a
plan to test the limits and regular use cases for expected results and this enabled us to find and
troubleshoot errors in our design. In our test plan taking many physical measurements of target
space and laser angles was key to our system working correctly. Calibration was the main testing
and troubleshooting of our system, but also signal integrity and communication between
processors and sensors was also key to the system function. We finally used our test plan to
measure error in our system along with difficulties that could not be resolved in the scope and
timeline of our project. The error analysis would be key to anticipating and resolving issues quickly
in future development and well as making plans for future changes design decisions if we continue
to enhance and develop our product. We were able to successfully implement a working system
which tracked and selectable targets automatically and were able to implement all the extra features
for ease of use but there were limitations in accuracy and speed which would need future work to
have an acceptable consumer security system. Although there were some limitations this does not
deter us as the analysis of those limitations gave rise to possible solutions and is informative to
and end product being attainable given more time and resources.
In conclusion we designed a motion tracking and response security device which we named
T.R.I.S.S.H., the acronym standing for target response inspection security system - home edition.
The purpose of which was to design it using cost effective parts so it could scale up to a possible
weaponized system for military and law enforcement use. The technology used was challenging
to implement a real time speed and accurate system but given more time and resources we feel our
prototype would be capable of real consumer quality results. Analyzing the possible errors was
informative because it raised our standards for future and testing and product development that
would be need in order to have a real product. This was a great exercise in the product development
cycle by most importantly isolating a problem in the world which could be addressed, improved
or solved and finding a cost effective way of doing it. We feel it helped shape our mind as
innovators and entrepreneurial opportunities which we hope to capitalize on in industry. The work
ethic needed to start and complete a project of our own creation proved very demanding to be selfmotivated. Again we feel this helped prepare us for life after the university and employing all our
Electrical and Computer Science Engineering skills we have gained through academic career to
successfully implement this project idea. Finally we are proud to work in great team who supported
each other for completion and resolution to difficult task and assignments, again a most value skill
for life after academia. Microprocessors.

35

7. Authors and Contributions


NAME

CONTRIBUTIONS

Jeffrey Nguyen

Lead C Programmer, Print to LCD,


Hardware/Software Debugging, Lab Report

Jesus Sandoval

Lead Reporter (Diagrams and Analysis),


Hardware/Software Debugging, Lab Report

Minhhue H. Khuu

Lead Hardware Implementer, Lead Communication


Implementer, Hardware/Software Debugging, Made
Internal C Libraries, Lab Report

36

Anda mungkin juga menyukai