Anda di halaman 1dari 53
us cu») United States cz) Patent Application Publication Northrup et al. oy oy ~ my en @ 3) SYSTEM AND METHOD FOR MARKSMANSHIP TRAINING Applicant: Shot lator, LLC, Dallas, TX. ws) Inventors: James L. Northrup, Dallas, TX (US): Robert P. Northrup, Dallas, TX (US) eter F, Blakeley Assignee Appl. No. 15/589,608 Filed: May 8, 2017 Related US. Application Data Continuation-in-par of application No, 141969,302, filed on Dee. 15,2015, whichis continuation-in-part, of aplication No, 147686.398, filed on Ape. 14, 2015, ‘which is a continiationin-part of application No. 14/149418, filed on Jan, 7, 2014, now Pat, No, 2200 ao) Pub, No.: US 2017/0307333 Al (43) Pub. Date Oct. 26, 2017 BAL 20170307333. 9,261,332, which is a continuation-in-part of appli- catia No. 13/890.997, filed om May 9, 2013, now at, No, 9,267,762, Publication Classification In cl, PUG 326 (2006.01) PAG 3/2633 (201301) on ABSTRACT A system and method for simulating ead ofa target includes ‘network, 2 simblation administrator and a user device ‘connected to the network, a database connected to the Simulation administrator, andl a set of position trackers positioned at a simulator site, The user deviee includes Yirtual reality unit and a computer conneeted to the set of Virtual reality unit and wo the network. A generated target is simulated, The target and the user are tracked to generate a ‘Phantom target and a phaatom halo, The phantom target and fhe phantom halo are displayed on the virtual reality unit at ‘lead distance anda drop distance Irom the target a viewed ‘through the virtual reality unit 2204 Patent Application Publication Oct. 26, 2017 Sheet 1 of 22. US 2017/0307333 Al 113, 117 11 FIG. 1 118 100. 120. Patent Application Publication Oct. 26, 2017 Sheet 2 of 22. US 2017/0307333 Al 219 | | i Ir | | =_ = ¢ ‘ 220 / \ ] \ / m9 / \ / 221 222 Aang \ | os \ 2h 225 \ \ y | ly X US 2017/0307333 Al Oct. 26, 2017 Sheet 3 of 22 Patent Application Publication voc ‘O14 Patent Application Publication Oct. 26, 2017 Sheet 4 of 22. US 2017/0307333 Al FIG. 20B Patent Application Publication Oct. 26, 2017 Sheet 5 of 22. US 2017/0307333 Al 2100 \ LIGHT a SIGNAL t 2111 t 2112~fc.ectronic |__/ ARBOR |~2110 CARTRIDGE SENSOR 2114 TRIGGER POSITIONING | -2204 UNIT DETECTOR 2109 WIRELESS |-2108 a) cence inTeRFacE | SYSTEM 2102 COMPUTER [2101 2103 2120 how HEADSET 2105 CAMERA —™~ 2106 FIG. 21A CAMERA Patent Application Publication Oct. 26,2017 Sheet 6 of 22 US 2017/0307333 Al 2101 SYSTEM COMPUTER 222 2124 OPERATING SYSTEM HEAD UNIT 2125 SIMULATION Sensors 2135 2127 pLaver | [HEAD unit |-2128 2136 opuecT [| OmeCT naroware ||| LESPAY WEAPON || TRACKER aici POSITIONING 21297 LOECT }|_OMECT P2130 DETECTOR TARGET | [LAUNCHER SENSORS | 9457 2131/|_osect || osvect [+2132] otyg 2123 FIG. 21B va 2121 COMPUTER HEAD UNIT 2124 OPERATING SYSTEM 1; 2135 2125 SIMULATION SENSORS 2127S puaver | [HEAD uN }-2128 2136 opyect ||_OBJECT penny: HARDWARE 2129~F Weapon |[ TRACKER |<2130 || “pRivers POSITIONING osect || oBect DETECTOR Tarcet |[ MESH sensors } 9437 2131|_opseot_||_osvect K5;gq CAMERAS 9438, LAUNGHER || SPATIAL oe 2132-7_oBecT | Ancor [+2133 q 2123 FIG, 21C Patent Application Publication Oct. 26,2017 Sheet 7 of 22. US 2017/0307333 Al FIG. 22A 2200 N FIG. 22B Patent Application Publication Oct. 26, 2017 Sheet 8 of 22. US 2017/0307333 Al 2200 FIG, 22C 2200 FIG. 22D Patent Application Publication Oct. 26, 2017 Sheet 9 of 22. US 2017/0307333 Al S g NS 2224~] USB 2271 BATTERY 2272~\ PROCESSOR 2073 MEMORY 2074-7] _ ANTENNA 2275-7] _ SENSORS OTe FIG. 22E Patent Application Publication Oct. 26, 2017 Sheet 10 of 22 2202 a 2210 2202 ss 2324 2210 US 2017/0307333 AI | -2308 2310 2304 ) 2306 FIG. 23A 2320 2304 2320 \ 2306 HH }, ~ 2322 2318 2314 \.2357 2312 2316 = FIG. 23B Patent Application 2358 Publication Oct. 26, 2017 Sheet 11 of 22, US 2017/0307333 Al t ANTENNA 2359 BLUETOOTH |-2356 2357 MODULE °G ea SENSOR MICRO SD 468 SD MEMORY CARD CARD SLOT, FIG. 23C Patent Application Publication Oct. 26, 2017 Sheet 12 of 22. US 2017/0307333 Al 2204 2404 2402 2401 ) al IQ 2406 ! ao f J 2410 2407~ oat 2412 2403 2405 7 2414 FIG. 24A ae CONNECTOR 2411 2438 CAVITY BATTERY Z 2406 TRANSMISsIONAL 2411 2448 2435 | a TUBE +t ) \ ons Laser 2409 2480 bord 2437 t 2404 PHOTO CELL T 2436 ‘CIRCUIT v FIG. 24B 2444 CLOSURE Patent Application Publication Oct. 26, 2017 Sheet 13 of 22. US 2017/0307333 Al 2413 2450 2422 5 Sa oo ood B 7 2412 FIG. 24C 24240 2424B 2426B Ur 2433 2431B FIG. 24D Patent Application Publication Oct. 26, 2017 Sheet 14 of 22. US 2017/0307333 Al 2500 2501 24 2502 2503 2506 2505 FIG. 25A |_| 2510 25162518 2520 9509 2514 2517 2512 O ph FIG. 25B 2513 2610 2616 2618 2617 * 2621 2624 2611 2612 2623. 2623 2613 FIG. 25C a 2626 2614 2629 2508 2710 e 2712 FIG. 25D Patent Application Publication Oct. 26, 2017 Sheet 15 of 22. US 2017/0307333 Al ‘ABUTMENT SENSOR CONTAINMENT FLANGE oy TUBE 2874 2572 7 2576 be A 2584 280 INDICATOR SHIELD 2582 oe 2573 USB PORT FIG. 25E 2570 2597 2588 oe 2592, { 2599 2595 2569 2580 { } dy sf a ‘Ns 2593 7 Ke Ds reseed u | FIG. 25F 2871 2598 Patent Application Publication Oct. 26, 2017 Sheet 16 of 22. US 2017/0307333 Al USB TETHER RING 2566 ‘CYLINDER 2561 \\-2564 USB PORT “2560 2562 THIMBLE SENSOR FIG. 25G DIGITAL CLAY PHANTOM TARGET TARGET 26106 26110 — < y oo“ 26107/ 26112 HIT ‘SPHERE 261027] x Y LAUNCHER (xyz) : (a8 26104 WEAPON (xy2) (a,B.") x FIG. 26 Patent Application Publication Oct. 26, 2017 Sheet 17 of 22. US 2017/0307333 Al 2716 FIG. 27 2800 2802~| __ RECORD VIDEO OF TARGET TRAJECTORIES DETERMINE MATHEMATICAL MODEL OF TRAJECTORIES a FROM VIDEO 1 DETERMINE TOWER 2806~|_LOCATIONS ORIENTATION FIG. 28A Patent Application Publication Oct. 26, 2017 Sheet 18 of 22. US 2017/0307333 Al IMAGE 2951 2982 g 2983 2954 a ORIGINAL IMAGE FIG. 28B MESH 2955 ie ‘SPATIAL MAP WITH ANCHORS. FIG. 28C Patent Application Publication Oct. 26, 2017 Sheet 19 of 22. US 2017/0307333 Al FIG. 28D 2900 SN 2902 ‘SET LAUNCHER LOCATION, TRAJECTORY, AND SETTINGS t 2904. ‘SET AMBIENT CONDITIONS T 2906~| ‘SET DIGITAL CLAY TARGET SETTINGS 1 2908. ‘SET WEAPON AND ANMUNITION SETTINGS q 2910. SET PHANTOM TARGET SETTINGS| Y 2912 ‘SET LEAD DISTANCE FIG. 29A FIG. 29B ~ 22 Patent Application Publication Oct. 26, 2017 Sheet 20 of 22. US 2017/0307333 Al 29100 29102~[ __ GETLOCATION AND a ORIENTATION OF HEADSET t 29104 ~J” DISPLAY RANGE GRAPHICS +t 29106 ‘GET LOGATION AND ORIENTATION OF WEAPON t 29107 J" PROCESS CONTROL SIGNAL T 29120 29108~JDispLaY WEAPON IMAGE < No. i 29126 UPDATE DIGITAL CLAY YES [———*]_TARGET BASED ON TIME DETERVINE SHORTEST aoe) a a — DISTANCES BETWEENTHERAY |. 9949, Pi UPDATE LOCATION AND —— - — pour oe DETERMINE PROBABILITIES OF REOEDETALCATTRGET] | MTINGDOTA GAY TARGET 29130 29118~|__ANDPHANTOMTARGET ee 4 NUMBER BETWEEN OAND1 _[~29132 29119-“L_UPDATE DISPLAY CL RECORD A HIT BASED ON GAUSSIAN DISTRIBUTION _ [29134 1 DISPLAY HIT WHEN CLAY TARGET REACHES CURRENT 99436 POSITION OF PHANTOM TARGET 1 FIG. 29C Patent Application Publication Oct. 26, 2017 Sheet 21 of 22. US 2017/0307333 Al OVERLAY 2957 — "AUGMENTED REALITY OVERLAY FIG. 29D 3002~_, 3006 ~_ 7 3008 Patent Application Publication Oct. 26, 2017 Sheet 22 of 22. US 2017/0307333 Al 3100 3102~J RECEIVE CONTROL SIGNAL t 3104~J DETERMINE INITIAL POSITION 3116 >| LAUNCH TARGET esse NO. LASER TOGGLE |-———>} 3120 TURN LEFT TURN RIGHT US 2017/0307333 Al SYSTEM AND METHOD FOR MARKSMANSHIP TRAINING CROSS REFERENCE TO RELATED [APPLICATIONS 10001] This application is a continuation in part of US. patent application Ser. No. 14/969,302 filed Dee. 15, 2015 ‘which i continuation in part of U.S, patent application Ser No, 14/686,398 filed Apr. 14, 2015, which isa continuation in par of US. patent application Ser. No. 14/149,418 fled Jan. 7, 2014, granted as US. Pat, No 9,261,332 on Feb. 16, 2016, which is a continuation in part of U'S. patent appli cation See. No, 13/890,997 filed May 9, 2013, geanted as US. Pat. No. 9,267,762 on Feb, 23, 2016, Pach ofthe patent applications identified above is incorporated herein by rel- ‘erence in its entity to provide continuity of disclosure FIELD OF THE INVENTION [0002] The present invention relates to devices for teach- ‘ng marksmen how to properly lead a moving target with a ‘weapon. More paaticulary, the invention relates to optical projection systems to monitor and simulate trp, skeet, and sporting clay shooting. BACKGROUND OF THE INVENTION 10003] | Marksmen typically train and hone their shooting sills by engaging in skeet, rp or sporting clay shooting at ‘4 shooting range, The objective for a marksman is 10 successfully hit a moving target by tracking at various ‘distances and angles and anticipating the delay time between, the shot and the impact. In order hit the moving target, the ‘marksman must aim the weapon ahead of and above the ‘moving target by a distance scien to allow a projectile fired from the weapon suficient time to reach the moving target. The process of aiming the weapon ahead of the moving target is known in the art as “Teading the target.” “Lead” is detined asthe distance between the moving target ‘and the aiming point, The corret lead distance is critieal (0 successfully hit the moving target. Further the corret lead distance is increasingly important as the distance of the ‘marksman to the moving target increases, the speed of the ‘moving target increases, and the direction of movement bocomes more oblique 10003) Trap shooting range 200 comprises firing lanes 201 and trap house 202. Stations 208, 204,205,206, and 207 are Positioned along ridivs 214 from center 218 of trap house 202, Redus 214 is distance 216 fiom center 218. Distance 216 is 48 feet Fach of stations 203, 204, 208, 206, and 207 js positioned at radias 214 at equal are lengths, Are length 213 is 9 feet. Stations 208, 209, 210, 211, and 212 are Positioned along radius 215 from center 218, Rais 215 is ‘distance 217 from center 218, Distance 217 i 81 feet. Each ‘of tations 208, 209, 210, 211, and 212 is positioned at radius 218 at equal ae lengths. Ace length 227 is 12 fet. Feld 226 fas long 221 from center 218 along center line 220 of trap house 202 to point 219. Length 221 is 150 feet. Boundary Jine 222 extends 150 feet fom center 218 a angle 224 from ‘enter line 220, Boundary Tine 228 extends 150 feet om ‘center 218 st angle 225 from center ine 220. Angles 224 and 225 are each 22° from centerline 220, Trap house 202 launches clay targets st variows trajectories within eld 226 Macksman 228 positioned at any of stations 208, 204, 208, Oct. 26, 2017 206, 207, 208, 209, 210, 11, and 212 attemps to shoot and ‘break the launched clay targets [0005] FIGS. 34, 3B, 3C, and 3D depict examples of tanget paths and associated projectile paths illustrating the ‘wide range of lead distances and distances required of the ‘marksman. The tem “projectile,” as use in this application, ‘means any projectile ied from a weapon but more typically a shoigun round comprised of pellets of various sizes. For example, FIG, 3 shows a left to right trijectory 303 of fayet 301 and left to sight intercept tajectory 304 for projectile 302, In this example, the iateroept path is oblique, ‘ering the lead to he a greater distance along the postive Xanis FIG. 38 shows a left to right trjectory 307 af target 308 sand intercept trajectory 308 for projectile 306. Lo this ‘example, the intercept path is acute, requiring te lead to be a lesser distance in the postive X direetion. FIG. 3C shows 4 ight to left trajectory ML of target 309 aad intercepting trajectory 312 for projectile 310, In this example, the inter cept path is oblique and requires a greater lead in the ‘negative X direction. PIG. 3D shows proximal to distal and sight o left trajectory 318 of target 313 and intercept trajectory 316 for projectile 314. In this example, the intr- cept path is acute and requires a lesser lead in the negative X direction. [0006] FIGS. 44 and 41 depict a range of paths ofa clay farget andl an associated inereept projectile. The most yp cal projectile used in skect and trp shooting is sholgua round, such asa 12-gauge round ofa 20 gauge fired, the pellets ofthe round spread out ito a . hhaving a generally circular cross-section. The eross-section increases a the flight time ofthe pelles increases. Referring to FIG. 4A, clay target 401 moves along path 402. Shot string. 403 intercepts clay target 401. Path 402 is an ideal path, in that no variables are considered that may alter path ‘402 of clay target AO once clay target 401 is launches [0007] Referring to FIG. 4B, path range 404 depicts ‘ange of potential light paths fora clay target after being released on a shooting range, The flight path of the clay target is affected by several variables. Variables include mass, wind, drag, lit fore, altitude, humidity, and tempera ture, resulting ina range of probable ight paths, path range +404, Path range 404 has upper limit 408 and lower timit 406 Path range 404 from launch angle 0 is extrapolated using: mL where xis the clay position along the x-axis , is the initial positon of the clay target along the X-axis, vis the initial Velocity along the x-axis, a, is the acceleration along the canis, tis ime, and C, isthe drag and Tit variable slong the -caxis isthe clay position along he y-axis, yi he initial positon of the elay target along the y-axis, v,. is th initial velocity along the y-axis, a, is the acceleration along the tis time, and C, isthe dmg and it variable slong the Upper limit 405 is a maximum distance along the scans with C, at’ maximum anda maximum along the y-axis with C, ata maximum, Lower iit 406 is mina US 2017/0307333 Al distance along the x-axis wi ‘minimum along the y-anis wi Tit are given by Whore Fig i the drag free, 2 is the density ofthe ar, vis vy, Ais the cross-sectional urea, and C, is the drag eoell- ‘where Fy isthe fit fore, p is the donsity ofthe ar, vis v. Ais the planform area, and C,, is the lift coefcint 10008] Referring wo FIG. 5, an example of lead from the perspective of the marksman is described. Marksman S01 ‘sims weapon 302 at ely target 803 moving along path $04 Jeft to right. In onder to hit ely target $03, marksman SOL ust anticipate the time delay for a projectile fred from ‘weapon 502 to intercept clay target 503 by aiming weapon, 502 ahead ofelay target $03 a ait point SOS. Aim point SOS Js lead distance 806 ahead of clay target $03 along path $04. Marksman S01 must anticipate axl adjust aim point SOS according to a host guess atthe anticipated path of the target. 10009] Clay target $03 has intial tajetory’ anges yal Positional coordinates x,.y, and a velocity v,. im point 508 has coordinates X,, ¥2 Lead distance $06 has x-com- ponent $07 and y-component S08. X-component S07 and ‘¥ecomponent $08 are calculated by ee fas where Ax is x component S07 and Ay is y component 508, As y increases, Ay must increase. As increases, AX must increase. As f nerases, AY must increase. [0010] The prior art has attempted to address the problems ‘of teaching proper lead distance with limited success. For ‘example, U.S. Pat. No. 3,748,751 to Broglia, et al. discloses 2 laser, automatic fire weapon simulator. The simulator ‘includes a display’ sereen, a projector for projecting a motion Picture on the display screen. A housing attaches to the barrel of the weapon. A eamera with a namrow band-pass ‘ker positioned to view the display sereen detects and records the laser light and the target shown on the display sereen. However, the simulator requires tbe marksman t0 sim at an invisible object, thereby: making the learning process of leading a target difficult and time-consuming, [0011] U.S, Pat, No, 3,940,204 to Yokoi discloses a clay shooting simulation system. The system includes a sereen, & first projector providing a visible mark on the sereen, @ second projector providing an inane mark on the sereen, a micro adapted to reflec the visible mark and the infrared mark tothe sereen, and a mechanical apparatus for moving the mirror in three dimensions to move the wo marks on the sereen sch thatthe infrared mark leads the visible mark to simulate a lead-sighting point in actual clay shooting. A Tight receiver receives the rellccted infrared light, Ilowever, the system in Yokoi requires a complex mechanical device t0 Oct. 26, 2017 project and move the target on the sereen, Whi {roqueat failure and increased maintenance, [0012] US. Pat No, 3,945,133 to Mohon, ota, diseloses ‘weapons training simulator ullizing polarized light. The simulator includes a sereen and a projector projecting. a ‘wo-ayer film, The 1Wo-layer fil is formed of ¢ noma fia ‘and polarized film. The normal film shows a background scene with a target with non-polarized light, The polarized film shows a leading target with polarized light. The polt- Jed film is layered on top ofthe normal non-polarized film. ‘A polarized light sensor Is mounted on the barel of @ un However, the Weapons tring simulator requires two cam tras andl two types of film to produce the two-layered film ‘making the simulator expensive and Gme-consuming 1 build and operate [0013] US. Pat No. 5,194,006 to Zaenglein, Ie. discloses 8 shooting simulator. The simblator includes a sereen, a projector for displaying a moving tant ima on the sere and a weapon connected tothe projector. When a marksman pulls de trigger « hear of infared Hight is emitted from the ‘weapon. A delay js introduced berwaen the time the trigger is pulled aad the beam is emitted. An infrared light sensor detoots the boam of infrared light. However, the taining device in Zaenglein, Jr. requires the marksman to aim at an fvisible object, thereby making the lesming process of Jeading «target dificult and time-consumis [0014] "US. Patent Publication No. 201010201620 to Sar- gent discloses a firearm training system for moving targets ‘The system includes a firearm, two cameras mounted on the firearm, processor, and a display. The two eameras capture ‘sot of stereo images of the moving target slong the moving ‘target’ path when the tigger is pulled, However, the system regres the marksman to aim at an invisible object, thereby paking the leaning process of leading a target difficult and time-consuming. Further, the system requires two caments ‘mounted on the firant making the firearm heavy: and ficult 19 manipulate leading to inaccurate aiming and tiring by the marksman when firing live ammunition without the mounted camer, [0015] The prior art fils to disclose or suggest a system and method for simulating a lead for # moving target using ‘gencrited images of targets projected at the same scale as Viewed inthe fell and a phantom target positioned ahead of the targets having a variable contrast. The pri art further {ails to disclose or suggest a system and method for sim Jating lead in vital reality system. Therefore, there is a need in the at fora shooting simulator that recreates moving targets at the same visual scale as soen in the Field with a ‘phantom target to teach proper lead of a moving target in a Virtual reality platform, SUMMARY, [0016] system and method for simuloting lead ofa target includes @ network, a simblation scministrator connected t0 the network, a database connected to the simulation admin- {stator and a user device connected to the network. The user device includes set of virtual reality unit, and a computer connected othe virtual reality unit and tothe network. A set of position trackers are connected to the computer. [0017] Ina prefered embodiment, targets simulated. In fone embodiment, @ simulated weapon is provided. In ‘another embodiment, set of sensor is attached fo areal ‘weapon. In another embodiment st of gloves having a ct of seasors is wom by a user. The system generates US 2017/0307333 Al simulated target and displays the simulated target upon Taunch of the generated target. The computer tracks the Position ofthe generated target and the position of the virtual reality uni andthe weapon to generate phantom target and phantom halo. The generated phantom target and the generated phuntom halo are displayed on the vietal reality ‘nit ata lead distance and a drop distance from the live target fs viewed through the virtual reality unit. The computer determines 2 hit ora miss ofthe generated target using the ‘weapon, the phantom target, and the phantom halo. In one ‘embodiment, the disclosed system snd method is imple~ mented in two-dimensional video game, [0018] The present disclosure provides a system which ‘embodies significantly more than an abstract idea including technical advancements inthe field of data processing and & teansformation of data which is directly related to real world fbjects and situations. The disclosed embodiments create and transform imagery in hardware, for example, 2 weapon Peripheral und a sensce attachment to @ real Weapon, BRIBE DESCRIPTION OF THE DRAWINGS 10019} “The disclosed embodiments will e described with reference to the accompanying deioas 10020] FIG. 1 is plan view of a skoot shooting range 021) FIG. 2 is plan view of «tap shooting range 0022] FIG. 3s tant path and an associated projectile path 0023] FIG. 3B isatargt path and an associat projesile path 10024] FIG. 3C sa tage path and an associated projectile path {0025} FIG. 3D isa target path and an associated projectile path 10026) FG, 4 is an ideal path of s moving arg 0027] FIG. 4B is @ range of probable fight paths of a target 10028] FG, Sis penpecive view ofa marksman aiming st a moving target. 10029] FIG. 6 is « schematic of @ simulator system of & preferred embodiment. [0030] FIG. 7 isa schematic ofa simulation administrator ‘ofa prefered embodiment, [0031] FIG. 8 isa schematic ofa user deview ofa simulator sysiem of « preferred embodiment 10032] FIG. 9s 9 side view of a user device ofa viral realty simulator system of a preferred embodiment 10033] FIG. 92 isa front view of a user device ofa viral reality simulator system of a prefered embodiment [0034] FIG. 10A is aside view ofa simulated weapon for ‘viral reality system of a prefeted embodiment 10035] FIG. 108 is side view of a real weapon with a set ‘of sensors attached for 8 viral reality system of preferred ‘embodiment. 10036) FIG. 10C is a detail view of a trigger sensor of 2 preferred embodiment. [0037] FIG. 10D isa deti view ofa set of muzzle sensors ofa prefered embodiment, [0033] FIG. 108 is deuil view of a set ofa transmitter base of a prefered embodiment, 10039] FIG. 10F isa detail view ofa set of muzzle sensors tied sith the transmitter base of FIG. 10E of a prefered ‘embostiment Oct. 26, 2017 [0040] FG, 106 is a detail view of emovable plug with light emitting diodes fora weapon of prefered! embed sent [0041] FIG. 101isa dew view ofa removable plog with Tight emitting diodes tached to @ Weapon of prefered emboximent [0042] "FIG. 101s adel view ofa removable collar with Tight emitting diodes tached to 4 Weapon of @ prefered embodiment {0043} FIG. 107 is a side view of weapon with an adjustable stock fora viral reality simlator system of a prefered embodiment {0044} FIG. 10K is a detail view of ager sensor of @ prcfered embodiment {0045} FIG. 114 ia simulation view ofa weapon having fn iron sight of a prefered embodiment [0046] FIG. 118 isa simulation view ofa weapon having 4 rfl sight of «preferred embodiment [0047] FIG. 1C isa simulation view ofa weapon having ‘vbologrphie sight of prefered embeximest [0048] FIG. 12 is a schematic view of a virwal realty Simlvon environment of a prefered embodiment {0049} FIG. 19 is a command input menu foe a vial reality simulator system of «prefered embodiment (0080) FIG. 14 isa flow chart of @ method for untime process ofa vital reality simulation system ofa preferred embodiment {OOSI] FIG. 1SA i top view of a ser and a sinlation vironment of preferred embodiment. {0082} FIG. 188 i flow char of method foe deteamin- ing a view fora user deviee with rexpost to a position std mevientation ofthe usar device and the Weapon. {0083} FIG. 18C isa flow chart of a method for mapping the position and orientation of the user device and the ‘wsapon to the simulation environment for determining a splay fed of view a prefered embod {0084} FIG. 164 isa owehar of a method foe detemnin- Jing a phastom and halo of prefemed embodiment {00SS] FIG. 161 is a plan view ofa tage and phantom ofa prefered embodiment [0086] FIG. 16C isan isomewie view of a get und a hntom of a prefered embodiment [0087] FIG. 17 isa user pont of view ofa vita reality simulation system of a preferred embodiment {0088} FIG. 18 is an isomer view of an input device congnred 0 be mounted on a al system of a Weapon of @ prefered embodiment [0089] FIG. 19 is a simulation view hat shows beams being project from a bare af @ Weapon of a prefered embodiment [0060] FIG, embodiment {006I] FIG. 208 is a sporting clay field of prefered embodiment {0062} FIG. 218 is diagram ofa prefered embodiment [0063] FIG. 212 sa disgram of vewal ality system of 4 prefered embodiment {0068} FIG. 21C is a diagram of an augmented reality system ofa prefered embodiment {0065} FIG. 228 isa diagram of system vsing a posi tioning detector at an ed a barrel ina prefered embod a 20A is a five stand field of preferred US 2017/0307333 Al [0066] FIG. 220 is a diagram of a system wsing a posi- tioning detector mounted under a barrel in a preferred ‘embovliment. 10067] FIG, 22C is a diagram of « system using sight ‘markings in a prefered embodiment 10068) FIG. 22D is a diagram of a system using sight ‘markings and a sensor thimble in «preferred emboslime 10069] FIG. 226 isu diagram ofa positioning detector ia 1 prefered embodiment. 10070] FIGS. 23A and 238 are diagrams ofa tigger unit fn preferred embodiment. [0071] FIG. 23C is a diagram of a processor board of a trigger unit in a prefered embodiment [0072] FIGS, 244 and 248 are diggrams of « mounting Shor i a preterm embodmcat 10073) FIGS. 4C and 24D are dsgrams of barel clamp ina prefer enbodiment [0074] FIGS. 284 though 28D are diagrams of leoronic ‘atridges in preferred embodiments 10075) FIGS. 28 and 25 are diagrams of a sensor atbor fn a prefered embodiment 10076) FIG. 256 is-a diagram ofa sensor thimble in @ prefered embodiment {0077} FIG. 26 i a diagram of « computer implemented method for determining ksuncher location of preferred ‘embodiment 10078} FIG. 27s. diogram of graphs of pellet spread of 8 preter embodiment {007} FIG. 284 isa diagram ofa computer implemented tmethod foe simulating digital clay tages of preferred ‘embodiment 0080) FIG. 288 is «diagram of an origina image cap- tured by an augmented reaiy system na prefered embod ment 10081) IG. 28C js dogram spatial map andl anchors in ‘us augmented reality syste in prefered embodiment [0082] FIG. 28D isa diagram of a viel realty simola- tion in a prefered embodiment 10083] FIG. 284 is « diapram of intlizing« computer Snuplemented simulation of booting digital clay target. 0084) FIG. 298 is « diagram for calevlaing @ lead distance 0085] F1G.29C isa diagram ofan image fom the syste [0086] FIG. 29D isu diagram of a spatial map fam the system {0087} FIG. 30 is diagram contol movements in @ prefered embodiment [0088] FG, 31-8 fowchart of a method for processing ‘contol signals in preferred embodiment DETAILED DESCRIPTION [0089] will be appreciate by those skied inthe ar that aspects of the present diselosire may be illustrated and ‘described herein in any of a numberof patentable classes oF ‘context including any new and usefUl process, machine, ‘manvfacture, or composition of matter, or any new and ‘usefilimsprovement thereof, Therefore, aspects of the pees- ‘ent disclosure may be implemented entirely in hardware, ‘entirely in software (including firmware, resident software, micro-code, ete.) or combining software and hardware implementation that may all penerally be referred to herein 82 "eieuit” “module,” “component,” or “system.” Further, aspects of the present disclosure may’ take the fom of 8 Oct. 26, 2017 ‘computer program product embodied in one or more com- puter readable media having computer readable program ode embodied thereon, [0090] Any combination of one or more computer read- able media may be utilized. The computer readable media ‘may be a computer readable signal medium or a compu reaudable storage medium, For example, a computer exlable Storage medium may be, but nt limited to, an electronic ‘magnetic, optical, electromagnetic, or semiconductor sys- ‘em, apparatus or device, or any suitable combination ofthe foregoing. More specific examples ofthe computer readable storage medium would include, but are not limited te: a portable computer diskette, a hard disk, a random aecess memory RAM"), a read-only memory (*ROM"), an eras- able programmable read-only memory (“EPROM ot Fash ‘emory), an appropriate optical Uber with a repeater, « portable compact dise read-only memory ("CD-ROM"), an optical storage device, a magnetic storage deviee, or any stitable combination of the foregoing. Thus, a computer readable storage medium may be any tangible medium that ‘ean contin, oF store a program for use by or in connection ‘with an instruction execution system, apparatus, or deview. 0091] A computer readable signal medium may include a propsated data signal with computer readable progr «ode embodied therein, for example, in baseband oF a6 pa of a easrer wave. The propagated data signal may take any of a varity of forms, including, but not limited to, electeo- ‘magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not 8 computer readable storage ‘medium and that can communicate, propagate, or transport 4 program for use by of in connection with an instruction ‘execution system, apparatus, or device. Program code embodied on a computer readsble signal medium may be {runsmitted using aay appropriate medium, iacluding but not limited to witetess, wireline, optical fiber cable, RF, or any suitable combination thereof. [0092] Computer program code for earying out opera ‘ions for aspects ofthe present disclosure may be written ia ‘any combination of one or more programming languages, ‘including an object oriented programming language such as Java, Scala, Smaltalk, Eiffel, JADE, Emeral, C++, Ci VBNET, Python or the like, conventional procedural pro- ramming languages, such as the “C™ programming, lan- ‘guage, Visual Basic, Fortran 2003, Per, COBOL. 2002, PHP, ABAD, dynamic programming languages such as Python, Ruby and Groovy, of other programming languages. [0093] Aspects of the present disclosure are described With reference to lowehar illustrations andlor Block dia- {grams of methods, ystems and computer program products according to embodiments of the disclosure. t will be ‘understood that each block of the flowchart illustrations andor block diagrams, and combinations of blocks in the Fowehart illustrations andlor block diagrams, ean be imple- ated by computer program instructions. These computer program instructions may be provided 10 & processor of @ general purpose computer, special purpose computer, or other programmable data processing apparatus to prodige a ‘machine, such thatthe istrtions, which execute via the processor of the computer or other programmable instc- tion execution apparatus, create a mechanism for imple- ‘menting the functions/acts specified inthe flowchart andlor block diagram block or blocks. US 2017/0307333 Al 0094} ‘These compute program instructions may aso be ore in computer readable medium tht when exceuted ‘an drt a computer, ber programmble data processing ‘epparats, or other deviees function in particle mac ne, sch that the instactions when stored in the computer readable mein produce sn ale of master ielad- ing intuctons which when executed, cause computer to inaplement the function act specifi n the Bowshart andor block diagram block oF blocks. The computer program insivetions may also be loaded onto a computer, other programmsble insircton execution apparas, ae other ‘evices io ease a series of opertiona steps to be perormed ‘on the compute, other programmable appara o ter ‘devices (0 produce & computer implemented process such that tho insroetions which execute the compute ar other programmuble apparatis provide process for inplemente Ing the fuetionsaes specified inthe owehart andlor block ‘agra block or blk. 0095] Referring 0 FIG. 6, sytem 600 includes network G01, simulation administrator 602 consccted to network G01, snl usar devi 61M connected to network 601. Simi- Inton adminstrtor 602 is futher connected to simltion datahase 603 fr storage of relevant data. For example, data Jocludes a set of tryet data, et of weapon dala, and a et ‘of envionment data {0095} In one embodiment, network 601 is lea! area network, In another embodiment, network 601 isa wide area ntwork, such a the intemet. In oer embediments, ete ‘work 601 eludes a combination of wide area neworks nd Joeal area networks, inches celilar networks. 10097} Ina prefene embodiment, user device 604 com- munieates with simolation administrator 602 to simulation ataase 603 to generate and projet a simulation that includes target. phantom, and. phantom halo ajacent to the target as wil be further deseribed below. 10098) In another embodiment, simulation administrator 602 generates smlation that incl ang, panto, a phantom halo adjacent othe target, and a weapon image fs will be Rirtr described below and sends the simltion to ter device for projection. 10099] FIG. 1 depicts the goeral dimensions of a skeet shooting ange, Skeet shooting range 100 is a skeet field that includes eight shooter postions with 2 launcher locations. (Cameras 180 and 181 ate cated in positions to view houses 101 and 102 and launchers 108 and 108. Skeet shooting range 100 has high hose 101 and fow house 102 separated by eistance IHL, Distance HIT is about 120 fect, Launcher 103 is adjacent high house 101. Launcher 109 is ajacent low hose 102. Staton 110 is cuit from high hose 101 and low house 102 at distance 112 Distance 112s about 40 foo. Station 106 is guint Som high house 101 snd Jow house 102 and generally perpendicular wo distance 111 at distance 113, Distance 13 #45 fest. Station 106 is ‘stance 114 rom lauachee 108. Distance 114 is about 73 feet. Stations 104 and 108 are positioned slong are 121 beeen launcher 103 and station 106 at equa ae lengths Each of ac lengths 122, 128, and 124 is about 27 fet Stations 107 and 108 are positioned along are 121 between tion 106 and lauocher 109 at equal ar lengths. Each of ane lengths 128, 126, and 127 is 26 feet, 86 inches 10100] Targt igh oth 116 extends from high Howse 101 {o marker 117. Maker 117s positioned about 130 fet fom high hone 101 along target at pth 15 Target igh pts 115 extends from low house 102 fo marker 118, Marker 118 Oct. 26, 2017 is about 130 feet from low house 102 slong tant ight pats 116, Target fight paths 113 and 116 intersect at tanget ‘rossing point 119. Target crossing point 119 is positioned ‘stance 120 from station 110 and is 15 feet above the around, Distance 120 is 18 fect. Cay targets are laached {om high house 101 and low house 102 along tangt ight paths 115 and 116, respectively. Marksman 128 positioned at fany of stations 104, 108, 106, 107, 108, and 110 and Jauchers 103 and 109 attempts to shoot and break the Junched clay targets. [0101] FIG. 2 depicts the general dimensions of a trp shooting mage. Trap shooting range 200 is a trap fl that includes five shooter loeations with one launcher location, (Cameras 250 and 251 are located in positions to view trap house 202, Once all ofthe coordinates are setae the field dimensions are known, one good video at a nomal lens setting at 60 frames per second (ips) of one trajectory ean be ‘ed to recreate atajectory and phantom position from any point of view (POV), [0102] Referring to FIG. 7, simulation administrator 701 includes processor 702, network interface 703 connected 10 processor 702, sind memory 704 connected to processor 702 Simulation application 70S is stored in memory 704 and executed by: processor 702. Simulation application 708 includes postion application 706, statistics engine 707, and faget and phantom generator 708, [0103] In a preferred embodiment, simulation administra- {or 7O1 is a PowerFdge C6100 server and includes @ PowerEdge C410x PCle Expansion Chasis available from Dall Tne. Other suitable servers, server aangements, and ‘computing devices known in the art may be employed [0104] In one embodiment, postion application 706 com- ‘municates with a postion tracker connected to the user ‘vice to detect the postion of the user device for simuration application 70S. Statistics engine 707 communicates with a database to retrieve relevant data and generate renderings according desired simulation criteria, such as desired weap- ‘ons, environments, an target types for simulation applica tion 705, Tanget and phantom generator 708 calculates and generates a target along a target path, a phantom target, and ‘phantom halo for the desired target along. a phantom pall tor simulation application, as will be funher described below. [0105] Referring w FIG. 8, user device 800 includes computer 801 connected to headset 802. Computer 801 is further connected (0 replaceable battery 803, microphone ‘04, speaker 808, and positon tracker $06. [0106] Computer 801 includes processor 807, memory £809 connected to processor 807, al neowork intriace 808 ‘connected to processor 807. Simulation application 810 is Stored in memory 809 and executed by processor 807. Simulation application 810 includes position application 81, statistics engine 812, and target and phantom generator B13. In a preferred embodiment, position application SIL ‘communicates with position tracker 806 to detect the pos ‘ion of headset 802 for simulation aplication 810. Statistics engine 812 communicates with a database to retrieve rel evant data and generate renderings aevording desired sin lation eritera, such as desired weapons, environments, and ‘target types for simulation application 810, Target and phantom generator 813 calculates and generates a target ‘long a target path, phantom target, and a phantom halo for the desired target slong a phantom path for simulation pplication 810, as will be fiber described below. US 2017/0307333 Al 10107] Input devise 814 is connected to computer 801 Input device #14 includes processor 815, memory 816 ‘connected (0 processor 818, communivation interface 817 ‘eonncied lo processor 815, «set of sensors 818 eonnecied to processor 815, and a set of controls 819 connected 10 processor 81S, [0108] -Inoneembodimen, input device 814 isa simulated ‘weapon, such as a shot gun, rifle, ora handgun. In another ‘embodiment, input device 814 is a se of sensors connected ‘oa disabled real weapon, such as a shot gun, a rifle, of @ handgun, 1o detect movement and actions of the real ‘weapon, In another embodiment, inp device B14 isa glove having a set of sensors worn by a user to detect positions and movements of'@ hand of a user. 10109] Headset 802 includes processor 820, battery 821 connected to processor 820, memory 822 connected to processor 820, communication interfee 828 connected 10 processor #20, display unit 824 connected to processor 820, ‘nd a set of sensors 825 connected to processor 82. [0110] Referring o FIGS. 94 and 98, 2 prefered imple- mentation of user device 800 is described a user device 900. User 901 wears virtual reality unit 902 having straps 903 and 904. Virtual realty unit 902 is connected to computer 906 via connection 908, Computer 906 is preferably a portable ‘computing device, such asa laptop oF tablet computer, worn by user 901, In other embodiments, computer 906 is 2 desktop computer of a server, not wom by the user Any’ suitable computing device known in the art may be ‘employed Connection 99S provides a data and power con- nection from computer 906 to virtual reality unit 902, [0111] Virwal reality unit 902 includes skirt 907 attached to straps 903 and 904 and display portion 908 attached 10 skint 907, Skin 907 covers eyes 921 and 916 of user 901 Display portion 908 includes processor 911, display unit 910 ‘connected to processor 911, a set of sensors 912 eonnecied ‘tp processor 911, communication interface 913 connected 10 processor 911, and memory 914 connected to processor 911 [Lens 909 is positioned adjacent to display unit 910 and eye ‘921 of user 901. Lens 918 is positioned adjacent w disploy unit 910 and eye 916 of user 901, Virtual reality wit 902 provides a stereoscopic tree-dimensionsal view of images 10 user 901 [0112] User 901 wears communication device 917. Com- Iunication device 917 includes earpiece speaker 918 and microphone 919, Commtinication device 917 is preferably ‘connetted to computer 96 visa wireless connection such as ‘Bluetooth connection. In other embodiments, other sire Jess or wired connections are employed. Communication device 917 enables voice activation and voice conteol of & simolation application stored in the computer 906 by ser ‘or. [0113] In one embodiment, virtual reality unit 902 is the ‘Oculus Rift headset available from Oculus VR, LLC. In ‘another embodiment, virtual reality unit 902 is the HTC Vive headset available fiom HTC Corporation, In this ‘embodiment, a set of laser position sensors 920 is attached to an extemal surface virtual reality unit 902 to provide Position data of viral reality unit 902. Any suitable virtual reality unit known ia the art may be employed. [0114] In certain embodiments, set of sensors 912 include sensors related to eye tracking. When the sensors related t0 ‘eye ircking are based on infrared optical tracking, the set of sensors 912 incldes one or more infrared ight sources and ‘one or more infrared camerss. ight from the infrared light Oct. 26, 2017 sources is reflected from one or more surfaces ofthe user eye ‘and is received by the infrared cameras, The reflected light is reduced to a digital signal which is representative of the positions ofthe user eye. These signals are transmitted to the ‘computer. Computer 906 and processor 911 thea determine the positioning and dirsction of the eyes of the wser and record eve tracking data. With the eve tracking data, com puter 906 determines whether the user is focusing on the Simulated target oF on the phantom target; how quiekly user focusses onthe simulated target or phantom target; how Jong takes forthe user to aim the weapon after focusing oa the simulated target or phantom target; how long the user focusses on the simulated target or phantom target before pulling the trigger, how long it takes the user to see and focus on the next target; whether the user's eyes were shut or closed before, ring or after the pull ofthe trigger: and 0 oa. Computer 996 also determines eye training statistics based on the eye training data and the eye tracking data collected over multiple shots and rounds of the simulation. Fewback is given 0 the user that includes and is base! on the eve nicking data, the eye taining data, and the eye training statistics. [0115] _In certain embodiments, the laser positon sensors 920 are light emitting diodes (LEDs) that act as markers that fan be seen or sensed by one or more cameras or sensors ‘Data feom the cameras of Sensors is processed to drive the locaton snd orientation of virtual reality unit 902 based on the LEDs. Fach LED emits light using particular transmis sion characteristics, such as phase, frequency, amplitude ‘and duty eyele, The differences in the phase, Irequency, amplitude, and duty eyele ofthe light emitted by the LEDs allows for a sensor 10 ienify eaeh LED by the LED's transmission characteristics. In certain embodiments, the LEDs on virwal reality unit 902 are spaced with placement characteristics so that there is a unique distance between any ‘wo LEDs, whieh gives the appearance of a slightly ran- domized placement on viral reality unit 902, The trans- ‘mission characteristics along with placement characteristics of the LEDs on vista reality unit 902 allows the simulation system to determine the location and orientation of virwal reality unit 902 by sensing as few as three LEDs with a camora other sensor [0116] In a prefered embodiment, a simulation environ ‘meat that includes a target is penersted by computer 906. Computer 906 further generates a phantom target and a phantom halo in front of the generated target based on 3 fzencrited target light path. The simulation environment including the generated tage, the phantom target, and the phantom halo are transmitted from computer 906 to viral reality unit 992 for viewing adjacent eyes 916 and 921 of user 901, as will be further described below. The user aims weapon at the phantom target (0 attempt to shoot the ‘genemted targct. [0117] Referring FIG. 104 in one embodiment, simulated ‘weapon 1001 includes trigger 1002 connected to set of sensors 1003, which is connected to processor 1004. Com- ‘munication interface 1008 is connected to processor 1014 ‘and to computer 1008. Batery 1026 is connected to proces: Sor 1004, Simslated weapon 1001 farther includes & set of controls 1006 attached 10 an extemal surface of simulated ‘weapon 1001 and connected to processor 1004. Set of controls 1006 includes directional pad 1007 and selection US 2017/0307333 Al 008. Battery 1026 is connected to processor 1004, tor 1024 is connected to processor 1004 to provide haptic feedback, [0118] Ina prefered embodiment, simulated weapon 1001 fsa shotgun, It wil be appreciated by those skied in the at that other weapon types may be employed. [0119] In one embodiment, simulated weapon 1001 is @ Delta Six first person shooter controller available from Avenger Advantage, LLC. In another embodiment, simu- lated wespon 1001 isan airsoft weapon or air gun replica of ‘8 real weapon, In another embodiment, simulated weapon, 1001 is fircarm simolator that isan ner detailed replica of ‘an actual weapons, sueh as “blueguns” from Ring’s Manu fcturing. Other stitale simulated wespons Known in the ‘at may be employed. 10120] In a preferred emboctiment, set of sensors 1003 inchides @ position sensor for trigger 1002 and a set of ‘motion sensors to detect an orientation of simulated weapon 1001 10121} Ina preferred embodiment, the position sensor is 2 Hall Eifect sensor. la this embodiment, a magnet i attached to trigger 1002. Other types of Hall Beet sensor or any colher suitable ‘sensor Iype known in the art may be ‘employed. 10122] In a preferred embodiment, the set of motion sensors isa Janis motion tracking system-in-package pack- ae sensor, model no. MPI-9150- available from InveSensef, Ine. In this embodiment, the 9-axis sensor ‘combines a 3-axis gytoscope, a axis accelerometer, an ‘on-board digital motion processor, and a 3-axis digital ‘compass. In other embodiments, other suitable sensors and! ‘oF suitable combinations of senvars may be employed 10123] Referring to FIGS. 108, 10C, and 10D in another ‘embodiment, wespon 1010 includes simulation attachment 1011 removably attached tts stock, Simulation attachment 1011 includes on-off switch 1012 and pair button 1013 to ‘communicate wth computer 1009 via Bluetooth coanectin, Any suitable wireless conection may be employed. Tigger sensor 1014 is removably attached fo trigger 1022 and in ‘communication with similation attachment 1011. A set of nizzle sensors 1018 is attched to removable plug 1016 Which is removable inserted into barrel 1023 of weapon 1010, Set ofmnzzle sensors 1018 include a processor 1017, battery 1018 connected to processor 1017, gyroscope 1019 ‘connected to processor, accelerometer 1020 connected 10 processor 1017, and compass 1021 connected to processor 1017. 10124] In one embodiment, set of muzzle sensors 1018 and removable plug 1016 are positioned partially protruding ‘ouside of baer 1023 of weapon 1010, 10125] In one embodiment, weapon 1010 includes rail 1025 attached to its stock in any position. In tis emboxt- ment, set of muzzle sensors 1015 is mounted ( ral 1028, 10126] In one embodiment, weapon 1010 fires blanks 10 provide live reco 1 a user [0127] Tewill be appreciated by those skied inthe ar that ny weapon may be employed as weapon 1010, including any rifle or handgun. I will be funlier appreciated by those lille inthe art that rail 1028 is optionally mounted to any type of weapon. Set of muzzle sensors 1018 may be ‘mounted in aay position on weapon 1010, Any type of ‘mounting means known in the art may be employed. [0128] Refering to FIG. 10F, base 1028 comprises 2 sensor system that includes a magnetic fleld detector used t© Oct. 26, 2017 determine the location and orientation of a weapon, such as ‘weapon 1010 with removable pag 1016 shown in FIG. 10 Base 1028 includes processor 1032, which is connected 10 communication interface 1034, power source 1036, memory 1038, fist coil 1040, second coil 1042, and thied coil 1044 Fins coil 1040, second coil 1042, and third coil 1044 form the magnetic field detector of the sensor system of base 1028. [0129] Processor 1032 of base 1028 receives positioning Signals via first coil 1040, second coil 1042, and third coil 1044 that are used to determine the position and orientation ‘ofa weapon used in the simulation system. Ina preferred embodiment, each of the positioning signals received via first coil 1D, second coil 1042, and third col 1044 can be ilerentiated from one another by one of more of each positioning signal's phase, equency, amplitude, and duty {yele so that each positioning signal transmitted by each oil is distinct. The differences inthe positioning signals allow ‘base 1028 to determine the postion of transmitting device, such as removable plug 1016 of FIG. 10F, based on the positioning signals that indicates the relative position between base 1028 and the transmitting device [0130] Referring to FIG. 10F, removable plug 1016 is seed ito at under barsel of weapon 1010 sad transits positioning signals used to determine the lecation an orien- {ation of removable plug 1016 and the weapon removable phig 1016 is connected to, Removable plug 1016 inclides processor 1017, Which is connected to battery 1018, com: ‘munication interface 1046, ist coil 1048, second coil 1080, ‘and third coil 1052. First eoil 1048, second coil 1050, and ‘third coil 1052 form magnetic field transmitters of a sensor system of removable plug 1016. The magnetic feks wen- crated and transmitted by first coil 1048, second coil 1080. ‘and tied coil 1082 are positioning signals used to determine the location and orientation of removable plug 1016, for example, by base 1028 of FIG. 10E, [0131] Processor 1017 transmits positioning signals from list coil 1048, second coil 1050, and tied coil 1082 that are received by processor 1032 of base 1028, From the trans- ited positioning signals, the relative location and orien- tation Between removable plug 1016 and base 1028 is determined so thatthe precise location of removable plug 1016 with respect to base 1028 is derived. The determina ‘ions and derivations may be performed by one oF more of processor 1032 of base 1028, processor 1017 of removable plug 1016, and a processor of another computer of the Simulation system, such as computer 1009. Once the pos tion of removable plug 1016 is known, the position and orientation of weapon 1010 is determined based on the location and orientation of removable plug 1016, the geom- etry of removable vg 1016, the geometry of weapon 1010, andthe placement of removable plug 1016 on weapon 1010. With the position and orientation of weapon 1010, the simulation application can display a simulated version of ‘weapon 1010, calculate the proper postion of a phantom target, and provide suggested adjustments to improve @ user's marksmanship [0132] _Inan altemative embodiment, the sensor system of | base 1028 includes the magnetic field transmitter and the sensor system of removable plug 1016 inclades the magnetic Tield detector. In alternative embodiments, removable plug 1016 includes threading that comresponds to threading with the barrel ofthe weapon that is commonly used fora shotgun US 2017/0307333 Al ‘choke and removable plug 1016 is fted and secured to the barrel of the weapon via the treading. 10133] Refering to FIG. 10G, removable collar 1084 fits ‘onto barre! 1086 ofa weapon, such as weapon 1010 of FIG. 10B, Removable collar 1084 includes tip 1058 snd three members 1060, 1062, anc! 1064. Members 1060, 1062, and 1064 extend from a first side of tip 1088 that touches barrel 1086 when removable collar 1084 is ft! to barrel 1086. Removable collar 1084 includes Tight emitting diodes (LEDs), such as LEDs 1066 on member 1060, LEDs 1068 ‘on member 1062, and LEDs on member 1064, and LEDS 1070 on tip 1058. Removable collar 1084 includes adi- tional LEDS that are occluded on FIG. 10G, such as on ‘ember 1064 and on tip 1088, The LEDs om removable ‘collar 1084 may emit infrared Tight to be invisible to user ‘or may emit light in the visible spectrum. Removable cola 41084 gets as # marker from which the location and orien- tation of the weapon can be derived. [0134] The LEDS on removable collar 1084 exch emit Tight using particular transmission characterises, such as phase, frequency, amplitude, and duty eye, The differences In the phase, frequency, amplitude, and duty eyele of the Tight emiued by the LEDs allows for a sensor wo idemify ‘each [ED on removable collar 1084 by the LED's tans- mission characteristics, The LEDS on removable cola 1084 fare spaced with placement characteristics so that there is @ unique distance between any two LEDs, which gives the appearance of a slightly randomized placement on remov= ble collar 1084, The transmission characterises along with placement characteristics ofthe LEDs on removable collar 41084 allows the simulation system to determine the locaton and orientation ofthe removable plug by sensing as few as three LEDs with a camera or ther sensor Onee the location and orientation of removable collar 1084 is determined, the location and orientation ofthe weapon to which removable collar 1084 is attached is derived based on the known, oometres of removable colar 1084 and the weapon, which are stored in a database 10135] Referring 10 FIG. 1011, removable collar 1084 is fited onto barrel 1086 of a weapon. Inner portions of members 1060-1064 ace rubberized and may contain aa adhesive to prevent movement of removable collar 1054 with repect to the weapon itis atached to. After removable collar 1054 is installed for the first time to a weapon, the simulation system is calibrated to associate the location and ‘orientation, including 2 roll angle, of removable collar 14 to the location and orientation of the weapon, 10136] In altemative embodiments, the portion of remov- able colar 1054 that fits against the bare ofthe weapon is shaped to fit with only one orientation with respect fo the ‘weapon. The removable collar 1054 may include additional embers tht fit around the iron sight of the weapon so that there is only one possible fitment of removable collar 1054 to the weapon and the process of calibration can be reduced or eliminate. 10137]. Refersing fo FIG. 101, removable collar 1084 is fited to weapon 1010. Weapon 1010 isan over-under shotgun with over barrel 1056, uoder barrel 1087, and top rail 1089. Removable collar 1084 comprises a hollow por- tion 1085 that allows for the discharge of live oF blank rounds of ammunition during the similaton. front surface ‘of removable collar 1084 is fash withthe front surfaces of barrel 1057 so that the position of removable collar 1084 with respect exch of bares 1086 and 1087 is known and Oct. 26, 2017 the trajectory of shots from weapon 1010 can be properly simulated, Removable collar 1084 includes hollow portion 1055, member 1061, mounting sewws 1063, battery 1018, processor 1017, and LEDs 1067. Removable collar 10S is teustomized to the particular shape of weapos 1010, which ‘ay include additonal iron sights. Removable collar 1054 does not interfere with the Sights of weapon 1010 so that ‘weapon 1010 can be aimed normally while removable colar 1054 is fited to weapon 1010, [0138] Member 1061 is 9 fat elongated member that allows for emovuble collar 1084 to he precisely and tightly fited 10 the end of barrel 1087 of weapon 1010 ater removable collar 1054 i sid onto the end of barrel 1087 ‘Member 1061 with mounting sews 1063 operate similar to ‘4C-clamp with mounting serews 1063 presing into member 1061 and thereby securing removable collar 1084 to the end ‘of burrel 1057 with sullciet force so tht the position and ‘orientation of removable collar 1084 with respect to weapon 1010 is not altered by the fring of live rounds or blank sounds of amminition with weapon 1010, [0139] Battery 1018 is connected to and powers the elee- ‘ical components within removable collar 1084 including processor 1017 and LEDs 1067. Processor 1017 controls LEDs 1067. In additional embodiments removable collar 1084 inchides one oF more, accelerometers, gyroscopes, ‘compasses, and communication interfaces connected 0 pro cessor 1017. The sensor data from the accelerometers, yroscopes, and compasses is sent from removable collar 1054 to computer 1009 via the communication interface ‘Removable collar 1054 includes button 1069 totum on, tara ofl and initiate the pairing of removable collar 1084 [0140] LEDs 1067 emit lipht that is sensed by one or more ‘cameras of sensors, fom which the locations and orienta- tions of removable collar 1054 ane) weapon 1010 can be termined, The locations and orientations are determined from the transmission charaterstics of the light emitted from LEDs 1067, and the placement characteristics of LEDs 1067. [0141] Weapon 1010, which removable collar 1084 is ited, is loaded with one or more live of blank rounds of ‘ammminiion that discharge through the hollow portion 1088 ‘of removable collar 1984 when a trigger of weapon 1010 is pulled so that blank rounds or live rounds of ammunition an be used in conjunction with the simulation. Using blank nds o live rounds with the simulation allows for a miore ‘curate and realistic simulation of the shooting experience, including the experience of re-aiming weapon 1010 for a second shot afer feeling the kickback from the dischange of ‘blank or live round from a fist shot. [0142] In altemative embodiments, the weapon is a mul- tiple shot weapon, such as an automatic rifle, a somi fulomatie shotgun, or a revolver. With a multiple shot ‘weapon the simulation experience includes the feeling ofthe ‘transition between shots, such as the eyeling ofthe receiver of a semiautomatic shotgun. When the weapon comprises ‘an aulomatic of semi-aiiomatic receiver, the simulation ‘splays the ejection of a spent shell easing that may not correspond tothe aetual path oF trajectory of the actual spent shell easing. Additional embodiments trick the location of the spent shell easing as iti ejected and match the location And trajectory of the simulated shell casing tothe location ‘and trajectory ofthe spent shell easing. Additional embodi- US 2017/0307333 Al ments also include one or more addtional sensors, electron ies, and power supplies embedded within the housiag of removable collar 1054, 10143] Refering to FIG. 10, weapon 1972 i adapted for use in simulaion by the iment of removable collar 1084 ‘o the barel of weapon 1072, Weapon 1072 isa ty aun that Includes a stock 1074 with adjustable components to fit users of diffrent heghts and statues. Fach component may ince electronic sensors that measure the length, angle, oF position of the component s that weapon 1072 can be properly displayed in a simulation. [0144] Stock 1074 of weapon 1072 includes comb 1076 With comb angle adjuster 1078 and comb height adjuster 1080, Comb 1076 ress against a check ofa user to improve stability of weapon 1072 during use. The height of comb 41076 is adjustable via manipulation of eomb bight aljustee 1080, The angle of comb 1076 is adjustable via manip tion of comb angle adjuster 1078. [0145] Stock 1074 of wespon 1072 also includes butt plate 1082 wits hut pate ange adjuster 1084 and trigger length adjuster 1086, Trigar length 1088 iste length from tigger 41090 to butt plate 1082. Butt plate 1082 rests against a shoulder of a user to improve stability of weapon 1072 ‘uring use. Trigger length 1088 from butt plate 1082 10 tegger 1090 is adjustable via manipulation of tigger length adjuster 1086. The angle of butt plate 1082 is adjustable via manipulation of butt plate angle adiuster 1084 [0146] When weapon 1072 used in a viral reality smo Jaton system with removable collar 1054, suggested adjste ments to comb 1076 andl but pate 1082 are optionally provided. I'shots are consistently to the sight o Ft of an ideal shot placement fora right handed shooter, it may be suggested to increase of decrease tigaer length 1088, respectively. If shots are consistently above or below the ‘eal shot placement, it may be suggested to decrease of inrease the height of comb 1076, respectively. 10147] Refering to FIG. 10K, an altemative embodiment ‘of trigger sensor 1014 js shown. Weopon 1010 incies teigger 1022 and tigger suard 1027. Trigger sensor 1014 is specially shaped and contoured to it securely tthe front of tegger guard 1027. Once trigger sensor 1014 is slid onto trigger guant 1027, serews 104 are tightened to further secure trigger sensor 1014 to tigger guant 1027 and weapon 1010, 0148] Pull ring 1029 is conneeted to string 1030, which ‘winds upon spindle 1031. Spindle 1031 includes spring 41033, which Keeps tension on sting 1030 and biases pull ring 1029 to be pulled sway from trigger 1022 and towards trigger guard 1027 and wigger sensor 1014 Tn the eesting sate, theres no slack in string 1030 and pl ring 1029 rests ‘against trigger sensor 1014, {0149} Sensor 1038 provides data indicative ofthe rotation ‘andor position of spindle 1081. In one preferred embod ment, sensor 1088 sa potentiometer that is connected to and tums with spindle 1031, where a voltage ofthe poento ‘eter indicates the position of spindle 1031 ands change in voliaze indicates 2 rotation of spindle 1081, In another prefered embodiment, sensor 1038 includes one or more Photo emitters and photo detetors that surround an optical ‘encoder whoo! that Is attached to spindle 1031, where light From the photo emitters passes theoigh the encoder wheel to aetivate cedtain photo detectors to indicate the position of spindle 1081, Oct. 26, 2017 [0150] Controller 1037 receives data from sensor 1035 10 {elermine the state of trigger sensor 101d and communicates the state of tigger sensor 1014 by controling the output of LED 1039 to create a coded signal tht corresponds to the state of trigger sensor 1014, Ina preferred embodiment, the Sates of tigger sensor 1014 inclide: pull ring not engages pull ring engaged but trigger not pulled, pull ring engaged And rigger is pulled. Controller 1087, LED 1039, and sensor 1035 are powered by battery 1083. [0181] The state of trigger sensor 1014 is communicated by controlling the ousput LED 1039 with concoller 1097 ‘The output of LED 1039 forms a coded signa to indicate the state of tigger sensor 1014 and can also be used t sid inthe Setermination of the position and orientation of weapon 1010 when the position of trigger sensor 1014 with respect to weapon 1010 and the geometry of weapon 1010 are ‘known, The output of LED 1039 iseyeled on snd oll fash ‘witha panicular phase, frequency, amplitude, and duty eycle that form a set of ouiput charttersties, Different output characteristics are used to indicat different states of trigger sensor 1014, fist set of output characteristics or ist cade is used to indicate the pull ring aot engaged state, a sccond set of output characteristics or second code is used 10 indicate the poll ring engaged but trigger not pulled stat, ‘and third set of ouput characteristics or third code is used to indicate the pull ing engaged and trigger is pulled state, In one embodiment, the pull ring not engaged state is indicated by a set of output characterises where the duty eyele is 0% andor the ampli is 0 $0 that LED 1039 does ‘ot tur on. An external sensor or camera, such a one of position trackers 1208, 1206, and 1215 can be used 10 Selermine the state of trigger sensor 1014 by detecting the ‘output from LED 1039 and decoding the output character istics to determine whicl state rigger sensor 1014 is in, [0152] In an altemative embostiment, pull ring 1029 and string 1030 cach inclide conductive material tigger sensor 1014 includes a pull-up resistor connected (0 an input of controller 1037, and controler 1037 is electrically grounded {0 trigger guard 1027, When trigger 1022 and trigger guard 1027 are electrically connected and conductive pull ring 1029 is touched to trigger 1022, the pull-up resister is grounded to change the sate ofthe input of controller 1037 ‘0 that controller 1037 ean determine whether pul ring. 1029 is touching tigger 1022. Assuming that the user only ‘ouches pull ring 1029 to trigger 1022 when attempting to pull trigger 1022, the determination of whether pull ring 1029 is touching rigger 1022 can be used to indicate that the \eigger has been pulled, which is communicated by changing the output coding of [ED 1039, [0183] Referring to FIGS. 114, 11B, and IC, diferent {ypesand styles of sights may be used on weapons used with the simulation. Additionally, the simulation may display a sight ona weapon that is different from the sight actually oa the weapon to allow different types of sights tobe teste. In altemative embodiments, the halo around the phantom target fan he adjusted to match or include the sight profile of the Sight being used on the weapon. [0154] In FIG. 114, weapon 1102 includes iron sight 1104. Iron sight 1104 comprises. two components, one proximate to the tip of the barrel of weapon 1102 and one stato the tip of weapon 1102, that when aligned indicate the orientation of weapon 1102 fo a user of weapon 1102, [0155] In FIG. 11B, weapon 1102 includes reflex sight 1106, also referred to 28 9 red-dot sight, which may be in US 2017/0307333 Al ‘addition to an iron sight on weapon 1102, Reflex sight 106 js mounted on the barrel of weapon 1102 and includes sight profle 108 shown asa dot. Sight profile 108 may take any Size, shape, color, or geometry and may inelude additional ‘ds, lines, curves, and shapes of one or more colors. user ‘can only see the sight profile 108 when the head ofthe user js properly positioned with respect 10 rllex sight 1106. 10156] In FIG. IC, weapon 1102 includes holographic sight 1110, which maybe in addition 10 an iron sight. Holographic sight 1110 js mounted tothe receiver of weapon 1102 and inches sight profile 112 shown as combination circle with dashes. Sight profile 1112 may take any size, shape, color or geometry aad may include additional dos, Fines, curves, ad shapes of one oF more colors. A user can ‘only see the sight profile 1112 when the head of the user is properly positioned with respect to holographic sight 1110. 10187] Referring to FIG. 12, in simulation environment 1200, user 1201 wears user device 1202 connected ‘computer 1204 and holds weapon 1203. Fach of postion trackers 1205, 1206, and 1215 is connected to computer 1204, Postion tracker 1208 has field of view 1207, Position tracker 1206 has fold of view 1208. Position tracker 1218 has field of view 1216. User 1201 is positioned in Belds of view 1207, 1208, and 1216. 10188] In one embodiment, weapon 1203 is a simulated ‘weapon. In another embodiment, weapon 1203 is a real ‘weapon with a simulation attachment, In another embodi- ‘ment, weapon 1203 isa real weapon and user 1201 wears 9 fet of tricking gloves 1210, la other embodiments, user 1201 wears the set of tacking gloves 1210 and uses the simulated weapon or the real weapon withthe simulation tachment, [0159] In prefered embodiment, ech of postion track- ‘ers 1208, 1206, and 1215 is a neat infrared CMOS sensor having a refresh mte of 60 Hz, Other suitable position trackers known in the art may be employed. For example, positon trackers 1208, 1206, and 1213 can be embodiments of base 1028 of FIG. 10E. [0160] Ina preferred embodiment, postion trackers 1208, 1206, and 1218 capture the vertical and horizontal positions ‘fuser device 1202, weapon 1203 andior set of gloves 1210, For example, position tracker 1208 captures the positions and movement of user device 1202 and weapon 1203, andior set of gloves 1210 in the y-z plane of coordinate System 1209 and position tracker 1206 captures the postions and movement of wer device 1202 and weapon 1208 andlor set ‘of glaves 1210 in the x- plane of coordinate system 1209, Fanher, a horizontal angle and an inclination angle of the ‘weapon are tracked by analyzing image data from position trackers 1208, 1206, and 1218, Since the horizontal angle and the inclination angle are sulicient to describe the sim point of the weapon, the aim point of the weapon is tracked [0161] Ina prefered embodiment, computer 1204 gener- ates the set of target data inchudes# target launch position, ‘target Launch angle, and a target launch velocity of the generated target. Computer 1204 retrieves a set of weapon ‘data based on a desied weapon, including a weapon type eg. a shotgun, a rifle, or a handgun, a set of weapon ‘dimensions, a weapon caliber or gauge, a shot type including, load, a caliber, pellet size, and shot mass, a barrel length, ‘a choke type, and « muzzle velocity. Other weapon data may be employed. Computer 1204 farther retrieves a set of ‘environmental data that inches temperature, amonat of Oct. 26, 2017 aylight, amount of clouds, atinde, wind velocity, wind direction, precipitation type precipitation amount, humidity and barometric pressure for desired environmental cond tions. Other types of environmental data may’ be employed. 0162} Position trackers 1208, 1206, and 1218 capture a Set of postion image data of use device 1202, weapon 1203 andlor set of gloves 1210 and the sot of images Is sent 10 ‘computer 1204, Sensors in user device 1202, weapon 1203 andr st of gloves 1210 detect a set of orientation data and ‘sends the set of orientation data to computer 1204, Computer| 1204 then calculates a generated target flight path for the senersted target base on the set of target dats, the set of environment data, and the postion and orientation of the user device 1202, The position and orientation of the user dovice 1202, the weapon 1203 andor set of gloves 1210 are determined from the set of postion image data and the set ‘of orientation data, Computer 1204 generates « phantom Tange and a phantom halo based on the generated target ‘light path and transmits the phantoms target and the phantom halo to user device 1202 for viewing by user 1201, User 1201 aims wespon 1203 at the phantom target and the phantom lilo € attempt to hit the generated target. Com- puter 1204 detets trigger pall on weapon 1203 by a trigger sensor anor a finger sensor and determines a hit or 4 miss of the generated target based on the timing of the trigger pull, the set of weapon data, the position and orientation of ter device 1202, weapon 1203, andlor set of gloves 1210, the phantom target, and the phantom alo. [0163] In an altemative embodiment, the set of gloves is replaced by a thimble wom on the trigger finger of the shooter aad a simulation attachment on the Weapon. The simulation attachment on the weapon indicates the position and direction ofthe weapon and the trigger finger thimble is ‘used to indicate when te tigger is pulled. Te positions of the simulation attachment and the thimble are tracked by position trackers 1208, 1206, and 1218. When tho user provides a “pull” command, such as by vocalizing the word “pull” that is picked up via voice recognition, the system Jaunches a target and arms the tigger finger thimble, so that ‘when stificient movement of the thimble relative to the ‘weapon is detected, the system will identify the tigger as being pulled and fire the weapon in te simulation. When the ‘thimble is not aed, movemtent of the thimble with respoct to the weapon isnot used to identity i the trigger has been pulled [0164] When weapon 1203 is loaded with live or blank rounds of ammunition, the discharge of te live or blank rounds of ammunition are detected by one vr more sensors, such as a microphone, of user device 1202. When the ischange ofa live or Blank round of ammunition is detected and weapon 1203 is multi-shot weapon that ineludes a receiver that cycles between shots, the simulation displays the eyeling of the receiver after the discharge of the live or blank round of ammunition is detected, When weapon 1203 js a revolver, the simulation displays the rotation of the cylinder. When the system detects the discharge ofa number ‘of rounds of live of blank ammnition that is equal to the ‘maximum number of rounds that can be stored in weapon 1203, the system provides an indication othe user, via user device 1202, tht it ime to reload weapon 1203, [0165] Referring to FIG. 13, command mony 1300 includes simulation type 1301, weapon type 1302, weapon ‘options 1312, smmonition 1303, taget type 1304, station select 1308, phantom togule 1306, day/night mode 1307, US 2017/0307333 Al ‘environmental conditions 1308, freeze fame 1309, instant replay 1310, and starved simulation 1311. Simulation type 41301 enables a use wo select different types of simulations For example, the simulation type includes skeet shooting trap shooting, sporting clays, and! hunting. Weapon type 1302 enables the user to choose fom diferent weapon types ‘and sizes. Weapon types include shot guns, rifles, handguns, airsoft weapons, air guns, and so on. Weapon sizes include the different calibers or gauges for the weapon's type. The user further enters a weapon sensor location, for example, in the muzzle or on a rail, and whether the user is right or lft handed. Weapon options 1312 enables the user to select different weapon options relating the weapon selected via weapon type 1302. Weapon options 1312 include optional ‘accessories that ean be mousted to the weapon, sch as tactical lights, laser aiming modules, forward hand grips telescopic sight, reflex sights, red-dot sights, iron sights, holograpic sishts, bipods, bayonets, and so on, including iron sight 1104, relex sight 1106, and holographic sight 1110 of FIG. 11, Weapon options 1312 also include one oF ‘more beams tobe simated withthe weapon, such as beams 41906, 1912, 1916, 1920, 1924, 1928, 1982, and 1936 of FIG. 19, which show an approximated trajectory ofa shot and are ‘optionally adjusted for one ce more of windage and gravity. Ammunition 1303 enables the user to select different types ‘of ammunition for the selected weapon type. Tarpet type 1304 enables the user to select ferent typos of targets for the simulation, including elay targets, birds, rabbits, drones, helicopters, airplanes, and so on. Bach type of target includes a target size, a target color, and a target shape. Station select 1308 enables the user to choose different Stations to shoot from, for example, ina trap shooting range, fied The user further selets @ numberof shot sequences forthe station select. In preferred embodiment the numberof shot sequences in the set of shot sequences is determined by the Iype of shooting minge used and the number of tant Aight path variations to be generated. For example, the represen- lative number of shot sequeates for a skeet shooting range iat least eight, one shot sequence per station. More than one shot per station may be utilized 10166] In a prefered embodiment, each simulation type 1301 js associated with one or move animated virtual reality shooting scenarios. As one example, when simulation ype 1301 is hunting, the animated virtual reality shooting sce- nario includes a scenario for learning how to shoot over ‘dogs. The shooting over dogs scenario displays an animated ‘dog going on point as apart ofthe lat i the simulation so thatthe user can lea to shoot the target and avoid shooting the dog. 10167] Phantom toggle 1306 allows @ user to select ‘whether to display a phantom target and a phantom halo during the simblation. The user further seleets » phantom ‘color, a phantom brightness level, and a phantom tanspar- cency level [0168] In certain embodiments, phantom toggle 1306 Includes additional help options that adjust the amount of “help” given othe user based on how well the user is doing, such as with sim sensitive help and with dynamic help ‘When aim sensitive help is selected, aim sensitive help is provided that adjusts one or more of the transparency, eolor, ‘and size of one oF more beams from weapon options 1312, phantom targets, and halos based on how close the aim point ‘ofthe weapon is to a phantom target, With aim sensitive Oct. 26, 2017 help, the beams, pluantom targets, and halos are displayed with less transparency, brighter colors, and larger sizes the ‘urtherof-target the aim point of the weapon is. Conversely, the beams, phantom targets, and halos are displayed with more transparency, darker colors, and smallersizes when the weapon is closer lo being aimed on-tanet, [0169] When dynamic help i selected, the amount of help provided tothe user for cach shot is adjusted dynamically based on how well the user is performing with respect to one ‘ormoreaf each shot, each uid, and the simulation over ‘When more help is provided, beams, phantom targets, and halos are given more conspicuous characteristics and, eon- versely, when less help is provided, the beams, phantom targets, sad halos are shown more passively or aot ata, The amount of help is dynamic in that when the previous one or ‘more shots hit dhe target, lesser amount of hep is provided fn the next one oF more shots and, conversely, when the previous one or more shots did not bit the target, more help Js provided for the subsequent one or more shots. As the user's skill level advances, the brightness of the phat farget can diminish until it is transparent—the tser bas Jeamed correct lead by rote repetition and no longer needs the phantom as a visual ide, [0170] Day/night mode 1307 enables the user to switch the fenviroameat between daytime and nighttime. Environmen- ‘al conditions 1308 enables the user to select different simulation environmental conditions including temperatir, amount of daylight, amount of clouds alttade, wind veloe- ity, wind diction, precipitation type, precipitation amount humidity, and barometric pressure. Other types of environ- ‘mental data may be employed. Freeze frame 1309 allows the user to “pause” the simulation. Instant replay 1310 enables the user mplay the last shot sequence including the shot attempt by the use. Starlend simulation 1311 enables the User (o slart or ead he simulation. Ia one embodiment, solection of 1301, 1302, 1312, 1303, 1304, 1308, 1306. 1307, 1308, 1309, 1310, and 1311 is accomplished via voice controls. In another embodiment, selection of 1301, 1302, 1312, 1303, 1304, 1308, 1306, 1307, 1308, 1309, 1310, and 131 is accomplished via a set of controls om simiuated ‘weapon as previously described 0171] Referring 19 PIG. 14, runtime method 1400 for a target simulation will be described. At step 1401, a baseline position and orientation of the user device and a baseline position and orientation of the weapon are set. In this sep, the computer retrieves a set of position image data fom a set of position trickers, a set of orientation data from a set of ‘sensors in the user device the weapon and/or a set of glaves and saves the current position and orientation of the user ‘device and the Weapon into memory. Based on the simia- ‘ion choice, the viral position of the launcher relative tothe position and orientation ofthe user device is also se. Ifthe user device is oriented toward the virtual location of the Juncher, a virtual image of the auneher will be displ Atstep 1402, a set of target fight dat, a set of environment ata, and a set of weapon data are determined from a set of fenvironment sensors and a database [0172] tn a preferred embodiment, the set of weapon data js downloaded and saved into the database based onthe ype ‘of weapon that i in use and the weapon options seleted 40 be used with the weapon Ina preferred embodiment, the set fof weapon data ineludes a weapon type €g.. a shotgun, a file, oF a handgun, a weapon caliber or gauge, a shot type including 2 Toad, a caliber, a pellet size, and shot mass, a US 2017/0307333 Al barrel length, a choke type, and a muzzle velocity. Other ‘weapon dita may be employed In a preferred embodiment the weapon options include one or more accessories and beams, including iron sight 1104, reflex sight 1106, and holographic sight 1110 of FIG. 11, and including. beams 1906, 1912, 1916, 1920, 1924, 1928, 1932, and 1936 of FIG. 19 10173] In prefered embodiment, the set of environment dlata is retieved from the database and includes a wind ‘velocity, an air temperature, an altitude, a relative airhumid- iy, and an outdoor illuminance. Other types of environmen- tal dita may be employe 10174] In. prefered embostiment, the set of treet ight datas etricved Irom the database based onthe type of tart in use. na prefere embodiment, the seo tare! Might data ‘neudes lnunch angle ofthe tanget, an inal velocity ofthe target, a mass ofthe target, a target Bight time, a drag fore, lit force, a shape ofthe target, a color of the tage, and 8 target brightness level. In alternative embostiments, the target is a self-propelled flying objet, such as a bird or drone, which taverses the simulated environment at 3 ‘onstant airspeed 10175] At step 1403, the target and environment are gen- erated from the set of target flight data and the set of ‘environmental data, At step 1404, a virtual weapon image that includes the selected weapon options is generated and save in memory. In this sep, mages an the st of weapon data ofthe selected weapon and the selected weapon options forthe simulation is retrieve from the database. At step 1408, the target is Inches! and the target and environment ‘are displayed inthe user device. Ina prefered embodiment, 2 marksman wil initiate the launch with 2 voice command suc as “pull 10176] AX sep 1496, view ofthe user device with respect ‘oa virtual target launched is determined, as willbe furbee ‘describe below 10177] At step 1407, a phantom target and a phantom hao ‘ze generated based on a target path andthe position and ‘orientation of the user, as will be further deseribed below. ‘The target path is determined from the target position the target velocity using Eqs. Io At sep 1408, the generated phantom target and the generated phantom halo are seat to the user device and displayed if the user device is oriented toward the target path. The generated weapon i displayed with the selected weapon options if the user deviee is ‘oriented toward the position of the Virwal weapon oF the selected weapon options [0178] At step 1409, whether the trigger on the weapon has been pulled is determined from a set of weapon sensors andlor a set of glove sensors. In one prefered embodiment With tbe trgver sensor of FIG. 10K, the determination of whether the tigger is pulled is made responsive to detecting ‘neo the codes tha correspond to he state of tigger sensor 1014 fom the ouput of LED 1089 by 2 sensor, such as one ‘of postion tockers 1208, 1206, and 1218 of FIG. 12 10179] he tigger has not been pulled, then metbod 1400 returns 10 step 1408. IF the tigger hes heen poled, then method 1400 proceeds to step 1810, [0180] At step 1410, a shot string is determined. In this Sop, a sot of position trackers capture a set of weapon position images. In this step, a set of weapon positon data $s received from a sct of weapon sensor. The shot sting is caleulated by Oct. 26, 2017 Rare Rata! rs where Ade rine the are ofthe shot sting, Ryne the sais of the st sting, Rg B he ratios of the shot 38 Ft leaves the weapon, pa the rate at which the shot spreads, and ts the time fakes forthe shot to travel fem the weapon fo the tage An sim point of the weapon is determined from the set of weapon positon images and the Sct of weapon. postion dat. A shot sting, positon ix dlermined fom the position of the weapon at the tine of Tring and the area of the sot stig {0181} Atstep 1411, ifthe user device is orienta slong the suze of the weapon, the shot sting is displayed on the ‘ser deviee atthe shot string postion. Separately, gunshot sound is played and wepon aeton is displayed. Wexpoa faction is Based onthe typeof the weapon and includes the splay of mechinial movements ofthe weapon, sch athe ‘movement of a semi-automatic receiver and the sike of a bammee of the weapon. {0182} At step L412, whether the pinion target has hee “hit” isdetemnined. The simulation system determines the postion of the shot string, as previously described. The Simulation system compares the position ofthe sot string to the position of the phaatom target. The shot string is ‘optionally displayed aan elongated clo of any color tt ‘moves fom tho tp of the user device towards the shot location, which, ideally. is the target ad. provides vil feedack tothe user of the path taken by the shot string. ‘When the elongate clo i close othe scr device sborly ale frng, the dinero the elongated cloud is about one Jack, When the elongated cloud is close othe tract. aboot twenty five yrds avay Irom the user, the diameter ofthe loud as expanded inary to abot twenty five inches. {0183] IF the poston of the shot string overlaps dhe positon of the phantom target, then the phantom target is "hiv" Ifthe poston ofthe shot string does not overlap the anton target, then the phantom target is “missed.” [0484] Ire phantom tat is hit and the user device is cricnted torr the bit location, thes method 1400 displays animation ofthe target being destoyed onthe user device athe appropriate coordinates and plays sound of the target being destoyed at step 1413. At step M414 the simulation system records a “it in the database [0185] 1fa"miss"isdetemnined at step 1412, thea method 1400 proceeds 10 step. 1418. AL step 1418, whethor the hao halo i hits determined. In his step, wheter the Shot string overlaps an arse of the phantom halo hy perentage greater than or ual oa pedtermined percente de is detemiped. For example the predetermined potcent- ‘ae i 5%, Whether the shot sting overiaps at eat 80% of the area ofthe phantom halo is determined. Any predeter ay be employed I the position of the shot sting overlaps the [o186) phantom halo by a pereentage greater than or equal to the predetermined percentage, then a “hit” is detemined and method 1400 proceeds t9 Step 1413, where the target hit is displayed. [0187] Ifot step 1415, the shot string does not overlap the farea of the phantom halo by a percentage gresfer than or equal t0 the predetermined percentage, then a“ Setermined and the simulation system records a the database at sep 1416, US 2017/0307333 Al 1B 10188] The number of tapets that are bit, the targets tht are missed, the location of each shot with respect to the phantom target, and the location ofthe shot string with respect tothe tjectory’ of the target are generated to fon tracking data, The tricking data is analyzed to provide insights and suggested adjustments for how to improve the user's performance with the simulation system, 10189] Ac step 1417, whether an end command has been received to complete the simulation is determined. I not received, then method 1400 advances to the next target at sep 1418, 10190] 1 an end command has been received and the Simulation is complete, then a trend of shot attempts is analyzed at step 1419 by rotieving a number of “its” in the fet oFshot sequences and a number of “misses” in the set oF shot sequences from the database, In this stop, a shot improvement is determined by evaluating the number of hits Jn the set of shot soquences and the mumber of misses in the set of shot sequences. Method 1400 ends a step 1420, [0191] Referring to FIG. 15, user 1500 woars user device 1501 and holds weapon 1502 in simulation environment 1503, Simulation environment 1803 is a viral spheeo spanning 360° in all directions surrounding user 1500, User device 1501 has field of view 1504. Field of view 1504 is 3 ‘cone that has angular range & and spans an arewate portion {in two dimensions) oF a sectoral portion (in three dimen- sions) of simulation environment 1803. User device orien- tation vector 1508 bisects feld of view 1504 and angulae range « into equal angles B. Weapon 1502 has weapon ‘orientation voctor 1506, Each of user device orientation vector 1508 nnd weapon orientation vector 1506 is indepen- dent of each other. The positions of user device 1501, weapon 1502, user device orientation vector 1508, and weapon orientation vector have Cartesian x:.7 coordinates. Simulation environment 1503 has spherical coordinates Simulation environment 1503 includes viral. target Juncher 1507, vital target 1508, phantom target 1509 and Phantom halo 1810. As can be seen, weapon 1802, viral Target 1808, phantom target 1809, and phantom halo 1810 fare infield of view 1504 of user device 1501. Virtual target launcher 1507 is not ia field of view 1504 of user device 1501. Weapon 1802, virtual target 1508, phantom target 1509 and phantom halo 1510 will be displayed in usee device 1801 and virtal target launcher 1807 will not be sisplayed in user device 1501 10192] In a preferred embodiment, angular range cis approximately 110° and each of equal angles fs approxi mately §5°. Other angular ranges may be employed. 10193] Refering to FIG. 15B, step 1406 will be furher ‘described os method 1S11 Tor determining a view for a user ‘device with respect to a postion and an orientation of the user device and the weapon. Method 1511 begins at step 1512. At step 1513, a set of current position image data is retrieved from a set of position trackers and a set of curent position and orientation data is retieved from the usee ‘device and the weapon andor set of gloves. At step 1514, 3 set of motion detection data is reeived from a set of sensors in the user device wo determine movement ofthe user device and from the weapon andlor set of gloves to determine ‘movement of the weapon, At step ISIS, the set of motion detection data and the position of the user device and the ‘weapon andior set of gloves are combined t0 determine un X.y.7 position of the dser device and the weapon and a rol, and yaw or detection of the user device and the Oct. 26, 2017 weapon. The curreat x,y, 2 orientation vectors forthe user device and the wespon are calculated from the difference ‘between the baseline position and orientation and the eurent position and orientation ofthe wser device and the weapon. ‘The set of motion detection data received is the rol, pitch and yaw orientation movement of the head of the ws and the weapon. At step 1516, the curent postions and orien- tation vectors of the user device and the weapon are mapped {o the simulation environment. Ina preferred embodiment, the current positions and orientation vectors are @ 1:1 ratio to the positions aud orientation vectors in the simulation ‘environment. For example, for every inch andlor degrse that the user device andor the weapon moves andr rotates, the view of the user andlor the simulated weapon moves one ‘nch andor rotates one degree inthe similated envionment. Other ratios may be employes, The mapping determines the splay view, as will be futher descrihed below. At stop 1517, the similation environment that would be visible 9 the user based on the orientation of the user device and the ‘weapon is displayed, Method 1500 ends at step 1518, [0194] Referring w FIG. ASC, step 1516 will be further described as method 1519 for mapping the position and bfientation ofthe user device and the weapon to the sinm- Jation environment for determining display leld of view. At step 1520, the x,y, 7 postions of the weapon and the ‘weapon orientation vector are retrieved. At step 1821, the x, ¥, 2 positions of the weapon and the weapon orientation vector are converted to spherical coordinates (8, @) using: B10 “: veo) [0198] At step 1822, the weapon is rendered in the simu- Jation environment atthe spherical postion and orientation vector At step 1823, the x,y. 2 positions of the user device ‘and the user deviee orientation vector are retrieved. At step 1524, the x,y,z positions of the user device and the user device orientation vector are converted to spherical coordi- nates (6, 8, 4) using Fqs. 9, 10, and 11. At step 1828, the display fisld of view is determined from the spherical orientation vector coordinates. In this step. qual angles f are measured from the user device orientation vector 10 ‘define the display’ field of view as a sector ofthe simulation environment in spherical coordinates. At step 1526, the field cof view sector is compared to the simulation environment 10 {etermine a portion of the simolaion environment within the field of view sector. At step 1527, the portion of the simulation environment within the field of view sector is layed onthe user device atthe display field of view, At step 1528, the spherical postion and orientation veetor of the weapon is compared to the field of view sector 10 {termine whether the weapon sin the displey field of view. Ifthe weapon isnot in the display field of view; then method 1519 rotums to step 1820. If the weapon isin the display field of view, then at step 1529, the weapon is displayed on the wer device at the spherical position and orientation ‘Method 1519 then returns to step 1820, US 2017/0307333 Al 10196] Refering to FIG. 16A, step 1407 will be further described 3s method 1600 for generating a phantom target and phantom halo, At step 1601, a. phantom path is ‘extmpolated. Referring to FIGS. 16B nd 16C, target 1606 fs launched from launch point 1611 and moves along target path 1607 '. Phatom target 1608 moves along Phaniom path 1609 abad af target 1606 at position P.. Position P, is lead distance 1610 and drop distance 1616 rom position P,. Phantom path 1609 varies as target 1606 and target path 1607 varies, thereby varying lead distance 1610. Marksman 1612 js positioned at distance 1613 from Juich point 1611, Marksman 1612 aims at phantora target 41608 and shoots along shot path 1614 to intercept target 1606, Target path 1607 is extrapolated overtime using the St of target ight data, Target path 1607 is calculated using gs. 14 [0197] "Referring t0 FIG. 168, lead distance 1610 is cal- ‘culated using target path 1607, the relative marksman loca tion, and the set of weapon data By ‘whore Dp, the distance of phantom tage 1608 postion Fyrom Gnch ott, By tthe Gance fo mart tan Y6r2 io photo ta TR sng shpat 1614 fe angle betwen sh pth 614 and dos 1613-6 the Tous ale betwee rset path 1607 and tance 1613, isthe dance of tg Ra poston , om Hhuoch pint 141, Dy she dstance rom marksman 1612 {tuget 1606 loge pe ane hence shot path 1618 and distance 1613; 4 the launch angle rnd te path S07 ao Once 161 Lend das tev Pins * Br, —Dr Bat AaDseancay Bis Pont * Be Cag where Dz, is load distance 1610, AD, is the difference between te Uistances of shot paths 1614 and 1618, 4 isthe dlfference betwen angles and 9, 0 is the launch angle between target path 1607 and distance 1613, Ais a variable multiplier for shot size, gauge, and shot mass, Bisa variable multiplier for 0 including vibration ofa target thrower and misaligned tage inthe target thrower, and C is a variable multiplier for drag, lit, and wind, [0198] For example, the approximate times it takes for & 7 sho size shell with an initial muzzle velocity of appeoxi- mately 1,225 feet per second! to travel various distances is shown in Table TABLE 1 Oct. 26, 2017 TABLE I-sontinued Diane frm bare “ine (econ) oo com site ke 1 fee one [0199] Various lead distances berwvoen target 1606 and phantom target 1608 for tanzt 1606 having aa inital veloc- Ay of approximately 30 mph is shown in Table 2 TABLE 2 Do fet ha feet 0 fet SSotee [0200] Referring to FIG. 16C, phantom path 1609 is offset rom target path 1607 by drop distance 1616 to simulate and compensate for the average exterior ballistics drop of shot. [0201] Tae “drop ofa shot" is the eect of gravity om the shot during the distance traveled by the shot. The shot trajectory has a near parabolic shape. Due to the near parabolie shape of the shot trajectory, the line of sight or horizontal sighting plane will erss the shot trajectory al wo points calle! the next 20 an fae 20 in the ease where the shot has a trajectory with an initial angle inclined upward ‘with respect othe sighting deviee horizontal plane, thereby ‘causing a portion of the shot trajectory to appear to “rise” above the horizontal sighting plane. The distance at which the Weapon is zeroed, and the Vertical distance betwen the sighting device axis and barrel bore axis, determine the amount of the “rise” in both the X and Y axes, Le. how fir above the horizontal sighting plane the rise goes, and over ‘what distance it ass [0202] Prop distance 1616 is calculated by: whore Dryyy is drop distance 1616, tyyuce is the time required f0Fa shot string fred by marksman T612 to impact phantom target 1608. T,,.., 8 determined by asc of lookup fables having various impact times at predetermined dis- tances for various shot strings lime 7 coat ‘where vs the terminal velocity of target 1606. is the mass ‘of target 1606, gis the vertical acceleration due to gravity, US 2017/0307333 Al Cis the dg coefcient for target 1606, o isthe density of the air, A is the planform area of target 1606, and tis the ‘characteristic tite, [0203] Refering to FIGS. 16A and 16C, at step 1602, phantom halo 1617 is determined. Phantom halo 1617 is 8 Simulation of ashot string aa distance ofthe phantom target from the position of the marksman, In a preferred emboxti- ‘ment, an area of phantom halo 1617 is determined from the sot of weapon data and calculated by: Aer? B19 Barnet 20 Ase "Ata a Where Arie amas the area ofthe shot string, Rang the radius of the Sho string, Rigg it the mais of the shot as itleaves the weapon, isa variable matiplicr for any choke applied to the weapon as determined from the set of Weapon, daa, Vga is the Fate at which the shot spreads, and tis the time it kes for the shot to travel from the weapon to the {RHEL Asn ai 8 the area of phantom hale 1617. 204} ‘inoue embodiment, the area of phantom bilo 1617 varies asthe amount of choke spplied to the weapon varies. [0205] Retuming to FIG. 16A, at step 1603, a relative ‘contrast value between the tamget and a background sur rounding the target i analyzed by calculating the difference between a grayscale brightness ofthe target and an avege brightness of the background surrounding the target and the diflerence between an average color of the target and 2 color ‘ofthe background surrounding the target based on desired ‘day/night sting and a set of desired environmental condi 0206] Aé step 1604, a color and a contrast level of @ phantom target is determined, Tn a prefered embodiment, the phantom tamgt inchides a set of pixels set at » prode> termined contrast level. The predetermined contrast level is determined by the diflerence of the enlor between the phantom target and the target and the difference of the brightness between the phantom tanget and the target. i this ‘embodiment, the predetemnined contrast level is a range from a fully opaque image to a flly transparent image with respect 10 the image of the target and the image of the background 10207] Ina preferred embodiment, the set of pixels is set ‘ata predetermined color. For example, blaze orange bas 8 pixel equivalent setting of R 232. G 110, BO. [0208] "At step 1603, a color and contrast level of the phantom halo is determined. Ina prefered embodiment, the Phantom halo includes a set of pixels set at a predetermined Contrast level. The predetermined contrast level is deter- mined by the difference ofthe color between the phantom halo and the target and the difference of the brightness between the phantom halo and the target. In this embodi- ‘ment, the predetermined contast levels range froma fully ‘opage image to a filly transparent image wi respect othe mage of the target and the image of the background 10209] Ina preferred embodiment, the set of pixels is set at a predetermined color. For example, black has a pixel ‘equivalent seting of R 0, G 0, BO. Any color may be ‘employed. (0210) -Refering to FIG. 17, a view of a simulation from the perspective of a marksman wearing @ user device, sich tas user device 900, is shown, Through display 1700, back- round eavironment 1701 and target 1702 are viewed. Oct. 26, 2017 Phantom taruet 1708 is projected ata lead distance and at a drop distance from target 1702, Phantom blo 1704 is projected surrounding phantom target 1703. Marksman T70S aims weapon 1706 at phantom target 1703, [0211] In a preferred embodiment, shot center 1707 Appears on display 1700 when marksman 1705 pulls a trigger of weapon 1706, Shot string 1708 surounds shot ceeater 1707. In a preferred embodiment, shot sting 1708 is 4 simulation ofa shot pelle spread fired from weapon 1706 [0212] In an altorative embodiment, shot center 1707 is ‘ot displayed and shot string 1708 is displayed traveling from the barrel of weapon 1706 slong. a tmjectory. The eajectory, size, positioning, and ight path of shot string 1708 are based on the location and oriatation of weapon 1706 and are base on the type of ammunition selected for the simulation. When shot string 1708 intersects target 1702, farget 1702 is destroyed. An image of one of more of target 1702, phantom target 1703, and phantom halo 1704 can be paused and displayed at their respective locations when the {rigger of wespon 1706 vas pulled while the target 1702 contives to move along its trajectory and shot string 1708 ‘continues fo move along its trajectory. [0213] Referring to FIG. 18, an isometric view shows an input device configured to be mounted on a 1 ‘weapon. Inpt device 1802 is to be mounted to 1804 of weapon 1806, [0214] Weapon 1806 includes barrel 1808, sivht 1846, lame 1842, member 1844, cylinder 1810, hammer 1812 handle 1814, trigger 1816, trigger guard 1818, tigger sensor 1860, and rail interface system 1804. Weapon 1806 js 2 double-sction revolver wherein operation of tigger 1816 ‘cocks and releases hammer 1812. Rotation of eylinder 1810 js linked to movement of hammer 1812 and trigger 1816 [0215] Barrel 1808 is connected to frame 1842 and mem- ber 1844. Member 1844 supports barrel 1808 and is the portion of weapon 1806 to which ral interface system 1804 Js mounted. In altemative embodiments, rail interface sys- ‘em 1804 is mounted to other pars or portions of weapon 1806, such as being directly mounted to barrel 1808. [0216] Frame 1842 conncets barrel 1808, member 1844, ‘eigger guanl 1818, trigger 1816, handle 1814, hammer 1812, snd eylinder 1810. Frame 1842 and handle 1814 hhouse the mechanisms that create action beeen tigger 1816, cylinder 1810, and hammer 1812. [0217] Rail interface system 1804 is rail system for interfacing additonal accessories to weapon 1806, such as tactical lights, laser aiming modules, forward hand grips telescopic sights, reflex sights, raldot sights, iron sights, holographic sights, bipods, bayonets, and so on. Rail inter face system 1814 may coniom to one or more standard rail systems, such as the Weaver nail moun, the Pieatiany ral (@ls0 known as MIL-STD-1913), and the NATO Accessory Rail, Rail interface system 1804 includes serews 1820, base 1822, member 1848, and rail 1826, [0218] Screws 1820 fit and secure ral interface system 1804 to member 1844 of weapon 1806, Screws 1820 com- press base 1822 and member I848 of rail interface system 1804 against momber 1848 of weapon 1806 [0219] Rail 1826 includes ridges 1824, slots 1880, and angled sures 1856. The longitudinal axis of rail 1826 is stibstantally parallel tothe longitudinal axis of barel 1808. Slots 1860 are the lateral voids or slots between ridges 1824 that are perpendicular to both the longitudinal axis of rail 1826 aod the longitudinal axis of barrel 1808, Ril 1826 also US 2017/0307333 Al ‘includes a longitudinal slot 1852 that uns slong the lets ‘of rail 1826 and is substantially parallel to the longitudinal axis of barel 1808, Angled surfaces 1856 of rail 1826 allow Tor the provise mounting of accessories to mil 1826, [0220] Input device 1802 includes ral mount 1828, fst Portion 1830, second portion 1832, battery 1834, processor 1836, LEDs 1884, button 1838, and screws 1840, Input device 1802 slides longitudinally onto rail 1826 of rail, Ierface system 1804 of weapon 1806 and its positon is secured by screws 1840. The front surface of input device 1802 is flush with a ridge 1824 of rail 1826 so that the location and orientation of iaput device 1802 with respect to barrel 1808 is known and the fring of weapon 1806 can be socurately simlated. [0221] Rail mount 1828 of input device 1802 includes fst portion 1830, second portion 1832, and angled surfaces 85S, Angled surfaces 1858 of mil mount 1928 correspond to angled surfaces 1856 of ral 1826 to allow foe a ight and precise fitment of input device 1802 to ell interface system 1804, Screws 1840 of input device 1802 compress first portion 1830 and second portion 1832 syainst ral 1826 of rail interface system 1804 with suflicient fore to prevent ‘changes in the positioning or orientation of input device 1802 with respect to weapon 1806 as weapon 1806 is being used, 0222] Battery 1834 of input device 1802 is connected 10 ‘and powers the electrical components within input device 1802 including processor 1836 and LEDs 1884, Processor 1836 controls LEDs 1884. In additional embodiments, input device 1802 inchudes one oF more sensors, acoolerometers, yroscopes, compasses, and communication interfaces. The Sensor data from the sensors, accelerometers, gyroscopes, ‘and compasses i sent from input device 1802 ta computer, such as computer 801 of FIG. 8, via the communication Jnterface. Input device 1802 includes button 1838 to tum on, turn off, and initiate the pairing of input device 1802 [0223] "LEDs 1884 emit light that is sensed by one oF more ‘cameras or sensors from which the locations and orienta tions of input device 1802 and weapon 1806 can he deter mined. The locations and orientations are determined from the trunsmission clumaeteristis of the light emitted from LEDs 1854, and the placement characteristics of LEDs 1854. 10224] Triguer sensor 1860 detects the pull of tigger 1816 when tigger 1816 presses onto pressure switch 1862 with ‘ulicent movement and foree, When haaumer 1812s fully ‘cocked, trigger 1816 ress just above pressure switch 1862 $0 that any additonal movement will release hammer 1812 nd will activate pressure switch 1862, One or more wires 1864 electrically connect trigger sensor 1860 to processor 11836 so that processor 1836 can determine when trigger 41816 is polled when blanks oF live rounds are not use. “Trigger sensor 1860 ie contoured to fit onto the back end of Ueigger guard 1818 behind trigger 1816 and trigger sensor 1860 is secured onto trigger urd ISI hy screws 1866. 0225] Ina two wire embodiment, current from processor 1836 through a fist wire of wires 1864 to trigger sensor 1860 is rotimes! tough a sevond wire of wires 1864. In an iterative embodiment, wine 1864 is a single wire and @ return path forthe current from processor 1836 through wire 1864 to trigger sensor 1860 is ereated by electrically con- necting trigger sensor 1860 to tigger guard 1818, which is ‘electrically connected to frame 1842, rail system 1804, input ‘device 1802, and processor 1836 Oct. 26, 2017 10226] Inltemative embodiments, weapon 1806 i leaded ‘with one or more live or blank rounds of ammunition that ischarye dough barrel 1808 after hammer 1812 is cocked fand trigger 1816 is thea pulled. Weapon 1806 does not include sensors for measuring the precise location of eylin- der 1810, hammer 1812, and tigger 1816. During simul ‘ion and after a round has been fired, the simulation shows the movement of eylinder 1810, hammer 1812, and tigger 1816 0 prepare fora subsequent shot, which may oF may not correspond to the actual state of wespon 1806, [0227] In altemative embodiments, the computer that receives date from one or more sensors from input device 1802 decives the state of weapon 1806 from data received from one of more sensors and updates the display of weapon 1806 to show the state andor firing of weapon 1806 in the simulation, For example, data from sensors, accelerometers, ‘and gyroscopes within input device 1802 ean indicate the click for when hammer 1812 is fully cocked, indieate the click for when cacked hammer 1812 is released and the chamber in eylinder 1810 ie unloaded, and indicate the sischarge ofa ive or blank round of ammunition, Data rom a microphone. such as microphone 919 of FIG. 9, can be ‘used to similarly detect one oF more states of weapon 1806 and the discharge of live or blank rounds of ammunition. ‘When cylinder 1810 is configured to hold six rounds of ‘ammunition and six shots have besa fred successively, the simulation may indicate to the user that it is time to reload ‘weapon 1806. The simulation displays changes tothe state ‘of weapon 1806 as mechanical movements on weapon 1806 ‘and displays the fring of weapon 1806 with associated ‘mechanical movements of weapon 1806. [0228] Referring to FIG. 19, 2 simulation view shows “beams” being projected from a barrel of a weapon. Weapon 1902 inchides barrel 1904 with one or more simulated ‘beams 1906, 1912, 1916, 1920, 1924, 1928, 1932, and 1936 that emanate from the tip of barel 1904. Beams 1906, 1912, 1916, 1920, 1924, 1928, 1932, and 1936 follow aad are adjusted with the movement of barrel 1904 of weapon 1902. [0229] The beam ofa laser in a real world environment is ‘zenemilly not visible to an observer unless reflected fom an ‘object in the environment, In a vietal reality environment, however, a simulated laser beam can be calculated and splayed. Simulated beams can be displayed with aay Tevel ‘of transpareney and can demonstrate characteristics that are ‘ot possible in the real world, For example, the simulated ‘beam can be displayed as visible, and with a dispersion pattem or in a curved path, [0230] Asan example, beam 1906 i beam of a simulated laser and is displayed as visible along its entire length, The beam is displayed asa line ora aight cylinder. Beam 1906 emanates from point 1908 that is centeal to and aligned with barrel 1904, Beam 1906 indicates the precise direction that barrel 1904s pointed. Beam 1906 extends to point 1910 that ison the cental longitudinal axis of barrel 1904 and is fixed distance away from barel 1904, [0231] In another embodiment, beam 1912 i displayed as conical frustum stating from barrel 1904 and extending 10 circulae cross section 1914, The inerease of the radius of beam 1912 from the radivs of barrel 1904 to eross section 1914 approximates the increasing spread of a shot as it travels evey from barrel 1904, Cielar eros section 1914 is displayed at the termination plane of beam 1912 and provides an indication ofthe maximum distance that shot fon target can reliably register a a hit US 2017/0307333 Al 10232] Beams 1906 and 1912 maintain their respective shapes and orientation with respect to barrel 1904 as itis ‘moved, Plling the trigger of weapon 1902 while bears 1906 ‘or beam 1912 is aligned with a phantom target or phantom, target, such as phantom faget 1708 or phantom halo 1704 of FIG. 17, registers as hit tothe simulated target, 10233] Beam 1916 is displayed as a curved line that extends from point 1908 at barrel 1904, Beam 1916 is Tangential to bean 1906 ut point 1908 and ends at point 1918. [0234] In another embodiment, beams 1916 and 1920 are ‘curved t approximate the drop of shat due to gravity. The ‘curvature of heams 1916 and 1920 is calculated hased on the ‘amount of simulated force due to gravity 1940 and he angle ‘of barrel 1904 when the trigger is pulled. Pulling the wiggcr ‘of weapon 1902 while beam 1916 oF beam 1920 is aligned ‘with a phantom targot or phantom target, such as phantom target 1703 ce phantom halo 1704, registers as a it to the simolated target 10235] In another embodiment, beam 1920 is displayed as curved conical frustum beginning at banel 1904 and ‘ending at circular cross section 1922. Beam 1920 is curved, ‘o approximate the drop of shot due to gravity and has @ radius that increases along the length from barrel 1904 10 ‘ross section 1922 to simulate the spread of a shot. 10236] In another embodiment, beams 1924 and 1928 are ‘curved 1© approximate changes in shot trajectory due t0 windage 1942. The amount of enrvature of beams 1924 and 1928 js based on the amount of simulated force duc t0 Wwindage 1942 snd the angle of barrel 1904 with respect 0 \indage 1942. The simulation of windage may approximate ‘changes in wind velocity and direction, such as found in a ‘eusty wind. In this embodiment, the simulation i ealewlated So thatthe beam moves with respect tothe longitudinal axis ‘of the bare to indicate how the shot would be affected by windy conditions. When windage 1942, is simulated, pulling the trigger of weapon 1902 while beam 1924 oF beam 1928 js aligned witha phantom tat or phantom tape, such as Phantom tayet 1708 or phantom halo 1704, registers hit to the simulated target, 10237] Beam 1924 is displayed as a curved fine that ‘extends from point 1908 a the tp of are! 1904, Ream 1924 Js tangential to beam 1906 at point 1908 and ends at point 1926. [0238] Beam 1928 is displayed a a curved conical frus- tum starting atthe circular tip of barrel 1904 snd ending st ‘circular cross section 1930, Beam 1928 is curved wo approxi mate the drop of shot due to gravity and has a adits that increases along the length from the ip of bareel 1904 to ‘ross section 1930 to simulate the spread of a sho. 10239] Beams 1932 and 1936 are curved to approximate ‘changes in shot trajectory due to both gravity 1940 and. windage 1942, The curvature of beams 1932 and 1936 is based on the amount of gravity 1940 and windage 1942 and based on the angle of barrel 1904 with respect to gravity 1940 and windage 1942. When both gravity 1940 and ‘windage 1942 are simulated, pulling the tigger of weapon, 1902 while beam 1982 or beam 1936 is aligned with « Phantom target or phanfom target, such as phantom target 1703 or phantom halo 1704, registers at a hit 19 the simulated target. Oct. 26, 2017 [0240] Beam 1932 is displayed as a curved line that ‘eters fom point 1908 atthe ip of barrel 1904, Beam 1932 is tangential to beam 1906 at point 1908 and ends at point 1934. [0241] Beam 1936 is formed as a curved conical frustum Starting at “barrel 1904 and ending at circular eross section 1938. Bam 1936 is curved to approximate the ehanges the trajectory ofa shot due to bath gravity 1940 ad windage 1942 sod the radius of beam 1936 increases along the length ‘rom the tip of barrel 1904 to eross section 1938 to appeoxi- sate the spread ofa shot [0242] In one preferred embodiment, a video capture system, such at Microsoft hololeas, ia combination with prerecorded videos of the shooting Field and multiple actual lay target launches ae used to ereate a virial model ofthe surroundings and tnjetories of clay targets for display and ‘use ia the system. [0243] The locations and orientations ofthe launchers are Serived based on the known location of the camera With respoct othe Field, the known size and weight ofthe targets, ‘and the known physical constants of the environment (62. savy). After deriving the launcher locations and orient ‘ions, viral or holographic launchers ean be placed at similar positions in viral reality or augmented reality Simulations of the feds, as will be further deseribed [0244] Referring w FIG. 20A, five stand field 2000 includes five shooter locations with six launchers, Five stand ‘eld 2000 includes launchers 2002, 2004, 2006, 2008, 2010, and 2012 that launch targets onto paths 2014, 2016, 2018. 2020, 2022, and 2024, respectively, Cameras 2026 and 2028 fre positioned 10 view all towers and launchers. A video of the high tower and the low tower shot with a normal lens at 60 fps from station 4 can be processed and used t0 show corte trajectory and eorrect lea from any point of view at ‘ny station. The tryjectory of the amet is the same, being viewed from different angles. [0248] Referring t0 FIG. 208, sporting clays field 2080 ‘includes three shooter locations that each have four launcher locations, The shooter and lauach locations in sporting clays ‘are unique tothe venue. Sporting clays feld 2080 includes Tour lauachers labeled TI through T4 for each of the three shooter positions Si, S2, and $3. Drones 2052 and 2054 include cameras that record the paths of the clay targets Drones 2052 and 2084 are capable of sensing and recording their respective GPS locations while in Bight. The same process can be used to record the fight trajectories oF birds, ‘drones, helicopters and airplanes for purposes of simulating correc spatial lea [0246] Referring to FIG. 214, an alternate embodiment of the simulation system will be described. System 2100 includes system computer 2001. System computer 2101 includes programs 2102, 2103, aad 2120, Program 2102 is software capable of operting the Microsoft hololens sys- tem, a8 will be Ruther deseribed, Program 2103 includes instuctions to operae a unity 3D simulation of tse system, as will be further described. Program 2120 is simulation ‘oftware capable of communicating with programs 2102 and 2103. Ina preferred embodiment, program 2120 isthe Unity 3D simulation engine, as will he Further described. [0247] Head set 2104 js connected to system computer 2101, Head set 2104 inchides an sugmented reality display fr a Viral realty display, as will be further described ‘System compter 2101 is further connected to camera 2105 ‘and camera 2106, The cameras are used in registering fixed US 2017/0307333 Al ‘objects sueh as lasers and tosers and in creating tejoe= tory models of moving objects such ts clay tants inthe Microsoft hololens sytem, as willbe Turhee described 0248}. System computer 2101 is attached to witless Snterface 2108. In s prefered embodiment, wireless iter fave 2108s Bluetooth interfce. System computer i also attached t dongle 2109. Ina prefered embodienent, donale 2109 is compatible with the Vive Tracker, avilable fom nic. 10249] System 2100 further inchades trigger omit 2114 “rigger unit 214, in a prefered embodiment is attached to the won and inchader sensors t detest igs pls The sensors communicate signals tough an onboard witeess interface to wireless interface 2108, {0280} System 2100 further includes electronic care 2112 and baeel bore atbor mounted sensor 2110, Ta prefered embodiment, both include onboard wireless inter faces which communicate with wireless interface 2108, lcctronic cartridge 2112 communicates with hamelacboe mounted sensor 210 va Fight sual 211, a willbe further escrbed 10251} Blectonie cartidge 2112 in a typical usage is ‘chambered in the weapon. Ina typical embodiment arbor mounted sensor 2110 is socured in the muzzle of the weapon. [0282] System 2100 also includes positioning detector 2204, as wll be further deseibed [0283] Refeing wo FIG. 218, ina profered embodiment ‘of Vetal realty systens, a system computer 2101 is ‘connected fo head unit 2122 and positon detector 2123 [0288] System computer 2101 runs operating system 2124, which runs vitual realty simulation eapine 2128, System computer 2121 receives input Ir head wnit 2122 ‘and position detector 2123 that includes measuremeat ds, ‘which suse to identify the postions of headunit 2122 and Position detector 2128, System computer 2121 output mages to headunit 2122 that see rendered sing simulation ‘engine 2128 10285) Head unit 212 includes sensors 2138 that provide measurement data tat is used to identify the postion of Tes nit 2122. Head uit 2122 also includes display 2136 that shows three dimensional imoges. The measurement data is process by system computer 2121 aod used to generate the images displayed by the one or more disploy screens 10286] Position detector 2123 includes sensors 2137, is ‘mounted to a weapon, and provides measurement dit, System computer 2121 reeeves and processes the measte- ‘ent daa rn position detector 2128 to update he position ofthe weapon inside ofthe simulation [0287] Operating sytem 2124 runs on system computer 21121 and provides standard atrlaces for appieations to ra ‘and access external hardware. Applications running under ‘erating system 2124 on system computer 2121 acess data provided by hardware devices, such a heod unit 2122 and Position detector 2128, though hclware divers 2126 {028} Hardware divers 2126 include device divers for ‘ach of head unit 2122 and postion detector 2123. Hardware {avers 2126 allows simulation engine 2128 to acces the measurement data provided by head unit 2122 and postion- ing detector 2128 and (o send inuges To bead uit 2122, [0289] Simulation engine 2128 rans under operating sys- tem 2124 Ina prefered embodiment, the simblation engine 2125 rans in program 2120. The smilation engine reecves ‘easement data from head unit 2122 and position detector Oct. 26, 2017 2123, renders vital reality images based on the measure ‘eat data and the state ofthe simulation, and sends the mages back to headunit 2122 tbe displayed othe wee Ina prefered embodiment, simulation engine 2128 uses one cormore software objects onthe viral elt simulation, Jncluding player object 2127, headunit object 2128, weapon ject 2128, tracker object 2130, target cbject 2131, and lnncher object 2132. Exery time's new fame or image is senerste, simulation engine 2128 updates cach ofthe bbjects based on the mespurement dat the amount of dane Since the fast update, and the previous sate of the simular (0260) Player object 2127 represents the usr inside of simolaton engine’ 2128 and its location is based on the location of ead unit 2122. layer object 2127 i inked to esd unit objet 2128, which stores the current location of bed unit 2122, Head wn object 2128 identifies the euerat location of head unit 2122 by accessing the measurement data provided by head unt 2122 through hardware deivers 2126 [0261] Weapon object 2129 represents, in simulation cngine 2125, the wespon 1 which postion detector 2123 is tached. The positon of weapon objet 2129s inked tothe retin of psi deco 2128 6 a even of don detector 2123 result in movemenis of weapon Skject 2129 inside of simulation engine 2128. Weapon ‘object 2129 i linked to tackor objot 2130 so that when tracker object 2130 updates i positon, the positon of ‘weapon object 2129 is also updated. [0262] Tracker object 2130 receives meastement data {rom postion detector 2128 through hardware divers 2126 Tracker objet 2130 updates the poston of position detector 2123, which s used by simulaibn engine 2128 and Wespoa objet 2129 to update the visible Teation of weapon abject 2129 within smlation engine 2128, Tracker object 2130 also receives button satus data within the measurement data, The button sats data is used to identify when 8 abot js fed and when target shoud be lsunched {0263} Targot object 2131 is dial representation of a lay target. Target objet 2131 is instantiated when a bution js pressed on poston detector 2128, The burton press is ‘ented by wacker object 2130 and tanger object 2131 is ‘ought into the simulation at the Ieation and dvection speciiod hy the launcher object. Tame object 2131 is ‘Meni a a rgd body to physes engine of simulation gine 2125 adits position i updated Based om the ime lated weight, postion, and velocity of target object 2131 ‘Upon inital placement, anget objet 211 simulated force is applied to target object 2131 to make it move inside of Similtion engine 2128, {0264} Launcher object 2132 represents the starting loca ion of tage objet 213 and can be placed st ay position Jnside of simulation engine 2128. For simulations tat include a launcher in high hows, launcher object 2132 is Joeated inside a digital representation of the high house {0268} Referring to FIG. 21C. an augmented reality sy5- ‘em includes head unit 2122 and positioning detector 2123, {0266} Head unit 2122 inches system computer 2121, Sensors 2138, and display 2136, [0267] Positioning detector 2128 includes sensors 2137 and js mounted to the weapon, Postoning detector 2123 Provides measurement data that allows i sed to detemsine {he location of pontoning detector 2123 with rexpoct othe aviroameat sd the location of eod wait 2122, US 2017/0307333 Al 10268] Sensors 2135 of head wit 2122 are used to provide measurement data that identifies the position of head unit 2122 and generates and updates mesh object 2134, Camera 2138 of head unit 2122 are used to locale and track regis- tration marks on the towers and the weapon, as will be Jurker described. [0269] Display 2136 is mounted within head unit 2122 and

Anda mungkin juga menyukai