(51) International Patent Classification: (81) Designated States (unless otherwise indicated, for every
61 19/00 (2006.01) 0413/00 (2006.01) kind of national protection available)'. , AG, AL, AM,
AO, AT, AU, , BA, , BG, , BN, BR, BW, BY,
(21) International Application Number:
, CA, CH, CL, CN, CO, CR, CU, CZ, DE, DK, DM,
PCT/US2015/023213
DO, DZ, EC, EE, EG, ES, FI, GB, GD, GE, GH, GM, GT,
(22) International Filing Date: , HR, HU, ID, IL, IN, IR, IS, JP, , KG, , , KR,
28 March 2015 (28.03.2015) , LA, LC, LK, LR, LS, LU, LY, , MD, ME, MG,
, , MW, MX, MY, , , NG, NI, NO, , ,
(25) Filing Language: English PA, PE, PG, PH, PL, , QA, RO, RS, RU, RW, SA, SC,
(26) Publication Language: English SD, SE, SG, SK, SL, SM, ST, SV, SY, , TJ, , ,
TR, , , UA, UG, US, UZ, VC, VN, , , ZW.
(30) Priority Data:
61/971,749 28 March 2014 (28.03.2014) US (84) Designated States (unless otherwise indicated, for every
62/096,518 23 December 2014 (23.12.2014) US kind of regional protection available)'. ARIPO (BW, GH,
GM, , LR, LS, MW, , , RW, SD, SL, ST, SZ,
(72) Inventors; and , UG, , ZW), Eurasian (AM, , BY, KG, , RU,
(71) Applicants : PANESCU, Dorin [US/US]; 5275 Country TJ, ), European (AL, AT, BE, BG, CH, CY, CZ, DE,
Forge Lane, San Jose, CA 95136 (US). JONES, Daniel, DK, EE, ES, FI, FR, GB, GR, HR, HU, IE, IS, IT, LT, LU,
. [US/US]; 716 S. Overlook Drive, Alexandria, VA LV, MC, , , NL, NO, PL, , RO, RS, SE, SI, SK,
22305 (US). SM, TR), OAPI (BF, BJ, CF, CG, Cl, CM, GA, GN, GQ,
GW, KM, ML, MR, , SN, TD, TG).
(74) Agents: MADDEN, Robert, . et al.; Schwegman, Lund-
berg & Woessner, .., .. Box 2938, Minneapolis, Published:
WO 2015/149043 A1
55402 (US).
with international search report (Art. 21(3))
Damaged Bone
Prosthetic Q3D
Model
2015/149043 1
FIG. 28
(57) Abstract: A system to produce a replacement anatomical structure, comprising: a quantitative three-dimensional (Q3D) endo -
scope disposed to image a structure within a field of view that includes target tissue; at least one processor configured to: produce a
first Q3D model of an anatomical structure that includes target tissue; produce a second Q3D model of the anatomical structure that
Q includes remainder tissue in a location from which the target has been tissue removed; and produce a third Q3D model of a replace -
ment structure based at least in part upon the first Q3D model and the second Q3D model.
WO 2015/149043 PCT/US2015/023213
OF SURGICAL IMPLANTS
RELATED APPLICATIONS
FIELD
BACKGROUND
1
WO 2015/149043 PCT/US2015/023213
using CMOS fabrication techniques. Each detector has an associated high speed
counter that accumulates clock pulses in number directly proportional to time-
of-flight (TOF) for a system-emitted pulse to reflect from an object point and be
detected by a pixel detector focused upon that point. The TOF data provides a
5 direct digital measure of distance from the particular pixel to a point on the
object reflecting the emitted light pulse. In a second embodiment, the counters
and high speed clock circuits are eliminated, and instead each pixel detector is
provided with a charge accumulator and an electronic shutter. The shutters are
opened when a light pulse is emitted and closed thereafter such that each pixel
10 detector accumulates charge as a function of return photon energy falling upon
the associated pixel detector. The amount of accumulated charge provides a
direct measure of round-trip TOF.
2
WO 2015/149043 PCT/US2015/023213
and determining physical distances based upon a deformation pattern of the light
due to contours of the physical object.
An imager array camera has been built that includes a plurality of pixel
arrays that can be used to compute scene depth information for pixels in the
5 array. High resolution (HR) images are generated from multiple low resolution
(LR) images. A reference viewpoint is selected and an HR image is generated as
seen by that viewpoint. A parallax processing technique utilizes the effects of
aliasing to determine pixel correspondences for non-reference images with
respect to the reference image pixels. Fusion and superresolution are utilized to
10 produce the HR image from the multiple LR images. See e.g., U.S. Patent No.
8,514,491, entitled Capturing and Processing Images using Monolithic Camera
Array with Heterogeneous Imager; U.S. Pat. . Pub. No. 2013/0070060,
entitled, Systems and Methods for Determining Depth from multiple Views of a
Scene that Include Aliasing using Hypothesized Fusion; and . Venkataraman
15 et al., PiCam: An ultra-Thin high Performance Monolithic Camera Array.
3
WO 2015/149043 PCT/US2015/023213
control and pixel digitization. In some embodiments, the sensors Sn through S33
are arranged into a grid format as illustrated in Figure 2. In other embodiments,
the sensors are arranged in a non-grid format. For example, the sensors may be
arranged in a circular pattern, zigzagged pattern, scattered pattern, or irregular
5 pattern including sub-pixel offsets.
20
SUMMARY
4
WO 2015/149043 PCT/US2015/023213
Aspects of the present disclosure are best understood from the following
detailed description when read with the accompanying figures. It is emphasized
30 that, in accordance with the standard practice in the industry, various features are
not drawn to scale. In fact, the dimensions of the various features may be
arbitrarily increased or reduced for clarity of discussion. In addition, the present
5
WO 2015/149043 PCT/US2015/023213
disclosure may repeat reference numerals and/or letters in the various examples.
This repetition is for the purpose of simplicity and clarity and does not in itself
dictate a relationship between the various embodiments and/or configurations
discussed.
6
WO 2015/149043 PCT/US2015/023213
7
WO 2015/149043 PCT/US2015/023213
9
WO 2015/149043 PCT/US2015/023213
10
WO 2015/149043 PCT/US2015/023213
DESCRIPTION OF EMBODIMENTS
25
Brief Overview
11
WO 2015/149043 PCT/US2015/023213
12
WO 2015/149043 PCT/US2015/023213
10 The patient-side system 152 includes one or more mechanical arms 158.
Typically, the patient-side system 152 includes at least three mechanical surgical
arms 158A-158C (generally referred to as mechanical surgical arms 158)
supported by corresponding positioning set-up arms 156. The central mechanical
surgical arm 158C may support an endoscopic camera 101C suitable for capture
15 of Q3D information for images within a field of view of the camera. The
mechanical surgical arms 158 and 158 to the left and right of center may
support instruments 101 and 101, respectively, which may manipulate tissue.
Excluding a monitor arm 154, each mechanical surgical arm 158 is used
to control instruments 101A-101C. Moreover, each mechanical surgical arm 158
is coupled to a set-up arm 156 that is in turn coupled to a carriage housing 190 in
one embodiment of the invention. The one or more mechanical surgical arms
30 158 are each supported by their respective set-up arm 156, as is illustrated in
Figure 6.
The mechanical surgical arms 158A-158D may each include one or more
displacement transducers, orientational sensors, and/or positional sensors 185 to
13
WO 2015/149043 PCT/US2015/023213
14
WO 2015/149043 PCT/US2015/023213
15
WO 2015/149043 PCT/US2015/023213
coordinates of the points in a scene adjacent the tip 208 of the elongate portion
202 and drives both the video processor 104 and a 3D display driver 109 to
compose 3D scenes, which then can be displayed on a 3D display 110. In
accordance with some embodiments, Q3D information about a surgical scene is
5 generated, such as numerical indicia of dimensions of surface contours of objects
in a scene or distances from objects within the surgical scene, for example. As
explained more fully below, the numerical Q3D depth information can be used
to annotate a stereoscopic image of a surgical scene with distance information or
surface contour information.
10 Data buses 107 and 108 exchange information and control signals among
the video processor 104, the controller 106, and the display driver 109. In some
embodiments, these elements can be integrated with the image sensor array 210
inside the body of the endoscope. Alternatively, they can be distributed
internally and/or externally to the endoscope. The endoscope is shown
15 positioned, via a cannula 140, to penetrate body tissue 130 in order to provide
visualize access to a surgical scene that includes a target 120. Alternatively, the
endoscope and one or more instruments may also pass through a single
openinga single incision or natural orificeto reach a surgical site. The target
120 can be an anatomic target, another surgical instrument, or any other aspect
20 of the surgical scene inside a patients body.
16
WO 2015/149043 PCT/US2015/023213
A viewing system having two viewing elements 401R, 401L can provide
a good 3D viewing perspective. The Q3D imaging system supplements this
25 viewing perspective with physical dimension information for physical structures
in the surgical scene. The stereo viewer 312 used in conjunction with a Q3D
endoscopy system, can display Q3D information overlayed onto the stereo
image of the surgical scene. For example, as shown in Figure 4, the numerical
Q3D distance value d lnstr Trgt between instrument 400 and target 410 can
30 be displayed within stereo viewer 312.
17
WO 2015/149043 PCT/US2015/023213
18
WO 2015/149043 PCT/US2015/023213
19
WO 2015/149043 PCT/US2015/023213
candidate pixels illuminated by projections from the same target. Module 402.5
configures the controller to compute two or more pixel coordinates (Nx, Ny)
coordinates for the selected target to determine whether the candidate pixels are
illuminated by a projection from the same target. Decision module 402.6
5 determines whether the computed 2D pixel coordinate values indicate that the
candidate pixels are illuminated by a projection from the same target. The image
diversity caused by viewing the same scene with multiple sensors Sy plays a role
in correctly identifying (Nx, Ny) associated with a specific target in the various
individual images Sy. For example, in accordance with some embodiments,
10 assuming a simplified scenario where only three sensors are used, Sn, S12 and
S13, if the triplet of 2D pixel coordinates [(, Nyn), (12, Nyi2), (13,
Nyi3)] are not corresponding to projections of the same target onto [Sn, S12 and
S13] then the quantities y 12 and y 13 (which are estimates of the projection shift in
the y direction) will yield different values. According the equations presented
15 later, yi2 and yn should be equal if pixel coordinates (, Nyn), (12, Nyn),
(3, Nyn) come from projections of the same target.
Nyu
(402.5 - 1)
Nyn ~Nyn
Nyn (402.5 - 2)
Nyn - Nyn
25 met for the corresponding estimates for the axis, . . and :./. In response to
It will be appreciated that each pixel directly captures color and intensity
30 information from a world scene. Moreover, in accordance with the above
20
WO 2015/149043 PCT/US2015/023213
Table 1
21
WO 2015/149043 PCT/US2015/023213
In accordance with modules 401 and 402.1, sensors of the sensor array
210 separately capture images from a world view. Figure 12 is an illustrative
drawing representing projections of the three objects of Figure 11 onto the
25 sensors Sy (only Sn, S12, and S13 are shown) in accordance with some
embodiments. A person of ordinary skill in the art will appreciate that the
reflected light rays incident upon the sensors project images of the objects that
are in the field of view. More specifically, the rays of light reflected from the
objects in the field of view that are incident upon multiple different image
30 sensors of the imager array produce multiple perspective projections of the
objects from three dimensions to two dimensions, i.e. a different projection in
each sensor that receives the reflected rays. In particular, the relative location of
projections of the objects is shifted from left to right when progressing from Sn
22
WO 2015/149043 PCT/US2015/023213
to S12 to S13. Image sensor pixels that are illuminated by incident light rays
produce electrical signals in response to the incident light. Accordingly, for each
image sensor, a pattern of electrical signals is produced by its pixels in response
to the reflected rays that indicates the shape and location of the image projection
5 within that image sensor.
23
WO 2015/149043 PCT/US2015/023213
relative to their location in sensor S13. It will be appreciated that since the FOV
viewing axes of sensors S12, S11 each is offset to the right of the FOV viewing
axis of sensor S13 (such viewing axes being perpendicular to plane of the
sensors), the projected images from ROI are offset to the left in the sensors S13
5 and S11 relative to sensor S .
24
WO 2015/149043 PCT/US2015/023213
reference. More specifically, the projected images within the ROI in S12 are
shifted and cross-correlated with the projected images within the ROI in S13 until
a highest correlation coefficient is achieved. Likewise, the projected images
within the ROI in Sn are shifted and cross-correlated with the projected images
5 within the ROI in S13 until a highest correlation coefficient is achieved. Thus,
alignment of the projections of the ROI is used to identify the locations of the
projections of the ROI in sensors Sn and S12 by determining the offset between
the projection of the ROI in S13 and the projection of the ROI in S12 and by
determining the offset between the projection of the ROI in S13 and the
10 projection of the ROI in Sn-
25
WO 2015/149043 PCT/US2015/023213
26
WO 2015/149043 PCT/US2015/023213
For simplicity, let us assume that all sensors Sy have N*N pixels. At a
distance from sensor Sn, the N-pixel width of the sensor expands out to a y-
10 dimension field of view of Sn indicated by FOVn Likewise, at a distance from
sensor S12, the y-dimension field of view of sensor S12 is indicated by FOV2.
Also, at a distance from sensor S13, the y-dimension field of view of sensor S13
is indicated by length FOV3. The lengths FOVi, FOV2, and FOV3 overlap each
other, signifying that sensors Sn, S12, and S13 achieve a 3-way sampling
15 diversity of target physically located at some (unknown) distance . Of course,
if the sensors are identically built, as assumed in this example, length FOVi,
FOV2, and FOV3 will be identical as well. It will be appreciated that the three
lengths FOVi, FOV2, and FOV3 all have the same magnitude and are coplanar in
that they are at the same (unknown) z-distance from the imager array, although
20 for the purpose of illustration they are portrayed as if they were stacked adjacent
to each other.
27
WO 2015/149043 PCT/US2015/023213
Referring to both Figure 16 and Figure 18, it can be seen that the y pixel
15 distance of the projection of the target is different in each sensor. The projection
of a currently selected target is disposed nyi pixels to the left of the origin in
Sn- The projection of the selected target is disposed ny2 pixels to the left of the
origin in S12. The projection of the selected target is disposed ny3 pixels to the
left of the origin in S13. As mentioned above, to simplify the explanation, it is
20 assumed that the projection of the target falls at the same pixel distance from
the origin in all three sensors.
28
WO 2015/149043 PCT/US2015/023213
Recall that the candidate pixel selections are made based at least in part
upon a correlation process that can have a level of uncertainty than can result in
inaccuracy in determination of the physical location of the selected target. Thus,
a further check of the accuracy of the target projection candidate selections, in
5 accordance with some embodiments, is made as follows.
-
15 Z~ (1)
2-(nyl -ny2)- tan(-)
2 nyl-N s
y =----- ---------- (2)
2(nyl-nyl)
Where:
29
WO 2015/149043 PCT/US2015/023213
2
= Q (4)
2'Ki -nyi)-tan(-)
2 nv1 -
5 y: ----yJ---------2 (5)
2 (nyl-nyi)
= - 1) tan() + 2 (6)
2
Where:
30
WO 2015/149043 PCT/US2015/023213
10 Given the assumed sensor symmetry around the and y axes, persons
skilled in the art will appreciate that the same kind of determination can be made
for the coordinates of the target using formulations similar to those in (2)
and (5), but using nx; instead of Formulations (3) and (6) cannot be used part
of 402.5 and 402.6 because they require knowledge of the coordinate.
15 However, the essence of modules 402.5 and 402.6 is to determine the correct
target projections on the planes of sensors Sn, S12 and S13. For this purpose
formulations (2) and (5), adjusted for and y axes, are sufficient. The complete
set of coordinates (, y, ) is computed part of modules 403 and 404, as
described below.
20
31
WO 2015/149043 PCT/US2015/023213
5 If (3) and (6) are outside an acceptable tolerance sx, then the iteration
continues and a new estimate for is tried, z\. In accordance with some
embodiments, the new estimate is defined automatically. For example, z\ = +
, where is the size of the iteration step. In general, at klh iteration zi< = Zk-i +
A. The iterative process stops when condition (7) is met. A smaller A yields
10 increased accuracy in determining the correct target coordinates, but it would
also require more computational time to complete the process, hence an
increased latency. An increased latency may result in delays between surgical
instrument movement and its visualization by the operating surgeon. In other
words, the surgeon may perceive the system as lagging behind commands. For a
15 surgical viewing space of 20 - 30 cm of depth, a A of 0.1 - 0.3 mm may be
sufficient. Of course, a person skilled in the art would know to balance the size
of A against the computational required to complete the iterative process.
The above explanation has been simplified for presentation reasons and,
therefore, it included only three sensors, Sn, S12, and S13. In general, more
20 sensors can be used to increase the accuracy of Q3D coordinate computations
but also to reduce the overall number of iterations. For example, if more than
three sensors are used, preferably an array of 3 3 sensors, then methods such as
the steepest gradient may be employed to trend the direction of estimation errors
made by modules 402.5 and 403. The iterative step size and direction can then
25 be adjusted to match the progression towards the local extreme of the 3D error
gradient surface.
32
WO 2015/149043 PCT/US2015/023213
the process 2000. Module 2002 configures the computer to receive user input to
select at least two objects within a surgeons field of view when looking into the
viewer 312. Module 2004 configures the computer to display a menu on a
computer console in response to receipt of a user selection. Decision module
5 2006 configures the computer to determine whether user input to the menu is
received to display a distance. In response to a determination that user input is
received to display a distance, module 2008 configures the computer to display a
numerical distance within the video image in the surgeons field of view.
Decision module 2010 configures the computer to wait for a prescribed time
10 interval for receipt of user input to select distance display and to end operation of
decision module 2006 in response to no receipt of user input within a time out
interval.
33
WO 2015/149043 PCT/US2015/023213
2104 that includes a first menu item Display Distance 2106 and a second
menu item Set Proximity Alarm 2108. In response to user input to select the
Display Distance menu item 2106, module 2008 causes a display of Q3D
distance between two or more objects. Referring again to Figure 4, there is
5 shown a display of a Q3D distance d lnstr Trgt between an instrument 400
and target displayed using module 2008. In response to user input to select the
Set Proximity Alarm menu item 2108, an Enter Distance UI input 2110 is
displayed that includes a field in which a user can enter a proximity distance
threshold value, e.g., one cm. In an alternative embodiment (not shown), a
10 default proximity threshold may be set in advance for all instruments, and a user
may change the proximity threshold using the menu of Figure 21, for example.
In the alternative embodiment, a user can choose to elect the default threshold
value rather than enter a threshold value. In some embodiments, a user can select
both to display the distance and set a proximity alert.
34
WO 2015/149043 PCT/US2015/023213
input to select an object within a surgeons field of view when looking in to the
viewer 312. For example, referring again to Figure 22, user input is shown
received to create the second highlighting areas 2206L, 2206R of the instrument
tip 400L, 400R using the video marker tool. User input (not shown) is received
5 to actuate a selector (not shown) (e.g., press the ENTER key) to enter the
selection of the image of the instrument tip 400L, 400R.
35
WO 2015/149043 PCT/US2015/023213
36
WO 2015/149043 PCT/US2015/023213
damaged or diseased bone tissue, a Q3D model is produced based upon image
information captured from the anatomical structure that represents contour of an
external surface of the bone tissue, including the damaged or diseased bone
tissue, before removal. After removal of the damaged or diseased bone tissue
5 which has not yet been removed, another Q3D model is produced based upon
image information captured from the anatomical structure that represents
contour of an external surface of the bone tissue, including the remainder
structure, after removal. An endoscope 101C associated with a Q3D sensor is
moved to a position in which a Q3D sensor can image externally visible
10 contours of the remaining bone tissue surrounding the void from which damaged
or diseased tissue has been removed. In accordance with some embodiments,
and as described above with reference to Figures 6-19, the Q3D sensor is
disposed to capture a Q3D image of the remainder bone tissue surrounding the
void region, and a Q3D model of the remainder bone tissue is produced. A
15 prosthetic implant is produced based upon the Q3D model of the remainder
bone tissue. The prosthetic implant is placed within the void defined by the
remainder bone tissue to act as a replacement for the removed bone tissue.
37
WO 2015/149043 PCT/US2015/023213
38
WO 2015/149043 PCT/US2015/023213
39
WO 2015/149043 PCT/US2015/023213
40
WO 2015/149043 PCT/US2015/023213
41
WO 2015/149043 PCT/US2015/023213
42
WO 2015/149043 PCT/US2015/023213
deployed from a distal opening of catheter 14, as shown by the arrow. Upon
deployment, imaging hood 12 may be unconstrained to expand or open into a
deployed imaging configuration, as shown in Figure 33. Imaging hood 12 may
be fabricated from a variety of pliable or conformable biocompatible material
5 including but not limited to, e.g., polymeric, plastic, or woven materials. One
example of a woven material is Kevlar, which is an aramid and which can be
made into thin (e.g., less than 0.001 in.) materials which maintain enough
integrity for applications described in this description. Moreover, the imaging
hood 12 may be fabricated from a translucent or opaque material and in a variety
10 of different colors to optimize or attenuate any reflected lighting from
surrounding fluids or structures, i.e., anatomical or mechanical structures or
instruments. In either case, imaging hood 12 may be fabricated into a uniform
structure or a scaffold-supported structure, in which case a scaffold made of a
shape memory alloy, such as Nitinol, or spring steel, or plastic, etc., may be
15 fabricated and covered with the polymeric, plastic, or woven material. Hence,
imaging hood 12 may comprise any of a wide variety of barriers or membrane
structures, as may generally be used to localize displacement of blood or the like
from a selected volume of a body lumen, such as a heart chamber. In exemplary
embodiments, a volume within an inner surface 13 of imaging hood 12 will be
20 significantly less than a volume of the hood 12 between inner surface 13 and
outer surface 11.
43
WO 2015/149043 PCT/US2015/023213
diameter, e.g., at contact lip or edge 22, is typically greater relative to a diameter
of the deployment catheter 16 (although a diameter of contact lip or edge 22 may
be made to have a smaller or equal diameter of deployment catheter 16). For
instance, the contact edge diameter may range anywhere from 1 to 5 times (or
5 even greater, as practicable) a diameter of deployment catheter 16. Figure 33C
shows an end view of the imaging hood 12 in its deployed configuration. Also
shown are the contact lip or edge 22 and fluid delivery lumen 18 and imaging
lumen 20.
20 Although contact edge 22 need not directly contact the underlying tissue,
it is at least preferably brought into close proximity to the tissue such that the
flow of clear fluid 28 from open area 26 may be maintained to inhibit significant
backflow of blood 30 back into open area 26. Contact edge 22 may also be made
of a soft elastomeric material such as certain soft grades of silicone or
25 polyurethane, as typically known, to help contact edge 22 conform to an uneven
or rough underlying anatomical tissue surface. Once the blood 30 has been
displaced from imaging hood 12, an image may then be viewed of the
underlying tissue through the clear fluid 30. This image may then be recorded or
available for real-time viewing for performing a therapeutic procedure. The
30 positive flow of fluid 28 may be maintained continuously to provide for clear
viewing of the underlying tissue. Alternatively, the fluid 28 may be pumped
temporarily or sporadically only until a clear view of the tissue is available to be
imaged and recorded, at which point the fluid flow 28 may cease and blood 30
44
WO 2015/149043 PCT/US2015/023213
may be allowed to seep or flow back into imaging hood 12. This process may be
repeated a number of times at the same tissue region or at multiple tissue
regions.
45
WO 2015/149043 PCT/US2015/023213
46
WO 2015/149043 PCT/US2015/023213
stent 84 has been installed in the right ascending artery (RCA), in accordance
with some embodiments. The stent 84 is installed over lesion 86. Image sensor
array 210 may be used to provide Q3D information to evaluate the deployment
and positioning of stent 84 within lumen 72. As an example, without limitations,
5 the Q3D information may be used to size stent 84 according to the size of
patients RCA OS. The Q3D information about the lesion 86 can also be used to
size the length of stent 84 appropriately. In yet another embodiment, the Q3D
information can be used to check the performance of stent 84 once deployed in
place. For example, the Q3D information can show precisely, using quantitative
10 information, the fit of the stent to the inner wall of the RCA. If there is
incomplete coverage of the lesion, or if the stent diameter is too large or too
small, these conditions can all be quantified based on the Q3D information.
Figure 41 is an illustrative drawing showing an example of quantitative
evaluation criteria that can be used to assess the performance of stent coverage
15 of blood vessel. One of the quantitative goals is to minimize the difference
between mean stent cross-sectional area and the mean lumen cross-sectional
area.
47
WO 2015/149043 PCT/US2015/023213
48
WO 2015/149043 PCT/US2015/023213
CLAIMS
49
WO 2015/149043 PCT/US2015/023213
50
WO 2015/149043 PCT/US2015/023213
51
WO 2015/149043 PCT/US2015/023213
52
WO 2015/149043 PCT/US2015/023213
1/38
FIG. 1
(PRIOR ART)
FIG. 2
(PRIOR ART)
WO 2015/149043 PCT/US2015/023213
2/38
FIG. 3
(PRIOR ART)
WO 2015/149043 PCT/US2015/023213
3/38
FIG. 4
f
CM
co
WO 2015/149043 PCT/US2015/023213
4/38
FIG. 5
WO 2015/149043 PCT/US2015/023213
FIG. 6
WO 2015/149043 PCT/US2015/023213
6/38
FIG. 7A
___________ _ J
101C'
FIG. 7
WO 2015/149043 PCT/US2015/023213
7/38
120
FIG. 8
WO 2015/149043 PCT/US2015/023213
8/38
FIG. 9
WO 2015/149043 PCT/US2015/023213
9/38
FIG. 10
WO 2015/149043 PCT/US2015/023213
10/38
FIG. 11
WO 2015/149043 PCT/US2015/023213
11/38
S13
S12
S11
F/G. 72
WO 2015/149043 PCT/US2015/023213
12/38
FIG. 13
WO 2015/149043 PCT/US2015/023213
13/38
Region of
Interest
S13
S12
S11
FIG. 14
WO 2015/149043 PCT/US2015/023213
14/38
FIG. 15
WO 2015/149043 PCT/US2015/023213
15/38
S12
S11
F/G. 16
WO 2015/149043 PCT/US2015/023213
16/38
(,,)
>z
FIG. 17
WO 2015/149043 PCT/US2015/023213
17/38
Pixels
Pixels
Pixels
S12
Pixels
Pixels
Pixels
F/G. 75
WO 2015/149043 PCT/US2015/023213
18/38
(,,)
FIG. 19
WO 2015/149043 PCT/US2015/023213
19/38
WO 2015/149043 PCT/US2015/023213
20/38
FIG. 21
WO 2015/149043 PCT/US2015/023213
CO
C\J
00
FIG. 22A
312
WO 2015/149043 PCT/US2015/023213
22/38
FIG. 22
WO 2015/149043 PCT/US2015/023213
23/38
FIG. 23
WO 2015/149043 PCT/US2015/023213
24/38
FIG. 24
WO 2015/149043 PCT/US2015/023213
25/38
Damaged
Tissue
Remainder
I
I
A
FIG. 26 FIG. 26
WO 2015/149043 PCT/US2015/023213
26/38
FIG. 27C
WO 2015/149043 PCT/US2015/023213
27/38
Damaged Bone
Prosthetic Q3D
FIG. 28
Model
(Subtract)
FIG. 28
WO 2015/149043 PCT/US2015/023213
28/38
3D Printer
Compatible
Model Of
Prosthetic
FIG. 29A
WO 2015/149043 PCT/US2015/023213
29/38
3D Printer
Compatible
Model Of
Prosthetic
Input
Input
3D Printer Fabrication
Machine Material
Output
r ____ - (Physical)
Prosthetic
Structure
w7
FIG. 29
WO 2015/149043 PCT/US2015/023213
30/38
Prosthetic
Implant
FIG. 30 FIG. 30
WO 2015/149043 PCT/US2015/023213
31/38
FIG. 32 FIG. 32
WO 2015/149043 PCT/US2015/023213
32/38
%
(PRIOR ART)
13
FIG. 33
(PRIOR ART)
18
13
FIG. 33C
(PRIOR ART)
WO 2015/149043 PCT/US2015/023213
33/38
FIG. 34
(PRIOR ART)
(PRIOR ART)
WO 2015/149043 PCT/US2015/023213
34/38
(PRIOR ART)
FIG. 35
(PRIOR ART)
WO 2015/149043 PCT/US2015/023213
35/38
FIG. 36
WO 2015/149043 PCT/US2015/023213
36/38
FIG. 37
FIG. 38
WO 2015/149043 PCT/US2015/023213
37/38
Prosthetic
First Valve
Flap
/ Heart Valve
Second Valve
Flap
210
FIG. 39
Prosthetic
210
FIG. 40
WO 2015/149043 PCT/US2015/023213
38/38
Variable ( = 12)
The difference of the stent area minus the lumen area had positive (when the stent was well
FIG. 41
INTERNATIONAL SEARCH REPORT International application No.
PCT/US2015/023213
A. CLASSIFICATION OF SUBJECT MATTER
61 19/00(2006.01)i, 04 13/00(2006.01)i
According to International Patent Classification (IPC) or to both national classification and IPC
. FIELDS SEARCHED
Minimum documentation searched (classification system followed by classification symbols)
61 19/00; 04 13/00; 61 1/00; G09B 23/28; A61C 7/02; 61 1/01; 61 1/04
Documentation searched other than minimum documentation to the extent that such documents are included in the fields searched
Korean utility models and applications for utility models
Japanese utility models and applications for utility models
Electronic data base consulted during the international search (name of data base and, where practicable, search terms used)
eKOMPASS(KIPO internal) & Keywords: quantitative, 3d, model, bone, tissue, remove, defect, compare, implant, munufacturing, stereo,
capture, external, successive, image, oral scanner
Category* Citation of document, with indication, where appropriate, of the relevant passages Relevant to claim No.
| | Further documents are listed in the continuation of Box C. See patent family annex.
* Special categories of cited documents: "T" later document published after the international filing date or priority
"A" document defining the general state of the art which is not considered date and not in conflict with the application but cited to understand
to be of particular relevance the principle or theory underlying the invention
"E" earlier application or patent but published on or after the international 'X' document of particular relevance; the claimed invention cannot be
filing date considered novel or cannot be considered to involve an inventive
"L" document which may throw doubts on priority claim(s) or which is step when the document is taken alone
cited to establish the publication date of another citation or other .-
document of particular relevance; the claimed invention cannot be
special reason (as specified) considered to involve an inventive step when the document is
"O" document referring to an oral disclosure, use, exhibition or other combined with one or more other such documents,such combination
means being obvious to a person skilled in the art
"P document published prior to the international filing date but later document member of the same patent family
than the priority date claimed
Date of the actual completion of the international search Date of mailing of the international search report
14 July 2015 (14.07.2015) 14 July 2015 (14.07.2015)
Name and mailing address of the ISA/KR Authorized officer
International Application Division
This international search report has not been established in respect of certain claims under Article 17(2)(a) for the following reasons:
2. I I Claims Nos.:
because they relate to parts of the international application that do not comply with the prescribed requirements to such an
extent that no meaningful international search can be carried out, specifically:
3. | | Claims Nos.:
because they are dependent claims and are not drafted in accordance with the second and third sentences of Rule 6.4(a).
Box No. Ill Observations where unity of invention is lacking (Continuation of item 3 of first sheet)
This International Searching Authority found multiple inventions in this international application, as follows:
1. | | As all required addtional search fees were timely paid by the applicant, this international search report covers all searchable
claims.
2. | | As all searchable claims could be searched without effort justifying an additional fees, this Authority did not invite payment
of any additional fees.
3. | | As only some of the required additional search fees were timely paid by the applicant, this international search report covers
only those claims for which fees were paid, specifically claims Nos.:
4. | | No required additional search fees were timely paid by the applicant. Consequently, this international search report is
restricted to the invention first mentioned in the claims; it is covered by claims Nos.:
Remark on Protest | | The additional search fees were accompanied by the applicant's protest and, where applicable, the
___payment of a protest fee.
| | The additional search fees were accompanied by the apphcant's protest but the applicable protest
___fee was not paid within the time limit specified in the invitation.
| | No protest accompanied the payment of additional search fees.