The metaphor of a “walled garden” conjures the dominant model for the next generation
cellular service as illustrated in Figure 1. Traditional mobile telecommunications firms can be
thought of as operating within a walled garden, where the applications available to the end-users
are chosen and, in some cases, even developed, by the network operator. U.S. and European
wireless carriers are looking to deploy their 3G networks within the familiar walled garden. 1 For
instance, subscribers are offered a menu of choices like call forwarding, voice mail, *## services,
etc. Developed in a world of voice-centric communications, the walled garden approach was
extended by wireless carriers in their early forays into data services using the Wireless Access
Protocol (WAP). It is now widely believed that the failure of WAP was largely due to the
unavailability of enough applications for the extra cost of the service to be seen as worthwhile to
end users. The developers of WAP applications were constrained by having to rely on operators
for not only accessing their services but also in providing the experimental environment. As a
result, there was insufficient innovation for WAP to catch on and tip the market. The applications
that emerged were too costly and did not have sufficient appeal in the eyes of the end users. In
WAPs case this walled garden proved too limiting to promote enough innovative applications. This
strategy is unlike the unfolding landscape of 802.11 technology that can be managed in a
distributed fashion, which results in groups of end users or entrepreneurs being able to experiment
with potential applications that are vital to solve their business or personal needs – a “open
garden” model.
1
The exception to rule is Japan’s NTT-DoCoMo service where the business model clearly encourages
innovation by third parties.
2
Menu of services
•Voice mail IPRadio
Server
•Local traffic
Voice mail
Server Menu of services
•Voice mail
Traffic •Local movies
Server
Voice mail
Cellular provider I Server
3
off: the valleys of 802.11 correspond to the peaks of 3G in the context of attributed users want.
Service providers stand to win with this cooperative model because users will be better served,
and there will be more of them. Wi-Fi and 3G technologies together is a friendly win-win
existence.
This paper uses the real options approach to analyze the growing market for wireless
network-based applications.2 We recognize that a large component of the value of wireless
networks is derived from the options that they create for future applications. The options inherent
in the network, in effect, confer rights to offer new services. As with any option, the value of
such networks will increase with increasing market uncertainty. The option value will also
depend on the abundance of available options – innovative new services that can be launched off
the network. We show that by allowing for easy experimentation the distributed 802.11 networks
will fosters greater innovation. As market uncertainty increases, so does the amount of
innovation, and hence, also the option value of the network as long as devices interoperate
between 802.11 and 3G networks so applications can migrate from one network infrastructure to
the other.
Background on 802.11 to 3G Cellular
Before we introduce our options model, we explore the two pertinent technologies and
their current management structures.
3G cellular: Technology [1][2]
Most of the world is currently using 2nd generation (2G) cellular systems. These systems
are based on one of two dominant technology 2G standards: Time Division Multiple Access
(TDMA) and Code Division Multiple Access (CDMA). More recently, some operators have
upgraded their systems to an intermediate step (2.5G) as a stepping-stone towards the eventual
offering of 3G services. The appropriate 2.5G network depends not only on the installed base of
2G but also on the planned technology strategy for the next generation. What will come after 3G?
For now, there is extreme uncertainty about what the dominant technology will be, and what
services will be profitable for service providers.
The primary motivation to adopt 3G was to address the capacity constraints for voice
services offered on existing 2G networks. The additional capacity is achieved via several
methods. First, 3G cellular is more efficient in using bandwidth. Second, 3G networks will
operate in higher frequencies, allowing for a much wider bandwidth capacity. The higher
frequencies also necessitate smaller cells that lend themselves to greater spectrum reuse (in non-
adjacent cells) that further enhance capacity.
Voice capacity increase may only be the tip of the iceberg for sources of value. The
greatest benefits are expected from voice services that are enabled by packet-switched (and
hence, always on) data. The potential 3G data services could also leverage additional attributes
of knowledge of location, presence, and greater intelligence in the network. Vendors of cell
phones are building IP enabled phones because they believe that services will emerge that users
demand. 3G technology is a big advance over 2G for data applications because of the increased
data rate, and always-on paradigm. 3G cellular networks are also moving towards the
2
See Amram and Kulatilaka [3] for a general survey of the literature on real options. Gaynor [4] applies
these ideas to the management of telecommunications networks.
4
convergence of voice and data networks seen elsewhere. Nobody knows what the landscape will
look like when data and voice finally converge, and 3G cellular networks that allow Voice-over-IP
will be better able to fit into whatever world evolves. These advantages of moving from 2G to 3G
technology that is causing companies to build equipment and networks.
The evolution to 3G cellular services is complex because of the variety of 2thG systems in
the world including: Global System Mobile (GSM) a TDMA technology used extensively in
Europe, and some parts of the US, and IS-95 the popular CDMA system used in North America.
Table 1 illustrates how these standards are expected to unfold, as described in [1].
Cellular phone networks are mostly designed and managed by traditional telephone
company types – bell heads, as some call them. It not surprising they are based on centralized
structure since that is what these providers know best. With current models of cellular services
end-users can never directly connect each other, but must follow a rigid set of protocols to have a
circuit established between them by a centralized switch. Once the connection is established, if
the user moves, the network must manage the handoff. This centralized architecture works well,
just like the wired PSTN, but is not conducive to innovation, just like the wired PSTN because of
the stiffing effects of centralized management on the ability of users to experiment.
5
Base Station Controller
(BSC)
Radio
T/R
end-user
Home
Location
Register
(HLR)
Base Station (BS) – This is the radio transmitter and receiver communicating to the end-
users wireless device. These stations are arranged in cells to maximize the spatial
reuse.
Base Station Controller – This controls many Base Stations. It is responsible for managing
the handoff of calls from one cell to another as the end-user moves between cells.
Mobile Switching Center (MSC) – The switch controlling many Base Station Controllers.
It is similar to a standard switch in the PSTN, but has additional functionality to handle
mobility of the end user. The MSC work with the HLR to track the mobile user
Home Location Register (HLR) – The function of the HLR is to keep track of where the
mobile user is. It helps manage user mobility by communicating with the MSC to
always know what MSC a user is currently connected to.
This structure is centralized because the network tracks usage. The network knows what
users are doing no mater where they are. With a cellular network, servers within the network
must know the location of the end-device even when it is the end-device initiating network
6
communication. In effect, cellular networks impose a centralized structure on the distributed
structure of the Internet access is provides.
There are many advantages to the centralized management structure of cellular networks.
The spectrum, equipment, and network resources are used efficiently. Users can have a one-stop
shop where all billing is consolidated. Well-engineered cellular wireless networks are an efficient
way to utilize scarce resources. As a result, there are substantial economic incentives for profit
motivated firms undertake the operation of such services. However, we will see later that this
efficiency must be traded off against disincentives to foster innovation.
802.11: Technology [1][5][6]
Networks based on the IEEE 802.11b standard, also know as Wi-Fi, are popping up
everywhere – in homes, offices, hotels, conferences and airports and other hot spots. Individuals,
loosely organized groups, and enterprises are building such networks because its easy, works well,
and is quite inexpensive. Anybody with an Ethernet connection to the Internet (including
DSL/Cable modems) can plug in an 802.11 access point and have broadband wireless Internet
connectivity. It is really that simple. We all have such networks in our offices and in our homes.
The technologies of 802.11 are a group of standards specified by the IEEE and classified
as low-power, license-free spread spectrum wireless communication system [1]. License-free
spectrum means anybody can use it. The spectrum must also be shared with other devices (e.g.,
microwave ovens, and X10 wireless cameras). Originally the 802.11b standard was introduced
with a 2Mbps data rate, later being increased to 11 Mbps. It uses both frequency hopping spread
spectrum (FHSS) and direct sequence spread spectrum (DSSS). The table below illustrates the
802.11 standards [1][3].
Table 2: Characteristics of 802.11 Technologies
Spectrum Modulation Bandwidth Distance
Technique
a 5 GHz OFDM 54 Mbps 60 (feet)
b 2.4GHz DSSS (WiFi)
11 Mbps 300 (feet)
g 2.4 GHz (b) CCK-OFDM 11, 54, (a) and (b) 300 (feet)
5 GHz (a) compatible 60 (feet)
802.11 technologies are designed for hot spots, but may turn out to work well in some
wide area applications. By increasing power, or with external antennas, the range of 802.11 can
be extends for miles [4]. This means Wi-Fi is a viable “last mile” technology. Covering
neighborhoods and business parks with 802.11 access points might be a cost effective method for
providing broadband wireless services in homes, and offices.
7
802.11 networks allow completely distributed structure where individual users at home
and in the office, install and manage their individual access points. Such networks provide
temporary IP addresses with a DNS/NAT where nothing within the Internet knows about this
connection, which makes tracking its usage impossible 3. These networks are simple to install.
When individuals buy, install, and manage their own wireless access points the overall
hardware/software and management costs can becomes substantial. In this completely distributed
architecture depicted (Figure 2), both Mark and Nalin have 802.11 access points in their homes, in
Marks case, his coverage overlaps a neighborhood access point. Mark and Nalin also have
access points in their offices. These office access points are within the Boston University, School
of Management (SMG) network, but are not managed by the IT department of SMG. In both
cases these access points are managed by the individuals who are the primarily users, leading to
inefficient use.
School of Management
Mark’s Nalin’s
office office
Nalin’s
Home
Mark’s Neighbor’s
Home Home
3
Interestingly, it is hard for service providers to tell if a user with wired access has installed their own
wireless access point.
8
extreme uncertainty of wireless markets makes this distributed management structure highly
valued.
Capturing the Options Value of Experimentation
Options
Options endow their owners with a right, but not an obligation, to respond to uncertain
future contingencies. The owner of a call option on a stock options has the right to buy the stock
at some future date at a pre-determined price. Owner of a duel-fuel power plant that can burn oil
or gas may operate now using less expensive gas, but has the right to switch to oil if and when oil
prices become cheaper than gas. The owner of a license to use a frequency band in the radio
spectrum has the right to develop a communications network.
Owners of such options will exercise their rights only if future market conditions result in
an advantageous situation. The stock options holder will only exercise the option and buy the
stock if the future price of the stock exceeds the pre-determined strike price. The power plant
will switch to oil only if oil prices fall below that of gas. The owner of the spectrum will only build
out the network if and when the market demand and technology costs deem the network
investment profitable. As each case illustrates the options derive their value from the uncertainty
surrounding some underlying economic driver – stock price, relative fuel price, revenue from
wireless services.
Options limit the downside risk of an investment decision while retaining the upside
potential. Consequently, option values increase monotonically with increasing uncertainty about
market conditions and with increasing managerial flexibility. While conventional discounted cash
flow analysis rely on computing the present value of a series of future cash flows treating
investments as options, can easily factor in the effect of managerial flexibility that allows a firm to
adapt to changing conditions.4
Experimentation
In markets for radically new products and services where producers do not understand
what users want, it is vital that they experiment with different applications and with feature sets
within applications. The user sees each experiment as a distinct product and as an attempt to
meet a market need. The economic value of experimentation is clearly linked to market
uncertainty. Uncertainty is the cause of the experimenter’s inability to predict the value of the
experiment. In the absence of uncertainty, the outcome of any experiment is known with perfect
accuracy. However, as uncertainty increases, outcomes are more widely dispersed and the
predictability of the success of any experiment’s outcome is lower. This means that successful
products may generate large profits, while unsuccessful products may not. This link between
experimentation and uncertainty is intuitive, as long as the definition of uncertainty is consistent
with the second moment of results from a set of experiments.
4
There is an extensive literature regarding options applications of investments in non-financial assets,
known as real options. This literature examines a plethora of situations in the real world such as staged
investment in IT infrastructure [8], oil field expansion, developing a drug [9] , the value of modularity in
designing computer systems [10], and the value of modularity in standards [12][13]. See Amram and
Kulatilaka [3] for a survey of the literature and elaborations on real options valuation techniques.
9
In order to illustrate this effect, consider three sets of 100 experiments where the value of
each set normal distributed with a different variance. Figure 5 shows the simulated data points
using an algorithm given in [7] for a normal distribution with mean = 0, and variance = 1, 5, and 10.
This shows how the degree of uncertainty changes the value of experimentation; when
uncertainty is low (variance = 1), as expected, the best experiment has a value around 2.5 away
from the mean. However, when uncertainty is high (variance = 10), similar data yields a value of
25 away from the mean, an order of magnitude better than the low variance case. This shows
how, in an absolute sense, as uncertainty increases, so does the possibility of performing an
experiment that is a superior match to the market, as indicated by a value far above the mean of
the distribution.
25
20
Maximum Value
15
Experiment Value
10
5
0
-5 0 20 40 60 80 100
-10
-15
-20
-25
Experiment Number
Wireless Experimentation
For now, nobody knows what the potential is for wireless services, or who will create the
best of them. We do know that web browsing on micro screens is not “the answer”, but what is?
Lots of experimentation is needed to figure this out. Users, graduate students, vendors, and
service providers need to think of new ideas, and see if users like them. This means applications
will likely be developed on traditionally wired networks, or maybe 802.11 wirelesses because this
is the environment most Internet innovation has, and will continue to occur in because of the
inexpensive, and easy to use equipment that is now available.
It is likely that new services for wireless devices will span many different areas for both
business and personal use. In a recent article [14] by (the founder of the cell phone) Martin
10
Cooper discusses a wireless camera with Internet connectivity that sends pictures in seconds.
What a great idea for both business and personal use: field agents for companies doing on site
inspection could instantly transmit pictures to a manager for approval, or a group of experts for
advice. Or imagine the fun of sending a cool picture of your kid to the grandparents, within
seconds of taking it. Its ideas like this that come from all walks of life that will enable service
providers to get what they need – a steady stream of revenue.
Ideas for new services will come from everywhere, the more ideas for services, the
better for users. Wireless service providers such as Telcos do not have the breath of knowledge
to innovate the best services in all areas (for example, the wireless camera discussed above).
While Telcos are good at creating successful traditional voice services, they are not good at
innovating new services in the financial or other areas laying outside their core competencies, as
illustrated in Table 3. The rows correspond to type of services, such as location tracking where
people or equipment are tracked, or radio and video services such as IP radio or video
conferencing. Each column represents an industry type, such as the transportation industry, or
telephone companies. Each entry is how well we expect experiments from one particular industry
to fair with services of a particular type. For example, we expect companies in the financial world
to do well when experimenting with financial applications, but poorly for all other types of
services. Allowing those with the most experience in their area to experiment within these areas
creates the most value.
Figure 5 illustrates the effect of Table 3 on how the value of experiments will be
distributed. This figure shows normal distribution curves for different means and variances. Firms
with average information about a particular industry are expected to have performance average in
the context of experimentation with new services. This is indicated by the centered distribution
with a mean of zero, and variance of 1. As firms become more expert in a particular service area
the expected value of its distribution shifts to the right and the variance decreases because the
experiments should be more successful and focused, thus meeting the market better because of
the firm’s added expertise. As organizations experiment in areas they know nothing about, the
mean is expected to shift to the left reflecting the inexperience of the firm in this new area, but the
variance should increase indicating that if they are extremely lucky, they might find a great
application. This figure illustrates how when firms become more expert in a particular area, then
their experimentation is more focused and successful.
11
Value of Focused Experimentation
0.9
0.8
0.7
0.6
Probablity
-1
0.5
0
0.4
1
0.3
0.2
0.1
0
-2
-1
5
5
5
-5
-4
-3
.5
0.5
3.5
4.5
5
1.
2.
-4.
-2.
-1.
-0.
-3
Value
12
Landscape of Value
0.6-0.8
0.4-0.6
0.2-0.4
0-0.2
1 0.8
0.6
0.4
Focus of Firm 0 0
0.2
-1 1 2 3 4 Probability
-2 -1 0
-4 -3
Value
5
In the PSTN a connection requires every central telephone office the circuit passes though to participate in
setting up this connection and keep track of each connection.
13
message across the SS7 network to query the 800 database; the result is a phone number the
network knows how to switch. This is an example of a centralized structure used to provide a
name resolution protocol to translate the 800 number into an area code and telephone number.
This management structure is centralized because a central authority manages both the physical
800 number server, and the data on the server.
800 # Routing
800
Server phone
SS7
switch
phone
Controlled by switch
a third party
DNS
root server
IP
com server eecs server
Harvard
14
existing service requires a network change, but no change to the end device. As discussed above,
changes to the network infrastructure are difficult and expensive. In addition, changes to
infrastructure require the permission of the central manager, who is not inclined towards wild
experimentation. This difficulty of receiving authority to change infrastructure within a network is
one reason that centralized management inhibits innovation.
Distributed management promotes innovation
Services with distributed management have no central authority controlling the users.
Typically, users can do whatever they want, allowing them to experiment with new services. One
good example of the power of allowing users to innovate is the Web. Created by a user, it vastly
surpassed anything that the so-called experts of the day could imagine. The existence of the Web
is due to the creativity of one system manager trying to better meet the needs of his users. Users
find that the lack of a central manager controlling their destiny is a liberating experience, and
creates opportunities to innovate services that meet their needs better.
One advantage of end-2-end [11] services with distributed management is that the
network does not know what applications users are developing, or what services users are
accessing. This means network infrastructure does not need to adapt to new applications; it
remains constant, and the applications use the simple yet flexible services it provides. This
property of network infrastructure being application-unaware gives innovation a powerful boost,
because it simplifies the process of experimentation and allows users to participate in innovating
new services. On the Internet, anybody can try a new end-2-end experiment. While flexible and
conducive to innovation, distributed management structure for a service can be far from efficient
in terms of management and use of resources. These benefits and disadvantages of distributed
management make the choice of management structure unclear.
The intuition behind the proposition is straightforward. We have already established that a
large component of a network’s value is derived from options it generates. It is well known from
option theory that the value of options increase as market uncertainty increases (since options
protect against the downside while retaining the upside). In addition, we have shown that
decentralized management will uncover greater number of options as a result of experimentation.
This effect is further enhanced with increased market uncertainty.
This theory is quantifiable using techniques from extreme order statistics and real options.
Examples of this are in [10][12]. Our mathematical model validates the results from an analytical
15
viewpoint, showing the tradeoffs among market uncertainty, the value of experimentation, and the
advantages of centralized management structure. Figure 8 (See Appendix I for it’s derivation.)
illustrates the key result. The surface represents the value of users having many choices, by
showing the value of experimentation along with its relationship to both uncertainty and the
number of experimental attempts for a particular module.
35
0-5 5-10
30
25
10-15 15-20
20 20-25 25-30
Value
15 30-35
10
5
14
0
17
# 1
13
9
5
1
trials MU
As expected, the value of users having many choices increases at a decreasing rate with
respect to the number of experiments, N. The value is increasing at a constant rate with respect
to market uncertainty, MU. The surface in Figure 8 illustrates this by showing the value (Z-axis)
of running N (Y-axis) experiments with regard to MU (X-axis). The curved lines for increasing N
show the decreasing rate of increase, while the straight lines for increasing MU show the linear
increase with regard to MU. This surface illustrates the relationship between market uncertainty
and the number of experiments from which the users can choose. When MU is low, large
amounts of experimentation is not useful, because the best experiment is only slightly better then
the average. However, with high MU, the best of N trials is significantly better then the average.
Application to 3G and 802.11
This section applies the above theory to predict the evolution of 802.11 and 3G cellular
wireless network infrastructures. Our theory illustrates the value of experimentation made possible
16
by the distributed management structure possible with 802.11 technology. It also demonstrates
how centrally managed cellular wireless networks can capture this value of innovation by learning
about what services work well in other wireless environments, and using the business and
technical advantages of central management to efficiency provide these successful services. We
demonstrate how value is created when 3rd generation cellular service providers gain minutes of
airtime or packets of data transfer from services provided by independent service providers.
Additional value is gained from service providers that have business relationships with the cellular
provider, and services the cellular providers themselves provide. This theory explains how
innovation in experiment friendly environments is required to find the services users want, and
how centralized management is better able to efficiently use the limited resources inherent with
wireless networks. By co-existing and building on each others strengths and weaknesses 802.11
and next generation cellular networks will combine to create great value because experimentation
in distributed networks will allow users to get services that meet their needs, and the efficiencies
of centralized management allows these services to be priced so many users can afford them.
Building services that are unaware of the underlying network is important to capturing the
most value in the co-existence of 802.11, next generation cellular networks and the rest of the
Internet. Applications unaware of the underlying network allow a seamless migration from one
network infrastructure to another. The most value is created when services developed on one
particular network architecture can migrate to another network infrastructure without changing
the application. This migration of services allows the value of experimentation from distributed
management to be captured by networks with centralized management structure. These
applications that don’t care about the underlying network structure create the most value because
services can seamlessly migrate between the different network architecture.
802.11 compared to Cellular
The value of experimentation with 802.11 is tremendous!! It is so easy to do, and so
inexpensive. 802.11 allows experimentation in the high yield region in Figure 6 because its so easy
to build an experimental environment. Even students in many schools do wireless projects. The
point is - there is so much experimentation going on, and the market uncertainty is high, that one of
these days (hopeful soon), someone will discover a killer application, or at least enough niche
applications to keep service providers alive. This combination of high market uncertainty and easy
experimentation is very valuable in the context of meeting uncertain user needs.
Installing equipment is only part of the cost of wireless infrastructure because it must be
managed, which turns out to be expensive. The centralized management of cellular networks
makes this easier because workers can be trained, and equipment can be standardized, which is
17
unlike the heterogeneous equipment used with 802.11 infrastructure where each users decides
what vendors equipment to use. Standard equipment, and wide spread deployment simplified
these management aspects. This is a far cry from the distributed management allowed with
802.11 networks where each individual decides what to use, and how to manage it. When
managing equipment, the centralized structure of 3G has value because it reduces cost.
The above illustrates the complex tradeoffs between 802.11 and 3G. The case for co-
existence is strong. The value of compatibility between 802.11 and 3G is essential to capturing the
most value from network services in general, and wireless network services in particular. Figure 9
is one view of the future world. It demonstrates the ubiquitous nature of current, and next
generation cellular technology. The entire region is blanked with cellular service. This works
everyplace, in building, subways, outside the city. Its not the highest speed, but the ubiquitous
coverage makes it a valuable technology. There are other spots where the economics make
sense to build a higher speed, but more expensive 802.11 infrastructure. Locations such as
airports, offices, and even homes are examples where this approach works well. Even business
parks, and dense neighborhoods might be good candidates for 802.11 wireless coverage. There
are many places where wired connectivity makes the most sense. Our desktop systems at home,
and in the office fit this category nicely. They never move, and I want the highest bandwidth, and
security, which the standard Ethernet switched LAN networks provide. This figure demonstrates
that there is room for all three technologies, and all three will co-exist, and thrive – each service
creating value for the other services because of the co-existence of these technologies and the
transparent usage of services between them all.
Conclusion
This paper illustrates the link between market uncertainty and management structure with
wireless services that previous work [4][12][13] predicts to illustrate why 802.11 and next
18
generation cellular technologies will co-exist – each adding value to the other. For now the needs
of users seem mysterious to vendors, service providers and even users who are unable to
articulate what they want. We explored why the flexibility of the 802.11 technology allowing
distributed management structure has such value when market uncertainty is high - it so easy to
build infrastructure, and experiment with new services. We explained benefits of 3G technology
because it has business and technical advantages over 802.11 technology. 802.11 and 3G network
infrastructure will co-exist, each gaining value from the other.
This paper addressed one main economic factors important to the argument for co-
existence of 3G cellular and Wi-Fi technology: maximizing the value of services to cellular service
providers by supplying services that better meet user needs because of the increased
experimentation possible with Wi-Fi and the migration of the successful services into cellular
networks. The economics of viewing Wi-Fi and 3G cellular as complements, not substitutes are
strong because both users and service providers win.
Our argument utilizes an options framework illustrating the value of 802.11 technologies
because it allows users to experiment and how next generation cellular networks can capitalize on
this innovation occurring in 802.11 networks. We discuss the value of how services migrate from
one network infrastructure to another. This is based on the real options approach, and something
we plan to investigate in future research. The main point in this paper is that giving users as many
choices as possible when market uncertainty is high creates the most value. These choices need
to include both what the applications are, and how the user accesses them. The higher the market
uncertainty, the greater the value of giving users these many choices. This real options framework
is one way to quantify what seems to be happing with the adoption of 802.11 technology from the
ground up .
In uncertain markets gaining the most from experimentation requires the most users
having the greatest number of choices. The more users, the greater the potential profit of
providing a popular service, the more choices these users have increases the odds of a really
successful service. By encouraging interoperability between wireless technologies the total
number of users will be larger. More total users means more users for each technology, as users
will tend to use both of them, picking the technology that fits the particular situation, as long as the
business model supports this interoperability. Furthermore, the users of 802.11 will have increased
ability to experiment, implying that many good services for the cellular market will incubate in the
WiFi environment. The effects of co-existence are illustrated in Figure 10 – it is helpful to both
technologies. The x-axis is the number of users; the y-axis is the value to each user of the service.
N1 and N2 are the number of users of 3G and Wi-Fi wireless service treated as substitutes. The
bottom curve is the value of 3G and the top curve is the value of WiFi, as expected, as the number
of users grows, so does the value to each user. The blue star is the value of having N1 users.
More users implies more value, and when cellular service providers can tap into the Wi-Fi market,
the number of users increases to N1 + N2, and the value increases to 3thGC, the red dot. This
figure illustrates this because 3GS < 3GC. The same effect is seen on the Wi-Fi value curve –
WiFiS < WiFiC. More users equates to more value per user. Thus, the minimal value of
complementary technologies is: (3GC + WiFiC) – (3GS + WiFiS). The maximum value is
obtainable by considering the option value of services discovered in the WiFi world, which
migrated to a service provided by cellular providers. This value is the best of many which is
illustrated in Figure 6. In the case of these technologies, cooperation has more value than
19
competition because this network effect is enhanced by the greater value of experimentation
possible with WiFi technology.
Value of WiFi
Value to
each user
WiFic
WiFis Value of NthG
3Gc
3GS # of users
N1 N2 N1+ N2
This co-existence of 3G and Wi-Fi is unfolding in Europe. In July of 2002 the mobile arm
of the Danish telephone company (TDC mobile) announced plans to integrate 802.11 with its
GSM cellular service [16]. The SIM within the GSM phone contains billing information about the
wireless subscriber that can be used to authenticate the WiFi user. Nokia Corp has developed an
integrated Wi-Fi SIM card that uses the GSM network to authenticate the user. These are good
examples of the synergy that Wi-Fi and cellular service can share to enhance the users broadband
experience in a scalable and secure way.
Service providers must come from every nook and cranny if the best services are to be
found. The addition of the N1 users is important because these are Wi-Fi users that can
experiment and innovate. Furthermore, the N2 users that normally are constrained to 3G service
will also be able to experiment and innovate because they will be able to use both technologies.
As discussed before, and illustrated in Figure xx and xx, different groups will have different
expertise and with co-existence more of these groups will be able to experiment. Large Telco-like
service providers need to capitalize on the massive innovation that occurs when niche players
experiment in their area of expertise by looking everyplace for new services. The most value
comes from giving users the most choices because some of these choices might be superior
matches to the uncertain market.
The current business model of cellular service providers seems opposite to what is needed
to capture value from the next generation cellular data services provided by smaller, niche
providers specializing in particular markets. Today’s wireless service providers’ market plans
encourage most users to buy greater airtime than they use. This model won’t transfer well to
cellular data services because if these wireless service providers only gain additional minutes from
20
independent service providers, they lose; they get no additional revenue as users usage increases
closer to their allowable allocation. As users begin to use up their minutes with services not
directly related to the wireless provider it costs the wireless provider more, but generates no
additional income. To capture the most value of services provided by independent service
providers’ cellular providers should bill per data packet.
I-mode, a next generation cellular data service started in Japan is succeeding with its
“open garden” business model. I-mode services are of three main types: those offered by the
wireless service provider (DoCoMo), services offered by other vendors but affiliated with I-mode
because they manage the billing, and services that have no association with DoCoMo - only
provide extra data transfer charges to them. Models like I-mode in which users pay as they go
may provide the economic incentives needed to convince the wireless providers to open up there
networks, similar to what DoCoMo is successfully doing in Japan. By adapting their current
business model to promote independent service providers these wireless service providers can
capture the most value.
[1] Rappaport, Theodore (2002). Wireless Communications - Principles and Practice 2nd Ed, Prentice
Hall
[2] Smith Clint and Collins, Daniel (2002), 3G Wireless Networks, McGraw-Hill.
[3] Amram, M. and Kulatilaka, N. (1999). Real Options, Managing Strategic Investment in an Uncertain
World. Boston, MA: Harvard Business School Press.
[4] Gaynor, M. (2001). The effect of market uncertainty on the management structure of network based
services. Ph.D Thesis, Harvard University.
[5] Gast, Matthew (2002), 802.11 Wireless Networks – the Definitive Guide, O’Reilly
[6] Flickenger, Rob (2002), Building 802.11 Wireless Community Networks, O’Reilly
[7] Hiller, F. S., and Lieberman, G. J. (1967). Operations Research. San Francisco, CA: Holden-Day Inc,
p. 631.
[8] Balasubramanian, P.,Kulatilaka, N. and Storck, J. (1999). Managing Information Technology
Investments Using a Real-Options Approach, Journal of Strategic Information Systems, Volume 9,
Issue 1.
[9] Myers, S. and C. Howe (1997) A life-cycle financial model of pharmaceutical R&D, Working Paper,
Program on the Pharmaceutical Industry, Sloan School of Management, MIT
[10] Baldwin, C. and Clark, K. (1999). Design Rules: The Power of Modularity. Cambridge, MA: MIT
Press.
[11] Saltzer, J., Reed, D., and Clark, D. (1984). “End-To-End Arguments in System Design,” ACM
Transactions in Computer Systems 2, 4, p 277-288.
[12] Gaynor, M. and Bradner, S. (2002) Knowledge Technology and Policy, Special issue on IT
Standardization.
[13] Gaynor, M. and Bradner, S. (2001). The Real Options Approach to Standardization. Proceedings of
Hawaii International Conference on Systems Science..
[14] Cooper, Martin, What Ails Broadband?, IEEE Spectrum, Sept 2002
21
[15] Ambeler, Chris. “Custom local area signaling services class” No. 111A ESS Div. 3 Sec. 1z3.
hyperarchive.lcs.mit.edu/telecom-archives/archives/technical/class.ss7.features
[16] Blau, John, Wi-Fi Hotspot Networks Sprout Like Mushrooms, IEEE Spectrum, Sept 2002
0.45
s.d.
0.4
0.35
0.3
Probability
0.25
0.2 S.D=1
0.15 34.1
% 34.1
0.1 %
13.5 13.5
0.05 % %
2% 2%
0
0
3
-3
-2
-1
U 1 0
V = M e a n
Experiment Value U 100
U = V + ((s.d.)*Q(y))
Q(y) is the benefit of many parallel
experiments
it, 13.5% between 1 and 2 standard deviations, and 2% between 2 and 3 standard deviations from
the mean. This matches the simulation results in Figure 4. To find a superior standard we expect
to need over 769 experiments on any particular standard to find a module that has a value greater
than +3 standard deviations from the mean. This illustrates that finding great results may take on
the order of 1000 attempts.
Figure 10: Best of Many Experiments
This figure shows U(10) and U(100), the expected value of the maximum of 10/100
experiments. That is, U(10) is the value of the best experiment from a sample of 10 experiments.
22
This maximum is composed of two different components: first is the effect of the mean, next is
the offset from the mean. This offset from the mean (V) is itself composed of two parts: first, the
effect of the standard deviation, and second, the effect of the parallel experimentation. Thus, we
can express U(n) in terms of these parts: U(n) = V + Q(n)*S.D. That is, the maximum of n
experiments equals the distribution mean plus the value of n experiments times the standard
deviation of the normal distribution. Q(n) measures how many standard deviations from the mean
U(n) is. Intuitively it makes sense that U(n) >= V, since to get an expected value of mean, we do
the n experiments, take the first one (expected value = V), and disregard the rest. It also follows
that the probability of U(n) greatly exceeding V increases as n or the variance grows.
Roughly for n = 2, Q(n) = .85, for n = 10, Q(n) = 1.5, for n = 100, Q(n) = 2.5, and for n =
1000, Q(n) = 3, again matching what is observed in Figure 4. The intuition behind this is that as
you increase the number of experiments, the best of these experiments has a value that grows
further from the mean, but at a decreasing rate.
As uncertainty increases, so does the gain from experimentation and thus the potential of
profit. To see how this works, consider the following example: let the S.D. = 1, and n = 10 with a
mean of zero. With n = 10, Q(10) = 1.5, so U = 1 * 1.5 = 1.5. However, if we increase the
standard deviation to 2, then U = 2 * 1.5 = 3. This example shows that Q(n) is a measure of the
number of standard deviations U is away from the mean.
This model, based on the best of many experiments, is option-based since a designer
provides several options for a particular module. When only a single choice for a service exists,
the expected value is lower than if the designer has many choices for the same service. The
model illustrates how uncertainty increases the benefit of many choices.
We assume a normal distribution and experiments that are not correlated with each other,
but the basic idea holds for any distribution or correlation between experiments. The expected
value of the best of many experiments may greatly exceed the mean, and is always at least as
good as the expected value of a single experiment. The next section uses this result to model the
expected value of giving users choice.
A Simple Model of Modularity in Standards
Below is one possible mathematical model based on our theory about the value of
modularity and the extreme order statistics discussed above. This model shows the value of having
many choices for any particular service.
As illustrated above, the marginal value of having more than one choice for a particular
service is given by:
23
Equation 1 : U(n) = MU*Q(n)
Figure 8 is this surface representing the additional value of having n choices for a
particular module. It shows the value of experimentation along with its relationship to both
uncertainty and the number of experimental attempts for a particular module.
24