The Incredible Shrinking Computer
During the past half century computers have increased in performance at the
same time as they have shrunk in size. In the 1940s they took more than
a millisecond to perform even a simple addition, yet filled a larger
room and consumed more electrical power than even the largest of today's
supercomputers. By the 1960s transistors had allowed computers to
shrink to the size of a few refrigerators and consume only a few
kilowatts. The 1980s saw the advent of desktop computers. Laptop
computers emerged in the early 1990s, and handheld computers followed
soon after.
Software is Harder than Hardware
In the beginning the main problem was to build computers and keep them
running. Software was almost an afterthought: a few lines of code
sufficed to calculate the ballistics tables and other simple but tedious
computational tasks for which computers were originally conceived.
Today the situation is entirely the reverse. While the Central
Processing Unit (CPU) has grown enormously in complexity, its growth is
managed through tight discipline and clear specifications under the
leadership of a very small set of designers, in the case of the x86
architecture principally Intel, AMD, and Via today. The CPU and
its associated chip sets are then assembled into boards defined largely by
the same discipline managed through reference designs produced by the
CPU companies to guide the board designers.
The software that runs on today's computers on the
other hand has grown in a relatively undisciplined way to many millions
of lines of code under largely decentralized management. Although
Microsoft plays a leading role in that growth, it is far from being the
only contributor to the software world. Moreover even the area in which
Microsoft excels, operating systems, has grown like Topsy in a way that
is rapidly becoming impossible to manage in its present form, as
witnessed by the five-year gestation period and repeated delays
experienced by Microsoft in bringing Vista, its latest incarnation of
Windows NT, to market.
Homogeneity-Heterogeneity: a moving barrier
As each new generation of smaller computers has emerged, its software
has mirrored the evolution of the software industry as a whole. Initially
the software for the new machines is relatively primitive and evolves in
a decentralized way. With increasing complexity comes increasing
unmanageability, until the point is reached when the only possibility
for further growth is to replace the ad hoc software with the
same software already established on older larger machines.
As this cycle repeats, the software on all machines above a certain size
becomes homogeneous, while the smaller machines continue to run a
heterogeneous mix of software.
Above this barrier, software written for any size machine will run on
any other size provided the same software operating environment is used.
In particular any program written for a deskside or desktop workstation
running Windows XP will also run on an XP laptop. While this was not
the case when laptops first appeared, the competitive advantage of
full compatibility with desktops drove laptop software architecture to
ultimately mimic that of desktops.
This common operating environment for laptops, desktops, and everything
larger in effect defines a notion of full-function computing, in
which every capability of one machine is a capability of all.
The Status Quo
Today, above the barrier, the smallest machines running
homogeneous software are laptops (counting the Tablet PC as a variant of
the laptop). Below the barrier, smaller computers divide mainly
into Personal Digital
Assistants (PDAs), mobile phones, point-of-sale (POS) computers, handheld
data collection devices, and
embedded computers. The dominant operating systems below the barrier
are Symbian, Windows CE (serving Windows Mobile), PalmOS, and Linux.
Symbian dominates the cellphone market. PalmOS dominated the PDA
market early on but has been overtaken by Windows CE. The economics of
point-of-sale and data collection devices justifies custom-built
operating systems. Historically this also held for embedded computers,
but this area today is now becoming well served by Linux, particularly in its
real-time incarnations.
The Tiqit Bet: continued movement of the barrier
The Tiqit bet is that this barrier will move down into the handheld
computer arena, as currently populated by the four above-mentioned kinds
of devices, PDAs, cellphones, data collectors, and embedded computers.
Tiqit's mission is to bring full function capability to the handheld
computer space.
The guiding principle here is that software is harder than
hardware. Instead of developing hardware and then writing software for
it, the strategy is to take the existing body of software defining "full
function computing" and slide the appropriately scaled hardware
underneath it.
There are two basic challenges here. The first is to build a computer
having the same functionality as a laptop but in the form factor of a PDA.
The second is to adapt the user interfaces that have evolved
for large displays to the small displays of handheld computers.
Like its competitors in this new and thus far undeveloped market,
principally OQO, Sony,
Samsung, and ASUS, Tiqit has begun with the first of these. In parallel
Tiqit is exploring the necessary adaptations of the user interface to
this size of computer.
|